Backblaze for offsite backup - alternative to Amazon S3
  • Hi Ben, I have motion detected events saving to an Amazon S3 bucket and deleted after 1 month, however I have just signed up for Backblaze who have an Amazon S3 compatible API (see link) because they are significantly cheaper and my offsite storage costs of motion events across 4 cameras are mounting.

    Unfortunately when I tried to use the Backblaze Amazon S3 compatible credentials, I can't quite specify enough I think in Security Spy to make it work. Will you be able to support this as it makes a significant cost saving for offsite storage and is still basically on Amazon S3 at a cheaper price and I am sure this would appeal to your users.

    https://www.backblaze.com/b2/docs/s3_compatible_api.html
  • This is an interesting potential option for cheaper offsite storage. When you attempt a test upload via Preferences -> Uploads in SecuritySpy, what error messages do you get?

    We use Amazon's AWS Command Line Interface tool to do the S3 uploads - perhaps this isn't compatible with the Backblaze API for some reason. Unfortunately, if this is the case, there would be little we can do about it. But the error message may shed some light as to what is going wrong.
  • They do have some documentation for developers using the API
    https://www.backblaze.com/b2/docs/
    - in the meantime a screenshot of the error is here:

    https://www.dropbox.com/s/abe9t9n4bsl7b3i/Screenshot 2020-05-23 at 18.56.08.png?dl=0

    At 1/4 of the price of S3 it is a significant incentive for offsite event storage!
  • I was intrigued by this, since I didn't have an offsite solution set up yet for my SecuritySpy captures. I created a Backblaze B2 bucket that was S3 compatible, and received the same errors when attempting a test upload from SecuritySpy into that bucket.

    Ben, your clue that you utilize the AWS CLI led me to the root of the failure, as well as a workaround solution.

    The AWS CLI uses, by default, Amazon endpoints. The AWS CLI will upload to B2 buckets ONLY IF you supply the correct endpoint via a command line parameter (--endpoint-url). I had hoped that the endpoint URL could be specified as in $HOME/.aws/config, but that doesn't seem to be the case.

    I poked around until I found the location that SecuritySpy installs the AWS CLI ($HOME/.local/lib/aws). I then performed these steps to hardcode the Backblaze endpoint so that uploads from SecuritySpy would work.

    1) Rename the distribution AWS application. [mv $HOME/.local/lib/aws/bin/aws $HOME/.local/lib/aws/bin/aws-dist]

    2) Create a wrapper script for "aws-dist" script in $HOME/.local/lib/aws/bin/aws. The contents of this wrapper script (between the "------" markers):
    ------
    #! /bin/sh

    $HOME/.local/lib/aws/bin/aws-dist --endpoint-url=https://s3.us-west-002.backblazeb2.com "$@"
    exit $?
    ------
    Note that your endpoint URL might be different. It might have been better to specify that in an environment variable. Also, the quotes around $@ are also critical.

    3) Make the wrapper script executable. [chmod 755 $HOME/.local/lib/aws/bin/aws]

    4) Configure in SecuritySpy. "Path on server" is just a top-level folder inside your bucket, and not the endpoint like in your screenshot. Mine is just "SSpy" - yours can be whatever you choose.

    After that, attempt test upload. Mine worked, and uploads have been working since.
  • @krose- Thank you so much!

    This is a little beyond my general mac experience but I will attempt to follow the instructions and see what I can make work!

    Very pleased to know that it can be used with Backblaze and it would be nice if this can be introduced into a release as well!

  • @sachaski - You are welcome. Please be aware that following these steps will disable your ability to use Amazon's S3 service, since the B2 endpoint is hardcoded. Just be aware of that.
  • Thanks! This also works for DreamObjects with their URL.
  • @krose - I'll not need the amazon services if I can get Backblaze going! :)
  • Many thanks @krose for outlining exactly what needs to be done in order to support this. I think this would be a useful addition to SecuritySpy so we've added this functionality to the latest beta version of SecuritySpy (currently 5.2.3b22). Now, under the upload settings, you will find a separate bucket name field and endpoint field, for S3 uploads. As the endpoint, leave this empty for standard AWS S3, or specify your Backblaze B2 endpoint address.

    To test, please undo your custom aws config (e.g. rename the folders so that the standard aws tool is at ~/.local/lib/aws/bin), and then test with the B2 endpoint address entered into SecuritySpy. Please confirm this works as expected.

    @sachaski - assuming you haven't done the custom setup yet, could you please test this too. Enter into SecuritySpy the specific endpoint address that has been provided to you by Backblaze. It can be entered with or without the "https://" protocol specifier.
  • @dnchen, if you can test this with DreamObjects too that would be great!
  • This is great, thank you Ben! I can confirm it works when using my own S3 endpoint with software called Minio.

    If anyone is interested in creating their own S3 storage, I can recommend setting up Minio in a Docker container. I've set this up on a little Linux server running in my parents house which acts as a free offsite S3/backup solution.

    https://min.io

    I access this via a reverse proxy using my own URL e.g s3.myhome.com
    A very useful guide to set up a reverse proxy is here: https://www.smarthomebeginner.com/traefik-2-docker-tutorial/

    example docker-compose details:

    s3:
    image: minio/minio:latest
    hostname: minio
    container_name: minio
    restart: always
    ports:
    - "9007:9000"
    volumes:
    - /data/backups:/data
    - ${USERDIR}/docker/minio/config:/root/.minio
    command: server /data

    If you want to remove old video files, you could use the following CLI:

    mc rm -r --force --older-than 1d2h30m myminio/mybucket

    I also use Duplicati as a backup client for various machines to my offsite S3 storage:
    https://www.duplicati.com
  • Hi @paul2020 this sounds great, thanks for posting!
  • Thanks, @Ben, for adding the endpoint-url into the GUI! I can confirm that it does work when using the Backblaze servers (after backing out my workarounds).
  • @ben Thank you and I will get on to testing it but probably at the weekend when I will have a moment to do so!
  • @krose - great to hear, and thanks for posting your solution, this made it easier for us to implement this in SecuritySpy.
  • @ben I have the beta software uploading only one of my cameras for some reason and not the others as far as I can see.. I'll look in the morning on Backblaze to see what if anything there has changed.

    On another note after installing the software I am getting 4001 errors on one of my reolink cameras. I have 2 reolink cameras and at first they were both getting these errors but one of them settled down. The other is mainly trying to connect.

    My computer resources include 128GB RAM and 2 x 3.46 GHz 6-Core Intel Xeon and a Radeon RX 580 8 GB
    running Mojave

    The Reolink model is an RLC-422

    I never had a problem with it until the Beta software so was wondering if there is a bug to be ironed out there maybe?
  • @ben - Backblaze seems to be accepting the other streams now - Perhaps it was an initialisation issue - but I did notice that the stream it was recording had no spaces in the camera name and I hyphenated the other camera names and perhaps that solved it.

    Also the Reolink seems to have settled down as well.. so possibly a false alarm
  • Hi @sachaski, good to hear the uploads are now working, yes perhaps this was an issue with the file names.

    As for the Reolink cameras, this is not a problem with the beta, but rather with the cameras themselves, whereby they provide unreliable RTSP streams. Please see our notes on our list of supported cameras, where we advise against using Reolink cameras.
  • @ben the beta crashed last night..

    I'll send the crash report to you by email.
  • @ben sorry, I just realized you sent an update with the endpoint. It works well with DreamObjects. Thanks!
  • @dnchen - great, thanks for reporting back!

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!