Re­cently, I've been prun­ing a very leggy jade plant. Hid­ing it in shade hasn't been good for it. I'm ex­cited as the prun­ing is lead­ing to new branches and leaves. I wanted to see its pro­gress after prun­ing in the form of a timelapse.

I've writ­ten this as a re­mind­er of the jour­ney to build this and the mis­takes I made along the way! While the code for this is open source, it's not ac­cess­ible to the pub­lic for pri­vacy.

To make a timelapse, I needed as a cam­era that could take pho­tos on a timer. There were a couple of op­tions here and I fa­voured two: use an ex­ist­ing cam­era and trig­ger it re­motely, or build a Rasp­berry PI timelapse. I ini­tially looked to bor­row my part­ner's di­git­al cam­era but it couldn't be triggered from an­oth­er device or be charged without re­mov­ing the bat­tery.

So, Rasp­berry Pi it was then! I got a Rasp­berry Pi 4 and a Rasp­berry Pi Cam­era Mod­ule. My plan at this point: build a web ap­plic­a­tion that will con­trol when the pho­tos will be taken and store them. I also wanted to de­ploy to the Rasp­berry Pi with­in a Dock­er im­age as part of a my home Kuber­netes "cluster" (a man­ager node and this new Rasp­berry Pi).

An ex­ample

Ini­tial setup

The first step, at this point, was to get the cam­era work­ing with the Rasp­berry Pi. It took a few at­tempts to get it work­ing, in­clud­ing us­ing the wrong port for the cam­era, not real­ising there were two ports on the Pi! It took a while but the cam­era even­tu­ally worked, tested us­ing the raspis­till com­mand.

A ba­sic web ap­plic­a­tion

Ini­tially, I con­sidered build­ing sep­ar­ate com­pon­ents (cam­era, timer, video com­poser). However, I de­cided against this as it would be more dif­fi­cult to con­trol the con­fig­ur­a­tion across mul­tiple ap­plic­a­tions. In­stead, I de­cided to build single ap­plic­a­tion that would handle most of this func­tion­al­ity, ex­cept the video com­poser to lim­it the pro­cessing on the Pi.

I chose Go for writ­ing the ap­plic­a­tion as I had found the lib­rary raspicam.

I im­ple­men­ted this in a few broad steps:

  1. Test the raspicam to get im­ages, choos­ing the right file format. I settled on PNG for lossless com­pres­sion.
  2. Design the timelapse con­fig mod­el, manu­ally set­ting the con­fig.
  3. Build an en­d­point to list the cur­rent im­ages.

This gave a ba­sic ap­plic­a­tion to start gen­er­at­ing timelapses. However, I wasn't very happy with the product at this stage, there were a few prob­lems with it:

  • It was dif­fi­cult to work out where the cam­era was point­ing and how it was fo­cussed.
  • The style of the pages wasn't great.
  • It was dif­fi­cult to re­set the timelapse.
  • The light­ing wasn't al­ways uni­form.

Now, fix­ing these things wasn't par­tic­u­larly dif­fi­cult to im­ple­ment so I'll skip to the hard parts.

Race con­di­tions

Hard part num­ber one.

Race con­di­tions.

I no­ticed that after a little while the soft­ware would no longer be able to get pho­tos from the cam­era. I nar­rowed this down to the raspis­till freez­ing if more than one in­stance ran at once. The only way to fix this was to re­start the Pi. Not great for a web ser­vice. I ini­tially tried us­ing mu­texes to pre­vent the cam­era from be­ing used at once but, in the end, it was clear­er to use chan­nels.

The oth­er race con­di­tions that I needed to solve was read­ing and cre­at­ing im­ages at the same time.

This didn't solve every single prob­lem, un­for­tu­nately, so, in the end, there's a cron script to check if the ap­plic­a­tion can take timelapse im­ages still and, if not, it will re­start the Pi. A bit hacky!

Con­tain­ers and or­ches­tra­tion

As part of this pro­ject, I wanted to build more ex­per­i­ence with Kuber­netes and Dock­er.

This came with some hard prob­lems:

  • Get­ting the raspis­till bin­ary
  • Find­ing the oth­er re­quired lib­rar­ies
  • Build­ing without mtrace and ex­ecinfo
  • Mount­ing the devices
  • Run­ning on the right device

This was solv­able but it did take sig­ni­fic­ant work that isn't quite cap­tured by this com­mit!

Once this was done, it star­ted to get frus­trat­ing wait­ing to re­build the im­age every time I wanted to de­ploy it. I solved this in a few ways:

Ac­tu­ally mak­ing the videos

Now I've got this far, I need to ac­tu­ally make the video. I wrote a script to pull all of the files to­geth­er and then turn them into an im­age.

A quick over­view of this:

  1. Fetch the im­ages from the Rasp­berry Pi, us­ing rsync.
  2. For each im­age, do some col­our cor­rec­tion and im­age cleanup.
  3. Use men­coder to stitch them to­geth­er into a video.
  4. Use ffm­peg to com­press the video into something us­able.