What a difference a year makes. Last year I made a time-lapse video of the Vernal equinox in Wooster Hall by standing next to a wall and taking a photo about once a minute for an hour. This year I have a much nicer video from a raspberry pi camera which captures an image every 5 seconds. Here is the result:
I’ll edit this post later to provide more information, but for now I’m posting it in the hopes that the video is useful for tonight’s event, which alas I will miss because I teach a lab during that time. Enjoy!
Added the next day…
Here are some of the details about how this video was created. You can compare this to how I did it a year ago and see if you think I’ve made progress.
First, the images were all captured by a Raspberry Pi 2 with attached camera, driven by a Python script which is started at boot time (unless there is a mouse or keyboard plugged in to a USB port). The camera simply takes an image every 5 seconds and saves it. No further processing is done on the Pi.
Then I take the Pi back to my lab and plug in a monitor, keyboard, and mouse. The only reason for the mouse is to inhibit starting the camera. I zip up the images into a tarball and copy that to a memory stick, which I then take to the Mac in my office.
The Mac runs the free version of the software TLDF (the name comes from “Time-Lapse De-Flicker”). I drag the files into TLDF, check the box for “Blend” and set it to blend 3 frames, and press “Render”. It does the rest, and produces an MP4 video. That’s it.
It’s not as fun as writing my own Python script to do variable duration frames in an animated GIF, but I sure do like the results.