kivy, the hero of iot gui developers

Kivy – An open-source Python library for rapid development of applications that make use of innovative user interfaces, such as multi-touch apps.

The Internet of Things (IoT) is the likely future of gadgets and devices that you’ll have in your homes and cars (if you don’t already) as well as technology that you wear. As a minimal criteria, these things use the cloud to gather and store information. In a way, even smartphones fall into this category if you think about it. Amazon’s Echo device is a good example.

For many of us who self-identify as “makers”, we use small computers with similar capabilities and we create these type of gadgets. Often when we’re coding software in this space, the Python language is the usual choice for the task.

Until now, we’ve not had many options for displaying graphical menus on such tiny screens other than the full “Desktop” GUI of some Linux-like operating system which didn’t really work, to be honest.

Introducing Kivy

A relatively new technology is the Kivy library. Imagine being able to describe the many screens you’d find in an application, whether it’s a smartphone or the touchscreen of a printer or even a watch. Then Kivy takes care of the rest for you, rendering those screens using a graphics engine behind-the-scenes. It even manages clicks and other gestures, getting these to fire off portions of your code.

Kivy comes equipped with an impressive collection of pre-defined screen widgets as well as the ability to create your own custom types. And you get all this for the low, low price of free (unlike its $5K+/year—priced competitor Qt).

I’ve had the pleasure of working on an almost daily basis with Kivy over the last two months and I must say that I’m still just as fond of it now as the day I originally learned of it.

If you’re a coder and you know Python, I would suggest that you add Kivy to your toolbelt. You’ll find that it’s easy to use and worth the effort you put into it.

small screen for the raspberry pi 3

I thought I’d do some prep work for a project that I’d like to finish before the Christmas break:  a time-lapse rail kit for the Nikon D750 DSLR camera. I’ll be going to Arches National Park in Utah for that week and wanted to do some astrophotography and sunset time-lapse videos. Here’s vaguely what the rig will look like:

pi-lapse

This photographer/inventor David Hunt has done a pretty good job on his rig and has produced some stunning videos. I hope to take things up a notch since I have access to a 3D printer and a variety of extruded 80/20 aluminum rails from ActoBotics, for example.

Oh… and the entire rig will need to be portable since I’ll likely be backpacking it into the park. Fortunately, I have a sewing machine and a good supply of marine-grade canvas to create something to hold and carry all of this.

TFT

Fortunately, Fry’s Electronics sells some of what Adafruit has to offer and in this case, it’s a tiny TFT screen with a touchscreen built in. It’s technically called a “Pi Hat” since it connects right to the top of a Raspberry Pi 3, for example.

adafruit-1601

I’ve got it connected to a Raspberry Pi 3 and have inserted a new 4GB microSD card for this project and furthermore, have loaded Raspbian Jessie Lite for that image. Although my version won’t have a nifty graphical desktop like the photo above, it will still run touch-based graphical menus.

Python

Looks like I’ll be using the Python programming language for this project. I’d prefer JavaScript but I only have a few weeks to get this “production-ready”, so to speak.

Kivy

The next step in developing graphical menus which respond to touch is to install the Kivy  framework for Python. The menu should allow you to set some configuration options for the spacing of the photos, the number of photos for the series and things pertinent to stepping the camera along the rail using a motor. Finally, there would be start and stop features for each session as well as on-going status.

Nikon

The Nikon D750 has a remote-shutter system and I’ve managed to find a good third-party version of the cable which should come in handy for this. I’ve spec’d out that interface so I should be able to remotely fire off the camera from the Raspberry computer.

Overall

This should be a fun project. I hope I can finish it in the perhaps five weeks left before Christmas break.