autonomous tank

I managed to snag some great track data today at the venue. It was necessary to write a service to take snapshots every second while I manually drove around the track a few times.

With data in hand now at home, I was able to do some data processing now with the images from my own webcam.

A New Perspective

I thought I’d compensate for the lines-of-perspective effect so that the trending portion of the software could have accurate data. Since Jimp didn’t have a skew function and since its convolute() method didn’t work as expected with the right matrix for this, I ended up writing my own prototype which now works as shown below.

Screen Shot 2018-09-22 at 9.34.22 PM

file management for 3d printers

I use the software OctoPrint to control my 3D printer. It’s an excellent web service with a rich collection of methods in its REST API. The software was designed, coded and is maintained by Gina Häußge.

I’ve just created my first plugin for OctoPrint. It should allow those who need version control to pull their sliced files from a github repository as selected. The interface and concepts are simple enough: identify the repository and press the button to pull the latest from that repository.

settings

 

buttonfiles

Update:

My new plugin is now listed and published on plugins.octoprint.org. Yay, me.

plugins.octoprint.org

tanks a lot

I decided to build a very cool-looking robotic tank kit which is made by OSEPP. They have a variety of grown-up toys like this in the geekspace.

I guess I’ve been inspired lately by some of the local meetups which involve races with autonomously-driven cars.

To build this, I find that a surprising amount of hardware is going into this project as well as several programming languages all at once. I’ve had to bounce back and forth between Python and C as I interface the Raspberry Pi Zero W computer with the Arduino Mega 2560 R3 Plus board. This Arduino doesn’t come with Bluetooth, wi-fi or even an Ethernet jack so I opted to add in the Pi since it’s inexpensive and comes with a full operative system. The Pi of course includes a webcam for initially allowing the remote control features to be easy. Later, that same camera will be used to generate images to be processed for autonomous driving.

DSC_0072

Repository

Update:

I decided to design some plastic parts for the tank. It’s now looking awesome, has some quick-release pins and I’ve purchased a 12-battery AA charger and batteries for the project since it seems to be hungry for power.

DSC_0073

DSC_0074

Screen Shot 2018-09-12 at 2.00.44 PM

The first three attempts at managing the tracks for steering didn’t seem accurate enough for my own driving-related expectations at least. I finally had to resort to trigonometry in the last set of calculations; this appears to be a more natural steerage interface.

It looks like the first two phases of the project are now complete and I’m well into the third (autonomous) phase now.

Autonomous (Self-Driving) Mode

Next up is the part where a service is taking snapshots from the camera and then using this to make steering decisions. The strategy here is to use data image processing to find the road, so to speak, our position relative to the path ahead as well as any competitors also on the track.

The first interesting piece of the data processing involves some linear algebra and a variety of matrices which perform distinct functions, if you will. You basically multiply a particular matrix for a 3×3 array of pixels to replace the center pixel’s color in each case. The first and most useful matrix is named findEdgesKernel and looks like this:

-1 -1 -1
-1  8 -1
-1 -1 -1

This will allow a new, simpler image which should highlight only the track (masking tape) for the path ahead. This part is working quite well so the next step is to process this resulting path image to determine how the tank should steer both now and in the near future.

Screen Shot 2018-09-12 at 2.03.10 PM