iCE40 and the IceStorm Open Source FPGA Workflow

Project IceStorm is the first, and currently only, fully open source workflow for FPGA programming. The toolchain targets the iCE40 series of FPGAs from Lattice Semiconductor. It enables you to take a project from high-level Verilog code to a synthesized bitstream and then use that bitstream to program an FPGA, generally by loading the bitstream to flash memory on a development board or other circuit housing one of the supported chips.

An FPGA (field programmable gate array) is an integrated circuit made up of user programmable logic blocks, accompanied by an assortment of interconnects, memory, and input/output functionality. Using an HDL (hardware description language) like Verilog, the FPGA can be configured as a digital logic circuit. The complexity of that circuit is essentially only limited by the resources (logic blocks and memory) on the chip. Modern FPGAs have many thousands of logic blocks and can be used to implement complete large-scale computing circuits.

These days, it is easy to become complacent about the availability of quality open source tools for just about any tech task. But the existence of IceStorm is really quite the noteworthy accomplishment for developers Clifford Wolf and Mathias Lasser. FPGA manufacturers have been notoriously protective of their intellectual property. Creating a functioning workflow required a deep reverse engineering of the iCE40 FPGA structure and the format of the configuration data (the bitstream) used in programming. No mean feat with scant documentation available and no support from the manufacturer.

Adding a Character LCD to an FPGA Project

Adding a character liquid crystal display (LCD) to an FPGA project is a simple and inexpensive way to get your project talking. In this post I discuss interfacing an FPGA with garden-variety generic 1602 LCD like the one pictured nearby. These alphanumeric displays include an integrated HD44780-compatible controller with a parallel interface that enables you to send data in ASCII format one byte at a time. The output is two lines of 16 characters each on a fixed dot matrix grid. They're cheap as chips (as my English friends say) and perfect for displaying measurements, results of calculations and any other simple messages.


Buster - A Voice Controlled Raspberry Pi Robot Arm

Buster is a fully voice interactive robot built around a tabletop robotic arm. He acts upon commands given in spoken English, such as: "Lift the arm two centimeters." or "Close the gripper." He also answers questions about his status, such as "How high is the arm?" or "What is the angle of the elbow?"

Buster accepts considerable variety in the syntax of the spoken commands, recognizing that "Lift the arm two centimeters." is roughly equivalent to "Move arm twenty millimeters up." Buster's responses also include a variety of messages about the system. For example, he will tell you if the command given would put the arm outside its range of motion and will not follow the command. As a bonus, Buster will also engage in brief conversation. He can answer questions from a small knowledge database. He can also act as a talking calculator and give the answers for some basic calculations.

As a project, Buster is both affordable and accessible. He is built around the Raspberry Pi and other inexpensive, garden variety hobbyist parts, and programmed using PocketSphinx and other open source tools and libraries. To that end, he represents not only a certain level of personal achievement, but also the truly exciting state of the robotics hobby. With the availability of increasingly capable Linux-based single board computers and mature libraries, robots with solid speech recognition and synthesis are now within every hobbyist's reach.

Haar LBP and HOG - Experiments in OpenCV Object Detection

I've spent some time lately coming up-to-speed and playing with OpenCV - especially the object detection routines. OpenCV is an open source computer vision library, currently in version 3.1. As far as I can tell it is the most widely used such library. I think too that the library is fairly unique in its comprehensive scope. OpenCV is a catch-all of vision related utilities. Its functionality ranges from the most pedestrian routines for opening and manipulating a graphic file, to elaborate implementations of full computer vision algorithms.

My original interest in OpenCV was at the more basic end. I want ways to be able to access graphic files and feeds from a camera, and to do some preprocessing before handing the image off to a neural network for recognition. OpenCV is rich with options for identifying shapes and colors within images, finding edges of objects, tracking motion and more.

As I dug deeper into OpenCV I realized, somewhat to my surprise, how mature and capable some of the library's object detection algorithms are. Three that caught my eye for further investigation were Haar Cascades, Local Binary Patterns (LBP), and Histogram of Oriented Gradients (HOG). These capabilities are often presented in terms of sample applications. For example, the library includes a fully trained human face detector that lets you build face detection into a project with just a few lines of code. However, the library also includes the necessary functionality to develop customized object recognizers that use these very same algorithms.

Getting Up and Running With a Tamiya Twin-Motor Gearbox

For my latest no-nonsense, keep-it-simple robot platform I'm using a Tamiya twin-motor gearbox. Tamiya makes a full line of small gearbox kits for different applications. While it might be tempting to dismiss these gearboxes as toys, they're actually fairly capable for their size and an easy, economical way to get a small to medium size wheeled robot project up and running. One of their virtues is that they fit handily with other parts from the educational series. The twin-motor gearbox screws directly to a universal plate without the need for any additional mounting hardware. Tamiya also makes a range of wheels and other parts to fit the 3mm hex shafts.

When you assemble the twin-motor gearbox you have two gear ratio options 58:1 or 203:1. The rule with gear ratios is the higher the gear ratio, the slower the rotation and the greater the torque. Here I need strength more than speed so I'm using the 203:1 configuration. I'm even considering upgrading to the Tamiya's dual gear box which has a beefier 344:1 gear ratio option - this robot is only going to get heavier.

Setting up a Simple Machine Learning Experiment for the Artificial Neural Network

The basic back propagation algorithm demonstrated in the Arduino Neural Network Tutorial does a pretty effective job of learning to recognize patterns. The training portion of the demo cycles through a series of potential inputs and the desired outputs, and the network converges on a solution fairly quickly. Once solved, you can feed the network any of the input sets from the training data and it will give you the correct outputs. (I should probably say the network "generally" converges on a solution, because with some of the training sets I've been working, it can still occasionally go into oscillation and never solve the set.)

I've arrived at a very simple concept for getting started applying the network to a robot machine learning scenario. The test robot has three infra-red (IR) reflective proximity sensors and two bump switches. For the experiment, the robot will use the bump switches to register collisions, and based on those collisions will learn to interpret the proximity sensors and avoid obstacles in the future.

As simple as the experimental setup sounds, right out of the gate the homebrew reflective sensors create interesting nuance. These sensors work by emitting IR light, and measuring the intensity of the light reflected back. Unlike ultrasonic sensors and prism-based IR detectors which give predictable readings that correlate to distance, the readings from simple reflective sensors will vary greatly depending on background IR light, and the color and other physical characteristics of the obstacle.

Migrating to the 1284P

For the next steps in the neural network project, I'm upgrading the microcontroller from an Atmel ATMEGA328P (Arduino Uno) to an ATMEGA1284P. Although no "official" Arduino boards use the 1284P, unofficial Arduino boards files are available for the chip and it's a pretty simple matter to download them from GitHub here and add them to the IDE. You can also download the files from the developer's blog here where there is a fairly comprehensive write-up of the chip and the process of getting it going.

As an Arduino compatible controller the 1284P is not-too-shabby. It's available in a hobbyist and breadboard-friendly 40-pin PDIP package. The process of assembling your own circuit is essentially identical to the 328P - you can use the same 16Mhz clock crystal and 5V power regulator, just mind the pinout. In addition to the extra input/output, the 1284P sports 128K of flash memory for programs (4x the Uno) and 16K SRAM for variables and data (8x the Uno). The 16K SRAM is absolutely key here, because the neural network is a RAM hog. The 2K on the UNO was barely enough to run a demo, with nothing left over for an actual application.

Back to Basics

Over the past couple of months I've gone through a reset of sorts on the robotics front. After spending quite a while exploring various approaches to walking and other mechanical conundrums, I wanted to start making better progress implementing some of my machine learning ideas. I've been particularly interested in developing an environment where I could do some foundational work applying the Arduino Artificial Neural Network to some real world sensor data and robot control tasks.

To get on track with these experiments I needed a simple but robust robotic platform. It needed to be medium size, large enough to carry a modest payload of batteries, breadboarded circuits, and sensors, yet small enough to explore the rather confined space of my home office. I also wanted it to be reasonably inexpensive, repeatable, and easy to tear down and reconfigure. In short I wanted to get back to basics.

Articles and Tutorials

An Arduino Neural Network
An artificial neural network developed on an Arduino Uno. Includes tutorial and source code.

Raspberry Pi to Arduino SPI Communication
This tutorial presents a basic framework for Raspberry Pi to Arduino communication and control using SPI - the Serial Peripheral Interface bus. Explores SPI in some detail, discusses hardware and software considerations, and develops a working example of a bi-directional communications scheme that could be adapted for any number of command and control applications.

Flexinol and other Nitinol Muscle Wires
With its unique ability to contract on demand, Muscle Wire (or more generically, shape memory actuator wire) presents many intriguing possibilities for robotics. Nitinol actuator wires are able to contract with significant force, and can be useful in many applications where a servo motor or solenoid might be considered.

Flexinol Control Circuit Using PIC 16F690 and ULN2003A
This article describes a microcontroller circuit to drive Flexinol muscle wires using a ULN2003A integrated circuit and a PIC 16F690. The method and circuit presented here provide a simple, low-cost solution with good flexibility well suited to the medium gauge Flexinol wires popular with students and hobbyists.

Precision Flexinol Position Control Using Arduino
An approach to precision control of Flexinol contraction based on controlling the voltage in the circuit. In addition, taking advantage of the fact that the resistance of Flexinol drops predictably as it contracts, the mechanism described here uses the wire itself as a sensor in a feedback control loop.

First Look at the Texas Instruments LaunchPad
Priced less than you paid for your first soldering iron, the Texas Instruments LaunchPad started a new era in low cost microcontrollers for hobbyists. Here is my review shortly after the release.

LaunchPad MSP430 Assembly Language Tutorial
One of my more widely read tutorials. Uses the Texas Instruments LaunchPad with its included MSP430G2231 processor to introduce MSP430 assembly language programming.

Analog Sensors and the MSP430 Launchpad
An introduction to interfacing analog sensors to MSP430 series microcontrollers. Presents a configuration that will be useful for interfacing many simple sensors that is easy to understand and integrate into your own projects.

Design and Build Your Own Robot
This tutorial provides an overview of some of the most common techniques and parts found in hobby robots with an emphasis on beginner projects.

K'nexabeast - A Theo Jansen Style Octopod Robot
K'nexabeast is an octopod robot built with K'nex. The electronics are built around a PICAXE microcontroller and it uses a leg structure inspired by Theo Jansen's innovative Strandbeests.

K'nexapod - A Hexapod Robot Built With K'nex and PICAXE
K'nexapod is a hexapod robot built with K'nex building toys and controlled by a PICAXE microntroller. Propelled by small hobby servos it can move forward, backward, and make turns. It is equipped with an ultrasonic sensor and follows a simple algorithm to wander about while avoiding obstacles in its path.

Robot Obstacle Detection and Avoidance with the Devantech SRF05 Ultrasonic Range Finder
Describes the fundamentals of applying ultrasonic sonar in robot navigation while considering some of the characteristics specific to the Devantech SRF05.

Older PICAXE Articles

Introduction to the PICAXE Microcontroller
An introduction to microcontrollers and an overview of the PICAXE product line is - an exciting family of microcontrollers that is simultaneously easy, inexpensive, and powerful.

Setting Up A Differential Drive For Your PICAXE Project
It's easy to setup a differential drive system for your vehicle or robot project using a PICAXE microcontoller and the SN754410 quad half-H driver.

Basic PICAXE Servo Interfacing
Servos are found in a number of hobby applications such as the steering mechanisms of RC (radio control) cars, the flaps of RC planes, and the arms and legs of robotics projects. This quick guide will get you started interfacing servos with PICAXE microcontrollers.

Understanding Variables and Symbols in PICAXE BASIC
A concise overview and tutorial of PICAXE variables and aliases, with tips on managing memory and program readability.

Calculating Right Triangles with PICAXE BASIC
The more recent PICAXE microcontrollers provide new math functions which make trigonometry and related calculations much easier. These functions have applications in many fields of science and technology. In robotics trig functions are invaluable in such tasks as range finding, triangulation, localization and mapping.

Adding Memory with i2c EEPROMs
An introduction to interfacing PICAXE processors with the Microchip 24LC16B and 24LC256 i2c EEPROMs.