Inspired by recent fantastic work showing event-based cameras detecting fast moving objects, I decided to try something similar with one of Centeye’s vision chips. Our vision chips aren’t event-based, but they do have flexible acuity and direct pixel addressing functions which can provide some similar benefits (logarithmic response and dynamic allocation of pixel sampling). This little project used a Centeye RockCreek chip, the same one I designed a few years ago to provide visual obstacle avoidance to a Crazyflie nano drone.
The setup was just a Teensy 4.0 and a module I built using a single RockCreek chip, some support components, and a cell-phone lens. All image acquisition and processing was performed on the Teensy in real time. The goal was simple- track and image a fast-moving LED at 1200 Hz. The LED was hung from the ceiling and released to swing about 12cm from the sensor (which you can see in the video).
Each frame used a three-step process. First, enable on-chip binning (using 8×8 super pixels) and acquire a whole image at low resolution. Second, find the brightest super pixel. Third, disable binning and grab just the 8×8 high resolution block of pixels associated with the bright super pixel. The pixel data was then sent over USB to a PC running MATLAB, where the final video was assembled off-line. The Teensy 4’s native USB stack was plenty fast to handle the data transfer.
Currently we are using just one of the four output channels of the RockCreek chip, with a single 16-bit ADC (ADS8860) used for conversion. Using all four channels and faster ADCs, we could probably boost the frame rate to 10 kHz.
In the past we’ve used similar techniques for optical beacon tracking and triangulation from a constellation of such beacons. That is a story for another time…