Vision-Based Hover in Place

 

Centeye has developed and demonstrated an omnidirectional vision system capable of “hover in place”, or holding a position in an environment for indefinite amounts of time. The system as demonstrated uses purely visual information- no IMU or gyro is required. The use of visual information allows motion in the world to be directly measured, without drift, thus eliminating factors like drift or offset that affect all IMUs.

The system works by using an array of vision sensors arranged to view the environment in an omnidirectional fashion. The system computes an array of optical flow measurements in different directions all around the helicopter. These optical flow measurements are aggregated to measure deviations from the starting position. Standard control theory techniques may then be used to keep the helicopter in one position by eliminating these deviations.

Qualitatively, the system is similar to the expensive external “motion capture” systems used by many research institutions to demonstrate precise flight control of quadrotors. However in our case, the “motion capture” system was shrunk down by a factor of many thousands in mass, turned inside out, and mounted on the helicopter.

EyeStrip2009

The above photo shows a version of the system fabricated in 2009. This system weighs between 3g and 5g, depending on the configuration. This mass includes eight vision sensors, embedded optics, flexible sensor ring, and processing using a 32-bit microcontroller (an Atmel AT32UC3B). The processor is capable of acquiring imagery from the vision sensors, computing omnidirectional optical flow, and generating measured perturbations at over a 100Hz frame rate, which is adequate for many small helicopter platforms. We have since then implemented similar capabilities in less than a gram.

For testing we have mounted this system on a modified eFlite Blade mCX toy RC helicopter platform. This helicopter has a rotor span of 18cm and an out-of-box mass of about 28 grams making it ideal for small form-factor demonstrations. Other than adding the sensor ring, our modifications included removing the canopy and replacing the integrated wireless controller board with a board of our own design.

We have tested the system in a variety of environments. We have flown the system in spaces as small as a 1m x 2m room to a large 10m hallway without changing any control constants. We have obtained flights as long as 6 minutes in duration, limited only by battery life. Using Centeye logarithmic-based vision chips, the system will work in visual environments ranging from bright sunlight to under one lux. We have also demonstrated flight in a pure dark environment using on-board LEDs for illumination. We have also developed methods to integrate human “control stick” input with the hover in place system, so that a human many fly the helicopter through a room and then release the control sticks, at which point the sensor ring will take over.

More recently, utilizing advances in wide field of view optics, we have reduced the mass of the system to less than a gram. This includes 203 milligrams for the camera module, and a generic microcontroller. A sample flight is shown below. This configuration is appropriate for use on true nano-scale air vehicles, as well as higher volume and low cost systems including hobby-scale vehicles and toys.


Our ultimate goal is to develop a system that weights a fraction of a gram and can both auto-hover a helicopter and avoid obstacles in all lighting conditions “from daylight to no light”. We are almost there.

This system is ready for integration on a wide variety of robotic helicopter platforms of all sizes and capabilities. Please contact us if you have any questions.