Living organisms are very good at making sense out of what they see. Designing machines that can recognize objects when seen from an angle or at various distances is challenging. Facial or gesture recognition is becoming common in our computing devices.
In an attempt to improve upon current state of the art visual systems, scientists are attempting to reverse engineer biological visual systems.
Huge advances have been recently made in visualizing the structure of our visual cortex (hardware) but the inner workings of the neuronal systems (software) remain a mystery. Mimicking natural selection, scientists are testing thousands of software algorithms at a time.
Using graphical processors from game playing computers (such as those found in the PlayStation 3 and high-end NVIDIA graphics cards), scientists have discovered better visual modeling systems.
"The best of these models, drawn from thousands of candidates, outperformed a variety of state-of-the-art vision systems across a range of object and face recognition tasks."
"GPUs are a real game-changer for scientific computing. We made a powerful parallel computing system from cheap, readily available off-the-shelf components, delivering over hundred-fold speed-ups relative to conventional methods,"
Courtesy NeilsPhotography Engineers are trying to design machines that can "think for themselves" when on surveillance or search and rescue missions. Somehow the machines has to look at its environment and decide what to do.
Have you ever tried to catch a fly? They are pretty good at seeing your hand and knowing just how to escape your grasp.
Can we figure out how a fly is able see, and find food, and escape from our fly swatters? With today's super microscopes, I am sure that we can visualize and model every nerve connection, muscle fiber, and eye facet.
David O’Carroll, a computational neuroscientist who studies insect vision at Australia’s University of Adelaide has been studying the optical flight circuits of flies, measuring their cell-by-cell activity. In a paper published in Public Library of Science Computational Biology, O’Carroll and fellow University of Adelaide biologist Russell Brinkworth describe an
algorithm composed of a series of five equations through which data from cameras can be run. Each equation represents tricks used by fly circuits to handle changing levels of brightness, contrast and motion, and their parameters constantly shift in response to input.
“It’s amazing work,” said Sean Humbert, who builds miniaturized, autonomous flying robots,
“For traditional navigational sensing, you need lots of payload to do the computation. But the payload on these robots is very small — a gram, a couple of Tic Tacs. You’re not going to stuff dual-core processors into a couple Tic Tacs.
Secret Math of Fly Eyes Could Overhaul Robot Vision Wired Science
Robust Models for Optic Flow Coding in Natural Scenes Inspired by Insect Biology Computational Biology
We have constructed a full model for motion processing in the insect visual pathway incorporating known or suspected elements in as much detail as possible. We have found that it is only once all elements are present that the system performs robustly, with reduction or removal of elements dramatically limiting performance. The implementation of this new algorithm could provide a very useful and robust velocity estimator for artificial navigation systems.