Wednesday, November 10, 2010

A computer can drive your car someday

A computer may drive your car somedayNeuFlow is a supercomputer that mimics human vision to analyze the complex environments, such as the street scene. (Image: Eugenio Culurciello, e-lab)Browse our way down the street is something that most of us we take for granted; seem to us recognize cars, as other people, trees and lampposts and without much thought instantly. Visually interpret our environment as more quickly what we do is indeed an astonishing feat requiring an enormous amount of calculations, which is only one of the reasons which has proved so difficult to achieve a system driven by the team that can mimic the human brain in the visual recognition of objects.

Eugenio Culurciello Faculty of applied science & Yale engineering has now developed a supercomputer based on the human visual system that operates much more quickly and efficiently than ever before.Nicknamed NeuFlow, the system takes its inspiration from the mammalian visual system, imitating their neural network to quickly interpret the world around él.Culurciello presented the results on September 15 in the workshop of high performance Embedded Computing (HPEC) in Boston, Massachusetts.


The system uses the vision complex algorithms developed by Yann LeCun at New York University to execute large neural networks for applications vision sintética.Una idea, the a Culurciello and LeCun focus, is a system enabling cars drive themselves mismos.A order to be able to recognize the various objects found in the carretera-tal as other cars, people, semaphore, sidewalks, not to mention the road Yes-NeuFlow processes tens of images megapixel real-time.


The system is also extremely efficient at the same time that runs more than 100 billion operations per second using only a few Watts (that less energy used by a cell phone) to make you need desktop with multiple graphics over 300 Watt processors to achieve.


"One of our early prototypes of this system is already capable of beating graphics tasks vision processors", says Culurciello.


"Culurciello had incorporated the supercomputer in a single chip, the much smaller system, even more powerful and effective than computers on a large scale.""The complete system will be not more than a wallet, which easily could embed in cars and other places", said Culurciello.


Beyond of the autonomous car navigation, the system could be used to improve navigation of robot in dangerous or difficult to reach, to provide the synthetic 360-degree vision for soldiers in combat situations, or in assisted living situations which could be used for monitoring the movement and call for help from an elderly stay, for example places.


Other partners include Clement Farabet (Yale University and the University of New York), Berin Martini, Polina Akselrod, Selcuk Talay (Yale University) and Benoit Corda (NYU).

0 comments:

Post a Comment

Popular Posts