Autonomous

A new kind of hybrid sensor aims to give the autonomous cars a human-like view

A startup, AEye, has built a new kind of hybrid sensor that aims to give the autonomous cars a human-like view of the surroundings. The device uses a solid-state lidar, a low-light camera, and chips to run embedded artificial-intelligence algorithms that can reprogram how the hardware is being used in the real time. That allows the system to prioritize where it’s looking in order to give vehicles a more refined view of the world.

aeye1-telematocswire
A new kind of hybrid sensor aims to give the autonomous cars a human-like-like view of the surroundings

AEye programs the Lidar to spit out laser beams in focused areas instead of a regular grid. The device uses data from its camera to add color to raw lidar images. This reduces the computation time as usually the data from Lidar is sent to a central computer to be fused with data from other sensors.  The company claims that the device should be able to see as far as 300 meters with an angular resolution as small as 0.1 degrees.

The detailed article from MIT Technology review can be read here

Show More

Related Articles