AEye's New Sensor To Facilitate Human-Like Vision In Autonomous Vehicles
AEye, a company founded by Lockheed Martin, Northrop Grumman, and NASA veteran Luis Dussan is hoping to To get to the next stage in the world of autonomity by making better sensors. Majority of the autonomous vehicles are currently depended upon LiDAR sensors, which use laser beams to take a reading of their surroundings. They’re limited in two key areas: they’re expensive, and they can only emit beams at preset angles.
AEye wants to use solid-state LiDAR sensors, which cast a laser beam back and forth across a scene. Most current applications of this technology utilize a regular scanning method, but Dussan wants to use two distinct types of scan instead: low resolution scans of a wide area, and high resolution scans of a smaller section, where the priority can be reprogrammed on the fly. The key innovation, though, is how the camera can be used to direct what the LiDAR focuses on. With high-speed image recognition algorithms running on its chips, the LiDAR can steer its gaze to pay special attention to cars, pedestrians, or whatever else the onboard AI is told to consider important.
These sensors are adaptable much like the human eye, wherein the hardware here would be able to focus in on what’s most important depending on the driving conditions present at any given time. AEye isn’t the only company looking to push the state of sensors forward. Apple has been working on a self-driving project in relative secrecy, with hope of changing the face of transportation in the future.