Not only Lidar, start-ups are focusing on car perception technology

Due to insufficient demand in the driverless car industry, for lidar companies, many companies are in trouble. Only a few outstanding people have achieved this through professional technology, and this year’s trend has surpassed Lidar, and new sensor imaging methods have promoted the competition and supplement of laser-based technology.

Due to insufficient demand in the driverless car industry, for lidar companies, many companies are in trouble. Only a few outstanding people have achieved this through professional technology, and this year’s trend has surpassed Lidar, and new sensor imaging methods have promoted the competition and supplement of laser-based technology.

Lidar is ahead of traditional cameras because it can do things that traditional cameras cannot. Now some companies are using less novel technologies to do the same thing.

They are using different methods to solve problems or perceptions. A good example is Eye Net’s V2X tracking platform. This is one of the technologies that people are talking about in the context of 5G, and 5G can indeed achieve short-range, low-latency applications that may save lives.

Regardless of whether they are equipped with a camera or other sensing technology, Eye Net will provide collision warning for vehicles equipped with this technology. For example, a car is driving in a parking lot without realizing that an unsafe electric motorcycle is rushing straight ahead and is about to hit it, but it is completely blocked by the parked car. Eye Net’s sensors detected the location of the devices on the two vehicles and issued a warning in time, asking one or both of the vehicles to brake.

Not only Lidar, start-ups are focusing on car perception technology

They are not the only company trying to do this, but Eye Net hopes that by providing a white label solution, a well-scaled network can be established relatively easily, which is better than nothing, and then all the public are equipped.

Vision will still be the main part of car navigation, and progress has been made in many areas.

For example, Brightway Vision solves the problem of limited visibility of ordinary RGB cameras under many real-world conditions through multi-spectrum. In addition to ordinary visible light images, Brightway Vision’s camera is also matched with a near-infrared laser transmitter to scan the road ahead multiple times per second at a set distance interval.

Not only Lidar, start-ups are focusing on car perception technology

Their idea is that if the main camera is unable to see the scene 100 feet away due to heavy fog, at this time, the near-infrared image can still capture obstacles or road features during regular scans of the entry area. It combines the advantages of traditional cameras and infrared cameras, but also avoids the disadvantages of both. The selling point is that this camera can do the same work as a normal camera, or even do better, and it is likely to replace another sensor.

Foresight Automotive also uses multi-spectral images in its cameras. In the next few years, almost no car camera will be limited to the visible spectrum. Through cooperation with FLIR, they began to get involved in the thermal imaging field, but what it really sells is Another thing.

In order to provide 360-degree coverage, multiple cameras are usually required. But the cameras on compact cars and SUVs made by the same manufacturer are not the same, let alone the cameras on automated freight cars. Because these cameras must work together, they need to be accurately calibrated to know the exact location of the other cameras, for example, they are all looking at the same tree or a passing bicycle, rather than doing their own things.

Not only Lidar, start-ups are focusing on car perception technology

Foresight’s progress is to simplify the calibration phase so that manufacturers, designers, or test platforms do not need to laboriously retest and certify each time the camera needs to move half an inch in one direction. In the Foresight demonstration, they attached the camera to the roof of the car a few seconds before driving.

Foresight has similarities with another startup called Nodar, which also relies on stereo cameras, but uses a different approach. Nodar said that as early as a few decades ago, people used binocular triangulation to obtain depth technology, and even millions of years ago, there was a similar way of working. The limitation of this approach is not that optical cameras simply cannot provide the depth information required by autonomous vehicles, nor can they be trusted to maintain calibration.

Nodar said that its paired stereo camera does not even need to be mounted on the main body of the car, which will reduce the partial mismatch between jitter and camera viewing angle. Mounted on the rearview mirror, their “Hammerhead” camera poses like a shark, which provides greater accuracy due to the larger gap between the cameras. Since the distance is determined by the difference between the two images, there is no need for object recognition or complex machine learning to determine the size and distance of the object, just like when using a single camera solution.

Not only Lidar, start-ups are focusing on car perception technology

Brad Rosen, chief operating officer and co-founder of Nodar, said: “The industry has proven that camera arrays can perform well in harsh weather conditions just like the human eye.” For example, research results published by Daimler’s engineers show that in severe weather Next, compared with the monocular method and lidar, the current stereo method can provide more stable depth calculations. Our characteristic is that the hardware we use already has car-level usability and provides more choices for manufacturers and distributors. “

Not only Lidar, start-ups are focusing on car perception technology

In fact, the most fatal disadvantage of lidar is its cost-even “cheap” cameras are often several times more expensive than ordinary cameras, and the cost rises rapidly. However, the research and development progress of major manufacturers has not stopped.

Sense Photonics has presented a new method that seems to combine the advantages of these two areas: a relatively cheap and simple flash lidar (as opposed to rotating or scanning, which tends to increase complexity) combined with traditional cameras , Can make the two see the same version of the image, allowing them to identify objects and determine the distance together.

Since its launch in 2019, Sense Photonics has been improving technology for production and other areas. The latest development is custom hardware. For remote imaging, whether it is a lidar or a traditional camera, it can image objects up to 200 meters away.

Shauna McIntyre, CEO of Sense Photonics, said: “In the past, we have purchased an off-the-shelf detector to pair with our laser source (sensor illuminator). However, our internal detector discovery has been completed and achieved great success. , Which allows us to manufacture short-range and long-range automotive products.”

Not only Lidar, start-ups are focusing on car perception technology

“Sense has laid the foundation for similar lidar designs. It can be paired with different optical devices to achieve different fields of view, range, and resolution. We did this with a very simple design. In fact, it can Mass production. You can think of our architecture as a SLR camera. You have a “basic camera” that can be used with a macro lens, zoom lens, fisheye lens, etc. to achieve different functions.”

Not only Lidar, start-ups are focusing on car perception technology

All companies seem to agree that no single sensing method will dominate the industry from top to bottom. Regardless of the big difference between the requirements of fully autonomous vehicles (L4-5) and driver assistance systems, the field is moving too fast, and neither method can stay ahead for a long time.

McIntyre: “If the public does not believe that the autonomous driving company’s platform is safe, then the autonomous driving company will not succeed, and the safety margin will only increase in the mode of redundant sensors with different wavelengths.”

Whether it is visible light, near-infrared, thermal imaging, radar, lidar, or a combination of two or three technologies we see here, it is clear that the market will continue to support differentiation-even though the lidar industry appeared a few years ago The cycle of boom and bust is also a warning, but integration will come soon.

The Links:   UCD9090RGZR LQ084S1LG01