DigiKey-eMag-EdgeAI-Vol 18

Understanding computer and machine vision

Technical advantages: ■ Provides dense, real-time depth maps with minimal computational overhead

provide high sensitivity for detecting weak return signals but introduce additional noise, requiring sophisticated filtering and amplification techniques. Single- photon avalanche diodes (SPADs) on the other hand enabled time- correlated single-photon counting, offering extreme sensitivity and superior performance in low-light conditions. The effectiveness of these receivers directly impacts point cloud density and resolution, which define the granularity of the captured 3D environment. The angular resolution, which is typically between 0.1 and 1 degree, determines the level of detail in the scan and affects object detection accuracy. Scan rates, ranging from 10 to 100Hz, influence real-time application viability, with higher rates improving responsiveness in dynamic environments. High- end LiDAR systems can generate millions of points per second, allowing for precise environmental

Figure 6: An illustration of iToF

Figure 7: Basic LiDAR principle visualized

Laser Light emitted pulse

Emitted light (periodic signal)

Object

Lidar

Light-emitting element

Reception of the laser pulse

■ Effective in low-light and textureless environments ■ Well-suited for gesture recognition, AR/VR applications, and industrial process monitoring Challenges: ■ Depth precision decreases with distance due to multi-path interference ■ Susceptible to errors in highly reflective or absorptive surfaces ■ Requires precise calibration of illumination and sensor exposure times to avoid artifacts LiDAR (Light Detection and Ranging) LiDAR uses pulsed laser beams to generate high-resolution 3D point clouds of an environment. By measuring the time delay between pulse emission and reception, LiDAR systems construct depth maps with sub-centimeter accuracy. LiDAR can operate in 1D (single-line scanning), 2D (rotating plane scanning), or 3D (solid-state or MEMS-based scanning). LiDAR calculates distance, d, via the following formula:

Target object

Phase difference

Laser pulse travelling at speed of light

Light-receiving element

■ MEMS-based scanning: employs micro-electromechanical mirrors to reduce size and cost while maintaining scanning capability ■ Solid-state (Flash LiDAR): illuminates the entire scene at once, eliminating the need for moving parts and enhancing robustness ■ Optical Phased Arrays (OPAs): utilizes interference patterns to steer beams electronically, enabling ultra-compact, solid- state implementations without mechanical components LiDAR systems also rely on advanced receiver technologies to capture returning laser pulses accurately and generate high- fidelity depth maps. For instance, avalanche photodiodes (APDs)

the division by 2 accounts for the round trip. LiDAR systems commonly operate at 905nm or 1,550nm. The 905nm wavelength is cost-effective but has a shorter range and is subject to eye safety limitations. In contrast, 1,550nm LiDAR is safer for higher power emissions and can achieve longer ranges but requires more expensive components such as InGaAs detectors. Various beam steering methods can be used to direct the laser beam for different outcomes or use cases. These include: ■ Mechanical scanning: uses rotating mirrors to direct the laser beam over the field of view, common in traditional automotive and mapping LiDAR

real-time depth estimation, particularly in high-resolution applications ■ Occlusion and edge bleeding effects introduce depth inaccuracies

(SiPMs). This method is well- suited for applications requiring long-range, high-precision depth measurements, such as automotive LiDAR and industrial metrology. Indirect Time-of-Flight (iToF) emits modulated infrared signals and measures the phase shift between emitted and received light using CMOS-based image sensors with demodulation pixels. iToF is commonly used in consumer electronics and AR/VR applications due to its compact design and lower power consumption. ToF sensor performance can be influenced through several system- level design factors to improve precision, coverage, or varying lighting conditions. This can be achieved by altering modulation frequencies/wavelengths, emitter

Time-of-Flight (ToF) sensors

Time-of-Flight (ToF) depth sensing is based on measuring the time delay or phase shift of emitted infrared (IR) light as it reflects off a target and returns to the sensor. This enables accurate distance measurements, generating real- time depth maps with minimal computational complexity. ToF systems are implemented in two main architectures:

Direct Time-of-Flight (dToF) measures the absolute travel

time of individual photons using Single-Photon Avalanche Diodes (SPADs) or Silicon Photomultipliers

technology, optical designs, or controlling calibration and exposure.

The choice of technology depends on environmental constraints, computational resources, and the precision required for machine vision applications.

Here, c is the speed of light, t is the total time for the laser pulse to travel to the object and back, and

Figure 8: Visualization of how LiDAR works on a vehicle for object detection and recognition.

we get technical

44

45

Powered by