How sensor fusion enables AMRs to maneuver around factory floors efficiently
(is it hazy or smoky, humid, how bright is the ambient light, etc.) and enable a more meaningful result by combining the outputs of different sensor technologies. Sensor elements can be categorized by function as well as technology. Examples of sensor fusion functions in AMRs include (Figure 1): ■ Distance sensors like encoders on wheels and inertial measurement units using gyroscopes and accelerometers help measure the movement and determine the range between reference positions. ■ Image sensors like three- dimensional (3D) cameras and 3D LiDAR are used to identify and track nearby objects. ■ Communications links, compute processors, and logistics sensors like barcode
scanners and radio frequency identification (RFID) devices link the AMR to facility-wide management systems and integrate information from external sensors into the AMR's sensor fusion system for improved performance. ■ Proximity sensors like laser scanners and two-dimensional (2D) LiDAR detect and track objects near the AMR, including people's movement. 2D LiDAR, 3D LiDAR, and ultrasonics 2D and 3D LiDAR and ultrasonics are common sensor technologies that support SLAM and safety in AMRs. The differences between those technologies enable one sensor to compensate for the weaknesses of the others to improve performance and reliability. 2D LiDAR uses a single plane of laser illumination to identify
Sensor fusion functions Mapping the facility is an essential aspect of AMR commissioning. But it's not a one-and-done activity. It's also part of an ongoing process called simultaneous localization and mapping (SLAM), sometimes called synchronized localization and mapping. It is the process of continuously updating the map of an area for any changes while keeping track of the robot's location. Sensor fusion is needed to support SLAM and enable the safe operation of AMRs. Not all sensors work equally well under all operating circumstances, and different sensor technologies produce various data types. AI can be used in sensor fusion systems to combine information about the local operating environment
objects based on X and Y coordinates. 3D LiDAR uses
multiple laser beams to create a highly detailed 3D representation of the surroundings called a point cloud. Both types of LiDAR are relatively immune to ambient light conditions but require that objects to be detected have a minimum threshold of reflectivity of the wavelength emitted by the laser. In general, 3D LiDAR can detect low-reflectivity objects with more reliability than 2D LiDAR.
Figure 1: Examples of common sensor types and related system elements used in AMR sensor fusion designs. (Image source: Qualcomm)
28
Powered by FlippingBook