DigiKey-emag-Adv-Future-Automation-Vol-3

How to Optimize Intra Logistics to Streamline and Speed Industry 4.0 Supply Chains

elements. Navigation combines visual SLAM and/or LiDAR SLAM, overlay technologies like 5G TRP, and ML to detect changes in the environment and provide constant location updates. Sensor fusion supports SLAM in several ways: n Continuous updates of the spatial and semantic model of the environment based on inputs from various sensors using AI and ML n Identification of obstacles, thus enabling path planning algorithms to make the needed adjustments and find the most efficient path through the environment n Implementation of the path plan, requiring real-time control to alter the planned path, including the speed and direction of the AMR, as the environment changes

n Scan-to-scan matching uses sequential LiDAR range data to estimate the position of an AMR between scans. This method provides updated location and pose information for the AMR independent of any existing map and can be useful during map creation. However, it's an incremental algorithm that can be subject to drift over time with no ability to identify the inaccuracies the drift introduces. Safety needs sensor fusion Safety is a key concern for AGVs and AMRs, and several standards must be considered. For example, American National Standards Institute / Industrial Truck Standards Development Foundation (ANSI/ITSDF) B56.5 – 2019, Safety Standard for Driverless, Automatic Guided Industrial Vehicles and Automated Functions of Manned Industrial Vehicles, the ANSI / Robotic Industrial Association (RIA) R15.08-1-2020 – Standard for Industrial Mobile Robots – Safety Requirements, several International Standards Organization (ISO) standards, and others. Safe operation of AGVs and AMRs requires sensor fusion that combines safety-certified 2D LiDAR sensors (sometimes called safety laser scanners) with encoders on the wheels. The 2D

d (PLd), and Safety Integrity Level 2 (SIL2) applications and are housed in an IP65 enclosure suitable for most outdoor as well as indoor applications (Figure 3). The scanners include an input for incremental encoder information from the wheels to support sensor fusion. Conclusion Intra logistics supports faster and more efficient supply chains in Industry 4.0 warehouses and factories. AMRs and AGVs are important tools for intra logistics to move material from place to place in a timely and safe manner. Sensor fusion is necessary to support AMR and AGV functions including determining pose, calculating SLAM data, improving navigational performance using scan-to-map matching and scan- to-scan matching, and ensuring safety for personnel and objects throughout the facility.

Integrated Approach

SLAM

Mapping

Localization

Scan-to-Scan

Scan-to-Map

Figure 3: 2D lidar sensors like this can be combined with encoders on the wheels in a sensor fusion system that provides safe operation of AMRs and AGVs. (Image source: Idec)

Navigation

Figure 2: Scan-to-map and scan-to-scan matching algorithms can be used to complement and improve the performance of SLAM systems. (Image source: Aethon)

LiDAR simultaneously supports two detection distances, can have a 270° sensing angle, and coordinates with the vehicle speed reported by the encoders. When an object is detected in the farther detection zone (up to 20 m away, depending on the sensor), the vehicle can be slowed as needed. If the object enters the closer detection zone in the line of travel, the vehicle stops moving. Safety laser scanners are often used in sets of 4, with one placed on each corner of the vehicle. They can operate as a single unit and communicate directly with the safety controller on the vehicle. Scanners are available and certified for use in Safety Category 3, Performance Level

SLAM estimation calls for external sensor readings to refine initial estimates. This two-step process helps eliminate and correct small errors that may compile over time and create significant errors. SLAM depends on the availability of sensor inputs. In some instances, relatively low-cost 2D LiDAR may not work, such as if there are no objects in the direct line of sight of the sensor. In those instances, 3D stereo cameras or 3D LiDAR can improve system performance. However, 3D stereo cameras or 3D LiDAR are more expensive and require more compute power for implementation.

Another alternative is to use a navigation system that

integrates SLAM with scan-to- map matching and scan-to-scan matching techniques that can be implemented using only 2D LiDAR sensors (Figure 2): n Scan-to-map matching uses LiDAR range data to estimate the AMR's position by matching the range measurements to a stored map. The efficacy of this method relies on the accuracy of the map. It does not experience drift over time, but in repetitive environments, it can result in errors that are difficult to identify, cause discontinuous changes in perceived position, and be challenging to eliminate.

When SLAM is not enough

SLAM is a vital tool for efficient AMR navigation, but SLAM alone is insufficient. Like pose algorithms, SLAM is implemented with an extended Kalman filter that provides estimated values. SLAM estimated values extend the pose data, adding linear and rotational velocities and linear accelerations among others. SLAM estimation is a two-step process; the initial step involves compiling predictions using internal sensor analytics based on physical laws of motion. The remaining step in

we get technical

74

75

Powered by