DigiKey-emag-Sensors-Vol-7

Explore the world of sensors with our e-magazine, covering smart air quality sensors for environmental monitoring, ultrasonic transducers for object and fluid flow sensing, and angle sensors for power steering, motors, and robotics. Learn how IMUs provide precise location data when GPS falls short. Stay informed on the latest sensor innovations driving smarter technology.

We get technical

Sensors | Volume 7

How to use smart air quality sensors for environmental monitoring

The basics of applying ultrasonic transducers for sensing objects or fluid flow How to choose and use angle sensors for power steering, motors and robotics Use IMUs for precise location data when GPS won’t suffice

we get technical

1

4 14

Get started quickly with 3D time-of-flight applications

How to use smart air quality sensors for environmental monitoring

20 28 32 40 50 58 66 74

Understand drone design trade-offs before piling on the sensors

An introduction to pressure sensors

Special feature: retroelectro The rise of designators: from deforest to western electric

The basics of applying ultrasonic transducers for sensing objects or fluid flow

How to choose and use angle sensors for power steering, motors and robotics

Using a MEMS sensor for vibration monitoring

Use IMUs for precise location data when GPS won’t suffice

Use a PCR module to rapidly develop accurate, low-power radar-based sensors

2

Editor’s note Recent advancements in sensor hardware and algorithms are driving significant transformations across industrial and consumer applications, particularly in automated systems, robotics, and digitalization efforts within industrial environments. The widespread adoption of key sensor technologies across different industries has been instrumental in fostering innovation. These advancements have enabled the miniaturization, ruggedization, and enhanced connectivity of transducers and processing elements. Moreover, there's a growing trend towards integrated sensing solutions that combine multiple modalities such as angle, speed, and temperature. This integration not only reduces design complexity and footprint but also lowers costs. 3D time-of-flight (ToF) imaging exemplifies this trend by integrating optical design, precision timing circuits, and advanced signal processing capabilities. It's tailored for applications in industrial safety, robotic navigation, and gesture-based controls. Advancements in sensors now incorporate wireless communication capabilities, real-time data collection, and sophisticated calibration techniques to achieve higher levels of accuracy. These technologies, complemented by machine-learning algorithms, support predictive maintenance in industrial settings and enable adaptive functionalities in drones and other mobile platforms. There is a notable shift towards energy-efficient and energy- harvesting sensors, which are increasingly integrated into

new applications and upgraded legacy systems. In this magazine, we delve into 10 specific sensor

technologies, exploring how they enable scalability, enhance design performance, and facilitate the smart engineering functions crucial for today's evolving needs.

Get started quickly with 3D time-of-flight applications

3D time-of-flight (ToF) imaging offers an efficient alternative to video imaging for a broad range of applications including industrial safety, robotic navigation, gesture control interfaces, and much more. This approach does, however, require a careful blend of optical design, precision timing circuits, and signal processing capabilities that can often leave developers

By Stephen Evanczuk Contributed By DigiKey's North American Editors

struggling to implement an effective 3D ToF platform.

4

Figure 1: Human and robot collaboration includes a broad range of possible levels of interaction. (Image source: SICK)

This article will describe the nuances of ToF technology before showing how two off-the-shelf 3D ToF kits— Analog Devices’ AD-96TOF1-EBZ development platform and ESPROS Photonics’ EPC660 evaluation kit—can help developers quickly prototype 3D ToF applications and gain needed experience to implement 3D ToF designs to meet their unique requirements. What is ToF technology? ToF technology relies on the familiar principle that the distance between an object and some source point can be found by measuring the difference between the time that energy is transmitted by the source and the time that its reflection is received by the source (Figure 1). Although the basic principle remains the same, ToF solutions vary widely and bear the

technologies including ultrasound, light detection and ranging (LiDAR), cameras, and millimeter wave (mmWave) RF signals: ■ Ultrasonic ToF solutions offer a low-cost solution but with limited range and spatial resolution of objects

robots move across a path or as drivers park their vehicles. In contrast, mmWave technology provides vehicles with the kind of long-distance sensing capability needed to detect approaching road hazards even when other sensors are unable to penetrate heavy weather conditions. ToF designs can be built around a single transmitter/receiver pair. For example, a simple optical ToF design conceptually requires only an LED to illuminate some region of interest and a photodiode to detect reflections from objects within that region of interest. This seemingly

■ Optical ToF solutions can achieve greater range and

spatial resolution than ultrasonic systems but are compromised by heavy fog or smoke ■ Solutions based on mmWave technology are typically more complex and expensive, but they can operate at significant range while providing information about the target object's velocity and heading despite smoke, fog, or rain Manufacturers take advantage of the capabilities of each technology as needed to meet specific requirements. For example, ultrasonic sensors are well suited for detecting obstructions as

simple design nevertheless requires precise timing and

synchronization circuits to measure the delay. In addition, modulation and demodulation circuits may be needed to differentiate the illumination signal from background sources or support more complex continuous wave methods.

capabilities and limitations inherent in their underlying

we get technical

5

Get started quickly with 3D time-of-flight applications

Design complexity rises quickly as developers work to enhance the signal to noise ratio (SNR) and eliminate artifacts in ToF systems. Further compounding complexity, more advanced detection solutions will employ multiple transmitters and receivers to track multiple objects or support more sophisticated motion tracking algorithms. For example, mmWave systems will often employ multiple receivers to track the heading and velocity of multiple independent objects. (See, "Use Millimeter Wave Radar Kits for Fast Development of Precision Object Detection Designs ".) 3D optical ToF systems 3D optical ToF systems extend the idea of using more receivers by using imaging sensors typically based on an array of charge-coupled devices (CCDs). When a set of lenses focuses some region of interest onto the CCD array, each charge storage device in the CCD array is charged by the return illumination reflected from a corresponding point in that region of interest. Synchronized with pulsed or continuous wave illumination, reflected light reaching the CCD array is essentially captured in a sequence of windows or phases, respectively. This data is further processed to create a 3D depth map comprising voxels (VOlume

piXELs) whose value represents the distance to the corresponding point in the region of interest. Like frames in a video, individual depth maps can be captured in sequence to provide measurements with temporal resolution limited only by the frame rate of the image capture system and with spatial resolution limited only by the CCD array and optical system. With the availability of larger 320 x 240 CCD imagers, higher resolution 3D optical ToF systems find applications in broadly diverse segments including industrial automation, unmanned aerial vehicles (UAVs), and even gesture interfaces (Figure 2).

Unlike most camera-based methods, 3D ToF systems can provide accurate results despite shading or changing lighting conditions. These systems provide their own illumination, typically using lasers or high-power infrared LEDs such as Lumileds' Luxeon IR LEDs able to operate at the megahertz (MHz) switching rates used in these systems. Unlike methods such as stereoscopic cameras, 3D ToF systems provide a compact solution for generating detailed distance information.

Figure 2: With their high frame rate and spatial resolution, 3D optical ToF can provide gesture interface systems with detailed data such as a person's hand being raised toward the ToF camera as shown here. (Image source: ESPROS Photonics)

6

Figure 3: The ESPROS Photonics epc660 integrates a 320 x 240 pixel imager with a full complement of timing circuits and controllers required to convert raw imager data into depth maps. (Image source: ESPROS Photonics)

EPC660-CSP68-007 ToF imager combine a 320 x 240 CCD array with the full complement of timing and signal processing capabilities required to perform 3D ToF measurements and provide 12-bit distance data per pixel (Figure 3). ESPROS Photonics’ EPC660-007 card-edge connector chip carrier mounts the epc650 imager on a 37.25 x 36.00 millimeter (mm) printed circuit board (pc board) complete with decoupling capacitors and card edge

connector. Although this chip carrier addresses the basic hardware interface in a 3D ToF system design, developers are left with the tasks of completing the appropriate optical design on the front end and providing processing resources on the backend. ESPROS Photonics’ epc660 evaluation kit eliminates these tasks by providing a full 3D ToF application development environment that includes a pre- built 3D ToF imaging system and associated software (Figure 4).

Pre-built solutions To implement 3D ToF systems, however, developers face multiple design challenges. Besides the timing circuits mentioned earlier, these systems depend on a carefully designed signal processing pipeline optimized to rapidly read results from the CCD array for each window or phase measurement, and then complete the processing required to turn that raw data into depth maps. Advanced 3D ToF imagers such as ESPROS Photonics'

we get technical

7

Get started quickly with 3D time-of-flight applications

Figure 4: The ESPROS Photonics’ epc660 evaluation kit provides a pre-built 3D ToF camera system and associated software for using depth information in applications. (Image source: ESPROS Photonics)

Designed for evaluation and rapid prototyping, the ESPROS kit provides a pre-assembled camera system that combines the epc660 CC chip carrier, optical lens assembly, and a set of eight LEDs. Along with the camera system, a BeagleBone Black processor board with 512 megabytes (Mbytes) of RAM and 4 gigabytes (Gbytes) of flash serves as the host controller and application processing resource. ESPROS also provides epc660 eval kit support software that can be downloaded from its website and opened with a password

that can be requested from the company’s local sales office. After gaining access to the software, developers simply run a graphical user interface (GUI) application with one of several provided configuration files to begin operating the camera system. The GUI application also provides control and display windows for setting additional parameters including spatial and temporal filter settings and finally for viewing the results. With minimal effort developers can use the kit to begin capturing depth maps in real time and use them as input to their own applications software.

Enhanced resolution 3D ToF systems A 320 x 240 imager such as the ESPROS epc660 can serve many applications but may lack the resolution required to detect small movements in gesture interfaces or to distinguish small objects without severely restricting the range of interest. For these applications, the availability of ready-made development kits based on 640 x 480 ToF sensors enables developers to quickly prototype high resolution applications.

8

Seeed Technology's DepthEye Turbo depth camera integrates a 640 x 480 ToF sensor, four 850 nanometer (nm) vertical-cavity surface-emitting laser (VCSEL) diodes, illumination and sensing operating circuitry, power, and USB interface support in a self- contained cube measuring 57 x 57 x 51 mm. Software support is provided through an open-source libPointCloud SDK github repository with support for Linux, Windows, Mac OS, and Android platforms. Along with C++ drivers, libraries and sample code, the libPointCloud SDK distribution includes a Python API for rapid prototyping as well as a visualization tool. After installing the distribution package on their host development platform, developers can connect the camera via USB to their computer and immediately begin using the visualization tool to display phase, amplitude, or point cloud maps, which are essentially enhanced depth maps rendered with texture surfaces to provide a smoother 3D image (Figure 5). Analog Devices' AD-96TOF1-EBZ 3D ToF evaluation kit provides a more open hardware design built with a pair of boards and designed to use Raspberry Pi's Raspberry Pi 3 Model B+ or Raspberry Pi 4 as the host controller and local processing resource (Figure 6).

Figure 5: Used in combination with the Seeed Technology DepthEye Turbo depth camera, the associated software package enables developers to easily visualize 3D ToF data in a variety of renderings including point clouds as shown here in the main window pane. (Image source: Seeed Technology/PointCloud.AI)

Figure 6: The Analog Devices AD-96TOF1-EBZ 3D ToF evaluation kit combines a two-board assembly for illumination and data acquisition with a Raspberry Pi board for local processing. (Image source: Analog Devices)

we get technical

9

Get started quickly with 3D time-of-flight applications

The kit's analog front-end (AFE) board holds the optical assembly, CCD array and buffers, firmware storage, and a processor that manages overall camera operation including illumination timing, sensor synchronization, and depth map generation. The second board holds four 850 nm VCSEL laser diodes and drivers and is designed to connect to the AFE board so that the laser diodes surround the optical assembly as shown in the figure above. Analog Devices supports the AD-96TOF1-EBZ kit with its open- source 3D ToF software suite featuring the 3D ToF SDK along with sample code and wrappers for C/C++, Python, and Matlab. To support both host applications and low-level hardware interactions in a networked environment, Analog Devices splits the SDK into a host partition optimized for USB and network connectivity, and a low-level partition running on Embedded Linux and built on top of a Video4Linux2 (V4L2) driver (Figure 7). This network-enabled SDK allows applications running on network connected hosts to work remotely with a ToF hardware system to access the camera and capture depth data. User programs can also run in the Embedded Linux partition and take full advantage of advanced options available at that level.

Figure 7: The Analog Devices 3D ToF SDK API supports applications running on the local Embedded Linux host and applications running remotely on networked hosts. (Image source: Analog Devices)

As part of the software distribution, Analog Devices provides sample code demonstrating key low-level operational capabilities such as camera initialization, basic frame capture, remote access, and cross-platform capture on a host computer and locally with Embedded Linux. Additional sample applications build on these

basic operations to illustrate the use of captured data in higher level applications such as point cloud generation. In fact, a sample application demonstrates how a deep neural network (DNN) inference model can be used to classify data generated by the camera system. Written in Python, this DNN sample application

10

(dnn.py) shows each step of the process required to acquire data and prepare its classification by the inference model (Listing 1). Here, the process begins by using OpenCV's DNN methods (cv.dnn. readNetFromCaffe) to read the network and associated weights for an existing inference model. In this case, the model is a Caffe implementation of the Google MobileNet Single Shot Detector (SSD) detection network known for achieving high accuracy with relatively small model sizes. After loading the class names with the supported class identifiers and class labels, the sample application identifies the available cameras and executes a series of initialization routines (not shown in Listing 1). The bulk of the sample code deals with preparing the depth map (depth_map) and IR map (ir_map) before combining them (cv. addWeighted) into a single array to enhance accuracy. Finally, the code calls another OpenCV DNN method (cv.dnn.blobFromImage) which converts the combined image into the four-dimensional blob data type required for inference. The next line of code sets the resulting blob as the input to the inference model (net.setInput(blob)). The call to net.forward() invokes the inference model which returns the classification results.

Listing 1: This snippet from a sample application in the Analog Devices 3D ToF SDK distribution demonstrates the few steps required to acquire depth and IR images and classify them with an inference model. (Code source: Analog Devices)

we get technical

11

Get started quickly with 3D Time-of-light applications

The remainder of the sample application identifies classification results that exceed a preset threshold and generates for those a label and bounding box displaying the captured image data, the label identified by the inference model, and its distance from the camera (Figure 8). Figure 8: Using a few lines of Python code and the OpenCV library, the DNN sample application in Analog Devices’ 3D ToF SDK distribution captures depth images, classifies them, and displays the identified object's label and distance. (Image source: Analog Devices)

As Analog Devices’ DNN sample application demonstrates, developers can use 3D ToF depth maps in combination with machine learning methods to create more sophisticated application features. Although applications that require low latency responses will more likely build these features with C/C++, the basic steps remain the same. Using 3D ToF data and high performance inference models, industrial robotic systems can more safely synchronize their movements with other equipment

or even with humans in "cobot" environments where humans and robots work cooperatively in close proximity. With different inference models, another application can use a high-resolution 3D ToF camera to classify fine movements for a gesture interface. In automotive applications, this same approach can help improve the accuracy of advanced driver- assistance systems (ADAS), taking full advantage of the high temporal and spatial resolution available with 3D ToF systems.

12

finer distinction between smaller objects and more precise monitoring of their relative distance. To take advantage of this technology, however, developers have needed to deal with multiple challenges associated with optical design, precision timing, and synchronized signal acquisition

of these systems. As shown, the availability of pre-built 3D ToF systems, such as Analog Devices’ AD-96TOF1-EBZ development platform and ESPROS Photonics’ EPC660 evaluation kit, removes these barriers to application of this technology in industrial systems, gesture interfaces, automotive safety systems, and more.

Conclusion ToF technologies play a key role in nearly any system that depends critically on accurate measurement of distance between the system and other objects. Among ToF technologies, optical 3D ToF can provide both high spatial resolution and high temporal resolution, enabling

we get technical

13

How to use smart air quality sensors for environmental monitoring

By Jeff Shepard Contributed By DigiKey's North American Editors

14

Environmental monitoring using smart air quality sensors is expanding across various applications from smart homes, buildings, and cities, to conventional and electric vehicles (EVs) and battery energy storage systems (BESS). In smart homes, buildings, and cities, air quality sensors can help ensure health and safety by monitoring airborne particles and gases associated with poor air quality, as well as smoke detection for early fire warnings. In vehicle passenger compartments, these sensors can identify volatile organic compounds (VOCs) and high levels of CO2 that can raise health concerns. In EVs and BESS, they can be used to detect an increase in pressure and high levels of hydrogen in a battery enclosure following the first venting phase of a cell, enabling the battery management system (BMS) to react and prevent a second venting event or thermal runaway of the whole battery system. The sensors used in these applications need to be compact, low power, and able to support secure boot and secure firmware updates. They often need to include multiple sensors, covering a broad spectrum of air quality monitoring. Integrating this range of functionality in a compact and low-power unit can be a daunting process, prone to restarts, resulting in a high-cost solution and delaying time to market.

To speed time to market and control costs, designers can turn to sensor modules that are factory calibrated, support secure boot and firmware updates, and provide connectivity options, including sending data to the cloud or using a CAN or other bus for local connections. This article begins by comparing optical particulate counters, screen-printed electrochemical, and multi-parameter sensor technologies. It presents air quality sensor solutions and development platforms from Sensirion, Metis Engineering, and Spec Sensors, along with companion devices from Infineon Technologies, and includes suggestions to speed the development process.

Particulate matter (PM) sensors provide counts for specific particle sizes such as PM2.5 and PM10, which correspond to particles with diameters of 2.5 microns and 10 microns, respectively, as well as other particle sizes as needed by the specific application. Optical particle counters (OPCs) are a specific PM technology that moves the air to be measured through a measurement cell that contains a laser and a photodetector (Figure 1). Particles in the air scatter the light from the laser, and the detector measures the scattered light. The measurement is converted into mass concentration in micrograms per cubic meter (μg/m3) and counts the number of particles per cubic centimeter (cm3). Counting particles using an OPC is straightforward but converting that information into

Figure 1: An OPC uses a laser and photodiode to count airborne particles. (Image source: Sensirion)

we get technical

15

How to use smart air quality sensors for environmental monitoring

a mass concentration number is more complex. The software used for the conversion needs to consider the particles’ optical parameters like shape and refractive index. As a result, OPCs can suffer from greater inaccuracy compared with other PM sensing methods such as direct, weight- based, gravimetric technologies. Not all OPCs are the same. Highly accurate and expensive, laboratory-grade OPCs can count every particle in the measurement cell. Lower-cost commercial-grade OPCs are available that sample only about 5% of the aerosol particles and use software-based estimation techniques to arrive at an overall ‘measurement.’ In particular, the density of large particles like PM10 is typically very low, and they can’t be measured directly by low-cost OPCs. As particle size increases, the number of particles in a given particle mass drops dramatically. Compared with an aerosol of PM1.0 particles, an aerosol with PM8 particles has about 500 times fewer particles for a given mass. To measure larger particles with the same accuracy as small particles, a low-cost OPC has to integrate data over several hours to arrive at an estimate. Fortunately, aerosols have fairly consistent distributions of small and large particles in real-

world environments. With properly designed algorithms, it’s possible to accurately estimate the number of larger particles, such as PM4.0 and PM10, using measurements of PM0.5, PM1.0, and PM2.5 particles. Amperometric gas sensors Instead of measuring particle counts, amperometric sensors measure gas concentrations. They are electrochemical devices that produce a current linearly proportional to the volumetric fraction of the gas being measured. A basic amperometric sensor consists of two electrodes and an

electrolyte. Gas concentration is measured at the sensing electrode, which consists of a catalytic metal that optimizes the reaction of the gas to be measured. The gas reacts with the sensing electrode after entering the sensor through a capillary diffusion barrier. The counter electrode acts as a half- cell and completes the circuit (Figure 2). An external circuit measures the current flow and determines the gas concentration. In some designs, a third ‘reference’ electrode is included to improve the stability, signal-to-noise ratio, and speed the response time of the basic amperometric sensor.

Figure 2: Amperometric sensors use two electrodes separated by an electrolyte to measure the concentrations of gases. (Image source: Spec Sensor)

16

common lithium-ion battery with a nickel manganese and cobalt cathode has a known chemical composition (Figure 3). The hydrogen concentration is critical; if it approaches 4%, hydrogen’s lower explosive limit, there is a possibility of an explosion or fire. Actions should be taken to prevent the cell from going into thermal runaway. The pressure sensor can detect small increases in pressure inside a battery pack caused by venting. False positives can be avoided by cross-checking any increase in pressure with the other sensor measurements.

This multi-parameter sensor also monitors for too cool of an operating condition. Large battery packs in EVs and BESS often include active cooling to keep the packs from overheating when they are charged or discharged. If they are cooled too much, the internal temperature can drop below the dew point, resulting in condensation inside the pack, potentially shorting the cells and causing thermal runaway. The dew point sensor alerts the BMS before condensation can collect on the battery terminals. Laser AQ sensor Designers of heating, ventilation, and air conditioning (HVAC) systems, air purifiers, and similar applications can use Sensirion’s SPS30 PM sensor to monitor air quality indoors or outdoors. SPS sensors measure mass concentrations of PM1.0, PM2.5, PM4, and PM10, as well as PM0.5, PM1.0, PM2.5, PM4, and PM10 particle counts. It has a mass concentration precision of ±10%, a mass concentration range of 0 to 1000 μg/m3, and an operational life of over ten years. The SPS30 includes an I2C interface for short connections and a UART7 for cables longer than 20 centimeters (cm).

Multi-parameter sensor for battery packs Monitoring air quality is just the start for sensors designed to protect battery packs in EVs and BESS installations. These sensors monitor pressure, air temperature, humidity, dew point, and absolute water content, in addition to volatile organic compounds (VOCs) such as methane (CH4), ethylene (C2H4), hydrogen (H2), carbon monoxide (CO) and carbon dioxide (CO2). During the first phase of battery venting, the gaseous product of a

Figure 3: A specific mixture of gases is characteristic of the first phase of battery venting (Image source: Metis Engineering)

we get technical

17

How to use smart air quality sensors for environmental monitoring

An automatic fan cleaning mode can be triggered at a preset interval to ensure consistent measurements. Fan cleaning accelerates the fan to maximum speed for 10 seconds and blows out accumulated dust. The PM measurement function is offline during fan cleaning. The default cleaning interval is weekly, but other intervals can be set to meet specific application requirements.

Dev kits and secure boot

Figure 4: This dev kit from Sensirion and Infineon can implement secure boot and secure firmware updates. (Image source: DigiKey)

The SEK-SPS30 air quality monitor sensor evaluation board can be used to connect the SPS30 to a PC to start exploring the capabilities of this PM sensor. In addition, DigiKey offers a platform to combine Sensirion’s air quality sensors with Infineon’s PSoC 6 MCUs to develop next-generation intelligent air quality monitoring systems. For smart building systems where privacy is a concern, PSoC 6 supports secure boot and secure firmware updates (Figure 4).

monitor for shocks up to 24G and impact duration, enabling the system to identify when the battery pack has been exposed to shocks above safe levels. It can measure: ■ 0.2 to 5.5 Bar absolute pressure ■ -30°C to +120°C air temperatures ■ VOCs, equivalent CO 2 (eCO 2 ), and H 2 in parts per billion (ppb) ■ Absolute humidity in milligrams of water vapor per cubic meter (mg/m 3 ) ■ Dew point temperature

CAN sensor dev kit The DEVKGEN1V1 development kit helps to shorten the system integration time when using Metis CAN sensors. The sensors include a configurable CAN bus speed and address along with a DBC CAN database that supports integration into almost any vehicle with a CAN bus. The basic dev kit can be expanded, enabling developers to add more sensors to the CAN network.

Battery pack sensor

EV and BESS battery pack designers can use the CANBSSGEN1 from Metis Engineering for battery safety monitoring. It’s designed to detect early failures due to cell venting. This CAN bus-based sensor includes a replaceable air filter and is especially useful in EVs (Figure 5). An optional accelerometer can

Figure 5: This battery safety monitor sensor includes a replaceable air filter (center white circle). (Image source: Metis Engineering)

18

Indoor air quality sensor Designers of indoor and vehicle in-cabin air quality monitoring systems can use the 110-801 from SPEC Sensors. The 110-801 is a screen-printed amperometric gas sensor that can detect a broad array of gases associated with poor air quality, including alcohols,

Figure 7: Simplified potentiostat circuit used to implement gas detection using an amperometric sensor. (Image source: Spec Sensors)

ammonia, carbon monoxide, various odorous gases, and

sulfides. The response of these sensors is linearly proportional to the volumetric fraction of the gas being measured, which simplifies system integration (Figure 6). Other features of this 20 x 20 x 3 mm sensor include: ■ Parts per million (ppm) sensitivity ■ Less than ten microwatts (μW) sensor power ■ -10°C to +40°C operating temperature range (0°C to +40°C continuous operation) ■ Robust and stable operation in the presence of a wide range of contaminants

output from the sensor to a voltage signal. At the same time, op amp U1 supplies current to the counter

Amperometric gas sensor integration A potentiostat circuit controls the working electrode’s potential in an amperometric gas sensor and converts the electrode current to an output voltage (Figure 7). The voltage at pin 2 of the operational amplifier (op amp) U1 sets the reference electrode voltage, and the working electrode’s potential is set by pin 6 of op amp U2. Op amp U2 also converts the current

electrode that is equal to the working electrode current.

Summary As shown, designers have a range of air quality sensor technologies to choose from when designing environmental monitoring systems. OPCs can be used to monitor for potentially dangerous particulate levels indoors and outdoors. CAN- based, multi-sensor systems can monitor for first stage venting in EV and BESS battery packs and help to prevent thermal runaway and possible fires or explosions. Low-power, screen-printed amperometric gas sensors can be used to detect a broad array of gases that cause poor air quality.

Figure 6: This screen-printed amperometric gas sensor can measure the presence of a variety of gases. (Image Source: Spec Sensors)

we get technical

19

Understand drone design trade-offs before piling on the sensors By Steve Taranovich Contributed By DigiKey's North American Editors

BME680 Drones are increasingly finding use in many applications, including as part of a first responder’s tool kit at the scene of an emergency or disaster. For example, during the fire at the Notre- Dame Cathedral in Paris they were used to initially report the size, heat, and extent of the active fire. They were also outfitted with thermal imaging capability to search for people still inside. Later, they were used to assess the damage. Clearly this kind of application presents challenges in trying to see through complex conditions such as smoke and flames with adequate resolution. As enticing as it may be to add more sensors to a drone to address these challenges, designers need to remain aware that drones are battery powered, and in many cases, cost sensitive. As a result, designers need to perform a delicate balancing act between functionality, cost, size, weight, and power consumption (SWaP). Finding this balance is the primary objective when considering the addition of sensors and imaging equipment to a drone design. This article discusses the architectural trade-offs designers need to consider when adding sensors to a drone. In doing so, particular attention is paid to the power supply, which will likely have magnetics that can add excess weight and take up precious space. In addition, suitable power supply and sensor solutions

are introduced from vendors that include Texas Instruments, Efficient Power Conversion, Analog Devices, Bosch Sensortec, STMicroelectronics, and SparkFun Electronics . Drone architectural design considerations The power supply : Once the designer knows the key areas on which to focus for optimum drone performance, they can then look at ways to minimize its physical size and weight, beginning with creating the most efficient power supply possible. This will lead to the minimization of the overall power supply size and weight, and so to a smaller, lighter drone. Being battery operated, a drone with greater power supply efficiency can operate with a smaller battery size and weight. A typical choice for a drone battery would be a rechargeable lithium battery—Li- Ion or Li-Po type—especially if the designer plans to recharge the battery when landing or hovering over a wireless charger, or just upon landing with an external charger. Designers can also use a standard non-rechargeable battery as the power source and replace it once it is discharged. When choosing a DC/DC converter, designers will need to use a wide input device due to the high voltage pulse of back EMF

we get technical

21

Understand drone design trade-offs before piling on the sensors

(BEMF) from the rotor motors. Under motor deceleration, this BEMF will appear at the DC/DC converter’s input as it comes after the separate DC/DC conversion powering the rotor motors. The Texas Instruments LM5161 DC/DC power converter IC is a good choice for a drone power supply because when programmed for discontinuous conduction mode (DCM) operation, it provides a tightly regulated buck output without any additional external feedback ripple injection circuit. It also has integrated high-side and low-side MOSFETs which save on board space. For added reliability, the LM5161 has peak and valley current limit circuits which protect against overload conditions. As an added precautionary feature, an undervoltage lockout (UVLO) circuit provides independently adjustable input undervoltage threshold and hysteresis. There most likely will be many sensors aboard a drone, along with an associated sensor fusion IC, the main processor, and propeller motors. These require a good battery control system. Designers may opt for gallium nitride (GaN) power transistors in the power supply architecture they choose that normally uses a power transistor. GaN will help with optimum performance efficiency with minimum size/footprint.

Wireless power - Re-charging while hovering [theoretical discussion]: 1, 2, 3 This is desirable because when a drone lands and powers down to re-charge and takes off again, the start-up and liftoff of the rotor motors takes a great deal of power from the battery. Efficient Power Conversion is one of many companies researching wireless charging while hovering. An option for the power supply could be a wireless charging architecture based on a GaN FET, such as Efficient Power Conversion’s EPC2019 . GaN-based FETs allow switching at 13.56 megahertz (MHz)—a switching frequency difficult to reach with ordinary silicon FETs. This high switching frequency will also minimize the size and weight of power supply magnetics. In addition, GaN transistors are five to ten times smaller than silicon devices yet can handle the same power levels. With this type of power supply, drones do not have to land; they can instead hover over a wireless charging base.

Designers will find that there are a great many evaluation/ development boards to speed time to market with wireless power. In the case of the EPC2019 GaN FET, Efficient Power Conversion supports it with the EPC9513 wireless power receiver development board, to be used inside the drone. This development board is important to designers because it is based on the AirFuel standard, which ensures a certified wireless design that is interoperable with other wireless charging products, globally. Designers can request the Gerber files from the supplier for the demo board to recreate the board’s optimized layout.

22

Figure 1: Designers can create reliable, efficient drone power with the addition of this 2 A solar-powered battery charger where thermistor RNTC has been added to compensate for a solar cell (like the PT15-75) temperature coefficient at maximum power levels. (Image source: Analog Devices)

Solar power: Another power option is to use solar energy to charge a drone battery. For this purpose, the PT15-75 solar cell from PowerFilm Inc. is a good option.

of the solar panel, which optimizes the efficiency of conversion from the sun’s power to supply maximum output power to the battery. Sensors: Sensors will both increase the controllability of drones, as well as their usefulness. With regard to controlling the drone, a sensor can enable an auto level mode, a constant altitude mode, or an orbit mode for circling around a specific object or point of interest. All of these added features rely on higher performance inertial measurement units (IMUs) and barometric pressure sensors to achieve an optimal user experience, as well as improved reliability for special purpose or commercial drones.

Designers may need to increase drone performance, which may require a gyroscope with extremely low output signal drift to ensure drone orientation, position, and balance, especially under changing temperature conditions. This can be achieved using a Bosch Sensortec BMI160 accelerometer and gyroscope combination that comes as a small, low-power IMU with nine-axis sensor data fusion. It measures 2.5 x 3.0 millimeters (mm) with a height of 0.83 mm and consumes only 925 microamps (µA), even when the gyroscope and accelerometer are in full operating mode. It operates from a 1.71 volt to 3.6 volt supply.

The PT15-75 can be used in conjunction with an Analog

Devices LT3652 battery charger IC to implement a clever, compact battery charger design (Figure 1). Remember, there really is no situation in which open-circuit voltage (Voc) is output when the panel is attached to a load and providing current. The LT3652 input regulation loop also has the capability to find the maximum power operating point

we get technical

23

Understand drone design trade-offs before piling on the sensors

To complement the BMI160, a digital barometric pressure sensor with temperature sensor will help to measure vertical velocity, enhance GPS navigation, and determine a drone’s altitude. It is recommended that barometers be occasionally calibrated at sea level pressures to stay accurate. Bosch Sensortec’s BMP388 barometric pressure and temperature sensor is a good example of an IC that designers can integrate into their architecture. With a small footprint of 2 x 2 mm2 at 0.88 mm high, and a low power consumption of just 3.4 µA at 1 hertz (Hz), this sensor module is well suited for battery operation. The device has a typical relative accuracy of +/-8 Pa with a typical absolute accuracy of +/-50 Pa that will improve drone hovering and obstacle avoidance capabilities. To detect motion along multiple axes, the STMicroelectronics’ ISM330DLCTR iNEMO IMU system- in-package (SiP) module combines an accelerometer and gyroscope along with a magnetometer in a monolithic six-axis IC. This kind of configuration enables a drone to maintain horizontal, vertical, and rotational stability while hovering. For applications like professional- grade drone photography and 3D imagery, six-axis gyro stabilization is necessary and is provided by the ISM330DLCTR.

The gyroscope measures and maintains drone orientation. When integrating three accelerometers, each of which are oriented along a different axis, the degree of motion of a drone along any axis can be determined. This will better enable the collection of information regarding the drone’s roll, pitch, and yaw, and then feed this information back to the drone’s proportional- integral-derivative (PID) controller. The magnetometer will measure the strength and direction of the Earth’s magnetic north field in order to correct its trajectory. Be sure the magnetometer is calibrated frequently; power lines, motors, and any other strong fields emitted from electrical devices can affect it. Drone movement caused by external forces, like a strong gust of wind, will be detected by the accelerometer and relayed to the PID controller, which in turn adjusts the motors to compensate. Rangefinders: Landing, hovering, and distance from an object Drones need to have good sensors to land safely, hover when wirelessly charging, and sense objects to avoid collisions when in motion. This ranging can be performed using sound or light.

Ultrasonic rangefinder sensing: Drone landing, hovering, and ground tracking capabilities can be provided using ultrasonic sensors. When a drone is in the process of landing, it needs to detect the distance from the bottom of the drone to the area in which it is landing. Although GPS and a barometer are part of this control function, accurate distance sensing is the key to a safe landing. Ultrasonic sensors can also assist in safe hovering and ground tracking, which may need the drone to fly at a fixed height. One such distance ranging sensor for landing assistance, hovering, and ceiling detection is MaxBotix ’s MB1010- 000 ultrasonic time-of flight (ToF) ranging sensor board.

Understanding ToF

All of these cases need to use the ToF method, which is the time taken for an emitted ultrasonic wave to reach a target, plus the time for the reflected signal to travel back to the drone’s sensor (Figures 2 and 3). To calculate the distance from the drone to any object, use the equation:

24

Texas Instruments offers the PGA460PSM-EVM ultrasonic proximity sensing evaluation module that will shorten design time. LiDAR range sensing: Another means of distance sensing is with the use of light detection and ranging (LiDAR) with pulsed lasers. The information gained from ToF LiDAR systems may be used to create a three-dimensional image. LiDAR technology allows for high accuracy and resolution, and a large coverage area. Designers can select an optical laser distance ranging sensor such as the SparkFun Electronics SEN-14032, a laser-based optical ranging sensor with a range of 40 m. An external microcontroller will be needed to interface with the sensor via I2C. There are two primary kinds of architectures used for this kind of LiDAR: a solid-state LiDAR and a motorized 360˚ field of view rotating LiDAR. These both use the same principle, with a laser sending out a beam of light. In the solid-state case a mirror is used to scan, while the scanning rotating disk architecture uses a spinning disk, driven by a motor.

Figure 2: Designers will need to understand the concepts of ToF during a drone landing, hovering, or wireless charging. (Image source: Texas Instruments)

Figure 3: The three phases of ultrasonic ToF. Initial transmitted sound (1), silence (2), and received echo (3) for accurate range finding in their drone designs. An understanding of this graphic, coupled with the evaluation board and sensors discussed in this article, can help designers meet the goals of flight stability, collision avoidance, and optimum wireless charging when implementing the hardware suggestions in this section. (Image source: Texas Instruments)

we get technical

25

Understand drone design trade-offs before piling on the sensors

A third type of LiDAR known as flash LiDAR, flashes many short pulses at the same time, uses a camera chip to receive the pulse reflections, and subsequently measures the ToF. Flash LiDAR has very high resolution but is limited to about 30 meters (m). Sensing the environment Thermal imaging camera: A thermal imaging camera on a drone will detect heat signatures/ temperature from objects and materials and display them as still images or videos. The Notre- Dame fire in Paris was observed and tracked using thermal image cameras. These cameras can detect small differences in heat, sometimes as small as 0.01˚C. Another important area for drone thermal imaging use is in disaster recovery, such as after an earthquake or severe hurricane, which can leave behind damaged or collapsed structures with people trapped inside (Figures 4 and 5). A good way for designers to start using thermal imaging in a drone is to use something like the 500-0771-01, a micro thermal camera from FLIR Lepton . The camera has a spectral range of 800 nanometers (nm) to 1400 nm, a scene dynamic range of 0˚ to 120˚C, and a nominal power consumption of 150 milliwatts (mW) (operating), 650 mW (during shutter event), and 5 mW (standby).

Figure 4: A drone’s-eye view of a collapsed building is an important first step a drone would take with a conventional camera. Then, with the use of a thermal imaging camera, it could sense the body heat of those trapped in the rubble. (Image source: IEEE 4 )

Figure 5: Designers now have the tools to locate and save lives in disaster situations. This image of a trapped person was taken using a DJI drone during a fire fighter drill. (Image source: Industrial Equipment News/Menlo Fire UAS/Drone program, via AP)

26

Humidity, pressure, and temperature sensing: To

Conclusion Drones present an unusually difficult design challenge of requiring high functionality and long flight times. As with any design, the main tasks the device will be required to perform must be known in order to develop a plan to ensure an optimal architecture that meets the project requirements.

help determine atmospheric conditions, designers can use Bosch Sensortec’s BME280, a digital humidity, pressure, and temperature sensor with an SPI interface. It’s highly integrated, measuring 2.5 mm x 2.5 mm x 0.93 mm, and consumes as little as 0.1 µA in sleep mode, or up to 3.6 µA when sensing all three parameters. Accelerate time to market with multi-sensor development kits The DA14585IOTMSENSOR is a multi-sensor development kit from Dialog Semiconductor that uses environmental sensors from Bosch Sensortec and motion sensors from TDK Invensense. This kit is important for designers because it is a good platform upon which to experiment with and develop drone sensing sensor fusion capabilities and accelerate time to market. It has a BME680 low-power gas, humidity, pressure, and temperature sensor, as well as an accelerometer, gyroscope, and magnetometer. The DA14585IOTMSENSOR’s sensor fusion capabilities lets designers see how this feature can be used to both get better overall sensing performance, while also extending drone battery life.

References

Drones…Up, Up, and Away Light-Weight Wireless Power Transfer for Mid-Air Charging of Drones Samer Aldhaher, Paul D. Mitcheson, Juan M. Arteaga, George Kkelis, David C. Yates, IEEE 2017 Nonlinear Parity-Time-Symmetric Model for Constant Efficiency Wireless Power Transfer: Application to a Drone-in-Flight Wireless Charging Platform Jiali Zhou, Bo Zhang, Wenxun Xiao, Dongyuan Qiu, Yanfeng Chen, IEEE 2018 DronAID: A Smart Human Detection Drone for Rescue Rameesha Tariq, Maham Rahim, Nimra Aslam, Narmeen Bawany, Ummay Faseeha, IEEE 2018

we get technical

27

An introduction to pressure sensors

By Ryan Smoot, Technical Support Engineer, CUI Devices

28

A pressure sensor is an electronic component that monitors or detects gas or liquid pressure (force) and transforms that information into an electrical signal that can be used to monitor or regulate that force. To further initiate a discussion on pressure sensors, however, it is worthwhile to start with some fundamental definitions. Pressure is the magnitude of force exerted by a gas or a liquid on a unit area of a surface. The relationship between pressure (P), force (F), and area (A) is given by the equation P=F/A. The traditional unit of pressure is the Pascal, defined as one Newton (N) per square meter. Pressure can also be described as the force needed to impede a fluid's expansion. Pressure sensors come in a variety of technologies, which are discussed later in this article, and each technology will ultimately

comparable in function, and hence, the terms are often used interchangeably. However, the main distinctions among them are in their output signals. A pressure sensor senses the force of the pressure and generates an output signal that corresponds to the magnitude of the force being exerted. A pressure transducer transforms the detected force into a continuous voltage output (V), while a pressure transmitter converts the detected force into a current output (mA). In common usage, pressure sensors may be referred to using a variety of terms, such as pressure transducers, pressure transmitters,

utilized in your application can significantly impact these factors, as pressure is typically measured in relation to a reference, such as atmospheric pressure at sea level. One crucial term is Gauge Pressure, which is a measurement of pressure relative to the local ambient or atmospheric pressure. The indicated pressure is either higher or lower than the local atmospheric pressure. Another significant term is Absolute Pressure, which is pressure measurement relative to a reference of zero pressure or a vacuum. The measurement obtained using an absolute pressure sensor remains the same irrespective of the location where it is measured. Differential Pressure pertains to the difference in pressure between two distinct points in a system, which is frequently used to calculate the flow of liquids or gases within pipes.

pressure senders, pressure indicators, piezometers, and

manometers. Regardless of the given nomenclature, these devices are implemented for the monitoring and regulation of pressure in numerous applications, and can also be used for measuring other variables, such as fluid/gas flow, altitude, and water level. Pressure measurement types In the realm of pressure measurement and pressure sensors, there are a variety of terms that must be understood to ensure optimal system performance and measurement accuracy. The specific type of pressure sensor

determine how a particular pressure sensor operates.

Vacuum Pressure measures a negative pressure range as compared to ambient or local atmospheric pressure.

Although many pressure sensors available today can be used with a broad range of fluids and gases, some fluids that are more viscous or thick (paper pulp, asphalt, crude oil, etc.) may require customized pressure sensors. Nevertheless, there is a pressure sensor type suitable for almost any scenario.

Lastly, Compound Pressure involves the measurement of both positive and negative pressure or vacuum, essentially combining Gauge Pressure and Vacuum Pressure.

Addressing naming confusion

At a fundamental level, pressure sensors, pressure transducers, and pressure transmitters are

we get technical

29

Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7 Page 8 Page 9 Page 10 Page 11 Page 12 Page 13 Page 14 Page 15 Page 16 Page 17 Page 18 Page 19 Page 20 Page 21 Page 22 Page 23 Page 24 Page 25 Page 26 Page 27 Page 28 Page 29 Page 30 Page 31 Page 32 Page 33 Page 34 Page 35 Page 36 Page 37 Page 38 Page 39 Page 40 Page 41 Page 42 Page 43 Page 44 Page 45 Page 46 Page 47 Page 48 Page 49 Page 50 Page 51 Page 52 Page 53 Page 54 Page 55 Page 56 Page 57 Page 58 Page 59 Page 60 Page 61 Page 62 Page 63 Page 64 Page 65 Page 66 Page 67 Page 68 Page 69 Page 70 Page 71 Page 72 Page 73 Page 74 Page 75 Page 76 Page 77 Page 78 Page 79 Page 80 Page 81 Page 82

Powered by