Talking Hands: American Sign Language Gesture Recognition Glove

Roberto developed a glove that enables communication between the user and those
around him. While the design is intended for use by people communicating in American Sign Language, you can apply what you learn in this article to a variety of communications applications.Capture
PHOTO 1-Here you see the finished product with all of the sensors sewn in. The use of string as opposed to adhesive for the sensors allowed the components to smoothly slide back and forth as the hand was articulated.

By Roberto Villalba

While studying at Cornell University in 2014, my lab partner Monica Lin and I designed and built a glove to be worn on the right hand that uses a machine learning (ML) algorithm to translate sign language into spoken English (see Photo 1). Our goal was to create a way for the speech impaired to be able to communicate with the general public more easily. Since every person’s hand is a unique size and shape, we aimed to create a device that could provide reliable translations regardless of those differences. Our device relies on a variety of sensors, such as flex sensors, a gyroscope, an accelerometer, and touch sensors to quantify the state of the user’s hand. These sensors allow us to capture the flex on each of the fingers, the hand’s orientation, rotation, and points of contact. By collecting a moderate amount of this data for each sign and feeding it into a ML algorithm, we are able to learn the association between sensor readings and their corresponding signs. We make use of a microcontroller to read, filter and send the data from the glove to a PC. Initially, some data is gathered from the users and the information is used to train a classifier that learns to differentiate between signs. Once the training is done, the user is able to put on the glove and make gestures which the computer then turns into audible output.

After performing some calculation and characterizing our flex sensors, we decided to use a 10-kΩ resistor. Note that the rightmost point goes into one of the microcontroller’s ADC.

FIGURE 1-After performing some calculation and characterizing our flex sensors, we decided to use a 10-kΩ resistor. Note that the rightmost point goes into one of the microcontroller’s ADC.

HIGH-LEVEL DESIGN
We use the microcontroller’s analog-to digital converter (ADC) to read the voltage drop across each of the flex sensors. We then move on to reading the linear acceleration and rotation values from the accelerometer and gyro sensor using I 2C. And finally, we get binary readings from each of the touch sensors regarding if there exists contact or not. We perform as many readings as possible within a given window of time and use all of this data to do some smoothing. This information is then sent through serial to the PC where it is gathered and processed. Python must listen to information coming in from the microprocessor and either store data or predict based on already learned information. Our code includes scripts for gathering data, loading stored data, classifying the data that is being streamed live, and some additional scripts to help with visualization of sensor readings and so on.

MCU & SENSORS
The design comprises an Atmel ATmega1284P microcontroller and a glove onto which the various sensors and necessary wires were sewn. Each finger has one Spectra Symbol flex sensor stitched on the backside of the glove. The accelerometer and gyro sensors are attached to the center of the back of the glove. The two contact sensors were made out of copper tape and wire that was affixed to four key locations.

Since each flex sensor has a resistance that varies depending on how much the finger is bent, we attached each flex sensor as part of a voltage divider circuit in order to obtain a corresponding voltage that can then be input into the microcontroller.

Capture3

We determined a good value for R1 by analyzing expected values from the flex sensor. Each one has a flat resistance of 10 k and a maximum expected resistance (obtained by measuring its resistance on a clenched fist) of about 27 k. In order to obtain the maximum range of possible output voltages from the divider circuit given an input voltage of 5 V, we plotted the expected ranges using the above equation and values of R1 in the range of 10 to 22 k. We found that the differences between the ranges were negligible and opted to use 10 k for R1 (see Figure 1).

Our resulting voltage divider has an output range of about 1 V. We were initially concerned that the resulting values from the microcontroller’s ADC converter would be too close together for the learning algorithm to discern between different values sufficiently. We planned to address this by increasing the input voltage to the voltage divider if necessary, but we found that the range of voltages described earlier was sufficient and performed extremely well.

The InvenSense MPU-6050 accelerometer and gyro sensor packet operates on a lower VCC (3.3 V) compared to the microcontroller’s 5 V. So as not to burn out the chip, we created a voltage regulator using an NPN transistor and a trimpot, connected as shown. The trimpot was adjusted so that the output of the regulator reads 3.3 V. This voltage also serves as the source for the pull-up resistors on the SDA and SCL wires to the microcontroller. Since the I 2C devices are capable only of driving the input voltages low, we connect them to VCC via two 4.7-k pull-up resistors (see Figure 2).

As described later, we found that we needed to add contact sensors to several key spots on the glove (see Figure 3). These would essentially function as switches that would pull the microcontroller input pins to ground to signal contact (be sure to set up the microcontroller pins to use the internal pull up resistors).

Figure 2: Here we see the schematic of the voltage regulator circuit that we created in order to obtain 3.3 V. The bottom of the schematic shows how this same regulator was used to pull up the signals at SCL and SDA. Figure 3: The contact sensor circuitry was quite simple. The input pins of the microcontroller are set to the internal pull-up resistors and whenever the two corresponding copper ends on the fingers touch the input is pulled low.

Figure 2: Here we see the schematic of the voltage regulator circuit that we created in order to obtain 3.3 V. The bottom of the schematic shows how this same regulator was used to pull up the signals at SCL and SDA.

Figure 3: The contact sensor circuitry was quite simple. The input pins of the microcontroller are set to the internal pull-up resistors and whenever the two corresponding copper ends on the fingers touch the input is pulled low.

I2C COMMUNICATIONS
Interfacing with the MPU-6050 required I 2C communication, for which we chose to use Peter Fleury’s public I 2C library for AVR microcontrollers. I 2C is designed to support multiple devices using a single dedicated data (SDA) bus and a single clock (SCL) bus. Even though we were only using the interface for the microcontroller to regularly poll the MPU6050, we had to adhere to the I 2C protocol. Fleury’s library provided us with macros for issuing start and stop conditions from the microcontroller (which represent different signals that the microcontroller is requesting data from the MPU-6050 or is releasing control of the bus). These provided macros allowed for us to easily initialize the I 2C interface, set up the MPU-6050, and request and receive the accelerometer and gyroscope data (described later).

Figure 4: The image is the visual output received from plotting sequences of sensor readings. The clear divisions across the horizontal signal the different signs A, B, C, and D, respectively.

Figure 4: The image is the visual output received from plotting sequences of sensor readings. The clear divisions across the horizontal signal the different signs A, B, C, and D, respectively.

While testing our I2C communication with the MPU-6050, we found that the microcontroller would on rare occasions hang while waiting for data from the I2C bus. To prevent this from stalling our program, we enabled a watchdog timer that would reset the system every 0.5 seconds, unless our program continued to progress to regular checkpoint intervals, at which time we would reset the watchdog timer to prevent it from unnecessarily resetting the system. We were able to leverage the fact that our microcontroller’s work consists primarily of continuously collecting sensor data and sending packets to a separate PC.

Photo 2: In this image we see the hand gestures for R, U, and V. As you can tell, there is not much difference in the hand’s orientation or the amount of flex on the fingers. However, note that the copper pieces make different kinds of contact for each of the signs.

Photo 2: In this image we see the hand gestures for R, U, and V. As you can tell, there is not much difference in the hand’s orientation or the amount of flex on the fingers. However, note that the copper pieces make different kinds of contact for each of the signs.

TINYREALTIME
For the majority of the code, we used Dan Henriksson and Anton Cervin’s TinyRealTime kernel. The primary reason for using this kernel is that we wanted to take advantage of the already implemented non-blocking UART library in order to communicate with the PC. While we only had a single thread running, we tried to squeeze in as much computation as possible while the data was being transmitted.

The program first initializes the I 2C, the MPU, and the ADC. After it enters an infinite loop it resets the watchdog timer and gets 16 readings from all of the sensors: accelerometers, gyroscopes, flex-sensors, and touch sensors. We then take all of the sensor values and compute filtered values by summing all of the 16 readings from each sensor. Since summation of the IMU sensors can produce overflow, we make sure to shift all of their values by 8 before summing them up. The data is then wrapped up into byte array packet that is organized in the form of a header (0xA1B2C3D4), the data, and a checksum of the data. Each of the sensors is stored into 2 bytes and the checksum is calculated by summing up the unsigned representation of each of the bytes in the data portion of the packet into a 2-byte integer. Once the packet has been created it is sent through the USB cable into the computer and the process repeats.

PYTHON COMMUNICATION
Communication with the microcontroller was established through the use of Python’s socket and struct libraries. We created a class called SerialWrapper whose main goal is to receive data from the microcontroller. It does so by opening a port and running a separate thread that waits on new data to be available. The data is then scanned for the header and a packet of the right length is removed when available. The checksum is then calculated and verified, and, if valid, the data is unpacked into the appropriate values and fed into a queue for other processes to extract. Since we know the format of the packet, we can use the struct library to extract all of the data from the packet, which is in a byte array format. We then provide the user with two modes of use. One that continuously captures and labels data in order to make a dataset, and another that continuously tries to classify incoming data. Support Vector Machines (SVM) are a widely used set of ML algorithms that learn to classify by using a kernel. While the kernel can take various forms, the most common kind are the linear SVMs. Simply put, the classification, or sign, for a set of readings is decided by taking the dot product of the readings and the classifier. While this may seem like a simple approach, the results are quite impressive. For more information about SVMs, take a look at scikit-learn’s “Support Vector Machines” (http://scikit-learn.org/stable/modules/svm.html).

PYTHON MACHINE LEARNING
For the purposes of this project we chose to focus primarily on the alphabet, a-z, and we added two more labels, “nothing” and “relaxed”, to the set. Our rationale for providing the classifier “nothing” was in order to have a class that was made up of mostly noise. This class would not only provide negative instances to help learn our other classes, but it also gave the classifier a way of outputting that the gestured sign is not recognized as one of the ones that we care about. In addition, we didn’t want the classifier to be trying to predict any of the letters when the user was simply standing by, thus we taught it what a “relaxed” state was. This state was simply the position that the user put his/her hand when they were not signing anything. In total there were 28 signs or labels. For our project we made extensive use of Python’s scikit-learn library. Since we were using various kinds of sensors with drastically different ranges of values, it was important to scale all of our data so that the SVM would have an easier time classifying. To do so we made use of the preprocessing tools available from scikit-learn. We chose to take all of our data and scale it so that the mean for each sensor was centered at zero and the readings had unit variance. This approach brought about drastic improvements in our performance and is strongly recommended. The classifier that we ended up using was a SVM that is provided by scikit-learn under the name of SVC.

Figure 5: The confusion matrix demonstrates how many times each label is predicted and how many times that prediction is accurate. We would like to see a perfect diagonal line, but we see that one square does not adhere to this. This square corresponds to “predicted V when it was really U” and it shows about a 66% accuracy.

Figure 5: The confusion matrix demonstrates how many times each label is predicted and how many times that prediction is accurate. We would like to see a perfect diagonal line, but we see that one square does not adhere to this. This square corresponds to “predicted V when it was really U” and it shows about a 66% accuracy.

Another part that was crucial to us as developers was the use of plotting in order to visualize the data and qualify how well a learning algorithm should be able to predict the various signs. The main tool that was developed for this was the plotting of a sequence of sensor readings as an image (see Figure 4). Since each packet contained a value for each of the sensors (13 in total), we could concatenate multiple packets to create a matrix. Each row is thus one of the sensor and we look at a row from left to right we get progressively later sensor readings. In addition, every packet makes up a column. This matrix could then be plotted with instances of the same sign grouped together and the differences between these and the others could then be observed. If the difference is clear to us, then the learning algorithm should have no issue telling them apart. If this is not the case, then it is possible that the algorithm could struggle more and changes to the approach could have been necessary.

The final step to classification is to pass the output of the classifier through a final level of filtering and debouncing before the output reaches the user. To accomplish this, we fill up a buffer with the last 10 predictions and only consider something a valid prediction if it has been predicted for at least nine out of the last 10 predictions. Furthermore, we debounce this output and only notify the user if this is a novel prediction and not just a continuation of the previous. We print this result on the screen and also make use of Peter Parente’s pyttsx text-to-speech x-platform to output the result as audio in the case that it is neither “nothing” or “relaxed.”

RESULTS
Our original glove did not have contact sensors on the index and middle fingers. As a result, it had a hard time predicting “R,” “U,” and “V” properly. These signs are actually quite similar to each other in terms of hand orientation and flex. To mitigate this, we added two contact sensors: one set on the tips of the index and middle fingers to detect “R,” and another pair in between the index and middle fingers to discern between “U” and “V.”

As you might have guessed, the speed of our approach is limited by the rate of communication between the microcontroller and the computer and by the rate at which we are able to poll the ADC on the microprocessor. We determined how quickly we could send data to the PC by sending data serially and increasing the send rate until we noticed a difference between the rate at which data was being received and the rate at which data was being sent. We then reduced the send frequency back to a reasonable value and converted this into a loop interval (about 3 ms).

We then aimed to gather as much data as possible from the sensors in between packet transmission. To accomplish this, we had the microcontroller gather as much data as possible between packets. And in addition to sending a packet, the microcontroller also sent the number of readings that it had performed. We then used this number to come up with a reasonable number of values to poll before aggregating the data and sending it to the PC. We concluded that the microcontroller was capable of reading and averaging each of the sensors 16 times each, which for our purposes would provide enough room to do some averaging.

The Python algorithm is currently limited by the rate at which the microcontroller sends data to the PC and the time that it takes the speech engine to say the word or letter. The rate of transfer is currently about thirty hertz and we wait to fill a buffer with about ten unanimous predictions. This means that the fastest that we could output a prediction would be about three times per second which for our needs was suitable. Of course, one can mess around with the values in order to get faster but slightly less accurate predictions. However, we felt that the glove was responsive enough at three predictions per second.

While we were able to get very accurate predictions, we did see some slight variations in accuracy depending on the size of the person’s hands. The accuracy of each flexsensor is limited beyond a certain point. Smaller hands will result in a larger degree of bend. As a result, the difference between slightly different signs with a lot of flex tends to be smaller for users with more petite hands. For example, consider the signs for “M” and “S.” The only difference between these signs is that “S” will elicit slightly more flex in the fingers. However, for smaller hands, the change in the resistance from the flex-sensor is small, and the algorithm may be unable to discern the difference between these signs.

Figure 6: We can see that even with very small amounts of data the classifier does quite well. After gathering just over 60 readings per sign it achieves an accuracy of over 98%.

Figure 6: We can see that even with very small amounts of data the classifier does quite well. After gathering just over 60 readings per sign it achieves an accuracy of over 98%.

In the end, our current classifier was able to achieve an accuracy of 98% (the error being composed almost solely of u, v sign confusion) on a task of 28 signs, the full alphabet as well as “relaxed” and “nothing” (see Figure 5). A random classifier would guess correctly 4% of the time, clearly indicating that our device is quite accurate. It is however worth noting that the algorithm could greatly benefit from improved touch sensors (seeing as the most common mistake is confusing U for V), being trained on a larger population of users, and especially on larger datasets. With a broad enough data set we could provide the new users with a small test script that only covers difficult letters to predict and relies on the already available data for the rest. The software has currently been trained on the two team members and it has been tested on some users outside of the team. The results were excellent for the team members that trained the glove and mostly satisfying though not perfect for the other volunteers. Since the volunteers did not have a chance to train the glove and were not very familiar with the signs, it is hard to say if their accuracy was a result of overfitting, individual variations in signing, or inexperience with American Sign Language. Regardless, the accuracy of the software on users who trained was near perfect and mostly accurate for users that did not know American Sign Language prior to and did not train the glove.

Lastly it is worth noting that the amount of data necessary for training the classifier was actually surprisingly small. With about 60 instances per label the classifier was able to reach the 98% mark. Given that we receive 30 samples per second and that there are 28 signs, this would mean that gathering data for training could be done in under a minute (see Figure 6).

FUTURE UPGRADES
The project met our expectations. Our initial goal was to create a system capable of recognizing and classifying gestures. We were able to do so with more than 98% average accuracy across all 28 classes. While we did not have a solid time requirement for the rate of prediction, the resulting speed made using the glove comfortable and it did not feel sluggish. Looking ahead, it would make sense to improve our approach for the touch sensors since the majority of the ambiguity in signs come from the difference between U and V. We want to use materials that lend themselves more seamlessly to clothing and provide a more reliable connection. In addition, it will be beneficial to test and train our project on a large group of people since this would provide us with richer data and more consistency. Lastly, we hope to make the glove wireless, which would allow it to easily communicate with phones and other devices and make the system truly portable.

RESOURCES
Arduino, “MPU-6050 Accelerometer + Gyro,” http://playground.arduino.cc/ Main/MPU-6050.

Atmel Corp., “8-Bit AVR Microcontroller with 128K Bytes In-System Programmable Flash: ATmega1284P,” 8059D­AVR­ 11/09, 2009,
www.atmel. com/images/doc8059.pdf.

Fleury, “AVR-Software,” 2006,
http://homepage. hispeed.ch/peterfleury/avrsoftware.html.

Lund University, “Tiny Real Time,” 2006, www.control.lth. se/~anton/tinyrealtime/.

Parente, “pyttsx – Text-tospeech x-platform,” pyttsx “struct–Interpret Strings as Packed Binary Data,” https://docs.python.org/2/ library/struct.html.

scikit-learn, “Preprocessing Data,”
http:// scikit-learn.org/stable/modules/preprocessing. html.

“Support Vector Machines,” scikit-learn.org/stable/modules/svm.html.

Spectra Symbol, “Flex Sensor FS,” 2015,
www.spectrasymbol.com/wp-content/themes/spectra/images/datasheets/FlexSensor.pdf.

Villalba and M. Lin, “Sign Language Glove,” ECE4760, Cornell University, 2014,
http:// people.ece.cornell.edu/land/courses/ece4760/FinalProjects/f2014/rdv28_mjl256/webpage/.

SOURCES
ATmega1284P Microcontroller Atmel | www.atmel.com

MPU-6050 MEMS MotionTracking Device InvenSense | www.invensense.com

Article originally published in Circuit Cellar June 2016, Issue #311

Transducer Class Multi-Grid Strain Sensors for Multi-Axis Force, Axial, and Torsional Load Measurements

Vishay Precision Group’s Micro-Measurements brand recently introduced the S5060 Series of transducer class multi-grid advanced strain sensors. Designed for accurate, cost-effective multi-axis force, torque/axial and torsional load measurements, the Series is well suited for a wide variety of applications, including robotics, factory automation, machinery, materials testing, and more. Micro-Measurements S5060

The S5060 Series’s features, specs, and benefits:

  • Incorporates proprietary Advanced Sensors Technology
  • When installed in pairs, the circuitry pattern can be used to construct both a full-torsion bridge and a full-Poisson bridge via the installation of just two strain sensors.
  • The alignment of a single pair automatically aligns all other grids installed on the common backing
  • The number of required circuit refinements for initial zero balance, as well as temperature compensation for zero balance, are further reduced via improvements in resistance tolerance (±0.2%) and grid-to-grid thermal performance matching specifications.

Sample and production quantities are now available. Prototype sensors can be produced and delivered within six weeks, with standard volumes available in 10 weeks.

Source: Micro-Measurements

New Reflective Optical Sensor for Industrial and Medical Applications

TT Electronics recently introduced the Photologic V OPB9000, which is a reflective CMOS logic output sensor with programmable sensitivity, output polarity, and drain select. It provides dependable edge and presence detection of reflective media under a wide range of ambient light conditions. The OPB9000 is well suitable for a variety of applications, including industrial printing, dispensing, manufacturing automation, security devices, and portable medical equipment.TT Electronics TT058

The OPB9000’s features, benefits, and specs:

  • Programmable sensitivity, output polarity, and drain select
  • 25+ kilolux ambient light immunity along with a wide operating temperature range
  • The self-calibration feature avoids the need for constant recalibration as the LED ages, saving valuable time and effort.
  • Temperature compensation and automatic gain control features
  • 6-µs response time ensures high-speed detection for time-critical applications.
  • Fully integrated analog front end and digital interface
  • Combines an infrared emitter and integrated logic sensor in a 4.0 mm × 2.2 mm × 1.5 mm surface-mount package

Source: TT Electronics

Scalable Wearable Development Kit

ON Semiconductor recently announced the availability of a new Wearable Development Kit (WDK1.0). The kit comprises the following: a touchscreen display; wired and AirFuel-compatible wireless charging capability; a six-axis motion sensor and temperature sensor; an alarm, timer, and stopwatch; schematics; firmware and sample code; a dock station for charging; and a downloadable SmartApp for evaluating and controlling the smartwatches multiple functions.OnSemi Wearable Dev Kit

The WDK1.0’s features, specs, and benefits:

  • NCP6915 power Management IC provides five LDOs and one DC-DC
  • NCP1855 battery charger IC, an LC709203F fuel gauge, and a 10-W rated SCY1751 wireless charging front-end controller
  • MEMS-based FIS1100 IMU, with three‐axis gyroscope and three‐axis accelerometer operation for multidimensional motion tracking
  • Embedded temperature sensor included and an LC898301 driver IC for initiating haptic feedback
  • nRF52832 multi-protocol system-on-chip (SoC)
  • Eclipse-based IDE
  • 1.44″ 128 × 128 pixel TFT display with a capacitive touch screen
  • 26‐pin expansion port

Source: ON Semiconductor

New Sensor Technologies for Next-Gen Temperature Measurement

Melexis recently announced two new sensing technologies for next-generaration temperature measurement. The MLX90640 sensor array is an alternative to high-end thermal cameras. The MLX90342 is a quad thermocouple interface that addresses automotive sensing to 1300ºC.

The MLX90640 IR sensor arrays benefits, characteristics, and specs:

  • 32 × 24 pixels
  • –40° to 85°C operational temperature range; measures object temperatures from 240°C and 300°C
  • ±1°C target object temperature accuracy
  • Noise equivalent temperature difference (NETD) of 0.1K RMS at a 1-Hz refresh rate
  • Doesn’t require frequent recalibration
  • Field-of-view (FoV) options: 55° × 35° version and 110° × 75° wide-angle version
  • Compact, four-pin TO39 package incorporating the requisite optics
  • I2C-compatible digital interface
  • Target applications: fire prevention systems, HVAC equipment, smart buildings, and IP/surveillance systemsMLX90342 Melexis

The MLX90342 high-performance quadruple thermocouple interface benefits, characteristics, and specs:

  • Supports a –40° to 1300°C thermocouple temperature range
  • Operating temperature specification of –40° to 155°C
  • On-board cold junction compensation and linearization
  • Factory calibration; guaranteed intrinsic accuracy of ±5°C at 1100°C.
  • 26-pin 6 mm × 4 mm QFN package
  • 50-Hz Rapid refresh rate
  • Temperature data can be transmitted via a SENT Revision 3 digital interface
  • Target applications: turbo charger temperature control, exhaust gas recirculation, and diesel/gas particle filtering systems

Source: Melexis

79-GHz CMOS Radar Sensor Chips for Automotive Applications

Infineon Technologies recently announced at the Imec Technology Forum in Brussels (ITF Brussels 2016) it is cooperating with Imec to develop integrated CMOS-based, 79-GHz sensor chips for automotive radar applications. According to the announcement, Infineon and Imec expect functional samples to be available in Q3 2016. A complete radar system demonstrator is slated for early 2017.

There are usually up to three radar systems built into vehicles equipped with driver assistance functions. In the future, fully automated cars will be equipped with up to 10 radar systems and 10 additional sensor systems using camera or lidar technologies.

Source: Infineon Technologies

OEM Controller for Fiber Optic Emergency Stop and Signaling Sensors

Micronor’s MR380-0 OEM Controller provides a low-cost, turn-key solution for OEM manufacturers and control system providers integrating any of the Micronor MR38X series ZapFREE Fiber Optic Signaling Sensors into their design. The sensor range includes Emergency Stop, E-Actuator, U-Beam, Key Switch, Push Button, Foot Switch, and Microswitch sensors.MICRONOR_MICROSWITCH_1500X1000P

The OEM Controller contains a stable transmitter and a sensitive optical receiver that operates over a Duplex LC multimode fiber optic link. The transmitter sends a constant light level via the transmit fiber that is interrupted when the fiber optic switch activates or the sensor beam is broken. The system is compatible with either OM1 (62.5 µm/125 µm) or OM2/OM3 (50 µm/125 µm) multimode fiber to distances up to 1.5 km. The Controller operates over a wide 5 to 24 VDC range and provides a Digital Logic as well as Open-Collector Output for activating external relays.

The MR380 ZapFREE Signaling Sensor System outperform electromechanical and electronics-based switches and sensors, specifically where EMI immunity, high voltage isolation, inherent safety, MRI compatibility, or operation over long distance is required. Applications include medical and MRI, transportation, and more.

For ATEX applications and hazardous locations, the Signaling Sensors are classified simple mechanical devices and can be installed in any manner of explosive atmosphere—mines, gas and dust. The Controller outputs inherently safe, optical radiation and is approved for EPL Mb/Gb/Gc/Db/Dc applications.

For Functional Safety applications, depending on sensor type, the controller defaults to the emergency state when: the optical path is blocked, in case of a broken fiber, a fiber is disconnected, or loss of power to the controller link.

In small quantities, the MR380-0 OEM Controller is $250 and MR38X Sensors can range $350 to $495, with a typical lead time of stock to two weeks. Discounts are available for OEM applications. Special engineered versions are available for MRI applications, radiation, and vacuum environments.

Source: Micronor

3-D Image Sensor Chips for Virtual Reality

Infineon Technologies AG and pmdtechnologies gmbh recently announced the development of REAL3 3-D image sensor chips for virtual and augmented reality applications, spatial measurement, photo effects, and more. The new sensors have improved optical sensitivity and power comsumption in comparison to the previous version.REAL3_Infineon
Features and specs:

  • Specifically designed for mobile devices, where most applications only need a resolution of 38,000 pixels.
  • Small sensor chip area
  • Each sensor chip features microlenses
  • The chips operate with infrared light and use the time-of-flight (ToF) measuring principle

The IRS1125C will be available in volume as of first quarter of 2016. The IRS1645C and IRS1615C are slated for the second quarter of 2016.

Source: Infineon Technologies

New Low-Power Smart Sensor Wireless Platform for IoT Devices

Dialog Semiconductor recently announced that it is collaborating with Bosch Sensortec to develop a low-power smart sensor platform for Internet of Things (IoT) devices. The 12-DOF smart sensor reference platform is intended for gesture recognition in wearable computing devices and immersive gaming, including augmented reality and 3-D indoor mapping and navigation.DS008_bosch-Dialog

The platform comprises Dialog’s DA14580 Bluetooth Smart SoC with three low-power Bosch Sensortecsensors: the BMM150 (for three-axis geo-magnetic field measurement), the BME280 (pressure, humidity, and temperature sensor), and the siz-axis BMI160 (a combination of a three-axis accelerometer and three-axis gyroscope in one chip). The resulting 14 × 14 mm2 unit draws less than 500 µA from a 3-V coin cell when updating and transferring all 12 × 16 bits of data wirelessly to a smartphone.

 

The 2.5 × 2.5 × 0.5 mm DA14580 SmartBond SoC integrates a Bluetooth Smart radio with an ARM Cortex-M0 application processor and intelligent power management. It more than doubles the battery life of an application-enabled smartphone accessory, wearable device, or computer peripheral in comparison with other solutions. The DA14580 includes a variety of analog and digital interfaces and features less than 15 mW power consumption in active mode and 600-nA standby current.

Bosch Sensortec’s BMI160 six-axis Inertial Measurement Unit (IMU) integrates a 16 bit, three-axis, low-g accelerometer and an ultra-low power three-axis gyroscope within a single package. When the accelerometer and gyroscope are in full operation mode, the typical current consumption is 950 µA.

The BMM150 integrates a compact three-axis geo-magnetic field sensor using Bosch Sensortec’s high performance FlipCore technology. The BME280 Integrated Environmental Unit combines sensors for barometric pressure, humidity, and temperature measurement. Its altitude measurement function is a key requirement in applications such as indoor navigation with floor tracking.

Source: Dialog Semiconductor

Sensor Interface Connects Multiple Sensors to MCUs or FPGAs

Exar Corp. has announced the XR10910, a new sensor interface analog front end (AFE) for the calibration of sensor outputs. The XR10910 features an onboard 16:1 differential multiplexer, offset correction DAC, programmable gain instrumentation amplifier, and voltage reference. In addition, it provides 14-bit signal path linearity and is designed to connect multiple bridge sensors to a microcontroller or FPGA with an embedded ADC.EX041_Exar

Operating from from 2.7- to 5-V supplies, the XR10910 has a wide digital supply range of 1.8 to 5 V. It typically consumes 457 µA of supply current and offers a sleep mode for reducing the supply current to 45 µA.

The XR10910 is available in a 6 mm × 6 mm QFN package. Pricing starts at $8.10 each for 1,000-piece quantities.

Source: Exar Corp.

High-Side Current/Power Sensor

Microchip Technology recently introduced the PAC1921, a high-side current sensor with both a digital output, as well as a configurable analog output that can present power, current or voltage over the single output pin. Simultaneously, all power related output values are also available over the 2-Wire digital bus, which is compatible with I2C. The PAC1921 is available in a 10-lead 3 × 3 mm VDFN package. It was designed with the 2-Wire bus to maximize data and diagnostic reporting, while having the analog output to minimize data latency. The analog output can also be adjusted for use with 3-, 2-, 1.5-, or 1-V microcontroller inputs.Microchip PAC1921 Eval

The PAC1921 is ideal for networking, power-distribution, power-supply, computing and industrial-automation applications that cannot allow for latency when performing high-speed power management. A 39-bit accumulation register and 128 times gain configuration make this device ideal for both heavy and light system-load power measurement, from 0 to 32 V. It has the ability to integrate more than two seconds of power-consumption data. Additionally, the PAC1921 has a READ/INT pin for host control of the measurement period; and this pin can be used to synchronize readings of multiple devices.

The PAC1921 is supported by Microchip’s $64.99 PAC1921 High-Side Power and Current Monitor Evaluation Board (ADM00592). The PAC1921 is available for sampling and volume production, in a 10-lead 3 × 3 mm VDFN package, starting at $1.18 each in 5,000-unit quantities.

Source: Microchip Technology

Liquid Flow Sensor Wins Innovation Prize

Sensirion recently won the DeviceMed OEM-Components innovation prize at the Compamed 2014 exhibition. The disposable liquid flow sensor LD20-2000T for medical devices features an integrated thermal sensor element in a microchip. The pinhead-sized device is based on Sensirion’s CMOSens technology.sensirionliquidflowsensor

The LD20-2000T disposable liquid flow sensor provides liquid flow measurement capability from inside medical tubing (e.g., a catheter) in a low-cost sensor, suitable for disposable applications. As a result, you can measure drug delivery from an infusion set, an infusion pump, or other medical device in real time.

A microchip inside the disposable sensor measures the flow inside a fluidic channel. Accurate (~5%) flow rates from 0 to 420 ml/h and beyond can be measured. Inert medical-grade wetted materials ensure sterile operation with no contamination of the fluid. The straight, open flow channel with no moving parts provides high reliability. Using Sensirion’s CMOSens technology, the fully calibrated signal is processed and linearized on the 7.4 mm2 chip.

Source: Sensirion

New 8-Bit PICs for Sensor Applications

Microchip Technology recently expanded it’s PIC12/16LF155X 8-bit microcontroller family with the PIC16LF1554 and PIC16LF1559 (PIC16LF1554/9), which are targeted toward a variety of sensor applications. The PIC16LF1554/9 features two independent 10-bit, 100,000 samples per second ADCs with hardware Capacitive Voltage Divider (CVD) support for capacitive touch sensing.

Source: Microchip Techno

Source: Microchip Techno

Watch a short video:

The PIC16LF1554 MCUs are available now for sampling and production in 14-pin PDIP, TSSOP, SOIC, and 16-pin QFN (4 x 4 x .9 mm) packages. The PIC16LF1559 MCUs are available for sampling and production in 20-pin PDIP, SSOP, and QFN (4 x 4 x .9 mm) packages. Pricing starts at $0.63 each, in 10,000-unit quantities.

Source: Microchip Technology

High-Performance 4- to 20-mA Output Ultrasonic Sensor

MaxBotix’s new 4-20HR-MaxSonar-WR sensors are high-accuracy ultrasonic sensors featuring a 4- to 20-mA output. Each sensor is an affordable IP67-rated drop-in replacement for use with existing PLC/process control systems. The sensors reject outside noise sources and feature speed-of-sound temperature compensation.

Source: MaxBotix

Source: MaxBotix

The 4-20HR-MaxSonar-WR sensors provide range information from 50 to 500 cm and have a 1.6-mm resolution, an operational temperature range from –40° to 65°C (–40° to 149°F), real-time automatic calibration, a 200,000-plus hours MTBF, an operational voltage range from 12 to 32 V, and a low 20- to 40-mA average current requirement. The sensors function well with multiple sensors in the same location and they are RoHS- and CE-compliant.

A six-pin screw terminal header is included to simplify system connections for quick installation in applications such as: tank level measurement, tide/water level monitoring, solar/battery powered applications, industrial automation and outdoor vehicle detection.

The 4-20HR-MaxSonar-WR sensors (and previous IP67 MaxBotix sensors) are manufactured in a variety of packages for easy mounting in existing fittings. The sensors are available in M30x1.5, 1″ BSPP, 1″ NPTS, and 0.75″ NPTS PVC pipe fittings.

Pricing starts at $199.95 each and $134.37 in 100-unit quantities.

Source: MaxBotix, Inc.

Ultra-Compact Ultrasonic Sensor Series

MaxbotixThe UCXL-MaxSonar-WR series of sensors are flexible, OEM-customizable products that can be integrated into a system with MaxBotix’s horns or flush-mounted into an existing housing. Mounting design recommendations are provided through MaxBotix’s 3-D CAD models (available in multiple formats) to facilitate your design process. The sensor layout offers four conveniently placed mounting holes for design flexibility.

The rugged, high performance sensors are individually calibrated and feature a 1-cm resolution, an operational temperature range from –40˚C to 70˚C, real-time automatic calibration (voltage, humidity, and ambient noise), 200,000+ h mean time between failures (MTBF), and an operational 3-to-5.5-V voltage range with a low 3.4-mA average current requirement.

Contact MaxBotix for pricing.

MaxBotix, Inc.
www.maxbotix.com