While the fundamentals of data acquisition remain the same, the computing technology it uses keeps evolving and changing. USB and PCI Express brought data acquisition off the rack, and onto the lab bench top. Today solutions are emerging that leverage web interfacing, Raspberry Pi and more.
Even though data acquisition has its unique requirements and industry demands, it’s a segment like any other when it comes to embracing the latest and greatest technologies and form factors from the computing world. More than a decade ago, desktop and laptop computing technologies transformed data acquisition. Once typical data acq setups required large racks of slot-cards based on VXI, PXI or other form factors. Today such systems can be implemented with well-established desktop computing interfaces like USB and PCI Express. And if you factor out the kind of data acquisition development done by the defense industry—radar and such—the whole data acquisition board vendor industry has completely moved over to USB and PCI Express for their core products.
Over the past 12 months, the latest trends revolve around increasing channel counts and embracing popular embedded architectures like Raspberry Pi. To keep pace, data acquisition product vendors are rolling out board-, module- and box-level solutions that leverage the best available interfacing and analog conversion technology. At the same time, they are improving on ways to make data acquisition easier for non-experts to leverage the technology.
DATA ACQ AND RASPBERRY PI
Thanks to its embrace by the hacker/maker community, Raspberry Pi is arguably the most popular SBC in use today. As a result, many traditional data acq users are designing systems around it because of its flexibility and low cost. This growing base of Raspberry Pi users, along with open-source software becoming more industry accepted, has driven growth and made its use more prevalent in professional data acq applications.
The creators of Raspberry Pi recognized the obvious synergy between Raspberry Pi and data acquisition over four years ago. Raspberry Pi provides the ability to attach physical hardware to the Raspberry Pi’s GPIO (General Purpose Input/Output) connector. There are many third-party add-on boards that attach to the Raspberry Pi and extend its functionality. These include modules like motor controllers, sensors, microcontrollers, LCDs, ADCs and DACs and more.
With all that in mind, the popular Raspberry Pi B+ was been designed specifically with add-on boards in mind and in 2014 they introduced ‘HATs’ (Hardware Attached on Top). A HAT is an add-on board for B+ that conforms to a specific set of rules. HATs include a system that allows the B+ to identify a connected HAT and automatically configure the GPIOs and drivers for the board.
HAT FOR VOLTAGE MEASUREMENT
In a recent data acquisition HAT example, in August Measurement Computing Corp. (MCC) released its MCC 118 voltage measurement DAQ HAT for Raspberry Pi. The MCC 118 HAT provides eight single-ended analog inputs with sample rates up to 100 kS/s for taking single point or waveform voltage measurements (Figure 1). The company claims its board offers higher resolution, greater accuracy and much faster sample rates than most other digital acquisition HAT add-ons for the Raspberry Pi. The MCC 118 provides 8x single-ended, 12-bit, ±10 V analog inputs with sample rates up to 100 kS/s.
Up to eight MCC HATs can be stacked onto one Raspberry Pi, providing up to 64 channels of data and a maximum aggregate throughput of 320 kS/s, says MCC. Multiple boards can be synchronized using external clock and trigger input options. The MCC 118 is the first in a series of MCC DAQ HATs.
The MCC 118 also provides an external scan clock and an external digital trigger input. The 65 mm × 56.5 mm × 12 mm board has a 0 to 55°C temperature range and is powered at 3.3 V from the Raspberry Pi via the GPIO connector. The MCC 118 ships with an open source, Raspbian-based MCC DAQ HAT Library available for C/C++ and Python. API and hardware documentation are provided. You also get sample programs including a C/ C++ based DataLogger and a Python-based web server and IFTTT web service.
MULTIFUNCTION I/O SOLUTION
The magic of chip integration has enabled module vendors to cram more functionality onto a single board. The result is data acquisition modules can now serve multiple different functions on the same product. Exemplifying this trend, ACCES I/O Products in early 2018 rolled out a new family of low-cost USB analog I/O modules. Called the USB-AIO Family, this line of 12- and 16-bit USB modules began with its flagship model, the USB-AIO16-16F (Figure 2). This high-speed, 16- bit multifunction analog input/output board is well-suited for precision measurement, analysis, monitoring and control.
The USB-AIO16-16F can sample inputs at speeds up to 1 MHz for the board’s 16 single-ended or 8 differential analog input channels. Standard features in the USB-AIO Family include up to four 16-bit analog outputs and 16 high-current digital I/O lines. This family of boards also includes models with slower A/D speeds and a group of 12- bit modules for less demanding applications. The OEM USB/104 version provides just the board without the enclosure and is ideal for a variety of embedded OEM applications. Users simply connect it to any available USB port.
The USB-AIO Family includes a dozen models with list prices ranging from only $374 to $879. The boards feature eight standard analog voltage input ranges, two factory current input ranges (4-20 mA or 10-50 mA), 16 factory pseudo-differential inputs and include a data sample buffer and hardware real-time calibration capability. A channel-by-channel programmable gain feature enables measurement of an assortment of large and small signals in one scan—all under software control at up to 1 MHz. The board’s data buffer and ability to trigger the A/D in real time assures synchronized sampling that is unaffected by other computer operations. The USB-AIO family is designed to be used in rugged industrial environments but is small enough to easily fit onto any desk or testing station. The boards measure just 3.550″ × 3.775″ and ship inside a steel powder-coated enclosure with an anti-skid bottom. A DIN rail mounting provision is available for installation in industrial environments.
In keeping with the theme of embracing the latest and greatest technologies, data acquisition vendors are making use of cloud and IoT functionalities. Along just those lines, Delphin Technology in June 2018 introduced its ProfiMessage D system as a new master device in its Message series. The ProfiMessage D now enables use of a PROFINET interface and communication with third-party systems via an OPC UA Server/Client. An optional WLAN interface is also available.
The ProfiMessage D device can acquire and analyze any measurement data and automatically transmit it to a cloud database (Figure 3). The device is equipped with extensive signal processing functions and interfaces and uses the latest measurement technology to make it an ideal IoT device. In this way, users gain an optimal environment to process and transmit data from their machines, systems and test stands.
A display is available to read out important configuration data and measurement values on-site all without requiring PC support. The system embeds processor technology to give the ProfiMessage D device the resources needed for “industrial analytics.” Analysis functions are highly intuitive to set up and use and simplify the computation, analysis and monitoring of data.
The ProfiMessage D features universal sensor inputs, a high level of data security and a diverse range of interfaces. The modular design is based on master and slave devices which can be equipped with one or two I/O modules. Slave devices can be decentrally located and managed from a single master device. A large number of I/O modules make the device adaptable to different types of sensor and different channel numbers. The I/O modules are equipped with 8–24 analog or digital inputs and outputs. Depending on the I/O module being used, each input can be individually configured for millivolts, milliamps, RTDs and thermocouples.
UPGRADES TO LABVIEW NXG
On the software side of data acquisition, innovation has continued as well. After more than three decades since its launch, LabVIEW from National Instruments (NI) is a key tool amongst data acquisition engineers. LabVIEW was designed to help engineers automate their measurement systems without having to learn the esoterica of traditional programming languages. Problem is, after adding to the complexity and capabilities of LabVIEW over the years, it’s become daunting for the non-expert to begin using, according to NI.
With that in mind, in 2017, NI decided to re-focus on the ease-of-use side of things by introducing its LabVIEW NXG. The new variant of the software was designed from the ground up to embrace a streamlined workflow. LabVIEW NXG is intended for bridging the gap between configuration-based software and custom programming languages. The idea is to empower domain experts to focus on the problem, not the tool.
In November 2018, NI announced a new release of LabVIEW NXG. The newest version simplifies the most time-consuming tasks in automated test and automated measurement applications, from setting up and configuring systems to developing test and measurement code and creating web-ready applications (Figure 4). These enhancements help engineers meet challenging time-to-market requirements.
New LabVIEW NXG features include the LabVIEW NXG FPGA Module. The module supports USRP (Universal Software Radio Peripheral) and Kintex-7 FlexRIO targets. It also features new workflows for faster FPGA development and debugging. LabVIEW FPGA helps you more efficiently and effectively design complex systems by providing a highly integrated development environment, IP libraries, a high-fidelity simulator, and debugging features. You can create embedded FPGA virtual interfaces (VIs) that combine direct access to I/O with user-defined LabVIEW logic to define custom hardware for applications such as digital protocol communication, hardware-in-the-loop simulation, and rapid control prototyping. Though LabVIEW FPGA contains many built-in signal processing routines, you can also integrate your existing HDL code as well as third-party IP.
The new version of NXG includes some development environment enhancements, such as integration and software engineering tools for rapid customization of applications, which offer support for registered .NET assemblies as well as new project dependency tools. Support is provided for interfacing with MathWorks’ MATLAB software. The “Interface for MATLAB“ feature lets engineers directly call MATLAB code from the LabVIEW NXG environment for complete reuse of their existing IP.
PUBLISED IN CIRCUIT CELLAR MAGAZINE• JANUARY 2019 #342 – Get a PDF of the issueSponsor this Article
Jeff served as Editor-in-Chief for both LinuxGizmos.com and its sister publication, Circuit Cellar magazine 6/2017—3/2022. In nearly three decades of covering the embedded electronics and computing industry, Jeff has also held senior editorial positions at EE Times, Computer Design, Electronic Design, Embedded Systems Development, and COTS Journal. His knowledge spans a broad range of electronics and computing topics, including CPUs, MCUs, memory, storage, graphics, power supplies, software development, and real-time OSes.