About Circuit Cellar Staff

Circuit Cellar's editorial team comprises professional engineers, technical editors, and digital media specialists. You can reach the Editorial Department at editorial@circuitcellar.com, @circuitcellar, and facebook.com/circuitcellar

New Frequency-Programmable, Narrow-Band Transmitter

Leading RF module designer and manufacturer Lemos International/Radiometrix recently launched a new range of flexible, frequency-programmable, RF power adjustable radios. The new NTX2B Transmitter offers industry-leading true Narrow-Band FM performance. It is available on user/factory-programmable custom frequencies between 425 and 470 MHz. Superseding the popular NTX2, the new transmitter offers greater stability and improved performance due to VCTCXO reference. The NTX2B provides users with the ability to dynamically reprogram the module via the microcontroller UART to other channel frequencies in the band or store new frequency/power settings on EEPROM.

Source: Lemos International

Source: Lemos International

The standard NTX2B version is a 10-mW, 25-kHz narrow-band Transmitter with data rates up to 10 kbps and is available on 434.075 and 434.650 MHz European SRD frequencies and 25 mW on 458.700 MHz for the UK. The NTX2B is also available with 12.5- or 20-kHz channel spacing for licensed US FCC Part 90 or legacy European Telemetry/Telecommand bands. The NTX2B features an internal LDO voltage regulator that enables the transmitter to be operated down to 2.9 V and up to 15-V voltage supply at a nominal current consumption of 18 mA and less than 3 µA in power-down mode, which can be enabled within 5 ms. NTX2B can transmit both digital and 3-VPP analog signals. Offering greater range than wideband modules, the transmitter can be paired with the new NRX2B receiver for a usable range of over 500 m, which is ideal for performance-critical, low-power wireless applications, including security, sensor networks, industrial/commercial telemetry and remote control.

Source: Lemos

 

Consumer Interest in Wearables Increases

New consumer research from Futuresource Consulting highlights a significant increase in consumers’ intentions to purchase wearable devices. Interviewing more than 8,000 people in May and and October in the US, the UK, France, and Germany, the study saw interest in fitness trackers and smart watches rise by 50% and 125%, respectively. However, interest in smart glasses and heart rate monitors has stalled.

Source: Futuresource

Source: Futuresource

The overall wearables market has seen significant growth so far in 2014, with Futuresource forecasting full-year sales of over 51 million units worldwide. However, it’s only just warming up, and wearables sales are expected to accelerate from 2015 as new brands enter the space.

The most marked change since May is the strong growth in the number of iPhone owners intending to purchase wearable devices. iPhone owners now lead the way in all categories – particularly in smartwatches, which 17% of iPhone owners expressed an intent to purchase in the next 12 months, up from only 6% in May 2014. This increase coincides with September’s announcement of the Apple Watch. As Apple customers are typically some of the earliest adopters of new technologies, their increasing engagement with the smartwatch category is a strong positive for the Apple Watch release in early 2015.

Source: Futuresource Consulting

Microcontroller-Based Air Quality Mapper

Raul Alvarez Torrico’s Air Quality Mapper is a portable device designed to track levels of CO2 and CO gasses for constructing “Smog Maps” to determine the healthiest routes. Featuring a Renesas RDKRL78G13 development board, the Mapper receives location data from its GPS module, takes readings of the CO2 and CO concentrations along a specific route, and stores the data in an SD card. With the aid of PC utility software, you can upload the data to a web server and see maps of gas concentrations in a web browser.

air q

The portable data logger prototype

In his Circuit Cellar 293 article (December 2014), Torrico notes:

My design, the Air Quality Mapper, is a data-logging, online visualization system comprising a portable data logger and a webserver for the purpose of measuring and visualizing readings of the quality of air in given areas. You take readings over a given route and then upload the data to the server, which in turn serves a webpage containing a graphical representation of all readings using Google Maps technology.

The webpage displaying CO2 measurements acquired in a session

The webpage displaying CO2 measurements acquired in a session

The data logging system features a few key components: a Renesas YRDKRL78G13 development board,  a Polstar PMB-648 GPS module, an SD card, and gas sensors.

The portable data logger hardware prototype is based on the Renesas YRDKRL78G13 development board, which contains a Renesas R5F100LEA 16-bit microcontroller with 64 KB of program memory, 4 KB of data flash memory, and 4 KB of RAM, running from a 12-MHz external crystal…

Air Quality Mapper system

Air Quality Mapper system

The board itself is a bit large for a portable or hand-held device (5,100 x 5,100 mils); but on the other hand, it includes the four basic peripherals I needed for the prototype: a graphic LCD, an SD card slot, six LEDs, and three push buttons for the user interface. The board also includes other elements that could become very handy when developing an improved version of the portable device: a three-axis accelerometer, a temperature sensor, ambient light sensor, a 512-KB serial EEPROM, a small audio speaker, and various connection headers (not to mention other peripherals less appealing for this project: an audio mic, infrared emitter and detector, a FET, and a TRIAC, among other things). The board includes a Renesas USB debugger, which makes it a great entry-level prototyping board for Renesas RL78/G13 microcontrollers.

For the GPS module, I used a Polstar PMB-648 with 20 parallel satellite-tracking channels. It’s advertised as a low-power device with built-in rechargeable battery for backup memory and RTC backup. It supports the NMEA0183 v2.2 data protocol, it includes a serial port interface, and it has a position accuracy 2DRMS of approximately 5 m and velocity accuracy of 0.1 m per second without selective availability imposed. It has an acquisition time of 42 s from a cold start and 1 s from a hot start. It also includes a built-in patch antenna and a 3.3- to 5-V power supply input.

The GPS module provides NMEA0183 V2.2 GGA, GSV, GSA, and RMC formatted data streams via its UART port. A stream comes out every second containing, among other things, latitude, longitude, a timestamp, and date information. In the system, this module connects to the R5F100LEA microcontroller’s UART0 port at 38,400 bps and sources the 3.3-VDC power from the YRDKRL78G13 board.

For the CO2 sensor, I used a Hanwei Electronics Co. MG-811 sensor, which has an electrolyte that in the presence of heat reacts in proportion to the CO2 concentration present in air. The sensor has an internal heating element that needs to be powered with 6 VDC or 6 VAC. For small CO2 concentrations, the sensor outputs a higher voltage, and for high concentrations the output voltage decreases. Because I didn’t have proper calibration instrumentation at hand for this type of sensor, I made a very simple calibration process just by exposing the sensor to a “clean air” environment outside the city. I took an average of various readings in a 15-minute period to define a 400-PPM concentration, which is generally defined as the average for a clean air environment. Not an optimal calibration method of course, but I thought it was acceptable to get some meaningful data for prototyping purposes. For a proper calibration of the sensor, I would’ve needed another CO2 sensing system already calibrated with a high degree of accuracy and a set up in a controllable environment (e.g., a laboratory) in order to generate and measure the amount of CO2.

This sensor provides an output voltage between 30 and 50 mV. And due to their high output impedance, the signal must be properly conditioned with an op-amp. So, I used a Microchip Technology MCP6022 instrumentation amplifier in a noninverting configuration with a gain of 9.2.

You can read the complete article in Circuit Cellar 293 (December 2014).

Liquid Flow Sensor Wins Innovation Prize

Sensirion recently won the DeviceMed OEM-Components innovation prize at the Compamed 2014 exhibition. The disposable liquid flow sensor LD20-2000T for medical devices features an integrated thermal sensor element in a microchip. The pinhead-sized device is based on Sensirion’s CMOSens technology.sensirionliquidflowsensor

The LD20-2000T disposable liquid flow sensor provides liquid flow measurement capability from inside medical tubing (e.g., a catheter) in a low-cost sensor, suitable for disposable applications. As a result, you can measure drug delivery from an infusion set, an infusion pump, or other medical device in real time.

A microchip inside the disposable sensor measures the flow inside a fluidic channel. Accurate (~5%) flow rates from 0 to 420 ml/h and beyond can be measured. Inert medical-grade wetted materials ensure sterile operation with no contamination of the fluid. The straight, open flow channel with no moving parts provides high reliability. Using Sensirion’s CMOSens technology, the fully calibrated signal is processed and linearized on the 7.4 mm2 chip.

Source: Sensirion

Data Center Power & Cost Management

Computers drive progress in today’s world. Both individuals and industry depends on a spectrum of computing tools. Data centers are at the heart of many computational processes from communication to scientific analysis. They also consume over 3% of total power in the United States, and this amount continues to increase.[1]

Data centers service jobs, submitted by their customers, on the data center’s servers, a shared resource. Data centers and their customers negotiate a service-level agreement (SLA), which establishes the average expected job completion time. Servers are allocated for each job and must satisfy the job’s SLA. Job-scheduling software already provides some solutions to the budgeting of data center resources.

Data center construction and operation include fixed and accrued costs. Initial building expenses, such as purchasing and installing computing and cooling equipment, are one-time costs and are generally unavoidable. An operational data center must power this equipment, contributing an ongoing cost. Power management and the associated costs define one of the largest challenges for data centers.

To control these costs, the future of data centers is in active participation in advanced power markets. More efficient cooling also provides cost saving opportunities, but this requires infrastructure updates, which is costly and impractical for existing data centers. Fortunately, existing physical infrastructure can support participation in demand response programs, such as peak shaving, regulation services (RS), and frequency control. In demand-response programs, consumers adjust their power consumption based on real-time power prices. The most promising mechanism for data center participation is RS.

Independent system operators (ISOs) manage demand response programs like RS. Each ISO must balance the power supply with the demand, or load, on the power grid in the region it governs. RS program participants provide necessary reserves when demand is high or consume more energy when demand is lower than the supply. The ISO communicates this need by transmitting a regulation signal, which the participant must follow with minimal error. In return, ISOs provide monetary incentives to the participants.

This essay appears in Circuit Cellar #293 (December 2014).

 
Data centers are ideal participants for demand response programs. A single data center requires a significant amount of power from the power grid. For example, the Massachusetts Green High-Performance Computing Center (MGHPCC), which opened in 2012, has power capacity of 10 MW, which is equivalent to as many as 10,000 homes (www.mghpcc.org). Additionally, some workload types are flexible; jobs can be delayed or sped up within the given SLA.

Data centers have the ability to vary power consumption based on the ISO regulation signal. Server sleep states and dynamic voltage and frequency scaling (DVFS) are power modulation techniques. When the regulation signal requests lower power consumption from participants, data centers can put idle servers to sleep. This successfully reduces power consumption but is not instantaneous. DVFS performs finer power variations; power in an individual server can be quickly reduced in exchange for slower processing speeds. Demand response algorithms for data centers coordinate server state changes and DVFS tuning given the ISO regulation signal.

Accessing data from real data centers is a challenge. Demand response algorithms are tested via simulations of simplified data center models. Before data centers can participate in RS, algorithms must account for the complexity in real data centers.

Data collection within data center infrastructure enables more detailed models. Monitoring aids performance evaluation, model design, and operational changes to data centers. As part of my work, I analyze power, load, and cooling data collected from the MGHPCC. Sensor integration for data collection is essential to the future of data center power and cost management.

The power grid also benefits from data center participation in demand response programs. Renewable energy sources, such as wind and solar, are more environmentally friendly than traditional fossil fuel plants. However, the intermittent nature of such renewables creates a challenge for ISOs to balance the supply and load. Data center participation makes larger scale incorporation of renewables into the smart grid possible.

The future of data centers requires the management of power consumption in order to control costs. Currently, RS provides the best opportunities for existing data centers. According to preliminary results, successful participation in demand response programs could yield monetary savings around 50% for data centers.[2]


[1] J. Koomey, “Growth in Data Center Electricity Use 2005 to 2010,” Analytics Press, Oakland, August, 1, 2010, www.analyticspress.com/datacenters.html.

[2] H. Chen, M. Caramanis, and A. K. Coskun, “The Data Center as a Grid Load Stabilizer,” Proceedings of the Asia and South Pacific Design Automation Conference (ASP-DAC), p. 105–112, January 2014.


LaneTTF Annie Lane studies computer engineering at Boston University, where she performs research as part of the Performance and Energy-Aware Computing Lab (www.bu.edu/peaclab). She received the Clare Boothe Luce Scholar Award in 2014. Annie received additional funding from the Undergraduate Research Opportunity Program (UROP) and Summer Term Alumni Research Scholars (STARS). Her research focuses strategies power and cost optimization strategies in data centers.