3D Tool Strengthens Marriage of PCB Design with Mechanical Design

Cadence Design Systems has announced its Cadence Sigrity 2018 release, which includes new 3D capabilities that enable PCB design teams to accelerate design cycles while optimizing cost and performance. According to the company, a 3D design and 3D analysis environment integrating Sigrity tools with Cadence Allegro technology provides a more efficient and less error-prone solution than current alternatives using third-party modeling tools, saving days of design cycle time and reducing risk.

In addition, a new 3D Workbench methodology bridges the gap between the mechanical and electrical domains, allowing product development teams to analyze signals that cross multiple boards quickly and accurately.

Since many high-speed signals cross PCB boundaries, effective signal integrity analysis must encompass the signal source and destination die, as well as the intervening interconnect and return path including connectors, cables, sockets and other mechanical structures.

Traditional analysis techniques utilize a separate model for each piece of interconnect and cascade these models together in a circuit simulation tool, which can be an error-prone process due to the 3D nature of the transition from the PCB to the connector. In addition, since the 3D transition can make or break signal integrity, at very high speeds designers also want to optimize the transition from the connector to the PCB or the socket to the PCB.

According to the company, the Sigrity 2018 release enables designers to take a holistic view of their system, extending design and analysis beyond the package and board to also include connectors and cables. An integrated 3D design and 3D analysis environment lets PCB design teams optimize the high-speed interconnect of PCBs and IC packages in the Sigrity tool and automatically implement the optimized PCB and IC package interconnect in Allegro PCB, Allegro Package Designer or Allegro SiP Layout without the need to redraw.
Until now, this has been an error-prone, manual effort requiring careful validation. By automating this process, the Sigrity 2018 release reduces risk, saves designers hours of re-drawing and re-editing and can save days of design cycle time by eliminating editing errors not found until the prototype reaches the lab. This reduces prototype iterations and potentially saves hundreds of thousands of dollars by avoiding re-spins and schedule delays.

A new 3D Workbench utility available with the Sigrity 2018 release bridges the mechanical components and the electronic design of PCB and IC packages, allowing connectors, cables, sockets and the PCB breakout to be modeled as one with no double counting of any of the routing on the board. Interconnect models are divided at a point where the signals are more 2D in nature and predictable. By allowing 3D extraction to be performed only when needed and fast, accurate 2D hybrid-solver extraction to be performed on the remaining structures before all the interconnect models are stitched back together, full end-to-end channel analysis can be performed efficiently and accurately of signals crossing multiple boards.

In addition, the Sigrity 2018 release offers Rigid-Flex support for field solvers such as the Sigrity PowerSI technology, enabling robust analysis of high-speed signals that pass from rigid PCB materials to flexible materials. Design teams developing Rigid-Flex designs can now use the same techniques previously used only on rigid PCB designs, creating continuity in analysis practices while PCB manufacturing and material processes continue to evolve.

Cadence | www.cadence.com

Tool Environment Upgrade Boosts Efficiency of Multi-Board PCB Designs

The latest release of Zuken’s system-level PCB design environment, CR-8000, includes several enhancements aimed at ensuring performance, quality and manufacturability. The CR-8000 family of applications spans the complete PCB engineering lifecycle: from system level planning through implementation and design for manufacturability. The CR-8000 environment also supports 3D IC packaging and chip/package/board co-design.

The focus of CR-8000 2018 is on enabling efficient front-loading of design constraints and specifications to the design creation process, coupled with sophisticated placement and routing capabilities for physical layout. This will increase efficiency and ensure quality through streamlined collaboration across the PCB design chain.
Front-loading of design intent from Design Gateway to Design Force has been achieved by adding an enhanced, unified constraint browser for both applications. This enables hardware engineers to assign topology templates, modify differential signals and assign clearance classes to individual signals. Using a rule stack editor during the circuit design phase, hardware engineers can now load design rules that include differential pair routing and routing width stacks directly from the design rule library into their schematic. Here they can modify and assign selected rules for improved cross talk and differential pair control. Finally, an enhanced component browser enables component variants to be managed in the schematic, and assigned in a user-friendly table.

Manual routing is supported by a new auto complete & route function that layout designers can use to complete manually routed traces in an automated way. Designers also have the option to look for paths on different layers while automatically inserting vias.

A new bus routing function allows layout designers to sketch paths for multiple nets to be routed over dense areas. An added benefit is the routing of individual signals to the correct signal length as per the hardware engineer’s front loaded constraints, to meet timing skew and budgets. If modifications to fully placed and routed boards are required, an automatic re-route function allows connected component pins to remain connected with a simple reroute operation during the move process. In all operations, clearance and signal length specifications are automatically controlled and adjusted by powerful algorithms.

To address manufacturing requirements for high-speed design, the automatic stitching of vias in poured conductive areas can be specified in comprehensive detail, for example, inside area online, perimeter outline or both inside and perimeter. Design-for-manufacturing (DFM) has been enhanced to include checks for non-conductor items, such as silkscreen and assembly drawing placed reference designators. A design rule check will make sure component reference designators are listed in the same order as the parts for visual inspection accuracy.

As many product engineers do not work with EDA tools, intelligent PDF documentation is required, especially in 3D. Design Force now supports creation of PRC files commonly used for 3D printing. The PRC files can be opened in PDF authoring applications such as Adobe Acrobat, where they are realized as a 3D PDF file complete with 3D models and bookmarks to browse the design.

Zuken Americas | us.zuken.com

Siemens Acquires Austemper Design Systems

Siemens has entered into an agreement to acquire Austin, Texas-based Austemper Design Systems, a startup software company that offers analysis, auto-correction and simulation technology. This technology allows customers to test and harden IC designs for functional safety in applications such as automotive, industrial and aerospace systems. These are systems where functional safety and high reliability are mandatory for compliance to safety standards like ISO 26262.

ICs in these applications require three types of functional safety verification: for systemic faults, malicious faults and random hardware faults. Mentor’s existing Questa software (shown) is a leading technology for functional verification of systemic faults and provides solutions for verification of malicious faults for IC security. The software technology from Austemper adds state-of-the-art safety analysis, auto-correction and fault simulation technology to address random hardware faults. This is expected to complement Mentor’s existing functional safety offerings including its Tessent product suite and Veloce platform.

Design teams at leading semiconductor and IP companies use Austemper’s innovative technology to analyze the registered-transfer level (RTL) code versions of their designs for faults and vulnerabilities. It can automatically correct and harden vulnerable areas, subsequently performing fault simulation to ensure the design is hardened and no longer susceptible to errors. Moreover, the Austemper technology performs simulation at orders of magnitude faster than competing solutions.

Siemens will integrate Austemper’s technology into Mentor’s IC verification portfolio as part of Siemens’ larger digitalization strategy, leveraging Siemens’ world-wide sales channel to make this functional safety solution available to companies developing digital twins of safety-critical systems at the heart of autonomous vehicles, smart cities and industrial equipment in Factory 4.0.

Mentor, A Siemens Business | www.mentor.com

Component Tolerance

Accuracy Unmasked

We take for granted sometimes that the tolerances of our electronic components fit the needs of our designs. In this article, Robert takes a deep look into the subject of tolerances, using the simple resistor as an example. He goes through the math to help you better understand accuracy and drift along with other factors.

By Robert Lacoste

One of the last projects I worked on with my colleagues was a kind of high-precision current meter. It turned out to be far more difficult than anticipated, even with our combined experience totaling almost 100 years. Maybe this has happened with your projects too: You discover that, even when you’re not looking for top performance out of your electronic components, the accuracy and stability of those components can be pernicious. My topic this month is examining component tolerances. And, for simplicity, I will focus on the simplest possible electronic device: a resistor.

FIGURE 1 A very simple voltage divider. With these values, Uout will be 1 V with Uin=100 V

Let’s start with a basic application. Imagine that you have to design a voltage divider with a ratio of 1/100 (Figure 1). I will assume that the source impedance is very low and that the load connected on the output draws no current at all. With those parameters the calculations are very easy. You just need to know Ohm’s Law. Because the resistors are in series, the current circulating through the two resistors is:

Similarly, the output voltage is:

Given that the current I is the same in both equations, we get:

This circuit is indeed a voltage divider, with a ratio of R2/(R1+R2). We want a ratio of 1/100, so one resistor could be fixed arbitrarily and the second easily calculated. For example: R1=9,900 Ω and R2=100 Ω will do the job as:

Of course, you can easily simulate such a circuit with any SPICE-based circuit simulator if you wish. I personally used Proteus from Labcenter to draw and simulate the small schematic provided on Figure 1, and the output voltage is 1 V with 100 V applied on the input, as expected. As usual, I encourage you to reproduce these small examples with your preferred simulator: for example the free LT-Spice.

Now let’s talk about accuracy. You want your divider to be as precise as possible and therefore you want to buy reasonably accurate resistors. But what if your budget is constrained? Will you use a high accuracy resistor for R1 (9,900 Ω)? Or for R2 (100 Ω)? Or for both? The good answer is both. In that case, a 1% error on either R1 or R2 gives close to a 1% error of the output voltage, as shown in Figure 2. Even if R1 has a stranger value than R2—9,900 Ω vs. 100 Ω—their accuracy is just as critical.

Figure 2
A 1% error either on the top or bottom resistors will induce a roughly 1% error on the output. That would not be the case for other division ratios.

Maybe you think this is too obvious? In that case I will give you another exercise: What happens with a divide-by-2 circuit using two resistors of the same value? Do the calculation or simulate it and you will find that both resistors have still the same impact on accuracy. But now a 1% error on one of the resistors has only a 0.5% impact on the output voltage. That means you could buy slightly less expensive resistors for the same overall precision! In fact, the higher the division ratio, the higher is the impact of each resistor on the overall accuracy.

E Series Resistors

Let’s go back to the 1/100 divider example. If you want to build it and look for a
9,900-Ω resistor, you will have some difficulties because nobody sells them.. …

Read the full article in the April 333 issue of Circuit Cellar

Don’t miss out on upcoming issues of Circuit Cellar. Subscribe today!
Note: We’ve made the October 2017 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.

Updated LiveLink for SOLIDWORKS

COMSOL recently updated LiveLink for SOLIDWORKS. An add-on to the COMSOL Multiphysics software, LiveLink for SOLIDWORKS enables a CAD model to be synchronized between the two software packages. Furthermore, it provides easy access for running simulation apps that can be used in synchronicity with SOLIDWORKS software. You can build apps with the Application Builder to let users analyze and modify a geometry from SOLIDWORKS software right from the app’s interface. Users can browse and run apps from within the SOLIDWORKS interface, including those that use a geometry that is synchronized with SOLIDWORKS software.

The update includes a new Bike Frame Analyzer app in the Application Libraries. It leverages LiveLink for SOLIDWORKS to interactively update the geometry while computing the stress distribution in the frame that is subject to various loads and constraints. You can use the app to easily test different configurations of a bike frame for different parameters such as, dimensions, materials, and loads. The app computes the stress distribution and the deformation of the frame, based on the structural dimensions, materials, and loads/constraints of the bike frame.

Source: COMSOL

New Embedded Solution for Debugging FPGAs

Exostiv Labs recently announced that its EXOSTIV solution for Intel FPGAs will be available in December 2016. Providing up to 200,000 times more visibility on an FPGA than other solutions, EXOSTIV enables the debugging and verification of FPGA board prototypes at speed of operation. It provides extended visibility on internal nodes over long periods of time with minimal impact on the FPGA resources. Thus, you can discover issues related to complex interactions between numerous IPs when simulation is impracticable.

EXOSTIV for Intel FPGAs will be released in December 2016 with support for Arria 10 devices first. Pricing starts at $5,100.

Source: Exostiv Labs 

Thermoelectric Module Simulation Software Simplifies Design

To decrease thermal deszsign time for design engineers, Laird recently improved the AZTEC thermoelectric module (TEM) simulation program algorithms. The AZTEC product selection tool enables you to specify input variables based on application attributes and the software analysis outputs. Now you can select the best TEM by easily comparing TEM datasheets. In addition, the software includes an analysis worksheet for simulating TEM device functionality.Laird AZTEC Interface

The AZTEC product selection tool—which is available at Lairdtech.com—uses a variety of input variables (i.e., heat load, ambient and control temperatures, input voltage requirement and thermal resistance of hot side heat exchangers) to recommend appropriate TEMs to meet your application’s needs. Laird updated the software with its newest TEM product offerings.

The Analysis Worksheet Tool simulates expected thermoelectric output parameters based on a given set of thermal and electrical operating points. The included output parameters are:

  • the hot and cold side temperatures of the TEM
  • heat pumped at the cold surface of the TEM
  • coefficient of performance (COP)
  • input power requirements

The total hot side heat dissipation is also calculated.

The included Qc Estimating Worksheet calculates an estimate on the heat load for device (spot) or chamber (volume) cooling applications. Computations are made based on the input (e.g., temperature requirements, volumetric dimensions, insulation thickness, material properties, and active heat load) you provide.

Source: Laird

New Software for Creating Simulation Apps

COMSOL recently released COMSOL Multiphysics 5.2 with the latest version of the Application Builder and COMSOL Server products. The new software enables organizations to share simulation work, from design and development to production and testing. This version of COMSOL Multiphysics and COMSOL Server simulation software environment provides new features, improved stability, and faster execution. Notable upgrades to the Application Builder available in COMSOL Multiphysics include the new Editor Tools for easy creation of user interface components, commands for dynamic updates of graphics, and more control over the deployment of simulation apps. Numerous updates, new features, and simulation application examples are also available for the add-on electrical, mechanical, fluid, and chemical products.COMSOL_5.2

The new software makes it possible to update graphics while running an app. The app designer can present app users with different plots while solving; this takes them through the progression of the solution process and presents them, for example, with geometry, mesh, and solution plots. The app designer can also customize the graphics toolbar with new buttons and include camera movements.

To demonstrate the power of the Application Builder, a wide variety of new apps have been added to the extensive Application Libraries showcasing the capabilities of the Application Builder. The Application Libraries include apps ranging from membrane dialysis, water treatment, thermoelectric cooling, heat exchangers, touchscreen design, magnetic prospecting, piezoacoustic transducers, muffler design, MEMS sensors, and pressure vessels.

Version 5.2 also introduces enhancements to the existing functionalities of COMSOL Multiphysics and its add-on products. Users will benefit from more flexible license management, Users of the Structural Mechanics Module and the AC/DC Module will benefit from the new External Materials functionality allowing materials to be algorithmically defined by shared library files written in the C language. The most prominent use of this new functionality will be for nonlinear materials that include hysteresis (history dependency) and irreversibility effects.

Source: COMSOL

Software-Only Hardware Simulation

Simulating embedded hardware in a Windows environment can significantly reduce development time. In this article, Michael Melkonian provides techniques for the software-only simulation of embedded hardware. He presents a simple example of an RTOS-less embedded system that uses memory-mapped I/O to access a UART-like peripheral to serially poll a slave device. The simulator is capable of detecting bugs and troublesome design flaws.

Melkonian writes:

In this article, I will describe techniques for the software-only simulation of embedded hardware in the Windows/PC environment. Software-only simulation implies an arrangement with which the embedded application, or parts of it, can be compiled and run on the Windows platform (host) talking to the software simulator as opposed to the real hardware. This arrangement doesn’t require any hardware or tools other than a native Windows development toolset such as Microsoft Developer Studio/Visual C++. Importantly, the same source code is compiled and linked for both the host and the target. It’s possible and often necessary to simulate more complex aspects of the embedded target such as interrupts and the RTOS layer. However, I will illustrate the basics of simulating hardware in the Windows environment with an example of an extremely simple hypothetical target system (see Figure 1).

Figure 1: There is a parallel between the embedded target and host environment. Equivalent entities are shown on the same level.
Figure 1: There is a parallel between the embedded target and host environment. Equivalent entities are shown on the same level.

Assuming that the source code of the embedded application is basically the same whether it runs in Windows or the embedded target, the simulation offers several advantages. You have the ability to develop and debug device drivers and the application before the hardware is ready. An extremely powerful test harness can be created on the host platform, where all code changes and additions can be verified prior to running on the actual target. The harness can be used as a part of software validation.

Furthermore, you have the ability to test conditions that may not be easy to test using the real hardware. In the vast majority of cases, debugging tools available on the host are far superior to those offered by cross development tool vendors. You have access to runtime checkers to detect memory leaks, especially for embedded software developed in C++. Lastly, note that where the final system comprises a number of CPUs/boards, simulation has the additional advantage of simulating each target CPU via a single process on a multitasking host.

FIRST THINGS FIRST
Before you decide to invest in simulation infrastructure, there are a few things to consider. For instance, when the target hardware is complex, the software simulator becomes a fairly major development task. Also, consider the adequacy of the target development tools. This especially applies to debuggers. The absence, or insufficient capability, of the debugger on the target presents a strong case for simulation. When delivery times are more critical than the budget limitations and extra engineering resources are available, the additional development effort may be justified. The simulator may help to get to the final product faster, but at a higher cost. You should also think about whether or not it’s possible to cleanly separate the application from the hardware access layer.

Remember that when exact timings are a main design concern, the real-time aspects of the target are hard to simulate, so the simulator will not help. Moreover, the embedded application’s complexity is relatively minor compared to the hardware drivers, so the simulator may not be justified. However, when the application is complex and sitting on top of fairly simple hardware, the simulator can be extremely useful.

You should also keep in mind that when it’s likely that the software application will be completed before the hardware delivery date, there is a strong case for simulation …

SOFTWARE DESIGN GUIDE
Now let’s focus on what makes embedded software adaptable for simulation. It’s hardly surprising that the following guidelines closely resemble those for writing portable code. First, you need a centralized access mechanism to the hardware (read_hw and write_hw macros). Second, the application code and device driver code must be separated. Third, you must use a thin operating level interface. Finally, avoid using the nonstandard add-ons that some cross-compilers may provide.

Download the entire article: M. Melkonian, “Software-Only Hardware Simulation,” CIrcuit Cellar 164, 2004.

The Future of Engineering Research & Systems Modeling

So many bytes, so little time. Five years ago, I found myself looking for a new career. After 20 years in the automotive sector, the economic downturn hit home and my time had come. I was lucky enough to find a position at the University of Notre Dame designing and building lab instrumentation and data acquisition equipment in the Department of Civil and Environmental Engineering & Earth Sciences, and teaching microprocessor programming in the evenings at Ivy Tech Community College. The transition from industry to the academic world has been challenging and rewarding. Component and System modeling using computer simulation is an integral part of all engineering disciplines. Much of the industry simulation software started out in a university computer lab.PIVtank

A successful computer simulation of a physical phenomenon has several requirements. The first requirement is a stable model based on a set of equations relating to the physics and scale of the event. For complex systems, this model may not exist, and a simplified model may be used to approximate the event as close as possible. Assumptions are made where data is scarce. The second requirement is a set of initial conditions that all the equation variables need to start the simulation. These values are usually determined by running real-world experiments and capturing a “snapshot” of the event at a specific time. The quality of this data depends on the technology available at the time. The technology behind sensors and data acquisition for these experiments is evolving at an incredible rate. Some sensors that may have cost $500 10 years ago are available now for $5 and have been miniaturized to one tenth of its original size to fit into a cell phone or smart band. Equipment that was too large to be used out of a lab environment is now pocket sized and portable. Researchers are taking advantage of this, and taking much more data than ever imagined.

So how will this affect the future of simulation? Multicore processors and distributed computing are allowing researchers to run more simulations and get results quicker. Our world has become Internet driven and people want data immediately, so data must become available as close to real-time as possible. As more and more sensors become wireless, low cost, energy efficient, and “smart” due to the Internet of Things movement, empirical data is available from places never before conceived. Imagine the possible advancements in weather modeling and forecasting if every cell phone in the world sent temperature, humidity, barometric pressure, GPS, and light intensity data to a cloud database automatically. More sensors lead to higher simulation resolution and more accuracy.

A popular saying, “garbage in = garbage out,” still applies, and is the bane of the Internet. Our future programmers must be able to sift through all of this new data and determine the good from the bad. Evil hackers enjoy destroying databases, so security is a major concern. Some of this new technology that could be useful in research is being rejected by the public due to criminal use. For example, a UAV “drone” that can survey a farmer’s crop can also deliver contraband or cause havoc at an airport or sporting event. While these issues are tackled in the courtroom and the FAA, researchers are waiting to take more data.

Simulation is still only a guess at what may happen under specific conditions based on assumptions of how our world works. The advancements in sensor and data acquisition technology will continue to improve the accuracy of these guesses, as long as we can depend on the reliability of the input sources and keep the evil hackers out of the databases. Schools still need to train students on how to determine good data from questionable data. The terabyte question for the future of simulation is whether or not we will be able to find the data we need in the format we need, searching through all these new data sources in less time than it would take to run the original experiments ourselves. So many bytes, so little time.

R. Scott Coppersmith earned a BSc in Electrical Engineering at Michigan Technological University. He held several engineering positions in the automotive industry from the late 1980s until 2010 when he joined the University of Notre Dame’s Civil Engineering and Geological Sciences department as Research Engineer to help build a Environmental Fluid Dynamics laboratory and assist students, faculty, and visiting researchers with their projects. Scott also teaches a variety of engineering courses (e.g., Intro to Microcontrollers and Graphic Communication for Manufacturing) at Ivy Tech Community College.

Streamlined Touchscreen Design with Application Builder and COMSOL Server

Cypress Semiconductor R&D engineers are creating simulation apps that streamline their touchscreen design processes. To do so, they’re sharing their simulation expertise with colleagues using the Application Builder and COMSOL Server, released with COMSOL Multiphysics simulation software version 5.COMSOL_5.1_COMSOL_Server

With the Application Builder, engineers can create ready-to-use simulation applications that can be implemented across departments, including by product development, sales, and customer service. The Application Builder enables simulation experts to build intuitive simulation apps based on their models directly within the COMSOL environment. COMSOL Server lets them share these apps with colleagues and customers around the globe.

To incorporate advances into touchscreen technology and embedded system products, Cypress simulation engineers use COMSOL for research and design initiatives. Their touchscreens are used in phones and MP3 devices, industrial applications, and more.

Source: COMSOL

 

Fast Quad IF DAC

ADI AD9144 16-bit 2.8 GSPS DAC - Fastest Quad IF DAC - High DynaThe AD9144 is a four-channel, 16-bit, 2.8-GSPS DAC that supports high data rates and ultra-wide signal bandwidth to enable wideband and multiband wireless applications. The DAC features 82-dBc spurious-free dynamic range (SFDR) and a 2.8-GSPS maximum sample rate, which permits multicarrier generation up to the Nyquist frequency.

With –164-dBm/Hz noise spectral density, the AD9144 enables higher dynamic range transmitters to be built. Its low SFDR and distortion design techniques provide high-quality synthesis of wideband signals from baseband to high intermediate frequencies. The DAC features a JESD204B eight-lane interface and low inherent latency of fewer than two DAC clock cycles. This simplifies hardware and software system design while permitting multichip synchronization.

The combination of programmable interpolation rate, high sample rates, and low power at 1.5 W provides flexibility when choosing DAC output frequencies. This is especially helpful in meeting four- to six-carrier Global System for Mobile Communications (GSM) transmission specifications and other communications standards. For six-carrier GSM intermodulation distortion (IMD), the AD9144 operates at 77 dBc at 75-MHz IF. Operating with the on-chip phase-locked loop (PLL) at a 30-MHz DAC output frequency, the AD9144 delivers a 76-dB adjacent-channel leakage ratio (ACLR) for four-carrier Wideband Code Division Multiple Access (WCDMA) applications.

The AD9144 includes integrated interpolation filters with selectable interpolation factors. The dual DAC data interface supports word and byte load, enabling users to reduce input pins on lower data rates to save board space, power, and cost.

The DAC is supported by an evaluation board with an FPGA Mezzanine Card (FMC) connector, software, tools, a SPI controller, and reference designs. Analog Devices’s VisualAnalog software package combines a powerful set of simulation and data analysis tools with a user-friendly graphical interface that enables users to customize their input signal and data analysis.

The AD9144BCPZ DAC costs $80. The AD9144-EBZ and AD9144-FMC-EBZ FMC evaluation boards cost $495.

Analog Devices, Inc.
www.analog.com

Doing the Robot, 21st-Century Style

Growing up in the 1970s, the first robot I remember was Rosie from The Jetsons. In the 1980s, I discovered Transformers, which were touted as “robots in disguise,” I imitated Michael Jackson’s version of “the robot,” and (unbeknownst to me) the Arthrobot surgical robot was first developed. This was years before Honda debuted ASIMO, the first humanoid robot, in 2004.

“In the 1970s, microprocessors gave me hope that real robots would eventually become part of our future,” RobotBASIC codeveloper John Blankenship told me in a 2013 interview. It appears that the “future” may already be here.

Honda's ASIMO humanoid robot

Honda’s ASIMO humanoid robot

Welcome to the 21st century. Technology is becoming “smarter,“ as evidenced at the Consumer Electronics Show (CES) 2014, which took place in January. The show unveiled a variety of smartphone-controlled robots and drones as well as wireless tracking devices.

Circuit Cellar’s columnists and contributors have been busy with their own developments. Steve Lubbers wondered if robots could be programmed to influence each other’s behavior. He used Texas Instruments’s LaunchPad hardware and a low-cost radio link to build a group of robots to test his theory. The results are on p. 18.

RobotBASIC’s Blankenship wanted to program robots more quickly. His article explains how he uses robot simulation to decrease development time (p. 30).

The Internet of Things (IoT), which relies on embedded technology for communication, is also making advancements. According to information technology research and advisory company Gartner, by 2020, there will be close to 26 billion devices on the IoT.

With the IoT, nothing is out of the realm of a designer’s imagination. For instance, if you’re not at home, you can use IoT-based platforms (such as the one columnist Jeff Bachiochi writes about on p. 58) to preheat your oven or turn off your sprinklers when it starts to rain.

Meanwhile, I will program my crockpot and try to explain to my 8-year-old how I survived childhood without the Internet.

Arduino MOSFET-Based Power Switch

Circuit Cellar columnist Ed Nisley has used Arduino SBCs in many projects over the years. He has found them perfect for one-off designs and prototypes, since the board’s all-in-one layout includes a micrcontroller with USB connectivity, simple connectors, and a power regulator.

But the standard Arduino presents some design limitations.

“The on-board regulator can be either a blessing or a curse, depending on the application. Although the board will run from an unregulated supply and you can power additional circuitry from the regulator, the minute PCB heatsink drastically limits the available current,” Nisley says. “Worse, putting the microcontroller into one of its sleep modes doesn’t shut off the rest of the Arduino PCB or your added circuits, so a standard Arduino board isn’t suitable for battery-powered applications.”

In Circuit Cellar’s January issue, Nisley presents a MOSFET-based power switch that addresses such concerns. He also refers to one of his own projects where it would be helpful.

“The low-resistance Hall effect current sensor that I described in my November 2013 column should be useful in a bright bicycle taillight, but only if there’s a way to turn everything off after the ride without flipping a mechanical switch…,” Nisley says. “Of course, I could build a custom microcontroller circuit, but it’s much easier to drop an Arduino Pro Mini board atop the more interesting analog circuitry.”

Nisley’s January article describes “a simple MOSFET-based power switch that turns on with a push button and turns off under program control: the Arduino can shut itself off and reduce the battery drain to nearly zero.”

Readers should find the article’s information and circuitry design helpful in other applications requiring automatic shutoff, “even if they’re not running from battery power,” Nisley says.

Figure 1: This SPICE simulation models a power p-MOSFET with a logic-level gate controlling the current from the battery to C1 and R2, which simulate a 500-mA load that is far below Q2’s rating. S1, a voltage-controlled switch, mimics an ordinary push button. Q1 isolates the Arduino digital output pin from the raw battery voltage.

Figure 1: This SPICE simulation models a power p-MOSFET with a logic-level gate controlling the current from the battery to C1 and R2, which simulate a 500-mA load that is far below Q2’s rating. S1, a voltage-controlled switch, mimics an ordinary push button. Q1 isolates the Arduino digital output pin from the raw battery voltage.

The article takes readers from SPICE modeling of the circuitry (see Figure 1) through developing a schematic and building a hardware prototype.

“The PCB in Photo 1 combines the p-MOSFET power switch from Figure 2 with a Hall effect current sensor, a pair of PWM-controlled n-MOFSETs, and an Arduino Pro Mini into

The power switch components occupy the upper left corner of the PCB, with the Hall effect current sensor near the middle and the Arduino Pro Mini board to the upper right. The 3-D printed red frame stiffens the circuit board during construction.

Photo 1: The power switch components occupy the upper left corner of the PCB, with the Hall effect current sensor near the middle and the Arduino Pro Mini board to the upper right. The 3-D printed red frame stiffens the circuit board during construction.

a brassboard layout,” Nisley says. “It’s one step beyond the breadboard hairball I showed in my article “Low-Loss Hall Effect Current Sensing” (Circuit Cellar 280, 2013), and will help verify that all the components operate properly on a real circuit board with a good layout.”

For much more detail about the verification process, PCB design, Arduino interface, and more, download the January issue.

The actual circuit schematic includes the same parts as the SPICE schematic, plus the assortment of connectors and jumpers required to actually build the PCB shown in Photo 1.

Figure 2: The actual circuit schematic includes the same parts as the SPICE schematic, as well as the assortment of connectors and jumpers required to actually build the PCB shown in Photo 1.