Reflections on Software Development

Present-day equipment relies on increasingly complex software, creating ever-greater demand for software quality and security. The two attributes, while similar in their effects, are different. A quality software is not necessarily secure, while a secure software is not necessarily of good quality. Safe software is both of high quality and security. That means the software does what it is supposed to do: it prevents hackers and other external causes from modifying it, and should it fail, it does so in a safe, predictable way. Software verification and validation (V&V) reduces issues attributable to defects, that is to poor quality, but does not currently address misbehavior caused by external effects.

Poor software quality can result in huge material losses, even life. Consider some notorious examples of the past. An F-22 Raptor flight control error caused the $150 million aircraft to be destroyed. An RAF Chinook engine controller fault caused the helicopter crash with 29 fatalities. A Therac radiotherapy machine gave patients massive radiation overdoses causing death of two people. A General Electric power grid monitoring system’s failure resulted in a 48-hour blackout across eight US states and one Canadian province. Toyota’s electronic throttle controller was said to be responsible for the lives of 89 people.

Clearly, software quality is paramount, yet too often it takes the back seat to the time to market and the development cost. One essential attribute of quality software is its traceability. This means that every requirement can be traced via documentation from the specification down to the particular line of code—and, vice versa, every line of code can be traced up to the specification. The documentation (not including testing and integration) process is illustrated in Figure 1.

FIGURE 1: Simplified software design process documentation. Testing, verification and validation (V&V) and control documents are not shown.

FIGURE 1: Simplified software design process documentation. Testing, verification and validation (V&V) and control documents are not shown.

The terminology is that of the DO-178 standard, which is mandatory for aerospace and military software. (Similarly, hardware development is guided by DO-254.) Other software standards may use a different terminology, but the intentions are the same. DO-178 guides its document-driven process, for which many tools are available to the designer. Once the hardware-software partitioning has been established, software requirements define the software architecture and the derived requirements. Derived requirements are those that the customer doesn’t include in the specification and might not even be aware of them. For instance, turning on an indicator light may take one sentence in the specification, but the decomposition of this simple task might lead to many derived requirements.

Safety-Instrumented Functions

While requirements are being developed, test cases must be defined for each and every one of those requirements. Additionally, to increase the system safety, a so-called Safety-Instrumented Functions (SIF) should be considered. SIFs are monitors which cause the system to safely shut down if its performance fails to meet the previously defined safety limits. This is typically accomplished by redundancy in hardware, software or both. If you neglect to address such issues at an early development stage, you might end up with an unsafe system and having to redo a lot of work later.

Quality design is also a bureaucratic chore. Version control and configuration index must be maintained. The configuration index comprises the list of modules and their versions to be compiled for specific versions of the product under development. Without it, configuration can be lost and a great deal of development effort with it.

Configuration control and traceability are not just the best engineering practices. They should be mandated whenever software is being developed. Some developers believe that software qualification to a specific standard is required by the aerospace and military industries only. Worse, some commercial software developers still subscribe to the so-called iron triangle: “Get to market fast with all the features planned and high level of quality. But pick only two.”

Engineers in safety-critical industries (such as medical, nuclear, automotive, and manufacturing) work with methods similar to DO-178 to ensure their software performs as expected. Large original equipment manufacturers (OEMs) now demand adherence to software standards: IEC61508 for industrial controls, IEC62034 for medical equipment, ISO 26262 for automotive, and so forth. The reason is simple. Unqualified software can lead to costly product returns and expensive lawsuits.

Software qualification is highly labor intensive and very demanding in terms of resources, time, and money. Luckily, its cost has been coming down thanks to a plethora of automated tools now being offered. Those tools are not inexpensive, but they do pay for themselves quickly. Considering the risk of lawsuits, recalls, brand damage, and other associated costs of software failure, no company can really afford not to go through a qualification process.


As with hardware, quality must be built into the software, and this means following strict process rules. You can’t expect to test quality into the product at the end. Some companies have tried and the results have been the infamous failures noted above.
Testing embedded controllers often presents a challenge because you need the final hardware when it is not yet finished. Nevertheless, if you give testing due consideration as you prepare the software requirements, much can be accomplished by working in virtual or simulated environments. LDRA ( is one great tool for this task.
Numerous methods exist for software testing. For example, dynamic code analysis examines the program during its execution, while the static analysis looks for vulnerabilities as well as programming errors. It has been shown mathematically that 100% test coverage is impossible to achieve. But even if it was, 35% to 40% of defects result from missing logic paths and another 40% from the execution of unique combinations of logic paths. Such defects wouldn’t get caught by testing, but can be mitigated by SIF.

Much embedded code is still developed in-house (see Figure 2). Is it possible for companies to improve programmers’ efficiency in this most labor-intensive task? Once again, the answer lies in automation. Nowadays, many tools come as complete suites providing various analyses, code coverage, coding standards compliance, requirements traceability, code visualization, and so forth. These tools are regularly seen at developers of avionic and military software, but they are not as frequently used by commercial developers because of their perceived high cost and steep learning curve.

FIGURE 2: Distribution of embedded software sources. Most is still developed in-house.

FIGURE 2: Distribution of embedded software sources. Most is still developed in-house.

With the growth of cloud computing and the Internet of Things (IoT), software security is gaining on an unprecedented importance. Some security measures can be incorporated in hardware while others are in software. Data encryption and password protection are the vital parts. Unfortunately, security continues to be not treated by some developers as seriously as it should be. Security experts warn that numerous IoT developers have failed to learn the lessons of the past and a “big IoT hack” in the near future is inevitable.

Security Improvements

On a regular basis, the media report on security breaches (e.g., governmental organization hacks, bank hacks, and automobile hacks). What can be done to improve security?

There are several techniques—such as Common Weakness Enumeration (CWE)—that can help to improve our chances. However, securing software is likely a task a lot more daunting than achieving comprehensive V&V test coverage. One successful hack proves the security is weak. But how many unsuccessful hacks by test engineers are needed to establish that security is adequate? Eventually, a manager, probably relying on some statistics, will have to decide that enough effort has been spent and the software can be released. Different types of systems require different levels of security, but how is this to be determined? And what about the human factor? Not every test engineer has the necessary talent for code breaking.

History teaches us that no matter how good a lock, a cipher, or a password someone has eventually broken it. Several security developers in the past challenged the public to break their “unbreakable” code for a reward, only to see their code broken within hours. How responsible is it to keep sensitive data and systems access available in the cyberspace just because it may be convenient, inexpensive, or fashionable? Have the probability and the consequences of a potential breach been always duly considered?

I have used cloud-based tools, such as the excellent mbed, but would not dream of using them for a sensitive design. I don’t store data in the cloud, nor would I consider IoT for any system whose security was vital. I don’t believe cyberspace can provide sufficient security for many systems at this time. Ultimately, the responsibility for security is ours. We must judge whether the use IoT or the cloud for a given product would be responsible. At present, I see little evidence to be convinced the industry is adequately serious about security. It will surely improve with time, but until it does I am not about to take unnecessary risks.

George Novacek is a professional engineer with a degree in Cybernetics and Closed-Loop Control. Now retired, he was most recently president of a multinational manufacturer for embedded control systems for aerospace applications. George wrote 26 feature articles for Circuit Cellar between 1999 and 2004. Contact him at with “Circuit Cellar”in the subject line.

Aurora Software for Evaluation of ArcticPro eFPGA IP

QuickLogic Corp. recently announced the release of its new Aurora software, which enables SoC developers to evaluate the integration of embedded FPGA (eFPGA) IP into devices designed for different Global Foundries process nodes. The Aurora eFPGA development tool supports design implementation from RTL through place and route. It enables SoC developers to determine the amount of eFPGA resources needed to support a design (including logic cell count, clock network requirements, and routing utilization) and also provide the estimated eFPGA die area associated with those resources. The current version of the tool supports GF’s 40-nm node. Support for the 65-nm node and 22FDX (FD-SOI) platform will be released in the future.

Source: QuickLogic Corp.

Embedded Software: Tips & Insights (Sponsor: PRQA)

When it comes to embedded software, security matters. Read the following whitepapers to learn about: securing your embedded systems, MISRA coding standard, and using static analysis to overcome the challenges of reusing code.

  • Developing Secure Embedded Software
  • Guide to MISRA Coding
  • Using Static Analysis to Overcome the Challenges of Reusing Code for Embedded Software


Programming Research Ltd (PRQA) helps its customers to develop high-quality embedded source code—software which is impervious to attack and executes as intended.

Updated LiveLink for SOLIDWORKS

COMSOL recently updated LiveLink for SOLIDWORKS. An add-on to the COMSOL Multiphysics software, LiveLink for SOLIDWORKS enables a CAD model to be synchronized between the two software packages. Furthermore, it provides easy access for running simulation apps that can be used in synchronicity with SOLIDWORKS software. You can build apps with the Application Builder to let users analyze and modify a geometry from SOLIDWORKS software right from the app’s interface. Users can browse and run apps from within the SOLIDWORKS interface, including those that use a geometry that is synchronized with SOLIDWORKS software.

The update includes a new Bike Frame Analyzer app in the Application Libraries. It leverages LiveLink for SOLIDWORKS to interactively update the geometry while computing the stress distribution in the frame that is subject to various loads and constraints. You can use the app to easily test different configurations of a bike frame for different parameters such as, dimensions, materials, and loads. The app computes the stress distribution and the deformation of the frame, based on the structural dimensions, materials, and loads/constraints of the bike frame.

Source: COMSOL

The Future of Test-First Embedded Software

The term “test-first” software development comes from the original days of extreme programming (XP). In Kent Beck’s 1999 book, Extreme Programming Explained: Embrace Change (Addison-Wesley), his direction is to create an automated test before making any changes to the code.

Nowadays, test-first development usually means test-driven development (TDD): a well-defined, continuous feedback cycle of code, test, and refactor. You write a test, write some code to make it pass, make improvements, and then repeat. Automation is key though, so you can run the tests easily at any time.

TDD is well regarded as a useful software development technique. The proponents of TDD (including myself) like the way in which the code incrementally evolves from the interface as well as the comprehensive test suite that is created. The test suite is the safety net that allows the code to be refactored freely, without worry of breaking anything. It’s a powerful tool in the battle against code rot.

To date, TDD has had greater adoption in web and application development than with embedded software. Recent advances in unit test tools however are set to make TDD more accessible for embedded development.

In 2011 James Grenning published his book, Test Driven Development for Embedded C (Pragmatic Bookshelf). Six years later, this is still the authoritative reference for embedded test-first development and the entry point to TDD for many embedded software developers. It explains how TDD works in detail for an unfamiliar audience and addresses many of the traditional concerns, like how will this work with custom hardware. Today, the book is still completely relevant, but when it was published, the state-of-the art tools were simple unit test and mocking frameworks. These frameworks require a lot of boilerplate code to run tests, and any mock objects need to be created manually.

In the rest of the software world though, unit test tools are significantly more mature. In most other languages used for web and application development, it’s easy to create and run many unit tests, as well as to create mock objects automatically.
Since 2011, the current state of TDD tools has advanced considerably with the development of the open-source tool Ceedling. It automates running of unit tests and generation of mock objects in C applications, making it a lot easier to do TDD. Today, if you want to test-drive embedded software in C, you don’t need to roll-your-own test build system or mocks.

With better tools making unit testing easier, I suspect that in the future test-first development will be more widely adopted by embedded software developers. While previously relegated to the few early adopters willing to put in the effort, with tools lowering the barrier to entry it will be easier for everyone to do TDD.
Besides the tools to make TDD easier, another driving force behind greater adoption of test-first practices will be the simple need to produce better-quality embedded software. As embedded software continues its infiltration into all kinds of devices that run our lives, we’ll need to be able to deliver software that is more reliable and more secure.

Currently, unit tests for embedded software are most popular in regulated industries—like medical or aviation—where the regulators essentially force you to have unit tests. This is one part of a strategy to prevent you from hurting or killing people with your code. The rest of the “unregulated” embedded software world should take note of this approach.

With the rise of the Internet of things (IoT), our society is increasingly dependent on embedded devices connected to the Internet. In the future, the reliability and security of the software that runs these devices is only going to become more critical. There may not be a compelling business case for it now, but customers—and perhaps new regulators—are going to increasingly demand it. Test-first software can be one strategy to help us deal with this challenge.

This article appears in Circuit Cellar 318.

Matt Chernosky wants to help you build better embedded software—test-first with TDD. With years of experience in the automotive, industrial, and medical device fields, he’s excited about improving embedded software development. Learn more from Matt about getting started with embedded TDD at

The Flow Coder

Products come and go. New products are developed all the time. So, what’s the key to success? John Dobson has successfully run Halifax, UK-based Matrix TSL 23 years. During that time, the company has gone through some changes. He recently gave Circuit Cellar a tour Matrix’s headquarters, shared a bit about the company’s history, and talked about product diversification.

Matrix started 23 years ago as a Matrix Multimedia, a CD-ROM publisher. “The Internet came along and destroyed that business, and we had to diversify, and so we diversified into electronics and into education in particular,” Dobson said.

Matrix’s flagship product is Flowcode software. “It basically uses flowcharts to allow people without a huge amount of coding experience to develop complex electronics systems,” Dobson explained. “Sometimes programming in C or other languages is a little complicated and time consuming. So Flowcode has a lot of components to it with libraries and things and lots of features that allow people to design complex electronic systems quickly.”

Today, while still focused on the education market, the latest version of Flowcode has about 3,000 industrial users. Dobson said many of the industrial users are test engineers whose specialty isn’t necessary coding.

Note: Circuit Cellar is currently running a Flowcode 7 promotion with Matrix TSL. Learn More

eSOL RTOS & Debugger Support for Software Development

Imperas Software recently announced its support for eSOL’s eMCOS RTOS and eBinder debugger. The partnership is intended to accelerate embedded software development, debugging, and testing.

The Imperas Extendable Platform Kit (EPK) features a Renesas RH850F1H device and it runs the eSOL eMCOS real time operating system. Imperas simulators can use the debugger from the eSOL IDE, eBinder, for efficient software debugging and testing.

Source: eSOL

October Code Challenge (Sponsor: Programming Research)

Ready to put your programming skills to the test? Take the new Electrical Engineering Challenge (sponsored by Programming Research). Find the error in the code for a shot to win prizes, such as an Amazon Gift Card, a Circuit Cellar magazine digital subscription, or a discount to the Circuit Cellar webshop.

The following program will compile with no errors. It runs and completes with no errors.

Click to enlarge. Find the error and submit your answer via the online submission form below. Submission deadline: 2 PM EST, October 20.

Take the challenge now!

September Code Challenge (Sponsor: Programming Research)

Ready to put your programming skills to the test? Take the new Electrical Engineering Challenge (sponsored by Programming Research). Find the error in the code for a shot to win prizes, such as an Amazon Gift Card, a Circuit Cellar magazine digital subscription, or a discount to the Circuit Cellar webshop.

The following program will compile with no errors. It will crash when run. This is an example of working with link lists. The output should be:

LinkedList : 4321

LinkedList in reverse order : 1234

Click to enlarge. Find the error and submit your answer via the online submission form below. Submission deadline: 2 PM EST, September 20.

Click to enlarge. Find the error and submit your answer via the online submission form below. Submission deadline: 2 PM EST, September 20.

Take the challenge now!

August Code Challenge (Sponsor: Programming Research)

Ready to put your programming skills to the test? Take the new Electrical Engineering Challenge (sponsored by Programming Research). Find the error in the code for a shot to win prizes, such as an Amazon Gift Card or a Circuit Cellar magazine digital subscription.

The following is a sample piece of code that has a subtle programming error that would cause the software to fail. This code is a C++ language program but written as a function.

Specification: The program does a bubble sort for a list (array) of numbers. That means that it takes a list of numbers like this: 5 23 7 1 9 and turns it into 1 5 7 9 23.

Click to enlarge. Find the error and submit your answer via the online submission form below. Submission deadline: 2 PM EST, July 20.

Click to enlarge. Find the error and submit your answer via the online submission form below. Submission deadline: 2 PM EST, July 20.

Take the challenge now!

Latest Release of COMSOL Multiphysics and COMSOL Server

COMSOL has announced the latest release of the COMSOL Multiphysics and COMSOL Server simulation software environment. With hundreds of user-driven features and enhancements, COMSOL software version 5.2a expands electrical, mechanical, fluid, and chemical design and optimization capabilities. COMSOL_Multiphysics

In COMSOL Multiphysics 5.2a, three new solvers deliver faster and more memory-efficient computations:

  • The smoothed aggregation algebraic multigrid (SA-AMG) solver is efficient for linear elastic analysis and many other types of analyses. It is very memory conservative, making it possible to run structural assemblies with millions of degrees of freedom on a standard desktop or laptop computer.
  • The domain decomposition solver has been optimized for handling large multiphysics models. “
  • A new explicit solver based on the discontinuous Galerkin (DG) method for acoustics in the time-domain enables you to perform more realistic simulations for a given memory size than was previously possible.

The complete suite of computational tools provided by COMSOL Multiphysics software and its Application Builder enables you to design and optimize your products and create apps. Simulation apps enable users without any previous experience using simulation software to run the apps. With version 5.2a, designers can build even more dynamic apps where the appearance of the user interface can change during run time, centralize unit handling to better serve teams working across different countries, and include hyperlinks and videos.

Source: COMSOL

BenchVue 3.5 Software Update for Instrument Control, Test Automation, & More

Keysight Technologies recently released BenchVue 3.5, which is an intuitive platform for the PC that provides multiple-instrument measurement applications, data capture, and solution applications. Programming or separate instrument drivers are not required.

When you connect an instrument to your PC over LAN, GPIB or USB, the instrument is automatically configured for use in BenchVue. With the BenchVue Test Flow app, you can quickly create automated test sequences. BenchVue 3.5 also features new apps that support signal generators, universal counters, and Keysight’s FieldFox Series of handheld analyzers.

BenchVue’s features and specs:

  • Expandable apps that provide instrument control with plug and play functionality
  • Data logging for instruments (e.g., digital multimeters, power supplies, oscilloscopes, and FieldFox analyzers
  • Rapid test automation development and analysis
  • Three-click exporting to common data formats (e.g., .csv, MATLAB, Word, and Excel)

BenchVue 3.5 software is available free of charge. Upgrades to extend BenchVue’s functionality are also available and are priced accordingly.

Source: Keysight

Virtual Software Development for Embedded Developers

Embeddetech will launch a Kickstarter campaign on June 20 for its Virtuoso software. Virtuoso is a powerful virtual device framework intended for custom electronics designers. With it, you can virtualize embedded systems. This means that firmware application developers can drag-and-drop commonly used components (e.g., LEDs, touch screens, and keypads) or develop new components from scratch and then start developing applications. With Virtuoso, a fully functional replica of the hardware accelerates firmware development while the hardware is developed in parallel.EmbeddedTech - Virtuoso

In early 2017, Embeddetech plans to bring photo-realistic, real-time, 3-D virtualization to embedded software development using Unreal Engine, which is a powerful game engine developed by Epic Games. Embeddetech has developed a second framework which adapts Unreal Engine to Microsoft’s .NET Framework, allowing business applications to leverage the power of the modern 3-D game development workflow.

Source: Embeddetech

Thermoelectric Module Simulation Software Simplifies Design

To decrease thermal deszsign time for design engineers, Laird recently improved the AZTEC thermoelectric module (TEM) simulation program algorithms. The AZTEC product selection tool enables you to specify input variables based on application attributes and the software analysis outputs. Now you can select the best TEM by easily comparing TEM datasheets. In addition, the software includes an analysis worksheet for simulating TEM device functionality.Laird AZTEC Interface

The AZTEC product selection tool—which is available at—uses a variety of input variables (i.e., heat load, ambient and control temperatures, input voltage requirement and thermal resistance of hot side heat exchangers) to recommend appropriate TEMs to meet your application’s needs. Laird updated the software with its newest TEM product offerings.

The Analysis Worksheet Tool simulates expected thermoelectric output parameters based on a given set of thermal and electrical operating points. The included output parameters are:

  • the hot and cold side temperatures of the TEM
  • heat pumped at the cold surface of the TEM
  • coefficient of performance (COP)
  • input power requirements

The total hot side heat dissipation is also calculated.

The included Qc Estimating Worksheet calculates an estimate on the heat load for device (spot) or chamber (volume) cooling applications. Computations are made based on the input (e.g., temperature requirements, volumetric dimensions, insulation thickness, material properties, and active heat load) you provide.

Source: Laird

USB 3.1 Gen 2 Protocol Trigger and Decode Software

Keysight Technologies’s new oscilloscope-based USB 3.1 Gen 2 10-Gbps protocol decode software features real-time triggering and protocol decode performance.Keysight 8821

Intended to help you verify and debug devices that implement the 128b/132b encoding technology, the Keysight N8821A USB 3.1 protocol trigger and decode software enables you to quickly see protocol decode, search with protocol-level triggers, and use time-correlated views to troubleshoot serial protocol problems back to timing or signal integrity.

Additional information about Keysight’s new N8821A USB 3.1 protocol trigger and decode software is available at

Source: Keysight Technologies