Latest Release of COMSOL Multiphysics and COMSOL Server

COMSOL has announced the latest release of the COMSOL Multiphysics and COMSOL Server simulation software environment. With hundreds of user-driven features and enhancements, COMSOL software version 5.2a expands electrical, mechanical, fluid, and chemical design and optimization capabilities. COMSOL_Multiphysics

In COMSOL Multiphysics 5.2a, three new solvers deliver faster and more memory-efficient computations:

  • The smoothed aggregation algebraic multigrid (SA-AMG) solver is efficient for linear elastic analysis and many other types of analyses. It is very memory conservative, making it possible to run structural assemblies with millions of degrees of freedom on a standard desktop or laptop computer.
  • The domain decomposition solver has been optimized for handling large multiphysics models. “
  • A new explicit solver based on the discontinuous Galerkin (DG) method for acoustics in the time-domain enables you to perform more realistic simulations for a given memory size than was previously possible.

The complete suite of computational tools provided by COMSOL Multiphysics software and its Application Builder enables you to design and optimize your products and create apps. Simulation apps enable users without any previous experience using simulation software to run the apps. With version 5.2a, designers can build even more dynamic apps where the appearance of the user interface can change during run time, centralize unit handling to better serve teams working across different countries, and include hyperlinks and videos.

Source: COMSOL

BenchVue 3.5 Software Update for Instrument Control, Test Automation, & More

Keysight Technologies recently released BenchVue 3.5, which is an intuitive platform for the PC that provides multiple-instrument measurement applications, data capture, and solution applications. Programming or separate instrument drivers are not required.

When you connect an instrument to your PC over LAN, GPIB or USB, the instrument is automatically configured for use in BenchVue. With the BenchVue Test Flow app, you can quickly create automated test sequences. BenchVue 3.5 also features new apps that support signal generators, universal counters, and Keysight’s FieldFox Series of handheld analyzers.

BenchVue’s features and specs:

  • Expandable apps that provide instrument control with plug and play functionality
  • Data logging for instruments (e.g., digital multimeters, power supplies, oscilloscopes, and FieldFox analyzers
  • Rapid test automation development and analysis
  • Three-click exporting to common data formats (e.g., .csv, MATLAB, Word, and Excel)

BenchVue 3.5 software is available free of charge. Upgrades to extend BenchVue’s functionality are also available and are priced accordingly.

Source: Keysight

Virtual Software Development for Embedded Developers

Embeddetech will launch a Kickstarter campaign on June 20 for its Virtuoso software. Virtuoso is a powerful virtual device framework intended for custom electronics designers. With it, you can virtualize embedded systems. This means that firmware application developers can drag-and-drop commonly used components (e.g., LEDs, touch screens, and keypads) or develop new components from scratch and then start developing applications. With Virtuoso, a fully functional replica of the hardware accelerates firmware development while the hardware is developed in parallel.EmbeddedTech - Virtuoso

In early 2017, Embeddetech plans to bring photo-realistic, real-time, 3-D virtualization to embedded software development using Unreal Engine, which is a powerful game engine developed by Epic Games. Embeddetech has developed a second framework which adapts Unreal Engine to Microsoft’s .NET Framework, allowing business applications to leverage the power of the modern 3-D game development workflow.

Source: Embeddetech

Thermoelectric Module Simulation Software Simplifies Design

To decrease thermal deszsign time for design engineers, Laird recently improved the AZTEC thermoelectric module (TEM) simulation program algorithms. The AZTEC product selection tool enables you to specify input variables based on application attributes and the software analysis outputs. Now you can select the best TEM by easily comparing TEM datasheets. In addition, the software includes an analysis worksheet for simulating TEM device functionality.Laird AZTEC Interface

The AZTEC product selection tool—which is available at Lairdtech.com—uses a variety of input variables (i.e., heat load, ambient and control temperatures, input voltage requirement and thermal resistance of hot side heat exchangers) to recommend appropriate TEMs to meet your application’s needs. Laird updated the software with its newest TEM product offerings.

The Analysis Worksheet Tool simulates expected thermoelectric output parameters based on a given set of thermal and electrical operating points. The included output parameters are:

  • the hot and cold side temperatures of the TEM
  • heat pumped at the cold surface of the TEM
  • coefficient of performance (COP)
  • input power requirements

The total hot side heat dissipation is also calculated.

The included Qc Estimating Worksheet calculates an estimate on the heat load for device (spot) or chamber (volume) cooling applications. Computations are made based on the input (e.g., temperature requirements, volumetric dimensions, insulation thickness, material properties, and active heat load) you provide.

Source: Laird

USB 3.1 Gen 2 Protocol Trigger and Decode Software

Keysight Technologies’s new oscilloscope-based USB 3.1 Gen 2 10-Gbps protocol decode software features real-time triggering and protocol decode performance.Keysight 8821

Intended to help you verify and debug devices that implement the 128b/132b encoding technology, the Keysight N8821A USB 3.1 protocol trigger and decode software enables you to quickly see protocol decode, search with protocol-level triggers, and use time-correlated views to troubleshoot serial protocol problems back to timing or signal integrity.

Additional information about Keysight’s new N8821A USB 3.1 protocol trigger and decode software is available at www.keysight.com/find/n8821a.

Source: Keysight Technologies

Software for Automated PAM-4 Pre-Compliance Testing, Reporting

Keysight Technologies recently announced new measurement application software for quickly and accurately measuring and quantifying pulse amplitude modulation with four amplitude levels (PAM-4) signals. The software is used with the Keysight S-Series, 90000A, V-Series, 90000 X- and Z-Series real-time oscilloscope platforms and the 86100D DCA-X Infiniium sampling oscilloscope. The new Keysight N8836A PAM-4 analysis software (for S-Series, 90000A, V-Series, 90000 X- and Z-Series oscilloscopes) and the new N1085A PAM-4 analysis software (for 86100D oscilloscopes) provide comprehensive characterization of electrical PAM-4 signals based on the Optical Internetworking Forum’s Common Electrical Interface (OIF-CEI 4.0) proposed 56G interfaces, and the IEEE 400 Gigabit Ethernet (P802.3bs) standard.PAM4 Keysight

 

The N8836A and N1085A PAM-4 measurement applications for S-Series, 90000A, V-Series, 90000 X- and Z-Series real-time oscilloscopes and 86100D DCA-X Series sampling oscilloscopes deliver measurements such as: linearity and output voltage measurements including level separation mismatch ratio (RLM); eye width (EW) and eye height (EH); jitter measurements including even-odd jitter and clock random jitter; and differential and common mode return losses performed using a Keysight time domain reflectometer (TDR) or VNA.

The N8836A and N1085A software applications cost $2,500 each.

Source: Keysight Technologies

New Software to Obtain Measurement Results without MIPI or Arbitrary Waveform Generator Expertise

Keysight Technologies recently announced introduced a software plug-in for the M8070A system software for M8000 Series BER test solutions. The M8085A MIPI C-PHY receiver test solution is designed for conformance and margin tests.Keysight-M8085

The MIPI C-PHY 1.0 standard supports camera and display applications. The standard comprises multilevel non-NRZ non-differential signaling. The Keysight M8190A arbitrary waveform generator (AWG) is the right instrument to generate such signals. The M8085A easy-to-use editor option enables you to set up the parameters and pattern content of test signals for turn-on and debug interactively from the GUI in familiar, application terms. During parameter adjustments, the software controls the AWG hardware to maintain uninterrupted signal generation.

In addition, the M8085A software provides the industry’s first complete and standard-conformant routines for calibration of signal parameters and physical layer (PHY) receiver tests. Thus, you can achieve results without expertise in the MIPI standard or with arbitrary waveform generators.

The software plug-in provides several options for selecting the error-detecting device. You can connect to the built-in detector in the device under test via the IBERReader interface, which transfers the test result to the M8085A software and displays the result in the GUI. Plus, it enables fully automated unattended tests.

The M8085A C-PHY software with various options is now available.

Source: Keysight 

Software-Only Hardware Simulation

Simulating embedded hardware in a Windows environment can significantly reduce development time. In this article, Michael Melkonian provides techniques for the software-only simulation of embedded hardware. He presents a simple example of an RTOS-less embedded system that uses memory-mapped I/O to access a UART-like peripheral to serially poll a slave device. The simulator is capable of detecting bugs and troublesome design flaws.

Melkonian writes:

In this article, I will describe techniques for the software-only simulation of embedded hardware in the Windows/PC environment. Software-only simulation implies an arrangement with which the embedded application, or parts of it, can be compiled and run on the Windows platform (host) talking to the software simulator as opposed to the real hardware. This arrangement doesn’t require any hardware or tools other than a native Windows development toolset such as Microsoft Developer Studio/Visual C++. Importantly, the same source code is compiled and linked for both the host and the target. It’s possible and often necessary to simulate more complex aspects of the embedded target such as interrupts and the RTOS layer. However, I will illustrate the basics of simulating hardware in the Windows environment with an example of an extremely simple hypothetical target system (see Figure 1).

Figure 1: There is a parallel between the embedded target and host environment. Equivalent entities are shown on the same level.
Figure 1: There is a parallel between the embedded target and host environment. Equivalent entities are shown on the same level.

Assuming that the source code of the embedded application is basically the same whether it runs in Windows or the embedded target, the simulation offers several advantages. You have the ability to develop and debug device drivers and the application before the hardware is ready. An extremely powerful test harness can be created on the host platform, where all code changes and additions can be verified prior to running on the actual target. The harness can be used as a part of software validation.

Furthermore, you have the ability to test conditions that may not be easy to test using the real hardware. In the vast majority of cases, debugging tools available on the host are far superior to those offered by cross development tool vendors. You have access to runtime checkers to detect memory leaks, especially for embedded software developed in C++. Lastly, note that where the final system comprises a number of CPUs/boards, simulation has the additional advantage of simulating each target CPU via a single process on a multitasking host.

FIRST THINGS FIRST
Before you decide to invest in simulation infrastructure, there are a few things to consider. For instance, when the target hardware is complex, the software simulator becomes a fairly major development task. Also, consider the adequacy of the target development tools. This especially applies to debuggers. The absence, or insufficient capability, of the debugger on the target presents a strong case for simulation. When delivery times are more critical than the budget limitations and extra engineering resources are available, the additional development effort may be justified. The simulator may help to get to the final product faster, but at a higher cost. You should also think about whether or not it’s possible to cleanly separate the application from the hardware access layer.

Remember that when exact timings are a main design concern, the real-time aspects of the target are hard to simulate, so the simulator will not help. Moreover, the embedded application’s complexity is relatively minor compared to the hardware drivers, so the simulator may not be justified. However, when the application is complex and sitting on top of fairly simple hardware, the simulator can be extremely useful.

You should also keep in mind that when it’s likely that the software application will be completed before the hardware delivery date, there is a strong case for simulation …

SOFTWARE DESIGN GUIDE
Now let’s focus on what makes embedded software adaptable for simulation. It’s hardly surprising that the following guidelines closely resemble those for writing portable code. First, you need a centralized access mechanism to the hardware (read_hw and write_hw macros). Second, the application code and device driver code must be separated. Third, you must use a thin operating level interface. Finally, avoid using the nonstandard add-ons that some cross-compilers may provide.

Download the entire article: M. Melkonian, “Software-Only Hardware Simulation,” CIrcuit Cellar 164, 2004.

AcqirisMAQS Software Simplifies Multichannel Data Acquisition Systems

Keysight Technologies recently announced a new version of its U1092A AcqirisMAQS Multichannel Acquisition Software. AcqirisMAQS software enables configuration management as well as visualization of data for hundreds of channels from a single console. The client-server architecture supports remote operation, making it possible for the data acquisition system to be distributed over a LAN. Keysight-AcqirisMAQS

A fully configurable GUI enables you to easily select instruments and measurement channels for the configuration of the acquisition parameters. Acquired data is presented in multiple display windows. Each window is fully configurable, including the selection of the number of plot areas and axes. Additional display functions provide multi-record overlay, persistence, frequency spectrum, and scalar computations on each trace. A special option for triggered single shot experiments adds an advanced configuration manager, digitizer memory protection locking and fail-safe operation.

Keysight’s multichannel digitizers contain all the timing and synchronization technologies needed to create synchronous sampling across tens or hundreds of channels at a time. For example, a reliable triggering and synchronization is essential to correctly recreate the event or process from captured data. More information about simplifying high-speed multichannel acquisition systems is available in the AcqirisMAQS Multichannel Acquisition Software Document Library.

The new version of the AcqirisMAQS multichannel acquisition software—including a 30-day free trial (evaluation mode)—is currently available now for the Keysight M9709A AXIe 8-bit high-speed digitizer as well as all the others Keysight high-speed digitizers. The 30-day free trial provides the Master and Monitoring functionality for 30 days.

Source: Keysight

Streamlined Touchscreen Design with Application Builder and COMSOL Server

Cypress Semiconductor R&D engineers are creating simulation apps that streamline their touchscreen design processes. To do so, they’re sharing their simulation expertise with colleagues using the Application Builder and COMSOL Server, released with COMSOL Multiphysics simulation software version 5.COMSOL_5.1_COMSOL_Server

With the Application Builder, engineers can create ready-to-use simulation applications that can be implemented across departments, including by product development, sales, and customer service. The Application Builder enables simulation experts to build intuitive simulation apps based on their models directly within the COMSOL environment. COMSOL Server lets them share these apps with colleagues and customers around the globe.

To incorporate advances into touchscreen technology and embedded system products, Cypress simulation engineers use COMSOL for research and design initiatives. Their touchscreens are used in phones and MP3 devices, industrial applications, and more.

Source: COMSOL

 

How to Improve Software Development Predictability

The analytical methods of failure modes effects and criticality analysis (FMECA) and failure modes effects analysis (FMEA) have been around since the 1940s. In recent years, much effort has been spent on bringing hardware related analyses such as FMECA into the realm of software engineering. In “Software FMEA/FMECA,” George Novacek takes a close look at software FMECA (SWFMECA) and its potential for making software development more predictable.

The roots of failure modes effects and criticality analysis (FMECA) and failure modes effects analysis (FMEA) date back to World War II. FMEA is a subset of FMECA in which the criticality assessment has been omitted. Therefore, for simplicity, I’ll be using the terms FMECA and SWFMECA only in this article. FMECA was developed for identification of potential hardware failures and their mitigation to ensure mission success. During the 1950s, FMECA became indispensable for analyses of equipment in critical applications, such as those occurring in military, aerospace, nuclear, medical, automotive, and other industries.

FMECA is a structured, bottom-up approach considering a failure of each and every component, its impact on the system and how to prevent or mitigate such a failure. FMECA is often combined with fault tree analysis (FTA) or event tree analyses (ETA). The FTA differs from the ETA only in that the former is focused on failures as the top event, the latter on some specific events. Those analyses start with an event and then drill down through the system to their root cause.

In recent years, much effort has been spent on bringing hardware related analyses, such as reliability prediction, FTA, and FMECA into the realm of software engineering. Software failure modes and effects analysis (SWFMEA) and software failure modes, effects, and criticality analysis (SWFMECA) are intended to be software analyses analogous to the hardware ones. In this article I’ll cover SWFMECA as it specifically relates to embedded controllers.

Unlike the classic hardware FMECA based on statistically determined failure rates of hardware components, software analyses assume that the software design is never perfect because it contains faults introduced unintentionally by software developers. It is further assumed that in any complicated software there will always be latent faults, regardless of development techniques, languages, and quality procedures used. This is likely true, but can it be quantified?

SOFTWARE ANALYSIS

SWFMECA should consider the likelihood of latent faults in a product and/or system, which may become patent during operational use and cause the product or the system to fail. The goal is to assess severity of the potential faults, their likelihood of occurrence, and the likelihood of their escaping to the customer. SWFMECA should assess the probability of mistakes being made during the development process, including integration, verification and validation (V&V), and the severity of these faults on the resulting failures. SWFMECA is also intended to determine the faults’ criticality by combining fault likelihood with the consequent failure severity. This should help to determine the risk arising from software in a system. SWFMECA should examine the development process and the product behavior in two separate analyses.

First, Development SWFMECA should address the development, testing and V&V process. This requires understanding of the software development process, the V&V techniques and quality control during that process. It should establish what types of faults may occur when using a particular design technique, programming language and the fault coverage of the verification and validation techniques. Second, Product SWFMECA should analyze the design and its implementation and establish the probability of the failure modes. It must also be based on thorough understanding of the processes as well as the product and its use.

In my opinion, SWFMECA is a bit of a misnomer with little resemblance to the hardware FMECA. Speculations what faults might be hidden in every line of code or every activity during software development is hardly realistic. However, there is resemblance with the functional level FMECA. There, system level effects of failures of functions can be established and addressed accordingly. Establishing the probability of those failures is another matter.

The data needed for such considerations are mostly subjective, their sources esoteric and their reliability debatable. The data are developed statistically, based on history, experience and long term fault data collection. Some data may be available from polling numerous industries, but how applicable they are to a specific developer is difficult to determine. Plausible data may perhaps be developed by long established software developers producing a specific type of software (e.g., Windows applications), but development of embedded controllers with their high mix of hardware/software architectures and relatively low-volume production doesn’t seem to fit the mold.

Engineers understand that hardware has limited life and customers have no problem accepting mean time between failures (MTBF) as a reality. But software does not fail due to age or fatigue. It’s all in the workmanship. I have never seen an embedded software specification requiring software to have some minimum probability of faults. Zero seems always implied.

SCORING & ANALYSIS

In the course of SWFMECA preparation, scores for potential faults should be determined: severity, likelihood of occurrence, and potential for escaping to the finished product. The scores between 1 to 10 are multiplied and thus the risk priority number (RPN) is obtained. An RPN larger than 200 should warrant prevention and mitigation planning. Yet the scores are very much subjective—that is, they’re dependent on the software complexity, the people, and other impossible to accurately predict factors. For embedded controllers the determination of the RPN appears to be just an analysis for the sake of analysis.

Statistical analyses are used every day from science to business management. Their usefulness depends on the number of samples and even with an abundance of samples there are no guarantees. SWFMECA can be instrumental for fine-tuning the software development process. In embedded controllers, however, software related failures are addressed by FMECA. SWFMECA alone cannot justify the release of a product.

EMBEDDED SOFTWARE

In embedded controllers, causes of software failures are often hardware related and exact outcomes are difficult to predict. Software faults need to be addressed by testing, code analyses, and, most important, mitigated by the architecture. Redundancy, hardware monitors, and others are time proven methods.

Software begins as an idea expressed in requirements. Design of the system architecture, including hardware/software partitioning is next, followed by software requirements, usually presented as flow charts, state diagrams, pseudo code, and so forth. High and low levels of design follow, until a code is compiled. Integration and testing come next. This is shown in the ubiquitous chart in Figure 1.

Figure 1: Software development "V" model

Figure 1: Software development “V” model

During an embedded controller design, I would not consider performing the RPN calculation, just as I would not try to calculate software reliability. I consider those purely statistical calculations to be of little practical use. However, SWFMECA activity with software ETA and FTA based on functions should be performed as a part of the system FMECA. The software review can be to a large degree automated by tools, such as Software Call Tree and many others. Automation notwithstanding, one should always check the results for plausibility.

TOOLS

Software Call Tree tells us how different modules interface and how a fault or an event would propagate through the system. Similarly, Object Relational Diagram shows how objects’ internal states affect each other. And then there are Control Flow Diagram, Entity Relationship Diagram, Data Flow Diagram, McCabe Logical Path, State Transition Diagram, and others. Those tools are not inexpensive, but they do generate data which make it possible to produce high-quality software. However, it is important to plan all the tests and analyses ahead of the time. It is easy to get mired in so many evaluations that the project’s cost and schedule suffer with little benefit to software quality.

The assumed probability of a software fault becomes a moot point. We should never plunge ahead releasing a code just because we’re satisfied that our statistical development model renders what we think is an acceptable probability of a failure. Instead, we must assume that every function may fail for whatever reason and take steps to ensure those failures are mitigated by the system architecture.

System architecture and software analyses can only be started upon determination that the requirements for the system are sufficiently robust. It is not unusual for a customer to insist on beginning development before signing the specification, which is often full of TBDs (i.e., “to be defined”). This may be leaving so many open issues that the design cannot and should not be started in earnest. Besides, development at such a stage is a violation of certification rules and will likely result in exceeding the budget and the schedule. Unfortunately, customers can’t or don’t always want to understand this and their pressure often prevails.

The ongoing desire to introduce software into the hardware paradigm is understandable. It could bring software development into a fully predictable scientific realm. So far it has been resisting those attempts, remaining to a large degree an art. Whether it can ever become a fully deterministic process, in my view, is doubtful. After all, every creative process is an art. But great strides have been made in development of tools, especially those for analyses, helping to make the process increasingly more predictable.

This article appears in Circuit Cellar 297, April 2015.

The Future of Embedded Linux

My first computer was a Cosmac Elf. My first “Desktop” was a $6,500 HeathKit H8. An Arduino today costs $3 and has more of nearly everything—except cost and size—and even my kids can program it. I became an embedded software developer without knowing it. When that H8 needed bigger floppy disks, a hard disk, or a network, you wrote the drivers yourself—in assembler if you were lucky and machine code if your were not.

Embedded software today is on the cusp of a revolution. The cost of hardware capable of running Linux continues to decline. Raspberry Pi (RPi) can be purchased for $25. A Beagle Bone Black (BBB) costs $45. An increasing number of designers are building products such as Cubi, GumStik, and Olinuxino and seeking to replicate the achievements of the RPi and BBB, which are modeled on the LEGO-like success of Arduino.

These are not “embedded Linux systems.” They are full-blown desktops—less peripherals—that are more powerful than what I owned less than a decade ago. This is a big deal. Hardware is inexpensive, and designs like the BBB and RPi are becoming easily modifiable commodities that can be completed quickly. On the other hand, software is expensive and slow. Time to market is critical. Target markets are increasingly small, with runs of a few thousand units for a specific product and purpose. Consumers are used to computers in everything. They expect computers and assume they will communicate with their smart phones, tablets, and laptops. Each year, consumers expect more.

There are not enough bare metal software developers to hope to meet the demand, and that will not improve. Worse, we can’t move from concept to product with custom software quickly enough to meet market demands. A gigabyte of RAM adds $5 to the cost of a product. The cost of an eight-week delay to value engineer software to work in a few megabytes of RAM instead, on a product that may only ship 5,000 units per year, could make the product unviable.

Products have to be inexpensive, high-quality, and fast. They have to be on the shelves yesterday and tomorrow they will be gone. The bare metal embedded model can’t deliver that, and there are only so many software developers out there with the skills needed to breathe life into completely new hardware.

That is where the joy in embedded development is for me—getting completely new hardware to load its first program. Once I get that first LED to blink everything is downhill from there. But increasingly, my work involves Linux systems integration for embedded systems: getting an embedded Linux system to boot faster, integrating MySQL, and recommending an embedded Linux distribution such as Ubuntu or Debian to a client. When I am lucky, I get to set up a GPIO or write a driver—but frequently these tasks are done by the OEM. Today’s embedded ARMs have everything, including the kitchen sink integrated (probably two).

Modern embedded products are being produced with client server architectures by developers writing in Ruby, PHP, Java, or Python using Apache web servers and MySQL databases and an assortment of web clients communicating over an alphabet soup of protocols to devices they know nothing about. Often, the application developers are working and testing on Linux or even Windows desktops. The time and skills needed to value engineer the software to accommodate small savings in hardware costs do not exist. When clients ask for an embedded software consultant, they are more likely after an embedded IT expert, rather than someone who writes device drives, or develops BSPs.

There will still be a need for those with the skills to write a TCP/IP stack that uses 256 bytes of RAM on an 8-bit processor, but that growing market will still be a shrinking portion of the even faster growing embedded device market.

The future of embedded technology is more of everything. We’ll require larger and more powerful systems, such as embedded devices running full Linux distributions like Ubuntu (even if they are in systems as simple as a pet treadmill) because it’s the easiest, most affordable solution with a fast time to market.


LaneTTFDavid Lynch owns DLA Systems. He is a software consultant and an architect, with projects ranging from automated warehouses to embedded OS ports. When he is not working with computers, he is busy attempting to automate his house and coerce his two children away from screens and into the outdoors to help build their home.

 

COMSOL Multiphysics 5.0 and the Application Builder

COMSOL recently announced the availability of Multiphysics 5.0 and Application Builder, with which “the power and accuracy of COMSOL Multiphysics is now accessible to everyone through the use of applications.”

Image made using COMSOL Multiphysics and is provided courtesy of COMSOL

Image made using COMSOL Multiphysics and is provided courtesy of COMSOL

According to COMSOL, Version 5.0 includes severral numerous enhancements to the existing Multiphysics software. The COMSOL Multiphysics product suite includes “25 application-specific modules for simulating any physics in the electrical, mechanical, fluid, and chemical disciplines.”

  • Multiphysics – Predefined multiphysics couplings, including Joule Heating with Thermal Expansion; Induction, Microwave, and Laser Heating; Thermal Stress; Thermoelectric and Piezoelectric Effect; and more.
  • Geometry and Mesh – You can create geometry from an imported mesh and call geometry subsequences using a linked subsequence.
  • Optimization and Multipurpose – The Particle Tracing Module includes accumulation of particles, erosion, and etch features. Multianalysis optimization was added as well.
  • Studies and Solvers – Improvements were made for the simulation of CAD assemblies, support for extra dimensions, and the ability to sweep over sets of materials and user-defined functions. Improved probe-while-solving and more.
  • Materials and Functions – Materials can now be copied, pasted, duplicated, dragged, and dropped. Link to Global Materials using a Material Link when the same material is used in multiple components.
  • Mechanical – Model geometrically nonlinear beams, nonlinear elastic materials, and elasticity in joints using the products for modeling structural mechanics.
  • Fluid – Create automatic pipe connections to 3-D flow domains in the Pipe Flow Module. The CFD Module is expanded with two new algebraic turbulence models, as well as turbulent fans and grilles.
  • Electrical – The AC/DC Module, RF Module, and Wave Optics Module now contain a frequency- and material-controlled auto mesh suggestion that offers the easy, one-click meshing of infinite elements and periodic conditions. The Plasma Module now contains interfaces for modeling equilibrium discharges.
  • Chemical – The Chemical Reaction Engineering Module now contains a new Chemistry interface that can be used as a Material node for chemical reactions.

Source: COMSOL

 

One Professor and Two Orderly Labs

Professor Wolfgang Matthes has taught microcontroller design, computer architecture, and electronics (both digital and analog) at the University of Applied Sciences in Dortmund, Germany, since 1992. He has developed peripheral subsystems for mainframe computers and conducted research related to special-purpose and universal computer architectures for the past 25 years.

When asked to share a description and images of his workspace with Circuit Cellar, he stressed that there are two labs to consider: the one at the University of Applied Sciences and Arts and the other in his home basement.

Here is what he had to say about the two labs and their equipment:

In both labs, rather conventional equipment is used. My regular duties are essentially concerned  with basic student education and hands-on training. Obviously, one does not need top-notch equipment for such comparatively humble purposes.

Student workplaces in the Dortmund lab are equipped for basic training in analog electronics.

Student workplaces in the Dortmund lab are equipped for basic training in analog electronics.

In adjacent rooms at the Dortmund lab, students pursue their own projects, working with soldering irons, screwdrivers, drills,  and other tools. Hence, these rooms are  occasionally called the blacksmith’s shop. Here two such workplaces are shown.

In adjacent rooms at the Dortmund lab, students pursue their own projects, working with soldering irons, screwdrivers, drills, and other tools. Hence, these rooms are occasionally called “the blacksmith’s shop.” Two such workstations are shown.

Oscilloscopes, function generators, multimeters, and power supplies are of an intermediate price range. I am fond of analog scopes, because they don’t lie. I wonder why neither well-established suppliers nor entrepreneurs see a business opportunity in offering quality analog scopes, something that could be likened to Rolex watches or Leica analog cameras.

The orderly lab at home is shown here.

The orderly lab in Matthes’s home is shown here.

Matthes prefers to build his  projects so that they are mechanically sturdy. So his lab is equipped appropriately.

Matthes prefers to build mechanically sturdy projects. So his lab is appropriately equipped.

Matthes, whose research interests include advanced computer architecture and embedded systems design, pursues a variety of projects in his workspace. He describes some of what goes on in his lab:

The projects comprise microcontroller hardware and software, analog and digital circuitry, and personal computers.

Personal computer projects are concerned with embedded systems, hardware add-ons, interfaces, and equipment for troubleshooting. For writing software, I prefer PowerBASIC. Those compilers generate executables, which run efficiently and show a small footprint. Besides, they allow for directly accessing the Windows API and switching to Assembler coding, if necessary.

Microcontroller software is done in Assembler and, if required, in C or BASIC (BASCOM). As the programming language of the toughest of the tough, Assembler comes second after wire [i.e., the soldering iron].

My research interests are directed at computer architecture, instruction sets, hardware, and interfaces between hardware and software. To pursue appropriate projects, programming at the machine level is mandatory. In student education, introductory courses begin with the basics of computer architecture and machine-level programming. However, Assembler programming is only taught at a level that is deemed necessary to understand the inner workings of the machine and to write small time-critical routines. The more sophisticated application programming is usually done in C.

Real work is shown here at the digital analog computer—bring-up and debugging of the master controller board. Each of the six microcontrollers is connected to a general-purpose human-interface module.

A digital analog computer in Matthes’s home lab works on master controller board bring-up and debugging. Each of the six microcontrollers is connected to a general-purpose human-interface module.

Additional photos of Matthes’s workspace and his embedded electronics and micrcontroller projects are available at his new website.

 

 

 

Quartus II Software Arria 10 Edition v14.0

Altera Corp. has released Quartus II software Arria 10 edition v14.0, which is an advanced 20-nm FPGA and SoC design environment. Quartus II software delivers fast compile times and enables high performance for 20-nm FPGA and SoC designs. You can further accelerate Arria 10 FPGA and SoC design cycles by using the range of 20-nm-optimized IP cores included in the latest software release.

Altera’s 20-nm design tools feature advanced algorithms. The Quartus II software Arria 10 edition v14.0 provides on average notably fast compile times. This productivity advantage enables you to shorten design iterations and rapidly close timing on 20-nm design.

Included in the latest software release is a full complement of 20-nm-optimized IP cores to enable faster design cycles. The IP portfolio includes standard protocol and memory interfaces, DSP and SoC IP cores. Altera also optimized its popular IP cores for Arria 10 FPGAs and SoCs, which include 100G Ethernet, 300G Interlaken, Interlaken Look-Aside, and PCI Express Gen3 IP. When implemented in Altera’s Arria 10 FPGAs and SoCs, these IP cores deliver the high performance.

The Quartus II software Arria 10 edition v14.0 is available now for download. The software is available as a subscription edition and includes a free 30-day trial. The annual software subscription is $2,995 for a node-locked PC license. Engineering samples of Arria 10 FPGAs are shipping today.

Source: Altera Corp.