CC Blog Insights Tech The Future

The Future of Engineering Research & Systems Modeling

So many bytes, so little time. Five years ago, I found myself looking for a new career. After 20 years in the automotive sector, the economic downturn hit home and my time had come. I was lucky enough to find a position at the University of Notre Dame designing and building lab instrumentation and data acquisition equipment in the Department of Civil and Environmental Engineering & Earth Sciences, and teaching microprocessor programming in the evenings at Ivy Tech Community College. The transition from industry to the academic world has been challenging and rewarding. Component and System modeling using computer simulation is an integral part of all engineering disciplines. Much of the industry simulation software started out in a university computer lab.PIVtank

A successful computer simulation of a physical phenomenon has several requirements. The first requirement is a stable model based on a set of equations relating to the physics and scale of the event. For complex systems, this model may not exist, and a simplified model may be used to approximate the event as close as possible. Assumptions are made where data is scarce. The second requirement is a set of initial conditions that all the equation variables need to start the simulation. These values are usually determined by running real-world experiments and capturing a “snapshot” of the event at a specific time. The quality of this data depends on the technology available at the time. The technology behind sensors and data acquisition for these experiments is evolving at an incredible rate. Some sensors that may have cost $500 10 years ago are available now for $5 and have been miniaturized to one tenth of its original size to fit into a cell phone or smart band. Equipment that was too large to be used out of a lab environment is now pocket sized and portable. Researchers are taking advantage of this, and taking much more data than ever imagined.

So how will this affect the future of simulation? Multicore processors and distributed computing are allowing researchers to run more simulations and get results quicker. Our world has become Internet driven and people want data immediately, so data must become available as close to real-time as possible. As more and more sensors become wireless, low cost, energy efficient, and “smart” due to the Internet of Things movement, empirical data is available from places never before conceived. Imagine the possible advancements in weather modeling and forecasting if every cell phone in the world sent temperature, humidity, barometric pressure, GPS, and light intensity data to a cloud database automatically. More sensors lead to higher simulation resolution and more accuracy.

A popular saying, “garbage in = garbage out,” still applies, and is the bane of the Internet. Our future programmers must be able to sift through all of this new data and determine the good from the bad. Evil hackers enjoy destroying databases, so security is a major concern. Some of this new technology that could be useful in research is being rejected by the public due to criminal use. For example, a UAV “drone” that can survey a farmer’s crop can also deliver contraband or cause havoc at an airport or sporting event. While these issues are tackled in the courtroom and the FAA, researchers are waiting to take more data.

Simulation is still only a guess at what may happen under specific conditions based on assumptions of how our world works. The advancements in sensor and data acquisition technology will continue to improve the accuracy of these guesses, as long as we can depend on the reliability of the input sources and keep the evil hackers out of the databases. Schools still need to train students on how to determine good data from questionable data. The terabyte question for the future of simulation is whether or not we will be able to find the data we need in the format we need, searching through all these new data sources in less time than it would take to run the original experiments ourselves. So many bytes, so little time.


Don't miss out on upcoming issues of Circuit Cellar. Subscribe today!

— ADVERTISMENT—

Advertise Here

 
 
Note: We’ve made the October 2017 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.


Would you like to write for Circuit Cellar? We are always accepting articles/posts from the technical community. Get in touch with us and let's discuss your ideas.

R. Scott Coppersmith earned a BSc in Electrical Engineering at Michigan Technological University. He held several engineering positions in the automotive industry from the late 1980s until 2010 when he joined the University of Notre Dame’s Civil Engineering and Geological Sciences department as Research Engineer to help build a Environmental Fluid Dynamics laboratory and assist students, faculty, and visiting researchers with their projects. Scott also teaches a variety of engineering courses (e.g., Intro to Microcontrollers and Graphic Communication for Manufacturing) at Ivy Tech Community College.