So many bytes, so little time. Five years ago, I found myself looking for a new career. After 20 years in the automotive sector, the economic downturn hit home and my time had come. I was lucky enough to find a position at the University of Notre Dame designing and building lab instrumentation and data acquisition equipment in the Department of Civil and Environmental Engineering & Earth Sciences, and teaching microprocessor programming in the evenings at Ivy Tech Community College. The transition from industry to the academic world has been challenging and rewarding. Component and System modeling using computer simulation is an integral part of all engineering disciplines. Much of the industry simulation software started out in a university computer lab.
A successful computer simulation of a physical phenomenon has several requirements. The first requirement is a stable model based on a set of equations relating to the physics and scale of the event. For complex systems, this model may not exist, and a simplified model may be used to approximate the event as close as possible. Assumptions are made where data is scarce. The second requirement is a set of initial conditions that all the equation variables need to start the simulation. These values are usually determined by running real-world experiments and capturing a “snapshot” of the event at a specific time. The quality of this data depends on the technology available at the time. The technology behind sensors and data acquisition for these experiments is evolving at an incredible rate. Some sensors that may have cost $500 10 years ago are available now for $5 and have been miniaturized to one tenth of its original size to fit into a cell phone or smart band. Equipment that was too large to be used out of a lab environment is now pocket sized and portable. Researchers are taking advantage of this, and taking much more data than ever imagined.
So how will this affect the future of simulation? Multicore processors and distributed computing are allowing researchers to run more simulations and get results quicker. Our world has become Internet driven and people want data immediately, so data must become available as close to real-time as possible. As more and more sensors become wireless, low cost, energy efficient, and “smart” due to the Internet of Things movement, empirical data is available from places never before conceived. Imagine the possible advancements in weather modeling and forecasting if every cell phone in the world sent temperature, humidity, barometric pressure, GPS, and light intensity data to a cloud database automatically. More sensors lead to higher simulation resolution and more accuracy.
A popular saying, “garbage in = garbage out,” still applies, and is the bane of the Internet. Our future programmers must be able to sift through all of this new data and determine the good from the bad. Evil hackers enjoy destroying databases, so security is a major concern. Some of this new technology that could be useful in research is being rejected by the public due to criminal use. For example, a UAV “drone” that can survey a farmer’s crop can also deliver contraband or cause havoc at an airport or sporting event. While these issues are tackled in the courtroom and the FAA, researchers are waiting to take more data.
Simulation is still only a guess at what may happen under specific conditions based on assumptions of how our world works. The advancements in sensor and data acquisition technology will continue to improve the accuracy of these guesses, as long as we can depend on the reliability of the input sources and keep the evil hackers out of the databases. Schools still need to train students on how to determine good data from questionable data. The terabyte question for the future of simulation is whether or not we will be able to find the data we need in the format we need, searching through all these new data sources in less time than it would take to run the original experiments ourselves. So many bytes, so little time.