Electrical Engineering Crossword (Issue 282)

The answers to Circuit Cellar’s January electronics engineering crossword puzzle are now available.

282-Crossword-key

Across

4.    VENNDIAGRAM—Represents many relation possibilities [two words]
5.    PERSISTRON—Produces a persistent display
9.    CODOMAIN—A set that includes attainable values
12.    HOMOPOLAR—Electrically symmetrical
13.    TRUTHTABLE—Determines a complicated statement’s validity [two words]
17.    POWERCAPPING—Controls either the instant or the average power consumption [two words]
18.    MAGNETRON—The first form, invented in 1920, was a split-anode type
19.    MAGNETICFLUX—F
20.    TURINGCOMPLETE—The Z3 functional program-controlled computer, for example [two words]

Down

1.    CHAOSCOMPUTERCLUB—Well-known European hacker association [three words]
2.    LOGICLEVEL—When binary, it is high and low [two words]
3.    LINEARINTERPOLATION—A simple, but inaccurate, way to convert A/D values into engineering units [two words]
6.    SYNCHRONOUSCIRCUIT—A clock signal ensures this device’s parts are in parallel [two words]
7.    BOARDBRINGUP—Design validation process [three words]
8.    HORNERSRULE—An algorithm for any polynomial order [two words]
10.    MEALY—This machine’s current state and inputs dictate its output values
11.    SQUAREWAVE—It is produced by a binary logic device [two words]
14.    THEREMIN—Its electronic signals can be amplified and sent to a loudspeaker
15.    ABAMPERE—10 A
16.    SCOPEPROBE—Connects test equipment to a DUT [two words]

 

Q&A: Andrew Godbehere, Imaginative Engineering

Engineers are inherently imaginative. I recently spoke with Andrew Godbehere, an Electrical Engineering PhD candidate at the University of California, Berkeley, about how his ideas become realities, his design process, and his dream project. —Nan Price, Associate Editor

Andrew Godbehere

Andrew Godbehere

NAN: You are currently working toward your Electrical Engineering PhD at the University of California, Berkeley. Can you describe any of the electronics projects you’ve worked on?

ANDREW: In my final project at Cornell University, I worked with a friend of mine, Nathan Ward, to make wearable wireless accelerometers and find some way to translate a dancer’s movement into music, in a project we called CUMotive. The computational core was an Atmel ATmega644V connected to an Atmel AT86RF230 802.15.4 wireless transceiver. We designed the PCBs, including the transmission line to feed the ceramic chip antenna. Everything was hand-soldered, though I recommend using an oven instead. We used Kionix KXP74 tri-axis accelerometers, which we encased in a lot of hot glue to create easy-to-handle boards and to shield them from static.

This is the central control belt-pack to be worn by a dancer for CUMotive, the wearable accelerometer project. An Atmel ATmega644V and an AT86RF230 were used inside to interface to synthesizer. The plastic enclosure has holes for the belt to attach to a dancer. Wires connect to accelerometers, which are worn on the dancer’s limbs.

This is the central control belt-pack to be worn by a dancer for CUMotive, the wearable accelerometer project. An Atmel ATmega644V and an AT86RF230 were used inside to interface to synthesizer. The plastic enclosure has holes for the belt to attach to a dancer. Wires connect to accelerometers, which are worn on the dancer’s limbs.

The dancer had four accelerometers connected to a belt pack with an Atmel chip and transceiver. On the receiver side, a musical instrument digital interface (MIDI) communicated with a synthesizer. (Design details are available at http://people.ece.cornell.edu/land/courses/ece4760/FinalProjects/s2007/njw23_abg34/index.htm.)

I was excited about designing PCBs for 802.15.4 radios and making them work. I was also enthusiastic about trying to figure out how to make some sort of music with the product. We programmed several possibilities, one of which was a sort of theremin; another was a sort of drum kit. I found that this was the even more difficult part—not just the making, but the making sense.

When I got to Berkeley, my work switched to the theoretical. I tried to learn everything I could about robotic systems and how to make sense of them and their movements.

NAN: Describe the real-time machine vision-tracking algorithm and integrated vision system you developed for the “Are We There Yet?” installation.

ANDREW: I’ve always been interested in using electronics and robotics for art. Having a designated emphasis in New Media on my degree, I was fortunate enough to be invited to help a professor on a fascinating project.

This view of the Yud Gallery is from the installed camera with three visitors present. Note the specular reflections on the floor. They moved throughout the day with the sun. This movement needed to be discerned from a visitor’s typical movement .

This view of the Yud Gallery is from the installed camera with three visitors present. Note the specular reflections on the floor. They moved throughout the day with the sun. This movement needed to be discerned from a visitor’s typical movement .

For the “Are We There Yet?” installation, we used a PointGrey FireFlyMV camera with a wide-angle lens. The camera was situated a couple hundred feet away from the control computer, so we used a USB-to-Ethernet range extender to communicate with the camera.

We installed a color camera in a gallery in the Contemporary Jewish Museum in San Francisco, CA. We used Meyer Sound speakers with a high-end controller system, which enabled us to “position” sound in the space and to sweep audio tracks around at (the computer’s programmed) will. The Meyer Sound D-Mitri platform was controlled by the computer with Open Sound Control (OSC).

This view of the Yud Gallery is from the perspective of the computer running the analysis. This is a probabilistic view, where the brightness of each pixel represents the “belief” that the pixel is part of an interesting foreground object, such as a pedestrian. Note the hot spots corresponding nicely with the locations of the visitors in the image above.

This view of the Yud Gallery is from the perspective of the computer running the analysis. This is a probabilistic view, where the brightness of each pixel represents the “belief” that the pixel is part of an interesting foreground object, such as a pedestrian. Note the hot spots corresponding nicely with the locations of the visitors in the image above.

The hard work was to then program the computer to discern humans from floors, furniture, shadows, sunbeams, and cloud reflections. The gallery had many skylights, which made the lighting very dynamic. Then, I programmed the computer to keep track of people as they moved and found that this dynamic information was itself useful to determine whether detected color-perturbance was human or not.

Once complete, the experience of the installation was beautiful, enchanting, and maybe a little spooky. The audio tracks were all questions (e.g., “Are we there yet?”) and they were always spoken near you, as if addressed to you. They responded to your movement in a way that felt to me like dancing with a ghost. You can watch videos about the installation at www.are-we-there-yet.org.

The “Are We There Yet?” project opens itself up to possible use as an embedded system. I’ve been told that the software I wrote works on iOS devices by the start-up company Romo (www.kickstarter.com/projects/peterseid/romo-the-smartphone-robot-for-everyone), which was evaluating my vision-tracking code for use in its cute iPhone rover. Further, I’d say that if someone were interested, they could create a similar pedestrian, auto, pet, or cloud-tracking system using a Raspberry Pi and a reasonable webcam.

I may create an automatic cloud-tracking system to watch clouds. I think computers could be capable of this capacity for abstraction, even though we think of the leisurely pastime as the mark of a dreamer.

NAN: Some of the projects you’ve contributed to focus on switched linear systems, hybrid systems, wearable interfaces, and computation and control. Tell us about the projects and your research process.

ANDREW: I think my research is all driven by imagination. I try to imagine a world that could be, a world that I think would be nice, or better, or important. Once I have an idea that captivates my imagination in this way, I have no choice but to try to realize the idea and to seek out the knowledge necessary to do so.

For the wearable wireless accelerometers, it began with the thought: Wouldn’t it be cool if dance and music were inherently connected the way we try to make it seem when we’re dancing? From that thought, the designs started. I thought: The project has to be wireless and low power, it needs accelerometers to measure movement, it needs a reasonable processor to handle the data, it needs MIDI output, and so forth.

My switched linear systems research came about in a different way. As I was in class learning about theories regarding stabilization of hybrid systems, I thought: Why would we do it this complicated way, when I have this reasonably simple intuition that seems to solve the problem? I happened to see the problem a different way as my intuition was trying to grapple with a new concept. That naive accident ended up as a publication, “Stabilization of Planar Switched Linear Systems Using Polar Coordinates,” which I presented in 2010 at Hybrid Systems: Computation and Control (HSCC) in Stockholm, Sweden.

NAN: How did you become interested in electronics?

ANDREW: I always thought things that moved seemingly of their own volition were cool and inherently attention-grabbing. I would think: Did it really just do that? How is that possible?

Andrew worked on this project when computers still had parallel ports. a—This photo shows manually etched PCB traces for a digital EKG (the attempted EEG) with 8-bit LED optoisolation. The rainbow cable connects to a computer’s parallel port. The interface code was written in C++ and ran on DOS. b—The EKG circuitry and digitizer are shown on the left. The 8-bit parallel computer interface is on the right. Connecting the two boards is an array of coupled LEDs and phototransistors, encased in heat shrink tubing to shield against outside light.

Andrew worked on this project when computers still had parallel ports. a—This photo shows manually etched PCB traces for a digital EKG (the attempted EEG) with 8-bit LED optoisolation. The rainbow cable connects to a computer’s parallel port. The interface code was written in C++ and ran on DOS. b—The EKG circuitry and digitizer are shown on the left. The 8-bit parallel computer interface is on the right. Connecting the two boards is an array of coupled LEDs and phototransistors, encased in heat shrink tubing to shield against outside light.

Electric rally-car tracks and radio-controlled cars were a favorite of mine. I hadn’t really thought about working with electronics or computers until middle school. Before that, I was all about paleontology. Then, I saw an episode of Scientific American Frontiers, which featured Alan Alda excitedly interviewing RoboCup contestants. Watching RoboCup [a soccer game involving robotic players], I was absolutely enchanted.

While my childhood electronic toys moved and somehow acted as their own entities, they were puppets to my intentions. Watching RoboCup, I knew these robots were somehow making their own decisions on-the-fly, magically making beautiful passes and goals not as puppets, but as something more majestic. I didn’t know about the technical blood, sweat, and tears that went into it all, so I could have these romantic fantasies of what it was, but I was hooked from that moment.

That spurred me to apply to a specialized science and engineering high school program. It was there that I was fortunate enough to attend a fabulous electronics class (taught by David Peins), where I learned the basics of electronics, the joy of tinkering, and even PCB design and assembly (drilling included). I loved everything involved. Even before I became academically invested in the field, I fell in love with the manual craft of making a circuit.

NAN: Tell us about your first design.

ANDREW: Once I’d learned something about designing and making circuits, I jumped in whole-hog, to a comical degree. My very first project without any course direction was an electroencephalograph!

I wanted to make stuff move on my computer with my brain, the obvious first step. I started with a rough design and worked on tweaking parameters and finding components.

In retrospect, I think that first attempt was actually an electromyograph that read the movements of my eye muscles. And it definitely was an electrocardiograph. Success!

Someone suggested that it might not be a good idea to have a power supply hooked up in any reasonably direct path with your brain. So, in my second attempt, I tried to make something new, so I digitized the signal on the brain side and hooked it up to eight white LEDs. On the other side, I had eight phototransistors coupled with the LEDs and covered with heat-shrink tubing to keep out outside light. That part worked, and I was excited about it, even though I was having some trouble properly tuning the op-amps in that version.

NAN: Describe your “dream project.”

ANDREW: Augmented reality goggles. I’m dead serious about that, too. If given enough time and money, I would start making them.

I would use some emerging organic light-emitting diode (OLED) technology. I’m eyeing the start-up MicroOLED (www.microoled.net) for its low-power “near-to-eye” display technologies. They aren’t available yet, but I’m hopeful they will be soon. I’d probably hook that up to a Raspberry Pi SBC, which is small enough to be worn reasonably comfortably.

Small, high-resolution cameras have proliferated with modern cell phones, which could easily be mounted into the sides of goggles, driving each OLED display independently. Then, it’s just a matter of creativity for how to use your newfound vision! The OpenCV computer vision library offers a great starting point for applications such as face detection, image segmentation, and tracking.

Google Glass is starting to get some notice as a sort of “heads-up” display, but in my opinion, it doesn’t go nearly far enough. Here’s the craziest part—please bear with me—I’m willing to give up directly viewing the world with my natural eyes, I would be willing to have full field-of-vision goggles with high-resolution OLED displays with stereoscopic views from two high-resolution smartphone-style cameras. (At least until the technology gets better, as described in Rainbows End by Vernor Vinge.) I think, for this version, all the components are just now becoming available.

Augmented reality goggles would do a number of things for vision and human-computer interaction (HCI). First, 3-D overlays in the real world would be possible.

Crude example: I’m really terrible with faces and names, but computers are now great with that, so why not get a little help and overlay nametags on people when I want? Another fascinating thing for me is that this concept of vision abstracts the body from the eyes. So, you could theoretically connect to the feed from any stereoscopic cameras around (e.g., on an airplane, in the Grand Canyon, or on the back of some wild animal), or you could even switch points of view with your friend!

Perhaps reality goggles are not commercially viable now, but I would unabashedly use them for myself. I dream about them, so why not make them?

Q&A: Marilyn Wolf, Embedded Computing Expert

Marilyn Wolf has created embedded computing techniques, co-founded two companies, and received several Institute of Electrical and Electronics Engineers (IEEE) distinctions. She is currently teaching at Georgia Institute of Technology’s School of Electrical and Computer Engineering and researching smart-energy grids.—Nan Price, Associate Editor

NAN: Do you remember your first computer engineering project?

MARILYN: My dad is an inventor. One of his stories was about using copper sewer pipe as a drum memory. In elementary school, my friend and I tried to build a computer and bought a PCB fabrication kit from RadioShack. We carefully made the switch features using masking tape and etched the board. Then we tried to solder it and found that our patterning technology outpaced our soldering technology.

NAN: You have developed many embedded computing techniques—from hardware/software co-design algorithms and real-time scheduling algorithms to distributed smart cameras and code compression. Can you provide some information about these techniques?

Marilyn Wolf

Marilyn Wolf

MARILYN: I was inspired to work on co-design by my boss at Bell Labs, Al Dunlop. I was working on very-large-scale integration (VLSI) CAD at the time and he brought in someone who designed consumer telephones. Those designers didn’t care a bit about our fancy VLSI because it was too expensive. They wanted help designing software for microprocessors.

Microprocessors in the 1980s were pretty small, so I started on simple problems, such as partitioning a specification into software plus a hardware accelerator. Around the turn of the millennium, we started to see some very powerful processors (e.g., the Philips Trimedia). I decided to pick up on one of my earliest interests, photography, and look at smart cameras for real-time computer vision.

That work eventually led us to form Verificon, which developed smart camera systems. We closed the company because the market for surveillance systems is very competitive.
We have started a new company, SVT Analytics, to pursue customer analytics for retail using smart camera technologies. I also continued to look at methodologies and tools for bigger software systems, yet another interest I inherited from my dad.

NAN: Tell us a little more about SVT Analytics. What services does the company provide and how does it utilize smart-camera technology?

MARILYN: We started SVT Analytics to develop customer analytics for software. Our goal is to do for bricks-and-mortar retailers what web retailers can do to learn about their customers.

On the web, retailers can track the pages customers visit, how long they stay at a page, what page they visit next, and all sorts of other statistics. Retailers use that information to suggest other things to buy, for example.

Bricks-and-mortar stores know what sells but they don’t know why. Using computer vision, we can determine how long people stay in a particular area of the store, where they came from, where they go to, or whether employees are interacting with customers.

Our experience with embedded computer vision helps us develop algorithms that are accurate but also run on inexpensive platforms. Bad data leads to bad decisions, but these systems need to be inexpensive enough to be sprinkled all around the store so they can capture a lot of data.

NAN: Can you provide a more detailed overview of the impact of IC technology on surveillance in recent years? What do you see as the most active areas for research and advancements in this field?

MARILYN: Moore’s law has advanced to the point that we can provide a huge amount of computational power on a single chip. We explored two different architectures: an FPGA accelerator with a CPU and a programmable video processor.

We were able to provide highly accurate computer vision on inexpensive platforms, about $500 per channel. Even so, we had to design our algorithms very carefully to make the best use of the compute horsepower available to us.

Computer vision can soak up as much computation as you can throw at it. Over the years, we have developed some secret sauce for reducing computational cost while maintaining sufficient accuracy.

NAN: You wrote several books, including Computers as Components: Principles of Embedded Computing System Design and Embedded Software Design and Programming of Multiprocessor System-on-Chip: Simulink and System C Case Studies. What can readers expect to gain from reading your books?

MARILYN: Computers as Components is an undergraduate text. I tried to hit the fundamentals (e.g., real-time scheduling theory, software performance analysis, and low-power computing) but wrap around real-world examples and systems.

Embedded Software Design is a research monograph that primarily came out of Katalin Popovici’s work in Ahmed Jerraya’s group. Ahmed is an old friend and collaborator.

NAN: When did you transition from engineering to teaching? What prompted this change?

MARILYN: Actually, being a professor and teaching in a classroom have surprisingly little to do with each other. I spend a lot of time funding research, writing proposals, and dealing with students.

I spent five years at Bell Labs before moving to Princeton, NJ. I thought moving to a new environment would challenge me, which is always good. And although we were very well supported at Bell Labs, ultimately we had only one customer for our ideas. At a university, you can shop around to find someone interested in what you want to do.

NAN: How long have you been at Georgia Institute of Technology’s School of Electrical and Computer Engineering? What courses do you currently teach and what do you enjoy most about instructing?

MARILYN: I recently designed a new course, Physics of Computing, which is a very different take on an introduction to computer engineering. Instead of directly focusing on logic design and computer organization, we discuss the physical basis of delay and energy consumption.

You can talk about an amazingly large number of problems involving just inverters and RC circuits. We relate these basic physical phenomena to systems. For example, we figure out why dynamic RAM (DRAM) gets bigger but not faster, then see how that has driven computer architecture as DRAM has hit the memory wall.

NAN: As an engineering professor, you have some insight into what excites future engineers. With respect to electrical engineering and embedded design/programming, what are some “hot topics” your students are currently attracted to?

MARILYN: Embedded software—real-time, low-power—is everywhere. The more general term today is “cyber-physical systems,” which are systems that interact with the physical world. I am moving slowly into control-oriented software from signal/image processing. Closing the loop in a control system makes things very interesting.

My Georgia Tech colleague Eric Feron and I have a small project on jet engine control. His engine test room has a 6” thick blast window. You don’t get much more exciting than that.

NAN: That does sound exciting. Tell us more about the project and what you are exploring with it in terms of embedded software and closed-loop control systems.

MARILYN: Jet engine designers are under the same pressures now that have faced car engine designers for years: better fuel efficiency, lower emissions, lower maintenance cost, and lower noise. In the car world, CPU-based engine controllers were the critical factor that enabled car manufacturers to simultaneously improve fuel efficiency and reduce emissions.

Jet engines need to incorporate more sensors and more computers to use those sensors to crunch the data in real time and figure out how to control the engine. Jet engine designers are also looking at more complex engine designs with more flaps and controls to make the best use of that sensor data.

One challenge of jet engines is the high temperatures. Jet engines are so hot that some parts of the engine would melt without careful design. We need to provide more computational power while living with the restrictions of high-temperature electronics.

NAN: Your research interests include embedded computing, smart devices, VLSI systems, and biochips. What types of projects are you currently working on?

MARILYN: I’m working on with Santiago Grivalga of Georgia Tech on smart-energy grids, which are really huge systems that would span entire countries or continents. I continue to work on VLSI-related topics, such as the work on error-aware computing that I pursued with Saibal Mukopodhyay.

I also work with my friend Shuvra Bhattacharyya on architectures for signal-processing systems. As for more unusual things, I’m working on a medical device project that is at the early stages, so I can’t say too much specifically about it.

NAN: Can you provide more specifics about your research into smart energy grids?

MARILYN: Smart-energy grids are also driven by the push for greater efficiency. In addition, renewable energy sources have different characteristics than traditional coal-fired generators. For example, because winds are so variable, the energy produced by wind generators can quickly change.

The uses of electricity are also more complex, and we see increasing opportunities to shift demand to level out generation needs. For example, electric cars need to be recharged, but that can happen during off-peak hours. But energy systems are huge. A single grid covers the eastern US from Florida to Minnesota.

To make all these improvements requires sophisticated software and careful design to ensure that the grid is highly reliable. Smart-energy grids are a prime example of Internet-based control.

We have so many devices on the grid that need to coordinate that the Internet is the only way to connect them. But the Internet isn’t very good at real-time control, so we have to be careful.

We also have to worry about security Internet-enabled devices enable smart grid operations but they also provide opportunities for tampering.

NAN: You’ve earned several distinctions. You were the recipient of the Institute of Electrical and Electronics Engineers (IEEE) Circuits and Systems Society Education Award and the IEEE Computer Society Golden Core Award. Tell us about these experiences.

MARILYN: These awards are presented at conferences. The presentation is a very warm, happy experience. Everyone is happy. These things are time to celebrate the field and the many friends I’ve made through my work.

Electrical Engineering Crossword (Issue 281)

The answers to Circuit Cellar’s December electronics engineering crossword puzzle are now available.

281-Crossword-key

Across

4. VENNDIAGRAM—Represents many relation possibilities [two words]
5. PERSISTRON—Produces a persistent display
9. CODOMAIN—A set that includes attainable values
12. HOMOPOLAR—Electrically symmetrical
13. TRUTHTABLE—Determines a complicated statement’s validity [two words]
17. POWERCAPPING—Controls either the instant or the average power consumption [two words]
18. MAGNETRON—The first form, invented in 1920, was a split-anode type
19. MAGNETICFLUX—F
20. TURINGCOMPLETE—The Z3 functional program-controlled computer, for example [two words]

Down

1. CHAOSCOMPUTERCLUB—Well-known European hacker association [three words]
2. LOGICLEVEL—When binary, it is high and low [two words]
3. LINEARINTERPOLATION—A simple, but inaccurate, way to convert A/D values into engineering units [two words]
6. SYNCHRONOUSCIRCUIT—A clock signal ensures this device’s parts are in parallel [two words]
7. BOARDBRINGUP—Design validation process [three words]
8. HORNERSRULE—An algorithm for any polynomial order [two words]
10. MEALY—This machine’s current state and inputs dictate its output values
11. SQUAREWAVE—It is produced by a binary logic device [two words]
14. THEREMIN—Its electronic signals can be amplified and sent to a loudspeaker
15. ABAMPERE—10 A
16. SCOPEPROBE—Connects test equipment to a DUT [two words]

Electrical Engineering Crossword (Issue 280)

The answers to Circuit Cellar’s November electronics engineering crossword puzzle are now available.

280-crossword-key

Across

2.    DOPPLERLIMIT—Temperature restrictions for laser cooling techniques [two words]
4.    PINKNOISE—This signal sees the world through rose-colored glasses [two words]
5.    BOLTZMANNCONSTANT—k or kB [two words]
7.    XORGATE—A half adder is made of an AND gate and one of these [two words]
14.    NEPER—Symbolized by “Np”
15.    GOTOPAIR—Two tunnel diodes used in high-speed gate circuits [two words]
17.    YAGI—Unidirectional antenna
18.    PHOSPHOR—Used as a light source in a cathode ray tube
19.    KERREFFECT—All materials show this, but certain liquids display it more strongly than others [two words]

Down

1.    WEINBRIDGEOSCILLATOR—This type generates sine waves [three words]
3.    INTEGRATEDINJECTIONLOGIC—These digital circuits are built with several collector BJTs [three words]
6.    CARBONNANOTUBES—Their electronic properties can be metallic or semiconducting [two words]
8.    GILBERTCIRCUIT—Uses diodes and transistors’ logarithmic properties to compensates for nonlinearities and instabilities [two words]
9.    ADDRESSINGMODE—Can be implied by the instruction’s function [two words]
10.    SINADRATIO—Used to measure a signal’s standards
11.    FLATPACK—Semiconductor network sealed in a thin rectangular package
12.    SPEECHCLIPPING—Process that limits peak signals [two words]
13.    COMPANDOR—Condenses or enlarges an electric signal’s dynamic range
15.    GLASS—Device under development by search engine giant
16.    LILLIPUTIAN—A very small robot