By Peter Weiss
A decade ago, Philip H. Rittmueller was a man on a mission. By the early 1990s, the automobile industry knew that airbags, while successful at saving lives in crashes, could also prove deadly to children and small adults (SN: 9/26/98, p. 206: https://www.sciencenews.org/pages/sn_arc98/9_26_98/bob3.htm). As an engineer with NEC Technologies Automotive Electronics Division at that time, Rittmueller was looking for a technological fix for this lethal threat. Yet none of the approaches Rittmueller knew about for the automatic sizing up of car seat occupants—including weight sensing, ultrasonic scanning, and optical imaging—seemed good enough. “I was looking under all sorts of rocks,” Rittmueller recalls.
Then, in the fall of 1994 at the Massachusetts Institute of Technology, Rittmueller found the right rock. He was visiting the university’s hotbed of invention, known as the Media Lab, when he saw what looked like a throne flanked by two lighted Plexiglas poles, and he viewed a startling video showing how such a “spirit chair” was used in magic shows by the famous duo Penn and Teller.
In the video, Penn sat in the chair. As he gestured wildly with both hands and feet, drums, trumpets, cymbals, and other musical sounds blared out. No wires linked Penn to the synthesizers making the sounds, yet the device sensed his every move.
When Rittmueller tried the gadget in the lab, its speed and three-dimensional awareness were “amazing,” he recalls. If such a wireless-sensing system could be adapted to track an occupant of an automobile, he figured, he would be on his way to a superior airbag-controller that could determine the size and position of a passenger or driver and then judge whether it was dangerous to fire the airbag to its full extent.
On the spot, Rittmueller and the chair’s designers started sketching possible airbag-related designs. Today, that collaboration between MIT and Rittmueller’s Suwanee, Ga.–based automotive-electronics company, now called Elesys North America, is bearing its first fruit—an airbag controller that’s already in some cars and soon to be introduced in many more.
The commercial use of the technology is expected to mushroom beyond the airbag niche, say developers of electric field imaging, the heart of the new technology. “We’re letting objects know what’s around them,” says Media Lab physicist Neil A. Gershenfeld. In this increasingly automated world, any technology that can do that reliably, cheaply, and autonomously could be useful in many places.
Taking charge
Imaging things in the world by means of electric fields actually started out as nature’s own technology.
Several species of fish in South America and Africa generate and detect weak electric fields. Because small fish, larvae, and other prey perturb the electric fields around the field-generating fish, voltage-sensitive cells in their skin can detect the objects. Weakly electric fish, as they are called, also use electric signals to recognize and attract potential mates.
Electric field sensing has made it into the technoscape too. Anyone who has pushed one of those elevator buttons that responds to a finger’s touch without itself moving has triggered an electric field sensor. A weak electric field continuously emanates from such buttons. When a finger contacts the button, that field becomes distorted. This causes an electric current increase in the button’s circuit that’s then interpreted by the elevator’s control circuitry as, say, the “go to floor 10” command.
Similar ways of using electric fields to sense nearby objects have been in use for roughly a century. Back in the 1920s, the Russian inventor Leon Theremin created one of the first electronic musical instruments, and it was based on electric field sensing.
Today, in addition to elevators, electric field sensing shows up in touch screens and pads on computers and stud finders that locate wooden supports in walls. Still, “That’s the old-fashioned measurement,” notes Media Lab physicist Joseph A. Paradiso.
Media Lab researchers in the early 1990s realized that it might be possible to use electric fields to image objects, not just to detect their presence. The scientists weren’t looking for crisp optical imaging, but rather a fuzzier result that would be good enough to make out sizes, shapes, and positions of objects. This imprecise approach saves processing time and resources, says Joshua R. Smith of Intel Research in Seattle.
In 1991, the MIT researchers began adding electronics to conventional musical instruments, such as cellos and violins, so that computers could detect players’ bowing or other movements and produce electronic sounds to accompany the natural tones of the instrument. Tod Machover, a Media Lab composer and electronic-instrument developer, and Gershenfeld built the first such hyperinstrument, a hypercello for Yo-Yo Ma.
Working on a wireless bow for a hyperviolin in 1993, Gershenfeld, Paradiso, and their colleagues ran into some puzzling observations stemming from unexpected ways that objects, including people, can perturb electric fields.
In some configurations, for example, part of a person or object would intercept a portion of a field emanating from an electrode. “It’s like you’re casting a shadow,” Paradiso notes.
In other configurations, some conductive object—for instance, a person—would effectively meld with the field-emitting electrode, thereby changing the geometry of the system. “You’re like the antenna. You’re like a piece of metal,” Paradiso adds.
What might have loomed as engineering annoyances, however, instead became new opportunities to gather information about the sizes, shapes, conductivities, masses, and positions of objects in a field-filled space. The Media Lab researchers designed and built arrays of electrodes and associated circuitry, as well as mathematical models and computer algorithms, to analyze the signals. “People started doing electric field interfaces all over [at the Media Lab],” Paradiso recalls. In honor of the natural masters of electric field sensing, the researchers gave their equipment names such as Smart Fish, Lazy Fish, Scan Fish, and School of Fish.
Fields of dreams
By the mid-1990s, the MIT team had developed not only the spirit chair but other gadgets that responded to gestures: electrode-equipped tubular frames and a field-sensing wall, for example. These converted arm and body motions into signals that drove music synthesizers. Other efforts led to musical toys for children, though they’re not yet on the market.
In another development, a stage backdrop for the juggling troupe The Flying Karamazov Brothers used electric field sensing of the performers to create the illusion that unlikely objects were being juggled though they were merely images projected on a screen in response to the performers’ movements.
Along less-theatrical lines, electrodes hidden in a table monitored hand positions and motions to determine how best to display information on screens projected on the table. As part of a 1999 architectural exhibit at the Museum of Modern Art in New York City, the table provided details about the exhibit.
In each of these devices, the MIT researchers implemented crude imaging. A long-term goal for the imaging technology, they say, is to create more detailed representations of three-dimensional contours of people or objects within a particular space.
Although images derived from electric field data will always be “kind of blurry,” the technique can pinpoint objects in space, Smith says. Some devices that locate the center of a hand or other object “can be precise to millimeters, or even better,” he notes.
The MIT team hasn’t been alone in devising prototypes in which spatial awareness derives from electric fields. In 1990, for instance, scientists at Bell Laboratories in Murray Hill, N.J., patented a music-making baton that sends radio signals to electrodes linked to a synthesizer.
“There are pieces of [electric-field imaging] that are new and pieces that are based on preexisting and familiar technologies,” but the end result is new, Smith notes. Until recently, the benefits of using arrays of electrodes in the shadow and antenna configurations had been largely overlooked, he says.
Because electric field interactions are so complex, the math needed to extract actual images of objects from electric-field data is trickier than the already-challenging calculations needed to make pictures from the data collected in computer assisted tomography scans or magnetic resonance images.
As a MIT graduate student, Smith worked out ways to simplify that math so that computers could carry it out in reasonable amounts of time. He has used this achievement in his Field Mouse, a prototype computer-input device that recognizes the gestures of a person’s hand.
Fish in chips
In 1999, only a few years after Rittmueller waded into the creative tumult at MIT with his quest for an airbag controller, Honda began rolling out models equipped with side airbags controlled by the electric field imaging systems. This was a voluntary step on Honda’s part, since the U.S. government doesn’t regulate deployment of side airbags.
As of last year, however, front airbags on more and more cars sold in the United States have to distinguish small drivers and passengers from big ones. Most carmakers now pass the new standards by means of mechanical weight-sensing systems in the cars’ seats. Starting this summer, however, two General Motors models will feature electric field sensing systems. A French car seat maker, Faurecia, has also demonstrated an electric field–based system.
Working together with MIT and Rittmueller’s group, electronics giant Motorola has developed an electric field imaging microchip. It went into production for automotive use in 2002. The device takes the place of about 90 discrete electronic components, says electrical engineer Kevin S. Anderson of Freescale, a Phoenix-based Motorola spin-off.
Since February 2003, the chip has also been on the market for general-purpose electric field imaging beyond automotive applications. In January, the chip won a “Product of the Year” award from Electronic Products magazine.
The availability of the chip appears to be creating ferment among technology developers. “We’ve shipped millions of the devices,” mostly for the automotive market, Anderson says. Twenty products incorporating the chip are imminent, and hundreds more are in the conceptual stage, he adds.
“People are finding a lot of neat things to apply [the chip] to—things we never thought of ourselves,” says Ron DeLong, a recently retired engineer who led the development of the chip for Motorola. Although companies aren’t yet revealing these products, the Media Lab developments over the past decade hint at what’s in store.
Also, a contest last year by electronics magazine Circuit Cellar suggests other potential uses of the new chip. Winning entries included an antitheft briefcase, a sleep-monitoring system, and an electronic whoopee cushion. Other products might rely on electric fields to detect water from leaky pipes or a pot boiling over.
Meanwhile, an architectural firm in Los Angeles has revealed plans to use the chip in the pavement of a building’s courtyard to detect passersby as part of a public-art installation. The system will control an electronic-lighting display depicting faux shadows of passersby moving across the face of the building.
As popular as the chip is, it implements only some of the capabilities explored by the MIT team. A next-generation device will provide more-refined measurements, DeLong says. Microchips connected to advanced electrodes in walls and floors could locate and count people as they move, says Anderson.
By then, the human technology may finally rival nature’s. As DeLong puts it, “We’re trying to get to where we’re as good as those fish.”