Sensors: The Next Wave of Infotech Innovation
From 1997 Ten-Year Forecast
The infotech revolution is fifty years young, for despite all the innovation and surprises served up to date, it is quite clear that far greater change lies ahead. We marvel at how computers have insinuated themselves into every corner of our lives, knowing all the while that in a few years today’s marvels will seem quaint compared to what follows. Amid all this change, a half-century of history provides us with one important constant — a clear trajectory of innovation and consequence that reveals important insights about the nature of surprises to come.
It turns out that about once a decade a new technology comes along that completely reshapes the information landscape. Just before 1980, that key enabling technology was the microprocessor, and its arrival set off a decade-long processing revolution symbolized by the personal computer. In a classic confusion of cause and effect, we called it the “Personal Computer Revolution,” but it was really a processing revolution, a decade during which we were utterly preoccupied with processing everything we could stuff into our machines.
Then just as the 1980s were closing, another new enabling technology came along to displace the centrality of the microprocessor — cheap lasers. Much as the microprocessor slipped into their lives hidden in PCs a decade earlier, lasers slipped into the lives of ordinary citizens hidden in everyday appliances — compact disc music players, CD-ROMs, and long-distance optical fiber phone lines. Lasers delivered bandwidth — huge volumes of storage on optical disk and high-quality communications bandwidth over optical fiber.
The consequence was a shift in emphasis from processing to access (Figure 1). In the 1980s, the processing decade, our devices were defined by what they processed. In the 1990s, the access decade, our devices are defined by what they connect us to. The advent of cheap lasers completely reinvented our desktop environment. Machines on the desk outwardly looked the same but changed profoundly in function, from stand-alone devices defined by what they processed to networked devices defined by what they connected us to, from 1980s-era “data laundries” to 1990s network windows on a larger information world.
Figure 1: The Shift From Processing to Access
Just as the PC symbolized the 1980s processing revolution, the centerpiece of today’s laser-enabled access revolution is the Internet in general and the World Wide Web in particular. Web surfing would be an outlandish impracticality but for massive amounts of laser-enabled fiber-optic bandwidth.
Note one important detail. The arrival of each successive new technology does not make the older technology obsolete. Microprocessors did not become irrelevant in the laser decade. In fact, lasers and the access they enabled led to demands for new kinds of access-oriented microprocessors such as digital signal processor (DSP) chips. The communications tail is now wagging the processing dog, but cheap lasers created enormous demand for microprocessor innovation.
We are approaching the end of the laser decade, and even though a few laser-enabled surprises are still waiting in the wings, we are beginning to see diminishing returns from merely adding more bandwidth to our access-oriented world.(1) It is now clear what will replace lasers as the foundational technology of the next decade.
Hints are lurking in many areas, but one of the most intriguing indicators appeared in Los Angeles in the last two years. What is the most popular item to steal out of automobiles in Los Angeles today? Air bags — because they contain an expensive and not-entirely-reliable accelerometer trigger. The consequence has been a booming market for replacement airbags, which thieves are happy to fulfill.
Air bags are about to become too cheap to steal, however, because, using MEMS (MicroElectroMechanical systems) technology, one can build an accelerometer on a single chip for a couple of dollars, creating a device that is not only cheaper than today’s sensors, but also smarter and more reliable. Today’s systems dumbly explode whenever they sense an abrupt acceleration, whether or not a passenger is present. Future systems will incorporate sensors capable of identifying not only the presence of a passenger, but their weight and size as well, and adjusting the force of inflation accordingly.
Such new devices — cheap, ubiquitous, high-performance sensors — are going to shape the coming decade. In the 1980s, we created our processor-based computer “intelligences.” In the 1990s, we networked those intelligences together with laser-enabled bandwidth. Now in the next decade we are going to add sensory organs to our devices and networks. The last two decades have served up more than their share of digital surprises, but even those surprises will pale beside what lies ahead. Processing plus access plus sensors will set the stage for the next wave — interaction. By “interaction” we don’t mean just Internet-variety interaction among people — we mean the interaction of electronic devices with the physical world on our behalf (Figure 2).
Figure 2: The Shift From Processing and Access to Interaction
What Are Sensors?
A suite of technologies underlie the rise of sensors, including MEMS, piezo-materials, micromachines, very large scale integration (VLSI) video, and a handful of other technologies (Figure 3).
Figure 3: Building Blocks
MicroElectroMechanical Systems (MEMS)
MEMS are by far the most important of the technologies enabling the rise of sensors in the near term. In concept, MEMS technology is simplicity itself: it amounts to nothing more than using semiconductor manufacturing techniques to create analog devices. But underlying MEMS technology is an interesting mind-shift in chip design.
Traditional chips are little more than intricate race tracks for electrons built up through an elaborate process of etching and deposition. One of the worst bugs a traditional chip can have is a “released layer” — in effect, a loose piece of circuit material hanging out in microspace above the chip surface. That loose layer interferes with the smooth flow of electrons because it interacts with the surrounding analog environment. In the MEMS world, however, that “bug” is a crucial feature because such released layers can serve as analog sensors, sensing everything from acceleration and temperature to pressure and fluid flows.
MEMS research has been underway for over a decade, (2) and MEMS-based devices are already finding their way into the marketplace. The automobile industry is a major consumer of MEMS devices, and is likely to be the single largest early market, as carmakers add them to everything from emissions systems to tire hubs.
The fact that MEMS is not a “new” technology underscores an important point about how each successive decade unfolds. What defines each decade is not the underlying technology’s invention, but rather a dramatic favorable shift in price and performance that triggers a sudden burst in diffusion from lab to marketplace. Thus, like MEMS, both the microprocessor and communications laser were “old” technologies from a research perspective by the time their respective decades began. The novelty was that the devices suddenly were cheap enough to put into ordinary products in the marketplace.
Piezo-materials are materials (typically ceramics) that give off an electrical charge when deformed and, conversely, deform when in the presence of an electrical field. (3) Put a charge in, the material deforms; deform the material, it sends out a charge. Piezos are particularly useful as surface-mount sensors for measuring physical movement and stress in materials. But more importantly, piezos are useful not just for sensing, but for effecting — manipulating the analog world. This is an indicator of the real significance of the sensor decade. Our devices won’t merely sense and observe. They will also interact with the physical world on our behalf.
Like MEMS, piezo-materials have been around for some time, and there is no shortage of interesting work underway. Researchers at the Georgia Institute of Technology are engaged in one of the most whimsical applications: creating a piezo-augmented “smart guitar” that mimics the sound of a high-end traditional guitar at much lower cost. As is discussed below, such research is bringing us to the verge of creating new classes of “smart materials” — materials that actively sense and respond to the surrounding analog environment.
Micromachines are semiconductor cousins to MEMS technology. Like MEMS, Micromachines are built using semiconductor manufacturing techniques, but unlike MEMS, they are more complex in design, incorporating in some instances micrometer-scale gears and other moving parts. At the bleeding edge of this field, in 1996, an affiliate of Toyota Motor Corporation, Nippondenso, constructed a “microcar” not much larger than a grain of rice — a replica of an early Toyota complete with electromagnetic motor and tiny ring-gear drive.
Micromachines exploit the often overlooked structural qualities of silicon: a low coefficient of thermal expansion, high thermal conductivity, a strength-to-weight ratio more favorable than aluminum, and elasticity comparable to that of steel. At the same time, the process of manufacturing micromachines is in its infancy, and it will be some years before elaborate micromachines are anything more than lab curiosities. Simpler devices will arrive slightly behind MEMS devices.
Today, a videocam with all the attendant circuitry required to attach it to a computer costs approximately $9 a unit in OEM (manufacturers’ price to other manufacturers) quantities. Expect this to drop precipitously as the next generation packages everything on a single chip: the charge-coupled device (CCD), all the circuitry needed, and even the lens will be glued directly to the chip. Cheap video translates into cheap “eyes” that can be used for a myriad of applications, including surveillance, security, and even party games.
Other Sensor Technologies
A host of other technologies are being pressed into the service of mediating between the analog and digital worlds. One example is Micropower Impulse Radar (MIR), a recent invention of Lawrence Livermore National Laboratory. Personal radar sounds like an unlikely consumer hit, but consider the following applications, all under commercial development: “intelligent” oil dipsticks for autos, handheld wall stud sensors, bulk tank level sensors, land mine detectors, and nondestructive testers for concrete structures.
Global positioning system sensors are also undergoing radical reinvention, lowering cost and increasing performance. Systems once costing tens of thousands of dollars can now be had in a handheld package for under $500. In the not-too-distant future, integrated sensor/GPS modules will be small and inexpensive enough to integrate into courier parcels in order to track the location and treatment of valuable cargoes.
Cheap laser technology is also rapidly changing gyroscopic technology as ring laser gyros (RLGs) displace traditional spinning-mass systems in aircraft, delivering dramatically increased performance in cheaper, more reliable packages. In the long run, it is likely that advanced MEMS accelerometer arrays will in turn displace RLG technology.
Implications: From Sensors to Effectors
The impact of sensors will be as surprising in the decade ahead as that of microprocessors in the 1980s and lasers in the 1990s. And the surprises will be additive because of the synergistic interaction among the generations of technology. Some of the most interesting applications of sensing technology will be applied to solving existing information technology problems. One such problem created by lasers is switching. Data moves along fiber-optic threads as photons traveling at the speed of light, far faster than the fastest of electronic switches can switch it. Thus the performance of fiber-optic communications systems are often “switch-bound” — limited by the speed of switches rather than the cable itself. Modest micromachine/MEMS technology could break this impasse. Researchers are working on a micromirror deformable diffraction grating “switch” theoretically capable of 20-nanosecond switching speeds.
MEMS technology could also deliver interesting storage packages or a MEMS-augmented optical disk system with capacities orders of magnitude larger than that of a conventional CD-ROM.
But these examples merely touch the most prosaic of possibilities. Casual inspection of prior forecast and subsequent reality of the microprocessor and laser decades makes it clear that the scale of surprise will be enormous even for professional forecasters. But the good news is that hints of what is to come are already occurring. As novelist William Gibson once observed, “The future’s already arrived; it’s just not evenly distributed yet.”
One place to look is the World Wide Web. In 1995, a home page of the California Department of Transportation posted a pointer to a Web-based map of the San Diego freeway system. The map displays traffic speeds and densities in real-time by means of sensors embedded in the asphalt that send data to a Sun workstation.
What’s more certain is that the most expected of futures will — as always — arrive late and in utterly unexpected ways. Even as telecommunications executives continue to try and sell tired old notions of videoconferencing, the interaction of cheap video and laser-based Web bandwidth has already delivered a hint of what the future will really hold. Recall the “Cambridge Coffee Pot,” which became an early curiosity on the Web: researchers in a research lab in the Computer Science Department of Cambridge University aimed a networked camera at the coffeepot down the hall in order to know when there was coffee fresh enough to make a trek to the kitchen worthwhile. A world of ubiquitous video is not a world of people looking at each other via videoconferencing. Rather, it is a world of cameras aimed at everything everywhere, watched over by machines, and only occasionally examined by people.
But the impact of sensors doesn’t stop at mere sensing. What happens when we put eyes, ears, and sensory organs on devices? Inevitably, we are going to ask those devices to respond to what they “see,” to manipulate the world around them. The sensor decade will really be a sensor/effector decade, where devices will not only observe things, they will also manipulate them.
This has profound implications. Two parallel universes exist today — the everyday analog universe we inhabit, and a newer digital universe created by humans, but inhabited by digital machines. We visit this digital world by peering through the portholes of our computer screens, and we manipulate it with keyboard and mouse much as a nuclear technician works with radioactive materials via glovebox and manipulator arms. Our machines manipulate the digital world directly, but they are rarely aware of the analog world that surrounds their cyberspace.
Now we are handing sensory organs and manipulators to the machines and inviting them to enter analog reality. The scale of possible surprise that this may generate over the next several decades as sensors, lasers, and microprocessors coevolve is breathtakingly uncertain.
Scaling Change: Orders of Impact
Such change seems overwhelmingly uncertain because we tend to compress outcomes into a telephoto view of the future — just as a telephoto lens compresses distance, creating the illusion that distant objects are close to nearby objects, our expectations lead us to compress chronology and overlook the logic of orders of impact as early developments contribute to later innovation. The way to make long-term sense of sensors and their place in the digital technology complex is to think explicitly in terms of first-, second-, and third-order impact, and beyond.
The history of the internal combustion engine provides a good example of orders of impact and their predictability (Figure 4).
Figure 4: Auto Impacts
The first-order impact was the “horseless carriage,” and that was no surprise to anyone for the simple reason that it was precisely what everyone was trying to build. The process of invention and subsequent diffusion was chaotic, but the outcome was clear.
The second-order impact — the traffic jam — came as something of a surprise, but only to idealists and others who had not taken the time to anticipate consequences. In fact, traffic jams were not unfamiliar to city dwellers in the horse and buggy era, so it was all but a foregone conclusion that the same would happen with cars.
But the third-order impact — suburbs — was rather more surprising, even though the first suburbs had already been around for decades on a small scale.(4) But the mobility afforded by the automobile led to the reinvention and dramatic spread of suburban life.
The biggest surprise though was the fourth-order impact — the rise of huge regional conurbations, such as the Atlantic Seaboard and the Los Angeles basin. This was unexpected in 1900 because everyone assumed that by conferring mobility, the auto would lead to the dispersal of populations, rather than their further concentration.
Sensors’ Orders of Impact
What assumptions are now blinding us to the impact of cheap and ubiquitous sensors? Notice the interesting pattern of alternating tension between expansion and constraint in each of the successive orders of impact for the internal combustion engine. The horseless carriage gave us sudden expansion locally, yielding a new constraint — the traffic jam. Early adopters responded with a third order expansion — moving to the suburbs. This in turn led to a new fourth order constraint — megacities.
Look for the same pattern of surprising consequence and interplay between expansion and constraint as sensors assume center stage in the information revolution in the decades ahead. And keep in mind that just as microprocessor and laser innovations continue today, sensor advances will have reverberating consequences well beyond the next decade. While the leading edge of sensing is with us today, the trailing edge will be felt as far out as 50 years from now, just as we’re still getting reverberations from earlier advances in lasers and microprocessors — not to mention the internal combustion engine.
Like the evolution of automobiles, the evolution of sensing technologies will pass through several orders of change (Figure 5).
Figure 5: Sensor Impacts
The first-order impact of sensors is quite obvious — cheap inputoutput (I/O) for networks and computing devices, plus modest levels of effecting — actual computer-controlled modification of the analog world — and the creation of simple “smartifacts” (defined below). A few examples include:
- Sensor-augmented heating ventilation and cooling (HVAC) systems, delivering dramatic improvements in performance and energy savings
- “Smart” courier boxes with sensors embedded in the cardboard skin that sense accelerations and box treatment en route
- Disposable video cameras as a consumer fad.
The second-order impact is more interesting. As effecting becomes richer, look for sensor/effector arrays to mature into simple classes of “smart-stuff” — smart materials and intelligent artifacts — smartifacts.(5) In addition, cheap sensors will contribute greatly to making old notions of hyperautomated manufacturing — cyber-manufacturing — a practical reality. The block in the past has been one of measurement and control granularity: the available sensors and effectors have been too coarse to really deliver the requisite levels of control over the materials. MEMS-scale devices radically reduce the scale of control, and make true automation practical.
The possibilities of smartifacts will lead to important third-order consequences, such as the advent of mass-customization. Ever since Stan Davis popularized this concept in the late 1980s, the philosopher’s stone of manufacturing has been finding a means of combining the appeal of unique one-purchaser customization with the economies of scale associated with mass manufacturing.(6) The scale-change triggered by sensors and effectors could set the stage for this to become a reality across a broad segment of industries, from autos to consumer apparel.
But there are even more interesting third-order impacts. One of the most important will be an acceleration in the decay and centrality of Von Neumann computing architectures. Consider a research initiative already underway to build turbulence-damping “smart-skins” for fighter wings. (7) This work contemplates a leading-edge array of myriad 0.2 millimeter-sized silicon microflaps, interspersed between equally small MEMS turbulence sensors.
This array is comparatively buildable today, but computational control is another matter. Even if one had an infinitely fast supercomputer controller in the fuselage linked by fiber-optic network to the array elements, the limits of the speed of light alone would make it impossible for the flaps to respond quickly enough to sensor data sent downwire to the computer and then back out as a control instruction. The only option is to create radically new hyperdistributed computational architectures, in effect a community of processors interspersed throughout the array, where each element is a triad of processor, sensor, and effector. This kind of development opens the information world to a host of way-radical exotica: from theories based on ecology and symbiosis to, in one case, models built around economics.(8)
At the fourth-order level, we will witness a generalized substitution of computation for stuff, and possibly the erosion of the entire digital order that we now take for granted. We will literally dematerialize objects, substituting, as Nick Negroponte likes to observe, “electrons for atoms.” Using arrays of sensors and effectors, one could take a structure (say, a bridge truss or aircraft spar) that in inert form lacks the intrinsic structural strength to support a given load, and dynamically sense and align its elements to yield the desired strength at a fraction of the weight of a traditional structure.(9)
The essence of this fourth order is that we will be connecting two previously parallel universes — the digital universe of our creation and the preexisting physical analog universe. The two worlds are in collision, and the biggest surprises will come when the boundaries between the two blur beyond recognition. Warriors fighting “virtual war games” over networks may discover after the fact they were killing real opponents.(10) Autonomous smartifacts, successors to today’s military UAVs (unmanned aerial vehicles), will become annoyingly commonplace. And just as the first biplanes were quickly turned from reconnaisance to battle, these new autonomous smartifacts will inevitably be pressed into service as “war-bots” of unprecedented lethality. The Advanced Research Projects Agency (ARPA) recently commissioned research on micro UAVs — autonomous flyers smaller than a dollar bill using micromachine engines to sustain a one-hour flight time and a 16-kilometer range. Just the thing for a 21st century James Bond — or a terrorist bent on assassinating a well-guarded head of state.
As the foregoing example implies, things get especially interesting as device size shrinks. If one shrinks the device sufficiently, it becomes possible to dispense with batteries entirely and allow the gizmo to run off ambient energy — sunlight, vibration, or perhaps even breezes flowing over tiny MEMS cilia. And cost shrinks with size, opening the door to what researchers refer to as “MEMS dust” — tiny, disposable devices used in a “toss-out-and-forget” manner for any number of applications from environmental sensing to surveillance.
Digital Is Dead?
The impact of ubiquitous sensors on the digital computing order could be especially surprising. In the short term, the challenge is interfacing analog sensor devices with digital computers and networks. However, it is inevitable that there will come a point when it will seem obvious that the logical next step is to create analog computers and networks in order to more effectively interface with and exploit the growing sensor arrays. And it may prove likely that there are instances where it is simply impossible to accomplish a desired goal with digital technology at all. Perhaps the ultimate solution to controlling the aircraft wing smart-skin mentioned earlier is with distributed analog electronics rather than arrays of digital processors.
A modest indicator of this trend is visible today in the audiophile world. CDs may have replaced phonograph records, but the most sophisticated audiophile stereo systems available today rely on old-fashioned vacuum tube technology to perform their magic. Audiophiles can tell the difference between sound that has been deconstructed into bits and reconstituted as an analog waveform and sound that has remained in analog form all along.
Thus, the long-term consequence of the coming sensor revolution may be the emergence of a newer analog computing industry in which digital technology plays a mere supporting role, or in some instances plays no role at all. At first, these new analog devices will probably occupy a place similar to that once occupied by supercomputers and parallel processing systems — specialized devices tailored to work on especially challenging tasks. But in the longer term, say 25 to 50 years maybe the digital order we take for granted will prove to be merely a transitional phase in a longer process of connecting symbolic universes of our creation with the preexisting physical world. Outlandish as this may sound, imagine telling information professionals in 1948 that one day they would do their work on then nonexistent digital electronic computers based on microprocessor descendants of the transistor invented in that year.
Digital Is Dull?
Yet as shocking as this sounds, one short-term surprise will prove even more startling to today’s digital establishment. Ever since the invention of the transistor, digital has been cool, and analog has been the forgotten old-fashioned stepchild. That is going to reverse itself in the next decade. Analog is going to be the great new unexplored frontier, and digital will seem just a bit dull.
Three decades ago, a generation of graduate students quietly made fun of their professors, who were trained in a world of analog electromechanical devices. They thought, “Oh, those old fuddy-duddies, vacuum tubs and analog, how quaint. Digital is hip.” Well, those professors will now have their revenge, for their once-arrogant students will be old fuddy-duddies now. The next generation will think of their digitally-steeped teachers: “You had it so easy. I mean, digital representations, it’s so straightforward, it’s so discrete, it’s so easy to contain. Analog is messy and subtle and unpredictable, and that’s where the big wins are, so get out of the damn way and let us assume the mantle of innovation.”
Of course, reality will be subtly different. Analog will be the frontier, but in turn it will lead to new digital challenges. That said, researchers would do well to ask their departmental librarian to dust off some of those overlooked 1950s Ph.D. dissertations on then interesting, now seemingly irrelevant analog problems, because we may suddenly discover that a host of insights from the analog 1950s are going to be very relevant in the analog years after 2000.
(1) For example, the next big surprise will occur on the World Wide Web. At the moment, the Web is defined by people accessing information. Over the next two years, look for the Web’s focus to shift away from this to a new model of people accessing other people in information-rich environments. In other words, the Web will go from being an information environment to an interpersonal environment in which information plays an important role supporting human interactions.
(2) It is noteworthy that the Advanced Research Project Agency (ARPA) — the same agency responsible for the initial research leading to the development of the Internet — has been a key player in catalyzing MEMS research. Just as ARPA’s investments in the early 1970s led to huge 1990s commercial payoffs in the form of the Internet “revolution,” its 1980s-era investment in MEMS could prove to be of crucial importance in the next decade.
(3) Anyone who has used a disposable lighter has experienced piezo-materials in action — pushing down on the tab flexes a fleck of piezo-crystal, generating an electrical charge converted into a spark.
(4) The first suburbs arguably appeared in the greater Boston area in the 1820s, and later experienced dramatic growth in the late 1800s, thanks to the advent of street car systems (see Henry C. Binford’s The First Suburbs [University of Chicago Press, 1985]).
(5) Smartifact is a term first coined by researcher Harry Vertelney at Apple Computer in the 1980s to refer to new forms of software-based agents. Smartifact is used here to connote something different: physical objects possessing rudimentary “intelligence” sufficient to be aware and affect the environment around them.
(6) See Stan Davis’s Future Perfect (Addison-Wesley, 1996, 1987).
(7) This research is being led by Dr. John Kim at the University of California at Los Angeles, under a grant from ARPA.
(8) Bernardo Huberman at Xerox’s Palo Alto Research Center is doing especially interesting work on this front.
(9) This research was performed by Andy Berlin, currently at Xerox’s Palo Alto Research Center.
(10) This is often referred to as the “Ender’s Game scenario,” a reference to a science fiction novel in which a group of kids are training in a computer simulation to eventually save Earth from invaders, only to learn that their graduation “simulation” was, unbeknownst to them, an actual war commanded by them (see Orson Scott Card’s “Ender’s Game” (Analog [August 1977]; TOR ).