Thin screens for televisions, computers and other big display devices may soon have brilliant, sharp pictures while consuming much less power, thanks to transistors that use carbon nanotubes to deliver current in a new way.
It will be at least a few years until the technology, described in the April 29 Science, graces your flat screen. But eventually such displays may be cheaper, last longer and use less energy than today’s finest liquid crystal displays.
The new technology employs organic light emitting diodes, or OLEDs, tiny thin films that create light in response to electrical current. Such displays have several advantages over traditional liquid crystal displays — they aren’t backlit, for example, so darkness isn’t created by blocking light, but by individual diodes emitting less light. That saves energy.
But making OLED displays that are much bigger than a smart phone’s has been problematic. While they consume less power overall, a serious burst of current is needed to fire up each pixel. Transistors that provide this much current are bulky and take up valuable screen space; they also require elaborate, expensive construction and yield pixels that aren’t uniform, a problem that grows with display size, says study coauthor Andrew Rinzler of the University of Florida.
To skirt these issues, Rinzler and his colleagues used a network of carbon nanotubes to drive current. The nanotube layer is porous, letting light through, so the transistor and light-emitting layers can be stacked vertically instead of sitting side-by-side, saving real estate. Without having to squeeze in transistors right next door to the OLEDs, more area is devoted to emitting light. In fact, 98 percent of the device emits light. That’s no small feat, says nanotechnologist Chongwu Zhou of the University of Southern California in Los Angeles.
“This is a wonderful piece of work,” says Zhou. “It pulls together a bunch of innovations.”