ERROR 1
ERROR 1
ERROR 2
ERROR 2
ERROR 2
ERROR 2
ERROR 2
Password and Confirm password must match.
If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)
ERROR 2
ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.
The fuzzy sound of a distorted electric guitar is, for rock fans, a thing of beauty. It has been a staple of music since the early 1950s, when guitarists strained the vacuum tubes inside their amplifiers to match the raw voices of blues singers. Later generations relied on digital devices: “effects pedals” with circuits built from silicon diodes and transistors.
But last year, a completely new source of distortion hit the market—one that uses an electronic junction made from organic molecules (J. Phys.: Condens. Matter 2016, DOI: 10.1088/0953-8984/28/9/094011). Billed as the world’s first commercial device to rely on this form of “molecular electronics,” the Heisenberg Molecular Overdrive contains aromatic azo compounds strung between two electrodes just a few nanometers apart. This forms a molecular junction that only allows a current to flow once a threshold voltage is reached.
It is an unlikely pioneer in a field that once hoped to reinvent the computer.
More than four decades ago, researchers suggested that individual molecules could act as components in electronic circuits, potentially offering the ultimate in miniaturization. By the turn of the century, this vision was seemingly becoming a reality: Science magazine heralded the first molecular-scale circuits as their “Breakthrough of the Year” in 2001 (DOI: 10.1126/science.294.5551.2442). And since then researchers have created a bewildering array of organic molecules and nanotubes that can act as diodes, transistors, and memory devices.
And not a moment too soon, proponents said. Our computers’ capabilities have blossomed thanks to the inexorable shrinking of silicon devices, which has reliably doubled the number of transistors that can fit on a chip every couple of years (a trend known as Moore’s law, after Gordon Moore, the cofounder of Intel). But transistors, a kind of electrical switch, were becoming so tiny that electrons could leak through them even when they were off, screwing up their performance and threatening to stall this remarkable rise in computing power. Perhaps single molecules could take silicon’s place and save the day?
Today, most researchers admit that is unlikely to happen. The difficulties of building a practical computer using molecular electronics—along with the microelectronics industry’s success in carving ever-smaller features into silicon devices and working around the electron leakage problem—have forced the field to adjust its goals, says Mark A. Ratner of Northwestern University, who first proposed a molecular diode with his colleague Ari Aviram back in 1974 (Chem. Phys. Lett. 1974, DOI: 10.1016/0009-2614(74)85031-1). This may mean targeting niche applications such as sensors or the Heisenberg pedal—or harnessing single molecules as tools to explore the fundamentals of electron behavior, which may yield unexpected strategies for improving computing.
Still, the idea of computing at a molecular scale is not dead. Others are eyeing hybrid technologies that combine silicon with molecules like carbon nanotubes, and using molecules to perform logic and memory operations in ways that take advantage of their chemical properties, rather than trying to bend their electronic behavior to imitate silicon. “The field has had to reinvent itself because of the continuing miniaturization of silicon,” says Kevin F. Kelly, a nanoimaging researcher at Rice University.
Films of organic compounds are already used to make commercial electronic devices such as light-emitting diodes, but these depend on the behavior of the bulk material. In contrast, molecular electronics aims to use the properties of individual molecules to control the flow of charge between electrodes that are just nanometers apart. That potentially offers a way to construct very densely packed circuits, something that looked attractive when silicon transistors were still thousands of times as large as a molecule. But today’s chips use silicon transistors with features as small as 14 nm, and 10-nm technology is just around the corner, considerably narrowing any benefit molecular electronics might offer.
Moreover, few molecular electronics devices can beat silicon on both energy efficiency and speed of charge transport, a necessity for developing practical circuits that are good enough to usurp a well-established industry: “So many billions of dollars have been invested in silicon fabrication, nobody wants to throw that away,” says Subhasish Mitra of Stanford University.
Still, researchers have achieved some impressive milestones. In 2015, for example, Latha Venkataraman and Luis M. Campos of Columbia University and Jeffrey B. Neaton of Lawrence Berkeley National Laboratory unveiled what is arguably the best single-molecule diode ever made (Nat. Nanotechnol. 2015, DOI: 10.1038/nnano.2015.97). Diodes allow current to flow freely in one direction but not in the opposite direction, and the team’s thiophene-based device had a very high on-off ratio of more than 200—meaning 200-fold as many electrons flowed in one direction as the other. The diode relies on a polar solvent, which aligns the molecule’s orbitals so they produce the desired electrical response.
Crucially, Venkataraman’s work has been consistently reproducible, unlike a lot of previous single-molecule work, and has revealed important details about electronic behavior. “She’s one of the top three people in the world at doing these measurements,” Ratner says.
Nongjian Tao of Arizona State University has made similar fundamental discoveries that have addressed central problems in molecular computing. In some molecular electronics devices, electrons can travel from one electrode to the other by quantum tunneling, a kind of subatomic disappearing act that allows them to dive right through a normally insurmountable energy barrier. For longer molecules, though, the electron can actually hop along the molecule in several steps, akin to conventional conduction along a wire.
Last year, Tao’s team found that changing the sequence of base pairs in a DNA helix flipped the mechanism electrons used to flow through the biomolecule: In most cases, it flipped from tunneling to hopping, and in some circumstances, the base-pair swap led to an intermediate form of tunneling-hopping (Nat. Chem. 2015, DOI: 10.1038/nchem.2183). Understanding exactly how electron transport works in single molecules—and how chemical structure, mechanical stress, temperature, and other factors affect it—will be crucial for chemists trying to design molecules with very specific electronic properties, he says.
Not only have molecular electronics devices revealed fundamental information about electronic behavior, they’ve also led to computing breakthroughs, at least indirectly. For example, fundamental studies of electron behavior in molecules helped inspire a new generation of memory chip called resistive random access memory (RRAM). Researchers realized that, in certain molecular electronics devices, current was not actually flowing through the molecules. Instead it moved along nanoscale metal filaments that unexpectedly grew across the gap between electrodes.
Santa Clara-based company Crossbar has exploited this effect to develop a type of RRAM that stores data using the presence or absence of these filaments as the ones and zeroes of binary code. The company says that its RRAM operates faster and at lower voltages than traditional flash memory and began licensing the technology to chipmakers earlier this year. “Developments in molecular electronics definitely helped to develop resistive memory,” says Victor Zhirnov, chief scientist at Semiconductor Research Corp., an industry-funded research consortium.
Making smaller components is not the only way to improve on silicon electronics, though. The semiconductor industry is desperate for technologies that reduce the energy consumption of chips—for mobile devices with limited battery lives, and for companies running vast clusters of computers. Improving energy efficiency also means that components generate less waste heat, so they can be packed more closely together to make smaller devices.
Chipmakers would also like to increase their circuits’ clock speeds—essentially the number of instructions they can carry out per second—which have barely risen over the past decade. That is partly due to the relatively poor mobility of electrons within thin slivers of silicon, a property that limits how quickly a pulse of charge moves through a transistor, delaying the stream of instructions.
Unfortunately, most materials either transmit charges quickly or have low energy demands, but not both. Carbon nanotubes (CNTs) offer a way out of this Catch-22 situation because they are very thin, just one nanometer or so across, and have an intrinsically high charge mobility that can also be switched with a relatively small voltage. Imagine that the transistor is a water hose, Mitra says: You would have to stand on a fat hose with all your weight to staunch the flow, whereas a thin pipe is much easier to pinch off.
In 2013, Mitra and colleagues, including Max M. Shulaker at Massachusetts Institute of Technology, unveiled the first CNT computer, built from 178 CNT transistors and able to run simple programs (Nature 2013, DOI: 10.1038/nature12502).
Several techniques lay behind this success, including a circuit design strategy that worked around problems with mispositioned CNTs, and a better method for aligning and packing CNTs tightly so they carried a decent current. “All too often they looked like a bowl of noodles,” Mitra says of previous attempts. Instead, his team grew CNTs on a quartz substrate before transferring them to silicon. By 2014, the researchers could neatly pack around 100 CNTs per micrometer inside their transistors, the first such devices to match the overall performance of ones made of silicon.
A decade ago, the microelectronics industry had all but given up on CNTs. But experiments such as Mitra’s are showing that problems with alignment and nanotube imperfections can be overcome, and industry interest is growing again. In 2015, for example, a team from IBM unveiled a better way to bind CNTs to metal electrodes, avoiding long-standing problems with high resistance at the contact point (Science 2015, DOI: 10.1126/science.aac8006).
The key is to realize CNTs will not act alone, Mitra says. “We’re not talking about throwing away silicon; we’re talking about enhancing it.” A source of delay in circuits is the time it takes to pass signals between components. One way to reduce a signal’s transit time is to simply place those components closer together. Stacking circuits in 3-D would be a big help. But building a layer of silicon components takes high temperatures, enough to melt parts of the layers beneath it. Mitra’s CNT circuits, in contrast, are constructed at much lower temperatures, so they can be safely stacked on top of silicon circuits, he says.
Molecules might not replace silicon processors, but they could be the way forward for memory applications, some say. Silicon technology is ideal for storing data that is needed in short order because it reads and writes data using speedy electrical currents. But silicon memory is an expensive and bulky way to archive the exabytes of data (1018 bytes) the world generates every day.
One of the most promising alternatives is to encode data in DNA by using patterns of base pairs to represent digital ones and zeros. Bits can be stored at such a high density that 1 kg of DNA could meet the world’s entire storage needs, Zhirnov estimates. “It’s the only solution I can think of to support the data explosion,” he says. Stored correctly, DNA is also extremely durable; and because it is the basis of our own genome, the technology needed to read it—a DNA sequencer—is unlikely to become obsolete in the future.
Over the past few years, researchers have stored Shakespeare sonnets and entire books in DNA. In July 2016, researchers at Microsoft and the University of Washington set a new record when they stored 200 megabytes of data—including 100 books and a high-definition music video—in a tiny smear of DNA that contained more than 1.5 billion base pairs.
Although that is half the size of the human genome, data-storing DNA is built in short strands that are easier to synthesize. The researchers add identifying sequences to the ends of each 150-base-pair segment so that they can find specific chunks of data. Reading back the data simply involves sequencing the DNA, and glitches can be caught by error-correcting software, making the storage method more robust than human DNA sequencing using the same methods.
The big challenges to deploying DNA data storage are cost and speed. For archival storage, magnetic tape is still the most cost-effective data storage medium, and although slower than flash memory it is still able to transfer hundreds of megabytes per second. That would be equivalent to reading and writing 1 billion DNA bases per second, something that could only be achieved with new chemistry in a massively parallel system.
Luis Ceze, part of the University of Washington team, does not claim that DNA will replace the memory inside your computer. “But I’m optimistic that we can replace archival media like magnetic tape,” he says. Zhirnov predicts that researchers will build prototype DNA memory devices in the next five years and potentially create practical devices by 2030. “DNA storage is not as far out as it might seem,” he says.
To prove the practical value of any molecular computing devices, Mitra says, researchers must also learn how to connect the components into functional systems, a task that will require a lot more cross-disciplinary research with computer system engineers.
Richard L. McCreery, who invented the molecular junctions at the heart of the Heisenberg Molecular Overdrive, says that fundamental science such as Tao’s and Venkataraman’s is still needed to provide a better understanding of electronic behavior, something that will be crucial to discovering a killer app for molecular electronics.
But in the meantime, the University of Alberta nanotechnologist also hopes that his guitar pedal, while clearly a niche application, will lend credibility to the field. He and his colleague Adam Bergren last year formed Nanolog Audio, a company that is now in talks with musical instrument maker Roland to supply the molecular junctions for other effects circuits.
“Molecular electronics has to coexist with silicon, and like any technology it’s got to establish that it’s got some advantages,” McCreery says. “I’m sure it can do that, but it’s going to take some very good research.”
Mark Peplow is a freelance writer. A version of this story first appeared in ACS Central Science: cenm.ag/molecularcomputer.
Join the conversation
Contact the reporter
Submit a Letter to the Editor for publication
Engage with us on X