A Posteriori

Attempts to grapple with and elucidate empirical knowledge

Graphene ICs around the corner? January 3, 2010

Filed under: Electronics,Semiconductors — Rāhul @ 13:51
Tags:

2009 marked the 50th anniversary of the Physicist Richard Feynman’s speech to the American Physical Society in 1959 where he foresaw the coming age of nanotechnology. Much of his vision of atomic level fabrication and nanoscale surgical robots in the bloodstream are yet to be realised. However, the Electronics industry has surged relentlessly down this path, following the now famous Moore’s law which says that the number of electronic building blocks called transistors on a chip roughly doubles every two years.

Ever since 1965 when Gordon Moore, the co-founder of Intel, foresaw this exponential decrease in the sizes of individual transistors on a computer chip, this unbridled advance of electronics has allowed progressively more powerful and compact computers. But as applications and user aspirations demand ever more powerful computers, Moore’s law is finally coming against the brick wall of Quantum Physics. Transistors on today’s chips are only of the order of a millionth of an inch, nearing the subatomic level where large scale physical laws break down and the uncertainties of Quantum Mechanics take over.

Scientists are now looking at ways to redesign the transistor in such a way as to take advantage of the quantum laws. All such approaches mean a shift from Silicon, which is used near-universally today. One promising new material is Graphene which is a one atom thick sheet of Graphite, one of the pure forms of elemental Carbon.

Graphene was only isolated in 2004 by peeling off a layer from Graphite using scotch tape. Since then, ways have been found to grow Graphene on Silicon Carbide and to deposit it from a solution. The thermal and electronic properties of Graphene have been found ideal for transistor operation. Current carriers in Graphene can travel very fast while picking up very low noise. Very high ability to conduct heat also makes it attractive in densely packed integrated circuits which need to dissipate heat efficiently.

Integrated circuits found on computer chips are essentially a clever combination of millions of transistors and other electronic elements in a logical circuit that achieves a certain output depending on the inputs. Individual transistors, connected to form electronic switches lie at the heart of the circuit. Mechanical switches can be flipped on and off to control the flow of current through a circuit. Similarly, transistors have a channel through which the flow of current is regulated by the voltage at a terminal called gate. When the gate voltage is flipped between high and low, the transistor channel can be flipped on and off. By cascading many thousands of such switches or gates, complicated logical operations can be performed electronically.

Although single transistors of Graphene had already been demonstrated, it was only recently that a Graphene Gate was demonstrated. Floriano Traversi and Roman Sordan from Politecnco di Milano and Valeria Russo from the Department of Energy, Micro and Nano structured materials laboratory, both in Italy reported in Applied Physics Letters, their demonstration of a Graphene Inverter. An Inverter is the most basic logic gate, which gives a low output if the input is high and vice versa, essentially inverting the input at the output.

The researchers deposited two adjacent Graphene layers on a Silicon substrate cove red with oxide to form the channels of two different transistors. Metal contacts were then formed behind the Silicon substrate to act as the controlling gate and in between the Graphene layers so that they were connected end to end. In an ingenious step, they then electrically annealed just one of the graphite channels so that the gate voltage at which it flips from off to on changed. If a constant voltage is now applied across both the transistors together, the output voltage at the terminal they share will depend on which transistor is off and which is on. By controlling the gate voltage, the scientists then controlled the channel conductivities and hence the output voltage.

Tying together many such switches in a circuit similar to the Silicon Integrated Circuits, much faster computer chips with lesser noise can be obtained. But this technology is yet only in a nascent stage. Output voltage from the demonstrated gate does not switch between values that can be directly fed to another gate input. Also, unlike today’s Silicon switches, this Graphene gate cannot be turned fully off shutting off current completely. So, they dissipate power even while not switching.

Silicon electronics has matured today after continuous improvements over half a century. The intense research interest and rapid progress in Graphene based devices and this demonstration of the feasibility of Graphene integrated circuits indicates that post Silicon Electronics might be just around the corner. Moore’s law will probably continue its resilience into the near future.

Advertisements
 

All the room at the bottom December 8, 2009

Filed under: General Physics — Rāhul @ 18:42
Tags:

In December 1959, 50 years ago this month, Richard Feynman gave a talk to the American Physical Society at Caltech. Titled “There’s plenty of room at the bottom”, it laid out the promise of the as yet unborn field we call today nanotechnology, and challenged physicists to turn their attention to unlocking the consequences of the laws of physics at this small scale. The potential of nanotechnology is widely recognised today and significant efforts and funding are directed to it. On this occasion of the 50th anniversary, I would like to review briefly the orignal talk by Professor Feynman and to explore how it has shaped nanotechnology research.

Feynman starts the talk appreciating the unique journey of an experimentalist who makes the first inroads into a hitherto unreachable field like Kamerlingh Onnes in low temperature physics and proposes as a similar area, the “problem of manipulating and controlling things on a small scale”. He then goes on to lay out the interesting challenge of writing the entire 24 volumes of the Encyclopaedia Britannica on a pinhead by reducing all its writing linearly by a factor of 25000 and in the same vein, of having all the information in the great libraries in a small block that can be carried about. Then, he talks about using codes of a few atoms instead of letters and symbols as a way to compress information to even smaller dimensions, which he illustrates as showing the “plenty of room” that is at the bottom. The central advance in technology that Feynman anticipates would drive all this is a better electron microscope. In 1959, electron microscopes could resolve dimensions as low as 1 nm. He challenges physicists to reduce this to 10 pm, an improvement of 100 times, which will help us look at and manipulate individual atoms.

Throughout the lecture, Feynman only described possibilities that follow the laws of Physics as then understood, but were beyond the realm of technology. He focussed on the effects of miniaturisation on computers. In the 50s, Computers with relatively few circuits filled entire rooms. If all the devices and circuits were to be made at the atomic level, he suggested that we could have computers with far more complicated circuits in a smaller space, which is exactly what we have today. Then, he talked of how the problems of lubrication and heat dissipation would scale in a favourable way at small dimensions. He also talked about the possibility of nanorobots entering the blood stream to conduct surgery, an idea that has since received considerable play in Science Fiction. Adressing the problem of assembling at the nano level, he suggested using a cascade of master-slave connections, either mechanical or electrical, that would progressively assemble at smaller and smaller levels and identified the need to improve the precision of the apparatus at each stage. As the final frontier, he considered the problem of re-arranging atoms themselves so as to create from elements and compounds to minerals and virtually anything. He ended by talking about how the physical laws are very different at such a small scale and announcing prizes for a technology challenge in this direction.

Although his groundbreaking work in Quantum Electrodynamics was well behind him, Professor Feynman didn’t then enjoy the public reputation of the supremely brilliant and erudite yet witty and charming scientist that he does today.  So, it is interesting why so many papers in nanotechnology quote this lecture as the beginning of the field. There is no direct link between the talk and the various advances that came later. But in many ways, Feynman has been prophetic. The electron microscope can today resolve down to 50 pm, which is as good as a biologist needs. Computers have indeed packed more and more circuits, devices and memory into shorter areas and grown powerful and complicated. But his vision of nano-level assembly and surgery don’t seem any closer today than when he talked about them.  In a series of articles this month, Nature Nanotechnology points to how a nascent field looked to this lecture as a focal point which drove the enormous advances that we have seen in the last few decades. While Feynman got a lot right through his crystal ball, he also got some which aren’t right yet!

Through the whole talk, the reader (and the listener, I am sure!) can sense the scientific zeitgeist of the 50s,  which was a reductionist viewpoint where everything could finally be analysed by a set of physical laws. Chemistry, Biology and other studies, it was thought, could eventually be reduced to Physics and once we had all the fundamental physical laws, we could build everything else from them. Although this point of view still holds much water and an incessant romantic sway, it is undeniable that the major advances of the last few decades have been in Biology, Psychology and Neuroscience and even many Physicists are today taking an emergent, rather than reductionist, view of the science. It can be argued that this signifies a failure of the vision and intellectual firepower required to make fundamental advancements. Perhaps, we will again return, with a momentous discovery, to the reductionist viewpoint. But for now, Science continues to look where the light is for the needle lost in the dark and tries to push the frontiers of the lighted area ever so much outwards. Maybe it will be the ability to manipulate things on an atomic scale that will eventually lead us  to the next great leap forward!