Skip to main content

Graphene is Next

Image Courtesy of:

Image Courtesy of:

Graphene. If you’ve never heard about it, don’t worry, a lot of people haven’t, because it’s really only been “discovered” relatively recently, and most of the truly interesting news about it has been in the last year. The amazing thing is that we’ve actually been using it for centuries, in the form of the common pencil. Graphene is a form of carbon, much like carbon nanotubes and other fullerenes, with one major difference. While fullerenes are 3D structures of carbon atoms, graphene is a flat sheet. It’s a 2D lattice of carbon with bonds as strong as diamond. It’s this sheetlike nature that makes it so useful in a pencil. As you write, individual planes of graphite are sheared off the end and deposited on the paper. Those individual planes are pure graphene.

By now, most of you are familiar with carbon nanotubes, a.k.a. CNTs, and their potential for computers. Graphene has equally amazing properties, including some that might make it far more readily usable than CNTs. First, like CNTs, graphene is capable of conducting electricity with much less resistance than copper. That alone makes it useful, but graphene has even more interesting properties. As New Scientist reports bending graphene creates strains between atoms that can create isolated pathways which then act as nanoribbons — wires — within the still connected sheet. In other words, the morphology of graphene affects its electrical properties: change the flat sheet by bending parts of it, and you change how electricity flows through it.
But that isn’t all. The pattern of carbon bonds has effects as well. Graphene is a hexagonal grid of carbon, much like a roll of chicken wire. Remove one random atom from the pattern every so often, and graphene can exhibit magnetic behavior without needing the presence of magnetic metals. Adding hydrogen into the mix creates graphene’s non-conductive cousin, graphane. Taking precisely defined patterns of atoms out of the sheet can create well-defined circuits, creating wires that are almost superconducting.
Image Courtesy of:
Image Courtesy of:
All of these properties make graphene a very important material for the future of electronics. It has already been used to create field effect transistors, the primary component of a computer processor. When you combine this with the other features above, you have a single material that could be used for the majority of the components in every electronic device we currently have… with one major difference: speed. Current silicon based chips have a limited speed at which they can run at room temperature without overheating and malfunctioning. Go much over 3GHz without some major cooling and chips melt down. But replace those chips with graphene equivalents — without having made any other changes to the circuits -—and you can raise that limit much higher. Potentially 100 to 1000 times higher. 

Let’s think about that for a moment. That’s 300GHz to 3000GHz or 3Terahertz.
That’s a jump of two or three orders of magnitude up the exponential curve, my friends, especially when you combine it with the advances in multi-core technology and parallel computing. We’re talking about that smartphone in your pocket having a thousand times the computing power of your desktop PC, but using no more power than it does right now. The resistance of graphene at room temperature is so much lower than copper and silicon that even though it’s running at 1000 times the speed, it’s not using any more current, or wasting any more energy as heat than an identical silicon device, and that’s without considering any other possible advances in the field of electronics design.
That big a leap in processing speed will simplify a lot of extremely complex tasks that require extensive amounts of data. From SETI searches for extraterrestrial intelligence to the search for all the ways a protein can fold, scientists use millions of processors in parallel to speed up research. A thousand-fold increase in computer speed could cut months to years off the time needed for their projects. The same goes for DNA sequencing, data mining, and a host of other areas.
And science will not be the sole benefactor. Most smartphones these days have the ability to use their cameras to create virtual overlays on the images that they see, a technique called Augmented Reality. AR has advanced to the point that it’s possible to create virtual characters in photos on your phone using nothing more than a 2D patterned target on the ground, or to create interactive “virtual assistants” in projected video that are capable of interacting with real world objects. Ultrafast computers will be essential for ushering in the age of Virtual Reality.
Image Courtesy of:
Image Courtesy of:
A massive increase in computer speeds is likely to benefit other complex computing tasks as well, such as real-time speech language translation. Right now, it is difficult to make these programs run quickly enough to be useful. A thousand-fold increase in computer speed could make brute force approaches a practical solution, enabling computers to crunch through entire dictionaries in milliseconds. It could make possible the elusive conversational interface that so many people believe will be the next step in operating systems. That speed will also be useful in the next generation of robotics, quite possibly bringing us a step closer to the kind of robots seen in movies like I, Robot or Star Wars. Ultrafast computers would enable a major reduction in the size of the computers needed to run some of the most complex robots we currently have, bringing the day of Rosie the Robot maid that much closer.

Obviously, ultrafast computers are going to have a very far-reaching effect on the way we do things, as well as how we interact with each other and our world, so the real questions are how practical is it to make graphene chips, and how soon can they be made? The answer is probably going to surprise you. Graphene has already been proven to be usable in current chip manufacturing processes with only minimal retooling needed. In fact, IBM has already created working 30GHz test devices using graphene transistors. In other words, graphene could begin making its way into computers as early as 2012 to 2015, and almost certainly by 2020.
Graphene, that same single-atom-thick layer of carbon that is a part of every pencil mark, is going to make all of this possible. Not bad for the humble Number 2, huh?

And for an update:
At Berkeley Lab’s Advanced Light Source, scientists working with graphene have made the first observation of the energy bands of complex particles known as plasmarons. Their discovery may hasten the day when graphene can be used to build ultrafast computers and other electronic, photonic, and plasmonic devices on the nanoscale. Understanding the relationships among these three kinds of particles—charge carriers, plasmons, and plasmarons—may hasten the day when graphene can be used for “plasmonics” to build ultrafast computers—perhaps even room-temperature quantum computers—plus a wide range of other tools and applications.
“The interesting properties of graphene are all collective phenomena,” says Rotenberg, an ALS senior staff scientist responsible for the scientific program at ALS beamline 7, where the work was performed. “Graphene’s true electronic structure can’t be understood without understanding the many complex interactions of electrons with other particles.”
The electric charge carriers in graphene are negative electrons and positive holes, which in turn are affected by plasmons—density oscillations that move like sound waves through the “liquid” of all the electrons in the material. A plasmaron is a composite particle, a charge carrier coupled with a plasmon.
Plasmons have been considered as a means of transmitting information on computer chips, since plasmons can support much higher frequencies (into the 100 THz range, while conventional wires become very lossy in the tens of GHz). For plasmon-based electronics to be useful, an analog to the transistor, called a plasmonster, must be invented.
Graphene used conventionally could be 3 to 10 THz. Graphene plasmonic computers could be 300 to 1,000 THz. Graphene plasmonic "wires" could carry data at the same speeds across continents.

And further updates:
A high-performance top-gate graphene field-effect transistor (G-FET) is fabricated, and used for constructing a high efficient frequency doubler. Taking the advantages of the high gate efficiency and low parasitic capacitance of the top-gate device geometry, the gain of the graphene frequency doubler is increased about ten times compared to that of the back-gate G-FET based device. The frequency response of the frequency doubler is also pushed from 10 kHz for a back-gate device to 200 kHz, at which most of the output power is concentrated at the doubled fundamental frequency of 400 kHz.
IBM recently showed that graphene transistor can operate up to 100 GHz, and the group at Peking University believes that the material may even still operate well in the THz regime.
* * * * * * * * * * * * * * * * * *

And yet more Graphene news:
[quote] Quantum dots are crystalline molecules from a few to many atoms in size that interact with light and magnetic fields in unique ways. The size of a dot determines its band gap – the amount of energy needed to close the circuit – and makes it tunable to a precise degree. The frequencies of light and energy released by activated dots make them particularly useful for chemical sensors, solar cells, medical imaging and nanoscale circuitry.
Singh and Penev calculated that removing islands of hydrogen from both sides of a graphane matrix leaves a well with all the properties of quantum dots, which may also be useful in creating arrays of dots for many applications.
Their work revealed several interesting characteristics. They found that when chunks of the hydrogen sublattice are removed, the area left behind is always hexagonal, with a sharp interface between the graphene and graphane. This is important, they said, because it means each dot is highly contained; calculations show very little leakage of charge into the graphane host material[/quote]
Now, if you are a reader of science fiction, in particular that of Wil McCarthy, you will have read about a substance called WELLSTONE. Also called Claytronics as well as Programmable Matter.
Need I go on?
Now there are far more applications that will be exploited much sooner than the possibilities of wellstone, such as quantum dot transitors, LEDs etc. So imagine a carbon display with pixels smaller than the rods and cones in your eyes built into a contact lens.
Such an amazingly useful material carbon is. Graphane is Graphene with a layer of hydrogen bonded to each side. If you look closely at the top picture in the article, I do believe that it actually shows GRAPHANE (i.e. the little red balls are hydrogen atoms.) This process creates hexagonal "wells" of conductive graphene (C no H) isolated from other wells via nonconductive Graphane (C with H). Combine this with Graphene's other properties, and you can see where it could enable some amazing possibilities.


Popular posts from this blog

As you know, I have friends all across the political spectrum, some of whom are liberal, some of whom are conservative. Sometimes, they call me names because I express viewpoints that they find objectionable. Note that I am including both sides in that statement, and not addressing it at either side specifically. To be blunt, I find it mildly amusing that simply stating a conclusion I have personally drawn based on observation and evidence can generally provoke extreme responses from all sides depending on whether that conclusion comes down in favor of one political "ideological certainty" or another. People who will laud my observations because they support their ideological views one day can be calling me every vile name in the book the next when another observation points out that a different ideological view is based on half-truths, extremely shaky evidence, or just completely at odds with observed reality. So, in general, I ignore the usual attempts to browbeat me into c…

How Freedom of Information will Change the World

Another rescue from's wholesale deletion of my articles.

So, 2012 has ended and everywhere you look, despite our having survived the “end of the world” you see pessimism, gloom, doom and negativity. No matter where you live, it seems everyone is convinced that there’s just no hope. So many people have stopped trying to do anything, while they “wait for god” or “wait for the Singularity.” Or simply wait, period. The negativity is everywhere. So, here’s one of my longwinded rambling rants, against that negativity.

So, you feel doomed, feel like the corporations have won, that nothing will ever change, or that if it does, the same old same old will make sure that you never get to share in the glorious new future? That only the rich will have immortality, that corporations will ensure that nanofactories are DRM’d, that nothing is ever free, and that the future is one in which Big Brother will control everything in one immortal Dictatorship? Trust me, I can understand y…

First You Must Question

Another preservation, since seems to have removed almost all of my articles I had there.

I’m going to insult your intelligence today.
No, seriously, I am. Because today I’m going to be talking to all of you who truly, firmly and completely think you base your world views on science, rational thinking, and a solid grasp of reality. I’m talking to those of you who claim to reject anything but that which can be empirically proven, mathematically defined, and subject to verified repeatable experiments. To those of you who believe that SCIENCE (note the emphasis) holds all the answers.
I deal with a lot of people like you. In fact, I recently wrote an article called “Against Consensus” which was deliberately targeted to draw fire from precisely those people who believe that “consensus” has any value at all in science. Today, I’m going to address those of you who rather rigidly think that “Science has all the answers.”
Don’t get me wrong. Science, as it is SUPPOSED to be pract…