Untitled

T
a
b
l
eo
f
c
o
n
t
e
n
t
s
I
CT
-Qubi
t
si
nt
hepi
nk
-Quant
um phy
s
i
cs
:I
nf
or
mat
i
ononheat
-Coher
encei
nEx
c
i
t
ons
-Phy
s
i
cspr
omi
s
eswi
r
el
es
spower
-Cl
oudydaywon'
tr
ai
nonl
as
ercommuni
cat
i
ons
-Br
i
ngi
ngcombi
nedpos
i
t
i
oni
ngandcommuni
cat
i
onst
ec
hnol
ogi
es
t
omar
k
et
-I
nt
egr
at
eds
ol
ut
i
onf
ormanagi
ngpr
oducet
r
ans
por
tf
l
eet
s
Mi
c
r
o
e
l
e
c
t
r
.&Na
n
o
t
e
c
h
-Fi
r
s
tr
out
i
neus
eofnanot
ubesi
npr
oduct
i
onofCMOS
-ASt
epCl
os
ert
oNanot
ubeComput
er
s
-El
ect
r
onbeamss
hr
i
nkcar
bonnanot
ubest
oor
der
-Gel
scoul
dpowert
i
nydev
i
ces
-Neur
alnet
wor
k
i
ngnanot
ubes
-Thr
eedi
mens
i
onalpol
y
merwi
t
hunus
ualmagnet
i
s
m
En
e
r
g
y
-El
ect
r
i
c
i
t
yf
r
om SugarWat
er
L
i
f
es
c
i
e
n
c
e
s
-Ant
i
pr
ot
onsFourTi
mesMor
eEf
f
ect
i
v
et
hanPr
ot
ons
f
orCel
lI
r
r
adi
at
i
on
-Bact
er
i
aCoul
dMak
eNewLi
br
ar
yOfCancerDr
ugsThatAr
e
TooCompl
exToCr
eat
eAr
t
i
f
i
c
i
al
l
y
-Pl
as
t
erc
ur
ef
orcancer
-Nanot
ec
hTr
i
pl
eThr
eatt
oCancer
-Compr
ehens
i
v
emodeli
sf
i
r
s
tt
omappr
ot
ei
nf
ol
di
ngatat
omi
cl
ev
el
I
CT
Qubits in the pink
Crystal imperfections known as nitrogen–vacancy defects give some diamonds a characteristic pink
colour. Appropriately manipulated, these defects might have rosy prospects as the 'qubits' of a
quantum computer. Ronald Hanson and colleagues from Oxford University report in Physical
Review Letters new developments in the study of negatively charged 'nitrogen–vacancy defects' in
diamond. These systems are rapidly becoming a front-runner for use as the basic unit of quantum
information — the 'qubit' — in a solid-state quantum computer.
The lattice of carbon atoms that makes up diamond can contain various substitutional impurities,
such as nitrogen or boron atoms. These defects give diamonds their colour, and are often called
colour centres. Nitrogen–vacancy (NV) defects give diamond a pink hue, and arise when a nitrogen
atom replaces a carbon atom at a position in the diamond lattice next to a vacant site. It might seem
unlikely that these two defects should sit right next to each other, but if the diamond is heated,
vacancies can diffuse through the lattice until they encounter nitrogen atoms. When this happens,
the 'random walk' comes to a halt because the configuration of the two adjacent defects is extremely
stable.NV centres come in two flavours: electrically neutral (NV0) and negatively charged (NV-).
Here, we are interested only in the NV- centres, which have an extra electron that is probably
donated by another nitrogen defect. The total number of electrons in an NV- defect that do not form
bonds between neighbouring carbon atoms is six, and these electrons have a ground state with total
spin S = 1. This spin can have three different orientations, described by a magnetic quantum
number mS = +1, 0 or -1. If the spin tends to take one of these values, it is said to be polarized.
A well-defined, 'coherent' spin state can be preserved in the NV- centre for a long time (more than
50 µs, even at room temperature). The energy difference between the mS=0 state and the mS=1
states is about 12 µeV. Photons with a microwave frequency of 3 GHz have precisely this energy,
and can therefore be used to manipulate the spin state of the NV- centre. To 'read out' the spin state,
transitions to much higher energy states are used. These transitions can be induced by shining a
laser on the diamond. The light that is emitted when the NV- centre relaxes back to the lower energy
levels tells us what the spin state was. This process is called photoluminescence.
The NV- spins give us much of what we need for a practical qubit — long coherence times and
precise external control — and have been widely studied in the context of quantum computing4.
But considerable obstacles must be overcome before effective quantum computing with NV- is
possible. Perhaps the toughest of these is scaling the system up to many qubits.
Hanson et al. take a significant step towards solving this problem. They demonstrate a coupling
between an NV- centre and a substitutional nitrogen centre, N, with no associated vacancy (this is
not the same nitrogen defect that supplies the NV- with its extra electron). This 'N centre' also has a
spin, but with different magnetic quantum numbers: mS=1/2. When a magnetic field is applied, the
energy-level structure of both the NV- and N spins changes, a phenomenon known as the Zeeman
effect. At certain values of the field, the energy gaps between the spin levels in the two defects
become equal. If there is an interaction between NV- and N, this 'resonance' allows the two defects
to exchange energy and polarization through a so-called virtual photon This reduces the
photoluminescence signal because the NV- is no longer in the correct quantum state to emit light.
In a further experiment, the authors bring a microwave field into resonance with the NV- defect.
This too leads to an exchange of polarization, this time between the NV- spin and the field. If the
NV- is isolated in the diamond lattice, a single dip should be observed in the photoluminescence
signal. But Hanson et al. see two dips in their signal, again because of the interaction with the N
centre. The two possible orientations of the N spin cause different effective magnetic fields at the
NV- centre, and shift the resonance accordingly. Importantly, this shift allows us to read out the spin
state of the N centre by simply observing the photoluminescence of the NV- centre. The same
experiment shows that a specific state of the N centre can be prepared on demand.
As N centres are very common in samples containing NV- defects, the interactions described above
are a main cause of NV- qubits losing coherence. Being able to control an N centre is a promising
way of reducing this decoherence to an acceptable level.
The N–NV- system can embody two electron-spin qubits, so these experiments represent a
significant step towards a larger-scale quantum computer based on the NV- system. The authors
suggest that several NV- centres could be connected through chains of N defects. This seems to us
very ambitious; a better way of achieving a many-qubit register could be to use a form of
'measurement-based' quantum computing, in which quantum correlations are created between qubits
before any quantum algorithms are executed. In this context, the N–NV- system could be ideal for a
protocol that creates the required quantum correlations between 'broker' qubits through optical
manipulations and subsequently transfers them to 'client' qubits that have longer decoherence times,
but no optical transitions6. With such a strategy, the future for quantum computing would look rosy
indeed.
Quantum physics: Information on heat
There is a fundamental quantum limit to heat flow, just as there is to electric current. This
limit is independent of what carries the heat, and could also have a role in an unexpected
quarter: information theory.
In the past 20 years, physicists have learnt a tremendous amount about the transport of matter and
energy through devices small enough for quantum effects to come into play. One surprising fact that
has emerged is that the rates of transport in such devices, expressed for example by their electronic
or thermal conductance, have simple quantum-mechanical limits. On Nature, Meschke et al.extend
this principle to heat conduction by photons. Although the result will certainly have practical
ramifications for the engineering of ultra-sensitive detectors, sensors and microelectronic
refrigerators, the physics behind it hints at more fundamental truths.
The experimental trail leading to this point starts in 1988, when two groups independently
demonstrated the quantization of electron transport through a single 'ballistic' channel in which the
electron's movement is impeded only negligibly by scattering.
Experimental evidence that the transport of thermal energy is also quantized dates back to 1999.
Then, Michael Roukes and al, demonstrated that the movement of heat through discrete, freely
suspended mechanically vibrating channels approached a previously predicted maximum rate for
each independent vibrational mode — longitudinal, transverse and torsional — of the structure.
This universal rate is the quantum of thermal conductance, GQ, and comes in units of 2kB2T/3h.
Here, kB is Boltzmann's constant and T is the prevailing temperature; the exact maximum rate of
heat transport through a quantum device therefore increases linearly with temperature.
The mechanical vibrations that transported the heat in this case are called phonons, and are
analogous to the quantized vibrations of the electromagnetic field — better known as photons.
The advance made by Meschke’s team is to demonstrate for the first time the quantized conduction
of heat by photons. In their experiment, two microscopic electronic resistors exchanged heat
through random thermal voltage fluctuations transmitted through two superconducting wires. With
some clever use of further superconducting circuitry, the authors could switch the electrical
conduction channel on and off, and thus expose the thermal connection between the two resistors
brought about by the photons emitted and absorbed by them both. The authors show that the rate of
heat exchange between the two resistors in this case is given simply by GQ.
Thus, like the quantum of electrical conductance, GQ is very general, and independent of the nature
of the material connection between two heat reservoirs. Nor does it depend on the type of particle
that carries the heat: the same GQ is the limit for the conductance of heat by electrons, phonons,
photons, gravitons, you-name-it-ons. Thus, GQ is universal in a much deeper sense than the
quantum of electrical conductance, which depends on the quantum statistics of the particles.
A further connection that might seem somewhat surprising at first glance is that the quantum limit
for heat transport is intimately related to the maximum classical information capacity of a single
quantum channel. This connection rests on a deep relationship between information and entropy, as
from Shannon’s theory. One of the more interesting treatments made explicit the connection
between maximum heating and cooling rates. In this sense, the work of Meschke and colleagues is
of more fundamental importance than just investigating the behaviour of microscopic resistors
exchanging heat. What they have demonstrated is two resistors babbling to each other using thermal
voltage noise. As these resistors are near-perfect 'black-body' radiators, they emit and absorb
radiation fields of maximum entropy, and so — according to Shannon's work — maximum
information content. The authors have proved that this information can be carried by particles of
very different natures: photons do just as well as phonons.
Coherence in Excitons
Physicists at UC San Diego have for the first time observed the spontaneous production of
coherence within “excitons,” the bound pairs of electrons and holes that enable semiconductors to
function as novel electronic devices. This newly discovered property could eventually help the
development of novel computing devices and provide new insights into the quirky quantum
properties of matter.
Excitons tend to self-organize into an ordered array of microscopic droplets, like a miniature pearl necklace.The wavelike interference pattern (right) reveals the spontaneous coherence of excitons.
The study, headed by Leonid Butov, lead to tthe discovery that excitons, when made sufficiently
cold, tend to self-organize into an ordered array of microscopic droplets, like a miniature pearl
necklace. “What is coherence and why is it so important?” said Butov. “To start with, modern
physics was born by the discovery that all particles in nature are also waves. Coherence means that
such waves are all ‘in sync.’ The spontaneous coherence of the matter waves is the reason behind
some of the most exciting phenomena in nature such as superconductivity and lasing.”
“Excitons are particles that can be created in semiconductors, in our case, gallium arsenide. One can
make excitons, or excite them, by shining light on a semiconductor. The light kicks electrons out of
the atomic orbitals they normally occupy inside of the material. And this creates a negatively
charged ‘free’ electron and a positively charged ‘hole’”The force of electric attraction keeps these
two objects close together, like an electron and proton in a hydrogen atom. It also enables the
exciton to exist as a single particle rather than a non-interacting electron and hole. However, it can
be the cause of the excitons’ demise. Since the electron and hole remain in close proximity, they
sometimes annihilate one another in a flash of light, similar to annihilation of matter and antimatter.
To suppress this annihilation, Butov and his team separate electrons and their holes in different
nano-sized structures called quantum wells.
“Excitons in such nano-structures can live a thousand or even a million times longer than in a
regular bulk semiconductor,” said Butov. “These long-lived excitons can be prepared in large
numbers and form a high density exciton gas. But whether excitons can cool down to low
temperatures before they recombine and disappear has been a key question for scientists.”
“What we found was the emergence of spontaneous coherence in an exciton gas,” added Butov.
“This is evidenced by the behavior of the coherence length we were able to extract from the light
pattern (as shown in the figure) emitted by excitons as they recombine. Below the temperature of
about five degrees Kelvin above absolute zero, the coherence length becomes clearly resolved and
displays a steady and rapid growth as temperature decreases. This occurs in concert with the
formation of the beads of the ‘pearl necklace.’ The coherence length reaches about two microns at
the coldest point available in the experiment.”
Journal Reference: Physical Review Letters.
Physics promises wireless power
The tangle of cables and plugs needed to recharge today's electronic gadgets could soon be a thing
of the past. US researchers have outlined a relatively simple system that could deliver power to
devices such as laptop computers or MP3 players wirelessly. The concept exploits century-old
physics and could work over distances of many metres, the researchers said.
Although the team had not built and tested a system, computer models and mathematics suggest it
would work. "There are so many autonomous devices such as cell phones and laptops that have
emerged in the last few years," said Marin Soljacic from the Massachusetts Institute of Technology
and one of the researchers behind the work."We started thinking, 'it would be really convenient if
you didn't have to recharge these things'. And because we're physicists we asked, 'what kind of
physical phenomenon can we use to do this wireless energy transfer?'."
The answer the team came up with was "resonance", a phenomenon that causes an object to vibrate
when energy of a certain frequency is applied. "When you have two resonant objects of the same
frequency they tend to couple very strongly," In specific, the team's system exploits the resonance
of electromagnetic waves.
Typically, systems that use electromagnetic radiation, such as radio antennas, are not suitable for
the efficient transfer of energy because they scatter energy in all directions, wasting large amounts
of it into free space. To overcome this problem, the team investigated a special class of "nonradiative" objects with so-called "long-lived resonances". When energy is applied to these objects it
remains bound to them, rather than escaping to space. "Tails" of energy, which can be many metres
long, flicker over the surface.
"If you bring another resonant object with the same frequency close enough to these tails then it
turns out that the energy can tunnel from one object to another," said Professor Soljacic.
Hence, a simple copper antenna designed to have long-lived resonance could transfer energy to a
laptop with its own antenna resonating at the same frequency. The computer would be truly wireless.
Any energy not diverted into a gadget or appliance is simply reabsorbed.
The systems that the team have described would be able to transfer energy over three to five metres.
"This would work in a room let's say but you could adapt it to work in a factory," he said. "You
could also scale it down to the microscopic or nanoscopic world."
The team from MIT is not the first group to suggest wireless energy transfer. Others have worked
on highly directional mechanisms of energy transfer such as lasers. However, these require an
uninterrupted line of sight, and are therefore not good for powering objects around the home. A UK
company called Splashpower has also designed wireless recharging pads onto which gadget lovers
can directly place their phones and MP3 players to recharge them. The pads use electromagnetic
induction to charge devices, the same process used to charge electric toothbrushes.
However, transferring the power is only part of the solution. "There are a number of other aspects
that need to be addressed to ensure efficient conversion of power to a form useful to input to
devices."
Professor Soljacic will present the work at the American Institute of Physics Industrial Physics
Forum.
Cloudy day won't rain on laser communications
Just as clouds block the sun, they interfere with laser communications systems, but Penn State
researchers are using a combination of computational methods to avoid this effect.
"Radio frequency communications are generally reliable and well understood, but cannot support
emerging data rate needs unless they use a large portion of the radio spectrum," says lead author
Mohsen Kavehrad, "Free space optical communications offer enormous data rates but operate much
more at the mercy of the environment."
Laser light used in communications systems can carry large amounts of information, but, the dust,
dirt, water vapor and gases in clouds, scatter the light and create echoes. The loss of some light to
scattering is less important than those parts of the beam that are deflected and yet reach their target,
because then, various parts of the beam reach the endpoint at different times.
"All of the laser beam photons travel at the speed of light, but different paths make them arrive at
different times," says Kavehrad. "We would like us to deliver close to 3 gigabytes per second of
data over a distance of 6 to 8 miles through the atmosphere." That 6 to 8 miles is sufficient to cause
an overlap of arriving data of hundreds of symbols, which causes echoes. The information arrives,
but then it arrives again because the signal is distributed throughout the laser beam. In essence, the
message is continuously being stepped on.
"In the past, laser communications systems have been designed to depend on optical signal
processing and optical apparatus," says Kavehrad. "We coupled state-of-the-art digital signal
processing methods to a wireless laser communications system to obtain a reliable, high capacity
optical link through the clouds." The researchers developed an approach called free-space optical
communications that not only can improve air-to-air communications, but also ground-to-air links.
Because their approach provides fiber optic quality signals, it is also a solution for extending fiber
optic systems to rural areas without laying cable and may eventually expand the Internet in a third
dimension allowing airplane passengers a clear, continuous signal.
Using a computer simulation called the atmospheric channel model, the researchers first process the
signal to shorten the overlapping data and reduce the number of overlaps. Then the system
processes the remaining signal, picking out parts of the signal to make a whole and eliminate the
remaining echoes. This process must be continuous with overlap shortening and then filtering so
that a high-quality, fiber optic caliber message arrives at the destination. All this, while one or both
of the sender and receiver are moving. "We modeled the system using cumulus clouds, the dense
fluffy ones, because they cause the most scattering and the largest echo," says Kavehrad. "Our
model is also being used by Army contractors to investigate communications through smoke and
gases and it does a very good job with those as well."
The computer modeled about a half-mile traverse of a cumulus cloud. While the researchers admit
that they could simply process the signal to remove all echoes, the trade-offs would degrade the
system in other ways, such as distance and time. Using a two-step process provides the most
reliable, high-quality data transfer.
The system also uses commercially available off-the-shelf equipment and proven digital signal
processing techniques.
Bringing combined positioning and communications technologies to market
LIAISON is one of the largest current initiatives to develop and implement a new generation of
location-based services (LBS) for the professional market. The project approach, based on what is
called 'enhanced assisted GPS', is designed to improve the speed, accuracy and reliability of
existing GPS systems, allowing a whole new range of time, cost and life-saving services to be
developed.
The three-and-a-half year initiative involves more than 30 partners from 10 European countries and,
after an extensive test programme commencing in November 2006, should result in several
commercial systems. Some applications are likely to be on the market before the project ends in
April 2008.
The first tests involve deploying location-based services to aid remote workers for French broadcast
service provider TDF, and also to enhance data collection for Ama, a waste management company
in Rome. The second set of trials, which are scheduled to begin May 2007, entail assisting
maintenance workers for Spanish electricity supplier Endesa, and operating an automated dispatch
system for taxi drivers in Greece. In the third test phase, beginning January 2008, LBS will be
implemented for the Sussex police force in the United Kingdom and for fire fighters in Italy.
“The last two trials will be the most demanding in terms of performance requirements, and will be
the definitive test of our approach to deploying LBS for professionals,” says LIAISON coordinator
Rémi Challamel of Alcatel Space in France.
Armed with a mobile terminal such as a smart phone, PDA or notebook computer linked to the
LIAISON system, police officers and fire fighters could respond more quickly to an incident and
control centres would be able to better manage emergencies. “In the event of a terrorist attack at an
airport, emergency response coordinators could track police officers at all times and be able to
quickly cordon off the area. In the event of a hospital fire, fire fighters could pinpoint precisely the
source of the blaze and better manage the evacuation of patients,” Challamel explains.
The key difference between the enhanced assisted GPS being implemented by LIAISON and
standard GPS is a '“substantial” improvement in every aspect of the location-based services, the
coordinator notes. By combining GPS with an external server to refine the raw location data,
LIAISON can pinpoint a person's location to within one or two meters, compared to a variation of
up 20 meters that is common with standard GPS, especially when someone is moving. The EU’s
Galileo positioning system could further improve performance.
“Even more importantly, a user can pick up the signal and identify their location within seconds, not
minutes, and the system works even in the most challenging environments, such as in urban
canyons surrounded by high buildings or in dense forests,” Challamel says. “We are also working
on techniques to provide LBS inside buildings.”While for outdoor use the LIAISON system relies
on a combination of GPS and mobile communications technologies, inside buildings WiFi can be
used for accurate location mapping. In the trials, user terminals employing the Terrestrial Trunked
Radio (TETRA) standard will be used, while the services themselves will be designed for each
specific user community. “A police officer has different needs to a taxi driver, so different services
need to be developed even if the core components of the system remain the same,” he explains.
Though much of the project team’s work is focused on refining and testing the technology, they are
also tackling other challenges that have hindered wider LBS deployment to date. The LIAISON
partners held a joint workshop in September 2006 together with the team from ISHTAR, another
IST initiative working on the harmonisation and standardisation of LBS technologies. The
workshop focused primarily on defining new business models and analysing current market trends.
Challamel is confident that LBS will take off over the coming years, with the professional market
likely to account for 70 percent of initial demand. From there, he expects positioning systems to be
the next big thing for mobile users in the mass market, just as mobile phones equipped with cameras
and MP3 players have been the must-haves of recent years.
Integrated solution for managing produce transport fleets
Transport firms that supply Europe’s supermarkets and grocery stores with refrigerated and frozen
produce could soon save millions of euros a year, once a new, integrated fleet-management solution
hits the market. An innovative software system that allows transport managers to track the status of
their haulage fleet at all times has recently been implemented by three European companies, thanks
to the Cold-Trace project co-funded under the eTEN programme. The technology was originally
developed under the two-year IST project ColdRoad.In addition to tracking the movements of
vehicles, the Cold-Trace system also monitors temperatures inside the trailers, to ensure that
produce reaches its destination in the optimum condition.
“It may seem surprising, but although systems to monitor the temperature inside trucks and GPS
systems to track their location have long been on the market, there has been no integrated solution
combining those applications with work-order management,” explains Yolanda Ursa, the manager
of the Cold-Trace project at INMARK in Spain.By combining these three applications, the system
evaluated by the Cold-Trace team provides fleet managers with all the information necessary to
ensure the safe, prompt and efficient delivery of produce in the most cost-effective way possible.
“Fleet managers know where all their trucks are at all times, if the driver has stopped, if there have
been any accidents and if the goods are at the right temperature.”
Where in the past fleet managers have relied on calling drivers to find out where they are or to
check if a pick-up or drop-off went ok, the Cold-Trace system gives them the information for the
whole fleet on a PC screen. A 'black box' in each truck is connected to a server in the fleet
manager’s office via a GPRS connection, while each driver has a standard PDA. GPS location data
and information from sensors placed around the vehicle are all fed into the black box, and from
there fed back to fleet headquarters. “Normally there are temperature sensors inside and outside the
vehicle, as well as a sensor to tell if the door is open, which would obviously severely disrupt the
cold chain,” Ursa explains.If any temperature increase above set parameters is detected, an alarm
sounds, alerting both the driver (who receives the information on his PDA) and the fleet manager.
Fleet managers are informed where the vehicle is at all times, including the speed it is travelling and
if the driver makes any unplanned stops. While the driver is provided with real-time location
information and guidance to his destination.
A remote-control system even allows the fleet manager or driver to set refrigeration levels in the
truck remotely. “That feature alone saves the driver half an hour every morning, which is the time it
takes on average to cool off a truck before it can be loaded,” Ursa explains.
“In terms of the economic savings, we estimate that remote-controlled pre-cooling would save
companies over two thousand euros per driver per year, roughly six percent of labour costs, or result
in an equivalent increase in productivity,” she adds.
Other cost and time-saving benefits of the system have also been quantified by the project team. For
example, the validation trials indicated that companies could save as much as 1,600 euros per truck
per year, just by using the system to optimise routes and cargo and thus lessen the amount of time
the truck is on the road empty.
.
Mi
c
r
o
e
l
e
c
t
r
o
n
i
c
s
&
Na
n
o
t
e
c
h
First routine use of nanotubes in production of CMOS
Nantero Inc., a Massachussets company using carbon nanotubes for the development of nextgeneration semiconductor devices, announced it has resolved the major obstacles that had been
preventing carbon nanotubes from being used in mass production in semiconductor fabs.
Nanotubes are widely acknowledged to hold great promise for the future of semiconductors, but
most experts had predicted it would take a decade or two before they would become a viable
material. This was due to several historic obstacles that prevented their use, including a previous
inability to position them reliably across entire silicon wafers and contamination previously mixed
with the nanotubes that made the nanotube material incompatible with semiconductor fabs.
Nantero announced it has developed a method for positioning carbon nanotubes reliably on a large
scale by treating them as a fabric which can be deposited using methods such as spincoating, and
then patterned using lithography and etching. The company said it has been issued patents on all the
steps in the process, as well as on the article of the carbon nanotube fabric itself.
Nantero has also developed a method for purifying carbon nanotubes to the standards required for
use in a production semiconductor fab, which means consistently containing less than 25 parts per
billion of any metal contamination.
With these innovations, Nantero has become the first company in the world to introduce and use
carbon nanotubes in mass production semiconductor fabs
A Step Closer to Nanotube Computers
Stanford researchers' new etching method shows promise for bulk manufacturing of
nanotube-electronics.
emiconducting carbon nanotubes could be the centerpiece of low-power, ultra-fast electronics of the
future. The challenge is getting them to work with today's manufacturing processes. Now
researchers at Stanford University have made an important advance toward large-scale nanotube
electronics. They have created functional transistors using an etching process that can be integrated
with the methods used to carve out silicon-based computer chips.
A major roadblock to making carbon-nanotube transistors has been the difficulty of separating
semiconducting tubes from a typical batch of nanotubes, in which about a third of the material is
metallic. Even a tiny percentage of metallic tubes would short a device, causing it to fail. The
established but tricky approach to making transistors involves separating out semiconducting
nanotubes and then arranging them into circuits.
Hongjie Dai and his colleagues take a new approach. They grow a mixed bunch of semiconducting
and metallic nanotubes on a silicon wafer and have them bridge the source and drain of a transistor.
Then they expose the devices to methane plasma at 400 °C. The hot, ionized methane particles eat
away the carbon atoms, but only in the metallic nanotubes, converting the tubes into a hydrocarbon
gas. (The plasma also etches out nanotubes with diameters smaller than about 1.4 nm.) Next, the
researchers treat the wafer in a vacuum at a temperature of 600 °C; this treatment gets rid of carbonhydrogen groups that latch on to the semiconducting nanotubes during the plasma treatment. This
leaves behind purely semiconducting nanotubes with a consistent range of diameters stretching
across the source and drain.
According to Dai, the process, which the researchers described in Science, could be made into a
bulk manufacturing process, because it is compatible with silicon-semiconductor processing. In fact,
the researchers utilize a furnace that was previously used for silicon chips. The process should not
be expensive once the equipment is set up, Dai adds, because "methane is really cheap and the
temperature is only a few hundred degrees Celsius."
Combining the new process with traditional separating methods would be very powerful, Dai says.
Current sorting methods--growing nanotubes selectively or separating them chemically in a
solution--are tedious and nonscalable, and at best they create a mix containing 5 to 10 % metallic
nanotubes. But one could use this high concentration of semiconducting material to make devices
and then "add selective etching to get to 100 % selectivity," Dai says.
The technique's main constraint is that it works in a narrow diameter range. As the Stanford
researchers show in their paper, the plasma selectively etches metallic nanotubes only when the
molecules are between 1.4 and 2 nm wide; the method gets rid of all nanotubes narrower than 1.4
nm, and leaves intact metallic and semiconducting nanotubes that are wider than 2 nm. Dai
acknowledges this limitation and notes that "the method will have the most potential if the starting
material is carefully chosen."
Electron beams shrink carbon nanotubes to order
When hooked up to a current, the nanotubes shrink in diameter, from 16 nam in diameter, to 11nm, to 4nm and to 3nm
A way of controllably shrinking carbon nanotubes has been developed by US researchers. They say
the technique could someday be used to make faster computers and other novel electronic devices.
Carbon nanotubes have been used to make a variety of different nanoscale electronic devices,
including sensors and transistors. These can outperform conventional components, working at
higher frequencies and sensitivities, thanks to the novel physical and electronic properties of
nanotubes.These properties, however, depend strongly on the dimensions of each tube. And, until
now, there has been no reliable way to make nanotubes to order. This means "nanotube device
fabrication is an unpredictable process", says lead author Alex Zettl, at the University of California
at Berkeley, who worked with another team at Lawrence Berkeley National Laboratory.
The shrinking technique can be used to compress a nanotube to a particular diameter, which could
then in theory be used as particular types of electronic component. The process begins with
depositing a solution of nanotubes on top of a silicon wafer. A scanning electron microscope is used
to select a tube to which a gold electrode can be easily attached to each end.The shrinking begins
when the wafer carrying this tube is loaded into a transmission electron microscope. An electron
beam is fired at the tube, knocking carbon atoms out of their honeycomb arrangement within its
walls and causing them to either crowd into other parts of the arrangement, disturbing the shape, or
fall out altogether.At the same time, a current is run through the tube, via the gold contacts, and this
reshuffles the remaining carbon atoms back into a regular, albeit narrower, nanotube structure. The
current also causes some atoms to form new bonds with others.
The researchers were able to gradually shrink a nanotube measuring 16 nanometres in diameter
down to 3 nm using the technique, until finally it broke in two.
The tiny tubes can be created in the first place by depositing a carbon vapour or by blasting graphite
with high energy lasers or electricity. These chemically created tubes can be made different
diameters but are especially prone to defects. On the other hand, tubes made from graphite have
fewer defects but cannot be made to order.
The method employed by Zettl and colleagues will need some modification though to become a
manufacturing process.
Journal reference: Nanoletters
Gels could power tiny devices
A gel that pulses regularly when doused with certain chemicals has been modelled in detail for the
first time. The scientists behind the modelling say it may one day be used to power miniature robots
or other devices. So-called Belousov-Zhabotinsky (BZ) gels were first discovered in 1996 by
researchers at the National Institute of Materials and Chemical Research in Japan. They consist of
long polymer molecules containing a metal catalyst made from ruthenium – a rare metal similar in
structure to platinum.When the right nitrogen compound solution is added to the gel, a cyclical
reaction starts. The ruthenium catalyst alternately gains and loses electrons, causing the polymer
strands within the gel to oscillate in length.
"A sample of this gel beats autonomously by itself like a little heart in a Petri dish," says Anna
Balazs, at the University of Pittsburgh, "A piece a few millimetres across will go for hours until the
fuel reagent runs out.” Such materials have the potential to power small mechanical devices, Balazs
says, but until now only crude models describing the gel's transformative properties have existed.
Balazs worked with colleagues to create a model that describes these changes in two dimensions for
the first time.Their model describes the gel as a lattice of tiny springs connected at specific points.
The gel's pulsing movements are defined through thermodynamic equations that calculate the
energy liberated by chemical reactions. "From that energy, we can consider the force on every point
on the lattice," explains Balazs, "and that can be used to calculate how the points move."
"This really is a prediction out there for people to test," says Balazs. So far, there is little detailed
research on the shape changes undergone by BZ gels, but preliminary tests suggest that the model
rings true, she adds.
Jonathan Howse, who works on polymer-based artificial muscles at the University of Sheffield, UK,
agrees that the model should help experimentalists but be believes the usefulness of BZ gels could
be limited by a lack of structural uniformity. "The cross links between the polymer molecules are
distributed randomly," he explains. "This means they are more likely to fail because they can pull in
different directions."By contrast, conventional polymer materials can be made with more regular
molecular arrangements, making them much less vulnerable to stress fatigue.
Journal reference: Science (vol 314, p798)
Neural networking nanotubes
Bridging neurons and electronics with carbon nanotubes
New implantable biomedical devices that can act as artificial nerve cells, control severe pain, or
allow otherwise paralyzed muscles to be moved might one day be possible thanks to developments
in materials science. Writing in Advanced Materials, Nicholas Kotov of the University of Michigan,
and colleagues describe how they have used carbon nanotubes to connect an integrated circuit to
nerve cells. The new technology offers the possibility of building an interface between biology and
electronics.
Kotov and colleagues at Oklahoma State University and the University of Texas have explored the
properties of single-walled nanotubes (SWNTs) with a view to developing these materials as
biologically compatible components of medical devices, sensors, and prosthetics. The researchers
built up layers of their SWNTs to produce a film that is electrically conducting even at a thickness
of just a few nanometers. They next grew neuron precursor cells on this film. These precursor cells
successfully differentiated into highly branched neurons. A voltage could then be applied, lateral to
the SWNT film layer, and a so-called whole cell patch clamp used to measure any electrical effect
on the nerve cells. When a lateral voltage is applied, a relatively large current is carried along the
surface but only a very small current, in the region of billionths of an amp, is passed across the film
to the nerve cells. The net effect is a kind of reverse amplification of the applied voltage that
stimulates the nerve cells without damaging them.
Kotov and his colleagues report that such devices might find use in pain management, for instance,
where nerve cells involved in the pain response might be controlled by reducing the activity of
those cells. An analogous device might be used conversely to stimulate failed motor neurons, nerve
cells that control muscle contraction. The researchers also suggest that stimulation could be applied
to heart muscle cells to stimulate the heart.
They caution that a great deal of work is yet to be carried out before such devices become available
to the medical profession.
Three-dimensional polymer with unusual magnetism
Up to now it has not been possible to fabricate magnets from organic materials, like for example
plastics. Recently, however, experiments at the Forschungszentrum Dresden-Rossendorf in
collaboration with an international research team have revealed magnetic order in a polymer.
The structure which consists in particular of hydrogen, fluorine, carbon and copper, has been
realized in an entirely novel, three-dimensional and very stable form. This will be described in an
upcoming issue of the journal "Chemical Communications".
Magnetism is a physical property of matter related to the magnetic spins of electrons. Iron, for
example, is a ferromagnet because these spins are aligned parallel to each other, generating a
uniform magnetic field. Antiferromagnetism, on the other hand, arises when neighboring spins are
oriented antiparallel to each other.
Such antiferromagnetism has been shown to exist for the new polymeric compound studied at the
Forschungszentrum Dresden-Rossendorf (FZD). This polymer is characterized by a novel and
unusual structure where copper atoms together with pyrazin-molecules build layers, which in turn
through bridges of hydrogen and fluorine are connected with each other. The three-dimensional
polymer was prepared by chemists working with Jamie Manson at Eastern Washington University.
Metallic copper is not magnetic. Joachim Wosnitza and his colleagues discovered at a temperature
of 1.54 Kelvin – that is 1.54 degrees above absolute zero at -273.15 °C – that the embedded copper
atoms order themselves antiferromagnetically. In the compound, every copper ion possesses a
magnetic spin which interacts with neighboring spins through organic units. How this interaction
arises and how it can be influenced is presently under investigation.
Additional polymeric samples from the laboratory of Manson will be studied at the
Forschungszentrum Dresden-Rossendorf with the objective of a better understanding of the newly
discovered magnetism for this class of polymers. In the future, this would be a significant step, to
synthesize organic materials with tailored magnetic properties. Permanent magnets can be made
from iron and other ferromagnetic materials, from polymers this is, according to the current
knowledge, not possible. The great vision of the scientists is to realize ferromagnetic properties for
novel polymeric compounds that eventually would permit the development of innovative magnets.