Wednesday, May 09, 2007

Adobe Mars - (no bars) The evolution of the PDF

Mars is the code name for technology being developed by Adobe that provides an Extensible Markup Language (XML)-based representation of Portable Document Format (PDF) documents. Adobe® PDF is a universal file format for representing documents in a manner independent of the application software, hardware, and operating system used to create them and of the output device on which they are to be displayed or printed. XML is a cross-platform, extensible, and text-based standard for representing data that was created so that richly structured documents could be used over the web.

The Mars Project is an XML friendly representation of PDF documents. Already an open specification, PDF is the global standard for trusted, high fidelity electronic documentation. The Mars file format incorporates additional standards such as SVG, PNG, JPG, JPG2000, OpenType, Xpath and XML into a ZIP-based document container. The Mars plug-ins for Acrobat 8 and Adobe Reader 8 enable creation and recognition of the Mars file format by Adobe Acrobat 8 and Adobe Reader 8 software. For additional Mars information see the documents below and check back for regular updates.

The Mars Project is early release software and we welcome your feedback. Please use the feedback link below to request features, make comments and report problems. Please understand that this is an early release and is not yet feature complete. The file format contained within the Mars plug-ins is subject to change and using this technology for production may require you to make significant changes to your documents at a later date. Content created with this release may not be compatible with the shipping version.

Check the adobe labs webpage to read more about that. It is possible to download plug-ins for adobe reader and some examples.





Powered by ScribeFire.

Friday, April 27, 2007

Reptile Molecules



The one-dimensional motion of a chain of N beads is studied to determine its drift velocity when an external field is applied. The dependences of the drift velocity with the chain length and field strength are addressed. Two cases are considered, chains with all their beads charged and chains having an end bead charged. In the last case, an analytical expression for the drift velocity is proposed for all N. Results are tested with the help of Monte Carlo simulations.



Could this concept be useful as a deployment system?

Here you can find more information (we have access to the journal).





Powered by ScribeFire.

Wednesday, April 25, 2007

Habitable Exoplanet detected

The new planet is not much bigger than the Earth


Astronomers have found the most Earth-like planet outside our Solar System to date, a world which could have water running on its surface.

The planet orbits the faint star Gliese 581, which is 20.5 light-years away in the constellation Libra.

Scientists made the discovery using the Eso 3.6m Telescope in Chile.









Powered by ScribeFire.

Friday, April 20, 2007

SHIELDS FOR THE STARSHIP ENTERPRISE: A REALITY?



In the last year space agencies in the United States, Europe, China, Japan and India have announced their intention to resume human exploration of the Solar system, beginning with the Moon and perhaps ultimately moving on to Mars. But travel beyond the immediate vicinity of the Earth carries significant risks for astronauts, not the least of which is the exposure to sometimes high levels of radiation. Now a team of scientists at the Rutherford Appleton Laboratory are set to construct an experimental magnetic shield that would protect explorers in their journeys between the planets. Dr Ruth Bamford will present this idea in her talk on Wednesday 18 April at the Royal Astronomical Society National Astronomy Meeting in Preston.



Cosmic rays and radiation from the Sun itself can cause acute radiation sickness in astronauts and even death. Between 1968 and 1973, the Apollo astronauts going to the moon were only in space for about 10 days at a time and were simply lucky not to have been in space during a major eruption on the sun that would have flooded their spacecraft with deadly radiation. In retrospect Neil Armstrong's 'one small step for Man' would have looked very different if it had.



On the International Space Station there is a special thick-walled room to which the astronauts have had to retreat during times of increased solar radiation. However on longer missions the astronauts cannot live within shielded rooms, since such shielding would add significantly to the mass of the spacecraft, making them much more expensive and difficult to launch. It is also now known that the 'drip-drip' of even lower levels of radiation can be as dangerous as acute bursts from the sun.



On the surface of the Earth we are protected from radiation by the thick layers of the atmosphere. And the terrestrial magnetic field extends far into space, acting as a natural 'force field' to further protect our planet and deflecting the worst of the energetic particles from the Sun by creating a 'plasma barrier'.



Now scientists at the Rutherford Appleton Laboratory in Oxfordshire plan to mimic nature. They will build a miniature magnetosphere in a laboratory to see if a deflector shield can be used to protect humans living on space craft and in bases on the Moon or Mars.



In order to work, an artificial mini-magnetosphere on a space craft will need to utilise many cutting edge technologies, such as superconductors and the magnetic confinement techniques used in nuclear fusion.



Thus science is following science fiction once again. The writers of Star Trek realised that any space craft containing humans would need protection from the hazardous effects of cosmic radiation. They envisioned a 'deflector shield' spreading out from the Starship Enterprise that the radiation would bounce off. These experiments will help to establish whether this idea could one day become a practical reality.



Powered by ScribeFire.

Wednesday, April 18, 2007

Intelsat to Test Internet Routing In Space for the U.S. Military

Intelsat General Corp., a wholly-owned subsidiary of Intelsat Ltd, today announced that it has been selected for an industry-government collaboration to demonstrate the viability of conducting military communications through an Internet router in space.The Department of Defense project to test Internet routing in space (IRIS) will be managed by Intelsat General, and the payload will convert to commercial use once testing has been completed. The IRIS project is one of seven projects – out of hundreds of applicants -- funded and announced in fiscal 2007 as a Joint Capability Technology Demonstration (JCTD) by the Department of Defense.

Intelsat to Test Internet Routing In Space for the U.S. Military | SpaceRef - Your Space Reference

Blogged with Flock

'Smart dust' to explore planets

Tiny "smart" devices that can be borne on the wind like dust particles could be carried in space probes to explore other planets, UK engineers say.

The devices would consist of a computer chip covered by a plastic sheath that can change shape when a voltage is applied, enabling it to be steered.

http://news.bbc.co.uk/2/hi/science/nature/6566317.stm

That is totally ACT!



OPTIMUM FRYING FOR HEALTH AND QUALITY

The principal aim of this International meeting is to cover emerging

developments on frying: e.g. new trans free frying oils, innovations in

enhancing the nutritional profi le of fried snacks, new designs of

industrial fryers, disposal of used frying oil and HACCP issues. The

programme – spread over two days - includes fourteen cutting edge

lecture presentations given by leading experts from industry and

academia. The programme also includes equipment display and poster

viewing, and provides ample opportunity for delegates to network. This

meeting will be invaluable to practicing food scientists, researchers,

nutritionists, equipment manufacturers, new product developers, and

quality control and technical managers.



http://www.soci.org/SCI/events/details.jsp?eventID=EV986

http://www.soci.org/SCI/events/writeups/2007/pdf/ev986.pdf







...my first blog entry ever. Living in the Netherlands, we should seriously consider attending this highly scientific meeting.

(TSe)





Powered by ScribeFire.

Friday, April 13, 2007

New Experiment Probes Weird Zone Between Quantum and Classical





Scientists at the Max Planck Institute for
Quantum Optics in Germany have created a tiny silicon cantilever arm on
a chip that, after being cooled down to 0.0001 degrees above absolute
zero, will sway back and forth in multiple modes at once, becoming the
world's first macroscopic system in a purely quantum mechanical state.
Image: Max Planck Institute, Munich/Jorg Kotthaus, Universtiy of Munich




The strange boundary between the macroscopic world and the weird realm
of quantum physics is about to be probed in a unique experiment.



May be of any interest Luzi, Jose?



Go to the original article.....





Powered by ScribeFire.

Quantum secrets of photosynthesis revealed

A study led by researchers with the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California (UC) at Berkeley reports that the answer lies in quantum mechanical effects. Results of the study are presented in the April 12, 2007 issue of the journal Nature.



"We have obtained the first direct evidence that remarkably long-lived wavelike electronic quantum coherence plays an important part in energy transfer processes during photosynthesis," said Graham Fleming, the principal investigator for the study. “This wavelike characteristic can explain the extreme efficiency of the energy transfer because it enables the system to simultaneously sample all the potential energy pathways and choose the most efficient one.”





[Quantum secrets of photosynthesis revealed]



Electronic spectroscopy measurements made on a femtosecond (millionths of a billionth of a second) time-scale showed these oscillations meeting and interfering constructively, forming wavelike motions of energy (superposition states) that can explore all potential energy pathways simultaneously and reversibly, meaning they can retreat from wrong pathways with no penalty. This finding contradicts the classical description of the photosynthetic energy transfer process as one in which excitation energy hops from light-capturing pigment molecules to reaction center molecules step-by-step down the molecular energy ladder.



The photosynthetic technique for transferring energy from one molecular system to another should make any short-list of Mother Nature’s spectacular accomplishments. If we can learn enough to emulate this process, we might be able to create artificial versions of photosynthesis that would help us effectively tap into the sun as a clean, efficient, sustainable and carbon-neutral source of energy.



Original papers: 1 and 2



Powered by ScribeFire.

3D Solar Cells




The Georgia Tech Research Institute (GRTI) News archive announces the successful development of a new concept of solar cell.

The research project Nano-Manhanttan has been conducted by Jud Ready.
The idea behind the device it to create a solar cell able to harvest every last photon that is available. This aim has been achieved by following a 3D design.

"The GTRI photovoltaic cells trap light between their tower structures, which are about 100 microns tall, 40 microns by 40 microns square, 10 microns apart -- and built from arrays containing millions of vertically-aligned carbon nanotubes. Conventional flat solar cells reflect a significant portion of the light that strikes them, reducing the amount of energy they absorb. "

"Because the tower structures can trap and absorb light received from many different angles, the new cells remain efficient even when the sun is not directly overhead. That could allow them to be used on spacecraft without the mechanical aiming systems that maintain a constant orientation to the sun, reducing weight and complexity – and improving reliability. "

The design is described in the March issue of the Journal of Minerals, Metals and Materials Society (
JOM), but sadly we don't have access.



Wednesday, April 11, 2007

Young Scientists Design Open-Source Program at NASA

Jessy+Cowan-Sharp%27s+avatar
Jessy Cowan-Sharp and Robert Schingler set up CosmosCode to help NASA develop open-source software for space exploration.

NASA scientists plan to announce a new open-source project this month called CosmosCode -- it's aimed at recruiting volunteers to write code for live space missions, Wired News has learned.

The program was launched quietly last year under NASA's CoLab entrepreneur outreach program, created by Robert Schingler, 28, and Jessy Cowan-Sharp, 25, of NASA's Ames Research Center in Mountain View, California. Members of the CosmosCode group have been meeting in Second Life and will open the program to the public in the coming weeks, organizers said.

"CosmosCode is ... allowing NASA scientists to begin a software project in the public domain, leveraging the true value of open-source software by creating an active community of volunteers," said Cowan-Sharp, a NASA contractor.

CosmosCode is indicative of a larger shift at NASA toward openness and transparency -- things for which complex and bureaucratic government labs are not known. The software project is part of CoLab, an effort to invite the public to help NASA scientists with various engineering problems. The space agency is also digging into its files from previous missions and releasing code that until now remained behind closed doors. Together, these projects are creating a sort of SourceForge for space.

Powered by ScribeFire.

Wednesday, April 04, 2007

FBI investigates virtual casinos in Second Life

FBI investigators have visited Second Life's internet casinos at the invitation of the virtual world's creator Linden Lab, but the US government has not yet decided on the legality of virtual gambling.

Tuesday, April 03, 2007

Superconductors inspire quantum test for dark energy

Dark energy is so befuddling that it's causing some physicists to do their science backwards.

"Usually you propose your theory and then work out an experiment to test it," says Christian Beck of Queen Mary, University of London. A few years ago, however, he and his colleague Michael Mackey of McGill University in Montreal, Canada, proposed a table-top experiment to detect the elusive form of energy, without quite knowing why it might work. Now the pair have come up with the theory behind the experiment. "It is certainly an upside-down way of doing things," Beck admits.

Dark energy is the mysterious force that many physicists think is causing the expansion of the universe to accelerate. In 2004, Beck and Mackey claimed that the quantum fluctuations of empty space could be the source of dark energy and suggested a test for this idea. This involved measuring the varying current induced by quantum fluctuations in a device called a Josephson junction – a very thin insulator sandwiched between two superconducting layers.

Beck reasoned that if quantum fluctuations and dark energy are related, the current in the Josephson junction would die off beyond a certain frequency (see A table-top test for dark energy?). But they hadn't worked out what exactly caused the cut-off.

Now the duo say they know, and last week Beck presented the theory at a conference on unsolved problems for the standard model of cosmology held at Imperial College London.

Frequency cut-off

Quantum mechanics says that the vacuum of space is seething with virtual photons that are popping in and out of existence. Beck and Mackey suggest that when these virtual photons have a frequency below a certain threshold, they are able to interact gravitationally, contributing to dark energy.

Their theory is inspired by superconducting materials. "Below a critical temperature, electrons in the material act in a fundamentally different way, and it starts superconducting," says Beck. "So why shouldn't virtual photons also change character below a certain frequency?"

If so, virtual photons should behave differently below a frequency of around 2 terahertz, causing any currents in the Josephson junction to taper off above this frequency. Physicist Paul Warburton at University College London is building such a dark energy detector and could have results next year.

Some evidence that dark energy works like this may already have been found. In 2006, Martin Tajmar at the Austrian Research Centers facility in Seibersdorf and his colleagues noticed bizarre behaviour in a spinning niobium ring. At room temperature, niobium does not superconduct, and accelerometers around the ring measured that it was spinning at a constant rate. But once the temperature fell, the niobium started to superconduct, and the accelerometers suddenly picked up a signal (Gravity's secret).

Odd acceleration

"We measured an acceleration even though the ring's motion hadn't changed at all," says Clovis de Matos, who works at the European Space Agency in Paris and established the theory behind the experiment. He thinks the results could be explained if gravity got a boost inside the superconductor. "Beck and Mackey's gravitationally activated photon would have that effect," he says.

The controversial experiment seemed to fall foul of Einstein's equivalence principle, which states that all objects should accelerate under gravity at the same rate. It implied that "if you have two elevators, one made of normal matter and one made of superconducting matter, and accelerate them by the same amount, objects inside will feel different accelerations", de Matos says. Astronomers may have seen a similar violation of the principle (see "Two-speed gravity", below).

The odd acceleration detected in the niobium ring also suggests that energy isn't conserved in the superconductor – another major violation of known physics. Dark energy could solve that problem, however. "We did the sums and found out that energy wasn't conserved, but perhaps that was just because we were missing dark energy," de Matos says.

Paul Frampton, a cosmologist at the University of North Carolina at Chapel Hill, thinks Beck and Mackey's reasoning is flawed. "I don't think for a second they'll measure dark energy, but they should certainly try."

original article from NewScientist

This article seems to be driven by the hope that a puzzle of many wrong pieces still gives a good picture... I know Paul Frampton, a very reasonable guy, I hope he turns out to be right. Lesson learned: A correct paper is not a prerequisite to make it into the news.

powered by performancing firefox

Thursday, March 22, 2007

NIAC may die soon

Nothing official yet, but the NewScientist reports that NASA might shut down the ACT "cousin" in the US ...

Saturday, March 10, 2007

Will Biology Solve the Universe?

weird interview just read in wired ...

For years, scientists have tried to develop a universal theory of everything. Steven Hawking predicts that such a theory will be discovered in the next 20 years. A new theory asserts that biology, not physics, will be the key to unlocking the deepest mysteries of the universe, such as quantum mechanics.

"The answer to the universe is biology -- it's as simple as that," says Dr. Robert Lanza, vice president of research and scientific development at Advanced Cell Technology. He details his theory in The American Scholar's spring issue, published on Thursday. Lanza says scientists will establish a unified theory only if they radically rethink their understanding of space and time using a "biocentric" approach. His article is essentially a biological and philosophical response to Hawking's A Brief History of Time, in which he questions how we interpret the big bang, the existence of space and time, as well as many other theories -- assertions that might ruffle the feathers of some physical scientists.

(LS)

Thursday, March 01, 2007

Robot swarms 'evolve' effective communication

Robot swarms 'evolve' effective communication


Robots that artificially evolve ways to communicate with one another have been demonstrated by Swiss researchers. The experiments suggest that simulated evolution could be a useful tool for those designing of swarms of robots.

Roboticists Dario Floreano, Sara Mitri, and Stéphane Magnenat at the Swiss Federal Institute of Technology in Lausanne collaborated with biologist Laurent Keller from the University of Lausanne.

They first evolved colonies of robots in software then tested different strategies on real bots, called s-bots. Both simulated and real robots were set loose in an arena containing two types of objects – one classified as "food" and another designated "poison" – both lit up red.

Each bot had a built-in attraction to food and aversion to poison. They also have a randomly-generated set of parameters, dubbed "genomes" that define the way they move, process sensory information, and how they flash their own blue lights.

"They start with completely random behaviour," Keller explains. "All they can do is discriminate food from poison." The robots can see both food or poison from a distance of several metres but can only tell them apart when almost touching.

Saturday, February 10, 2007

very nice biomimicy research - we should try to see if we can't apply this also to landing spacecraft!!

A miniature robotic helicopter has revealed a simple yet effective visual trick that lets insects fly so adeptly without sophisticated avionics.

Besides explaining how insects zoom around and land without crashing into the ground, the technique could potentially be used to help control aircraft.

As insects fly forwards the ground beneath them sweeps backwards through their field of view. This "optical flow" is thought to provide crucial cues about speed and height. For example, the higher an insect's altitude, the slower the optical flow; the faster it flies, the faster the optical flow.

Previous experiments involving bees suggest that optical flow is crucial to landing. Maintaining a constant optical flow while descending should provide a constant height-to-groundspeed ratio, which makes a bee slowdown as it approaches the ground. Distorting this optical flow can cause them to crash land instead.

General flight

Now Nicolas Franceschini at the University of the Mediterranean in Marseilles, France, and colleagues have shown the same technique may explain more general flying behaviours.

They fitted a miniature helicopter with a simple software feedback loop to ensure that optical flow remains constant as it flies along. This allowed the tethered micro-copter to take off gracefully, maintain altitude over varying terrain and land, all without any means of directly measuring its speed or height. A video produced by the researchers shows the micro-copter in action (43.7 MB .avi format, requires DivX).

The fact that insects are such effective fliers could all be thanks to a similar feedback mechanism hardwired into their brains, Franceschini says.

Bee brains

Maintaining a constant optical flow should be relatively easy for an insect, says Rob Harris, a specialist in insect vision at the Centre for Computational Neuroscience and Robotics and the University of Sussex, in Brighton, UK, who was not involved with the project.

Maintaining a constant optical flow precludes the need to calculate height and groundspeed manually. "You have to assume they are not doing complicated trigonometry in their little brains," he adds.

"It explains about 70 years of experiments," Franceschini says. For example, it explains why bees sometimes drown when flying over still water. Without any features on the surface of water, a bee detects no optical flow and instinctively descends, eventually landing in the water.

Franceschini is currently talking to helicopter manufacturers about developing optical flow regulators for their aircraft. Such feedback mechanisms would be lightweight and trivial to develop and could help prevent crashes, he claims.

Journal reference: Current Biology (vol 17, issue 4, manuscript 5340)

Blogged with Flock

New prize by Branson for a method to remove CO2 from the atmosphere ... start using your brains!

A prize of $25 million for anyone who can come up with a system for removing greenhouse gases from the atmosphere was launched on Friday. It is the biggest prize in history, claims its sponsor, Richard Branson.

The head of Virgin Group said at the launch in London, UK, that the prize was not for removing emissions from power plants before they reach the atmosphere and storing them deep underground – an existing technology known as carbon capture and sequestration.

Instead, the brief is to devise a system to remove a "significant amount" of greenhouse gases – equivalent to 1 billion tonnes of carbon dioxide or more – every year from the atmosphere for at least a decade. It was inspired by the £20,000 prize for developing a way of measuring longitude won by 18th century clockmaker John Harrison, and recounted in the book Longitude. The $10 million X-Prize for private human spaceflight, won in 2004, was also an inspiration.

The initial closing date for Branson's Earth Challenge is 8 February 2010. If the judges deem that no design submitted by that stage is worthy of the prize, it will re-open for two more year-long phases.

$25 million prize for greenhouse gas removal - earth - 09 February 2007 - New Scientist Environment

lets win this prize :-)

Blogged with Flock

Friday, February 09, 2007

The brain scan that can read people's intentions (The Guardian)

CT scan of a human head
A team of world-leading neuroscientists has developed a powerful technique that allows them to look deep inside a person's brain and read their intentions before they act.

The brain scan that can read people's intentions | Science | Guardian Unlimited

Blogged with Flock

Wednesday, January 31, 2007

US climate scientists pressured on climate change



* 15:01 31 January 2007

* NewScientist.com news service

* New Scientist Environment and Reuters



US scientists were pressured to tailor their reports on global warming to fit the Bush administration's climate change scepticism, a congressional committee heard on Tuesday 30 January. In some cases, this occurred at the request of a former oil-industry lobbyist.

"High-quality science [is] struggling to get out," Francesca Grifo, of the watchdog group Union of Concerned Scientists, told members of the House Oversight and Government Reform Committee. A UCS survey found that 150 climate scientists personally experienced political interference in the past five years in a total of at least 435 incidents.

"Nearly half of all respondents perceived or personally experienced pressure to eliminate the words 'climate change', 'global warming' or other similar terms from a variety of communications," Grifo said.

Rick Piltz, a former US government scientist, told the committee that former White House official Phil Cooney took an active role in casting doubt on the consequences of global climate change. Piltz said he resigned in 2005 as a result of pressure to soft-pedal findings on global warming.

Cooney, who was a lobbyist for the American Petroleum Institute before becoming chief of staff at the White House Council on Environmental Quality, also resigned in 2005. He went on to work for oil giant ExxonMobil, which was recently accused of spending $16 million on supporting climate sceptics.



"Speculative musing"

Documents on global climate change required Cooney's review and approval, Piltz said, adding that "If you know what you are writing has to go through a White House clearance before it is to be published, […] an anticipatory kind of self-censorship sets in."

He added: "[Cooney's] edits of programme reports, which had been drafted and approved by career science programme managers, had the cumulative effect of adding an enhanced sense of scientific uncertainty about global warming and minimising its likely consequences."

According to The Guardian newspaper, Piltz described how Cooney had personally edited out a key section of an Environmental Protection Agency report to Congress on the dangers of climate change, calling it "speculative musing".



Seeking answers


Henry Waxman, a California Democrat who chairs the oversight committee, complained that the White House has balked at supplying documents requested over six months to investigate these allegations.

"The committee isn't trying to obtain state secrets or documents that could affect our immediate national security," Waxman said. "We are simply seeking answers to whether the White House's political staff is inappropriately censoring impartial government scientists."

Kristen Hellmer, of the Council on Environmental Quality, part of the Executive Office of the US President, said the CEQ had been cooperating with Congress. When asked about allegations of political interference in scientific documents, she said: "We do have in place a very transparent system in science reporting."



Spate of accusations


Reporting on the hearing, the New York Times says that even Republicans had little good to say of the Bush administration's handling of climate change science. Almost all the Republicans on the panel began by stating that global warming was happening and that greenhouse gases from human activities were largely to blame.

The Bush administration has suffered a spate of accusations of muzzling climate scientists in recent years. In January 2006, James Hansen, director of the US space agency's Goddard Institute for Space Studies, said that officials at NASA headquarters had ordered their staff to review his lectures, papers, postings on the Goddard website and requests for media interviews (see Top climatologist accuses US of trying to gag him).

In February 2006, the topic was brought up in the House Committee on Science, and New Scientist reported that scientists at another government agency, the National Oceanic and Atmospheric Administration, were also upset about the situation.

The UCS issued its first accusations in February 2004, followed closely by more finger pointing in July of that year.



Mandatory limits


President George W Bush's position on global warming has evolved over his presidency, from open scepticism about the reality of the phenomenon to acknowledgment at a global summit in 2006 that climate change is occurring and that human activities speed it up.

In his 2007 State of the Union address, Bush called climate change "a serious challenge" that should be addressed by technology and greater use of alternative sources of energy. But he stopped short of calling for mandatory limits on US emissions of carbon dioxide, a greenhouse gas blamed in part for global warming.

The congressional discussions come in the run-up to the release of a major United Nations report on climate change, scheduled for Friday in Paris, France.

Leaked drafts of the report suggest it will state that "there is a 90% chance humans are responsible for climate change", mostly due to the burning of fossil fuels. That contrasts with the last version of the Intergovernmental Panel on Climate Change's report, issued in 2001, which concluded there was a 66% chance that humans were responsible for rising temperatures.







powered by performancing firefox

Tuesday, January 30, 2007

The quantum world is about to get bigger

The quantum world is about to get bigger thanks to a technique that
will allow objects big enough to see with the naked eye to exist in two
places at once.








Quantum
properties are most prominent in single particles. In bigger objects
thermal vibrations destroy the quantum effects. So in theory, chilling
a large object should allow its quantum properties to shine through.








This week, three teams of physicists have perfected a way of doing this (Nature, vol 444, p 67). Their technique is to bombard a mirror of roughly 1014 atoms with photons in a way that damps out thermal vibrations, cooling it to 135 millikelvin.








However,
the researchers will need sophisticated techniques to see the quantum
behaviour. "You can see the mirror with the naked eye but you won't be
able to resolve the quantum effects," says Markus Aspelmeyer, at the
University of Vienna in Austria.

Of course, you won't be. If you were, you would be a measuring device and according to the laws of quantum mechanics the quantum effect should disappear (decoherence and in the end collapse of the wave function.) So in the end it is not a question of size!





powered by performancing firefox

Quantum computers? Don't hold your breath

Quantum computing will never work. At least, that's the view of one physicist who thinks that unavoidable noise will always stand in its way.

In theory, a quantum computer could be far more powerful than any existing device. Making it work, however, means protecting the quantum particles used in its calculations from the disrupting noise of the outside world. To date, this has been achieved only for a few particles for fractions of a second. A useful device would have to be noiseless for far longer, and use hundreds or thousands of particles.

Michael Dyakonov of the University of Montpellier in France believes this feat is akin to achieving perpetual motion. It has been assumed till now that errors caused by noise can be fixed. In reality, Dyakonov argues, such errors would grow far too rapidly with the number of particles, making correction impractical. What's more, he says, correction schemes make unproven assumptions: for example, that the various errors introduced by imperfect devices will be independent of one another, and so largely cancel each other out. "I have serious doubts about the possibility of large-scale quantum computations," he says (www.arxiv.org/quant-ph/0610117).

Others think Dyakonov has got it wrong. "It is true that the quantum computing community should be cautiously optimistic, rather than confident," says Andrew Steane of the University of Oxford. "But his arguments are largely misleading."





powered by performancing firefox

Friday, January 26, 2007

use of bacteria as small "motors" ...

One of the main challenges in developing microscale robots lies in miniaturising their power and propulsion. Now, researchers in the US may have found a solution to this problem, by exploiting the natural movement of bacteria to propel micro-objects through water.

Many bacteria propel themselves along in a fluid by rotating their corkscrew-like tails, called flagella, at relatively high speeds. These flagella are only around 20 nanometres in diameter and are about 10,000 nm long.

Motors made from bacterial flagella have been used as novel "nano-actuators" before (see Bacteria harnessed as miniature pumps), but Metin Sitti and Bahareh Behkam of Carnegie Mellon University in Pennsylvania, US, have taken another approach. They use the entire microorganism as the motor and control its on/off motion with chemicals.

Bacteria harnessed as micro propeller motors - tech - 26 January 2007 - New Scientist Tech

Blogged with Flock

Nature article on new very high density molecular electronic memory circuit

The primary metric for gauging progress in the various semiconductor integrated circuit technologies is the spacing, or pitch, between the most closely spaced wires within a dynamic random access memory (DRAM) circuit1. Modern DRAM circuits have 140 nm pitch wires and a memory cell size of 0.0408 mum2. Improving integrated circuit technology will require that these dimensions decrease over time. However, at present a large fraction of the patterning and materials requirements that we expect to need for the construction of new integrated circuit technologies in 2013 have 'no known solution'1. Promising ingredients for advances in integrated circuit technology are nanowires2, molecular electronics3 and defect-tolerant architectures4, as demonstrated by reports of single devices5, 6, 7 and small circuits8, 9. Methods of extending these approaches to large-scale, high-density circuitry are largely undeveloped. Here we describe a 160,000-bit molecular electronic memory circuit, fabricated at a density of 1011 bits cm-2 (pitch 33 nm; memory cell size 0.0011 mum2), that is, roughly analogous to the dimensions of a DRAM circuit1 projected to be available by 2020. A monolayer of bistable, [2]rotaxane molecules10 served as the data storage elements. Although the circuit has large numbers of defects, those defects could be readily identified through electronic testing and isolated using software coding. The working bits were then configured to form a fully functional random access memory circuit for storing and retrieving information.

Blogged with Flock

Strange but True: Turning a Wobbly Table Will Make It Steady

For every table—turn, turn, turn... there is a proof

It's a problem as old as civilization: the wobbly table. You may have thought your only recourse against this scourge is a hastily folded cocktail napkin stuffed under the offending leg. If so, take heart, because mathematicians have recently proved a more elegant solution. Just rotate the table.The intuitive argument, which dates back at least to a 1973 Scientific American column by Martin Gardner, is straightforward. Consider a square table with four equally long legs. Any three of the legs must be able to rest on the floor simultaneously, as a tripod does. Assume the floor undulates smoothly and the fourth leg hovers above it.Now imagine turning the table about its center while keeping the first three legs grounded, or balanced. Once the table has rotated by 90 degrees, the wobbly leg must lie below the floor. (If you do not see why, imagine pushing down equally on the wobbly leg and a neighboring leg until the neighbor sinks below the floor and the wobbly leg touches down.) And so, at some point along the wobbly leg's arc, it has to hit a spot on which it can rest. As simple as this argument may sound, however, proof was a long time coming.The first serious mathematical inroad against table wobbling seems to have occurred in the late 1960s with Roger Fenn, a PhD student at the University of London. One day Fenn and his graduate adviser ended up at a coffee shop faced with—you guessed it—an unsteady table. "The table wouldn't stop wobbling and we fiddled it around until we got it to stop," recalls Fenn, who is now at the University of Sussex.At his adviser's suggestion, Fenn wrote out a proof that for any smoothly curving floor that bulges upward like a hill, there is at least one way to position the table so that it is balanced and horizontal. But he did not reveal how exactly to find that sweet spot, and he quickly tabled the subject. "I didn't think people were going to take this very seriously," he admits. "You say to somebody you've met, 'Well I'm trying to put a table on the floor so it doesn't wobble'; they'll say, 'Oh yeah?'"The season for proving the table turning hypothesis would not arrive for another 35 years. By then, the idea had become such a part of mathematical lore that two years ago mathematician Burkard Polster of Monash University in Australia included it in an article on neat math tricks for teachers. He promptly received a letter pointing out that the idea would not work if a floor was too uneven or possessed sheer cliffs, such as between tiles.Polster rose to the challenge. "It's never been really pinpointed exactly what the ground should be like," he says. So he and some of his colleagues ran through the appropriate trigonometry and satisfied themselves that if a floor has no spots that slope by more than 35 degrees, then turning will indeed balance a square or rectangular table. They detail the proof in a paper accepted for publication by the Mathematical Intelligencer. (In one of those odd cases of co-discovery, a retired CERN physicist named André Martin published a similar result a few months before the Australians did.)Polster's group even spells out a procedure for balancing the table [see video above]. First lift up the leg of the table diagonal from the wobbly leg. Make sure both legs are roughly equal distances off the ground and then begin rotating. "In practice," the researchers write, "it does not seem to matter how exactly you turn your table on the spot, as long as you turn roughly around the center of the table."So, next time you feel a table start to tilt, put that napkin down and don't be shy about turning the tables on a wobbly dining experience. Rest assured, mathematics is on your side.

Scientific American: Strange but True: Turning a Wobbly Table Will Make It Steady

Tuesday, January 23, 2007

Photon Compression??

Researchers condense entire image into single photon - Engadget

A team of researchers has managed to find a way to store a large amount of data in a single photon of light. Although the first stored item -- an image of the characters "UR" -- implies that the inventor was a 13 year old girl dealing with an extremely low text messaging limit, the image was in fact intended to signify the institution which developed the technology, the University of Rochester (either that or it's the shortest example of the "UR IN MY ... " meme that we've seen in the while.) Apparently the system works because "instead of storing ones and zeros" (a la binary code), the team has figured out how to store an entire image in a single photon, which sounds sort of impossible to us. Funny, because that's exactly what John Howell, the leader of the team said about the system. One of the key components of the process is the particle-wave duality nature of light: by firing a single photon of light through a stencil -- we presume one heckuva small one -- the wave carries a shadow of the image along with it at a very high signal-to-noise ratio, even with low light levels. The light is then slowed down in a cell of cesium gas, where it is compressed to 1 percent of its original length. This is where the storage aspect of the device comes in, as the researchers hope to be able to delay a single photon almost permanently, resulting in a device that can store "incredible amounts of information in just a few photons": an enticing thought for a world currently satisfied with a maximum of 1TB hard drives based on physical platters. A pity then that the world is completely distracted by the potential for "Photon on photons" jokes that this throws into the ring.

Blogged with Flock

Monday, January 22, 2007

Untitled

Can't Touch This - Jeff Han - Touch Screen

Jefferson Han, a pale, bespectacled engineer dressed in Manhattan black, faced the thousand or so attendees on the first day of TED 2006, the annual technology, entertainment, and design conference in Monterey, California. The 30-year-old was little more than a curiosity at the confab, where, as its ad copy goes, "the world's leading thinkers and doers gather to find inspiration." And on that day, the thinkers and doers included Google (NASDAQ:GOOG) gazillionaires Sergey Brin and Larry Page, e-tail amazon Jeff Bezos, and Bill Joy, who helped code Sun Microsystems (NASDAQ:SUNW) from scratch. Titans of technology. It was enough to make anyone feel a bit small.


Blogged with Flock

Wednesday, January 17, 2007

String Theory's Extra Dimensions Must Be Less Than Half the Width of a Human Hair

If extra dimensions of space exist, they must be smaller than about half the width of a human hair, according to new measurements of the strength of gravity at short distances. Researchers found that the same law governing the gravitational pull between planets continues to work when objects are separated by as little as 56 micrometers. The finding rules out extra dimensions of 44 micrometers or larger, they report in this week's Physical Review Letters.Discovering extra dimensions with the relatively huge size of a few micrometers would offer spectacular confirmation for string theory, the still unproved body of equations that may unify gravity with the normally incompatible realm of quantum physics. "Even though we haven't seen anything, these results put boundaries on what people can legitimately propose," says experimental physicist and study author Eric Adelberger of the University of Washington. "Testing the inverse square law [meaning Newton's law of gravity] is the bombproof way to look for extra dimensions.""I'm a big admirer of this class of experiments; I think they're awesome," says theoretical particle physicist Raman Sundrum of Johns Hopkins University. In principle, such tests could effectively rule out theories of micrometer-size extra dimensions, he says. To study such questions researchers would normally expect to use giant particle accelerators, such as the Large Hadron Collider (LHC), set to switch on in Geneva later this year.Sundrum says the LHC may still get its shot at large extra dimensions, because the new result leaves the idea some wiggle room. "It's not killing that scenario," he says.With no pressing reason to check, researchers, until a few years ago, had never measured the strength of gravity when objects were separated by much less than a millimeter (roughly the width of a period on this page). But beginning in the late 1990s, some physicists proposed that string theory might cause gravity to grow stronger at such distances if the universe came with relatively big extra dimensions of micrometers in width. (To make its arithmetic come out right, string theory requires that space have extra dimensions beyond the three we can readily experience, but researchers had assumed that these dimensions are extraordinarily tiny.)Adelberger and his colleagues on the so-called Eot-Wash experiment have led the way in checking gravity's short-distance strength. They employed a small metal pendulum suspended above a stacked pair of fused metal disks, which exerted a gravitational tug on a metal ring on the bottom of the pendulum.The ring and the upper disk contained a series of matching holes. If the holes lined up, gravity was pulling straight down, but if the holes were offset, the disk's gravity was twisting the pendulum. As a result, the experiment was able to measure the strength of gravity at the distance between the ring and the upper disk.The key to the experiment is the lower disk, which contains holes of different sizes that are designed to cancel out the twisting caused by the upper disk when the ring and disks are in certain orientations. If that canceling does not occur, it means that the force between the ring and the upper disk has changed, either because the strength of gravity has changed or because some new force has intervened that has no effect at the slightly larger distance between the ring and the bottom disk.In a prior experiment, the Eot-Wash team ruled out extra dimensions larger than 100 micrometers. This time researchers attained greater sensitivity by using more holes and covering the apparatus in gold to screen out electromagnetic forces between the ring and disks.Adelberger says they might be able to get down to a few micrometers, but it would be very tough. "As the dimensions get smaller and smaller, the force [they cause] gets smaller much faster," he says. The payoff could be worth it, though. Sundrum says that if extra dimensions failed to turn up at that distance, it would likely prune off that branch of string theory.

Scientific American: String Theory's Extra Dimensions Must Be Less Than Half the Width of a Human Hair

Thursday, January 11, 2007

Optimized adhesives

ESA launches new project to protect biodiversity


"The United Nations Convention on Biodiversity (UNCBD) agreed on a set of headline indicators to assess the progress made towards this target. DIVERSITY will make a contribution to the required monitoring efforts that will help us to determine whether we are making progress and which management and policy measures are most effective and thereby support decision-making," the UNCBD Secretariat Robert Höft said.
[...]
The DIVERSITY project, developed under ESA's Data User Element (DUE) programme, is being carried out in collaboration with the UNCBD Secretariat and UNESCO, which, in addition to being a user, is also the main coordinator between the users and contractors selected by ESA.

ESA Portal - ESA launches new project to protect biodiversity

Blogged with Flock

Wednesday, January 03, 2007

Robo-rights:a study commissioned by the UK government

Humans are increasingly reliant on computers, robots and machines. Currently, robots and machines are inanimate objects without rights or duties. If artificial intelligence is achieved and widely deployed (or if they can reproduce and improve themselves) calls may be made for human rights to be extended to robots. If so, this may be balanced with citizen responsibilities (e.g. voting, paying tax). A push for robots' rights may clash with owners’ property rights. More strain may be placed on the environment (e.g. energy, waste, resource & space usage).

View Issue