RSS

How Many Moons Does Mars Have ?

Even though the speculations about the existence of life on Mars has brought the planet back into the limelight, not many people out there can boast of knowing everything about this planet. Among the various lesser known facts about planet Mars, one of the most prominent fact is related to its natural satellites, i.e. the exact number of moons of Mars to be precise. It wouldn't be surprising to see people left unanswerable when asked how many moons does Mars have. Not many people are aware of the fact that other planets also have their own moons. While planet Jupiter leads the pack with a count of 63 known moons, Saturn has 61 moons orbiting it and Uranus has 27.


How Many Moons Does Mars Have?

The fact that planet Mars has 2 moons may come as a surprise for many, as this planet is half the size of planet Earth, which has only 1 moon. The speculation that planet Mars has moons surfaced long before Asaph Hall, Sr., actually discovered them in 1877. For instance, the famous German astronomer, Johannes Kepler had predicted that Mars has two moons way back in the 17th century. That, however, didn't quite hold ground as his claims were more of assumptions which lacked scientific support. Kepler's claims that Mars had two moons were based on the fact that the Earth had one moon and planet Jupiter was believed to have 4 moons.


What are the Names of the Two Moons of Mars?

The two moons of planet Mars are named Phobos and Deimos, after the sons of Ares - the God of War in Greek mythology. While the former represents 'fear', the latter represents 'terror'. These moons of Mars were discovered by Asaph Hall, Sr., an American astronomer, on August 12, 1877. Owing to their irregular shape, astronomers believe that these moons are actually captured asteroids. A comparison with the Earth's moon shows that the Mars moons are different from it in several aspects.


Moons of Mars: Facts About Phobos and Deimos

Among the two moons, Phobos is the larger moon and has a diameter of 14.1 miles, while Deimos is smaller and has a diameter of 7.82 miles. The orbit of Phobos can be located at a distance of 5826.59 miles from the surface of the planet, whereas the orbit of Deimos can be located at a distance of 14577.36 miles from the planet. Considering that the orbit of Earth's moon is 238856.95 miles away from the surface of the planet, one can say that the moons of Mars orbit the planet at a very close range. Another major difference between the two moons is that Phobos travels from west to east, while Deimos travels from east to west. Phobos takes 7 hours and 39 minutes to orbit planet Mars, while Deimos takes 30 days i.e. 1.2 Martian days to orbit the planet. Recent observations have revealed that the gravity of planet Mars is slowing down the orbit of Phobos. Going by the speed at which this is happening, Phobos will reach the Roche limit, i.e. the limit up to which a natural satellite can close in on its parent body, and get disintegrated within a a span of 15 million years. NASA has managed to collect a significant amount of information about Phobos and Deimos but that doesn't mean we know everything about them.If this information about how many moons does Mars have took you by surprise, knowing about the moons of Jupiter or how many moons does Saturn have is bound to leave you bewildered. NASA's efforts to facilitate low cost planetary exploration opened the realms of the solar system for mankind. With NASA continuing its space escapades, it wouldn't be surprising if yet another fascinating attribute of Mars surfaces sometime in the near future.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

These new research findings provide scientists with a more useful probe of the inner workings of volcanic eruptions. Infrasound is sound that is lower


Jay Miller, a research scientist in the Integrated Ocean Drilling Program who has made numerous trips to the region and studied there under a Fulbright grant, says the ash produced from Icelandic volcanoes can be a real killer, which is why hundreds of flights from Europe have been canceled for fear of engine trouble.


"What happens is that the magma from the volcano is around 1,200 degrees and it hits the water there, which is near freezing," he explains. "What is produced is a fine ash that actually has small pieces of glass in it, and it can very easily clog up a jet engine. If you were to inhale that ash, it would literally tear up your lungs."


Miller says most volcanoes in Iceland erupt only about every five years on average and are relatively mild, but history is repeating itself. Extremely large eruptions occurred there in 934 A.D. and again in 1783 that covered Europe with ash much like today.


"Ben Franklin was ambassador to France in 1783 and he personally witnessed the large ash clouds over Europe, and he later wrote that it was a year in which there was no summer," Miller adds. "The big question now is, what happens next? It's very possible this eruption could last for quite some time, but no one knows for sure. Volcanoes in that part of the world are very hard to predict."

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Sound From Exploding Volcanoes Compared With Jet Engines


These new research findings provide scientists with a more useful probe of the inner workings of volcanic eruptions. Infrasound is sound that is lower in frequency than 20 cycles per second, below the limit of human hearing.


The study led by Robin Matoza, a graduate student at Scripps Oceanography, will be published in an upcoming issue of the journal Geophysical Research Letters, a publication of the American Geophysical Union (AGU). Matoza measured infrasonic sound from Mount St. Helens in Washington State and Tungurahua volcano in Ecuador, both of which are highly active volcanoes close to large population centers.


"We hypothesized that these very large natural volcanic jets were making very low frequency jet noise," said Matoza, who conducts research in the Scripps Laboratory for Atmospheric Acoustics.
Using 100-meter aperture arrays of microbarometers, similar to weather barometers but sensitive to smaller changes in atmospheric pressure and low-frequency infrasonic microphones, the research team tested the hypothesis, revealing the physics of how the large-amplitude signals from eruptions are produced. Jet noise is generated by the turbulent flow of air out of a jet engine. Matoza and colleagues recorded these very large-amplitude infrasonic signals during the times when ash-laden gas was being ejected from the volcano. The study concluded that these large-scale volcanic jets are producing sound in a similar way to smaller-scale man-made jets.
"We can draw on this area of research to speed up our own study of volcanoes for both basic research interests, to provide a deeper understanding of eruptions, and for practical purposes, to determine which eruptions are likely ash-free and therefore less of a threat and which are loaded with ash," said Michael Hedlin, director of Scripps' Atmospheric Acoustics Lab and a co-author on the paper.
Large-amplitude infrasonic signals from volcanic eruptions are currently used in a prototype real-time warning system that informs the Volcanic Ash Advisory Center (VAAC) when large infrasonic signals have come from erupting volcanoes. Researchers hope this new information can improve hazard mitigation and inform pilots and the aviation industry.


"The more quantitative we can get about how the sound is produced the more information we can provide to the VAAC," said Matoza. "Eventually it could be possible to provide detailed information such as the size or flow rate of the volcanic jet to put into ash-dispersal forecasting models."


The paper's co-authors include D. Fee and M A. Garcés, Infrasound Laboratory at the University of Hawaii at Manoa; J.M. Seiner of the National Center for Physical Acoustics at the University of Mississippi; and P.A. Ramón of Instituto Geofisico, Escuela Politecnica Naional. The research study was funded by a National Science Foundation grant.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Scientists Uncover Novel Role for DNA Repair Protein Linked to Cancer

Assistant Professor of Biology Mitch McVey and his research team report that DNA polymerase theta, or PolQ, promotes an inaccurate repair process, which can ultimately cause mutations, cell death or cancer. The research is published in the July 1 edition of the open-access journal PLoS Genetics.
"Although scientists have known for several years that the PolQ protein is somehow related to the development of cancer, its exact cellular role has been difficult to pin down," says McVey."Our finding that it acts during inaccurate DNA repair could have implications for biologists who study genomic changes associated with cancer."
DNA is a double stranded molecule shaped like a spiral staircase. Its two strands are linked together by nucleotides -- guanine, cytosine, adenine and thymine -- that naturally complement one another. Under normal conditions, a guanine matches with a cytosine, and an adenine with a thymine.

How DNA Double-Strand Breaks Are Repaired

But during the course of a cell's life, the staircase can become severed into two molecules. These breaks must be repaired if the cells are to accurately replicate and pass on their genetic material. Most breaks are quickly and accurately fixed during the process of homologous recombination (HR), which uses an intact copy of DNA as a template for repair.

However, there is a second, error-prone process called end-joining repair. Here, the broken, double-stranded ends are stitched back together without regard to the original sequence. The ends of the broken strands may be altered by removal or addition of small DNA segments, which can change the genomic architecture.

In a previous paper, McVey and doctoral student Amy Marie Yu were able to demonstrate an alternative form of end-joining by studying how repair proceeds in the absence of DNA ligase 4, an important protein that links together two broken DNA ends.

After analyzing hundreds of inaccurately repaired breaks in the fruit fly Drosophila melanogaster the scientists observed two things. One, extra nucleotides were often inserted into the DNA strands at the point of the break. Second, the insertions were closely related to the original DNA sequences directly adjacent to the break.

Polymerase Theta's Role in DNA Repair and Cancer

In the current PLoS Genetics paper, McVey, Yu and undergraduate Sze Ham Chan showed that polymerase theta plays a dominant role in this alternative repair process. First, it reads the genetic material in DNA adjacent to the break and makes a copy of it. The newly copied DNA can then be used as a molecular splint that holds the broken ends together until they can be permanently joined. In addition, the scientists speculated that the PolQ protein also has the ability to unwind DNA sequences near a break, thereby facilitating alternative end-joining.
Other research groups have previously shown that levels of the PolQ protein are higher in several types of human tumors. McVey and his team are currently working to determine if a PolQ-dependent type of alternative end-joining is involved in the development of cancer in people. If this is indeed the case, the PolQ protein could represent a novel target for the development of new cancer drugs.
"Our first goal is to determine which parts of PolQ are required for its role in alternative end-joining," McVey says. "This will give us a road map for determining how its activity might be altered in a clinical setting."
This work was funded by grants from the National Science Foundation and the Ellison Medical Foundation.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Urgent Computing: Exploring Supercomputing's New Role


Pete Beckman, Mathematics and Computer Science Division, Argonne National Laboratory
Large-scale parallel simulation and modeling have changed our world. Today, supercomputers are not just for research and development or scientific exploration; they have become an integral part of many industries. A brief look at the Top 500 list of the world’s largest supercomputers shows some of the business sectors that now rely on supercomputers: finance, entertainment and digital media, transportation, pharmaceuticals, aerospace, petroleum, and biotechnology. While supercomputing may not yet be considered commonplace, the world has embraced high-performance computation (HPC). Demand for skilled computational scientists is high, and colleges and universities are struggling to meet the need for cross-disciplinary engineers who are skilled in both computation and an applied scientific domain. It is on this stage that a new breed of high-fidelity simulations is emerging – applications that need urgent access to supercomputing resources.

For some simulations, insights gained through supercomputer computation have immediate application. Consider, for example, an HPC application that could quickly calculate the exact location and magnitude of tsunamis immediately after an undersea earthquake. Since the evacuation of local residents is both costly and potentially dangerous, promptly beginning an orderly evacuation in only those areas directly threatened could save lives. Similarly, imagine a parallel wildfire simulation that coupled weather, terrain, and fuel models and could accurately predict the path of a wildfire days in advance. Firefighters could cut firebreaks exactly where they would be most effective. For these urgent computations, late results are useless results. As the HPC community builds increasingly realistic models, applications are emerging that need on-demand computation. Looking into the future, we might imagine event-driven and data-driven HPC applications running on-demand to predict everything from where to look for a lost boater after a storm to tracking a toxic plume after an industrial or transportation accident.
Of course, as we build confidence in these emerging computations, they will move from the scientist’s workbench and into critical decision-making paths. Where will the supercomputer cycles come from? It is straightforward to imagine building a supercomputer specifically for these emerging urgent computations. Even if such a system led the Top 500 list, however, it would not be as powerful as the combined computational might of the world’s five largest computers. Aggregating the country’s largest resources to solve a critical, national-scale computational challenge could provide an order of magnitude more power than attempting to rely on a prebuilt system for on-demand computation.

Furthermore, costly public infrastructure, idle except during an emergency, is inefficient. A better approach, when practical, is to temporarily use public resources during times of crisis. For example, rather than build a nationwide set of radio towers and transmitters to disseminate emergency information, the government requires that large TV and radio stations participate in the Emergency Alert System. When public broadcasts are needed, most often in the form of localized severe weather, broadcasters are automatically interrupted, and critical information is shared with the public.

As high-fidelity computation becomes more capable in predicting the future and being used for immediate decision support, governments and local municipalities must build infrastructures that can link together the largest resources from the NSF, DOE, NASA, and the NIH and use them to run time-critical urgent computations. For embarrassingly parallel applications, we might look to the emerging market for “cloud computing.” Many of the world’s largest Internet companies have embraced a model for providing software as a service. Amazon’s elastic computing cloud (EC2), for example, can provide thousands of virtual machine images rapidly and cost effectively. For applications with relatively small network communication needs, it might be most effective for urgent, on-demand computations simply to be injected into the nation’s existing Internet infrastructure supported by Amazon, Yahoo, Google, and Microsoft.
In April 2007, an urgent computing conference at Argonne National Laboratory brought together an international group of scientists to discuss how on-demand computations for HPC might be supported and change the landscape of predictive modeling. The organizers of that workshop realized that CTWatch Quarterly would be the ideal venue for exploring this new field. This issue describes how applications, urgent-computing infrastructures, and computational resources can support this new role for computing.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Gamma-Ray Wipe-Out



Gamma-ray bursts: the biggest, most powerful explosions in the Universe since the Big Bang. They are a million trillion times brighter than the Sun. Each day, we detect a burst of gamma radiation coming from a random direction in the sky. Lasting anywhere from a fraction of a second to several minutes, these high-energy astronomical enigmas have a deadly power.

The monitoring of nuclear-weapon tests led researchers to the biggest bangs in the Universe
Until very recently, we really didn’t know very much about them. Gamma-ray bursts were only discovered in the late 1960s, quite by accident. During the Cold War, the US military sent up satellites to detect signs of illegal nuclear testing by the Soviet Union. These Vela satellites were fitted with gamma-ray detectors, since nuclear explosions release copious amounts of this extremely energetic radiation. The American military was surprised that their satellites instead detected great explosions of gamma-ray photons coming from space.


So what are these gamma-ray bursts? How are they formed, and where do they come from? Over the past 15 or so years, satellites and ground-based observations have provided a mass of data that has helped scientists to answer these questions.


Scientists have identified two kinds of gamma-ray burst (GRB): short duration (average 0.3 second) and long duration (average 30 seconds). Experts suspect that these form in totally different ways. Although they are fairly confident they understand the origin of long-duration GRBs, short duration GRBs remain a puzzle

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

3 Robots That Move Just Like Animals

Biomimicry, imitating nature’s designs and processes to create products for humans, has been heralded as key to creating our sustainable future. Innovations such as self-cleaning paint based on lotus leaves, swimsuits made like sharkskin, and wind turbines in the likeness of whale flippers have all been inspired by parts of nature. But why stop there? A number of developers are capturing the movement and grace of entire animals, giving us robots that crawl, walk, and swim just like their biological counterparts. If this research one day spawns an uncomplaining robotic mule to carry our physical burdens or dogs that can save children from fiery buildings without fear of harm, man’s best friend may also be humanity’s own invention.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Science Sets Its Eyes on the Prize


A growing number of organizations are taking a cue from reality TV, offering prize money for successful solutions to science and technology problems.

Three major prizes are currently up for grabs from the X Prize Foundation, which aims to spur innovation. The $10 million Archon X Prize will reward any group or person who can sequence the human genome in 10 days or less for no more than $10,000 per genome. So far, eight teams have registered. The $10 million Progressive Automotive X Prize, recognizing high-efficiency, commercially viable vehicles, completed two rounds of judging this past year. Performance tests will start in the spring of 2010, with winners announced in September. And 21 teams are vying to land a privately funded rover on the moon in pursuit of the $30 million Google Lunar X Prize. Last October a smaller X Prize–operated contest, the Northrop Grumman Lunar Lander Challenge, awarded $1 million to Masten Space Systems and $500,000 to Armadillo Aerospace for their progress in building a commercial rocket capable of safe vertical takeoff and landing, as demonstrated by successful tests in the Mojave Desert.

Other groups were also busy in 2009. Entries poured in for the £10 million ($17 million) Saltire Prize, to be awarded by the government of Scotland for wave or tidal energy technology that can produce a continuous output of 100 gigawatt-hours for two years. More than 100 teams will begin competition this month. In September, the DVD rental company Netflix paid out a $1 million purse to a seven-member team that developed an algorithm to improve its predictions of customers’ movie preferences. Netflix plans to announce a sequel early this year. Meanwhile, a company called InnoCentive is hosting hundreds of open questions in science and technology. Rewards range from $5,000 to $1 million.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

20 Things You Didn't Know About... Light


1 : God commanded, “Let there be light,” but it didn’t happen for nearly half a million years. That’s how long after the Big Bang the universe took to expand enough to allow photons (light particles) to travel freely.

2 : Those photons are still running loose, detectable as the cosmic microwave background, a microwave glow from all parts of the sky.

3 : Light moves along at full “light speed”—186,282.4 miles per second—only in a vacuum. In the dense matrix of a diamond, it slows to just 77,500 miles per second.

4 : Diamonds are the Afghan­istan of gemstones: Any entering photon quickly gets bogged down. It takes a lot of pinging back and forth in a thicket of carbon atoms to find an exit. This action is what gives diamonds their dazzling sparkle.

5 : Eyeglasses can correct vision because light changes speed when it passes from air to a glass or plastic lens; this causes the rays to bend.

6 : Plato fancied that we see by shooting light rays from our eyes.

7 : The Greek philosopher was not completely wrong. Like all living things, humans are bio­luminescent: We glow. We are brightest during the afternoon, around our lips and cheeks. The cause may be chemical reactions involving molecular fragments known as free radicals.

8 : Bioluminescence is the largest source of light in the oceans; 90 percent of all creatures who live below about 1,500 feet are luminous.

9 : World War II aviators used to spot ships by the bio­luminescence in their wakes. In 1954 Jim Lovell (later the pilot of Apollo 13) used this trick to find his darkened aircraft carrier.

10 : Incandescent bulbs convert only 10 percent of the energy they draw into light, which is why Europe will outlaw them by 2012. Most of the electricity turns into unwanted heat.

11 : In the confined space of an Easy-Bake oven, a 100-watt bulb can create a temperature of 325 degrees Fahrenheit.

12 : Light has no mass, but it does have momentum. Later this year the Planetary Society will launch LightSail-1, attempting to capture the pressure of sunlight the way a boat’s sail gathers the wind.

13 : Laser beams bounced off mirrors left behind by Apollo astronauts show that the moon is moving 1.5 inches farther from Earth each year.

14 : Visible light makes up less than one ten-billionth of the electromagnetic spectrum, which stretches from radio waves to gamma rays.

15 : Goldfish can see infrared radiation that is invisible to us. Bees, birds, and lizards have eyes that pick up ultraviolet.

16 : Photography means “writing with light.” English astronomer John Herschel, whose father discovered infrared, coined the term.

17 : Shoot now: The “golden hour,” just after sunrise and before sunset, produces the prettiest shadows and colors for photographs.

18 : Day and night are everywhere the same length on the vernal equinox, which occurs this year on March 20.

19 : Auroras light up the night sky when solar wind particles excite atoms in the upper atmosphere. Oxygen mostly shines green; nitrogen contributes blue and red.

20 : But to the Inuits, auroras are spirits of the dead kicking around the head of a walrus.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Are Black Holes the Architects of the Universe?


Black holes are finally winning some respect. After long regarding them as agents of destruction or dismissing them as mere by-products of galaxies and stars, scientists are recalibrating their thinking. Now it seems that black holes debuted in a constructive role and appeared unexpectedly soon after the Big Bang. “Several years ago, nobody imagined that there were such monsters in the early universe,” says Penn State astrophysicist Yuexing Li. “Now we see that black holes were essential in creating the universe’s modern structure.”

Black holes, tortured regions of space where the pull of gravity is so intense that not even light can escape, did not always have such a high profile. They were once thought to be very rare; in fact, Albert Einstein did not believe they existed at all. Over the past several decades, though, astronomers have realized that black holes are not so unusual after all: Supermassive ones, millions or billions of times as hefty as the sun, seem to reside at the center of most, if not all, galaxies. Still, many people were shocked in 2003 when a detailed sky survey found that giant black holes were already common nearly 13 billion years ago, when the universe was less than a billion years old. Since then, researchers have been trying to figure out where these primordial holes came from and how they influenced the cosmic events that followed.

In August, researchers at the Kavli Institute for Particle Astrophysics and Cosmology at Stanford University ran a supercomputer simulation of the early universe and provided a tantalizing glimpse into the lives of the first black holes. The story began 200 million years after the Big Bang, when the universe’s first stars formed. These beasts, about 100 times the mass of the sun, were so large and energetic that they burned all their hydrogen fuel in just a few million years. With no more energy from hydrogen fusion to counteract the enormous inward pull of their gravity, the stars collapsed until all of their mass was compressed into a point of infinite density.


The first-generation black holes were puny compared with the monsters we see at the centers of galaxies today. They grew only slowly at first—adding just 1 percent to their bulk in the next 200 million years—because the hyperactive stars that spawned them had blasted away most of the nearby gas that they could have devoured. Nevertheless, those modest-size black holes left a big mark by performing a form of stellar birth control: Radiation from the trickle of material falling into the holes heated surrounding clouds of gas to about 5,000 degrees Fahrenheit, so hot that the gas could no longer easily coalesce. “You couldn’t really form stars in that stuff,” says Marcelo Alvarez, lead author of the Kavli study.

Even as Alvarez’s computer model offered a glimpse into the universe’s infancy, it sowed confusion about what happened next. In 2007 scientists spotted a billion-solar-mass black hole that existed some 840 million years after the Big Bang, the earliest and most distant one ever observed. (Black holes themselves are invisible, but astronomers detect them by looking for the brilliantly hot gas that swirls around them before getting sucked in.) This past September another research team announced it had found a large, star-forming galaxy surrounding that black hole. These discoveries were puzzling, to say the least. About 400 million years after the Big Bang, the universe still consisted of scattered stars and small, starving black holes. Less than 500 million years later, it was full of monster black holes embedded in vast galaxies. How did things change so rapidly?

Penn State’s Li is trying to find out. While Alvarez’s simulations focus mostly on individual stars and black holes, Li studies the interaction of those objects and their influence on large-scale structures in the early universe. Her work shows that the first black holes were enveloped by halos of dense, invisible matter tens of thousands of times more massive. Together, these constituted protogalaxies, building blocks of today’s galaxies. During a period of frequent, violent collisions among the protogalaxies, their resident black holes experienced rapid growth spurts by merging with one another and gobbling up new supplies of gas and dust. A 100-solar-mass black hole ballooned into a billion-mass beast within 800 million years, and in especially dense regions that growth could have occurred even more quickly. During this dynamic period, Li’s model shows, black holes suddenly became a lot more star-friendly. Merging protogalaxies sent out shockwaves that compressed dense clumps of gas, helping trigger widespread star birth even in regions previously dominated by black hole radiation. In a remarkably short period of time, black holes shifted from lightweight bullies to supermassive centerpieces of star-breeding galaxies.

The most distant black hole known, nearly 13 billion light-years away, is the white spot in the middle of this false-color image.Tomotsugu Goto/University of Hawaii
Although this simulation offers a comprehensive account of this formative epoch, Li concedes that her models are still just models; they are no match for direct observation. So while she and other theorists refine their calculations, other astronomers are using powerful telescopes to peer ever further back in time, looking for objects that are currently known only from computer simulations. “There are aggressive campaigns to search for the first supermassive black holes,” Li says. “We still may not have found the very first ones.” She says it would not surprise her if the earliest of these giant black holes appeared as little as 500 million years after the birth of the universe.

The recently refurbished Hubble Space Telescope will aid this search. This past April, one of Li’s Penn State colleagues discovered the burst of energy from a star that exploded, probably in the process of collapsing to form a black hole, when the cosmos was just 630 million years old.Hubble’s successor, the James Webb Space Telescope, will delve even deeper following its 2014 launch.

Soon, astronomers may be able to directly observe the improbable era when black holes were among the most important objects in the universe, helping to bring order to the Big Bang’s formlessness. “In theoretical and observational astronomy,” Li says, “this is the cosmic frontier.”

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

4-D Microscopy Captures the Movements of Individual Atoms

Electron microscopes have given scientists unprecedented views of atomic and molecular structures. Now those vistas are evolving from still photographs into motion pictures. Physicist Ahmed Zewail of Caltech, who won the 1999 Nobel Prize in Chemistry for his use of lasers to study chemical reactions, is pioneering 4-D electron microscopy, which allows direct observation of atomic-scale changes in real time. His filmmaking technique relies on applying a short laser pulse to the subject, followed by an ultrafast pulse of electrons, which scatter off the material to produce an image for the microscope. A series of these images can be collected in rapid succession and viewed as a movie. The extremely brief electron pulses ensure that the image remains sharp, much like a short-exposure photograph of a speeding object.

In late 2008 Zewail and his colleagues announced that they had observed atomic motion in gold and in graphite (a sheet of carbon atoms). They discovered that atoms in heated graphite began to pulse in an unexpected synchronized “drumming” action, much like a heartbeat. Last summer the lab reported capturing changes in the pattern of bonding among graphite’s carbon atoms following intense compression of the sample. And in December they described watching nanotubes briefly glow after being hit with laser light. “You can see things evolve over time in a way that you never could with a snapshot,” says physicist and collaborator Brett Barwick. Eventually the group hopes to study chemical reactions in living cells.
The full text of this article is only available to DISCOVER subscribers. To read the rest of the article, you can:

1) Buy the digital version of this issue and read it online for just 99 cents.
2) Subscribe to the print version of the magazine. You'll get instant access to this article and everything else on the site.
Already a subscriber? Log in here.
Login failed. Please try logging in again.
Login attempt failed. Click here to manage your account.
Close window
Account Number:
Zip Code:

After you log in, you'll remain logged in on this computer for up to three months, or until you click the Log Out button at the top of the page.
Click here for help.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Technology Innovator’s Mobile Move


MENLO PARK, Calif. — The film “2001: A Space Odyssey” presents a dramatic vision of the future, where sentient robots double as secretaries, performing daily tasks and simple services for their human masters.
Enlarge This Image
Peter DaSilva for The New York Times
Norman Winarsky, a vice president at SRI.
Enlarge This Image
Peter DaSilva for The New York Times
The founders of Chattertrap, a kind of personal assistant, David Schairer, left, Henry Nothhaft Jr. and Gary Griffiths.
Now, SRI International, the research institute, is hoping to bring the concept of virtual personal assistants closer to reality — without the malevolent malfunctions, of course.
“We are looking to augment human capability,” said Norman Winarsky, vice president for licensing and strategic programs at SRI. “But with artificial intelligence.”
Established in 1946 by Stanford University, SRI created early prototypes of the computer mouse and the technologies involved in ultrasound and HDTV.
Although SRI does roughly 80 percent of its work for the federal government, many of its technologies have been adapted for commercial purposes. Recently, the institute has set its sights on the mobile phone and Web market, especially on creating applications that perform personal functions.
“We have companies in every space: drug discovery, flexible circuits, new medical devices, solar, clean tech,” said Mr. Winarsky, who oversees the establishment of new companies that are spun off from SRI. “But right now, half of the companies we’re thinking of creating are strongly related to virtual personal assistants.”
SRI’s newest venture is a Web-based personalized news feed, Chattertrap, that monitors what people are reading to learn what they like, and then serves up articles and links that suit their interests.
Another recent project is a mobile application, Siri, that allows people to perform Web searches by voice on a cellphone. Siri users can speak commands like “find a table at an Italian restaurant for six at 8 tonight,” and the application can translate the request and use GPS functions and search algorithms to find an answer.
Siri’s software is sophisticated enough that over time, it can even remember if someone prefers places that serve Northern Italian cuisine, rather than Sicilian, and make recommendations around that preference.
The application has already been a big hit; in April, Apple acquired Siri for a price said to be as high as $200 million. But some analysts wonder whether SRI will be able to duplicate this kind of success. Variations on the virtual personal assistant concept have been around for a while. Two services, for example — Remember the Milk and Jott — are types of electronic crutches intended to help users be more efficient at ticking off items in their daily to-do lists.
But SRI is betting that its expertise in artificial intelligence will help make software that can break away from the pack. And it has high hopes that Chattertrap will be as successful as Siri.
“The popular news sites aren’t always the most interesting,” said Gary Griffiths, one of the two entrepreneurs SRI recruited to guide Chattertrap. “But by using technology to evolve with you as you use it, watching what you’re doing and giving more of what you like and less of what you’re ignoring, we can create a very personal information service.”
Although Chattertrap is in a limited test period right now, the company hopes to allow more users later this summer and release the product in its entirety by the end of the year.
Chattertrap has already caught the eye of Li Ka-shing, a Chinese billionaire who has invested in Facebook and the music-streaming service Spotify. Mr. Li recently led a $1.5 million round of venture financing in the Chattertrap project.


SRI’s newfound interest in mobile and Web applications was born, in part, from a research project commissioned by the Defense Department to develop software that can learn, in an effort to create a more efficient way for the military to communicate and stay organized in the field. The project’s underlying technology, a combination of adaptive machine learning and natural-language processing, has spawned several offshoots.
Each year, SRI tests the marketability of roughly 2,000 technology ventures, but typically only three or four are ever established as independent businesses.


Charles S. Golvin, an analyst with Forrester Research who follows the mobile industry, said SRI was tapping into the mobile market at a time when the need to simplify searching is greater than ever.


“The old paradigm of having a desktop computer in front of you with a large screen to search around for what you want is going away,” Mr. Golvin said. “More and more, the information you want online is coming from the palm of your hand.”


Since most mobile phones have small, cramped screens and tiny keyboards, voice-activated search and speech recognition become much more powerful, Mr. Golvin said.
“It’s a very compelling offer for a mobile company,” he said.


In addition, companies like Apple and Google are sizing up the market opportunity for location-based search and the potential advertising opportunities that come with it, said Brent Iadarola, director of mobile research at Frost & Sullivan.


“The acquisition that Apple has made provides powerful clues as to what the mobile landscape will look like in the future,” Mr. Iadarola said.


“When you’re in a mobile environment there’s a higher propensity to spend, and tying that into mobile advertising could be lucrative.”


Still, he said, it’s not clear yet whether SRI can recreate the same type of successes it had with Siri with its future virtual personal assistants. “That was hitting it out of the ballpark, in my opinion,” he said. “I don’t know if they can replicate that.”


Mr. Winarsky said the intellectual property licensed to Apple as part of the acquisition of Siri is a fraction of what has been generated by the institute.


“Siri is the first and in some cases, the simplest, of what we’ll do,” he said.
Mr. Winarsky said SRI was in the early stages of determining what will be the next start-up to become an independent company.


One area he is particularly excited about is translation, he said.
“Virtually every industry and platform has a need for translation services,” he said.
In addition, he said, a virtual personal assistant could be of great use to the health industry and patients, by helping figure out which procedures are covered by insurance or quickly finding and booking a doctor’s appointment.


“We’ll only be able to tell in 20 years,” he said. “But I truly believe this is the dawn of a new era of artificial intelligence. It is on the vanguard of a great revolution in computer science.”

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Technology; Continuous Casting in Steel

ALTHOUGH the technology of continuous casting has been around for two decades, most domestic steel producers had found it uneconomical and impractical until recently.
But steelmaking energy costs have doubled since the 1973 oil embargo and pressure has intensified from such competitors as the Japanese, who in 1980 cast 58 percent of their steel through the new process. Some of the large domestic steelmakers are now making a major effort to build new continuous casters. The domestic industry now casts about 17 percent of its steel through the process.

The process was patented by Henry Bessemer in 1865 but was not developed on a large scale until after World War II. Its use in the 1980's is expected to be significant, however. ''The most important technological change for integrated steelmakers during the next 10 years will be greater adoption of continuous casting,'' according to a report by the Federal Office of Technology Assessment.

The United States Steel Corporation, for example, announced last year that it would install a continuous slab caster at its historic Edgar Thomson Works and at its Lorain Works in Ohio. David M. Roderick, chairman, said that the nation's largest steel company would then have eight such casters and would be able to cast about 29 percent of its raw steel through the process.
The National Steel Corporation, which installed the first large continuous caster in this country in 1968 at its Weirton Works in West Virginia, will be able to cast more than 50 percent of its steel continuously by 1982.

And such other large steelmakers as Armco are adding casters. In continuous casting, freshly refined molten steel is transformed directly into steel products such as slabs, billets and blooms, which can then be finished in mills into various steel products such as sheet, rods and beams.
In the process, molten steel is poured into a vessel called a tundish, which allows the steel to flow at a constant rate into a water-cooled mold. A solid skin is formed and the hardened steel is pulled continuously out of the bottom of the mold. It is then cut into various lengths.
In the traditional method, molten steel is poured into ingot molds. When it cools and hardens, the mold is stripped away. The ingot can then be stored. It must be reheated, however, before it can be formed into slabs, billets and blooms suitable for rolling.
During heating in the old process, the surface oxidizes and part of the steel must be removed through a process called scarfing. These various steps lead to the loss of some of the steel, which becomes scrap and is recycled.

The savings in energy, manpower and waste in continuous casting come mostly from the elimination of various steps needed under traditional methods.
The total energy saving is estimated to be about 10 percent in making semifinished steel products. The average saving on a ton of steel has been estimated to be about 3 million British thermal units, or about $12.

The other major advantage is that the process eliminates much of the scrap and increases the yield from the raw steel of semifinished steel products by about 10 to 15 percent. Continuous casters thus fit in with current industry strategy of seeking to raise capacity by making plants more efficient.

The manpower savings are estimated to be 10 to 15 percent by the Department of Labor. In older plants some workers are absorbed in running the caster and in the finishing mills.
There are a few potential disadvanages. If a continuous caster breaks down, the primary steelmaking process has to be held up also. There are also a small number of steels that have not yet been cast through the process - basically because the casting methods have not been perfected -though the list has been decreasing.

Milton Deaner, vice president of engineering for National Steel, which has had one the most extensive and longest experiences with continous casting, said that the company has been very satsified with the process. He said that executives of plants without casters were eager to have them installed. ''No one has reservations to build them, it's just to get the money to build one,'' he said. A caster can cost from $120 million to $150 million

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Bluetooth Headset for iPod Touch


So you have that gadget, the iPod touch? What next? Using your Apple iPod touch to the fullest is what you would wish to do. Then getting the best iPod touch accessories is a must. Have you got that sturdy case, screen guard and the best docking system for this cool gadget? Have you bought the bluetooth headset? Now that's something you just can't miss as it is amongst the must haves for your Apple iPod touch. Buying the best bluetooth headset for iPod touch is essential to experience the best quality audio and keep those tangling wires at bay. Mentioned below are the best bluetooth headsets for iPod touch which you can check out before buying one.Best Bluetooth Headset for iPod TouchThanks to the Bluetooth technology we no more require wired headsets. So getting a Bluetooth headset or Bluetooth headphone is essential. It also enhances your experience of listening the favorite tracks. Make sure that the device you buy is rightly compatible with your version of iPod touch. There are many types of headsets, including in-ear, on-ear, overhead band, neckband, stereo headphones, etc. We get the best of them for you. Just keep reading.Beats by Dr. Dre Solo HD HeadphonesEasy folding and easy packing headset with ingenious tri-fold design is a perfect pick for your iPod touch. This ultra light headset delivers amazing high definition sound while you move on. This gadget has a ControlTalk feature for iPod playback control and iPhone or any music phone hands-free calling, hence is a perfect bluetooth headset for iPhone as well. This also comes with a touring case to keep your headphones protected while you travel.Sennheiser PXC 450 Travel HeadphonesThis is the highest rated product amongst bluetooth headphones for iPod touch. PXC 450 is a high-end travel headphone set with NoiseGard 2.0 technology and TalkThrough function which ensures the best possible audio quality. These also have the talk while listening function. With this function and the microphones mounted on the headphones you can communicate with your friends without removing the headphones. The comfortable, sturdy and foldable headphone design is another plus point and a feature making it a perfect travel companion. You are sure to experience best sound quality from your iPod or iPhone even in noisy environments.Motorola HT820This is one of the best bluetooth headsets for iPod touch 3rd generation. This ensures high quality audio and its light weight ensures total ease of carrying. These have embedded microphone and help you answer a call by pausing the music track. It has music buttons on the right ear piece and call buttons on the left one. With a music time of approximately 12 hours and a standby time of approximately 500 hours, this is the right pick for your iPod touch.Sony Ericsson HBH-IS800Experience extraordinary music quality with this stereo headset from Sony Ericsson. This bluetooth headset comes with a short cord which lies almost hidden at the back of your neck. In-ear stereo headphones ensure high-quality wireless audio. Its Adaptive Frequency Hopping (AFH) feature reduces the impact on sound caused by interference, thus giving you an excellent audio quality.Audio-Technica ATH-ESW9AHere's another that can be considered as the best bluetooth headset for iPod touch. This is one of the Apple recommended and second highest rated portable headset for your iPod touch. This foldable headphone consists of 42 mm drivers to ensure outstanding audio quality. The soft earpads give you comfort while listening. This piece is compatible with all models of iPod. Try them to hear the best from your iPod touch.With bluetooth technology, clipping those wires and holding that iPod is just not needed. You can simply place your gadget in the pocket, dock, stand or dashboard and have your favorite track playing right in your ears. Pick the best bluetooth headset for iPod touch and have fun, listening to music.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Best Wireless Router for Home


If you want to have a truly unplugged Internet browsing experience at home, switch to wireless routers. Wireless routers release you from 'wired' bonds and give you the freedom to access the net from anywhere in your house. With wireless networking enabled by the router, you can have freedom from wires! To have secure and high speed wireless Internet access, it is essential that you opt for the best wireless router for home use. The reviews of some of the best home wireless routers provided below will help you in choosing one for your home.Choosing the Best Wireless Router For HomeBefore we have a look at the best wireless routers for home use, let me walk you through the various factors to take into consideration while shopping for one and checking out wireless router reviews. Here they are:


Speed: Speed of data transmission, measured in MB/sec is the most important factor to be taken into consideration. Go for a high speed router with a high data transfer speed.
Number of Antennas: More the number of internal antennas, better is the coverage range and connectivity of the router.

Dual Band/Single Band: A dual band wireless router offers data transmission at two radio frequencies which makes them faster than single band wireless routers.
Firewall & Network Security: See to it that the wireless router has state of the art security, including a firewall and WPA2 encryption standard is enabled. This protects your network from getting hacked by eavesdroppers.


Network Standard: Opt for any one of the following; 802.11 g/n/a/b/c, out of which 802.11 n and 802.11 g are the faster.
Range: See to it that the router has a high range of coverage and covers your home area adequately. Best Wireless Router For Home UseAfter that brief overview of the features that would make the best wireless router for home, let us review some of the best products available in the market.Apple AirPort ExtremeThe Apple AirPort Extreme is voted to be one of the best wireless routers for home use. It is a dual band wireless router with advanced security features. It comes with 802.11 a/b/g/n compatibility with a facility for external hard drive and printer sharing capability. What makes it the best wireless router for home use is the high speed Internet access it offers, along with excellent coverage range. This is also the best wireless router for home gaming. With a one year warranty, this best home wireless router is priced at $175.Linksys WRT610NWith simultaneous dual band connectivity, WPA2 encryption, SPI firewall, six internal antennas and high speed data transfer, Linksys WRT 610N is a great choice for the home user. It is compatible with all of 802.11a/b/g/n network standards. Home users can benefit from two added feature of Linksys WRT610N, which are external hard drive sharing and four ethernet ports. With all these great features, this is among the best home wireless router which will cost you only $165.Linksys WRT400NWith a price of only $115, Linksys WRT400N offers dual band connectivity and excellent coverage range. Though not as fast as the higher line Linksys products, its features make it a good choice for the home user with no high bandwidth needs. This may not be the fastest wireless router for home but what sets it apart of course, is the low price!Linksys WRT54GLIf all of you are bound by budget constraints and need to opt for a wireless router that offers all the bare essentials at a reasonable price, Linksys WRT54GL is the best choice for you. It has WPA2 security, single band connectivity and SPI firewall with decent data transfer speeds. Its two antennas ensure adequate coverage for small homes and it will cost you only $55.The best wireless router for home use is the one which provides secure Internet access, high data transfer speeds and excellent coverage range. Every one of the above mentioned products qualifies for possessing these requisites. Choose any one that matches your requirement and suits your budget.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS