RSS

How Many Moons Does Mars Have ?

Even though the speculations about the existence of life on Mars has brought the planet back into the limelight, not many people out there can boast of knowing everything about this planet. Among the various lesser known facts about planet Mars, one of the most prominent fact is related to its natural satellites, i.e. the exact number of moons of Mars to be precise. It wouldn't be surprising to see people left unanswerable when asked how many moons does Mars have. Not many people are aware of the fact that other planets also have their own moons. While planet Jupiter leads the pack with a count of 63 known moons, Saturn has 61 moons orbiting it and Uranus has 27.


How Many Moons Does Mars Have?

The fact that planet Mars has 2 moons may come as a surprise for many, as this planet is half the size of planet Earth, which has only 1 moon. The speculation that planet Mars has moons surfaced long before Asaph Hall, Sr., actually discovered them in 1877. For instance, the famous German astronomer, Johannes Kepler had predicted that Mars has two moons way back in the 17th century. That, however, didn't quite hold ground as his claims were more of assumptions which lacked scientific support. Kepler's claims that Mars had two moons were based on the fact that the Earth had one moon and planet Jupiter was believed to have 4 moons.


What are the Names of the Two Moons of Mars?

The two moons of planet Mars are named Phobos and Deimos, after the sons of Ares - the God of War in Greek mythology. While the former represents 'fear', the latter represents 'terror'. These moons of Mars were discovered by Asaph Hall, Sr., an American astronomer, on August 12, 1877. Owing to their irregular shape, astronomers believe that these moons are actually captured asteroids. A comparison with the Earth's moon shows that the Mars moons are different from it in several aspects.


Moons of Mars: Facts About Phobos and Deimos

Among the two moons, Phobos is the larger moon and has a diameter of 14.1 miles, while Deimos is smaller and has a diameter of 7.82 miles. The orbit of Phobos can be located at a distance of 5826.59 miles from the surface of the planet, whereas the orbit of Deimos can be located at a distance of 14577.36 miles from the planet. Considering that the orbit of Earth's moon is 238856.95 miles away from the surface of the planet, one can say that the moons of Mars orbit the planet at a very close range. Another major difference between the two moons is that Phobos travels from west to east, while Deimos travels from east to west. Phobos takes 7 hours and 39 minutes to orbit planet Mars, while Deimos takes 30 days i.e. 1.2 Martian days to orbit the planet. Recent observations have revealed that the gravity of planet Mars is slowing down the orbit of Phobos. Going by the speed at which this is happening, Phobos will reach the Roche limit, i.e. the limit up to which a natural satellite can close in on its parent body, and get disintegrated within a a span of 15 million years. NASA has managed to collect a significant amount of information about Phobos and Deimos but that doesn't mean we know everything about them.If this information about how many moons does Mars have took you by surprise, knowing about the moons of Jupiter or how many moons does Saturn have is bound to leave you bewildered. NASA's efforts to facilitate low cost planetary exploration opened the realms of the solar system for mankind. With NASA continuing its space escapades, it wouldn't be surprising if yet another fascinating attribute of Mars surfaces sometime in the near future.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

These new research findings provide scientists with a more useful probe of the inner workings of volcanic eruptions. Infrasound is sound that is lower


Jay Miller, a research scientist in the Integrated Ocean Drilling Program who has made numerous trips to the region and studied there under a Fulbright grant, says the ash produced from Icelandic volcanoes can be a real killer, which is why hundreds of flights from Europe have been canceled for fear of engine trouble.


"What happens is that the magma from the volcano is around 1,200 degrees and it hits the water there, which is near freezing," he explains. "What is produced is a fine ash that actually has small pieces of glass in it, and it can very easily clog up a jet engine. If you were to inhale that ash, it would literally tear up your lungs."


Miller says most volcanoes in Iceland erupt only about every five years on average and are relatively mild, but history is repeating itself. Extremely large eruptions occurred there in 934 A.D. and again in 1783 that covered Europe with ash much like today.


"Ben Franklin was ambassador to France in 1783 and he personally witnessed the large ash clouds over Europe, and he later wrote that it was a year in which there was no summer," Miller adds. "The big question now is, what happens next? It's very possible this eruption could last for quite some time, but no one knows for sure. Volcanoes in that part of the world are very hard to predict."

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Sound From Exploding Volcanoes Compared With Jet Engines


These new research findings provide scientists with a more useful probe of the inner workings of volcanic eruptions. Infrasound is sound that is lower in frequency than 20 cycles per second, below the limit of human hearing.


The study led by Robin Matoza, a graduate student at Scripps Oceanography, will be published in an upcoming issue of the journal Geophysical Research Letters, a publication of the American Geophysical Union (AGU). Matoza measured infrasonic sound from Mount St. Helens in Washington State and Tungurahua volcano in Ecuador, both of which are highly active volcanoes close to large population centers.


"We hypothesized that these very large natural volcanic jets were making very low frequency jet noise," said Matoza, who conducts research in the Scripps Laboratory for Atmospheric Acoustics.
Using 100-meter aperture arrays of microbarometers, similar to weather barometers but sensitive to smaller changes in atmospheric pressure and low-frequency infrasonic microphones, the research team tested the hypothesis, revealing the physics of how the large-amplitude signals from eruptions are produced. Jet noise is generated by the turbulent flow of air out of a jet engine. Matoza and colleagues recorded these very large-amplitude infrasonic signals during the times when ash-laden gas was being ejected from the volcano. The study concluded that these large-scale volcanic jets are producing sound in a similar way to smaller-scale man-made jets.
"We can draw on this area of research to speed up our own study of volcanoes for both basic research interests, to provide a deeper understanding of eruptions, and for practical purposes, to determine which eruptions are likely ash-free and therefore less of a threat and which are loaded with ash," said Michael Hedlin, director of Scripps' Atmospheric Acoustics Lab and a co-author on the paper.
Large-amplitude infrasonic signals from volcanic eruptions are currently used in a prototype real-time warning system that informs the Volcanic Ash Advisory Center (VAAC) when large infrasonic signals have come from erupting volcanoes. Researchers hope this new information can improve hazard mitigation and inform pilots and the aviation industry.


"The more quantitative we can get about how the sound is produced the more information we can provide to the VAAC," said Matoza. "Eventually it could be possible to provide detailed information such as the size or flow rate of the volcanic jet to put into ash-dispersal forecasting models."


The paper's co-authors include D. Fee and M A. Garcés, Infrasound Laboratory at the University of Hawaii at Manoa; J.M. Seiner of the National Center for Physical Acoustics at the University of Mississippi; and P.A. Ramón of Instituto Geofisico, Escuela Politecnica Naional. The research study was funded by a National Science Foundation grant.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Scientists Uncover Novel Role for DNA Repair Protein Linked to Cancer

Assistant Professor of Biology Mitch McVey and his research team report that DNA polymerase theta, or PolQ, promotes an inaccurate repair process, which can ultimately cause mutations, cell death or cancer. The research is published in the July 1 edition of the open-access journal PLoS Genetics.
"Although scientists have known for several years that the PolQ protein is somehow related to the development of cancer, its exact cellular role has been difficult to pin down," says McVey."Our finding that it acts during inaccurate DNA repair could have implications for biologists who study genomic changes associated with cancer."
DNA is a double stranded molecule shaped like a spiral staircase. Its two strands are linked together by nucleotides -- guanine, cytosine, adenine and thymine -- that naturally complement one another. Under normal conditions, a guanine matches with a cytosine, and an adenine with a thymine.

How DNA Double-Strand Breaks Are Repaired

But during the course of a cell's life, the staircase can become severed into two molecules. These breaks must be repaired if the cells are to accurately replicate and pass on their genetic material. Most breaks are quickly and accurately fixed during the process of homologous recombination (HR), which uses an intact copy of DNA as a template for repair.

However, there is a second, error-prone process called end-joining repair. Here, the broken, double-stranded ends are stitched back together without regard to the original sequence. The ends of the broken strands may be altered by removal or addition of small DNA segments, which can change the genomic architecture.

In a previous paper, McVey and doctoral student Amy Marie Yu were able to demonstrate an alternative form of end-joining by studying how repair proceeds in the absence of DNA ligase 4, an important protein that links together two broken DNA ends.

After analyzing hundreds of inaccurately repaired breaks in the fruit fly Drosophila melanogaster the scientists observed two things. One, extra nucleotides were often inserted into the DNA strands at the point of the break. Second, the insertions were closely related to the original DNA sequences directly adjacent to the break.

Polymerase Theta's Role in DNA Repair and Cancer

In the current PLoS Genetics paper, McVey, Yu and undergraduate Sze Ham Chan showed that polymerase theta plays a dominant role in this alternative repair process. First, it reads the genetic material in DNA adjacent to the break and makes a copy of it. The newly copied DNA can then be used as a molecular splint that holds the broken ends together until they can be permanently joined. In addition, the scientists speculated that the PolQ protein also has the ability to unwind DNA sequences near a break, thereby facilitating alternative end-joining.
Other research groups have previously shown that levels of the PolQ protein are higher in several types of human tumors. McVey and his team are currently working to determine if a PolQ-dependent type of alternative end-joining is involved in the development of cancer in people. If this is indeed the case, the PolQ protein could represent a novel target for the development of new cancer drugs.
"Our first goal is to determine which parts of PolQ are required for its role in alternative end-joining," McVey says. "This will give us a road map for determining how its activity might be altered in a clinical setting."
This work was funded by grants from the National Science Foundation and the Ellison Medical Foundation.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

Urgent Computing: Exploring Supercomputing's New Role


Pete Beckman, Mathematics and Computer Science Division, Argonne National Laboratory
Large-scale parallel simulation and modeling have changed our world. Today, supercomputers are not just for research and development or scientific exploration; they have become an integral part of many industries. A brief look at the Top 500 list of the world’s largest supercomputers shows some of the business sectors that now rely on supercomputers: finance, entertainment and digital media, transportation, pharmaceuticals, aerospace, petroleum, and biotechnology. While supercomputing may not yet be considered commonplace, the world has embraced high-performance computation (HPC). Demand for skilled computational scientists is high, and colleges and universities are struggling to meet the need for cross-disciplinary engineers who are skilled in both computation and an applied scientific domain. It is on this stage that a new breed of high-fidelity simulations is emerging – applications that need urgent access to supercomputing resources.

For some simulations, insights gained through supercomputer computation have immediate application. Consider, for example, an HPC application that could quickly calculate the exact location and magnitude of tsunamis immediately after an undersea earthquake. Since the evacuation of local residents is both costly and potentially dangerous, promptly beginning an orderly evacuation in only those areas directly threatened could save lives. Similarly, imagine a parallel wildfire simulation that coupled weather, terrain, and fuel models and could accurately predict the path of a wildfire days in advance. Firefighters could cut firebreaks exactly where they would be most effective. For these urgent computations, late results are useless results. As the HPC community builds increasingly realistic models, applications are emerging that need on-demand computation. Looking into the future, we might imagine event-driven and data-driven HPC applications running on-demand to predict everything from where to look for a lost boater after a storm to tracking a toxic plume after an industrial or transportation accident.
Of course, as we build confidence in these emerging computations, they will move from the scientist’s workbench and into critical decision-making paths. Where will the supercomputer cycles come from? It is straightforward to imagine building a supercomputer specifically for these emerging urgent computations. Even if such a system led the Top 500 list, however, it would not be as powerful as the combined computational might of the world’s five largest computers. Aggregating the country’s largest resources to solve a critical, national-scale computational challenge could provide an order of magnitude more power than attempting to rely on a prebuilt system for on-demand computation.

Furthermore, costly public infrastructure, idle except during an emergency, is inefficient. A better approach, when practical, is to temporarily use public resources during times of crisis. For example, rather than build a nationwide set of radio towers and transmitters to disseminate emergency information, the government requires that large TV and radio stations participate in the Emergency Alert System. When public broadcasts are needed, most often in the form of localized severe weather, broadcasters are automatically interrupted, and critical information is shared with the public.

As high-fidelity computation becomes more capable in predicting the future and being used for immediate decision support, governments and local municipalities must build infrastructures that can link together the largest resources from the NSF, DOE, NASA, and the NIH and use them to run time-critical urgent computations. For embarrassingly parallel applications, we might look to the emerging market for “cloud computing.” Many of the world’s largest Internet companies have embraced a model for providing software as a service. Amazon’s elastic computing cloud (EC2), for example, can provide thousands of virtual machine images rapidly and cost effectively. For applications with relatively small network communication needs, it might be most effective for urgent, on-demand computations simply to be injected into the nation’s existing Internet infrastructure supported by Amazon, Yahoo, Google, and Microsoft.
In April 2007, an urgent computing conference at Argonne National Laboratory brought together an international group of scientists to discuss how on-demand computations for HPC might be supported and change the landscape of predictive modeling. The organizers of that workshop realized that CTWatch Quarterly would be the ideal venue for exploring this new field. This issue describes how applications, urgent-computing infrastructures, and computational resources can support this new role for computing.

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS