Tuesday, July 25, 2017

Scientists Get Best Measure of Star-forming Material in Galaxy Clusters in Early Universe

Scientists Get Best Measure of Star-forming Material in Galaxy Clusters in Early Universe:



The Tadpole Galaxy is a disrupted spiral galaxy showing streams of gas stripped by gravitational interaction with another galaxy. Molecular gas is the required ingredient to form stars in galaxies in the early universe. Credit: Hubble Legacy Archive, ESA, NASA and Bill Snyder.



The international Spitzer Adaptation of the Red-sequence Cluster Survey (SpARCS) collaboration based at the University of California, Riverside has combined observations from several of the world’s most powerful telescopes to carry out one of the largest studies yet of molecular gas – the raw material which fuels star formation throughout the universe – in three of the most distant clusters of galaxies ever found, detected as they appeared when the universe was only four billion years old.

Results were recently published in The Astrophysical Journal Letters. Allison Noble, a postdoctoral researcher at the Massachusetts Institute of Technology, led this newest research from the SpARCS collaboration.

Clusters are rare regions of the universe consisting of tight groups of hundreds of galaxies containing trillions of stars, as well as hot gas and mysterious dark matter. First, the research team used spectroscopic observations from the W. M. Keck Observatory on Mauna Kea, Hawai’i, and the Very Large Telescope in Chile that confirmed 11 galaxies were star-forming members of the three massive clusters. Next, the researchers took images through multiple filters from NASA’s Hubble Space Telescope, which revealed a surprising diversity in the galaxies’ appearance, with some galaxies having already formed large disks with spiral arms.

One of the telescopes the SpARCS scientists used is the extremely sensitive Atacama Large Millimeter Array (ALMA) telescope capable of directly detecting radio waves emitted from the molecular gas found in galaxies in the early universe. ALMA observations allowed the scientists to determine the amount of molecular gas in each galaxy, and provided the best measurement yet of how much fuel was available to form stars.

The researchers compared the properties of galaxies in these clusters with the properties of “field galaxies” (galaxies found in more typical environments with fewer close neighbors). To their surprise, they discovered that cluster galaxies had higher amounts of molecular gas relative to the amount of stars in the galaxy, compared to field galaxies. The finding puzzled the team because it has long been known that when a galaxy falls into a cluster, interactions with other cluster galaxies and hot gas accelerate the shut off of its star formation relative to that of a similar field galaxy (the process is known as environmental quenching).

“This is definitely an intriguing result,” said Gillian Wilson, a professor of physics and astronomy at UC Riverside and the leader of the SpARCS collaboration. “If cluster galaxies have more fuel available to them, you might expect them to be forming more stars than field galaxies, and yet they are not.”

Noble, a SpARCS collaborator and the study’s leader, suggests several possible explanations: It is possible that something about being in the hot, harsh cluster environment surrounded by many neighboring galaxies perturbs the molecular gas in cluster galaxies such that a smaller fraction of that gas actively forms stars. Alternatively, it is possible that an environmental process, such as increased merging activity in cluster galaxies, results in the observed differences between the cluster and field galaxy populations.

“While the current study does not answer the question of which physical process is primarily responsible for causing the higher amounts of molecular gas, it provides the most accurate measurement yet of how much molecular gas exists in galaxies in clusters in the early universe,” Wilson said.

The SpARCS team has developed new techniques using infrared observations from NASA’s Spitzer Space Telescope to identify hundreds of previously undiscovered clusters of galaxies in the early universe. In the future, they plan to study a larger sample of clusters. The team has recently been awarded additional time on ALMA, the W. M. Keck Observatory, and the Hubble Space Telescope to continue investigating how the neighborhood in which a galaxy lives determines for how long it can form stars.

Credit: ucr.edu

Astronomer Develops New Ways to See the Formation of Stars and Discovers Never-Before Seen Areas in Our Milky Way Galaxy

Astronomer Develops New Ways to See the Formation of Stars and Discovers Never-Before Seen Areas in Our Milky Way Galaxy:



A representative color image of infrared light from an infant star cluster: Young stars predominantly show up as orange. Regions where gas is being heated by intense radiation from luminous young stars show up as white. Newly discovered jets from the young stars show up as blue in the image. Credit: Adler Planetarium




A research team led by Adler Planetarium astronomer Dr. Grace Wolf-Chase has discovered new evidence of stars forming in our Milky Way Galaxy. By using a telescope equipped to detect infrared light invisible to our eyes, this exciting new science is revealing how stars, including our very own Sun, grow up within clusters and groups. The Astrophysical Journal has published a paper on the subject titled, “MHOs toward HMOs: A Search for Molecular Hydrogen Emission-Line Objects toward High-Mass Outflows.”

The team found huge gas clouds moving outward from areas where “baby” stars are forming, using a new way of disentangling these outflows from other processes in densely-populated stellar nurseries. These stellar nurseries can produce dozens or even hundreds of stars with different sizes and masses.

“The Sun, though isolated from other stars today, is thought to have formed in a cluster with many other stars, so the environments we’re studying can tell us a lot about the origin of our own Solar System,” said Wolf-Chase.

Stars form when cold, rotating clouds of gas and dust in space are pulled together by gravity into flattened “disks” that spin faster as they shrink, similar to what happens when twirling figure skaters pull their outstretched arms in toward their bodies. In order for a star to form at the center of a spinning disk, the rotation of the disk must slow down. This happens through powerful outflows of gas that are channeled into tight streams, known as “jets.” Jets can span more than 10 trillion miles, even though the disks that launch them are “mere” billions of miles across (comparable to the size of our Solar System.) Since planets can form in the disks, the presence of a jet can be a good indicator of a nascent planetary system, even when the disk isn’t observed directly. Stars more than eight times as massive as the Sun bathe their surroundings in intense ultraviolet radiation that destroys their natal clouds quickly, so it’s not clear if these massive stars develop disks and jets similar to stars like the Sun.

The researchers used an instrument called NICFPS (which stands for Near-Infrared Camera and Fabry-Perot Spectrometer) on the Astrophysical Research Consortium (ARC) 3.5-meter telescope at the Apache Point Observatory (APO) in Sunspot, New Mexico. NICFPS peered into 26 dusty clouds thought to be forming clusters containing massive stars. Using a combination of infrared filters that allowed them to distinguish jets from infant stars from other types of light produced by the radiation in these massive stellar nurseries, they identified 36 jets across 22 of the regions. These results provide compelling evidence that, like their lower-mass siblings, massive stars also launch powerful jets. The jet shuts off shortly after radiation from the massive star begins to disrupt its environment.

Grace Wolf-Chase, an Astronomer at the Adler Planetarium, is first author of this paper. Coauthors are former Postdoctoral Scholar at the Adler Planetarium, Kim Arvidsson, who is currently Assistant Professor of Physics in the Trull School of Sciences and Mathematics at Schreiner University in Kerrville, Texas; and Michael Smutko, Professor of Instruction in the Department of Physics and Astronomy at Northwestern University and former Astronomer at the Adler Planetarium. Astronomical research at the Adler Planetarium is funded in part through a generous grant from the Brinson Foundation. This project also received a Research Seed Grant through NASA’s Illinois Space Grant Consortium.

Holographic Imaging Could Be Used to Detect Signs of Life in Space

Holographic Imaging Could Be Used to Detect Signs of Life in Space:



Plumes water ice and vapor spray from many locations near the south pole of Saturn's moon Enceladus, as documented by the Cassini-Huygens mission. Credit: NASA/JPL/Space Science Institute




We may be capable of finding microbes in space—but if we did, could we tell what they were, and that they were alive? This month the journal Astrobiology is publishing a special issue dedicated to the search for signs of life on Saturn's icy moon Enceladus. Included is a paper from Caltech's Jay Nadeau and colleagues offering evidence that a technique called digital holographic microscopy, which uses lasers to record 3-D images, may be our best bet for spotting extraterrestrial microbes.

No probe since NASA's Viking program in the late 1970s has explicitly searched for extraterrestrial life—that is, for actual living organisms. Rather, the focus has been on finding water. Enceladus has a lot of water—an ocean's worth, hidden beneath an icy shell that coats the entire surface. But even if life does exist there in some microbial fashion, the difficulty for scientists on Earth is identifying those microbes from 790 million miles away.

"It's harder to distinguish between a microbe and a speck of dust than you'd think," says Nadeau, research professor of medical engineering and aerospace in the Division of Engineering and Applied Science. "You have to differentiate between Brownian motion, which is the random motion of matter, and the intentional, self-directed motion of a living organism."

Enceladus is the sixth-largest moon of Saturn, and is 100,000 times less massive than Earth. As such, Enceladus has an escape velocity—the minimum speed needed for an object on the moon to escape its surface—of just 239 meters per second. That is a fraction of Earth's, which is a little over 11,000 meters per second.

Enceladus's minuscule escape velocity allows for an unusual phenomenon: enormous geysers, venting water vapor through cracks in the moon's icy shell, regularly jet out into space. When the Saturn probe Cassini flew by Enceladus in 2005, it spotted water vapor plumes in the south polar region blasting icy particles at nearly 2,000 kilometers per hour to an altitude of nearly 500 kilometers above the surface. Scientists calculated that as much as 250 kilograms of water vapor were released every second in each plume. Since those first observations, more than a hundred geysers have been spotted. This water is thought to replenish Saturn's diaphanous E ring, which would otherwise dissipate quickly, and was the subject of a recent announcement by NASA describing Enceladus as an "ocean world" that is the closest NASA has come to finding a place with the necessary ingredients for habitability.

Water blasting out into space offers a rare opportunity, says Nadeau. While landing on a foreign body is difficult and costly, a cheaper and easier option might be to send a probe to Enceladus and pass it through the jets, where it would collect water samples that could possibly contain microbes.

Assuming a probe were to do so, it would open up a few questions for engineers like Nadeau, who studies microbes in extreme environments. Could microbes survive a journey in one of those jets? If so, how could a probe collect samples without destroying those microbes? And if samples are collected, how could they be identified as living cells?

The problem with searching for microbes in a sample of water is that they can be difficult to identify. "The hardest thing about bacteria is that they just don't have a lot of cellular features," Nadeau says. Bacteria are usually blob-shaped and always tiny—smaller in diameter than a strand of hair. "Sometimes telling the difference between them and sand grains is very difficult," Nadeau says.

Some strategies for demonstrating that a microscopic speck is actually a living microbe involve searching for patterns in its structure or studying its specific chemical composition. While these methods are useful, they should be used in conjunction with direct observations of potential microbes, Nadeau says.

"Looking at patterns and chemistry is useful, but I think we need to take a step back and look for more general characteristics of living things, like the presence of motion. That is, if you see an E. coli, you know that it is alive—and not, say, a grain of sand—because of the way it is moving," she says. In earlier work, Nadeau suggested that the movement exhibited by many living organisms could potentially be used as a robust, chemistry-independent biosignature for extraterrestrial life. The motion of living organisms can also be triggered or enhanced by "feeding" the microbes electrons and watching them grow more active.

To study the motion of potential microbes from Enceladus's plumes, Nadeau proposes using an instrument called a digital holographic microscope that has been modified specifically for astrobiology.

In digital holographic microscopy, an object is illuminated with a laser and the light that bounces off the object and back to a detector is measured. This scattered light contains information about the amplitude (the intensity) of the scattered light, and about its phase (a separate property that can be used to tell how far the light traveled after it scattered). With the two types of information, a computer can reconstruct a 3-D image of the object—one that can show motion through all three dimensions.

"Digital holographic microscopy allows you to see and track even the tiniest of motions," Nadeau says. Furthermore, by tagging potential microbes with fluorescent dyes that bind to broad classes of molecules that are likely to be indicators of life—proteins, sugars, lipids, and nucleic acids—"you can tell what the microbes are made of," she says.

To study the technology's potential utility for analyzing extraterrestrial samples, Nadeau and her colleagues obtained samples of frigid water from the Arctic, which is sparsely populated with bacteria; those that are present are rendered sluggish by the cold temperatures.

With holographic microscopy, Nadeau was able to identify organisms with population densities of just 1,000 cells per milliliter of volume, similar to what exists in some of the most extreme environments on Earth, such as subglacial lakes. For comparison, the open ocean contains about 10,000 cells per milliliter and a typical pond might have 1–10 million cells per milliliter. That low threshold for detection, coupled with the system's ability to test a lot of samples quickly (at a rate of about one milliliter per hour) and its few moving parts, makes it ideal for astrobiology, Nadeau says. 

Next, the team will attempt to replicate their results using samples from other microbe-poor regions on Earth, such as Antarctica.

Nadeau collaborated with Caltech graduate student Manuel Bedrossian and Chris Lindensmith of JPL.

Credit: caltech.edu

Flashes of Light on the Dark Matter

Flashes of Light on the Dark Matter:



On the left side the cosmic web in the standard cold scenario, on the right side how it would look like in the Fuzzy Dark Matter model. The curved lines in both panels show how the absorption by the neutral hydrogen in the cosmic web behaves in the two models. The right curve does not agree with the data, while the left one does. Credit: Matteo Viel



A web that passes through infinite intergalactic spaces, a dense cosmic forest illuminated by very distant lights and a huge enigma to solve. These are the picturesque ingredients of a scientific research - carried out by an international team composed of researchers from the International School for Adavnced Studies (SISSA) and the Abdus Salam International Center for Theoretical Physics (ICTP) in Trieste, the Institute of Astronomy of Cambridge and the University of Washington - that adds an important element for understanding one of the fundamental components of our Universe: the dark matter.

In order to study its properties, scientists analyzed the interaction of the "cosmic web" - a network of filaments made up of gas and dark matter present in the whole Universe - with the light coming from very distant quasars and galaxies. Photons interacting with the hydrogen of the cosmic filaments create many absorption lines defined "Lyman-alpha forest". This microscopic interaction succeeds in revealing several important properties of the dark matter at cosmological distances. The results further support the theory of Cold Dark Matter, which is composed of particles that move very slowly. Moreover, for the first time, they highlight the incompatibility with another model, i.e. the Fuzzy Dark Matter, for which dark matter particles have larger velocities. The research was carried out through simulations performed on international parallel supercomputers and has recently been published in Physical Review Letters.

Although constituting an important part of our cosmos, the dark matter is not directly observable, it does not emit electromagnetic radiation and it is visible only through gravitational effects. Besides, its nature remains a deep mystery. The theories that try to explore this aspect are various. In this research, scientists investigated two of them: the so-called Cold Dark Matter, considered a paradigm of modern cosmology, and an alternative model called Fuzzy Dark Matter (FDM), in which the dark matter is deemed composed of ultralight bosons provided with a non-negligible pressure at small scales. To carry out their investigations, scientists examined the cosmic web by analyzing the so-called Lyman-alpha forest. The Lyman-alpha forest consists of a series of absorption lines produced by the light coming from very distant and extremely luminous sources, that passes through the intergalactic space along its way toward the earth's telescopes. The atomic interaction of photons with the hydrogen present in the cosmic filaments is used to study the properties of the cosmos and of the dark matter at enormous distances.

Through simulations carried out with supercomputers, researchers reproduced the interaction of the light with the cosmic web. Thus they were able to infer some of the characteristics of the particles that compose the dark matter. More in particular, evidence showed for the first time that the mass of the particles, which allegedly compose the dark matter according to the FDM model, is not consistent with the Lyman-alpha Forest observed by the Keck telescope (Hawaii, US) and the Very Large Telescope (European Southern Observatory, Chile). Basically, the study seems not to confirm the theory of the Fuzzy Dark Matter. The data, instead, support the scenario envisaged by the model of the Cold Dark Matter.

The results obtained - scientists say - are important as they allow to build new theoretical models for describing the dark matter and new hypotheses on the characteristics of the cosmos. Moreover, these results can provide useful indications for the realization of experiments in laboratories and can guide observational efforts aimed at making progress on this fascinating scientific theme.

Credit: sissa.it

Hunting Molecules with the MWA

Hunting Molecules with the MWA:



This image shows the centre of the Milky Way as seen by the Galactic Centre Molecular Line Survey. Credit: Chenoa Tremblay (ICRAR-Curtin)



Astronomers have used an Australian radio telescope to observe molecular signatures from stars, gas and dust in our galaxy, which could lead to the detection of complex molecules that are precursors to life. Using the Murchison Widefield Array (MWA), a radio telescope located in the Murchison region of Western Australia, the team successfully detected two molecules called the mercapto radical (SH) and nitric oxide (NO).

“The molecular transitions we saw are from slow variable stars—stars at the end of their lives that are becoming unstable,” said Chenoa Tremblay from the International Centre for Radio Astronomy Research (ICRAR) and Curtin University.

“We use molecules to probe the Milky Way, to better understand the chemical and physical environments of stars, gas and dust,” she said.

“One of the unique aspects of this survey is that until now, no one has ever reported detections of molecules within the 70-300MHz frequency range of the MWA and this is the widest field-of-view molecular survey of the Milky Way ever published.”

Since the 1980s, frequencies greater than 80GHz have been used for this type of work due to the freedom from radio frequency interference emitted by our mobile phones, televisions and orbiting satellites. But the extreme “radio quietness” of the Murchison Radio-astronomy Observatory, where the telescope is located, allows astronomers to study molecular signatures from stars and star-forming regions at lower frequencies.

“Before this study, the mercapto radical had only been seen twice before at infrared wavelengths, in a different part of the electromagnetic spectrum,” said Dr Maria Cunningham from the University of New South Wales.

“This shows that molecules are emitting photons detectable around 100MHz and we can detect these molecular signatures using the MWA—it’s very exciting for us,” she said.

Following on from the pilot study, a survey of the Orion region is now in progress, again using the MWA, in the frequency range of 99-270MHz. The Orion nebula is a chemical-rich environment and one of the closest star-forming regions to Earth. The aim is to detect more chemical tracers in stars, compare these regions to the observations from the Galactic Centre pilot region and to better understand the emission mechanisms of these molecules.

“This new technique paves the way for deeper surveys that can probe the Milky Way and other galaxies in search of molecular precursors to life,” said Tremblay.

“We might even discover signatures from long chain amino acids in the cold gas environments we’re observing—which is where they are likely to be most stable.”

‘A First Look for Molecules between 103 and 133MHz using the Murchison Widefi eld Array’, was published in the Monthly Notices of the Royal Astronomical Society on July 21, 2017.

Credit: icrar.org

Superluminous Supernova Marks the Death of a Star at Cosmic High Noon

Superluminous Supernova Marks the Death of a Star at Cosmic High Noon:



The yellow arrow marks the superluminous supernova DES15E2mlf in this false-color image of the surrounding field. This image was observed with the Dark Energy Camera (DECam) gri-band filters mounted on the Blanco 4-meter telescope on December 28, 2015, around the time when the supernova reached its peak luminosity. (Observers: D. Gerdes and S. Jouvel)




The death of a massive star in a distant galaxy 10 billion years ago created a rare superluminous supernova that astronomers say is one of the most distant ever discovered. The brilliant explosion, more than three times as bright as the 100 billion stars of our Milky Way galaxy combined, occurred about 3.5 billion years after the big bang at a period known as "cosmic high noon," when the rate of star formation in the universe reached its peak.

Superluminous supernovae are 10 to 100 times brighter than a typical supernova resulting from the collapse of a massive star. But astronomers still don't know exactly what kinds of stars give rise to their extreme luminosity or what physical processes are involved.

The supernova known as DES15E2mlf is unusual even among the small number of superluminous supernovae astronomers have detected so far. It was initially detected in November 2015 by the Dark Energy Survey (DES) collaboration using the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory in Chile. Follow-up observations to measure the distance and obtain detailed spectra of the supernova were conducted with the Gemini Multi-Object Spectrograph on the 8-meter Gemini South telescope.

The investigation was led by UC Santa Cruz astronomers Yen-Chen Pan and Ryan Foley as part of an international team of DES collaborators. The researchers reported their findings in a paper published July 21 in the Monthly Notices of the Royal Astronomical Society.

The new observations may provide clues to the nature of stars and galaxies during peak star formation. Supernovae are important in the evolution of galaxies because their explosions enrich the interstellar gas from which new stars form with elements heavier than helium (which astronomers call "metals").

"It's important simply to know that very massive stars were exploding at that time," said Foley, an assistant professor of astronomy and astrophysics at UC Santa Cruz. "What we really want to know is the relative rate of superluminous supernovae to normal supernovae, but we can't yet make that comparison because normal supernovae are too faint to see at that distance. So we don't know if this atypical supernova is telling us something special about that time 10 billion years ago."

Previous observations of superluminous supernovae found they typically reside in low-mass or dwarf galaxies, which tend to be less enriched in metals than more massive galaxies. The host galaxy of DES15E2mlf, however, is a fairly massive, normal-looking galaxy.

"The current idea is that a low-metal environment is important in creating superluminous supernovae, and that's why they tend to occur in low mass galaxies, but DES15E2mlf is in a relatively massive galaxy compared to the typical host galaxy for superluminous supernovae," said Pan, a postdoctoral researcher at UC Santa Cruz and first author of the paper.

Foley explained that stars with fewer heavy elements retain a larger fraction of their mass when they die, which may cause a bigger explosion when the star exhausts its fuel supply and collapses.

"We know metallicity affects the life of a star and how it dies, so finding this superluminous supernova in a higher-mass galaxy goes counter to current thinking," Foley said. "But we are looking so far back in time, this galaxy would have had less time to create metals, so it may be that at these earlier times in the universe's history, even high-mass galaxies had low enough metal content to create these extraordinary stellar explosions. At some point, the Milky Way also had these conditions and might have also produced a lot of these explosions."

"Although many puzzles remain, the ability to observe these unusual supernovae at such great distances provides valuable information about the most massive stars and about an important period in the evolution of galaxies," said Mat Smith, a postdoctoral researcher at University of Southampton. The Dark Energy Survey has discovered a number of superluminous supernovae and continues to see more distant cosmic explosions revealing how stars exploded during the strongest period of star formation.

In addition to Pan, Foley, and Smith, the coauthors of the paper include Lluís Galbany of the University of Pittsburgh, and other members of the DES collaboration from more than 40 institutions. This research was funded the National Science Foundation, The Alfred P. Sloan Foundation, and the David and Lucile Packard Foundation.

Credit: ucsc.edu

Scientists Spy New Evidence of Water in the Moon’s Interior

Scientists Spy New Evidence of Water in the Moon’s Interior:



Evidence from ancient volcanic deposits suggests that lunar magma contained substantial amounts of water, bolstering the idea that the Moon's interior is water-rich. Credit: Olga Prilipko Huber




A new study of satellite data finds that numerous volcanic deposits distributed across the surface of the Moon contain unusually high amounts of trapped water compared with surrounding terrains. The finding of water in these ancient deposits, which are believed to consist of glass beads formed by the explosive eruption of magma coming from the deep lunar interior, bolsters the idea that the lunar mantle is surprisingly water-rich.

Scientists had assumed for years that the interior of the Moon had been largely depleted of water and other volatile compounds. That began to change in 2008, when a research team including Brown University geologist Alberto Saal detected trace amounts of water in some of the volcanic glass beads brought back to Earth from the Apollo 15 and 17 missions to the Moon. In 2011, further study of tiny crystalline formations within those beads revealed that they actually contain similar amounts of water as some basalts on Earth. That suggests that the Moon’s mantle — parts of it, at least — contain as much water as Earth’s.

“The key question is whether those Apollo samples represent the bulk conditions of the lunar interior or instead represent unusual or perhaps anomalous water-rich regions within an otherwise ‘dry’ mantle,” said Ralph Milliken, lead author of the new research and an associate professor in Brown’s Department of Earth, Environmental and Planetary Sciences. “By looking at the orbital data, we can examine the large pyroclastic deposits on the Moon that were never sampled by the Apollo or Luna missions. The fact that nearly all of them exhibit signatures of water suggests that the Apollo samples are not anomalous, so it may be that the bulk interior of the Moon is wet.”

The research, which Milliken co-authored with Shuai Li, a postdoctoral researcher at the University of Hawaii and a recent Brown Ph.D. graduate, is published in Nature Geoscience. The work was part of Li's Ph.D. thesis.

Detecting the water content of lunar volcanic deposits using orbital instruments is no easy task. Scientists use orbital spectrometers to measure the light that bounces off a planetary surface. By looking at which wavelengths of light are absorbed or reflected by the surface, scientists can get an idea of which minerals and other compounds are present.

The problem is that the lunar surface heats up over the course of a day, especially at the latitudes where these pyroclastic deposits are located. That means that in addition to the light reflected from the surface, the spectrometer also ends up measuring heat.

“That thermally emitted radiation happens at the same wavelengths that we need to use to look for water,” Milliken said. “So in order to say with any confidence that water is present, we first need to account for and remove the thermally emitted component.”

To do that, Li and Milliken used laboratory-based measurements of samples returned from the Apollo missions, combined with a detailed temperature profile of the areas of interest on the Moon’s surface. Using the new thermal correction, the researchers looked at data from the Moon Mineralogy Mapper, an imaging spectrometer that flew aboard India’s Chandrayaan-1 lunar orbiter.

The researchers found evidence of water in nearly all of the large pyroclastic deposits that had been previously mapped across the Moon’s surface, including deposits near the Apollo 15 and 17 landing sites where the water-bearing glass bead samples were collected.

“The distribution of these water-rich deposits is the key thing,” Milliken said. “They’re spread across the surface, which tells us that the water found in the Apollo samples isn’t a one-off. Lunar pyroclastics seem to be universally water-rich, which suggests the same may be true of the mantle.”

The idea that the interior of the Moon is water-rich raises interesting questions about the Moon’s formation. Scientists think the Moon formed from debris left behind after an object about the size of Mars slammed into the Earth very early in solar system history. One of the reasons scientists had assumed the Moon’s interior should be dry is that it seems unlikely that any of the hydrogen needed to form water could have survived the heat of that impact.

“The growing evidence for water inside the Moon suggest that water did somehow survive, or that it was brought in shortly after the impact by asteroids or comets before the Moon had completely solidified,” Li said. “The exact origin of water in the lunar interior is still a big question.”

In addition to shedding light on the water story in the early solar system, the research could also have implications for future lunar exploration. The volcanic beads don’t contain a lot of water — about .05 percent by weight, the researchers say — but the deposits are large, and the water could potentially be extracted.

“Other studies have suggested the presence of water ice in shadowed regions at the lunar poles, but the pyroclastic deposits are at locations that may be easier to access,” Li said. “Anything that helps save future lunar explorers from having to bring lots of water from home is a big step forward, and our results suggest a new alternative.”

Credit: brown.edu

Choose Your Star: First Gaia Data Release Catalogs More than Billion Celestial Objects

Choose Your Star: First Gaia Data Release Catalogs More than Billion Celestial Objects:



Gaia's first sky map. Credit: ESA/Gaia/DPAC. Acknowledgement: A. Moitinho & M. Barros (CENTRA – University of Lisbon), on behalf of DPAC.



European Space Agency’s (ESA) Gaia satellite is on a crucial mission to create the most detailed ever 3D map of our Milky Way galaxy. Last year, the agency has published first data release provided by Gaia, which contains more than one billion stars with information about their brightness and precise position on the sky.

Finding an interesting star could be now as easy as browsing offers in order to find the best car for you on websites like Cars.com. Just like this site lists a plenitude of models from Acura to Volvo, the Gaia Data Release 1 (or DR1 for short) allows astronomers to investigate a variety of peculiar objects in the sky. While car buyers and enthusiasts can choose among a diversity of models, including Toyota Gaia, the scientific community has its own Gaia, flying in space and delivering essential astronomical data.

DR1 is a real treasure trove for astronomers studying stars in our galaxy. The catalog consists of astrometry and photometry data for over one billion sources brighter than magnitude 20.7 in the white-light photometric band G of the Gaia satellite. It is the largest all-sky survey of celestial objects to date.

ESA scientists underline that DR1 shows the density of stars measured by Gaia across the entire sky, and confirms that it has already collected superb data since the beginning of its operational life in July 2014. They note that the satellite charts the sky at precision that have never been achieved before.

In particular, DR1 contains about 1.14 billion stars with precise measurements of their position on the sky and brightness. The dataset allows astronomers to estimate distance and proper motion for over two billion stars in common with the earlier Hipparcos and Tycho-2 catalogs, based on data from ESA's Hipparcos mission. Moreover, DR1 also contains nearly 3,200 variable stars, including details about their brightness variations as well as positions and brightness of more than 2,000 quasi-stellar objects (quasars).

The promising results provided by DR1 leave the researchers hungry for more. Gaia mission scientists are convinced that subsequent data releases will revolutionize our understanding of how stars are distributed and move across the Milky Way.

“1,000 days after launch and thanks to the great work of everyone involved, we are thrilled to present this first dataset and are looking forward to the next release, which will unleash Gaia’s potential to explore our galaxy as we have never seen it before,” Fred Jansen, Gaia mission manager at ESA, said on September 14, 2016, when DR1 was published.

In general, DR1 confirms that Gaia is well on track of achieving it main goal – charting the positions, distances, and motions of one billion stars. It will be a 3D map of about one percent of the Milky Way’s stellar content – all with an unprecedented accuracy. Gaia’s second data release is currently planned for April 2018.

First-ever laser communications terminal to be tested on the Moon

First-ever laser communications terminal to be tested on the Moon:



Astrobotic's Peregrine Lander will deliver a laser communications terminal built by ATLAS to the Moon.


Astrobotic’s Peregrine Lander will deliver a laser communications terminal built by ATLAS to the Moon. Image Credit: Astrobotic
ATLAS Space Operations Inc., a company specializing in cloud-based satellite management and control services, has announced that it will test the first-ever laser communications terminal on the lunar surface. The company has recently signed a contract with Astrobotic Technology Inc., which could see their system fly to the Moon in late 2019.

The terminal, under development by ATLAS, is expected to establish the world’s first laser communication link from the lunar surface. This could mark a significant breakthrough in terms of laser communications for planetary missions.



ATLAS technologies plans to test laser communications system on the Moon. Photo Credit: Mark Usciak / SpaceFlight Insider


It is hoped this new system could serve to revolutionize deep space communications. Photo Credit: Mark Usciak / SpaceFlight Insider
“Our main goal is to demonstrate the viability of a commercial laser communications capability from the lunar surface. This is a stepping-stone to establishing a permanent infrastructure in support of future lunar activity,” Dan Carey, Director of Marketing at ATLAS Space Operations, told SpaceFlight Insider.

The terminal, which will be sent to the Moon on board Astrobotic’s Peregrine Lander, will carry out first the crucial tests for the development of this potentially ground-breaking technology. This hardware is intended to be a baseline for ATLAS’ future interplanetary communications technology. Carey noted that the tests on lunar surface will allow us to “learn the hard lessons closer to home, on the Moon, before venturing beyond.”

By sending its payload to the Moon ATLAS also aims to provide a platform for the public to access a virtual lunar experience. With this technology and lunar capability, the company would be able to provide the rest of humanity an experience that previously has been reserved for an elite class of explorers.

“Organizations like NASA and MIT/Lincoln Labs are the ones who have developed the revolutionary technology. ATLAS is taking that technology and commercializing it for the advancement of human interest in space. Our company was founded on the ideal of making space accessible to all,” Carey said.

The laser communications terminal is expected to weigh less than 22 pounds (10 kilograms) and will consume less than 60 W for up to 1.0 Gbps of data transfer to Earth. The ground segment of this system will be comprised of Earth Observation Stations, part of the International Laser Ranging Service adapted for this mission, and other commercially-available ground terminal technology previously used for laser communications.

For ATLAS’ management, the partnership with Astrobotic is considered to be key to showcase its capabilities. Moreover, both companies share the same vision of space exploration and look forward to a long-lasting collaboration.

“Astrobotic is progressive and forward thinking. Our companies share a common goal in advancing human interest in lunar and interplanetary exploration. We aim to make the ‘heavens’ more available and affordable than ever before to all who have similar interests,” Carey concluded.



The post First-ever laser communications terminal to be tested on the Moon appeared first on SpaceFlight Insider.

Lawbreaking Particles May Point to a Previously Unknown Force in the Universe

Lawbreaking Particles May Point to a Previously Unknown Force in the Universe:

Lawbreaking Particles May Point to a Previously Unknown Force in the Universe
A diagram of two protons colliding inside the LHCb experiment.
Credit: CERN


For decades physicists have sought signs of misbehaving particles — evidence of subtle cracks in the "Standard Model:" of particle physics, the dominant theory describing the most fundamental building blocks of our universe. Although the Standard Model has proved strikingly accurate, scientists have long known some adjustments will be needed. Now, as a recent review paper in Nature documents, experimenters have started seeing suggestions of particles flouting the theory — but they're not quite the violations theorists were looking for.

The evidence comes from electrons and their more massive cousins, muons and tau leptons. According to the Standard Model, these three particles should behave like differently sized but otherwise identical triplets. But three experiments have produced growing evidence — including results announced in just the last few months — that the particles react differently to some as-yet mysterious influence. The findings are not yet conclusive, but if they hold up, "it would be a complete revolution," says California Institute of Technology theorist Mark Wise.

Tantalizing Signs

A shake-up in the Standard Model would be huge. This theory has formed the bedrock of particle physics research since it was fleshed out in the late 20th century. It carves the universe into twelve elementary particles that make up all matter, plus 'force-carrier' particles that transmit the fundamental forces of nature. (For instance, particles exert electrical or magnetic forces by exchanging transient photons.) Despite its successes, however, the Standard Model predicts nothing that would explain gravity or the dark matter thought to invisibly inhabit space. To marry particle physics with these larger-scale observations, theorists have proposed all manner of "new physics" — matter or forces beyond the Standard Model's menagerie. But most experiments have stubbornly upheld the theory with impressive fidelity, finding no evidence of the hypothesized particles or forces. [What the Higgs Is Going on with Mass?]

Since 2012, though, signs of particle misbehavior have started emerging from a less-explored corner of the Standard Model: a pattern called "lepton universality." Here "lepton" refers to the class of particles including electrons, muons and taus. The Standard Model predicts these three species should commune with one another and other particles in exactly the same way except for differences attributable to their unique masses — a commonality of behavior that accounts for the second term in lepton universality.

The first lepton surprise showed up in results announced in 2012 from the BaBar experiment at the SLAC National Accelerator Laboratory in Menlo Park, Calif. BaBar’s particle accelerator rammed together electrons and their antimatter equivalents, known as positrons. The collisions produced many composite particles that were heavy but unstable: They acted like absurdly radioactive uranium atoms, lasting just fractions of a nanosecond before decaying into smaller and smaller particles. The final products spewed out into the accelerator's detectors, allowing scientists to reconstruct the chain of particle decays. If the Standard Model is right, two of the types of decays examined by the BaBar team should produce taus just 25 to 30 percent as often as electrons, which are lighter and thus easier to make. But that is not what the team saw. Taus were far more common than they should have been, hinting at a difference between taus and electrons beyond their masses.

BaBar's result was just the beginning. Two other experiments, the LHCb experiment at the Large Hadron Collider in Switzerland and the Belle experiment at the High Energy Accelerator Research Organization in Japan, studied the same decays and published similar results in 2015. Belle, like BaBar, collides electrons and positrons. But LHCb collides protons with other protons at much higher energies, and uses different methods to detect the products. Those differences make it harder to wave away the results as experimental mistakes, bolstering the prospect that the anomaly is real.

Furthermore, LHCb has also found signs of lepton universality violation in another type of lepton-producing decay, and several months ago it announced possible deviations in yet a fourth decay type. Just last month it reported a similar disparity between electrons and muons (rather than taus) in a related decay. All these converging lines of evidence make an increasingly compelling case that something is systematically fishy. "If [the deviations] turn out to be real," says BaBar spokesperson and University of Victoria professor Michael Roney, "it would be kind of weird if they weren't related."

A Revolution — If It's Real

If the various leptons really behave differently, the only explanation would be some previously unrecognized force. Under the Standard Model, larger particles decay into leptons (and other products) via the "weak force," the same force that causes radioactive decay. But the weak force treats all leptons equally. If more taus are coming out than the weak force should produce, then some unknown force, associated with some undiscovered attendant force-carrier particle, must be breaking down the larger particles in a way that favors taus. Finding such a force would be as fundamental as the discovery of electromagnetism, albeit with much less effect on our daily lives. "It does actually constitute, with little exaggeration, a revolution in physics," says Hassan Jawahery, a University of Maryland, College Park, physicist and a member of the LHCb collaboration.

Because the implications would be so dramatic, physicists will demand overwhelming evidence — a burden the experimenters are well aware of. Greg Ciezarek, lead author on the Nature review and a postdoctoral researcher at Nikhef National Institute for Subatomic Physics in Amsterdam, says lepton universality violations "would be in the territory of making extraordinary claims," which, as the adage goes, require extraordinary evidence. Roney sums up the skepticism: "You don't bet against the Standard Model."

The evidence to date is not insubstantial. Combining all the data, the probability that the tau/electron deviations are just statistical flukes now stands at about one in 10,000. For any everyday question, that would more than suffice. But particle physicists are a skeptical bunch; the community will not consider a discovery confirmed until there's just a one-in-3.5-million chance of a false alarm. As some "chronologically more advanced" scientists can attest, they’ve been burned before, says Zoltan Ligeti, a professor of theoretical physics at Lawrence Berkeley National Laboratory. "We have seen similar fluctuations in the past that have come and gone."

The evidence is even harder to swallow given how far lepton universality is from theorists' expectations of where cracks in the Standard Model might show up. "There's sort of a story line that the theorists tell," Wise says, and "this isn't in the story line." What's worse, the proposed explanations for the leptons' behavior seem ad hoc and unsatisfying. "The kind of models that can fit the…anomalies don't really do anything else at first sight," Ligeti says. "For example, they don't get you any closer to understanding what dark matter might be."

Still, he adds, "nature tells us how nature is." Physicists are increasingly taking note of the violations' continued persistence, and proposing new theoretical explanations. Experimentalists and theorists alike are also looking to reduce existing measurements' uncertainties. Ultimately, the biggest revelations will come when LHCb and the next version of Belle produce more data. Physicists are optimistic that within about five years not only will we know whether the effect is real, we will have an explanation for it. "If there is a new [force-carrier] particle," says Svjetlana Fajfer, a theorist at the University of Ljubljana in Slovenia, "[it] should have a mass in reach of LHC," meaning the collider should be able to produce and identify such a particle. For some theorists, that testability is a big draw. "That makes it actually exciting, because if I do something, it can be proven right or wrong," Ligeti says. "One way or another, the case will become clear."

Explore the International Space Station with Google Street View

Explore the International Space Station with Google Street View:

Aspiring astronauts can now pretend to float on the International Space Station (ISS), thanks to Google. The company worked with astronauts on the orbiting complex to provide a Google Street View of the space station, from its science labs to its beautiful Earth-facing Cupola window.

Thomas Pesquet, a European Space Agency astronaut who helped collect the images earlier this year, said in a blog post that the experience of capturing the tour "describes the feeling of being in space" better than words or a picture can. But there were limitations to collecting the data. For one, astronauts float in space, so the imagery of the ISS couldn't be captured the same way as other Google Street View locations.

NASA's Johnson Space Center in Houston and Marshall Space Flight Center in Alabama worked with Google to create a "gravity-free method of collecting the imagery," Pesquet said in the blog post. These methods included using DSLR cameras and other equipment already available at the space station. An extended video provides an additional look at how the view came together. (Pesquet didn't specify the other equipment in the blog post.) [The International Space Station: Inside and Out (Infographic)]

"I collected still photos in space, that were sent down to Earth where they were stitched together to create panoramic 360 degree imagery of the ISS," Pesquet wrote.

"We did a lot of troubleshooting before collecting the final imagery that you see today in Street View," he added.

"The ISS has technical equipment on all surfaces, with lots of cables and a complicated layout with modules shooting off in all directions — left, right, up, down," Pesquet wrote. "And it's a busy place, with six crew members [at the time] carrying out research and maintenance activities 12 hours a day. There are a lot of obstacles up there, and we had limited time to capture the imagery, so we had to be confident that our approach would work."

The International Space Station's U.S. laboratory module as seen through Google Street View.
Credit: Google Street View


The tour is the first Google Street View captured in space, and it features annotations that pop up to explain additional information about each module, such as how astronauts stay physically fit or the kinds of food they eat.

You can read the entire blog post here: https://www.blog.google/products/maps/welcome-outer-space-view/ and take a virtual tour of the International Space Station here in Google Street View: https://www.google.com/streetview/#international-space-station/

The International Space Station's Cupola observation module as seen through Google Street View.
Credit: Google Street View


The ISS has been occupied continuously since November 2000. It generally houses three to six crewmembers, who split their days between science and maintenance activities. Crewmembers currently "commute" to space on the Russian Soyuz spacecraft, but within the next few years, commercial spacecraft from SpaceX and Boeing will ferry astronauts from U.S. soil for the first time since the space shuttle's retirement in 2011.

Follow us @Spacedotcom, Facebook and Google+. Original article on Space.com

'Star Trek' Historians Says CBS 'Spared No Expense' with 'Star Trek: Discovery'

'Star Trek' Historians Says CBS 'Spared No Expense' with 'Star Trek: Discovery':

'Star Trek' Historians Says CBS 'Spared No Expense' with 'Star Trek: Discovery'
Sonequa Martin-Green plays Micheal Burnham in the new CBS show "Star Trek: Discovery."
Credit: CBS


SAN DIEGO — With the "Star Trek" franchise now well into its 51st year, and a new TV series set to debut this fall, it's clear the iconic science fiction epic is definitely living long and prospering. But according to the authors of a book series chronicling the history of Star Trek, that wasn't always the case.

Mark A. Altman and Edward Gross are the authors of the two-volume book series called "The Fifty Year Mission: The Complete, Uncensored, Unauthorized Oral History of Star Trek: The First 25 Years," (Thomas Dunne Books, 2016), based on hundreds of interviews with people who witnessed the five-decade saga. They spoke Friday (July 20) here at Comic-Con International in San Diego on panel hosted by TV producer and lifelong Star Trek fan Scott Mantz. [More from SDCC: The 'Star Trek: Discovery' Cast Is Full of Trek Fans]

"We're going to talk about where [Star Trek is] going but first we gotta talk about where it started," Mantz said, kicking off a discussion about a few of the many instances when the "Star Trek" universe might have fallen into oblivion, and what fans can expect from the upcoming TV series, "Star Trek: Discovery" when it debuts on Sept. 24.

While the original "Star Trek" series is now a modern classic, it's fate was uncertain for most of its run, according to the authors. And its enduring success had a lot to do with Paramount Studio's decision to syndicate it, and that was thanks to Richard Block who worked for Kaiser Broadcasting in 1969 when the show was airing.

"He just kept pushing and pushing, and finally on a napkin in a bar they made a deal to syndicate 'Star Trek' on a couple of stations," Gross said. "They put it on, and it started winning the timeslot against the news. And everyone's looking around saying 'Star Trek's beating the news?' And they started bulk airing it five nights a week and blowing everything away."

The late popularity of the series kicked off the creation of the Star Trek movies starting in the late 1970s. But there were a few "Star Trek" projects in the mid-1970s that never came to fruition, including a reboot of the TV series and a movie called "Star Trek: Planet of the Titans," with an extremely complicated and bizarre plot with a scope similar to "2001: A Space Odyssey," according to Altman.

"They had hugely great people involved … it was going to be expensive, it was going to have an A-list director," Altman said. There were even rumors that Robert Redford's had been suggested to play Kirk. But the movie was planned as a one-off, not part of a series like the movies that were ultimately made, and wouldn't have spawned the "Star Trek" we know today.

"It probably would have been the end of the franchise had that movie been made," he said. [Comic-Con 2017 in Photos: Aliens, Mutants & More Invade San Diego]

Panel moderator Scott Mantz (left) with Mark A. Altman and Edward Gross (right) at Comic-Con International in 2017.
Credit: Calla Cofield/Space.com


The Future of "Star Trek"

This fall, CBS will debut a new TV series, "Star Trek: Discovery." The show takes place before the events of the original series. [What Makes a Star Trek Fan (Slideshow)]

The first episode of the show will air on CBS, but all subsequent episodes will only be available on the network's online viewing platform, CBS All Access, which requires a subscription fee. Altman said many fans had decried the additional cost, but Altman said he thinks it's a good thing.

"'Star Trek' has had good demographics but never great ratings. And I think by CBS having [the show] on [the network's] own platform, it will ensure the longevity of the show," Altman said. "So whether 'Star Trek: Discovery' is amazing and fantastic, or just mediocre, it will be ensured a long life because it's on the platform. I also think they wouldn't spend the kind of money they are spending if it weren't for this platform … and it is quite a lot of money. They spared no expense."

"Star Trek: Discovery" will also be serialized (meaning the episodes won't stand alone — they will have an overarching story such that viewers will have to watch the entire series in order to understand what's going on). That decision has been met with some skepticism by fans, the panelists said, because only one of the five previous live-action "Star Trek" TV shows attempted to do this (the rest were done in a "stand alone" format, in which viewers could watch any episode and not require any additional information to understand the plot).

Serialized shows are extremely common in the modern television landscape, perhaps largely because people don't have to watch the shows on live TV, but can instead watch them online and don't have to worry about missing an episode. The TV series "Star Trek: Deep Space Nine" did incorporate serialized plotlines and was "way, way ahead of its time" in that regard, according to Altman.

Because "Discovery" takes place before the events of the original series, the creators must deal with the difficult challenge of making the show look older than the original series, but also look good in 2017. And even though the show takes place in the past in the "Star Trek" universe, it is supposed to take place in the 23rd century (so, the future for those of us in the real world). Along with that tricky balancing act is the fact that many fans may come into the show with certain expectations about which elements from the original series should be included in the new show, right down to thinks like the design of the ship and the crew uniforms.

"There's a danger of doing fan service when you go backwards, because everyone's going to have expectations of what they think should be or what they saw in their mind. And if any show is about going forward it's 'Star Trek," Altman said. "I think that for too long Star Trek has imitated itself. I'd really love to see what the future of Star Trek looks like because we've kind of got stuck in this temporal loop where we're just exploring the same eras over and over again."

Mantz, the die-hard Trek fan, said, "I love fan service! Bring on the fan service!"

"[But] you need new fans," Gross countered.

"But at the end of the day, what's the most important thing about 'Star Trek'? It's the vision. It's the optimism," Altman said. "It's believing that we can be a better people, that we are striving to be better. The respect for science, the respect for each other, the lack of xenophobia, all the things we don't have right now in our society that need to come back."

Follow Calla Cofield @callacofield. Follow us @Spacedotcom, Facebook and Google+. Original article on Space.com.

Cool the Planet? Geoengineering Is Easier Said Than Done

Cool the Planet? Geoengineering Is Easier Said Than Done:

Cool the Planet? Geoengineering Is Easier Said Than Done
Credit: Narith Thongphasuk/Shutterstock


Planet Earth is feeling the heat.

With the world facing increased warming, melting ice caps, rising sea levels, intense weather events and other global disasters, scientists are exploring ways to re-engineer the planet to counter the effects of global warming.

Earth's surface has warmed, on average over land and sea, 1.53 degreesFahrenheit (0.85 degrees Celsius) since 1880, according to the Intergovernmental Panel on Climate Change, an international organization created by the United Nations to evaluate the state of climate change science. [Changing Earth: 7 Ideas to Geoengineer Our Planet]

In the most recent issue of the journal Science, published online Thursday (July 20), two researchers provided perspective on two geoengineering methods that could reduce the so-called greenhouse effect, under which gases and clouds in Earth's atmosphere trap the sun's heat. Both schemes could contribute to a cooler climate, but they are not without risks. And as both researchers made clear, neither idea addresses the rising levels of carbon dioxide (CO2) in the atmosphere that is primarily to blamefor global warming and higher levels of oceanic acid. This acidity is killing the coral reefs that shelter marine life and support the fish that humans eat.

Ulrike Lohmann and Blaž Gasparini, both researchers at the Institute of Atmospheric and Climate Scienceat ETH Zurich inSwitzerland, proposed a counterintuitive plan: Seed the upper atmosphere with tiny particles of desert dust to reduce cirrus clouds. These are the wispy, nearly invisible clouds that form at high altitudes. Unlike fat, billowy clouds that reflect sunlight, these clouds trap heat energy radiating up from Earth out into space.

"If cirrus clouds behave like a blanket around the Earth, you're trying to get rid of that blanket," Lohmann, a professor of experimental atmospheric physics at ETH Zurich, told Live Science.

Thinning the clouds

Seeding the atmosphere with dust would paradoxically thin out cirrus clouds, Lohman said. Under normal circumstances, the atmosphere at altitudes of about 16,000 to 40,000 feet (4,800 to 12,200 meters) is full of tiny particles. Some are solid particles like mineral dust, and some are liquid aerosols, such assulfuric acid. The liquid aerosols freeze instantly and create ice crystals that form long-lasting cirrus clouds.

Cirrus thinning changes this dynamic, Lohman said. The idea, Lohmann said, is to inject solid particles, like desert dust, into the atmosphere at spots slightly lower than where cirrus clouds would naturally form. The quantity of dust introduced would be far less than the number of particles that exist higher up. This part is key, because fewer particles will attract more water vapor, creating larger crystals. As the ice crystals grow to larger and heavier, they would and fall as precipitation, and depending on the conditions would evaporate before reaching the ground.

"You remove the water vapor, you remove the humidity and you prevent the normal cirrus cloud formation," Lohmann said. [8 Ways Global Warming is Already Changing the World]

Ideally, the method would be applied to locations most susceptible to cirrus cloud formation, Lohmann said — geographical latitudes above 60 degrees, including the Arctic, where temperature increases from CO2 are the greatest.

The researchers' computer models have shown that if done correctly, cirrus thinning could reduce global temperatures by 0.9 degrees F (0.5 degrees C), Lohmann said. But if done incorrectly, the activity could produce cirrus clouds where none existed before, contributing to the very problem it's meant to solve, she added.

Risky business

The risk of doing more harm than good is a concern, said Ulrike Niemeier, a climate scientist at the Max Planck Institute for Meteorology in Hamburg, Germany, and her colleague Simone Tilmes, a project scientist at the National Center for Atmospheric Research in Boulder, Colorado. Niemeier and Tilmes published a separate commentary in this week's issue of the journal Science that discusses a geoengineering method called stratospheric aerosol modification (SAM).

SAM involves injecting sulfur aerosols into the stratosphere to increase the reflectivity of Earth's atmosphere. Computer models have shown that SAM could reduce the amount of sunlight that reaches the planet's surface. The effect would resemble that of ash clouds that linger after volcanic eruptions, which have been shown to lower global temperatures, the researchers wrote.

But the science behind SAM is in its very early stages, and the technologies to deploy it are not developed, the researchers added.

"It was our intention to say that [geoengineering] is not something that we should have in the back of our minds as the main solution," Niemeier told Live Science.

Niemeier and Tilmes wrote that different computer models consistently identify side effects to SAM. For example, reducing incoming solar radiation also reduces evaporation, which in turn reduces precipitation, and that can slow the hydrological cycle, particularly in the tropics, the authors wrote. Less rainfall could increase droughts that are already devastating parts of the world.

Although computer models tend to agree that it’s best to inject the aerosols into the stratosphere above the tropics or subtropics, and that the aerosols would disperse globally, the models differ on the extent of injection required for a given level of cooling, the authors wrote.

"Most current Earth-system models do not adequately capture important interactions, such as the coupling between stratospheric aerosols, chemistry, radiation and climate. They cannot, therefore, simulate the full impact of the interventions," Niemeier and Tilmes wrote.

Complicated solutions

Even if scientists could figure out a precise method, the economics are mind-boggling. Using SAM to bring down global temperatures just 2 degrees F (1 degree C), to preindustrial levels, would require injection amounts equivalent to one volcanic eruption per year the size of the 1991 Mount Pinatubo blast in the Philippines — the largest volcanic eruption in the last 100 years, according to the U.S. Geological Survey. The cost of dispersing that much content artificially would cost $20 billionper year and require 6,700 aircraft flights per day over 160 years, the researchers wrote.

No single method can solve the climate change problem as a whole, either, they said.

"Any geoengineering method we know of can only offset part of the global warming that we have," Lohmann said.

And no method designed to cool the planet deals with the gases in the atmosphere that are the sources of the problem and are contributing to increasing levels of acid in the oceans, the researchers said.

"It doesn't get at the heart of the problem," Lohmann said. "The ocean acidification is ongoing."

If society decides to undertake any geoengineering method, she said, this action should be accompanied by large efforts to reduce greenhouse gas emissions.

Niemeier said emission reductions should be the primary focus. "We are quite critical about [geoengineering], and we want people to be aware it would be a difficult."

Original article on Live Science.