Pale Blue Dot: Visualization Challenge

Pale Blue Dot: Visualization Challenge

Satellite visualization of shifting land use over 40 years in East Asia.

Our world is facing many urgent challenges, such as climate change, water insecurity, and food insecurity. Maintaining and improving quality of life around the world requires bringing together innovators across disciplines and countries to find creative solutions.

One critical tool for understanding and improving the urgent challenges facing our world is Earth observation data, meaning data that is gathered in outer space about life here on Earth! Earth observation data provides accurate and publicly accessible information on our atmosphere, oceans, ecosystems, land cover, and built environment. The United States and its partners have a long history of exploring outer space and making satellite, airborne, and in-situ sensor datasets openly available to all.

Your goal in this challenge is to create a visualization using Earth observation data that advances at least one of the following Sustainable Development Goals (SDGs):

By participating, you can be part of NASA’s initiative to Transform to Open Science and to make Earth observation data available to all.

Award: 10-day Space Study program, with travel, lodging, and tuition covered.

Open Date: November 15, 2023

Close Date: January 26, 2024

For more information, visit: https://www.drivendata.org/competitions/256/

Powered by WPeMatico

Get The Details…
Sarah Douglas

A View Through Skylab

A View Through Skylab

A view from Skylab's airlock hatch, looking down the length of the orbiting workshop. Skylab has a hexagon shape, with metal mesh floors that you can see through. Equipment lines the walls, and at center, two astronauts, Edward G. Gibson (left, in a white t-shirt) and Jerry P. Carr (right, in a brown t-shirt) smile for the camera. Also in frame are parts of three spacesuits.
NASA / William R. Pogue

Astronaut William R. Pogue, Skylab 4 pilot, recorded this wide scene of his crewmates, astronauts Edward G. Gibson (left), science pilot, and Jerry P. Carr (right), commander, on the other end of the orbital workshop on Feb. 1, 1974. Also in the frame are parts of three spacesuits, used on several EVA sessions during the third and final mission on the Skylab space station.

Skylab 4 launched on Nov. 16, 1973. Pogue, Gibson, and Carr were the first all-rookie crew since Gemini 8 in 1966. The crew continued the science program begun by the previous two Skylab crews, including biomedical investigations on the effects of long-duration space flight on the human body, Earth observations using the Earth Resources Experiment Package, and solar observations with instruments mounted on the Apollo Telescope Mount. Added to their science program were observations of the comet Kohoutek, discovered earlier in the year and predicted to make its closest approach to the Sun in December.

Watch a recap of Skylab’s legacy as a major stepping stone to the successful construction and operation of the International Space Station and future long-duration human missions to asteroids, Mars and other destinations.

Image Credit: NASA/William R. Pogue

Powered by WPeMatico

Get The Details…
Monika Luabeya

NASA’s Cold Atom Lab Sets Stage for Quantum Chemistry in Space

NASA’s Cold Atom Lab Sets Stage for Quantum Chemistry in Space

7 min read

NASA’s Cold Atom Lab Sets Stage for Quantum Chemistry in Space

NASA’s Cold Atom Lab animation
This animation depicts six finely tuned lasers used inside NASA’s Cold Atom Lab to slow down atoms, lowering their temperature. Scientists can now use the lab to see how different types of atoms interact with each other at these cold temperatures.
NASA/JPL-Caltech

The remotely operated facility aboard the International Space Station has created another tool that researchers can use to probe the fundamental nature of the world around us.

For the first time in space, scientists have produced a quantum gas containing two types of atoms. Accomplished with NASA’s Cold Atom Laboratory aboard the International Space Station, the achievement marks another step toward bringing quantum technologies currently available only on Earth into space.

Quantum tools are already used in everything from cellphones to GPS to medical devices. In the future, they could be used to enhance the study of planets, including our own, and help solve mysteries of the universe while deepening our understanding of the fundamental laws of nature.

The new work, performed remotely by scientists on Earth, is described in the Nov. 16 issue of the journal Nature.

With this new capability, the Cold Atom Lab can now study not only the quantum properties of individual atoms, but also quantum chemistry, which focuses on how different types of atoms interact and combine with each other in a quantum state. Researchers will be able to conduct a wider range of experiments with Cold Atom Lab and learn more about the nuances of performing them in microgravity. That knowledge will be essential for harnessing the one-of-a-kind facility to develop new space-based quantum technologies.

Quantum Chemistry

The physical world around us depends on atoms and molecules staying bound together according to an established set of rules. But different rules can dominate or weaken depending on the environment the atoms and molecules are in – like microgravity. Scientists using the Cold Atom Lab are exploring scenarios where the quantum nature of atoms dominates their behaviors. For example, instead of acting like solid billiard balls, the atoms and molecules behave more like waves.

In one of those scenarios, the atoms in two- or three-atom molecules can remain bound together but grow increasingly far apart, almost as though the molecules are getting fluffy. To study these states, scientists first need to slow the atoms down. They do this by cooling them to fractions of a degree above the lowest temperature matter can reach, far colder than anything found in the natural universe: absolute zero, or minus 459 degrees Fahrenheit (minus 273 degrees Celsius).

NASA’s Cold Atom Lab lets scientists investigate the quantum nature of atoms in the freedom of microgravity. Learn how quantum science has led to the development of everyday technologies like cellphones and computers, and how Cold Atom Lab is paving the way for new breakthroughs. Credit: NASA/JPL-Caltech

Physicists have created these fluffy molecules in cold atom experiments on the ground, but they are extremely fragile and either break apart quickly or collapse back down to a normal molecular state. For that reason, enlarged molecules with three atoms have never been directly imaged. In the microgravity of the space station, the fragile molecules can exist for longer and potentially get larger, so physicists are excited to start experimenting with the Cold Atom Lab’s new capability.

These types of molecules likely don’t occur in nature, but it’s possible they could be used to make sensitive detectors that can reveal subtle changes in the strength of a magnetic field, for example, or any of the other disturbances that cause them to break apart or collapse.

“What we’re doing with cold atom science in general is looking for and learning about new tools that nature gives us,” said Jason Williams of NASA’s Jet Propulsion Laboratory in Southern California, project scientist for the Cold Atom Lab and a co-author on the new study. “It’s like we’ve discovered a hammer and we’re just starting to investigate all the ways we could use it.”

A Modern Mystery

One possible way of using a quantum gas with two types of atoms would be to test something called the equivalence principle, which holds that gravity affects all objects the same way regardless of their mass. It’s a principle that many physics teachers will demonstrate by putting a feather and a hammer in a sealed vacuum chamber and showing that, in the absence of air friction, the two fall at the same rate. In 1971, Apollo 15 astronaut David Scott did this experiment on the Moon’s surface without the need for a vacuum chamber.

Using an instrument called an atom interferometer, scientists have already run experiments on Earth to see if the equivalence principle holds true at atomic scales. Using a quantum gas with two types of atoms and an interferometer in the microgravity of the space station, they could test the principle with more precision than what’s possible on Earth. Doing so, they might learn whether there’s a point where gravity doesn’t treat all matter equally, indicating Albert Einstein’s general theory of relativity contains a small error that could have big implications.

The equivalence principle is part of the general theory of relativity, the backbone of modern gravitational physics, which describes how large objects, like planets and galaxies, behave. But a major mystery in modern physics is why the laws of gravity don’t seem to match up with the laws of quantum physics, which describe the behaviors of small objects, like atoms. The laws of both fields have proven to be correct again and again in their respective size realms, but physicists have been unable to unite them into a single description of the universe as a whole.

Looking for features of gravity not explained by Einstein’s theory is one way to search for a means of unification.

Better Sensors

Scientists already have ideas to go beyond testing fundamental physics in microgravity inside the Cold Atom Lab. They have also proposed space-based experiments that could use a two-atom interferometer and quantum gases to measure gravity with high precision in order to learn about the nature of dark energy, the mysterious driver behind the accelerating expansion of the universe. What they learn could lead to the development of precision sensors for a wide range of applications.

The quality of those sensors will depend on how well scientists understand the behavior of these atoms in microgravity, including how those atoms interact with each other. The introduction of tools to control the atoms, like magnetic fields, can make them repel each other like oil and water or stick together like honey. Understanding those interactions is a key goal of the Cold Atom Lab.

More About the Mission

A division of Caltech in Pasadena, JPL designed and built Cold Atom Lab, which is sponsored by the Biological and Physical Sciences (BPS) division of NASA’s Science Mission Directorate at the agency’s headquarters in Washington. BPS pioneers scientific discovery and enables exploration by using space environments to conduct investigations not possible on Earth. Studying biological and physical phenomena under extreme conditions allows researchers to advance the fundamental scientific knowledge required to go farther and stay longer in space, while also benefitting life on Earth. 

To learn more about Cold Atom Lab, go here:

https://coldatomlab.jpl.nasa.gov/

News Media Contact

Calla Cofield
Jet Propulsion Laboratory, Pasadena, Calif.
626-808-2469
calla.e.cofield@jpl.nasa.gov

2023-170

Powered by WPeMatico

Get The Details…
Anthony Greicius

Satellite Data Can Help Limit the Dangers of Windblown Dust

Satellite Data Can Help Limit the Dangers of Windblown Dust

8 Min Read

Satellite Data Can Help Limit the Dangers of Windblown Dust

Dust storms present a growing threat to the health and safety of U.S. populations. A new model, powered by NASA and NOAA satellite data, provides important early warnings.
Credits:
Stock Footage Provided by Pond5/EnglerAerial

Interstate 10, an artery that cuts through the rural drylands of southern New Mexico, is one of the country’s deadliest roadways. On one stretch of the highway, just north of a dry lakebed called Lordsburg Playa, fatal collisions occur with such regularity that officials often call it the “dust trap.” It’s a fitting name. Since 1967, at least 55 deaths in the area have been linked to dust storms. 

This stretch of Interstate 10 offers a concentrated example of the hazards that dust storms carry. But across the U.S. Great Plains, levels of windblown dust have increased steadily, by about 5% each year between 2000 and 2018, contributing to a decline in air quality and an increase in fatal collisions.

“Dust storms are appearing with greater frequency for reasons that include extended drought conditions and urban sprawl, which disrupt the fragile biotic crust of the desert,” said John Haynes, program manager for NASA’s Health and Air Quality Applied Sciences Team. As reduced rainfall in arid regions and warmer weather become regular fixtures of the U.S. climate, experts expect the trend to continue.   

Dust storms can cause traffic accidents, negatively impact air quality, and even carry pathogens that cause diseases.

john Haynes

john Haynes

Program manager for NASA Health and Air Quality Applied Sciences Team

On the ground, dust storms form menacing palls that can swallow entire cities whole. From space, dust storms can be observed moving across continents and oceans, revealing their tremendous scale. It’s from this vantage point, high above the clouds, that NASA and NOAA have Earth-observing satellites that help scientists and first responders track windblown dust. 

Daniel Tong, professor of atmospheric chemistry and aerosols at George Mason University, working closely with NASA’s Health and Air Quality Applied Sciences Team, leads a NASA-funded effort to improve the country’s dust forecasting capabilities. Tong’s forecasting system relies on an algorithm called FENGSHA, which means “windblown dust” in Mandarin. By plugging real-time satellite data into a complex model of Earth’s atmosphere – one that accounts for site-specific variables like soil type, wind speed, and how Earth’s surface interacts with winds – the system churns out hourly forecasts that can predict dust storms up to three days in advance. 

On March 16, 2021, images acquired by the Visible Infrared Imaging Radiometer Suite (VIIRS) on the NASA/NOAA Suomi NPP satellite show large dust plumes sweeping across New Mexico, Texas, and Mexico.
On March 16, 2021, images acquired by the Visible Infrared Imaging Radiometer Suite (VIIRS) on the NASA/NOAA Suomi NPP satellite show large dust plumes sweeping across New Mexico, Texas, and Mexico. Credit: NASA Earth Observatory
NASA/NOAA

FENGSHA was initially developed using a dust observation method trained by NASA’s Aqua and Terra satellites. It’s these “space truths,” as Tong calls them, that make reliable forecasting possible. Comparing the model’s predictions with satellite imagery from real dust storms allows the team to identify shortcomings and improve accuracy. The most recent version of the model includes data from the Visible Infrared Imaging Radiometer Suite (VIIRS) on the NASA-NOAA Suomi-NPP, NOAA-20, and NOAA-21 satellites, which observe each location on the planet at least twice a day.

Currently, the dust monitoring system is available to all 122 of the National Weather Service’s regional forecasting offices. When a forecast calls for dust, local teams assess each case individually and decide whether to send out alerts. These could involve a warning to transit authorities or weather alerts sent directly to people’s phones.

“Dust storms cause traffic accidents, negatively impact air quality, and even carry pathogens that cause diseases,” Haynes said.  “Early warning systems empower individuals to take necessary actions, such as sheltering indoors or clearing roadways until the storm passes.”

The Benefits of Early Warning  

On May 1, 2023, high winds in Illinois sent a dark cloud of dust creeping along Interstate 55, the state’s main throughway. Visibility was reduced to zero in a matter of minutes – leaving drivers with little time to react. The resulting collision involved 72 vehicles and killed eight people. Dozens more were hospitalized.

In some hotspots for dust, officials are taking steps to minimize the damage. On Interstate 10 in New Mexico and Arizona, for example, drivers are now met with 100 miles of roadside warning signs that urge them to pull over when dust is detected. But Interstate 55, in Illinois, isn’t a hotspot. No one saw the storm coming. And as dust claims new territory, local ground-based solutions may not provide sufficient coverage. 

This is why satellite-based forecasting is essential, said Morgan Gorris, an Earth system scientist and geohealth expert at Los Alamos National Laboratory. “When we see a dust storm developing in radar returns or on dust sensors, people are already on the road, and it’s more difficult to make safety decisions.”

Tong hopes to see forecasts used more frequently in commercial trucking “to prevent delays, traffic jams, and accidents,” he said. Notably, semi-trucks or tractor-trailers are involved in almost all fatal collisions involving dust. By rerouting or delaying truck drivers, the worst accidents could be avoided.

Tong also promotes advanced forecasting as a way to reduce the frequency and intensity of dust storms. Storms like the one in Illinois – which rose from the overworked soil of the surrounding farmland – might be preventable. “If we know that there might be a dust storm tomorrow, farmers might stop tilling their land,” he said.

Most fatal collisions are the result of smaller, quick-forming dust storms. But larger storms carry additional hazards. Billowing plumes of dust lofted from loose soil or desert floors by high-speed winds can reach thousands of feet into the air and travel hundreds of miles, affecting the respiratory health of populations across great distances.

Valley fever —an infectious disease caused by a soil-dwelling fungus endemic to the arid and semi-arid climates of Texas, New Mexico, Arizona, and California — is also a threat. The fungus is harmless in the ground, but airborne spores can lead to infections that are sometimes fatal. The Centers for Disease Control and Prevention reported more than 200,000 infections of Valley fever since 1998. The current infection rate is about 10 times higher than that of the West Nile Virus, a vector-transmitted disease that often receives far more attention. 

An Image of Baja, CA, taken from the International Space Station depicts strong winds blowing dust into the Pacific Ocean.
An Image of Baja, CA, taken from the International Space Station depicts strong winds blowing dust into the Pacific Ocean. Valley fever cases have been discovered off the California coast among populations of bottle-nosed dolphins and other marine mammals, a sign that windblown dust could be carrying the fungus to non-endemic regions of the country. Credit: NASA

“The areas where we see dust storms and the areas endemic to Valley fever are both expanding,” said Gorris, who also warns that the expanding reach of dust storms might unearth new airborne diseases. “We don’t yet know what other biology is in the soil that might infect us.”

It’s not just what’s in the soil. Even when traces of chemical or biological toxins are absent, the soil itself can be a significant irritant. “People think that it’s a natural phenomenon carrying natural material, so it’s probably innocuous,” said Thomas E. Gill, professor of Earth and environmental sciences at the University of Texas at El Paso. But that’s not the case. Fine grains of dust can penetrate deep into lung tissue and are linked to an increase in respiratory illness and premature death.

According to a global study conducted by atmospheric scientists at NASA’s Goddard Space Flight Center, 2.89 million premature deaths were connected to PM2.5 in 2019 – and 22% of those deaths were attributed to dust. Most at risk were children and those with pre-existing conditions like asthma.

A New Way to See an Old Problem

In the 1930s, during the Dust Bowl years, severe drought and poor land management sent deadly “black blizzards” sweeping across the landscape. From Texas to Nebraska, wind stripped the soil of vital nutrients, generating massive dust storms that blocked out the Sun for days at a time and reached as far east as New York City – where the sky was dark enough for streetlights to switch on in the middle of the day.

Some scientists claim that the threat of a “dust bowl 2.0” is imminent. Urban sprawl, industrial-scale agriculture, wildfires, drought, and a warming climate can all strip the land of vegetation and remove moisture from the soil. But it can be difficult to draw a hard line from these individual sources to their cumulative effects. “We have to continue developing our understanding of the consequences on our communities and come up with better ways to protect citizens,” Tong said.

The next generation of FENGSHA will soon be integrated into an atmospheric model developed by NASA called the Goddard Chemistry Aerosol Radiation and Transport (GOCART). Features of Earth’s surface like rocks, vegetation, and uneven soil all influence how much dust the wind can kick up. As a result, both the amount of dust in the air and the direction that windblown dust travels are often governed by what’s on the ground. GOCART’s ability to model these surface features will improve the accuracy of the forecasting system, said Barry Baker, an atmospheric physicist and lead of chemical modeling for the National Oceanic and Atmospheric Administration who led the research to operation transition of FENGSHA for NOAA’s oceanic and atmospheric research team. The ultimate goal, though, he added, is a geostationary satellite. Polar-orbiting satellites pass over each spot of the globe twice a day; a geostationary satellite could hover over the U.S. and monitor dust around the clock, tracking storms as they develop and grow. 

An image captured by the VIIRS instrument on the NOAA-20 satellite shows dust from the Saharan desert blowing west over the Atlantic
Each year, 182 million tons of dust escapes into the atmosphere from the Sahara. This image captured by the VIIRS instrument on the NOAA-20 satellite captures the tremendous scale of African dust. Credit: NASA Earth Observatory.

Despite its hazards, windblown dust is a fundamental feature of the atmosphere and a critical ingredient for life on Earth. Dust from the Saharan Desert carries life-sustaining nutrients across the Atlantic Ocean to the Amazon rainforest, roughly 1,600 miles away. It also feeds the vast algal ecosystems that teem near the surface of Earth’s oceans, which in turn support a diverse menagerie of marine life. Even if we could rid the planet of dust, we would not want to.

“There’s no way to contain the situation; you can’t just eliminate the desert,” Tong said. “But what we can do is increase awareness and try to help those who are impacted most.”

Share

Details

Last Updated
Nov 15, 2023

Powered by WPeMatico

Get The Details…

NASA Data Reveals Possible Reason Some Exoplanets Are Shrinking

NASA Data Reveals Possible Reason Some Exoplanets Are Shrinking

6 min read

NASA Data Reveals Possible Reason Some Exoplanets Are Shrinking

This artist’s concept shows what the sub-Neptune exoplanet TOI-421 b
This artist’s concept shows what the sub-Neptune exoplanet TOI-421 b might look like. In a new study, scientists have found new evidence suggesting how these types of planets can lose their atmospheres.
NASA, ESA, CSA, and D. Player (STScI)

A new study could explain the ‘missing’ exoplanets between super-Earths and sub-Neptunes.

Some exoplanets seem to be losing their atmospheres and shrinking. In a new study using NASA’s retired Kepler Space Telescope, astronomers find evidence of a possible cause: The cores of these planets are pushing away their atmospheres from the inside out.

Exoplanets (planets outside our solar system) come in a variety of sizes, from small, rocky planets to colossal gas giants. In the middle lie rocky super-Earths and larger sub-Neptunes with puffy atmospheres. But there’s a conspicuous absence – a “size gap” – of planets that fall between 1.5 to 2 times the size of Earth (or in between super-Earths and sub-Neptunes) that scientists have been working to better understand.

This video explains the differences between the main types of exoplanets, or planets outside our solar system. Credit: NASA/JPL-Caltech

“Scientists have now confirmed the detection of over 5,000 exoplanets, but there are fewer planets than expected with a diameter between 1.5 and 2 times that of Earth,” said Caltech/IPAC research scientist Jessie Christiansen, science lead for the NASA Exoplanet Archive and lead author of the new study in The Astronomical Journal. “Exoplanet scientists have enough data now to say that this gap is not a fluke. There’s something going on that impedes planets from reaching and/or staying at this size.”

Researchers think that this gap could be explained by certain sub-Neptunes losing their atmospheres over time. This loss would happen if the planet doesn’t have enough mass, and therefore gravitational force, to hold onto its atmosphere. So sub-Neptunes that aren’t massive enough would shrink to about the size of super-Earths, leaving the gap between the two sizes of planets.

But exactly how these planets are losing their atmospheres has remained a mystery. Scientists have settled on two likely mechanisms: One is called core-powered mass loss; and the other, photoevaporation. The study has uncovered new evidence supporting the first.

This infographic details the main types of exoplanets
This infographic details the main types of exoplanets. Scientists have been working to better understand the “size gap,” or conspicuous absence, of planets that fall between super-Earths and sub-Neptunes.
NASA/JPL-Caltech

Solving the Mystery

Core-powered mass loss occurs when radiation emitted from a planet’s hot core pushes the atmosphere away from the planet over time, “and that radiation is pushing on the atmosphere from underneath,” Christiansen said.

The other leading explanation for the planetary gap, photoevaporation, happens when a planet’s atmosphere is essentially blown away by the hot radiation of its host star. In this scenario, “the high-energy radiation from the star is acting like a hair dryer on an ice cube,” she said.

While photoevaporation is thought to occur during a planet’s first 100 million years, core-powered mass loss is thought to happen much later – closer to 1 billion years into a planet’s life. But with either mechanism, “if you don’t have enough mass, you can’t hold on, and you lose your atmosphere and shrink down,” Christiansen added.

For this study, Chistiansen and her co-authors used data from NASA’s K2, an extended mission of the Kepler Space Telescope, to look at the star clusters Praesepe and Hyades, which are 600 million to 800 million years old. Because planets are generally thought to be the same age as their host star, the sub-Neptunes in this system would be past the age where photoevaporation could have taken place but not old enough to have experienced core-powered mass loss.

So if the team saw that there were a lot of sub-Neptunes in Praesepe and Hyades (as compared to older stars in other clusters), they could conclude that photoevaporation hadn’t taken place. In that case, core-powered mass loss would be the most likely explanation of what happens to less massive sub-Neptunes over time.

In observing Praesepe and Hyades, the researchers found that nearly 100% of stars in these clusters still have a sub-Neptune planet or planet candidate in their orbit. Judging from the size of these planets, the researchers think they have retained their atmospheres.

This differs from the other, older stars observed by K2 (stars more than 800 million years old), only 25% of which have orbiting sub-Neptunes. The older age of these stars is closer to the timeframe in which core-powered mass loss is thought to take place.

From these observations, the team concluded that photoevaporation could not have taken place in Praesepe and Hyades. If it had, it would have occurred hundreds of millions of years earlier, and these planets would have little, if any, atmosphere left. This leaves core-powered mass loss as the leading explanation for what likely happens to the atmospheres of these planets.

Christiansen’s team spent more than five years building the planet candidate catalog necessary for the study. But the research is far from complete, she said, and it is possible that the current understanding of photoevaporation and/or core-powered mass loss could evolve. The findings will likely be put to the test by future studies before anyone can declare the mystery of this planetary gap solved once and for all.

This study was conducted using the NASA Exoplanet Archive, which is operated by Caltech in Pasadena under contract with NASA as part of the Exoplanet Exploration Program, which is located at NASA’s Jet Propulsion Laboratory in Southern California. JPL is a division of Caltech.

More About the Mission

On Oct. 30, 2018, Kepler ran out of fuel and ended its mission after nine years, during which it discovered more than 2,600 confirmed planets around other stars along with thousands of additional candidates astronomers are working to confirm.

NASA’s Ames Research Center in Silicon Valley, California, manages the Kepler and K2 missions for NASA’s Science Mission Directorate. JPL managed Kepler mission development. Ball Aerospace & Technologies Corporation operated the flight system with support from the Laboratory for Atmospheric and Space Physics at the University of Colorado in Boulder.

For more information about the Kepler and K2 missions, visit:

https://science.nasa.gov/mission/kepler

News Media Contacts

Calla Cofield
Jet Propulsion Laboratory, Pasadena, Calif.
626-808-2469
calla.e.cofield@jpl.nasa.gov

Karen Fox / Alise Fisher 
NASA Headquarters, Washington
202-358-1257 / 202-358-2546
karen.c.fox@nasa.gov / alise.m.fisher@nasa.gov

Written by Chelsea Gohd

2023-169   

Powered by WPeMatico

Get The Details…
Anthony Greicius