Just in Time for Halloween, NASA’s Juno Mission Spots Eerie “Face” on Jupiter

Just in Time for Halloween, NASA’s Juno Mission Spots Eerie “Face” on Jupiter

The image shows turbulent clouds and storms along Jupiter’s terminator, the dividing line between the day and night sides of the planet. The low angle of sunlight highlights the complex topography of features in this region, which scientists have studied to better understand the processes playing out in Jupiter’s atmosphere.

On Sept. 7, 2023, during its 54th close flyby of Jupiter, NASA’s Juno mission captured this view of an area in the giant planet’s far northern regions called Jet N7. The image shows turbulent clouds and storms along Jupiter’s terminator, the dividing line between the day and night sides of the planet. The low angle of sunlight highlights the complex topography of features in this region, which scientists have studied to better understand the processes playing out in Jupiter’s atmosphere.

As often occurs in views from Juno, Jupiter’s clouds in this picture lend themselves to pareidolia, the effect that causes observers to perceive faces or other patterns in largely random patterns.

Citizen scientist Vladimir Tarasov made this image using raw data from the JunoCam instrument. At the time the raw image was taken, the Juno spacecraft was about 4,800 miles (about 7,700 kilometers) above Jupiter’s cloud tops, at a latitude of about 69 degrees north.

JunoCam’s raw images are available for the public to peruse and process into image products at https://missionjuno.swri.edu/junocam/processing. More information about NASA citizen science can be found at https://science.nasa.gov/citizenscience.

More information about Juno is at https://www.nasa.gov/juno and https://missionjuno.swri.edu. For more about this finding and other science results, see https://www.missionjuno.swri.edu/science-findings.

Image credit:
Image data: NASA/JPL-Caltech/SwRI/MSSS
Image processing by Vladimir Tarasov © CC BY

Powered by WPeMatico

Get The Details…
Naomi Hartono

AWE Launching to Space Station to Study Atmospheric Waves via Airglow

AWE Launching to Space Station to Study Atmospheric Waves via Airglow

4 min read

AWE Launching to Space Station to Study Atmospheric Waves via Airglow

NASA’s Atmospheric Waves Experiment, or AWE, mission is scheduled to launch to the International Space Station in November 2023, where it will make use of a natural, ethereal glow in Earth’s sky to study waves in our planet’s atmosphere.

Built by Utah State University’s Space Dynamics Laboratory in North Logan, Utah, AWE will be mounted on the exterior of the space station. From this perch, AWE will stare down toward Earth, tracking undulations in the air known as atmospheric gravity waves (AGWs).

Primarily originating in the lowest level of the atmosphere, AGWs may be caused by strong weather events such as tornadoes, hurricanes, or even thunderstorms. These weather events can momentarily push pockets of high-density air upwards into the atmosphere before the air sinks back down. This up-and-down bobbing often leaves behind distinctive ripples patterns in the clouds.

This photo shows examples of cloud patterns caused by atmospheric gravity waves (AGWs). Warmer, denser air from lower in the atmosphere holds more water, so as weather events like wind and storms push those pockets of air to higher altitudes, that water forms clouds at the crests of those waves.
Courtesy Alexa Halford; used with permission

But AGWs continue all the way to space, where they contribute to what’s known as space weather – the tumultuous exchange of energy in the area surrounding our planet that can disrupt satellite and communications signals. AWE will measure AGWs at an atmospheric layer that begins some 54 miles (87 kilometers) in altitude, known as the mesopause.

“This is the first time that AGWs, especially the small-scale ones, will be measured globally at the mesopause, the gateway to the space,” said Michael Taylor, professor of physics at Utah State University and principal investigator for the mission. “More importantly, this is the first time we will be able to quantify the impacts of AGWs on space weather.”

This image taken from the International Space Station shows swaths of airglow hovering in Earth’s atmosphere. NASA’s new Atmospheric Waves Experiment will observe airglow from a perch on the space station to help scientists understand, and ultimately improve forecasts of, space weather changes in the upper atmosphere.
NASA

At the mesopause, where AWE will make its measurements, AGWs are revealed by colorful bands of light in our atmosphere known as airglow. AWE will “see” these waves by recording variations of airglow in infrared light, a wavelength range too long for human eyes to see. At these altitudes our atmosphere dips to its coldest temperatures – reaching as low as -150 degrees Fahrenheit (-101 degrees Celsius) – and the faint glow of infrared light is at its brightest.

By watching that infrared airglow grow brighter and dimmer as waves move through it, AWE will enable scientists to compute the size, power, and dispersion of AGWs like never before. It was also designed to see smaller AGWs, detecting short-scale ripples in airglow that previous missions would miss.

“AWE will be able to resolve waves at finer horizontal scales than what satellites can usually see at those altitudes, which is part of what makes the mission unique,” said Ruth Lieberman, AWE mission scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

This artist’s conception depicts AWE scanning the atmosphere from aboard the International Space Station. AWE will measure variations in infrared airglow to track atmospheric gravity waves as they move up from the lower atmosphere into space.
Utah State University Space Dynamics Laboratory

From its vantage point on the space station, AWE’s Advanced Mesospheric Temperature Mapper (AMTM) instrument will scan the mesopause below it. AWE’s AMTM consists of four identical telescopes, which together comprise a wide-field-of-view imaging radiometer, an instrument that measures the brightness of light at specific wavelength ranges. The relative brightness of different wavelengths can be used to create temperature maps, which in turn reveal how AGWs are moving through the atmosphere. It will be the most thorough study of AGWs and their effects on the upper atmosphere ever conducted.


From its unique vantage point on the International Space Station, NASA’s Atmospheric Waves Experiment (AWE) will look directly down into Earth’s atmosphere to study how gravity waves travel through the upper atmosphere. Data collected by AWE will enable scientists to determine the physics and characteristics of atmospheric gravity waves and how terrestrial weather influences the ionosphere, which can affect communication with satellites. Credit: NASA’s Goddard Space Flight Center Conceptual Image Lab

As a payload headed to the space station, AWE was required to hold four crucial safety reviews. The mission was successfully certified as a station payload at its last review in July 2023. Part of this certification involved “sharp edge” testing with astronaut gloves to ensure safety during AWE’s installation and maintenance on the exterior of the space station.

AWE is the first NASA mission to attempt this type of science to provide insight into how terrestrial and space weather interactions may affect satellite communications and tracking in orbit.

Following AWE’s installation on the International Space Station, the team’s focus will be to share the instrument’s data and results with the science community and the public. More information about AWE is available on the mission website: https://www.awemission.org/.

By J. Titus Stupfel, NASA’s Goddard Space Flight Center

Powered by WPeMatico

Get The Details…

NASA Improves GIANT Optical Navigation Technology for Future Missions

NASA Improves GIANT Optical Navigation Technology for Future Missions

As NASA scientists study the returned fragments of asteroid Bennu, the team that helped navigate the mission on its journey refines their technology for potential use in future robotic and crewed missions.

The optical navigation team at NASA’s Goddard Space Flight Center in Greenbelt, Maryland, served as a backup navigation resource for the OSIRIS-REx (Origins, Spectral Interpretation, Resource Identification, and Security – Regolith Explorer) mission to near-Earth asteroid Bennu. They double-checked the primary navigation team’s work and proved the viability of navigation by visual cues.

The sample return capsule from NASAs OSIRIS-REx mission is seen shortly after touching down in the desert, Sunday, Sept. 24, 2023, at the Department of Defense's Utah Test and Training Range.
The sample return capsule from NASA’s OSIRIS-REx mission is seen shortly after touching down in the desert, Sunday, Sept. 24, 2023, at the Department of Defense’s Utah Test and Training Range. The sample was collected from the asteroid Bennu in October 2020 by NASA’s OSIRIS-REx spacecraft.
NASA/Keegan Barber

Optical navigation uses observations from cameras, lidar, or other sensors to navigate the way humans do. This cutting edge technology works by taking pictures of a target, such as Bennu, and identifying landmarks on the surface. GIANT software – that’s short for the Goddard Image Analysis and Navigation Tool – analyzes those images to provide information, such as precise distance to the target, and to develop three-dimensional maps of potential landing zones and hazards. It can also analyze a spinning object to help calculate the target’s mass and determine its center – critical details to know for a mission trying to enter an orbit.

“Onboard autonomous optical navigation is an enabling technology for current and future mission ideas and proposals,” said Andrew Liounis, lead developer for GIANT at Goddard. “It reduces the amount of data that needs to be downlinked to Earth, reducing the cost of communications for smaller missions, and allowing for more science data to be downlinked for larger missions. It also reduces the number of people required to perform orbit determination and navigation on the ground.”

Asteroid Bennu ejecting particles from its surface on Jan. 19, created by combining two images from NASA's OSIRIS-REx spacecraft processed by optical navigation technology
Asteroid Bennu ejecting particles from its surface on Jan. 19, created by combining two images from NASA’s OSIRIS-REx spacecraft. GIANT optical navigation technology used to process images like these helped establish the size and velocity of the particles.
NASA / Goddard / University of Arizona

During OSIRIS-REx’s orbit of Bennu, GIANT identified particles flung from the asteroid’s surface. The optical navigation team used images to calculate the particles’ movement and mass, ultimately helping determine they did not pose a significant threat to the spacecraft.

Since then, lead developer Andrew Liounis said they have refined and expanded GIANT’s backbone collection of software utilities and scripts.

New GIANT developments include an open-source version of their software released to the public, and celestial navigation for deep space travel by observing stars, the Sun, and solar system objects. They are now working on a slimmed-down package to aid in autonomous operations throughout a mission’s life cycle.

“We’re also looking to use GIANT to process some Cassini data with partners at the University of Maryland in order to study Saturn’s interactions with its moons,” Liounis said.

Other innovators like Goddard engineer Alvin Yew are adapting the software to potentially aid rovers and human explorers on the surface of the Moon or other planets.

Adaptation, Improvement

Shortly after OSIRIS-REx left Bennu, Liounis’ team released a refined, open-source version for public use. “We considered a lot of changes to make it easier for the user and a few changes to make it run more efficiently,” he said.

An intern modified their code to make use of a graphics processor for ground-based operations, boosting the image processing at the heart of GIANT’s navigation.

A simplified version called cGIANT works with Goddard’s autonomous Navigation, Guidance, and Control software package, or autoNGC in ways that can be crucial to both small and large missions, Liounis said.

Liounis and colleague Chris Gnam developed a celestial navigation capability which uses GIANT to steer a spacecraft by processing images of stars, planets, asteroids, and even the Sun. Traditional deep space navigation uses the mission’s radio signals to determine location, velocity, and distance from Earth. Reducing a mission’s reliance on NASA’s Deep Space Network frees up a valuable resource shared by many ongoing missions, Gnam said.

Next on their agenda, the team hopes to develop planning capabilities so mission controllers can develop flight trajectories and orbits within GIANT – streamlining mission design.

“On OSIRIS-REx, it would take up to three months to plan our next trajectory or orbit,” Liounis said. “Now we can reduce that to a week or so of computer processing time.”

Their innovations have earned the team continuous support from Goddard’s Internal Research and Development program, individual missions, and NASA’s Space Communications and Navigation program.

“As mission concepts become more advanced,” Liounis said, “optical navigation will continue to become a necessary component of the navigation toolbox.”

By Karl B. Hille

NASA’s Goddard Space Flight Center in Greenbelt, Md. 

Powered by WPeMatico

Get The Details…
Karl B. Hille

Readying a Little Rover

Readying a Little Rover

An engineer in white protective gear, a blue mask, and glasses works on a small rover. The rover is small enough to fit on a black tabletop. The rover has a flat top and four gear-like wheels.
NASA/JPL-Caltech

An engineer prepares a small rover for testing in a thermal vacuum chamber on Oct. 24, 2023, at NASA’s Jet Propulsion Laboratory in Southern California. This rover is part of the agency’s Cooperative Autonomous Distributed Robotic Exploration (CADRE) technology demonstration that’s headed to the Moon as part of the Commercial Lunar Payload Services initiative. CADRE is designed to demonstrate that multiple robots can cooperate and explore together autonomously – without direct input from human mission controllers.

Learn more about these miniature rovers.

Image Credit: NASA/JPL-Caltech

Powered by WPeMatico

Get The Details…
Monika Luabeya

NASA’s First Two-way End-to-End Laser Communications System

NASA’s First Two-way End-to-End Laser Communications System

5 Min Read

NASA’s First Two-way End-to-End Laser Communications System

A rendering of ILLUMA-T on the International Space Station communicating with LCRD in geosynchronous orbit.

NASA’s ILLUMA-T payload communicating with LCRD over laser signals.

Credits:
NASA/Dave Ryan

NASA is demonstrating laser communications on multiple missions – showcasing the benefits infrared light can have for science and exploration missions transmitting terabytes of important data.

The International Space Station is getting a “flashy” technology demonstration this November. The ILLUMA-T (Integrated Laser Communications Relay Demonstration Low Earth Orbit User Modem and Amplifier Terminal) payload is launching to the International Space Station to demonstrate how missions in low Earth orbit can benefit from laser communications.

Laser communications uses invisible infrared light to send and receive information at higher data rates, providing spacecraft with the capability to send more data back to Earth in a single transmission and expediting discoveries for researchers.

The ILLUMA-T payload at SpaceX Dragonland.
NASA’s ILLUMA-T payload was delivered to SpaceX Dragonland, and the team integrated the payload into the Dragon trunk in preparation for its November launch.
SpaceX

Managed by NASA’s Space Communications and Navigation (SCaN) program, ILLUMA-T is completing NASA’s first bi-directional, end-to-end laser communications relay by working with the agency’s LCRD (Laser Communications Relay Demonstration). LCRD launched in December 2021 and is currently demonstrating the benefits of laser communications from geosynchronous orbit by transmitting data between two ground stations on Earth in a series of experiments.

Some of LCRD’s experiments include studying atmospheric impact on laser signals, confirming LCRD’s ability to work with multiple users, testing network capabilities like delay/disruption tolerant networking (DTN) over laser links, and investigating improved navigation capabilities.

LCRD communicating over laser links to the International Space Station and Earth.
The Laser Communications Relay Demonstration (LCRD) launched in December 2021. Together, LCRD and ILLUMA-T will complete NASA’s first bi-directional end-to-end laser communications system.
Dave Ryan

Once ILLUMA-T is installed on the space station’s exterior, the payload will complete NASA’s first in-space demonstration of two-way laser relay capabilities.

How It Works:

ILLUMA-T’s optical module is comprised of a telescope and two-axis gimbal which allows pointing and tracking of LCRD in geosynchronous orbit. The optical module is about the size of a microwave and the payload itself is comparable to a standard refrigerator.

The ILLUMA-T payload in the Goddard cleanroom.
NASA’s ILLUMA-T payload in a Goddard cleanroom. The payload will be installed on the International Space Station and demo higher data rates with NASA’s Laser Communications Relay Demonstration.
Dennis Henry

ILLUMA-T will relay data from the space station to LCRD at 1.2 gigabits-per-second, then LCRD will send the data down to optical ground stations in California or Hawaii. Once the data reaches these ground stations, it will be sent to the LCRD Mission Operations Center located at NASA’s White Sands Complex in Las Cruces, New Mexico. After this, the data will be sent to the ILLUMA-T ground operations teams at the agency’s Goddard Space Flight Center in Greenbelt, Maryland. There, engineers will determine if the data sent through this end-to-end relay process is accurate and of high-quality. 

“NASA Goddard’s primary role is to ensure successful laser communications and payload operations with LCRD and the space station,” said ILLUMA-T Deputy Project Manager Matt Magsamen. “With LCRD actively conducting experiments that test and refine laser systems, we are looking forward to taking space communications capabilities to the next step and watching the success of this collaboration between the two payloads unfold.”


ILLUMA-T and LCRD demonstrating laser communications.

Once ILLUMA-T transmits its first beam of laser light through its optical telescope to LCRD, the end-to-end laser communications experiment begins. After its experimental phase with LCRD, ILLUMA-T could become an operational part of the space station and substantially increase the amount of data NASA can send to and from the orbiting laboratory.

Transmitting data to relay satellites is no new feat for the space station. Since its completion in 1998 the orbiting laboratory has relied on the fleet of radio frequency relay satellites known as NASA’s Tracking and Data Relay Satellites, which are part of the agency’s Near Space Network. Relay satellites provide missions with constant contact with Earth because they can see the spacecraft and a ground antenna at the same time.

Laser communications could be a game-changer for researchers on Earth with science and technology investigations aboard the space station. Astronauts conduct research in areas like biological and physical sciences, technology, Earth observations, and more in the orbiting laboratory for the benefit of humanity. ILLUMA-T could provide enhanced data rates for these experiments and send more data back to Earth at once. In fact, at 1.2 Gbps, ILLUMA-T can transfer the amount of data equivalent to an average movie in under a minute.

The ILLUMA-T / LCRD end-to-end laser communications relay system is one small step for NASA, but one giant leap for space communications capabilities. Together with previous and future demonstrations, NASA is showcasing the benefits laser communications systems can have for both near-Earth and deep space exploration.

The goal of these demonstrations is to integrate laser communications as a capability within NASA’s space communications networks: the Near Space Network and Deep Space Network. If you are a mission planner interested in using laser communications, please reach out to scan@nasa.gov.

LLCD, LCRD, TBIRD, DSOC, ILLUMA-T, and O2O in a roadmap
NASA’s Laser Communications Roadmap – proving the technology’s validity in a variety of regimes.
NASA / Dave Ryan

The ILLUMA-T payload is funded by the Space Communications and Navigation (SCaN) program at NASA Headquarters in Washington. ILLUMA-T is managed by NASA’s Goddard Space Flight Center in Greenbelt, Maryland. Partners include the International Space Station program office at NASA’s Johnson Space Center in Houston and the Massachusetts Institute of Technology (MIT) Lincoln Laboratory in Lexington, Massachusetts.

LCRD is led by Goddard and in partnership with NASA’s Jet Propulsion Laboratory in Southern California and the MIT Lincoln Laboratory. LCRD is funded through NASA’s Technology Demonstration Missions program, part of the Space Technology Mission Directorate, and the Space Communications and Navigation (SCaN) program at NASA Headquarters in Washington.

By Kendall Murphy and Katherine Schauer

Goddard Space Flight Center, Greenbelt, MD

Powered by WPeMatico

Get The Details…
Kendall Murphy