NASA Mars Mission Fueled By Amazon Web Services - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

09:32 AM
Connect Directly

NASA Mars Mission Fueled By Amazon Web Services

Just how is NASA keeping up with all that Mars mission data and imagery? Think cloud: Take a look at the wide range of Amazon Web Services and tools working behind the scenes.

Curiosity's Mars Mission
Curiosity's Mars Mission
(click image for larger view and for slideshow)
Cloud computing helps many businesses to wrangle and transmit data packets from different parts of the world. Amazon Web Services (AWS) and NASA's Jet Propulsion Laboratory (JPL) have now upped the stakes with a project that manages the flow of information from another part of the solar system--Mars.

The space agency's $2.6 billion Curiosity rover successfully landed on the red plant early on Aug. 6 and immediately began transmitting information back to Earth. These messages, which travel at the speed of light, take 14 minutes--at the planets' present orientation--to speed across the cosmos to waiting scientists. This long-distance transit would be logistically daunting under any circumstances, but NASA faced an even greater burden because of the huge volume of data these transmissions carry.

To address this profound challenge, JPL is using a wide gamut of AWS tools and services, including EC2, S3, SimpleDB, Route 53, CloudFront, the Relational Database Service, Simple Workflow, CloudFormation, and Elastic Load Balancing. This array of services is vital not only to the mission's research objectives but also to public outreach, as images recorded by the rover are made available almost immediately via JPL's Mars Science Laboratory site .

"NASA wanted to ensure that this thrilling experience was shared with fans across the globe by providing up-to-the-minute details of the mission," according to a case study AWS released to illustrate the project's technical accomplishments. With hundreds of thousands of concurrent visitors anticipated during traffic peaks, the case study asserts that "availability, scalability, and performance of the [site] was of the utmost essence." It also says that prior to AWS implementation, NASA/JPL did not possess the requisite Web and live streaming infrastructure to push hundreds of gigabits of content per second to the legions of site users.

"The public gets access as soon as we have access," Khawaja Shams, manager of data services for tactical operations at JPL, said in an interview. "All the images that come from Mars are processed in the cloud environment before they're disseminated." Services from Amazon "allows us to leverage multiple machines to do actual processing."

The processing itself is complex, as the rover captures images using a stereoscopic system that uses two camera lenses. "In order to produce a finished image, each pair (left and right) of images must be warped to compensate for perspective, then stereo matched to each other, stitched together, and then tiled into a larger panorama," Jeff Bar, an AWS evangelist, wrote in a blog post.

Though complicated, this method is vital to the mission's research goals. "One of the big misconceptions about rovers is that they're driven via joystick," Shams stated, "but even at the speed of light, we can't get there." Because of this limitation, a plan must be uploaded into the rover, which then "semi-autonomously takes care of the whole thing." The image acquisition technique allows researchers to generate geographic metadata that serves as a foundation for these plans. "It gives scientists situational awareness to do the best science and keep the rover safe," said Shams.

According to the AWS case study, such awareness maximizes "the time that scientists have to identify potential hazards or areas of particular scientific interest." This enables researchers to send longer command sequences to the rover, thereby increasing "the amount of exploration that the Mars Science Laboratory can perform during any given sol," or Martian day.

JPL's use of AWS technology furthers NASA's reputation as an early and avid adopter of cloud computing. It likewise continues the agency's goal of addressing the general population, an effort that has led to close relationships with Google and Microsoft, among others. In this instance, public engagement culminates in the Mars Research Laboratory site, which is based on the open-source content management system Railo, running on Amazon's EC2.

The Curiosity mission also extends NASA's recent effort to streamline operations and reduce costs by utilizing Amazon services for cloud-based enterprise infrastructure. NASA CIO Linda Cureton detailed the initiative in a June 8 blog post, writing, "This cloud-based model supports a wide variety of Web applications and sites using an interoperable, standards-based, and secure environment while providing almost a million dollars in cost savings each year." In the context of this mission, AWS allows NASA to track Web traffic in real time, and to scale capacity to meet demand. The cloud infrastructure also allows assets to be distributed intelligently across AWS regions depending on the part of world from which requests originate. This functionality produces a secure and stable environment despite the high bandwidth logistics, AWS said. It also can be economical because AWS downsizes activity when traffic is low, avoiding the problem of expensive but under-used resources.

"Science data is growing at an exponential rate. Some upcoming instruments will produce terabytes of data every single day," he said. Such a deluge would have left NASA "out of data center space," making the ability to provision cloud-based machines invaluable. As NASA uses the cloud to solve its own puzzles, opportunities for other applications naturally arise.

"We can provision a supercomputing cluster in the cloud that would qualify as one of the top 500 in the world" at a cost of "a couple hundred dollars an hour," he said. "Think of the possibilities."

Expertise, automation, and silo busting are all required, say early adopters of private clouds. Also in the new, all-digital Private Clouds: Vision Vs. Reality issue of InformationWeek: How to choose between OpenStack and CloudStack for your private cloud. (Free with registration.)

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
InformationWeek Is Getting an Upgrade!

Find out more about our plans to improve the look, functionality, and performance of the InformationWeek site in the coming months.

Remote Work Tops SF, NYC for Most High-Paying Job Openings
Jessica Davis, Senior Editor, Enterprise Apps,  7/20/2021
Blockchain Gets Real Across Industries
Lisa Morgan, Freelance Writer,  7/22/2021
Seeking a Competitive Edge vs. Chasing Savings in the Cloud
Joao-Pierre S. Ruth, Senior Writer,  7/19/2021
White Papers
Register for InformationWeek Newsletters
2021 State of ITOps and SecOps Report
2021 State of ITOps and SecOps Report
This new report from InformationWeek explores what we've learned over the past year, critical trends around ITOps and SecOps, and where leaders are focusing their time and efforts to support a growing digital economy. Download it today!
Current Issue
Monitoring Critical Cloud Workloads Report
In this report, our experts will discuss how to advance your ability to monitor critical workloads as they move about the various cloud platforms in your company.
Flash Poll