Now that the Curiosity rover has made an action hero’s entrance onto the red planet, members of NASA’s Jet Propulsion Laboratory (JPL) can breath a quick sigh of relief. The car-sized, rolling laboratory has already transmitted a small but breathtaking collection of images, displaying the planet’s landscape and will continue to do so over the course of its mission. Eager earthlings who want a glimpse of what Mars looks like from the Curiosity’s point of view can turn to the JPL website and spend time looking at pictures captured by the rover’s cameras. All this has been made possible by a collection of tools provided by Amazon, who recently published a case study about the Curiosity mission.
NASA and Amazon have an established history when it comes to working on missions to Mars. The cloud provider is handling images transmitted by the Opportunity exploratory rover, which continues to function after eight years of service. Amazon also had a role in handling Web traffic during the new rover’s complex landing procedure.
In preparation for Curiosity’s big debut, NASA asked Amazon to help serve the estimated hundreds of thousands of visitors looking to see the landing operation. A complex system was devised, incorporating load balancing, traffic monitoring and a method to de-provision resources after the event took place. The system was benchmarked by SOASTA, which verified the stream could handle requests to the order of hundreds of gigabits per second.
There was good news all around as the landing was a success and the stream worked without any noticeable issues. Now that Curiosity is on the red planet, AWS will process pictures taken by the new inhabitant, making them available to JPL researchers and the public.
The workflow is slightly more complicated than sharing a photo from a smartphone. Although the rover has 17 cameras in total, the panoramic pictures are assembled using images gathered by a stereoscopic camera located on its masthead. An Amazon blog entry explains the process:
In order to produce a finished image, each pair (left and right) of images must be warped to compensate for perspective, then stereo matched to each other, stitched together, and then tiled into a larger panorama.
This process is completed using Amazon’s Simple Flow and AWS Flow Frameworks, producing the graphics available to the public. The service provider says accelerated analysis of these images will lead to better decision-making, ultimately increasing the amount of exploration the new rover will embark upon.