While NASA scientists were busy binge-eating turkey and online shopping last November, the Curiosity Rover wasn’t taking a break, instead snapping a 1.8 billion-pixel image from the surface of Mars. The 360-degree panorama, shared by NASA on March 4, is the highest resolution captured by the Curiosity Rover yet, providing enough detail to zoom in on some impact craters, ridges, and mountains, and some Martian dirt believed to have once been underwater.

The Curiosity Rover had to spend four days in the same spot to capture the panorama — which is likely one of the reasons Nasa hasn’t attempted such a high-resolution panorama before. To get the rover working while NASA scientists were off on Thanksgiving break, the Mastcam operators programmed the rover to take those photos, adjusting the position of the rover mask and focusing the camera.

The process took Curiosity six and a half hours over the span of four days at the end of November 2019, because the camera was only set to work between noon and 2 p.m. Mars time in order to achieve consistent lighting across the entire panorama.

That same panorama took NASA researchers months to stitch together. Between 1,000 and 1,200 photos were used to assemble the final photograph.

At the same time, Curiosity also snapped photos with its medium angle lens, which is wide enough to capture a panorama that includes part of the rover itself. The 650-megapixel panorama has a lower resolution but still offers enough detail to get a close-up look at the rover’s wires and the Martian dust coating its surface.

Both photos capture the area called Glen Torridon.

While the photo is the highest resolution stitch from the now 7-year-old rover, the team has in the past instructed the rover to snap the images for a 1.3 billion-pixel stitch.

“While many on our team were at home enjoying turkey, Curiosity produced this feast for the eyes,” Ashwin Vasavada, Curiosity’s project scientist at NASA’s Jet Propulsion Laboratory, said in a statement. “This is the first time during the mission we’ve dedicated our operations to a stereo 360-degree panorama.”

To get the full effect of what 1.8 billion pixels look like, explore the image using NASA’s navigation tool.

Related Posts

Your WhatsApp voice notes could help screen for early signs of depression

The study, led by researchers in Brazil including Victor H. O. Otani from the Santa Casa de São Paulo School of Medical Sciences, found that their AI could identify depression in female participants with 91.9% accuracy. All the AI needed was a simple recording of the person describing how their week went.

Talk to AI every day? New research says it might signal depression

This finding comes from a national survey of nearly 21,000 U.S. adults conducted in 2025, where participants detailed how often they interacted with generative AI tools and completed standard mental health questionnaires. Within that group, about 10% said they used AI daily, and 5% said they engaged with chatbots multiple times throughout the day. Those daily users showed higher rates of reported depressive symptoms and other negative emotional effects, such as anxiety and irritability.

You might actually be able to buy a Tesla robot in 2027

The comments follow a series of years-long development milestones. Optimus, which was originally unveiled as the Tesla Bot in 2021, has undergone multiple prototype iterations and has already been pressed into service handling simple tasks in Tesla factories. According to Musk, those internal deployments will expand in complexity later this year, helping prepare the robotics platform for broader use.