As thrilling and entertaining as seeing all the historical footage from the Apollo Moon landing is, you must admit that the quality isn’t all that great sometimes.
Some of Apollo’s most famous footage remains grainy or blurry despite NASA working to restore and improve their great footages. But now, recent artificial intelligence advances have come to the rescue, providing viewers with an almost entirely new experience watching historic Apollo footage.
Photo, film restoration through AI’s help
A photo and film restore specialist called DutchSteamMachine worked with some AI magic to improve the original Apollo films that produced stunning video clips and photographs.
“I really wanted to provide an experience on this old footage that has not been seen before,” he told Universe Today.
However, he pointed out that the framerate seems to be off or fluctuating, not always working as intended sometimes. “So the best way to find the framerate is to listen to landmarks the astronauts are talking about and match the footage to that.”
Take a look at this improved video from an Apollo 16 lunar rover traverse with Charlie Duke and John Young. The video, initially filmed with 12 frames per second (FPS), was boosted to 60 FPS.
Isn’t that astonishing? During this enhanced view of the Apollo 15 landing site at Hadley Rille, you’ll also be blown away by the Moon landing, Moon’s surface’s clear view.
Neil Armstrong taking his first step
Look at how clearly Neil Armstrong can be seen in this enhanced version of Apollo 11’s often-seen “first move” footage. Initially, the visuals were shot on a camera mounted inside a Lunar Module which shot a hazy video on a 16 mm frame.
NASA astronauts Neil Armstrong, Michael Collins, and Buzz Aldrin, armed with film cameras and video recorders, captured their historic flight. Armstrong and Aldrin flew the Eagle Lunar Module (LM) on the day of the lunar landing, while Collins remained in lunar orbit. Their dangerous fall to the Moon’s surface was captured on camera, but the image quality has left much to be desired.
The video was shot from the right-hand window of the LM. They, however, got a wrong angle on the Moon, and the small window gave only a narrow field of view.
“Due to the small size of the LM windows and the angle at which the movie camera was mounted, what mission commander Neil Armstrong saw as he flew and landed the LM was not recorded,” NASA explained.
Fortunately, the NASA team behind the Lunar Reconnaissance Orbiter Camera (LROC) has been able to repair the landing.
What inspired the content creator to enhance the videographs?
The AI used by DutchSteamMachine is called the Depth-Aware INterpolation video frame, or DAIN for short. This AI is open-source, online, and undergoing continuous development and enhancement.
Motion interpolation is a type of video processing where intermediate animation frames are created between existing ones, making the video smoother, compensating for the blurriness, etc.
He also disclosed in a study that the footage of Moon landing required a high-end GPU to interpolate motion-compensated frames that made the iconic video appear ‘more fluid.’
Explaining the amount of time that would have been needed to edit the entire clip, the specialist said a 5-minute video would take between six and 20 hours to complete anywhere.
“People have used the same AI programs to bring old film recordings from the 1900s back to life, in high definition and colour,” he said. “This technique seemed like a great thing to apply to much newer footage.”
Originally published at Tech times