Artificial Intelligence Makes Old Film Look Like New One

Artificial Intelligence Makes Old Film Look Like New One

The Miles brothers left their studio on Market Street of San Francisco, boarded a cable car, and started filming what would become an iconic short movie (1906). The movie is “A Trip Down Market Street.” It is a fascinating documentation of life at the time. The brothers aim their camera straight ahead, as the cable car rolls slowly along, capturing women in outrageous frilly Victorian hats as they hurry across the tracks. A policeman strolls by wielding a billy club. Newsboys are peddling their wares.

Early automobiles are swerving in front of the cable car. Some of them are convertibles. Thus, we can see their drivers bounce inside. The filmmakers arrive at the turntable in front of the Ferry Building, after nearly a dozen minutes. Ferry Building’s towering clock stopped at 5:12 am just four days later than a massive earthquake, and indirect fire virtually obliterated San Francisco.

Denis Shiryaev is an artificial intelligence geek. So, well over a century later, he has transformed A Trip Down Market Street into something even more magical. Shiryaev sharpened and colorized the film, using a variety of publicly available algorithms, to 4K resolutions (2,160 vertical pixels, by 3,840 horizontal pixels), and bumped the choppy frame rate up to sixty frames per second, a process known as frame interpolation. The result is mesmerizing. Thus, on those flamboyant Victorian hats, we can finally see vibrant colors. On those newsboys’ faces, we can thee puckish looks. Moreover, and most importantly, we can see in unprecedented details, for example, byproducts that horses had left on the ground along the cable car’s tracks.

Film

Also, Shiryaev is the product director of the company Neural.love. The company provides artificial intelligence-driven video enhancements for clients. Shiryaev has not stopped at 1906 San Francisco.

He has waved his magic artificial intelligence wand over another historical film. The film is made by Lumiere brothers’ 1895 French short of a train pulling into a station.

He spilled passengers onto the platform. Furthermore, you can join the astronauts of Apollo 16 as they drive their lunar rover around the moon in 1972 or take a trip through New York City in 1911. All of those films are gussied up with remarkable clarity. Thus, it is giving us modern folk an enchanting view of life long ago.

You cannot call these restorations of films. It is because the algorithms are not just getting rid of imperfections. Nevertheless, they fill in approximations of the data missing from low-frame rate, old, and blurry films. The algorithms make stuff up based on their previous training. For example, the algorithm DeOldify handles colorations. It was trained on over 14 million images to build an understanding of how objects are usually colored in the world. Painting the old footage with vibrant hues, it can then apply that knowledge to old black-and-white films. Shiryaev said that it is an important thing.

They call it an enhancement. It is because they are training neural networks. When neural networks redraw in pictures, it adds a new layer of data. Thus, colorization is enhancement.