Regístrese ahora para una mejor cotización personalizada!

'The Beatles: Get Back' shows that deepfake tech isn't always evil

Dic, 06, 2021 Hi-network.com

While it rarely happens for me, I occasionally observe something that is an actual watershed moment for technological advancement and can acknowledge that the possibility for good using said technologies might outweigh the potential for evil.

Recommends

The best streaming devices

Streaming your favorite shows has never been easier.

Read now

For me, such a moment was watchingThe Beatles: Get Back, which is now airing on Disney+. I had heard about it and seen some of its footage slowly being released for about a year, but I had no idea just how much technology had been applied to its production.

I've seen many films made from historical footage from the 1960s and even the 1970 and 1980s, and so much of the source material is in poor condition. In many cases, even if the film stock is in pristine condition and requires minimal restoration, it will have muted colors. The audio from the time period can be hissy and often sounds like it was produced using less than professional recording equipment. Even with live performances such as Woodstock, the archival stuff is frequently in suboptimal condition due to the nature of what was used to record it, such as 16mm film -- an economical format popular with enthusiasts of the time which doesn't age well. 

In these cases, when you watch content such as this, the viewer feels like they are looking at something that is, in fact, archival. There is a detachment that you are viewing history, that it feels like the past, that something occurred decades before. It doesn't feel real or current.

Before watching the first hours of"Get Back," I only once felt with a film I was thrown back in time and watched events as they happened or recorded yesterday. The documentary filmApollo 11, produced by Todd Douglas Miller, released in 2019, used pristine 70mm prints from NASA and the National Archives that were then scanned into digital, employing over 11,000 hours of recordings to produce the film.

But what makes it all the more significant withGet Backis the condition of the source material, the 60 hours of 16mm film and 150+ hours of NAGRA audio created during 21 days in 1969 at Twickenham Studios in England. Unlike what was used forApollo 11, it was not in pristine condition -- it was washed out and grainy, and the audio was poor or muted. So Disney brought in Peter Jackson, the director ofThe Lord of the Rings and The Hobbitfilms, to restore the video and the audio. 

What happened next is quite astonishing. Jackson had the grainy, desaturated film fed through a computer algorithm. Suddenly, the resulting video had bright, vibrant colors, with sharp images that looked like it was filmed yesterday, not in 1969. But even more impressive than the film restoration, which is a technical triumph in and of itself, is what they did with the audio. Per Jackson, in hisVarietyinterview:

"To me, the sound restoration is the most exciting thing. We made some huge breakthroughs in audio. We developed a machine learning system that we taught what a guitar sounds like, what a bass sounds like, what a voice sounds like. In fact, we taught the computer what John sounds like and what Paul sounds like. So we can take these mono tracks and split up all the instruments we can just hear the vocals, the guitars. You see Ringo thumping the drums in the background but you don't hear the drums at all. That that allows us to remix it really cleanly."

The machine learning technology used here is very similar (if not identical) to what has been used in the past for deepfakes, making fake video look and sound real. A prime example of this is the Emmy Award-winning demonstration video produced by MIT's Center for Advanced Virtuality,"In Event of Moon Disaster," which depicts then-president Nixon reading a prepared statement that the Apollo 11 astronauts had perished in a catastrophe. To create it, MIT used Nixon's likeness and speech from television appearances and fed it into a machine learning system to synthesize the audio and video and produce the uncanny film.

The demonstration is a warning that these technologies can be used for nefarious purposes. There are currently efforts underway, such as with the Coalition for Content Provenance and Authenticity (C2PA), to create standards for providing context and history for digital media to prove the authenticity for a particular image or video/audio stream in the future can be established, as it is expected that these technologies will be used much more heavily in the future.

So can this deepfake technology be used for evil? Yes. But ifGet Backproves anything, it shows it can be used for "deep restoration" as well. A great deal of vintage content can be repaired in this way, be it original films or archival footage that can make it look brand new again -- or the freshest they have ever looked and shown on modern content delivery platforms. 

Can machine learning be used positively with archival source materials through deep restoration instead of deep fakery? Talk Back and Let Me Know.

Artificial Intelligence

Generative AI will far surpass what ChatGPT can do. Here's everything on how the tech advancesChatGPT's new web browsing feature is a big disappointment. Use this plugin insteadWhat is Amazon Bedrock? 4 ways it can help businesses use generative AI toolsCan generative AI solve computer science's greatest unsolved problem?
  • Generative AI will far surpass what ChatGPT can do. Here's everything on how the tech advances
  • ChatGPT's new web browsing feature is a big disappointment. Use this plugin instead
  • What is Amazon Bedrock? 4 ways it can help businesses use generative AI tools
  • Can generative AI solve computer science's greatest unsolved problem?

tag-icon Etiquetas calientes: Inteligencia Artificial innovación

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.