Google and YouTube are addressing AI and copyright issues

Google has made clear it is going to use the open web to inform and create anything it wants, and nothing can get in its way. except perhaps Frank Sinatra.

Our Staff

Reads
Google and YouTube are addressing AI and copyright issues

Google and YouTube are working so hard to addressing AI and copyright issues. But who pops up in your brain when you’re pondering the wild frontier of online copyright laws? Frank Sinatra, of course!

Isn’t it great to ensure his estate and his label, Universal Music Group, profit every time an AI rendition of his song is played on YouTube? Isn’t it brilliant to create a new class of royalty contracts for these music labels?

All to protect the dominance of your video platform while arguing that using content from books and news sites for AI search results – without paying anyone – is completely fair? Isn’t that correct?

Google has announced a deal with Universal Music Group to develop an AI framework for mutual goals. This isn’t just any framework – they’re developing new intellectual property rights that could impact the world. Additionally, they’re setting a condition for inclusion in Search – allowing Google to use your data for AI training.

Let’s walk through it.

In April, a user named Ghostwriter977 released a song called “Heart on My Sleeve” featuring Drake and The Weeknd, created with AI technology. Both artists are represented by Universal Music Group.

Streaming services like Apple and Spotify control their music catalogs strictly. They quickly comply with any requests, unlike open platforms like YouTube which typically only remove user content for clear policy violations, mainly copyright infringement. However, the rules around copyright are complicated. Interestingly, voices cannot be copyrighted.

Though individual songs used for AI training can be copyrighted, there are no federal laws protecting likenesses, leading to a complex mixture of state laws.

This presents a challenge for large music corporations like UMG, but they found a solution: the track contained a part of the Metro Boomin producer tag.

Google, like other AI companies, is collecting large amounts of web data to train their AI systems, without paying for the data used. They believe their actions are permissible under Section 107 of the Copyright Act

Google’s Obligation to the Music Industry

Here’s the rub, folks. This whole “fair use” is 1) an affirmative defense to copyright infringement. Which basically means you have to fess up to making the copy in the first place. And 2) it’s judged on a case-by-case basis in the courts, a process as clear as mud and slower than a snail on tranquilizers.

This haphazard system often results in face-palm worthy outcomes, throwing entire creative fields into chaos for years on end. How’s that for progress?

Well, Google has to keep the music bigwigs especially grinning from ear to ear, right? Because without those all-encompassing licenses from the labels, YouTube would be up the proverbial creek without a paddle. Do we really want to return to the days of labels hauling unsuspecting parents into court

It Seems Like YouTube Has Finally Bowed Down.

Well, isn’t this interesting? In a blog post where he’s trumpeting a new deal with UMG, YouTube’s head honcho, Neal Mohan, starts muttering about working on AI… things. And get this: he’s making these nebulous pledges to broaden Content ID.

You know, that sometimes-contentious YouTube system that usually ensures copyright holders see some green for their sweat and tears? Yeah, that one. Apparently, it’s going to cover “generated content” now. Whatever that means.

Mohan has announced a new “YouTube Music AI technology”. It will feature numerous UMG artists and producers, including the estate of Frank Sinatra.

They plan to extend their content moderation policies to address “AI challenges”. However, they omitted the issue of AI deepfakes. Their proposed solution to tech problems is more technology.

AI, a popular term in technology, can be used for copyright protection, identifying content theft. We will continue investing in AI tools like Content ID, policy enforcement, and detection systems. This is to ensure the safety of our community including viewers, creators, artists, and songwriters, Neal explains.

Combining “copyright and trademark abuse” with serious issues like malicious deepfakes and AI misuse seems inappropriate. One may affect your profits, but the others are potentially life-threatening and undermine democracy. That’s noteworthy, isn’t it?

Let’s get real here, folks. The music industry, and we’re talking chiefly about UMG here, won’t settle for any half-baked solutions like AI councils that don’t have any bite.

Nah, they’re holding out for the big guns – a shiny new royalty system that gives them a slice of the pie every time an artist’s voice is used. And brace yourselves, ’cause this is a system that isn’t even a twinkle in the eye of current copyright law.

AI Drake is creating a stir on YouTube. UMG is issuing numerous takedowns due to a Metro Boomin sample in the track. This information is confirmed by UMG’s EVP of Digital Strategy, Michael Nash, during their quarterly earnings call.

“Generative AI, powered by those hulking, hefty language models, is just merrily pirating our intellectual property and stamping all over copyright law like it’s a soggy doormat, in more ways than one,” he lamented. “These corporate behemoths need to get their act together, ask nicely for permission, and sign on the dotted line to use our copyrighted content for AI training or, heaven forbid, any other scheme

 YouTube, in all its infinite wisdom, will try to beef up Content ID to start flagging content with voices that even mildly resemble UMG artists. And then what? UMG will swoop in, either axing those videos or raking in the royalties. Why? Because they can, that’s why.

We’ll also get to enjoy high-quality videos. Imagine UMG’s Ryan Tedder requesting Google Bard for a melancholic beat for a rainy day, and praising AI. Exciting, isn’t it?

This is a great solution for the team at YouTube. They are successful and need to avoid ongoing legal disputes over AI and fair use. But what about the rest of us.

Problems with This

Let’s consider this scenario. Currently, Content ID is well-protected by intellectual property law. Imagine you’re a music critic and Content ID mistakenly flags your work as copyright infringement. If you disagree, does YouTube step in to resolve the issue?

No, they send you through a tedious process, and if that fails, they subtly suggest you pursue legal action. (Most YouTubers don’t take this route, instead they find complex ways to avoid Content ID’s overeager flagging.)

When YouTube gives big record labels more rights to artists’ voices, it complicates matters. There’s no foolproof mark for AI content yet, so no AI can perfectly distinguish between a robot Drake and a youngster aspiring to be Drake.

Consider YouTube providing this unusual private right to everyone. What will happen next? Will our favorite Donald Trump impersonators disappear during elections? What about Joe Biden imitations? Where

DeSantis is known for advocating speech regulations. How will YouTube respond if he requests removal of his depictions, especially after AI Frank Sinatra’s removal? Are we ready for this situation, or worried about losing our music rights?

Well, if there are answers buried somewhere in this blog post, they’re playing a darn good game of hide and seek. But, lo and behold, what do we have here? Oh, it’s just Universal Music Group grinning like a Cheshire cat.

YouTube and UMG are collaborating, while Google uses its dominant internet position to gather data for its AI models. 

We find ourselves in a time where Google dominates web traffic. In response, websites are quickly becoming AI-created SEO traps. This situation is indeed deteriorating.

Google seems to have a strong hold over web publishers who invest considerably in content creation, in hopes of improved page rankings. Moreover, Google uses this content to further train its AI models. 

Google has introduced the Search Generative Experience (SGE), an AI-powered tool designed to directly respond to search queries, particularly those concerning purchases. Interestingly, most SGE demos performed by Google conclude with a cash register sound.

Over time, this will just be how search works.

Google’s thriving, but publishers are struggling with diminishing Google referrals and reduced affiliate revenue. They can’t sever ties with search traffic, can they? Sundar Pichai

The concept of feeding an AI with Sinatra songs until it mimics its style is comparable to filling it with bike reviews so it becomes a cycling expert. However, there’s no AI Music Incubator available online and no agreements with web publishers are in sight. Google’s position? Simple: if its search crawlers can access content on the open web, they’ll use it to enhance their AI. They’ve even updated their privacy policy

Yes, a website could theoretically block Google’s crawlers using its robots.txt file. OpenAI, the creator of ChatGPT, did just that after using all available data from the internet. But there’s a catch, blocking Google’s crawlers means your site will be removed from search results.

Well, well, well, isn’t this a spicy meatball of a situation? Our good old friend, the New York Times, has been playing a bit of a cat-and-mouse game with OpenAI’s GPTBot and Google.

You see, their robots.txt file is like the bouncer that lets Google in, but gives the cold shoulder to OpenAI’s GPTBot. How’s that for picking favorites, eh? And it doesn’t stop there. The Times recently got all high and mighty, updating their terms of use to say “no way, José” to anyone wanting to use their content to train AI.

Smells like they’re trying to keep their cake and eat it too, doesn’t it? But here’s the kicker: they had the chance to block both Google and OpenAI on the tech front, but instead opted for the legal route. Yeah, you heard that right. They inked a deal with Google and are now flexing their muscles, pondering a lawsuit against OpenAI. Meanwhile, OpenAI’s been doing some deal-making of its own.

They’ve shaken hands with The Associated Press. This sets up a juicy dynamic where AI companies are peeling off big players from coalitions that could’ve had some serious bargaining power over the platforms. Oh, and just so we’re clear, The Verge’s parent company, Vox Media, backs a bill called the JCPA

Remember the old days of internet remixing and sharing? Those are gone. Welcome to the era of pay-to-play. 

AI, copyright, and fair use are complex subjects. Predicting the outcome of legal issues involving major players like Sarah Silverman and Getty Images is uncertain. Humans can’t be programmed like computers. You might be able to imitate your favorite author’s style by studying their work, but you can’t duplicate their mind.

 copyright dramas a hoot? The only thing that’s as clear as a vodka tonic is how they’re about to throw a spanner into the whole damn internet. They’re poised to shake up copyright law like a snow globe and might even force us all to reevaluate how we interact with art.

Remember when the social web was all about Everything is a Remix? Those were good times, weren’t they? But brace yourselves, folks, ’cause the new decade’s catchphrase seems more like a rude awakening – “Screw You, Pay Me

So, this is all gonna be a long, drawn-out process, huh? And wouldn’t ya know it, Google’s just sitting pretty, taking its sweet time with everything. They’re over there, chin-stroking and musing about whipping up a new version of robots.txt

AI could potentially overwhelm the internet, not only by flooding user-generated platforms with irrelevant content but also by damaging Google’s search results. This could force Google to resort to high-quality content deals. 

Interestingly, the ‘future’ Google appears to be similar to today’s YouTube, filled with user-generated content alongside profitable licensing deals with TV networks, record labels, and sports franchises. It looks like Google is becoming the very thing it vowed to dismantle in its early days.

On a Final Note

Google And Youtube'S Copyright Dilemma

Despite the challenges, there are potential solutions that could help Google and YouTube strike a better balance. For example, they could invest in more sophisticated AI algorithms that can better distinguish between fair use and copyright infringement. They could also work more closely with content creators to ensure that their rights are protected while still allowing for creative expression and innovation.

Video Source: U.S. Copyright Office

Frequently Asked Questions

Google owns YouTube, and both companies use artificial intelligence (AI) to identify and manage copyrighted content. However, this has led to complex legal issues regarding copyright infringement and fair use.

YouTube uses AI to scan and identify copyrighted content uploaded by users. This helps prevent copyright infringement and allows content owners to manage their intellectual property on the platform.

One challenge is that AI is not always accurate in identifying copyrighted content. This can lead to false positives, where non-infringing content is mistakenly flagged as copyrighted. Additionally, some content creators have criticized YouTube’s copyright policies, claiming that they unfairly target smaller creators and limit creative expression.

One solution is to improve the accuracy of AI algorithms used to identify copyrighted content. Another solution is to provide better support and resources for content creators who are affected by copyright claims. Additionally, there have been calls for YouTube to reform its copyright policies to better balance the interests of content creators and copyright holders.

Copyright law can limit the ways in which content creators can use copyrighted material in their own work. It can also result in copyright claims and takedowns for users who upload copyrighted content without permission. However, copyright law also provides protections for content creators and encourages the creation of new and original works.

The use of AI to manage copyright on platforms like YouTube is likely to continue to evolve and impact the way that content is created and shared. As AI becomes more accurate and efficient, it may be able to better balance the interests of content creators and copyright holders. However, there are also concerns that overreliance on AI could limit creative expression and lead to censorship.

, ,

Leave a Comment below

Join Our Newsletter.

Get your daily dose of search know-how.