Two separate actions—one a US lawsuit and the other a Canadian law—begin to inject some sanity into issues of ownership, copyright, and technology’s relationship with each.
Canada’s Online News Act, effective at the end of this year, forces Google and Meta to pay for the news to which they link. They will have to compensate publishers for the content they consume, just like any other user. Both are apoplectic at the idea of having to do so. In response, both Google and Meta have threatened to block Canadian news in-country.
Meanwhile, comedian and author Sarah Silverman, along with authors Christopher Golden and Richard Kadrey, are suing OpenAI and Meta for copyright infringement, saying that the companies’ AIs were trained on their works without providing compensation. In other words, their work is being used to generate corporate profits without compensating the works’ owners.
Both of these are incredibly welcome developments. Leading tech companies tend to think that everything that is not a service they provide should be free—to them. Thus, they can allow their users to disseminate articles from any newspaper or magazine, and train AIs on other peoples’ books, music, art, etc. without paying anyone a dime.
That’s not the way it works. Already, social media companies like TikTok, Facebook, Instagram, and YouTube pay music publishers a blanket fee for content that’s shared on their platforms. (Music publishers are suing Twitter for $250 million for failure to do the same). Why should print content, whether news, fiction, or non-fiction, be any different?
When Australia passed a law requiring payment to news publishers, Facebook briefly removed news content from its Australian site. In the end, Facebook lifted the ban on Australian news content and agreed to pay for it.
Had Facebook maintained the news embargo, might that have been welcome? The free sharing of copyrighted content has driven journalism into a state of financial crisis. Social media’s presentation of legitimate, fact-based news on par with propaganda, misinformation, and downright lies has proven a threat to public health, election integrity, and democracy itself. In fact, social medial algorithms tend to promote the most sensational items, not the most accurate ones.
There is a difference between ease of use and intellectual torpor. Taking news on a come-what-may basis via social media is easy. To some, it may be neither here nor there that what you get may be complete BS, but if it matters to you, fact-based news is worth the effort of a bit of curation, and even a few monthly dollars out-of-pocket. You pay for Spotify. You pay for Tidal. If you want to consider yourself well-informed, you have to pay for the best information, too.
If AI companies want to train their machines to better make a profit by absorbing copyrighted content, those companies should pay for that content. Perhaps having to do so would make AI companies more careful and selective in the training of their models, which would be a very good thing, considering their current propensity for providing lies, inaccurate information, and bigotry.
Leonce Gaiter – Vice-President, Content & Strategy