Anybody know if ChatGPT has passed the bar exam yet? It might want to start studying.
Microsoft and OpenAI, creator of DALL-E and ChatGPT, face a lawsuit alleging they illegally used code that developers added to GitHub. Last month, artists sued AI art generators such as Stability AI.
This litigation phase may be new for AI, but it’s old news when it comes to disruptive technologies. In fact, according to Vox’s Peter Kafka, the rise of AI might just match the evolution of the music industry.
Remember Napster?
It was a file-sharing service where you could download a song over the internet for free (and for several hours of your time if you had a 14.4k modem like at my house).
The OG version of Napster fell apart after a torrent of lawsuits from artists and labels who considered the service a medium for stealing music.
As an attorney who litigated against record labels told Kafka: Many creators, and creative companies, will see AI as stealing their work, too.
Like Napster, AI involves sharing
Most generative AI tools crawl the web for data, collecting material from numerous sources, then pump out an essay, a painting, or whatever.
- Many sources are copyrighted or available only when licensed. The lawsuit against Microsoft and OpenAI essentially argues that the AI tools don’t abide by the proper protocols for using code shared to GitHub.
- AI backers say the machines are “learning” rather than stealing, and creating something new from a melange of existing materials.
How does this all get solved? Post-Napster, the music industry waged a losing battle against the internet for the next decade before capitulating — and making enormous bank by licensing songs to streamers.
The big companies with content on the internet may reach a similar deal with AI.