TwistedSifter

Proposed Law Wants To Force AI Companies To Reveal What Tech Is Trained On Copyrighted Material

Source: Shutterstock

There is no shortage of controversy when it comes to AI technology. From beginning to end, these companies are facing tons of scrutiny and pushback.

One of the most public fights has been about whether or not programmers can use copyrighted material to train its AI.

This new law doesn’t weigh in on whether or not it’s ethical, but it would force companies to use a disclaimer letting users know that it contains copyrighted knowledge.

Representative Adam Schiff (D-CA) introduced the bill, which would (if passed) require businesses to disclose a list of all the copyrighted content they used in a training data set.

This would have to be done at least 30 days before they are made public.

For models currently in use, businesses would be required to disclose copyrighted content if they alter training data “in a significant manner.”

Affected companies would face monetary penalties for non-compliance.

“We must balance the immense potential of AI with the crucial need for ethical guidelines and protections. My Generative AI Copyright Disclosure Act is a pivotal step in this direction.”

The proposed legislation is approved by representatives from creative sectors like the Recording Industry Association of America and SAG-AFTRA.

Companies like OpenAI have already been sued over their use of copyrighted materials.

AI companies have argued that using copyrighted materials falls under fair use clauses, which means they don’t believe they should have to pay the owners for the rights.

Obviously, the owners of those materials do not agree.

The court of public opinion is turning against AI in a lot of sectors, so we’ll have to see if more disclosure spurs more lawsuits.

Either way, I don’t think the creators of the world are going down without a fight.

If you enjoyed that story, check out what happened when a guy gave ChatGPT $100 to make as money as possible, and it turned out exactly how you would expect.

Exit mobile version