A bipartisan group of senators has introduced the COPIED Act, which aims to protect content creators against unauthorized use of their work to train AI models or generate AI content:
Key provisions of the COPIED Act: The bill would require standards for watermarking AI-generated content with provenance information, make it illegal to tamper with this information, and allow individuals and authorities to sue for violations:
- The National Institute of Standards and Technology (NIST) would be tasked with creating guidelines and standards for adding watermark-like details about the origin of AI content.
- Removing, disabling, or tampering with this “content provenance” information would be illegal under the proposed law.
- Individuals could sue for violations, and the FTC and state attorneys general would be authorized to enforce the requirements.
Protecting artists and journalists from AI exploitation: Co-sponsors emphasize the need to defend content creators against AI models profiting from their work without permission:
- Sen. Marsha Blackburn (R-TN) says the bill takes an important step to “better defend common targets like artists and performers against deepfakes and other inauthentic content.”
- Senate Commerce Committee Chair Maria Cantwell (D-WA) believes the act will provide “much-needed transparency around AI-generated content” and put creators back in control of their content.
Broad support from creative industries: Several groups have endorsed the COPIED Act, citing the need for greater visibility into AI development and fair competition:
- The Recording Industry Association of America (RIAA) says leading tech companies refuse to share basic data about their AI models while profiting from unlicensed copyrighted material.
- Other backers include SAG-AFTRA, Nashville Songwriters Association International, Recording Academy, National Music Publishers’ Association, and various news media organizations.
Broader implications for the AI landscape: If passed, the COPIED Act could have significant impacts on the development and deployment of AI systems across industries:
- Requiring watermarks and provenance information for AI-generated content would introduce a new level of transparency and accountability in the field.
- Empowering individuals and authorities to take legal action against AI companies could deter unauthorized use of copyrighted material and encourage more ethical practices.
- However, enforcing these requirements and proving violations could be challenging, given the complex and often opaque nature of AI systems.
The COPIED Act reflects growing concerns over the potential misuse of AI and the need for stronger regulations to protect content creators. While some publishers have signed deals with AI companies for permission to use their data, many artists and journalists feel exploited by AI models that profit from their work without consent or compensation. The bill’s broad support from creative industries suggests a growing consensus around the need for legislative action to address these issues. However, the full implications of the COPIED Act, if passed, remain to be seen in the rapidly evolving AI landscape.
Senate Introduces Bill to Guard Artists' and Journalists' Creations Against AI