×
What we know about the artists who leaked OpenAI’s Sora model
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The tension between AI companies and artists has reached a new flashpoint as OpenAI faces backlash from testers of its Sora video generation platform.

The incident unfolds: A group of artists calling themselves PR Puppets briefly shared public access to OpenAI’s Sora video generation platform, leading to an immediate suspension of the testing program.

  • The group posted a “Generate with Sora” access point on Hugging Face, allowing public access to OpenAI’s actual Sora API
  • Social media users quickly confirmed the authenticity of the access through the videos.openai.com domain
  • OpenAI revoked access within hours, though many users managed to generate and share videos during the brief window

Core grievances: PR Puppets claims to represent approximately 300 artists who feel exploited by OpenAI’s alpha testing program.

  • The group argues they’re providing unpaid labor through bug testing and feedback for a company valued at $150 billion
  • They protest OpenAI’s requirement for approval before sharing any Sora-generated content publicly
  • Only select artists will reportedly receive broader exposure for their Sora-created films

OpenAI’s response: The company maintains that participation in the alpha test is entirely voluntary with no mandatory feedback requirements.

  • An OpenAI spokesperson emphasized that Sora remains in research preview
  • The company states it’s working to balance creativity with safety measures
  • Sources close to OpenAI indicate only a few protest signatories were actually part of the alpha testing group

Development timeline: Sora’s public release faces continued delays despite initial enthusiasm.

  • The platform was first showcased in February 2024, generating significant interest
  • Original plans called for a public release by the end of 2024
  • CPO Kevin Weil cites model perfection, safety concerns, and computing scale as reasons for the delay
  • Competitors like Minimax, Google, and Meta have meanwhile announced their own video generation tools

Industry implications and ethics debate: The conflict highlights ongoing tensions between AI companies and the creative community.

  • The protest raises questions about the relationship between tech companies and artists in AI development
  • Artists express support for AI technology while criticizing implementation and rollout strategies
  • The incident exposes broader concerns about compensation and recognition in AI testing programs
  • The situation underscores the challenge of balancing rapid AI development with fair treatment of creative contributors

Looking ahead: The intersection of AI development and artistic contribution remains contentious, with this incident likely to influence how tech companies approach future collaborations with creative professionals.

  • The controversy may lead to more structured compensation models for AI testing programs
  • Questions about proper attribution and artist recognition in AI development persist
  • The incident could impact how other companies approach their own AI tool development and testing
OpenAI is at war with its own Sora video testers following brief public leak

Recent News

How AI is transforming design and architecture

As AI reshapes traditional design workflows, patent offices grapple with establishing clear guidelines for machine-assisted creative works and their intellectual property status.

AI predicts future glucose levels in groundbreaking Nvidia study

AI model predicts glucose patterns and diabetes risk by analyzing continuous glucose monitor data, offering healthcare providers early intervention opportunities.

Is AGI unnecessary if specialized AI can supercharge AI development itself?

A new theory suggests specialized AI systems focused solely on machine learning could achieve superintelligence more efficiently than developing human-like general intelligence first.