×
Former Google AI researcher claims ChatGPT’s success was delayed
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The rise of transformer architecture in AI and its impact on modern language models has been profoundly shaped by the 2017 research paper “Attention Is All You Need,” which laid the groundwork for today’s generative AI technologies like ChatGPT, Sora, and Midjourney.

Origins and impact: The transformer architecture emerged from collaborative research at Google, fundamentally changing how AI processes and transforms data tokens into meaningful outputs.

  • Eight Google researchers, including Jakob Uszkoreit, developed the transformer architecture that now powers most major AI language models
  • The technology enables various AI applications, from language processing to audio synthesis and video generation
  • The research built upon previous work in the field, representing an evolution rather than a sudden breakthrough

Google’s missed opportunity: Internal conservatism at Google may have delayed the public deployment of transformer-based language models.

  • Google had developed impressive language models around the time of the paper’s publication
  • A conservative approach to product development prevented earlier public release of these capabilities
  • Uszkoreit suggests that transformative AI products “could have happened sooner” had different decisions been made

ChatGPT’s unexpected success: The widespread adoption and creative applications of ChatGPT surprised even those familiar with the underlying technology.

  • While the technology itself wasn’t necessarily breakthrough, its utility and accessibility marked a significant milestone
  • The public response to ChatGPT demonstrated previously unforeseen potential for practical applications
  • The success highlighted the importance of experimentation and willingness to release products despite imperfections

Biological computing frontier: Uszkoreit has shifted his focus to applying AI in biological computing through his company Inceptive.

  • The company aims to develop “biological software” using AI to translate specified behaviors into RNA sequences
  • This technology could potentially revolutionize medicine by programming molecular behaviors in biological systems
  • The approach builds upon principles similar to mRNA vaccines but aims for more complex therapeutic applications
  • Safety protocols and established medical safeguards guide the development process

Future implications: While transformer architecture has already transformed AI applications, its potential impact on biological computing and medicine represents an entirely new frontier that could fundamentally change how we approach healthcare and biological engineering.

ChatGPT’s success could have come sooner, says former Google AI researcher

Recent News

Veo 2 vs. Sora: A closer look at Google and OpenAI’s latest AI video tools

Tech companies unveil AI tools capable of generating realistic short videos from text prompts, though length and quality limitations persist as major hurdles.

7 essential ways to use ChatGPT’s new mobile search feature

OpenAI's mobile search upgrade enables business users to access current market data and news through conversational queries, marking a departure from traditional search methods.

FastVideo is an open-source framework that accelerates video diffusion models

New optimization techniques reduce the computing power needed for AI video generation from days to hours, though widespread adoption remains limited by hardware costs.