×
Video Thumbnail
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Big AI news: Major OpenAI employee leaves, Google prepares for AGI, AI self-improves and more

Google gets serious about AGI

Google is taking artificial general intelligence (AGI) more seriously than many expected. Beyond just publishing a comprehensive report on AGI, they're actively hiring for a "post-AGI research scientist" position. This suggests Google believes AGI might arrive sooner than most anticipate.

The job description is quite revealing:

"We are seeking a highly motivated research scientist to join our team and contribute to groundbreaking research that will focus on what comes after AGI. Key questions include the trajectory of AGI to ASI, machine consciousness, the impact of AGI and the foundations of human society."

This is significant because Google is committing actual resources to understand what happens after we achieve AGI. The role includes exploring AGI's influence on domains like economics, law, health, machine consciousness, and education.

Meanwhile, Google continues releasing cutting-edge models. Their new Gemini 2.5 Flash ranks alongside GPT-4.5 Preview and Claude 3 Opus, but costs 5-10 times less than Gemini 2.5 Pro. This combination of advanced capabilities with lower costs could significantly impact developer access.

Meta takes a different approach to advanced AI

Meta is charting its own path toward advanced AI. Rather than focusing on AGI, Yann LeCun (Meta's AI chief) prefers the term "advanced machine intelligence" (AMI). LeCun doesn't believe in AGI in the traditional sense, arguing that humans themselves have limited general intelligence and are mostly specialists.

Meta recently released several new research artifacts:

  1. Meta Perception Encoder – a vision encoder for image and video tasks
  2. Meta's Perception Language Model – a vision language model for visual recognition
  3. Meta Locate 3D – for accurate object localization in 3D environments
  4. Their 8B parameter Dynamic Byte Latent Transformer – an alternative to traditional tokenization methods

Meta's approach is interesting because they're pursuing different technical paths than competitors, with LeCun even stating he's not focused on large language models (LLMs).

OpenAI's head of preparedness quietly steps down

A top Open

Recent Videos