News/Computing
Apple’s new AI studies predict software bugs with 98% accuracy
Apple has quietly released three research studies that could reshape how software gets built, tested, and debugged across the technology industry. While the company is better known for consumer products, these papers reveal Apple's deeper ambitions in artificial intelligence-powered development tools—technology that could eventually accelerate software creation while reducing the costly errors that plague large-scale projects. The studies tackle three fundamental challenges in software development: predicting where bugs will occur before they cause problems, automating the time-intensive process of creating comprehensive test plans, and training AI systems to actually fix code defects. For business leaders managing software teams, these advances...
read Oct 13, 2025TDK develops analog AI chip that mimics brain function for edge computing
TDK, a company best known for audio cassettes, has developed a prototype analog reservoir AI chip that mimics brain function for real-time learning applications. The chip, created in collaboration with Hokkaido University, uses analog circuitry to process time-varying data at high speed and ultra-low power, making it suitable for robotics and human-machine interfaces that require instant feedback. How it works: The chip mimics the human cerebellum and uses natural physical dynamics of analog signals for efficient processing. Unlike traditional deep learning models that rely on cloud processing and extensive datasets, this technology learns directly at the edge using analog circuitry...
read Oct 7, 2025JEDEC unveils UFS 5.0 storage standard with 10.8GB/s speeds for AI apps
The Joint Electron Device Engineering Council (JEDEC), the organization that sets industry standards for semiconductor devices, has officially announced UFS 5.0, a new Universal Flash Storage standard that nearly doubles data transfer speeds to 10.8GB per second. The upgrade represents a significant leap from UFS 4.0's 5.8GB per second speeds and is specifically designed to meet the demanding performance requirements of AI-powered mobile applications and computing systems. What you should know: UFS 5.0 delivers substantial performance improvements while maintaining backward compatibility with existing hardware. The new standard reaches speeds of 10.8GB per second, compared to UFS 4.0's 5.8GB per second...
readGet SIGNAL/NOISE in your inbox daily
All Signal, No Noise
One concise email to make you smarter on AI daily.
Words at a loss: Perplexity makes AI browser free despite massive compute costs
Perplexity CEO Aravind Srinivas announced Wednesday that the company's Comet web browser will soon be free for all users, despite the significant compute costs associated with its autonomous AI capabilities. The move positions Perplexity, an AI-powered search company, to challenge established players while highlighting the complex economics facing AI companies as they scale their most resource-intensive features. Why this matters: The decision to offer compute-heavy AI browsing for free illustrates how startups can exploit advantages that tech giants like Google cannot easily replicate due to their massive user bases and existing infrastructure constraints. Key details: The browser's autonomous, agentic features...
read Oct 2, 2025No pain, no TX-GAIN: MIT unveils the most powerful AI supercomputer at any US university
MIT Lincoln Laboratory has unveiled TX-GAIN (TX-Generative AI Next), the most powerful AI supercomputer at any U.S. university, with a peak performance of two AI exaflops. The system is optimized specifically for generative AI applications and is already accelerating research across biodefense, materials discovery, cybersecurity, and other critical domains for both Lincoln Laboratory and MIT campus collaborations. What you should know: TX-GAIN represents a significant leap in university-based AI computing capabilities, powered by over 600 NVIDIA graphics processing unit accelerators designed specifically for AI operations. The system achieved recognition from TOP500, which biannually ranks the world's top supercomputers across various...
read Oct 1, 2025Crusoe powers up Central Texas AI data center with Oracle partnership
Crusoe has officially powered up the first phase of its flagship Stargate data center campus in Abilene, Texas, developed in partnership with Oracle Cloud Infrastructure. The rapid deployment—going from groundbreaking in June 2024 to operational facilities running Nvidia GB200 racks just over a year later—represents a significant milestone in U.S. AI infrastructure development and demonstrates how quickly large-scale AI computing facilities can be brought online. What you should know: The Abilene campus is already processing AI training and inference workloads with cutting-edge hardware delivered by Oracle. Oracle began delivering Nvidia GB200 racks in June 2025, enabling the site to run...
read Sep 30, 2025DeepSeek cuts AI processing costs 50% with new sparse attention tech
Chinese AI startup DeepSeek has launched DeepSeek-V3.2-Exp, an experimental model that introduces "sparse attention" technology to cut AI processing costs in half while maintaining performance levels. The release builds on DeepSeek's reputation for creating efficient AI systems using fewer resources than traditional approaches, though experts question whether the cost-cutting architecture compromises model reliability and safety. What you should know: DeepSeek's new experimental model represents a significant shift in AI architecture design, focusing on efficiency over raw computational power. The V3.2-Exp model introduces DeepSeek Sparse Attention (DSA), which selectively processes only the most relevant information rather than analyzing all available data....
read Sep 30, 2025CoreWeave shares rise as company signs $14B deal with Meta for AI computing power
CoreWeave has signed a $14 billion agreement with Meta to supply computing power, marking the latest massive infrastructure deal as companies scramble to meet surging demand for AI applications. The deal sent CoreWeave shares up 10% in premarket trading and underscores the intense capital flows driving AI infrastructure expansion, though it also raises questions about potential market bubbles and circular financing patterns. What you should know: This represents one of the largest AI infrastructure deals to date, highlighting the enormous capital requirements for modern AI operations. CoreWeave will provide Meta access to Nvidia's latest GB300 systems as part of the...
read Sep 29, 2025Survey: 73% of CIOs now use Macs primarily for AI processing
Apple Macs are gaining significant traction in enterprise IT environments, with 73% of CIOs now citing AI processing as the primary use case for Apple hardware according to a new MacStadium survey of 300 chief information officers. This shift represents a fundamental change from Macs' traditional role in creative work and app development, positioning them as critical infrastructure for AI workloads as enterprises prioritize artificial intelligence capabilities. What you should know: Enterprise adoption of Mac infrastructure is accelerating rapidly across multiple dimensions.• Apple technologies now account for an average of 63% of enterprise endpoints, with nearly all respondents (96%) expecting...
read Sep 24, 2025Apple’s SimpleFold AI matches AlphaFold performance with 90% less computing power
Apple researchers have developed SimpleFold, a lightweight AI model for protein folding prediction that achieves comparable performance to Google DeepMind's AlphaFold while requiring significantly less computational power. The breakthrough uses flow matching models instead of the complex architectures employed by existing systems, potentially making protein structure prediction more accessible to researchers with limited computing resources. What you should know: SimpleFold represents a fundamental shift in how AI approaches protein folding by prioritizing simplicity over complex engineering. Rather than relying on multiple sequence alignments, pairwise interaction maps, triangular updates or other specialized modules, Apple's model uses flow matching techniques that were...
read Sep 24, 2025Qualcomm CEO plans to bring AI everywhere with next-gen chips
Qualcomm CEO Cristiano Amon appeared on CNBC's "Closing Bell Overtime" to discuss the company's next generation of mobile and PC processors designed to integrate AI capabilities across devices. The interview focused on Qualcomm's strategy to embed artificial intelligence into its semiconductor offerings and the broader outlook for the chip industry as AI becomes increasingly central to consumer and enterprise technology. What you should know: The discussion centered on Qualcomm's vision to democratize AI access through its chip technology. Amon emphasized the company's commitment to "bring AI everywhere" through its latest processor developments. The conversation covered both mobile and PC processors,...
read Sep 18, 2025Huawei unveils world’s most powerful AI computing systems with 1M+ NPUs
Huawei has unveiled what it claims are the world's most powerful AI computing systems, featuring SuperPoDs with up to 15,488 NPUs (neural processing units, specialized chips for AI calculations) and superclusters containing over one million NPUs. The announcement positions the Chinese tech giant as a direct challenger to Nvidia and AMD in the high-performance AI computing market, particularly as US export restrictions limit access to American-made AI chips. What you should know: Huawei's new Atlas series represents a comprehensive AI infrastructure ecosystem designed to compete with established players.• The Atlas 950 SuperPoD contains 8,192 Ascend NPUs, while the superior Atlas...
read Sep 17, 2025Tencent Cloud shifts to Chinese AI chips amid $50B Nvidia standoff
Tencent Cloud has fully integrated Chinese AI chips into its computing infrastructure, marking another significant step in China's push for semiconductor self-sufficiency amid escalating U.S. export restrictions. The move positions Tencent as the latest major Chinese tech company to embrace domestic processors, while highlighting Beijing's broader strategy to reduce reliance on foreign semiconductors like those from Nvidia. What you should know: Tencent Cloud president Qiu Yuepeng announced the company had "fully adapted to mainstream domestic chips" within its AI computing infrastructure during the annual Global Digital Ecosystem Summit. The integration spans multiple domestic chip partnerships, with Tencent working to deploy...
read Sep 11, 2025Nvidia unveils Rubin CPX GPU with 128GB memory for AI inference
Nvidia has unveiled the Rubin CPX GPU, a new compute-focused graphics processor featuring 128GB of GDDR7 memory specifically designed for enterprise AI inference workloads. The announcement positions Nvidia to address the growing demand for long-context AI applications in software development, research, and high-definition video generation, with shipments planned for late 2026. What you should know: The Rubin CPX represents Nvidia's first GPU to reach 128GB memory capacity, delivering up to 30 petaFLOPs of NVFP4 compute performance.• The GPU integrates hardware attention acceleration that Nvidia claims is three times faster than the GB300 NVL72.• Four NVENC and four NVDEC units are...
read Sep 5, 2025Germany puts “Jupiter,” Europe’s first exascale supercomputer, in orbit. So to speak.
German Chancellor Friedrich Merz inaugurated the Jupiter supercomputer at the Juelich research centre, marking Europe's first Exascale-class supercomputer and the world's fourth-fastest computing system. The milestone represents a strategic push by Germany and Europe to compete with the United States and China in AI development and high-performance computing capabilities. What you should know: Jupiter represents a major technological achievement for European computing infrastructure and research capabilities. The supercomputer can perform one billion times one billion calculations per second, equivalent to the power of about 10 million standard notebook computers. It was assembled through a collaboration between Nvidia (the chip manufacturer),...
read Sep 5, 2025GPUs overtake CPUs as the dominant force in modern computing
GPUs have officially overtaken CPUs as the dominant force in modern computing, driven by their superior parallel processing architecture that excels at handling today's most demanding computational workloads. This shift represents a fundamental change in how we approach computing power, with GPUs now leading in artificial intelligence, scientific research, and high-performance computing applications that define the future of technology. The big picture: The transition from CPU to GPU dominance stems from architectural differences that favor modern computing needs, where thousands of smaller GPU cores outperform fewer, more powerful CPU cores for parallel processing tasks. Training complex neural networks on CPUs...
read Sep 3, 2025Huawei unveils 245TB AI SSDs to bypass costly HBM restrictions
Huawei has unveiled three new AI SSDs designed to reduce reliance on expensive high-bandwidth memory (HBM), addressing supply restrictions that Chinese firms face in accessing advanced memory chips. The OceanDisk series includes the industry's largest SSD at 245TB capacity, positioning solid-state storage as a partial alternative to costly HBM in AI workloads. What you should know: Huawei's new OceanDisk series targets the "memory wall" and "capacity wall" problems that currently bottleneck AI training and inference performance. The OceanDisk EX 560 delivers extreme performance with 1,500K IOPS write speeds, sub-7µs latency, and can increase fine-tunable model parameters on a single machine...
read Sep 3, 2025Fed study links AI adoption to rising unemployment in tech jobs, most notably computing and math
A new Federal Reserve study suggests the U.S. may be witnessing the early stages of AI-driven job displacement, with industries that adopted artificial intelligence most heavily now showing the highest unemployment increases. The research found a 0.57 correlation coefficient between AI adoption rates and rising joblessness, particularly affecting computing and mathematics professionals where AI usage reached nearly 80% while unemployment climbed 1.2% over three years. What you should know: The St. Louis Fed research examined unemployment changes between 2022 and 2025 across different sectors, correlating them with AI exposure rates to identify potential displacement patterns. Computing and math professionals experienced...
read Aug 27, 2025IBM and AMD team up to build quantum-centric supercomputers
IBM and AMD have announced a collaboration to develop "quantum-centric supercomputing" architectures that integrate quantum systems with high-performance computing and AI accelerators. The partnership aims to create hybrid computing models that could tackle scientific and industrial problems at unprecedented speeds and scales, with quantum hardware simulating atoms and molecules while traditional supercomputers analyze large-scale data. What you should know: The collaboration focuses on blending IBM's quantum expertise with AMD's high-performance computing and AI technologies to push beyond traditional computing limitations. The companies plan to integrate AMD's CPUs, GPUs, and FPGAs (specialized processing chips) with IBM quantum computers to explore new...
read Aug 21, 2025Cloud company claims AI bots now account for 80% of website traffic, threatening server stability
Fastly's new report reveals that AI crawlers and fetchers are overwhelming websites with traffic that accounts for a staggering 80 percent of all AI bot activity, with some bots hitting sites with over 39,000 requests per minute. The surge is primarily driven by Meta (52% of crawler traffic) and OpenAI (98% of fetcher traffic), creating unsustainable server loads that threaten website performance and the business models of content creators. What you should know: AI bots are fundamentally reshaping internet traffic patterns, with crawlers scraping training data and fetchers delivering real-time responses creating new operational challenges. Fastly, a cloud services company,...
read Aug 20, 2025Intel’s new feature boosts AI performance by allocating more RAM to integrated graphics
Intel has introduced "Shared GPU Memory Override," a new feature for its Core Ultra systems that allows users to allocate additional system RAM for integrated graphics use. This capability mirrors AMD's earlier "Variable Graphics Memory" feature and targets compact laptops and mobile workstations that rely on integrated solutions rather than discrete GPUs, potentially improving AI workload performance where memory availability is often the limiting factor. What you should know: The feature requires the latest Intel Arc drivers to function and is specifically designed for systems without dedicated graphics cards.• Bob Duffy, who leads Graphics and AI Evangelism at Intel, confirmed...
read Aug 20, 2025Vantage Data Centers builds $25B AI campus in rural Texas the size of 900 football fields
Vantage Data Centers is making the largest investment in its corporate history: a $25 billion artificial intelligence campus in rural Texas that signals just how dramatically the AI boom is reshaping America's digital infrastructure landscape. The massive project, dubbed Frontier, will span 1,200 acres in Shackelford County, Texas—roughly the size of 900 football fields. When complete, this hyperscale facility will deliver 1.4 gigawatts of power capacity, enough electricity to power roughly one million homes, making it the largest data center campus in Vantage's global portfolio. Why this matters This investment represents more than just corporate expansion—it's a window into the...
read Aug 19, 2025Nvidia’s 9B parameter AI model offers toggleable reasoning on single GPU
Nvidia has released Nemotron-Nano-9B-v2, a compact 9-billion parameter language model that features toggleable AI reasoning capabilities and achieves top performance in its class on key benchmarks. The model represents Nvidia's entry into the competitive small language model market, offering enterprises a balance between computational efficiency and advanced reasoning capabilities that can run on a single GPU. What you should know: Nemotron-Nano-9B-v2 combines hybrid architecture with user-controllable reasoning to deliver enterprise-ready AI at reduced computational costs. The model was pruned from 12 billion to 9 billion parameters specifically to fit on a single Nvidia A10 GPU, making deployment more accessible for...
read Aug 18, 2025Not WWE: AMD’s new 64-core “Threadripper” 9980X offers local AI alternative for $5K
AMD has launched its Ryzen Threadripper 9000 Series workstation processors, featuring the 32-core Threadripper 9970X ($2,499) and 64-core Threadripper 9980X ($4,999). The processors target enterprise developers building AI applications locally, offering a cost-effective alternative to cloud-based solutions while addressing security concerns and providing the computational power needed for large language models and intensive development work. What you should know: These workstations bridge the performance gap between consumer PCs and enterprise servers, specifically designed for AI development, STEM applications, and content creation. The Threadripper 9970X delivers 32 cores and 64 threads, while the flagship 9980X offers 64 cores and 128 threads...
read