News/Computing

Nov 18, 2024

Nvidia and Google partner on quantum computing initiative

The intersection of quantum computing and artificial intelligence reaches new heights as Nvidia partners with Google Quantum AI to enhance quantum computing device design through advanced simulation capabilities. Key partnership details: Nvidia and Google Quantum AI are collaborating to accelerate quantum computing development using Nvidia's supercomputing infrastructure. The partnership leverages the Nvidia Eos supercomputer and a hybrid quantum-classical computing platform for quantum processor simulations Google Quantum AI aims to address fundamental limitations in quantum computing, particularly the issue of "noise" that currently restricts quantum operations The collaboration utilizes 1,024 Nvidia H100 Tensor Core GPUs to conduct some of the world's...

read
Nov 17, 2024

Experts weigh in on what happens when AI models don’t keep getting better

The rapid advancement of artificial intelligence may be approaching a significant slowdown, particularly in the development of large language models (LLMs), as traditional training methods show signs of diminishing returns. Current state of AI development: OpenAI's next major model release, codenamed Orion, is demonstrating smaller performance improvements compared to previous generational leaps between models like GPT-3 and GPT-4. Internal researchers at OpenAI report that Orion isn't consistently outperforming its predecessor on various tasks This plateauing effect represents a significant departure from the exponential growth in AI capabilities observed in recent years The development raises questions about the sustainability of current...

read
Nov 16, 2024

New research explores how cutting-edge AI may advance quantum computing

The intersection of artificial intelligence and quantum computing represents a significant frontier in computer science, where AI's learning capabilities are being leveraged to solve complex quantum computing challenges. The foundational context: AI's unprecedented advancements are now being applied to quantum computing, where its data-driven approaches are particularly well-suited to handle the field's counterintuitive nature and high-dimensional mathematics. Quantum computing's complex challenges make it an ideal candidate for AI applications, potentially relying on AI developments for major scaling breakthroughs The collaboration between these fields requires expertise from two of computer science's most sophisticated domains, highlighting the need for cross-disciplinary knowledge Scope...

read
Nov 15, 2024

Mass. economic bill includes millions in funding for AI, quantum computing

Massachusetts is positioning itself as a leader in emerging technologies through a significant economic development bill that allocates substantial funding for artificial intelligence and quantum computing initiatives. Major funding allocation: The Massachusetts legislature has approved a $3.96 billion economic development bill that includes specific provisions for advancing technological innovation across the state. The bill designates $103 million to enhance the state's competitiveness in AI technologies across various sectors including life sciences, financial services, and healthcare A new grant program, administered by Mass Tech Collaborative, will support AI companies and promote development, adoption, and commercialization At least $3 million is earmarked...

read
Nov 15, 2024

The latest trends in spatial computing, AI glasses and the players behind them

Spatial computing and mixed reality technologies are rapidly evolving, with major tech companies launching new products and features that blend digital experiences with the physical world. Ray-Ban Meta's breakthrough: Meta's latest smart glasses collaboration with Ray-Ban introduces AI-powered voice interactions and computer vision capabilities, marking a significant advancement in wearable technology. The glasses maintain Ray-Ban's signature style while incorporating voice-activated AI features for tasks like translation and information retrieval Meta is promoting the product through experiential retail events in Los Angeles and Phoenix The device connects to smartphones via Bluetooth, though it currently lacks an integrated display Industry competition heats...

read
Nov 15, 2024

A 2nd Trump term might completely reshape the data center industry

A second Trump presidency could bring significant changes to data center industry regulations, energy policies, and domestic semiconductor production, with implications for the sector's growth trajectory and technological leadership. Policy shift implications: The return of Trump administration policies could reshape key aspects of data center operations and development in the United States. Energy regulations are expected to be relaxed, potentially easing restrictions on power consumption and generation methods for data centers Domestic semiconductor production could receive renewed focus, affecting supply chains and infrastructure development Construction regulations may see broader deregulation, as indicated by positive feedback from the Associated Builders and...

read
Nov 14, 2024

New AI models are falling short of expectations — here’s why

The rapid advancement of artificial intelligence models appears to be hitting unexpected roadblocks, with major tech companies struggling to achieve significant improvements in their next-generation AI systems. Current challenges facing OpenAI: OpenAI's newest language model, Orion, is showing less impressive gains over its predecessor compared to the leap from GPT-3 to GPT-4. Internal testing reveals minimal improvements in certain capabilities, particularly in coding tasks The underperformance suggests potential limitations in the current approach to AI development This setback represents a significant deviation from OpenAI's historical pattern of achieving substantial improvements with each new model iteration Industry-wide struggles: The challenges extend...

read
Nov 14, 2024

Red Hat upgrades OpenShift with AI and edge capabilities

Red Hat's latest OpenShift update represents a significant evolution in enterprise Kubernetes platforms, introducing major enhancements across artificial intelligence, edge computing, and security capabilities. Core platform updates: OpenShift 4.17 delivers substantial improvements to Red Hat's enterprise Kubernetes distribution, focusing on hybrid cloud innovation while maintaining strict security standards. The new version introduces enhanced virtualization management capabilities, including safe memory oversubscription and dynamic workload rebalancing Security enhancements now feature native network isolation for namespaces and a Confidential Compute Attestation Operator Advanced multi-cluster management capabilities enable seamless virtual machine orchestration across distributed environments AI integration and capabilities: Red Hat has significantly expanded...

read
Nov 14, 2024

Hybrid compute adoption surges as enterprises seek control over AI assets

The growing adoption of artificial intelligence by large enterprises is driving a shift toward hybrid computing models that combine public cloud services with private infrastructure, allowing organizations to maintain greater control over their AI capabilities. The evolving AI landscape: Large enterprises are increasingly adopting a hybrid approach to artificial intelligence deployment, combining public cloud services with private computing resources and locally-controlled models. Organizations spending over $10 million annually on AI are particularly motivated to develop private computing capabilities alongside their use of public cloud services This trend is especially prominent among companies with significant security concerns, regulatory requirements, or specific...

read
Nov 14, 2024

How Microsoft’s next-gen BitNet architecture is turbocharging LLM efficiency

Microsoft's research team has developed BitNet a4.8, a new architecture that advances the efficiency of one-bit large language models (LLMs) by drastically reducing their memory and computational requirements while maintaining performance levels. The fundamentals of one-bit LLMs: Traditional large language models use 16-bit floating-point numbers to store their parameters, which demands substantial computing resources and limits their accessibility. One-bit LLMs represent model weights with significantly reduced precision while achieving performance comparable to full-precision models Previous BitNet models used 1.58-bit values (-1, 0, 1) for weights and 8-bit values for activations Matrix multiplication costs remained a bottleneck despite reduced memory usage...

read
Nov 13, 2024

Why ARK Research believes power limitations won’t hinder AI data center growth

The rapid expansion of AI infrastructure is creating new demands on power systems, but innovative solutions are emerging to address potential constraints. Key findings: ARK's research indicates that power limitations will not significantly hinder AI data center growth or profitability. Current estimates show electricity costs represent only about 9% of total AI data center operating expenses Global electricity demand growth is projected to reach 3.2% annually through 2030, driven by AI infrastructure expansion The required capital investment for additional power generation is estimated at $235 billion in 2030, representing roughly 6% of expected AI hardware spending Industry adaptations: Companies are...

read
Nov 13, 2024

xAI competitors are flying spy planes over the ‘Colossus’ supercomputer facility

AI companies' rising concerns about Elon Musk's expanding supercomputing capabilities have led to unprecedented surveillance tactics in the increasingly competitive artificial intelligence sector. Key development: A new supercomputer facility dubbed 'Colossus,' operated by Musk's xAI company, has attracted attention from rival AI firms who are now conducting aerial surveillance of the installation. Competitors have resorted to flying spy planes over the data center, suggesting significant industry concern about xAI's growing computational capabilities The facility's rapid expansion has particularly worried other AI companies, indicating a possible shift in the competitive landscape of AI computing infrastructure Industry implications: The use of surveillance...

read
Nov 12, 2024

AMD launches Versal Premium Gen 2 for data centers

The rapid evolution of data center processing capabilities continues with AMD's latest advancement in adaptive computing technology. Product Overview: AMD has introduced the Versal Premium Series Gen 2, a new adaptive FPGA platform designed specifically for data center applications and AI processing workloads. The platform represents AMD's latest iteration in the field programmable gate array (FPGA) technology, which allows customers to configure hardware circuits after manufacturing This new series targets multiple markets including data centers, communications, test and measurement, and aerospace and defense sectors The system-on-chip design integrates various computing capabilities into a single package Technical Innovations: The Versal Premium...

read
Nov 12, 2024

How Dell is empowering enterprises to unlock the value of edge data

Edge computing and artificial intelligence are rapidly converging, with more than half of enterprise data expected to be processed outside traditional data centers by 2025, creating both opportunities and challenges for businesses seeking to leverage AI at the edge. Platform evolution and key features: Dell has announced significant updates to its NativeEdge platform, expanding capabilities for edge operations and AI deployment. The platform now offers multi-node high-availability capabilities, allowing multiple endpoints to function as a single system A new catalog features over 55 pre-built blueprints to streamline AI deployment across edge locations The solution supports virtual machine migration and automatic...

read
Nov 7, 2024

AI outpaces quantum computing in real-world applications

The AI revolution in scientific computing: Artificial intelligence is making significant strides in physics, chemistry, and materials science, potentially challenging the long-held belief that quantum computing would dominate these fields. AI's ability to simulate quantum systems is advancing at a rapid pace, with the scale and complexity of models growing exponentially. Researchers are now questioning whether AI could solve many interesting problems in chemistry and materials science before large-scale quantum computers become operational. For weakly correlated quantum systems, which encompass most systems of practical interest, classical AI approaches may prove sufficient without the need for quantum computers. AI's progress in...

read
Nov 5, 2024

AMD overtakes Intel in datacenter sales for first time

AMD's historic milestone in datacenter CPU sales: For the first time, AMD has overtaken Intel in datacenter CPU revenue, marking a significant shift in the competitive landscape of the semiconductor industry. AMD's datacenter segment revenue reached $3.549 billion in Q3, surpassing Intel's datacenter and AI group earnings of $3.3 billion. This achievement represents a dramatic reversal from just two years ago when Intel's datacenter group was consistently earning $5-6 billion per quarter. The shift is largely attributed to the competitive advantages of AMD's EPYC processors over Intel's Xeon CPUs, which has forced Intel to implement price discounts to remain competitive....

read
Nov 5, 2024

The 7 cloud computing trends shaping business success in 2025

The cloud computing revolution: Cloud technology is poised to undergo a fundamental transformation by 2025, with several key trends set to redefine business success and innovation. Emerging technologies like AI, quantum computing, and edge computing are converging with cloud services to create new possibilities for organizations across industries. These advancements promise to deliver unprecedented efficiency, cost savings, and performance improvements for businesses that embrace them. AI as the brain of cloud computing: Artificial intelligence is evolving from a service running in the cloud to the intelligent force optimizing all aspects of cloud operations. AI-driven systems will predict resource needs, automatically...

read
Nov 4, 2024

AI at the edge: Key architecture decisions for future success

Edge intelligence revolutionizes AI deployment: By bringing processing and decision-making closer to the point of value creation, edge intelligence enhances responsiveness, reduces latency, and enables applications to function independently, even with limited cloud connectivity. Edge intelligence moves AI and analytics capabilities to devices, sensors, and localized systems, enabling real-time intelligence crucial for applications like autonomous vehicles and hospital monitoring. Running AI locally bypasses network delays, improving reliability in environments that demand split-second decisions. This approach scales AI for distributed applications across various sectors, including manufacturing, logistics, and retail. Architectural considerations for edge intelligence: IT leaders must carefully balance latency, data...

read
Nov 4, 2024

MIT’s latest breakthrough is tiny, but it has big implications for the semiconductor industry

Breakthrough in nanoscale transistor technology: MIT researchers have developed a novel three-dimensional nanoscale transistor that could potentially revolutionize the efficiency of electronic devices by surpassing the inherent limitations of silicon semiconductor technology. Key innovations: The transistors utilize ultrathin semiconductor materials, specifically gallium antimonide and indium arsenide, as alternatives to silicon. These devices harness quantum mechanical properties such as quantum tunneling and quantum confinement to achieve low-voltage operation while maintaining high performance. With a diameter of only 6 nanometers, these vertical nanowire transistors are potentially the smallest 3D transistors reported to date. Performance advantages: The new transistors can operate efficiently at...

read
Nov 3, 2024

AI energy use and the new era of data center design innovation

AI's relentless pace drives system design revolution: The exponential growth of artificial intelligence is forcing a fundamental rethink of how data centers and computing systems are designed, from the chip level to entire facilities. Performance demands outpace hardware capabilities: AI's insatiable appetite for computing power is pushing beyond what traditional architectures can deliver. AI workloads require 100x to 1000x performance increases between generations, far exceeding the 10x to 20x improvements typical in other areas. The slowing of Moore's Law compounds the challenge, as hardware performance gains are increasingly difficult to achieve. These factors are driving unprecedented power consumption and heat...

read
Oct 30, 2024

AI will permeate every aspect of computing, says AMD CEO

AI's pervasive future in computing: AMD CEO Lisa Su emphasizes the critical role artificial intelligence will play across all computing sectors, signaling a transformative shift in the technology landscape. Lisa Su, CEO of Advanced Micro Devices (AMD), shared her insights on the company's quarterly earnings and future prospects during an appearance on CNBC's "Squawk on the Street" program. Su highlighted the growing importance of AI in AMD's business strategy, suggesting that artificial intelligence will become an integral part of every aspect of computing. The CEO's comments reflect the broader industry trend of major tech companies increasingly focusing on AI integration...

read
Oct 29, 2024

What ‘GDDR7’ is and why it promises performance gains for AI

The rise of GDDR7 memory in AI inference: GDDR7, the latest graphics memory solution, is set to revolutionize AI inference with its exceptional bandwidth and low latency capabilities, making it ideal for AI-powered edge and endpoint devices. GDDR7 offers a performance roadmap of up to 48 Gigatransfers per second (GT/s) and memory throughput of 192 GB/s per device, significantly outperforming previous generations. This new memory standard is expected to be utilized in the next generation of GPUs and accelerators for AI inference workloads. AI training vs. inference requirements: While AI training demands high memory bandwidth and capacity, inference prioritizes throughput...

read
Oct 24, 2024

The top tech trends shaping 2025, according to Gartner

The big picture: Gartner has unveiled its top strategic technology trends for 2025, highlighting key areas that enterprises should focus on to stay ahead in the rapidly evolving tech landscape. AI dominates the landscape: Artificial intelligence continues to be a major driving force in technological innovation, with several AI-related trends making Gartner's list. Agentic AI, which refers to intelligent software entities that can autonomously complete tasks and achieve goals, is expected to make 15% of day-to-day work decisions by 2028. AI governance platforms are becoming crucial for managing the legal, ethical, and operational aspects of AI systems, with the potential...

read
Oct 23, 2024

Denmark unveils AI supercomputer to tackle societal challenges

Denmark's AI leap: A sovereign supercomputer for scientific breakthroughs: Denmark has unveiled Gefion, its largest AI supercomputer, aimed at tackling global challenges and fostering innovation across various scientific domains. The supercomputer, named after a Danish mythological goddess, is an NVIDIA DGX SuperPOD powered by 1,528 NVIDIA H100 Tensor Core GPUs and connected via NVIDIA Quantum-2 InfiniBand networking. Gefion was inaugurated by King Frederik X of Denmark, NVIDIA CEO Jensen Huang, and Danish Center for AI Innovation (DCAI) CEO Nadia Carlsten in Copenhagen. The project is a collaboration between NVIDIA and DCAI, funded by the Novo Nordisk Foundation and the Export...

read
Load More