×
The case against LLMs in software development
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Software industry veteran offers a critical analysis of Large Language Models and the degradation of software quality over time.

The core argument: The rise of Large Language Models (LLMs) represents a concerning shift in computing, where corporations prioritize profit over software quality and user experience.

Historical context: Earlier software development emphasized different priorities and characteristics compared to today’s landscape:

  • Programs were faster and more efficient despite limited hardware capabilities
  • Quality control was paramount due to the difficulty of distributing patches
  • Software was typically standalone, purchasable, and didn’t require internet connectivity
  • Applications were simpler, focused on specific use cases, and supported multiple hardware architectures
  • Independent developers could create meaningful applications with limited resources

Current state of technology: Major tech companies have invested heavily in AI and LLM technology, creating significant market pressure:

  • Every major tech company, including Google, Microsoft, Amazon, NVIDIA, AMD, and Apple, has positioned themselves as “AI-first” by the end of 2024
  • Traditional services like Google Search and Windows have deteriorated in quality while companies focus on AI integration
  • The substantial financial investments in LLM technology make it unlikely that companies will abandon this direction, regardless of effectiveness

Technical concerns about LLMs: Several fundamental issues exist with current LLM implementation:

  • Systems are slow, expensive, and non-deterministic by design
  • Results can be inconsistent and unreliable
  • The technology represents an abstraction layer between users and computing functions
  • Environmental impact of massive data center requirements raises sustainability concerns

Generational impact: The normalization of poor software quality could have lasting effects:

  • Younger users who have never experienced better alternatives may accept subpar performance as normal
  • Growing dependency on LLM tools could reduce understanding of fundamental computing concepts
  • Future generations might never experience deterministic software behavior

Industry implications: The focus on LLMs reflects broader changes in the technology sector:

  • Competition has diminished across operating systems, search engines, and mobile platforms
  • Investment returns often trump technical merit in determining which technologies succeed
  • Companies are positioning LLMs as intermediaries for all computer interactions

Looking ahead: The momentum behind LLM technology, combined with massive corporate investments, suggests this trend will continue despite technical limitations:

  • The technology sector’s consolidation means few alternatives exist
  • Environmental and computational costs may continue to rise
  • Individual resistance options are limited to personal choice in software usage and development practices

Market realities and resistance: While the trajectory seems set due to massive financial investments, individual choices remain important:

  • Some developers and users are actively choosing to avoid LLM integration in their work
  • Small-scale resistance through personal software choices continues
  • The future impact of this technology on computing, the economy, and the environment remains uncertain
LLMs are everything that it wrong in the world of computing

Recent News

How organizations are thinking about AI implementation in 2025

Large consulting firms are formalizing executive oversight of AI strategy as enterprise adoption accelerates, with PwC installing its first Chief AI Officer to guide both internal and client implementations.

Why artist Jeff Koons says he won’t use AI anytime soon

Leading contemporary artist positions AI as auxiliary tool while maintaining biology-driven creative process, signaling persistent traditionalism in high-value art production.

UTSA unveils plans for new AI and cybersecurity college

The Texas university will merge its data science, cybersecurity and computing programs to create a pipeline of tech talent for San Antonio's rapidly growing security sector.