Ruby on Rails might be the ideal programming language for AI code generation due to its expressive efficiency and readability. Large language models excel at generating small-scale code but struggle with larger codebases that exceed their context windows. This creates a fundamental constraint that favors languages requiring fewer tokens to express complex functionality.
The big picture: Language models face diminishing performance as context windows fill with code, making token efficiency a crucial factor in determining which programming languages work best with AI assistants.
- The effectiveness of AI-assisted programming directly correlates with how many tokens are needed to express a given function or feature.
- Even models advertising large context windows experience degraded performance as more content is added, limiting their practical usefulness with verbose programming languages.
- Code completion tools like GitHub Copilot and Cursor remain the gold standard because they leverage AI’s strength in small-scale changes rather than attempting to generate entire applications.
Why this matters: Programming languages designed for developer happiness and concise expression, like Ruby, may provide significant advantages when working with token-limited AI systems.
- A language requiring fewer tokens per feature allows developers to build more complex applications before hitting the AI’s context limitations.
- This creates a counterintuitive advantage for languages that prioritize human readability and concise syntax over raw performance.
Technical considerations: The ideal language for AI code generation balances token efficiency with maintainability and readability.
- Languages with extensive boilerplate code (like Golang’s error handling patterns) consume valuable context window space that could otherwise be used for additional features.
- Unlike humans who can skim repetitive code patterns, AI models must process every token, making verbose languages less efficient for machine-assisted programming.
- Using minified code isn’t the solution, as AI models still need expressive variable names to understand program logic effectively.
Reading between the lines: Ruby on Rails’ focus on developer experience and convention over configuration makes it unexpectedly well-suited for AI programming, despite its performance limitations.
- While typed languages provide important safety checks that compensate for LLMs’ inability to test their own code, Ruby’s expressiveness may outweigh this disadvantage in certain scenarios.
- The argument for using JavaScript and Python remains strong due to their outsized presence in training data, but Ruby’s efficiency might eventually overcome this training bias.
The irony: Ruby, designed to be the most “human” programming language with its natural language-like syntax and focus on programmer happiness, may ironically become the preferred language for AI-driven development.
The future of AI is Ruby on Rails