Anil Dash, a tech entrepreneur and writer, can be found on his blog anildash.com and LinkedIn. He previously appeared on the podcast in 2020 to discuss Glitch and Glimmer.

The concept of “normal technology” was coined by Princeton researcher Arvind Narayanan, a term Anil Dash explored further in his article.
AI as a Normal Technology
The discussion begins by addressing the idea of AI as a normal technology, a concept credited to researcher Arvind Narayanan. For those familiar with software development, machine learning and adaptive systems have existed for decades. While Large Language Models (LLMs) represent a significant breakthrough and evolution, they are not entirely new. It is crucial to evaluate these advancements within this context, as many claims about AI are exaggerated or untrue.
Misrepresenting AI’s capabilities can be detrimental to the economy and culture, attracting opportunists. It also detracts from the genuine excitement of creating with code. The real capabilities of AI are impressive enough without exaggeration, and an honest assessment allows for better preparation against potential harms.
For non-technical individuals, advanced technology often appears magical. However, AI can be viewed as another layer of abstraction and automation, continuing a long-standing goal in computer science. LLMs significantly improve natural language interfaces, making technology more accessible to a wider audience. This increased accessibility is powerful, but the hype surrounding AI often misrepresents its true nature, leading to a less nuanced conversation.
The Challenge of Non-Deterministic Systems
A key challenge with LLMs is their anthropomorphic interface, which makes it difficult to avoid attributing human-like qualities to them. When an LLM makes an error, it is often termed a “hallucination,” further reinforcing the perception of personhood. This can lead non-experts to misinterpret bugs and be exploited by those building these systems.
Unlike traditional software, which relies on deterministic behavior (zeros and ones), LLMs are non-deterministic. Applying these fuzzy tools to problems that require precise, deterministic code can be problematic. Many developers face pressure to integrate LLMs into existing, reliable systems, even when a simple, deterministic script would be more appropriate, reliable, and cost-effective. This often stems from a desire to simply “have AI” rather than using the right tool for the job.
Historically, technology was evaluated based on its ability to perform a task reliably. In current hype cycles, decisions are sometimes driven by mandates to adopt new tech, even if it’s not the optimal solution. While past hype cycles, like the adoption of Java, involved programming languages that were Turing complete and capable of the job, LLMs are not always the right tool for tasks requiring reliable, deterministic output. This distinction is often lost on non-experts, highlighting the need for technical fluency in decision-making.
Empowering Developers and Preserving Community Ethos
The capabilities of LLMs are constantly evolving, and they are becoming increasingly powerful tools. For individuals with foundational coding knowledge, LLM-assisted development tools can significantly accelerate the creation process, reigniting the joy of coding by expanding what can be achieved within limited timeframes.
Viewing AI and LLMs as part of a continuum of empowering, accessible, and understandable tools, rather than as mysterious, magical entities, aligns with the spirit of open information sharing. The ethos of Stack Overflow, founded on democratizing access to technical knowledge, is particularly relevant here. The generosity of the Stack Overflow community in openly sharing information was instrumental in training many major AI platforms and LLMs, making their coding assistance possible.
This shared spirit implies a social obligation to reciprocate, ensuring that access to technology remains open and not controlled by a select few. While some argue for treating AI as uniquely special, the overwhelming consensus among developers, product people, and designers is to approach it as normal technology.
Addressing Concerns and the Future of Coding
Concerns exist about the potential for individuals to “vibe code” insecure or non-performant applications without a deep understanding of underlying principles. Previous platforms like Glitch, which aimed to democratize app building, also encountered issues with varying levels of user expertise and security. While the principle of lowering barriers to entry for creators is positive, the execution can range from beneficial to problematic.
The argument that “this is the worst AI will ever be” and that it will constantly improve overlooks the reality that not all technologies consistently get better. A balanced approach involves using LLMs for what they excel at (e.g., creative assistance) and combining them with traditional deterministic tools like security scanners, profilers, and software tests to ensure reliability and security. The solution is not an “all LLM” or “no LLM” dichotomy, but rather an integrated use of tools.
There is a concern that the widespread use of AI coding tools could diminish the overall skill level of software developers. However, throughout the history of coding, every new level of abstraction has prompted similar worries, yet developers, as builders, continue to gravitate towards powerful tools and explore deeper levels of the stack as needed. Just as people still write assembly code, new prompt-based engineers may eventually delve into more complex languages like Rust, driven by curiosity and the inherent challenges of debugging and building.
Coding is inherently social, fostering collaboration, pride in creation, and shared curiosity. This social aspect, central to Stack Overflow’s lessons, will continue to draw people deeper into the technical stack. The community’s commitment to open access and generosity of spirit, championed by its founders, remains vital. In an era where AI models consume vast amounts of internet data, it is crucial to ensure fair exchange and control over intellectual property, akin to artists owning their work. Nurturing and fighting for this community spirit is essential for its continued thriving.

