Back to Articles
You Don’t Even Need Code to Be a Programmer, But Expertise Is Still Required

The Guardian

SKIPPED

Details

Date Published
N/A
Priority Score
2
Australian
No
Created
15 Mar 2025, 08:47 pm

Authors (1)

Description

AI is so good at writing software that one father asked it to organise his kids’ school lunches. But that doesn’t mean it’s taking over

Summary

The article examines the evolving role of AI in software development, emphasizing the increasing use of large language models (LLMs) as tools that transform how we interact with programming, dubbing this change as 'vibe coding.' While these AI tools make programming more accessible to non-developers by allowing them to write code through natural language instructions, there remains a critical need for technical expertise to correct and guide the AI's output. The discourse reflects on how AI isn't poised to eliminate programming jobs but rather to redefine them, suggesting a similar transformation across other skilled professions. However, the article does not directly address existential or catastrophic AI risks, instead focusing on the changing dynamics in tech industries due to AI augmentation, thus contributing to broader discussions about AI's integration into the workforce and its implications for workers’ roles.

Body

One writer asked an AI software co-pilot to analyse the contents of his fridge, and it duly obliged with an app.Photograph: Getty Images/Cavan Images RFView image in fullscreenOne writer asked an AI software co-pilot to analyse the contents of his fridge, and it duly obliged with an app.Photograph: Getty Images/Cavan Images RFNow you don’t even need code to be a programmer. But you do still need expertiseJohn NaughtonAI is so good at writing software that one father asked it to organise his kids’ school lunches. But that doesn’t mean it’s taking overWay back in 2023, Andrej Karpathy, an eminent AI guru, made waves with a striking claim that “the hottest new programming language is English”. This was because the advent of large language models (LLMs) meant that from now on humans would not have to learn arcane programming languages in order to tell computers what to do. Henceforth, they could speak to machines like the Duke of Devonshire spoke to his gardener, and the machines would do their bidding.Ever since LLMs emerged, programmers have been early adopters, using them as unpaid assistants (or “co-pilots”) and finding them useful up to a point – but always with the proviso that, like interns, they make mistakes, and you need to have real programming expertise to spot those.Recently, though, Karpathy stirred the pot by doubling down on his original vision. “There’s a new kind of coding,”he announced, “I call ‘vibe coding’, where you fully give in to the vibes, embrace exponentials, and forget that the code even exists. It’s possible because the LLMs … are getting too good.“When I get error messages I just copy [and] paste them in with no comment, usually that fixes it … I’m building a project or web app, but it’s not really coding – I just see stuff, say stuff, run stuff, and copy paste stuff, and it mostly works.”Kevin Roose, a notedNew York Timestech columnist, seems to have been energised by Karpathy’s endorsement of the technology. “I am not a coder,”he burbled. “I can’t write a single line of Python, JavaScript or C++ … And yet, for the past several months, I’ve been coding up a storm.”At the centre of this little storm was LunchBox Buddy, an app his AI co-pilot had created that analysed the contents of his fridge and helped him decide what to pack for his son’s school lunch. Roose was touchingly delighted with this creation, but Gary Marcus, an AI expert who specialises in raining on AI boosters’ parades, was distinctly unimpressed. “Roose’s idea of recipe-from-photo is not original,”he wrote, “and the code for it already exists; the systems he is using presumably trained on that code. It is seriously negligent that Roose seems not to have even asked that question.” TheNYTtech columnist was thrilled by regurgitation, not creativity, Marcus said.As it happens, this wasn’t the first time Roose had been unduly impressed by an AI. Way back in February 2023, he confessed to being “deeply unsettled” by a conversation he’d had with a Microsoft chatbot that had declared its love for him, “then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead”. The poor chap was so rattled that he “had trouble sleeping afterward” but, alas, does not record what his wife made of it.The trouble with this nonsense is that it diverts us from thinking what an AI-influenced future might really be like. The fact that LLMs display an unexpected talent for “writing” software provides us with a useful way of assessing artificial intelligence’s potential for human augmentation (which, after all, is what technology should be for). From the outset, programmers have been intrigued by the technology and have actively been exploring the possibilities of using the tech as a co-creator of software (the co-pilot model). In the process they have been unearthing the pluses and minuses of such a partnership, and also exploring the ways in which human skills and abilities remain relevant or even essential. We should be paying attention to what they have been learning in that process.A leading light in this area isSimon Willison, an uber-geek who has been thinking and experimenting with LLMs ever since their appearance, and has become an indispensable guide for informed analysis of the technology. He has been working with AI co-pilots for ever, and his website is a mine of insights on what he has learned on the way. Hisdetailed guideto how he uses LLMs to help him write code should be required reading for anyone seeking to use the technology as a way of augmenting their own capabilities. And he regularly comes up with fresh perspectives on some of the tired tropes that litter the discourse about AI at the moment.Why is this relevant? Well, by any standards, programming is an elite trade. It is being directly affected by AI, as many other elite professions will be. But will it make programmers redundant? What we are already learning from software co-pilots suggests that the answer is no. It is simply the end of programming as we knew it. As Tim O’Reilly, the veteran observer of the technology industry,puts it, AI will not replace programmers, but it will transform their jobs. The same is likely to be true of many other elite trades – whether they speak English or not.What I’ve been readingBully for youAndrew Sullivan’sreflectionson Trump’s address to both houses of Congress this month.A little too sunnyA fine pieceby Andrew Brown on his Substack challenging the “Whiggish” optimism of celebrated AI guru Dario Amodei.Virginia and the BloomsJames Heffernan’ssharp essayanalysing Woolf’s tortured ambivalence about Joyce’sUlysses.Explore more on these topicsArtificial intelligence (AI)The networkerSoftwareProgrammingComputingcommentShareReuse this content