Software development

by Steven Vaughan-Nichols 

And you try telling the kids of today that…

That first version only had 14 commands. They included: PRINT, IF and THEN, and, the soon-to-be infamous GOTO. Thanks to GOTO, the famous Dutch computer scientist Edsger Dijkstra said, "It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: As potential programmers, they are mentally mutilated beyond hope of regeneration."

With GOTO, it was all too easy for would-be programmers to write what would become known as spaghetti code -- a tangled mess of source code that was almost impossible to understand or debug. Yes, BASIC was easy to write simple programs in, but it was awful for writing anything complex.

Still, the keyword was "easy." So, early developers kept using BASIC and porting it to one computer after another. 

Then, as the years rolled by, another paradigm for computing power emerged: The PC. In 1975, instead of sharing computers, you could have one of your very own with all the power of a 2MHz Intel 8080 processor.

Two young men, Paul Allen, and Bill Gates, proposed to the maker of the first PC, Ed Roberts' Altair 8800, that they port BASIC to his computer. He agreed, and shortly thereafter, they founded Micro-Soft. You know it better as Microsoft.

Yes, that's right. Without BASIC, you're not running Windows today. At about the same time, Steve Wozniak was working on porting BASIC to the first Apple computer, the Apple I. BASIC was essential for Apple's early growth as well.

BASIC also became a staple in home computers like the Atari 400, Commodore 64, and TRS-80. It was featured prominently in early computer magazines, where readers could find and then type in BASIC code all by themselves. Or, you could pay real money and get a cassette tape with such popular games as Lunar Lander.

Then, when IBM came out with its first PC, Gates and Allen were ready to take advantage of this new platform. As IBM President of Entry Systems, Don Estridge, said, "Microsoft BASIC had hundreds of thousands of users around the world. How are you going to argue with that?"

via Kathy Reid
by Jennifer Moore 

The intersection of AI hype with that elision of complexity seems to have produced a kind of AI booster fanboy, and they're making personal brands out of convincing people to use AI to automate programming. This is an incredibly bad idea. The hard part of programming is building and maintaining a useful mental model of a complex system. The easy part is writing code. They're positioning this tool as a universal solution, but it's only capable of doing the easy part. And even then, it's not able to do that part reliably. Human engineers will still have to evaluate and review the code that an AI writes. But they'll now have to do it without the benefit of having anyone who understands it. No one can explain it. No one can explain what they were thinking when they wrote it. No one can explain what they expect it to do. Every choice made in writing software is a choice not to do things in a different way. And there will be no one who can explain why they made this choice, and not those others. In part because it wasn't even a decision that was made. It was a probability that was realized.

But it's worse than AI being merely inadequate for software development. Developing that mental model requires learning about the system. We do that by exploring it. We have to interact with it. We manipulate and change the system, then observe how it responds. We do that by performing the easy, simple programing tasks. Delegating that learning work to machines is the tech equivalent of eating our seed corn. That holds true beyond the scope of any team, or project, or even company. Building those mental models is itself a skill that has to be learned. We do that by doing it, there's not another way. As people, and as a profession, we need the early career jobs so that we can learn how to do the later career ones. Giving those learning opportunities to computers instead of people is profoundly myopic.