After the launch of artificial intelligence ChatGPT, it has had an impact on people's work and life in many ways. The author of this article is a self-taught programmer who once believed that programming was an important skill, but the emergence of GPT-4 changed the way he works. The programming skills he once cherished and the sense of accomplishment he got from solving problems seemed to disappear in the face of AI. This forced him to rethink the value of programming. What will programming look like in the future? Just as my parents believed that I could read and write, I took it for granted that my children could program as well as I did. Programming is an emerging art and a fundamental skill that is becoming increasingly important. Proficiency in programming is a beneficial supplement to children's literacy and keeps them employable. As I write these words, my wife is about three weeks pregnant with our first child, and she is due to give birth to my first child. I am a professional programmer, but by the time my child can type, programming will have faded from the world as a valuable skill. With GPT, I lost my passion for programming The idea first occurred to me on a Friday morning in the summer of 2023, while I was working on a small side project. My friend Ben and I decided to try to make a New York Times-style crossword puzzle entirely by computer. We had already made a crossword puzzle in 2018 with the help of software, and we did very little—just a few minor changes here and there as needed. Now we were trying to make a crossword puzzle generator that didn’t require human intervention. In the past, developing similar projects involved both hardware and software, with Ben being better at hardware. We once made a neon sign that would light up when the subway approached a stop near our apartment. Ben was responsible for making the glass tubes and connecting the circuit board of the transformer, and I wrote the code to process the traffic data. Ben also had some programming experience, but it was brief and more than two decades had passed, so the main programming work was left to me. However, for the new crossword puzzle project, Ben brought in a third "player". He subscribed to ChatGPT Plus and used GPT-4 as a programming assistant. Then, something strange happened. Ben and I discussed some programs we would need for the project. And, very quickly, Ben built it. At one point, we needed a command that would print a hundred lines of random content from a dictionary file. I thought about it for a while and had no idea, so I tried Googling. I tried some methods I could find, and just as I started writing the code, Ben told GPT-4 our idea and got perfectly working code. Commands like these are notoriously cumbersome, and everyone should look them up, so they are not really programming. A few days later, Ben said he wanted to develop an iPhone app that could rate words in the dictionary, but he didn't know how painful it would be to develop an iPhone app. I've tried several times, but I always gave up halfway. I find Apple's development environment difficult to get started with. Not only do you have to learn a new programming language, you also have to get familiar with new programs for editing and running code. You also have to learn the various "user interface components" and how to put them together, and finally, you have to figure out how to package your app. It's just a mountain of things to learn, and it doesn't seem worth it. Most people today have had experience with AI, but not everyone is impressed. “I didn’t really respect it until I started using it to write code for me,” Ben said recently. I suspect that those who are naturally skeptical, who watch ChatGPT spew out stilted text or fake facts, will still underestimate what’s going on. Knowledge and skills that once took a lifetime to acquire can now be swallowed in one gulp. Programming has always been an endlessly rich and colorful field for me. Now, I find myself wanting to write an elegy for it. I thought of Lee Sedol, once the world's top Go player and a national hero in South Korea, but now best known for his 2016 loss to a computer program called AlphaGo. Lee went into the match confident he could easily beat the AI. But after days of play, he lost. “I’m sorry for my incompetence,” Lee said at a news conference. Three years later, he retired. A question seemed to be nagging Lee: What would become of this thing that had consumed him for so much of his life? Copyright images in the gallery. Reprinting and using them may lead to copyright disputes. I used to be obsessed with and worship programming My fascination with computers began when I was six years old. It was in Montreal in the early 1990s, and my brother and I were playing the game Mortal Kombat. He told me about some of the “killer moves”—terrible and clever ways to defeat your opponent, but neither of us knew how to perform them. He dialed into an FTP server on an MS-DOS terminal and typed in some obscure commands. Soon, he printed out a page of code - the instructions for the special moves used in the game. We went back to the basement and started beating each other in the head. I used to think my brother was a hacker. Like many programmers, I used to dream about hacking into and taking control of remote systems, not so much about causing chaos but about finding hidden objects. “My sin is that I am too curious,” Loyd Blankenship wrote in “The Hacker Manifesto” in 1986. There’s a scene in the 1995 movie “The 100-Year-Old” where Dade Murphy proves his mettle in an underground club: Colorful computer books are pulled out of a backpack, and Dade recognizes each one by the cover: a green one about the International Unix Environment, a red one about the NSA’s trusted network, and one with a guy in a pink shirt playing with an IBM PC. Dade put his expertise to work on the school’s sprinkler system and helped clean ballast from an oil tanker—all by tapping away at a keyboard. The lesson here is that knowledge is power. But how do you learn to hack? When I was in fifth grade, my family settled in New Jersey. When I was in high school, I went to Borders Bookstore in Short Hills Mall and bought Beginning Visual C++ by Ivor Horton. It was 1,200 pages long - my first magic book. Like many other tutorials, it started out simple, and then, suddenly, it wasn't so simple. Medieval students referred to the moment when a beginner encountered learning failure as the "Bridge of Asses" moment. The term comes from the fifth proposition of the first book of Euclid's Elements, which is the first difficult concept in the book. Those who successfully cross the bridge will continue to master geometry, while those who cannot will only sigh in despair. Section 4.3 of "Visual C++ Primer", on "Dynamic Memory Allocation", is my bridge. Unfortunately, I did not cross it. But I didn't give up on the subject. I still remember the moment when things changed. It was a long flight, and I had brought a bulky laptop and a CD-ROM with the Borland C++ compiler. The compiler converts the code you write into machine code that the machine can execute, and I had spent several days getting this compiler to work. Traditionally, the first program that every programming beginner writes is "Hello, world." Imagine explaining to a moron on the phone how to assemble furniture, with no pictures, in a language you barely speak. Imagine that the only response you get is that you made a ridiculous suggestion and the furniture is coming together less and less properly. So when you finally get it done, you feel a sense of relief. I remember clearly lying on my stomach in the aisle of the airplane and pressing the Enter key for the last time. Then I sat up, and this time the program did what I told it to do. "Hello, world" appeared above the cursor, as if a wise man had woken up and introduced himself to me. Most of us will never be the hackers described in "Hackers." To "hack," in programmer parlance, is to tinker—to express creativity through code. I never formally learned to program, I just messed around and made computers do little things that were useful or enjoyable. In 2006, during my freshman year, in order to keep up with Tiger Woods’ standings at the Masters, I wrote a program that searched pgatour.com for leaderboard information and texted me when he made birdies or bogeys. Later, after reading Ulysses in English class, I wrote a program that pulled random sentences from the book, counted their syllables, and compiled haikus—a more primitive form of verbal regurgitation than today’s chatbots, but one that, I thought, was capable of real poetry: I'll flay him aliveUncertainly he waitedHeavy of the past As a programmer I've experienced the best times I started taking programming seriously. I offered to develop programs for my friend's startup. I gradually understood that the computer world was vast and as orderly as the layers of sediment in geology. From web browsers to transistors, each subfield or system was built on top of other older subfields or systems, layer upon layer but in an orderly manner. The deeper you dig, the more you sense what racing driver Jackie Stewart calls “machine empathy”: a sense of both the machine’s strengths and limitations and how to make it work. I felt my mechanical empathy growing stronger in the company of friends. In my sophomore year, I was watching Jeopardy with a friend, and he suggested that I make a game based on it. I thought about it for a few hours, then told him, very disappointed, that it was beyond my ability. However, the idea came back to me during my junior year, and I finally found a solution. I now had a better idea of what the machine could do. I spent the next 14 hours developing the game. After a few weeks, playing the game had become a regular activity among my friends. The experience was profound. I could understand why people would spend their lives on handmade art: there is nothing more fun than watching other people enjoy what you have made. During this time, I was completely immersed in a "House of Cards" state and put my studies aside. I worked hard, but not on learning. One night, I used 6 machines in the basement to run a program in parallel. I laid printouts full of numbers on the floor and thought about a path planning algorithm. The price was that I had a nightmare - taking the final exam and knowing nothing about it. In 2009, during the worst financial crisis in decades, I graduated with a GPA of 2.9. But I got my first full-time job easily. I had experience as a programmer, and no one asked me how I did. It was a boom time for young programmers. Companies were competing for the best programmers. The fight for experienced programmers was so intense that they complained of "job emails flying everywhere." The popularity of computer science majors at universities began to explode. (My degree was in economics.) Programming "boot camps" began to appear, claiming to turn beginners into highly paid programmers in less than a year. During a job interview in my early twenties, the CEO asked me how much I thought I was worth. I gave him a number that embarrassed me. He drew up a contract on the spot that offered me 10 percent more than I had asked for. The skills of a "software engineer" are highly prized. At one company I worked for, someone got in trouble for using HipChat (the predecessor to Slack) to ask a colleague a question directly. He was told to “never contact engineers directly using HipChat.” We are too important to do that! This was an era of near-zero interest rates and staggering tech industry growth. Some norms had taken hold, with companies like Google telling the industry that programmers could expect free espresso and hot meals, world-class health care and parental leave, on-site gyms and bike rooms, casual dress codes, and “20 percent time,” meaning they could spend one day a week doing whatever they liked. Their skills are considered so important that a superstition has developed. For example, estimating how long a coding task might take is considered foolish, because a programmer might turn over a rock and dig up a pile of bugs at any moment. Delivery deadlines are anathema, and if the pressure to deliver is too great, a programmer can gain a few extra months simply by uttering the word "burnout." I felt like something was not right from the start. Was what we were doing really that valuable? How long could this boom last? In my early teens, I did some web design. It was a job that was popular and respected. You could finish a project in a weekend and make thousands of dollars. But along came tools like Squarespace, which allowed pizza parlor owners and freelance artists to create their own websites with just a few clicks of the mouse. For professional programmers, a portion of the high-paying, relatively effortless work disappeared. The programmer community reacts to these changes by saying you have to keep upgrading your skills, learning harder, more obscure things. Software engineers, as a species, love automation. Inevitably, the best of them build tools that make other kinds of work obsolete. This instinct explains why we are so well taken care of: Code has enormous leverage. A piece of software may affect the jobs of millions of people, or even replace the programmers themselves. We should think of these advances as a rising tide that gradually submerges our bare feet but, as long as we keep learning, we won’t be submerged. This is wise advice, unless there is a tsunami. Copyright images in the gallery. Reprinting and using them may lead to copyright disputes. ChatGPT is here and it changes our work When we were allowed to use AI chatbots at work to help with programming, I purposely avoided them. I thought my colleagues would do the same. But soon, I saw the colors of the AI chat sessions on their screens—the zebra stripes of the Q&A style. A common claim is that these tools make you more efficient, and in some cases, they can help you solve problems ten times faster. I’m not sure I want that. I enjoy the process of programming itself, and I enjoy feeling useful. The tools I’m skilled at, like the text editor I use to format and navigate my code, satisfy both needs. They enhance my practice of the craft—and while they get me done faster, I still feel like I’ve done something worthwhile. But AI, as it’s been described, seems different. It helps so much that I worry it’ll rob me of both the joy of solving a puzzle and the satisfaction of being the one who solved it. I could be infinitely more productive, but all I’d have to show for it would be the results. The actual work products of most programmers are rarely exciting. In fact, they are often mundane. A few months ago, I came home from get off work and told my wife that I had a very good day solving a particularly interesting problem. I was working on a program that generated tables, and someone wanted to add a header that spanned multiple columns - a feature that our custom layout engine didn't support. The work was urgent: these tables were used in important documents. So I locked myself in my room for most of the afternoon. There are many sub-problems to solve here: How should users of the layout engine express that they want a header that spans columns? What should their code look like? There are also trivial details that can cause bugs if ignored. For example, what if one of the columns in a spanning header is deleted because it has no data? I know it's a good day because I have to get out pen and paper - I'm drawing possible scenarios and double-checking my logic. Yet, what if you take a bird’s eye view of what happened that day? It’s hard to imagine anything more mundane than a table getting a new title. For me, the fun is all about the process, not the outcome. What would that process look like if it only took a three-minute ChatGPT conversation? As programmers, our jobs involve many other things besides literally coding, like mentoring new people and designing systems, but coding has always been at its root. Throughout my career, I had been selected by employers for my ability to solve trivial programming problems. Suddenly, however, that ability became less important. Copyright images in the gallery. Reprinting and using them may lead to copyright disputes. Conquered by Generative AI I got a lot of information from Ben, who kept telling me about the amazing results he was getting with GPT-4. It turned out that it was not only good at trivial tasks, but also had the qualities of a senior engineer: it could come up with solutions to problems from a rich knowledge base. In one project, Ben connected a small speaker and a red LED bulb to a portrait of King Charles. His idea was that when a user entered text on the website, the speaker would play a sound and the light would flash a message in Morse code. But programming the device to get the message bothered Ben, and it seemed to require specialized knowledge, not only of the microcontroller he was using, but also of the backend server technology Firebase that stored the messages. Ben asked me for help, and I mumbled a few words. In fact, I'm not sure if what he wants is even possible. Then he asked GPT-4, and it told Ben that Firebase has a feature that would make this project a lot easier. And just like that, here's some code that's compatible with microcontrollers. Although I was terrified of using GPT-4, and somewhat uncomfortable with paying OpenAI twenty bucks a month to use it, I started exploring its capabilities with Ben. We worked together on our crossword puzzle project. I'd say, "Why don't you try this clue?" He'd hand me the keyboard. I'd say, "No, you do it." Together we gradually figured out the capabilities of this AI. Ben had more experience with it than I did, and seemed to be able to use it more effectively. As he later said, his own neural network had begun to align with GPT-4’s. I would say he had achieved machine empathy. At one point, he did something that really surprised me: He had the AI build a game of Snake, like the ones on old Nokia phones. After a brief exchange with GPT-4, he had it modify the game so that when you lose, it shows how far you’ve strayed from the most efficient route. The bot did this in about ten seconds. Frankly, I’m not sure I could have done it. Chess has been dominated by AI for decades, and a player’s only hope is to partner with a robot. Such half-human, half-AI teams, known as centaurs, may still be able to beat the best humans and AI engines playing alone. Programming hasn’t advanced to the same level as chess, but centaurs are here. So far, GPT-4 alone is a worse programmer than me, and Ben is even worse, but Ben and GPT-4 together are dangerous. Soon, I gave in. I had built a small search tool that wanted to highlight the parts of the user query that matched the results. I was complicating things by segmenting the user query into words. My patience was tested, and I started thinking about using GPT-4. Maybe I could spend some time writing "prompt words" or chatting with the AI instead of spending an entire afternoon programming it. In 1978, computer scientist Edsger W. Dijkstra argued in an article titled "On the Folly of "Natural Language Programming"" that if you instruct a computer in your native language instead of a formal programming language like C++ or Python, you are denying the computer its effectiveness. He wrote that formal programming languages are "an extremely effective tool for excluding all sorts of meaningless nonsense that is almost unavoidable when using our native language." Dijkstra's view has become a consensus in the programming community. When the article was recirculated on Reddit in 2014, one commenter wrote: "I'm not sure which is scarier: how obvious this idea is, or how many people still don't know it." Edsger W. Dijkstra (1930-2002) When I started using GPT-4, I understood what Dijkstra was talking about. You can't just say to an AI, "Solve this for me." Maybe that day will come, but right now it's more like an instrument you have to learn to play. You have to carefully specify what you want, just like communicating with a beginner. On the search highlighting problem, I found myself asking GPT-4 to do too many things at once, causing it to fail and start over. Each time, my prompts became less ambitious. In the end, instead of talking to it directly about the search or highlighting problem, I broke the problem down into concrete, abstract, and well-defined subproblems that, when added together, would give me what I wanted. After I discovered the power of AI, my work life began to change. GPT-4 was like a hammer that made everything look like a nail. I finally understood why my office computer screen was always full of chat boxes and how Ben was so efficient. So I opened my mind and used it more often. I'm back on the crossword project. Our puzzle generator prints out in an ugly text format, with rows of letters like "s", "c", "a", "r", "*", "k", "u", "n", "i", "s", "*", "a", "r", "e", "a". I want to turn this output into a nice web page that lets users explore the words in the grid and displays score information at a glance. But I know the task will be tricky: every letter must be labeled with the word it belongs to, both horizontally and vertically. It's a delicate problem that will probably take up an entire evening. Copyright images in the gallery. Reprinting and using them may lead to copyright disputes. What will the future be like? With my baby on the way and my free time running out, I started chatting with GPT-4. We talked back and forth, sometimes having to look at some code to understand what it was doing, but I was doing almost nothing of what I once thought of as programmatic thinking. I wasn’t thinking about numbers or patterns or loops, or using my brain to simulate the computer’s activities. As another programmer, Geoffrey Litt, wrote after a similar experience, “I never engaged my programmer brain.” So what did I do? Perhaps because he felt that the game of Go had been permanently devalued, Lee Sedol chose to quit the game. I learned programming because I felt that computers were like a kind of magic. The computer gave you power, but you needed to study its secrets - learn a spell, which required a special way of thinking. I felt that I was chosen. I worked hard, thought hard, and accumulated obscure knowledge. Then, one day, many of the same goals could be achieved without thinking and knowledge. From a certain perspective, this may make most of one's work seem like a waste of time. Whenever I think of Lee Sedol, I think of chess. When machines conquered the game about thirty years ago, people worried that there would be no reason to play it anymore. Yet chess has never been more popular than it is now—and artificial intelligence is making the game even more alive. A friend of mine recently started learning to play chess. He has at his disposal an AI coach that gives him games that are just right for his abilities and tells him what he did wrong after he loses. Meanwhile, chess masters study the computer's moves like they're reading an oracle. Learning chess has never been easier, and studying its deepest secrets has never been more exciting. Computer science has not been conquered. GPT-4 is impressive, but the average person can’t harness it the way a programmer can. I still feel like my career is secure. In fact, I feel more secure than before. As software development becomes easier, it will become more ubiquitous, and programmers will move more toward design, configuration, and maintenance. Although I’ve always felt that the nitty-gritty parts of programming are the most engaging and important, they’re not what I’m particularly good at. I didn’t make it through many coding interviews at big tech companies. What I’m relatively good at is knowing what’s worth doing, what users like, and how to communicate both technically and humanly. A friend of mine called this AI moment “the revenge of the mediocre programmer.” As coding itself starts to matter less, maybe soft skills will shine. This makes me unsure of what to teach my child once he is born. I even think that when my kids grow up, we'll look at "programmers" the same way we look back at "computers" now. Typing out C++ or Python code yourself may end up looking as ridiculous as spitting out binary instructions on punch cards. Dijkstra would probably be shocked that getting a computer to do something you want might be as simple as asking politely. So maybe it's not the skills that need to be taught, but the spirit. I sometimes wonder what I might have done if I had been born in another era. In the agricultural era, coders might have studied water wheels and crop varieties, and in Newton's era, they might have been fascinated by glass, dyes, and time measurement. I recently read oral histories of the history of neural networks and was struck by how many of the people interviewed—people born around the 1930s—played with the radio as children. Perhaps the next generation will spend late nights exploring the AI that their parents once viewed as a black box. I shouldn’t worry about the end of programming. The hacker spirit lives on! References [1] This article is reproduced from InfoQ, https://www.infoq.cn/article/0PY4NyD9cPDe4pEQlRVa; the original article is translated from A Coder Considers the Waning Days of the Craft, The New Yorker, [2]https://www.newyorker.com/magazine/2023/11/20/a-coder-considers-the-waning-days-of-the-craft Planning and production Source: Fanpu By James Somers Translated by Xue Mingdeng Editor: Wang Mengru |
On June 18, the special government bonds were iss...
The directory of shared content is here: “Know Yo...
Whether it is the recently discussed ending of &q...
On May 1, the National Space Administration relea...
Some people ask, since there was no such thing as...
Introduction to the resources of the popular scie...
I have watched quite a few product launch confere...
1. SB Admin 2 Details & Download 2. Adm...
About the Author: Huang Haoyu Currently working i...
Electronic products surround the lives of modern ...
iOS 18.2 new charging features Recently, a develo...
Recently, the team of Professor Wu Huaqiang and A...
Every time a large-scale event is held, many oper...
Wuling Motors released a positive profit forecast...
The project I am sharing with you today is curren...