Part 01introductionThe history of computer development spans nearly 80 years, and it is unrealistic to describe it in a short article. But don't worry, as an ordinary user, you only need to remember one magical era, and the history of computer development will become clear before your eyes, that is - the early 1970s. Part 02Overlooking this lifeWhy the early 1970s? Because the computers we see today, and the various technologies they rely on, were born between 1970 and 1973. If you don’t believe it, just look at it - ➵ In 1970, Intel released the first commercial DRAM memory. Computers cannot function without memory. ➵ In 1971, Intel released the first commercial microprocessor, which is now the CPU. With it, computers can be small and exquisite. ➵ In 1971, Bell Labs developed the first version of the Unix operating system. Unix is the ancestor of most operating systems today, including the ubiquitous Linux (including Android), Apple system, etc. The "starting point" of computer time (0:00 on January 1, 1970) is also defined by Unix. ➵ In 1972, Bell Labs developed the C language. The C language may be the most influential programming language. Many operating systems and programming languages are directly or indirectly based on the C language. ➵ In 1973, ARPA designed the TCP/IP protocol, which is the cornerstone of the Internet. The movie "The Wandering Earth 2" saves the earth by restarting the global Internet, and it relies on the TCP/IP protocol suite. ➵ In 1973, IBM released the first mechanical hard disk with today's structure. ➵ In 1973, Xerox invented the first graphical interface operating system, which enabled simple and intuitive human-computer interaction and laid an important foundation for the popularization of computers. In general, the most important things about computers today were basically invented in the early 1970s. You might be thinking, wow, the early 1970s were so great, so many great things happened. So what developments have computers made in the years since? Later stories➵ In the late 1970s, personal computers were born, and expensive computers were no longer exclusive to governments, businesses, and universities. ➵ In the 1980s, graphical interface operating systems, which were a boon to consumers, began to become popular in commercial applications, namely Apple OS and Windows (written in C/C++). ➵ In the 1990s, popular productivity tools among programmers were born - Python and Java (written in C/C++). ➵ In the 1990s, the information superhighway required for billions of people to work and live was born - the World Wide Web (based on TCP/IP). 21st century: ➵ In the 2000s, the kings of smartphone operating systems—Android and iOS—were born (indirectly derived from Unix). ➵ In the 2010s, mobile Internet and cloud computing became popular. ➵ In the 2020s, the Metaverse, the culmination of existing technologies, took off in 2021. In contrast, the early 1970s were like a cheat, with too many groundbreaking achievements concentrated, while in the years that followed, development seemed relatively lackluster, with more focus on optimizing and upgrading existing technologies. For example, the CPU has grown from the original 4 bits to the current mainstream 64 bits, and its performance has also been greatly improved. For example, DRAM memory was originally called DRAM (dynamic random access memory, abbreviated as dynamic memory); later it was upgraded all the way and prefixes were continuously added, and now it is called LPDDR5 SDRAM (5th generation low-power double data rate synchronous dynamic memory) Figure 1 Memory name explanation If we consider computers after 1970 as their present life, then what were they doing in their past life? Didn’t the hardware and software systems we have today exist before 1970s? You have to know that the first human landing on the moon was in 1969, and the technologies in the early 1970s had not yet been developed. So what did the systems that the moon landing relied on look like? Part 03Looking back at the pastIt is generally believed that computers were born in 1946. So what did the development of computers go through until 1970? Here we can summarize it in three aspects:
Hardware miniaturizationThe original computer circuit used electron tubes, and the whole thing was as big as two rooms. It is hard to imagine how much hard work our predecessors had to put in to control this huge beast. Fortunately, in the 1950s, transistors and integrated circuit technology were developed, which allowed the CPU and memory to be miniaturized one after another, thus gradually realizing the miniaturization of the entire machine. The evolutionary path is: giant -> large -> medium -> small computer. As for microcomputers (microcomputers), they did not appear until the 1970s. Integrated circuits have made chips smaller and more powerful, supporting a wide range of upper-level applications. The key process of integrated circuits is lithography, which is why advanced lithography is said to be a bottleneck. It is because it is at the upstream of the industrial chain. Once it is cut off, the downstream will be helpless. Humanized operating systemToday’s computers seem to have three heads and six arms. They can play music, download videos, and chat on WeChat at the same time without any effort. However, early computers were very "focused", completing one task before moving on to the next. At the time, computers were scarce and slow, so dozens of people had to queue up to take turns using them. The "batch processing" and "time-sharing" operating systems that emerged in the 1950s and 1960s improved this poor user experience. The essence of time-sharing is that tasks are executed in turns at the micro level, but appear to be in parallel at the macro level, and users are basically unaware of this illusion. This humanized design can be regarded as solving the user's pain points. Of course, this operation is inseparable from the improvement of CPU performance. Advanced programming languages (automation)Python has been very popular in recent years. One of the reasons is that it is simple and powerful. Even non-professionals can use Python to realize office automation and free themselves from repetitive work. This is thanks to the advanced programming language of that year. We know that the underlying structure of computers is 0 and 1. Initially, humans directly used machine language composed of 0 and 1, which was very difficult to read or write, not to mention troubleshooting after errors. Later, a relatively easy-to-read assembly language was developed, but its efficiency was still relatively low. In 1957, the first high-level language Fortran was launched to enable automatic programming. From then on, people only need to write simple ideographic characters, and computers can translate them into machine language by themselves. In addition, simple symbols can be used to represent complex functions, achieving the effect of achieving great results with little effort, which can be said to have greatly liberated manpower. Table 1 lists the key developments in various early computer technologies, which can be used as a reference for those who are interested. Table 1 Key points of early computer development Part 04Looking to the futureWhat is the significance of understanding the history of computer development? On the one hand, no matter how things change, the underlying foundation remains the same.Computing networks, smart homes, and the metaverse have been popular topics in recent years. They are largely based on the integration and innovation of existing technologies, such as cloud computing and the Internet of Things, which are ultimately strongly supported by CPUs, memory, operating systems, networks, etc. Only by understanding the root causes and seeing the essence through phenomena can we be more calm in the face of ever-changing new things. For example, the computing power network did not emerge out of thin air. The vision it outlines (making computing power an infrastructure like water and electricity, available on demand) was similarly conceived and practiced on a large scale as early as the 1960s after the emergence of time-sharing operating systems. It was only due to the limitations of the times that it did not last long. (To learn how brilliant, efficient and fast the computer systems built based on these hardware and software are, please refer to the article "Revealing the Secret: Why Letters Can Arrive Instantly Across Thousands of Miles" on the Mobile Labs official account .) On the other hand, history is a mirror that can help us understand the rise and fall of nations.With the changes in the international situation, the urgency of promoting the localization of key technologies such as chips and operating systems has increased dramatically. After all, the underlying layer is the foundation. If the underlying technology is withdrawn, the superstructure, no matter how splendid and magnificent it is, will collapse. In the process of localization, we will, to a certain extent, retrace the development history of the computer ecosystem, learn from history and avoid detours. Of course, this article is just a starting point, trying to clear the fog of confusion that consumers have about computers and get a glimpse of the full picture. For more historical materials, see the references at the end of the article. Part 05ConclusionThen again, in the early 1970s, was it really that important? In fact, what is more important is the integrated circuit, time-sharing system, high-level language, etc. These technologies, which were born in the 1950s and 1960s, laid the foundation for later development. The explosion of achievements in the early 1970s was more like a natural inevitability than a coincidence. But this does not prevent us from considering the 1970s as an important milestone. After all, what we consumers are most familiar with is the computer (microcomputer) at hand. Its extraordinary origin can only be traced back to the magical watershed of the 1970s: In the first three years of the 1970s, the most important hardware and software for today's computers came onto the stage; The future "present life" will be largely based on the continuous upgrading and innovation of these achievements. The previous "prehistoric civilization" was roughly characterized by three aspects: miniaturization of hardware (integrated circuits), humanization of operating systems (time-sharing), and advancement of programming languages (automation). Easter EggsThe technologies mentioned above in the early 1970s were all created by the United States. I guess the United States was at its peak at that time, right? On the contrary, times were not good in the United States at that time. In the 1970s, the United States ended its rapid economic growth since World War II and entered a period of stagnation. Politically, the United States was in the Cold War period of "Soviet attack and American defense", and faced great pressure. Later, Intel, which dominated the CPU industry for a long time, was also in a desperate situation at that time, and even considered selling itself in 1975. But crises often bring both danger and opportunity, don’t they? The United States was worried that a Soviet nuclear bombing would paralyze its own command system, so it developed the ARPANET (1969), which marked the beginning of the Internet. Although various achievements were made in the early 1970s, they were not yet widespread. The Internet economy did not prosper until the 1990s. The technological foundation for this prosperity was laid as early as the 1970s. The country is in the midst of a sanctions crisis, but it is also vigorously promoting the localization of key technologies. The industrial system will usher in a bottom-up rebirth. This moment in the early 2020s may be just like that moment in the early 1970s. I believe that the unyielding growth of surviving in the cracks today will eventually bear fruit in the near future. |
<<: HashMap implementation principle, expansion mechanism, and summary of common interview questions
>>: iOS 16.4 push update, several new features!
The article is long and is mainly divided into th...
【Full course】Dou Shenda Chinese Classical Poetry ...
Today's mobile phones are good in every way, ...
The NetEase H5 that has been all over the WeChat ...
The establishment of an operating system involves...
As a gathering place for Generation Z, Bilibili a...
In the blink of an eye The Shenzhou 13 crew has b...
Recently, many optimizers and advertisers have as...
On December 15, the Natural Resources Comprehensi...
80 episodes of Gu Yu's video on how to become...
How much does it cost to attract investment in th...
"Ouch, your finger is injured, put it in you...
In the previous article when I was explaining cer...
This year's high temperatures in northern my ...
As the weather warms up The unbearable "flyi...