The History of the Computer: First PC’s and the Future Computer Timeline

As time marches on, it becomes more and more difficult to recall our world before the invention of the computer. For those who tire over answering an infinite number of daily emails, imagining this world may seem like a pleasant dream. But for the rest of modern society, this is probably not something we’d like to imagine. Whether you are a technophile, or someone who simply requires a computer for your day-to-day activities, you’ve probably wondered at least once who is responsible for the modern computer.

The History of the Computer

First, a necessary digression: Some will argue that the first computer was invented 5000 years ago when the Sumerians developed the abacus. But for those of you who can’t remember World History 101, the abacus was a man-made wooden calculating tool that allowed the user to formulate and keep track of easy math problems. The simple fact is that a distinction has now been drawn between these calculators and modern computers.


The First Electronic Computer

So, let’s move forward to the commonly accepted definition of the modern digital computer. A Nazi by the name of Konrad Zuse developed the first freely programmable computer. Zuse’s computer required three basic elements: a control, a memory, and a calculator for the arithmetic he needed to process. Zuse continued to build upon his work over the years by developing the first algorithmic programming language, and in 1941 he completed the first fully functioning electro-mechanical computer. Following this progress, Zuse was unable to convince the Nazi government to support his work for a computer based on electronic valves because the cost to create such a machine would have taken hundreds of thousands of dollars. The Nazi’s thought they were close to winning the war and felt no need to support further research.


Remarkably, the 1960’s were a decade known for a lot more than the invention of the microchip. However, this invention is arguably one of the most important in the history of modern man. At the time, Jack Kilby and Robert Noyce (founder of Intel) were not partners, they actually didn’t even know each other, but as fate would have it they both invented almost identical integrated circuits (a.k.a. microchip) at nearly the same time. To put it simply, these integrated circuits allowed computers to run more with fewer parts. Most notably, it was the microchip that enabled man to fly into space and to land on the moon. Regardless of running more with less, in the days of vacuum tubes and the early microchip, a computer with less than a megabyte of memory would fill up 1/4 of a soccer field, and cost millions of dollars to produce. And it wasn’t until the 1970’s when the microchip allowed a computer to fit on the top of a desk instead of filling the entire house.


Another important year for the computer was 1962. This was the year known for the Cuban Missile Crisis, but also the year the first computer game was invented. “Spacewar” was invented by a team of geeks from MIT, and was led by a young computer programmer by the name of Steve Russell. It took the team 200 hours to write the first version of Spacewar. What was significant about the game was that the operating system was the first to allow multiple users to share one computer simultaneously. But it wasn’t until the 1970’s that the computer moved outside of the expensive university and into the living room.

The First Personal Computer

For the controversial price of $666.66 a piece, Steve Wozniak and Steve Jobs built by hand the first 50 personal computers in 1976 ( It has now been over thirty years since that introduction of the first personal computers to the world market in 1977. And since their introduction –The Commodore PET, Apple II, and TRS-80– the world has been forever changed. This is perhaps most notable in that, during the first three decades of availability of consumer computers, they have dramatically changed the way billions of people now conduct their daily lives. Additionally, we have witnessed the growth of what is now a global, multi-billion dollar industry. The rate at which the technology has grown and the increased availability of products over the years has had a dramatic impact on the affordability and subsequent distribution of personal computers at the consumer level.

Initially, one of these computers, along with a printer and programs, would cost the consumer in the range of $2000-$3000 dollars. That might not sound like much, but when adjusted for inflation in 2008, that would be like spending $7,000 to $10,000 for a computer with four to 16 kb of RAM. Considering that the average family income in 1977 was in between $13,000 and $16,000, this was not a regular household item obviously. Therefore, if you were to purchase one of these machines in the year it was released, you would be spending more for a computer than the most popular new car at that time, the Ford Pinto, which sold for just under $2000.

Apple Makes Personal Computers Affordable

Skip ahead a couple of years. It’s 1984, and millions of Americans are watching the Raiders pummel the Redskins in Super Bowl XXIII. When during a commercial break in the third quarter, a one-minute ad is aired for Apple’s new personal computer. This Orwell-inspired advertisement helps Apple bolster sales for its $2500 Macintosh Computer to 50,000 units sold within the first two months on the market. This feat had never before been accomplished in the personal computing industry, and marked a turning point in the market for such devices. Regardless of the turning point in the market for Apple, the average computer cost was still too high for the average consumer. And personal computers were only in 7.9% of American Households, of which, the majority were households that made over $50,000. Adjusted for inflation in 2008, that would be households that made over $105,000 a year.

The First PC

Within a matter of a decade, the percentage of households that owned personal computers would more than quadruple to 36.6%, thanks in large part to one computer company – Dell. Michael Dell began his company, while still in college in the mid-eighties, and by 1997 Dell, Inc. had become the largest seller of PCs and successfully shipped its 10 millionth system. Dell sought to build its business model around the practice of individually assembling each personal computer. Not only did this model set Dell apart from the competition, but so did the company’s consumer-oriented focus which allowed for customers to customize their computers during the ordering process. By the mid-nineties, competition within the industry had driven prices to a more affordable $1000 to $2000, and, as a result, more and more people from diverse backgrounds were able to purchase personal computers.

Fast-forward to today where technology has become affordable enough for 62% of the U.S. population to own computers. Therefore, within a twenty-year time frame, the availability of personal computers has increased to the point that nearly 190 million people in the U.S. now reap the technological benefits of computing. That means around 170 million people who couldn’t afford a personal computer twenty years ago now had the means to purchase one on their own. Anyone can look through a catalog in the Sunday newspaper and find PCs that are selling for as little as $300. Adjust that for the rate of inflation and that would be like spending $85 for a PC in 1977. Do the numbers and that comes out to being 95% less than the going rate when personal computing first emerged on the market, with exponentially greater computing power on top of that.

The Future Timeline of the Computer

Why is this important? Everyone knows that technology is more expensive when it first comes out. When we compare the availability and cost of personal computers in 1977 to 2008, we can begin to see how much cheaper technology is available for today. And when we begin to understand this, revolutionary ideas like the $100 laptop come about.


Taking that even one step further, a team of MIT students have set out to make a very simple computer that would retail for just $12. It is loosely based on the old Apple II, with Nintendo-like controls to perform basic functions. It’s these ideas that begin to shape our world and make it a better place. This effort is to make technology available for all people, from all backgrounds, around the world. Because of the advances in the affordability of technology, it is now becoming a reality, even in the Third World. These revolutionary innovators who consistently push the envelope of what is thought to be possible, will continue to transform the way we go about our daily lives, and open new realms of opportunity across the globe.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s