Learn Human:Computer Interaction
上QQ阅读APP看书,第一时间看更新

Exploring the history of computers

Without computers, HCI would not be a profession. Software that HCI designers work on is steeped in history and knowing the foundations will allow you to take steps into the future more confidently, therefore, let's rewind a bit and understand how we got here.

Very early history – the 17th century

Since the beginning of civilization, there is proof of human beings' ability to quantify and record their interactions. The computer is the outcome of millennia of knowledge, all now combined into handheld devices that allow us to quantify our existence. The long tail of human accomplishment and innovation that have brought us to today are too numerous to count, but we have been able to advance faster than any other time due to our ability to harness the accomplishments of our forefathers in computing history.

In the Enlightenment era, we had logician Gottfried Wilhelm Leibniz (a 17th/18th century German philosopher and mathematician) who invented and refined the binary number system, which is the foundation of all computers. Computers have a long history and are rooted in machines that can do mathematics.


For more history on Gottfried Wilhelm Leibniz, check out https://www.iep.utm.edu/leib-met/.

Early history – the 17th to 19th centuries

During the industrial revolution, we find an explosion of shared ideas accompanying banking, the stock market, and the industrialization of the workforce that led to the invention of many machines that helped increase productivity while decreasing the reliance on human capital to execute the work. Take mathematics; for example, it is tough to count large numbers and is a tedious process for any human being to do manually. Thus the invention of the mechanical calculator started with Wilhelm Schickard and Blaise Pascal during the 17th century. 

During the 18th century, these adding machines were mechanical devices to help speed up bookkeepers' work. Adding machines and cash registers were the precursors to computers:

The preceding photo shows William S. Burroughs' (1855-1898) adding machine. The invention of the adding machines was aimed to accommodate human inability to memorize numbers and to take manually laborious tasks off our hands. In 1886, William S Burroughs founded the American Arithmometer Company. His first U.S. patent was a nine-digit keyboard and a printing mechanism that would print out the total of the computation, with the original model selling for $475. All machines were crank-operated until the first electric models were introduced in 1928. By 1935, the company produced 350 different models of adding machines, both electronic and non-electric. Adding machines and typewriters answered specific human tasks at the time that our computers have now fully taken off our hands through software programs and computation.

Moving through the 19th century, we arrive in the post-Depression era (the 1930s-40s). Machines continued to build on human limitations, but in 1934 the first programmable machine (a computer) was created by German Konrad Zuse - the Z1. The programmable computer is the foundation on which all computers today are rooted. A computer is transformed by the programs installed on each computer that execute a variety of different tasks. Programmable computers are connected to how we start thinking about what a computer can and cannot do. Today, there are millions of programs that do everything from help us write emails to managing computer networks. The software that is ubiquitous in our world is both generally used by all users such as word processing or internet browsers, to specialized software for specific users, such as 3D modeling or film editing software.  

All computer programs are processed as bits. A bit is the smallest unit of data in a computer, a 0 or 1 of a transistor. See the following representation of 8 bits = 1 byte:

Transistors are organized in groups of 8, so each group can store a byte. All computer processing power and computer memory is a multiplication of 8. A kilobyte (KB) is 1,024 bytes, 1 MB is 1,024 kilobytes, and 1 GB is 1,024 MB, and so on. Computer storage, speed, and size have led to the proliferation of devices from personal computers to smartphones to smart TVs to internet-connected light bulbs. Now, why are binary logic and basic computer history important to this, you might be asking?

Since the 1930s, a lot of innovation has been directed into the production of computers, and we have become very reliant on them for some time. During World War II, computers were used by the Allies (USA, Britain, France, the Soviet Union, and Poland) to break German communication encryption codes. Alan Turing and his team helped break the Enigma code and helped the Allies win World War II. Computers allowed organizations to speed up human data processing skills.

Recent history – the 20th century

The post-World War II economies created many opportunities and the 1950s-1980s saw an explosion in computing technology and computer software creation. The movement of the computer out of specialized clubs and enthusiasts' groups and into the hands of the masses is not arbitrary. Computers are useful, and humans will spend money on valuable products. Just like the adding machine helped people execute math faster, the personal computer is that same idea times a thousand.  

The computer was once used only by a small set of researchers, scientists, and academics. Luckily for us, computer enthusiasts broke down those ivory towers and democratized computer programs, which are useful for everyday people. For example, I am part of the generation that had computers in my classroom for the first time. I learned to type on a typewriter first before using word-processing software on a Macintosh IIci. My generation was one of the first to be taught 21st-century computing skills as part of my base education, including access to computers to learn to type, play games, learn math, and do art. The computer in my elementary classroom is the foundation of why I continue to work with computers to this day.  

The addition of digital literacy in K-12 education is inseparable from computer software and its ability to permeate the systems we use and learn from impacts our own innovative skills. As computers came down in price and in size, their usability increased to the point where even school children could learn and execute a program interface without learning computer programming languages like MS-DOS. The use of the computer was made essential to operate in the modern world. The power of the computer in our society is nothing but remarkable. Still, you already know this because you are here attempting to grasp and manipulate how humans engage with technology.

When computers started being used, they were the size of entire rooms. Over time, they got smaller and faster. The Xerox Palo Alto Research Center (PARC) became a catalyst for many of the ideas that propelled computers into the homes of billions of users. The Xerox Alto systems pioneered the power of a GUI and were used for a variety of research purposes into the fields of human-computer interaction and computer usage:

Research computers at Xerox PARC inspired Steve Jobs and Steve Wozniak and others to design a GUI for Apple computers. Some HCI pioneers that came out of Xerox PARC are the following:

  • Butler Lampton (1943-present): A computer scientist and founding member of Xerox PARC who was instrumental in developing the Xerox Alto in 1973 with a three-button mouse and GUI. 
  • Charles "Chuck" Patrick Thacker (1943-2017): A computer scientist who helped create an OS that allowed users to interface with a computer and a computer mouse through a GUI. The GUI was implemented by Steve Jobs and Steve Wozniak and their colleagues at Apple into the 1984 Macintosh. 
  • Alan Kay (1940-present) is a pioneering computer scientist well known for his work on object-oriented programming and Windows-based user interfaces.
  • Mark Weiser (1952-1999) was the CTO at Xerox PARC and is considered the father of ubiquitous computing (ubicomp). 

In September of 1991, Mark Weiser wrote The Computer and the 21st Century. At the precipice of the creation of the World Wide Web, these thinkers, engineers, and designers started to understand something profound about the computer. They began to see the potential of computers not just as useful tools but as drivers of culture. They would become the new drivers of the modern world.

As computers came down in size and started to use software that was more friendly and accessible, they quickly became instruments for education, business, and government. In the 1980s and 1990s, the personal computer took off and companies such as Microsoft, HP, Xerox, IBM, Apple, and so on started creating consumer-friendly hardware and software that could be portable in the form of laptops. Portability allowed users and workers to be unchained from their desks and move freely throughout the office or the world. This freedom was then augmented by being able to be connected at all times through the internet.

There is a robust history of all the factors and technologies that came together to produce the internet that we will quickly discuss to get us all on the same page.

The 21st century – the internet, smartphones, cloud computing, and IoT

The origins of the internet has its roots in Cold War government research going as far back as the 1960s and programs like DARPA but in 1991, Tim Berners Lee invented the World Wide Web and thus the consumer internet, allowing computers to communicate over a network through HyperText Transfer Protocol (HTTP). The world was then fundamentally altered. Using HTML (Hypertext Mark-Up Language), websites could publish their content for all the world to see through a web address. HTML was limited as a coding language and was then augmented by Cascading Style Sheets (CSS), which impacted the look and feel of a web page, and then JavaScript (JS), which impacted their behavior. This built the foundation for modern web pages that both function well and look good. The internet is loved by many because of a combination of standardized computer code (HTML/CSS/JS) plus a way to quickly deliver content around the globe through content delivery networks (CDNs):

The internet fundamentally altered our existence. You could write a library of books on the impact the internet has made on the world, so I won't go into it too much; however, the expansion of computer networks and the ability to communicate with anyone around the globe has modified our ability to consume knowledge. The ability to serve content via a CDN around the planet has a profound impact on the number of people we can reach, but also on the content they can consume. 

Computers thus moved from devices of business to points of access and entertainment. Connecting users around the world through computer networks have altered how we communicate, exchange ideas, and think. The acceleration toward smaller and smaller computers exploded alongside the expansion of the internet, and the communication technologies of Wi-Fi and cellular technology. This has resulted in the acceleration of smartphone technology, cloud-based application infrastructure, and the proliferation of the Internet of Things (IoT). IoT is the ability of everything to be networked and connected to the internet, which allows all things to communicate and collect data. All this change has occurred in half a century. The potential of what the next half-century has to offer is where we will pick up the torch. As the personal computer and the software designed to operate it have permeated our jobs, our education, and our media lives, we start to understand that computers are like Pandora's box – once opened, you can't put anything back in. There is no undoing computer technology; we can only ride the wave and learn how to approach our technology systems so that they reflect our human values.  

This is not a history book, but some context-setting with the history of HCI and the language that has sprung out of the computer domain is necessary. Let's discuss the role you will play.