As we work with computers today, we should always remember the many people who created the technology we take for granted. There are hundreds we could name, but let’s look at just five of them.
Alan Turing
If any one person made “computer science” into a real science, it was Alan Turing. He addressed a basic question: What can be computed? What are the limits of computation? He described a very simple hypothetical device, known today as the “Turing machine,” and showed that it could calculate anything which the most complicated computing machine can calculate.
During World War II, his work on breaking Germany’s “Enigma” encryption code contributed greatly to the Allied victory. He identified several weaknesses in the encoding and the way it was used.
His article, “Computing Machinery and Intelligence,” posited that a sufficiently advanced computer would be indistinguishable from a thinking being. It’s still a regular topic of debate.
Fred Brooks
In 1964 Frederick P. Brooks, Jr. was put in charge of one of the most ambitious software projects to date, an operating system for IBM’s new System/360 line of computers. His classic of computing literature, The Mythical Man-Month, tells stories from that project that are still worth reading.
The problems his team encountered are familiar to developers today, but they had little history to go by. The same architecture had to work on both small and powerful computers. The project was huge, but simply piling on more people doesn’t make a software project go faster.
Brooks couldn’t afford the delays that would result from first designing a complete architecture and then implementing it, which today we’d call a “waterfall” process. Implementors could start with partial information, so different stages of work could proceed in parallel. This idea is basic to modern software development.
Margaret Hamilton
People think of software development as a male-dominated endeavor, but women such as Ada Lovelace and Grace Hopper have played a major role in shaping it. Margaret Hamilton, one of the few female programmers on the Apollo space program in the sixties, is often credited with inventing the term “software engineering.”
At MIT, Hamilton was in charge of developing the code for the Apollo 11 command module and lunar module (most of which you can now download from GitHub for free). This code was one of the earliest instances of what we’d call “real-time” code today. It accomplished things which are routine today, but were leading edge technology on computers that had a tiny fraction of a modern thermostat’s computing power. It prioritized tasks. When its navigation process needed all the computing power to keep up with the descent, the computer eliminated lower priority tasks. Without Hamilton’s work, Apollo 11 might have crash-landed and ended the whole moon program.
Additionally, she was awarded the Presidential Medal of Freedom in November 2016.
Bill Gates
We know Bill Gates as the man who built Microsoft and became a billionaire. Before that, though, he was a hands-on pioneer of personal computing. In the seventies, the first personal computers appeared. They were devices for electronics hobbyists, not offices or casual home users.
In 1975, Gates and Paul Allen devised an interpreter for the BASIC programming language that would run on the Altair computer, which sold for $439 in kit form. They didn’t even have an Altair computer; they simulated it on a PDP-10 minicomputer which they had access to after doing support work on it. With this interpreter, people could write their own programs on the Altair.
In 1980, IBM contracted with a tiny software company, appropriately called Microsoft, to develop an operating system for its personal computer. Gates convinced IBM to let his company keep the rights to the software. MS-DOS was the direct ancestor of Windows.
Tim Berners-Lee
In the eighties, the Internet grew out of the military-oriented ARPANET. Researchers were using email and other tools to share information over the network, but there was no standard way to make information available to everyone. Tim Berners-Lee, working at CERN in Switzerland, wrote up a document called “Information Management: A Proposal,” which his boss called “vague, but exciting.” The proposal suggested hypertext — text with embedded links to other documents — as the way to present documents and connect them to existing ones.
He developed this idea into the Hypertext Markup Language, called HTML. By promoting this standard, he led the creation of the World Wide Web.
Berners-Lee founded the World Wide Web Foundation and the World Wide Web Consortium, and he continues to be active in developing Internet standards.
And many others
Highlighting these five people was a tough choice, with so many who deserve honor. Whenever you read a blog, send an online message, or watch a video on the Web, take a moment to think of the people whose achievements made it possible.