Yes, Alan Turing is “the father” of modern computer science.
Without him, no modern algorithms, no contemporary concepts of computation.
But what about the personal computer? I’m talking about the one I’m using right now to type those words.
And the web — how you’re reading these words.
So, yes. We do stand on the shoulder of giants.
But the people who made the personal computer possible were science-fiction loving, long-haired hippies.
‘Ask not what your country can do for you. Do it yourself,’ we said, happily perverting J.F.K.’s Inaugural exhortation. Our ethic of self-reliance came partly from science fiction. We all read Robert Heinlein’s epic Stranger in a Strange Land as well as his libertarian screed-novel, The Moon Is a Harsh Mistress. Hippies and nerds alike reveled in Heinlein’s contempt for centralized authority. To this day, computer scientists and technicians are almost universally science-fiction fans. And ever since the 1950s, for reasons that are unclear to me, science fiction has been almost universally libertarian in outlook.
So the answer to the perennial question “why are all computer science geeks Star Wars fans?” you have the answer here.
Vintage science-fiction books have anti-authoritarian slants that appealed to young people in the sixties and seventies.
First then, the rejection of authority.
In his 1984 book, Hackers: Heroes of the Computer Revolution, Steven Levy enshrined the hacker ethic — the political beliefs that motivated the creation and promotion of personal computers by said hippies:
- “Access to computers should be unlimited and total.”
- “All information should be free.”
- “Mistrust authority – promote decentralization.”
- “You can create art and beauty on a computer.”
- “Computers can change your life for the better.”
So the idea is that the books brought the original enthusiasm.
What brought the vision?
Can we go as far as arguing that LSD — “turn on, tune in and drop out”, a phrase popularised by Timothy Leary — is what enabled these people to create machines that would set them free? If not, where does the inspiration to make the necessary abstractions computers require come from?
For instance, do all the people using IBM’s application Lotus know that it is based on Lotus 1-2-3, a spreadsheet software created by Mitch Kapor, a transcendental meditation teacher, hence the name Lotus?
My point is that the invention of personal computers has a political origin, unlike the invention of the lightbulb — but please let me know if I’m wrong.
The people responsible for the promotion of computers rejected the ideas of authority that led to the horrors of the 20th century. Aided by drugs and the smooth climate of California, new ideas popped into their open minds.
Somewhere along the road, though, it went a bit sour.
Initially, Steve Jobs was a different creature that what is known of him in popular culture today:
In the 1960s and early ’70s, the first generation of hackers emerged in university computer-science departments. They transformed mainframes into virtual personal computers, using a technique called time sharing that provided widespread access to computers. Then in the late ’70s, the second generation invented and manufactured the personal computer. These nonacademic hackers were hard-core counterculture types – like Steve Jobs, a Beatle-haired hippie who had dropped out of Reed College, and Steve Wozniak, a Hewlett-Packard engineer. Before their success with Apple, both Steves developed and sold “blue boxes,” outlaw devices for making free telephone calls. Their contemporary and early collaborator, Lee Felsenstein, who designed the first portable computer, known as the Osborne 1, was a New Left radical who wrote for the renowned underground paper the Berkeley Barb.
In 1995, as Brand wrote this article, I’m sure he wouldn’t know that in 2016, scores of people would despise Steve Jobs for his wrongdoings — ranging from how he treated his own daughter, to the closed nature of the Apple ecosystem and the despicable way the iPhone is manufactured by Foxconn.
I understand the betrayal computer scientists may feel today.
Jobs took their ideas, applied a healthy dose of human-centred design and marketed them to death.
Brand concludes his piece (again, written in 1995) with a hopeful vision that unfortunately — as iOS & Android, Facebook and overall progress of social media cement our entrance into the 21st century — is not unfolding as he intended:
Our generation proved in cyberspace that where self-reliance leads, resilience follows, and where generosity leads, prosperity follows. If that dynamic continues, and everything so far suggests that it will, then the information age will bear the distinctive mark of the countercultural ’60s well into the new millennium.
The Internet-equipped smartphone created a new frontier for computer science. Everyone has a computer in their pocket and people can see their loved ones’ faces from across the globe in the blink of an eye with software like Skype. Closed distribution models (Google Play and App Store) are hurdles.
But the Internet enables anyone to learn new skills — do you want to learn how to code? How to fix an oven? How to write in Portuguese? — and the Internet enables anyone to then sell their skills by means of products or services. If you’re not into making software, you can use it to advance yourself or your business. And if you’re into making software, it is up to you and I to make it useful — politically and economically.
People like Aaron Swartz, that Brand wouldn’t have known, are the ones behind new, public technologies like RSS and other standards. They are part of the fourth generation of hackers he mentioned in his Time article.
We have the tools and the information. Our cleverness will help us cut through the chaotic noise to only get the delightful juice of the signal.
Next time you boot your computer, maybe you’ll think of its history, maybe you won’t.
Now though, you can’t say you didn’t know.
P.S Original pieces (categorised in “Commentary”) will now be preceded by a black circle, ●.