The Asimovian Epoch

Centuries past are like the long night, a mysterious dream world. The modern, networked Earth is like daybreak, a world-wide awakening.

The fifty years 1945-1995 could be considered as a sort of ‘twilight zone’ between the two. The bulk of thinking and research done during this period is too new to be of any great historical or even deep nostalgic interest. Yet it is too old to be treated as current knowledge in the Internet age, where freshness and novelty are valued. In addition, we are still inside several of the epic transformations that began in this time. It’s difficult to appreciate and comprehend a revolution when you live within it. Our blind spot for this period is a great tragedy.

This period could be called the ‘Asimovian Epoch’. Here’s why.

Isaac Asimov (1920-1992) was perhaps the last great polymath. He was a biochemistry professor and author, publishing (writing or editing) over 500 books (covering 90% of the Dewey Decimal Classification system) and many thousands of shorter works. That’s an average rate of over one book per month over his long career – one of the greatest outbursts of creativity in history. His favorite topics included history, physics, comedy, and English literature. He was quite arguably the greatest science fiction writer of all time. He, Robert Heinlein, and Arthur C. Clarke are known as ‘The Big Three’ of the Golden Age of science fiction.

Asimov was a nexus of 20th century thought.

He was strongly influenced by Greek and Roman classics, Shakespeare, Gilbert & Sullivan, and also by contemporaries such as Kurt Vonnegut, Heinlein, Clarke, Harlan Ellison, Carl Sagan, and Marvin Minsky. He in turn influenced the likes of Paul Krugman (Nobel Prize in Economics), Elon Musk (SpaceX, Tesla), most science popularizers (happily now becoming too numerous to list), virtually all science fiction writers, and of course millions of readers.

Optimism came early to Asimov, perhaps because he spent so much time as a child in his father’s candy store. Many were initially drawn to science as teenagers by his optimistic and compelling vision of science and technology. Given the advances that 20th century science brought, such as space exploration, computation, automation, microbiology, etc, our debt to him on this account alone is inestimable. His masterpiece, Foundation, has deeply influenced many others, including me: Future Psychohistory

It has become fashionable for some to be suspicious of science and even liberal education. This was a trend that Asimov fought tirelessly against all his life. He warned against the false notion that democracy means that ‘my ignorance is just as good as your knowledge’

He went even further, exploring the frontiers of individuality and self-education:

Self-education is, I firmly believe, the only kind of education there is.

Some even credit Asimov with predicting the Internet, especially its use for personalized education. He envisioned a world in which people could pursue their own interests, unfettered by the standard classroom, open to new ideas and serendipity.

Ironically, the very technology that he hoped would spur individuality and diversity has the opposite effect at times. One example is digitization and archiving of published material. There was a time when books were expensive, highly personal possessions that stayed with the owner for a lifetime, and were even sometimes handed down across generations. People would store letters, pressed flowers, and other snippets of life between their pages, and write personal margin notes. Books were not just repositories of language, they were time capsules and valuable historical accounts. However, the best book to scan is a pristine, un-personalized volume. Once it is scanned, future researchers use this one version increasingly exclusively. Individuality and diversity are squelched.

A pervasive basic understanding of science was Asimov’s constant goal. He often took the opportunity to describe science not as a storehouse of absolute truth, but rather as a mechanism for exploring nature. Science is not a dogmatic belief system. Learning happens when one asks questions of the universe and then listens attentively and objectively with the ears of science. He wrote eloquently on scientific models, logic, and inference. An example is this essay.

Of course Asimov is best known for his robot stories, and his ‘Three Laws of Robotics’. In 2010, the U.S. Congress established National Robotics Week in his honor. It’s possible that he saw artificial intelligence as a way for intellect to exceed the bounds of human bias, subjectivity, and mortality.

I never met Isaac Asimov, but I have spoken to several people who did have that pleasure. Each one related the profound impression he made on them. He died in 1992, but his books continued to be published posthumously. By the end of 1994, the World Wide Web Consortium had been founded at MIT. The dramatic rise of the Web marks the end of the Asimovian epoch.

Academia can sometimes degenerate into a vicious paper chase, with Ego in the front seat, Science in the back seat, and Humanity locked in the trunk. Asimov famously said,

The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ but ‘That’s funny…’

Sadly, with hyper-specialization and the loss of interdisciplinary thinkers like Isaac Asimov, the catch-phrase of science has increasingly become: “Oooh! Shiny!”

Minecraft and STEM

Microsoft recently bought Mojang AB, the Swedish company that makes Minecraft, for a reported $2.5 billion. It was not Microsoft’s first attempt to woo Mojang, and many thought they would never be able to close the deal.

In 2009, Markus Persson (‘Notch’), an unknown independent Swedish game developer, released the first version of his creation, Minecraft. It was not a normal ‘game’. It had no defined sequence, no milestones or levels of achievement, and most curiously, no end goal. What it did have was a charming, simplistic world in which a player could ‘mine’ and ‘craft’ for hours (or days) (or weeks) according to their own desires. In fact, the game could be played without any other mobile agents (‘mobs’) at all, leaving an entire world to explore and terraform with no limits. In 2010, maps became truly infinite, growing automatically as the player’s avatar (‘Steve’) approached an edge. The fact that no one else ever has or ever will live in, or even see, your privately created world gives one a compelling, almost haunting feeling.

It reminds me of the 1967 Star Trek episode “Arena”. Captain Kirk is locked in mortal combat with a frightening alien creature on a deserted and barren world. He must find or craft weapons from whatever materials he can find. He finally succeeds by assembling the raw materials for a crude cannon. We have all imagined ourselves cast back to a more primitive environment, where our own imagination and gumption are nearly all we have at our disposal. Interestingly, the goal of many advanced Minecraft players is to populate their world with coffee shops, restaurants, and villages.

Over time, Minecraft has expanded to include many characters, tools, and landscape features, but it has always maintained its minimalist look and feel. That’s the secret to its success – this is a world that is understandable to everyone, from children right through to seniors who grew up before the computer age. While the richness and physics of this world have evolved, the quaint, blocky look of it hasn’t.

The game has since sold many millions of copies, inspired LEGO sets, books, films, and has a huge and active fan base on YouTube. Conventions are held regularly, and Notch achieved rock star status years ago. Minecraft is the very definition of viral Internet success, massively disrupting traditional corporate and marketing models. Without shrink wrap or advertisements, it took over the gaming world.

So, many folks were left scratching their heads over this Microsoft acquisition. On the surface (pardon the pun), it seems like an unlikely fit. However, I don’t think that at all. I see it as the happy marriage announcement of two dear old friends.

Around the turn of the millennium, I was a corporate trainer, with MCT & MCSD certifications (the ‘M’ is for Microsoft). I often found myself sticking up for Microsoft amidst the disdain of my colleagues. I would use Bill Gates’ early years as an example of out-of-the-box thinking, like the Tandy Model 100, the first true notebook, and the last machine that he programmed personally. The best Christmas present I ever got was DOS 3.0, and QuickBASIC was one of the best languages I ever used (and I’ve used 50 or more). Maybe Apple and others were more stylish, but it was Microsoft that really put the excitement into personal computing.

Microsoft hopes to gain leverage in the mobile and youth markets. I cannot think of any better vehicle than Minecraft. Far from seeing it as the death knell for Minecraft, I see Microsoft’s acquisition as a renaissance and a real chance for positive, imaginative, and compelling games to get the much larger piece of the pie that they deserve.

“If you talk about STEM education, the best way to introduce anyone to STEM or get their curiosity going on, it’s Minecraft”
– Microsoft CEO Satya Nadella

Also, there’s Microsoft’s augmented reality headset – HoloLens. There appears to be a plan coming together:
Minecraft + HoloLens + ElementaryProgrammingSkills = Syntonic Learning

If this is what Microsoft has in mind, it could be the most disruptive technology yet. Well done Redmond, and best wishes to Notch in all his future endeavors.