News & Ruminations

Here are some newer happenings and considerations since the article was written in 2011.

Neural Networks, Really?
In 2011, this was the technology that was most on my mind, so it’s not surprising that I named it specifically as the probable base of a model. Since then, I have increasingly leaned towards agent-based models. The problem is that even if I was to write the article anew, I would probably stick with neural networks only because the very idea of billions of intelligent, learning, interacting agents is still too far “out there”. I don’t like being laughed at, especially by software engineers I respect and admire, so computational psychohistory is fanciful enough for now. However, IBM recently announced a “neurosynaptic” chip with 5.4 billion transistors arranged as 1 million neurons and 256 million synapses, so who knows?

Haskell? Really?
I’m aware of the criticism that it’s contradictory to say that the formula (mathematics) is being supplanted by the algorithm (computation) on one hand and yet recommend a very formal, mathematical computer language (Haskell). I will stick with Haskell however, for three main reasons.

First, the theme of the article was that Asimov’s original mathematical psychohistory is perhaps more realizable as computational psychohistory, and therefore any computational language, even a formal one such as Haskell, is the essential, big step. The rest is semantics.

Second, a project such as this redefines the concept of scalability in computer programs. The system being considered here would begin with billions of intelligent agents and quickly increase exponentially with learning. This enforces formality because only machines could tackle such a chore. More human-friendly, pragmatic languages (such as scripting) would quickly be left in the dust.

Third, the concept of ‘laziness’ is crucial, and not often stressed in computer science.
“Laziness in doing stupid things can be a great virtue”
– Father Perrault in Lost Horizon

“… as soon as it could have…”
The early appearance of life on Earth is well established. Recently, the Kepler mission discovered a planetary system that’s almost three times the age of our solar system.

Big Data is not nearly big enough
There has been much written in regards to big data analysis for prediction. However, sifting and massaging massive databases is how one might find nuggets, not direction. What I proposed in the article was a simulation approaching nature itself in detail and complexity – perhaps a good name would be “reality data”.

GPU Programming
In the article, I used GPU as a euphemism for “F—ing big pile of transistors”. Programming GPUs is really a very restrictive, synchronous process. However, the first attempts at GPU general programming languages are beginning to emerge. Not surprisingly, they seem to use the Functional Programming paradigm, just as I predicted 🙂

One of the first I’ve seen is called Harlan (I hope that’s for Harlan Ellison)
Harlan on github

Dawkins, Dennett, Hofstadter, Asimov, and me
(Yes, the name list is intended to be humorous. I am not worthy of washing the socks of any of the four names listed before mine.) Hofstadter and Dennett have been criticized for reifying Dawkins’ meme metaphor (that ideas are like genes). I too have been rebuked (well, laughed at actually) for doing the same thing with Asimov’s psychohistory. I’m glad I put the non-advocacy line in the introduction, and I take every opportunity to remind people that Asimov died long ago, I never met him, and he wouldn’t have had any reason to even ask me for the time of day.

Stan: A Language for Bayesian Inferencing
Obviously I’m a big fan of Bayesian methods. A new language promises to deliver such capability into the hands of ordinary developers:
Stan home page

Common Sense: A Tough Nut to Crack
I made the point that common sense is the crux of the matter. There’s just so much tacit and temporal knowledge involved with anything like human intelligence, that machines have a long, long way to go. An example is this recent, very powerful AI that can beat many classic arcade games, but not one of the earliest and simplest:
DeepMind and Pac-Man

January 24, 2016 Marvin Minsky dies
“If you just have a single problem to solve, then fine, go ahead and use a neural network.”
– Marvin Minsky

Machine Learning Chips Begin to Arrive
Application Specific Integrated Circuits (ASICs) and Field Programmable Gate Arrays (FPGAs) may offer major improvements in machine learning speed and power. Also, the easing of the demand for near-infinite accuracy/reliability harkens back to that Father Perrault line again. Remember that Kirk could beat Spock, even at chess.
Google’s TPU

Expanding the Psychohistory Model Beyond Humanity
In reading Asimov’s original description of the Prime Radiant, perhaps the most compelling aspect is its zoom capability. Increase or decrease in resolution of events and timelines is of course done at the familiar human scale.

This is no longer a technical limitation. The speed of light is a large number, but it is not infinite. Space-time can be resolved to almost arbitrary resolution using new technology (subject to Planck, Heisenberg, and sensors), and to fully arbitrary precision using computation. We learn a lot by scaling time (eg high speed photography, or a single book that summarizes the Peloponnesian War). If psychohistory is expanded to model nature in general, not just humanity…
Just sayin’
Computational Vision

The Hubris of Neuroscience
I contend that computation >> neuroscience. How far would our modern, much-vaunted methodology get in understanding a simple microprocessor? Not very far…
Could a neuroscientist understand a microprocessor?

DIY Psychohistory
Technology is out-pacing the Ivory Tower. AI research is becoming democratized. Several important recent pulsar discoveries have been made by citizen scientists. Could computational psychohistory be far behind?
DIY Deep Learning

TPU Programming
I seem to have underestimated the rate of computational progress. Google has just made a 1,000 TPU cluster available on the cloud.
TPU Cluster

Leave a Reply

Your email address will not be published. Required fields are marked *