Author Archives: scidata

The picoXpert Story

picoXpert was one of the first (if not THE first) handheld artificial intelligence (AI) tools ever. It provided for the capture of human expert knowledge and later access to that knowledge by general users. It was a simplistic, yet portable implementation of an Expert System Shell. Here is the brief story of how it came to be.

When I was about 10, my grandfather (an accomplished machinist in his day) gave me his slide rule. It was a professional grade, handheld device that quickly performed basic calculations using several etched numeric scales with a central slider. I was immediately captivated by its near-magical power.

In high school, I received an early 4-function pocket calculator as a gift. Such devices were often called ‘electronic slide rules’. It was heavy, slow, and sucked battery power voraciously. I spent many long hours mesmerized by its operation. I scraped my pennies together to try to keep up with ever newer and more capable calculators, finally obtaining an early programmable model in 1977. Handheld machines that ‘think’ were now my obsession.

I read and watched many science fiction stories, and the ones that most fired my imagination were those that involved some sort of
portable computation device.

By 1980, I was building and programming personal computers. These were assembled on an open board, using either soldering or wire wrap to surround an 8-bit microprocessor with support components. I always sought those chips with orthogonality in memory and register architecture. They offered the most promise for the unfettered fields on which contemporary AI languages roamed. I liked the COSMAC 1802 for this reason. It had 5,000 transistors; modern processors have several billion. The biggest, baddest, orthogonal processor was the 16- or 32-bit Motorola 68000, but it was too new and expensive, so I used its little brother, the 6809, which was an 8-bit chip that looked similar to a 68000 to the programmer.

I spent much of the 1980s canoeing in Muskoka and Northern Ontario, with a Tandy Model 100 notebook, a primitive solar charger, and paperback editions of Asimov’s “Foundation” trilogy and sequels on-board (I read them five times).

By the mid 1990s, Jeff Hawkins had created the Palmtm handheld computer. The processor he chose was a tiny, cheap version of the 68000 called the ‘DragonBall’. I don’t know which I found more compelling – this little wonder or the fact that it was designed by a neuroscientist. I finally had in my hand a device with the speed, memory, and portability to fulfill my AI dreams.

The 1990s saw the death of Isaac Asimov (one of my greatest heroes), but also saw me finally gaining enough software skills to implement a few Palm designs. These were mainly created in Forth and Prolog.  The Mars Pathfinder lander in 1997 was based on the same 80C85 microprocessor used in the Tandy Model 100 that I had used years earlier. This fact warmed my heart.

In 2001, I formed Picodoc Corporation, and released picoXpert.

Here are: the original brochure, a primer on Expert Systems, and a few slides.

It met with initial enthusiasm by a few, such as this review:

Handheld Computing Mobility
Jan/Feb 2003  p. 51
picoXpert  Problem-solving in the palm of your hand
by David Haskin

However, the time for handheld AI had not yet come. After a couple of years of trying to penetrate the market, I moved on to other endeavours. These included more advanced AI such as Neural Networks and Agent-Based Models. In 2011, I wrote Future Psychohistory to explore Asimov’s greatest idea in the context of modern computation.

Picodoc Corporation still exists, although it has been dormant for many years. It’s encouraging to see the current explosion of interest in AI, especially the burgeoning Canadian AI scene. For those like me, who have been working away in near anonymity for decades, it’s a time of great excitement and hope. Today, I’m mainly into computational citizen science, and advanced technologies that might be applied to it.

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Microsoft, Minecraft, and AI: Project Malmo

In a previous post, I complimented Microsoft on their purchase of Minecraft. I ruminated on the potential for STEM and experiential learning it opens up, particularly with the addition of HoloLens and augmented reality. Recently, Microsoft announced the public availability of their Project Malmo platform that uses Minecraft to provide an interactive environment for development and testing of intelligent agents. This further illuminates Microsoft’s long-term plans for Minecraft.

In contrast to highly specific, applied AI, Project Malmo harnesses the almost unlimited exploration and experimentation possibilities of Minecraft as a research tool for general artificial intelligence. Agents can learn, comprehend, communicate, and do tasks in the Minecraft world. A mod for Minecraft provides an API for programming, and uses XML and JSON to hold representations of the world in computer memory. Agents can explore their surroundings, see and touch Minecraft blocks, take actions, and receive feedback from the environment. This enables reinforcement learning. Instead of just applying deductive, symbolic reasoning, agents can benefit from inductive (experiential) learning.

The potential benefits are compelling.

minecraft block

 

Sandbox development allows strategies and algorithms developed in the Minecraft world to be later moved to a much larger simulation environment, possibly on distributed systems and/or supercomputers. Computational agents do not require sensory augmentation or biological interfaces, since they connect ‘directly’ to the simulated world. The ability to ‘overclock’ the simulation frees agents from the limits of our time scale (they can ‘live’ many days in an hour of real time).

For remote sensing applications, such as planetary probes & rovers, the benefits are huge. Autonomous machines must be developed and tested before they are deployed. The time delay incurred by vast distance precludes ground-based control. By the time people on Earth receive video from Mars and send the command, “don’t drive off that cliff”, it’s far too late. The ‘smarts’ to navigate and make decisions locally must exist in the robot. Hazardous locations, even here on Earth, also require considerable autonomous learning and decision-making.

Collaboration and comparison between people and teams is possible with a common testing ground. Minecraft can be played as an individual, but the real power is that a Minecraft world can be hosted on a server, enabling many people, wherever they are, to experience and simultaneously participate in that Minecraft world. Players, agents, landscapes & artifacts, or even a combination can all exist and interact. This is much more like the natural world.

Project Malmo encourages public participation and involvement. It’s an open architecture that invites experimentation by all. This is somewhat similar to the citizen science movement, which not only provides the benefits of ‘crowdsourcing’ to scientific research, but also enhances public understanding of scientific methods. In the modern world, a pervasive, basic understanding of artificial intelligence would be of tremendous benefit.

The ability to go far beyond the development of a single agent also opens up the possibility for social simulations of almost limitless scale. Agent-based models have long shown their usefulness in the study of social interaction, from ant colonies to human nations and cultures. One example is the simulation of ancient civilizations. This could lead to an entirely new approach to deciphering Linear A, for example. I considered a much more vast model in Future Psychohistory. In 2011, when I wrote that article, I had a standard neural network model in mind, but if I were to write it today, I’d definitely go with intelligent agents.

The concept of agency is central in many areas of life science. Relational databases are far too static for life science – Cassandra for Life Science. The proper representation of life is not tabular, but associative. Dynamic systems interact, adapt, and even learn. This is the basis of evolution. Machinery as tiny as enzymes and ribosomes, and possibly even medical nano-robots in the near future, all require much more dynamic models to study.

Project Malmo offers major benefits for teaching computer programming. Different paradigms such as procedural, functional, object oriented, declarative, and concatenative are all suitable and helpful in the construction of agents. And of course, there’s one of my favourite hobby horses – constructionism. This only adds to the already expansive use of Minecraft in education.

And for game play itself, Project Malmo can be of great usefulness. Dangerous and tedious tasks can be handed off to agents, amplifying a human player’s efforts. Tutoring and nurturing AI agents is a good use of human intelligence — and it’s a lot of fun.

Greek Pattern 1

AI research and Minecraft now have a powerful and resourceful champion – nice.
I look forward to seeing the fruits of Project Malmo.

Save

Save

Save

Save

Save

Learning does not mirror Teaching

An implicit assumption in most educational infrastructure is that teaching and learning are closely similar processes, perhaps even mirror images of each other. In the abstract at least, there is a transfer of knowledge from teacher to learner. It’s even possible that once a learner has assimilated enough taught knowledge, they could ‘switch polarity’ and become a teacher. No, this is not another well crafted advocacy for sweeping reform in the educational system. I am neither qualified nor motivated to deliver such a thing. I just want to say a few words in the context of some aspects of computational thinking. There are several ways to categorize programming languages: procedural, declarative, functional, concatenative, syntonic, object oriented, data oriented, etc. My point is merely this: teaching is declarative, learning is syntonic.

The ability to acquire and the ability to impart are wholly different talents. The former may exist in the most liberal manner without the latter.
– Horace Mann

When we start out, we immediately inherit a vast and exponentially expanding body of human knowledge. This is our birthright, it does not belong to gate keepers or authorities who mete out crumbs as they see fit. Academia has the task of adding to and curating this knowledge – it doesn’t own it.

The unifying term ‘education’ implies a deep connection, a yin-yang, almost mathematical sort of symmetry between teaching and learning. This symmetry is a perception that is eagerly supported by academia, it’s an intuitive and widely held view, a ‘central dogma’. There is little evidence for it, however. It should be remembered that mathematics is only shorthand for the complexity of nature. Nature is a realm of computation and evolution, and mathematics is one of the tools that enables a vastly simplified model of reality to be held in a three-pound hominid brain. It is often said that mathematical concepts such as π and the Fibonacci Sequence are seen everywhere in nature. That’s true, but they’re seen by who? Snails and daisies, or humans? The fact that we see a pattern does not necessarily mean that a ‘Deep Truth’ has been discovered. Anthropomorphizing nature is a mistake. Furthermore, it is difficult to see even a logical similarity between Plato in the olive groves of Athens and the result of many millions of years of evolution by variation and natural selection.

There is one aspect of teaching though, that is highly influential on learning. That is in a teacher’s capacity to inspire.

If you want to build a ship, don’t drum up people to collect wood and don’t assign them tasks and work, but rather teach them to long for the endless immensity of the sea.
– Antoine de Saint-Exupéry

Human knowledge may be a birthright, but the storage and delivery systems for that knowledge are subject to the laws of socio-economics just like every other industry. Papyrus, the printing press, telegraphy, telephony, electronic media, and ultimately the Internet has been the path of technology.

While not able to exactly lay out a guaranteed path, a teacher can describe the landscape, list known boundary conditions, and illustrate and clarify goals and heuristics. Teaching is therefore, a formal, objective, descriptive task. In programming parlance, it is ‘declarative’.

LOGO1

Learning stuff is a very different topic from teaching. We have basically the same neurology as people did way back when banging rocks together was high technology. We evolved to find food, avoid predators, and reproduce. Of course, when intelligence arrived on the scene, things became ‘non-linear’. When social behaviour and language arrived, Alice tumbled down the rabbit hole.

The smartphone is a testament to language, science, and technology, but not increased individual intelligence. In 1965, a good pocket radio had a handful of transistors. Today’s smartphone has over a billion. People haven’t gotten a hundred million times smarter in the last 50 years (at least I know that I haven’t). Buckminster Fuller’s “Knowledge Doubling Curve” goes from 100 years around 1900, to 25 years around 1950, to 1 year today, to months/weeks/days/hours? soon. Accurate predictions are difficult because human activity is now blending with machine learning, and it’s a whole new ball game. If the central dogma that teaching and learning are symmetrical ever was true, it is becoming less true with each passing year.

So how do human learners continue to even be relevant? Well, the good news is that the same evolved learning capacity we’ve always had is applicable to any level of abstraction. In fact, perhaps a serious exploration of exactly what ‘level of abstraction’ means would be a good thing for young minds. An associated idea is that ‘things’ are not of primary importance, but rather that the connections between things are. Metaphors are examples of such connections. If we can conceptualize atoms and galaxies in terms of table-top models, we have a shot at comprehension. Also, people can learn on their own using reasoning, common sense (bootstrapping), reverse engineering, and intelligent trial and error.

The key element to learning is experience. It makes little difference how logical or well laid out an argument is if the learner has no connection to it. That’s what is meant by ‘syntonic learning’:

Educators sometimes hold up an ideal of knowledge as having the kind of coherence defined by formal logic. But these ideals bear little resemblance to the way in which most people experience themselves. The subjective experience of knowledge is more similar to the chaos and controversy of competing agents than to the certitude and orderliness of p’s implying q’s. The discrepancy between our experience of ourselves and our idealizations of knowledge has an effect: It intimidates us, it lessens the sense of our own competence, and it leads us into counterproductive strategies for learning and thinking.
– Seymour Papert

A body of knowledge is much more compelling if it can be explored subjectively, at the learner’s own speed and depth, because memorability is a big part of learning:

When you make the finding yourself – even if you’re the last person on Earth to see the light – you’ll never forget it.
– Carl Sagan

Teaching does not and cannot encompass learning:

What we become depends on what we read after all of the professors have finished with us. The greatest university of all is a collection of books.
– Thomas Carlyle

Learning is not containable in bricks and mortar or bureaucracy. It is very simply, what every human does whenever free to do so. ‘Education’ is really just another word for ‘learning’:

Self-education is, I firmly believe, the only kind of education there is.
– Isaac Asimov

It may be tempting to assert that Socratic dialectic is a suitable substitute for syntonicity. However, the former, while undeniably powerful and valuable, still involves knowledge transfer between human minds. This, by necessity, requires formalism, symbolism, and formulae. Syntonicity, on the other hand, requires nothing but a human mind exploring reality, with the aid of machine computation (algorithms) if necessary. Learning is therefore, an informal, subjective, experiential task. In programming parlance, it is ‘syntonic’.

Save

Institutional Citizen Science

Personnel motivation and esprit de corps have always been important in any organization. Citizenship and Corporate Social Responsibility (CSR) have sometimes become as important to the brand as the trade name or logo. For many reasons, it is wise to consider institutional citizen science.

Traditionally, participation in citizen science projects has been done at the individual level. That is, observations (e.g. ecosystem projects), identifications (e.g. galaxy classification), and computational contributions (e.g. protein folding simulation) have been made by individuals. People sometimes join teams of like-minded or geographically grouped participants, and their efforts are often reported or tallied as a team. However, there has been very little organizational-level participation. There are many potential benefits of institutional citizen science, and in particular the computational variety.

Employee engagement can be improved. IT staff can provide leadership in setting up the required infrastructure, even with minimal initial effort. They can provide ongoing maintenance, expansion, and IT efficiency improvements. They can learn a lot along the way. Communications staff can prepare and disseminate any required internal information, and again, learn a lot along the way. Seeing the daily progress of the organization’s participation can engage everyone. A well run project can advance the cause of a more inspired, invigorated, enthusiastic, energized, and empowered staff with more of a sense of ownership for their organization. Progress in citizen science projects could be shown in a dedicated section of the organization’s intranet. Perhaps even a big screen could be located in common areas such as the lobby or cafeteria to show live content (e.g. simulations, animations, numerical results) and promote a sense of community. Management can simultaneously learn a lot about concepts such as computing as talent, cognitive computing, ‘gamification’, and integrating technology.

MTekg HeroRestoration of Miss Canada IV speedboat (courtesy http://www.tom-adams-muskoka-boats.com)

Institutional culture can benefit. Loyalty and pride in the institution are valuable assets. Leadership in ‘doing good’ is a strong motivator and has been a cornerstone of CSR for decades. There are opportunities for recognition and appreciation of both individual and team efforts. Both individuals and groups can suggest which projects to participate in from the large and growing menu available. Citizen science projects offer opportunities for people to think outside the box, to step out of their comfort zone, to consider more diverse possibilities, to form new partnerships, and to take the long view. A culture with all specialists and no generalists needs fresh air to breathe. The study of nature can offer a welcome break from politics and policy considerations, immediately and easily putting everyone on the same level: an observer.

Institutional innovation can benefit. Although tempting (when myopically studying spreadsheets) to farm everything out to consultants and sub-contractors, in-house innovation can be extremely valuable. Skunkworks (small teams for experimental projects) and Bimodal IT (production and exploration as separate yet symbiotic streams) can provide huge benefits, and citizen science is a natural skunkworks project. Notions of siloed knowledge and operation can be skeptically reviewed and perhaps even stress-tested without having to disrupt the larger organization. ‘What if’ models can move from pure theory to at least partial practicality. Owning innovation, moving it vertically through an organization at the appropriate pace, and finally delivering it to the world can generate and cultivate innovation itself as an asset. Like the old proverb says: “Give someone a fish and they’ll eat for a day. Teach someone to fish and they’ll eat for a lifetime.”

Internal HR can benefit. Management and leadership talent can be identified and incubated in a non-threatening, non-competitive domain. Understanding the internal talent ecosystem is essential for the health and future scalability of any organization.

Governments can draw upon citizen science as well. On a regional or national scale, public policy can both encourage and benefit from an actively engaged citizenry. In-depth issues such as climate change, demographic change, disruptive technologies such as Artificial Intelligence (AI) and Automation, and general scientific and digital literacy become much easier to create a dialog around if the communication and participation is two-way. Agile, multi-disciplinary, multi-lingual, and age-spanning efforts are all increasingly valuable. A sparse and diverse population can come together on a unified effort without sacrificing, and perhaps even benefiting from, that diversity.

In the coming age of AI and Computer-Generated Imagery (CGI), there will be a tsunami of hoaxes, spoofing, and fake news. At best, mistaking such things for real content is embarrassing. At worst, these could represent an existential threat. The surest defense against these dangers is scientific literacy, both in the general population, and particularly within the organization. The first step in avoiding a trap is knowing of its existence.

There is also of course, a direct benefit to scientific research. Citizen science is not a replacement for academic research, it’s an adjunct. Projects run under the supervision and purview of scientists benefit in several ways from citizen participation. There is an increase in resources, harnessing more labor (e.g. collecting data), human intelligence (e.g. categorizing images), and computational power (e.g. crunching numbers for simulations). There is an increase in scope, drawing from a wider pool of time, space, and experience. There is also an increase in public awareness of scientific research and methodology. Scientific research is its own reward, and is worth defending.

Finally, there are the usual advantages that come with economy of scale. By gathering the efforts of many individuals under one ‘roof’, much wasteful duplication is avoided. Looking at computational citizen science in particular, instead of having members individually setup and run their own ‘crunching’ computers at home, they can participate at the workplace or remotely. The performance per watt of one big machine is much better than many smaller ones. It’s also a way for social skills to be advanced over isolation in the internet age.

Organizational learning requires interaction and participation. Growth requires innovation. Basic scientific literacy improves objectivity and comprehension of a complex world. Computational thinking improves problem solving skills. The improved use of reason and logic for analytical thinking, deduction, and inference might result. These skills and attitudes may not be easily quantified or measured, yet they surely benefit the organization, especially in the long term. Learning becomes an organizational task and goal, resulting in a more knowledgeable enterprise as a whole. Improvements in individual skills, together with deeper and wider internal communication go a long way toward that end. Diversity of learning styles, participation levels, and paces can be accommodated. The best organizations assign the ends, not the means.

Greek Pattern 1

“If you want to build a ship, don’t drum up people to collect wood and don’t assign them tasks and work, but rather teach them to long for the endless immensity of the sea.”
– Antoine de Saint-Exupéry

Save

Save

Save

Save

Save

Save

Save

Save

Save

Save

Forth: A Syntonic Language

Educators sometimes hold up an ideal of knowledge as having the kind of coherence defined by formal logic. But these ideals bear little resemblance to the way in which most people experience themselves. The subjective experience of knowledge is more similar to the chaos and controversy of competing agents than to the certitude and orderliness of p’s implying q’s. The discrepancy between our experience of ourselves and our idealizations of knowledge has an effect: It intimidates us, it lessens the sense of our own competence, and it leads us into counterproductive strategies for learning and thinking.¹
– Seymour Papert

People are sentient and we know that we are sentient. We are also social creatures. The human mind is aware of ‘agency’, in both itself and others. Since the beginning of recorded history, and probably long before, we have even ascribed agency to objects and events in the natural world. Once we attain consciousness, it becomes powerful and efficient to reuse this mind machinery to understand the world around us in familiar terms. Mirror neurons in the human brain enable us to read or infer intention in others. They do not merely form patterns of thought – they reflect them.

Syntonic is a word sometimes found in music theory, but in psychology, syntonicity means having a strong emotional connection with the environment. It is understanding something through identification with it. It is achieved by putting oneself “in another’s shoes”, and it is key to human learning. One form of this is imitation, and children do this all the time, with everything from trees and animals to teapots and airplanes. Another form is metaphor, where an established understanding is substituted for a new and unfamiliar thing. See Ancient Metaphors for examples from the IT world.

When we mature, many of us discover computer programming and would like to use the same built-in learning method that allowed us to develop ‘common sense’. However, most programming languages and their text books are so abstract, formal, and doctrinal, that it’s nigh-on impossible to get very far this way. As a result, computational thinking recedes off into the ivory tower, and the world is much worse off because of it.

An interesting and more in-depth exploration of syntonicity in programming languages was written by Stuart Watt.²

 

Forth words

I have found Forth to be a very syntonic language.

Syntax and structure are the first face that we see of a new language. Some are fairly natural (e.g. BASIC), most are verbose and intricate, and some are at the extreme end of mathematical formality (e.g. Haskell). Forth is positively Spartan by comparison, with the majority of keywords being three letters or less, and the ‘: … ;’ pair providing most of the structure and context. Some long time users of Forth do not even like its syntax and quickly override keywords (a trivial task in Forth), inventing their own to craft the language to be more in tune with their way of thinking. Ironically, this makes them love Forth more, not less. It is an extremely simple language. At its core, Forth is just a mechanism to reduce the cost of calling a subroutine to merely referring to it by name. But this is not what makes Forth syntonic.

“syntonicity is not directly a property of the syntax, or even the semantics or pragmatics of a language, but a property of the stories that we tell about it” (Watt p.5)

Forth is a stack-based, threaded, concatenative language with a tightly coupled interpreter and compiler. To explain these terms, and thus ‘tell a story of Forth’, we’ll use a metaphor:
Forth – a hiker (named ‘Arkady’ for convenience)
Stack – her backpack
Thread – the trail
Interpreter – her mind (neurology)
Compiler – her logbook

Notice that there is no clear distinction between traveler (Forth) and journey (program). Arkady’s journey is of more interest to her than is her destination. She is open to distraction, side-trips, and unexpected eventualities. Even mistakes are sometimes learning opportunities. This is the way of Forth – less focus on design and protocol, more on exploration, discovery, and emergence. While most languages are applicative (functions are applied to data), Forth is concatenative, which is a much more natural and emergent process. There are also few safeguards in Forth; the entire environment (machine) is wide open. This freedom is anathema to authorities of language design, which might partly explain why Forth-like languages are not more widespread. It’s much safer to clone an application from an existing template and/or framework than to invent a new one. Of course, the result is a copy, not an original. Incidentally, in much research funding, the outcome must almost be predicted along with careful metrics and timetables (ask anyone who has prepared a grant application). This squelches creativity and serendipity. It also puts a premium on positive results over negative (and perhaps greatly informative) ones. In fairness, just so we don’t make the mistake of assuming that everyone agrees that doctrine is a bad thing, there has been a large and successful effort to come up with a more standardized, ANS Forth.

Her backpack (the stack) is a readily available, last in, first out (LIFO) pile of items that she adds or removes as required. As a good hiker, Arkady is always aware of what’s in her backpack. She keeps often-used items (tools, map, compass) near the top. Longer term necessities (food, tent) are towards the bottom.

She walks along the trail (thread), step-over-step. Some steps are short and obvious, like avoiding big rocks and snakes. Some are more complex and subtle, such as crossing streams and staying downwind of bears. Many are actually composites of smaller steps. Some require more items from the backpack than others. At each step, she has the same backpack, although the exact contents may vary. She may sometimes just take the top few items, drop the pack, and head off down an interesting side trail. This is how a Forth program is built up. New ‘words’ are written to accomplish tasks. These words are then threaded together as higher level words invoke a series of them. Eventually, a single, highest level word is the starting point for the whole program, which is just a chain of subroutine calls.

Using all of her neurology, from senses to brain to common sense, logic, and rationality, she ‘interprets’ her world. She learns new facts and skills as she walks, gradually ‘bootstrapping’ a deeper understanding of her environment and more powerful and efficient use of resources (food, energy, time). To learn if and how something works, she prods it and observes the results. If she finds an unfamiliar mineral, she could bring whatever tools she is carrying to bear. Or, she could invent an entirely new tool. She could craft a crude microscope from the magnifying lenses and rolled up map in her backpack. If she has time, a mass spectrometer may be suitable. In Forth, quick tests and trials are inexpensive and easy. There is no odious edit-compile loop to get in the way. For example, to find what OVER does, put a few numbers on the stack, type OVER, and examine the stack to see the result. Stumbling around with her nose in manuals and books would perhaps cause Arkady to miss a fossil or patch of berries. It is also a good way for her to become lunch for a mountain lion.

A word about bootstrapping. Sometimes, when a skill is applied to produce a result, new knowledge (or a new tool) is gained in the process. This new knowledge opens the door to skill-set improvement or refinement, which enables hitherto impractical or unseen possibilities. In turn, even more new knowledge is obtainable, and the process repeats. The result is flint knife to bow and arrow to alchemy to chemistry to electronics to spaceflight. Bootstrapping is the essence of Forth. Sadly, multitasking, object-orientation, dissociation and data-hiding, and that ultimate chestnut, ‘leveraging’ are all much more in vogue today. Multitasking in human thinking is greatly overrated. Deeper, syntonic thinking enables more creativity and innovation. If Forth was more widely used and understood, you might well be reading this on Mars right now. If it had been invented earlier, you might be reading this on Triton.

From time to time, Arkady considers new observations, knowledge, thoughts, and ‘steps’ to be well-established and important enough to jot down in her logbook. This log serves as a permanent record of all she has learned on her journey. It is in fact, a purified, corrected, ‘frozen concentrate’ of her trail.

Arkady eventually arrives at the end of her journey. It may have been cut short for various reasons or maybe her destination is different (maybe even better) than she originally intended. This is a function of her curiosity as much as weather or circumstance. In any case, she still has her logbook, for her or others to use as a future guide (be they human or machine).

Some would say that it’s wrong to argue for Forth going forward. After all, it’s an old language, and was originally created for control of machinery in an era of sparse computational resources. As programs and systems grow ever-more complex, capable, and intelligent, such a language has outlived its usefulness. However, I must disagree. The future is not just about frameworks, big data, and augmented reality. It will be at least as important to build new, ad hoc, ‘micro’ systems rapidly and locally in a time of dizzying acceleration of technology. Creating, testing, and problem solving starting from first principles will always be valuable.

Understanding the world requires thinking in new languages. Forth is to computer science what math is to physics. Computational thinking, syntonicity, learning-by-doing*, and good old human common sense are not headed for the dust bin of history any time soon. Here are some further thoughts on learning.

 

Greek Pattern 1

 

 

* I am, however, perhaps not a very good constructionist, try as I might.
I once tried (unsuccessfully) to acquire a minor Logo implementation.
I believe in learning-by-doing something else !  A wide, diverse search is
often the most efficient (just ask the ants).

(1) Papert, S. (1980). Mindstorms: Children, Computers, and Powerful Ideas. New York: Basic Books
(2) Watt, S. (1998). Syntonicity and the psychology of programming – Psychology of Programming Interest Group 10th Annual Workshop.

The Asimovian Epoch

Centuries past are like the long night, a mysterious dream world. The modern, networked Earth is like daybreak, a world-wide awakening.

The fifty years 1945-1995 could be considered as a sort of ‘twilight zone’ between the two. The bulk of thinking and research done during this period is too new to be of any great historical or even deep nostalgic interest. Yet it is too old to be treated as current knowledge in the Internet age, where freshness and novelty are valued. In addition, we are still inside several of the epic transformations that began in this time. It’s difficult to appreciate and comprehend a revolution when you live within it. Our blind spot for this period is a great tragedy.

This period could be called the ‘Asimovian Epoch’. Here’s why.

Isaac Asimov (1920-1992) was perhaps the last great polymath. He was a biochemistry professor and author, publishing (writing or editing) over 500 books (covering 90% of the Dewey Decimal Classification system) and many thousands of shorter works. That’s an average rate of over one book per month over his long career – one of the greatest outbursts of creativity in history. His favorite topics included history, physics, comedy, and English literature. He was quite arguably the greatest science fiction writer of all time. He, Robert Heinlein, and Arthur C. Clarke are known as ‘The Big Three’ of the Golden Age of science fiction.

Asimov was a nexus of 20th century thought.isaac-asimov

He was strongly influenced by Greek and Roman classics, Shakespeare, Gilbert & Sullivan, and also by contemporaries such as Kurt Vonnegut, Heinlein, Clarke, Harlan Ellison, Carl Sagan, and Marvin Minsky. He in turn influenced the likes of Paul Krugman (Nobel Prize in Economics), Elon Musk (SpaceX, Tesla), most science popularizers (happily now becoming too numerous to list), virtually all science fiction writers, and of course millions of readers.

Optimism came early to Asimov, perhaps because he spent so much time as a child in his father’s candy store. Many were initially drawn to science as teenagers by his optimistic and compelling vision of science and technology. Given the advances that 20th century science brought, such as space exploration, computation, automation, microbiology, etc, our debt to him on this account alone is inestimable. His masterpiece, Foundation, has deeply influenced many others, including me:  Future Psychohistory

It has become fashionable for some to be suspicious of science and even liberal education. This was a trend that Asimov fought tirelessly against all his life. He warned against
the false notion that democracy means that ‘my ignorance is just as good as your knowledge’

He went even further, exploring the frontiers of individuality and self-education:
Self-education is, I firmly believe, the only kind of education there is.

Some even credit Asimov with predicting the Internet, especially its use for personalized education. He envisioned a world in which people could pursue their own interests, unfettered by the standard classroom, open to new ideas and serendipity.

Ironically, the very technology that he hoped would spur individuality and diversity has the opposite effect at times. One example is digitization and archiving of published material. There was a time when books were expensive, highly personal possessions that stayed with the owner for a lifetime, and were even sometimes handed down across generations. People would store letters, pressed flowers, and other snippets of life between their pages, and write personal margin notes. Books were not just repositories of language, they were time capsules and valuable historical accounts. However, the best book to scan is a pristine, un-personalized volume. Once it is scanned, future researchers use this one version increasingly exclusively. Individuality and diversity are squelched.

A pervasive basic understanding of science was Asimov’s constant goal. He often took the opportunity to describe science not as a storehouse of absolute truth, but rather as a mechanism for exploring nature. Science is not a dogmatic belief system. Learning happens when one asks questions of the universe and then listens attentively and objectively with the ears of science. He wrote eloquently on scientific models, logic, and inference. An example is this essay.

Of course Asimov is best known for his robot stories, and his ‘Three Laws of Robotics’.  In 2010, the U.S. Congress established National Robotics Week in his honor. It’s possible that he saw artificial intelligence as a way for intellect to exceed the bounds of human bias, subjectivity, and mortality.

I never met Isaac Asimov, but I have spoken to several people who did have that pleasure. Each one related the profound impression he made on them. He died in 1992, but his books continued to be published posthumously. By the end of 1994, the World Wide Web Consortium had been founded at MIT. The dramatic rise of the Web marks the end of the Asimovian epoch.

Academia can sometimes degenerate into a vicious paper chase, with Ego in the front seat,
Science in the back seat, and Humanity locked in the trunk. Asimov famously said,

The most exciting phrase to hear in science, the one that heralds new discoveries,
is not ‘Eureka!’ but ‘That’s funny…’

Sadly, with hyper-specialization and the loss of interdisciplinary thinkers like Isaac Asimov, the catch-phrase of science has increasingly become: “Oooh! Shiny!”

 

 

 

Save

Save

Microsoft, Minecraft, and STEM

Microsoft recently bought Mojang AB, the Swedish company that makes Minecraft, for a reported $2.5 billion. It was not Microsoft’s first attempt to woo Mojang, and many thought they would never be able to close the deal.

In 2009, Markus Persson (‘Notch’), an unknown independent Swedish game developer, released the first version of his creation, Minecraft. It was not a normal ‘game’. It had no defined sequence, no milestones or levels of achievement, and most curiously, no end goal. What it did have was a charming, simplistic world in which a player could ‘mine’ and ‘craft’ for hours (or days) (or weeks) according to their own desires. In fact, the game could be played without any other mobile agents (‘mobs’) at all, leaving an entire world to explore and terraform with no limits. In 2010, maps became truly infinite, growing automatically as the player’s avatar (‘Steve’) approached an edge. The fact that no one else ever has or ever will live in, or even see, your privately created world gives one a compelling, almost haunting feeling.

It reminds me of the 1967 Star Trek episode “Arena”. Captain Kirk is locked in mortal combat with a frightening alien creature on a deserted and barren world. He must find or craft weapons from whatever materials he can find. He finally succeeds by assembling the raw materials for a crude cannon. We have all imagined ourselves cast back to a more primitive environment, where our own imagination and gumption are nearly all we have at our disposal. Interestingly, the goal of many advanced Minecraft players is to populate their world with coffee shops, restaurants, and villages.

Over time, Minecraft has expanded to include many characters, tools, and landscape features, but it has always maintained its minimalist look and feel. That’s the secret to its success – this is a world that is understandable to everyone, from children right through to seniors who grew up before the computer age. While the richness and physics of this world have evolved, the quaint, blocky look of it hasn’t.

The game has since sold many millions of copies, inspired LEGO sets, books, films, and has a huge and active fan base on YouTube. Conventions are held regularly, and Notch achieved rock star status years ago. Minecraft is the very definition of viral Internet success, massively disrupting traditional corporate and marketing models. Without shrink wrap or advertisements, it took over the gaming world.

minecraft block

 

So, many folks were left scratching their heads over this Microsoft acquisition. On the surface (pardon the pun), it seems like an unlikely fit. However, I don’t think that at all. I see it as the happy marriage announcement of two dear old friends.

Around the turn of the millennium, I was a corporate trainer, with MCT & MCSD certifications (the ‘M’ is for Microsoft). I often found myself sticking up for Microsoft amidst the disdain of my colleagues. I would use Bill Gates’ early years as an example of out-of-the-box thinking, like the Tandy Model 100, the first true notebook, and the last machine that he programmed personally. The best Christmas present I ever got was DOS 3.0, and QuickBASIC was one of the best languages I ever used (and I’ve used 50 or more). Maybe Apple and others were more stylish, but it was Microsoft that really put the excitement into personal computing.

Microsoft hopes to gain leverage in the mobile and youth markets. I cannot think of any better vehicle than Minecraft. Far from seeing it as the death knell for Minecraft, I see Microsoft’s acquisition as a renaissance and a real chance for positive, imaginative, and compelling games to get the much larger piece of the pie that they deserve.

“If you talk about STEM education, the best way to introduce anyone to STEM
or get their curiosity going on, it’s Minecraft”

– Microsoft CEO Satya Nadella

Also, there’s Microsoft’s augmented reality headset – HoloLens. There appears to be a plan coming together:
Minecraft + HoloLens + ElementaryProgrammingSkills = Syntonic Learning

If this is what Microsoft has in mind, it could be the most disruptive technology yet.
Well done Redmond, and best wishes to Notch in all his future endeavors.

 

Forth words

 

Morse Code and Minecraft

Several clever machinations have been implemented for the game Minecraft. These often include circuits and devices that mimic their real-world counterparts. They even include complete calculators and computer sub-systems (eg ECC memory and ALUs). Perhaps one of the more compelling inventions is the addition of Morse code operations. There are numerous implementations, all of which are interesting and a few are excellent.

I have a particular fondness for Morse code. See “Morse Code and Me”.

 

Telegraph Poles

Often in the 18th and 19th centuries, ‘semaphores’ or codes were played with in an attempt to communicate over longer distances than the human voice could carry. With the rapid development of electricity and magnetism theory and practice in the early 1800s, the time was ripe for the telegraph. In 1837, Samuel Morse et al demonstrated a working telegraph setup in America (there were earlier versions in Europe). The principle was that codes tapped on a mechanical key would be converted to electrical signals, then sent along a wire to another key at the other end which would receive (repeat) the code. Since this transmission happened at almost the speed of light, communication at a distance became essentially instantaneous. Later refinements to and extensions of this principle led to the telephone, radio, television, and of course the Internet.

Why is Morse a good fit for Minecraft ? Basically, it is in keeping with Minecraft’s quaint and minimalist nature. Crafting rudimentary equipment and stringing telegraph wires matches this game (it’s simple and fun). Minecraft already has support for elementary circuits and power via “Redstone”. Morse code is at home in a rustic world of dirt, stone, and wood. You can’t get much more binary and ‘blocky’ than a string of dots and dashes. In contrast, popup chat windows are harshly anachronistic. Telegraphy also fits in well with Minecraft’s ‘bootstrapping’ model; you don’t absolutely require it, but it speeds gameplay and opens up new possibilities once the technology is achieved.

Educational use of Minecraft benefits greatly from this bootstrapping model. Finding, crafting, and mastering tools and gadgets, and thus learning as you go is half the story. Exploring endlessly rich and surprising 3D worlds is the other half.

Also, learning a language is never a bad thing, and Morse code is one of the simplest ‘languages’ there is. Telegraphy is laden with historical significance, another opportunity for learning.

What are the possible uses for telegraphy within Minecraft ? First, as I said, it’s a more primitive alternative to text chat windows in multi-player modes. Next, it would enable the transfer of information across distance, for example maybe a crude sort of ‘fax machine’ for maps, books, signs, etc. Next, automated beacons could be built, perhaps they might even show up on maps. Next, remote control of processes and apparatus would extend the player’s reach. Next, short messages could be written right into the landscape (for example dirt for dots and stone for dashes). How about time capsules or ‘notes to self’ ?

So, Morse code would fit very well into the Minecraft world and mindset, with many novel and fun uses. The educational benefits of this addition would be welcome and easily realized.

MC Morse

Concatenative Biology

There are several ways to categorize programming languages. One is to distinguish between applicative and concatenative evaluation. Most languages are applicative – functions are applied to data. In contrast, a concatenative language moves a single store of data or knowledge along a ‘pipeline’ with a sequence of functions each operating on the store in turn. The output of one function is the input of the next function, in a ‘threaded’ fashion. This sequence of functions is the program. Forth is an example of a concatenative language, with the stack serving as the data store that is passed along the thread of functions (‘words’). “Forth is a simple, natural computer language” – Charles Moore, inventor of Forth.

One of the great advantages of concatenative languages is the opportunity for extreme simplicity. Since each function really only needs to know about its own input, machinery, and output, there is a greatly reduced need for overall architecture. The big picture, or state, of the entire program is neither guiding nor informing each step. As long as a function can read, compute, and write, it can be an entity unto itself, with little compliance or doctrine to worry about. In fact, in Forth, beyond the stack and the threaded execution model, there’s precious little doctrine anyway!  Program structure is a simple sequence, with new words appended to the list (concatenated). The task of the programmer is just to get each word right, then move on to the next.

In nature, the concatenative approach is the only game in town. Small genetic changes occur due to several causes, random mutation being one of them. Each change is then put through the survivability sieve of natural selection, with large changes accumulating over large time scales (evolution). (Evolution is active across the entire spectrum of abstraction levels. Hierarchies emerge naturally, not through doctrine or design.)  Concatenation is the way by which these small changes are accumulated. Much of the epic and winding road of human evolution is recorded in our DNA, which is billions of letters long.

Concatenative DNA

This process can be seen happening right now in molecular biology. Consider the ribosome. This is the little machine inside a cell that reads a piece of RNA (a chain of nucleotides) and translates it into a protein (a chain of amino acids). There is no Master Control Program assigning names, delegating work, and applying functions. There is only a concatenative program, built up over the ages by evolution. So, basic life uses a fairly powerful and successful form of computation: DNA for storage, RNA for code, ribosome for computing, protein for application.
(and natural selection for testing) 🙂

We flatter ourselves when we talk of our ‘invention’ of levers, gears, factories, and computers. Nature had all that stuff and much more long before we ever came down from the trees. Math, engineering, and science are great not because of their products, but rather because they enable 3-pound hominid brains to explore nature and ponder the possibilities.

Ancient Metaphors

New nations often use comparisons and metaphors to build a more familiar and solid framework or ‘story’ using the past, especially to evoke visions of ‘glory days’. One example is the symbolism of early America harkening back to ancient Greece and Rome. The founding fathers often learned Greek, Latin, and much of the history of those cultures in their formative years. We can still see that influence today in the architecture of capital government buildings.

It should come as no surprise then, that the same thing happens with computers and the insatiable need for new descriptors and naming conventions. Much of this framework has sprouted up within the last few decades. That doesn’t leave enough time for growing familiarity organically. Past language and literature provide a rich crop ready for immediate harvesting.

Here are two examples:

Beowulf

This epic poem is one of the earliest known pieces of vernacular English literature. The story is about a brave hero who slays monsters in the north of Europe sometime around the 6th century. It is a tale of great strength, difficulties overcome, and distances traveled. The fame of this poem (especially the Old English version) has grown steadily over the last two centuries. It has even been reproduced in literature, popular media, and the cinema. This provides a ready-made metaphor that can be used by something new.

In 1994, Thomas Sterling and Donald Becker built a computer at a contractor to NASA. It was the prototype for what would become the “Beowulf Cluster” architecture. This type of machine uses modest, commodity, even oddly matched PCs connected together to form a single parallel computer. It has acquired a rather stoic and renegade reputation in academia, where it has been used to gain entrance to the exclusive supercomputer club ‘on-the-cheap’. The name is said to have been inspired by the line in the epic poem: “thirty men’s heft of grasp in the gripe of his hand”.

 

Inferno

Another old epic poem is Dante Alighieri’s “Divine Comedy”. It is an allegory for the soul’s travels and a map of 14th century spirituality. It is as well a multi-dimensional exploration of sensitive/rational human nature and psychology. Again, this work is one of the most studied pieces of literature in history. It is another ready-made treasure trove of meaning and metaphor.

The first of its three parts is “Inferno” (hell). It is graphic and horrifying. It is also strangely open-minded and thoughtful in that figures from classical literature are woven into a Christian narrative. These include Socrates, Plato, and indeed Virgil, Dante’s guide. Perhaps the most apt adjective for “Inferno” is: deep.

“Inferno” was recently used as a template for a “Software Inferno” describing “sinners against software” and their torments (1). By using this template, the author begins with an extensive toolbox of metaphors. Of course, neither “Software Inferno” nor this short blog post should be used as an authoritative representation of Dante. There are plenty of good academic sources for that.

Instead of Virgil, Bell uses the Countess of Lovelace as our guide. More commonly known as Ada Lovelace, she was a mathematician and the daughter of Lord Byron, and is widely considered as the world’s first computer programmer for her work with Charles Babbage. She leads our hapless traveler down through the Inferno, towards the Earth’s core, along a roughly cone-shaped set of circular slices:

inferno1

LEVEL DESCRIPTION
ANTE-
INFERNO
developers who couldn’t decide whether to use their skills for good or evil
denied entry to both Heaven and Hell, doomed to eternal obscurity
LIMBO developers born too early in history to know of proper software engineering
LUST developers who chose power and fame over commitment and responsibility
they gave their time to their computers instead of their family
they sought the spotlight, now they are blinded by its glare for eternity
GLUTTONY developers with human heads and pig bodies
IPO and stock bubbles had led them into wasteful, gross over-consumption
GREED IT types who gathered every crumb of profit for themselves, neglecting to
share anything with the people who got them there
condemned to a cacophony of falling pennies forever
ANGER realm of vitriol and negativity, with inhabitants broiled by burning paper,
the only water available had been poisoned just like their own workplaces
HERESY developers who knew of good software engineering, yet refused to use it
now chaotically and aimlessly going their own way for all time
VIOLENCE developers who had inflicted malware, viruses, and phishing on the public
this pestilence now torments them in turn, and even sleep offers no escape
 FRAUD techno-hucksters, advocates, evangelists who had foisted crappy software,
standards, and models upon desperate users who longed for guidance
they now dwelt in sewage for all time
TREACHERY those who had enabled and encouraged this fraud
doomed to keep shoveling this detritus forever

 

Although intended for an audience of computer developers, the macabre humour is widely understood thanks to the template provided by Dante.

Here are a few other names for computer technology borrowed from the past:
Apollo, Athena, Delphi, Hercules, Hermes, Homer, Hydra, Jason, Janus, Merlin, Midas, Oracle, Odyssey, Phoenix, Phoebe, Pegasus, Sisyphus, Tantalus, Troy, Ulysses, Valhalla, Valkyrie, Zeus
… and of course a whole host of planets, moons, stars, and constellations.

 

Greek Pattern 1

 

(1) Bell, A.E. (2014). The Software Inferno. Communications of the ACM, 57(1), 48-53.

Save