Author Archives: scidata

Ancient Metaphors

New nations often use comparisons and metaphors to build a more familiar and solid framework or ‘story’ using the past, especially to evoke visions of ‘glory days’. One example is the symbolism of early America harkening back to ancient Greece and Rome. The founding fathers often learned Greek, Latin, and much of the history of those cultures in their formative years. We can still see that influence today in the architecture of capital government buildings.

It should come as no surprise then, that the same thing happens with computers and the insatiable need for new descriptors and naming conventions. Much of this framework has sprouted up within the last few decades. That doesn’t leave enough time for growing familiarity organically. Past language and literature provide a rich crop ready for immediate harvesting.

Here are two examples:

Beowulf

This epic poem is one of the earliest known pieces of vernacular English literature. The story is about a brave hero who slays monsters in the north of Europe sometime around the 6th century. It is a tale of great strength, difficulties overcome, and distances traveled. The fame of this poem (especially the Old English version) has grown steadily over the last two centuries. It has even been reproduced in literature, popular media, and the cinema. This provides a ready-made metaphor that can be used by something new.

In 1994, Thomas Sterling and Donald Becker built a computer at a contractor to NASA. It was the prototype for what would become the “Beowulf Cluster” architecture. This type of machine uses modest, commodity, even oddly matched PCs connected together to form a single parallel computer. It has acquired a rather stoic and renegade reputation in academia, where it has been used to gain entrance to the exclusive supercomputer club ‘on-the-cheap’. The name is said to have been inspired by the line in the epic poem: “thirty men’s heft of grasp in the gripe of his hand”.

 

Inferno

Another old epic poem is Dante Alighieri’s “Divine Comedy”. It is an allegory for the soul’s travels and a map of 14th century spirituality. It is as well a multi-dimensional exploration of sensitive/rational human nature and psychology. Again, this work is one of the most studied pieces of literature in history. It is another ready-made treasure trove of meaning and metaphor.

The first of its three parts is “Inferno” (hell). It is graphic and horrifying. It is also strangely open-minded and thoughtful in that figures from classical literature are woven into a Christian narrative. These include Socrates, Plato, and indeed Virgil, Dante’s guide. Perhaps the most apt adjective for “Inferno” is: deep.

“Inferno” was recently used as a template for a “Software Inferno” describing “sinners against software” and their torments (1). By using this template, the author begins with an extensive toolbox of metaphors. Of course, neither “Software Inferno” nor this short blog post should be used as an authoritative representation of Dante. There are plenty of good academic sources for that.

Instead of Virgil, Bell uses the Countess of Lovelace as our guide. More commonly known as Ada Lovelace, she was a mathematician and the daughter of Lord Byron, and is widely considered as the world’s first computer programmer for her work with Charles Babbage. She leads our hapless traveler down through the Inferno, towards the Earth’s core, along a roughly cone-shaped set of circular slices:

inferno1

LEVEL DESCRIPTION
ANTE-
INFERNO
developers who couldn’t decide whether to use their skills for good or evil
denied entry to both Heaven and Hell, doomed to eternal obscurity
LIMBO developers born too early in history to know of proper software engineering
LUST developers who chose power and fame over commitment and responsibility
they gave their time to their computers instead of their family
they sought the spotlight, now they are blinded by its glare for eternity
GLUTTONY developers with human heads and pig bodies
IPO and stock bubbles had led them into wasteful, gross over-consumption
GREED IT types who gathered every crumb of profit for themselves, neglecting to
share anything with the people who got them there
condemned to a cacophony of falling pennies forever
ANGER realm of vitriol and negativity, with inhabitants broiled by burning paper,
the only water available had been poisoned just like their own workplaces
HERESY developers who knew of good software engineering, yet refused to use it
now chaotically and aimlessly going their own way for all time
VIOLENCE developers who had inflicted malware, viruses, and phishing on the public
this pestilence now torments them in turn, and even sleep offers no escape
 FRAUD techno-hucksters, advocates, evangelists who had foisted crappy software,
standards, and models upon desperate users who longed for guidance
they now dwelt in sewage for all time
TREACHERY those who had enabled and encouraged this fraud
doomed to keep shoveling this detritus forever

 

Although intended for an audience of computer developers, the macabre humour is widely understood thanks to the template provided by Dante.

Here are a few other names for computer technology borrowed from the past:
Apollo, Athena, Delphi, Hercules, Hermes, Homer, Hydra, Jason, Janus, Merlin, Midas, Oracle, Odyssey, Phoenix, Phoebe, Pegasus, Sisyphus, Tantalus, Troy, Ulysses, Valhalla, Valkyrie, Zeus
… and of course a whole host of planets, moons, stars, and constellations.

 

Greek Pattern 1

 

(1) Bell, A.E. (2014). The Software Inferno. Communications of the ACM, 57(1), 48-53.

Save

Morse Code Therapy

Some years ago I had speech therapy for dysarthria. The therapy consisted mainly of breathing and vocal exercises, aimed at strengthening and enhancing the many components of physiology involved in speech. It was very successful and the professionals who helped me have my eternal gratitude. Along the way, I noticed that I was having some small trouble with initiating speech. Although a thought would form quickly and lucidly in my mind, actually triggering the vocal machinery to convert it into spoken words was sometimes problematic. It was like using a computer with a slow, damaged printer as the output device. Having been a computer designer and programmer for many years, it seemed to me that what was needed was a second output device, perhaps even a full two-way communications channel.

It’s certainly nothing new for people to compensate for one shortcoming by drawing upon one or more other senses or faculties. For example, there is Braille for the visually impaired and sign language for the hearing impaired. Even for someone without any such disability, learning a new language has profound benefits.

“To have another language is to possess a second soul”
Charlemagne

If I could open such a second communications channel, I could link my mind to the outside world while bypassing speaking altogether. This would allow me to work on both problems, the dysarthria and the imperfect speech initiation, as two processes in parallel (my inner geek again). However, another spoken language would not help here, and sign language required another person for learning and practice. Additionally, I wanted to keep it in the audio realm because working on neural audiology would reinforce my other therapy. I needed something that was audio-based, flexible enough for real communication, and simple enough to learn quickly, perhaps even with automated help.

Back in high school, I had belonged to the amateur radio club, mainly due to my interest in electronics. We built circuits, from oscillators to amplifiers to modest radio transmitters and receivers. The really keen members of the club (which did not include me) went so far as to obtain their amateur radio license. One of the requirements for this was proficiency in Morse code, that dih-dah-dih beeping from a bygone era sometimes featured in movie plots from the Old West to air and sea.  Aha – I had found the answer.

morse eureka

I assembled a few resources – a Morse code chart, an inexpensive key, an audio oscillator and speaker, and even a few text snippets encoded as Morse audio files from the Internet. Surprisingly, a bit of my old familiarity with Morse had stayed with me across the years. I keyed a few short phrases and found to my great joy that there was no delay or difficulty in initiating them. It also provided good fine motor control exercise for my right hand – another therapy which I needed but had not taken up as yet. I quickly improved at both writing and reading Morse, and soon began tapping out thoughts even if a pencil or fork was all I had available.

After a few weeks, I noticed a marked improvement in my speech initiation too, although I cannot say for sure whether this was the result of Morse practice or normal therapy and recovery. I like to think it was a combination of both. It could also be that any kind of learning, even just passive reading, is beneficial during this type of recovery.

The biggest surprise is the one that comes with any new comprehension of previously unknown words or language. Now, whenever I hear some innocuous Morse in the background of a film, I read it without even trying; usually it’s meaningful, but occasionally, it’s just gibberish !

 

Adaptive

 

“I see no reason why intelligence may not be transmitted instantaneously by electricity”
– Samuel Morse

We sometimes give ourselves too much credit for all that’s new and shiny. Staying with telegraphy, here’s a prediction of cell phones (in regards to Marconi’s “Syntonic Wireless Telegraphy”) from 1901!

 

 

I found a few other tools helpful in the course of my rehabilitation. The concept of ‘syntonicity’ was important, that is, the idea that learning is a subjective, experiential process. I revived my old interest in the Forth programming language, as well as Minecraft and similar exploration & strategy games.

Scientific Models

I get nervous when people enthusiastically advocate models. This happens mostly, though not exclusively, in sociopolitical debates such as climate change. A scientific model is not a belief system. All those who claim that science is on their side might better consider whether they are on science’s side. Most scientific models, especially statistical ones, are ways to measure and predict the behaviour of a bounded system (a subset of nature). Such a model does not claim to be the final, fundamental, and ultimate description of nature itself. It’s important to not confuse the map with the territory, since rabbit holes rarely appear on maps.

In fact, even a major shift in the underlying paradigm may only correspond to a small change in a model’s accuracy. A very good essay on this was penned by Isaac Asimov. In short, he argued that models may be updated with more subtlety and scope, but they are not summarily discarded as simply and absolutely wrong. Long after we replaced force with space-time geometry in our gravitational models, we still design elevators with Isaac Newton looking over our shoulder. We are all very much still ‘Flat Earthers’ when we plan a walk (unless that walk will take us across a time zone boundary or we sneak a peek at our GPS location along the way).

In this view, science is models and observations only. A theory is only a specific model, not a deep insight into the true nature of reality. Nature simply is. That’s it. Our beliefs about its fundamental structure are illusions. The meaning of observations is model dependent, but the observations themselves are not. Antares is in the same position in the sky, whether it is a red supergiant or “the heart of the scorpion”.

It should be noted that not everyone agrees with this view. For example, Thomas Kuhn argued that newer models are not just ‘better approximations’ (straw man alert) and that it is a serious mistake to teach that they are. He argued that consensus of the scientific community is also a component of scientific truth. I don’t happen to share that view, but then as I consider consensus to be irrelevant in science, my opinion means naught anyway. Of course paradigm shifts and revolutions do happen, but they are artifacts of history and psychology, not objective science. It’s a fallacy to define science as being only the set of all scientists, past and present. Again, there’s that map and territory confusion.

Raw predictive power alone is not the only metric for judging a model. Probably the most universal heuristic is agreement with observation. Many a compelling model has been ruined by the facts. Mathematical beauty, symmetry, and ‘feel’ are not important. Elegance and simplicity are helpful, perhaps even necessary, but not sufficient conditions for a good model. Enthusiasm for a model is a strong cause for suspicion. Humility before nature is a much better heuristic. Darwin and Planck had to be dragged along reluctantly with their own theories. Models can incorporate ‘tricks’ that are obviously not part of reality. For example, adding an equal ‘negative mass’ allows one to easily calculate the centre of mass of a plate with a hole in it. Perhaps the best thing a model can do is to not just answer questions, but to ask even better ones. Heliocentricity and the microbe theory of disease launched tremendously fruitful new science.

Nature is a realm of computation and evolution on an unimaginable scale (quite literally). Local agency leads to emergent complexity. Mathematics is one of the tools that can allow a vastly simplified model of nature to be stored in a three-pound hominid brain. However, daisies and snails know nothing about mathematics. Anthropomorphization of nature is not science. Consensus and comfort are not science. Clinging to teddy bears is really cute in youngsters, not so much in adults. The good news is that nature is far, far, far more wonderful than anything any individual or advocacy group can come up with (including well-crafted teddy bears). You just have to let objectivity take the wheel.

 

 

Worth Defending

In ancient times, love of wisdom had a name – “philosophy”. That context brings to mind sun-drenched hilltops in Greece and great libraries like Pergamum and Alexandria. Scholars were few, ‘pure’, supported by patrons or states, and breathed the rarefied air of privilege.

In modern times, we talk of “science”. There are many, many scientists around the world. They work on increasingly specialized research, and increasingly on applied science. They are largely supported by a vast international academic system that is quickly merging with complex economic systems. In fact, direct support from the commercial and corporate world is becoming the norm. Many scientists now breathe the chemical soup of industrialization.

The general public has the widest technology usage rate in history. There are billions as many transistors as people and the number of cell phone accounts is roughly equal to the world’s population.

Technology is almost becoming synonymous with science. “Techno-Science” has reached a level of acceptance and respect that was previously reserved only for religion. Even many of those who remain scientifically illiterate hold science in mystical reverence.

This is the greatest tragedy of all.

The death of pure science directly implies the death of human civilization. We will either be enslaved in a dystopian future or simply be replaced by artificial intelligence. It ends not with a bang, but a whimper.

In 1969, Robert R. Wilson, an advocate of a new particle accelerator research facility that later became Fermilab was testifying before a committee in Washington, DC. His exchange with one Senator has become legendary and is found in many places on the Internet, with Wilson himself memorialized at Fermilab.

Senator: “Is there anything connected with the hopes of this accelerator that in any way involves the security of the country?”
Wilson: “No sir, I don’t believe so.”
Senator: “Nothing at all?”
Wilson: “Nothing at all.”
Senator: “It has no value in that respect?”
Wilson: “It has only to do with the respect with which we regard one another, the dignity of men, our love of culture. It has to do with those things. It has to do with, are we good painters, good sculptors, great poets? I mean all the things that we really venerate and honor in our country and are patriotic about. It has nothing to do directly with defending our country except to help make it worth defending.”

 

There’s little I can add to that. Thomas Huxley (“Darwin’s Bulldog”) once said,

“the scientific spirit is of more value than its products”

Forth – not quite dead

In my previous post, Thinking in Forth, I said that Forth is a dead language. I did this to make two points. The first is that I don’t have to revive or advocate a dead language, hurray, because that’s boring. The second point is that even a dead language still has value, like Latin does. However, just because I consider it to be a dead language doesn’t mean that everyone else does too.

Forth words

Forth is an excellent choice for education for several reasons:

  • it’s small, maybe 25 required core words
  • most of these words are one, two, or three letters long, so familiarity with English is not a prerequisite for learning Forth
  • it’s simple with no built-in abstractions – source code maps directly to assembly code (a Forth word is just a list of calls to previously defined words)
  • it has little syntax to learn
  • there is only one data type in basic Forth – integers (called ‘cells’)
  • useful programs can be written early in the learning process
    (a Forth program is just a list of one or more Forth words)
  • it’s very ‘bootstrap-able’ – one learns by concatenation, not starkly delineated ‘levels’
  • paradigms such as object-oriented and functional programming are consistent with Forth (and in fact many of them have often already been very well implemented and documented)
  • concatenative programming is much more pure and natural
    See Threaded Nature
  • syntonic programming is easier learn and use
    See Forth: A Syntonic Language
  • Forth is useful for building models and simulations
  • open and non-proprietary versions of Forth are freely available for a wide range of platforms

There are several efforts at delivering basic, classic Forth to beginning programmers. One that I particularly like is 4E4th which runs on a tiny, inexpensive protoyping card.
The Raspberry Pi can easily run a bigger, more capable Forth.

Several commercial enterprises continue to promote Forth. Some can be found towards the end of the excellent and extensive Forth page on Wikipedia

The creator of Forth, Chuck Moore, is a tireless, brilliant, and dynamic force. He has never stopped developing cutting edge technology and innovation based on Forth. For a look at some of his latest work, here’s a talk he gave in September 2013 on the ga144 chip. It has 144 cores, with energy measured in picoJoules per instruction and power in nanoWatts per core, and is of course programmed in Forth. That’s hardly dead technology.

So, while I do not aspire to revive Forth or advocate it generally, I may have been in error in describing it as dead. I come to praise Forth, not to bury it.

Thinking in Forth

I consider Forth to be a dead language, at least in the same sense that Latin is a dead language – i.e. nobody uses it in daily life anymore. I now only use it and write about it in the context of computational thinking. Forth may be a dead language, but that doesn’t preclude it from being vibrantly important. If you don’t like irony, you’re reading the wrong blog.

Forth was developed for over a decade by Charles (Chuck) Moore before being written down formally in 1970. Moore was a radio telescope guy at the start. He needed an expressive, helpful computer language for his own work, much as Newton needed calculus. The story is best told in his own words http://www.colorforth.com/HOPL.html

What makes Forth important is just that – it was created to serve as a computational assistant. Using it is very conversational and immersive. One doesn’t think for an hour, consult authority and standards, edit for an hour, then compile, then test, then go for lunch (or go home), etc. There is a non-stop dialectic, a friendly and thoughtful argument, with you handling the imagination and it handling the bits and bytes.
And a program emerges.

Forth words

I started using Forth in the late 1970s. Using it, I learned binary and hexadecimal numbers and arithmetic, register manipulation (the main task of a CPU), memory manipulation (the main task of a computer), and many concepts and paradigms of programming. These included imperative (states, flow control), functional (stateless, mathematical), object oriented (encapsulation, re-use), factoring, concatenative, declarative (mostly through a Prolog written in Forth), machine control, etc, etc,. I read several books and many articles, but the real learning happened by doing. What’s key here is the fact that I’ve never been a great programmer (very good, but never great). Anyone can gain tremendous benefit from spending even a few hours learning Forth. It enables one to invent an ad hoc language to solve an ad hoc problem. It’s not magic and other worldly like Excalibur. It’s more like an early hominid’s stone tool – it teaches one to think, especially to think about the next tool.

My Forth days are long in the past. Today, I write mainly in Haskell, Assembler, Prolog, and English. For database work, I use SQL, key-value stores, and JSON. Whenever I think about any problem, I think computationally. I’ve spent almost four decades thinking in Forth.

I intend to write many posts and maybe even a few articles on Forth, it will be a mainstay of this blog.