The Atomic Human

edit

Playing in People's Backyards

The Berkeley statistician John Tukey is said to have told a colleague “The best thing about being a statistician is that you get to play in everyone’s backyard”. The same is true for machine learning, data science and AI.

Most of this play is welcomed, but there’s a challenge where what starts out as play becomes a problem. AI play can get a bit raucous. It often ignores those who are living in the house and neighborhood. It ignores experts in the domain. Before you know it the garden play-shelter becomes a concrete carbuncle … and the AI people are having a big party down there with a load of noise blaring.

Respectful play involves understanding the homeowners and what they’d find useful. It involves first asking “how can we help”.

With this approach in mind we approached my colleague Dr Jonathan Tenney, an Assyriologist in the University’s department of Archeology. Instead of imposing neural networks on him we followed approaches from the social sciences such as grounded theory to understand Jonathan’s work.

The collaboration then can enrich both fields - AI and Assyriology - offering insights into ancient and modern ways of processing information. Part of this work features in The Atomic Human where we explore the opportunities presented by human-analogue machines. Today we conclude the traditional 24 days of the advent calendar by highlighting this approach.

Dan Andrew's drawing for Chapter 11, Human Analogue Machines Dan Andrews' illustration for Chapter 11, Human Analogue Machines. See scribeysense.com.

Part of the chapter: Human Analogue Machines is focussed on the extraordinary opportunities we gain when we can interface directly with computers through natural language. Jon’s work looks back to the development of writing. We watched and recorded Jon as he translated a tablet from the ancient city of Ur. It was a legal decision held under the Code of Hammurabi, one of the oldest legal codes dating from 1700BC.

In the judgment witness statements contradicted themselves, and in this case the Hammurabi’s code suggests trial by ordeal: the accused is thrown into the river to allow the gods to decide.

Although this was over 3000 years ago, we see the same tendency in modern AI systems. When things get complicated, let’s get the AI to decide.

This is playing in the garden of others in the worst possible manner. Undermining the original owner’s confidence to such a degree that they no longer trust their own judgment and prefer the modern equivalent of trial-by-ordeal rather than expressing their professional judgment.

Click to see what the machine says about the reflection and the book