The Limits Of Robotics and AI

I knew that I had asked the wrong question. The toolbox thumped onto the floor and the top was slammed open. I would not ask again.

It was Friday morning and there had been a slow drip of water through the light fitting in the sitting room. I guessed it was from the upstairs shower. But I am no expert so I called our plumber Paul who arrived promptly. He has repaired many leaks over the years as well as installing new showers and bathrooms in our house. However, just that morning, I had been reading about robotics and Artificial Intelligence ( AI ) and how they would take over all of our jobs. This had prompted me to ask “Do you think plumbers will ever be replaced by a robot?”

Robotics and AI is reaching fever pitch in the media. Headlines proclaim that they are “a new form of life”, or that they can think “… and could make decisions against our will” and then there is the Killer robot ARMS RACE - if climate change doesn’t wipe us out then robots will!

Ideas about robots have been building up in the public conscience since the early 1940’s when Isaac Asimov formulated the Three Laws of Robotics, a set of rules to ensure friendly behaviour, and in the process coined the word “robotics”. This awareness has increased with films such as 2001: A Space Oydesst with H.A.L. and its ominous catch phrase: “I am afraid I can’t do that Dave”, Blade Runner where it seems to be impossible to distinguish between people and “replicants”, and the Terminator films where Skynet ( something similar to the Internet ) becomes self-aware and starts to wipe out humans who are seen to be a “threat”. The public are seeing robots everywhere, which has prompted the government to respond, as only they can, with a strategy! The Trade Union Conference has waded into the debate with a discussion paper that includes the phrase: “there is no need to panic” which is enough to make anyone feel nervous!

And yet in amongst the swirl of headlines about this revolutionary technology I still find myself waiting in a phone queue for over thirty minutes to sort out problems with an energy supplier, or have to wait nearly two weeks to see my doctor! And, of course, I still need a plumber to repair a leak in our shower.

Upstairs I heard Paul starting to lift the floorboards and search for the leak. From previous repair jobs I know that he will bring a multitude of skills and experience to repairing the leak: navigating around our rooms to find the pipes and locate the leak, finding out why it leaked - was it a loose connection or had the seal rotted away ?, repairing and testing the new joint, and then putting everything back so that it looks as though nothing had gone wrong. My imagination is struggling to see all those skills and experience in one robot!

Robots are machines that have been programmed to carry out a series of actions by themselves, and Artificial Intelligence is computer programs that mimic human intelligence on tasks such as learning, perception, problem-solving, language-understanding and/or logical reasoning. The first robot emerged in 1960 and was a programmable robot arm which was called Unimate. This paved the way for industrial robots to complete repetitive, difficult or dangerous tasks mainly in manufacturing. Most predictions show that they will continue to grow in numbers with sales expecting to triple by 2025. It is a similar story with AI. In the mid 60’s ELIZA was a computer programme that could hold a discussion with people by replying to typed in questions. Today AI is beating experts in games such as Chess, Go and the TV game show Jeopordy. AI will creep into all walks of life and the market for the technology is expected to increase from $500 million in 2015 to £14 billion by 2025.

In amongst the predictions about the declining future of humans there are some applications of robotics and AI that can make a positive difference. For example in healthcare robots are helping with prostate cancer surgery, and reducing hospital acquired infections such as MRSA. There are exciting new developments in detecting disease, for example using AI to detect the onset of Alzheimers by how people speak.

Similarly in education, although there are some over excited headlines such as “Intelligent machines will replace teachers within 10 years”, there are more practical applications of AI that are helping teachers. For example systems from Knewton’s and ACER use AI to tailor the learning material to the ability of the individual student as well as monitor their progress. Using these types of systems could free up teachers from increasing amounts of administration.

The hot topic at the moment is driverless cars which according to the headlines will be on our “roads within the next few years.”. But the jury is still out about how quickly it will be taken up by drivers, especially when leading lights as Jeremy Clarkson has had a couple of near death expereinces when the using the latest autonomous car. However, in Agriculture there is more scope for automation ( and less opportunity to wipe out people ). Driverless tractors are already here and there must be more scope for automated picking machines to replace the shortage of migrant workers since Brexit.

As Bill Gates once said: “We tend to overestimate the pace of change in the short term, but underestimate it in the long term.” Robotics and AI will have an increasing role in our lives but to what level and extent it is still not clear. My hope is that they will continue to take over the repetitive and dangerous work and improve our capabilities in learning, problem solving and making decisions.

Paul stuck his head around the door “All finished!”, and then with a smile “I bet a robot couldn’t have done that!” We both laughed. As I waved Paul off I made a few mental notes: up date the contact details of the electricians, roofers, bathroom tilers, painters, decorators and anybody else that we rely on to maintain our house, and not to ask them silly questions!

A formula for a successful business is a very rare thing - it is never that simple! But when a successful one emerges then it is worth a look.

In the autumn of 1999 the executives of Southwestern Energy Co, a gas exploration and production company based in Arkansas USA, were meeting to discuss their business strategy. The company was struggling. They had lost a $109 million lawsuit and the the total value of the companies assets was greater than its capitalisation. They were wondering whether it was time to pack it in. “We were not healthy. No one wanted our stock” the CEO Mr Korell recalls. He remembers uncapping a black marker pen and writing:

\[{R \times R \over A} = V+,\]

in other words the Right people doing the Right thing by wisely investing the cash flow from Soutwestern’s Assets creates Value. Using this formula the company steered its way to success.

What was written on the board shows the relationship between people and the assets. The “right people” are those with the correct skills, experience and attitude for the business. Doing the “right thing” covers not only the activities in using the assets but also making the correct decisions about future assets. Therefore to improve the performance of the business the numerator of the formula shows that it is people that have the greatest impact on creating value.

Although the formula was derived by a business operating in the Oil & Gas industry where the assets are very large it can be used for any business no matter how small the assets, for example a coffee machine is a small asset in a coffee shop but it is critical to adding value - without it there is no business ! For businesses where the assets are knowledge e.g. consultancies, marketing companies, training etc , then the formula still applies. For example the knowledge assets can cover Intellectual Property, Trade Marks, licensing agreements, methodologies etc. And just like physical assets that wear out or are no longer generating any value, then knowledge assets need to be maintained or replaced with new ones.

So what makes this a good formula for a business? First, it is easy to understand - it doesn’t require graph filled reports to explain its elements. Secondly, it generates a series of questions e.g. do we have the right people doing the right things ? Are the people properly trained to take full advantage of the assets ? which assets should we invest in or sell to improve value generation ? and so on. The formula implies a disciplined approach - if the questions are not focussed on improving the elements of the formula then time shouldn’t be wasted trying to answer them. Finally, and most importantly, it can be quickly communicated and easily remembered, in the case of Southwestern it has formed part of their culture where it is known as “The Formula” and is so important that it has a trademark.

Sometimes business formulas can be viewed as being too simplistic - too abstract and not capturing enough of the complexities of a business to be of any use. But when they are found and used they the can provide clarity and direction to a business. In the case of Southwestern it was applied through focusing on the companies core competencies, increasing production, adding low cost gas reserves, improving efficiencies, lowering drilling costs, reducing debt and investing wisely. The companies capitalisation went from $187 million in 1988 to $2 billion in 2005. Today, it is a successful and vibrant company with a capitalisation of over $2.6 billion - proof that a formula can work !

Headlines regularly announce the latest scientific breakthroughs which promise radical improvements in the fight against illness.

They are eye catching: DEMENTIA BUSTER New wonder drug hailed as a ‘game changer’ in battle against Alzheimer’s, or Miracle made in Britain! How the microscopic substance graphene can make sea water drinkable and even fight cancer, and from a few years ago Personal Genomes Will Spawn Made-to-Measure Drugs. But the medication that my doctor prescribes can be tens of years old and sometimes work and sometimes doesn’t. So why does it take so long for a scientific breakthrough to make its way into an effective medication ?

To get from the eureka moment in the laboratory to an effective medication is very complex. First there is the scientific discovery which establishes new facts , or explanation, that answers a question: what is the cause of cancer ?, why do cells divide ? and why is the blue print for our bodies wrapped up in something called DNA ? Once something new has been discovered it has to be turned into something that can be used. This can require developing new: skills, equipment, processes, and techniques. Finally, it has to be developed into something that people can easily use to cure, or manage, their illness.

Straddling all the stages from scientific breakthrough to medication are many regulations, intellectual property rights, industry standards, etc. some of which may require changing before the medication can be used. Then there is the tension between science and commerce where ones priorities is to discover new scientific facts and the other wants to take the breakthrough and make money from it ( the gap between the two groups is often called the ominous “valley of death” ). Many steps through layers of complexity add up to a long time and lots of money before the new medication is available for use.

To illustrate the complexities in taking a scientific breakthrough into a medicine it is interesting to trace the history of something sits on most cupboard shelves - aspirin. The story starts around 400 BC in Greece, when Hippocrates gave women willow leaf tea to relieve the pain of childbirth. It took until 1823 for the active ingredient to be extracted from willow and named salicin. Then in 1853 Salicylic acid was made from salicin by French scientists but it was found to irritate the gut. It took another 40 years until German scientists found a way to reduce its irritant properties. Then through the late 1890’s a process for synthesising aspirin was developed, clinical trials completed and aspirin was launched. The application of aspirin is still explored today, for example in the reduction of cancer risk. Today it is the best known and most widely used medicine in the world with an estimated 100 billion tablets taken every year. The history of aspirin shows how long it takes from the initial scientific breakthrough to being prescribed by a doctor, or bought of the shelf.

Digging behind the headlines shown at the beginning of this post their underlying status can be found. For the Dementia Buster claim the results are at the Phase III of clinical trials ( there are 4 phases of clinical trials) However, research into a cure for Altzhimers is notoriously difficult with success declining at each phase of clinical trials. Fingers crossed ! In the announcement of using Graphene to cure Cancer - university medical teams are working with it to produce minuscule drug delivery systems that can penetrate patients’ tumours before releasing cancer-killing medicines. However it is at the laboratory phase and a long way to go before clinical trials ! Using the Genome for personalised medicine there has been more progress. Specific genetic disorders have been been identified, for example most inherited cases of breast cancer are associated with two abnormal genes: BRCA1 (BReast CAncer gene one) and BRCA2 (BReast CAncer gene two). There is also progress with gene-targeted cancer drugs and in particular helping to identify targeted cancer therapies for a wide range of cancers. Also, companies are popping up that can analyse an individuals DNA for a relatively low cost e.g. 23andMe but linking the results to medical conditions is still work in progress. Personalised, or precision, medicine continues to be a promising area of research let’s hope that it gets lots more support.

Every scientific breakthrough should be celebrated - increasing our understanding of the world and ourselves can only be a good thing. But caution needs to be applied when the breakthrough is proclaimed through media headlines - a dose of reality needs to be applied and realistic estimates given when we can get access to the new medication. Also, it will prevent embarrassing discussion with my doctor when I start “But I have read about …”.

I use the web as a source for learning - see Using The Web To Learn - about a wide range of subjects: making sourdough bread; writing software to analyse a shopping basket; or understanding why football teams are no longer using the 4-4-2 formation. However, there is one area that needs to be improved: access to relevant information based on my level of knowledge about a subject.

Let me give you an example. I have reached an age when medical breakthroughs are becoming more important. Recently a breakthrough was announced in the fight against the terrible disease Alzheimer’s. I was curious about the science behind the breakthrough, in particular the analysis of the data to support the claim, and a quick search found a copy of the paper.

The paper was full of terminology that I didn’t understand. More searches helped to clarify the different proteins, scanning techniques, assessment of Alzheimer’s etc. that give an overview of the research. However, I was still interested in the statistical analysis which was used to support the claim that the test is 87% accurate. This is when the problems started. Further searches started to produce a mish-mash of information: poor quality videos, badly written overheads for a lecture, chapters from draft books, chapters from books no longer available, lecture notes, different mathematical notation etc. After many coffee drenched hours I eventually found the correct level of information that helped me to understand the breakthrough. However, instead of trawling through piles of confusing information it would have been better if my search brought back the information at a level that I could understand in enough time to drink one cup of coffee !

This is my challenge to the search engine companies - when searching on a particular subject I need the information to be at a level that I can understand. So search engine companies - sharpen up your algorithms, speed up your computers and lets see what you can really do !

On a good day I can answer two of the questions that come up during the BBC's programme University Challenge: the quiz where university students answer questions on a wide range of subjects. Using my smart phone I can answer more - it depends on how fast and accurate I can type ! Answering questions using the web is a powerful thing. But what if I want to learn more about a subject ?

I often come across a subject when reading about a hobby, developing a new skill or sparked by curiosity, where I need to find out more. But the subject can be like a brick wall built of strange words and unfamiliar concepts. I am stuck ! For me there is only one approach and that is self-learning, in other words, working through the subject by myself. The following is the way that I use the web to help me learn.

I start by identifying the people who are working on the subject and search for what they have written, for example blogs, papers, public presentations, books, videos on YouTube etc. I also search out any commentators, or critics, on the subject. This initial phase is like detective work - pulling together scraps of information about the subject and assembling them into some sort of overview.

Let me give an example. Tim Harford wrote an interesting article about our inability to predict the impact of new technology on how we get it wrong when imagining the effects of new technology on the future of everyday life. In the article he mentioned an interesting idea proposed by the economists Daron Acemogula and David Autor where we should view work in bite-size chunks - tasks rather jobs. Basically they think that “routine, codifiable tasks” can be automated while purely human skills such as problem solving and creativity can not. Searches produced papers by the two economists with titles such as: Skills, Tasks and Technologies: Implications for Employment and Earnings, Polanyi’s Paradox and the Shape of Employment Growth, and The “task approach” to labor markets: an overview ). I found Tim’s email address on the web and contacted him for advice on the best papers and he kindly suggested Why Are There Still So Many Jobs? The History and Future of Workplace Automation ).

The next stage is to immerse myself in all of the information that I have gathered which feels like drowning. But after surfacing for a few coffees I find that some of the new terminology makes sense and the subject starts to take shape.

Eventually I reach point where I feel confident enough to test my understanding of the new subject. This is done by working through some examples, or tutorials, and then using my new knowledge on a problem. This can be done more easily in some areas than others, for example using open source software I can test my understanding of new concepts in computer technology. For other subjects then I can only read about it e.g. surgery.

Continuing with my example, the subject falls under the heading economics. Therefore applying what I have learned I can draw any sensible conclusions about changes in business ? For example what is driving the increase in call centres ? Part of the job of a salesperson was to make regular calls to customers to try and drum up new business or chase repeat orders; this was usually when selling to a business and is still practiced today. However, call centres take the “task” of calling customers to retail customers and probably reduce the cost per call due to centralisation of the task. Not a rigorous conclusion but the start of a few interesting thoughts.

What I have described above is the first step. To gain a greater level of expertise in a subject can take many years: there is a debate raging on the web that suggests that around 10000
hours is required to gain a high level of expertise in a subject which is a long time but exploring a new subject is the fun of learning.

Next time I sit down to watch University Challenge I hope too have gained more knowledge from my self-learning so that I can answer more than my usual couple of questions - if not I can always access the web !