With the recent passing of Stephen Hawking and Elon Musk’s latest predictions of life on Mars, now is a timely moment to reflect on Artificial Intelligence (AI), and on the opportunities it offers.
Firstly, what actually is meant by AI? From a benign home help bot doing chores or ordering groceries via Alexa through to Terminator and a robot Armageddon, the images the term conjures up are legion, often heavily influenced by science fiction. To my mind, simply put, AI is a set of programmes which allow any type of machinery with an internal processor to mimic or replace tasks or interactions currently done by humans.
To some extent, this has been underway for decades – auto-pilots on airplanes, even robot vacuum cleaners, have taken over jobs that originally required a human. It is a trend which began with the Industrial Revolution, as machines replaced human labour, and which has continued through to the present day. There are no more switchboard operators, mailrooms have largely disappeared, and lighthouses and railway signals are automated and computerised.
However, recent improvements in computing at scale (aka the Cloud), smart connected devices and the Internet of Things (IoT) have all created an environment where the pace of change in human society is about to rapidly increase.
Self-driving vehicles alone could soon see professions such as taxi driver, chauffer and truck driver disappear. A sky full of drones delivering groceries to homes based on interactions with our smart fridges is already feasible, if not yet quite accepted by anyone outside the Silicon Valley.
So, what does this future world have to do with the day-to-day life of IT professionals? For most of us, not very much. We still work through tasks and schedules based on 20th Century approaches to business and management. We plan incremental business improvements as we juggle with budget cycles, gradually adopt new-ish technologies such as VoIP and video calling, update computers to the latest flavour of Microsoft OS, struggle to keep our young millennial Developers and BAs engaged and productive, and hope that our controls are sufficient to prevent a major cyber-security breach!
Nevertheless, there are opportunities for us all to help guide our colleagues in the wider business into the brave new world of AI. As technology becomes more pervasive, as more functions that would have sat firmly within IT move out to other business teams (who themselves have adopted Agile and Lean techniques to continuously improve processes and speed to market), so the IT group can claim to be true thought-leaders, adding value and revenue to the bottom line, and finally becoming a profit centre, rather than just a cost centre.
In all of my roles in IT management, the search for new ways to leverage technology to drive better business has always been a constant source of excitement and fascination.
AI is a set of programmes which allow machinery with an internal processor to mimic or replace tasks or interactions currently done by humans
One such way is by using AI to automate much of the more tedious, routine deliverables and free up the team to think outside of the box and challenge assumptions. One opportunity is through data analysis of various disparate sources, to improve business and/or customer outcomes.
A good example of how AI, through machine learning and natural language generation, could make a significant difference to us all is in fault tolerance mapping, automated analytics and alert generation. Smart electricity meters can capture enough data relating to aspects of power usage that is now feasible to identify potential or actual failing hot water cylinders. However, it is a laborious and labour intensive task even for highly skilled data analysts to work through this process, but not for an AI engine. Imagine getting proactive warnings from our electricity provider, with remedial actions to minimise risk, or even offering discounts on new cylinders or for a gas water heating solution!
A similar opportunity is using a combination of hindcast (i.e. looking for patterns in historical data) and forecast weather data to allow for real-time warnings for severe weather events, such as lightning storms. This technology is already available, but as above, requires intensive manual intervention. Much better is to have an AI engine doing real-time analysis and identifying potential risk situations – i.e. thunderstorm over an airfield or lightning storm rapidly approaching a rock concert (as happened in early June 2016). In this case, the meteorologists could focus onto these situations and manage the warnings proactively.
The same opportunities for AI to add significant value are also true in healthcare. AI can help to manage medicine dispensing from a central warehouse – automating every step in the process from receipt of the order, through selection of appropriate medicines, packaging and dispatch via a drone to the patients’ homes. AI could would manage appliances, medical equipment and other technologies in elderly people’s homes, and reduce the burden of chores and maintenance to allow older people to stay in their own homes and living much more fulfilling lives, rather than moving into assisted-living environments.
The combination of technologies that in whole, or part, offer AI outputs offer so many benefits to us all. The ability to source and process data into information assets providing enhanced knowledge to individuals and businesses at speeds and volumes never before imagined is exhilarating. For example, a fascinating factoid I learned several years is how sales of refrigerated soft drinks increase then plateau as temperatures rise. Imagine a vending machine automatically increasing the price of Coke by itself close to a swimming pool in summer!
Therefore, in conclusion, AI is not a threat to the IT professionals reading this article. Instead, it behoves us all to harness the opportunities it and other new, disruptive technologies present and leverage these to improve the lives of everyone in our communities.