The Barbican’s ‘More than Human’ exhibition explores creative and scientific developments in artificial intelligence (AI) and aims to demonstrate its potential to revolutionise our lives in many ways. Having visited it last week and come away overwhelmed by the kaleidoscope of concepts, sounds, information and objects combining to explore our relationships with technology, I thought it would be useful to focus this article on the basics of the AI revolution.
So, what is AI?
According to the Barbican’s literature, “AI is comprised of a collection of algorithms, which are executable mathematical functions that process data through a series of rules and step by step instructions. AI algorithms make use of data and rule-based logics to model and simulate environments, with little human interference.”
We are all familiar with hearing the term AI and understand that it’s a technology that already influences how we experience the world and that it has the potential to impact massively on all aspects of our lives, such as healthcare, mobile apps, city planning, surveillance and manufacturing. Fundamentally, it’s a game-changing technological development.
How AI has been used so far
One of the most widely reported AI systems is IBM Watson. The reason being is that it’s available as a set of open APIs and SaaS products, democratising artificial intelligence for developers to do with as they please. For that reason, its name has cropped up alongside some of the most interesting experiments in AI to date. Watson’s website showcases how businesses in diverse industries have improved their processes using the system. The Toronto Raptors basketball team, for example, uses IBM Watson to analyse gaps in its team and identify the players that best fit the bill, while elevator maintenance firm KONE predicts elevator conditions to assess the likelihood of faults, to keep things moving smoothly.
- You may like: The Calvium view on augmented reality
Beyond Watson, the engineering sector is already embracing AI and its potential: metal products used for construction, car frames, antennae and bike stems have seen weight reductions, improved design and increased effectiveness thanks to AI.
The list goes on: quote-to-cash service Apttus uses contract management to speed up their contract processing to close sales faster; Google uses AI to provide better search results, cool its computers and translate languages, and artificial intelligence company Human aims to curb suicides on public transport using the technology.
Other examples represent revolutionary change in their sector. In 2017, Microsoft announced the creation of an entirely new healthcare division with artificial intelligence solutions at its core. As The Times reported: “the team’s research could include developing predictive analytic tools, and personal health information systems, as well as using AI to target interventions.”
We also have Apple’s ARKit and Google’s ARCore: while these are augmented reality (AR) rather than AI developments, they rely on machine learning to function. Building block technologies like these, including IBM Watson, image recognition and voice recognition, are already production-ready, or close to being so – and will play a key role in making AI technology accessible to consumers.
A disruptive force
Custom machine learning is the most important potential disruption to the workplace since the Industrial Revolution. In this way, AI will be a game-changer. Automation may see jobs either being replaced by robots or AI software programmes – and may include both low-skilled and high-skilled, knowledge-based roles. Insight from Deloitte reveals that 39% of legal sector roles could be automated in the coming decade, while NPR’s automation calculator suggests 95% of accounting roles could go the same way. Farmers are finding their milking duties replaced by robots, and a bricklaying robot can do the same job as two to three human labourers.
AI, should it keep developing at the current rate, promises to improve medical diagnoses and treatment, remove workers from the danger of high risk roles, and increase productivity levels. With this, inevitably, will come a reduction in the need for human employees in certain fields – but with baby boomers retiring and the generations behind them smaller in number, plus Britain’s startup numbers increasing at a rapid rate, the employment issue may be non-existent. Remember too, that previous industrial revolutions have all created new jobs, as well as reducing the need for existing roles. There’s no longer a need for pinsetters in bowling alleys, for example – but these roles were replaced by the employees needed to create and install the systems that replace pins mechanically.
AI apps: Where to start
Step 1: Data collection
If you’re excited by the possibilities of AI, you might be wondering where to start. The answer is data – and plenty of it.
For a computer to be able to recognise something – an object in an image, a voice, a location – you’ll need to have 10,000 to 100,000 examples with that element included. It may be that you need to generate this data yourself, or it’s possible that (depending on your app’s purpose) you can trawl Google image search for 10,000 photos of the instance in question.
Step 2: Making sense of the data
As we’re always keen to tell our clients who show an interest in AI, it’s not just having the data that counts. Owning a database of 10,000 photos, for example, doesn’t mean a thing; these photos, objects or whatever, need to have some meaning attached for machine learning to be able to happen. This means tagging. These really is the grunt work of AI – grunt work often outsourced to other companies – long, arduous and time consuming work that takes hours. We have one client keen to implement AI in their app in a couple of years. Their data tagging and gathering has already started.
Often companies get the public to do this for them, unbeknownst to them. If you’ve ever had to confirm that you’re a human online by typing what you see in a picture, that is more often than not, training for artificial intelligence.
Step 3: Building the model
Once you have a solid bank of tagged data, the next step is to find someone to build an AI model – still a highly specialised role. Most computer developers are used to working with deterministic systems – systems that will always produce the same output from the outset. The logic in most apps is simple: if this happens, do this; if that happens, do that.
AI, however, is different. The tools that establish what the AI is doing and how it is thinking are currently poorly developed and misunderstood. Deep learning may be great for problem solving, but AI systems are incredibly complex. These require highly technical algorithms and techniques, which can increase in complexity depending on the scope of the project.
Step 4: Computing power
This isn’t really a stage as such, but a necessity for a developer that needs to develop an AI system. AI project requires a huge amount of computing power, equating to a couple of hundred computers at any time. With the likes of Amazon Web Services and the Google Cloud Platform available, this represents less of a problem than previously. But the need for massive computing power is important, in order that computers can look at the examples provided, understand those examples, and train themselves to work out the exact thing you’re looking for.
Once these steps are complete and the AI system has been created and trained, the resulting product is often quite simple, and able to run on a phone or small computer server in real time. The implementation itself is far less of a challenge than the system’s initial creation.
Ready for the future?
While AI has plenty of benefits and potential uses – as well as a prestigious exhibition about the past, present and future of the subject – the technology is still very much in its infancy and the human skills required in businesses to understand the strategic opportunities of artificial intelligence as well as the operational requirements aren’t always readily available. As highlighted above, AI takes time and data to reach a starting point where the technology becomes useful. It also takes people know-how. However, for those businesses who believe that this technology will be relevant for their business in the future, then start planning nowfor tomorrow.
If you’d like to know more about new technologies for business, read our blockchain article with insights from our Technical Director, Ben Clayton.