App Insights

How should you prepare your business for the Artificial Intelligence (AI) revolution?

Date Published:

You may or may not have heard of the Chinese game, Go.

All you need to know is that it’s the most complicated game on earth: there are more possible moves in a game of Go than there are atoms in the universe. And in May 2017, Google’s AlphaGo artificial intelligence programme went head to head with the world’s best Go player and won, in a best-of-five contest. This was a feat that many believed was impossible.

You may remember chess grandmaster Gary Kasparov being beaten by a computer – IBM’s Deep Blue – in 1997. This latest victory was something more significant. Deep Blue was designed specifically to play chess: it learnt moves, positions and tactics based on the way the pieces were laid out on the table. AlphaGo wasn’t. This computer used general-purpose AI to learn to play the game by scratch, improving its skill by playing against different versions of itself. This was a computer making its own decisions independent of human programming and becoming better at the game than our best players. This is why AI is creating such a fuss.

AI, of course, isn’t all fun and games. Rather, it’s a game-changing technological development for everything from healthcare to creative sectors. So what what is the state of play in AI at the moment; what are the challenges, and how should you be preparing for the AI revolution?

How AI has been used so far

One of the most widely reported AI systems is IBM Watson. The reason being is that it’s available as a set of open APIs and SaaS products, democratising artificial intelligence for savvy developers to do with as they please. For that reason, its name has cropped up alongside some of the most interesting experiments in AI to date. Watson’s website showcases how businesses in diverse industries have improved their processes using the system. The Toronto Raptors basketball team, for example, uses IBM Watson to analyse gaps in its team and identify the players that best fit the bill, while elevator maintenance firm KONE predicts elevator conditions to assess the likelihood of faults, to keep things moving smoothly.

Beyond Watson, the engineering sector is already embracing AI and its potential: metal products used for construction, car frames, antennae and bike stems have seen weight reductions, improved design and increased effectiveness thanks to AI.

The list goes on: quote-to-cash service Apttus uses contract management to speed up their contract processing to close sales faster; Google uses AI to provide better search results, cool its computers and translate languages, and artificial intelligence company Human aims to curb suicides on public transport using the technology.

Other examples represent revolutionary change in their sector. Microsoft recently announced the creation of an entirely new healthcare division with artificial intelligence solutions at its core. As The Times reports: “the team’s ­research could include developing predictive analytic tools, and personal health information systems, as well as using AI to ­target interventions.”

We also have Apple’s ARKit and Google’s recently announced ARCore: while these are augmented reality (AR) rather than AI developments, they rely on machine learning to function. Building block technologies like these, including IBM Watson, image recognition and voice recognition, are already production-ready, or close to being so – and will play a key role in making AI technology accessible to consumers.

A disruptive force

Custom machine learning is the most important potential disruption to the workplace since the Industrial Revolution. In this way, AI will be a game-changer.

Automation may see jobs either being replaced by robots or AI software programmes – and may include both low-skilled and high-skilled, knowledge-based roles. Insight from Deloitte reveals that 39% of legal sector roles could be automated in the coming decade, while NPR’s automation calculator suggests 95% of accounting roles could go the same way. Farmers are finding their milking duties replaced by robots, and a bricklaying robot can do the same job as two to three human labourers.

AI, should it keep developing at the current rate, promises to improve medical diagnoses and treatment, remove workers from the danger of high risk roles, and increase productivity levels. With this, inevitably, will come a reduction in the need for human employees in certain fields – but with baby boomers retiring and the generations behind them smaller in number, plus Britain’s startup numbers increasing at a rapid rate, the employment issue may be non-existent. Remember too, that previous industrial revolutions have all created new jobs, as well as reducing the need for existing roles. There’s no longer a need for pinsetters in bowling alleys, for example – but these roles were replaced by the employees needed to create and install the systems that replace pins mechanically.

AI apps: Where to start

Step 1: Data collection

If you’re excited by the possibilities of AI, you might be wondering where to start. The answer is data – and plenty of it.

For a computer to be able to recognise something – an object in an image, a voice, a location – you’ll need to have 10,000 to 100,000 examples with that element included. It may be that you need to generate this data yourself, or it’s possible that (depending on your app’s purpose) you can trawl Google image search for 10,000 photos of the instance in question.

Step 2: Making sense of the data

As we’re always keen to tell our clients who show an interest in AI, it’s not just having the data that counts. Owning a database of 10,000 photos, for example, doesn’t mean a thing; these photos, objects or whatever, need to have some meaning attached for machine learning to be able to happen. This means tagging. These really is the grunt work of AI – grunt work often outsourced to other companies – long, arduous and time consuming work that takes hours. We have one client keen to implement AI in their app in a couple of years. Their data tagging and gathering has already started.

Often companies get the public to do this for them, unbeknownst to them. If you’ve ever had to confirm that you’re a human online by typing what you see in a picture, that is more often than not, training for artificial intelligence.

Step 3: Building the model

Once you have a solid bank of tagged data, the next step is to find someone to build an AI model – still a highly specialised role. Most computer developers are used to working with deterministic systems – systems that will always produce the same output from the outset. The logic in most apps is simple: if this happens, do this; if that happens, do that.

AI, however, is different. The tools that establish what the AI is doing and how it is thinking are currently poorly developed and misunderstood. Deep learning may be great for problem solving, but AI systems are incredibly complex. These require highly technical algorithms and techniques, which can increase in complexity depending on the scope of the project.

Step 4: Computing power

This isn’t really a stage as such, but a necessity for a developer that needs to develop an AI system. AI project requires a huge amount of computing power, equating to a couple of hundred computers at any time. With the likes of Amazon Web Services and the Google Cloud Platform available, this represents less of a problem than previously. But the need for massive computing power is important, in order that computers can look at the examples provided, understand those examples, and train themselves to work out the exact thing you’re looking for.

Once these steps are complete and the AI system has been created and trained, the resulting product is often quite simple, and able to run on a phone or small computer server in real time. The implementation itself is far less of a challenge than the system’s initial creation.

Ready for the future?

While AI has plenty of benefits and potential uses, the technology is very much in its infancy, with little common understanding of how it makes AI systems make decisions, a point proven recently when AI developed by Facebook invented its own language.

It’s weird, wonderful, exciting and disruptive, but AI takes time and data to reach a starting point where the technology becomes useful. If you believe the tech will be relevant to your business in the future, start planning for tomorrow now.