App Insights

Google I/O and Apple’s WWDC: Key takeaways

Date Published:

For those not in the know, Google I/O and Apple’s WWDC are the year’s definitive app tech events. The two biggest players in the industry set out their roadmaps, and – with virtually all app engineers designing for their operating systems – the rest of the industry follows.

Last year, hot topics included instant apps, integrated voice control and new subscription models for mid-sized developers. This year, Apple implored us to imagine a world without apps, which makes you realise how integral they’ve become to our daily lives. So what were the biggest insights we took home from this year’s conferences?

Google’s AI

Google’s Assistant will now be available on Apple’s iPhone, changing the terms of competition between the biggest players in smartphone tech.

If this causes a shift towards apps that integrate with both Apple and Google-driven voice control then we’ll see more bang for our development buck. A hybrid app that’s more stable and easy to develop may become more attractive than exclusive designs for one major smartphone or the other.

Google also plans to embed more AI into existing phone functions – especially everything that uses a camera. Google Lens is a huge deal; technology that recognises objects in the real world. At its most basic, the system will build AI into AR, cropping out unwanted details from what your phone camera can ‘see’ – like noticeboards which obstruct ‘magic moments’ on heritage trails, for example.

There’s more to Lens than glorified Photoshopping though; it’ll let smartphones understand what they see and take action accordingly. Point your phone at something, and the system will identify it, search the Web for information about it and interact with it. This could represent a gamechanger for app-driven heritage trails, by automating a process we previously created with beacons and GPS, putting control back in the hands of the individual visitor and their device.

Apple’s AR

The biggest news from Apple is ARKit – a new platform for developing AR apps for iPhones and iPads that will fundamentally reshape the creative possibilities of AR.

More accurate real-world tracking by devices means their AR display will be less jittery. No more virtual objects bouncing around the screen, struggling to attach to their real counterparts. More detailed scene understanding means smaller objects can be tagged for in-app information displays. Finally, lighting estimation means we’ll be able to use AR to offer a well-lit view of heritage artefacts that – for preservation’s sake – have to be kept in the dark.

Time Magazine’s Tim Bajarin suspects Apple has something bigger up its sleeve for AR. Tim was wrong in his prediction that new AR-capable hardware would be released before the developer kit. But when he approached Apple CEO Tim Cook, the other Tim said “given Apple’s platform-based approach to AR, its software should be able to run on most iPhones and iPads, rather than being tied to a specific design.”

Like the drift toward hybridity and cross-compatibility, this can only be good news for developers like us. The more devices that can run a given app, the more takeup we’ll see for that software. ARKit is already open to designers, meaning we can future-proof our software, aware of the capabilities we’ll be working with down the line.

Democratising smartphone tech

Google’s Android operating system now runs on two billion devices around the world. To reach the next billion, Google is making a break for lower-spec hardware that’s more affordable and resilient, opening up smartphone access to developing countries.

The new Android Go project will offer a streamlined user interface and optimised low-memory apps, with a version of the Play Store which foregrounds these more efficient designs. From a placemaking perspective, this is brilliant; it could bring more people and voices into virtual storytelling territory. Some app-driven heritage experiences were previously off-limits to those using lower-spec or older devices.

Meanwhile, Apple is focussed on improving user experience. Siri, Control Centre and the iOS App Store will all become more accessible with voice responses, topic suggestions, touch controls and app discovery functions. More paths to different areas of smartphones mean lower barriers to entry, and more users engaging with app content.

With both the biggest players in the smartphone industry leaning toward accessibility, flexibility and compatibility, the future looks bright for AR and AI. For design and development teams like ourselves, that means more opportunities to create new engaging app experiences for customers and heritage visitors. Where will we be twelve months from now?

Have we missed your key highlight from the conferences? Tell us on Twitter and LinkedIn.

 

Image credit:

Daniel Spiess via Flickr, Creative Commons 2.0