UX trends over the past few years and the evolution of

2016
UX Trends over the
past few years and
the Evolution of
Slippy UX
This White Paper looks at the various
ways in which UX design has changed
over the last few years and retraces the
evolution of Slippy UX.
1
Contents
Introduction
3
Multitasking Support
5
Continuity
8
Wearables
9
Contextual Technology: Google Now on Tap, 3D Touch, Force Touch,
Proactive Search and Google Instant Apps
13
Shifting Mindsets of UX Practitioners
15
Technology to Deliver Enhanced UX
18
Conclusion
20
Guide to Approaching Innovation
2
Introduction
According to a 2016 comScore report1 , Millennials around the
globe spend around three hours per day using smartphone apps.
Mobile accounts for 67% of all time spent on the internet; and within
this mobile consumption, apps account for almost 90% of all time spent on mobile.
Advances to touchscreen user interfaces, the rise of apps
and a new paradigm of user experience took the Internet from the
desktop into the palms of users’ hands. Statistics from virtually
every market around the globe show that the average person
does more on mobile now than any other channel.
Whilst the increased use of mobile vs desktop is due to various
factors such as improved 4G technology and public Wi-Fi signals,
the popularity of mobile has also incrementally increased with the
enhancement and improvement of user experience design (UX
design). In general, UX design can be defined as the process of
anticipating and designing the experience a user will have with a
digital or physical product. Good UX design renders a product
useful and simple to use and the consumer should expect to have
a pleasurable interaction with it, thereby appreciating the value of
the provided product.
Since the advent of smartphones, mobile operating systems have
received regular manufacturer software upgrades and
improvements. For companies who provide mobile apps, this
creates an opportunity to ensure that they regularly improve the
services that they deliver over mobile, both at the front end in
terms of UX design, and the back end in terms of programming.
App providers must review the experience they are providing
regularly, to ensure their app is offering the best possible
experience to the user within the framework of the operating
system of the user’s phone.
The days of designing apps around sticky experiences, where the
success of an app was measured only according to the time users
spent in-app, are coming to an end.
1 https://www.comscore.com/Insights/Presentations-and-Whitepapers/2016/The-2016
-US-Mobile-App-Report
Guide to Approaching Innovation
3
Thanks to improved background monitoring, experiences are no
longer designed to keep people within apps. Interactive
notifications, Bluetooth Beacons, wearables, continuity and omnidevice working means that certain app experiences now need to
be designed for minimal interaction. Mobile is meant to help users
do things more easily than they can elsewhere, so creating
experiences that are smarter and slippy, as opposed to necessarily
being sticky, has been something that has become increasingly
important in the design of some app experiences.
This White Paper explores the different ways in which UX design
has changed over the last few years, from interface design
changes such as with Google’s Material Design Philosophy, to the
ways apps interact with the user.
Google’s Material Design Philosophy
As the experience of using a mobile phone and mobile software
has vastly improved over the years, so too has mobile interface
design. Over the past few years, there has been a big move from
more realistic and skeuomorphic designs—intended to mimic
physical objects—to flatter user interfaces, which are clean and
uncluttered, but have sometimes been used to the detriment of
the user experience. For example, the loss of the obvious
tappable button sometimes hampered users’ understanding of
how to use the interface.
Google’s Material Design philosophy, introduced in 2014,
addressed the challenge of providing clean and uncluttered
designs with the most user-friendly experience. This was achieved
by combining the minimal flat design of its predecessors with the
idea of layering designs using light, surface and movement to
convey interactions and relationships within an interface.
Aside from design language changes such as the aforementioned
Material Design philosophy introduced by Google, here are some
of the other top changes to mobile UX over the past few years:
Guide to Approaching Innovation
4
Multitasking Support
Multitasking is a computing term used to describe more than one
process or programme running at the same time. Early computers
were capable of only running one task at a time, but as processing
power increased, today’s computers can now run multiple
applications simultaneously.
As a way of managing battery life, Android and iOS didn’t support
multi-tasking at launch. To use apps such as Spotify or other third
party music apps in the early years, users had to have them
physically open on their screens. Quitting the apps would have
made the music stop, making app stickiness a vital design
philosophy and encouraging the user to engage with an app or at
least keep it open. This didn’t change until iOS 4 which was
introduced in 2010.
There are differences between how Android and iOS handle
multitasking. Google allows its apps to truly run in the
background: they can complete tasks, run services, work to a
schedule or do anything else, as long as they have been given
permission to do so. Apple’s original multitasking, which was
released with iOS 4, allowed different apps to be open at the
same time with easy navigation between them. However, apart
from various native and music apps, once an app was placed into
the background it would freeze in its current state until brought
back to the foreground. This lulled users into a belief that they
were multitasking between apps, when only one app was actually
running at any one time.
Since the introduction of iOS 7 in 2013, Apple has improved its
multitasking features, allowing apps to update themselves in the
background. This is a key development in terms of the move
towards Slippy UX, which will be discussed in more detail later in
this paper. If an app does not need to be open to run and update,
the user potentially has a better experience when they return to it.
Over the years, both Android and iOS saw huge updates and
improvements to the ways in which they supported multi-tasking.
Today, both systems support intelligent background monitoring
that is able to identify regularly used apps as well as knowing
when users are likely to open certain apps. This way, the operating
system can ensure that the app’s information is up to date before
Guide to Approaching Innovation
5
the user even launches it. From the iPad Air onwards it is also
possible to have two apps running next to each other and to use
them simultaneously.
Key Considerations for Multitasking:
• Multitasking is a platform level feature, so you need to ensure
that if users leave their apps, they are able to return and pickup where they left off.
• Multitasking is not recommended for apps where security is a
high priority, such as financial apps. Here you should ensure
that the app logs out as soon as it is closed, or that it closes
after a user-defined amount of time.
Push Notifications
A push notification is a message or alert delivered by a centralised
server to an endpoint device. Push notifications are the alerts
which come through on a lock screen or at the top of a device to
alert the user of news, information or other content from an
application.
Originally, notifications on mobile were limited to native apps.
Android quickly gained support for notifications across its system,
whilst iOS offered support for app notifications with iOS 3.0,
launched in 2009. Notifications helped to encourage app usage
by delivering news alerts or updates within apps, where the
experience is focused on driving app usage by serving as a nudge
to get people to open apps.
Push notifications are one of the best ways to drive engagement
and usage of apps. According to Localytics, a push notification
technology provider, in 2015, users who enabled push
notifications for an app launched it an average of 14.7 times per
month, whereas users who did not enable notifications for the
same app only launched it an average of 5.4 times per month3. In
2016, analysis shows that push notifications boost app
engagement by 88%4 .
3
4
Guide to Approaching Innovation
6
Yet, another study by Localytics and ResearchNow identified that
52% of users view push notifications as an annoying distraction5.
This is partly down to 35% of notifications being generic
‘broadcast’ blasts rather than contextually relevant messages
tailored to individual users. However, there is no denying that
push notifications are important to apps when done correctly. As
users need to opt into notifications, developers need to
demonstrate the value they will bring to users as part of the
onboarding process when an app is first installed. Apps that
deliver notifications that users find annoying may find that they
are primed for deletion, or at least notifications being turned off.
Interactive Notifications
Interactive notifications enable users to take pre-defined custom
actions on notifications that they receive. This action can be done
from the lock screen or from the notifications bar, without the
user needing to open an app.
As mobile usage was increasing, it became apparent that users
needed easier ways to respond to messages, or take actions on
certain functions which were previously only possible within the
app. This may include adding an article from a news app alert to
read later, responding with a message via a chat app, checking
into a location on apps like Swarm or marking/deleting email
messages.
In 2012, Android added support for interactive notifications, which
enabled users to interact with notifications from the lock screen
or notification bar. This meant that users could respond to, or
complete pre-set actions on notifications that came through.
The benefit of interactive notifications was that it meant users
could engage with apps or respond to messages at a platform
level, without having to first open the app. Two years later, Apple
also introduced support for interactive notifications with iOS 8.
With the current operating system iOS 10, interactive notifications
have been significantly enhanced with the option to link to more
actions and for custom UI to be attached to each notification.
Notifications are enabling better functionality and inspire better
engagement with the user where opening the app is not always
5
Guide to Approaching Innovation
7
necessary to deliver value. Using interactive notifications is all
about being able to anticipate user’s needs and delivering timely
notifications accordingly.
This brings us back to the concept of Slippy UX. Where a
notification is relevant and, in case it calls for action, the relevant
action can be completed with the minimum amount of distraction,
an interactive push notification will be welcomed and remain an
important part of the app user experience.
Key Considerations with Notifications:
• Can you enable certain actions for users without them opening
the app? It’s important to consider how using interactive
notifications can empower users, especially when they may
need to quickly complete an action.
• Ensure that you have thought about how notifications will be
used by the app and how it will add value for the user. Often,
delivering information is a key way to let the user decide
whether they want to find out more or just continue with what
they are doing.
Continuity
With iOS 8 in 2014, Apple introduced new features that sought to
improve the experience of moving from one device to another. In
the past, if a user was using an app on their phone but wanted to
continue the experience from their tablet, they would have to
navigate to the part of the app or website they were in on the
other device.
Continuity was created as a seamless way to move from one
device to another. Using a mixture of Wi-Fi and Bluetooth
protocols, Continuity was designed around people being able to
move from one device to another, within the same app or website.
This type of experience means that, for example, within the Mail
app on iOS and OSX, a user can start writing a message on one
device and continue from another.
Today, this also means that if a user is looking at an app on their
Apple Watch, they can pick up the experience from their iPhone,
Guide to Approaching Innovation
8
iPad or Mac computer (or any variation within that, bar the
exception of picking up the experience on the Apple Watch).
Microsoft also introduced a similar concept through Convergence
with Windows 10 and Google also has similar functionality built
into Chrome for Android devices, helping users to carry on
working on another device with a seamless experience.
Combined, all of these developments are giving rise to slippy
experiences and the omni-channel, omni-device landscape.
This flexibility of moving between devices is part of the user
experience that designers and developers need to consider when
creating apps. It’s not enough to just think about the interface,
you need to think about how the experience will work between
devices and ensure that any possible security concerns are
considered early on.
Wearables
Wearable devices are electronic technologies or computers that
are incorporated into items of clothing and accessories which can
comfortably be worn on the body. These wearable devices can
perform many of the same computing tasks as mobile phones and
laptop computers; however, in some cases, wearable technology
can outperform these handheld devices entirely. Wearable
technology tends to be more sophisticated than handheld
technology on the market today because it can provide sensory
and scanning features not typically seen in mobile and laptop
devices. This includes, for example, biofeedback and tracking of
physiological function.6
Smartwatches and other wearable devices have been created
around the idea of helping people better manage how they
interact with devices, through alerts and interaction with
smartphones. With the average person looking at their phone 150
times per day (according to a study from Nokia) delivering
glanceable information can help to reduce the number of times
someone needs to pull out or unlock their device.
Outside of notifications, one of the key challenges that designers
need to consider with wearables is how to create smarter ways of
6
Source: http://www.wearabledevices.com/what-is-a-wearable-device/
Guide to Approaching Innovation
9
interacting with apps. As a rule, interactions with a wearable are
best measured in seconds. The quicker you can make it for a user
to get the information they need, the better. This requires thinking
about the touchpoints at which someone may need to use their
smartphone or wearable to deliver the quickest interaction
possible.
Though wearables are a relatively new category, adoption of the
technology is rapidly increasing. Our latest, Wearables and Mobile
report uncovered that two thirds of Australians have used a wearable
and that half of Australians plan to buy a wearable in the next six months.7
Key Considerations with Wearables:
• With notifications being one of the key use cases for wearables,
how will you utilise them to provide users with the snippets of
information they need?
• If you’re creating an app for a wearable device, what is the main
focus of it for a wearable user? The key here is simplicity and
focus. You don’t want users to need to spend too long
interacting with it.
• Does the wearable have any sensors that you can make use of
to enhance the app or experience in any way?
The rise of ‘Slippy UX’
The term ‘Slippy UX’ was first introduced in 2015 by Jake
Zukowski, Assistant Creative Director of Frog Design, the
company founded by famed designer Hartmut Esslinger. Slippy
UX was used to define experiences and interactions that are
invisible and non-distracting to users. Zukowski used the term
Slippy UX during a presentation about design considerations for
in-car entertainment systems. In this scenario, thoughtful design is
required so that an app doesn’t attract unwanted, unnecessary or
unsafe attention with potentially dangerous effects. As a result of
thinking about Slippy UX, Zukowski argued that apps would
increase their usability and user engagement. This type of design
thinking is key when considering the number of current and future
connected devices (such as the smart home, wearables and
7http://www.idc.com/getdoc.jsp?containerId=prUS41530816
Guide to Approaching Innovation
10
connected cars) where much of the experience depends on
minimal engagement or distraction.
Slippy UX can be defined in various ways; firstly, it denotes the
concept of designing interfaces that display only minimal (i.e.
absolutely essential) information which can be captured in one
glance. Secondly, Slippy UX is contextual: the information that is
being displayed is displayed contextually, i.e. the app will take into
consideration various user analytics to determine what
information is most useful or necessary to the user at any given
time. Thirdly, where a notification or interaction invites the user to
take action, Slippy UX principles determine that completing the
action requires minimal engagement from the user and that
interactions with the app are always designed to be as seamless
as possible. Thus, the main activity a user is engaging in is
disrupted as little as possible. News apps that utilise push
notifications to deliver breaking news to users, without them
necessarily needing to open them, are well-known early examples
of Slippy UX.
As mentioned earlier, in the early years of app development it was
important to create sticky experiences that helped to keep people
within an app. With technologies such as push and interactive
notifications, as well as background data monitoring becoming
available to developers, it became possible to create smarter
apps. These apps don’t necessarily require constant user
engagement to deliver a contextually relevant experience. Today,
stickiness within certain types of apps has become less important
and the concept of Slippy UX, where an app needs minimal
engagement to function, has become more crucial in delivering
the ultimate user experience.
In many ways, Slippy UX was first consciously introduced with
Windows Phone 7 in 2010, where app buttons could show
important information that users could glance at. Previously,
Android had also experimented with similar concepts around
widgets that could be placed on the home screen for users to be
able to quickly glance at information without launching into apps.
Widgets were later added to iOS in iOS 8, with the ability for
developers to create an experience within the Today screen. This
experience has been greatly improved with iOS 10, allowing users
to quickly access widgets without unlocking their phones.
Guide to Approaching Innovation
11
Apps such as Slack, Outlook and Skype also take advantage of
this type of slippy experience by providing both pre-populated
responses to messages and the ability for users to respond
without opening the app.
Slippy UX goes beyond the interface to focus solely on the
experience of interacting or engaging with an app. This is best
being implemented in apps where users may need to be triggered
to take an action, or a quick text response can be actioned. As
more customer service takes place via apps, this type of
development can help to deliver a more seamless, speedy
customer experience.
“Hey Siri”, “OK, Google”: Virtual Digital Assistants
(VDA’s)
A virtual digital assistant (VDA) can come in many forms, from a
relatively basic text-based assistant to a fully-fledged, multimodal
digital assistant. VDAs are automated digital systems that assist
the human user through understanding natural language in
written or spoken form.
When Apple introduced Siri in 2011, it introduced a new form of
interaction with digital devices. At a platform level, Siri introduced
the Slippy UX concept for example where users could use voice
commands to add reminders, set alarms, add calendar markers,
and send or reply to messages. With some select third party
service providers, Apple made it possible for users to use Siri to
complete actions at a platform level, meaning that the service
provider had collaborated with Apple on making standard actions
from Siri directly possible.
A year later, Google Now launched on Android. Google Now went
further than Siri by not just answering questions and being able to
take certain actions, but also by being proactive. Google Now
uses data from across a user's Google Account and some third
party apps to prompt them about meetings that they have or let
them know when to leave for a flight, based on current traffic
conditions or flight changes.
Both these technologies paved the way for a big trend that is
happening surrounding contextual computing experiences; where
Guide to Approaching Innovation
12
users engage with apps and services at a platform level. This
means the phone’s assistant can execute commands for thirdparty providers, without the user having to launch another
application.
Contextual Technology: Google Now on Tap, 3D
Touch, Force Touch, Proactive Search and Google
Instant Apps
Contextual computing, also called context-aware computing, is
the use of software and hardware to automatically collect and
analyse data about a device's surroundings in order to present
relevant, actionable information to the end user.
In 2015, Apple and Google both introduced more contextual
technologies to their platforms which can be considered slippy.
Google took Google Now and expanded it to be platform-wide
and part of the user interface. With Marshmallow, released in 2015,
Now on Tap provided a contextual menu that brought up
information or interaction tools within web pages or apps. Now
on Tap works by Android parsing the content of the entered text
of the screen a user is looking at in order to search for keywords
and other information to generate cards and display information,
make suggestions or undertake actions.
With the release of Android Nougat in 2016, Google Now is being
replaced with Google Assistant, which is based on natural
language processing and it clearly focuses on the user being able
to speak naturally to software. The assistant uses the phone’s
context, including location, orientation, whether it’s playing music
etc. to determine what a user is referring to or requesting, and it
uses this information to provide the user with relevant assistance.
For example, if a user receives a message from a friend asking
whether they want to see ‘film a’ or ‘film b’, Google Assistant will
bring up cards with the trailers for each film. In addition to the
trailers, Google Assistant will also bring up cards about the
nearest place to watch the films, with the option to book tickets.
This is Slippy UX in the sense that it dramatically reduces the
number of steps that a user will need to take to complete the
Guide to Approaching Innovation
13
action. It also keeps them within the phone environment they are
currently in and means that they can simply glance at information,
without having to search for it and interrupt their current activity.
Apple, on the other hand, introduced 3D Touch for iOS (and
ForceTouch for Mac and Apple Watch—which is essentially the
same technology, just with a slight rebrand) to deliver a
contextual experience to the interaction with apps. With 3D
Touch, users can get context or extra options for apps based on
how hard they are pressing on their screen.
For example, when launching an app, 3D Touch will bring up a
contextual menu that enables users to quickly launch into specific
actions within the app, thus reducing the steps required to
complete an action. Within an app, 3D Touch can be used to
enable some limited contextual information, such as a location,
contact information or other options. 3D Touch allows interaction
with notifications even from the lock screen, so you can reply to
messages, view maps or check news updates without ever
unlocking your phone.
Proactive Search for iOS is a system level search that searches
within apps from the home screen. With Proactive Search, a user
can search for a film and be presented with apps that contain this
film title. This will also bring up podcasts the user has subscribed
to that mention the film’s title or find the searched term in
messages, top tweets, videos or apps, to name a few examples.
For Proactive Search to work, developers need to have created
apps with Proactive Search in mind to include the searchable
content. This can include apps that a user doesn’t have installed
on their devices, enabling them to easily download relevant apps.
Recently, Google announced another innovation to improve the
accessibility of apps. At Google I/O 2016, Google announced
Instant Apps to improve the browsing experience in mobile and
help developers get their apps into the hands of users. Android
Instant Apps enables users to click on a URL to load parts of
apps, even if the user doesn’t have them installed. Native apps will
be available within a few seconds and will provide access to fully
functional modules for the user to interact with the app, in the
same way they would have if they had downloaded the app to
their device.
Guide to Approaching Innovation
14
Google has indicated that it will be a fairly simple process to
modularise Android apps for the programme, which will be rolled
out later this year. Instant apps could replace the need for web
pages for some companies, where clicking on a link will deliver
native app functionality instead.
Shifting Mindsets of UX Practitioners
One of the biggest changes in smartphone user experience over
the past eight years has been the amount of activity that can
happen outside of the core app interface. It’s therefore the
experience that matters across different devices and the
experience between different apps that matters.
Today, it’s important for designers and developers to consider
how users will engage with their apps, regardless if they are
actively using them or not. The overall aim, at every stage, is to
consider how to make it easier to complete actions, or get to
where users need to be as quickly as possible.
Here are the key points to Consider:
• Metrics about longevity of time in apps are less important
As mentioned at the beginning of this paper, the average
person now spends more time on their smartphone than any
other device. Combined with the use of tablets, wearables and
PCs, digital devices and mobile OSes have transformed the
ways in which we use, and the kind of people who use,
technology.
Slippy experiences must still be considered for long form
productivity apps, such as Word/Excel/Powerpoint or gaming,
video, or news apps that may be used for a longer periods of
time. For example, with a productivity app such as Microsoft
Office, a slippy experience may be implemented where there is
a quick action, such as granting a colleague access to edit or
view a document. Similarly, with a magazine app, a slippy
experience may be suggesting an article or update in a
magazine, with the ability to get a brief summary or save it for
later.
Focusing too heavily on metrics around time spent in apps may
mean that the overall user experience suffers as a result. Why
Guide to Approaching Innovation
15
would a user continue to use your app if they can do what they
need to do in another app much quicker and more
conveniently?
• Using analytics to deliver the best result
Analytics are an essential tool for designers both pre and post
app creation. Pre-creation, analytics from your website will
reveal which devices people are accessing your services from.
These same analytics will also reveal the activity that takes
place on mobile, often helping to reveal where apps have the
biggest potential to deliver a better, more focused product.
In lieu of analytics, companies will always be able to identify
where apps can deliver value by speaking to people. This
should be thought of as the difference between quantitative
(analytics) and qualitative (speaking to users) research. In the
case of speaking to users, finding out where their sticking
points are and the issues they face can be hugely valuable.
With in-app analytics, it's important to decide what it is you
want to measure. It may be to track how people move around
the app or which features they use more than others.
Alternatively, it may be load times, retention rates, geographical
usage or metrics around crashes. These insights will help with
further iterations and future improvements to the app
experience.
• Background activity is important, but think battery life
When considering the user experience, it’s also important to
consider how your use of background monitoring or other
technologies may impact the battery life of devices. Facebook
has come under a large amount of criticism in recent years due
to the ways in which it uses background activity and the impact
the Facebook app has on the battery life of smartphones.
The Guardian unveiled how deleting the Facebook app from a
user's phone will save 15% of the battery life on iPhone8, whilst
BGR looked at how deleting the Facebook app on Android
improves not only battery life, but also performance. As the
world’s most used app with the highest levels of engagement,
8 https://www.theguardian.com/technology/2016/feb/08/uninstalling-facebook-app
-saves-iphone-battery-life
Guide to Approaching Innovation
16
Facebook is a prominent example of an app that is designed to
be sticky to the point of being addictive.
When developing apps, the planning of the user experience
needs to consider the impact your app will have on a user's
device. With all mobile OSes now able to show users which
apps are using the most amount of battery, not thinking about
this could mean your app may be deleted.
• Remove as many steps as possible early on
A good user experience is centered around delivering what
users need in as few steps as possible. Too often, companies
can become complacent with designs that look good on the
surface, but take too long for users to reach what they need.
The use of widgets and more recently, 3D Touch or universal
search for apps mean that it’s now easier than ever for users to
launch into specific functionality or easily find what they are
looking for.
At the wireframe stage, it’s important to focus on the
functionality of the app and how it is going to work. User
experience at this stage is more about how the experience
flows, without worrying about aesthetics. This will help you to
see how many steps it will take users to complete certain tasks
and how the app links together.
Saving time with mobile is a key metric that app providers
should be working towards. This can be done by utilising other
technologies, which are discussed in more detail later in this
paper.
Using the camera and other pieces of hardware are a key way
to help utilise technologies that save time. This is especially true
for scenarios where users may need to get information from a
physical object, such as a driving license, bank card or piece of
paper. Designing around automating processes and utilising
onboard sensors and other technologies is key to help create
smarter interactions.
Guide to Approaching Innovation
17
Technology to Deliver Enhanced UX
Beyond the platform level improvements that are enabling new
forms of user experience, there are a number of other technologies that can be utilised to deliver an enhanced user journey. In
each of these cases, it comes down to how developers and
designers integrate the technology to apply it for new types of
user experience.
• Geo-location
As discussed, geo-location was one of the defining features of
early apps to reduce the number of steps it would take
someone to complete actions. The simple idea of using
someone's location to deliver information most relevant to
where they are helped people to quickly understand the value
of apps and mobility.
Core to any use of a user’s location is explaining the value to
end users. Use of location is something that users need to
approve when first launching apps. When designing apps
therefore, it's important to think about the ways in which
location will be used. For some apps, it will be delivering
services such as weather information or local information,
such as featuring nearby businesses. For others, it may be
to help users easily check-in or log where they are.
In the app onboarding process, it is generally recommended to
quickly and easily explain how the user's location will be used.
There have been many reports of apps that have exploited
permissions for a user's location for commercial benefits. It is
vital that you only use location if there is a valid, user-centric,
reason for doing so.
Geo-location can also be used today to deliver suggested apps
to a user's lock screen across both iOS and Android. In some
cases, this can be used to suggest apps that a user doesn't
have installed. The idea here is to make it easier for users to
discover your apps within a contextual environment.
• Beacons
Building on geo-location, Bluetooth beacons can also be
utilised to deliver hyperlocation services, especially when a user
is indoors. First integrated into consumer-facing devices by
Guide to Approaching Innovation
18
Apple with the launch of iOS 7, beacons use Bluetooth low
energy protocols to trigger an in-app action. This may be
something simple like a notification designed to deliver
information, or encouraging a user to open the app, or
something more complex like enabling automated check-in,
without the user opening the app.
Core to the beacon experience is the idea that when a user
opens the app, they will be presented with the right content or
buttons that are contextual to a user's hyper-location. For
example, if a user were to enter a meeting room and open a
company app, the app would provide the tools that are relevant
to that room, such as controlling the lights, projector, screen or
other in-room services. If the user were to move to a different
room, the app would change based on the functionality within
that room.
Beacons are also being used today for delivering indoor
location services, similar to the blue dot experience found in
navigation apps.
• Connected devices
As with wearables, our mobile devices can now be used to
connect with thousands of other devices and systems. Mobile is
everything today and sits at the heart of the digital ecosystem.
From controlling your heating, acting as an advanced remote
control for your car, smart TV, security systems, controlling
lighting, connecting with assets, medical equipment, drone or
any other connected device, thinking about the experience for
end users is key.
Designing around a holistic experience means that today, it’s more
than just considering the experience of the app. You need to think
about how an app will connect with a device, what function the
app will play, why someone would want to use the app, where
they will be using it and various other important considerations.
Guide to Approaching Innovation
19
Conclusion
Mobile user experience goes beyond pure visual design: it is a
consideration of every stage of the user's journey and interaction
with an app. At the start of projects, it’s important to create user
personas where developers map out their typical day and imagine
what the app they are creating needs to do to help the user.
Ultimately, with mobile, you are trying to create experiences that
make someone's life easier.
With regards to user experience, the aim is to make the app as
easy and simple as possible to use. App developers should look at
how they can remove as much clutter as possible to create a
cohesive experience that users will benefit from. This becomes
especially apparent when the earlier discussed concept of Slippy
UX is considered. Users should be able to interact with an app in
the most effective way possible, whereby simplicity of design and
contextual information serve to determine the kind of interaction
the user has.
The key to user experience is that the user’s journey should feel
like it was always meant to be that way, which is ultimately what
great design is always about. Achieving this is not always easy,
but ultimately, apps that require training indicate that companies
have failed to focus on the user experience.
When developing internal apps, many companies think that they
can save money by speeding up the development process and
not focusing on the user experience. However, if they then need to
train employees on how to use the app, they’ll ultimately end up
spending more on lost productivity and training costs.
Approximately every year, there should be improvements at both
a platform and device level that help to further improve the user
experience, meaning that new contextual information must be
considered. Apps require maintenance, updates and
improvements over time as this is the only way that they can
continue to deliver a relevant, productive and up to date
experience.
Ease of use, access and convenience have all played roles in the
shift away from desktop to mobile usage. With millions of apps
available, apps need to stand out to attract and retain users
Guide to Approaching Innovation
20
through optimal functionality combined with intuitive design. An
app that offers a discrete, slippy experience can have equal value
to an app that is sticky. However, as mentioned earlier, the
introduction of fully-functional built-in assistants means that the
trend is towards apps that offer slippy, non-intrusive experiences
for an overall uninterrupted user experience. It’s therefore vital
that companies work with designers who have a deep
understanding of the different platforms and technologies
available. This is what will ultimately lead to better apps that
deliver an intuitive, helpful and enjoyable user experience.
Get in touch
Ansible
[email protected]
+61 (0) 2 8373 2399
www.ansibleww.com.au
Ansible is the world’s most awarded mobile agency.
Ansible helps businesses explore the opportunities for
mobility, establish strategic frameworks and roadmaps,
delivering an end-to-end design, development and
integration service. Outside of enterprise mobility,
Ansible offers a full range of mobile marketing services.
Guide to Approaching Innovation
21
Mobility Solutions
S trategy | E nte rprise | Marke ting