Our addiction to digital technology has reached its apex. Already in 2017, addiction to smartphones was out of control, with new terms such as ‘smombie’ or ‘technoference’ emerging to describe the detrimental impact of mobile phones. And it seems the addiction hasn’t gotten any better. The average UK adult spends 2 hours and 34 minutes daily on a smartphone. In the US, this rises to an astonishing 3 hours and 43 minutes. On average, we check smartphones every 12 minutes, and it’s not just millennial behavior. Gen X (born 1965-1979) spend more time on their smartphones than those currently in their 20s and 30s, and 76 percent of smartphone-owning baby boomers (born 1946 – 1964) use the internet ‘at least several times per day.
And while many are in denial – with four in five 6 smartphone users thinking their personal usage is below the national average – the truth is staggering. Our phones are never more than a meter away from us, and every vibration or noise immediately captures our attention. A growing number of users are now considering the harmful impact of mobile, and social media in particular. Six in ten millennials said reducing time on social media would make them happier, and 64 percent that it would make them physically healthier. This is why the new concept of ‘digital wellbeing’ has been gathering pace over the last few years. As a key driver of addictive behavior, social media has been designed to exploit some of our core psychological needs – to belong, to have a sense of connection, to feel appreciated, to tickle our self-esteem. Its addictive properties are based on immediacy – activating the reward path in our brains to deliver moments of dopamine; instant gratification, with infinite scroll, features further deepening the psychological dependency through repeated cycles of uncertainty, anticipation, and feedback (a tactic also used by casinos).
There’s a shared responsibility that falls onto the companies designing and building apps and user interfaces, a responsibility that also extends to policymakers and the users. Pursuing effective digital wellbeing should be the norm, and the shared vision should go beyond narrowly categorising this into ‘reducing screen time’.
Apple’s iOS 12 software update focused strongly on this theme. Google produced informative guides to educate users. And Instagram trialed removing likes to temper the addictive patterns. Ultimately, these features are targeted at boosting productivity, empowering users to manage their screen time and promoting healthier relationships with mobile phones. But these are not enough on their own, and time spent on devices continues to increase.
Digital wellbeing will be exceptionally important for us all to consider – we need to have empathy with our users as we design new products and services. We need to create experiences that, at best, promote a healthy relationship with the digital world; and at worst – not hurt customers’ wellbeing.
We must remember, however, that wellbeing covers a much broader area of everyday life, with digital services having a big impact on the personal, social and mental wellbeing of users. Many popular services nowadays have been built using addictive design patterns, playing on our psychological fears, forcing us to always be ‘on’. Dark patterns , as an example, are tricks that steer user behaviour towards a certain action, such as buying things we didn’t want, giving an answer we didn’t intend or subscribing to a paid service we didn’t mean to. Something many of us have dealt with is ‘Forced Continuity’, when your card gets silently charged with no warning when a free trial comes to an end.
Pursuing effective digital wellbeing should be the norm, and the shared vision should go beyond narrowly categorising this into reducing screen time.
Building digital products must be about more than simply satisfying needs in our consumerism-driven society. Design practices have often focused too heavily on the consumer output, with UX seen as a means to deliver business value. But the industry should instead create designs that support the wellbeing of users. Governments and brands have often used behavioural economics theories, with practices such as nudges, to steer consumer decision-making towards positive choices. The prevalent use of addictive patterns – primarily exploited to help companies drive usage, boost engagement or revenue – is not acceptable given the collateral damage to society and the individual.
So, how can we – as designers, developers and brands – make things better? Challenge yourself to answer the following questions:
What tools can you implement to support digital wellbeing?
What addictive patterns have you built and how can you deconstruct these?
Lastly, the accountability of businesses storing, processing and moving user data must improve. Data breaches are extremely damaging, yet surprisingly common. It’s clear that data is not being stored securely enough, – you’ll see the worrying prevalence of systems that have been built without the right security in place. Take a look at ‘Have I Been Pwned’ website to check your own data integrity. It will let you know if your email address has been affected by any data breaches in the past.
The continued rise of new technologies such as the Internet of Things (IoT) or facial recognition poses a real threat to cybersecurity – nowadays, anything that has a microphone or a camera poses a real risk to digital privacy. Global brands like Amazon, who have a huge product line based around IoT, have shown to be vulnerable to data breaches and misuse of the technology in their products. Their security device, Amazon Ring, has been subject to data leaks of customers’ information and hackers taking control of the device.
6/10 millennials believe reducing time on social media would make them happier
In an attempt to develop customer experiences, Alexa-based devices have been recording and uploading some of the conversations you have to the cloud, irrespective of the wake command. The biggest red flag is that teams at Amazon analyse parts of these conversations, and none of it is strictly anonymised. If devices cannot be secure from the point of manufacturing all the way through to the consumers’ hands, it exposes vulnerabilities to the entire network of that device.
It’s no surprise that consumers are becoming more data conscious, with research showing users are now much more concerned with online privacy than they were a year ago (up by 43 percent in the UK). One thing to remember is that the frameworks of digital privacy should be built upon trust and transparency. Delivering real product differentiation, solving real customer pain points and building an open forum with the community that fuels your business are stepping stones towards nurturing digital trust.
As consumers are becoming savvier about their data, we expect 2020 to see a greater adoption of regulations and tools to educate and manage users data. So, the questions you should be asking yourself are:
What steps should you be taking to build transparency and win consumer trust?
How do you move from legal compliance for new data regulation to a pro-consumer stance in your product strategy?
What are your customer product and communications plan in the event of a data breach?
Data has become the most valuable resource of the 21st century, so to speak. But as with anything valuable, there is an opportunity for exploitation – the recent case of Travelex 12 being held to ransom by hackers demanding sensitive customer data is a prime example. The amount of digital data being created every minute is accelerating exponentially. Experts predict that, by the end of 2020, there will be 40 times 13 more bytes of data than all the stars in the observable universe. Take Facebook – if you’re a user, even an infrequent one, download your profile 14 to see the magnitude of information collected from your messages, photos, comments, and likes. This slightly terrifying exercise is one that very clearly portrays why digital privacy is so important – and why tech giants are being scrutinised 15 for their approach to user privacy and data monetisation.
Without delving any deeper into the landscape where companies too often exploit data for their own business gain, we must remember there is more to the story. Data is a powerful tool and – if collected with consent, protected, and transparently utilised – it can be massively beneficial not just to the company, but also to the user.
Users effectively trade their data in return for convenience, personalisation, better user experiences or time-saving benefits. They typically feel comfortable allowing access to basic personal identifiable information to increase the personalisation of an app or a website, but there’s a limit to what they’re prepared to hand over.
New regulations such as GDPR in Europe and CCPA in the US are now enshrining the fact that personal data collected by businesses belongs to the customer – not the businesses. This clear mandate that customers own their data (whether they want to or not, it must be stressed) should lead to increased data transparency – something companies must not only be prepared for but should fully embrace. Companies must be clear why data is being collected, what will be done with it and what customers will get in return.
By the end of 2020, there will be 40 times more bytes of data than observable stars
In the earlier years of the internet, web protocols had evolved to allow users to publish information that could then be found via search engines. But, fast forward to today, and anyone can publish content instantly from any device. As the authentication methods for platforms, publishers and customers are limited, digital trust becomes key. One of the latest AI-based technologies, ‘deepfakes’, has brought digital trust to the forefront of media attention. The technology is using artificial neural networks and machine learning techniques to manipulate the existing media of one person, and superimpose it over media showcasing another person – creating completely fabricated images, videos and sounds. Naturally, this could be extremely detrimental in political campaigns, for example, where reputation and trust are key to voters’ decision-making process. This would only be amplified in situations like echo chambers, a metaphorical way of describing a closed system that reinforces any predetermined beliefs.
Facebook’s AI division recognised the lack of verification methods online to detect deepfakes. They released a statement 23 saying they will remove any manipulated videos from their platform, and have been investing in AI research and development. They also set up a challenge 24 to deliver publicly available technology solutions that can be used to determine the legitimacy of information presented online.
Many of these topics around data breaches, illegal surveillance, lack of transparency and so on can have a profound effect on a company’s reputation. In a recent Deloitte survey 25 nine out of ten Americans stated business transparency is more important today than ever before. The goal for brands is to invest in transparent customer experiences and develop trustworthy relationships, as these can ultimately be more important than the product itself. Privacy, transparency, truth, security, control and ethics are some of the many foundations for building digital trust. A few questions worth considering:
Do you support the accessibility of data across your platforms? And can users easily digest policies and other terms and conditions?
Are you truthful about how your products are presented and do you use explicit language to convey what product or service benefits are gained by submitting information?
About us and this blog
We are a digital marketing company with a focus on helping our customers achieve great results across several key areas.