0% found this document useful (0 votes)
38 views48 pages

Usability Design for Screen Time Apps

The document discusses screen time tracking apps and how they can be improved through design. It conducted research on existing apps, surveyed users, and created prototypes to evaluate design elements and features. The results provide guidelines to improve screen time app design and encourage behavior change.

Uploaded by

Shreya Chipalu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views48 pages

Usability Design for Screen Time Apps

The document discusses screen time tracking apps and how they can be improved through design. It conducted research on existing apps, surveyed users, and created prototypes to evaluate design elements and features. The results provide guidelines to improve screen time app design and encourage behavior change.

Uploaded by

Shreya Chipalu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

Keep your screen happy:

Improving the usability of screen time


tracking apps

Milena Pacherazova

Department of Informatics
Master thesis, 30 hp
Human-computer Interaction and Social Media
SPM 2019.18
Abstract
The adoption of technology in our daily activities increased the time that we spend in front
of the screen and changed the way we communicate and work. In recent years, many big
companies started to develop and implement screen time management tools in their products
to educate the user on how to improve their digital health. Those tools are an important step
in the process, they bring awareness and help the users to change their habits. Several studies
have focused on screen time tracking apps but not from the design perspective. Therefore,
this thesis aims to explore the design of screen time management apps by developing two
prototypes, which were used to evaluate different design elements and features. The results
of this thesis present a guideline on how to improve the design of the existing screen time
tracking tools and what additional features could be added to fulfil their aim and encourage
users to change their behaviour.

Keywords: screen time, screen time management, screen time apps, persuasive design,
machine learning, usability, user experience design

1. Introduction
As smartphones become ubiquitous, the time that people spend in front of the screen has
noticeably increased. The rapid development of smartphones applications (apps), extended
their functions and today they are used as universal tools, from buying a ticket to playing
games. They offer access to huge amounts of information at any time and place and allow their
owner to stay in touch with people from all over the world. They are also helpful in terms of
organising tasks, increasing work efficiency and even improve the users’ health. Last but not
least, they keep people entertained when they feel sad or bored. However, while all this sounds
amazing, smartphones have disadvantages especially when they are overused, which is easily
achieved during the exploration of their practically endless possibilities.
Various researchers investigate the changes in human health, due to the usage of
smartphones. The smartphone overuse is increasing the risks of inflammation of muscles of
the hands and is one of the causes of arthritis (Megna, Napolitano, Patruno & Balato, 2018).
The prolonged use of smartphones and tablets causes neck pain due to tilting the head and
taking an unhealthy posture, which is a frequent habit among younger and older users. It is
“currently coined as “text-neck syndrome” and according to research by Elnaffar & El Allam
(2018), the popularity of smartphones increases the risks of spreading the syndrome, especially
among children who adopt the bad habit at a very young age. The overuse of smartphones is
causing not only physical problems but cognitive and emotional ones as well. According to
various studies the overuse leads to depression and anxiety (Elhai, Levine, Dvorak & Hall,
2016), associated with sleep quality as well (Demirci, Akgonul & Akpinar, 2015).
The problem is not only the addiction to the device itself but also to the apps that are
installed on it. The constant connectivity makes users pay more attention to the smart device
than everything around them. An analysis by Ding, Xu, Chen & Xu (2016) shows that social
media apps and communication apps such as messaging apps are considered as addictive,
especially among college students. Indeed, social media apps are viewed as one of the

1
predictors of addiction (Salehan & Negahban, 2013). According to Goggin, Lincoln & Robards
(2014), the development of smartphones brings many benefits for social media apps. Thanks
to their mobility and connectivity, users can now access social media almost everywhere.
However, in recent years, many companies started to develop apps that promote behaviour
change. An example of such apps are the ones related to physical activity and/or dietary
behaviour. According to different researches (Bardus, Van Beurden, Smith & Abraham, 2016;
Direito, Pfaeffli Dale, Shields, Dobson, Whittaker, & Maddison, 2014; Edwards, Lumsden,
Rivas, Steed, Edwards, Thiyagarajan, … Walton, 2016) the most commonly used behaviour
change techniques are to ask the user to provide personal details such as height, weight, age
and others (typical for the dietary apps), to set a specific goal and to encourage them to monitor
their progress. Studies like these and many others prove that smartphone apps should not be
viewed as something bad because they could be helpful in many other situations, especially if
the behaviour change techniques are used appropriately and do not violate users' privacy.
Combining them with a visually appealing design will provide a smooth user experience, which
will maximise the apps’ efficiency (Chhabra, Sharma & Verma, 2018).
Unfortunately, the usage of mobile apps requires users to spend too much time in front of
the screen. Addictive or not, the apps are looking for attention. Probably this is the reason for
another group of apps to be developed – screen time1 tracking apps, which are the focus of this
research. Currently, many apps and tools track the time spend in front of the screen. As of my
knowledge, several papers investigate that kind of apps. However, they emphasise on
improving the focus (Whittaker, Kalnikaite, Hollis & Guydish, 2016), reducing the smartphone
usage (Hiniker, Hong, Kohno & Kientz, 2016; Ko, Yang, Lee, Heizmann, Jeong, Lee, . . . Chung,
2015) and fighting the smartphone addiction (Löchtefeld, Böhmer & Ganev, 2013). However,
none of them focuses on the design, which is an essential factor for the future of screen time
tracking apps (and not only) and the relationship between humans and their devices. To
address this research gap, there is a need for additional research. The improvement of the
design of screen time management tools can attract attention to the problems that those tools
are trying to solve, increase their popularity and educate the users on how to take care of their
digital health. Following this line leads to the research question that this study will answer:

How can a screen time management tool be designed and what features could improve its
usability?

To answer the question, this study uses the following approach: first, research of the most
commonly used screen time tracking apps was conducted, where 10 apps were selected and
tested. Then an online survey was shared among people of various ages and locations to inform
how aware are the users of their screen time if they consider the screen overuse as a problem
and how are they dealing with. Based on the answers from the survey several short interviews
were conducted online with users of screen time management tools to investigate their
advantages and disadvantages. This led to the creation of a high-fidelity prototype, which was

1The term “screen time” defines the action of spending too much time in front of digital devices
with screens. It should not be confused with “Screen Time” which is the name of an app, used in
the text below.

2
evaluated with potential users. The next step was improving the prototype and evaluate it
again.
The present study is addressing the gap mentioned above and explores the challenges of
developing a screen time management tool, that supports several different devices, uses
machine learning and it is inspired by persuasive technology to promote behaviour change.
The main contribution of the study is to present guidelines that help designers and researchers,
interested in the integration of screen time tracking tools into daily activities.

2. Related research
This section presents an overview of the current state of the literature in relation to the study.
Screen time management apps have not been an object of study by other researchers at least
not in terms of design. Therefore, there is a need for additional research to find out how these
apps can be designed and what features need to be included to improve their usability. First,
the effects of the addictive technology are explored, what makes the users spend time in front
of the screen and how they can protect themselves from falling into this trap. Lastly, several
papers investigating screen time tracking tools will be briefly presented.

Addictive technology and its effects


Mobile devices offer a wide range of areas where they can be helpful. Thanks to the additional
applications, available for smartphones, users can experience the benefits of having access to
a powerful tool. Smartphones are always around us and we use them for almost everything
from a simple search to communicating with friends and relatives. However, according to
Sherry Turkle, an MIT technology specialist, their usage leads to disconnection from the real
world (Turkle, 2011). Her book “Alone together” is the third part of her exploration and
research on human life and behaviour in the digital age. She shares the stories of many young
people and adolescents whose social networking and communication is entirely online via text
messages or emails. Turkle claims that by being constantly connected through mobile devices
and networks and being continuously distracted by all the information and entertainment that
we have in our hands, we will lose our abilities to think. Even though the book was published
in 2011, many of the statements are still applicable today. She also mentions that “with
constant connection comes new anxieties of disconnection” (Turkle, 2011).
Most products and services are designed to be addictive. By “fighting” for the users’
attention, the usage of such products prevents the user from concentrating and focusing on a
current task. Many researchers report that while users interact with mobile devices, they
become less productive and stressed (Kushlev, Proulx & Dunn, 2016; Yuan, Gao & Lindqvist,
2017; De Russis & Monge Roffarello, 2017; Furst, Evans & Roderick, 2018; Stothart, Mitchum,
Yehnert & Enns, 2015). Furst et al. (2018) conduct a mixed-method study that explores the
effects of the smartphone usage on homework of college students in New Your City. The results
show that the students who use their mobile phones more while doing their homework are
distracted from doing their assignment (Furst et al., 2018). Smartphones are more than just a
device, for many young people they are the hub that connects everything. If for some reason
they are “disconnected”, they usually experience FoMO or Fear of missing out. It is a term

3
defined as “a pervasive apprehension that others might be having rewarding experiences from
which one is absent, FoMO is characterised by the desire to stay continually connected with
what others are doing.” (Przybylski, Murayama, DeHaan & Gladwell, 2013). Even though
smartphones offer practically everything to the user, their presence harms the cognitive
capacity (Ward, Duke, Gneezy & Bos, 2017). Another study by (Stothart et al., 2015)
investigates what happens when users receive a notification on their device. The results show
that even if the user does not respond to the received notification, their performance
significantly decreases because they are disturbed by this notification.
Excessive smartphone usage also has a negative impact on people’s physical health. It is
known that the prolonged smartphone usage affects the body posture and can lead to changes
in different parts of the spine (cervical and lumbar) due to the forward head position (Jung,
Lee, Kang, Kim & Lee, 2016; Kim, Kang, Kim, Jang & Oh, 2013). Tilting the head in such
irregular position is causing muscle pain in the neck and shoulders as well. According to
Elnaffar & El Allam (2018), there is a high risk of “text-neck syndrome” especially among
children and teenagers, who are overusing smartphones and tablets. The syndrome is harmful
to the neuro-musculoskeletal apparatus and it already affects people from various ages and in
particular young people (Giansanti, Colombaretti, Simeoni & Maccioni, 2019). Another term
related to this syndrome is “SMS thumb” which affects the muscles of the hand. Studies show
(Shah & Sheth, 2018) that the repetitive usage of hand-held devices such as smartphones lead
to musculoskeletal disorders, including arthritis (Megna et al, 2018). Other researches show
that the exposure to blue light, not limited to smartphones but screens in general, can be
harmful to the retina and is one of the causes of DES or Digital Eye Strain (Sheppard &
Wolffsohn, 2018).
In the studies mentioned so far, smartphones are considered as devices that cause various
kinds of mental and physical issues. However, looked at another perspective, it is known that
they have positive sides as well. Indeed, they can be used as a tool to encourage users to change.
One of the examples is a game called Pokemon Go, known as the first augmented reality game.
It motivates the player to move in a physical environment and to “catch” imaginary characters
with their smartphones. The game is not promoted as a health app, but studies show that it
encourages the players to walk which brings health benefits (Mccartney, 2016; Howe,
Suharlim, Ueda, Howe, Kawachi & Rimm, 2016). The Health and Fitness categories in the App
Store and Google Play store include various apps that promote healthy living and physical
activity. Research shows that the most popular features of those kinds of apps are tracking
progress, setting goals/limitations and self-monitoring (Subramanian, Freivogel, Iyer,
Ratnapradipa, Veenstra & Xie, 2015). Additionally, there is a need of improvements in terms
of features such as experts’ recommendations and advice on how to set limitations the right
way and how to get better and faster results (Schoffman, Turner-McGrievy, Jones & Wilcox,
2013; Pagoto, Jojic & Mann 2013).

Building the change


The growing discussion around the downside of using smartphones continues with Adam
Alter, who shows the business perspective of behavioural and technology addictions in his book
“Irresistible”. According to his book, the majority of people spend 4 hours per day on their

4
smartphones, which means that checking email, social media feeds etc. equals to 100 hours for
a month (Alter, 2017). People are so addicted to their devices that even the need to touch
smartphone results in anxiety, depression and FoMO (Elhai et al, 2016; Hoffner & Lee, 2015).
Alter (2017) suggests that there is a need for behavioural architecture, which means to create
surroundings which will help us to thrive. It is indeed difficult to avoid using technologies and
certain apps like email for example, but Alter encourages to try to reduce their usage. He also
mentions that everything nearby has an impact on our mental health, that is why according to
different studies looking at a screen before falling asleep obstruct our abilities to sleep better,
“Surround yourself with temptation and you’ll be tempted; remove temptation from arm’s
reach and you’ll find hidden reserves of willpower.” (Alter, 2017).
Additionally, the user needs to create this behavioural architecture on their own or as Nir
Eyal explains in his book, following a famous quote by Mahatma Gandhi: “build the change
they want to see in the world.” (Eyal & Hoover, 2014). Eyal argues that by understanding how
we get attracted and even addicted to technology, we can break and change the unwanted
habits in our lives. The solution is hidden in the design of the products and services around.
The book describes the Hook model (figure 1), which is “a simple yet powerful way to help your
customers form habits that connect their problem with your solution”. The model has 4 parts
– Trigger, Action, Reward and Investment (Eyal & Hoover, 2014), illustrated and explained
below (figure 1).

Figure 1. The Hook model.


The Hook model starts with a Trigger which tells the user what they should do next. One of the
most common examples of a Trigger are buttons, which we can see in many websites and
mobile apps. Usually, they have an action word like “Buy”, “Click here”, “Download” or a well-
known symbol like the triangle on the play button. Those type of triggers are called external
ones, but there are also internal triggers which build the long-term habit (Eyal & Hoover,
2014). The difference is that the internal ones do not contain any information, they are formed
by association or a memory in the brain and most importantly when the user experience
negative emotions without conscious thought. According to various researches (Banjanin,
Banjanin, Dimitrijevic & Pantic, 2015; Moreno, Breland & Jelenchick, 2015; Muñoz-Rivas,
Fernández & Gámez-Guadix, 2010; Özdemir, Kuzucu & Ak, 2014) people who suffer from
depression use internet more, to boost their mood. Those people are more likely to be addicted
to technology because it offers them relief (Kotikalapudi, Chellappan, Montgomery, Wunsch &

5
Lutzen, 2012; Larose, Lin & Eastin, 2003) and “Once a technology has created an association
in users’ minds that the product is the solution of choice, they return on their own, no longer
needing prompts from external triggers.” (Eyal & Hoover, 2014).
The next part of the Hook Model is Action, this is the part in which the habitual behaviour
occurs. It is defined as an expectation of a reward and it uses the motivation and the abilities
of the user. For the Action to work it needs to be designed in a simple way and at the same time
to increase the motivation of the user. The Action “draws upon the art and science of usability
design to ensure that the user acts the way the designer intends.”2. A simple search bar or a
scroll function are considered as an example of this part of the model.
The third part of the Hook model is Rewards, the user is rewarded, and their problem is
solved. Rewards can be anything that keeps the user attention - from a simple image or
experience to a physical product. They activate desire simply by being unknown. Eyal suggests
three types of rewards the Tribe, the Hunt and the Self (Eyal & Hoover, 2014). The Tribe
rewards are also known as social rewards. They make the user to come back and to look for
more. This is one of the reasons why social media platforms are so popular - they offer the user
this type of rewards. Every time the user comes back, the content is different and there is
always the uncertainty of what other people might have posted or commented on. One of the
examples for rewards of the Hunt is slot machines. In this example, the reward is the money
that the user might win and that is what makes gambling so addictive. Similar is the social
media feed, by opening an app with feed the user might see something interesting but if they
continue to scroll the next things, they will see might be even more interesting. Feeds motivate
the user to continue searching for the next “reward”. The last type of rewards are the rewards
of the Self. They do not come from other people; they make the user feel good just by being
there. They are all about the search for mastery and control. An example of this type of rewards
are the games. Even if the user plays alone, without other people, reaching the next level or
getting the next achievement makes him or her to feel good. For people who do not play games
rewards of the Self come in another format - by checking their email and notifications just
because there is an indicator with some number, and it makes the user clear those numbers
away. The purpose of those rewards is to “satisfy users’ needs while leaving them wanting to
reengage with the product.” (Eyal & Hoover, 2014).
The final part of the model is Investment, which should increase the likelihood and make
the user pass through the hook again and again in long-term. In this part, the users are asked
to invest something which will make the products they use better. The difference here is visible
between physical and digital products. As time passes the physical products lose their value
however, the digital ones should do the opposite. An example of this part is the data shared
online, the more data is shared the better products and services become by providing more
relevant content (Eyal & Hoover, 2014).
However, the Hook model is not flawless. According to Filippou, Cheong and Cheong (2016)
it has a limitation – it does not check the effectiveness of a Trigger. The authors also suggest

2 https://www.nirandfar.com/hooked-user-behavior-resources/

6
that Fogg’s behaviour model can be used to deal with this
disadvantage. B. J. Fogg’s behaviour model needs three
components “a person must have sufficient motivation,
sufficient ability, and an effective trigger” (Fogg, 2009).
Fogg visualises the model as a plane formed by two axis -
a vertical for motivation and a horizontal for ability. The
place where the two axes are connected is marked with
low (low motivation and low ability) and their ends - as
high, which means that “high motivation and high ability
are typically necessary for a target behaviour to occur.”
(Fogg, 2009). If the ability or the motivation is low, then
the trigger will be unsuccessful, and the system should Figure 2. Successful and unsuccessful
support the user by turning the lower parameter into a triggers, according to Fogg’s model.
high one (Figure 2.).

Regulate smartphone usage


The discussions around the negative effects of smartphone overuse led to the exposure of
numerous mobile apps and tools that claim to help the users to reduce the distractions and
limit their smartphone usage. Some companies started to implement features into their
products that inform the user on the time they have spent using a specific service. YouTube,
for example, implement the “Time Watched” feature into their mobile apps, where users can
see how many minutes they have been watching. The data is collected from all devices,
connected to the YouTube account. In this case, such a feature is integrated into the app by
default but there are also stand-alone apps for regulation as well as various researches
investigating if those apps help. Apps that simplify the data collection process and visualise it
are known as personal informatics tools (PI) and in recent years they gained huge popularity
(Rapp & Cena, 2016). Collins, Cox, Bird & Cornish-Tresstail (2014) conducted a series of
studies regarding the usage of PI tools for productivity. They used the tool RescueTime to
investigate if it can increase the awareness of the social networks site usage and if it can
encourage the user to change their behaviour. The results of the studies showed that
RescueTime presents data which lacks “salience; contextual information; credibility; and
action advice.” (Collins et al., 2014).
Freedom3 is another popular software (and mobile app) for blocking specific websites. It is
used in a study by Mark, Iqbal, & Czerwinski (2017) who investigated the effects of distractions
in the workplace on people’s cognitive absorption, productivity, workload and stress. By
removing distractions, productivity and focus did increase but as reported by the researchers
“half the participants experienced more stress as well” (Mark et al., 2017). Another study by
Marotta and Acquisti (2017) who also use the same tool, again proves that limiting the access
to specific websites, do increases productivity.
Other researchers took a different approach – developing and evaluating their own tools.
(Whittaker et al., 2016) designed a tool called meTime which bring awareness to the users on
“how they allocate their time across applications”. The study resulted in improved focus since

3 https://freedom.to

7
the participants reduced the usage of social media sited and browsing as well as their total
online activity (Whittaker et al., 2016). NUGU (when No Use is Good Use) is another example
of an app, created for a study, that also brings awareness. It encourages users to limit phone
usage and share their limiting information with other users. The results showed that this
feature is “critical in assisting the participants to limit their smartphone use” (Ko et al., 2015).
Another app that also supports self-limiting and is still popular (reviewed in the next section)
is AppDetox, which offers the users to set rules for the apps that they want to use less. The
findings of the research on this app demonstrated that users are limiting mainly social media
and messaging apps (Löchtefeld et al., 2013). Hiniker et al. (2016) created a standalone mobile
app, called MyTime, that creates a balance between the use of the device and the non-use. Their
study reports that the targeted non-use of a smartphone can help the users to reach their target
of appropriate usage. Apart from that, the study also demonstrated the advantages of technical
solutions over self-preventive measures such as leaving the phone in another room and
completely removing apps.
The results of these studies indicate that the need of apps for limiting the smartphone use
is appreciated by the users (Hiniker et al., 2016) and they understand the advantages of using
their devices less in a daily basis (Ko et al., 2015). Therefore, this study aims to investigate the
existing apps for smartphone usage regulation which is strongly connected to limiting the
screen time usage. Details about the study will be presented in the next chapter.

3. Research methodology
This section presents the methods used to conduct the study. Following is a brief overview of
the section. The study consists of 2 parts, each of which is explained in detail below. Lastly, the
data analysis method will be explained and the ethical considerations.

Overview of the section


• Part 1: Understanding the users
o Evaluating current apps
o Exploring users’ needs
• Part 2: Design and user experience
o Prototype v1.0.
o Evaluation of Prototype v1.0
o Prototype v2.0.
o Evaluation of Prototype v2.0
• Data analysis method
• Ethical considerations

3.1. Part 1: Understanding the users

3.1.1. Evaluating current apps


Understanding the users’ needs is an important step for any research. The study is initiated
with a research of the most popular mobile applications for screen time tracking. Based on
their ratings and number of downloads in the two app stores – Google Play and App Store on

8
iOS, as well as several articles from well-known tech review websites, 10 applications were
selected and evaluated. They were downloaded on two mobile devices (Android system and
iOS system) and their features were explored, including the accuracy of time tracking and their
user interfaces. The evaluation was conducted in one month and a half. The availability of two
devices to test on, allowed for testing two apps at the same time – one on Android and one on
an iOS system, as well as comparing their features and interfaces. Each application was tested
for about a week individually on daily usage of the device. Table 1 below presents a summarised
version of the collected data during the evaluation of the existing apps. The full version of the
table can be seen in the table in Appendix A.

Limits Count Notifications Other

Count time for every app


Opt-out when time limit

Count the time device is


Limit categories of apps

Set limits for time/days

Sends "run out of time"


Block apps after time

Asks for location in


Sends motivational
Limit specific apps

Pause counting
App name/

order to work
notifications

notifications
Features
is reached
limit ends

used

Screen Time ✓ ✓ ✓ ✓ ✗ ✓ ✓ ✗ ✗ ✗ ✗

Moment ✓ Paid Paid Paid ✓ ✓ • ✗ ✓ ✓ ✓

Mute ✗ ✗ ✗ ✗ • ✓ ✗ • ✓ ✗ ✓

Space ✗ ✗ ✗ ✗ ✗ ✓ ✗ ✓ ✓ ✗ ✓

Forest ✗ ✗ N/A N/A N/A ✓ N/A ✓ ✓ N/A ✗

Quality time ✗ ✗ • N/A ✓ ✓ ✓ ✗ ✗ N/A ✗

Antisocial ✓ • ✓ ✓ • ✓ • ✗ ✗ ✗ ✗

App Detox ✓ ✗ • • ✓ ✗ ✗ ✗ ✗ ✓ ✗

Table 1. Evaluated apps and the availability of important features.

In the evaluation, the availability and reliability of some of the most important features, that a
screen time tracking apps should have, were tested. For example, track screen time, set limits
and notify the user. As seen in the table above those features are included as main categories
with additional subcategories for more specific functions. Setting a limit is one of the core
features of a screen time management app and to allow the user additional freedom it should
be available in a variety of ways. For example, limit an app, limit a category of apps, limit for
several minutes/hours or even setting limits per day. None of the tested apps offers that wide
range of options. However, to meet the needs of a wider target group, it should be considered
for the future development of screen time tools. Counting the time is the next important feature
and similar to the previous one, it should count more than just the total screen time. This

9
feature was included in the evaluation because it brings awareness for the user. The table
includes a subcategory called “Pause counting” which is perceived as a way to bypass the
counting in case the user wants to use their device a little longer than the set limit. While it is
good for a screen time app to have such a feature, it should be used responsibly. The next
category in the table is “Notifications” and it is included in the evaluation because this is one
of the most convenient ways for any kind of app to “communicate” with the user. Again, it
should be used responsibly by the designer in order not to bother and annoy the user and
because screen time management apps should reduce the screen time, not to make the user
check their device regularly. It would be flexible to have additional settings for notifications
inside the screen time app in order to be appealing for more users. The last column in the table
is called “Other” and includes features that cannot be part of any of the other categories, for
example asking the user for their location. After the evaluation, each app was kept installed on
the devices, but all of their features were turned off in order not to interfere with the results
with the apps that are currently tested. It turned out that this feature is also important and
therefore included in the extended version of the table, available in Appendix A.

3.1.2. Exploring users’ needs


Survey
Based on the problems found during the evaluation of the most popular apps, an online survey
consisting of 10 questions was created (see Appendix B). It consists of a multiple-choice type
of questions with single/multiple answers and a matrix. The survey aimed to check what
features screen time management tools should have, how aware are the users of the existence
of those apps and their popularity. Therefore, the survey was not focused on a specific target
group of users but rather a broad range. The survey contained a question about the different
ways to deal with overuse and distractions coming from mobile devices which were used as an
inspiration to the researcher to think of new features that could be added to the concept. As an
integral part of any survey, there was also one demographic question, which was used to
identify the age group of the responders. The survey was distributed in Slack channels,
Facebook groups with users from all over the world, friends and colleagues. It was open for one
week and it got 113 responses.

Short interviews
Initially, the plan was to use only a survey, but the results showed that most of the participants
are not aware of screen time tracking apps and there is a need of targeting a specific group of
users. For example, users who are using or have used screen time tracking apps. In this phase
of the study, specific problems of the screen time management apps had to be identified and
solved later. In total 13 online structured interviews were conducted with individuals (Table
2.). Some interviewees contacted the researcher directly, others were invited personally based
on the criteria mentioned above. Each interview lasted between 5 and 15 minutes and it was
recorded on a mobile device. The data was transcribed and analysed with thematic analysis.

10
Screen time app used Screen time app used
Participant Age range Time used
now before
P1 18-25 Screen Time (iOS) 6 months No
P2 26-32 Screen Time (iOS) 3 months No
P3 26-32 Screen Time (iOS) 4 months No
P4 26-32 Screen Time (iOS) 4 months No
P5 26-32 Screen Time (iOS) 6 months Yes, Moment
P6 26-32 Screen Time (iOS) 5 months No
P7 26-32 Digital Wellbeing 4 months No
P8 26-32 Screen Time (iOS) 4 months Yes, Space
P9 18-25 Screen Time (iOS) 5 months No
P10 18-25 Digital Wellbeing 3 months No
P11 26-32 Space 1 month No
P12 18-25 Digital Wellbeing 2 months No
P13 18-25 Screen Time (iOS) 4 months Yes, Digital Wellbeing

Table 2. Interviewed participants during part 1 of the study.


Before each interview, the participants were informed about the aim of the study, the recording
of their answers and their right not to answer a question and/or abort the interview at any
time. To keep their anonymity, the participants were asked only about their age range, instead
of their specific age. Their name, gender, location and other personal information was not
needed for the study and therefore such data was not collected. The interviewees were asked
10 open questions (see Appendix B) and encouraged to share their experience with the apps
they use. The questions started with the name of the app and the period that it was used, for
the researcher to understand the accuracy of the data from each participant. The next group of
questions asked about what the user like, dislike and would change about the app they use. The
data from those questions helped the researcher to understand which of the features of screen
time tracking apps are appreciated by the users and which are not. In order to understand the
effects of screen time tracking apps on users’ behaviour, the participants were asked about
their opinion on the usage in the long term.

3.2. Part 2: Design and user experience


For this part of the study, two interactive prototypes were created and evaluated. The results
from the first evaluation influenced the creation and the development of the second prototype.
In the following paragraphs, both prototypes will be described, their differences and the
evaluation methods.

3.2.1. Prototype v1.0


The development of the first prototype started with several sketches on paper, transferred into
a design tool (Sketch app) and converted into a low-fidelity interactive prototype with InVision.
To make the users use their devices less and build a friendly relationship between the person
and the app, a persuasive design element was used as part of the user interface of the prototype.
It is represented by a simple icon of the users’ device (also called mascot below) with eyes and

11
mouth which express different emotions based on the users’ screen time usage. The initial state
of this icon is a smiley face, which makes it look happy and this is where the name of the app
comes from - “Happy Screen”. This icon is also inspired by a digital pet product of the mid-
1990s called Tamagotchi, which changed the way a digital device is perceived by the users. It
is one of the examples of the “functional triad” framework, created by B. J. Fogg, which deals
with the different way users perceive the roles of the computer technologies in their lives (Fogg,
2003). The mascot will keep the interface minimalistic and by changing its facial expressions,
it is expected that the users will use the app longer, which on the other hand will help them to
change their habits.
The prototype starts with an onboarding with 3 separate screens where the user can
navigate between them by swiping left or right (figure 3). The onboarding aims to welcome the
user and present the main idea of the app. In the text below the word “onboarding” will be used
to define the process of welcoming the new users and make them engage with the app. After
the onboarding, there is a Home screen (Figure 3) with several clickable blocks:
• a mascot followed by information about the screen time usage of the device today;
• 2 blocks with the number of notifications received and the number of pickups of the
device;
• Updates and news section with tips and information from the mascot, combined with
its emotional states.
Each of the blocks leads to different screens with more detailed information and data, where
some of it is visualised in the form of graphs - widely used in screen time management apps.

Figure 3. Example of an onboarding screen and Home screen of Prototype 1

12
By using graphs, the user gets a better understanding of large amounts of statistical data, and
this allows them to make a comparison between different days/hours of the day.
The prototype was “connected” with a fake Android “home” screen, with links to fake
Facebook and Instagram apps, used during the evaluation. The Facebook and Instagram feeds
are especially designed with effects that the “Happy Screen” app will “cause” when the user is
using them for too long. The effects used in the prototype are blurred content in the Facebook
feed and “Broken” screen for the Instagram feed. They will be active only in the mentioned
apps. In the text below the word “effects” will define the state of “Happy Screen” when a user
sets a limit for an app and use it over the limit. Effects are another feature of “Happy Screen”
that makes the app different from all of the existing ones, which just block the usage of a certain
app when the user is using it over the set limit by preventing them from opening the app (App
Detox) or showing white screen with opt-out option (Screen time on iOS). As an inspiration of
the effects, the Hook model (Eyal & Hoover, 2014) is used and modified. Instead of keeping
the app addictive and increasing users’ curiosity to open it again and again, applying effects
aim to break the hook. Starting with the Trigger, the user is prompt to open a certain app, then
there is an Action – tapping on the icon, but the next part of the model – Reward is never
“received”. After reaching the previously set limit, the user will still be able to open the app and
post content, but they will not be able to read the content that is already posted from their
friends. Practically, the app will be useless, and it is expected that the user will not be motivated
to use it. In the case of Instagram, the user will still be able to open it but when they start to
scroll a picture of a Brocken glass will cover the screen and the scrolling function will not work.

3.2.2. Evaluation of prototype 1


Method
Focus groups were used as a method for the evaluation of the first prototype. The reason
behind using focus groups for this evaluation is that during the discussion the conversation
can be unpredictable and make the participants more talkative, they influence each other and
build on each other’s’ contribution. This prototype was low fidelity with the hopes to motivate
the users to talk more and suggest new ideas for the development of the concept. The feedback
from the focus groups helped the researcher to make a strategic decision about the refinement
of the prototype.

Participants
For the evaluation of the first version of the prototype, three focus groups were organised with
a total of 8 participants (Table 3). Group 1 consisted of 3 students from the HCI master
program, Group 2 had 2 participants - both were master students within the IT Management
field and the last Group 3 - again 3 participants, students from different master programs.
Unfortunately, the data from Group 2 will not be considered due to the fatigued look of the
participants and the vague comments that have been shared during the session. Each session
started with a brief introduction of the tasks and basic questions to each of the participants.
From all participants, only one had used several screen time tracking apps. Without achieving
any results in reducing the screen time usage, she was looking for something new and more
reliable. The others had not heard of such apps but admitted that they use social media apps a
lot and would like to try a tool that can help them to use those apps less. One of them

13
highlighted her addiction to the device and according to the Screen time widget on her iPhone,
her average daily screen time was 5 hours.

Participant Focus group Gender Age


P1 Focus group 1 female 27
P2 Focus group 1 female 24
P3 Focus group 1 female 25
P4 Focus group 2 female 24
P5 Focus group 2 female 30
P6 Focus group 3 male 32
P7 Focus group 3 male 25
P8 Focus group 3 female 24

Table 3. Participants in focus groups, who evaluated Prototype 1.


Procedure
Before the evaluation, the participants were informed that the discussion would be recorded,
and their opinions would be used for the improvement of the concept. Each session lasted
between 20 and 30 minutes. All participants tested 2 scenarios (1) first time using the app and
(2) experiencing some of its features (see Appendix C). For the first scenario, the participants
were given about 5 minutes to open the app on their devices and imagine that they just installed
it. They had to read the texts on the onboarding screens and explore the user interface of the
app. For the second scenario, the participants had to imagine that they have been using this
app for some time and they have set limits for Facebook and Instagram. Their task was to open
those apps again to see the effects of “Happy Screen” on apps that exceeded the previously set
limits. When opening Facebook, the participants see that the main content is blurred, but some
of the other features like Stories and posting are still available. On Instagram, after several
scrolls, the screen of the device is covered by an image that represents a broken screen and the
scroll function is blocked. Examples of those effects can be seen in Appendix C, fig. 1.

3.2.3. Prototype 2
Based on the analysis of the collected data from the evaluation of the first prototype, a second
improved version was created and evaluated. The improved version contains additional
screens for the onboarding experience, with details on how to do particular tasks. For example,
instead of only 3 screens, the new version has in total 7 – one for the introduction of the app,
five with how-to explanations and one final to welcome the participants. The Home screen of
the app is also improved. Instead of having 3 big buttons that lead to detailed graphs there is
only 1 (Figure 4). Notifications and Pickup buttons and their corresponding pages were
removed. In the new version, those parameters are kept as informative numbers because most
of the participants were concerned that the app will collect their notifications and they might

14
Figure 4. Comparison between the Home screens of prototype 1 and 2.
miss an important one. “News & Updates” section is also removed from the main screen and it
is now situated on a separate page – “Notifications”, accessible by a button in the upper right
corner of the “Home” screen. Some participants were concerned that they might accidentally
tap on a notification (which will mark it as read and remove it from the section) without
actually reading it. The information on the detailed screen (called Graphs in this version) is
also simplified. The gradient colours from the graph are replaced with one solid colour because
they were not consistent with the design. Also, the recommended times are removed because
the users were confused. The limited apps section is moved to a separate screen, accessible
through the Settings menu, which helps to make the Graphs page less text-heavy (see Appendix
D, fig. 1). Grouping the apps into categories and showing their limits as well as the time spend
to reach the limit are now accessible through two icons which can be switched on and off based
on the users’ needs.
The effects that “Happy Screen” is using when the user is over the limit are also improved.
In this version, the blurry effect is applied on apps containing mainly text such as email apps
(Gmail, Outlook etc.) and text-based social media apps (Twitter, Medium etc.). The “broken
screen” is applied to Facebook and also stop the scrolling function of the feed. According to
(Eyal & Hoover, 2014), Facebook’s feed is the function that makes the social media addictive
because the user does not know what the next post will be, and their curiosity keeps them
scrolling endlessly. This is also the reason why many websites and apps integrated similar
feeds. For the social media that contain mainly photos and images (Instagram, Pinterest, Flickr
etc.) a new effect will be applied - simulating acrylic painting - which aims to destroy the users’

15
pleasure of looking at beautiful images and make them stop using an app temporary (Figure
5).

Figure 5. Acrylic painting effect. Evaluated in prototype 2.


In case the user needs to use an app with “effect” on it there is a workaround where the user
just needs to open “Happy Screen” again and pause or temporarily exclude a device. This action
requires several steps to make it difficult for the user and encourage them to change their mind.
Two new features are also added to this version of the prototype – changing a time limit and
adding a new device. The concept of supporting more devices than just a smartphone existed
in the first prototype, but it was not fully developed and tested. Because of the addictive
patterns of most mobile apps, users can easily change the medium to continue their “tasks” in
case of drained battery or poor connection. Therefore, the possibility to support more than just
smartphones can be highly appreciated by everyone that realize the devastating effects of the
screen overtime and are willing to do something to change their habits.

3.2.4. Evaluation of prototype 2


Method
For this evaluation, a moderated usability test was used where users had to complete several
tasks, using the prototype described above, and answer a few questions after each of the tasks.
To avoid the technical difficulties, experienced during the first evaluation, the participants
were asked to use a specifically prepared device instead of their own devices. All participants
were encouraged to share their thoughts and questions during the session. The prototype was
evaluated with each participant individually and lasted about 1 hour.

16
Participants
For the evaluation of this prototype 4 participants were recruited – 2 males and 2 females, aged
between 28 and 35 (Table 4.). Three of the participants are working full time in marketing,
software development and creative fields and use social media apps for several hours daily for
inspiration, work-related tasks or just for fun. The fourth participant is a recently graduated
student, actively looking for a job, which also requires checking social media groups and other
pages with job advertisements. They were selected as a potential target group – young people,
full time working in various fields. As active users of social media platforms, it was assumed
that they might be addicted without knowing.

Participant Gender Age Occupation / Work field


P9 female 32 Marketing
P10 male 35 Software development
P11 female 33 Recent graduate
P12 male 28 Creative field

Table 4. Participants who evaluated Prototype 2.


Procedure
Similar to the evaluation of the first prototype, this one also started with a short presentation
from the researcher about the aim of the study and the rights of the participants. The answers
of the participants were recorded on a mobile device and their progress with the tasks was
observed by the researcher. Each session lasted about one hour. For this evaluation the
participants had to use a specifically prepared device, instead of their own, due to technical
limitations occurred during the first evaluation.
Each participant had to complete 5 tasks: (1) First time using the app, (2) Get to know the
Home and Graphs screen, (3) Set and delete a limit, (4) Add new device and (5) Effects after
overuse (Appendix F). After each task, there were several follow-up questions.
For the first task, the participants were asked to imagine that they just installed the app and
they have to go through the onboarding and read the texts, where some hints were hidden for
the next tasks. The follow-up questions were about the functionalities of the app, the user
interface and the texts written on the onboarding screens.
For the next task, each participant had to go to the Home screen, where there was an
indication of a new Notification. They had to read the message, concerning their privacy and
press a button, indicating that they agree with the text. Part of this task was also to go to the
Graphs screen and understand how the total used time is formed and also to check how long
they have been using each app. During the task, the participants we asked basic questions
about the interface, the functionalities and the information they see.
The third task was again in two parts, in the first one the users had to set a limit to an app,
to experience the ease of setting limits, when the app supports machine learning and
recommends limits based on the usage of the apps. In the second part, the participants had to
delete a limit, where the app asks them if they are sure about this choice. The follow-up
questions were concerning the process of setting and deleting limits.

17
The fourth task was to Add new device (Appendix E, fig 1.). At the beginning of the
evaluation, this feature was briefly mentioned and during the previous task, all participants
guessed that the availability of this feature and were even eager to try it. The users were asked
to imagine that they have a personal computer (PC), where “Happy Screen” is also installed
and they have to link the PC to the phone, to see the statistics of both of their device. Part of
this task was also to exclude (paused) an already added device (Appendix E, fig 2.), which acts
as a filter. In this case, the data that the user sees on the Home screen is only from the devices
that are not excluded.
For the last task in this evaluation, the participants were asked to imagine that they have
used their device too much, over the previously set limits. Now they had to try to use some of
the apps again and experience the effects that “Happy Screen” is showing over the overused
apps. Depending on the type of apps, they had different effects. This task tested the
participants’ reactions when they cannot use their devices. Part of the task was also to bypass
the restriction by excluding device. When they open “Happy Screen”, they saw the sad faces of
the mascots, which should make them change their decision.

3.3. Data analysis method


Thematic analysis was used as a data analysis method throughout the study. The verbal
communication during the interviews and focus groups were recorded and then transcribed to
identify similarities between the two prototypes tested as well as connections to the data
collected from the online interviews and the survey.
Thematic analysis is known as “the most commonly used method of analysis in qualitative
research” (Guest, MacQueen & Namey, 2012). It is used to find out both clearly stated ideas
and those which are not expressed. Therefore, it requires the researcher to be able to interpret
the data. The first step of the analysis is to transcribe and read through the recordings form the
evaluation. Part of the texts containing opinions, ideas, suggestions and questions from the
users were coded. Coded data represents small parts of text relevant to the study, which are
used to easily organise it into themes and analyse it later (Clarke and Braun, 2017). When
coding the data, it is important to take into account the context. This led to adding additional
short comments in brackets to properly categorise the data before extracting it from the text.
As a next step was the extracted parts were read again and the ones that are not relevant to the
study were excluded. According to Clarke and Braun (2017) thematic analysis is a useful
method for analysing data because it is flexible, and this is what distinguishes the method from
other qualitative approaches. The analysis resulted in several subgroups which were combined
into two main groups - data regarding the user interface of the app and data regarding the
“effects” of the app.

3.4. Ethical considerations


All participants took part in the study voluntarily. At the beginning of every interview and focus
group session, the participants were verbally informed about the aim of the study, the
recording of the data as well as their rights to cancel the interview at any time or not to answer
a question. They were also assured that the recorded data will be used only for the purposes of

18
the study and their anonymity will be fully preserved. The recordings were stored securely and
after the data was transcribed, all recordings were immediately deleted.

4. Results
This section presents the results of the study. The concept of a mobile application that will
track screen time and help the user to reduce it was accepted by all of the participants that took
part in the evaluations. Most of them mentioned that they would use the app if it exists. The
results cover three focus groups with 8 participants in total, evaluating the first prototype and
four moderate usability testing session with four individuals.

4.1. Results from Part 1

4.1.1. Evaluation of the current apps


Evaluating the existing apps revealed various problems in terms of way of work, interaction
and user interface. Below is a summary of some of the problems and limitations found during
the evaluation.
As already mentioned, one of the most important features of screen time management apps
is to set limits for other apps. Supporting such a feature is a great way to give users the freedom
to limit whatever they want. However, one of the disadvantages is that it takes a lot of time,
especially if the user has more than 50 apps and they want to set limits for all of them or maybe
half of them. Each of the tested apps has own ways to do that. For example, in the free version
of Moment (iOS), the user can set limits for individual apps but not for a category, which could
be a solution for the time-consuming initial setup. Moment (iOS), Quality Time (Android) and
App Detox (Android) support setting limits for specific time and days, which is a helpful feature
for different types of users and can make the setup of those apps faster and easier because
when the limit is set per day usually includes more than one app or all apps. None of the tested
apps supports all of the ways to set limits. However, it should be considered for the future
development of screen time tools to meet the needs of a wider target group. One of the
identified limitations, regarding the limits, was during the test of Quality Time. The app can
block the phone function of the device. In case of an emergency, the user should be able to call
and answer calls regardless of their screen time usage.
It is well-known that screen time management apps bring awareness for the user, therefore
they must support a feature such as counting the time for which the device is used. The results
here show that almost all of the evaluated apps have it, apart from App Detox (Android) and
Hold (iOS). Moment supports it differently. The app needs permission to the photos folder of
the device and sends reminders at the end of every week to the users, asking them to take a
screenshot of the battery usage of the device which will be “analysed” later. The way this feature
works is absolutely inappropriate. The app should be able to get this information without
analysing screenshots. It is taking too much attention and the fact that the user is required to
think and do something to receive a report makes this feature intrusive. Besides, anyone can
open the battery settings and analyse their phone usage without the need for installing a screen
time tracking app.

19
Counting pickups and unlocks is another addition in the counting column and it is
supported by most of the evaluated apps. The ability to stop or pause the counting could be
considered as a way to “cheat” but in some cases, it could be an important feature.
Unfortunately, it is supported by only 3 of the tested apps. Mute has it in their premium
version, which also supports an option to “delete” a time that is not supposed to be there.
Another way to reduce screen time, even though it is a way of cheating (see Appendix A, fig. 2).
Notifications are the feature used by many apps to communicate with the user. In the screen
time management apps this feature could be used to inform the user that they are running out
of time if they have set a limit or to motivate them to use their devices less. However, in some
cases (Quality Time) receiving such notification makes the user unlock the device, which was
tracked as a screen time usage.
One of the problems identified during the evaluation is that some of the apps, especially
those tested on the iOS system are using the physical location of the device to work. As part of
the test, this setting was changed from the device settings. Moment, Mute and Space instantly
send notifications asking the user to allow them access to the location otherwise they cannot
work. Location services always drain the battery and in the case of screen time apps, this
feature should not be used.
A huge problem found in almost all of the apps, especially those running on Android is the
inaccuracy of the time tracked. Quality Time tracked 10 minutes and 38 seconds for a call,
which lasted only 1 minute and 15 seconds. Such difference makes the user question how
exactly this app is working and creates distrust. An example of this case can be seen in
Appendix A, fig. 3. Antisocial also had a tracking problem. Sometimes the app was working
fine and sometimes could not track anything.
Another limitation is the user interfaces of some of the apps. Space, for example, shows a
big empty circle on the “Home” screen (see Appendix A, fig. 4). After several days of usage, it
turned out that this circle symbolises “the space”, based on the set time usage and it gets
smaller when the device is used more, even disappears when the limit is reached. Quality Time
also has some strange UI elements, like the sliders they are using. As shown in Appendix A, fig.
4, the sliders are too long, and it is not clear if the user has to actually slide them or just tap.

4.1.2. Survey and Interviews


The results from the survey show that screen time management apps are not as widely spread
as initially thought. Even though there are numerous articles and many existing services and
apps are implementing some screen time regulation features, users, in general, are still not
well informed. More than half of the responders (63 out of 113) chose the answer “I’m not
using/have never used such apps” when they were asked if they use apps to track their screen
time. As already mentioned, the results from the survey showed that in general people are not
informed of screen time tracking tools but the study needed more precise data. This led to the
usage of another method and the conduction of in-person interviews with a narrowed target
group - people who use or have used screen time management app.
Most of the interviewed people were using the built-in Screen Time too in iOS, which has a
clean and minimalistic design. All of the interviewees see potential in screen time management
apps and think they bring awareness of their screen time usage which helps them to learn how

20
to regulate it. However, some of them mentioned that they need more information and
additional graphs where they could see their progress compared with the previous months.
This lack of variety in the ways that the user can set limit inspired the researcher to think
about all of those different ways to set limits and also to include those features in the prototype.
During the evaluation of the existing apps, it was noticeable that it takes too much time for
users to set limits. This problem was mentioned during the interviews by several users and it
was pointed as one of the features that makes them annoyed and unwilling to use a particular
app. A possible solution of this problem could be the integration of machine learning
algorithms that will make the screen time management app “smarter” and it will know in
advance which are the most used apps on a specific device. Machine learning is gaining a lot of
popularity and in one or another way it is implemented in many of the apps and services that
we use today.
On the question of what features users would include if they were designing and developing
a screen time management tool, most of the interviewed mentioned the support of multiple
devices. Many of them said that when they reach the limit, either they change the device with
a pc/laptop or just open the browser and continue. This process is particularly valid for social
media apps, which support both mobile apps and web versions accessed by a browser. As of
today, Screen Time partly supports similar feature and includes other iOS devices connected
to the same user account. The upcoming macOS version supports the Screen Time feature on
a mobile and desktop systems, however, it is not clear yet how exactly it will work4. The support
of various devices regardless of their operating system would be a great addition to any screen
time management app.

4.1.3. Summary of the results from part 1


Evaluating the existing apps gave the researcher a deeper grasp of the way they work and also
inspiration on what to focus in the next step. The results from the next two approaches - survey
and interviews show (1) how common are the screen time tracking apps among people from all
over the world and (2) identify issues and benefits of the screen time tracking apps.
Together the data from those methods suggests that future developed screen time
management tools should have clean and minimalistic design in order to be easy to understand
and use. They should bring awareness to the users by providing additional data, for example
monthly or weekly reports which will help the user to compare their progress. Such details
could be sent to the user by a notification for example. The setup process should be simplified
and should take less time. Another great feature would be the addition of information on how
to set limits, especially if the user has never used a screen time management app before. All of
those improvements can make the screen time apps more user-friendly and trustworthy.

4.2. Results from Part 2

4.2.1. Evaluation of Prototype 1


Scenario 1: First time using the app

4 https://blog.macsales.com/50187-macos-catalina-features-using-screen-time/

21
In general, all participants expressed positive feelings about the prototype when they saw it for
the first time. During the onboarding, the mascot was highly appreciated and one of the
participants commented: “It makes me sympathises more” (P1). However, the design of the
next screens seemed confusing for most of the participants and they experienced difficulties,
completing the tasks.
On two of the buttons on the Home screen - Notifications and Pickups, there were green
arrows, pointing down, which showed a comparison of the same data from the previous day.
However, those elements were confused for a drop-down menu by one of the participants and
according to another one they remind of the arrows from a stock exchange data. Another
confusion was the title “Pickups” on one of the buttons. Since almost all of the participants did
not use a screen time management tool before, they did not know what this title means and
what are the functions of this button. One of the participants expressed their opinion about the
Pickups function as follows:
“The pickups will be useless in that case (if people want to limit only apps).” - P3
The results from the focus groups show that the usefulness of this parameter can be
questioned.
Another confusing part of the Home screen was a section called “News&Updates”. The
problem in this section was again on a user interface level. According to the participants, this
part of the screen has too many arrows. One participant shared that they are attracted by the
visual element and do not read the text, which in this case is more important: “The text should
be more or less the first thing, not the arrows.” (P2). The results from the discussion after
completing this scenario shows that this section is useful. However, some participants were
concerned that they might accidentally tap on one of the buttons with checkmarks (which will
be perceived as completed) without actually reading the text. All of the participants took this
section too seriously and suggested various improvements both from user interface and user
experience perspective.
Another confusing part of this scenario was the Graphs screen, which followed the design
of the iOS screen time tool. Almost all of the participants in this evaluation were Android users
and did not understand all of the functionalities of this screen. The graph for the screen time
had gradient colours, which represent the data combined with the pickups and notifications,
which were marked by two different shades of blue. The same shades were used for the
recommended times below this graph. One of the participants expressed their confusion as
follows:
“The gradient is not consistent with the design. I am confused between the
recommended and the usage colours.” - P3
The used and recommended times, shown below of the graph were also not understood by
most of the participants. Some of them preferred to see how much time they have until
reaching a limit, but others mentioned that they would feel a little bit stressed if they see a
small number of minutes left. The results showed, that all of the participants were confused by
the section “Limited apps” and they did not understand why it shows two numbers. Most of
them suggested that the section should look different and it should be moved to the Settings,
where it should be the place to set the limits. Overall, the Graphs screen had too much text and
one of the participants expressed their opinion as follows:
“There is a lot of information to process here” - P7

22
Scenario 2: Experiencing some of Happy Screen’s features
The second scenario from this evaluation presented the effects that a potential user will
experience when they use their mobile devices over the limits. All participants reacted
differently when they saw the “Blur screen” effect. Due to a technical limitation, the effect was
confused for a bad internet connection and at first, some participants did not understand that
the blur part of the screen is an effect from the prototype. However, the results showed that
most participants recognise the cause of the effect, which is visible from the following quote:
“I think it’s because of the time (the screen got blurred). And the app is mad at
me.”. - P2
The other effect - “broken screen” was positively accepted. Almost instantly, the participants
started to compare the two effects. One participant already broke the physical screen of their
device and this effect did not make a difference during his experience. However, both effects
encouraged some participants to ask a lot of questions, such as:
“Are there other options or it’s only blurry/cracked?” - P6
“Can we choose what effect we prefer?” - P2
Asking such questions express the participants’ desire to choose an effect that they like and
prefer to see, which does not correspond to the initial concept of “Happy Screen”.
Both effects triggered various opinions and concerns. Some participants were concerned that
there is no information when the effect will show up:
“I think I’ll panic if my phone suddenly just cracked.” - P7
This comment encouraged the other participants in this group to think about possible ways
that the upcoming effect can be communicated before occurring. According to the findings
such preventive measure could be included in the form of a notification, for example. However,
it could also cause a distraction and encourage the user to open the notification to read it.
Inspired by another screen time management app, one of the participants suggested:
“When you’re in an app there could be a clock that shows how much time have left
(before the limit).” - P1

4.2.2. Evaluation of Prototype 2


Task 1: First time using the app
As in the evaluation of the first prototype, the participants reacted positively to the concept of
the app. However, as the subject is explored deeper, the results reveal various concerns.
Starting from the first task, all of the participants went through the onboarding without any
problems and read the texts on each screen. In this prototype, the onboarding was a little bit
longer and contained more screens, compared to the first prototype. All participants noted that
the onboarding was too long and explanatory, as seen in the following quote:
“I think it should be short…it’s like when you play a game and the tutorial is too
long, the risk is high that you won’t play it again and don’t manage to go through
the tutorial.” - P10
It was a shared thought among all participants that the onboarding was helpful, they
understood the concept of the app and the way it works. One of the participants noted that in
the beginning she had a lot of questions about the functionality of the app but by the end of the
task she felt well informed of all details.

23
Task 2: Get to know the Home and Graphs screen
The second task started with many comments on the titles used in the user interface. According
to almost all of the participants, the title “Total working time” should be changed, because it
reminds them of a period during which they go to work:
“It sounds like “Working time from 9 to 5” - P9
Part of the task included a privacy notification, which asks the participants to agree with it (by
pressing a button) to continue work with the app. All of the participants agreed with the text,
although some of them (P11, P12) were concerned about the access of the app to their private
emails and chat messages. One of the participants took the description too seriously and
admitted that he does not understand how the app will “scan” his content and at the same time
it will not read his email:
“After reading this text, I’d probably uninstall the app immediately. I can’t trust
it” - P12
The next part of the task - the Graphs screen - was well accepted. One of the participants
completed the task with some difficulties because she could not see the icons for the categories
and the limits, which can be considered as a design issue. They were too thin, also for her the
Limits icon was unclear:
“It should be something else. This symbol doesn’t say that I can see the limits when
I press it” - P9
A screen time management tool needs to have an understandable user interface with
appropriate icons, that correspond to various actions. Regarding the part with the limits, all
participants agreed that it is useful:
“It is clear for which apps the time limit is reached and for which not” - P11
Using the red colour as a warning sign was highly appreciated. One participant noted, that the
limited apps section should be more visible by making it the default state of the Graphs screen.
One of the participants was curious about setting a limit to a category, which could be
considered as a suggestion for a feature. However, the results show that several apps can
belong to more than one category and this could cause problems to some users who use an app
for a different purpose than its initial one. For example, some users who use the YouTube app
to listen to music, but they do not watch the video attached to it.

Task 3: Set and delete a limit (Settings screen)


All participants expressed positive thoughts when they opened the Settings screen and saw the
first section - “Devices”:
“So, you can track more than one device? Cooooool!” - P10
“Wow, this app will be… hell!” - P11
“Devices… wooow! Will this app be available for laptops too?” - P12
The next part of the task was to set and change a limit that is already set. For the app to help
the user to choose the correct option, the call-to-action button was designed differently. One
participant expressed concerns about why the text in the button is negative, but the other
participants did not have concerns about this and felt amused:
“Ahahaha, that’s nice! It’s like when you see those annoying ads and they tell you
“Buy this”, “Subscribe here” and then you click just to close the ad” - P11

24
Task 4: Add new device
On the contrary to the positive reactions to the “Devices” section, the task of adding a new
device to this section was difficult for almost all participants. Partly, the issues came from the
instructions which were not very clear, even though they were ordered and separated into 4
points. For the user to add/link a device they had to first install the app on the other device
(desktop PC, used as an example), open the Settings screen and enter the Happy code of the
PC into the field provided on the phone. This process was too complicated, and each participant
had a different idea of how the linking will work.
The next part of the task was to pause the phone, so only the data from the other device to
be visible. For this to be achieved, the app offered two possible ways - (1) through the Settings
screen and (2) by using the mascots on the Home screen. Instructions for this “special” feature
were also included in the onboarding. The results showed that half of the participants (P9 and
P10) did not read all of the texts during the onboarding and accordingly they choose the first
way of pausing. However, their comments were correct:
“But those devices look like an image. There should be something like a comics
bubble to remind me that I can click on them. Now it’s not clear that they are
clickable.” - P9
“I don’t get that part… So, I have to click on one of the devices?… The idea to tap
on the devices is cool but the graphic doesn’t look like something that I can click
on. As a [profession] I see that kind of graphics all the time and I’d never click on
them. Maybe if they are not overlapping and there is a “plus” sign somewhere… I
don’t know, now they are just a normal picture for me.” - P10

Task 5: Effects after overuse


For the last task, the participants commented on the effects that they would experience if they
use their devices too long during the day. They all reacted with amazement when they saw the
blurred Messenger chat. One of them (P11) was curious and tried to write a message when she
saw a popup message asking her to call her friend instead of typing and looking at the screen.
The results showed that the message on this popup might not be appropriate and valid for
many young people who prefer to type instead of call or just like in the quote below - live in
different time zones:
“What if you and your friend are in different time zones? I can’t call all of my
friends because it might be night where they live and the same goes for them.
Calling is allowed only for the most urgent situations.” - P11
The new effect evaluated in this prototype - acrylic painting simulation caused very positive
reactions, visible from the quote below:
“For a moment I thought “This is a nice photo” and then I saw that all pics have
this effect. I like that.” - P11
One of the participants (P12) expressed his concern, that this effect might not be appropriate
for some people. It would make him curious to see the original image and respectively to pause
Happy Screen to see it. He suggested instead of applying effects on the images, they should be
replaced by a boring static image. That would prevent him from pausing the app.
The “broken screen” effect on the Facebook feed was not accepted with the same
astonishment. One of the participants was wondering (P11) if she open Facebook through a

25
browser will the effect be visible. It is a common practice for people who work in big companies
to try to bypass the restrictions on the local network to visit social media websites by using
VPN services, incognito modes on the browsers and so on. As for the smartphone Sally
mentioned that she tries to reduce her Facebook usage and she deleted the app from her phone.
Now she opens it only from the browser:
“Facebook is the most toxic thing you can have on your phone” - P11
Before the end of the evaluation, all participants were encouraged to share their general
thoughts about the app, as well as suggestions, comments and questions. The results show that
they all like the concept of the app and it will help them to track their screen time. One of the
participants (P12) expressed his option about the effects and his concerns that the app the
effects might not help him:
“They (the effects) are just technical obstacles, which I can overcome” - P12
According to the results, the general view of apps that use artificial intelligence is not well
accepted. Some users (P12) prefer to have more control and decide what kind of content the
app can “read”, others admitted that they felt scared:
“I’m scared that someone is watching my every move. The AI part is the thing that
will make me not using the app. But it looks great and I’d install it just to see if it
works as it should be”. - P12
One of the suggestions of this problem was to assure the users that their data is stored
anonymously. However, such a feature needs further testing. In case the app reaches a
development phase, the artificial intelligence should be either removed or replaced with
another method that simulates it in a way, but it does not read the users’ content.

4.2.3. Summary of the results from part 2


The results from the first evaluation demonstrated a successful identification of several
problems in terms of design and user experience, which were considered when the second
prototype was developed. Below are summarised the main points from this part:
• Findings pointed towards short and interactive onboarding, instead of long
explanations;
• Presenting various kinds of information in a visually appealing way and at the same
time keeping the app consistent and minimalistic;
• When designing a screen time management tool, it should be considered that people
have and use more than one device and the tool should be able to support various
devices;
• The process of adding/linking more devices should be developed straightforwardly,
with clear explanations and as short as possible;
• Adding effects over existing apps to make their content useless should be carefully
considered as it might not be valid for all users.
In conclusion, the mascot was highly appreciated by everyone although, for one participant
(P9), the PC looked too depressed. Another one (P10) noted, that he would download the app
because of the mascot which he absolutely adores.

26
5. Discussion
Screen time tracking apps are still getting popularity among both users and researchers. As
mentioned before very few studies are focused on the effects that kind of apps offer. By the
time of conducting this study, none of them examines the design. Therefore, this study aims to
fill this research gap. This thesis presents guidelines on how the design can be improved from
a user experience point of view. It can be used for the development of new screen time apps
and improvement of the existing ones. In this section, the findings and possible interpretations
of the collected data will be discussed, based on the results described in the previous section.
For this study, the researcher created and evaluated two prototypes, by using the research
through design (RtD) approach. RtD generates knowledge by utilizing methods and processes
from design practice (Zimmerman, Forlizzi & Evenson, 2007). In the book “Design research
through practice” (Koskinen, Zimmerman, Binder, Redström & Wensveen, 2011), the authors
explain the RtD practices used in the interaction design research community by referring them
to Lab, Field, and Showroom. The Lab practice “focuses on creating novel and much more
aesthetically appealing ways for people to interact with things” (Koskinen et al., 2011), which
is one of the contributions of my study. The Field practice outlines a problem and offers design
solutions that can solve it, such as the “effects” included in the prototypes. The Showroom
practice is used to “design provocative things that challenge the status quo” (Koskinen et al.,
2011). In the study, this practice is used to challenge the users by including machine learning
features and evaluate their reactions.

5.1. Design contributions


The most popular apps used today are designed in a way to include elements and features that
make them addictive for the users. Screen time apps can help the users by bringing awareness
for those of them who realize that the overuse of screen time is becoming problematic. One of
the problems of the screen time management apps, identified during the study is that they are
designed in more or less the same way. Any app can visualise data in colourful graphs, but the
results show that this will not help the user or trigger them to change the way they use their
devices. Bringing awareness is only part of the solution. That is why “Happy Screen” was used
to evaluate potential improvements related to the design of screen time management apps.
One of them is breaking the addictiveness. Instead of just blocking apps when the user reaches
the set limit or showing a white screen, Happy Screen is disrupting the experience of the other
apps by making their content useless. As already mentioned, its effects are breaking the Hook
model by preventing the user from receiving their Reward. According to Löchtefeld et al.
(2013), the most restricted apps are social networking and messaging apps, which was the
reason to simulate the “effects” from “Happy Screen” on that kind of apps into the prototype.
During the first evaluation, this approach was misunderstood, and some participants asked if
“Happy Screen” is blocking only social media apps. The mistake was corrected for the second
evaluation, where it was clear for the participants that the effects can be applied to any app
with set time limit. None of the existing apps at the moment offer a feature like this. The results
of this study show that a sudden change could be disturbing for some users and would motivate
them to stop using screen time tracking tools, or “ignore” the limits. All of the participants in
the second evaluation appreciated the effects, especially the Art effect on Instagram. As

27
interaction designers instead of just showing white screens and popups that stop the user from
opening an app, it is much better to allow them to use it and make them smile and even laugh
at the effects:
“This effect looks funny! [smiles and continues to scroll] Oh…all photos look like
this [smiles again].” – P1
Another problem that “Happy Screen” tries to solve by improving the design is seriousness.
The existing screen time tracking apps are designed in a way to be perceived as “serious”. They
visualise the screen time data in a variety of ways but so far none of them offers a persuasive
element. Even those with gamification features like Forest and Hold, where after a few days of
usage they become tedious for the user and eventually deleted from the device. The concept of
“Happy Screen” includes such an element and aims to develop a friendly relationship with the
user, which might help them to change. The mascot was inspired by B. J. Fogg’s “functional
triad” framework (Fogg, 2003) and just like Tamagotchi, it can have different facial
expressions, based on the screen time used so far. Surprisingly it received very positive
comments during both evaluations, visible from the quotations below:
“I like that the phone icon is always different. It’s cool!” – P2
“The icon is friendly, and it looks cute.” – P4

5.2. Usability
The second contribution of this study is related to usability. According to Nielsen J. (2012)
usability is “a quality attribute that assesses how easy user interfaces are to use”. Usually,
usability evaluations are conducted in laboratories and require complex procedures (Genc-
Nayebi & Abran, 2018). This study did not measure all of the dimensions of the usability
(Learnability, Efficiency, Memorability, Errors and Satisfaction (Nielsen, 2012)) of screen time
management apps. Instead, the researcher maps them to the collected data and draws
conclusions from there. The existing screen time tracking tools are working in almost the same
way. They do not offer anything else than collecting and visualising data, their success in terms
of behaviour change in the long term is also unknown. The problems identified in the tested
apps, such as confusing interface elements, are making them unfriendly and even repulsive,
which would most likely cause them to fail if the usability is measured in a laboratory. This
aspect is unhelpful for the future vision of screen time apps. Therefore, by including specific
tasks for the evaluations of “Happy Screen” the researcher incorporates new features which
might improve the usability. For example, evaluating how users will react to features such as
machine learning.

5.3. Machine learning and privacy concerns


According to a research (Dove, Halskov, Forlizzi & Zimmerman, 2017), Machine Learning
(ML) is already used in the HCI field to increase the interaction possibilities of the products
and to reduce some of the users’ efforts by automating some tasks. The results from another
study (Hiniker et al., 2016) demonstrate the variety of users’ needs in terms of limiting apps,
which motivated me to include different ways to set limits. In this context, the machine
learning features could be helpful and could reduce the time to set limits manually, like in all
of the existing apps. The concept of “Happy Screen” includes “Recommended limits” option,
which is an automated feature and it recommends time limits based on the users’ screen time

28
usage for every app. It would be beneficial for the study to have at least two participants who
have been using different screen time limiting apps before. They would appreciate the amount
of time that they would save when they decide to set limits. However, during the evaluations,
this feature did not receive many comments. A possible explanation could be the choice of
participants and the fact that they have not used such apps before.
Machine learning offers many advantages for future mobile apps, but it also raises a lot of
questions. The results from the current study show that users are not ready to accept machine
learning as part of their mobile experience yet. On one hand, this could be due to the limitations
of the prototype. Studies show that incorporating machine learning into a prototype is a
difficult task, which requires a new way of prototyping with data that will change over time
(Dove et al., 2017). Besides, since the data is dynamic, the outcomes can also be unpredictable.
“It is hard to effectively imagine what the experience will be, or the likely performance errors
until the system is built; therefore, making it difficult to assess potential value versus
“creepiness”.”, Dove et al. (2017) mention. Building such a system can allow designers to
evaluate concepts and fix potential problems. However, this process will take too much time
compared to the traditional UX process (Yang, Scuito, Zimmerman, Forlizzi & Steinfeld, 2018).
Even though in the existing studies (Rooksby, Asadzadeh, Rost, Morrison & Chalmers, 2016;
Mehrotra, Pejovic, Vermeulen, Hendley & Musolesi, 2016; Hiniker et al., 2016, Whittaker et
al., 2016; Ko et al., 2015; Löchtefeld et al., 2013), the evaluated screen time tracking tools do
not have machine learning features, they show significantly better results because the
researchers used specifically made apps for a certain media. Evaluating with a real app, allows
them to gather a great amount of quantitative data and longer periods of evaluation, which
leads to better results. Another advantage of using a real app is that it allows the participants
in the study to be in a real situation and act as they normally do, without the need to imagine
a certain situation. Sometimes it is difficult for the participants to imagine themselves in a
specific situation and the data received in this case will be different from the one, received in a
real environment. It is possible to hypothesise that the problems mentioned in the previous
section are less likely to occur when testing with an already developed tool. Unfortunately, the
time frame and the lack of programming skills for my study did not allow for testing with real
application.
On the other hand, the problem could be connected with the privacy scandal in recent years,
which showed that users do not have any control over their private data. This made them more
cautious about what they share online and who has access to their data. Research findings show
that one of the measurements that people took after these events is to stop downloading and
using specific apps to protect their private data (Brandtzaeg, Pultier & Moen, 2018). Further
research should be conducted to investigate possible solutions and ways to incorporate
machine learning into mobile apps without affecting users’ privacy.

5.4. Breaking the addictiveness


As mentioned, many apps are using different methods and techniques to make users addicted.
However, instead of using them for gaining profits, those methods can be used differently and
become part of the solution by chaining the user mentality. Addiction to apps and devices is a
serious problem and there is a need for massive research in this area to find what is causing it

29
and how to solve it. It might turn out that screen time tracking apps are part of the solution to
this problem. The existing ones offer some great features but to bring a real behaviour change
and help their users, their design and features should be improved. The concept of “Happy
Screen” should not be considered as a universal solution and it definitely needs to be
additionally researched to assess the presented design solution in long term and what features
should be added or changed to attract more users and how it can be improved if the users’
behaviour patterns also change over time. This study outlines some of the features that could
to be included in the future development of similar concepts or could be helpful for the upgrade
of the existing screen time tracking tools.

5.5. Limitations
Like in every study, this one also had several limitations. Probably one of the most common
limitations in this type of studies is time. Due to the time constraints, the prototype was not
evaluated for a third time, as it was intentionally planned. However, having more time for
testing would allow the researcher to include more participants and to have a broader view of
the concept and its potential problems.
The number of participants was not enough for this type of study. Due to a limited budget,
all of the participants in the first evaluation of the prototype were students. There is a tendency
for this type of participants to express a positive opinion about the prototype in order not to
hurt the interviewer’s feelings. This limitation was partly resolved for the second evaluation
when non- students were recruited.
Another limitation was the prototyping tool. During the first evaluation, the participants
used their devices and experienced various technical problems. First, the user interface looked
different due to the different sizes of the screens. Second, when the prototyping tool generates
a link for sharing the prototype, it is opened with the browser of the device and some of the
features, included in “Happy Screen” could not be tested properly. For example, during the
“Broken screen” effect, the participants should be prevented from scrolling, but apparently,
when the prototype is opened through a browser this feature (of the prototyping tool) is not
supported. There were no issues with the scrolling when the prototype is opened with the
accompanying app, developed by the prototyping tool for previewing and testing prototypes.
However, the participants were warned that there might be some minor problems, which
changed their perception and resulted in looking for the problems instead of actual testing. For
example, the blurred effect was mistaken for one of the problems by several participants, which
caused misunderstanding of the concept. To avoid those issues, for the second evaluation, the
prototype was specifically designed for one device (iOS) and the participants were asked to use
it. The prototype runs on the mobile app, provided by InVision. Some of the participants who
use Android system experienced difficulties in completing some of the tasks.
Additional issues appear from the concept itself. It requires longer evaluations and data
collected by the app itself, for the participants to test and experience “Happy Screen” in a real
environment. This was another huge limitation of the study. Mobile applications that are using
machine learning and claim to “adapt” to the users’ usage, require time and budget to be build
and evaluated. Even if not all of the features are included, evaluating with a real app offers
more accurate data and eliminates issues such as incompatibility with different screen sizes.

30
6. Conclusion
The purpose of this study was to investigate some of the existing screen time tracking mobile
apps and to explore the possibilities of improving their design by including new features, which
could increase their popularity among the users and trigger a behaviour change – reduce the
screen time. To conduct the study, first, a survey was created and used to improve the
understanding of the researcher about how aware people are about their screen time usage.
During the study ten of the most popular screen time tracking apps were downloaded and
evaluated from the App Store and Google Play store. Contrary to their popularity, the results
from the survey showed that most of the participants are not using and/or have never used
screen time tracking apps. This caused a change in the initial plan of the study and the inclusion
of another step – online interviews targeting only users of screen time apps. The data collected
at this stage of the study brought insights about the popularity of those apps and some of the
problems that the users experience. As a next step, a low-fidelity prototype was created and
evaluated. The results from the evaluation were used to improve the prototype and develop
new features which were evaluated with a second prototype, created as a next step of the study.
Both evaluations demonstrated approval of the concept from the users. In the future, the
research on those apps could be continued by developing and evaluating additional features
and improvements of the existing ones.
However, despite that the study was conducted by one person, it managed to contribute
with knowledge on the primary features that could be included in screen time management
tools. This study acts as a basic guideline and it could be used for in-depth research for
improving the design of screen time management tools, which will help the users to keep their
screens happy.

“So, what’s the solution? We can’t abandon technology, nor should we.”
(Alter, 2017)

There is a better way.

31
Acknowledgements
I would like to thank everyone who supported me and my work during the past months. First
and most importantly, to my supervisor – Fatemeh Moradi for the great support and all
inspirational meetings. Thank you for guiding me and giving me hope throughout the thesis
process. To everyone who took part in the interviews, focus groups and user tests. This thesis
would not be possible without you. To all my friends from the HCI program, thank you for the
amazing two years together. And finally, to my family, for being always by my side and
supporting me during my studies.
Thank you!

32
References
Alter, A. (2017). Irresistible: The rise of addictive technology and the business of keeping us
hooked. Penguin.
Banjanin, Nikolina, Banjanin, Nikola, Dimitrijevic, Ivan, & Pantic, Igor. (2015). Relationship
between internet use and depression: Focus on physiological mood oscillations, social
networking and online addictive behavior. Computers in Human Behavior, 43(C), 308-
312.
Bardus, M., Van Beurden, S., Smith, J., & Abraham, C. (2016). A review and content analysis
of engagement, functionality, aesthetics, information quality, and change techniques in
the most popular commercial apps for weight management. The International Journal
of Behavioral Nutrition and Physical Activity, 13(35), 35.
Brandtzaeg, P., Pultier, A., & Moen, G. (2018). Losing Control to Data-Hungry Apps: A Mixed-
Methods Approach to Mobile App Privacy. Social Science Computer Review,
089443931877770.
Chhabra, H. S., Sharma, Sunil, & Verma, Shalini. (2018). Smartphone app in self-management
of chronic low back pain: A randomized controlled trial. European Spine Journal,
27(11), 2862-2874.
Clarke, V., & Braun, V. (2017). Thematic analysis. The Journal of Positive Psychology, 12(3),
297-298.
Collins, E., Cox, A., Bird, J., & Cornish-Tresstail, C. (2014). Barriers to engagement with a
personal informatics productivity tool. Proceedings of the 26th Australian Computer-
Human Interaction Conference on Designing Futures, 370-379.
De Russis, L., & Monge Roffarello, A. (2017). On the Benefit of Adding User Preferences to
Notification Delivery. Proceedings of the 2017 CHI Conference Extended Abstracts on
Human Factors in Computing Systems, 127655, 1561-1568.
Demirci, Kadir, Akgonul, Mehmet, & Akpinar, Abdullah. (2015). Relationship of smartphone
use severity with sleep quality, depression, and anxiety in university students. Journal
of Behavioral Addictions, 4(2), 85-92.
Ding, Xiang, Xu, Jing, Chen, Guanling, & Xu, Chenren. (2016). Beyond Smartphone Overuse:
Identifying Addictive Mobile Apps. Proceedings of the 2016 CHI Conference Extended
Abstracts on Human Factors in Computing Systems, 07-12, 2821-2828.
Direito, Artur, Pfaeffli Dale, Leila, Shields, Emma, Dobson, Rosie, Whittaker, Robyn, &
Maddison, Ralph. (2014). Do physical activity and dietary smartphone applications
incorporate evidence-based behaviour change techniques? BMC Public Health, 14(1),
646.
Dove, G., Halskov, K., Forlizzi, J., & Zimmerman, J. (2017). UX Design Innovation: Challenges
for Working with Machine Learning as a Design Material. Proceedings of the 2017 CHI
Conference on Human Factors in Computing Systems, 2017, 278-288.
Edwards, E A, Lumsden, J, Rivas, C, Steed, L, Edwards, L A, Thiyagarajan, A, . . . Walton, R T.
(2016). Gamification for health promotion: Systematic review of behaviour change
techniques in smartphone apps. BMJ Open, 6(10), E012447.

33
Elhai, J. D., Levine, J. C., Dvorak, R. J., & Hall, B. (2016). Fear of missing out, need for touch,
anxiety and depression are related to problematic smartphone use. Computers in
Human Behavior, 63, 509-516.
Elnaffar, S., & El Allam, A. (2018). An app approach to correcting the posture of smartphone
users. 2018 Advances in Science and Engineering Technology International
Conferences (ASET), 1-4.
Eyal, N., & Hoover, R. (2014). Hooked: How to build habit-forming products. Penguin UK.

Filippou, Justin, Cheong, Christopher, & Cheong, France. (2016). Combining The Fogg
Behavioural Model And Hook Model To Design Features In A Persuasive App To
Improve Study Habits. ArXiv.org, ArXiv.org, Jun 11, 2016.
Fogg, B. J. (2009). A behavior model for persuasive design. Proceedings of the 4th
International Conference on Persuasive Technology, 350, 1-7.
Fogg, B. J. (2003). Persuasive technology: Using computers to change what we think and do.
Retrieved from https://ebookcentral.proquest.com
Furst, R., Evans, T., & Roderick, D. (2018). Frequency of College Student Smartphone Use:
Impact on Classroom Homework Assignments. Journal of Technology in Behavioral
Science, 3(2), 49-57.
Genc-Nayebi, N., & Abran, A. (2018). A measurement design for the comparison of expert
usability evaluation and mobile app user reviews.
Giansanti D., Colombaretti L., Simeoni R., Maccioni G. (2019) The Text Neck: Can Smartphone
Apps with Biofeedback Aid in the Prevention of This Syndrome. In: Masia L., Micera
S., Akay M., Pons J. (eds) Converging Clinical and Engineering Research on
Neurorehabilitation III. ICNR 2018. Biosystems & Biorobotics, vol 21. Springer, Cham
Goggin, G., Lincoln, S., & Robards, B. (2014). Facebook’s mobile career. New Media & Society,
16(7), 1068-1086.
Guest, G., MacQueen, K. M., & Namey, E. E. (2011). Applied thematic analysis. Sage
Publications.
Hiniker, A., Hong, S., Kohno, T., & Kientz, J. (2016). MyTime: Designing and Evaluating an
Intervention for Smartphone Non-Use. Proceedings of the 2016 CHI Conference on
Human Factors in Computing Systems, 4746-4757.
Hoffner, C., & Lee, S. (2015). Mobile Phone Use, Emotion Regulation, and Well-Being.
Cyberpsychology, Behavior, and Social Networking, 18(7), 411-416.
Howe, Katherine B, Suharlim, Christian, Ueda, Peter, Howe, Daniel, Kawachi, Ichiro, & Rimm,
Eric B. (2016). Gotta catch’em all! Pokémon GO and physical activity among young
adults: Difference in differences study. BMJ, 355, I6270.
Jung, Sang In, Lee, Na Kyung, Kang, Kyung Woo, Kim, Kyoung, & Lee, Do Youn. (2016). The
effect of smartphone usage time on posture and respiratory function. Journal of
Physical Therapy Science, 28(1), 186-9.
Kim, Y. G., Kang, M. H., Kim, J. W., Jang, J. H., & Oh, J. S. (2013). Influence of the duration
of smartphone usage on flexion angles of the cervical and lumbar spine and on
reposition error in the cervical spine. Physical Therapy Korea, 20(1), 10-17.

34
Ko, Minsam, Yang, Subin, Lee, Joonwon, Heizmann, Christian, Jeong, Jinyoung, Lee, Uichin,
. . . Chung, Kyong-Mee. (2015). NUGU: A Group-based Intervention App for Improving
Self-Regulation of Limiting Smartphone Use. Proceedings of the 18th ACM Conference
on Computer Supported Cooperative Work & Social Computing, 1235-1245.
Koskinen, I., Zimmerman, J., Binder, T., Redström, J., & Wensveen, S. (2011). Design research
through practice: From the lab, field, and showroom. Waltham, MA: Morgan
Kaufmann.
Kotikalapudi, R., Chellappan, S., Montgomery, F., Wunsch, D., & Lutzen, K. (2012).
Associating Internet Usage with Depressive Behavior Among College Students. Ieee
Technology And Society Magazine, 31(4), 73-80.
Kushlev, K., Proulx, J., & Dunn, E. (2016). "Silence Your Phones": Smartphone Notifications
Increase Inattention and Hyperactivity Symptoms. Proceedings of the 2016 CHI
Conference on Human Factors in Computing Systems, 1011-1020.
Larose, R., Lin, C., & Eastin, M. (2003). Unregulated Internet Usage: Addiction, Habit, or
Deficient Self-Regulation? Media Psychology, 5(3), 225-253.
Löchtefeld, M., Böhmer, M., & Ganev, L. (2013). AppDetox: Helping users with mobile app
addiction. Proceedings of the 12th International Conference on Mobile and Ubiquitous
Multimedia, 1-2.
Mark, G., Iqbal, S., & Czerwinski, M. (2017). How blocking distractions affects workplace focus
and productivity. Proceedings of the 2017 ACM International Joint Conference on
Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International
Symposium on Wearable Computers, 928-934.
Marotta, V., & Acquisti, A. (2017). Online Distractions, Website Blockers, and Economic
Productivity: A Randomized Field Experiment. Preliminary Draft.
Mccartney, M. (2016). Margaret McCartney: Game on for Pokémon Go. BMJ, 354, I4306.
Megna, Gisonni, Napolitano, Orabona, Patruno, Ayala, & Balato. (2018). The effect of
smartphone addiction on hand joints in psoriatic patients: An ultrasound-based study.
Journal of the European Academy of Dermatology and Venereology, 32(1), 73-78.
Mehrotra, A., Pejovic, V., Vermeulen, J., Hendley, R., & Musolesi, M. (2016). My Phone and
Me: Understanding People's Receptivity to Mobile Notifications. Proceedings of the
2016 CHI Conference on Human Factors in Computing Systems, 1021-1032.
Moreno, M. A., Breland, D. J., & Jelenchick, L. (2015). Exploring depression and problematic
internet use among college females: A multisite study. Computers in Human Behavior,
49, 601-607.
Muñoz-Rivas, M. J., Fernández, L., & Gámez-Guadix, M. (2010). Analysis of the indicators of
pathological Internet use in Spanish University students. The Spanish Journal of
Psychology, 13(2), 697–707.
Nielsen, J. (2012). Usability 101: Introduction to Usability. Nielsen Norman Group. Available
at: https://www.nngroup.com/articles/usability-101-introduction-to-usability/
Özdemir, Kuzucu, & Ak. (2014). Depression, loneliness and Internet addiction: How important
is low self-control? Computers in Human Behavior, 34, 284-290.

35
Pagoto, Schneider, Jojic, Debiasse, & Mann. (2013). Evidence-Based Strategies in Weight-Loss
Mobile Apps. American Journal of Preventive Medicine, 45(5), 576-582.
Przybylski, A. K., Murayama, K., DeHaan, C. R., & Gladwell, V. (2013). Motivational,
emotional, and behavioral correlates of fear of missing out. Computers in Human
Behavior, 29(4), 1841-1848.
Rapp, & Cena. (2016). Personal informatics for everyday life: How users without prior self-
tracking experience engage with personal data. International Journal of Human -
Computer Studies, 94(C), 1-17.
Rooksby, J., Asadzadeh, P., Rost, M., Morrison, A., & Chalmers, M. (2016). Personal Tracking
of Screen Time on Digital Devices. Proceedings of the 2016 CHI Conference on Human
Factors in Computing Systems, 284-296.
Salehan, & Negahban. (2013). Social networking on smartphones: When mobile phones
become addictive. Computers in Human Behavior, 29(6), 2632-2639.
Schoffman, D., Turner-McGrievy, E., Jones, G., & Wilcox, S. (2013). Mobile apps for pediatric
obesity prevention and treatment, healthy eating, and physical activity promotion: Just
fun and games? Translational Behavioral Medicine, 3(3), 320-325.
Shah, P. P., & Sheth, M. S. (2018). Correlation of smartphone use addiction with text neck
syndrome and SMS thumb in physiotherapy students. International Journal Of
Community Medicine And Public Health, 5(6), 2512-2516.
Sheppard, A. L., & Wolffsohn, J. S. (2018). Digital eye strain: prevalence, measurement and
amelioration. BMJ open ophthalmology, 3(1), e000146.
Stothart, C., Mitchum, A., Yehnert, C., & Enns, James T. (2015). The Attentional Cost of
Receiving a Cell Phone Notification. Journal of Experimental Psychology: Human
Perception and Performance, 41(4), 893-897.
Subramanian, R., Freivogel, William, Iyer, Narayanan, Ratnapradipa, Dhitinut, Veenstra,
Aaron, & Xie, Wenjing. (2015). Diet, Exercise, and Smartphones - A Content Analysis
of Mobile Applications for Weight Loss, ProQuest Dissertations and Theses.
Turkle, S. (2011). Alone together: Why we expect more from technology and less from each
other. New York: Basic Books.
Ward, A., Duke, K., Gneezy, A., & Bos, M. (2017). Brain Drain: The Mere Presence of One’s
Own Smartphone Reduces Available Cognitive Capacity. Journal of the Association for
Consumer Research, 2(2), 140-154.
Whittaker, S., Kalnikaite, V., Hollis, V., & Guydish, A. (2016). 'Don't Waste My Time': Use of
Time Information Improves Focus. Proceedings of the 2016 CHI Conference on
Human Factors in Computing Systems, 1729-1738.
Yang, Q., Scuito, A., Zimmerman, J., Forlizzi, J., Steinfeld, A. (2018). Investigating How
Experienced UX Designers Effectively Work with Machine Learning. Proceedings of
the 2018 Designing Interactive Systems Conference (DIS ’18), 585-596. DOI:
https://doi-org.proxy.ub.umu.se/10.1145/3196709.3196730
Yuan, F., Gao, X., & Lindqvist, J. (2017). How Busy Are You?: Predicting the Interruptibility
Intensity of Mobile Users. Proceedings of the 2017 CHI Conference on Human Factors
in Computing Systems, 2017, 5346-5360.

36
Zimmerman, J., Forlizzi, J., & Evenson, S. (2007). Research through design as a method for
interaction design research in HCI. Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems, 493-502.

37
Appendix A

Table 1. Results from the tests of existing screen time


tracking apps.

38
Fig. 2. Screenshot from Mute; reduce the
tracked time by deleting a “pickup” card.

Fig. 3. Duration of a call and time tracked by Quality Time.

39
Fig. 4. Screenshots from Space and Quality Time showing some UI problems.

Fig. 5. Notifications containing odd words, screenshots from Forest and Mute.

40
Appendix B:
Questions from the survey:
1. What is your age group?
2. How do you assess the time you spend on your phone per day? (incl. social medias, emails,
messaging, games etc.)
3. What do you do to reduce the distractions from your phone while you are working?
4. Do you feel distracted when you are in the middle of something and receive a notification on
your mobile?
5. How do you react when you receive notification in the following situations: [In a boring
lecture/meeting; During working hours; Meeting with friend(s); During
breakfast/lunch/dinner; Watching favorite movie/tv series; Listening to music; Playing a
favorite game]
6. Which of the following actions can help to reduce the usage of the phone?
7. Are you using/have you used any apps to track/reduce your screen time?
8. Have you ever try to limit the time you use a specific app, and do you follow such limits?
9. What you do like about screen time management apps?
10. What you do dislike about screen time management apps?

Questions asked during the short interviews:


1. What is the app that you’re using and how long have you been using it?
2. What do you like about it?
3. What do you dislike?
4. Is there something that annoys you when you use it?
5. If you had the possibility to add or change some features or the design of this app, what
would they be?
6. Does the usage of this app helps you to reduce/limit your screen time?
7. Have you used other apps similar to this one? Which one? Was it better or worse than the
one you use now?
8. Do you think those apps can change people’s behaviour?
9. If you were a designer of such app what features would you include?
10. Can you briefly describe how would your app look like?

Appendix C:
Evaluation Prototype v1.0, test cases description:
Imagine that you just installed the app. First you will see the onboarding screen, which will
give you some basic information about what the app does. Your first task is to read the texts
on the onboarding screens and then look around the app. I will give you some time to do that.
The battery indicator is a “secret spot” so please don’t click on it for now.
Questions asked after the first task:
• Do you understand the concept of the app?
• Is there something that bothers you? Maybe in the way it works?
• Is everything on the interface clear?

41
Now, imagine that several hours have passed. You’ve been actively using your device and
now you want to check what your friends has posted on Facebook. If there’s nothing to see
there, use the “secret spot” to go back to the Android’s main screen and then try to open
Instagram.
Questions asked after the second task:
• How do you feel when you know that you can’t control your phone?
• Do you think that those effects can make you leave you phone?
• How likely is to install the app if it exists?

Fig 1. Blur effect and “Broken screen” effect, evaluated in prototype v1.0

42
Appendix D:

Fig. 1. Prototype v1.0 and 2.0 Graph screens improvements

fig. 2. Prototype v2.0 Graphs screen, filter the used apps by category (left) and by limits (right).

43
Appendix E:

Fig. 1. Instructions on how to add a device (left) and successfully added one (right).

Fig. 2. One device paused; All devices paused. Evaluated in prototype v2.0

44
Appendix F:
Evaluation, prototype v2.0 description:
Link to prototype: https://invis.io/YZRYN25K4B9

You are going to test a prototype of an app, called Happy Screen, which tracks the usage of
your screen time and helps you to reduce it, if it is too high. I have prepared a few tasks for
you that will help me to identify any problems in the functionality and the design of this app.

TASK 1: ONBOARDING EXPERIENCE.


Let’s start. Imagine that you just installed the app, and this is the first time that you’re
opening it. You have to go through the onboarding, carefully read the messages there and
answer a few questions after that. When you reach the screen with a blue button with text
“Yaaaay”, stop for a moment.
Questions asked after task 1:
• Can you tell me what is the main function of the app? What it does?
• What do you think about this onboarding? As it too long?
• Were all sentences written in an understandable way?

TASK 2: HOME, NOTIFICATIONS, GRAPHS


Press the “Yaaay” button and you’ll see the “Home” screen of the app. You have a notification.
Can you see it? Tap on the icon in the upper right corner and read the privacy notification
(the one with the blue dot).
Questions asked after task 2:
• Is there something that bothers you in this text?
• Do you have any concerns about your privacy?
Now, let’s go back to the “Home” screen of the app. It says that you’ve been using your device
for 43 minutes, but you want to see some details.
• Where do you think you can find this information? [the user goes to the Graphs
screen]
Great, look around. Do you understand the information on this screen?
You’ve been using several apps today, but you want to see to which categories those apps
belong.
• Can you find a way to do that?
One of the features of Happy Screen is to set time limits to some apps.
• Can you see from this screen if there are any limits set to the apps on this list?
General questions about the Graphs screen:
• Is the information here enough?
• Are the “categories” and “limits” icons understandable and visible?
• Is the title text appropriate? (Total working time … )

TASK 3: SETTINGS
Our next stop is the “Settings” screen. Your next task is to set some limits but instead of doing
this manually the app has a feature that analyze all of the apps on your phone and
recommends you time limits that you can set. Imagine that you want to see those
recommendations now.

45
• Where are you going to click?
You really like Instagram and want to change this time limit from 20m to 1h30m. For your
convenience this is already set in the prototype, no need to type anything.
• Do you still want to change it, after reading the popup?
• Any comments about the popup buttons?
Now, after you made some changes, you have to “Approve” those recommended limits.
• How are you going to do this? [the user should scroll to the end of the page and click a
button]
Great, you just set the limits to the most used apps with only a few taps. If you come back to
this screen [the Settings screen] in a few hours and what to see all of the apps with set limits,
how are you going to do that? Can you delete a limit from here? Let’s say the limit for Twitter.
Questions after task 3:
• What do you think about the process of setting and deleting limits?

TASK 4: ADD NEW DEVICE


Now is time to add another device. Maybe you have a desktop PC that you use daily and want
to see the screen time of both your devices.
• How are you going to add a new device?
• Is the process written on the screen clear and understandable?
For your convenience you don’t have to enter any codes or device names, just press the button
to add the device.
You can check the details for those two devices if you want or just go directly to the Home
screen by tapping on the Home button. If everything is successful, you should see two mascots
there – one for each device.
• Do you know why the icon of the PC is not really happy?
You can see your 2 devices here but now it shows the total time from all devices. You already
know that you have used the phone for 43 minutes and want to see the data from the PC. You
have to exclude the phone.
• How are you going to do that?
If you see a hotspot on the title, do not click on it for now. We’ll get back to this in a few
minutes.
TASK 5: EFFECTS OF THE APP
Now that you already know how to exclude devices, turn on both of them. In the prototype
now there is a secret button that won’t exist in the real app, tap on it to continue with the test
(the title of the screen). Imagine that this is the main screen of your device. You have been
using your device actively and now you want to send a message to your friend on Messenger.
Open Messenger and try to write a message.
• What do you think about this effect?
Go back to the main screen of the phone by tapping on the name of the person. Now, open
Outlook and try to send an email.
• Are you going to take those 5 minutes or left the email for some other time?
If you press on take 5 minutes button, your content is visible. You don’t have to write the email
in the prototype. Go back to the main screen by tapping in the middle top part of the screen.
Now let’s check the social media. Start with Facebook, when you open it you have to start
scrolling and if you want you can read the new posts. When you see the effect (broken screen)

46
the scroll function shouldn’t work anymore. Tap once again in the middle top part of the
screen and you’ll be back to the main screen of the phone.
Open Instagram. [the user sees the Art effect] Go back to the main screen of the phone by
tapping on the Instagram logo.
Questions about the effects:
• What do you think about those effects – the blur, the broken screen and the Art
effect?
• Are they relevant for the type of those apps?
• Do you think they can make you to temporary stop using those apps?
Imagine that you really want to see your Facebook and Instagram feeds, but the app doesn’t
allow you now.
• Can you think of a way to bypass this restriction? [the user should open “Happy
Screen” again and see that the mascots are crying]
• Maybe if you pause the phone?
Tap on the title again to go back to the main screen and open Instagram or Facebook one
more time.
• Do you see strange effects now?

General questions about the app:


• What you think about the app in general?
• Will you install it if it was real?
• Do you think those “effects” can help you to change your habits?
• What do you think about the “Happy screen” icon?
• Any thoughts about the popups?

Fig. 1. Photo of the analysis. The data was grouped into 6 groups: Onboarding, Home
and Graphs, Settings and Limits, Add a device, Effects (this photo), General
comments.

47

You might also like