Usability Design for Screen Time Apps
Usability Design for Screen Time Apps
Milena Pacherazova
Department of Informatics
Master thesis, 30 hp
Human-computer Interaction and Social Media
SPM 2019.18
Abstract
The adoption of technology in our daily activities increased the time that we spend in front
of the screen and changed the way we communicate and work. In recent years, many big
companies started to develop and implement screen time management tools in their products
to educate the user on how to improve their digital health. Those tools are an important step
in the process, they bring awareness and help the users to change their habits. Several studies
have focused on screen time tracking apps but not from the design perspective. Therefore,
this thesis aims to explore the design of screen time management apps by developing two
prototypes, which were used to evaluate different design elements and features. The results
of this thesis present a guideline on how to improve the design of the existing screen time
tracking tools and what additional features could be added to fulfil their aim and encourage
users to change their behaviour.
Keywords: screen time, screen time management, screen time apps, persuasive design,
machine learning, usability, user experience design
1. Introduction
As smartphones become ubiquitous, the time that people spend in front of the screen has
noticeably increased. The rapid development of smartphones applications (apps), extended
their functions and today they are used as universal tools, from buying a ticket to playing
games. They offer access to huge amounts of information at any time and place and allow their
owner to stay in touch with people from all over the world. They are also helpful in terms of
organising tasks, increasing work efficiency and even improve the users’ health. Last but not
least, they keep people entertained when they feel sad or bored. However, while all this sounds
amazing, smartphones have disadvantages especially when they are overused, which is easily
achieved during the exploration of their practically endless possibilities.
Various researchers investigate the changes in human health, due to the usage of
smartphones. The smartphone overuse is increasing the risks of inflammation of muscles of
the hands and is one of the causes of arthritis (Megna, Napolitano, Patruno & Balato, 2018).
The prolonged use of smartphones and tablets causes neck pain due to tilting the head and
taking an unhealthy posture, which is a frequent habit among younger and older users. It is
“currently coined as “text-neck syndrome” and according to research by Elnaffar & El Allam
(2018), the popularity of smartphones increases the risks of spreading the syndrome, especially
among children who adopt the bad habit at a very young age. The overuse of smartphones is
causing not only physical problems but cognitive and emotional ones as well. According to
various studies the overuse leads to depression and anxiety (Elhai, Levine, Dvorak & Hall,
2016), associated with sleep quality as well (Demirci, Akgonul & Akpinar, 2015).
The problem is not only the addiction to the device itself but also to the apps that are
installed on it. The constant connectivity makes users pay more attention to the smart device
than everything around them. An analysis by Ding, Xu, Chen & Xu (2016) shows that social
media apps and communication apps such as messaging apps are considered as addictive,
especially among college students. Indeed, social media apps are viewed as one of the
1
predictors of addiction (Salehan & Negahban, 2013). According to Goggin, Lincoln & Robards
(2014), the development of smartphones brings many benefits for social media apps. Thanks
to their mobility and connectivity, users can now access social media almost everywhere.
However, in recent years, many companies started to develop apps that promote behaviour
change. An example of such apps are the ones related to physical activity and/or dietary
behaviour. According to different researches (Bardus, Van Beurden, Smith & Abraham, 2016;
Direito, Pfaeffli Dale, Shields, Dobson, Whittaker, & Maddison, 2014; Edwards, Lumsden,
Rivas, Steed, Edwards, Thiyagarajan, … Walton, 2016) the most commonly used behaviour
change techniques are to ask the user to provide personal details such as height, weight, age
and others (typical for the dietary apps), to set a specific goal and to encourage them to monitor
their progress. Studies like these and many others prove that smartphone apps should not be
viewed as something bad because they could be helpful in many other situations, especially if
the behaviour change techniques are used appropriately and do not violate users' privacy.
Combining them with a visually appealing design will provide a smooth user experience, which
will maximise the apps’ efficiency (Chhabra, Sharma & Verma, 2018).
Unfortunately, the usage of mobile apps requires users to spend too much time in front of
the screen. Addictive or not, the apps are looking for attention. Probably this is the reason for
another group of apps to be developed – screen time1 tracking apps, which are the focus of this
research. Currently, many apps and tools track the time spend in front of the screen. As of my
knowledge, several papers investigate that kind of apps. However, they emphasise on
improving the focus (Whittaker, Kalnikaite, Hollis & Guydish, 2016), reducing the smartphone
usage (Hiniker, Hong, Kohno & Kientz, 2016; Ko, Yang, Lee, Heizmann, Jeong, Lee, . . . Chung,
2015) and fighting the smartphone addiction (Löchtefeld, Böhmer & Ganev, 2013). However,
none of them focuses on the design, which is an essential factor for the future of screen time
tracking apps (and not only) and the relationship between humans and their devices. To
address this research gap, there is a need for additional research. The improvement of the
design of screen time management tools can attract attention to the problems that those tools
are trying to solve, increase their popularity and educate the users on how to take care of their
digital health. Following this line leads to the research question that this study will answer:
How can a screen time management tool be designed and what features could improve its
usability?
To answer the question, this study uses the following approach: first, research of the most
commonly used screen time tracking apps was conducted, where 10 apps were selected and
tested. Then an online survey was shared among people of various ages and locations to inform
how aware are the users of their screen time if they consider the screen overuse as a problem
and how are they dealing with. Based on the answers from the survey several short interviews
were conducted online with users of screen time management tools to investigate their
advantages and disadvantages. This led to the creation of a high-fidelity prototype, which was
1The term “screen time” defines the action of spending too much time in front of digital devices
with screens. It should not be confused with “Screen Time” which is the name of an app, used in
the text below.
2
evaluated with potential users. The next step was improving the prototype and evaluate it
again.
The present study is addressing the gap mentioned above and explores the challenges of
developing a screen time management tool, that supports several different devices, uses
machine learning and it is inspired by persuasive technology to promote behaviour change.
The main contribution of the study is to present guidelines that help designers and researchers,
interested in the integration of screen time tracking tools into daily activities.
2. Related research
This section presents an overview of the current state of the literature in relation to the study.
Screen time management apps have not been an object of study by other researchers at least
not in terms of design. Therefore, there is a need for additional research to find out how these
apps can be designed and what features need to be included to improve their usability. First,
the effects of the addictive technology are explored, what makes the users spend time in front
of the screen and how they can protect themselves from falling into this trap. Lastly, several
papers investigating screen time tracking tools will be briefly presented.
3
defined as “a pervasive apprehension that others might be having rewarding experiences from
which one is absent, FoMO is characterised by the desire to stay continually connected with
what others are doing.” (Przybylski, Murayama, DeHaan & Gladwell, 2013). Even though
smartphones offer practically everything to the user, their presence harms the cognitive
capacity (Ward, Duke, Gneezy & Bos, 2017). Another study by (Stothart et al., 2015)
investigates what happens when users receive a notification on their device. The results show
that even if the user does not respond to the received notification, their performance
significantly decreases because they are disturbed by this notification.
Excessive smartphone usage also has a negative impact on people’s physical health. It is
known that the prolonged smartphone usage affects the body posture and can lead to changes
in different parts of the spine (cervical and lumbar) due to the forward head position (Jung,
Lee, Kang, Kim & Lee, 2016; Kim, Kang, Kim, Jang & Oh, 2013). Tilting the head in such
irregular position is causing muscle pain in the neck and shoulders as well. According to
Elnaffar & El Allam (2018), there is a high risk of “text-neck syndrome” especially among
children and teenagers, who are overusing smartphones and tablets. The syndrome is harmful
to the neuro-musculoskeletal apparatus and it already affects people from various ages and in
particular young people (Giansanti, Colombaretti, Simeoni & Maccioni, 2019). Another term
related to this syndrome is “SMS thumb” which affects the muscles of the hand. Studies show
(Shah & Sheth, 2018) that the repetitive usage of hand-held devices such as smartphones lead
to musculoskeletal disorders, including arthritis (Megna et al, 2018). Other researches show
that the exposure to blue light, not limited to smartphones but screens in general, can be
harmful to the retina and is one of the causes of DES or Digital Eye Strain (Sheppard &
Wolffsohn, 2018).
In the studies mentioned so far, smartphones are considered as devices that cause various
kinds of mental and physical issues. However, looked at another perspective, it is known that
they have positive sides as well. Indeed, they can be used as a tool to encourage users to change.
One of the examples is a game called Pokemon Go, known as the first augmented reality game.
It motivates the player to move in a physical environment and to “catch” imaginary characters
with their smartphones. The game is not promoted as a health app, but studies show that it
encourages the players to walk which brings health benefits (Mccartney, 2016; Howe,
Suharlim, Ueda, Howe, Kawachi & Rimm, 2016). The Health and Fitness categories in the App
Store and Google Play store include various apps that promote healthy living and physical
activity. Research shows that the most popular features of those kinds of apps are tracking
progress, setting goals/limitations and self-monitoring (Subramanian, Freivogel, Iyer,
Ratnapradipa, Veenstra & Xie, 2015). Additionally, there is a need of improvements in terms
of features such as experts’ recommendations and advice on how to set limitations the right
way and how to get better and faster results (Schoffman, Turner-McGrievy, Jones & Wilcox,
2013; Pagoto, Jojic & Mann 2013).
4
smartphones, which means that checking email, social media feeds etc. equals to 100 hours for
a month (Alter, 2017). People are so addicted to their devices that even the need to touch
smartphone results in anxiety, depression and FoMO (Elhai et al, 2016; Hoffner & Lee, 2015).
Alter (2017) suggests that there is a need for behavioural architecture, which means to create
surroundings which will help us to thrive. It is indeed difficult to avoid using technologies and
certain apps like email for example, but Alter encourages to try to reduce their usage. He also
mentions that everything nearby has an impact on our mental health, that is why according to
different studies looking at a screen before falling asleep obstruct our abilities to sleep better,
“Surround yourself with temptation and you’ll be tempted; remove temptation from arm’s
reach and you’ll find hidden reserves of willpower.” (Alter, 2017).
Additionally, the user needs to create this behavioural architecture on their own or as Nir
Eyal explains in his book, following a famous quote by Mahatma Gandhi: “build the change
they want to see in the world.” (Eyal & Hoover, 2014). Eyal argues that by understanding how
we get attracted and even addicted to technology, we can break and change the unwanted
habits in our lives. The solution is hidden in the design of the products and services around.
The book describes the Hook model (figure 1), which is “a simple yet powerful way to help your
customers form habits that connect their problem with your solution”. The model has 4 parts
– Trigger, Action, Reward and Investment (Eyal & Hoover, 2014), illustrated and explained
below (figure 1).
5
Lutzen, 2012; Larose, Lin & Eastin, 2003) and “Once a technology has created an association
in users’ minds that the product is the solution of choice, they return on their own, no longer
needing prompts from external triggers.” (Eyal & Hoover, 2014).
The next part of the Hook Model is Action, this is the part in which the habitual behaviour
occurs. It is defined as an expectation of a reward and it uses the motivation and the abilities
of the user. For the Action to work it needs to be designed in a simple way and at the same time
to increase the motivation of the user. The Action “draws upon the art and science of usability
design to ensure that the user acts the way the designer intends.”2. A simple search bar or a
scroll function are considered as an example of this part of the model.
The third part of the Hook model is Rewards, the user is rewarded, and their problem is
solved. Rewards can be anything that keeps the user attention - from a simple image or
experience to a physical product. They activate desire simply by being unknown. Eyal suggests
three types of rewards the Tribe, the Hunt and the Self (Eyal & Hoover, 2014). The Tribe
rewards are also known as social rewards. They make the user to come back and to look for
more. This is one of the reasons why social media platforms are so popular - they offer the user
this type of rewards. Every time the user comes back, the content is different and there is
always the uncertainty of what other people might have posted or commented on. One of the
examples for rewards of the Hunt is slot machines. In this example, the reward is the money
that the user might win and that is what makes gambling so addictive. Similar is the social
media feed, by opening an app with feed the user might see something interesting but if they
continue to scroll the next things, they will see might be even more interesting. Feeds motivate
the user to continue searching for the next “reward”. The last type of rewards are the rewards
of the Self. They do not come from other people; they make the user feel good just by being
there. They are all about the search for mastery and control. An example of this type of rewards
are the games. Even if the user plays alone, without other people, reaching the next level or
getting the next achievement makes him or her to feel good. For people who do not play games
rewards of the Self come in another format - by checking their email and notifications just
because there is an indicator with some number, and it makes the user clear those numbers
away. The purpose of those rewards is to “satisfy users’ needs while leaving them wanting to
reengage with the product.” (Eyal & Hoover, 2014).
The final part of the model is Investment, which should increase the likelihood and make
the user pass through the hook again and again in long-term. In this part, the users are asked
to invest something which will make the products they use better. The difference here is visible
between physical and digital products. As time passes the physical products lose their value
however, the digital ones should do the opposite. An example of this part is the data shared
online, the more data is shared the better products and services become by providing more
relevant content (Eyal & Hoover, 2014).
However, the Hook model is not flawless. According to Filippou, Cheong and Cheong (2016)
it has a limitation – it does not check the effectiveness of a Trigger. The authors also suggest
2 https://www.nirandfar.com/hooked-user-behavior-resources/
6
that Fogg’s behaviour model can be used to deal with this
disadvantage. B. J. Fogg’s behaviour model needs three
components “a person must have sufficient motivation,
sufficient ability, and an effective trigger” (Fogg, 2009).
Fogg visualises the model as a plane formed by two axis -
a vertical for motivation and a horizontal for ability. The
place where the two axes are connected is marked with
low (low motivation and low ability) and their ends - as
high, which means that “high motivation and high ability
are typically necessary for a target behaviour to occur.”
(Fogg, 2009). If the ability or the motivation is low, then
the trigger will be unsuccessful, and the system should Figure 2. Successful and unsuccessful
support the user by turning the lower parameter into a triggers, according to Fogg’s model.
high one (Figure 2.).
3 https://freedom.to
7
the participants reduced the usage of social media sited and browsing as well as their total
online activity (Whittaker et al., 2016). NUGU (when No Use is Good Use) is another example
of an app, created for a study, that also brings awareness. It encourages users to limit phone
usage and share their limiting information with other users. The results showed that this
feature is “critical in assisting the participants to limit their smartphone use” (Ko et al., 2015).
Another app that also supports self-limiting and is still popular (reviewed in the next section)
is AppDetox, which offers the users to set rules for the apps that they want to use less. The
findings of the research on this app demonstrated that users are limiting mainly social media
and messaging apps (Löchtefeld et al., 2013). Hiniker et al. (2016) created a standalone mobile
app, called MyTime, that creates a balance between the use of the device and the non-use. Their
study reports that the targeted non-use of a smartphone can help the users to reach their target
of appropriate usage. Apart from that, the study also demonstrated the advantages of technical
solutions over self-preventive measures such as leaving the phone in another room and
completely removing apps.
The results of these studies indicate that the need of apps for limiting the smartphone use
is appreciated by the users (Hiniker et al., 2016) and they understand the advantages of using
their devices less in a daily basis (Ko et al., 2015). Therefore, this study aims to investigate the
existing apps for smartphone usage regulation which is strongly connected to limiting the
screen time usage. Details about the study will be presented in the next chapter.
3. Research methodology
This section presents the methods used to conduct the study. Following is a brief overview of
the section. The study consists of 2 parts, each of which is explained in detail below. Lastly, the
data analysis method will be explained and the ethical considerations.
8
iOS, as well as several articles from well-known tech review websites, 10 applications were
selected and evaluated. They were downloaded on two mobile devices (Android system and
iOS system) and their features were explored, including the accuracy of time tracking and their
user interfaces. The evaluation was conducted in one month and a half. The availability of two
devices to test on, allowed for testing two apps at the same time – one on Android and one on
an iOS system, as well as comparing their features and interfaces. Each application was tested
for about a week individually on daily usage of the device. Table 1 below presents a summarised
version of the collected data during the evaluation of the existing apps. The full version of the
table can be seen in the table in Appendix A.
Pause counting
App name/
order to work
notifications
notifications
Features
is reached
limit ends
used
Screen Time ✓ ✓ ✓ ✓ ✗ ✓ ✓ ✗ ✗ ✗ ✗
Mute ✗ ✗ ✗ ✗ • ✓ ✗ • ✓ ✗ ✓
Space ✗ ✗ ✗ ✗ ✗ ✓ ✗ ✓ ✓ ✗ ✓
Antisocial ✓ • ✓ ✓ • ✓ • ✗ ✗ ✗ ✗
App Detox ✓ ✗ • • ✓ ✗ ✗ ✗ ✗ ✓ ✗
In the evaluation, the availability and reliability of some of the most important features, that a
screen time tracking apps should have, were tested. For example, track screen time, set limits
and notify the user. As seen in the table above those features are included as main categories
with additional subcategories for more specific functions. Setting a limit is one of the core
features of a screen time management app and to allow the user additional freedom it should
be available in a variety of ways. For example, limit an app, limit a category of apps, limit for
several minutes/hours or even setting limits per day. None of the tested apps offers that wide
range of options. However, to meet the needs of a wider target group, it should be considered
for the future development of screen time tools. Counting the time is the next important feature
and similar to the previous one, it should count more than just the total screen time. This
9
feature was included in the evaluation because it brings awareness for the user. The table
includes a subcategory called “Pause counting” which is perceived as a way to bypass the
counting in case the user wants to use their device a little longer than the set limit. While it is
good for a screen time app to have such a feature, it should be used responsibly. The next
category in the table is “Notifications” and it is included in the evaluation because this is one
of the most convenient ways for any kind of app to “communicate” with the user. Again, it
should be used responsibly by the designer in order not to bother and annoy the user and
because screen time management apps should reduce the screen time, not to make the user
check their device regularly. It would be flexible to have additional settings for notifications
inside the screen time app in order to be appealing for more users. The last column in the table
is called “Other” and includes features that cannot be part of any of the other categories, for
example asking the user for their location. After the evaluation, each app was kept installed on
the devices, but all of their features were turned off in order not to interfere with the results
with the apps that are currently tested. It turned out that this feature is also important and
therefore included in the extended version of the table, available in Appendix A.
Short interviews
Initially, the plan was to use only a survey, but the results showed that most of the participants
are not aware of screen time tracking apps and there is a need of targeting a specific group of
users. For example, users who are using or have used screen time tracking apps. In this phase
of the study, specific problems of the screen time management apps had to be identified and
solved later. In total 13 online structured interviews were conducted with individuals (Table
2.). Some interviewees contacted the researcher directly, others were invited personally based
on the criteria mentioned above. Each interview lasted between 5 and 15 minutes and it was
recorded on a mobile device. The data was transcribed and analysed with thematic analysis.
10
Screen time app used Screen time app used
Participant Age range Time used
now before
P1 18-25 Screen Time (iOS) 6 months No
P2 26-32 Screen Time (iOS) 3 months No
P3 26-32 Screen Time (iOS) 4 months No
P4 26-32 Screen Time (iOS) 4 months No
P5 26-32 Screen Time (iOS) 6 months Yes, Moment
P6 26-32 Screen Time (iOS) 5 months No
P7 26-32 Digital Wellbeing 4 months No
P8 26-32 Screen Time (iOS) 4 months Yes, Space
P9 18-25 Screen Time (iOS) 5 months No
P10 18-25 Digital Wellbeing 3 months No
P11 26-32 Space 1 month No
P12 18-25 Digital Wellbeing 2 months No
P13 18-25 Screen Time (iOS) 4 months Yes, Digital Wellbeing
11
mouth which express different emotions based on the users’ screen time usage. The initial state
of this icon is a smiley face, which makes it look happy and this is where the name of the app
comes from - “Happy Screen”. This icon is also inspired by a digital pet product of the mid-
1990s called Tamagotchi, which changed the way a digital device is perceived by the users. It
is one of the examples of the “functional triad” framework, created by B. J. Fogg, which deals
with the different way users perceive the roles of the computer technologies in their lives (Fogg,
2003). The mascot will keep the interface minimalistic and by changing its facial expressions,
it is expected that the users will use the app longer, which on the other hand will help them to
change their habits.
The prototype starts with an onboarding with 3 separate screens where the user can
navigate between them by swiping left or right (figure 3). The onboarding aims to welcome the
user and present the main idea of the app. In the text below the word “onboarding” will be used
to define the process of welcoming the new users and make them engage with the app. After
the onboarding, there is a Home screen (Figure 3) with several clickable blocks:
• a mascot followed by information about the screen time usage of the device today;
• 2 blocks with the number of notifications received and the number of pickups of the
device;
• Updates and news section with tips and information from the mascot, combined with
its emotional states.
Each of the blocks leads to different screens with more detailed information and data, where
some of it is visualised in the form of graphs - widely used in screen time management apps.
12
By using graphs, the user gets a better understanding of large amounts of statistical data, and
this allows them to make a comparison between different days/hours of the day.
The prototype was “connected” with a fake Android “home” screen, with links to fake
Facebook and Instagram apps, used during the evaluation. The Facebook and Instagram feeds
are especially designed with effects that the “Happy Screen” app will “cause” when the user is
using them for too long. The effects used in the prototype are blurred content in the Facebook
feed and “Broken” screen for the Instagram feed. They will be active only in the mentioned
apps. In the text below the word “effects” will define the state of “Happy Screen” when a user
sets a limit for an app and use it over the limit. Effects are another feature of “Happy Screen”
that makes the app different from all of the existing ones, which just block the usage of a certain
app when the user is using it over the set limit by preventing them from opening the app (App
Detox) or showing white screen with opt-out option (Screen time on iOS). As an inspiration of
the effects, the Hook model (Eyal & Hoover, 2014) is used and modified. Instead of keeping
the app addictive and increasing users’ curiosity to open it again and again, applying effects
aim to break the hook. Starting with the Trigger, the user is prompt to open a certain app, then
there is an Action – tapping on the icon, but the next part of the model – Reward is never
“received”. After reaching the previously set limit, the user will still be able to open the app and
post content, but they will not be able to read the content that is already posted from their
friends. Practically, the app will be useless, and it is expected that the user will not be motivated
to use it. In the case of Instagram, the user will still be able to open it but when they start to
scroll a picture of a Brocken glass will cover the screen and the scrolling function will not work.
Participants
For the evaluation of the first version of the prototype, three focus groups were organised with
a total of 8 participants (Table 3). Group 1 consisted of 3 students from the HCI master
program, Group 2 had 2 participants - both were master students within the IT Management
field and the last Group 3 - again 3 participants, students from different master programs.
Unfortunately, the data from Group 2 will not be considered due to the fatigued look of the
participants and the vague comments that have been shared during the session. Each session
started with a brief introduction of the tasks and basic questions to each of the participants.
From all participants, only one had used several screen time tracking apps. Without achieving
any results in reducing the screen time usage, she was looking for something new and more
reliable. The others had not heard of such apps but admitted that they use social media apps a
lot and would like to try a tool that can help them to use those apps less. One of them
13
highlighted her addiction to the device and according to the Screen time widget on her iPhone,
her average daily screen time was 5 hours.
3.2.3. Prototype 2
Based on the analysis of the collected data from the evaluation of the first prototype, a second
improved version was created and evaluated. The improved version contains additional
screens for the onboarding experience, with details on how to do particular tasks. For example,
instead of only 3 screens, the new version has in total 7 – one for the introduction of the app,
five with how-to explanations and one final to welcome the participants. The Home screen of
the app is also improved. Instead of having 3 big buttons that lead to detailed graphs there is
only 1 (Figure 4). Notifications and Pickup buttons and their corresponding pages were
removed. In the new version, those parameters are kept as informative numbers because most
of the participants were concerned that the app will collect their notifications and they might
14
Figure 4. Comparison between the Home screens of prototype 1 and 2.
miss an important one. “News & Updates” section is also removed from the main screen and it
is now situated on a separate page – “Notifications”, accessible by a button in the upper right
corner of the “Home” screen. Some participants were concerned that they might accidentally
tap on a notification (which will mark it as read and remove it from the section) without
actually reading it. The information on the detailed screen (called Graphs in this version) is
also simplified. The gradient colours from the graph are replaced with one solid colour because
they were not consistent with the design. Also, the recommended times are removed because
the users were confused. The limited apps section is moved to a separate screen, accessible
through the Settings menu, which helps to make the Graphs page less text-heavy (see Appendix
D, fig. 1). Grouping the apps into categories and showing their limits as well as the time spend
to reach the limit are now accessible through two icons which can be switched on and off based
on the users’ needs.
The effects that “Happy Screen” is using when the user is over the limit are also improved.
In this version, the blurry effect is applied on apps containing mainly text such as email apps
(Gmail, Outlook etc.) and text-based social media apps (Twitter, Medium etc.). The “broken
screen” is applied to Facebook and also stop the scrolling function of the feed. According to
(Eyal & Hoover, 2014), Facebook’s feed is the function that makes the social media addictive
because the user does not know what the next post will be, and their curiosity keeps them
scrolling endlessly. This is also the reason why many websites and apps integrated similar
feeds. For the social media that contain mainly photos and images (Instagram, Pinterest, Flickr
etc.) a new effect will be applied - simulating acrylic painting - which aims to destroy the users’
15
pleasure of looking at beautiful images and make them stop using an app temporary (Figure
5).
16
Participants
For the evaluation of this prototype 4 participants were recruited – 2 males and 2 females, aged
between 28 and 35 (Table 4.). Three of the participants are working full time in marketing,
software development and creative fields and use social media apps for several hours daily for
inspiration, work-related tasks or just for fun. The fourth participant is a recently graduated
student, actively looking for a job, which also requires checking social media groups and other
pages with job advertisements. They were selected as a potential target group – young people,
full time working in various fields. As active users of social media platforms, it was assumed
that they might be addicted without knowing.
17
The fourth task was to Add new device (Appendix E, fig 1.). At the beginning of the
evaluation, this feature was briefly mentioned and during the previous task, all participants
guessed that the availability of this feature and were even eager to try it. The users were asked
to imagine that they have a personal computer (PC), where “Happy Screen” is also installed
and they have to link the PC to the phone, to see the statistics of both of their device. Part of
this task was also to exclude (paused) an already added device (Appendix E, fig 2.), which acts
as a filter. In this case, the data that the user sees on the Home screen is only from the devices
that are not excluded.
For the last task in this evaluation, the participants were asked to imagine that they have
used their device too much, over the previously set limits. Now they had to try to use some of
the apps again and experience the effects that “Happy Screen” is showing over the overused
apps. Depending on the type of apps, they had different effects. This task tested the
participants’ reactions when they cannot use their devices. Part of the task was also to bypass
the restriction by excluding device. When they open “Happy Screen”, they saw the sad faces of
the mascots, which should make them change their decision.
18
the study and their anonymity will be fully preserved. The recordings were stored securely and
after the data was transcribed, all recordings were immediately deleted.
4. Results
This section presents the results of the study. The concept of a mobile application that will
track screen time and help the user to reduce it was accepted by all of the participants that took
part in the evaluations. Most of them mentioned that they would use the app if it exists. The
results cover three focus groups with 8 participants in total, evaluating the first prototype and
four moderate usability testing session with four individuals.
19
Counting pickups and unlocks is another addition in the counting column and it is
supported by most of the evaluated apps. The ability to stop or pause the counting could be
considered as a way to “cheat” but in some cases, it could be an important feature.
Unfortunately, it is supported by only 3 of the tested apps. Mute has it in their premium
version, which also supports an option to “delete” a time that is not supposed to be there.
Another way to reduce screen time, even though it is a way of cheating (see Appendix A, fig. 2).
Notifications are the feature used by many apps to communicate with the user. In the screen
time management apps this feature could be used to inform the user that they are running out
of time if they have set a limit or to motivate them to use their devices less. However, in some
cases (Quality Time) receiving such notification makes the user unlock the device, which was
tracked as a screen time usage.
One of the problems identified during the evaluation is that some of the apps, especially
those tested on the iOS system are using the physical location of the device to work. As part of
the test, this setting was changed from the device settings. Moment, Mute and Space instantly
send notifications asking the user to allow them access to the location otherwise they cannot
work. Location services always drain the battery and in the case of screen time apps, this
feature should not be used.
A huge problem found in almost all of the apps, especially those running on Android is the
inaccuracy of the time tracked. Quality Time tracked 10 minutes and 38 seconds for a call,
which lasted only 1 minute and 15 seconds. Such difference makes the user question how
exactly this app is working and creates distrust. An example of this case can be seen in
Appendix A, fig. 3. Antisocial also had a tracking problem. Sometimes the app was working
fine and sometimes could not track anything.
Another limitation is the user interfaces of some of the apps. Space, for example, shows a
big empty circle on the “Home” screen (see Appendix A, fig. 4). After several days of usage, it
turned out that this circle symbolises “the space”, based on the set time usage and it gets
smaller when the device is used more, even disappears when the limit is reached. Quality Time
also has some strange UI elements, like the sliders they are using. As shown in Appendix A, fig.
4, the sliders are too long, and it is not clear if the user has to actually slide them or just tap.
20
to regulate it. However, some of them mentioned that they need more information and
additional graphs where they could see their progress compared with the previous months.
This lack of variety in the ways that the user can set limit inspired the researcher to think
about all of those different ways to set limits and also to include those features in the prototype.
During the evaluation of the existing apps, it was noticeable that it takes too much time for
users to set limits. This problem was mentioned during the interviews by several users and it
was pointed as one of the features that makes them annoyed and unwilling to use a particular
app. A possible solution of this problem could be the integration of machine learning
algorithms that will make the screen time management app “smarter” and it will know in
advance which are the most used apps on a specific device. Machine learning is gaining a lot of
popularity and in one or another way it is implemented in many of the apps and services that
we use today.
On the question of what features users would include if they were designing and developing
a screen time management tool, most of the interviewed mentioned the support of multiple
devices. Many of them said that when they reach the limit, either they change the device with
a pc/laptop or just open the browser and continue. This process is particularly valid for social
media apps, which support both mobile apps and web versions accessed by a browser. As of
today, Screen Time partly supports similar feature and includes other iOS devices connected
to the same user account. The upcoming macOS version supports the Screen Time feature on
a mobile and desktop systems, however, it is not clear yet how exactly it will work4. The support
of various devices regardless of their operating system would be a great addition to any screen
time management app.
4 https://blog.macsales.com/50187-macos-catalina-features-using-screen-time/
21
In general, all participants expressed positive feelings about the prototype when they saw it for
the first time. During the onboarding, the mascot was highly appreciated and one of the
participants commented: “It makes me sympathises more” (P1). However, the design of the
next screens seemed confusing for most of the participants and they experienced difficulties,
completing the tasks.
On two of the buttons on the Home screen - Notifications and Pickups, there were green
arrows, pointing down, which showed a comparison of the same data from the previous day.
However, those elements were confused for a drop-down menu by one of the participants and
according to another one they remind of the arrows from a stock exchange data. Another
confusion was the title “Pickups” on one of the buttons. Since almost all of the participants did
not use a screen time management tool before, they did not know what this title means and
what are the functions of this button. One of the participants expressed their opinion about the
Pickups function as follows:
“The pickups will be useless in that case (if people want to limit only apps).” - P3
The results from the focus groups show that the usefulness of this parameter can be
questioned.
Another confusing part of the Home screen was a section called “News&Updates”. The
problem in this section was again on a user interface level. According to the participants, this
part of the screen has too many arrows. One participant shared that they are attracted by the
visual element and do not read the text, which in this case is more important: “The text should
be more or less the first thing, not the arrows.” (P2). The results from the discussion after
completing this scenario shows that this section is useful. However, some participants were
concerned that they might accidentally tap on one of the buttons with checkmarks (which will
be perceived as completed) without actually reading the text. All of the participants took this
section too seriously and suggested various improvements both from user interface and user
experience perspective.
Another confusing part of this scenario was the Graphs screen, which followed the design
of the iOS screen time tool. Almost all of the participants in this evaluation were Android users
and did not understand all of the functionalities of this screen. The graph for the screen time
had gradient colours, which represent the data combined with the pickups and notifications,
which were marked by two different shades of blue. The same shades were used for the
recommended times below this graph. One of the participants expressed their confusion as
follows:
“The gradient is not consistent with the design. I am confused between the
recommended and the usage colours.” - P3
The used and recommended times, shown below of the graph were also not understood by
most of the participants. Some of them preferred to see how much time they have until
reaching a limit, but others mentioned that they would feel a little bit stressed if they see a
small number of minutes left. The results showed, that all of the participants were confused by
the section “Limited apps” and they did not understand why it shows two numbers. Most of
them suggested that the section should look different and it should be moved to the Settings,
where it should be the place to set the limits. Overall, the Graphs screen had too much text and
one of the participants expressed their opinion as follows:
“There is a lot of information to process here” - P7
22
Scenario 2: Experiencing some of Happy Screen’s features
The second scenario from this evaluation presented the effects that a potential user will
experience when they use their mobile devices over the limits. All participants reacted
differently when they saw the “Blur screen” effect. Due to a technical limitation, the effect was
confused for a bad internet connection and at first, some participants did not understand that
the blur part of the screen is an effect from the prototype. However, the results showed that
most participants recognise the cause of the effect, which is visible from the following quote:
“I think it’s because of the time (the screen got blurred). And the app is mad at
me.”. - P2
The other effect - “broken screen” was positively accepted. Almost instantly, the participants
started to compare the two effects. One participant already broke the physical screen of their
device and this effect did not make a difference during his experience. However, both effects
encouraged some participants to ask a lot of questions, such as:
“Are there other options or it’s only blurry/cracked?” - P6
“Can we choose what effect we prefer?” - P2
Asking such questions express the participants’ desire to choose an effect that they like and
prefer to see, which does not correspond to the initial concept of “Happy Screen”.
Both effects triggered various opinions and concerns. Some participants were concerned that
there is no information when the effect will show up:
“I think I’ll panic if my phone suddenly just cracked.” - P7
This comment encouraged the other participants in this group to think about possible ways
that the upcoming effect can be communicated before occurring. According to the findings
such preventive measure could be included in the form of a notification, for example. However,
it could also cause a distraction and encourage the user to open the notification to read it.
Inspired by another screen time management app, one of the participants suggested:
“When you’re in an app there could be a clock that shows how much time have left
(before the limit).” - P1
23
Task 2: Get to know the Home and Graphs screen
The second task started with many comments on the titles used in the user interface. According
to almost all of the participants, the title “Total working time” should be changed, because it
reminds them of a period during which they go to work:
“It sounds like “Working time from 9 to 5” - P9
Part of the task included a privacy notification, which asks the participants to agree with it (by
pressing a button) to continue work with the app. All of the participants agreed with the text,
although some of them (P11, P12) were concerned about the access of the app to their private
emails and chat messages. One of the participants took the description too seriously and
admitted that he does not understand how the app will “scan” his content and at the same time
it will not read his email:
“After reading this text, I’d probably uninstall the app immediately. I can’t trust
it” - P12
The next part of the task - the Graphs screen - was well accepted. One of the participants
completed the task with some difficulties because she could not see the icons for the categories
and the limits, which can be considered as a design issue. They were too thin, also for her the
Limits icon was unclear:
“It should be something else. This symbol doesn’t say that I can see the limits when
I press it” - P9
A screen time management tool needs to have an understandable user interface with
appropriate icons, that correspond to various actions. Regarding the part with the limits, all
participants agreed that it is useful:
“It is clear for which apps the time limit is reached and for which not” - P11
Using the red colour as a warning sign was highly appreciated. One participant noted, that the
limited apps section should be more visible by making it the default state of the Graphs screen.
One of the participants was curious about setting a limit to a category, which could be
considered as a suggestion for a feature. However, the results show that several apps can
belong to more than one category and this could cause problems to some users who use an app
for a different purpose than its initial one. For example, some users who use the YouTube app
to listen to music, but they do not watch the video attached to it.
24
Task 4: Add new device
On the contrary to the positive reactions to the “Devices” section, the task of adding a new
device to this section was difficult for almost all participants. Partly, the issues came from the
instructions which were not very clear, even though they were ordered and separated into 4
points. For the user to add/link a device they had to first install the app on the other device
(desktop PC, used as an example), open the Settings screen and enter the Happy code of the
PC into the field provided on the phone. This process was too complicated, and each participant
had a different idea of how the linking will work.
The next part of the task was to pause the phone, so only the data from the other device to
be visible. For this to be achieved, the app offered two possible ways - (1) through the Settings
screen and (2) by using the mascots on the Home screen. Instructions for this “special” feature
were also included in the onboarding. The results showed that half of the participants (P9 and
P10) did not read all of the texts during the onboarding and accordingly they choose the first
way of pausing. However, their comments were correct:
“But those devices look like an image. There should be something like a comics
bubble to remind me that I can click on them. Now it’s not clear that they are
clickable.” - P9
“I don’t get that part… So, I have to click on one of the devices?… The idea to tap
on the devices is cool but the graphic doesn’t look like something that I can click
on. As a [profession] I see that kind of graphics all the time and I’d never click on
them. Maybe if they are not overlapping and there is a “plus” sign somewhere… I
don’t know, now they are just a normal picture for me.” - P10
25
browser will the effect be visible. It is a common practice for people who work in big companies
to try to bypass the restrictions on the local network to visit social media websites by using
VPN services, incognito modes on the browsers and so on. As for the smartphone Sally
mentioned that she tries to reduce her Facebook usage and she deleted the app from her phone.
Now she opens it only from the browser:
“Facebook is the most toxic thing you can have on your phone” - P11
Before the end of the evaluation, all participants were encouraged to share their general
thoughts about the app, as well as suggestions, comments and questions. The results show that
they all like the concept of the app and it will help them to track their screen time. One of the
participants (P12) expressed his option about the effects and his concerns that the app the
effects might not help him:
“They (the effects) are just technical obstacles, which I can overcome” - P12
According to the results, the general view of apps that use artificial intelligence is not well
accepted. Some users (P12) prefer to have more control and decide what kind of content the
app can “read”, others admitted that they felt scared:
“I’m scared that someone is watching my every move. The AI part is the thing that
will make me not using the app. But it looks great and I’d install it just to see if it
works as it should be”. - P12
One of the suggestions of this problem was to assure the users that their data is stored
anonymously. However, such a feature needs further testing. In case the app reaches a
development phase, the artificial intelligence should be either removed or replaced with
another method that simulates it in a way, but it does not read the users’ content.
26
5. Discussion
Screen time tracking apps are still getting popularity among both users and researchers. As
mentioned before very few studies are focused on the effects that kind of apps offer. By the
time of conducting this study, none of them examines the design. Therefore, this study aims to
fill this research gap. This thesis presents guidelines on how the design can be improved from
a user experience point of view. It can be used for the development of new screen time apps
and improvement of the existing ones. In this section, the findings and possible interpretations
of the collected data will be discussed, based on the results described in the previous section.
For this study, the researcher created and evaluated two prototypes, by using the research
through design (RtD) approach. RtD generates knowledge by utilizing methods and processes
from design practice (Zimmerman, Forlizzi & Evenson, 2007). In the book “Design research
through practice” (Koskinen, Zimmerman, Binder, Redström & Wensveen, 2011), the authors
explain the RtD practices used in the interaction design research community by referring them
to Lab, Field, and Showroom. The Lab practice “focuses on creating novel and much more
aesthetically appealing ways for people to interact with things” (Koskinen et al., 2011), which
is one of the contributions of my study. The Field practice outlines a problem and offers design
solutions that can solve it, such as the “effects” included in the prototypes. The Showroom
practice is used to “design provocative things that challenge the status quo” (Koskinen et al.,
2011). In the study, this practice is used to challenge the users by including machine learning
features and evaluate their reactions.
27
interaction designers instead of just showing white screens and popups that stop the user from
opening an app, it is much better to allow them to use it and make them smile and even laugh
at the effects:
“This effect looks funny! [smiles and continues to scroll] Oh…all photos look like
this [smiles again].” – P1
Another problem that “Happy Screen” tries to solve by improving the design is seriousness.
The existing screen time tracking apps are designed in a way to be perceived as “serious”. They
visualise the screen time data in a variety of ways but so far none of them offers a persuasive
element. Even those with gamification features like Forest and Hold, where after a few days of
usage they become tedious for the user and eventually deleted from the device. The concept of
“Happy Screen” includes such an element and aims to develop a friendly relationship with the
user, which might help them to change. The mascot was inspired by B. J. Fogg’s “functional
triad” framework (Fogg, 2003) and just like Tamagotchi, it can have different facial
expressions, based on the screen time used so far. Surprisingly it received very positive
comments during both evaluations, visible from the quotations below:
“I like that the phone icon is always different. It’s cool!” – P2
“The icon is friendly, and it looks cute.” – P4
5.2. Usability
The second contribution of this study is related to usability. According to Nielsen J. (2012)
usability is “a quality attribute that assesses how easy user interfaces are to use”. Usually,
usability evaluations are conducted in laboratories and require complex procedures (Genc-
Nayebi & Abran, 2018). This study did not measure all of the dimensions of the usability
(Learnability, Efficiency, Memorability, Errors and Satisfaction (Nielsen, 2012)) of screen time
management apps. Instead, the researcher maps them to the collected data and draws
conclusions from there. The existing screen time tracking tools are working in almost the same
way. They do not offer anything else than collecting and visualising data, their success in terms
of behaviour change in the long term is also unknown. The problems identified in the tested
apps, such as confusing interface elements, are making them unfriendly and even repulsive,
which would most likely cause them to fail if the usability is measured in a laboratory. This
aspect is unhelpful for the future vision of screen time apps. Therefore, by including specific
tasks for the evaluations of “Happy Screen” the researcher incorporates new features which
might improve the usability. For example, evaluating how users will react to features such as
machine learning.
28
usage for every app. It would be beneficial for the study to have at least two participants who
have been using different screen time limiting apps before. They would appreciate the amount
of time that they would save when they decide to set limits. However, during the evaluations,
this feature did not receive many comments. A possible explanation could be the choice of
participants and the fact that they have not used such apps before.
Machine learning offers many advantages for future mobile apps, but it also raises a lot of
questions. The results from the current study show that users are not ready to accept machine
learning as part of their mobile experience yet. On one hand, this could be due to the limitations
of the prototype. Studies show that incorporating machine learning into a prototype is a
difficult task, which requires a new way of prototyping with data that will change over time
(Dove et al., 2017). Besides, since the data is dynamic, the outcomes can also be unpredictable.
“It is hard to effectively imagine what the experience will be, or the likely performance errors
until the system is built; therefore, making it difficult to assess potential value versus
“creepiness”.”, Dove et al. (2017) mention. Building such a system can allow designers to
evaluate concepts and fix potential problems. However, this process will take too much time
compared to the traditional UX process (Yang, Scuito, Zimmerman, Forlizzi & Steinfeld, 2018).
Even though in the existing studies (Rooksby, Asadzadeh, Rost, Morrison & Chalmers, 2016;
Mehrotra, Pejovic, Vermeulen, Hendley & Musolesi, 2016; Hiniker et al., 2016, Whittaker et
al., 2016; Ko et al., 2015; Löchtefeld et al., 2013), the evaluated screen time tracking tools do
not have machine learning features, they show significantly better results because the
researchers used specifically made apps for a certain media. Evaluating with a real app, allows
them to gather a great amount of quantitative data and longer periods of evaluation, which
leads to better results. Another advantage of using a real app is that it allows the participants
in the study to be in a real situation and act as they normally do, without the need to imagine
a certain situation. Sometimes it is difficult for the participants to imagine themselves in a
specific situation and the data received in this case will be different from the one, received in a
real environment. It is possible to hypothesise that the problems mentioned in the previous
section are less likely to occur when testing with an already developed tool. Unfortunately, the
time frame and the lack of programming skills for my study did not allow for testing with real
application.
On the other hand, the problem could be connected with the privacy scandal in recent years,
which showed that users do not have any control over their private data. This made them more
cautious about what they share online and who has access to their data. Research findings show
that one of the measurements that people took after these events is to stop downloading and
using specific apps to protect their private data (Brandtzaeg, Pultier & Moen, 2018). Further
research should be conducted to investigate possible solutions and ways to incorporate
machine learning into mobile apps without affecting users’ privacy.
29
and how to solve it. It might turn out that screen time tracking apps are part of the solution to
this problem. The existing ones offer some great features but to bring a real behaviour change
and help their users, their design and features should be improved. The concept of “Happy
Screen” should not be considered as a universal solution and it definitely needs to be
additionally researched to assess the presented design solution in long term and what features
should be added or changed to attract more users and how it can be improved if the users’
behaviour patterns also change over time. This study outlines some of the features that could
to be included in the future development of similar concepts or could be helpful for the upgrade
of the existing screen time tracking tools.
5.5. Limitations
Like in every study, this one also had several limitations. Probably one of the most common
limitations in this type of studies is time. Due to the time constraints, the prototype was not
evaluated for a third time, as it was intentionally planned. However, having more time for
testing would allow the researcher to include more participants and to have a broader view of
the concept and its potential problems.
The number of participants was not enough for this type of study. Due to a limited budget,
all of the participants in the first evaluation of the prototype were students. There is a tendency
for this type of participants to express a positive opinion about the prototype in order not to
hurt the interviewer’s feelings. This limitation was partly resolved for the second evaluation
when non- students were recruited.
Another limitation was the prototyping tool. During the first evaluation, the participants
used their devices and experienced various technical problems. First, the user interface looked
different due to the different sizes of the screens. Second, when the prototyping tool generates
a link for sharing the prototype, it is opened with the browser of the device and some of the
features, included in “Happy Screen” could not be tested properly. For example, during the
“Broken screen” effect, the participants should be prevented from scrolling, but apparently,
when the prototype is opened through a browser this feature (of the prototyping tool) is not
supported. There were no issues with the scrolling when the prototype is opened with the
accompanying app, developed by the prototyping tool for previewing and testing prototypes.
However, the participants were warned that there might be some minor problems, which
changed their perception and resulted in looking for the problems instead of actual testing. For
example, the blurred effect was mistaken for one of the problems by several participants, which
caused misunderstanding of the concept. To avoid those issues, for the second evaluation, the
prototype was specifically designed for one device (iOS) and the participants were asked to use
it. The prototype runs on the mobile app, provided by InVision. Some of the participants who
use Android system experienced difficulties in completing some of the tasks.
Additional issues appear from the concept itself. It requires longer evaluations and data
collected by the app itself, for the participants to test and experience “Happy Screen” in a real
environment. This was another huge limitation of the study. Mobile applications that are using
machine learning and claim to “adapt” to the users’ usage, require time and budget to be build
and evaluated. Even if not all of the features are included, evaluating with a real app offers
more accurate data and eliminates issues such as incompatibility with different screen sizes.
30
6. Conclusion
The purpose of this study was to investigate some of the existing screen time tracking mobile
apps and to explore the possibilities of improving their design by including new features, which
could increase their popularity among the users and trigger a behaviour change – reduce the
screen time. To conduct the study, first, a survey was created and used to improve the
understanding of the researcher about how aware people are about their screen time usage.
During the study ten of the most popular screen time tracking apps were downloaded and
evaluated from the App Store and Google Play store. Contrary to their popularity, the results
from the survey showed that most of the participants are not using and/or have never used
screen time tracking apps. This caused a change in the initial plan of the study and the inclusion
of another step – online interviews targeting only users of screen time apps. The data collected
at this stage of the study brought insights about the popularity of those apps and some of the
problems that the users experience. As a next step, a low-fidelity prototype was created and
evaluated. The results from the evaluation were used to improve the prototype and develop
new features which were evaluated with a second prototype, created as a next step of the study.
Both evaluations demonstrated approval of the concept from the users. In the future, the
research on those apps could be continued by developing and evaluating additional features
and improvements of the existing ones.
However, despite that the study was conducted by one person, it managed to contribute
with knowledge on the primary features that could be included in screen time management
tools. This study acts as a basic guideline and it could be used for in-depth research for
improving the design of screen time management tools, which will help the users to keep their
screens happy.
“So, what’s the solution? We can’t abandon technology, nor should we.”
(Alter, 2017)
31
Acknowledgements
I would like to thank everyone who supported me and my work during the past months. First
and most importantly, to my supervisor – Fatemeh Moradi for the great support and all
inspirational meetings. Thank you for guiding me and giving me hope throughout the thesis
process. To everyone who took part in the interviews, focus groups and user tests. This thesis
would not be possible without you. To all my friends from the HCI program, thank you for the
amazing two years together. And finally, to my family, for being always by my side and
supporting me during my studies.
Thank you!
32
References
Alter, A. (2017). Irresistible: The rise of addictive technology and the business of keeping us
hooked. Penguin.
Banjanin, Nikolina, Banjanin, Nikola, Dimitrijevic, Ivan, & Pantic, Igor. (2015). Relationship
between internet use and depression: Focus on physiological mood oscillations, social
networking and online addictive behavior. Computers in Human Behavior, 43(C), 308-
312.
Bardus, M., Van Beurden, S., Smith, J., & Abraham, C. (2016). A review and content analysis
of engagement, functionality, aesthetics, information quality, and change techniques in
the most popular commercial apps for weight management. The International Journal
of Behavioral Nutrition and Physical Activity, 13(35), 35.
Brandtzaeg, P., Pultier, A., & Moen, G. (2018). Losing Control to Data-Hungry Apps: A Mixed-
Methods Approach to Mobile App Privacy. Social Science Computer Review,
089443931877770.
Chhabra, H. S., Sharma, Sunil, & Verma, Shalini. (2018). Smartphone app in self-management
of chronic low back pain: A randomized controlled trial. European Spine Journal,
27(11), 2862-2874.
Clarke, V., & Braun, V. (2017). Thematic analysis. The Journal of Positive Psychology, 12(3),
297-298.
Collins, E., Cox, A., Bird, J., & Cornish-Tresstail, C. (2014). Barriers to engagement with a
personal informatics productivity tool. Proceedings of the 26th Australian Computer-
Human Interaction Conference on Designing Futures, 370-379.
De Russis, L., & Monge Roffarello, A. (2017). On the Benefit of Adding User Preferences to
Notification Delivery. Proceedings of the 2017 CHI Conference Extended Abstracts on
Human Factors in Computing Systems, 127655, 1561-1568.
Demirci, Kadir, Akgonul, Mehmet, & Akpinar, Abdullah. (2015). Relationship of smartphone
use severity with sleep quality, depression, and anxiety in university students. Journal
of Behavioral Addictions, 4(2), 85-92.
Ding, Xiang, Xu, Jing, Chen, Guanling, & Xu, Chenren. (2016). Beyond Smartphone Overuse:
Identifying Addictive Mobile Apps. Proceedings of the 2016 CHI Conference Extended
Abstracts on Human Factors in Computing Systems, 07-12, 2821-2828.
Direito, Artur, Pfaeffli Dale, Leila, Shields, Emma, Dobson, Rosie, Whittaker, Robyn, &
Maddison, Ralph. (2014). Do physical activity and dietary smartphone applications
incorporate evidence-based behaviour change techniques? BMC Public Health, 14(1),
646.
Dove, G., Halskov, K., Forlizzi, J., & Zimmerman, J. (2017). UX Design Innovation: Challenges
for Working with Machine Learning as a Design Material. Proceedings of the 2017 CHI
Conference on Human Factors in Computing Systems, 2017, 278-288.
Edwards, E A, Lumsden, J, Rivas, C, Steed, L, Edwards, L A, Thiyagarajan, A, . . . Walton, R T.
(2016). Gamification for health promotion: Systematic review of behaviour change
techniques in smartphone apps. BMJ Open, 6(10), E012447.
33
Elhai, J. D., Levine, J. C., Dvorak, R. J., & Hall, B. (2016). Fear of missing out, need for touch,
anxiety and depression are related to problematic smartphone use. Computers in
Human Behavior, 63, 509-516.
Elnaffar, S., & El Allam, A. (2018). An app approach to correcting the posture of smartphone
users. 2018 Advances in Science and Engineering Technology International
Conferences (ASET), 1-4.
Eyal, N., & Hoover, R. (2014). Hooked: How to build habit-forming products. Penguin UK.
Filippou, Justin, Cheong, Christopher, & Cheong, France. (2016). Combining The Fogg
Behavioural Model And Hook Model To Design Features In A Persuasive App To
Improve Study Habits. ArXiv.org, ArXiv.org, Jun 11, 2016.
Fogg, B. J. (2009). A behavior model for persuasive design. Proceedings of the 4th
International Conference on Persuasive Technology, 350, 1-7.
Fogg, B. J. (2003). Persuasive technology: Using computers to change what we think and do.
Retrieved from https://ebookcentral.proquest.com
Furst, R., Evans, T., & Roderick, D. (2018). Frequency of College Student Smartphone Use:
Impact on Classroom Homework Assignments. Journal of Technology in Behavioral
Science, 3(2), 49-57.
Genc-Nayebi, N., & Abran, A. (2018). A measurement design for the comparison of expert
usability evaluation and mobile app user reviews.
Giansanti D., Colombaretti L., Simeoni R., Maccioni G. (2019) The Text Neck: Can Smartphone
Apps with Biofeedback Aid in the Prevention of This Syndrome. In: Masia L., Micera
S., Akay M., Pons J. (eds) Converging Clinical and Engineering Research on
Neurorehabilitation III. ICNR 2018. Biosystems & Biorobotics, vol 21. Springer, Cham
Goggin, G., Lincoln, S., & Robards, B. (2014). Facebook’s mobile career. New Media & Society,
16(7), 1068-1086.
Guest, G., MacQueen, K. M., & Namey, E. E. (2011). Applied thematic analysis. Sage
Publications.
Hiniker, A., Hong, S., Kohno, T., & Kientz, J. (2016). MyTime: Designing and Evaluating an
Intervention for Smartphone Non-Use. Proceedings of the 2016 CHI Conference on
Human Factors in Computing Systems, 4746-4757.
Hoffner, C., & Lee, S. (2015). Mobile Phone Use, Emotion Regulation, and Well-Being.
Cyberpsychology, Behavior, and Social Networking, 18(7), 411-416.
Howe, Katherine B, Suharlim, Christian, Ueda, Peter, Howe, Daniel, Kawachi, Ichiro, & Rimm,
Eric B. (2016). Gotta catch’em all! Pokémon GO and physical activity among young
adults: Difference in differences study. BMJ, 355, I6270.
Jung, Sang In, Lee, Na Kyung, Kang, Kyung Woo, Kim, Kyoung, & Lee, Do Youn. (2016). The
effect of smartphone usage time on posture and respiratory function. Journal of
Physical Therapy Science, 28(1), 186-9.
Kim, Y. G., Kang, M. H., Kim, J. W., Jang, J. H., & Oh, J. S. (2013). Influence of the duration
of smartphone usage on flexion angles of the cervical and lumbar spine and on
reposition error in the cervical spine. Physical Therapy Korea, 20(1), 10-17.
34
Ko, Minsam, Yang, Subin, Lee, Joonwon, Heizmann, Christian, Jeong, Jinyoung, Lee, Uichin,
. . . Chung, Kyong-Mee. (2015). NUGU: A Group-based Intervention App for Improving
Self-Regulation of Limiting Smartphone Use. Proceedings of the 18th ACM Conference
on Computer Supported Cooperative Work & Social Computing, 1235-1245.
Koskinen, I., Zimmerman, J., Binder, T., Redström, J., & Wensveen, S. (2011). Design research
through practice: From the lab, field, and showroom. Waltham, MA: Morgan
Kaufmann.
Kotikalapudi, R., Chellappan, S., Montgomery, F., Wunsch, D., & Lutzen, K. (2012).
Associating Internet Usage with Depressive Behavior Among College Students. Ieee
Technology And Society Magazine, 31(4), 73-80.
Kushlev, K., Proulx, J., & Dunn, E. (2016). "Silence Your Phones": Smartphone Notifications
Increase Inattention and Hyperactivity Symptoms. Proceedings of the 2016 CHI
Conference on Human Factors in Computing Systems, 1011-1020.
Larose, R., Lin, C., & Eastin, M. (2003). Unregulated Internet Usage: Addiction, Habit, or
Deficient Self-Regulation? Media Psychology, 5(3), 225-253.
Löchtefeld, M., Böhmer, M., & Ganev, L. (2013). AppDetox: Helping users with mobile app
addiction. Proceedings of the 12th International Conference on Mobile and Ubiquitous
Multimedia, 1-2.
Mark, G., Iqbal, S., & Czerwinski, M. (2017). How blocking distractions affects workplace focus
and productivity. Proceedings of the 2017 ACM International Joint Conference on
Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International
Symposium on Wearable Computers, 928-934.
Marotta, V., & Acquisti, A. (2017). Online Distractions, Website Blockers, and Economic
Productivity: A Randomized Field Experiment. Preliminary Draft.
Mccartney, M. (2016). Margaret McCartney: Game on for Pokémon Go. BMJ, 354, I4306.
Megna, Gisonni, Napolitano, Orabona, Patruno, Ayala, & Balato. (2018). The effect of
smartphone addiction on hand joints in psoriatic patients: An ultrasound-based study.
Journal of the European Academy of Dermatology and Venereology, 32(1), 73-78.
Mehrotra, A., Pejovic, V., Vermeulen, J., Hendley, R., & Musolesi, M. (2016). My Phone and
Me: Understanding People's Receptivity to Mobile Notifications. Proceedings of the
2016 CHI Conference on Human Factors in Computing Systems, 1021-1032.
Moreno, M. A., Breland, D. J., & Jelenchick, L. (2015). Exploring depression and problematic
internet use among college females: A multisite study. Computers in Human Behavior,
49, 601-607.
Muñoz-Rivas, M. J., Fernández, L., & Gámez-Guadix, M. (2010). Analysis of the indicators of
pathological Internet use in Spanish University students. The Spanish Journal of
Psychology, 13(2), 697–707.
Nielsen, J. (2012). Usability 101: Introduction to Usability. Nielsen Norman Group. Available
at: https://www.nngroup.com/articles/usability-101-introduction-to-usability/
Özdemir, Kuzucu, & Ak. (2014). Depression, loneliness and Internet addiction: How important
is low self-control? Computers in Human Behavior, 34, 284-290.
35
Pagoto, Schneider, Jojic, Debiasse, & Mann. (2013). Evidence-Based Strategies in Weight-Loss
Mobile Apps. American Journal of Preventive Medicine, 45(5), 576-582.
Przybylski, A. K., Murayama, K., DeHaan, C. R., & Gladwell, V. (2013). Motivational,
emotional, and behavioral correlates of fear of missing out. Computers in Human
Behavior, 29(4), 1841-1848.
Rapp, & Cena. (2016). Personal informatics for everyday life: How users without prior self-
tracking experience engage with personal data. International Journal of Human -
Computer Studies, 94(C), 1-17.
Rooksby, J., Asadzadeh, P., Rost, M., Morrison, A., & Chalmers, M. (2016). Personal Tracking
of Screen Time on Digital Devices. Proceedings of the 2016 CHI Conference on Human
Factors in Computing Systems, 284-296.
Salehan, & Negahban. (2013). Social networking on smartphones: When mobile phones
become addictive. Computers in Human Behavior, 29(6), 2632-2639.
Schoffman, D., Turner-McGrievy, E., Jones, G., & Wilcox, S. (2013). Mobile apps for pediatric
obesity prevention and treatment, healthy eating, and physical activity promotion: Just
fun and games? Translational Behavioral Medicine, 3(3), 320-325.
Shah, P. P., & Sheth, M. S. (2018). Correlation of smartphone use addiction with text neck
syndrome and SMS thumb in physiotherapy students. International Journal Of
Community Medicine And Public Health, 5(6), 2512-2516.
Sheppard, A. L., & Wolffsohn, J. S. (2018). Digital eye strain: prevalence, measurement and
amelioration. BMJ open ophthalmology, 3(1), e000146.
Stothart, C., Mitchum, A., Yehnert, C., & Enns, James T. (2015). The Attentional Cost of
Receiving a Cell Phone Notification. Journal of Experimental Psychology: Human
Perception and Performance, 41(4), 893-897.
Subramanian, R., Freivogel, William, Iyer, Narayanan, Ratnapradipa, Dhitinut, Veenstra,
Aaron, & Xie, Wenjing. (2015). Diet, Exercise, and Smartphones - A Content Analysis
of Mobile Applications for Weight Loss, ProQuest Dissertations and Theses.
Turkle, S. (2011). Alone together: Why we expect more from technology and less from each
other. New York: Basic Books.
Ward, A., Duke, K., Gneezy, A., & Bos, M. (2017). Brain Drain: The Mere Presence of One’s
Own Smartphone Reduces Available Cognitive Capacity. Journal of the Association for
Consumer Research, 2(2), 140-154.
Whittaker, S., Kalnikaite, V., Hollis, V., & Guydish, A. (2016). 'Don't Waste My Time': Use of
Time Information Improves Focus. Proceedings of the 2016 CHI Conference on
Human Factors in Computing Systems, 1729-1738.
Yang, Q., Scuito, A., Zimmerman, J., Forlizzi, J., Steinfeld, A. (2018). Investigating How
Experienced UX Designers Effectively Work with Machine Learning. Proceedings of
the 2018 Designing Interactive Systems Conference (DIS ’18), 585-596. DOI:
https://doi-org.proxy.ub.umu.se/10.1145/3196709.3196730
Yuan, F., Gao, X., & Lindqvist, J. (2017). How Busy Are You?: Predicting the Interruptibility
Intensity of Mobile Users. Proceedings of the 2017 CHI Conference on Human Factors
in Computing Systems, 2017, 5346-5360.
36
Zimmerman, J., Forlizzi, J., & Evenson, S. (2007). Research through design as a method for
interaction design research in HCI. Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems, 493-502.
37
Appendix A
38
Fig. 2. Screenshot from Mute; reduce the
tracked time by deleting a “pickup” card.
39
Fig. 4. Screenshots from Space and Quality Time showing some UI problems.
Fig. 5. Notifications containing odd words, screenshots from Forest and Mute.
40
Appendix B:
Questions from the survey:
1. What is your age group?
2. How do you assess the time you spend on your phone per day? (incl. social medias, emails,
messaging, games etc.)
3. What do you do to reduce the distractions from your phone while you are working?
4. Do you feel distracted when you are in the middle of something and receive a notification on
your mobile?
5. How do you react when you receive notification in the following situations: [In a boring
lecture/meeting; During working hours; Meeting with friend(s); During
breakfast/lunch/dinner; Watching favorite movie/tv series; Listening to music; Playing a
favorite game]
6. Which of the following actions can help to reduce the usage of the phone?
7. Are you using/have you used any apps to track/reduce your screen time?
8. Have you ever try to limit the time you use a specific app, and do you follow such limits?
9. What you do like about screen time management apps?
10. What you do dislike about screen time management apps?
Appendix C:
Evaluation Prototype v1.0, test cases description:
Imagine that you just installed the app. First you will see the onboarding screen, which will
give you some basic information about what the app does. Your first task is to read the texts
on the onboarding screens and then look around the app. I will give you some time to do that.
The battery indicator is a “secret spot” so please don’t click on it for now.
Questions asked after the first task:
• Do you understand the concept of the app?
• Is there something that bothers you? Maybe in the way it works?
• Is everything on the interface clear?
41
Now, imagine that several hours have passed. You’ve been actively using your device and
now you want to check what your friends has posted on Facebook. If there’s nothing to see
there, use the “secret spot” to go back to the Android’s main screen and then try to open
Instagram.
Questions asked after the second task:
• How do you feel when you know that you can’t control your phone?
• Do you think that those effects can make you leave you phone?
• How likely is to install the app if it exists?
Fig 1. Blur effect and “Broken screen” effect, evaluated in prototype v1.0
42
Appendix D:
fig. 2. Prototype v2.0 Graphs screen, filter the used apps by category (left) and by limits (right).
43
Appendix E:
Fig. 1. Instructions on how to add a device (left) and successfully added one (right).
Fig. 2. One device paused; All devices paused. Evaluated in prototype v2.0
44
Appendix F:
Evaluation, prototype v2.0 description:
Link to prototype: https://invis.io/YZRYN25K4B9
You are going to test a prototype of an app, called Happy Screen, which tracks the usage of
your screen time and helps you to reduce it, if it is too high. I have prepared a few tasks for
you that will help me to identify any problems in the functionality and the design of this app.
TASK 3: SETTINGS
Our next stop is the “Settings” screen. Your next task is to set some limits but instead of doing
this manually the app has a feature that analyze all of the apps on your phone and
recommends you time limits that you can set. Imagine that you want to see those
recommendations now.
45
• Where are you going to click?
You really like Instagram and want to change this time limit from 20m to 1h30m. For your
convenience this is already set in the prototype, no need to type anything.
• Do you still want to change it, after reading the popup?
• Any comments about the popup buttons?
Now, after you made some changes, you have to “Approve” those recommended limits.
• How are you going to do this? [the user should scroll to the end of the page and click a
button]
Great, you just set the limits to the most used apps with only a few taps. If you come back to
this screen [the Settings screen] in a few hours and what to see all of the apps with set limits,
how are you going to do that? Can you delete a limit from here? Let’s say the limit for Twitter.
Questions after task 3:
• What do you think about the process of setting and deleting limits?
46
the scroll function shouldn’t work anymore. Tap once again in the middle top part of the
screen and you’ll be back to the main screen of the phone.
Open Instagram. [the user sees the Art effect] Go back to the main screen of the phone by
tapping on the Instagram logo.
Questions about the effects:
• What do you think about those effects – the blur, the broken screen and the Art
effect?
• Are they relevant for the type of those apps?
• Do you think they can make you to temporary stop using those apps?
Imagine that you really want to see your Facebook and Instagram feeds, but the app doesn’t
allow you now.
• Can you think of a way to bypass this restriction? [the user should open “Happy
Screen” again and see that the mascots are crying]
• Maybe if you pause the phone?
Tap on the title again to go back to the main screen and open Instagram or Facebook one
more time.
• Do you see strange effects now?
Fig. 1. Photo of the analysis. The data was grouped into 6 groups: Onboarding, Home
and Graphs, Settings and Limits, Add a device, Effects (this photo), General
comments.
47