0% found this document useful (0 votes)
13 views4 pages

Implications of Computer Vision Driven Assistive Technologies Towards Individuals With Visual Impairment

The paper discusses the implications of computer vision-based assistive technologies for individuals with visual impairments, highlighting both positive aspects, such as increased independence and autonomy, and negative aspects, including bias, privacy concerns, and exclusion in the development process. It emphasizes the need for researchers to consider these implications to improve the design and functionality of assistive technologies. The authors advocate for a human-centered approach to mitigate negative impacts and enhance the effectiveness of these technologies.

Uploaded by

21911a3555
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views4 pages

Implications of Computer Vision Driven Assistive Technologies Towards Individuals With Visual Impairment

The paper discusses the implications of computer vision-based assistive technologies for individuals with visual impairments, highlighting both positive aspects, such as increased independence and autonomy, and negative aspects, including bias, privacy concerns, and exclusion in the development process. It emphasizes the need for researchers to consider these implications to improve the design and functionality of assistive technologies. The authors advocate for a human-centered approach to mitigate negative impacts and enhance the effectiveness of these technologies.

Uploaded by

21911a3555
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Implications of Computer Vision Driven Assistive Technologies Towards

Individuals with Visual Impairment∗

Linda Wang and Alexander Wong


Waterloo Artificial Intelligence Institute
University of Waterloo
arXiv:1905.07844v1 [cs.CV] 20 May 2019

{ly8wang,a28wong}@uwaterloo.ca

Abstract
Computer vision based technology is becoming ubiqui-
tous in society. One application area that has seen an in-
crease in computer vision is assistive technologies, specifi-
cally for those with visual impairment. Research has shown
the ability of computer vision models to achieve tasks such
provide scene captions, detect objects and recognize faces.
Although assisting individuals with visual impairment with
Figure 1. Positive implications: computer vision-based devices al-
these tasks increases their independence and autonomy,
low blind individuals to navigate independently, recognize faces
concerns over bias, privacy and potential usefulness arise. and read text, which helps them overcome social barriers.
This paper addresses the positive and negative implications
computer vision based assistive technologies have on indi-
viduals with visual impairment, as well as considerations As AI is becoming more ubiquitous, it is crucial to ad-
for computer vision researchers and developers in order to dress issues related to the implications of AI, specifically
mitigate the amount of negative implications. computer vision driven assistive technology towards indi-
viduals with visual impairment. The goal of the paper is
1. Introduction to review what implications computer vision has on assis-
In recent years, the rise of deep learning has made previ- tive technologies for individuals with visual impairment and
ously unsolvable tasks possible. One particular area where considerations for computer vision researchers. The paper
deep learning has made tremendous progress is computer will be guided by the following questions:
vision, such as in image recognition, object detection and
image understanding. As computer vision results are be- • What are the positive and negative aspects of using
coming more promising, larger issues regarding the use of computer vision in assistive technologies with respect
this technology need to be considered. An important area to to the impact on the lives of individuals with visual
consider is assistive technologies for those with visual im- impairment?
pairments, as computer vision technologies have the poten-
• What should researchers consider while conducting
tial to aid in tasks where previous solutions have struggled.
computer vision research to reduce negative implica-
Although there are positive aspects of computer vision
tions of AI-powered assistive technology on the lives
applications, there are also negative aspects that should be
of individuals with visual impairment?
addressed. The use of black box artificial intelligence so-
lutions raises many concerns such as fairness and bias of
2. Positive Implications
the models [13]. There are also ethical concerns related to Vision impairment and blindness cause a considerable
privacy protection as many computer vision models rely on amount of economic and emotional burden for not only
camera input. In addition, the exclusion of certain groups the affected persons but also their caregivers and society at
during the development process may also lead to negative large [11]. The recent rise in computer vision based assis-
aspects of the technology [15] and as a result, lead to low tive technologies show the potential to reduce some burden
adoption rates. placed on the individuals, as well as on caregivers and so-
∗ We thank NSERC, Canada Research Chairs program, and Microsoft. ciety. By assisting visually impaired individuals with tasks

1
Negative Implications emerged to make a social or economic impact and im-
Gender prove quality of life. Fundamental challenges, such as those
Bias Age shown in Table 1, are still be to thoroughly addressed before
Race/Ethnicity deploying into assistive technologies.
Exploitation of personal Bias: In machine learning, bias refers to statistics that
information lead to a skew and as a result, brings an unjust outcome
Obtrusiveness of cameras for a population [13]. Bias often stems from training data
Privacy
Tradeoff between autonomy sample sets that are non-representative of the general popu-
and privacy costs lation. When algorithms are trained with biased data, they
Poor device evaluation are inherently bound to produce skewed results [5].
Exclusion in
Age and condition dependent One of the biggest implications in applying AI systems
development
Inefficiency in development with bias is the potential for adversely impacting already
process
process marginalized groups. In 2012, Klare et al. conducted a
study on the influence of gender, race/ethnicity and age
Table 1. Negative implications: bias in computer vision algo-
on the performance of six different face recognition algo-
rithms, privacy concerns related to data collection and cameras,
and exclusion in development process.
rithms, three of which are commercial [10]. The results
found that there are lower matching accuracies for females
than males, Blacks compared to other race/ethnicities, and
they would otherwise need help in, as shown in Figure 1, 18 to 30 year olds compared to other age groups.
their level of independence and autonomy are increased. In recent years, the low errors rates achieved by fa-
Overcoming social barriers: One area assistive tech- cial recognition models led to even more commercializa-
nologies have become an integral part in the lives of those tion. However, studies have shown consistent bias in ar-
with visual impairment is overcoming barriers faced in ev- eas of gender, race and age from these commercial mod-
eryday life. These individuals face adversity in all stages of els. Buolamwini and Gebru evaluated bias present in
life. For instance, severely visually impaired young people three commercial automated facial analysis algorithms from
use their assistive technology as more than just a device to IBM, Microsoft and Megvii with respect to phenotypic sub-
overcome environmental barriers but also a means of com- groups [5]. The results showed that there is a significant
munication for peers in their school [18]. drop in performance of state of the art models when applied
Face recognition and optical character recognition: to images of a particular gender and/or ethnicity group. For
The ever growing presence of smartphones and advance- instance, male subjects were more accurately classified than
ments in computer vision are transforming the accessibil- females and lighter subjects were more accurately classified
ity of assistive technologies, allowing individuals to over- than darker subjects. All three commercial classifier per-
come social barriers and have autonomy over when and how formed the worst on darker female subjects.
they access information. Smartphone applications, such as Raji and Buolamwini conducted a second audit of com-
SeeingAI and Lookout, use auditory cues to assist users mercial facial analysis models [14]. In this study, perfor-
in identifying scenes, recognizing faces, reading short text, mances from target companies, ones that were in the first
documents and currency [8, 6]. audit, and non-target companies, Amazon and Kairos, are
Navigation assistance: Individuals with visual impair- presented. The results showed all targets had the greatest
ment also face difficulty localizing themselves in unknown reduction in error rates for female and darker faces. In terms
indoor and outdoor environments. Research projects are us- of non-target companies, the performance results were sim-
ing cameras and sensors to give directions so these indi- ilar to the first audit, with the largest disparity gap between
viduals can navigate outdoor and indoor environments in- black females and white males.
dependently. For instance, a prototype was developed for Although the awareness of disparity improved the facial
guiding the visually impaired across streets in a straight line recognition models from target companies and produced a
using a wearable computer-based orientation and wayfind- lower error rate than non-target companies, the commercial-
ing aid [16]. For indoor navigation, Tian et al. developed a ization of these models before evaluating biases and poten-
proof of concept computer-vision based indoor wayfinding tial impacts on protected groups raises a concern.
aid that detects doors and elevators, as well as text on signs, Privacy: As shown in the Section 2, computer vision
to find different rooms [19]. based assistive technologies for the visually impaired allow
these individuals to gain independence and autonomy over
3. Negative Implications different aspects of their life. However, these devices also
Although the advancement of technology is evident, only pose privacy risks because of the vast amounts of personal
a limited number of assistive technology solutions have data stored. Although individuals with visual impairment
felt that smartphones help them communicate and achieve
greater independence, these devices create privacy risks be-
cause of the amount of personal data stored. As well, their
poor visual acuity makes it hard to safeguard their informa-
tion, such as if someone is around and eavesdropping [4].
Home-monitoring for older adults, who represent major-
ity of those with visual impairment [1], reliefs caregivers
burden and allows individuals with severe visual impair-
ment to live independently, but the devices for monitor-
ing also store personal data. Studies have found that older
adults are willing to have activity monitoring shared with
family members and doctors if the collected data is useful,
but expressed that the greatest concern is exploitation and Figure 2. Design considerations for computer vision researchers.
misuse of their personal health information [9].
Based on the studies, the greatest fear associated with the assistive technology. As users become more accustomed to
collection of personal data is the concern that their collected their condition, they may prefer to perform some activities
data could end up in the wrong hands and be misused. In independently [16].
addition to the fear of personal information being exploited, Not only does including users during the development
the use of cameras is obtrusive and found to elicit greater process point out which areas to focus on, but also saves
fears than wearable solutions. In a comparison of four am- development time. When testing the usability of the indoor
bient intelligent systems, the camera-based behaviour and wayfinding device on blind participants, the researchers
emergency detection system was perceived with the great- found that the participants were able to find doors without
est fear and highest level of concern [9]. However, studies any problem since the participants use canes, and realized
have also shown that there is a tradeoff between gained au- that text localization and recognition were more useful for
tonomy and privacy costs. Older adults with lower levels of indoor navigation [19]. By including the users earlier in
functioning are willing to accept video cameras and trade- the development process could have identified that locating
off the privacy lost if camera-based solution could prevent doors are not a problem and use the saved time to address
transfer to a long term care facility [20]. text localization and recognition.
The different perceptions of privacy over the use of data,
as well as the potential benefits of using cameras for home 4. Design Considerations for Researchers
monitoring, suggest that privacy is a complex topic. Under- Computer vision has the potential to impact people’s
standing the variables that influence privacy concerns and lives. However, just algorithmic advances to the accuracies
how these concerns can be mediated by potential benefits of computer vision models are insufficient for assistive tech-
are important when developing computer vision based as- nologies, which interact with and around humans. Recently,
sistive technologies. the term human-centered artificial intelligence is used to re-
Exclusion in development process: The main goal of fer to intelligent systems that are aware of the interaction
assistive technologies is to improve the lives of end users. with humans and are designed with social responsibility in
However, when the design of form or function of the tech- mind [15]. As researchers, it is important to uphold soci-
nology is poor, or when inequality exists between techno- ety’s moral and legal obligations to treat citizens fairly, es-
logical accessibility, the lives of those affected can be nega- pecially those in protected groups that face discrimination.
tively impacted, as well as perceptions of their abilities [12]. Figure 2 illustrates some considerations to reduce the nega-
For instance, a device that has good design, usability and ac- tive implications mentioned in Section 3.
cessibility can be poorly evaluated. The user’s lifestyle and Bias mitigation: One method to uphold fairness is by
aspirations have to be taken into consideration to receive a mitigating bias. For instance, researchers can use tools,
positive user evaluation [12]. such as Google’s What-If tool [3] and IBM AI Fairness 360
The lifestyle and desired function of assistive technolo- kit [2], to analyze and identify unwanted bias in datasets and
gies depend on age and level of adaption to their condition. ML models in order to mitigate such bias. For age, gender
A predominant want for young disabled people is the sig- and ethnicity, there are different methods to reduce the neg-
nificance of being ordinary [18]. No matter the degree of ative impacts of bias. Das et al. proposed a Multi-Task Con-
visual impairment, all the participants expressed that inclu- volution Neural Network that employs joint dynamic loss
sion by peers and being ordinary is a big part in their daily weight adjustments to minimize bias when classifying gen-
lives [18]. In addition to age, how the user has adapted to der, age and race [7]. There are also methods to reduce bias
their condition also impacts the desired functionality of the at the dataset level. Salimi et al. introduced a database re-
pair algorithm, which uses causal pre-processing to reduce sual impairments. In Proceedings of the 33rd Annual ACM
or eliminate sources of discrimination for fair ML [17]. Conference on Human Factors in Computing Systems, pages
Disability discrimination: Like age, gender and race, 3523–3532, 2015. 3
disability status is also a protected characteristic. How- [5] J. Buolamwini and T. Gebru. Gender shades: Intersectional
ever, disability discrimination has not been explored in lit- accuracy disparities in commercial gender classification. In
S. A. Friedler and C. Wilson, editors, Proceedings of the 1st
erature [21]. Similar to under-representation of age, gender
Conference on FAT, volume 81 of PMLR, pages 77–91, 23–
and racial groups in datasets, as shown in Section 3, there 24 Feb 2018. 2
is also potential for under-representation of individuals with [6] P. Clary. Lookout: an app to help blind and visually impaired
disabilities. Ways to mitigate disability bias have also not people learn about their surroundings, May 2018. 2
been explored. Compared to gender, race and age, gather- [7] A. Das, A. Dantcheva, and F. Bremond. Mitigating Bias in
ing a balanced training dataset is not enough to address the Gender, Age and Ethnicity Classification: a Multi-Task Con-
biased outcomes for those with a disability [21]. The many volution Neural Network Approach. In ECCVW 2018, Sept.
different forms and degrees of disability makes it difficult 2018. 3
for a machine learning model to find patterns, form groups [8] S. Kelley. Seeing ai: Artificial intelligence for blind and
and generalize. With the rise of machine learning based visually impaired users, 2019. 2
assistive technologies, understanding and assessing the im- [9] F. Kirchbuchner, T. Grosse-Puppendahl, M. R. Hastall,
M. Distler, and A. Kuijper. Ambient intelligence from se-
pact towards people with disabilities is crucial, especially
nior citizens’ perspectives. In Ambient Intelligence, pages
since disability bias has not been widely explored. 48–59, Cham, 2015. Springer International Publishing. 3
Inclusion of end users: Taking into account where end [10] B. F. Klare, M. J. Burge, J. C. Klontz, R. W. Vorder Bruegge,
users will use the assistive technologies, as well as the needs and A. K. Jain. Face recognition performance: Role of de-
and goals, a task specific training set and appropriate model mographic information. IEEE Transactions on Information
architecture can allow computer vision based devices to be Forensics and Security, 7(6):1789–1801, Dec 2012. 2
perceived as useful, allowing individuals to gain indepen- [11] J. Köberlein, K. Beifus, C. Schaffert, and R. P. Finger. The
dence and autonomy. Based on the studies mentioned in economic burden of visual impairment and blindness: a sys-
Section 3, users are willing to tradeoff privacy for more au- tematic review. BMJ Open, 3(11), 2013. 1
tonomy. Thus, by including users in the development pro- [12] M. Leo, G. Medioni, M. Trivedi, T. Kanade, and G. Farinella.
Computer vision for assistive technologies. Computer Vision
cess, the devices will be perceived as more useful, gaining
and Image Understanding, 154:1 – 15, 2017. 3
more adoption since users are willing to tradeoff privacy
[13] K. Lloyd. Bias amplification in artificial intelligence sys-
concerns to have more independence and autonomy. tems. CoRR, abs/1809.07842, 2018. 1, 2
Diverse skill set: The ethical implications presented in [14] I. D. Raji and J. Buolamwini. Actionable auditing: Inves-
this paper are difficult to address by just computer vision tigating the impact of publicly naming biased performance
researchers. Instead, a team with a diverse set of skills is results of commercial ai products. In Conference on AIES,
required to address both the positive and negative impli- 2019. 2
cations of an assistive technology. The underlying bias in [15] M. O. Riedl. Human-centered artificial intelligence and ma-
the models can cause protected groups to feel more iso- chine learning. CoRR, abs/1901.11184, 2019. 1, 3
lated. Researchers should be aware of the possible biases [16] D. A. Ross. Implementing assistive technology on wear-
the dataset and algorithm may have before the system be- able computers. IEEE Intelligent Systems, 16(3):47–53, May
comes commercialized and interacts with people in every- 2001. 2, 3
day context. In addition to bias, the use of cameras raise [17] B. Salimi, L. Rodriguez, B. Howe, and D. Suciu. Capuchin:
privacy concerns over who has access to the data stored Causal database repair for algorithmic fairness, 2019. 4
and the amount of security measures taken to protect per- [18] S. Söderström and B. Ytterhus. The use and nonuse of as-
sonal data. Before deployment, developers should ensure sistive technologies from the world of information and com-
that measures are in place to reduce the chances of data munication technology by visually impaired young people: a
exploitation. By understanding the needs and goals of in- walk on the tightrope of peer inclusion. Disability & Society,
dividuals with visual impairment, designers can effectively 25(3):303–315, 2010. 2, 3
address these requirements in the design of the computer [19] Y. Tian, X. Yang, C. Yi, and A. Arditi. Toward a computer
vision system, resulting in a more useful device for the end vision-based wayfinding aid for blind persons to access un-
users. familiar indoor environments. Machine vision and applica-
References tions, 24:521–535, 04 2013. 2, 3
[20] D. Townsend, F. Knoefel, and R. Goubran. Privacy ver-
[1] Blindness and vision impairment, Oct 2018. 3 sus autonomy: A tradeoff model for smart home monitoring
[2] Introducing ai fairness 360, Sep 2018. 3 technologies. In 2011 IEEE EMBC, pages 4749–4752, Aug
[3] The what-if tool, Sep 2018. 3 2011. 3
[4] T. Ahmed, R. Hoyle, K. Connelly, D. Crandall, and A. Ka- [21] S. Trewin. AI fairness for people with disabilities: Point of
padia. Privacy concerns and behaviors of people with vi- view. CoRR, abs/1811.10670, 2018. 4

You might also like