Кумарбеков КК
Кумарбеков КК
*Correspondence:
ramy.elmoazen@uef.fi Abstract
1
School of Computing, Remote learning has advanced from the theoretical to the practical sciences with the
University of Eastern Finland, advent of virtual labs. Although virtual labs allow students to conduct their experi-
Yliopistokatu 2, 80100 Joensuu, ments remotely, it is a challenge to evaluate student progress and collaboration using
Finland
2
Centre for the Science learning analytics. So far, a study that systematically synthesizes the status of research
of Learning and Technology on virtual laboratories and learning analytics does not exist, which is a gap our study
(SLATE), University of Bergen, aimed to fill. This study aimed to synthesize the empirical research on learning analyt-
Bergen, Norway
ics in virtual labs by conducting a systematic review. We reviewed 21 articles that were
published between 2015 and 2021. The results of the study showed that 48% of studies
were conducted in higher education, with the main focus on the medical field. There
is a wide range of virtual lab platforms, and most of the learning analytics used in the
reviewed articles were derived from student log files for students’ actions. Learning
analytics was utilized to measure the performance, activities, perception, and behavior
of students in virtual labs. The studies cover a wide variety of research domains, plat-
forms, and analytical approaches. Therefore, the landscape of platforms and applica-
tions is fragmented, small-scale, and exploratory, and has thus far not tapped into the
potential of learning analytics to support learning and teaching. Therefore, educators
may need to find common standards, protocols, or platforms to build on each others’
findings and advance our knowledge.
Keywords: Virtual laboratory, Remote laboratories, Learning analytics, Distance
education, Online learning
Introduction
The COVID-19 coronavirus pandemic has created an extremely difficult situation that
causes anxiety in the academic field. Practical sessions and experiments in schools and
universities have been suspended, which are essential for students’ experience and skill
development in laboratory-based disciplines (Vasiliadou, 2020). Despite the pandemic
conditions, some specialties have started to use virtual labs for teaching biology, chemis-
try, and the natural sciences. Virtual labs have the advantages of unlimited time, immediate
feedback, experiment repetition, and safety for students and the subjects of the experi-
ment (Vasiliadou, 2020). Students’ experience with virtual and simulated experiments helps
© The Author(s) 2023. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits
use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original
author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third
party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the mate-
rial. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or
exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://
creativecommons.org/licenses/by/4.0/.
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 2 of 20
prepare them for their physical laboratories and offers a reasonable solution—at least in
emergencies—(Breakey et al., 2008). Technology affords students several means of commu-
nication, allowing students to interact with teachers, ask for help, or provide feedback about
their learning. Furthermore, students can conduct virtual experiments in groups, allow-
ing for social engagement and collaboration through teamwork (Manchikanti et al., 2017).
Virtual laboratories can generate digital traces to monitor students’ learning and identify
their learning strategies. These traces of students’ interactions with virtual labs revealed an
enhancement in students’ ability to solve problems, engage in critical thinking, develop lab-
oratory skills, and acquire knowledge (Ramadahan & Irwanto, 2018). To take advantage of
such data, the "learning analytics" field was conceptualized to provide insights into learning
by analyzing various student-generated data (Hantoobi et al., 2021).
Learning analytics (LA) is commonly defined as “the measurement, collection, analysis,
and reporting of data about learners, learning environments, and contexts to understand
and optimize learning and their environments” (SoLAR, 2011). Therefore, LA adopts a
data-driven strategy in educational settings with the ultimate goal of enhancing and opti-
mizing the educational experience for students and teachers. LA has a broad range of
applications in many fields of education, from preschool to postgraduate studies (Adejo &
Connolly, 2017). The LA implementation may provide educational institutions and stake-
holders with multiple significant benefits. (Howell et al., 2018; Ifenthaler, 2017). These
include LA being used for students’ collaboration measurement (Saqr, Elmoazen, et al.,
2022), grade prediction (Agudo-Peregrina et al., 2014; Strang, 2017), learning gap identi-
fication (Nyland et al., 2017), failure prediction (Tempelaar et al., 2018), decision making
(Vanessa Niet et al., 2016), active learning support (Kwong et al., 2017), profiling students
(Khalil & Ebner, 2017) and assessment improvement (Azevedo et al., 2019).
LA has been implemented in many contexts, such as the early identification of at-risk stu-
dents for underachievement, the tracking of students’ online activity, the provision of auto-
mated feedback, the facilitation of learning strategies, and the optimization of teamwork
in collaborative learning (Kaliisa et al., 2022; Papamitsiou & Economides, 2014). Previous
systematic reviews have either narrowed in on the technology and design of virtual labora-
tories in a single discipline, such as biology (Udin et al., 2020) or chemistry (P. ), covered a
wider range of disciplines while focusing on a single technology, such as virtual reality (Rah-
man et al., 2022) or provided a more broad-based review of the theoretical and practical
approaches of virtual labs in various fields (Reeves & Crippen, 2021). However, a systematic
review that synthesizes research about how learning analytics are used to monitor, support,
or assess virtual laboratory work does not exist. In this study, we aim to bridge such a gap
and contribute to the literature with a systematic review encompassing all research about
learning analytics and virtual laboratories. We investigate the characteristics, research
methods, and findings of learning analytics in virtual labs. Therefore, the main research
questions for this study are: How has research on virtual laboratories used learning analyt-
ics in regards to educational levels, subjects, applications, and methods of analysis?
Background
Virtual labs
Technology-based training is growing across many areas of practice, and education is
not an exception. Organizations are adopting virtual and simulated applications to
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 3 of 20
Learning analytics
Educational technology has evolved in three distinct waves. The first wave started with
the development of learning management systems (LMS). Social networks are consid-
ered the second wave of educational development that affects learning. Learning analyt-
ics, which is the third wave, is used to improve and optimize education (Fiaidhi, 2014).
LA as a multidisciplinary field has been drawn from diverse scientific fields including
computer science, education science, data mining, statistics, pedagogy, and behavioral
science (Chatti et al., 2012).
The main objectives that have been explored in LA research are to support instruc-
tional strategies and the most promising applications in education, identify at-risk stu-
dents to provide effective interventions; recommend reading materials and learning
activities to students; and assess their outcomes (Romero & Ventura, 2020). The use of
LA allows for tracking students’ activities and providing feedback to improve the learn-
ing experience. LA pursued its objectives using various data mining techniques to cre-
ate analytical models, which give a deep look into the learning process and could lead
to more effective learning and pedagogical intervention (Elmoazen et al., 2022; Heik-
kinen et al., 2022). Among the approaches utilized, improved, or introduced in LA are
machine learning, predictive analytics, process and sequence mining, and social network
analysis (Romero & Ventura, 2020). The initial work was mostly algorithms for the pre-
diction of students’ success, and at-risk student identification (Ifenthaler & Yau, 2020).
Then some researchers argued that relying on learning analytics for prediction is not
sufficient (Saqr et al., ; Tempelaar et al., 2018), and it is essential to include pedagogi-
cal perspectives while studying the learning process (Gašević et al., 2015; Wong et al.,
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 5 of 20
2019). Accordingly, scholars give more attention to pedagogical practices and feedback
in recent LA research (Banihashem et al., 2022; Wise & Jung, 2019).
In virtual labs, LA techniques were applied in a variety of approaches to investigate the
impact of using virtual labs to gain the necessary skills and competencies. Govaerts et al.
(2012) applied the Student Activity Meter (SAM) to visualize students’ performance
based on many metrics, which they then displayed in a comprehensive dashboard with
dimensional filtering. Similarly, in the FORGE European online learning project, a dash-
board was used to visualize students’ interactions with course materials and each other,
in addition to surveys and questionnaires (Mikroyannidis et al., 2015). The dashboards
of virtual labs present a summary of student progress by visualization using different
statistical charts such as histograms and plots (Garcia-Zubia et al., 2019; Tobarra et al.,
2014).
Many research papers use interaction data, including statistical extraction of students’
interactions in relation to time spent, the distribution of time-on-task per student, and
different user configurations (Elmoazen et al., 2022; Heikkinen et al., 2022; Ifenthaler &
Yau, 2020). Another approach is to develop an autonomous assessment and recommen-
dation system to analyze real-time activity results and improve students’ performance in
virtual labs (Considine et al., 2019; Gonçalves et al., 2018). For instance, for optimal per-
formance of virtual labs, students should spend appropriate amounts of time interacting
with tools and resources. The relationship between students’ interactions and their aca-
demic progress may be used to study students’ behavior. Moreover, clustering method-
ologies can categorize students by their weaknesses and strengths to study their learning
progress (Tulha et al., 2022).
Methodology
The authors conducted this review according to the Preferred Reporting Items for Sys-
tematic Reviews and Meta-Analyses (PRISMA) 2020 guidelines (Page et al., 2021) and
the eight fundamental steps of systematic reviews by Okoli (2015). The authors followed
these guidelines to identify the purpose of the review, prepare a protocol draft, identify
inclusion and exclusion criteria, and conduct the search process in order to extract data
and appraise articles’ quality before writing the review.
First, the authors determined that the purpose of the study was to report on the appli-
cation of learning analytics in virtual labs to answer research questions. Following the
assessment of the review’s scope, the authors frequently convened to draft the proto-
col. This document organizes all subsequent actions to reduce the possibility of bias in
the selection of publications and data processing. The protocol ensures reproducibility
and consistency by planning the strategy for practicing and conducting the review (Fink,
2019). Accordingly, the protocol included research questions, the literature search strat-
egy, inclusion criteria, the assessment of the studies, the data extraction, and the planned
schedule (Kitchenham & Charters, 2007).
The inclusion and exclusion criteria for study selection were based on the research
questions and guided by ) previous review. All reviewed articles to be included should
use learning analytics in virtual labs and meet the following inclusion and exclusion
criteria:
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 6 of 20
Table 1 The used keywords with wildcards to cover all keyword forms
This combination of keywords was selected to be searched in the fields of the title,
abstract, or author keywords of articles. The search was conducted within two days of
the eighth of November 2021. The returned search resulted in 1069 articles from all
specified databases, as follows: 653 articles from Scopus, 248 articles from WoS, 120
articles from the ERIC database, and 48 articles from the Journal of Learning Analytics.
All articles were uploaded to the Covidence web-based system1 for analysis. Duplicates
1
https://www.covidence.org/ (last accessed September 2022).
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 7 of 20
papers were checked for quality before beginning the stage of synthesis. At this point,
the writers are organizing all of the data within the framework of the review hypothesis
(Webster & Watson, 2002). The data analysis gets a comprehensive presentation from a
learning analytics perspective.
Results
The included studies are listed in the appendix, and each one is given a capital S and a
number.
Educational levels
The reviewed studies have populations from various educational levels (Fig. 3). The
majority of the research in the reviewed articles (57.1%) was conducted in higher edu-
cation institutions (n = 12) and two of these studies involved postgraduate students in
their analysis (Burbano & Soler, 2020; Considine et al., 2021). Six studies (28.6%) were
conducted on secondary education, and four of them focused on STEM (science, tech-
nology, engineering, and math) subjects (de Jong et al., 2021; Rodríguez-Triana et al.,
2021; Sergis et al., 2019; Vozniuk et al., 2015). Only two research projects (9.5%) focused
on elementary and middle school students (Metcalf et al., 2017; Reilly & Dede, 2019a).
Finally, one study was conducted online as a Massive Open Online Course (MOOC) for
students of varying education levels (Hossain et al., 2018).
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 9 of 20
Subjects
The reviewed studies covered different disciplines of science, medicine, and engineering
(Fig. 4). The medical and dental virtual practices were used in practical-based physiol-
ogy courses (King et al., 2016), virtual patient cases (Berman et al., 2018), periodontol-
ogy and oral pathology (Burbano & Soler, 2020) and prosthodontics courses (Chan et al.,
2021a, 2021b). Chemistry virtual labs were used in concentration experiments (Liu et al.,
2018) and organic chemistry (Qvist et al., 2015), while biology labs covered Euglena’s
interactive live (Hossain et al., 2018), and molecular biology experiments (Qvist et al.,
2015). Virtual labs for science classes were available for school students (Metcalf et al.,
2017; Reilly & Dede, 2019b) and students in science, technology, engineering, and math-
ematics (STEM) (de Jong et al., 2021; Rodríguez-Triana et al., 2021; Sergis et al., 2019;
Vozniuk et al., 2015). Virtual labs were used in different fields of computer science,
namely Java programming (Castillo, 2016), cloud applications (Manske & Hoppe, 2016),
and network virtual labs (Venant et al., 2017). The engineering virtual labs covered auto-
motive engineering (Goncalves et al., 2018), container-based virtual labs (Robles-Gómez
et al., 2019), and building electrical circuits (Considine et al., 2021). Other practices
include digital electronic simulation environments (Considine et al., 2021) and remote
labs in the field of image processing (Vahdat et al., 2015).
Virtual environment
The authors of the reviewed articles used a wide range of virtual environments. Go-lab
was used in STEM education (de Jong et al., 2021; Rodríguez-Triana et al., 2021; Sergis
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 10 of 20
et al., 2019) and was combined with other applications such as the GRAASP platform
(Vozniuk et al., 2015) and cloud applications (Manske & Hoppe, 2016). In addition,
the EcoXPT system was utilized in science classes. (Metcalf et al., 2017; Reilly & Dede,
2019b). In the medical field, the LabTutor platform was used in physiology courses (King
et al., 2016), the ASUS virtual patient package (Berman et al., 2018), and the M-Health
Smilearning application with TIMONEL platform in the dental field (Burbano & Soler,
2020). Chemistry virtual labs were accessible on two platforms: the ChemVLab + tutor
(Liu et al., 2018) and the LabLife3D platform (Qvist et al., 2015). In the field of biol-
ogy, virtual labs were available in LabLife3D for molecular biology (Qvist et al., 2015)
and Open edX for Euglena experiments (Hossain et al., 2018). The virtual labs used in
computer science were the Magentix 2 platform with virtual hosts (Castillo, 2016), and
the network Lab4CE (Laboratory for Computer Education) (Venant et al., 2017). Vari-
ous engineering fields utilized different virtual lab platforms, such as Falstadat’s Circuit
Simulator Applet and Virtual Instrumentation Systems in Reality (VISIR) (Goncalves
et al., 2018), Netlab for building electrical circuits (Considine et al., 2021), and a con-
tainer-based virtual laboratory (CVL) using Linux Docker containers (Robles-Gómez
et al., 2019). Other labs included such as DEEDS (Digital Electronics Education and
Design Suite) for digital electronic simulation environments (Vahdat et al., 2015) and
the WebLab-Deusto remote lab management system (RLMS) for image processing
(Schwandt et al., 2021).
Learning analytics
The reviewed studies mainly covered one or more of these variables: performance,
activities, perception, and behavior. Performance was assessed in 11 studies, either
to evaluate the impact of the virtual labs on learning achievement (King et al.,
2016; Metcalf et al., 2017; Reilly & Dede, 2019b; Robles-Gómez et al., 2019; Vah-
dat et al., 2015); improve knowledge (Burbano G & Soler, 2020; Manske & Hoppe,
2016); assess the need for support (Goncalves et al., 2018; Venant et al., 2017) or
assess the inquiry-based educational designs by teachers (de Jong et al., 2021; Sergis
et al., 2019). There are 10 studies focusing on the analysis of students’ activities and
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 11 of 20
the pattern of virtual lab utilization (Castillo, 2016; King et al., 2016; Hossain et al.,
2018; Metcalf et al., 2017; Berman et al., 2018; Liu et al., 2018; Burbano and Soler
2020; Considine et al., 2021; Schwandt et al., 2021; Chan et al., 2021a, 2021b). Nine
studies measured the perceptions towards virtual labs in one of three forms: self-
reported feedback (Berman et al., 2018; Chan et al., 2021a, 2021b; Considine et al.,
2021; Hossain et al., 2018), teacher’s opinions (Qvist et al., 2015; Rodríguez-Triana
et al., 2021; Vozniuk et al., 2015) and students’ satisfaction questionnaires (Castillo,
2016; Robles-Gómez et al., 2019). Three studies identified the behavior pattern of
the students in virtual labs (Robles-Gómez et al., 2019; Vahdat et al., 2015; Venant
et al., 2017).
The learning analytics in the reviewed articles were mainly based on log data from
the virtual lab platforms. The data collected from system log files consist of general
data such as user ids, students’ clicks, the start and end of experiments, and users’
actions (Schwandt et al., 2021). Authors used log files to analyze the patterns of
experiments (Metcalf et al., 2017; Qvist et al., 2015), long-term patterns in MOOC
courses (Hossain et al., 2018), and interactions between learners (Venant et al.,
2017). Some authors used the time sequence as part of their analysis to monitor the
timeline pattern (Qvist et al., 2015), durations of system activities (Burbano G &
Soler, 2020; Vozniuk et al., 2015), time spent on tasks (King et al., 2016), sequence of
actions (Manske & Hoppe, 2016) and comparison between more than one academic
year to assess the improvement when using virtual labs (Robles-Gómez et al., 2019).
Also, the students’ performance can be predicted using engagement metrics of stu-
dent activity (Berman et al., 2018; Castillo, 2016), complexity metrics (Vahdat et al.,
2015), and behavior during practical learning (Venant et al., 2017). Thus, learning
analytics help teachers figure out when students are having difficulties and support
them when needed (Goncalves et al., 2018; Sergis et al., 2019; Venant et al., 2017).
Process mining was used as a temporal method to discover the hidden strategies of
students to achieve their goals (Castillo, 2016). Similarly, students’ learning strate-
gies and practical activity sequences were analyzed using sequential pattern mining
to identify behavior variations at different performance levels (Venant et al., 2017).
The learning trajectories of students were identified by meta-classification of the
events with their timestamps (Reilly & Dede, 2019b) and by selecting the segments
of interest in log data and then coding the video and audio recordings for these seg-
ments (Liu et al., 2018). Correlation analysis and multiple linear regression analy-
sis were used to address the relationship between access to learning resources and
academic achievement (Chan et al., 2021a, 2021b). Students’ performance was part
of the analysis by monitoring the students’ results in exams (Goncalves et al., 2018)
and extracting their mistakes (Considine et al., 2021). Virtual labs included built-in
learning analytics tools in many studies such as the Learning Analytics Data Collec-
tor (LADC) (Vahdat et al., 2015), Inquiry Learning Space (ILS) dashboard, “Teaching
and Learning Analytics (TLA)” and measurements based on algorithms to analyze
the correlation between students’ performance and actions (Schwandt et al., 2021).
Finally, virtual patients’ metrics were used to monitor students (Berman et al., 2018).
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 12 of 20
In addition, we stress the findings by Birkeland (Birkeland, 2022) regarding the absence of
collaborative environments and teamwork incentives, which are very common practices in
virtual labs, pointing to a critical issue that current virtual labs lack.
Since studies do not use a common standard, protocol, or shared methods, their findings
are not shared, portable, or built on each other. Authors and researchers need to think of
common protocols, standards, or application programming interfaces (APIs). Such com-
mon protocols would make efforts more likely to build on each other and results more
likely to be shared.
Virtual laboratory dashboards are in the very early stages of development, and little is
known about the effective elements of dashboards that could help students or teachers. In
fact, how learning analytics can help teachers optimize learning and teaching with virtual
laboratories is still an open area of inquiry. In the same vein, how learning analytics can
help teachers design, assess, or improve learning tasks is still largely unexplored.
This systematic review comes with the following limitations. The search was performed
using five search terms in the title and keywords, which were too generic and resulted in a
large number of articles being initially reported and then excluded. This may complicate
the search and filtration processes, but it reduces the exclusion of any articles, as authors
didn’t include "virtual labs" in their keywords. However, if authors didn’t include our key-
words, their work may have been missed in this review. Although the coding process for
this review worked well for most articles, the coders had to make an effort to interpret some
articles. Thus, in order to facilitate coding, the authors had to discuss and figure out the pri-
mary emphasis of the research that was unclear. Also, construct validity may be needed as
we rely on author descriptions and code groupings that sometimes differ from the author’s
domains, which don’t follow a standardized approach or protocol. Finally, the qualitative
analysis and the relatively small number of articles included in this review from a variety of
disciplines and research approaches restrict the ability to make broad generalizations due to
a lack of standardization”. Nonetheless, this research presents the first systematic overview
of learning analytics in virtual labs. Researchers may utilize our work as a framework and
lens to perform further rigorous research, and we believe that the results we have provided
can serve as a new basis for learning analytics in laboratories.
In summary, our review addressed questions pertaining to the use of learning analytics
in virtual laboratories. An area that still has significant gaps of knowledge that only future
research would help us shed light on.
Appendix
S1 Qvist 2015 (Qvist et al., 2015) Design of Virtual Learning Envi- To present the design and imple-
ronments Learning Analytics and mentation of virtual laboratories,
Identification of Affordances and and to discover student and
Barriers teacher views on the affordances
and barriers to learning in these
environments
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 14 of 20
Acknowledgements
The authors would like to thank Hanna Birkeland for her contribution in articles scanning process. This study is co-funded
the EU’s Erasmus + program within the project of “European Network for Virtual lab & Interactive SImulated ONline
learning (ENVISION_2027)” (2020-1-FI01-KA226-HE-092653). The paper is also co-funded by the Academy of Finland
(Suomen Akatemia) Research Council for Natural Sciences and Engineering for the project Towards precision education:
Idiographic learning analytics (TOPEILA), Decision Number 350560 which was received by the second author.
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 16 of 20
Author contributions
RE led the project. RE and MS contributed to the study design, methods, and manuscript writing; RE contributed to the
results, and MS contributed to the discussion and conclusions. All authors contributed to the writing, provided critical
feedback, helped shape the research and analysis, All authors read and approved the final manuscript.
Funding
This study is co-funded the EU’s Erasmus + program within the project of “European Network for Virtual lab & Interac-
tive SImulated ONline learning (ENVISION_2027)” (2020-1-FI01-KA226-HE-092653). The paper is also co-funded by the
Academy of Finland (Suomen Akatemia) Research Council for Natural Sciences and Engineering for the project Towards
precision education: Idiographic learning analytics (TOPEILA), Decision Number 350560 which was received by the
second author.
Declarations
Competing interests
The authors have no competing interests to declare that are relevant to the content of this article.
References
Abdulwahed, M., & Nagy, Z. K. (2013). Developing the TriLab, a triple access mode (hands-on, virtual, remote) laboratory,
of a process control rig using LabVIEW and Joomla. Computer Applications in Engineering Education, 21(4), 614–626.
https://doi.org/10.1002/cae.20506
Adejo, O., & Connolly, T. (2017). Learning analytics in a shared-network educational environment: ethical issues and
countermeasures. International Journal of Advanced Computer Science and Applications. https://doi.org/10.14569/
ijacsa.2017.080404
Agudo-Peregrina, Á. F., Iglesias-Pradas, S., Conde-González, M. Á., & Hernández-García, Á. (2014). Can we predict success
from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in
VLE-supported F2F and online learning. Computers in Human Behavior, 31(1), 542–550. https://doi.org/10.1016/j.
chb.2013.05.031
Ali, N., & Ullah, S. (2020). Review to analyze and compare virtual chemistry laboratories for their use in education. Journal
of Chemical Education, 97(10), 3563–3574. https://doi.org/10.1021/acs.jchemed.0c00185
Ali, N., Ullah, S., Alam, A., & Rafique, J. (2014). 3D interactive virtual chemistry laboratory for simulation of high school
experiments. Proceedings of Eurasia Graphics, Vember, 2015, 1–6.
Ângulo, A., & Velasco, G. V. de. (2014). Immersive Simulation of Architectural Spatial Experiences. pp. 495–499. https://doi.
org/10.5151/despro-sigradi2013-0095
Azevedo, J. M., Oliveira, E. P., & Beites, P. D. P. D. (2019). Using learning analytics to evaluate the quality of multiple-choice
questions: A perspective with classical test theory and item response theory. International Journal of Information
and Learning Technology, 36(4), 322–341. https://doi.org/10.1108/IJILT-02-2019-0023
Banihashem, S. K., Noroozi, O., van Ginkel, S., Macfadyen, L. P., & Biemans, H. J. A. (2022). A systematic review of the role of
learning analytics in enhancing feedback practices in higher education. Educational Research Review, 37, 100489.
https://doi.org/10.1016/j.edurev.2022.100489
Beardsley, P., Csikari, M., Ertzman, A., & Jeffus, M. (2022). BioInteractive’s free online professional learning course on evolu-
tion. The American Biology Teacher, 84(2), 68–74. https://doi.org/10.1525/abt.2022.84.2.68
Berman, N. B., Artino, A. R., Artino, A. R., Jr., & Artino, A. R. (2018). Development and initial validation of an online engage-
ment metric using virtual patients 13 education 1303 specialist studies in education. BMC Medical Education, 18(1),
213. https://doi.org/10.1186/s12909-018-1322-z
Birkeland, H. (2022). Understanding collaboration in virtual labs: A learning analytics framework. The University of Bergen.
Breakey, K. M., Levin, D., Miller, I., & Hentges, K. E. (2008). The use of scenario-based-learning interactive software to create
custom virtual laboratory scenarios for teaching genetics. Genetics, 179(3), 1151–1155. https://doi.org/10.1534/
genetics.108.090381
Burbano, G. D. C., & Soler, J. A. (2020). Learning analytics in m-learning: Periodontic education. Communications in Com-
puter and Information Science, 1280, 128–139. https://doi.org/10.1007/978-3-030-62554-2_10
Castillo, L. (2016). A virtual laboratory for multiagent systems: Joining efficacy, learning analytics and student satisfaction.
In International symposium on computers in education (SIIE 2016): learning analytics technologies, 1–6. https://doi.
org/10.1109/SIIE.2016.7751820
Chan, A. K. M., Botelho, M. G., & Lam, O. L. T. (2021a). The relation of online learning analytics, approaches to learning
and academic achievement in a clinical skills course. European Journal of Dental Education: Official Journal of the
Association for Dental Education in Europe, 25(3), 442–450. https://doi.org/10.1111/eje.12619
Chan, P., Van Gerven, T., Dubois, J.-L., & Bernaerts, K. (2021b). Virtual chemical laboratories: A systematic literature review of
research, technologies and instructional design. Computers and Education Open, 2, 100053.
Chatti, M. A., Dyckhoff, A. L., Schroeder, U., & Thüs, H. (2012). A reference model for learning analytics. International Journal
of Technology Enhanced Learning, 4(5/6), 318. https://doi.org/10.1504/ijtel.2012.051815
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 17 of 20
Childers, G., & Jones, M. G. (2015). Students as Virtual scientists: An exploration of students’ and teachers’ perceived real-
ness of a remote electron microscopy investigation. International Journal of Science Education, 37(15), 2433–2452.
https://doi.org/10.1080/09500693.2015.1082043
Chiu, J. L., Dejaegher, C. J., & Chao, J. (2015). The effects of augmented virtual science laboratories on middle school
students’ understanding of gas properties. Computers and Education, 85, 59–73. https://doi.org/10.1016/j.compe
du.2015.02.007
Christian, W., Esquembre, F., & Barbato, L. (2011). Open source physics. Science, 334(6059), 1077–1078. https://doi.org/10.
1126/science.1196984
Civelek, T., Ucar, E., Ustunel, H., & Aydin, M. K. (2014). Effects of a haptic augmented simulation on K-12 students’ achieve-
ment and their attitudes towards physics. Eurasia Journal of Mathematics, Science and Technology Education, 10(6),
565–574.
Considine, H., Nedic, Z., & Nafalski, A. (2019). Automation of basic supervision tasks in a remote laboratory-case study
netlab. In Proceedings of the 2019 5th experiment at international conference, exp.at 2019, (pp. 189–192). https://doi.
org/10.1109/EXPAT.2019.8876508
Considine, H., Nafalski, A., & Milosz, M. (2021). An Automated support system in a remote laboratory in the context of
online learning. In M. E. Auer & T. Rüütmann (Eds.), Educating Engineers for future industrial revolutions: proceedings
of the 23rd international conference on interactive collaborative learning (ICL2020), Volume 2 (pp. 657–665). Springer
International Publishing. https://doi.org/10.1007/978-3-030-68201-9_64
Dalgarno, B., Bishop, A. G., & Bedgood, R. (2003). The potential of virtual laboratories for distance education science teach-
ing: reflections from the development and evaluation of a virtual chemistry laboratory. In K. Placing (Ed.), UniServe
science improving learning outcomes symposium proceedings (pp. 90–115).
de Jong, T., Gillet, D., Rodríguez-Triana, MJe., Hovardas, T., Dikke, D., Doran, R., Dziabenko, O., Koslowsky, J., Korventausta,
M., Law, E., Pedaste, M., Tasiopoulou, E., Vidal, G., & Zacharia, Z. C. (2021). Understanding teacher design practices
for digital inquiry–based science learning: the case of Go-Lab. Educational Technology Research and Development,
69(2), 417–444. https://doi.org/10.1007/s11423-020-09904-z
de Ribaupierre, S., Kapralos, B., Haji, F., Stroulia, E., Dubrowski, A., & Eagleson, R. (2014). Healthcare training enhancement
through virtual reality and serious games. Intelligent Systems Reference Library, 68, 9–27. https://doi.org/10.1007/
978-3-642-54816-1_2
Dziabenko, O., & Budnyk, O. (2019). Go-Lab Ecosystem: Using Online Laboratories in a Primary School. EDULEARN19
Proceedings, 1, 9276–9285. https://doi.org/10.21125/edulearn.2019.2304
Elmoazen, R., Saqr, M., Tedre, M., & Hirsto, L. (2022). A systematic literature review of empirical research on epistemic
network analysis in education. IEEE Access, 10, 17330–17348. https://doi.org/10.1109/ACCESS.2022.3149812
Eve, E. J., Koo, S., Alshihri, A. A., Cormier, J., Kozhenikov, M., Donoff, R. B., & Karimbux, N. Y. (2014). Performance of dental
students versus prosthodontics residents on a 3D immersive haptic simulator. Journal of Dental Education, 78(4),
630–637.
Fiaidhi, J. (2014). The next step for learning analytics. IT Professional, 16(5), 4–8. https://doi.org/10.1109/MITP.2014.78.
Fink, A. (2019). Conducting research literature reviews: From the Internet to Paper. SAGE Publications.
Freina, L., & Ott, M. (2015). A literature review on immersive virtual reality in education: State of the art and perspectives.
In: Proceedings of eLearning and Software for Education (eLSE)(Bucharest, Romania, April 23--24, 2015), 8.
Garcia-Zubia, J., Cuadros, J., Serrano, V., Hernandez-Jayo, U., Angulo-Martinez, I., Villar, A., Orduna, P., & Alves, G. (2019).
Dashboard for the VISIR remote lab. In Proceedings of the 2019 5th experiment at international conference, Exp.at
2019, (pp. 42–46). https://doi.org/10.1109/EXPAT.2019.8876527
Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends: For Leaders
in Education & Training, 59(1), 64–71. https://doi.org/10.1007/s11528-014-0822-x
Goncalves, A. L., Carlos, L. M., Da Silva, J. B., & Alves, G. R. (2018). Personalized student assessment based on learning ana-
lytics and recommender systems. In 3rd International conference of the portuguese society for engineering education
(CISPEE 2018). https://doi.org/10.1109/CISPEE.2018.8593493
Gonçalves, A. L., Alves, G. R., Carlos, L. M., Da Silva, J. B., & Alves, D. M. (2018). Learning analytics and recommender systems
toward remote experimentation. CEUR Workshop Proceedings, 2188, 26–37.
Govaerts, S., Verbert, K., Duval, E., & Pardo, A. (2012). The student activity meter for awareness and self-reflection. CHI ’12,
Austin, Texas, 869–884. https://doi.org/10.1145/2212776.2212860
Green, J., Wyllie, A., & Jackson, D. (2014). Virtual worlds: A new frontier for nurse education? Collegian, 21(2), 135–141.
https://doi.org/10.1016/j.colegn.2013.11.004
Griol, D., Molina, J. M., & Callejas, Z. (2014). An approach to develop intelligent learning environments by means of
immersive virtual worlds. Journal of Ambient Intelligence and Smart Environments, 6(2), 237–255. https://doi.org/10.
3233/AIS-140255
Hantoobi, S., Wahdan, A., Al-Emran, M., & Shaalan, K. (2021). A review of learning analytics studies. In M. Al-Emran & K.
Shaalan (Eds.), Recent advances in technology acceptance models and theories (pp. 119–134). Cham: Springer Inter-
national Publishing. https://doi.org/10.1007/978-3-030-64987-6_8
Hauze, S., & Frazee, J. (2019). Virtual immersive teaching and learning: How immersive technology is shaping the way
students learn. EdMedia+ Innovate Learning, (pp. 1445–1450).
Heikkinen, S., Saqr, M., Malmberg, J., & Tedre, M. (2022). Supporting self-regulated learning with learning analytics
interventions – a systematic literature review. Education and Information Technologies. https://doi.org/10.1007/
s10639-022-11281-4
Hossain, Z., Bumbacher, E., Brauneis, A., Diaz, M., Saltarelli, A., Blikstein, P., & Riedel-Kruse, I. H. (2018). Design guidelines and
empirical case study for scaling authentic inquiry-based science learning via open online courses and interactive
biology cloud labs. International Journal of Artificial Intelligence in Education, 28(4), 478–507. https://doi.org/10.
1007/s40593-017-0150-3
Hossain, Z., Bumbacher, E. W., Chung, A. M., Kim, H., Litton, C., Walter, A. D., Pradhan, S. N., Jona, K., Blikstein, P., & Riedel-
Kruse, I. H. (2016). Interactive and scalable biology cloud experimentation for scientific inquiry and education.
Nature Biotechnology, 34(12), 1293–1298. https://doi.org/10.1038/nbt.3747
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 18 of 20
Howell, J. A., Roberts, L. D., Seaman, K., & Gibson, D. C. (2018). Are we on our way to becoming a “Helicopter University”?
Academics’ views on learning analytics. Technology, Knowledge and Learning, 23(1), 1–20. https://doi.org/10.1007/
s10758-017-9329-9
Ifenthaler, D., & Yau, J. Y. K. (2020). Utilising learning analytics to support study success in higher education: a system-
atic review. Educational technology research and development: ETR & D, 0123456789. https://doi.org/10.1007/
s11423-020-09788-z
Ifenthaler, D. (2017). Are higher education institutions prepared for learning analytics? TechTrends, 61(4), 366–371. https://
doi.org/10.1007/s11528-016-0154-0
Izatt, E., Scholberg, K., & Kopper, R. (2014). Neutrino-KAVE: An immersive visualization and fitting tool for neutrino physics
education. Proceedings - IEEE Virtual Reality. https://doi.org/10.1109/VR.2014.6802062
Jona, K., & Vondracek, M. (2013). A remote radioactivity experiment. Physics Teacher, 51(1), 25–27. https://doi.org/10.
1119/1.4772033
Jones, N. (2018). Simulated labs are booming. Nature, 562(7725), S5–S7. https://doi.org/10.1038/d41586-018-06831-1
Kaliisa, R., Rienties, B., Mørch, A. I., & Kluge, A. (2022). Social learning analytics in computer-supported collaborative learn-
ing environments: A systematic review of empirical studies. Computers and Education Open, 3, 100073. https://doi.
org/10.1016/j.caeo.2022.100073
Khalil, M., & Ebner, M. (2017). Clustering patterns of engagement in Massive Open Online Courses (MOOCs): The use of
learning analytics to reveal student categories. Journal of Computing in Higher Education, 29(1), 114–132. https://
doi.org/10.1007/s12528-016-9126-9
King, D. A., Arnaiz, I. A., Gordon-Thomson, C., Randal, N., & Herkes, S. M. (2016). Evaluation and use of an online data acqui-
sition and content platform for physiology practicals and tutorials. International Journal of Innovation in Science and
Mathematics Education, 24(5), 24–34.
Kitchenham, B., & Charters, S. (2007). Guidelines for performing Systematic Literature Reviews in Software Engineering.
Kleven, N. F., & Prasolova-Førland, E. (2014). Virtual University hospital as an arena for medical training and health educa-
tion. 106.
Kumpulainen, M., & Seppänen, M. (2022). Combining web of science and scopus datasets in citation-based literature
study. Scientometrics, 127(10), 5613–5631. https://doi.org/10.1007/s11192-022-04475-7
Kwong, T., Wong, E., & Yue, K. (2017). Bringing abstract academic integrity and ethical concepts into real-life situations.
Technology, Knowledge and Learning, 22(3), 353–368. https://doi.org/10.1007/s10758-017-9315-2
Liu, R., Stamper, J. C., & Davenport, J. (2018). A novel method for the in-depth multimodal analysis of student learning
trajectories in intelligent tutoring systems. Journal of Learning Analytics. https://doi.org/10.18608/jla.2018.51.4
Lynch, T., & Ghergulescu, I. (2017). Review of Virtual Labs As the Emerging Technologies for Teaching Stem Subjects. In:
INTED2017 Proceedings, 1, 6082–6091. https://doi.org/10.21125/inted.2017.1422
Manchikanti, P., Kumar, B. R., & Singh, V. K. (2017). Role of Virtual Biology Laboratories in Online and Remote Learning. In
Proceedings - IEEE 8th International Conference on Technology for Education, T4E 2016, (pp. 136–139). https://doi.org/
10.1109/T4E.2016.035
Manske, S., & Hoppe, H. U. (2016). The concept cloud: Supporting collaborative knowledge construction based on
semantic extraction from learner-generated artefacts. In 16th international conference on advanced learning tech-
nologies (ICALT 2016), 302–306. https://doi.org/10.1109/ICALT.2016.123
Metcalf, S. J., Kamarainen, A. M., Grotzer, T. A., & Dede, C. J. (2017). Changes in student experimentation strategies within an
inquiry-based immersive virtual environment (Annual Meeting of the American Educational Research Association
(AERA)).
Mikroyannidis, A., Gomez-Goiri, A., Domingue, J., Tranoris, C., Pareit, D., Vanhie-Van Gerwen, J., & Marquez Barja, J. M.
(2015). Deploying learning analytics for awareness and reflection in online scientific experimentation. CEUR Work-
shop Proceedings, 1465, 105–111.
Nyland, R., Davies, R. S., Chapman, J., & Allen, G. (2017). Transaction-level learning analytics in online authentic assess-
ments. Journal of Computing in Higher Education, 29(2), 201–217. https://doi.org/10.1007/s12528-016-9122-0
Okoli, C. (2015). A guide to conducting a standalone systematic literature review. Communications of the Association for
Information Systems, 37(1), 879–910.
Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A.,
Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson,
E., McDonald, S., & Moher, D. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic
reviews. Systematic Reviews, 10(1), 89. https://doi.org/10.1186/s13643-021-01626-4
Papamitsiou, Z., & Economides, A. A. (2014). Learning analytics and educational data mining in practice: A systematic
literature review of empirical evidence. Journal of Educational Technology & Society, 17(4), 49–64.
Potkonjak, V., Gardner, M., Callaghan, V., Mattila, P., Guetl, C., Petrović, V. M., & Jovanović, K. (2016). Virtual laboratories for
education in science, technology, and engineering: A review. Computers and Education, 95, 309–327. https://doi.
org/10.1016/j.compedu.2016.02.002
Qvist, P., Kangasniemi, T., Palomäki, S., Seppänen, J., Joensuu, P., Natri, O., Närhi, M., Palomäki, E., Tiitu, H., Nordström, K.,
Palomaki, S., Seppanen, J., Joensuu, P., Natri, O., Narhi, M., Palomaki, E., Tiitu, H., Nordstrom, K., Palomäki, S., & Nord-
ström, K. (2015). Design of virtual learning environments: learning analytics and identification of affordances and
barriers. International Journal of Engineering Pedagogy (IJEP), 5(4), 64. https://doi.org/10.3991/ijep.v5i4.4962
Rahman, F., Mim, M. S., Baishakhi, F. B., Hasan, M., & Morol, M. K. (2022). A systematic review on interactive virtual reality
laboratory. In ACM international conference proceeding series, (pp. 491–500). https://doi.org/10.1145/3542954.35430
25
Ramadahan, M. F., & Irwanto. (2018). Using virtual labs to enhance students’ thinking abilities, skills, and scientific atti-
tudes. In International conference on educational research and innovation (ICERI 2017), Iceri, (pp. 494–499).
Reeves, S. M., & Crippen, K. J. (2021). Virtual laboratories in undergraduate science and engineering courses: A sys-
tematic review, 2009–2019. Journal of Science Education and Technology, 30(1), 16–30. https://doi.org/10.1007/
S10956-020-09866-0
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 19 of 20
Reilly, J. M., & Dede, C. (2019b). Differences in student trajectories via filtered time series analysis in an immersive virtual
world. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge (LAK 2019b), (pp.
130–134). https://doi.org/10.1145/3303772.3303832
Reilly, J. M., & Dede, C. (2019a). Differences in student trajectories via filtered time series analysis in an immersive virtual
world. In Proceedings of the 9th International Conference on Learning Analytics & Knowledge (LAK 2019a). https://doi.
org/10.1145/3303772.3303832
Richard, E., Tijou, A., Richard, P., & Ferrier, J. L. (2006). Multi-modal virtual environments for education with haptic and
olfactory feedback. Virtual Reality, 10(3–4), 207–225. https://doi.org/10.1007/s10055-006-0040-8
Richter, T., Boehringer, D., & Jeschke, S. (2011). LiLa: A European project on networked experiments. In Automation, com-
munication and cybernetics in science and engineering 2009/2010 (pp. 307–317). Springer Berlin Heidelberg. https://
doi.org/10.1007/978-3-642-16208-4_27
Robbins, J. B. (2001). ERIC: Mission, structure, and resources. Government Information Quarterly, 18(1), 5–17. https://doi.
org/10.1016/S0740-624X(00)00062-9
Robles-Gómez, A., Tobarra, L., Pastor, R., Hernández, R., Duque, A., & Cano, J. (2019). Analyzing the students’ learning within
a container-based virtual laboratory for cybersecurity. 7th International Conference on Technological Ecosystems for
Enhancing Multiculturality (TEEM 2019), (pp. 275–283). https://doi.org/10.1145/3362789.3362840
Rodrigues, H. F., Machado, L. S., & Valença, A. M. G. (2014). Applying haptic systems in serious games: A game for adult’s
oral hygiene education. Journal on Interactive Systems, 5(1), 1. https://doi.org/10.5753/jis.2014.639
Rodríguez-Triana, M. J., Prieto, L. P., Dimitriadis, Y., de Jong, T., & Gillet, D. (2021). ADA for IBL: Lessons Learned in aligning
learning design and analytics for inquiry-based learning orchestration. Journal of Learning Analytics, 8(2), 22–50.
https://doi.org/10.18608/jla.2021.7357
Romero, C., & Ventura, S. (2020). Educational data mining and learning analytics: An updated survey. Wires Data Mining
and Knowledge Discovery. https://doi.org/10.1002/widm.1355
Salmi, H., Thuneberg, H., & Vainikainen, M. P. (2017). Making the invisible observable by Augmented Reality in informal sci-
ence education context. International Journal of Science Education, Part b: Communication and Public Engagement,
7(3), 253–268. https://doi.org/10.1080/21548455.2016.1254358
Saqr, M., Jovanovic, J., Viberg, O., & Gašević, D. (2022b). Is there order in the mess? A single paper meta-analysis approach
to identification of predictors of success in learning analytics. In Studies in Higher Education, (pp. 1–22). https://doi.
org/10.1080/03075079.2022.2061450
Saqr, M. (2018). A literature review of empirical research on learning analytics in medical education. International Journal
of Health Sciences, 12(2), 80–85.
Saqr, M., Elmoazen, R., Tedre, M., López-Pernas, S., & Hirsto, L. (2022a). How well centrality measures capture student
achievement in computer-supported collaborative learning? – A systematic review and meta-analysis. Educational
Research Review, 35, 100437. https://doi.org/10.1016/j.edurev.2022.100437
Schwandt, A., Winzker, M., & Rohde, M. (2021). Utilizing user activity and system response for learning analytics in a
remote lab. In M. E. Auer & D. May (Eds.), Cross reality and data science in engineering: proceedings of the 17th interna-
tional conference on remote engineering and virtual instrumentation (pp. 63–74). Springer International Publishing.
https://doi.org/10.1007/978-3-030-52575-0_5
Sergis, S., Sampson, D. G., Rodríguez-Triana, M. J., Gillet, D., Pelliccione, L., & de Jong, T. (2019). Using educational data from
teaching and learning to inform teachers’ reflective educational design in inquiry-based STEM education. Comput-
ers in Human Behavior, 92, 724–738. https://doi.org/10.1016/j.chb.2017.12.014
Slater, T. F., Burrows, A. C., French, D. A., Sanchez, R. A., & Tatge, C. B. (2014). A proposed astronomy learning progression for
remote telescope observation. Journal of College Teaching & Learning (TLC). https://doi.org/10.19030/tlc.v11i4.8857
SoLAR. (2011). What is Learning Analytics? https://www.solaresearch.org/about/what-is-learning-analytics/
Strang, K. D. (2017). Beyond engagement analytics: Which online mixed-data factors predict student learning outcomes?
Education and Information Technologies, 22(3), 917–937. https://doi.org/10.1007/s10639-016-9464-2
Tempelaar, D., Rienties, B., Mittelmeier, J., & Nguyen, Q. (2018). Student profiling in a dispositional learning analytics
application using formative assessment. Computers in Human Behavior, 78, 408–420. https://doi.org/10.1016/j.chb.
2017.08.010
Tobarra, L., Ros, S., Hernández, R., Robles-Gómez, A., Caminero, A. C., & Pastor, R. (2014). Integrated Analytic dashboard for
virtual evaluation laboratories and collaborative forums. In Proceedings of XI Tecnologias Aplicadas a La Ensenanza
de La Electronica (Technologies Applied to Electronics Teaching), TAEE 2014. https://doi.org/10.1109/TAEE.2014.69001
77
Tulha, C. N., Carvalho, M. A. G., & De Castro, L. N. (2022). LEDA: A Learning Analytics Based Framework to Analyze Remote
Labs Interaction. In Proceedings of the 9th ACM Conference on Learning @ Scale (L@S ’22), (pp. 379–383). https://doi.
org/10.1145/3491140.3528324
Udin, W. N., Ramli, M., & Muzzazinah. (2020). Virtual laboratory for enhancing students’ understanding on abstract biology
concepts and laboratory skills: A systematic review. Journal of Physics. Conference Series, 1521(4), 042025. https://
doi.org/10.1088/1742-6596/1521/4/042025
Vahdat, M., Oneto, L., Anguita, D., Funk, M., & Rauterberg, M. (2015). A learning analytics approach to correlate the
academic achievements of students with interaction data from an educational simulator. In G. Conole, T. Klobučar,
C. Rensing, J. Konert, & E. Lavoué (Eds.), Design for teaching and learning in a networked world: 10th european confer-
ence on technology enhanced learning, EC-TEL 2015, Toledo, Spain, September 15-18, 2015, Proceedings (pp. 352–366).
Cham: Springer International Publishing. https://doi.org/10.1007/978-3-319-24258-3_26
Vanessa Niet, Y., Diaz, V. G., & Montenegro, C. E. (2016). Academic decision making model for higher education institutions
using learning analytics. 2016 4th International Symposium on Computational and Business Intelligence ISCBI, 2016,
27–32. https://doi.org/10.1109/ISCBI.2016.7743255
Vasiliadou, R. (2020). Virtual laboratories during coronavirus (COVID-19) pandemic. Biochemistry and Molecular Biology
Education: A Bimonthly Publication of the International Union of Biochemistry and Molecular Biology, 48(5), 482–483.
https://doi.org/10.1002/bmb.21407
Elmoazen et al. Smart Learning Environments (2023) 10:23 Page 20 of 20
Venant, R., Sharma, K., Vidal, P., Dillenbourg, P., & Broisin, J. (2017). Using sequential pattern mining to explore learners’
behaviors and evaluate their correlation with performance in inquiry-based learning. In É. Lavoué, H. Drachsler, K.
Verbert, J. Broisin, & M. Pérez-Sanagustín (Eds.), Data driven approaches in digital education (pp. 286–299). Springer
International Publishing. https://doi.org/10.1007/978-3-319-66610-5_21
Vozniuk, A., Rodriguez-Triana, M. J., Holzer, A., Govaerts, S., Sandoz, D., & Gillet, D. (2015). Contextual learning analytics
apps to create awareness in blended inquiry learning. In International Conference on Information Technology Based
Higher Education and Training (ITHET 2015), (pp. 1–4). https://doi.org/10.1109/ITHET.2015.7218029
Webster, J., & Watson, R. T. (2002). Analyzing the past to prepare for the future: Writing a literature review. The Mississippi
Quarterly, 26, xiii.
Wise, A. F., & Jung, Y. (2019). Teaching with analytics: Towards a situated model of instructional decision-making. Journal of
Learning Analytics, 6(2), 53–69.
Wong, J., Baars, M., de Koning, B. B., van der Zee, T., Davis, D., Khalil, M., Houben, G.-J., & Paas, F. (2019). Educational theories
and learning analytics: From data to knowledge: The whole is greater than the sum of its parts. In D. Ifenthaler,
D.-K. Mah, & J.Y.-K. Yau (Eds.), Utilizing learning analytics to support study success (pp. 3–25). Springer International
Publishing. https://doi.org/10.1007/978-3-319-64792-0_1
Yaron, D., Karabinos, M., Lange, D., Greeno, J. G., & Leinhardt, G. (2010). The chemcollective—virtual labs for introductory
chemistry courses. Science, 328(5978), 584–585. https://doi.org/10.1126/science.1182435
Zhang, J., Sung, Y. T., Hou, H. T., & Chang, K. E. (2014). The development and evaluation of an augmented reality-based
armillary sphere for astronomical observation instruction. Computers and Education, 73, 178–188. https://doi.org/
10.1016/j.compedu.2014.01.003
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.