Skip to main content

Development of Assessment Tools for Depth of Understanding Quantitatively with Cognitive Diagnostic Models

  • Conference paper
  • First Online:
Advances in Information and Communication (FICC 2023)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 651))

Included in the following conference series:

  • 784 Accesses

Abstract

Fueled by the demand to cultivate deep understanding in 21st-century education, schoolteachers have a burning desire to assess the elements of understanding that students struggle to achieve. However, there has been limited consideration of quantitatively identifying these strengths and weaknesses. Hence, this study proposes a way to assess these by using a class of statistical models called cognitive diagnostic models. This study consists of three parts: development and administration of the diagnostic mathematical test, analysis using the statistical models, and development of feedback sheets for students and teachers. We believe this assessment system can be implemented in digital tools in the future.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Rupp, A. A., Templin, J., Henson, R. A.: Diagnostic measurement. Theory, Methods, and Applications. New York: Guilford (2010)

    Google Scholar 

  2. Bellanca, J.A.: Deeper learning: Beyond 21st century skills. In: Solution Tree Press, Bloomington (2014)

    Google Scholar 

  3. National Research Council.: a framework for K-12 science education: Practices, crosscutting concepts, and core ideas. National Academies Press (2012)

    Google Scholar 

  4. Rittle-Johnson, B., Alibali, M.W.: Conceptual and procedural knowledge of mathematics: does one lead to the other? J. Educ. Psychol. 91(1), 175 (1999)

    Google Scholar 

  5. Lachner, A., Nückles, M.: Tell me why! content knowledge predicts process-orientation of math researchers’ and math teachers’ explanations. Instr. Sci. 44(3), 221–242 (2016). https://doi.org/10.1007/s11251-015-9365-6

  6. Manalo, E., Uesaka, Y., Chen, O., Ayabe, H.: Showing what it looks like: teaching students to use diagrams in problem solving, communication, and thinking. In: Deeper Learning, Dialogic Learning, and Critical Thinking. Routledge, pp. 230–246 (2019)

    Google Scholar 

  7. Clark, I.: Formative assessment: assessment is for self-regulated learning. Educ. Psychol. Rev. 24(2), 205–249 (2012)

    Article  Google Scholar 

  8. Sessoms, J., Henson, R.A.: Applications of diagnostic classification models. Literat. Rev. Crit. Commen. 16(1), 1–17 (2018)

    Google Scholar 

  9. Tatsuoka, K.K.: Rule space: an approach for dealing with misconceptions based on item response theory. J. Educ. Measure. 20, 345–354 (1983)

    Google Scholar 

  10. Sun, Y., Suzuki, M., Toyota, T.: Designing effective feedback for cognitive diagnostic assessment in web-based learning environment. In: Proceedings of the 21st International Conference on Computers in Education, pp. 115–120 (2013)

    Google Scholar 

  11. Junker, B.W., Sijtsma, K.: Cognitive assessment models with few assumptions, and connections with nonparametric item response theory. Appl. Psychol. Meas. 25(3), 258–272 (2001)

    Article  MathSciNet  Google Scholar 

  12. Zhan, P., Jiao, H., Man, K., Wang, L.: Using JAGS for Bayesian cognitive diagnosis modeling: a tutorial. J. Educ. Behav. Stat. 44(4), 473–503 (2019)

    Article  Google Scholar 

  13. Su, Y.-S., Yajima, M.: R2jags: using R to run ‘JAGS’ (2020)

    Google Scholar 

  14. Vehtari, A., Gelman, A., Simpson, D., Carpenter, B., Bürkner, P.C.: Rank-normalization, folding, and localization: an improved R for assessing convergence of MCMC (with discussion). Bayesian Anal. 16(2), 667–718 (2021)

    Google Scholar 

  15. de la Torre, J., Hong, Y., Deng, W.: Factors affecting the item parameter estimation and classification accuracy of the DINA model. J. Educ. Meas. 47, 227–249 (2010)

    Article  Google Scholar 

  16. Jang, E.E.: A validity narrative: effects of reading skills diagnosis on teaching and learning in the context of NG TOEFL. University of Illinois at Urbana-Champaign (2005)

    Google Scholar 

  17. Madison, M.J., Bradshaw, L.P.: Assessing growth in a diagnostic classification model framework. Psychometrika 83(4), 963–990 (2018). https://doi.org/10.1007/s11336-018-9638-5

    Article  MathSciNet  MATH  Google Scholar 

  18. Zhan, P., Jiao, H., Liao, D., Li, F.A.: Longitudinal higher-order diagnostic classification model. J. Educ. Behav. Stat. 44(3), 251–281 (2019)

    Article  Google Scholar 

  19. Ministry of Education, Culture, Sports, Science and Technology.: FY2020 MEXT general budget highlights (2020). Accessed 31 Aug 2022. https://www.mext.go.jp/en/unesco/mext_00002.html

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shun Saso .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Saso, S., Oka, M., Uesaka, Y. (2023). Development of Assessment Tools for Depth of Understanding Quantitatively with Cognitive Diagnostic Models. In: Arai, K. (eds) Advances in Information and Communication. FICC 2023. Lecture Notes in Networks and Systems, vol 651. Springer, Cham. https://doi.org/10.1007/978-3-031-28076-4_55

Download citation

Publish with us

Policies and ethics