logo logo International Journal of Educational Methodology

IJEM is a leading, peer-reviewed, open access, research journal that provides an online forum for studies in education, by and for scholars and practitioners, worldwide.

Subscribe to

Receive Email Alerts

for special events, calls for papers, and professional development opportunities.


Publisher (HQ)

Eurasian Society of Educational Research
College House, 2nd Floor 17 King Edwards Road, Ruislip, London, UK. HA4 7AE
College House, 2nd Floor 17 King Edwards Road, Ruislip, London, UK. HA4 7AE
metacognition performance based testing regulation of cognition structural validity

Rethinking the Components of Regulation of Cognition through the Structural Validity of the Meta-Text Test

Marcio Alexander Castillo-Diaz , Cristiano Mauro Assis Gomes , Enio Galinkin Jelihovschi

The field of studies in metacognition points to some limitations in the way the construct has traditionally been measured and shows a near absence of .


The field of studies in metacognition points to some limitations in the way the construct has traditionally been measured and shows a near absence of performance-based tests. The Meta-Text is a performance-based test recently created to assess components of cognition regulation: planning, monitoring, and judgment. This study presents the first evidence on the structural validity of the Meta-Text, by analyzing its dimensionality and reliability in a sample of 655 Honduran university students. Different models were tested, via item confirmatory factor analysis. The results indicated that the specific factors of planning and monitoring do not hold empirically. The bifactor model containing the general cognition regulation factor and the judgment-specific factor was evaluated as the best model (CFI = .992; NFI = .963; TLI = .991; RMSEA = .021). The reliability of the factors in this model proved to be acceptable (Ω = .701 & .699). The judgment items were well loaded only by the judgment factor, suggesting that the judgment construct may actually be another component of the metacognitive knowledge dimension but having little role in cognition regulation. The results show initial evidence on the structural validity of the Meta-Text and give rise to information previously unidentified by the field which has conceptual implications for theorizing metacognitive components.

Keywords: Metacognition, performance-based testing, regulation of cognition, structural validity.

cloud_download PDF
Article Metrics



Abernethy, M. (2015). Self-reports and observer reports as data generation methods: An assessment of issues of both methods. Universal Journal of Psychology, 3(1), 22–27. https://doi.org/10.13189/ujp.2015.030104

Akturk, A., & Sahin, I. (2011). Literature review on metacognition and its measurement. Procedia Social and Behavioral Sciences, 15, 3731–3736. https://doi.org/10.1016/j.sbspro.2011.04.364

Amin, A., Corebima, A., Zubaidah, S., & Mahanal, S. (2020). The correlation between metacognitive skills and critical thinking skills at the implementation of four different learning strategies in animal physiology lectures. European Journal of Educational Research, 9(1), 143–163. https://doi.org/10.12973/eu-jer.9.1.143

Azevedo, R. (2020). Reflections on the field of metacognition: Issues, challenges, and opportunities. Metacognition and Learning, 15(2), 91–98. https://doi.org/10.1007/s11409-020-09231-x

Brown, T. A. (2015). Confirmatory factor analysis for applied research (2nd ed.). The Guilford Press.

Castillo-Diaz, M. A., & Gomes, C. M. A. (2021). Presenting the Meta-Performance Test, a metacognitive battery based on performance. International Journal of Educational Methodology, 7(2), 289-303. https://doi.org/gjwgpv  

Castillo-Diaz, M. A., & Gomes, C. M. A. (2022). Monitoring and intelligence as predictors of a standardized measure of general and specific higher education achievement. Trends in Psychology. Advance online publication. https://doi.org/10.1007/s43076-022-00160-z

Craig, K., Hale, D., Grainger, C., & Stewart, M. E. (2020). Evaluating metacognitive self-reports: Systematic reviews of the value of self-report in metacognitive research. Metacognition and Learning, 15(2), 155-213. https://doi.org/10.1007/s11409-020-09222-y

Cromley, J. G., & Kunze, A. J. (2020). Metacognition in education: Translational research. Translational Issues in Psychological Science, 6(1), 15-20. https://doi.org/10.1037/tps0000218

Dent, A. L., & Koenka, A. C. (2016). The relation between self-regulated learning and academic achievement across childhood and adolescence: A meta-analysis. Educational Psychology Review, 28(3), 425–474. https://doi.org/10.1007/s10648-015-9320-8

Desoete, A., Roeyers, H., & Buysse, A. (2001). Metacognition and mathematical problem solving in grade 3. Journal of Learning Disabilities, 34(5), 435–447. https://doi.org/10.1177/002221940103400505

DiStefano, C., McDaniel, H. L., Zhang, L., Shi, D., & Jiang, Z. (2019). Fitting large factor analysis models with ordinal data. Educational and Psychological Measurement, 79(3), 417–436. https://doi.org/10.1177/0013164418818242

Donker, A. S., de Boer, H., Kostons, D., van Dignath Ewijk, C. C., & van der Werf, M. (2014). Effectiveness of learning strategy instruction on academic performance: A meta-analysis. Educational Research Review, 11, 1–26. https://doi.org/10.1016/j.edurev.2013.11.002

Fergus, T. A., & Bardeen, J. R. (2019). The Metacognitions Questionnaire-30: An examination of a bifactor model and measurement invariance among men and women in a community sample. Assessment, 26(2), 223–234. https://doi.org/10.1177/1073191116685807

Filippi, R., Ceccolini, A., Periche-Tomas, E., & Bright, P. (2020). Developmental trajectories of metacognitive processing and executive function from childhood to older age. Quarterly Journal of Experimental Psychology, 73(11), 1757–1773. https://doi.org/10.1177/1747021820931096

Fleur, D. S., Bredeweg, B., & van den Bos, W. (2021). Metacognition: Ideas and insights from neuro- and educational sciences. NPJ Science of Learning, 6(1), Article 13. https://doi.org/10.1038/s41539-021-00089-5

Flora, D. B. (2020). Your coefficient alpha is probably wrong, but which coefficient omega is right? A tutorial on using r to obtain better reliability estimates. Advances in Methods and Practices in Psychological Science, 3(4), 484–501. https://doi.org/10.1080/19312458.2020.1718629

Gascoine, L., Higgins, S., & Wall, K. (2017). The assessment of metacognition in children aged 4–16 years: A systematic review. Review of Education, 5(1), 3–57. https://doi.org/10.1002/rev3.3077

Golino, H. F., & Gomes, C. M. A. (2011). Preliminary internal validity evidences of two Brazilian Metacognitive Tests. International Journal of Testing, 26, 11-12. https://www.intestcom.org/files/ti26.pdf

Gomes, C. M. A. (2021, September 1-3). Presentation of a methodology for creating metacognitive tests [Paper presentation]. International Galician-Portuguese Congress of Psychopedagogy, University of Minho, Braga, Portugal. https://doi.org/10.13140/RG.2.2.33129.62569

Gomes, C. M. A., Araujo, J. D., & Castillo-Diaz, M. A. (2021). Testing the invariance of the Metacognitive Monitoring Test. Psico-USF, 26(4), 685–696. https://doi.org/10.1590/1413-82712021260407

Gomes, C. M. A., & Golino, H. F. (2014). Self-reports on students' learning processes are academic metacognitive knowledge. Psychology: Reflection and Criticism/ Psicologia: Reflexão e Crítica, 27(3), 472-480. https://doi.org/10.1590/1678-7153.201427307

Gomes, C. M. A., Golino, H. F., & Menezes, I. G. (2014). Predicting school achievement rather than intelligence: Does metacognition matter? Psychology, 5, 1095–1110. https://doi.org/10.4236/psych.2014.59122

Gomes, C. M. A., Linhares, I., Jelihovschi, E., & Rodrigues, M. (2021). Introducing rationality and content validity of slat-thinking. International Journal of Development Research, 11(1), 43264–43272. https://bit.ly/3fSxBMM

Gomes, C. M. A., & Nascimento, D. (2021). Presenting slat-thinking second version and its content validity. International Journal of Development Research, 11(3), 45590–45596. https://bit.ly/3rFwByt

Gomes, C. M. A., Quadros, J. S., Araujo, J., & Jelihovschi, E. G. (2020). Measuring students’ learning approaches through achievement: Structural validity of SLAT-Thinking. Psychology Studies/ Estudos de Psicologia, 25(1), 33-43. https://bit.ly/3RCHFHf

Greene, J. A., Deekens, V. M., Copeland, D. Z., & Yu, S. (2018). Capturing and modeling self-regulated learning using think-aloud protocols. In D. H. Schunk & J. A. Greene (Eds.), Handbook of self-regulation of learning and performance (2nd ed., pp. 323–337). Routledge. https://doi.org/10.4324/9781315697048-21

Hu, J., & Gao, X. (2017). Using think-aloud protocol in self-regulated reading research. Educational Research Review, 22, 181–193. https://doi.org/10.1016/j.edurev.2017.09.004

Immekus, J. C., & Imbrie, P. K. (2008). Dimensionality assessment using the full-information item bifactor analysis for graded response data: An illustration with the State Metacognitive Inventory. Educational and Psychological Measurement, 68(4), 695–709. https://doi.org/10.1177/0013164407313366

Jansen, R. S., van Leeuwen, A., Janssen, J., Jak, S., & Kester, L. (2019). Self-regulated learning partially mediates the effect of self-regulated learning interventions on achievement in higher education: A meta-analysis. Educational Research Review, 28, Article 100292. https://doi.org/10.1016/j.edurev.2019.100292

Jia, X., Li, W., & Cao, L. (2019). The role of metacognitive components in creative thinking. Frontiers in Psychology, 10, Article 2404. https://doi.org/10.3389/fpsyg.2019.02404

Jorgensen, T. D., Pornprasertmanit, S., Schoemann, A. M., & Rosseel, Y. (2021). semTools: Useful tools for structural equation modeling. R package (version 0.5-4) [Computer software]. https://bit.ly/3s5bZjd

Li, J., Zhang, B., Du, H., Zhu, Z., & Li, Y. M. (2015). Metacognitive planning: Development and validation of an online measure. Psychological Assessment, 27(1), 260-271. https://doi.org/10.1037/pas0000019    

McNeish, D. (2018). Thanks coefficient alpha, we’ll take it from here. Psychological Methods, 23(3), 412–433. https://doi.org/10.1037/met0000144

Mondal, H., Mondal, S., Ghosal, T., & Mondal, S. (2019). Using Google Forms for medical survey: A technical note. International Journal of Clinical and Experimental Physiology, 5(4), 216–218. https://doi.org/10.5530/ijcep.2018.5.4.26

Morales, J., Lau, H., & Fleming, S. M. (2018). Domain-general and domain-specific patterns of activity supporting metacognition in human prefrontal cortex. The Journal of Neuroscience: The Official Journal of the Society for Neuroscience, 38(14), 3534–3546. https://doi.org/10.1523/JNEUROSCI.2360-17.2018

Muijs, D., & Bokhove, C. (2020). Metacognition and self-Regulation: Evidence review. Education Endowment Foundation. https://bit.ly/3VaqeAv

Neuenhaus, N., Artelt, C., Lingel, K., & Schneider, W. (2011). Fifth graders metacognitive knowledge: General or domain-specific? European Journal of Psychology of Education, 26(2), 163–178. https://doi.org/czv78g

Ning, H. K. (2019). The bifactor model of the Junior Metacognitive Awareness Inventory (Jr. MAI). Current Psychology, 38(2), 367–375. https://doi.org/10.1007/s12144-017-9619-3

Norman, E., Pfuhl, G., Sæle, R. G., Svartdal, F., Låg, T., & Dahl, T. I. (2019). Metacognition in psychology. Review of General Psychology, 23(4), 403–424. https://doi.org/10.1177/1089268019883821

Ohtani, K., & Hisasaka, T. (2018). Beyond intelligence: A meta-analytic review of the relationship among metacognition, intelligence, and academic performance. Metacognition and Learning, 13(2), 179–212. https://doi.org/10.1007/s11409-018-9183-8

Oliveira, A., & Nascimento, E. (2014). Construção de uma escala para avaliação do planejamento cognitivo [Construction of a cognitive planning assessment scale]. Psychology: Reflection and Criticism/ Psicologia: Reflexão e Crítica, 27(2), 209-218. https://doi.org/10.1590/1678-7153.201427201

Panadero, E. (2017). A review of self-regulated learning: Six models and four directions for research. Frontiers in Psychology, 8, Article 422. https://doi.org/10.3389/fpsyg.2017.00422

Pires, A. A. M., & Gomes, C. M. A. (2017). Three mistaken procedures in the elaboration of school exams: Explicitness and discussion. PONTE International Scientific Researches Journal, 73(3), 1-14. https://doi.org/10.21506/j.ponte.2017.3.1

Pires, A. A. M., & Gomes, C. M. A. (2018). Proposing a method to create metacognitive school exams. European Journal of Education Studies, 5(8), 119-142. https://doi.org/10.5281/zenodo.2313538

Preiss, D., Ibaceta, M., Ortiz, D., Carvacho, H., & Grau, V. (2019). An exploratory study on mind wandering, metacognition, and verbal creativity in Chilean high school students. Frontiers in Psychology, 10, Article 1118. https://doi.org/10.3389/fpsyg.2019.01118

Putnick, D. L., & Bornstein, M. H. (2016). Measurement invariance conventions and reporting: The state of the art and future directions for psychological research. Developmental Review, 41, 71-90. https://doi.org/10.1016/j.dr.2016.06.004

Reise, S. P. (2012). The rediscovery of bifactor measurement models. Multivariate behavioral research, 47(5), 667–696. https://doi.org/10.1080/00273171.2012.715555

Reise, S. P., Bonifay, W. E., & Haviland, M. G. (2013). Scoring and modeling psychological measures in the presence of multidimensionality. Journal of Personality Assessment, 95, 129–140. https://doi.org/gfrkkf  

Roebers, C. M. (2017). Executive function and metacognition: Towards a unifying framework of cognitive self-regulation. Developmental Review, 45, 31–51. https://doi.org/10.1016/j.dr.2017.04.001

Rose, N. S., Luo, L., Bialystok, E., Hering, A., Lau, K., & Craik, F. I. M. (2015). Cognitive processes in the Breakfast Task: Planning and monitoring. Canadian Journal of Experimental Psychology/ Revue Canadienne De Psychologie Experimentale, 69(3), 252–263. https://doi.org/10.1037/cep0000054

Rosseel, Y., Jorgensen, T. D., Rockwood, N., Oberski, D., Byrnes, J., Vanbrabant, L., Savalei, V., Merkle, E., Hallquist, M., Rhemtulla, M., Katsikatsou, M., Barendse, M., Scharf, F., & Du, H. (2020). Lavaan: Latent Variable Analysis. R package (version 0.6-7) [Computer software]. https://bit.ly/3gmzbqR  

Saenz, G. D., Geraci, L., & Tirso, R. (2019). Improving metacognition: A comparison of interventions. Applied Cognitive Psychology, 33(5), 918–929. https://doi.org/10.1002/acp.3556

Schraw, G. (2009). Measuring metacognitive judgments. In D. J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Handbook of metacognition in education (pp. 415-429). Routledge.

Schumacker, R., & Lomax, R. (2018). A beginner’s guide to structural equation modeling (4th ed.). Routledge.

Schunk, D. H., & Greene, J. A. (Eds.). (2018). Handbook of self-regulation of learning and performance (2nd ed.). Routledge. https://doi.org/10.4324/9781315697048

Silva, C., & Iturra, C. (2021). A conceptual proposal and operational definitions of the cognitive processes of complex thinking. Thinking Skills and Creativity, 39, Article 100794. https://doi.org/10.1016/j.tsc.2021.100794

Van der Stel, M., & Veenman, M. (2008). Relation between intellectual ability and metacognitive skillfulness as predictors of learning performance of young students performing tasks in different domains. Learning and Individual Differences, 18(1), 128–134. https://doi.org/10.1016/j.lindif.2007.08.003

Veenman, M., & Van Cleef, D. (2018). Measuring metacognitive skills for mathematics: Students’ self-reports versus on-line assessment methods. ZDM, 51(4), 691-701. https://doi.org/10.1007/s11858-018-1006-5

Wetzel, E., Böhnke, J. R., & Brown, A. (2016). Response biases. In F. T. L. Leong, D. Bartram, F. M. Cheung, K. F. Geisinger, & D. Iliescu (Eds.), The ITC international handbook of testing and assessment (pp. 349-363). Oxford University Press. https://doi.org/ghm7qr  

Wolcott, M. D., & Lobczowski, N. G. (2021). Using cognitive interviews and think-aloud protocols to understand thought processes. Currents in Pharmacy Teaching and Learning, 13(2), 181-188. https://doi.org/jg8g  

Zhao, N., Teng, X., Li, W., Li, Y., Wang, S., Wen, H., & Yi, M. (2019). A path model for metacognition and its relation to problem-solving strategies and achievement for different tasks. ZDM, 51(4), 641–653. https://doi.org/10.1007/s11858-019-01067-3