##plugins.themes.bootstrap3.article.main##

The term Computational Thinking (CT) is commonly acceptable as a 21st century skill in reading, writing, and arithmetic, causing many states to adopt new policies as far as the curriculum, didactic material, teaching and learning methods, as well as assessment methods. In this frame, we have developed a series of Arduino lessons for teachers’ training, including physics computing and computational thinking development. For the training material quality improvement, we have developed a set of assessment criteria (Rubric) that will be used during the teacher’s training course resulting in the formative assessment. The aim of this work is to analyze and justify the rubric’s format and priorities, as well as to inform the teachers’ trainers about the course assessing priorities.

Downloads

Download data is not yet available.

Introduction

Although the term “Computational Thinking” has not yet reached a global consensus agreement, the majority of researchers and policymakers perceive it as a thought process that is involved in designing solutions that a computer and/or a human can execute. This conceptualization is based on the definition devised by Janette Wing [1]–[3]: “Computational thinking is the thought processes involved in formulating a problem and expressing its solution(s) in such a way that a computer—human or machine—can effectively carry out”. Wing also declared that CT is a fundamental skill like reading, writing, and arithmetic and must be taught to everyone and not only to those who plan a career in the CS or STEM field [1], [4]. Fundamental issues for STEM’s transdisciplinary approach in different knowledge objects include CT research [5]–[7]. Moreover, Computational Thinking concepts using the Computational Pedagogy Model in STEM, constitute a field research for education [7]–[9].

An updated definition of CT which involves Computing is given by the Digital Education Action Plan 2021–2027 [10]: Computational thinking, programming, and coding are often used interchangeably in education settings, but they are distinct activities. Programming refers to the activity of analyzing a problem, designing a solution, and implementing it. Coding means implementing solutions in a particular programming language. Computational thinking, shorthand for “thinking as a computer scientist,” refers to the ability to understand the underlying notions and mechanisms of digital technologies to formulate and solve problems. Generally, the most dominant definitions concerning CT core include concepts, such as abstraction, algorithmic thinking, automation, decomposition, and generalization [11].

The above skills are very critical for everyone in order to participate in the digital society and/or for their professional development [3]. The challenge of teaching and learning CT skills pushed many educational systems to integrate CT skills into curricula [12]. Many open hardware platforms (such as Arduino, microbit, and Raspberry pi) and education robotics constitute tools in Education 4.0 [13]–[17]. In Greece, similar initiatives exist [18]–[20]. Currently, a course is designed for teachers and trainees. The aim of the course is to improve skills in physical computing using the Arduino platform and Ardublock (open-source software) to solve problems in the real world. The lessons were designed to introduce: physical computing, the fundamentals of sensors and actuators, the algorithm definition, the logical connective, the if-then-else, and the while statements [21].

But, although there have been developed too many didactic materials, teachers suffer from a lack of CT assessment and/or evaluation material [22]. Thus, the present work represents a proposed assessment rubric that allows teachers as trainees to be assessed. The effectiveness and the usability of the proposed rubric will be examined during teachers’ training at Open Lab at the Regional Unit of Viotia under the programs of Annual Pedagogical Training Program (EPPAIK) in Levadia departments of the Higher School of Pedagogical and Technological Education (ASPETE).

Related Works

The majority of assessment tools and methods are based on the Rubric schema formulated by [23]. Also, the evaluated concepts, which are analyzed to criteria and scales, are based on Computing and aim to depict the development of CT concepts [24]. Selby’s work is limited to abstraction, decomposition, algorithmic thinking, evaluation, and generalization [25]. Another important work, where the CT assessment based on a four interconnected stages framework, presents the procedure of developing an assessment approach for a proposed lesson in the classroom [26]. This case study shows a clear way of assessing CT concepts through a lesson. An alternative approach to CT assessment was proposed by Brennan and Resnick who stated that CT is strongly connected with a set of attitudes and skills (practices) that involves the creation of computational artefacts, debugging, testing, collaboration, and creativity [27].

The Proposed Rubric

The present work proposed a holistic rubric focusing on the assessment of didactic material offering activities based on Computing and STEM. Even though the proposed rubric concepts are based on Selby’s work [24], it is actually an innovative rubric because the included criteria are concerned with domain-specific area CT definitions (such as the offered Arduino/STEM course, Table I). This area requires specific knowledge or skills in order to solve problems in the subject of Computer Science or Programming [28], [29]. Thus, it has followed a framework similar to that has been proposed by Tang [30], which separates CT definitions into computing concepts and competencies.

Session Hours Learning objective
Introduction to physical computing 1 The role of microprocessors in the physical world The analog-to-digital conversion.
Fundamental of common electronics components 2 Introduction to electronic components (LEDs, resistors, switches, and potentiometers) and basic electrical connections using wires, breadboards, connectors, and components.
Fundamentals of common sensors 2 Introduction to common sensor components (LDR, IR sensor, ultrasonic sensor, etc.) and basic electrical connections.
Fundamentals of common actuators 2 Introduction to common actuator components (servo, dc motor, etc.) and basic electrical connections.
Programming by solving a real problem 4 Traffic light and rail crossing using an automatic barrier: The aim of the project (like in the real world) is to safely pass a car over a rail crossing by using basic programming (loop, cases, and digital I/O) using electronic components, an ultrasonic sensor, a LED, and a servo. The artifact which the trainers develop is shown in Fig. 1.
Programming by solving a real problem 4 Climate control in a greenhouse: The aim of the project is to control the climate inside a greenhouse by using basic programming (loop, cases, and digital I/O) using electronic components such as LDR, temperature and humidity sensors, LEDs, and servos. The artifact which the trainers develop is shown in Fig. 2.
Table I. Sessions of Course for Physical Computing Courses Using the Arduino Platform for Formative Assessment

Fig. 1. Artifact for project traffic light and rail crossing using an automatic barrier.

Fig. 2. Artifact for project climate control in a greenhouse.

The term “Assessment” is used for the trainee’s judgment performance in relation to specific goals, and a formative direction requires (a) feedback and (b) an indication of how the work can be improved to reach the required standard [31]–[33]. Using the term “feedback” we adopted the definition of Ramaprasad [34], [35]. He describes feedback as the distance between the actual and the expected result, and this is subsequently used to alter the gap in some way. Generally, there are two types of rubrics: (a) the “analytic rubric” in which each criterion is evaluated separately, and (b) the “holistic rubric” in which all dimensions are assessed simultaneously to provide a single overall score [36]. The use of rubrics in the educational process offers many benefits to both trainees and trainers: (a) inform trainees of expectations, (b) provide feedback, (c) maintain consistent grading and fair assessment, and (d) enhance trainees learning and self-assessment [37]. In addition, rubrics provide trainers with mechanisms to (a) clarify teaching and learning goals, (b) analyze trainee’s scores with specific criteria and skills, (c) summarize trainee’s performance reliably, and (d) identify patterns of strengths.

The proposed rubric as shown in Table II has been structured in 4 levels so that it is possible to identify not only the existence or use of the concept but, mainly, to measure a qualitative value for its application. Moreover, the odd number of levels, for example, 3 or 5, usually prompts the evaluators to opt for the middle option instead of their neighbor, which usually depicts a wrong result. On the other side, the number of 5 levels would give unnecessary information to evaluators.

Criteria Level-1 Level-2 Level-3 Level-4 CT concepts
Decomposes a problem into chunks of subproblems driving to a solution based on the subproblems’ solutions. Ineffectively tries to notionally separate the initial problem. Formulates a partly acceptable decomposed problem solution. Formulates a series of individual subproblems which are part of the initial problem. Designs solvable individual subproblems which results in the initial problem solution. DE, AL, AB
Finds similarities/differences during a problem analysis (pattern recognition). Cannot identify similarities and/or differences in a problem solution. Identifies some of the similarities and/or differences in a problem solution. Can transform the similarities to patterns. Successfully applies the recognized patterns to the final problem solution. GE, AL
Data, information analysis and evaluation Cannot recognize the role of accurate data in a problem’s solution. Partly effectively uses the notions of data and information. Can collect, organize, store data, and evaluate the resulting information. Evaluates the received feedback and improves the offered solution. EV
Designs and creates artefacts Inadequately designs artefacts. Designs adequate artefacts but faces difficulties in the implementation phase. Creates “weak” digital artefacts (lack of stability, trust worthless, not tested, etc.). Creates solid digital artefacts, effective and well-designed. AL, AB
Exploitation and usage of procedures Has no clear image of a procedure usage. Knows that a procedure can be used to hide the detail with subsolution. Inadequately uses the procedures in a program solution (still needs improvement). Perfectly designs, writes and debugs programs using procedures. AL, DE, AB, GE
Propose solutions which are based on the solutions of smaller instances of the initial problem (Recursion). Has no clear image of a recursion notion. Cannot identify the recursion problem parts. Ineffective uses the recursion (no results, high delays). Perfectly uses the recursion. AL, DE, AB, GE
Table II. The Proposed Rubric

The contribution of each criterion to the final assessment is equivalent. Each criterion has been opted in the scope of depicting at least one of the CT concepts. This is perceived as a more comprehensive way to holistically trainee’s assessment.

The abbreviations in Table II are: AB (Abstraction), DE (Decomposition), AL (Algorithmic Thinking), GE (Generalization), and EV (Evaluation). Moreover, is worth mentioning that each criterion includes/expresses more than one Computational Thinking concept. Fig. 3 quotes the proposed rubric as it is previewed in the Rubrics.io application which is used for the in-classroom teacher’s assessment needs. Also, each criterion is assigned a factor playing a more critical role in the production of the final mark assessment. For example, in our case, the first criterion has been assigned the factor 20% while the majority of the rest with 15%. The summation of all criteria should be 100%.

Fig. 3. The proposed rubric previewed in the rubrics.io application.

Fig. 4 depicts the rubric usage in practice as it is utilized by a trainer for a teacher’s formative assessment. For our research needs, we have created a demo teacher and we have assigned him/her two separate evaluations. The evaluations are presented in a visual way, thus making the results more perceivable and understandable by the evaluator (trainer).

Fig. 4. The proposed rubric in practice for a teacher’s formative assessment.

Future Work

The effectiveness and the usability of the proposed rubric will be evaluated by a sample of 90 students at OpenLab at the Regional Unit of Viotia under the programs of Annual Pedagogical Training Program (EPPAIK) in Levadia departments of the Higher School of Pedagogical and Technological Education (ASPETE). The results will be presented in a next paper.

References

  1. Wing JM. Computational thinking’s influence on research and education for all. Ital J Educ Technol. Jul 2017;25(2):7–14. doi:10.17471/2499-4324/922.
     Google Scholar
  2. Wing JM. Computational Thinking. ACM Press; 2006.
    DOI  |   Google Scholar
  3. Wing JM. Research notebook: computational thinking—what and why?. theLink. 2011;8.
     Google Scholar
  4. Voogt J, Fisser P, Good J, Mishra P, Yadav A. Computational thinking in compulsory education: towards an agenda for research and practice. Educ Inf Technol. Dec. 2015;20(4):715–28. doi: 10.1007/s10639-015-9412-6.
    DOI  |   Google Scholar
  5. Psycharis S. Steam in education: a literature review on the role of computational thinking, engineering epistemology and computational science. Sci Cult. Apr 2018;4(2):51–72. doi:10.5281/ZENODO.1214565.
     Google Scholar
  6. Psycharis S, Kalovrektis K, Sakellaridi E, Korres K. 4” Unfolding the curriculum: physical computing, computational thinking and computational experiment in stem’s transdisciplinary approach. Presented at the 9th Conference on Informatics in Education 2017, vol. 15, 2017.
     Google Scholar
  7. Psycharis S, Kalovrektis K. A conceptual framework for computational STEAM integration. Crosscutting concepts, threshold concepts, border objects and their propagation in STEM integrational fusion. In STE(A)M Educators & Education; 2021.
     Google Scholar
  8. Psycharis S, Kalovrektis K, Sakellaridi E, Korres K, Mastorodimos D. Unfolding the curriculum: physical computing, computational thinking and computational experiment in STEM’s transdisciplinary approach. EJENG. Mar. 2018:19–24. doi:10.24018/ejeng.2018.0.CIE.639.
    DOI  |   Google Scholar
  9. Psycharis S, Kalovrektis K. Assesement and integrated steam in engineering education. 2022 IEEE Global Engineering Education Conference (EDUCON), pp. 695–703, Tunis, Tunisia: IEEE; Mar. 2022. doi: 10.1109/EDUCON52537.2022.9766654.
     Google Scholar
  10. European Commission. Digital education action plan 2021–2027: resetting education and training for the digital age. 2020b. Available from: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52020SC0209&qid=1647943853396.
     Google Scholar
  11. Curzon P, Bell T, Waite J, Dorling M. Computational thinking. In The Cambridge Handbook of Computing Education Research. Fincher SA, Robins AV, 1st ed., Cambridge University Press, 2019. pp. 513–46. doi: 10.1017/9781108654555.018.
    DOI  |   Google Scholar
  12. European Commission. Joint Research Centre., Reviewing Computational Thinking in Compulsory Education: State of Play and Practices from Computing Education. LU: Publications Office; 2022. Accessed: Mar. 27, 2022. Available from: https://data.europa.eu/doi/10.2760/126955.
     Google Scholar
  13. Chatzopoulos A, Kalovrektis K, Xenakis A, Papoutsidakis M, Kalogiannakis M, Psycharis S. An advanced physical computing—based educational robot platform evaluated by technology acceptance model. 2022 10th International Conference on Information and Education Technology (ICIET), pp. 6–10, Matsue, Japan: IEEE; Apr. 2022. doi: 10.1109/ICIET55102.2022.9779049.
     Google Scholar
  14. Kalovrektis K, Papoutsidakis M, Drosos C, Stamoulis G. Information technology and μcontroller applications to support experiential learning of students. IJCA. Oct. 2017;175(8):33–7. doi:10.5120/ijca2017915649.
    DOI  |   Google Scholar
  15. Papoutsidakis M, Chatzopoulos A, Kalovrektis K, Drosos C. A brief guide for the continuously evolving μcontroller raspberry PI Mod.B. IJCA. Oct. 2017;176(8):30–3. doi: 10.5120/ijca2017915651.
    DOI  |   Google Scholar
  16. Papoutsidakis M, Chatzopoulos A, Drosos C, Kalovrextis K. An arduino family controller and its interactions via an intelligent interface. IJCA. Mar. 2018;179(30):5–8. doi:10.5120/ijca2018916684.
    DOI  |   Google Scholar
  17. Plageras A, Xenakis A, Kalovrektis K, Vavougios D. An application study of the UTAUT methodology for the flipped classroom model adoption by applied sciences and technology teachers. Int J Emerg Technol Learn. Jan. 2023;18(02):190–202. doi:10.3991/ijet.v18i02.35585.
    DOI  |   Google Scholar
  18. Dimos I, Velaora C, Kakarountas A. Computational thinking in Greek educational system for K-12: towards the future teaching approach. 2022 Panhellenic Conference on Electronics & Telecommunications (PACET), pp. 1–6, Tripolis, Greece: IEEE; Dec 2022. doi: 10.1109/PACET56979.2022.9976359.
     Google Scholar
  19. Plageras A, Kalovrektis K, Xenakis A, Vavougios D. The application of TPACK in the methodology of the flipped classroom and with the evaluation of the UTAUT to measure the impact of stem activities in improving the understanding of concepts of applied sciences. Presented at the 15th International Conference on Education and New Learning Technologies, pp. 569–75, Palma, Spain, Jul. 2023. doi: 10.21125/edulearn.2023.0243.
    DOI  |   Google Scholar
  20. Psycharis S, Iatrou P, Kalovrektis K, Xenakis A. The impact of the computational pedagogy steam model on prospective teachers’ computational thinking practices and computational experiment capacities. A case study in a training program. In Learning in the Age of Digital and Green Transition. Lecture Notes in Networks and Systems. Auer ME, Pachatz W, Rüütmann T, 634. Cham: Springer International Publishing, 2023, pp. 400–11. doi:10.1007/978-3-031-26190-9_41.
    DOI  |   Google Scholar
  21. Xenakis A, Kalovrektis K, Theodoropoulou K, Karampelas A, Giannakas G, Sotiropoulos DJ, Vavougios D Using sensors and digital data collection/analysis technologies in K-12 physics education under the STEM perspective. In The International Handbook of Physics Education Research: Teaching Physics. Ta¸sarMF, Heron PRL Ed. New York: AIP Publishing LLCMelville, 2023. pp. 6–1–6–46. doi: 10.1063/9780735425712_006.
    DOI  |   Google Scholar
  22. Otero Avila C, Foss L, Bordini A, Simone Debacco M, da Costa Cavalheiro SA. Evaluation rubric for computational thinking concepts. 2019 IEEE 19th International Conference on Advanced Learning Technologies (ICALT), pp. 279–81, 2019. doi:10.1109/ICALT.2019.00089.
    DOI  |   Google Scholar
  23. Mertler CA. Designing scoring rubrics for your classroom. Pract Assess, Res, Eval. 2000;7:9. doi: 10.7275/gcy8-0w24.
     Google Scholar
  24. Selby CC, Selby C, Woollard J, Woollard J. Computational thinking: the developing definition. Special Interest Group on Computer Science Education (SIGCSE) 2014. 2013;6:7–8.
     Google Scholar
  25. Selby C, Dorling M, Woollard J. Evidence of assessing computational thinking. (2014);11.
     Google Scholar
  26. Curzon P, Dorling M. Developing computational thinking in the classroom: a framework. 2014.
     Google Scholar
  27. Sentance S, Barendsen E, Schulte C. ComputerScience Education: Perspectives on Teaching and Learning in School. Bloomsbury Academic; 2018. doi: 10.5040/9781350057142.
    DOI  |   Google Scholar
  28. Panagiotis K, Valentina D, Stefania B, Augusto C, Katja E, Gabriel˙e S, et al. Integrating computational thinking into primary and lower secondary education: a systematic review. Edu Technol Soc. 2023;26(2):99–117. doi: 10.30191/ETS.202304.
     Google Scholar
  29. Tikva C, Tambouris E. Mapping computational thinking through programming in K-12 education: a conceptual model based on a systematic literature review. Comput Educ. 2021;162:104083. doi:10.1016/j.compedu.2020.104083.
    DOI  |   Google Scholar
  30. Tang X, Yin Y, Lin Q, Hadad R, Zhai X. Assessing computational thinking: a systematic review of empirical studies. Comput Educ. 2020;148:103798. doi: 10.1016/j.compedu.2019.103798.
    DOI  |   Google Scholar
  31. Guggemos J, Seufert S, Rom´an-Gonz´alez M. Computational thinking assessment—towards more vivid interpretations. Technology, Knowledge and Learning. 2022;28:539–68. doi:10.1007/s10758-021-09587-2.
    DOI  |   Google Scholar
  32. Hadad R, Thomas K, Kachovska M, Yin Y. Practicing formative assessment for computational thinking in making environments. J Sci Educ Technol. 2020;29(1):162–73. doi:10.1007/s10956-019-09796-6.
    DOI  |   Google Scholar
  33. Taras M. Assessment—summative and formative—some theoretical reflections. Brit J Educ Stud. 2005;53(4):466–78. doi:10.1111/j.1467-8527.2005.00307.x.
    DOI  |   Google Scholar
  34. Fong CJ, Schallert DL. Feedback to the future: advancing motivational and emotional perspectives in feedback research. Educ Psychol. 2023;58(3):146–61. doi: 10.1080/00461520.2022.2134135.
    DOI  |   Google Scholar
  35. Ramaprasad A. On the definition of feedback. Behav Sci. 1983;28(1):4–13. doi: 10.1002/bs.3830280103.
    DOI  |   Google Scholar
  36. Dimos I, Velaora C, Louvaris K, Kakarountas A, Antonarakou A. How a rubric score application empowers teachers’ attitudes over computational thinking leverage. Information. 2023;14(2):118. doi:10.3390/info14020118.
    DOI  |   Google Scholar
  37. Chowdhury F. Application of rubrics in the classroom: a vital tool for improvement in assessment, feedback and learning Int Educ Stud. 2018;12(1):61. doi: 10.5539/ies.v12n1p61.
    DOI  |   Google Scholar


Most read articles by the same author(s)