320 D. A. López et al.: Radioprotection 2025, 60( 4), 318 – 327
Table 1. Schedule and program. Time / schedule
Title
Previous months International Call for matriculation( via ALFIḾs website, ALFIḾs newsletter, ALFIḾs networks and by the Societies Membersśofficial communication channels)
Matriculation process using Google Form Acceptance letter and general information for the course and Drive access
Theoretical( Theoretical Course, Part 1)
Week 1 Introduction and general considerations( 30 min) Lecture 1 Diagnostic Nuclear Medicine: Current Situation and Future Challenges( 1h) Lecture 2 Introduction to Medical Exposure Optimization( 1h) Open session for online discussions( 1h)
Week 2 Lecture 3 Establishing Reference Levels in Nuclear Medicine( 1 h) Lecture 4 Activity Measurement: a key aspect of the NRD process( 30 min) Lecture 5 Practical Aspects for Generating Typical Exposure Levels for Hybrid Studies( 30 min) Tutorial 1: Demonstration of data collection sheet( 15min) Tutorial 2: Introduction to practical exercise( 15 min) Open session for online discussions( 1h)
Final Theoretical Test( Google form questionnaries) Practical Exercise( theoretical and practical course, Part 2)
Week 3
Collecting anonymous data from the participant’ s nuclear medicine department Open session for online discussions( 1h)
Week 4
Data collection and analysis Analysis Report Open session for online discussions( 1h)
Evaluation of individual practical activities( data sheets and report).
2.2 Practical Component
The optional practical component involved a project involving the establishment of a typical DRL in a Nuclear Medicine department for at least one diagnostic procedure( Tab. 1, Part II). The practical exercise included all steps to obtain the typical reference levels, including the proper selection of study and patient data. This involved conducting a critical analysis report that compared their findings with existing similar DRL studies, especially identifying national reference levels, if available, and proposed action plans based on their analyses. Participants were provided with an Excelbased data collection template, aligned with the methodological recommendations of the ICRP regarding the minimum number of studies to be collected( 20), as well as patient information variables and study data( Vañó et al., 2017). The template required, at a minimum, the inclusion of patient weight, age, and administered activity for the procedure. Participants also received guidelines outlining the essential requirements for their final reports, along with bibliographic references that could be useful for their analysis.
2.3 Data Analysis
Educational outcomes were analyzed both quantitatively and qualitatively. The flow of digital information was monitored through participant engagement with pre-recorded theoretical activities on YouTube. Quantitative results from the theoretical assessments were gathered via the Google questionnaire, while competencies acquired during practical exercises were evaluated through the submitted reports.
Additionally, an anonymous and voluntary quality survey was developed using Google Forms to gather participant feedback on several aspects of the course( Tab. 2). This survey included seven Likert-scale questions( 1 – 5 scale) ranging from 1( strongly disagree) to 5( strongly agree), evaluating usefulness and course quality( Ankur, 2015); and one openended question to suggest improvements.
The survey results were analysed with SPSS version 20( IBM SPSS Statistics 20 Documentation. Product Documentation, 2016), employing descriptive statistics to calculate absolute and relative frequencies. The mean scores were categorised as follows:
* Scores between 4-5 indicated strong satisfaction or effectiveness.
* A score of 3 suggested sufficient but improvable criteria.
* Scores of 1-2 highlighted areas needing significant enhancement.
This comprehensive approach ensured that both theoretical knowledge acquisition and practical skill development were rigorously evaluated, providing valuable insights into the effectiveness of this innovative educational initiative. This revised methods section improves clarity by breaking down each component into distinct subsections, enhancing readability while maintaining detailed descriptions of the processes involved. It also emphasizes participant engagement and feedback mechanisms, which are crucial for evaluating educational effectiveness.
To ensure the representativeness of the survey results, given a finite number of students( Np), the minimum required answer sample size( n) was determined based on a 95 %