ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Research Article
Revised

Digital teaching competence of higher education professors: self-perception study in an Ecuadorian university

[version 2; peer review: 2 approved]
PUBLISHED 29 Oct 2024
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS

Abstract

Background

Teaching professionalization aimed at the digital transformation of educational scenarios and training processes for students in contemporary higher education requires the mastery of digital competence by the teaching staff. The objectives of the study were to analyze the self-perceived level of digital teaching competence (DTC) of the faculty of the Technical University of Manabí (UTM), Ecuador, and to establish the relationship between age, sex, and academic profile variables with digital teaching competence.

Methods

A quantitative methodological approach was adopted to develop a descriptive-correlational field study with a non-experimental design. The participants were 277 professors, selected through non-probabilistic and voluntary sampling, who completed the DigCompEdu Check-In questionnaire sent by e-mail.

Results

The data revealed that the “integrator” and “expert” categories achieved high levels in all competencies. In particular, 48.74% of the participants were placed in the integrator category in the competence of facilitating skills, while 46.21% positioned themselves as integrators in the competence of evaluation and feedback. Additionally, a significant difference was found in the pedagogy variable in the interaction.

Conclusions

It is concluded that the competences self-perceived by the professors are within the intermediate categories such as integrator and expert. Likewise, the age, sex, and academic profile variables differ in the digital pedagogy level, which produces an inconsistent relationship, with the exception of the variable evaluates and provides feedback, where it was significant.

Keywords

Digital competences, university professors, self perception, educational modality, information and Communication Technologies

Revised Amendments from Version 1

The changes made in version 2 of the article compared to version 1 primarily address reviewer feedback aimed at enhancing the methodological clarity and theoretical justification. Key updates include a detailed explanation of the sampling method used, correcting inconsistencies regarding the declaration of probabilistic procedures and emphasizing the use of a non-probabilistic, purposive, and voluntary sampling technique. Additionally, the adaptation of the DigCompEdu Check-in instrument to the Ecuadorian context was theoretically and scientifically substantiated by highlighting cultural, socio-educational, and economic differences between Spain and Ecuador. This involved acknowledging the work of Cabero-Almenara and Palacios-Rodríguez in adapting the instrument to the Spanish context while justifying its relevance and applicability in Ecuador. Furthermore, the results section was expanded with a more in-depth discussion on the significance of the two-factor analysis, including gender differences and other demographic variables, aligning with the findings of recent studies on digital teaching competence in similar educational contexts. These adjustments aim to provide a more comprehensive understanding of the study’s results and ensure the methodological rigor necessary for scientific publication.

To read any peer review reports and author responses for this article, follow the "read" links in the Open Peer Review table.

Introduction

Technology plays a crucial role in contemporary and future society, with significant implications for the educational field.1 The rapid pace of technological change has introduced profound challenges in teaching and learning.2 As a result, educators must recognize the necessity of self-directed professional development to enhance their expertise, including achieving an adequate level of digital literacy. This proficiency is essential for success across various dimensions of education and training.3

As key institutions responsible for fostering the development of professional competence, universities must possess qualified resources, both in terms of materials and human capital, to prepare well-rounded professionals.4,5 These professionals should be equipped with problem-solving skills, critical thinking abilities, and the right attitudinal disposition necessary to navigate the complexities of the digital age.6,7 However, despite the acknowledged importance of technology in education, there are significant gaps in research regarding how educators can effectively select and utilize technological tools within their teaching practices.

Therefore, it becomes crucial for educators to develop proficiency in selecting appropriate technological tools and seamlessly integrating them into their instructional approaches.8,9 By doing so, they can plan and implement activities that foster learning environments conducive to student success.10 Unfortunately, there is a dearth of comprehensive research that explores how educators can fully leverage the potential of technology to enhance teaching and learning in the classroom.

Consequently, it is essential for educators to develop competencies in the proper selection of technological tools and their effective integration into pedagogical approaches.11,12 This skill enables the planning and implementation of activities that promote learning environments conducive to student success. However, there is a noticeable lack of comprehensive research exploring how educators can fully harness the potential of technology to enhance teaching and learning processes in the classroom.1315

The findings of this research will have significant implications for educational institutions, policymakers, and educators themselves.16 Understanding educators’ self-assessment of their digital competencies will provide clear insight into their level of confidence and awareness regarding the use of technology in education.17 Additionally, identifying the challenges they face in selecting and utilizing technological tools will offer valuable insights into the barriers that hinder the effective integration of technology into their pedagogical practices.

In light of this situation, this study aims to address these research gaps by examining educators’ self-assessment of their digital competencies and their ability to effectively select and utilize technological tools in their teaching practice. Understanding the challenges and needs faced by educators in this context will enable the development of more effective training and support strategies,18 contributing to their professional development and enhancing the quality of education in the digital age.19

In this context, the following question arises: What level of preparedness do the faculty members at the Technical University of Manabi (UTM) have in managing digital tools? To answer this question, two essential objectives are set. The first is to analyze the self-perceived level of digital competence (DC) among the faculty at UTM. The second is to establish the relationship between the variables of age, gender, and academic profile with this digital competence. The study was conducted using a quantitative approach, applying surveys to the faculty, considering gender, age, and academic background as variables. These factors were analyzed to identify potential correlations and establish patterns that contribute to a better understanding of the digital preparedness of faculty members in the current educational context.

Theoretical framework

As education increasingly moves towards digitalization, teachers’ digital competences have become crucial for ensuring teaching quality and effectiveness. The integration of technology into pedagogical processes demands continuous training and adaptation from educators to meet the challenges of the digital age. In this context, Digital Teaching Competence (DTC) is vital, allowing educators to use technology effectively while fostering interactive and dynamic learning environments that support student success. This section outlines the key theoretical concepts, dimensions, evaluation models, and self-assessment tools like DigCompEdu Check-in, emphasizing DTC’s transformative role in higher education and its importance for professional development.

Digital Teaching Competence (DTC)

Digital Teaching Competence (DTC) in higher education refers to the set of knowledge, skills, and abilities necessary for the effective use of digital media in an educational context.20 This competence enables educators to achieve pedagogical objectives through the use of technologies, making the design, implementation, and execution of training initiatives aimed at incorporating digital tools an essential professional requirement.21 Various studies highlight that DTC is an evolving skill that advances in parallel with rapid technological developments, requiring continuous updating by educators.22,23

Environmental Factors in Technological Integration

The integration of technology in education does not solely depend on the availability of technological tools but also on various environmental factors, such as institutional support, educational policies, and available technological infrastructure.24 In this regard, DTC should be conceived as a dynamic competence linked not only to technological advances but also to educators’ ability to adapt to changing educational contexts.25 When applied through interactive pedagogical tools, DTC enhances learning and contributes to the creation of more meaningful educational environments.26

Dimensions and Evaluation Models of DTC

Within the framework of DTC, different dimensions are identified, categorized by models and standards established for its evaluation. The International Society for Technology in Education (ISTE) and the National Institute of Educational Technologies and Teacher Education (INTEF) propose five areas of digital competences for educators: information and information literacy, communication and collaboration, digital content creation, security, and problem-solving.2729 These areas are organized into competence levels, from basic (A1, A2) to intermediate (B1, B2) and advanced (C1, C2), depending on the educator’s proficiency in using digital technologies.30,31

Transformative Importance of DTC in Higher Education

In the context of higher education, several studies have demonstrated that DTC has a transformative impact. Educators not only need to develop their own digital skills but also foster the necessary competencies in students to adapt to the digital world.31,32 These studies reveal that DTC is crucial in preparing teachers and students for contemporary technological challenges. Moreover, its relevance is reinforced in the context of the United Nations’ Sustainable Development Goals (SDGs), where digital competences contribute to advancing equality and social progress.

Operational Definition of DTC

DTC is operationally defined as the ability of educators to develop operational skills with the use of technological devices, facilitating access to information and digital resources. This competence has evolved into a flexible and critical tool that adapts to social and educational realities and continuously evolves. Its integration into the educational field has allowed for the development of frameworks and self-assessment tools that measure the degree of DTC appropriation, both in educators and educational managers.31 These tools are aligned with educational policies and strategies proposed by organizations such as UNESCO and the European Union.

Importance of Digital Teaching Competence (DTC) and Evaluation Tools

The literature review highlights that the importance of Digital Teaching Competence (DTC) has been widely recognized through the development of frameworks and self-assessment instruments designed to measure the degree of DTC appropriation by teachers and educational managers, as well as the integration and use of Information and Communication Technologies (ICT).33,34 These frameworks and instruments align closely with educational policies and strategies proposed by Digitally Competent Educational Organizations, the European Framework for Digital Competence of Educators (DigCompEdu), the Mentoring Technology-Enhanced Pedagogy (MENTEP) project, and the United Nations Educational, Scientific and Cultural Organization (UNESCO).24

International Projects to Promote DTC

UNESCO’s Mentoring Technology-Enhanced Pedagogy (MENTEP) project aims to incorporate the technical model of ICT into the pedagogical environment to enhance teaching through technology,24,35 particularly through the use of Massive Open Online Courses (MOOCs).36 This project also provides teachers with the opportunity to self-assess their digital competences and identify areas for improvement through the standardized TET-SAT test.24 This initiative emphasizes the importance of self-assessment in acquiring and strengthening digital competences in educational settings.

The European Framework for Digital Competence of Educators (DigCompEdu)

The European Framework for Digital Competence of Educators (DigCompEdu) provides a solid reference for the development of digital competences at all educational levels.37,38 Based on scientific foundations, DigCompEdu helps identify needs, deepen knowledge, expand skills, and promote professional development.39 This framework covers six competence areas: professional engagement, digital resources, teaching and learning, assessment and feedback, student empowerment, and fostering students’ digital competence.40,41 These areas stand out for their comprehensive approach to DTC development, adapting to various educational contexts and levels of proficiency.

Methods

To explore the proposed objectives, a descriptive correlational study was designed, supported by a quantitative approach and a non-experimental design structure. The methodological choice was neither random nor incidental but was deliberately aligned with the overall goals of the research. The focus was on delving into the essential core of the studied phenomenon and the intricate relationships between predetermined variables.42 It is noteworthy that the strategy adopted is based on a strictly observational model, excluding any form of direct intervention or manipulation of the subject matter.

Within this framework, the following key elements are established:

  • Critical phases in the research process were meticulously identified. These stages include the recruitment period for participants scheduled for the year 2022, specific moments for exposing the study subjects to the variables of interest, carefully structured follow-up phases, and predetermined timeframes for the empirical data collection.

  • Proactive measures were taken to identify and mitigate potential sources of bias. For this, statistical methods such as random probability sampling and weighted tests were applied. These techniques were specifically adapted to the peculiarities and demands of the research with the aim of preserving objectivity and impartiality in the results.

  • The selection of the number of participants was based on advanced statistical algorithms. These considered both the analytical power and the estimated effect size of the variables in question. This dual approach not only maximized the capability to detect significant interactions or impacts but also minimized the risks of erroneous inferences. In the event of missing data, consolidated statistical methodologies are applied. Initially, the nature and extent of the missing data were assessed. Subsequently, data imputation techniques were opted for, or alternatively, analyses based on complete records were carried out. This procedure ensured the integrity and reliability of the data and its subsequent interpretations.

Ethical considerations

In accordance with the ethical imperatives governing academic research, a meticulous procedure for obtaining informed consent from participants was diligently executed prior to the administration of the research instrument. This is a critical facet in the realm of scientific inquiry, designed to ensure that participants are not only fully aware of the study’s overarching aim and methodology but also of any prospective risks and benefits that may arise from their involvement.

To facilitate a comprehensive understanding of the study’s parameters, informed consent documents were articulated in a lucid and accessible language, deliberately avoiding any complex technical terminology that could potentially obfuscate participants’ comprehension of the study’s scope and implications. This approach was adopted to reinforce the principle of voluntariness, emphasizing that participants were free to either abstain from or withdraw from the study at any point, without suffering any negative repercussions.

Simultaneously, at the same time, strict protocols were established to safeguard the confidentiality and privacy of the data collected from the participants. Detailed explanations were provided regarding the mechanisms to protect the identity and personal data of the participants. Once any pending questions or concerns were addressed, participants were invited to officially register their consent. This was achieved through the signing of the informed consent document, before proceeding to administer the questionnaire through Google Forms.

The authorization for the execution of the current research was granted by the Institutional Ethics Committee, and the funding was facilitated by the Honorable University Council of the Technical University of Manabí. This support was institutionalized through the issuance of resolution RHCU.UTM-No.259-SO-10-2022, dated January 10, 2022, thus preceding the data collection phase.

Participants and sampling

The study population comprised the entire faculty body of the Technical University of Manabí, totaling 992 academic professionals (N = 992). A non-probabilistic, purposive, and voluntary sampling method was employed, resulting in an estimated sample of 277 faculty members (n ≈ 277). The sample size was determined using the finite population sampling formula proposed by Hernández et al.,43 which takes into account the population size, confidence level, estimated proportion, and acceptable margin of error. This approach ensured adequate capacity to detect significant interactions while guaranteeing that the sample size was representative of the finite population under study. Although the voluntary nature of the sampling may limit the representativeness of the sample relative to the entire population, this method was deemed the most appropriate given resource constraints and the specific objectives of the research.

Data collection instrument

The DigCompEdu Check-in was employed as the data collection instrument for this study, originally developed by Ghomi and Redecker17 and published by the Joint Research Center. This tool has been widely used to assess digital teaching competencies in various educational contexts. While the instrument was initially adapted to the Spanish context by Cabero-Almenara and Palacios-Rodríguez,44 adjustments were necessary to ensure its relevance and applicability to the Ecuadorian educational setting. The adaptation process considered the distinct cultural, demographic, socio-educational, and economic factors present in Ecuador, which differ significantly from those of the European context where the tool was first applied.

The adaptation of the instrument to the Ecuadorian context was carried out with theoretical and methodological rigor, ensuring that it accurately reflected the realities of local educational systems. Ecuador, like many other Latin American countries, faces unique challenges related to access to technology, infrastructure limitations, and varying levels of digital literacy across different regions and populations. These factors were considered when adapting the survey, ensuring that the instrument captured the specific needs and competencies relevant to Ecuadorian educators.

Additionally, the socio-economic disparities present in Ecuador necessitated a more inclusive approach to assessing digital competencies, as educators may have varying levels of access to digital resources. The adaptation involved reviewing the language, examples, and scenarios used in the questionnaire to reflect the local educational environment and the realities of teaching in both urban and rural settings. Moreover, this adjustment aligned with recommendations from authors such as Martínez-Bravo et al.,45 who emphasize the importance of contextualizing digital competency frameworks to account for regional educational disparities and technological constraints.

The final version of the instrument retained its original structure, comprising 22 items categorized into six domains: Professional Commitment, Digital Resources, Digital Pedagogy, Assessment and Feedback, Empowering Students, and Facilitating Students’ Digital Competence. The responses were measured using a five-point Likert scale, ranging from “strongly disagree” to “strongly agree,” and supplemented with demographic variables such as gender and age. The items were coded alphanumerically, with the first letter representing the specific competency domain (e.g., “C” for Professional Commitment and “R” for Digital Resources).

The survey was distributed via Google Forms, ensuring efficient data collection and anonymity of responses, thus adhering to both ethical and logistical requirements. This online distribution method was particularly relevant in the Ecuadorian context, where accessibility to digital tools can vary, but the widespread availability of internet connections in academic institutions allowed for broad participation.

Data analysis

Following the data collection via surveys, an inferential analysis was conducted with the aim of elucidating the self-perception that educators have regarding their digital competencies. To eliminate ambiguities in the categorization of these competencies, a secondary alphabetical character was assigned to each group when two groups shared the same initial letter. Thus, “Evaluation and Feedback” was coded as “EfV,” while “Empowerment” was labeled as “EP.”

During the variable construction phase, a composite index (CALIF) was calculated for each participant. This was achieved by summing the corresponding values of the responses for the items linked to each competency domain. The composite index ranged from 0 to 88 points. Once these scores were obtained, the individuals’ competency levels were categorized according to the grading scheme outlined in Table 1.

Table 1. Classification and scoring system of the "DigCompEdu Check-In" competence level.

Level of competenceScore (out of 88 points)
Novice (A1)<20
Explorer (A2)20-33
Integrator (B1)34-49
Expert (B2)50-65
Leader (C1)66-80
Pioneer (C2)>80

For each area of competence, a variable was generated with the sum of the scores of the items that constituted the area of competence, and together with each numerical variable, a categorical variable was conceived as suggested by Ref. 46, as specified below:

  • COMP_C (Commitment): Based on responses in “Professional Commitment,” with scores ranging from 0 to 16 points.

  • PEDAGO_C (Digital Pedagogy): Based on responses in “Digital Pedagogy,” with scores ranging from 0 to 16 points.

  • RECDIGC (Digital Resources): Based on responses in “Digital Resources,” with scores ranging from 0 to 12 points.

  • EMPODERAC (Empowerment): Based on responses in “Empowering Students,” with scores ranging from 0 to 12 points.

  • EVALUAYRC (Evaluation and Feedback): Based on responses in “Evaluation and Feedback,” with scores ranging from 0 to 12 points.

  • FACILITACOC (Facilitating Students’ Digital Competence): Based on responses in “Facilitating Students’ Digital Competence,” with scores ranging from 0 to 20 points.

This structured approach facilitated a comprehensive analysis of the various dimensions of digital competencies among educators, ensuring that each competency area was accurately represented and analyzed.

The reliability of the DigCompEdu Check-in instrument was evaluated using Cronbach’s Alpha to assess both general and internal consistency. The analysis was conducted with SPSS-21 software, yielding a Cronbach’s Alpha of 0.949 across the 22 items, which signifies a high level of reliability (see Table 2).

Table 2. Reliability statistics.

Reliability statisticsValueNumber of elements
Cronbach’s Alpha0.94922

A reliability test using Cronbach’s alpha coefficient was performed to assess the internal consistency of the 22 survey items, providing an indication of how well the instrument consistently measures the same underlying construct. A higher Cronbach’s alpha value signifies greater reliability, while a lower value suggests potential inconsistencies. The analysis, conducted with statistical software like SPSS-21, confirmed the robustness and stability of the instrument, ensuring that the items consistently reflect the digital competencies being assessed. The results, shown in Table 3, demonstrate the instrument’s overall reliability, reinforcing the validity of the findings.

Table 3. Cronbach's reliability test results.

AreaCompetenceScale mean if the item has been deletedScale variance if the item has been deletedCorrelation total corrected itemsCronbach's alpha if the item has been deleted
Professional EngagementOrganizational Communication52.5343129.0180.5000.949
Professional Collaboration52.8448128.8850.6110.948
Reflective Practice52.5812127.7080.6270.947
Digital Training51.9458124.2830.6540.947
Digital ResourcesSelection52.5668128.6090.5960.948
Creation and Modification52.5307129.3880.6230.948
Administration, Sharing, and Protection52.3827126.6860.6070.948
Digital PedagogyTeaching52.4585125.6980.6940.947
Guidance52.1913127.5100.7030.946
Collaborative Learning52.2888128.6990.6400.947
Self-directed Learning52.2166129.2720.6690.947
Evaluation and FeedbackEvaluation Strategies52.2635128.1660.7020.947
Analysis of Evidence and Tests52.3863126.2520.7160.946
Feedback and Planning52.6643127.8400.7230.946
Empowering StudentsAccessibility and Inclusion52.1047128.0290.6420.947
Differentiation and Personalization52.4693126.2350.6770.947
Active Student Participation52.3899125.4920.7510.946
Facilitating Students' Digital CompetenceInformation and Media Literacy52.5235127.3160.6890.947
Digital Communication and Collaboration52.4513127.4010.7000.947
Creation of Digital Content52.2491127.1150.7180.946
Responsible Use and Well-being52.4801126.5550.7350.946
Digital Problem Solving52.4296127.6880.6530.947

Table 3 presents the results of Cronbach’s alpha reliability assessment across several key competency domains, including Professional Engagement, Digital Resources, Digital Pedagogy, Evaluation and Feedback, Empowering Students, and Facilitating Students’ Digital Competence. The analysis reveals high reliability across all competencies, with Cronbach’s alpha values ranging from 0.946 to 0.949, surpassing the commonly accepted threshold of 0.7, which indicates substantial internal consistency. These results suggest that the instrument consistently measures the intended competencies across the various domains.

Further analysis of the scale mean and variance in the event of item deletion shows minimal variation, with the scale mean averaging around 52.5 and variance values ranging between the mid-120s and 130. This consistency across competencies implies a balanced scale. Additionally, correlations between individual competencies and the corrected total score reveal positive linear relationships, with coefficients ranging from 0.500 in Organizational Communication (Professional Engagement) to 0.751 in Active Student Participation (Empowering Students). This indicates that an increase in individual competency scores correlates with a rise in the overall score. Lastly, evaluating Cronbach’s alpha when specific items are removed suggests that the exclusion of any single competency would have a negligible impact on the overall reliability of the instrument, further reinforcing its robustness.

Results

In this section, the results of the study are presented, building on the work of Moreira-Choez et al.,70 provide a detailed analysis of participants’ self-evaluations of their digital teaching competencies. Using descriptive statistics, the data were interpreted to reveal trends, frequencies, and averages, helping to identify key insights regarding faculty members’ proficiency in various digital competency areas. The analysis utilized a five-level Likert scale, ranging from “strongly disagree” to “strongly agree,” allowing for a nuanced interpretation of the responses. This categorization highlighted general trends and specific areas where digital competency training may be needed, offering a foundation for future discussions and recommendations aimed at improving digital teaching practices within the institution.

Table 4. Results of the descriptive analysis on the areas of digital competence of professors (n=277).

Categories
CompetenceNoviceExplorerIntegratorExpertLeaderPioneer
PROFESSIONAL COMMITMENT3.2516.9736.4638.993.610.72
RECOGNIZES EVALUATES AND EMPOWERS2.5313.0039.3535.389.030.72
DIGITAL PEDAGOGY0.369.3940.0739.717.942.53
EVALUATES AND PROVIDES FEEDBACK1.449.0346.2132.137.583.61
EMPOWERS STUDENTS2.538.3032.8542.609.034.69
FACILITATES COMPETENCES2.535.0548.7436.823.613.25

Table 4 presents the distribution of participants (n=277) across different competence levels for six key areas of digital competence. The results provide a comprehensive overview of how professors self-assess their proficiency in these domains, highlighting areas of strength and potential improvement.

In the area of Professional Commitment, the majority of professors are categorized as experts (38.99%) and integrators (36.46%). These findings indicate a significant level of engagement in professional development and digital competence. However, only a small proportion of respondents are classified as pioneers (0.72%) or novices (3.25%), suggesting a lower presence of both extreme levels of competence. Intermediate categories, such as explorers and leaders, account for 16.97% and 3.61% respectively.

The Recognizes, Evaluates, and Empowers competency shows a similar trend, with the integrator category representing the largest group (39.35%), followed closely by experts (35.38%). Novices and pioneers are underrepresented, with values of 2.53% and 0.72% respectively, while explorers (13.00%) and leaders (9.03%) occupy intermediate levels. This distribution reflects an overall positive self-perception among the majority of respondents, though it indicates room for growth in leadership and pioneering roles.

In the Digital Pedagogy domain, integrators (40.07%) and experts (39.71%) dominate the responses, demonstrating high levels of competence in the application of digital tools in teaching practices. Explorers account for 9.39%, while leaders (7.94%) and pioneers (2.53%) remain in the minority. Only 0.36% of respondents identify as novices, indicating a general familiarity with digital pedagogical tools across the sample.

For the Evaluates and Provides Feedback competence, the largest group consists of integrators (46.21%), followed by experts (32.13%). Explorers (9.03%) and leaders (7.58%) show a moderate presence, whereas pioneers (3.68%) and novices (1.44%) constitute the smallest categories. This suggests that most professors are proficient in using digital tools for evaluation and feedback, with fewer at the novice or pioneer levels.

Regarding Empowers Students, 42.60% of the participants identify as experts, and 32.85% as integrators. The remaining categories include leaders (9.03%), explorers (8.30%), pioneers (4.69%), and novices (2.53%), illustrating that the majority of professors feel confident in empowering students through digital means, though there is still a notable proportion in lower competence levels.

Finally, the Facilitates Competences competence is predominantly represented by integrators (48.74%) and experts (36.82%). The explorer category accounts for 5.05%, while leaders (3.61%), pioneers (3.25%), and novices (2.53%) remain less frequent. This distribution indicates that a considerable number of professors perceive themselves as proficient in facilitating digital competencies in their educational practices.

Table 5 presents the analysis of variance, which examines the sources of variation, the degrees of freedom, and the sum of squares for each numerical variable investigated. Analysis of variance (ANOVA) is a statistical technique used to determine the significance of differences between groups or categories.

Table 5. Summary of variance analyses.

Source of variationglCommitmentDigResourcesDigital pedagogyEmpowersEvaluatesandproFacilitatesco
Sex10.19411.9783.9560.5613.16814.771
Academic profile313.8126.3204.4325.8140.96310.764
Age35.7257.21530.76220.35016.31459.216
Sex x academic profile342.86723.22746.22717.36811.43243.846
Sex x Age331.8929.85715.5986.6056.97527.227
Academic profile x Age511.3946.28813.96617.0368.27932.639
Sex x academic profile x Age28.42911.7558.7993.8642.4092.676
Error2571635.739857.1051386.3561039.333880.3002313.342
Total corrected2771805.726954.6711500.6861104.801932.1082533.199

When examining the sources of variation such as sex, academic profile, and age, as well as their interactions, significant differences are observed in the “pedagogy variable” due to the interaction effect between sex and academic profile. This suggests that the relationship between sex and academic profile influences the variability in pedagogical competence.

Furthermore, the relationship between academic profile and age accounts for approximately 31.892% of the variability in professional commitment. This indicates that the combination of academic profile and age significantly contributes to variations in the level of professional commitment.

Similarly, the confluence of sex, academic profile, and age has a notable impact, accounting for 11.755% of the variability in digital resources. This suggests that the interaction among these factors plays a significant role in determining the level of digital resource competency.

Table 6 provides a summary of the Chi-Square test results, which examines the independence of each factor against all categorical variables in each competence area. The “P value” column indicates the significance of each test, with a significance criterion of α=0.05. If the test is not significant, it suggests that the variables are independent. However, for example, in the case of the “Evaluates and provides feedback” variable, a significant relationship with sex is found.

Table 6. Chi-square test for the factors of sex, age and academic education level.

FactorLevelsChi squaredGlP value
SexProfessional commitment3.767a50.583
Digital pedagogy4.983a50.418
Recognizes, evaluates6.142a50.293
Evaluates and provides feedback15.543a50.008
Empowers students7.204a50.206
Facilitates competences1.903a50.862
AgeProfessional commitment29.113a150.016
Digital pedagogy32.688a150.005
Recognizes, evaluates36.243a150.002
Evaluates and provides feedback24.878a150.052
Empowers students15.910a150.388
Facilitates competences42.415a150.0001
Academic profileProfessional commitment15.540a150.413
Digital pedagogy24.410a150.058
Recognizes, evaluates9.339a150.859
Evaluates and provides feedback20.957a150.138
Empowers students28.461a150.019
Facilitates competences16.066a150.378

Regarding age, independence is observed in the “Evaluates and provides feedback” and “Empowers students” variables, suggesting that age does not significantly affect these competences. However, in the other variables, a significant relationship is found between age and the rest of the competences, indicating a non-independent effect of age on those variables.

For the academic profile, independence is found in relation to age, except in the “Empowers students” variable. This implies that the academic profile has a significant relationship with age in most competences, except in the case of empowering students.

These findings highlight the complex interplay of factors and their interactions in influencing the variations observed in different competence areas. Understanding these relationships is essential for tailoring educational interventions and designing targeted strategies to enhance specific competences based on sex, academic profile, age, and their combinations.

Within the possibilities of multidimensional scaling with the application of the multivariate technique, stress is presented as a measure of goodness of fit, whose indicator was ‘weak to poor’ with a value of 0.1828, which is reflected in the graph represented by a regression in Figure 1, where the degree of coupling of the data can be observed. However, in the Stress value, it was visualized that there was closeness between the items of the ‘Empowers’, ‘Evaluates’, ‘Facilitates Digital Competence’ competence areas. Therefore, the items of these areas of competence formed two compact groups of items, sharing items between one group and the other, but indicating that in general these items formed a construct.

8a0a54ec-5c1d-4d6b-9ee9-1a02b7b28500_figure1.gif

Figure 1. Scaling of competences.

Source: Data provided by respondents, processed with SPSS-21 statistical software.

As it can be observed, it was verified that Professional Commitment and Digital Resources were left out of this group. In this way, it was observed that the most dispersed items were those of Professional Commitment. On the other hand, only “Reflective Practice” and “ Professional Collaboration “ were relatively close to each other, but very far from “Digital Training” and “Organizational Communication”. Likewise, the “Digital Resources” items were found to be close to each other. Consequently, it could be considered that the items of these two areas formed another construct.

Next, a two-stage factor analysis was performed.

The results of the factor analysis with the maximum likelihood method and the “Oblimin” oblique rotation method option showed a KMO value of 0.961 and Bartlett’s test of sphericity was significant, indicating that the application of the factor analysis was adequate.

The lower part of Table 7 shows the characteristic values of the first two factors, and it was observed that with the first two factors a cumulative explained variance of 56.58% was obtained which was considered acceptable, suggesting a two-factor model. The first factor was made up of the competences that were mainly aimed at the achievement of empowerment by students of digital tools to apply them to the teaching-learning process.

Table 7. Factor analysis of items

Area/CompetenceFactor
12
Facilitating Students' Digital Competence/Digital Communication and Collaboration0.859
Facilitating Students' Digital Competence/Responsible Use and Well-being0.809
Empowering Students/Differentiation and Personalization0.800
Facilitating Students' Digital Competence/Digital Problem Solving0.788
Facilitating Students' Digital Competence/Creation of Digital Content0.787
Facilitating Students' Digital Competence/Information and Media Literacy0.702
Evaluation and Feedback/Feedback and Planning0.659
Evaluation and Feedback/Analysis of Evidence and Tests0.631
Digital Pedagogy/Collaborative Learning0.597
Empowering Students/Accessibility and Inclusion0.595
Empowering Students/Active Student Participation0.590
Digital Pedagogy/Self-directed Learning0.472
Digital Pedagogy/Guidance0.461
Digital Pedagogy/Teaching0.3930.373
Professional Engagement/Digital Training0.783
Digital Resources/Selection0.678
Digital Resources/Creation and Modification0.640
Professional Engagement/Organizational Communication0.633
Professional Engagement/Reflective Practice0.442
Evaluation and Feedback/Evaluation Strategies0.3680.409
Digital Resources/Administration, Sharing, and Protection0.373
Auto value10.8621.285
% Variance explained49.3745.840
% of cumulative variance49.37455.214

The second factor generated a construct that described the professors’ personal attitudes toward their training and use of digital tools. That is, construct one referred to the entire pedagogical structure, and construct two to the professors’ personal training and use of it.

Similarly, Table 7 presents the factors and the loadings of each of the items within the factors. Thus, the rotation analysis made it possible to achieve what the literature suggests as a good result in factor analysis, in other words, that each factor is made up of variables with high values and some variables with values close to zero, and that each variable belongs to only one of the factors.

Therefore, variables with small values in the factors were omitted to avoid duplicities, taking as a criterion for the elimination of values lower than 0.35. For this reason, only two items could not be fully discriminated against, which were Professional Collaboration and Evaluation Strategies. However, Professional Collaboration was more important in factor 1, while Evaluation Strategies was more important in factor 2. Finally, it was observed that the item Digital Training was not part of any of the factors to obtain the scores.

It is clear that through the assisted clustering analysis, three groups were identified. These are shown in Figure 2.

8a0a54ec-5c1d-4d6b-9ee9-1a02b7b28500_figure2.gif

Figure 2. Assisted clustering analysis of the constructs.

Source: Data provided by respondents, processed with SPSS-21 statistical software.

In this analysis, it was found that the grouping of participants was strongly influenced by factor 1, which pertained to the pedagogical aspect. Group 1 comprised professors who exhibited higher ratings on construct 1, indicating a strong emphasis on pedagogy. Group 2 represented an intermediate category, characterized by a balance between the two constructs. On the other hand, Group 3 consisted of professors whose ratings on construct 2 prevailed.

These groupings suggest that participants varied in their emphasis on different aspects of teaching and learning. Group 1 highlighted a strong focus on pedagogical approaches, while Group 3 demonstrated a greater emphasis on the second construct, which may be related to other dimensions or competences.

Understanding these groupings provides insights into the diverse perspectives and preferences of educators in relation to teaching practices and competences. It enables researchers and educators to identify specific areas where professional development and support can be targeted to enhance teaching effectiveness and promote a well-rounded approach to education.

To describe more precisely the constitution of the groups, the means corresponding to each of the variables that calculated the score in each area of competence were generated, the results of which are shown in Table 8.

Table 8. Means and standard deviation of the groups formed.

VariableG1 (n=117)G2 (n=101)G3(n=59)
MeanStandard deviationMeanStandard deviationMeanStandard deviation
COMMITMENT11.851.4289.07921.560026.5091.569
DIGRESOURCES8.691.2966.81191.146415.0341.144
DIGPEDAGOGY12.351.6989.81191.426277.7971.200
EMPOWER ESTUD9.291.3277.16831.497135.6611.360
FACILITATESCO14.862.29711.17821.835199.5091.995
EVALUATES AND PROVIDES FEEDBACK8.851.5386.74261.006515.6101.145

To provide a more detailed description of the group composition, the means for each variable that contributed to the calculation of the score in each competence area were computed. The results of this analysis are presented in Table 8.

Table 8 displays the mean values for each variable within each competence area. These means offer a comprehensive understanding of the specific aspects or dimensions that contribute to the overall score in each competence area. By examining the means, researchers can identify the relative strengths and weaknesses of the participants in different areas of competence.

Analyzing the means enables a more nuanced interpretation of the group composition and allows for a deeper exploration of the participants’ competences. It provides valuable insights into the specific areas where educators excel or require further development.

By considering these mean values, researchers and educators can design targeted interventions and strategies to enhance specific competences and address any areas of improvement identified. This information contributes to the overall goal of fostering professional growth and enhancing the quality of teaching and learning experiences.

Discussion

Upon examining the results, it is evident that the Integrator and Expert categories consistently present higher percentages in various competencies, such as PROFESSIONAL COMMITMENT, RECOGNIZE, EVALUATE AND EMPOWER, DIGITAL PEDAGOGY, EVALUATE AND PROVIDE FEEDBACK, EMPOWER STUDENTS, and FACILITATE COMPETENCIES. This indicates that teachers who perceive themselves as integrators and experts demonstrate a higher level of Digital Teaching Competence compared to explorers and novices. Additionally, leaders and pioneers, albeit fewer in number, show greater knowledge and abilities in these competencies.

These findings align with studies conducted by Espino-Díaz et al.,47 and Hämäläinen et al.48 which similarly highlight a pattern of behavior in competencies leaning towards a low to medium competence level. The study by Peled49 also supports this idea, suggesting that there is a positive correlation between teachers’ self-perception as experts or integrators and their level of Digital Teaching Competence.

In terms of demographic factors, Table 5 shows that sex and academic profile exhibit the highest values on the digital pedagogy variable. This suggests that, regardless of sex, educational teaching techniques are applied without gender bias, contributing to the development of Digital Teaching Competence. Furthermore, the lower values imply that sex is not a significant factor in acquiring commitments or fulfilling professional responsibilities in teaching.

These findings align with the discoveries of Guillén-Gámez et al.,50 who emphasize the limited differences between sex and age in terms of self-perception of digital competencies. This corroborates that sex does not have a significant impact on learning and teaching abilities related to the use and management of digital tools. It also suggests the need to plan and implement strategies for integrating digital competencies into teacher training programs.

Regarding age, the percentages indicate that it is not perceived as a limitation for acquiring or fulfilling commitments related to the appropriation of digital resources. This finding aligns with Suárez and Colmenero,51 where no statistically significant association was found between age and competence level, indicating that there are no significant differences among different age groups. This result is also backed by the study of Gudmundsdottir and Hatlevik,52 suggesting that digital competence is not determined by age, but rather by attitude and training.

Table 6 examines the relationship between factors, including sex, age, and academic profile. It indicates that sex is independent of the variables, except for “Evaluate and provide feedback,” suggesting that sex influences the search for possibilities to identify and verify knowledge. Also, the “Evaluate and provide feedback” and “Empower students” variables show the lowest values, indicating that age does not determine processes involving communication with students.

Regarding the academic profile, the “Empower students” variable obtains the highest score, indicating that teachers strive to provide students with the necessary support and information to guide their development. Although some teachers possess basic or intermediate digital skills, it is essential that they empower their students through adequate training. This aligns with the study conducted by Colás-Bravo et al.,53 emphasizing the need for teachers to contribute to developing critical, reflective, creative, and innovative thinking in students through proper training. Aidoo et al.54 study supports this idea, suggesting that teachers need to have a deeper understanding of how to use digital tools to empower students and foster their autonomous learning.

The correlation analysis presented in Figure 1 highlights the interrelationship between items related to competencies such as “Empower”, “Evaluate”, and “Facilitate Digital Competence.” This suggests that as teachers set out to empower students, they contribute their own competencies to improve students’ skills, thus demonstrating their commitment to professional performance and teaching praxis.

However, it is important to note that Professional Commitment and Digital Resources were not included in this construct, indicating that not all teachers share the same attitude towards student training, possibly due to resource limitations. Another construct formed by “Digital Training” and “Organizational Communication” suggests the existence of divided positions regarding empowerment practices.

Findings from List55 support this idea, as they indicate that the management of digital identities in the educational context, specifically in the dimensions of communication and collaboration, is limited among teachers. This suggests that there is a need for further development of digital competencies among educators. Similarly, the study of Benitt et al.56 concludes that teachers need more comprehensive training in digital competencies to effectively utilize technologies in teaching.

Table 7 highlights two significant factors: Professional Collaboration and Evaluation Strategies, both of which play crucial roles in integrating digital competencies in education. Professional Collaboration, identified as a key component of the first factor, underscores the importance of adapting virtual environments to foster collaborative learning. This competency not only facilitates the application of strategies for enhancing student engagement but also contributes to creating more inclusive and participatory learning environments, as supported by Kempe and Grönlund,57 who emphasize the need for collaborative digital practices in contemporary educational contexts. According to Majid and Ali,58 digital collaboration among teachers is essential for building learning communities that promote shared knowledge and innovation. Therefore, promoting teachers’ participation in professional development programs that focus on collaboration in digital environments becomes critical to advancing pedagogical practices and responding to the evolving demands of online education.5961

The second factor, Evaluation Strategies, highlights the essential role of assessment in determining student learning outcomes and identifying areas for pedagogical improvement. This aligns with findings by Amhag et al.,62 who assert that effective digital evaluation tools are indispensable for modern education, allowing teachers to track performance and adjust instruction accordingly. As Sillat et al.63 suggest, digital competence training programs must incorporate emerging methodologies for assessment, such as formative and summative evaluations using technological platforms. Additionally, Kaswan et al.64 argue that leveraging digital tools in assessment processes enables more personalized feedback and continuous improvement in teaching methods. Evaluation strategies are, therefore, pivotal for ensuring that pedagogical practices evolve alongside technological advancements, equipping educators with the ability to organize, track, and improve learning outcomes in virtual environments. By adopting these methods, educational institutions can enhance the quality of learning and better prepare students for future challenges in a digitized society.6567

Figure 2 identifies three distinct groups of teachers. The first group (G1) emphasizes the importance of grades in pedagogical processes, while the second group (G2) lacks a clear definition. The third group (G3) expresses limited contributions to the topic. These findings are consistent with the conclusions drawn by Gil-Jaurena and Domínguez,41 who recommend incorporating more profound educational technology content into training programs to improve educational quality and enable appropriate teaching praxis for the digital age. Moreover, the findings of the study by Rapanta et al68 support this recommendation, suggesting that a better understanding of educational technology can help teachers adapt their practices to students’ needs and expectations in the digital environment.

Finally, Table 8 presents the mean values and standard deviations of the formed groups. It reveals that Group 1 (G1) shows the lowest proportional effect on the “DIG RESOURCES” variable, indicating that without adequate digital resources, teachers face limitations in providing quality education. On the other hand, Group 1 (G1) shows the highest weighted average on the “FACILITATESCO” variable, suggesting that the ability to facilitate required competencies largely depends on the interaction between teachers and their students. In Group 3 (G3), the “DIGRESOURCES” variable receives the lowest scale, underlining the importance of having digital resources to promote skills for effective use of digital environments. These findings align with the study conducted by Pozo-Sánchez et al.,69 which associates higher digital competence scores with dimensions related to information and communication literacy and collaboration.

The study provides a comprehensive view of how teachers perceive themselves in terms of digital competence and how this perception relates to demographic variables such as gender, age, and academic profile. However, certain limitations and ambiguous areas suggest the need for further and more detailed research in the future.

Conclusions

The descriptive analysis revealed valuable insights into the self-perception of digital teaching competence among professors at the Technical University of Manabí (UTM). The majority of participants demonstrated a tendency towards the “Integrator” and “Expert” categories, with competencies such as “Facilitating Competences,” “Evaluating and Providing Feedback,” and “Digital Pedagogy” being the most frequently highlighted. Furthermore, the study found significant relationships between certain demographic variables and professors’ digital competence. Specifically, age, sex, and academic profile were shown to influence competencies in the “Digital Pedagogy” domain, although the relationship between sex and academic profile appeared inconsistent. Most variables displayed independence except for “Evaluates and Provides Feedback,” which demonstrated statistical significance.

Limitations

One of the primary limitations of this study is the lack of random sampling, which restricted the creation of a balanced and homogeneous sample. The non-probabilistic nature of the sample may have introduced bias, potentially affecting the generalizability of the findings to the broader population of university professors. Additionally, the voluntary participation of professors may have resulted in the underrepresentation of certain demographic groups, particularly those less inclined to engage with digital tools. These limitations should be considered when interpreting the study’s results and their applicability to other contexts.

The findings of this study underscore the importance of digital teaching competencies in higher education, especially in the current digital age where virtual learning environments are increasingly common. The relationship between demographic factors and digital competencies highlights the need for targeted professional development initiatives. By understanding how variables such as age and academic profile affect digital competence, institutions can design more effective training programs that address the specific needs of educators. This, in turn, will lead to enhanced teaching practices and better student outcomes, as faculty are better equipped to integrate digital tools into their pedagogy.

Recommendations

Based on the study’s findings, future research should focus on designing and implementing training programs that are specifically tailored to the digital teaching competence needs of university professors. These programs should emphasize both formative and procedural aspects, encouraging educators to actively produce and apply their learning in real-world contexts. The training should incorporate hands-on practice, allowing professors to develop digital competencies that meet the specific challenges of their teaching environments. Additionally, future studies should aim to employ random sampling techniques to create more representative samples, thus enhancing the validity and generalizability of the results. It is also recommended to explore the impact of digital competence on student learning outcomes, as this would provide further evidence of the importance of digital skills in education.

Comments on this article Comments (0)

Version 2
VERSION 2 PUBLISHED 20 Nov 2023
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
F1000Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Moreira-Choez JS, Zambrano-Acosta JM and López-Padrón A. Digital teaching competence of higher education professors: self-perception study in an Ecuadorian university [version 2; peer review: 2 approved]. F1000Research 2024, 12:1484 (https://doi.org/10.12688/f1000research.139064.2)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status: ?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions
Version 2
VERSION 2
PUBLISHED 29 Oct 2024
Revised
Views
3
Cite
Reviewer Report 12 Nov 2024
Dirgha Raj Joshi, Mahendra Ratna Campus Tahachal, Tribhuvan University,, Tribhuvan University, Kirtipur, Central Development Region, Nepal 
Approved
VIEWS 3
Almost comments are addressed by the ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Joshi DR. Reviewer Report For: Digital teaching competence of higher education professors: self-perception study in an Ecuadorian university [version 2; peer review: 2 approved]. F1000Research 2024, 12:1484 (https://doi.org/10.5256/f1000research.172300.r336136)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
Views
6
Cite
Reviewer Report 06 Nov 2024
Alexa Angelica Senior-Naveda, Department of Humanities, Universidad de la Costa, Barranquilla, Colombia 
Approved
VIEWS 6
The corrections were made completely and satisfactorily; in view of this, the ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Senior-Naveda AA. Reviewer Report For: Digital teaching competence of higher education professors: self-perception study in an Ecuadorian university [version 2; peer review: 2 approved]. F1000Research 2024, 12:1484 (https://doi.org/10.5256/f1000research.172300.r336135)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
Version 1
VERSION 1
PUBLISHED 20 Nov 2023
Views
14
Cite
Reviewer Report 12 Sep 2024
Alexa Angelica Senior-Naveda, Department of Humanities, Universidad de la Costa, Barranquilla, Colombia 
Approved with Reservations
VIEWS 14
As a final concept, the evaluator considers that the inconsistencies present between the objectives set out in the introduction and the objectives that appear inserted in the theoretical framework must be resolved. To resolve this disagreement, it is suggested to ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Senior-Naveda AA. Reviewer Report For: Digital teaching competence of higher education professors: self-perception study in an Ecuadorian university [version 2; peer review: 2 approved]. F1000Research 2024, 12:1484 (https://doi.org/10.5256/f1000research.152308.r224295)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.
Views
28
Cite
Reviewer Report 29 Mar 2024
Dirgha Raj Joshi, Mahendra Ratna Campus Tahachal, Tribhuvan University,, Tribhuvan University, Kirtipur, Central Development Region, Nepal 
Approved with Reservations
VIEWS 28
Title: Digital teaching competence of higher education professors: self-perception study in an Ecuadorian university
Abstract:
Results in the abstract is just like conclusion hence it must contain some statistical values of core findings.
Keywords: can add one ... Continue reading
CITE
CITE
HOW TO CITE THIS REPORT
Joshi DR. Reviewer Report For: Digital teaching competence of higher education professors: self-perception study in an Ecuadorian university [version 2; peer review: 2 approved]. F1000Research 2024, 12:1484 (https://doi.org/10.5256/f1000research.152308.r246734)
NOTE: it is important to ensure the information in square brackets after the title is included in all citations of this article.

Comments on this article Comments (0)

Version 2
VERSION 2 PUBLISHED 20 Nov 2023
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.