«   Guidelines: main page     |     D-ChallengHE: project's website   *

 

 

Digital Challenges in Higher Education
Guidelines
for online and blended learning

Premises for academic curriculum digitalisation

 

 

 

Chapter 2   The role of teachers’ digital competences
                 2.4.   Ethical aspects, limits and challenges of using digital technologies in education. Safety and data protection

 

Feedback form for this chapter: https://forms.gle/tNyWC1HYMsP46t6WA
 

2.4.   Ethical aspects, limits and challenges of using digital technologies in education. Safety and data protection

A thorough, multifaceted approach to digital competence necessitates a departure from traditional pedagogical methods, encouraging educators to embrace a collaborative and integrated framework that prioritizes continuous professional growth, adaptability, and a focus on ethical implementation of digital tools in their teaching practices. (Cook et al., 2023) (Srivastava, 2023) (Falloon, 2020) (González et al., 2023)

Generally speaking, the penetration of new technologies brought upfront a series of aspects that require rigorous examination and new educational policies and curriculum approaches, concerning education decision makers, HEIs leadership, researchers, as well as teachers at all educational levels.

1. Curriculum alignment. Ensuring that digital technologies are aligned with curriculum goals and learning outcomes is a challenge, as many educators struggle to integrate technology meaningfully into their lesson plans.

2. Policy implementation. The effectiveness of policies regarding the use of digital technologies in education can vary widely, with some regions lacking clear guidelines or support for implementation.

3. Changing pedagogical approaches. The need to rethink traditional pedagogical approaches in light of new technologies can be daunting for educators, particularly those who are less digitally literate. Similarly, developing effective assessment strategies that incorporate digital technologies while ensuring fairness and validity remains a significant challenge for educators.

4. Teacher burnout and workload. The rapid shift to online and hybrid learning models during the pandemic has contributed to increased stress and burnout among teachers, impacting their ability to effectively use digital tools.

5. Professional development/ Digital competence of educators. Many teachers lack the necessary skills and confidence to integrate digital technologies into their teaching practices, leading to a reliance on traditional methods. Furthermore, there is often a lack of ongoing professional learning opportunities for teachers to develop their digital skills and pedagogical strategies for using technology effectively.

6. Access to technology. Disparities in access to devices and reliable internet connectivity can hinder the ability of both teachers and students to engage with digital learning tools.

7. Equity and inclusion. The digital divide exacerbates existing inequalities in education, as students from disadvantaged backgrounds may not have the same access to technology and resources as their peers.

8. Ethical use of technology. Educators face challenges in understanding the ethical implications of using digital technologies, including issues related to data privacy, security, and the responsible use of generative AI.

For education professionals, this multifaceted responsibility calls for a thorough understanding of the potential implications of technology on student learning and well-being, highlighting the necessity for training that emphasizes both the advantages and the potential risks associated with digital tools in education. Thus, educators must be equipped with strategies to address the challenges posed by digital platforms, ensuring that they not only enhance learning, but also prioritize student safety and privacy; establishing robust frameworks for ethical data use is essential to maintaining trust and integrity within the educational ecosystem. Nowadays, teachers must ensure that their practices safeguard students' privacy and uphold ethical standards, particularly in an age where the collection and use of data are prevalent in educational settings (Bhat, 2023). Consequently, educational institutions and technology providers must collaborate to create and implement robust frameworks that protect students' data while also promoting the ethical use of technology, thus ensuring that the digital classroom can serve as a space for equitable access, engagement, and personalized learning (Bhat, 2023; Srivastava, 2023).

Educators have a huge responsibility to use data ethically in educational settings, balancing the benefits of learning analytics with student privacy concerns (Mandinach & Jimerson, 2022; Reidenberg & Schaub, 2018). This requires data literacy and knowledge of relevant laws like FERPA (in US), the “Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence” Executive Order (in US), and the EU AI Act (in the European Union).

According to Hakimi et al. (2021), ethical considerations include four areas: (1) privacy, informed consent, and data ownership; (2) validity and integrity; (3) ethical decision making; and (4) governance and accountability. The authors emphasize the need for more rigorous evidence-based practices, and a more unified approach that incorporates ethical theory and addresses both immediate and long-term concerns within the broader learning and educational ecosystem. A privacy-compliant framework for capturing, storing, and using student data is essential. Overall, educators must couple data literacy with ethical practices, using the right data for the right purposes to benefit students (Mandinach & Jimerson, 2022). Beyond regulations and top-down recommendations, this requires ongoing attention to evolving technologies and their implications for student privacy and autonomy.

Of increasing importance are the aspects of plagiarism, especially in the context of proliferation of AI-based assistants. The integration of AI and human interaction in the academic context raises significant questions regarding ownership, attribution, copyrights, and authorship. As students grapple with these issues, distinguishing between legitimate AI assistance and academic misconduct becomes increasingly complex, thus necessitating educational interventions that incorporate discussions about what constitutes literary theft and the importance of maintaining individuality in their writing. For very good reasons, while there may be small variations and nuances across various legislative bodies, educational institutions, publishing houses and journals, the consensus is that AI or AI-assisted tools cannot be credited as an author or co-author – authorship is strictly a human responsibility. Current attitudes toward AI usage are evolving, and regulations are still developing, particularly as AI detection technology remains in its early stages and often produces false positives. Consequently, AI detection is primarily used to identify AI involvement in texts, necessitating further examination rather than a simple pass/fail based on AI usage percentages. While AI-generated ideas can expedite the writing process, scholars and students must not use AI outputs verbatim, as this constitutes plagiarism. Instead, AI tools should be employed to assist with tasks like article recommendations, summaries, and brainstorming, with the original academic narrative crafted by the author themselves.

Authors share a responsibility to avoid misconduct, with institutions like Cambridge University emphasizing the importance of proper referencing. Even more, the educators have a more prominent role in engaging students in meaningful discussions about the line between assistance and appropriation, emphasizing proper attribution and the importance of preserving academic integrity in the face of AI advancements. They must evolve their teaching strategies to address the ethical implications of AI usage, fostering a nuanced understanding of what constitutes plagiarism and ensuring that students are equipped with the knowledge to navigate these new challenges responsibly (Chan, 2023). Moreover, institutions should consider incorporating curricular components that specifically address the implications of AI on authorship, exploring issues of originality and the moral responsibilities of using such technologies in academic contexts, thereby preparing students for the ethical dilemmas they may encounter in the evolving landscape of professional creativity and professionals’ responsibility. Schools and universities have to create supportive environments that highlight the potential risks of AI use and promote critical thinking and ethical decision-making among students, ensuring they understand the long-term consequences of cheating and its implications in professional and societal contexts.

 

 

» Provide feedback for this chapter: https://forms.gle/tNyWC1HYMsP46t6WA
« Get back to main page: digital-pedagogy.eu/Guidelines

An open access guide. A perfectible product, for an evolving reality.

You can use the form available on this page to provide feedback and/ or suggestions.
For social annotations on any chapter, you can use Hypothesis or any other similar tool.
You can also send direct feedback to: olimpius.istrate@iEdu.ro | +40 722 458 000

 

Guidelines for online and blended learning
Available online: https://digital-pedagogy.eu/Guidelines
Full pdf version to download: Guidelines (version 6)

The Romanian partner in D-ChallengHE project in charge with WP5 is
the Institute for Education (Bucharest): https://iEdu.ro
Contact: office@iEdu.ro

 

 

 

 

 


 

 

«   Guidelines: main page     |     D-ChallengHE: project's website   *