European Commission logo
Log in Create an account

Popular searches on EPALE

EPALE - Electronic Platform for Adult Learning in Europe

Blog

Not Just Another Tool: Why Adult Education Needs a Clear Stance on AI

Keynote contribution at the Austrian EPALE and ERASMUS+ Adult Education Conference 2025

 

Workshops EPALE Themenkonferenz 2025.

Author: Birgit Aschemann

Since early 2023, the widespread use of generative AI - particularly text generators - has triggered a profound cultural shift. With the rise of ChatGPT and similar tools, chatbot usage has become deeply embedded in creative and writing processes. Their integration is transforming established work routines, the way we engage with language, our expectations around information, and, consequently, the epistemic and cultural foundations of education.

AI – More Than Just a New Tool

The new culture of writing shaped by the proliferation of AI chatbots is characterized by a lowered threshold for text production. Writing is increasingly being replaced by curating and editing, while the practice of “thinking through writing” is receding into the background. AI chatbots encourage a smooth, algorithmically optimized use of language.

AI now generates emails, social media posts, and chat messages - reshaping trust in digital communication. At the same time, AI chatbots are setting new standards for service orientation, perceived “friendliness,” and “patience.”

But AI also influences published content. Text generators operate on probabilities, which can cause information to lose its authoritative character. They make implicit selections and define relevance - unless explicitly steered through careful prompting. Moreover, AI models are increasingly trained on AI-generated content, reinforcing existing errors and biases over time.photo

Implications for Education

A 2025[1]  study found that by the end of 2024, approximately 24% of all press releases were written using AI. Barbara Geyer, professor and program director at the University of Applied Sciences Burgenland, noted on LinkedIn: “We are currently educating for a world that no longer exists. In professional practice, AI tools are already standard.”

A Europe-wide study conducted by the Vodafone Foundation[2] in 2024 revealed that two-thirds of students consider access to AI essential for academic success—and that they are mostly learning these skills from their peers.

AI usage among university students is even more widespread. A 2025 study by the University of Darmstadt found that nearly 92% of students in Germany use tools like ChatGPT for text analysis and production, research, and clarifying questions.

However, this does not mean that educational practices are aligned with the realities of the modern workforce. AI use remains error-prone and is often carried out without sufficient guidance or critical reflection. 

(Perhaps) the Most Important AI Competence

Given the rise of AI chatbots as text generators, perhaps the most essential AI competence today is the ability to use them purposefully and in accordance with their function. The simulation https://moebio.com/mind/ illustrates this well by showing how ChatGPT completes a sentence like “Intelligence is...” in countless ways - each continuation unique and semantically distinct. The results highlight the variability and contextual sensitivity of AI-generated language.

When knowledge-based questions are asked in this way, reliable answers cannot be expected. AI tools like ChatGPT can deliver convincingly worded responses even to nonsensical prompts - for example, “Explain the term pedagogical double burger” or “Is the pedagogical double burger a synonym for the pedagogical hot dog?” Such replies may sound plausible but lack factual grounding, highlighting the need for critical evaluation of AI-generated content.

The key competence lies in knowing when it is sufficient to generate content - for example, for inspiration or brainstorming - and when proper research is required to obtain fact-based, reliable information. In such cases, AI-assisted search tools like Perplexity or chatbots with web access - where original sources are provided and can be critically evaluated - offer a more responsible use of AI. This kind of awareness must become a core element of basic digital education on a broad scale. The AI Act, with its emphasis on competence development, reinforces this imperative.

What the AI Act Demands from Adult Education

The AI Act came into force in August 2024, with various provisions being implemented gradually over the following years. Since February 2025, the provision on AI competence has been in effect. It reads as follows:

“Providers and deployers of AI systems shall take measures to ensure, to the best of their ability, that their staff and other persons acting on their behalf who are involved in the operation and use of AI systems possess an adequate level of AI competence, taking into account their technical knowledge, experience, education and training, the context in which the AI systems are to be used, and the persons or groups on whom the AI systems will be used."[3]

This requirement applies to all AI systems - regardless of their risk level. Under the AI Act, a “deployer” is defined as any individual or organization that uses an AI system under their own responsibility. This means that if an educational institution decides to integrate tools like Microsoft Copilot, ChatGPT, or Mistral into its courses, it qualifies as a deployer and is therefore responsible for ensuring that its staff possesses the necessary level of AI competence.

The AI Act defines AI competence as follows: “The abilities, knowledge, and understanding that enable providers, deployers, and affected individuals, considering their respective rights and obligations under this Regulation, to use AI systems knowledgeably and to be aware of the opportunities, risks, and potential harm they may cause.”

At its core, AI competence is about informed and responsible use - recognizing opportunities and risks, and preventing harm 

In practice, inadequate or improper use of AI can lead to liability claims or administrative penalties for non-compliance with the AI Act or the General Data Protection Regulation (GDPR).

In each case, “the context in which the AI systems are to be used, as well as the individuals or groups they are intended for, must be taken into account”. This means that AI competence in adult education differs from that in a PR agency, a law firm, or a manufacturing company – and AI competence must be specifically defined for adult education, with a focus on informed application and risk awareness.

It is clear that adult education is called upon to take an active stance - especially at the level of individual institutions. Possible approaches include targeted training for staff, as well as written guidelines or collectively shared minimum standards for the use of AI. However, as of early 2025, a large proportion of providers have yet to address this challenge in a systematic way.

Recommendations for a Proactive Approach in Adult Education

A new practice of individual learning is currently emerging: more and more people are learning with chatbots. The know-how required to do this effectively must be actively integrated into educational programs and course design. This includes not only technical handling but also critical reflection on the strengths and limitations of AI-supported learning.

At the same time, it is essential to create spaces for discourse that allow us to reflect on this cultural shift - so that the transformation brought about by AI can be consciously acknowledged, discussed, and actively shaped.

In addition, entirely new applications with transformative potential for educational practice are emerging at a rapid pace. Just to name a few: agent-based systems, multilingual avatars, computer vision integrated into AI chatbots, and the rise of audio-based chatbot interactions. These innovations need to be piloted, researched, and - where appropriate - scaled up. A crucial starting point is the development of targeted professional learning opportunities for adult education practitioners.

A simple yet important contribution within courses is to avoid mystifying AI - by not presenting it as a subject with human-like qualities or illustrating it with typical anthropomorphic images. This applies both to visual materials and to the language used when talking about AI. Normalising AI as a tool, rather than a being, helps foster critical distance and informed engagement.

Competence-oriented didactics involving AI are equally important. A well-designed AI-related learning task should require human intelligence. For instance, educators can ask learners to guide a chatbot in addressing a specific subject-related question. This not only fosters logical thinking and prompting skills, but also deepens engagement with the actual content - bridging technological and subject-matter competence.

An AI strategy is essential

Ideally, all of these activities should be embedded at the institutional level within a provider-specific AI strategy. While the development of such a strategy must ultimately be carried out by each institution or association individually, there are key principles derived from the AI Act that apply across the board. Establishing such strategies is not optional - it is essential for ensuring responsible, competent, and future-oriented use of AI in adult education. These are included in the AI guidelines for SMEs published by the Austrian Economic Chamber (WKO).

For educational providers, it is also essential to define their position on AI based on their own mission statement and to formulate AI-specific goals - both internally and for their educational programs. At the heart of any such AI strategy lie minimum standards for the use of AI in teaching and learning processes. These standards provide clarity, ensure responsible implementation, and support a shared understanding among staff and learners.

These minimum standards are aimed at educators and should ideally be accompanied by informational materials and professional development opportunities. Finally, a comprehensive AI strategy should also include appropriate measures for personnel development or recruitment, supported by clear guidelines tailored to the different professional roles within the institution.

A learning path for developing an AI strategy is offered by the EBmooc 2025, which will be published starting in September 2025. The course supports the creation of provider-specific guidelines through structured reflection questions and ready-to-use materials—helping institutions take practical, well-informed steps toward AI readiness in adult education.

[1] Liang/Zhang/ Codreanu/Wang/Cao/Zou (2025): The Widespread Adoption of Large Language Model-Assisted Writing Across Society. Available online: https://arxiv.org/pdf/2502.09747 (18.04.2025)

[3] AI-Act see https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=OJ:L_202401689#d1e39-1-1

About the Author:

Birgit Aschemann is a psychologist and educational scientist who has been working at the CONEDU Institute for around 10 years, where she leads the area of Digital Professionalisation. In addition, she has been a part-time lecturer for future educators at the University of Graz for around 25 years. She considers AI to be the greatest call for independent thinking she has encountered in her professional life – both for herself and for all of us. Team | Institute CONEDU 

About this blog:

This blog post is based on a key note at the Austrian 2025 EPALE and Erasmus+ Conference entitled "Artificial intelligence meets adult education: Promoting innovation, strengthening competences".

Likeme (1)

Login or Sign up to join the conversation.