When to use technology in the classroom

When to use technology in the classroom

The introduction of technology into the classroom is not, and should not be, an end in itself. / Photo:  Faisal Mehmood – Pixabay

 

License Creative Commons

 

Antoni Hernández-Fernández

 

I often leave the classroom, on foot and in thought, with a sense of contradiction. It happens whenever I become part of the machinery of the system—when I train teachers or students in the use of technology. I teach them. Ergo, I am encouraging its use. I reassure myself (or perhaps deceive myself) by thinking that, as a Design and Technology teacher, this is precisely my role. And I like to believe that, at the same time, I foster critical thinking rather than passive use. That this is my duty, I tell myself. I am not sure I succeed. Allow me, then, to clarify a few points that may be of help to colleagues—both within my field and beyond it. Or perhaps to myself.

The introduction of technology into the classroom is not, and should not be, an end in itself. Except in subjects belonging to the field of technology, in vocational education and training, or at specific points in the curriculum where technology is itself the object of study, technology is a means. It should be a tool for improving learning. In a context marked by the rapid expansion of artificial intelligence and other digital technologies, the relevant question is no longer which technology to use, but when and why to use it. This distinction is crucial if we take into account both students’ cognitive limits and the ethical, environmental and global socio-economic impacts associated with these technologies. Because AI, like other technologies and artefacts, is not neutral; it is not merely a tool (Laba, 2025).

From a pedagogical perspective, introducing technology before students have acquired the necessary conceptual frameworks is counterproductive. Far from facilitating learning, it can generate instrumental dependence, superficiality, cognitive offloading or a false sense of understanding. Would it ever occur to us to give a calculator to pupils who do not yet know how to multiply, before they have learned their times tables? Or to give generative AI to students to create an outline, write a text or produce a video before they have learned how to do these things themselves?

Technology used merely as a substitute kills learning. As B. F. Skinner already argued in 1970, abandoning a student in front of a technology is tantamount to giving up on teaching; it is encouraging them to learn alone, without being taught. Technology—AI chatbots, for example—should be reserved for moments when the student is genuinely alone, with no one else to turn to. And it is precisely there, in the solitude of the learner without human support, that we must teach—especially in the field of Design and Technology—how to use technology properly. But timing, tempo, is crucial. Technology should appear only when it adds real value to the learning process: when the student has been properly instructed and possesses the cognitive scaffolding required to interpret its outputs critically; not as an artefact that replaces reasoning or skill, but as a conscious and situated support. And it is only natural that, at their age and given their immaturity, students are tempted by the path of least resistance, by cutting corners, by skipping the effort of learning. Forgive them, for they know not what they do. But supervise them—and be strict.

Students must be honest about their use of AI, as with any other technology, and acknowledge it to the teacher. But first they must be honest with themselves: do I actually have the knowledge? Have I learned? Or has the AI done everything while I have learned nothing? We need to create a classroom climate that opens channels of communication and allows students to feel safe enough to practise that honesty, to speak frankly. Or we can open alternative, virtual channels of communication to support more timid students. That is precisely where technology should be used: as a complementary tool to improve communication.

To this didactic dimension we must add an unavoidable ethical responsibility. In the case of generative AI, we have already seen the global disruption it causes, as illustrated by the cartography produced by the Taller Estampa (2024). As that visualisation showed—partly inspired by Atlas of AI by Kate Crawford (2021)—these technologies have a significant environmental impact, involving high energy consumption, intensive use of natural resources and forms of human extractivism, as well as an uneven socio-economic impact that disproportionately affects certain regions of the world and marginalised, vulnerable and exploited groups. Ignoring this reality in the classroom amounts to training uncritical consumer-users of harmful systems whose footprint remains invisible—left in the cloud, connected and ignorant.

For this reason, when it is necessary and we decide to use technology in class, we should opt for free, public and open-source tools, trained transparently and guided by clear ethical criteria. It is not just about teaching how to use a tool, but about educating students in the consequences of its use: who developed it, with what data, at what environmental cost and with what social implications. It may be that, at present, such free tools lag slightly behind corporate ones in strictly technical terms and measurable outputs. But as has happened in other technological domains, communities and users can improve and empower them. And, in any case, for most classroom tasks we do not need the latest, most powerful version of everything. A technologist’s view.

The guiding principle should be clear: do not use AI—or any other digital technology—unless it is strictly necessary. Teaching also means knowing when to refrain. Reducing is one of the best things we can do for the environment, both in material consumption and in digital use. In many cases, century-old technologies such as writing (pencil and paper), books, conversation and slow, social reasoning—cornerstones of human communication—remain not only sufficient, but indispensable. Our symbols, cultural rituals and ancestral ways of life are not optimisation algorithms. Using technology in the classroom is a decision that must always be justified in terms of genuine learning needs, never by passing fashions of innovation or novelty, and certainly not by external pressure—whether from educational institutions themselves or from the media propaganda of large technology corporations.

And so, I continue, with my tribulations and contradictions, doubting in a way no machine ever will. I am off to prepare my next lesson: Creativity and imagination in the technological age.


References:

Crawford, K. (2021). Atlas of AI. Yale University Press.

Estampa, Taller (2023). Cartography of generative AI. https://cartography-of-generative-ai.net/

Hernández-Fernández, A. (2024). Técnicas y tecnologías útiles en el aprendizaje: de las máquinas de Skinner a la inteligencia artificial. En: «Inteligencia artificial en la educación: desarrollo y aplicaciones». Madrid: OEI, p. 22-40. https://oei.int/wp-content/uploads/2025/04/desarrollo-ia-educacion-7.pdf

Laba, N. (2025). AI is not a tool. AI & Societyhttps://doi.org/10.1007/s00146-025-02784-y


Source: educational EVIDENCE

Rights: Creative Commons

Leave a Reply

Your email address will not be published. Required fields are marked *