ChatGPT and Teacher Education

Announcements

  1. AERA Meet-Up this Friday!: We are holding an in-person Civics of Technology meet-up at the American Educational Research Association (AERA) Annual Meeting in Chicago at D4 Irish Pub & Cafe (345 E Ohio St, Chicago, IL 60611, USA) on Friday, April 14th, 2023 from 5-7pm CST. Check our Twitter account for updates. See you there!

  2. Next Book Club: We are reading Sasha Costanza-Chock’s 2020 book, Design Justice: Community-Led Practices to Build the Worlds We Need. We will meet to discuss the book at 8pm EDT on Thursday, May 18th, 2023. Register on our Events page.

  3. Next “Talking Tech” Monthly Meeting on 05/02: We hold a “Talking Tech” event on the first Tuesday of every month from 8-9pm EST/7-8pm CST/6-7pm MST/5-6pm PST. These Zoom meetings include discussions of current events, new books or articles, and more. Participants can bring topics, articles, or ideas to discuss. Join us this Tuesday on April 4th for our third Talking Tech. Register on our Events page or click here.

 by Jacob Pleasants & Jeffrey Radloff

On November 30, 2022, OpenAI released the ChatGPT chatbot to the public, unleashing a torrent of attention and commentary. The underlying technologies it employs – natural language processing and language models (OpenAI, 2023) – are not new. Over a decade ago, IBM’s Watson used those AI approaches to win the Jeopardy! challenge (Baker, 2012). Language models are used in many public-facing technologies, such as personal assistants (e.g., Amazon’s Alexa, Apple’s Siri, and Google) and chatbots companies use for customer service (Adamopoulou & Moussiades, 2020). They have also entered educational spaces through essay scoring systems (Ramesh & Sanampudi, 2022; Schneider et al., 2022) and automated feedback systems (Chen et al., 2020).

However, ChatGPT nevertheless represents a significant departure from its predecessors. For one, it is publicly available, at least at the time of writing. More striking is its ability to parse natural language prompts and provide a wide array of human-like responses. It can provide reasonable answers to a great variety of questions (though certainly not all; Christian, 2023) and even more impressively compose entire essays in an array of styles (e.g., rap lyrics in the style of Snoop Dogg; Zahn, 2022). Moreover, much to the consternation of many educators, the text it generates gives few overt signs that a machine created it. ChatGPT might not employ radically new AI processes, but it does demonstrate what can happen when a truly enormous amount of computing resources and a staggering amount of data are brought to bear.

Since its release, ChatGPT has garnered a great deal of attention in education circles. Much of the initial discourse focused on the possibility of students using the chatbot to cheat (Rosenblatt, 2022). Its capabilities led some scholars to herald the death of English education (Herman, 2022) and essay writing (Marche, 2022) as we know it. For their part, colleges and universities scrambled to take action (Huang, 2023), and school districts such as New York City have banned ChatGPT outright (Elsen-Rooney, 2023). These conversations and developments are the latest entries in what seems like an ever-escalating and dichotomous arms race of ‘academic dishonesty’ on one side and ‘methods of detection and deterrence’ on the other. As technologies have given students novel ways to evade the mental workload of school (e.g., using the internet to search for information, using software to solve mathematical problems), new technologies have also been developed to surveil and constrain them (e.g., Respondus, 2023) – though often with problematic results (Flaherty, 2020, 2021; Grajek, 2020). True to form, ChatGPT is already spawning its own new wave of countermeasures (GPTZero, 2023; Rosalsky & Peaslee, 2023).

The initial panic has calmed (somewhat), and educators have begun to write about ways ChatGPT can be employed for more positive educational purposes (e.g., Ferlazzo, 2023; Shields, 2023). Rather than simply focusing on positive and negative use cases, important critical perspectives have also been brought to bear (e.g., Singer, 2023). For instance, in a recent blog post, Autumn Caines addressed critical ethical questions about students’ use of ChatGPT, providing suggestions for how to engage students in critical conversations about what it means to use the technology responsibly. Unfortunately, such nuanced and critical perspectives remain rare in the educational discourse.

As teacher educators working with pre-service and in-service public school teachers, we have watched this discourse unfold uneasily. We are not particularly concerned about how our teacher education students might use ChatGPT in our classes. Nevertheless, that does not mean that we should ignore it. Even though ChatGPT is unlikely to remain freely accessible, it will not be the last generative AI of its kind. Therefore, we need to think carefully about how to prepare teachers for an educational landscape in which AI technologies, from generative chatbots to scoring algorithms, will continue to be made more powerful and ubiquitous.

When working with teachers, we want to avoid conveying that ChatGPT threatens the integrity of education, against which a defense must be mounted. Treating it this way feeds into reactive conversations about the cheating/detection arms race, which will likely be of little lasting value. At the same time, we do not want to simply outline different ways to use ChatGPT for more positive educational purposes. Doing so would imply that it is just a “neutral tool” that can be put to various uses. Instead, we consider ChatGPT an opportunity to develop teachers’ critical thinking abilities about novel educational technologies. The provocative nature of ChatGPT creates a valuable and timely invitation to conduct critical inquiries into educational technology that are all too rare in teacher education programs (Bradshaw, 2017; Heath et al., 2023; Krutka et al., 2019, 2022).

Technical Capabilities and Uses is a Starting Point, not an Ending Point

·      If you ask ChatGPT a question, how accurate is its response?

·      What kinds of questions does it answer well, and which does it answer poorly?

·      What genres of writing can it compose, and how well?

·      To what extent are its responses distinguishable from a human's?

When we encounter new educational technologies, we often begin with these kinds of questions. We consider them tools and ask about their technical capabilities and potential uses (Selwyn, 2010). For example, we might imagine how a student could use ChatGPT when completing school tasks such as homework assignments or essays. Naturally, this provokes anxiety given that ChatGPT is not just assistive but can efficiently complete many assignments for the student. However, we can also imagine more positive use cases: a student could use it as an editor and collaborator rather than a surrogate writer or as a starting point to learn about a new topic (Ferlazzo, 2023; Pavlik, 2023).

Assessing a technology's technical capabilities and use cases is a fine place to begin. Unfortunately, all too often, this is as far as analyses of educational technologies go before judgments are made, and actions are taken (Selwyn, 2010). In the case of ChatGPT, the response has generally been to react to the harmful use cases (cheating) by constraining its use (Elsen-Rooney, 2023) or adjusting assignment and assessment protocols (Huang, 2023). The issue is that viewing technologies (educational or otherwise) merely as value-neutral tools with ‘good’ and ‘bad’ uses is a profoundly limiting perspective (Martin et al., 2020; Pleasants et al., 2019; Spector, 2016). Understanding a technology's uses and technical capabilities is necessary but woefully insufficient to understand its effects (Bonk & Wiley, 2020).

One crucial idea that the “tool” perspective misses is how, in using any technology, we are changed by it in ways that are unintended, unexpected, and often unnoticed (Carr, 2014; Ihde, 1990, 1998; Verbeek, 2005). Communication technologies from smartphones to email reshape work and social life in rarely anticipated ways (Newport, 2021; Turkle, 2012). Even “mundane” technologies change how we perceive and act in the world. When we sit behind the wheel of an automobile, we can seemingly become an entirely different person, often one who is more belligerent and impatient. Other people, especially those on bicycles, become little more than obstacles and impediments (Delbosc et al., 2019). Paradoxically, the technical capabilities of cars to move us around more rapidly are often accompanied by heightened feelings that we are never moving as fast as we want to be (Burkeman, 2021).

Also missing from the “tool” perspective is the recognition that technologies are part of broader social and technical systems. As elements of those systems, technologies inevitably interact with social values and beliefs, strongly influencing what those technologies do and their effects (Feenberg, 2010; Kranzberg, 1986; Van de Pol & Kroes, 2014). Those interactions with broader social systems cause the benefits and costs of seemingly neutral technologies to be distributed inequitably and disproportionately (Benjamin, 2019; O’Neil, 2017). For instance, critical studies of computer algorithms have shown how they reinscribe and reinforce societal biases and inequities, despite their ostensible impartiality (Bogina et al., 2021; Buolammini & Gebru, 2018; Eubanks, 2017; Noble, 2018).

In general, what is true for technology is also true for educational technologies. When educational technologies are introduced into a classroom, the changes they bring are both additive and ecological (Garcia & Nichols, 2021; Nichols & Garcia, 2022). Not all of those changes will be intentional, and while some might be desirable, others will be less so (Ciccone, 2022; Dixon-Román et al., 2020; Macgilchrist, 2021). They encourage and facilitate specific interactions and learning activities while discouraging others (Manolev et al., 2019; Teräs et al., 2020; Witzenberger & Gulson, 2021). Many of the complex ecological effects fail to be anticipated because educational technologies are not recognized as belonging to broader social and technical systems. The values embedded in those systems influence how developers design educational technologies, how they are utilized, and their effects on students and teachers (Garcia & Nichols, 2021; Selwyn & Bulfin, 2016; Watters, 2020). Even if unintended, educational technologies will be shaped by, and often reinforce, inequities embedded in those broader systems (Heath & Segal, 2021; Heath et al., 2023; Raji et al., 2020; Resta et al., 2018).

These are the kinds of understandings that we need to instill in our current and future teachers. To slightly modify the words of historian of technology Melvin Kranzberg (1986), teachers need to see that “[Educational] technology is neither good nor bad; nor is it neutral” (p. 545).

Enter ChatGPT

In many ways, ChatGPT provides an ideal object of study and point of entry into the broader and more perceptive ways of thinking about educational technology described above. The primary reason why ChatGPT is so well suited for this task is that its effects on education are anything but subtle. In the case of more mundane, everyday educational technologies such as PowerPoint or iPads, the ecological impacts on the classroom environment are potentially harder to recognize. While it may not be entirely clear what ChatGPT will do to education, it is much more apparent that its effects will be substantial and far-reaching. The fact that it has already spurred changes in so many classrooms (Huang, 2023) is evidence of its potential to transform teaching and learning (and not necessarily in desirable ways).

            Yet even though the power of ChatGPT is evident, much of the discourse around it remains confined to “tool” perspectives. In holding up ChatGPT as an object of inquiry, we need to help teachers think about it in ways that go beyond listing its potential use cases. We especially need to help teachers go beyond merely discussing how to react to the specter of ChatGPT-based cheating. Below are some questions that could be used to initiate conversations about ChatGPT that give attention to those ideas:

1.     Even if ChatGPT is free to use (for now), what costs are associated with using it? Who bears those costs?

2.     What kinds of uses does ChatGPT encourage? Which does it make more difficult?

3.     If students were to frequently use ChatGPT when completing assignments, how might that change the way they engage with school tasks more generally? How might it change the way that they interact with teachers and classmates?

4.     How might using ChatGPT change the way that students and teachers think about the nature of writing? About the nature of knowledge? About the nature of learning?

5.     What are some of the characteristics of our educational systems that make ChatGPT so threatening? How could those systems be set up so that ChatGPT would not be so threatening?

6.     What and whose values are reflected in how ChatGPT was designed?

7.     What societal values and biases are likely to exist in the data on which ChatGPT was trained? How might those values and biases manifest themselves when using it?

8.     What educational biases and inequities might ChatGPT reinforce?

9.     What would an AI system look like that was more aligned with your values as an educator?

10.  What societal inequities does ChatGPT support or perpetuate, and how can its fairness and transparency be improved?

 These questions are only starting points for deeper critical conversations about ChatGPT. Although many questions take a skeptical stance (Krutka et al., 2022), this does not necessarily mean that the inquiries will result in pessimistic or negative evaluations. Only some of the transformative changes that ChatGPT might bring are undesirable. It can, for instance, force educators to rethink their educational objectives and assessments. If the culminating assignment for a class is something that ChatGPT can complete, what does that say about the assignment? What does that say about the educational goals of that class? By rendering certain classroom tasks trivial, it can actually reveal a triviality that was already present. When we prepare students to merely complete the kinds of tasks that ChatGPT can do, we do little more than train those students to be machines (and be machined).

ChatGPT, though, is just a single point of entry into conversations and inquiries that ought to be had about any educational technology. We should help teachers apply the ways of thinking that they used for ChatGPT to other influential educational technologies. Teachers might, for instance, analyze other AI systems that have made their way into educational spaces. AI-powered “intelligent tutoring systems” (Ma et al., 2014) continue to be promoted as tools for individualized learning (Chen et al., 2020; Tetzlaff et al., 2021). Instead of narrowly examining their technical capabilities, teachers might analyze the values and beliefs built into those systems. Clearly, personalized AI tutors were not designed from sociocultural perspectives on learning (e.g., Rogoff, 2003; Vygotsky, 1978). They instead employ a behaviorist view of learning, one that imagines it to be a reinforcement-driven process of knowledge and skill acquisition, which can presumably be made more efficient via the use of data to employ “effective” teaching techniques (Biesta, 2010; Watters, 2021; Witzenberger & Gulson, 2021). They are tied to larger social systems that foreground data as an educational value (Nichols & Garcia, 2022; Selwyn, 2015; Selwyn et al., 2021). Recognizing those systems and values will raise teachers’ awareness of the kinds of impacts these technologies might bring.

Critical examinations need not and should not be limited to “cutting-edge” AI technologies. More everyday educational technologies, from whiteboards to learning management platforms to classroom furniture, are worthy of critical examination (Nichols & Garcia, 2022; Selwyn, 2023; Smith et al., 2005). As teacher educators, our goal is not to focus only on the latest educational technologies but to provide teachers with the conceptual tools to think critically and make informed decisions about educational technologies in general (Bradshaw, 2017; Heath et al., 2023; ISTE, 2018; Krutka et al., 2022). AI technologies like ChatGPT and others (e.g., facial recognition software and art generators) are starting points, not ending points, for critical thinking.

References

Adamopoulou, E., & Moussiades, L. (2020, June). An overview of chatbot technology. In Proceedings of IFIP International Conference on Artificial Intelligence Applications and Innovations (pp. 373-383). Springer. https://doi.org/0.1007/978-3-030-49186-4_31

Baker, S. (2012). Final Jeopardy: The story of Watson, the computer that will transform our world. Harper Collins.

Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim Code. Polity.

Biesta, G. J. (2010). Why ‘what works’ still won’t work: From evidence-based education to value-based education. Studies in philosophy and education, 29(5), 491-503.

Bogina, V., Hartman, A., Kuflik, T., & Shulner-Tal, A. (2021). Educating software and AI stakeholders about algorithmic fairness, accountability, transparency and ethics. International Journal of Artificial Intelligence in Education, 32, 808-833. https://doi.org/10.1007/s40593-021-00248-0

Bonk, C. J., & Wiley, D. A. (2020). Preface: Reflections on the waves of emerging learning technologies. Educational Technology Research and Development, 68(4), 1595-1612.

Bradshaw, A.C. (2017). Critical pedagogy and educational technology. In A. D. Benson, R. Joseph, & J. L. Moore (Eds.), Culture, learning and technology: Research and practice (pp. 8–27). Routledge.

Buolamwini, J., & Gebru, T. (2018, January). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Proceedings of Machine Learning Research Conference on Fairness, Accountability, and Transparency (pp. 77–91). PMLR.

Burkeman, O. (2021). Four thousand weeks: Time management for mortals. Farrar, Straus and Giroux.

Chen, L., Chen, P., & Lin, Z. (2020). Artificial intelligence in education: A review. IEEE Access, 8, 75264-75278.

Christian, J. (2023, February 9). Magazine publishes serious errors in first AI-generated health article. Neoscope. https://futurism.com/neoscope/magazine-mens-journal-errors-ai-health-article

Ciccone, M. (2022). Surveillance and the edtech imaginary via the mundane stuff of schooling. In B. S. De Abreu (Ed.), Media Literacy, Equity, and Justice (pp. 197–205). Routledge.

Delbosc, A., Naznin, F., Haslam, N., & Haworth, N. (2019). Dehumanization of cyclists predicts self-reported aggressive behaviour toward them: A pilot study. Transportation Research Part F: Traffic Psychology and Behaviour, 62(4), 681–689.

Dixon-Román, E., Nichols, T. P., & Nyame-Mensah, A. (2020). The racializing forces of/in AI educational technologies. Learning, Media and Technology, 45(3), 236-250.

Elsen-Rooney, M. (2023, January 3). NYC education department blocks ChatGPT on school devices, networks. Chalkbeat New York. https://ny.chalkbeat.org/2023/1/3/23537987/nyc-schools-ban-chatgpt-writing-artificial-intelligence

Eubanks, V. (2017). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin’s Press.

Feenberg, A. (2010). Ten paradoxes of technology. Techné: Research in Philosophy and Technology, 14(1), 3–15.

Ferlazzo, L. (2023, January 18). 19 ways to use ChatGPT in your classroom. Education Week. https://www.edweek.org/teaching-learning/opinion-19-ways-to-use-chatgpt-in-your-classroom/2023/01

Flaherty, C. (2020, May 11). Big proctor. Inside HigherEd. https://www.insidehighered.com/news/2020/05/11/online-proctoring-surging-during-covid-19.

Flaherty, C. (2021, February 1). U of Illinois says goodbye to Proctorio. Inside HigherEd. https://www.insidehighered.com/news/2021/02/01/u-illinois-says-goodbye-proctorio

Garcia, A., & Nichols, T. P. (2021). Digital platforms aren’t mere tools—they’re complex environments. Phi Delta Kappan, 102(6), 14-19.

GPTZero (2023). GPTZero [computer software]. Retrieved from http://gptzero.me/

Grajek, S. (2020, April 10). EDUCAUSE COVID-19 QuickPoll Results: Grading and Proctoring. EDUCAUSE Review. https://er.educause.edu/blogs/2020/4/educause-covid-19-quickpoll-results-grading-and-proctoring#fn2

Gutiérrez, K. D., & Rogoff, B. (2003). Cultural ways of learning: Individual traits or repertoires of practice. Educational researcher, 32(5), 19-25.

Heath, M. K., & Segal, P. (2021). What pre-service teacher technology integration conceals and reveals: “Colorblind” technology in schools. Computers & Education, pp. 170, 1–9.

Heath, M., Asim, S., Milman, N., & Henderson, J. (2023). Confronting tools of the oppressor: Framing just technology integration in educational technology and teacher education. Contemporary Issues in Technology and Teacher Education, 22(4).

Herman, D. (2022, December 9). ChatGPT will end high-school English. The Atlantic. https://www.theatlantic.com/technology/archive/2022/12/openai-chatgpt-writing-high-school-english-essay/672412/

Huang, K. (2023, January 16). Alarmed by A.I. chatbots, universities start revamping how they teach. The New York Times. https://www.nytimes.com/2023/01/16/technology/chatgpt-artificial-intelligence-universities.html

Ihde, D. (1990). Technology and the lifeworld: From garden to earth. Indiana University Press.

Ihde, D. (1998). Expanding hermeneutics: Visualism in science. Northwestern University Press.

International Society for Technology in Education. (2018). ISTE standards for teachers. ISTE.

Kranzberg, M. (1986). Technology and history: Kranzberg’s laws. Technology and Culture, 27(3), 547-560.

Krutka, D.G., Heath, M.K., & Willet, K.B.S. (2019). Foregrounding technoethics: Toward critical perspectives in technology and teacher education. Journal of Technology and Teacher Education, 27(4), 555-574.

Krutka, D. G., Heath, M. K., & Smits, R. M. (2022). Toward a civics of technology. Journal of Technology and Teacher Education, 30(2), 229-237.

Ma, W., Adesope, O. O., Nesbit, J. C., & Liu, Q. (2014). Intelligent tutoring systems and learning outcomes: A meta-analysis. Journal of Educational Psychology, 106(4), 901–918. https://doi.org/10.1037/a0037123

Macgilchrist, F. (2021). What is ‘critical’ in critical studies of edtech? Three responses. Learning, Media and Technology, 46(3), 243-249.

Manolev, J., Sullivan, A., & Slee, R. (2019). The datafication of discipline: ClassDojo, surveillance and a performative classroom culture. Learning, Media and Technology, 44(1), 36–51.

Marche, S. (2022, December 6). Will ChatGPT kill the student essay? The Atlantic. https://www.theatlantic.com/technology/archive/2022/12/chatgpt-ai-writing-college-student-essays/672371/

Martin, F., Dennen, V. P., & Bonk, C. J. (2020). A synthesis of systematic review research on emerging learning environments and technologies. Educational Technology Research and Development, 68(4), 1613-1633.

Newport, C. (2021). A world without email: Reimagining work in an age of information overload. Penguin.

Nichols, T. P., & Garcia, A. (2022). Platform Studies in Education. Harvard Educational Review, 92(2), 209–230.

Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York University Press.

O'Neil, C. (2017). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.

OpenAI (2022, November 30). ChatGPT: Optimizing language models for dialogue. OpenAI Blog. https://openai.com/blog/chatgpt/

OpenAI (2023). ChatGPT [computer software]. Retrieved from https://chat.openai.com/chat

Pavlik, J. V. (2023). Collaborating with ChatGPT: Considering the implications of generative artificial intelligence for journalism and media education. Journalism & Mass Communication Educator. https://doi.org/10.1177/10776958221149577

Pleasants, J., Clough, M. P., Olson, J. K., & Miller, G. (2019). Fundamental issues regarding the nature of technology: implications for STEM education. Science & Education, 28, 561–597.

Raji, I. D., Gebru, T., Mitchell, M., Buolamwini, J., Lee, J., & Denton, E. (2020, February). Saving face: Investigating the ethical concerns of facial recognition auditing. In Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society (pp. 145–151).

Ramesh, D., & Sanampudi, S. K. (2022). An automated essay scoring systems: a systematic literature review. Artificial Intelligence Review, 55(3), 2495-2527.

Rapp, A., Curti, L., & Boldi, A. (2021). The human side of human-chatbot interaction: A systematic literature review of ten years of research on text-based chatbots. International Journal of Human-Computer Studies, 151, 102630.

Respondus (2023). Respondus Monitor [computer software]. Retrieved from https://web.respondus.com/he/monitor/

Resta, P., Laferrière, T., McLaughlin, R., & Kouraogo, A. (2018). Issues and challenges related to digital equity: An overview. In J. Voogt et al. (Eds.) Second handbook of information technology in primary and secondary education (pp. 987–1001). Springer.

Rosalsky, G., & Peaslee, E. (2023, January 17). This 22 year old is trying to save us from ChatGPT before it changes writing forever. NPR News. https://www.npr.org/sections/money/2023/01/17/1149206188/this-22-year-old-is-trying-to-save-us-from-chatgpt-before-it-changes-writing-for?utm_source=pocket-newtab

Rosenblatt, K. (2022, December 7). ChatGPT can generate an essay. But can it generate an “A”? NBC News. https://www.nbcnews.com/tech/chatgpt-can-generate-essay-generate-rcna60362

Schneider, J., Richner, R., & Riser, M. (2022). Towards trustworthy autograding of short, multi-lingual, multi-type answers. International Journal of Artificial Intelligence in Education, 33(3), 88-118. https://doi.org/10.1007/s40593-022-00289-z

Selwyn, N. (2010). Looking beyond learning: Notes towards the critical study of educational technology. Journal of computer assisted learning, 26(1), 65–73.

Selwyn, N. (2015). Data entry: Towards the critical study of digital data and education. Learning, Media and Technology, 40(1), 64–82.

Selwyn, N. (2023). The modern classroom chair: Exploring the ‘coercive design’of contemporary schooling. Power and Education, 17577438231163043.

Selwyn, N., & Bulfin, S. (2016). Exploring school regulation of students’ technology use–rules that are made to be broken? Educational Review, 68(3), 274–290.

Selwyn, N., Pangrazio, L., & Cumbo, B. (2021). Knowing the (datafied) student: The production of the student subject through school data. British Journal of Educational Studies, 1-17.

Shields, C. (2023, January 5). Don’t ban ChatGPT. Use it as a teaching tool. Education Week. https://www.edweek.org/technology/opinion-dont-ban-chatgpt-use-it-as-a-teaching-tool/2023/01

Singer, N. (2023, February 6). At this school, computer science class now includes critiquing chatbots. The New York Times. https://www.nytimes.com/2023/02/06/technology/chatgpt-schools-teachers-ai-ethics.html

Smith, H. J., Higgins, S., Wall, K., & Miller, J. (2005). Interactive whiteboards: boon or bandwagon? A critical review of the literature. Journal of Computer Assisted Learning, 21(2), 91–101.

Spector, J. M. (2016). Ethics in educational technology: Towards a framework for ethical decision making in and for the discipline. Educational Technology Research and Development, 64(5), 1003-1011.

Teräs, M., Suoranta, J., Teräs, H., & Curcher, M. (2020). Post-Covid-19 education and education technology ‘solutionism’: A seller’s market. Postdigital Science and Education, 2(3), 863-878.

Tetzlaff, L., Schmiedek, F. & Brod, G. (2021) Developing personalized education: A dynamic framework. Educational Psychology Review, 33(3), 863–882. https://doi.org/10.1007/s10648-020-09570-w

Turkle, S. (2012). Alone together: why we expect more from technology and less from each other. Basic Books.

Van de Poel, I., & Kroes, P. (2014). Can technology embody values? In P. Kroes, & P. P. Verbeek (Eds.), The moral status of technical artefacts (pp. 103–124). Springer.

Verbeek, P. P. (2005). What things do: Philosophical reflections on technology, agency, and design. Penn State University Press.

Vygotsky, L. (1978). Mind in society: The development of higher psychological processes. Harvard University Press.

Watters, A. (2020, November 5). Ed-Tech and trauma. http://hackeduaction.com/2020/11/05/trauma

Watters, A. (2021). Teaching machines: The history of personalized learning. MIT Press.

Witzenberger, K., & Gulson, K. N. (2021). Why EdTech is always right: Students, data and machines in pre-emptive configurations. Learning, Media and Technology, 46(4) 420–434. doi:10.1080/17439884.2021.1913181.

Zahn, M. (2022, December 9). What is ChatGPT, the artificial intelligence text bot that went viral? https://abcnews.go.com/Technology/chatgpt-artificial-intelligence-text-bot-viral/story?id=94857599

Previous
Previous

AERA 2023 Roundup

Next
Next

Asking Technoskeptical Questions about ChatGPT