Brief: AI, Consciousness, and Posthumanism – A Community-Engaged Critical Research Exploration
Link to JSE December 2024 CECR Issue Table of Contents
Florant and von Stackelberg JSE Dec 2024 CECR Issue PDF
Abstract: This study explores the intersection of artificial intelligence (AI), consciousness, and posthumanism through a Community-Engaged Critical Research (CECR) approach. Inspired by Noemie Florant’s TEDx talk on reducing algorithmic bias in AI through youth engagement, researchers Julika von Stackelberg and Florant collaborated to explore the process of a CECR approach in a school-based youth-led context. The research aimed to stimulate rapid consciousness-raising about AI and its implications, particularly among youth. Based on the World Café model for community dialogue and Critical Participatory Action Research (CPAR) principles, the dialogue engaged seven high school students in discussions about their experiences with AI, the meaning of being human, and the impact of technology on human experiences. Students expressed a cautious and critical stance toward AI and technology and emphasized the need for regulation, ethical considerations, and preservation of uniquely human traits. As a result of the discussions and collective evaluation of the emerging themes, students reported an immediate change in their behavior and interaction with technology and each other as they consciously chose to disconnect from their devices to prioritize human-to-human interactions as a practice. The study highlights the importance of including youth voices in AI discussions, challenging adultism, and promoting democratic knowledge production. The CECR approach proved effective in raising consciousness and fostering community-building in an increasingly posthuman world. This research suggests potential applications in education, particularly in addressing critical sustainability issues. Future directions include expanding the project to connect college and high school students to explore these methods further and to collaboratively develop programs that engage the community to build resilient communities in the face of technological advancement and climate change.
Keywords: Artificial Intelligence, Consciousness Raising, Posthumanism, Ethics, Community-Engaged Critical Youth-led Research, Resilience
In her TEDx talk at Oneonta, then-high schooler Noemie Florant (2023) argues that algorithmic bias in artificial intelligence (AI) can be reduced by engaging youth in critical AI education and, consequently, building a greater consciousness about accessibility, diversity, and integrity in the development of AI systems (Lee & Kwon, 2024). Given the ongoing prevalence of adultism, which marginalizes youth voices in dialogues about current issues, such as climate change and the implications of AI, Florant’s argument hit a chord with Julika von Stackelberg. Von Stackelberg’s research investigates how community dialogues can stimulate rapid consciousness-raising regarding such urgent topics at a grass-roots level that contributes to collectively developing a healing-informed sociotechnical imaginary as an ethic for posthuman and resilient communities.
What’s Love Got to Do With It? represents the title of an interactive workshop and a model von Stackelberg designed for researchers to collaborate with youth like Florant, who is deeply committed to data science, ethics, and social justice. The foundation of the workshop is built on recognizing the impacts of trauma as well as the mechanisms that lead to well-being, which are stimulated when the nervous system can regulate itself by feeling safe and connected in an environment that triggers the sensations of feeling loved and loving towards others and the world (Doppelt, 2023). Following a study von Stackelberg conducted with three school districts in Orange County, NY, to identify the pillars of a healing-informed sociotechnical imaginary as a tool for change, von Stackelberg and Florant joined as co-researchers to further explore the process of this Community Engaged Critical Research (CECR) approach in a school-based youth-led context. The co-researchers designed a method based on the World Café model (Steier et al., 2015) while leaning on Critical Participatory Action Research (CPAR) values (Fine & Torre, 2021) and principles of phenomenology (Scharrer et al., 2021) to engage with seven high school students in discussing their ideas, perspectives, and lived experiences with AI.
After introducing the process, which entailed youth-led group discussions, reflections, and collective synthesizing, the students chose to explore these questions: “What is your experience with AI?” “What does being human mean to you?” “How do you see yourself related to non-human life like animals and plants?” “How does technology and artificial intelligence impact your experience of being human?” and “How can humans evolve alongside technology while maintaining and preserving the core of what it means to be human?” The responses were noted, and Florant prompted peers to expand further, which led to a lively discussion that entailed philosophical reflection, laughter, wondering, confessions, and nuance.
When asked about their experience with AI, one student chuckled and said, “I used AI to write a really nice break-up message,” which led to laughter and a discussion about the value of quarrels that only come through human-to-human interaction. While the students discussed things like using AI for homework and writing assignments, even though they were aware of frequent AI hallucinations, they also discussed more serious issues. One student shared that they had heard of robots receiving rights in another country, to which another student immediately sat back in their seat and responded, “Hold on! African Americans fought for their rights for so long, and a robot just gets it? That’s an insult!” This led to a deeper discussion about the value of humans versus robots and economics, which one of the students summarized by commenting, “It’s not just about building, building, and building. We need to remember not all humans benefit in the same way.” Other concerns the students expressed were related to how “AI makes relationships synthetic” and the “impersonation and voice AI can cause real harm with deepfakes.”
During the Harvest, which is part of the Community Café method to summarize, review, and collect takeaways (Steier et al., 2015), the students highlighted their interest in spending less time online and cultivating more in-person connections. One person said, “You have to have the courage to go up and talk to people,” and others wanted to highlight that future generations “have to know how to think for themselves.” The concern about the types of work they would be doing also surfaced several times, pointing to some of the anxiety that students said they felt about their future.
Analysis
As the conversation drew to a close, the group took a “gallery walk” to collectively reflect on the notes on big paper. Together, they clarified any ambiguities and identified common themes, outliers, and specific highlights. After the meeting, Florant further synthesized the notes using in-vitro and axial coding, allowing specific topics to emerge. In-vitro coding creates codes that stem directly from the text (Scharrer & Ramasubramanian, 2021), ensuring that the participants’ voices are centered in the research. She then shared them with the students as feedback and to share ownership of the results.
Results
Overall, the high schoolers’ perspectives reflect a cautious and critical stance toward AI and technology, emphasizing the need for regulation, ethical considerations, and preserving uniquely human traits and experiences. There was a clear call for a balanced and thoughtful approach to integrating AI into society, ensuring that it supports rather than undermines human growth, mental health, and job security. The most immediate and striking outcome was revealed during a follow-up conversation. Florant reported that the students had consciously decided to turn off their phones while engaging with each other over lunch as an ongoing practice. As a result of becoming conscious of the implications of AI in their lives, the students deliberately chose to invest in human-to-human connection rather than being distracted by the devices that often create connection and isolation.
Discussion
Florant’s argument at TEDx Oneonta (2023) highlights the frequent oversight of children and youth surrounding discussions in AI. She argues that they are often merely seen as consumers and thus a source for data extraction to inform the development of entertainment, technology, and artificial intelligence for a lucrative profit rather than crucial stakeholders in shaping the future of humanity at risk of extinction. Her argument points to the purpose of CECR, which is to improve access and ownership of outcomes and avoid ethical pitfalls (Chapman, 2019). By building on CPAR values using the World Café model, each participant essentially became a co-researcher who wanted to collectively explore their experiences with AI as a phenomenon in their everyday individual lives in the context of their community. Centering their voices made room for shifting the power dynamics reflected in the adultism that increases the risk of young people being actively excluded from these conversations. Fine and Torre (2021) describe this process as democratic knowledge production that untangles the capitalistic values and reasons for much research conducted and centers the voices and well-being of all co-researchers.
As the follow-up revealed, part of the result was the creation of a youth-led movement that rejects being defined through online avatars, streaks, and likes but one that resists being used as a vehicle focusing on personal connections instead. This rapid consciousness-raising method led to a change that they felt was an effective way of researching social and environmental justice and wellness. To include youth, as Florant is asking for in her TEDx talk, this research approach offers itself to be promoted, particularly in high schools, as an accessible and responsive tool that considers the dialoguers’ needs and desires, which reveal their humanness in a world that turns all information, including personal information into data for profit and consumption.
Limitations
Organizing and conducting a CPAR project requires time, and ideally, it takes place over a period of several days or weeks. This research represents a pilot project limited to a two-hour time block and with no consent to take pictures from the parents and guardians. Adding those as part of the documentation would have illustrated the depth the students created in the time that was available. Additionally, having designated roles for scribes and note-takers is necessary to add more nuance to the analysis. With careful planning, these roles can be assigned to the participants/co-researchers, and they can take turns.
Conclusion
CECR can create an important space in the future of education that is facilitated in the context of climate change and the continued evolution of technology. Incorporating methods that stimulate consciousness-raising in an increasingly posthuman world sustains human connection and community-building. According to the Center for the Study of Social Policy (2019), such connections serve as protective factors and build the resilience that supports regenerative well-being.
There is room for expansion, and the next steps could include collaboratively developing curricula that address critical sustainability. Von Stackelberg is currently working on a project that will connect college-age students with high schoolers to further explore these methods and topics and to build community among young people in education. Such projects value ongoing dialogue and internal human connections on critical topics rather than facilitated through structures and systems that emerged from extraction-based models. At the core of developing relationships between researchers, students, administrators, faculty, and staff is what Brooks (2017) calls a critical theory of love, which resists dehumanization and erasure. Such theories and methods are growing in value as they provide pathways to keep humans in the loop in a posthuman world.
References
Brooks, D. (2017). (Re)conceptualizing Love: Moving Towards a Critical Theory of Love in Education for Social Justice. Journal of Critical Thought and Praxis, 6(3), 102–114.
Chapman, M. (2019). Changing the World Without Doing Harm: Critical Pedagogy, Participatory Action Research, and the Insider Student Researcher. Religious Studies and Theology, pp. 100–116. DOI:10.1558/rsth.38715
Doppelt, B. (2023). Preventing and Healing Climate Traumas A Guide to Building Resilience and Hope in Communities (1st ed.). Routledge.
Fine, Michelle and Maria Elena Torre (2021). Essentials of Critical Participatory Action Research, American Psychological Association.
Lee, S. J., & Kwon, K. (2024). A systematic review of AI education in K-12 classrooms from 2018 to 2023: Topics, strategies, and learning outcomes. Computers and Education Artificial Intelligence, 100211. https://doi.org/10.1016/j.caeai.2024.100211
Scharrer, E., and Ramasubramanian, S. (2021). Quantitative Research Methods in Communication: The Power of Numbers for Social Justice, Routledge: NY, Oxon.
Steier, F., Brown, J., & Mesquita da Silva, F., (2015). The World Cafe in Action Research Settings. Chapter 20: The world cafe in action research settings. In H. Bradbury (Ed.), The SAGE Handbook of Action Research (third). SAGE Publications: London; Thousand Oaks.
TEDx Oneonta, (2023). TEDx Oneonta 2023. YouTube. https://www.youtube.com/watch?v=Vqmoo8V71cI
Protective factors framework. Center for the Study of Social Policy. (2019, January 11). https://cssp.org/our-work/projects/protective-factors-framework/