AI and Human Relationships at FGCU
AI and Human Relationships at FGCU
(Adapted from remarks made at the Provost’s Annual Retreat, August 1, 2023.)
I’ve been following and sometimes participating in the conversation about AI and higher ed over the last several months, and I’ve noticed two themes that seem to be getting quite a bit of attention. The first is about academic integrity and is encapsulated in the following question: How do we prevent students from using ChatGPT or another AI tool to produce assignments we wish them to complete on their own AND “catch” them when they do inappropriately or surreptitiously use AI? The second theme, I’ll call “Adapting to the marketplace.” This theme is related to the idea that the “genie is out of the bottle” and, therefore, we have to find ways to integrate AI tools into our teaching or at least help students prepare to be familiar with and able to use these tools in the work they will do in their chosen careers.
To be fully transparent, I’m not especially interested in the first theme, to the extent that it entails helping faculty increase their level of surveillance over students. And, among those in the know, there seems to be a growing consensus that AI detection is a losing battle. Also, while the “adapting to the marketplace” theme is somewhat closer to the work I do in the Lucas Center to support faculty—in that it is directly related to teaching and learning—I don’t believe the function of our work as educators is to meet the labor needs of the marketplace. So, I don’t find this second theme especially compelling either.
This is not to say, however, that I don’t believe that AI has the potential to be transformational in higher ed. I just hope that it will be transformational in ways that are as yet underrepresented in the discourse on AI and education, and I will turn my attention to a third theme that I hope will occupy our attention in substantive ways in the coming academic year.
What I find to be missing from the conversation about AI and higher ed is recognition of the ways in which AI enhances our opportunities for relationship building with students. In a recent article in the Atlantic Monthly, Adrienne LaFrance (2023) wrote:
Now is the time…to recommit to making deeper connections with other people. Live videochat can collapse time and distance, but such technologies are a poor substitute for face-to-face communication, especially in settings where creative collaboration or learning is paramount. The pandemic made this painfully clear. Relationships cannot and should not be sustained in the digital realm alone, especially as AI further erodes our understanding of what is real. Tapping a “Like” button is not friendship; it’s a data point. And a conversation with an artificial intelligence is one-sided—an illusion of connection (para. 17).
In a similar vein, in early 2023 David Brooks wrote in the New York Times:
[AI] is missing a humanistic core. It’s missing an individual person’s passion, pain, longings and a life of deeply felt personal experiences. It does not spring from a person’s imagination, bursts of insight, anxiety and joy that underlie any profound work of human creativity (para. 4).
Finally, in a 2011 New York Times article titled “What is College For?”, philosopher Gary Gutting of Notre Dame wrote:
First of all, [colleges] are not simply for the education of students. This is an essential function, but the raison d’être of a college is to nourish a world of intellectual culture; that is, a world of ideas, dedicated to what we can know scientifically, understand humanistically, or express artistically (para. 7).
He went on to write:
Students…need to recognize that their college education is above all a matter of opening themselves up to new dimensions of knowledge and understanding. Teaching is not a matter of (as we too often say) “making a subject (poetry, physics, philosophy) interesting” to students but of students coming to see how such subjects are intrinsically interesting. It is more a matter of students moving beyond their interests than of teachers fitting their subjects to interests that students already have. Good teaching does not make a course’s subject more interesting; it gives the students more interests — and so makes them more interesting…[T]he truth is that, for both students and faculty members, the classroom is precisely where the most important learning occurs (paras. 9-10).
So, if we appreciate the value of the humanistic responses of Brooks and LaFrance to AI and of Gutting’s philosophical perspective on the purpose of higher ed and the transformative potential of the classroom experience, what are the implications for us as teachers, students, administrators, advocates, staff—essentially anyone involved in the university community—as we attempt to come to terms with the impact of AI on our work? I noted earlier that my hope is that our current preoccupation with AI will lead us to seek deeper, more meaningful relationships with our students, and this is the point on which I will conclude.
In the service of building trust and rapport in the classroom I believe we can and should approach AI as an object of intellectual inquiry with our students, rather than as a tool to be mastered or an enemy to be subverted. We can engage students in a collaborative process to develop shared guidelines for the use of AI in our courses, while increasing our transparency about the ways in which vital transferable skills—skills such as critical thinking, oral and written communication, teamwork, leadership, professionalism—all of which are identified in FGCUs newest Quality Enhancement Plan—may be inhibited or constrained when we outsource or delegate our intellectual effort to a machine.
I agree with the following point made by LaFrance (2023) in her Atlantic article, and I believe it is fully aligned with our fundamental purpose as educators: “We should trust human ingenuity and creative intuition, and resist overreliance on tools that dull the wisdom of our own aesthetics and intellect… We can and should layer on technological tools that will aid us in this endeavor, but never at the expense of seeing, feeling, and ultimately knowing for ourselves” (para. 24).
In the end, then, I hope that A.I. will force us to reflect more deeply on what it means to be human and what we in the university community mean to our students—and they to us. It is the depth and commitment of our shared goals and connections to one another that can not be replicated by AI and that are absolutely necessary for the achievement of our educational, personal, and professional aspirations.
Brooks, D. (2/2/23). In the age of A.I., major in being human. The New York Times. Retrieved from https://www.nytimes.com/2023/02/02/opinion/ai-human-education.htmlBrooks.
LaFrance, A. (July/August, 2023). The coming humanist renaissance. Atlantic Monthly. Retrieved from https://www.theatlantic.com/magazine/archive/2023/07/generative-ai-human-culture-philosophy/674165/.
Gutting, G. (1/14/11). What is college for? The New York Times. Retrieved from https://archive.nytimes.com/opinionator.blogs.nytimes.com/2011/12/14/what-is-college-for/.