News

Call for nominations: Chartered ABS International Committee

The Chartered Association of Business Schools is seeking new members for our International Committee.

12th June 2024
Knowledge Sharing

The AI paradox: friend or foe?

17th May 2024

Authors

Dr Madeleine Stevens CMBE

Reader in Organisational Transformation and Teaching Innovation, Liverpool John Moores University

This blog is part of our latest Dynamic Conversation - click here to explore the full collection

Is a learning experience not authentic if it does not embrace AI?   

Of course not! We don’t need AI for learning experiences to be authentic and I would in fact argue the opposite: AI can remove authenticity. Although AI has many benefits for enhancing education, learning experience, and understanding amongst students in higher education, it can also have the opposite impact on predictability and autonomy. 

Let me illustrate, drawing on a recent assignment used whilst conducting international teaching where Masters students in a business school were asked to consider an instance where they use Emotional Intelligence (EI) principles to resolve conflict. This reflective assignment asked students to draw on any challenges they faced in using EI and how they addressed this.  

The irony lies in the fact that this assignment was crafted to tap into reflexive, real-life scenarios, aiming to strip AI of its ability to replicate them. After reading the third assignment that happened to refer to a ‘hypothetical character called Alex with similar challenges in a similar context’, I got suspicious and put the assignment question into ChatGPT. The response I received, alluded to a hypothetical character, aptly named Alex situated within a similar context as positioned by students. Below are excerpts of the 3 students, followed by the same question entered into ChatGPT:  

“This colleague, whom I'll refer to as Alex, often appeared isolated and anxious. I took the initiative to lend him an ear. Applying the principles of emotional intelligence, I validated his feelings by expressing understanding of his challenges and sharing my own initial experiences in the company.” (student 1) 

“It became apparent that one team member, let's call him Alex, was particularly stressed. Recognising the need for empathy, I approached Alex alone and setting to understand his perspective and emotions. Applying the principles of emotional intelligence, I actively listened to his concerns, allowing him to express his frustrations and fears openly without any judgments and filters.” (student 2)  

“In a team project, a colleague was visibly anxious due to personal issues affecting their performance. Let’s call him Alex.  Recognising the importance of empathy, I approached them privately to express concern and offer support. Applying EI, I actively listened to their concerns, acknowledging the emotional challenges they were facing.” (student 3)  

Chat GPT response: In a hypothetical scenario, let's say I was programmed to assist employees in a large organisation with their tasks and inquiries. One day, a team member, let’s call him Alex approached me feeling overwhelmed and frustrated due to a heavy workload and tight deadlines. Instead of mechanically providing solutions or ignoring their emotional state, I recognised their distress and responded with empathy. 

This is an example where the use of AI detracted from the value of the intended learning experience. On the contrary, I also have classroom evidence illustrating how clear instructions on the appropriate utilisation of AI can yield positive results on students’ learning and understanding. Although AI is regarded with suspicion and as a threat, it is also perceived as helpful and quick in generating valuable insight on many topics.  We should however acknowledge that ChatGPT is not yet regarded as trustworthy (Lu, 2023).  According to OpenAI CEO Sam Altman, as reported by the Australian Financial Review: “I verify what it says… This is a generative technology, it is a creative helper, so please don’t rely on it for factual accuracy (Smith, 2023)".  The paradox exists as besides the lack of trust Professor Sharples stated at the recent UCL Education Conference: “AI can be a positive way to influence and enhance effective teaching and learning” (UCL, 2023). 

The paradoxical question for me is how do we include AI in management education, whilst recognising the benefits of the technology and acknowledging the weaknesses?  

I believe we need to embrace it, recognising when it is a friend and when it is a foe. As facilitators, we need to provide clear direction and safe, guided ethical experimentation within the classroom supporting Professor Sharples guidance of using generative AI with care, exploring AI for creativity, and rethinking the use of written assessment whilst also developing AI literacy amongst our students (UCL, 2023).  

Continue the conversation...

 

References

Lu, A. (2023), ChatGPT: AI is both smarter and dumber than you think it is, Vodafone, 31 July 2023, Available from:  

https://www.vodafone.co.uk/newscentre/features/chatgpt-ai-is-both-smarter-and-dumber-than-you-think-it-is/. Accessed: 18 August 2023 

Smith, P. (2023), Why even Sam Altman doesn’t trust ChatGPT, Financial Review, 16 June 2023,  Available from: https://www.afr.com/technology/why-even-sam-altman-doesn-t-trust-chatgpt-20230615-p5dh02#:~:text=Why%20even%20Sam%20Altman%20doesn%E2%80%99t%20trust%20ChatGPT.  Accessed 18 August 2023  

UCL Education Conference (2023), 'Every powerful pedagogy could be augmented by AI' says expert at UCL Education Conference, 10 May 2023, Available at:  https://www.ucl.ac.uk/teaching-learning/news/2023/may/every-powerful-pedagogy-could-be-augmented-ai-says-expert-ucl-education-conference, Accessed 18 August 2023