The Advocate Magazine 2025 Number 48, Issue 1 | Page 11

Why to Use — or Avoid —
AI in Counseling continued from page 10
WILL AI REPLACE COUNSELORS ? route to work , unlock your phone using a facial scan , interact with a chatbot on a website , or open Google Translate to communicate with someone who speaks a different language , you ’ re using AI .
AI isn ’ t new . The term was first coined in 1957 , but the concept originated decades before the term existed . AI is not even new to counseling . You can still converse with a Rogerian mock therapist chatbot called “ Eliza ” that was developed in the 1960s by a researcher at MIT . Though Eliza ’ s counseling skills are basic , focusing on open-ended questions , reflections , and empathetic responses , her — or should we say , its — services are free !
Fear and discomfort with AI are as old as the concept of AI itself . Dismal stories focused on the potential horrors of AI predate any of our entries into the counseling profession . For example , a list on the Internet Movie Database ( IMDB ) site identifies 113 movies filmed since 1927 with plotlines involving AI , often consisting of dysphoric , post-apocalyptic futures in which humanity struggles to survive .
If AI isn ’ t new , then why do so many people think it is ? I think it ’ s mostly because of ChatGPT , a revolutionary generative AI platform created by a San Francisco-based company called OpenAI . ChatGPT is unique because of its advanced ability for natural language generation .
I should back up here a little and offer some additional definitions , courtesy of USF ’ s Innovative Education team . We ’ ve already learned that AI can be defined as “ the theory and development of computer systems able to perform tasks that normally require human intelligence .” Natural language processing ( NLP ) is a subfield of AI that focuses on enabling computers to understand , interpret , and generate human language . Natural language generation ( NLG ) is , in turn , a subfield of NLP that focuses on generating human-like language from structured data or other inputs .
ChatGPT is unique because it offers NLG that feels more like communicating with a real human than any of its predecessors . You can communicate with it similarly to the way that you would converse with any person who exists , except that ChatGPT is , essentially , more knowledgeable than any single human being , as it borrows from a substantially greater pool of data than any single person can possess , and its capacity for analogical reasoning ( i . e ., identifying similarities between concepts and ideas ) is superior to previous generative AI platforms .
So , if ChatGPT is , in a sense , smarter than any single counselor , does that mean it can counsel better than we can ?
Many counselors fear that they ’ ll eventually become fully or partly obsolete as the influence of AI in society continues to grow . As time progresses , AI bots are becoming increasingly difficult to differentiate from humans . The Turing test , named after Alan Turing , who invented the procedure in 1950 , employs what is known as the “ imitation game ” to determine whether humans can differentiate between responses offered by AI or other humans . In a study published in 2024 , humans incorrectly identified text-based responses from ChatGPT 54 percent of the time , which is about the same as random chance .
In a 2022 study published in Computers in Human Behavior , 55 percent of the sample of 872 adults preferred AI-based psychotherapy over psychotherapy with a human . At the same time , participants in that study acknowledged that they generally trust human therapists more than AI . In a 2023 study published in JAMA Internal Medicine , a panel of healthcare professionals judged human physicians ’ answers to patients ’ questions as much less empathetic and lower in quality than the answers offered by ChatGPT .
Though I know of no such study ever being conducted with mental health counselors ( any doctoral student reading this article who is looking for a good dissertation idea should consider replicating that study with counselors instead of physicians ), this study highlights how generative AI tools may , in the future , become better sources of mental health-related information than any single counselor .
Interestingly , in a 2022 study published in Current Psychology , therapy clients preferred AI bots that disclose or exhibit humanlike emotions over those that provide information only . This is notable , given that AI bots do not truly experience human emotions — they can simply be programmed to mimic them . Yet , people would rather communicate with an AI bot that essentially feigns human emotion than one that doesn ’ t .
I can ’ t pretend to know what will happen in 100 years , but I don ’ t believe that in our lifetime we will see mental health counseling dominated by AI bots . But I ’ ll come back to this issue after exploring both some of the beneficial and potentially detrimental uses of AI in counseling .
BENEFICIAL USES OF AI IN COUNSELING
Let ’ s talk about examples of how AI could be helpful as an adjunct to counseling and psychotherapy . continued on page 12 The Advocate Magazine 2025 , Issue # 1 American Mental Health Counselors Association ( AMHCA ) www . amhca . org
11