Will AI Replace Psychotherapists?
- Deric Hollings
- May 11
- 17 min read
Tl;dr: It’s plausible (appearing worthy of belief) that digital mental, emotional, and behavioral health (collectively “mental health”) services could one day replace human psychotherapists, though I’m unsure how probable (likely to be or become true or real) this proposal is at present.
Defining Terms
To explore the current topic it may be useful to define a number of terms, because I’m addressing a personal hypothesis (an assumption or concession made for the sake of argument) shared by others regarding digital replacement of human mental health professional practitioners.
Bear in mind that not all of the following concepts will be addressed in this blogpost. Still, I consider it worthwhile to familiarize oneself with these terms for the sake of understanding. Helpfully, a single source is relied upon to describe many concepts addressed herein:
Artificial Intelligence (or “AI”) – The simulation of human intelligence in machines that are programmed to think and learn like humans. Example: A self-driving car that can navigate and make decisions on its own using AI technology.
Artificial General Intelligence (or “AGI”) – Artificial General Intelligence (AGI) refers to an AI system that possesses a wide range of cognitive abilities, much like humans, enabling them to learn, reason, adapt to new situations, and devise creative solutions across various tasks and domains, rather than being limited to specific tasks as narrow AI systems are.
Machine Learning (or “ML”) – A subfield of AI that involves the development of algorithms and statistical models that enable machines to improve their performance with experience. Example: A machine learning algorithm that can predict which customers are most likely to churn based on their past behavior.
Deep Learning – A subfield of ML that uses neural networks with multiple layers to learn from data. Example: A deep learning model that can recognize objects in an image by processing the image through multiple layers of neural networks.
Large Language Model (or “LLM”) – A type of deep learning model trained on a large dataset to perform natural language understanding and generation tasks. There are many famous LLMs like BERT, PaLM, GPT-2, GPT-3, GPT-3.5, and the groundbreaking GPT-4. All of these models vary in size (number of parameters that can be tuned), in the breadth of tasks (coding, chat, scientific, etc.), and in what they’re trained on.
Chatbot – A user-friendly interface that allows the user to ask questions and receive answers. Depending on the backend system that fuels the chatbot, it can be as basic as pre-written responses to a fully conversational AI that automates issue resolution.
Reasoning – AI reasoning is the process by which artificial intelligence systems solve problems, think critically, and create new knowledge by analyzing and processing available information, allowing them to make well-informed decisions across various tasks and domains.
Summarization – Summarization is the ability of generative models to analyze large texts and produce concise, condensed versions that accurately convey the core meaning and key points.
Quantum Computing – Quantum computing is a computational approach that could be used to dramatically increase processing power, with significant potential to enhance AI capabilities.
Hallucination – Hallucination refers to a situation wherein an AI system, especially one dealing with natural language processing, generates outputs that may be irrelevant, nonsensical, or incorrect based on the input provided. This often occurs when the AI system is unsure of the context, relies too much on its training data, or lacks a proper understanding of the subject matter.
Now that necessary descriptions are out of the way, let us continue the exploration of whether or not AI will replace psychotherapists. For the sake of simplicity, herein I’ll refer to AI, AGI, ML, LLM, chatbots, and other forms of non-human models of learning and thinking as “AI.”
REBT
I informally began life coaching in 1991, received specialized training on coaching people when in the military in 2001, earned a Master of Arts in Counseling degree in 2011, earned a Master of Science in Social Work degree in 2014, and began Hollings Therapy, LLC in 2021.
Also in 2021, I attained licensure as a Licensed Professional Counselor and Licensed Clinical Social Worker, as well as having received formal training in Rational Emotive Behavior Therapy (REBT), yet I studied this psychotherapeutic modality during both graduate study programs.
REBT uses two main techniques. The first tool is the ABC model which illustrates that when an undesirable Action occurs and you Believe an unhelpful narrative about the event, it’s your unfavorable assumption and not the occurrence itself that causes an unpleasant Consequence.
From a psychological standpoint, people disturb themselves using a Belief-Consequence (B-C) connection. Of course, this isn’t to suggest that in the context of the naturalistic or physical world there is no Action-Consequence (A-C) connection.
For example, if AI replaces all human software coding jobs (Action), then coders will become unemployed (Consequence). However, this A-C connection doesn’t cause a coder to become angry at AI that replaced the individual’s job. Rather, the B-C connection explains this outcome.
For instance, if coder John Doe loses his job due to AI having replaced all human software coding jobs (Action) and John unhelpfully Believes, “I shouldn’t have been phased out, and I can’t stand that this is happening,” then it’s John’s attitude that causes anger (Consequence).
Addressing how people upset themselves with unhelpful attitudes, the ABC model uses Disputation of unproductive assumptions in order to explore Effective new beliefs. Noteworthy, Actions and Consequences aren’t Disputed, as only unproductive Beliefs are challenged.
Herein, I won’t delve into the finer points of how I challenge unfavorable attitudes. For the sake of expediency, imagine that John instead helpfully tells himself, “Although I’d prefer not to be phased out, I can endure this untimely and unfortunate event,” as he remains disappointed.
With the ABC model, I invite people to consider healthy and tolerable elements of distress which relate to frustration, annoyance, or disappointment rather than unhealthy negative emotions of disturbance such as fear, anger, sorrow, or disgust. Here, John prefers disappointment to anger.
The second tool or REBT is the helpful technique of unconditional acceptance (UA) to relieve self-induced suffering. This is accomplished through use of unconditional self-acceptance (USA), unconditional other-acceptance (UOA), and unconditional life-acceptance (ULA).
Existential and Stoic philosophy is interwoven into UA. Not to oversimplify matters, an example of an existential philosophical principle is acknowledgement about the impermanent and uncertain nature of life. Each and every person alive will one day die.
There’s no utility in John bemoaning this fact. Simply put, John will one day cease to exist in his current form. Again, at the risk of oversimplifying things, an example of how I teach Stoic principles is by inviting people to consider imagined circles and an area.
The sphere of control encompasses only oneself, the sphere of influence encapsulates elements which may be subject to one’s sway, the sphere of concern engrosses most matters one can imagine, and the area of no concern relates to all content which isn’t yet imagined.
John has control only of his reaction to AI. Although he may be able to influence politicians to stop AI expansion, it’s unlikely that John will have any meaningful sway in this regard. Even if United States (U.S.) politicians slowed AI advancement, other nations are pursuing this matter.
Without the ability to control or likely influence AI from replacing his job as a coder, John may still be concerned with what the future of his employment may look like. It’s natural for John to be frustrated, annoyed, or disappointed in this regard. Thus, healthy distress isn’t abnormal.
Then, there’s an area of no concern which relates to whether or not AI, when connected to quantum computing, could contact aliens in an alternate dimension that will seek to enslave the human race. This improbable, though not impossible, scenario isn’t worth self-disturbing over.
Thus, John can practice USA if or when he isn’t perfect at stopping himself from moving toward disturbance rather than distress. Likewise, he can practice UOA with those fallible humans who advance forth AI efforts which could potentially impact the jobs of millions or billions of people.
As well, John can practice ULA by admitting how little control or influence he has over AI, the future, or aliens in an alternative dimension, for that matter. Will AI replace John Doe’s occupation? It’s plausible, though I’m unsure how probable this proposal is at present.
Will AI Replace Psychotherapists?
Recently, I came across a Reddit post that prompted the current blogpost. In the subreddit submission, an individual stated the following:
I am a psychotherapist running a successful private practice for years and last year set up a clinic, due to high demand. I am in my mid forties and this is the only job I know how to do, having studied psychology as my first degree when I was 19 and then following the therapy training and career route. I am confident in my experience and skills and my work has been very stable over the years. However recently AI terrifies me. I have used it and I can totally understand what the hype is. I can’t imagine it replacing the depth I reach at times with clients, but I am aware that it is at very early stages. I was always fascinated by technology, sci fi and the possibilities, but this exceeds that.
In the last couple of months enquiries have dramatically dropped. I am in the UK [United Kingdom], and although we have a cost of living problem here, I don’t think work would be impacted so suddenly here as much as in the US, where I hear therapists struggling a lot with enquiries. I am talking a sudden 80% drop. I am convinced that enquires have dropped because of use of AI. What is your opinion? Am I just being too anxious or is there an element of truth there?
Although I suspect there are many complex system issues which call into question one being “convinced that enquiries have dropped because of use of AI,” as correlation doesn’t imply causation, I imagine that AI possibly plays a role in the observed reduction in engagement.
When reading the Redditor psychotherapist’s post, I thought about what other therapists would say regarding the individual’s circle of concern. Interestingly, one Redditor responded:
Just my experience and opinion, the patients I see rarely know about (or trust) AI. Mostly younger (20s). While AI has gained popularity, I don’t see how it would cause an 80% drop so quickly. On a side note, we’ll be outsourced to AI within next 5 , maybe 10 years. The reason? Insurance companies. They will push an AI system that just barely does the job (and cheaper) instead of paying humans to go deeper. But AI is not there yet. Last week, there was a drastic difference in quality coming from ChatGpt. One week it was great, spot on. Right now it just runs surface level loops.
This individual’s reply is something I consider plausible. Within my blog, I’ve made no secret about my displeasure with insurance companies. Given my professional experience, I think it’s likely that in the future insurance companies will transition to AI in order to save money.
Of course, this raises moral and ethical questions regarding the field of mental health and the foreseeable direction into which it’s likely headed. (Perhaps that’s a post for another day.) On December 29, 2022, I posted a blog entry entitled Artificial Influence in which I stated:
Though rated as probabilistically low, some people have begun discussing whether or not A.I. will replace psychotherapists in the future. In fact, one person reports ChatGPT made him “feel better” after conversing with the A.I. […]
While I don’t think the modern Johnny 5 machine learning is sentient quite yet, I’m intrigued to consider how the future of mental health treatment may look if or when influenced by A.I., and despite its potential bias […]
I searched a number of sites for a free A.I. behavioral health experience and settle on Character.AI, offering a free chatbot conversation. Is it possible for A.I. to render helpful care, given apparent bias? […]
I suspect that with enough time and patience an A.I. chatbot could be programmed to effectively address any number of issues with which I’ve assisted clients throughout the years. Is humanity ready for A.I. psychotherapists? Time will tell.
Much advancement in AI technology has occurred since 2022. For instance, one 2024 source stated of whether or not AI will replace psychotherapists:
[A]s AI becomes more integrated into mental health care, it is essential to recognize and address its limitations. Challenges related to memory retention, algorithmic bias, and ethical considerations underscore the need for a balanced approach.
AI systems lack genuine empathy, ethical judgment, and the ability to interpret non-verbal cues—qualities that are intrinsic to human therapists. Therefore, these systems must be carefully designed and implemented to complement rather than replace the critical human elements of empathy, cultural competence, and nuanced understanding in therapy.
While the referenced source may seem optimistic for psychotherapists, I argue that “vicariously experiencing” another person’s cognitive and/or emotive state (“empathy”) is virtually impossible. Thus, focus on how supposedly important empathy is in psychotherapy is unhelpful.
Still, I understand that some people use flexible and preferential should statements regarding this matter (i.e., preferably, AI should be empathic like human therapists). Therefore, for those who value empathy, consider what one 2025 source states about AI replacing psychotherapists:
Although previous research has found that humans can struggle to tell the difference between responses from machines and humans, recent findings suggest that AI can write empathically and the generated content is rated highly by both mental health professionals and voluntary service users to the extent that it is often favored over content written by professionals.
In their new study involving over 800 participants, Hatch and colleagues showed that, although differences in language patterns were noticed, individuals could rarely identify whether responses were written by ChatGPT or by therapists when presented with 18 couple’s therapy vignettes.
This finding echoes Alan Turing’s prediction that humans would be unable to tell the difference between responses written by a machine and those written by a human. In addition, the responses written by ChatGPT were generally rated higher in core psychotherapy guiding principles.
This referenced source indicates that there’s a growing body of evidence which suggests that AI could plausibly replace psychotherapists as the go-to source for mental health services in the future. For a final reference regarding continued discussion of this issue, one 2025 source states:
“The question of whether AI will replace therapists isn’t a simple yes or no—rather, it’s about understanding how AI will transform the mental health care landscape. While some patients will continue to prefer human therapists, valuing the aspects of human connection, others may gravitate toward AI systems, finding them more accessible and feeling more comfortable sharing their deepest struggles without fear of human judgment.
“The dramatically lower cost of AI therapy, combined with its constant availability and elimination of waitlists, will likely accelerate this shift—particularly as insurance companies recognize the potential for both improved outcomes and reduced expenses. The mental health care field will likely develop in multiple directions simultaneously. Some practices might adopt hybrid approaches, where AI handles initial assessments and provides between-session support while human therapists focus on core therapeutic work.
“Other services might be fully AI-driven, particularly as these systems become increasingly sophisticated in their ability to understand human psychology and develop treatment plans. In psychiatry, while early AI adoption might focus on routine cases, advancing AI systems will likely surpass human capabilities in handling even the most complex diagnostic and medication management challenges, potentially leading to AI becoming the primary psychiatric care provider.
I’m intrigued by the prospect of AI support in the way of intakes and assessments, as well as providing support between sessions. Likewise, I suspect that for psychiatrists (who often do little more than prescribe medications), AI poses a significant challenge to their field.
Now, take a moment to reflect upon what I’ve stated about the ABC model and UA. I often encourage the people with whom I work to consider that there are generally three elements needed for their success with REBT: understanding of, belief in, and practice of the modality.
Without these components – and with heavy emphasis on routine (daily!) practice of this approach to rational living – I’ve witnessed many clients experience self-disturbance when not achieving their expressed interests and goals. How may AI assist people in this regard?
Suppose I see John Doe for weekly sessions which last up to 50 minutes per appointment. In our sessions, I help John practice disputation of unproductive beliefs while providing psychoeducational lessons on UA. Also, we negotiate homework to reinforce REBT techniques.
John understands REBT and believes that this approach to wellness can help improve his level of functioning and quality of life. Yet, John often neglects completion of his homework. Instead, he simply relies on me to provide catharsis in our sessions while he does little practice on his own.
Being that the demands of my personal and professional life disallow me from taking a more substantial role in John’s mental health care outside of sessions, AI augmentation of John’s care could be useful. For instance, an AI assistant could provide daily reminders of REBT principles.
As an example, John could converse with AI about challenges faced when initiating homework tasks. “I don’t know; what if I do this and fail?” John may text to an AI chatbot. Perhaps motivation to begin the task, or even fear of failure, is John’s issue in this case.
“What if you do so and achieve success?” the chatbot may respond while using the tool of what if in the opposite direction. This mild form of disputation to the distorted inference John uses (i.e., I shouldn’t fail when trying) may be all John needs to push through amotivation or fear.
The AI assistant could then chart the interaction in John’s digital record. Prior to meeting with John, I could briefly review this matter. Thus, the intervention strategy may save John time and money, as we wouldn’t spend time unraveling the reasons why John didn’t complete homework.
In this way, I view AI as a potentially hopeful utility for the field in which I work. Will AI replace psychotherapists? It’s plausible that digital mental health services could one day replace human psychotherapists, though I’m unsure how probable this proposal is at present.
If you’re looking for a provider who tries to work to help understand how thinking impacts physical, mental, emotional, and behavioral elements of your life—helping you to sharpen your critical thinking skills, I invite you to reach out today by using the contact widget on my website.
As a psychotherapist, I’m pleased to try to help people with an assortment of issues ranging from anger (hostility, rage, and aggression) to relational issues, adjustment matters, trauma experience, justice involvement, attention-deficit hyperactivity disorder, anxiety and depression, and other mood or personality-related matters.
At Hollings Therapy, LLC, serving all of Texas, I aim to treat clients with dignity and respect while offering a multi-lensed approach to the practice of psychotherapy and life coaching. My mission includes: Prioritizing the cognitive and emotive needs of clients, an overall reduction in client suffering, and supporting sustainable growth for the clients I serve. Rather than simply trying to help you to feel better, I want to try to help you get better!
Deric Hollings, LPC, LCSW

Photo credit (edited), fair use
References:
Affectionate_Duck663. (2025, May 9). To what extend do you think AI will replace psychotherapists? [Post]. Reddit. Retrieved from https://www.reddit.com/r/ArtificialInteligence/comments/1kijjdh/to_what_extend_do_you_think_ai_will_replace/
APA Dictionary of Psychology. (2023, November 15). Empathy. American Psychological Association. Retrieved from https://dictionary.apa.org/empathy
Bhaskar, C. (2025, February 12). AI vs. human therapists: Study finds ChatGPT responses rated higher. NeuroscienceNews.com. Retrieved from https://neurosciencenews.com/ai-chatgpt-psychotherapy-28415/
Character.AI. (n.d.). Home [Official website]. Retrieved from https://beta.character.ai/
Daren, S. (2021, October 12). Will AI replace psychologists? InData Labs. Retrieved from https://indatalabs.com/blog/will-ai-replace-psychologists
Dry-Objective7330. (2025, May 9). To what extend do you think AI will replace psychotherapists? [Post]. Reddit. Retrieved from https://www.reddit.com/r/ArtificialInteligence/comments/1kijjdh/to_what_extend_do_you_think_ai_will_replace/
Eväkallio, J. [@jevakallio]. (2022, December 4). All this ChatGPT shit has been making me feel anxious, so I had a therapy session with it, and it uhhhhh it actually made me feel better??? [Tweet]. Twitter. Retrieved from https://twitter.com/jevakallio/status/1599439122879635456
Freepik. (n.d.). 3d character emerging from a smartphone [Image]. Retrieved from https://www.freepik.com/free-ai-image/3d-character-emerging-from-smartphone_170857869.htm#fromView=search&page=1&position=0&uuid=dd4a8e7f-8e8d-4f25-8abd-ae9a6e3b0bf2&query=digital+therapy
Harb, C. (2025, January 8). Will AI replace therapists? Newsweek. Retrieved from https://www.newsweek.com/will-ai-replace-therapists-chatgpt-mental-health-usa-2010934
Hollings, D. (2024, January 14). An adaptive approach. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/an-adaptive-approach
Hollings, D. (2022, December 29). Artificial influence. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/artificial-influence
Hollings, D. (2024, November 15). Assumptions. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/assumptions
Hollings, D. (2024, November 10). Catharsis. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/catharsis
Hollings, D. (2022, May 17). Circle of concern. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/circle-of-concern
Hollings, D. (2024, October 29). Cognitive continuum. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/cognitive-continuum
Hollings, D. (2024, July 11). Concern and no concern. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/concern-and-no-concern
Hollings, D. (2024, October 27). Correlation does not imply causation. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/correlation-does-not-imply-causation
Hollings, D. (2024, November 4). Critical thinking. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/critical-thinking
Hollings, D. (2022, October 31). Demandingness. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/demandingness
Hollings, D. (2022, March 15). Disclaimer. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/disclaimer
Hollings, D. (2024, March 28). Distorted inferences. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/distorted-inferences
Hollings, D. (2025, March 12). Distress vs. disturbance. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/distress-vs-disturbance
Hollings, D. (2024, April 21). Existentialism. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/existentialism
Hollings, D. (2023, September 8). Fair use. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/fair-use
Hollings, D. (2024, May 17). Feeling better vs. getting better. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/feeling-better-vs-getting-better-1
Hollings, D. (2023, October 12). Get better. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/get-better
Hollings, D. (2024, February 24). High frustration tolerance. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/high-frustration-tolerance
Hollings, D. (n.d.). Hollings Therapy, LLC [Official website]. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/
Hollings, D. (2024, April 18). Homework. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/homework
Hollings, D. (2022, November 4). Human fallibility. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/human-fallibility
Hollings, D. (2024, October 21). Impermanence and uncertainty. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/impermanence-and-uncertainty
Hollings, D. (2022, June 20). Insurance coverage and lengthy wait times. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/insurance-coverage-and-lengthy-wait-times
Hollings, D. (2024, January 2). Interests and goals. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/interests-and-goals
Hollings, D. (2025, January 14). Level of functioning and quality of life. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/level-of-functioning-and-quality-of-life
Hollings, D. (2023, September 19). Life coaching. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/life-coaching
Hollings, D. (2022, December 2). Low frustration tolerance. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/low-frustration-tolerance
Hollings, D. (2024, March 4). Mental, emotional, and behavioral health. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/mental-emotional-and-behavioral-health
Hollings, D. (2023, October 2). Morals and ethics. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/morals-and-ethics
Hollings, D. (2024, September 27). My attitude. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/my-attitude
Hollings, D. (2022, October 22). On empathy. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/on-empathy
Hollings, D. (2023, April 24). On truth. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/on-truth
Hollings, D. (2025, April 25). Preferences vs. expectations. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/preferences-vs-expectations
Hollings, D. (2024, July 10). Preferential should beliefs. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/preferential-should-beliefs
Hollings, D. (2024, May 26). Principles. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/principles
Hollings, D. (2024, January 1). Psychoeducation. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/psychoeducation
Hollings, D. (2023, September 15). Psychotherapeutic modalities. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/psychotherapeutic-modalities
Hollings, D. (2024, May 5). Psychotherapist. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/psychotherapist
Hollings, D. (2022, March 24). Rational emotive behavior therapy (REBT). Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/rational-emotive-behavior-therapy-rebt
Hollings, D. (2024, May 15). Rational living. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/rational-living
Hollings, D. (2024, January 1). Rational vs. irrational. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/rational-vs-irrational
Hollings, D. (2024, December 5). Reasoning. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/reasoning
Hollings, D. (2024, July 18). REBT flexibility. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/rebt-flexibility
Hollings, D. (2024, July 10). Recommendatory should beliefs. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/recommendatory-should-beliefs
Hollings, D. (2022, November 1). Self-disturbance. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/self-disturbance
Hollings, D. (2022, October 7). Should, must, and ought. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/should-must-and-ought
Hollings, D. (2024, April 21). Stoicism. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/stoicism
Hollings, D. (2024, January 17). Summarizing. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/summarizing
Hollings, D. (2023, September 6). The absence of suffering. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/the-absence-of-suffering
Hollings, D. (2022, December 23). The A-C connection. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/the-a-c-connection
Hollings, D. (2022, December 25). The B-C connection. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/the-b-c-connection
Hollings, D. (2023, August 6). The science. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/the-science
Hollings, D. (2024, January 28). Think for yourself. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/think-for-yourself
Hollings, D. (2023, February 16). Tna. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/tna
Hollings, D. (2025, February 28). To try is my goal. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/to-try-is-my-goal
Hollings, D. (2025, April 18). Tolerable FADs. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/tolerable-fads
Hollings, D. (2025, January 9). Traditional ABC model. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/traditional-abc-model
Hollings, D. (2024, October 20). Unconditional acceptance redux. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/unconditional-acceptance-redux
Hollings, D. (2023, March 11). Unconditional life-acceptance. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/unconditional-life-acceptance
Hollings, D. (2023, February 25). Unconditional other-acceptance. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/unconditional-other-acceptance
Hollings, D. (2023, March 1). Unconditional self-acceptance. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/unconditional-self-acceptance
Hollings, D. (2024, January 16). Understanding, belief, and practice. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/understanding-belief-and-practice
Hollings, D. (2024, March 18). Unhealthy vs. healthy negative emotions. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/unhealthy-vs-healthy-negative-emotions
Hollings, D. (2024, April 10). Welcome to complex systems. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/welcome-to-complex-systems
Hollings, D. (2024, September 29). Well, well, well. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/well-well-well
Hollings, D. (2024, June 7). What if in the opposite direction. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/what-if-in-the-opposite-direction
Hollings, D. (2023, September 22). You’re gonna die someday. Hollings Therapy, LLC. Retrieved from https://www.hollingstherapy.com/post/you-re-gonna-die-someday
Moveworks, Inc. (n.d.). AI terms explained. Retrieved from https://www.moveworks.com/us/en/resources/ai-terms-glossary
Soonmme. (2008, July 15). Tl;dr. Urban Dictionary. Retrieved from https://www.urbandictionary.com/define.php?term=tl%3Bdr
Wikipedia. (n.d.). ChatGPT. Retrieved from https://en.wikipedia.org/wiki/ChatGPT
Wikipedia. (n.d.). Short Circuit (1986 film). Retrieved from https://en.wikipedia.org/wiki/Short_Circuit_(1986_film)
Will Robots Take My Job? (n.d.). Mental health counselors. Retrieved from https://willrobotstakemyjob.com/mental-health-counselors
Zhang, Z. and Wang, J. (2024, October 31). Can AI replace psychotherapists? Exploring the future of mental health care. Frontiers in Psychiatry. Retrieved from https://pmc.ncbi.nlm.nih.gov/articles/PMC11560757/
Comments