menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Is AI Making Us Bossy?

22 0
latest

Using AI may train its users to speak in direct, command-oriented ways.

AI tools tend to reward clarity and precision, shaping how users structure thoughts and request.

Habitual AI language may lower users' patience for emotion, ambiguity, and human unpredictability.

Being mindful of AI’s influence can help users preserve empathy and relational connection.

I have begun to notice a shift in the way we use language.

When many of us speak to artificial intelligence (AI), we do not offer greetings, nuance, or hesitation. We command. Rewrite this, summarise that, plan my schedule. The language is direct, efficient, and focused on output.

It made me wonder whether this way of speaking is influencing how we interact with people. Are AI users becoming commanders in human relationships, too?

Why AI Rewards Direct, Command-Based Language

AI systems tend to respond best to clear, precise prompts. Users quickly learn to optimise their language for performance and accuracy.

This makes sense when the goal is task completion; efficiency and clarity yield better results. But language is not just a tool for tasks—it is a vehicle for thought. Psychological research shows that habitual patterns of language influence how we think and behave. Regularly practising directive, goal-focused language with AI may subtly reinforce similar habits in cognition and communication.

Human Conversation Is Not a Task to Optimise

Unlike the typical requests given to AI, human conversation is not a task to be completed. It is a process, rich with emotion, ambiguity, hesitation, repair, and connection.

Interpersonal communication requires patience, empathy, and tolerance for imperfection. When daily language practice becomes increasingly command-oriented, there is a risk that these essential relational skills may be deprioritized or overlooked.

AI responds instantly, accurately, and without its own needs. Over time, frictionless responsiveness may recalibrate users' expectations of human interaction.

Research shows that using AI-generated suggestions can influence perceptions of warmth and cooperation in relationships (Hohenstein et al., 2023). Other studies suggest that high reliance on chat-based AI may be associated with increased loneliness and lower real-world social engagement (Fang et al., 2025 [preprint]; Hau & Winthrop, 2025).

While these patterns do not prove causation, they highlight the potential for AI-shaped habits to carry over into human contexts.

Language Shapes How We Think and Relate

Language is more than a tool for communication; it shapes thought, attention, and relational style. The Sapir-Whorf hypothesis suggests that habitual language patterns influence cognition and behaviour (Whorf, 1956). When language becomes primarily directive and outcome-driven, it can prime people to approach relationships as problems to solve rather than experiences to share. Over time, these habits may subtly reduce patience for human unpredictability and emotional nuance.

This is not to say AI is inherently harmful. Some users report that they benefit from the clarity AI provides, particularly those who struggle to organise their thoughts or express their needs. Evidence even suggests that for some users, AI-assisted feedback can enhance empathic communication in structured settings when used thoughtfully (Sharma et al., 2022). The key is intent: Are we using AI as a partner in reflection, or as a model for commanding behaviour?

What This Means for the Future of Human Connection

The question is twofold: What is AI doing for people, and what is it doing to people? Are machines being trained to follow commands while humans train themselves to expect precision, speed, and control in all interactions? And what happens when attention returns to relationships that are inherently complex, slow, and wonderfully imperfect?

AI may be teaching people how to be better commanders. The real challenge is learning when not to take that stance with other humans. People do not respond to commands—they respond to curiosity, patience, and care.

Yet there is another side to this story. The very act of instructing AI may also be training people to be clearer, more direct, and more goal-oriented in their communication. Whether this represents a cognitive gain or a relational cost may depend on context—a question worth exploring in a follow-up discussion.

Hau, I., & Winthrop, R. (2025, July 2). What happens when AI chatbots replace real human connection. Brookings. https://www.brookings.edu/articles/what-happens-when-ai-chatbots-replac…

Fang, Cathy & Liu, Auren & Danry, Valdemar & Lee, Eunhae & Chan, Samantha & Pataranutaporn, Pat & Maes, Pattie & Phang, Jason & Lampe, Michael & Ahmad, Lama & Agarwal, Sandhini. (2025). How AI and Human Behaviors Shape Psychosocial Effects of Chatbot Use: A Longitudinal Randomized Controlled Study. Preprint ArXiv. 10.48550/arXiv.2503.17473.

Sharma, A., Lin, I. W., Miner, A. S., Atkins, D. C., & Althoff, T. (2023). Human–AI collaboration enables more empathic conversations in text-based peer-to-peer mental health support. Nature Machine Intelligence, 5(1), 46–57. https://doi.org/10.1038/s42256-022-00593-2

Hohenstein, J., Kizilcec, R.F., DiFranzo, D. et al. Artificial intelligence in communication impacts language and social relationships. Sci Rep 13, 5487 (2023). https://doi.org/10.1038/s41598-023-30938-9

Whorf, B. L. (1956). Language, thought, and reality: Selected writings of Benjamin Lee Whorf. MIT Press


© Psychology Today