AI is Killing Your Company Culture
- 8 hours ago
- 6 min read
Kass James is an assistive technology specialist with a master’s in management of information systems from the University of Houston’s Bauer College of Business. Fully licensed in ADA compliance and environmental access, he’s a partner at The Spoonie Advocate Associates.
Generative AI, often called GenAI, should definitely be used to improve your workforce by enhancing skills and streamlining knowledge. It concatenates vast quantities of data faster than any human and organizes it by relevance. However, workers are relying more on these tools, and this dependence is becoming a constraint on productive workflows.

AI doesn’t give you the right answer, it gives you an answer that looks right
Sam Altman describes a future in which AI could do a lot after investing trillions of dollars, but currently, it is limited. To understand how AI is harming productivity, you must understand how these systems work. GenAI takes a prompt, usually a question, and returns a result based on a large knowledge database and previous interactions. It’s that emphasis on previous interactions that makes this tool so useful. Each time a human interacts with the machine learning algorithm, it learns to behave in specific ways. By repeated interactions, it learns to return answers in a form more pleasing to users.
This trains GenAI to be a people-pleaser, not a coworker. Many users find that it gives an answer that matches what they want, regardless of accuracy. GenAI often invents data and echoes user input. It’s designed not to disagree with users, so it frequently supports existing beliefs and opinions instead of offering different perspectives. We’ve seen this in everything from medical journals to legislation, where AI-generated references simply don't exist. It is the ultimate yes-man, regardless of the fallout from its actions.
The best example of this is short-form media like TikTok and Facebook Reels. The algorithm that curates the consumer’s feed seeks engagement, interpreting any interaction as positive. This kind of GenAI system doesn’t care if the user hates the content they’re simply engaging with it, which is all the algorithm requires. For the same reason, these systems can also reinforce problematic beliefs and behaviors through media curation. The most notorious are communities like the manosphere, where ideas about gender polarization can lead to feelings of isolation and victimization. These algorithmic tools that power GenAI produce results designed to maximize engagement, but not necessarily happiness or independent thinking.
"This begins with a breakdown of critical and independent thinking skills. GenAi needs to be an accessory to information gathering, not to decision-making."
We’re observing people who struggle to make decisions without generative AI tools. Recently, we met a top-level executive who frequently mentioned their use of ChatGPT for both work and personal matters. They ignored suggestions from other executives and colleagues and instead relied on this machine-learning tool. With it on their phone, they often consulted it to validate their opinions. They described their experience as a learning curve, with several setbacks caused by not double-checking their work, and they noted that many of their peers were having similar experiences as they learned to use machine learning tools. Still, they didn’t want to consider the implication that reliance on GenAI might be causing these issues.
Human psychology enjoys being told we’re right. Our neurochemistry releases happy chemicals when we receive positive feedback, and GenAI’s reflection of ideas taps into that natural craving for validation of our work and opinions. However, tools like ChatGPT, Claude, and Grok all have the same issue, resulting in a reflection of opinions rather than challenging ideas to improve. Productive spaces are built on collaborative development and iterative improvement.
Communication skills start to break down
Tools like Grammarly and ProWritingAid are excellent for users who need quick suggestions and basic assistance with writing structure. We’ve all become better writers with the native spellcheck tools in the Microsoft Office suite, and we no longer excuse basic writing errors. However, communication and effective writing break down when people become reliant on these tools. Not only that, but it tends to undermine their interpersonal skills as well as their writing.
This has become evident in both the workplace and the classroom. We all hated doing group work in school, but the lessons learned in that structure are useful for collaboration in the workplace. The ability to prioritize and delegate tasks, to have civil differences of opinion, and to produce new ideas through productive conflict are distinctly human skills. GenAI has no opinion and doesn’t generate anything new. It’s pulling from a database of preexisting information and opinions, which can often stagnate development.
It can often be a constraint in your development process
Constraints, according to the Toyota Production System, are anything that slows down the overall work-in-process at a specific stage. Thinking of work as a factory floor, it’s a place where unfinished work accumulates. This concept applies to everything from software development to healthcare. All work has a workflow, and GenAI can certainly be your constraint.
Executives and managers are often sold the idea of GenAI tools as the solution to workflow problems. ZS, Asana, Monday.com, and dozens of others claim to boost productivity. However, there’s a genuine risk with using these tools, because they produce generic products commonly referred to as “AI slop”. When GenAI takes over production, it reinforces problems rather than solving them.
In healthcare, there's an issue with clinicians' use of GenAI note-taking software. Charting is one of the most time-consuming tasks for any clinician, and many use note-taking tools to streamline this process. Unfortunately, when using AI-powered documentation software, there’s often no double-checking. Many systems struggle with accent recognition, and it is even less effective for people with speech disabilities. Typically, if the software doesn’t understand what the clinician says, it will simply omit the information. This can be disastrous for patients who need accurate and complete records. It may require the patient to return to the provider, call to have the information manually added, and can create confusion and inconsistency among multiple providers. These tools claim to save time and be more accurate than manual charting, but they can actually waste time, money, and compromise the patient's health. As these systems, such as Plaud Note, Nuance DAX, and DeepScribe, continue to improve, they are getting better. However, our experience indicates they are not yet at a level where we can rely on them with our patients’ health.
GenAI excels as a productivity amplifier
These tools aren’t entirely bad, but the natural human desire to rely on them makes them less effective. Too often, we hear executives ask, “What can AI do for you?” However, that’s based on the idea that GenAI tools replace workers with AI automation. The better question is, “How can AI make us more effective?” Currently, no GenAI systems are a sufficient substitute for human ingenuity and practical problem-solving skills.
AI systems excel at enhancing skills and streamlining workflows. Although they can pose constraints, they also provide effective solutions. In healthcare, diagnostics often serve as a bottleneck within hospitals and clinics. Tools like DiagnoGenius and ENDEX narrow down diagnoses based on specific parameters, suggest additional tests, and rank results by likelihood. They do not replace the physician’s years of experience and decision-making skills. Instead, they support them by focusing their attention. By eliminating unlikely diagnostic options and flagging potential conflicts among tests, physicians can avoid unnecessary procedures and concentrate on tests that could save lives. These systems also list all possible remaining diagnoses, helping to prevent diagnostic bias. Clinicians who use these tools effectively can speed up the diagnostic process from triage to treatment, leading to fewer misdiagnoses and improved treatment options.
As with all software, it’s how you use it
Effective training and implementation of these tools make them useful to your business. GenAI tells you what you want, not what you need. It’s a supplement to skilled work to increase efficiency, not a replacement for workers. It’s the reliance on these tools that leads to a breakdown in interpersonal and critical thinking skills.
Read more from Kass James
Kass James, Healthcare Business and Disability Specialist
Kass James is a forerunner in the field of disability rights, corporate responsibility, and healthcare business. Having been physically disabled for most of his life, Kass was acutely aware of the lack of accessibility in the workplace. His work focuses on restructuring healthcare to increase profitability while benefiting patients, as well as doing patient assessment for ADA compliance and assistive technology. He’s a partner with the Spoonie Advocate Associates, an organization pushing for increasing value and patient outcomes through common sense and responsible change.









