The words used by healthcare professionals carry significant weight. Language that stigmatizes patients — terms like “difficult” or “non-compliant” — can have profound consequences on the medical care they receive. When healthcare professionals describe a patient as “difficult,” they may be less inclined to thoroughly investigate the patient’s complaints. This can create a barrier to accurate diagnosis and effective treatment. Recent research finds that these types of comments are not only unhelpful, but they can also increase the patient’s risk of misdiagnosis.
When it comes to health records, word choice matters
Research finds that the use of stigmatizing language influences not just a single interaction between patient and physician but also future treatment for the patient. Chart notes include information about the patient’s history, vital signs, and test results as well as potential treatment plans. A study out of Johns Hopkins found that the tone used by the medical professional who is putting information within these records impacts a physician’s empathy and treatment of that patient for years to come.
This impact can have catastrophic consequences. Patients may receive a misdiagnosis or even no diagnosis at all, which can have serious, if not fatal, consequences. Furthermore, a physician may be less likely to refer a patient labeled as “difficult” for further testing or to specialists, which further compounds the risk of misdiagnosis.
A publication in the Journal of the American Medical Association (JAMA) Internal Medicine discusses recent research that further highlights this concern. The group examined records of patients admitted to a hospital and, within a span of 48 hours, transferred to intensive care or died. Within this group of seriously ill patients, researchers found patients with stigmatizing language in their medical records were 23.2% more likely to have a misdiagnosis. When focusing only on those patients with misdiagnosis, researchers found the rate of diagnostic errors within this group of patients doubles for those who also had negative language in their records.
Impact of AI
More and more healthcare facilities are utilizing artificial intelligence (AI) to help with everything from scheduling to managing patient records. Critics have already voiced concern that these models are trained on material that has similar biases noted above. This could further contribute to the problem.
Takeaway
It is also important to note that bias can remain even if a physician does not write something within the patient’s records. The ripple effect of stigmatizing language is clear — it undermines the thoroughness and accuracy of medical evaluations. To mitigate this risk, healthcare providers must recognize and adjust their language and approach, ensuring they assess all patients with the same level of diligence.
Those who believe that they were the victim of a misdiagnosis because their treating medical team failed to take their symptoms seriously are wise to advocate for their legal right to remedies. This can serve a number of functions, including funds to help you pay off the medical bills that result from delayed treatment and discouraging other medical professionals from making the same mistake.