In the high-stakes world of internal medicine, a diagnosis weight carried by a human being. As part of my research for the Emotion Encoded study, I spoke with Dr. Sharon Cordner to explore the tension between a doctor's seasoned intuition and the cold objectivity of Artificial Intelligence.
When an algorithm suggests a path that contradicts a doctor’s gut feeling, the first reaction usually isn't curiosity. It is a red flag. Dr. Cordner told me that for an experienced physician, an unexpected AI output triggers an immediate sense of responsibility for the non-medical person who might not know better.
This concern highlights exactly what I am looking for with this initiative: the human emotion of protection that a machine simply cannot simulate.
In complex cases, the "black box" of AI becomes a danger. Dr. Cordner argues that AI should never be a solo pilot. Instead, it must be treated like a witness being questioned. It can be part of a team, but it is the part of the team that must prove itself most vigorously.
An important part of her insights was her warning about how tech might change the doctors themselves. If we lean too hard on the machine, we don't just lose accuracy. We might lose the very human drive to learn from experience.
Dr. Cordner points out a fascinating psychological split in the medical field. While some doctors might use AI to mask their own uncertainty because they like the comfort of its objectivity, others might let vanity or ego lead them to ignore the tool entirely.