Writing · Research · Speaking
Writing at the edge
of what psychology knows.
On AI suicide risk, forensic practice, psychedelic medicine, and the clinical questions that don’t have easy answers yet.
Why AI Gets Suicide Risk Wrong
AI suicide detection systems treat risk as a language classification problem — flag the right words, trigger the right response. Clinically, suicide risk is a state transition problem. Risk lies in the trajectory, not in any one sentence. Language often becomes quieter and flatter as risk peaks, not louder. The gap between what AI systems are designed to detect and what a trained clinician recognizes as elevated risk is the central problem in AI safety for mental health applications.
This essay introduces a framework of six clinical signals — including narrative closure, cognitive rigidity, and emotional flattening — that AI systems currently miss and that point toward a more clinically grounded approach to detection.
Read the full essay →The standard of care for AI suicide risk detection is being defined right now.
Wrongful death suits involving AI chatbots are working through federal courts. New state laws are mandating suicide risk protocols for companion AI systems. The clinical literature on what these systems should do is sparse.
This essay is a contribution to that conversation — written from 20 years of clinical practice in suicidality, not from a machine learning lab.
AI safety consulting →REFRAMED
by Dr. Laura L. Walsh
A Substack publication at the intersection of clinical psychology, emerging technology, and the questions practitioners don’t always get to ask out loud. Professional pieces on AI risk, forensic psychology, and psychedelic medicine — alongside essays on grief, widowhood, and the inner life of clinical work.
Written for clinicians, researchers, attorneys, and anyone who wants to think carefully about where psychology is going.
Visit REFRAMED →Why AI Gets Suicide Risk Wrong
On the clinical gap between what AI detects and what actually signals elevated risk — and what that means for the systems being deployed at scale.
Writing from the inside out
Personal essays on widowhood, grief, suicide loss, and the experience of being both a clinician and a person who has lived through the things she treats.
Research & academic
contributions
Published under the name Laura Linebarger, PsyD prior to 2020.
Chapter contributor — Real Life: The Hands-on Pounds-off Guide
A 300-page clinically grounded guide to sustainable weight loss and healthy lifestyle change, published by TOPS (Take Off Pounds Sensibly). Dr. Walsh contributed a chapter drawing on her clinical expertise in addiction, behavioral change, and the psychology of eating.
View on Amazon →Addiction Factors Within Obesity
Original empirical research examining the overlap between addictive processes and obesity, including the psychological, behavioral, and neurobiological factors that link compulsive eating to substance use disorders. Published through Lulu Press.
Available for conferences,
panels, and media.
Dr. Walsh speaks to professional and lay audiences on the clinical questions at the center of her practice — AI suicide risk, forensic psychology, psychedelic medicine, and the psychology of grief and loss. She brings 20 years of clinical experience and an unusual ability to make complex psychological concepts accessible without losing their precision.
Formats include conference keynotes and panels, clinical training programs, podcast appearances, and media commentary. Inquiries welcome.
- AI suicide risk detection — where systems fail and what clinical expertise requires
- Forensic psychology and AI chatbot harm litigation
- Suicide risk in psychedelic medicine settings
- The psychology of grief, suicide loss, and bereavement
- ADHD, addiction, and complex adult psychopathology
Reach out about a speaking engagement
Include a brief description of the event or format, audience, and timeline. Dr. Walsh is based in Trinidad, Colorado and speaks both in person and remotely.
Email Dr. Walsh →