Center for Medical Ethics and Health Policy Staff Publications

Language

English

Publication Date

1-5-2026

Journal

JMIR Mental Health

DOI

10.2196/79182

PMID

41490401

PMCID

PMC12817037

PubMedCentral® Posted Date

1-5-2026

PubMedCentral® Full Text Version

Author MSS

Abstract

Background: Computer perception (CP) technologies-including digital phenotyping, affective computing, and related passive sensing approaches-offer unprecedented opportunities to personalize health care, especially mental health care, yet they also provoke concerns about privacy, bias, and the erosion of empathic, relationship-centered practice. At present, it remains elusive what stakeholders who design, deploy, and experience these tools in real-world settings perceive as the risks and benefits of CP technologies.

Objective: This study aims to explore key stakeholder perspectives on the potential benefits, risks, and concerns associated with integrating CP technologies into patient care. A better understanding of these concerns is crucial for responding to and mitigating such concerns via design implementation strategies that augment, rather than compromise, patient-centered and humanistic care and associated outcomes.

Methods: We conducted in-depth, semistructured interviews with 102 stakeholders involved at key points in CP's development and implementation: adolescent patients (n=20) and their caregivers (n=20); frontline clinicians (n=20); technology developers (n=21); and ethics, legal, policy, or philosophy scholars (n=21). Interviews (~ 45 minutes each) explored perceived benefits, risks, and implementation challenges of CP in clinical care. Transcripts underwent thematic analysis by a multidisciplinary team; reliability was enhanced through double coding and consensus adjudication.

Results: Stakeholders raised concerns across 7 themes: (1) Data Privacy and Protection (88/102, 86.3%); (2) Trustworthiness and Integrity of CP Technologies (72/102, 70.6%); (3) direct and indirect Patient Harms (65/102, 63.7%); (4) Utility and Implementation Challenges (60/102, 58.8%); (5) Patient-Specific Relevance (24/102, 23.5%); (6) Regulation and Governance (17/102, 16.7%); and (7) Philosophical Critiques of reductionism (13/102, 12.7%). A cross-cutting insight was the primacy of context and subjective meaning in determining whether CP outputs are clinically valid and actionable. Participants warned that without attention to these factors, algorithms risk misclassification and dehumanization of care.

Conclusions: To operationalize humanistic safeguards, we propose "personalized road maps": co-designed plans that predetermine which metrics will be monitored, how and when feedback is shared, thresholds for clinical action, and procedures for reconciling discrepancies between algorithmic inferences and lived experience. Road maps embed patient education, dynamic consent, and tailored feedback, thereby aligning CP deployment with patient autonomy, therapeutic alliance, and ethical transparency. This multistakeholder study provides the first comprehensive, evidence-based account of relational, technical, and governance challenges raised by CP tools in clinical care. By translating these insights into personalized road maps, we offer a practical framework for developers, clinicians, and policy makers seeking to harness continuous behavioral data while preserving the humanistic core of care.

Keywords

Humans, Qualitative Research, Stakeholder Participation, Female, Male, Humanism, Adult, Adolescent, computer perception, digital phenotyping, ethics, humanistic care, artificial intelligence, stakeholder engagement, context, consent, affective computing

Published Open-Access

yes

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.