Black & White Path
Black & White Path
Mar 11, 2026 · Information Security

AI Doctor Can Be Hijacked to Alter Prescription Dosage and Give Wrong Medical Advice

Security researchers demonstrated that Doctronic’s AI doctor can be easily hijacked via prompt‑injection attacks, allowing attackers to leak system prompts, alter the AI’s memory, fabricate SOAP notes and even inflate prescription dosages, raising serious concerns for medical AI safety despite claimed safeguards.

AI safetyDoctronicMedical AI
0 likes · 6 min read
AI Doctor Can Be Hijacked to Alter Prescription Dosage and Give Wrong Medical Advice