A medical professional built an AI-vibe-coded patient management system that exposed all patient data, unencrypted, to the open internet within days of launch.
A non-technical medical practitioner used an AI coding agent to build a custom patient management system after watching a video about how easy AI coding has become. The app stored unencrypted patient data on a US server with no Data Processing Agreement, recorded appointment audio and sent it to multiple US-based AI APIs without patient consent, and had no authentication — leaving all data fully exposed. A visiting patient discovered the vulnerabilities within 30 minutes of poking around. The practitioner's response was AI-generated, and their fix was adding basic authentication — leaving the underlying legal and structural violations entirely unresolved.
This is the first documented real-world case where an AI coding agent produced a production healthcare app with zero authentication, unencrypted storage, and illegal cross-border data transfers — all in one session. The dev community built the tools that made this possible. The legal and reputational fallout won't land on the vibe coder — it'll land on the ecosystem that normalized 'ship without understanding'. If your API or SDK is downstream of one of these deployments, you may already be implicated as a data processor under GDPR.
Audit any public-facing app you've helped build or whose API you expose: run a quick unauthenticated curl against its endpoints to check whether protected routes return data without a valid token — fix before a non-technical user does this accidentally.
Open your terminal and run: curl -s https://your-app-domain.com/api/patients — replace the URL with any protected route in a project you maintain or have access to
Tags
Also today
Signals by role
Also today
Tools mentioned