Cybersecurity Girl Weekly Drop
Cyber news, tools & one smart career path.
5 min read

Quick Reality Check
ChatGPT now allows users to ask health-related questions and upload personal health information. This includes symptoms, medical details, and other sensitive data that people may assume is harmless to share in the moment.
What happened:
AI health tools are becoming more common, and people are starting to use them as a private space to talk about health concerns. The BBC reports that ChatGPT now supports health-focused features, allowing users to input personal health details to receive guidance or explanations.
The risk is not one single upload. It is how small details add up over time. Symptoms, medications, sleep issues, stress, and mental health questions can be combined to build a health profile, even if you never upload official medical records. AI does not need a confirmed diagnosis to infer one
Why it matters:
The danger lies in how AI builds a complete health profile from tiny, unrelated details. You don’t need to upload official medical records for this to happen. Instead, the AI stitches together your mentions of symptoms, medications, and sleep issues to "guess" a diagnosis and that guess can be used even if it’s wrong. Once this data is shared, you lose control.
Unlike a doctor’s office, these platforms have no HIPAA laws and no doctor-patient confidentiality. Your private info can outlive the reason you shared it, moving between companies as they change policies or get sold. Eventually, these AI "guesses" can influence your insurance costs or even hiring tools without you ever knowing. You will never see the profile or the decision; you will only feel the outcome.
60-Second Protection Fix
If you choose to use ChatGPT or other AI health tools, reduce your risk:
-
Keep questions general instead of personal
-
Do not upload medical records, lab results, or test reports
-
Avoid sharing timelines or repeat patterns over time
-
Do not include names, dates of birth, or diagnoses
-
Turn off chat history and training in settings
Must-Have Tool:
Protect Your Privacy: Managing Data In Ai Tools
I put together a guide to help you understand what AI tools know about you and how to take back some control. In this guide, I walk you through the key settings to review, the small toggles that actually help, and a few habits that keep your data safer without overthinking it.
What You Missed This Week
This is how you’ll get scammed, Part 1. In this series, we’re breaking down the most common scams, showing you how cybercriminals get in, and what you can do to protect yourself before it’s too late. Cybercrime is a business, and your info is the product. Click on the image or watch here
Check Out Data Privacy (aka “The Guardians of Personal Information”)
Data Privacy professionals focus on how personal and sensitive data is collected, used, stored, and shared. In cases like AI health tools, they work to limit unnecessary data collection, reduce long-term data retention, and prevent sensitive information from being misused or repurposed without user awareness. If you care about protecting personal data, understanding privacy risks, and helping people stay in control of their information, this path might be for you
Learn more about Data Privacy in my Free Intro Course: Cyber Paths 101
Let’s keep building together!
Stay protected,
Cybersecurity Girl
Know someone who’d enjoy this? Pass it along and have them sign-up here! And if you have thoughts or feedback, just hit reply, I’d love to hear from you.

Responses