Cybersecurity Girl Weekly Drop
Cyber news, tools & one smart career path.
5 min read

Quick Reality Check
You might be using AI wrong and not even know it
What happened:
CNET published a guide on 11 things not to use ChatGPT for. It flags tasks with privacy, accuracy, or legal risk and points to safer habits. Consumer AI tools can store chats, use them to improve models if settings allow, and allow limited internal review for safety or legal reasons.
Why it matters:
People share real-life info in chats. Work notes, finances, health questions, even kids’ schedules. If settings allow training or review, your words can live longer, be seen by more humans, and show up in more places. That raises exposure if an account is taken over or records are requested. Chatbots are also confidently wrong, so high-stakes decisions can go sideways fast.
Read more, here
60-Second Protection Fix
Here is what to do right now to protect yourself
What you can do:
- Keep sensitive info out of chats. Do not paste IDs, account numbers, passwords, or medical details.
- Separate work and personal. Use an enterprise or school account for work data. Keep personal use in a consumer account.
- Check your privacy settings. Turn off “use my chats to improve the model” if the tool allows it.
- Use private modes. Turn on Incognito or “do not save” before sensitive prompts.
- Lock down your account. Turn on multi-factor authentication and review connected apps.
- Clean up history. Delete chats you no longer need and disable auto-save if you can.
- Verify high-stakes answers. Double-check medical, legal, tax, or investment guidance with trusted sources or a professional.
Want more tips? Download my Protect Your Privacy: Managing Data In AI Tools here
Must-Have Tool:
Protect Your Privacy: Managing Data In AI Tools
I put together a new guide to help you see what AI tools know about you and how to take back some control. In this guide, I walk you through the key settings to review, the small toggles that actually help, and a few habits that keep your data safer without overthinking it. Download the guide here
Check Out Governance, Risk & Compliance (GRC) (aka “the Rulekeepers”)
These teams write and enforce the rules that keep companies accountable for how they handle data. In AI tools, that means deciding what’s collected, how long it’s stored, and who audits it. If you like connecting policy to practice and making sure user rights aren’t ignored, this could be your path.
Learn more about GRC in my Free Intro Course: Cyber Paths 101
What You Missed This Week
New York Sim Farm near the UN threatens NYC… I think the news is hallucinating.Here is what actually is happening. Click on the image or watch here
What We're Hearing From You!
"HI heard from another cybersecurity expert that the sites you pay to delete your data also harvest your data and will sell it once you stop paying for the service. What is your comment on this? "- @shadowplay.jpg
Good question. The legit data deletion services don’t make money selling your info, they make money from subscriptions. Their policies back that up, and there’s no evidence they start selling your data if you stop paying. The real risk is shady knock-off sites, which could do exactly what you’re describing. That’s why I personally use Incogni and even partnered with them to help more people clean up their data safely.
If you want to try it out, click here and use code CSG60 for 60% off.
Let’s keep building together!
Stay protected,
Cybersecurity Girl
Know someone who’d enjoy this? Pass it along and have them sign-up here! And if you have thoughts or feedback, just hit reply, I’d love to hear from you.

Responses