From The Desk Of Beatrix Lamarr

Once upon a time, Hedy Lamarr was coerced into filming a nude scene she never consented to. It nearly ended her career.

Consent matters, and not just in the past.

This month, as Sora 2 makes it possible to create videos of anyone with just a prompt and a face, consent just got more complicated. From Hollywood to Tokyo, the backlash was swift.

It’s also Cybersecurity Awareness Month. For women especially, that means more than passwords and phishing scams. It means protecting your image, your voice, your brand.

So many entrepreneurs build their business and reputation on their personal brand. Which makes cyber protection essential.

Yours in Beauty, Brains & Bots,

Beatrix πŸ’‹

HER POWER BRIEF

πŸ’„ Sora 2 Just Dropped. So Did The Backlash

Image: Forbes

OpenAI’s new text-to-video model, Sora 2, can do a lot. Including, apparently, infringe on copyrighted material with abandon.

The rollout came with a quiet policy twist: copyrighted characters and celebrity likenesses were opted-in by default. If you didn’t want to be included, you had to ask not to be.

Spoiler: that didn’t go over well.

Here’s who’s pushing back:

  • Hollywood came out swinging, including Walter White himself (a.k.a. Bryan Cranston), calling for serious guardrails on AI-generated actors.

  • Japan’s government requested OpenAI protect anime and manga from copyright infringement, calling them β€œirreplaceable treasures.”

  • The MLK Jr. estate demanded OpenAI block unauthorized use of his likeness after deepfakes surfaced online.

When consent becomes optional, creativity gets complicated β€” and the risk falls hardest on the artists, voices, and founders least protected.

But there’s an even bigger concern when it comes to Sora…

πŸ’„ Sora’s Got a Perv Problem

OpenAI’s new video app lets users make content with your face β€” and yes, it’s already being used for fetish videos.

Tech reporter Katie Notopoulos discovered strangers using her likeness for AI-generated pregnancy and inflation content after leaving her cameo settings public.

πŸ€– No nudity. Just no consent either.

It’s yet another reminder that OpenAI’s β€œask forgiveness, not permission” approach has real-world consequences β€” and why ethical, enforceable guardrails in AI aren’t optional. They’re overdue.

πŸ’„ AI Is The New Scam Artist πŸ‘€

Move over, Nigerian Prince. There’s a new scam artist in town.

According to a new global report by ISACA, AI-powered social engineering is now the top cybersecurity threat facing businesses β€” outranking even malware and ransomware.

The reason? These scams don’t look like scams anymore.

AI makes phishing emails more convincing, voice clones more realistic, and fake identities nearly impossible to detect.

The takeaway?

✨ If you’re the face of your brand β€” your voice, image, and inbox are now part of your security perimeter.

It’s time to think like a founder and a firewall.

October is Cybersecurity Awareness Month. How at ease do you feel about cybersecurity in the AI era?

πŸ’„ The Future Of Cyber Defense Is Female

Cybercrime is expected to cost $10.5 trillion this year β€” and most of the industry’s still focused on passive monitoring and damage control.

Enter: May Chen-Contino, CEO of women-led Cybersecurity firm Unit 221B.

A lifelong martial arts student and self-defense instructor, Chen-Contino is building a different kind of protection:

β†’ Threat disruption, not just passive observation.

β†’ Purpose, not panic.

β†’ Martial arts discipline meets data science

In a space still dominated by men, her mission-driven model is a reminder that empathy can be a power move.

❝

β€œCybercrime is about people, not just code. It takes people who care enough to fight back.”

⬇️ Spread The Movement ⬇️

Know another incredible woman in AI who would love this newsletter? Use the button below to share it with her, your email list or community.

Keep Reading