Close Menu
TechurzTechurz

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Why top talent is walking away from OpenAI and xAI

    February 13, 2026

    Fusion startup Helion hits blistering temps as it races toward 2028 deadline

    February 13, 2026

    AI burnout, billion-dollar bets, and Silicon Valley’s Epstein problem

    February 13, 2026
    Facebook X (Twitter) Instagram
    Trending
    • Why top talent is walking away from OpenAI and xAI
    • Fusion startup Helion hits blistering temps as it races toward 2028 deadline
    • AI burnout, billion-dollar bets, and Silicon Valley’s Epstein problem
    • Score, the dating app for people with good credit, is back
    • Didero lands $30M to put manufacturing procurement on ‘agentic’ autopilot
    • Eclipse backs all-EV marketplace Ever in $31M funding round
    • Complyance raises $20M to help companies manage risk and compliance
    • Meridian raises $17 million to remake the agentic spreadsheet
    Facebook X (Twitter) Instagram Pinterest Vimeo
    TechurzTechurz
    • Home
    • AI
    • Apps
    • News
    • Guides
    • Opinion
    • Reviews
    • Security
    • Startups
    TechurzTechurz
    Home»AI»The quest to defend against tech in intimate partner violence
    AI

    The quest to defend against tech in intimate partner violence

    TechurzBy TechurzJune 19, 2025No Comments5 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    The Download: tackling tech-facilitated abuse, and opening up AI hardware
    Share
    Facebook Twitter LinkedIn Pinterest Email

    As technology evolved, the ways abusers took advantage evolved too. Realizing that the advocacy community “was not up on tech,” Southworth founded the National Network to End Domestic Violence’s Safety Net Project in 2000 to provide a comprehensive training curriculum on how to “harness [technology] to help victims” and hold abusers accountable when they misuse it. Today, the project offers resources on its website, like tool kits that include guidance on strategies such as creating strong passwords and security questions. “When you’re in a relationship with someone,” explains director Audace Garnett, “they may know your mother’s maiden name.” 

    Big Tech safeguards

    Southworth’s efforts later extended to advising tech companies on how to protect users who have experienced intimate partner violence. In 2020, she joined Facebook (now Meta) as its head of women’s safety. “What really drew me to Facebook was the work on intimate image abuse,” she says, noting that the company had come up with one of the first “sextortion” policies in 2012. Now she works on “reactive hashing,” which adds “digital fingerprints” to images that have been identified as nonconsensual so that survivors only need to report them once for all repeats to get blocked.

    Other areas of concern include “cyberflashing,” in which someone might share, say, unwanted explicit photos. Meta has worked to prevent that on Instagram by not allowing accounts to send images, videos, or voice notes unless they follow you. Besides that, though, many of Meta’s practices surrounding potential abuse appear to be more reactive than proactive. The company says it removes online threats that violate its policies against bullying and that promote “offline violence.” But earlier this year, Meta made its policies about speech on its platforms more permissive. Now users are allowed to refer to women as “household objects,” reported CNN, and to post transphobic and homophobic comments that had formerly been banned.

    A key challenge is that the very same tech can be used for good or evil: A tracking function that’s dangerous for someone whose partner is using it to stalk them might help someone else stay abreast of a stalker’s whereabouts. When I asked sources what tech companies should be doing to mitigate technology-assisted abuse, researchers and lawyers alike tended to throw up their hands. One cited the problem of abusers using parental controls to monitor adults instead of children—tech companies won’t do away with those important features for keeping children safe, and there is only so much they can do to limit how customers use or misuse them. Safety Net’s Garnett said companies should design technology with safety in mind “from the get-go” but pointed out that in the case of many well-established products, it’s too late for that. A couple of computer scientists pointed to Apple as a company with especially effective security measures: Its closed ecosystem can block sneaky third-party apps and alert users when they’re being tracked. But these experts also acknowledged that none of these measures are foolproof. 

    Over roughly the past decade, major US-based tech companies including Google, Meta, Airbnb, Apple, and Amazon have launched safety advisory boards to address this conundrum. The strategies they have implemented vary. At Uber, board members share feedback on “potential blind spots” and have influenced the development of customizable safety tools, says Liz Dank, who leads work on women’s and personal safety at the company. One result of this collaboration is Uber’s PIN verification feature, in which riders have to give drivers a unique number assigned by the app in order for the ride to start. This ensures that they’re getting into the right car. 

    Apple’s approach has included detailed guidance in the form of a 140-page “Personal Safety User Guide.” Under one heading, “I want to escape or am considering leaving a relationship that doesn’t feel safe,” it provides links to pages about blocking and evidence collection and “safety steps that include unwanted tracking alerts.” 

    Creative abusers can bypass these sorts of precautions. Recently Elizabeth (for privacy, we’re using her first name only) found an AirTag her ex had hidden inside a wheel well of her car, attached to a magnet and wrapped in duct tape. Months after the AirTag debuted, Apple had received enough reports about unwanted tracking to introduce a security measure letting users who’d been alerted that an AirTag was following them locate the device via sound. “That’s why he’d wrapped it in duct tape,” says Elizabeth. “To muffle the sound.”

    Laws play catch-up

    If tech companies can’t police TFA, law enforcement should—but its responses vary. “I’ve seen police say to a victim, ‘You shouldn’t have given him the picture,’” says Lisa Fontes, a psychologist and an expert on coercive control, about cases where intimate images are shared nonconsensually. When people have brought police hidden “nanny cams” planted by their abusers, Fontes has heard responses along the lines of “You can’t prove he bought it [or] that he was actually spying on you. So there’s nothing we can do.” 

    defend intimate partner Quest tech violence
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleHow ‘Jaws’ Sank Shark Conservation Before It Began
    Next Article Google Maps gets a bunch of updates in Europe just in time for the summer
    Techurz
    • Website

    Related Posts

    Opinion

    India has changed its startup rules for deep tech

    February 8, 2026
    Opinion

    The Minneapolis tech community holds strong during ‘tense and difficult time’

    February 3, 2026
    Opinion

    Peak XV says internal disagreement led to partner exits as it doubles down on AI

    February 3, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    College social app Fizz expands into grocery delivery

    September 3, 20251,570 Views

    A Former Apple Luminary Sets Out to Create the Ultimate GPU Software

    September 25, 202514 Views

    The Reason Murderbot’s Tone Feels Off

    May 14, 202511 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    College social app Fizz expands into grocery delivery

    September 3, 20251,570 Views

    A Former Apple Luminary Sets Out to Create the Ultimate GPU Software

    September 25, 202514 Views

    The Reason Murderbot’s Tone Feels Off

    May 14, 202511 Views
    Our Picks

    Why top talent is walking away from OpenAI and xAI

    February 13, 2026

    Fusion startup Helion hits blistering temps as it races toward 2028 deadline

    February 13, 2026

    AI burnout, billion-dollar bets, and Silicon Valley’s Epstein problem

    February 13, 2026

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer
    © 2026 techurz. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.