Close Menu
TechurzTechurz

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Less than 3 days to secure your exhibit table at Disrupt 2025

    October 15, 2025

    The full Space Stage agenda at Disrupt 2025

    October 15, 2025

    The new iPad Pro’s biggest upgrade isn’t the M5 chip – I’d buy it for this feature instead

    October 15, 2025
    Facebook X (Twitter) Instagram
    Trending
    • Less than 3 days to secure your exhibit table at Disrupt 2025
    • The full Space Stage agenda at Disrupt 2025
    • The new iPad Pro’s biggest upgrade isn’t the M5 chip – I’d buy it for this feature instead
    • How Attackers Bypass Synced Passkeys
    • Flax Typhoon exploited ArcGIS to gain long-term access
    • When Face Recognition Doesn’t Know Your Face Is a Face
    • There’s one critical reason why I choose this Garmin smartwatch over competing models
    • Two CVSS 10.0 Bugs in Red Lion RTUs Could Hand Hackers Full Industrial Control
    Facebook X (Twitter) Instagram Pinterest Vimeo
    TechurzTechurz
    • Home
    • AI
    • Apps
    • News
    • Guides
    • Opinion
    • Reviews
    • Security
    • Startups
    TechurzTechurz
    Home»Startups»A Single Poisoned Document Could Leak ‘Secret’ Data Via ChatGPT
    Startups

    A Single Poisoned Document Could Leak ‘Secret’ Data Via ChatGPT

    TechurzBy TechurzAugust 7, 2025No Comments3 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    A Single Poisoned Document Could Leak ‘Secret’ Data Via ChatGPT
    Share
    Facebook Twitter LinkedIn Pinterest Email

    The latest generative AI models are not just stand-alone text-generating chatbots—instead, they can easily be hooked up to your data to give personalized answers to your questions. OpenAI’s ChatGPT can be linked to your Gmail inbox, allowed to inspect your GitHub code, or find appointments in your Microsoft calendar. But these connections have the potential to be abused—and researchers have shown it can take just a single “poisoned” document to do so.

    New findings from security researchers Michael Bargury and Tamir Ishay Sharbat, revealed at the Black Hat hacker conference in Las Vegas today, show how a weakness in OpenAI’s Connectors allowed sensitive information to be extracted from a Google Drive account using an indirect prompt injection attack. In a demonstration of the attack, dubbed AgentFlayer, Bargury shows how it was possible to extract developer secrets, in the form of API keys, that were stored in a demonstration Drive account.

    The vulnerability highlights how connecting AI models to external systems and sharing more data across them increases the potential attack surface for malicious hackers and potentially multiplies the ways where vulnerabilities may be introduced.

    “There is nothing the user needs to do to be compromised, and there is nothing the user needs to do for the data to go out,” Bargury, the CTO at security firm Zenity, tells WIRED. “We’ve shown this is completely zero-click; we just need your email, we share the document with you, and that’s it. So yes, this is very, very bad,” Bargury says.

    OpenAI did not immediately respond to WIRED’s request for comment about the vulnerability in Connectors. The company introduced Connectors for ChatGPT as a beta feature earlier this year, and its website lists at least 17 different services that can be linked up with its accounts. It says the system allows you to “bring your tools and data into ChatGPT” and “search files, pull live data, and reference content right in the chat.”

    Bargury says he reported the findings to OpenAI earlier this year and that the company quickly introduced mitigations to prevent the technique he used to extract data via Connectors. The way the attack works means only a limited amount of data could be extracted at once—full documents could not be removed as part of the attack.

    “While this issue isn’t specific to Google, it illustrates why developing robust protections against prompt injection attacks is important,” says Andy Wen, senior director of security product management at Google Workspace, pointing to the company’s recently enhanced AI security measures.

    ChatGPT data document leak Poisoned secret single
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleDreame’s Three New L40 Robot Vacuums Don’t Get Stuck Easily
    Next Article Apple CEO Tim Cook presents special gold gift to Donald Trump
    Techurz
    • Website

    Related Posts

    Security

    Single 8-Byte Write Shatters AMD’s SEV-SNP Confidential Computing

    October 14, 2025
    Security

    npm, PyPI, and RubyGems Packages Found Sending Developer Data to Discord Channels

    October 14, 2025
    Security

    Satellites Are Leaking the World’s Secrets: Calls, Texts, Military and Corporate Data

    October 14, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    The Reason Murderbot’s Tone Feels Off

    May 14, 20259 Views

    Start Saving Now: An iPhone 17 Pro Price Hike Is Likely, Says New Report

    August 17, 20258 Views

    CNET’s Daily Tariff Price Tracker: I’m Keeping Tabs on Changes as Trump’s Trade Policies Shift

    May 27, 20258 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    The Reason Murderbot’s Tone Feels Off

    May 14, 20259 Views

    Start Saving Now: An iPhone 17 Pro Price Hike Is Likely, Says New Report

    August 17, 20258 Views

    CNET’s Daily Tariff Price Tracker: I’m Keeping Tabs on Changes as Trump’s Trade Policies Shift

    May 27, 20258 Views
    Our Picks

    Less than 3 days to secure your exhibit table at Disrupt 2025

    October 15, 2025

    The full Space Stage agenda at Disrupt 2025

    October 15, 2025

    The new iPad Pro’s biggest upgrade isn’t the M5 chip – I’d buy it for this feature instead

    October 15, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer
    © 2025 techurz. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.