Airbnb Busted After Host Uses AI Images to Pin $7K in Fake Damages on Guest
A London academic is speaking out after narrowly avoiding a $7,000 bill from Airbnb — the result of a host allegedly using AI-generated photos to falsely accuse her of causing major property damage.
The guest had rented a one-bedroom Manhattan apartment for a long-term stay but left early due to safety concerns about the neighborhood. Not long after her departure, the host filed a damage claim for nearly $16,000, submitting photos of cracked furniture, stained bedding, and broken appliances — including a robot vacuum and air conditioner — as alleged evidence.
But something didn’t add up.
Upon examining the photos, the guest — who maintains she left the unit in good condition — noticed visual discrepancies that hinted the images were either heavily altered or entirely fabricated using AI tools. Despite her objections and willingness to provide eyewitness testimony, Airbnb initially sided with the host and demanded she pay $7,000 in damages.
It wasn’t until The Guardian began investigating the case that Airbnb changed course.
As AI-enhancement to images becomes commonplace for selfies and social media, one woman found that her AirBNB host likely used the tech to try and scam her out of thousands of dollars in a false damage claim//bit.ly/4fgkr6Y pic.twitter.com/OquVGCFHUh
— 10-100 Consultancy (@10_100) August 4, 2025
After media scrutiny, Airbnb issued a full apology, refunded the guest $5,700 for the entire cost of her stay, and removed a retaliatory negative review the host had left on her profile. The host, identified as a “superhost” on the platform, did not respond to inquiries and has since received a warning from Airbnb. The company admitted it could not verify the authenticity of the photos he submitted.
Airbnb now says it is reviewing its internal claims process. The company acknowledged the growing challenge of AI-manipulated images being used fraudulently and emphasized its commitment to fair resolutions for both hosts and guests.



