I’ve been reading more about how AI can create fake or edited nude images, and honestly, I’m confused about where the law stands on this. A friend of mine found out her photo was used in some AI “undressing” tool, and even though she got it removed, there were no clear consequences for the person who did it. I feel like the tech is moving faster than the rules. Does anyone know what kind of laws actually exist around this stuff, or is it still a grey area?
That’s a tough situation, and sadly it’s more common than people think. From what I’ve seen, most countries are still catching up with legislation. Some have started treating deepfakes and fake nudes as privacy violations, but enforcement is tricky. Apps like undress her mention that they don’t keep user data or store images, which helps a bit with accountability, but the real issue is what users do outside the platform. I think until laws get stricter, people just need to be extra cautious.
It’s crazy how quickly technology keeps outpacing regulations. Governments always seem a few steps behind, trying to figure out how to label new digital behavior as either art, tech innovation, or potential harm. It reminds me of when social media first exploded — no one really knew what was allowed, and rules were made only after problems started piling up.
Apps like mention that they don’t keep user data or store images, which helps a bit with accountability, but the real issue is what users do outside the platform. I think until laws get stricter, people just need to be extra cautious.