What the Take It Down Law Means for Middle and High School Students

Facebook and Instagram logos over a face scan background, showing how the Take It Down Law forces platforms to remove harmful content

Share Post:

If youโ€™re in middle or high school right now, chances are youโ€™ve seen your fair share of wild stuff online. Screenshots. Rumors. AI-edited videos. Maybe even deepfakes.

And maybe, at some point, youโ€™ve wondered, what happens if someone crosses a line? What if a photo or video gets shared thatโ€™s not just embarrassing, but deeply personal, and fake?

Well, as of May 2025, thereโ€™s a new federal law in place designed to deal with exactly that. Itโ€™s called the Take It Down Act, and itโ€™s got serious weight behind it โ€” from lawmakers on both sides of the political aisle, to celebrities, tech giants, and, most importantly, students who lived through the worst of what the internet can do.

What Is the Take It Down Act?

 

View this post on Instagram

 

A post shared by RAINN (@rainn)

At its core, the Take It Down Act makes it illegal to knowingly share or post non-consensual intimate imagery (NCII), including fake or AI-generated images,ย  without the personโ€™s permission. That includes stuff like:

  • Real photos or videos shared without consent
  • AI-edited or โ€œdeepfakeโ€ images that look real
  • Anything made to humiliate, harass, or exploit someone sexually

And hereโ€™s the real headline: if someone reports that kind of content, websites and apps have 48 hours to remove it. Period.

That means platforms like TikTok, Instagram, Reddit, or Discord canโ€™t just shrug it off or make you โ€œproveโ€ itโ€™s fake. Once thereโ€™s a valid report from the victim, theyโ€™re required by law to act quickly โ€” or face serious consequences.

Why Should Students Care?

If youโ€™re thinking, โ€œThis doesnโ€™t affect me,โ€ take a minute. Because a big chunk of the lawโ€™s momentum came from teenagers, including a girl named Elliston Berry.

When Elliston was just 14, someone at her school used AI to create a fake nude image of her and shared it. It wasnโ€™t real, but it looked real enough.

And it wrecked her sense of safety. For nearly a year, she and her mom begged platforms to take it down. Some did. Some didnโ€™t. Some ignored them altogether.

Now? That kind of delay is no longer legal. The law forces tech companies to respond โ€” fast.

And itโ€™s not just about deepfakes. Say you break up with someone, and they decide to share a private picture you sent. Thatโ€™s NCII, too. Even if it was consensual at the time, sharing it without your permission is now a federal crime.

What Counts as โ€œNon-Consensual Intimate Imageryโ€?

The law is pretty specific. Hereโ€™s what falls under the NCII umbrella:

  • Nude photos or videos shared without permission
  • Any sexual content where the person didnโ€™t agree to it being posted
  • Fake nudes or AI-created images (aka โ€œdeepfake pornโ€) showing someoneโ€™s face on another personโ€™s body
  • Content that looks real to a regular viewer, even if itโ€™s AI-made

And just because someone is permitted to take a photo doesnโ€™t mean they agreed to share it. The law makes that clear.

What About AI Deepfakes?


Thatโ€™s a big part of why the law was created.

Deepfakes are fake videos, images, or audio that use artificial intelligence to make it seem like someone did or said something they didnโ€™t. They can be used for pranks, memes, or movie effects โ€” but theyโ€™ve also been used in really harmful ways.

For example, someone could take your school photo, slap it onto someone elseโ€™s body in a sexually explicit image, and then send it around. The image might be fake, but to the people seeing it, it looks real. And thatโ€™s enough to destroy someoneโ€™s reputation, cause mental health issues, or lead to real-life harassment.

Under the Take It Down law, those deepfakes are treated the same as real images โ€” if they look real enough to fool people, they qualify as NCII.

How Does the Take It Down Law Work?

Hereโ€™s the process, step-by-step, for when someone wants a fake or explicit image removed:

  1. The victim notices the image or gets alerted โ€” on social media, a group chat, a Discord server, wherever.
  2. They notify the platform โ€” this can be done through a complaint form or a direct message.
  3. The platform has 48 hours to take it down โ€” that includes the original post and any reposts or re-shares they can find.
  4. The FTC (Federal Trade Commission) is in charge of making sure platforms follow through. If they donโ€™t, they could get fined.

So if you ever hear, โ€œThey donโ€™t care โ€” theyโ€™re not gonna do anything,โ€ thatโ€™s no longer true. Itโ€™s the law now.

What If Youโ€™re the One Who Posts Something?

A man takes a photo of a blurred woman without consent, highlighting what the Take It Down Law now treats as a federal offense
Source: Youtube/Screenshot, Sharing sexual content that looks real without consent makes you legally responsible

Letโ€™s be blunt: if you knowingly share something that qualifies as NCII โ€” even if itโ€™s a joke, even if you didnโ€™t make it โ€” you could be committing a federal crime.

Thatโ€™s a big deal.

It doesnโ€™t matter if:

  • Youโ€™re under 18
  • You didnโ€™t make the image
  • You only sent it to a few people
  • It was meant as a joke or a prank

If you shared something sexual that looks real and the person didnโ€™t say yes to posting it, you can be held responsible. Even if you delete it later. Even if you didnโ€™t think it was โ€œa big deal.โ€

What About Free Speech?


Good question. Some people wondered if this law would go too far and end up censoring stuff unfairly. But the law includes a safeguard: it only applies when someone knowingly publishes non-consensual intimate content, and when that content would look real to a โ€œreasonable person.โ€

So satire, memes, obvious parodies, or artistic commentary are not targeted. Itโ€™s not about policing the internet โ€” itโ€™s about protecting real people from real harm.

Whoโ€™s Behind the Law?

The Take It Down Act came from a surprising mix of people.

  • Senator Ted Cruz (Republican, Texas) helped write it.
  • Senator Amy Klobuchar (Democrat, Minnesota) co-led it.
  • First Lady Melania Trump threw major support behind it as part of her โ€œBe Bestโ€ campaign, especially after meeting victims like Elliston Berry.
  • It passed Congress with overwhelming support: 409โ€“2 in the House, and unanimously in the Senate.

And beyond politics, the law got backing from some heavy hitters:

  • Google
  • Meta (Facebook, Instagram)
  • TikTok
  • Snap
  • Amazon
  • The National Center for Missing & Exploited Children
  • RAINN (Rape, Abuse, & Incest National Network)

Even celebrities like Paris Hilton spoke out in favor.

What Can You Do if It Happens to You?

Pixelated face of a woman showing the risk of sharing explicit content without consent
Source: Youtube/Screenshot, If the image keeps coming back, some services can help remove it permanently

If you ever find yourself in a situation where something explicit is posted without your permission โ€” real or fake โ€” youโ€™ve got options.

Hereโ€™s what to do:

  • Take screenshots of the post, comments, and any usernames involved.
  • Report the content directly to the platform (look for โ€œreportโ€ or โ€œflagโ€ options).
  • Tell a trusted adult โ€” a parent, teacher, counselor, or coach. Donโ€™t try to handle it alone.
  • Contact law enforcement if the image is spreading or if you feel unsafe.
  • Use legal resources: sites like the Cyber Civil Rights Initiative can guide you through getting things taken down.

If the image keeps resurfacing or youโ€™re running into roadblocks getting it taken down, some services specialize in persistent content removal.

According to Guaranteed Removals, a company that works with online reputation and image takedown cases, non-consensual content, especially when it’s shared across multiple platforms, can require repeated takedown efforts and professional follow-up to fully remove from search engines and social media archives.

Youโ€™re not being dramatic. Youโ€™re protecting your identity.

Final Thoughts

Middle and high school can be intense enough without the internet weaponizing your face, your body, or your image. The Take It Down Act doesnโ€™t solve everything, but it gives power back to the people whoโ€™ve had it taken away.

Itโ€™s a sign that adults in power are finally starting to take digital harassment seriously, especially the kind that targets teens, girls, and students caught up in the wrong place at the wrong time.

So, if youโ€™ve ever thought, โ€œThey wonโ€™t do anything about it,โ€ think again.

Now they have to.

And if youโ€™ve ever worried youโ€™d be blamed, dismissed, or ignored โ€” know this: the law is on your side. And youโ€™re not alone.

Picture of Jessica Giles

Jessica Giles

Hi, Iโ€™m Jessica Giles, a passionate education specialist with a Bachelor's degree in Education from Boston University and over 10 years of hands-on classroom experience teaching middle school students. My expertise lies in developing innovative strategies to enhance critical thinking, creativity, and collaborative learning. At Springfield Renaissance School, I combine my real-world teaching experiences with my enthusiasm for educational writing, aiming to empower both students and teachers alike.

Related Posts