March 11, 2025

Durbin Questions Witnesses In Judiciary Subcommittee Hearing On The STOP CSAM Act

Durbin and Hawley’s STOP CSAM Act would crack down on the proliferation of child sexual abuse material online

WASHINGTON – U.S. Senate Democratic Whip Dick Durbin (D-IL), Ranking Member of the Senate Judiciary Committee, today questioned witnesses during the Senate Judiciary Subcommittee on Crime and Counterterrorism hearing entitled “Ending the Scourge: The Need for the STOP CSAM Act.” Durbin’s bipartisan legislation, the STOP CSAM Act, which he announced he would soon reintroduce with U.S. Senator Josh Hawley (R-MO), would crack down on the proliferation of child sexual abuse material (CSAM) online by allowing victims to sue companies that host this material, among other things. The Judiciary Committee unanimously advanced this legislation last Congress.

Durbin began by asking Mr. Greg Schiller, CEO of Child Rescue Coalition, about why the STOP CSAM Act’s provisions to strengthen privacy protections for CSAM victims, such as creating a presumption in favor of keeping protected information under seal and providing the court with remedies against any attorney who does not comply with these privacy provisions, are so important. Child Rescue Coalition is a nonprofit organization that rescues children from sexual abuse by building technology for law enforcement to identify, arrest, and prosecute child predators. Mr. Schiller previously served as an Assistant U.S. Attorney in the Southern District of Florida, where he prosecuted child sexual exploitation cases. 

“The STOP CSAM Act is a comprehensive child protection bill, as we all know. We counted on survivors, law enforcement, prosecutors, and victim advocates to help write it. A significant portion of the bill is dedicated to strengthening privacy protections for child victims. All CSAM victims have their dignity and privacy further ripped away from them when their abuse is captured in pictures or videos and put on display. That’s why STOP CSAM Act includes significant updates to federal law,” Durbin said. “Mr. Schiller, can you explain why it’s necessary to make these updates. And particularly, I’d like your thoughts on why it’s important to have an enforcement provision within this statute.”

Mr. Schiller responded, “I cannot begin to tell you how many times in my years of prosecuting cases that I have had to redact documents to do what I needed to do to protect the victims’ identity. For years we did not know how to handle it. We didn’t know exactly what we were supposed to do. And this legislation gives us that direction.”

Durbin then asked Ms. Taylor Sines, a survivor of internet-facilitated online enticement, child sexual exploitation, and CSAM, about her experience participating in the prosecution of her exploiter in federal court. Ms. Sines has spoken to thousands of child-serving professionals, both domestic and international, about the harms of CSAM, as well as the challenges with getting images, videos, and fake profiles removed from platforms such as Facebook, Instagram, X, and more.

“Ms. Sines, your exploiter was recently prosecuted in federal court. I understand you participated in that process,” Durbin said. “What, if anything, could have been done to make it easier or better for you?”

Ms. Sines responded, “In all transparency, it was the easiest thing I think I have ever done. My prosecutor to the detective who worked my case made it effortless for me… I consider myself very lucky that I don’t think anything could have been done better for my case, but I know that is not how it is for every victim. I know that law enforcement falls short. I know prosecution can fall short. In my case, my school resource officer fell short. I think it’s making sure that everyone is well educated on CSAM and what’s sextortion and exploitation means and how prevalent it is in their own backyards.”

Durbin concluded by asking the witnesses whether any social media companies are responding properly and in a timely fashion to CSAM.

Mr. Schiller responded, “I think some of them have been getting better. But I think that a lot of that has to do with the pressure they have been receiving from law enforcement, from prosecutors, from the Department of Justice, to provide more information. But that’s not the way it is supposed to happen. There is supposed to be a uniform standard to make sure they are all providing the same and necessary information.”

Video of Durbin’s questions in Committee is available here.

Audio of Durbin’s questions in Committee is available here.

Footage of Durbin’s questions in Committee is available here for TV Stations.

Durbin has used his role on the Senate Judiciary Committee to prioritize child safety online through hearings, legislation, and oversight efforts. On January 31, 2024, while Durbin was serving as Chair, the Committee held a hearing featuring testimony from the CEOs of social media companies Discord, Meta, Snap, TikTok, and X (formerly known as Twitter). This hearing highlighted the ongoing risk to children and the immediate need for Congress to act on the bipartisan bills reported by the Committee.

In addition, Durbin’s bipartisan Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act) passed the Senate in July 2024. The legislation would hold accountable those responsible for the proliferation of nonconsensual, sexually-explicit “deepfake” images and videos. The volume of “deepfake” content available online is increasing exponentially as the technology used to create it has become more accessible to the public. The overwhelming majority of this material is sexually explicit and is produced without the consent of the person depicted.

Last month, the Judiciary Committee held a hearing entitled “Children’s Safety in the Digital Era: Strengthening Protections and Addressing Legal Gaps.” Durbin’s opening statement from that hearing is available here, and his questions for the witnesses are available here.

-30-