© 2026 WVIK
Listen at 90.3 FM and 98.3 FM in the Quad Cities, 95.9 FM in Dubuque, or on the WVIK app!
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Illinois and Iowa US Senators back ending law that offers protections for internet platform providers, claiming companies are negatively impacting children

Illinois Senator Dick Durbin and Iowa Senator Chuck Grassley
Senate Judiciary Committee
Illinois Senator Dick Durbin and Iowa Senator Chuck Grassley

A bipartisan bill in Congress would end a key foundation of the internet, impacting providers and users across the country. The Sunset Section 230 Act, cosponsored by Illinois US Senator Dick Durbin (D-Springfield), would allow victims to bring legal action against online companies for harmful content posted on their platforms. Some opponents argue, however, the measure could compromise free speech and shut down all but the largest online platforms, which rely on liability protections to operate.

Associate Professor of Journalism and Communication David Schwartz at Augustana College said Section 230 is a relic of the 1996 Communications Decency Act, which was the only thing salvaged after the Supreme Court ruled in favor of the American Civil Liberties Union in the 1997 case Reno v. ACLU, where the organization claimed the CDA violated adults' First Amendment rights.

“Imagine if your toddler or your first grader goes downstairs before you wake up in the morning,” Schwartz said in an interview with WVIK. “Who knows what they're going to be looking at online. According to the CDA, it could have been the adults who were responsible for that. Or if you put certain limits on what they could see, then that violates the First Amendment of the adults. Therefore, the CDA was ruled unconstitutional by the Supreme Court. I say it's a relic only because it was the only thing that was saved. So it's as if they threw the entire piece of legislation into a burning building. Somebody ran in to save it, and all they saved coming out was Section 230.”

Senior Counsel David Greene at the nonprofit Electronic Frontier Foundation says the section does not offer blanket immunity for providers.

“It didn't apply to violations of intellectual property law. So there's a whole different statutory scheme they created for copyright infringement, and it didn't apply to violations of criminal law,” Greene said in an interview with WVIK. “But for everything else, they chose immunity, which essentially means that online intermediaries will bear no liability for harm caused by user-generated content. So not things they wrote themselves, but only things that their users wrote.”

Professor of Law at George Washington Law School, Mary Anne Franks, said the section can be broken down into two parts.

“One is that Section 230 says that there is this legal shield when a platform or other provider tries to remove information that the provider considers to be objectionable in some kind of manner,” Anne Franks said in an interview with WVIK. “And that's a really straightforward and pretty positive aspect of Section 230, known as C2, which really focuses on encouraging that original goal of letting these companies say, ‘[Y]ou know what, whether or not this speech is protected or unprotected or defensible or indefensible, we as a company have the right to decide that we're going to take some of it out or not leave it on our platforms.’ And Section 230 C2 says you can't sue these companies for making that kind of choice, even if you, as a content creator, are upset because of the choices that were made.”

“The other part of Section 230 that gets a lot more airtime is the C1 provision, which is a lot more ambiguously worded. It says that you can't treat these kinds of platforms or providers of internet services as the publisher or the speaker of somebody else's content. So it makes the distinction between the platform that is providing access to third-party content and the third-party content itself. But it, in some ways, is the inverse almost of the two, because it says or has been interpreted to say there should be no liability, or broadly speaking, shielding from liability.”

Iowa Senator Chuck Grassley (R-New Hartford) is also co-sponsoring the legislation introduced by Senator Lindsey Graham of South Carolina. Both Senators Grassley and Durbin, in interviews with WVIK, said the repeal is to protect children.

“[T]here was a good faith effort to put this protection from lawsuit into the legislation because we had then very infant high tech and social media networks just starting, and we wanted to do everything we could to let them get off the ground,” Senator Grassley told WVIK in February. “And because we didn't know what the future held. Now we know what the future did hold. It did hold that there's a lot of bad things happening from social media. And the extent to which it is happening, whether that's sextortion or whether it's human trafficking [The Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) amended Section 230, explicitly stating it does not shield platforms that host, promote, or facilitate sex trafficking. That bill was signed into law in 2018] or whether it's people having mental health issue problems, I think that it's going to hold these big tech companies more responsible for the messages that go out over their platform.”

In an interview in March, Senator Durbin gave the example of photos of children being shared online, and parents wanting the photos removed could be stifled by the provider citing the section as protection to do nothing.

“Section 230 was written at a time when we didn't think the Internet was going to make it on its own, and it needed protection. Now it should be treated like every other business in America. If they knowingly, recklessly continue to publish these images at the expense of children, they ought to pay a price in court,” Senator Durbin said.

EFF Senior Counsel Greene says the senator’s goal would not be achieved efficiently.

“There's nothing that's stopping Congress now from passing a law that would address child exploitation. In particular, Congress has passed other laws that have special rules for liability for intermediaries in certain situations,” Greene said. “And there's nothing stopping Congress from doing [that] now but sunsetting the entire law, and really removing a fundamental piece of architecture from the Internet is a really ham-handed blunderbuss way of trying to accomplish that. So no, I don't think that will accomplish the goal either efficiently, and I think it will do so at great cost to all other types of speech.”

Jared Schroeder, Associate Professor of Journalism at the Missouri School of Journalism, said that if the law were passed, providers would have to approve every message on their platforms to avoid legal action.

“[O]ne possible outcome is they're suddenly being sued countless times a day and their financial model just simply cannot handle that. And they would have to change the systems to basically where we don't publish almost anything, or we run everything through lawyers. That's not feasible, right? It's not really feasible to run everything through a lawyer,” Schroeder said in an interview with WVIK.

“[W]hen we're talking on the scale that we're talking about, and like it's just mind-numbing work, you could say that they could run it through AI, and one of two things would happen,” Shroeder said. “The AI, it's very likely that the AI would basically just take anything that could possibly be the legal problem out, which would destroy the space as it is. Because it would just vanillafy the entire Internet. Or it would take all the spice out of the Internet, or it wouldn't catch things. Right. Because it just doesn't fundamentally understand human language or human behavior.”

Shroeder said smaller providers could not keep up with the threats of lawsuits and would probably close up shop, removing niche platforms, unlike META and Google, which could afford such litigation.

During their interviews, Schroeder, Schwartz, and Anne Franks said they do not support a full repeal, but rather amending the section.

“[M]ake it explicitly clear that that kind of immunity shield should only apply if we're actually talking about speech as opposed to any number of other kinds of things that could be occurring online. Transactions, financial issues, economic practices, things of that nature,” Anne Franks said. “We should restrict it to expressive content and it should not be given to companies that are making deliberate choices about harmful content that they are then amplifying for their own purposes and for their own commercial interests. In other words, if a company is aware of certain types of harms that flow from content that they are, that they have on their platform, and they choose to not only continue to have that content on their platform, but they even choose to maybe to amplify it or to solicit it, then that shield should go away. Now, that doesn't mean that they would be found liable for the harm that does flow from that content, but it would mean that if they have that kind of awareness and have that kind of control and have made that kind of reckless choice to continue to promote certain harmful content, they shouldn't have this preemptive immunity.”

The Senate Commerce, Science, and Transportation Committee is considering the legislation and held a hearing on Section 230 on March 18th. The legislation, which would sunset the section two years after passage, has not come up for a vote. Committee members have until Wednesday, March 25th, to submit questions to the guest speakers, and their responses are due back by April 8th.

EFF Senior Counsel Greene said the immunity provided in the section is not unique.

“But like most states have an immunity for live television, for statements made during live television and radio broadcast because state legislatures recognize that there was some value in being able to hear from people live. But there would also be great risks of liability if broadcasters were held liable if someone said something illegal or harmful. So that's just an example of one of the cases where the legislature made a thoughtful decision to have immunity,” Greene said. “The point I think is really important to understand, especially because I hear members of Congress say one of the reasons I think a sunset is such a bad faith idea is because the US is not the only nation in the world that's dealing with this issue.’

“Every country around the world has addressed intermediary liability as an issue. And so there are a lot of models and a lot of learning and a lot of knowledge out there that if Congress was really thoughtful about this and really was going to engage in good-faith efforts. There are lots of places to look to see what the alternatives are. And when you do look internationally, what you see is that most places in the world have adopted a system that, if it's not immunity, is really close to it. So if there's some reason the US experience is going to be different, that needs to be based on sound policy reasons, not this type of ham-handed thing. We're going to sunset this, which seems more like a punishment and a threat than a good-faith effort to reform a regulation.”

This story was produced by WVIK, Quad Cities NPR. We rely on financial support from our listeners and readers to provide coverage of the issues that matter to the Quad Cities region and beyond. As someone who values the content created by WVIK's news department, please consider making a financial contribution to support our work.

Brady is a 2021 Augustana College graduate majoring in Multimedia Journalism-Mass Communication and Political Science. Over the last eight years, he has reported in central Illinois at various media outlets, including The Peoria Journal Star, WCBU Peoria Public Radio, Advanced Media Partners, and WGLT Bloomington-Normal's Public Media.