The government's contentious Online Safety Bill, which attempts to protect children online, has finally become law after years of discussion.
It aims to compel digital companies to assume greater accountability for the information that appears on their networks.
According to Michelle Donelan, the Secretary of State for Technology, it "ensures the online safety of British society not only now but for decades to come."
However, others have expressed worries about the privacy consequences.
A number of messaging providers, including WhatsApp, have threatened to leave the UK in response to the legislation.
The new law gives the regulator, Ofcom, additional enforcement tools and places the burden of protecting minors from some lawful but dangerous information on businesses.
It brings in additional regulations, such as making pornographic websites verify users' ages before allowing them to see content.
Other new charges include posting "deepfake" pornography, which uses artificial intelligence (AI) to incorporate someone's likeness into pornographic content, and cyber-flashing, which is the act of transmitting unwanted sexual pictures online.
The act also contains provisions to facilitate the process by which grieving parents may get data about their deceased children from digital companies.
Particularly contentious are the act's provisions allowing messaging providers to be forced to search encrypted texts for evidence of child abuse.
Some messaging platforms, such as WhatsApp, Signal, and iMessage, have vowed to leave the UK rather than jeopardize message security, claiming they cannot access or read anyone's communications without undermining the privacy safeguards that are in place for all users.
Proton is a privacy-focused mail platform, and if the government asks it to change its end-to-end encryption, the company says it will be ready to fight back in court.
According to Proton CEO Andy Yen, "the internet as we know it faces a very real threat," since the legislation allows the government to read private messages sent by citizens. "No one would tolerate this in the physical world, so why do we in the digital world?"
The measure was endorsed by the fact-checking group Full Fact, but they claimed that because of "retrograde changes" made to it, it did not go far enough "to address the way that platforms treat harmful misinformation and disinformation."
"Our freedom of expression is left in the hands of self-interested internet companies, while dangerous health misinformation is allowed to spread rampant," said Glen Tarman, head of policy and advocacy at Full Fact.