Prime Minister Theresa May has put big technologies companies in the cross hairs of the fight on terrorism, saying they have to do more to prevent hate speech online.
"We cannot allow this ideology the safe space it needs to breed," May said in a Sunday statement outside Number 10 Downing Street in the wake of Saturday's terror attack in the London Bridge and Borough Market area of the capital.
"Yet that is what the internet - and the big companies that provide internet-based services - provide," May added. "We need to work with allied, democratic governments to reach international agreements that regulate cyberspace to prevent the spread of extremism and terrorist planning.
"And we need to do everything we can at home to reduce the risks of extremism online," she added.
The attack killed seven people and injured 48. It has followed a series of attacks in the U.K., including a suicide bomber at an Ariana Grande in Manchester in May killing 20 people and a vehicle attack on Westminster Bridge, which killed five.
In a reply to May's statement, which was posted on Facebook, the company's policy director wrote: "Using a combination of technology and human review, we work aggressively to remove terrorist content from our platform as soon as we become aware of it.
"If we become aware of an emergency involving imminent harm to someone's safety, we notify law enforcement.
"Online extremism can only be tackled with strong partnerships. We have long collaborated with policymakers, civil society, and others in the tech industry, and we are committed to continuing this important work together."
In early May, after a row erupted with advertisers after commercials were shown alongside extremist content, Facebook hired 3,000 new people to moderate and remove inappropriate content.
A spotlight is also being put on the encryption of messages services including Facebook's WhatsApp, which can be used to prepare attacks.
U.K. Home Secretary Amber Rudd has demanded that social media companies, "limit the amount of end-to-end encryption."
However, campaigners have said that this is a risky approach and could force "these vile networks into even darker corners of the web, where they will be harder to observe," the Open Rights Group warned.
Facebook, YouTube and Twitter have all pledged to remove such content within 24 hours. YouTube has only managed to do this om 42.6% of cases and Twitter in 39% of cases, according to EC data.
Facebook has succeeded in removing the content in this time period in 57.9 % of cases.
Jim Cramer and the AAP team hold positions in Facebook and Alphabet for their Action Alerts PLUS Charitable Trust Portfolio. Want to be alerted before Cramer buys or sells GOOGL and FB? Learn more now.