Reacting to a tweet in which YouTube claimed it was working to remove the footage, Mr Javid said YouTube, Google, Facebook and Twitter "really need to do more to stop violent extremism being promoted on your platforms".
Facebook and YouTube did not immediately respond to HuffPost's request for comment on the matter.
An armed police officer stands guard in a perimeter outside Linwood mosque after Friday's gunmen attacks, in Christchurch, New Zealand March 16, 2019. Reuters was unable to confirm the authenticity of the footage.
Twitter has also been battling to remove shared videos.
But that's just a drop in the bucket of what is needed to police the social media platform, said Siva Vaidhyanathan, author of "Antisocial Media: How Facebook Disconnects Us and Undermines Democracy".
The shootings in New Zealand show how the services they offer can be exploited by extremist groups, said Lucinda Creighton, senior advisor to the Counter Extremism Project.
The massacre in Christchurch was live-streamed by an attacker through his Facebook profile for 17 minutes, according to a copy seen by Reuters. Facebook says it does not want to act as a censor, as videos of violence, such as those documenting police brutality or the horrors of war, can serve an important goal.
In August, a shooting at a Madden 19 video-game tournament in Jacksonville, Florida, was captured on live video.
If Facebook wanted to monitor every livestream to prevent disturbing content from making it out in the first place, "they would have to hire millions of people", something it's not willing to do, said Vaidhyanathan, who teaches media studies at the University of Virginia.
Court rules Remington can be sued related to Sandy Hook shooting
But there are exceptions for violations of state laws, and that's what the CT court will now allow a state court trial to examine. The lawsuit, by relatives of nine victims and one survivor, points to the "militaristic" marketing of Remington's AR-15 rifle.
"While Google, YouTube, Facebook and Twitter all say that they're cooperating and acting in the best interest of citizens to remove this content, they're actually not because they're allowing these videos to reappear all the time", Lucinda Creighton, a senior adviser at the Counter Extremism Project, an global policy organization told CNN.
"The responsibility for content of the stream lies completely and exclusively on the person who initiated the stream".
He said the company condemned "the actions of these frightful persons and their disgusting use of our app for these purposes".
"Our hearts are broken over today's bad tragedy in New Zealand", the video portal said in a Twitter posting. "We will do whatever is humanly possible for it to never happen again".
Ionescu said that's because video and images are harder to block than words.
Facebook is "removing any praise or support for the crime and the shooter or shooters as soon as we're aware", she said.
But private online communities dedicated to violent content were still looking for ways to share copies of the video.
Facebook, Twitter and YouTube said they would take down content involving the mass shootings which were posted online as the attack unfolded.
"They have the tools with social listening to go in with keyword terms and have moderators view and remove all videos linked to this type of incident", she said.
- IRAN Iranian lawyers protest conviction of human rights activist Nasrin Sotoudeh
- HEXO to acquire and Newstrike Brands for $263M
- United States of America launch Basketball World Cup defense against Czechs
- Tata Motors' Jaguar Land Rover recalls 44,000 cars over CO2 emission levels
- Christchurch Shooting: 49 People Killed In New Zealand Mosque Terror Attack
- YouTube, Facebook, Twitter scrambling to contain video of New Zealand mosque shooting
- Johnny Manziel signs with AAF, report says
- Nigeria ranks fourth country in the world with highest HIV/AIDS victims
- The Royals Respond to the New Zealand Terror Attack
- Amazon Fireplace TV lastly will get Apple Music streaming