Video of the Christchurch terror attack was viewed 4,000 times before it was removed from Facebook, the social media platform has said.
No one reported the video while it was being streamed live and it was 29 minutes after it had started before the first user flagged the footage to moderators.
The company earlier revealed that it had removed 1.5 million videos of the attack worldwide in the 24 hours after the shootings, 1.2 million of which were blocked at upload.
Facebook and other social media firms have come under fire over the rapid spread of the footage across the networks and around the world.
In an article on Tuesday, Chris Sonderby, vice president and deputy general counsel at Facebook, confirmed the video was viewed fewer than 200 times during its live broadcast. “No users reported the video during the live broadcast,” he said.
“The first user report on the original video came in 29 minutes after the video started, and 12 minutes after the live broadcast ended.
“Before we were alerted to the video, a user on 8chan posted a link to a copy of the video on a file-sharing site.”
Sonderby said Facebook was “working around the clock” to prevent the video from appearing on its site.
New Zealand Prime Minister Jacinda Ardern has called on social media companies to take responsibility for ensuring that such content cannot be distributed or viewed on their platforms, saying they are “the publisher, not just the postman”.
She told the country’s parliament: “There is no question that ideas and language of division and hate have existed for decades, but their form of distribution, the tools of organisation, they are new.
“We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published.”
In the UK, Home Secretary Sajid Javid told social media companies “enough is enough” in the wake of last Friday’s shootings.
Reacting to a tweet from YouTube claiming that the video-sharing service was working to remove the footage, he said: “You really need to do more @YouTube @Google @facebook @Twitter to stop violent extremism being promoted on your platforms. Take some ownership. Enough is enough.”
Damian Collins, Tory chairman of the Digital, Culture, Media and Sport Select Committee, called for a review into how the footage was shared and “why more effective action wasn’t taken to remove them”.
And Downing Street said social media companies needed to act “more quickly” to remove terrorist content.