Facebook's moderation of graphic content is "very wrong" and must "urgently change", Yvette Cooper has said.
The former chair of the Home Affairs Select Committee made the assessment after leaked documents claimed the social networking giant instructs its moderators only to remove certain threats of violence, but not some references to child abuse.
Ms Cooper said while she welcomed Facebook's recent announcement to increase its number of moderators to tackle the issue, "too much harmful and dangerous content is getting through".
Facebook has come under increased pressure in recent months over its influence on almost two billion active users and the control it has over the content that appears on the platform.
According to files published by the Guardian newspaper, Facebook does not automatically delete evidence of non-sexual child abuse in order to help identify and rescue the child involved.
Ms Cooper, who chaired the committee in the last parliament, said this approach was "very wrong".
"These files demonstrate why powerful social media companies, including Facebook, have to be more transparent as the Home Affairs Select Committee recommended," she said.
"They also show why we were right to call on social media companies to urgently review their community guidelines, as too much harmful and dangerous content is getting through.
"None of this is easy, and we welcomed Facebook's commitment a fortnight ago to hire thousands more staff to tackle the problem and bring in more safety measures.
"But on child abuse they are still getting this very wrong and now the guidelines are public and will be discussed, I hope they will urgently change them."
The leaked dossier also claimed comments posted about killing Donald Trump are banned by the social networking site, although violent threats against other people are often allowed to remain.
It shows "credible violence" such as posting the phrase "someone shoot Trump" must be removed by the staff because he is a head of state.
However, generic posts stating someone should die are permitted as they are not regarded as credible threats, the Guardian reported.
Facebook will also allow people to live-stream attempts to self-harm because it "doesn't want to censor or punish people in distress", it added.
"Facebook say they 'do not action photos of child abuse,' they only mark videos as disturbing and only remove them if they are shared with sadism," Ms Cooper said.
"And they claim this allows for the child to be identified and rescued.
"However it is only likely to be in exceptional circumstances that continued sharing of child abuse is essential to the identifying and rescuing of a child.
"In most cases the reality of sharing vile and violent images of violence and child abuse simply perpetuates the humiliation and abuse of a child.
"Images should be given to the police and removed instead. Facebook are getting this wrong and need to urgently change."
Facebook has previously come under fire for allegedly failing to remove sexualised pictures of children from its website after the BBC said it used Facebook's "report button" to flag up 100 photos on the website, but 82 were not removed.
Monika Bickert, head of global policy management at Facebook, said: "Keeping people on Facebook safe is the most important thing we do.
"Founder Mark Zuckerberg recently announced that over the next year, we'll be adding 3,000 people to our community operations team around the world - on top of the 4,500 we have today - to review the millions of reports we get every week, and improve the process for doing it quickly.
"In addition to investing in more people, we're also building better tools to keep our community safe.
"We're going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help."