Uncategorized

Child sexual abuse is the internet’s darkest secret ( World Economic Forum, Davos)

“Historically… all throughout history, from ancient Greece, to Rome, to Nazi Germany… just before a civilisation collapses, there is a sudden and unexplained rise in child sex abuse… … in the 21st century online child sex abuse seems to be a global epidemic!”

In the UK, the online paedophile rings (many of them women) are acting with impunity! Engaged in gang stalking against anyone brave enough to speak out, and sharing online images of children… they don’t give a fuck!

“If it was a disease, or a health epidemic (which it can be considered so), the worlds powers would act immediately. The question is, above all the solutions and preventative actions that can be taken… … why the fuck do so many adults (not always men) from around the world, in various cultures and from various back rounds, ages and races… feel the need to view a child being sexually molested? (tortured… sometimes murdered)??? Pfffft!

Child sexual abuse is the internet’s darkest secret

A 59-year-old European man is currently awaiting extradition to France after abusing two girls in their early teens in Madagascar. One of the survivors testified that the man, who took nude photographs of both girls and put them online, sexually assaulted her.
This is far from an isolated case. Child sexual abuse has emerged as internet’s darkest secret. It is an $8bn industry in itself – and that sum does not include the exponential profits made by data service providers across the globe.
Blocking child porn websites is just a quick-fix solution. The real challenge is to block individuals from disseminating such content and data servers from hosting it.
Internet users these days can access child abuse material with near impunity, taking advantage of decentralised networks accessed through special algorithms. As many of those who seek to view such content use crypto-currencies, it is extremely difficult to track the seller and buyer.
Every picture is a crime scene in itself which not only violates the rights of children at the time but also remains on the internet forever, a life-long psychological burden.
Therefore, as world leaders gather at the World Economic Forum in Davos, I am demanding a legally-binding UN convention against online child sexual abuse and pornography.

Several Nobel laureates, Pope Francis, Angela Merkel, and international bodies such as the OECD have extended support for the endeavour.
In 2017 the Internet Watch Foundation found 78,589 individual web addresses worldwide showing images of child abuse. The five countries that host 87 per cent of this material are the Netherlands, the USA, Canada, France and Russia. However this tells us nothing about where this material was being produced or viewed.
The content may be produced in one part of the world, hosted in another, and viewed in an altogether different location. Some studies claim that videos of infants as young as eighteen months being raped or tortured sell for anywhere between $7,000 and $8000.
At the same time, cybersex trafficking of children is one of the most brutal forms of modern day slavery. Paedophiles lure children and their parents online to watch acts of sexual abuse from wherever they may be abroad.
While the United Nations Convention of the Rights of the Child (UNCRC) and a few other international treaties do mention the crime, they are not legally binding.
The UN Convention that I envisage will focus on the prevention of all forms of online sexual abuse. It will be backed by a new Global Task Force against online child pornography, child sexual abuse and child trafficking to provide victims with holistic support.

It will include a dedicated international toll-free helpline for reporting cases under real-time supervision of INTERPOL or any other relevant agency. It will create a treaty body to provide assistance to stakeholders where the expertise to deal with such cyber-crimes is inadequate. And it will facilitate extradition procedures.
Finally, the Convention will ensure a convergence of efforts at national, bilateral and international levels. This will lead to a uniform legal regime dealing with online sexual abuse of children, as well as uniformity in standards and efficiency of global law enforcement response.
Together we must make the internet safer for our children.
Mr Satyarthi is attending the World Economic Forum

Record number of online images showing child abuse removed by internet charity

More than 100,000 webpages showing images of child sexual abuse have been removed from the internet after intervention by the Internet Watch Foundation (IWF), the UK charity has said.
In total, 105,047 webpages were removed in 2018, up by a third on the previous year, with almost half of those flagged by the IWF displaying the sexual abuse of children aged 10 or younger, according to new figures from the charity.
IWF chief executive Susie Hargreaves described it as “shocking and deeply upsetting” that the images had been created.
In response to the figures, Home Secretary Sajid Javid said he wanted internet companies to do more to improve online safety.
“The horrifying amount of online child sexual abuse material removed by the IWF shows the true scale of the vile threat we are facing. This is why I have made tackling it one of my personal missions,” he said.
“I welcome this impressive work and have been encouraged by the progress being made by the tech companies in the fight against online predators. But I want the web giants to do more to make their platforms safe.”

web.jpg

(PA Graphics)
Ms Hargreaves said much of the content had originated outside the UK, and that analysing the images had been challenging for the charity’s workers.
“These 105,047 webpages each contained up to thousands of images and videos showing the sexual abuse of children. It amounted to millions of horrific images,” she said.
“Virtually all – more than 99% – were hosted outside of the UK. Whilst we use sophisticated and cutting-edge technology in our work, ultimately, every webpage was assessed by human eyes.
“Watching the repeated abuse of children, some of whom are very young, is a difficult job but we have an amazing team of compassionate, resilient and highly-trained internet content analysts. They work to give hope to the victims of sexual abuse whose images are shared online repeatedly.”
According to the IWF’s figures, 49% of the webpages assessed by the charity came from website URLs linked to the Netherlands, with the United States and Russia the next most common countries of origin on the list at 13% and 12% respectively.
“We work with more than 140 internet companies to keep their networks safe but it’s a sad fact that the vast majority of these webpages – four out of five – were hosted by image-hosting companies based overseas, who do not want to engage, and frankly have little regard to providing safe networks, or relieving the suffering of child victims,” Ms Hargreaves said.
“It is shocking and deeply upsetting that these images should have been created in the first place. We have set ourselves an ambitious programme of work for 2019. By getting better at finding, and combatting this material, we offer real hope to the victims whose images are shared online.”
The IWF said it will publish its full annual report, detailing full statistics and trend analysis for 2018, in April.

Two Hat Leads the Charge in the Fight Against Child Sexual Abuse Images on the Internet

Technology company releases CEASE.ai, an artificial intelligence model to detect new child sexual abuse material for law enforcement and social platforms

KELOWNA, British Columbia, Jan. 22, 2019 /PRNewswire/ — Leading AI technology company Two Hat announced today that it has released CEASE.ai, an image recognition technology for social platforms and law enforcement that detects images containing child sexual abuse. By making the technology available to public and private sectors, Two Hat aims to address the problem not only at the investigative stage but at its core, by preventing images from being posted online in the first place.
“This issue affects everyone, from the child who is a victim, to the law enforcement agents who investigate these horrific cases, to the social media platforms where the images are posted,” said Two Hat CEO and founder Chris Priebe. “With one hat in social networks and the other in law enforcement we are uniquely positioned to solve this problem. With CEASE.ai, we’ve leveraged our relationship with law enforcement to help platforms protect their most vulnerable users.”
Built in collaboration with Canadian law enforcement, and with support from the Government of Canada’s Build in Canada Innovation Program and Mitacs, a national research organization with top Canadian universities https://www.mitacs.ca, CEASE.ai is an artificial intelligence model that uses ensemble technology for groundbreaking precision. With Two Hat’s recent acquisition of image moderation company ImageVision, it has boosted its existing technology to achieve even greater accuracy and efficiency.
Unlike similar technology that only identifies known images (“hash lists”), CEASE.ai detects new child sexual abuse material (CSAM). Developed for law enforcement, CEASE.ai aims to reduce investigators’ workloads and reduce trauma by prioritizing images that require immediate review, ultimately rescuing innocent victims faster. Now social platforms can use CEASE.ai to detect and remove child abuse images as they are uploaded, preventing them from being shared.
Predators are increasingly using social platforms to solicit and share images. According to a 2018 NetClean report, “Grooming and extortion are now coming from social media apps, unlike a few years ago where most of it occurred by someone that had access to the child.”
“Removing child abuse material from the internet and protecting kids is a responsibility that we all share, regardless of sector,” says Julie Inman Grant, Australian eSafety Commissioner. “It’s exciting to see innovative technology solutions being deployed in a space where it’s crucial that the good guys stay one step ahead.”
Learn about Two Hat’s efforts to detect CSAM through CEASE.ai on its site. Priebe will host a webinar in February to share his vision of the future of AI, including CEASE.ai and updates to Two Hat’s suite of content moderation solutions.
About Two Hat
Founded in 2012, Two Hat is an AI-based technology company that empowers gaming and social platforms to grow and protect their online communities.

A UN convention should not be required to force internet giants to act on child abuse material

As the globe’s super-rich and political elite gather for their annual get together at the World Economic Forum in the Swiss alpine resort of Davos, many observers look on aghast, feeling distinctly uneasy about what a terrible waste of time and resources the whole thing is. Rather than being an opportunity for world leaders to debate what they should be doing to help solve some of more pressing problems the planet is facing, the forum is in reality little more than a major networking opportunity for those fortunate enough to be able to afford a ticket. In many cases, attendees only show up to the conference to raise their profiles or boast to the well-connected about the important work they are doing.
On exceptionally rare occasion, somebody attending the conference will seek to use it to promote an idea that really could benefit tens of thousands of people. And so it was this morning, when Nobel Peace Prize Laureate Kailash Satyarthi used an opinion piece in Britain’s Daily Telegraph to announce that he would take the event as an opportunity to demand that world leaders adopt a legally-binding UN convention against online child sexual abuse and pornography.
Explaining how the online child abuse industry has grown to be worth some $8 billion a year, Satyarthi claimed that “blocking child porn websites is just a quick-fix solution”, while at the same time warning that paedophiles are now able to access child abuse material online with near impunity. He told readers that the convention would focus on the prevention of all forms of online child sexual abuse, and would be backed by a new global task force that would offer victims “holistic support”. Satyarthi’s convention would also seek to help nations work towards creating a uniform legal regime for tackling online sexual abuse of children, and uniformity “in standards and efficiency of global law enforcement response”.
While Satyarthi’s proposed convention as laid out in his Telegraph article sounds entirely commendable, it barely mentions the role internet companies should play in stamping out child abuse content online. It is of course vital to ensure that victims of online child sexual exploitation are provided with as much help and support as can be made available, but so long as it remains technologically possible to disseminate this type of content so easily, there will most likely always be an audience for it, and those who are willing to produce it. As such, the first step towards eliminating child abuse material from the internet should involve going after those who facilitate it – internet and social media companies.
It has been pointed out for many years now that internet firms are amazingly skilled at developing innovative new technology that stands to make them a lot of money, but seem less adept at creating new solutions to tackle problems such as child abuse material online that might not be so profitable. Every now and then a company such as Google will unveil a new tool or initiative that promises to help deal with the issue, but by and large, little progress is being made. This is because there is still no real incentive for technology firms to take action, as is routinely evidenced by a steady stream of revelations relating to paedophiles using online tools to access child abuse material.
Earlier this month, an investigation conducted by US magazine the Atlantic revealed that paedophiles are using Facebook-owned photo-sharing platform Instagram to distribute Dropbox links to child pornography. Members of the suspected online paedophile ring were said to be setting up anonymous Instagram accounts and then sending blank posts with captions asking users to direct message them for Dropbox links to child abuse material. Elsewhere, a separate investigation commissioned by Tech Crunch found that Microsoft’s Bing search engine was helping paedophiles locate child pornography. Meanwhile, abusers are still routinely able to access streams of children being abused in live sex shows, many of which are filmed in developing countries where child protection laws might not be as strong as they are in the West.
As has been suggested previously with terrorist content, governments across the globe should introduce meaningful fines for internet and social media companies who allow their services to be used to facilitate the sexual exploitation of children. While doing so will not eliminate child abuse material from the internet overnight, we must surely look to compel technology companies to fulfil their moral responsibility of ensuring children are safe online before moving forward. Satyarthi’s proposed UN convention sounds like it would be a welcome step in the right direction, but planning any other action while major internet companies’ products are still routinely being used by paedophiles seems like putting the cart before the horse. The quicker we realise these firms will need to be forced to act, the better.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s