Wednesday, December 25, 2013

Google blocks child porn in 100,000 searches, plans filter for YouTube


Google Chairman Eric Schmidt today said his company's search engine has made it far more difficult to find images of child pornography and that Google is developing a technology that will identify children being abused in YouTube videos.

"While no algorithm is perfect—and Google cannot prevent pedophiles adding new images to the Web—these changes have cleaned up the results for over 100,000 queries that might be related to the sexual abuse of kids,"

UK Prime Minister David Cameron recently called on search engines to impose a blacklist of search terms related to child sexual abuse. Microsoft's Bing was "the first to introduce pop-up warnings for people in the UK who seek out online images of child abuse" back in July. The UK is also forcing Internet service providers to roll out filters targeting all porn.

Google's changes aren't just for the UK. "[W]e will soon roll out these changes in more than 150 languages, so the impact will be truly global," Schmidt wrote. In addition to the 100,000 search queries mentioned above, Google is now showing warnings for more than 13,000 queries. "These alerts make clear that child sexual abuse is illegal and offer advice on where to get help," Schmidt wrote.

Schmidt noted that "Google and Microsoft have been working with law enforcement for years to stop pedophiles sharing illegal pictures on the Web." In this case, Schmidt credited Microsoft for sharing its picture detection technology with Google to help its rival better identify pictures of children being sexually abused.

To prevent false positives, Google has employees reviewing photos before blocking them. "This is because computers can't reliably distinguish between innocent pictures of kids at bathtime and genuine abuse. So we always need to have a person review the images. Once that is done—and we know the pictures are illegal—each image is given a unique digital fingerprint," Schmidt wrote.

Google also has a plan to target child pornography videos. "[P]edophiles are increasingly filming their crimes. So our engineers at YouTube have created a new technology to identify these videos," he wrote. "We're already testing it at Google, and in the new year we hope to make it available to other Internet companies and child safety organisations."
SELECT CATEGORY
>>Campus Gist
>>Global News
>>2013/2014 POST UTME/ADMISSIONS
>>Scholarshops and essays
>>Tech News
>>Free Android Apps
>>Be a Co-Blogger of Veecubed

About the Author
Just a cool guy who loves the tech world and the opportunities it presents to this generation. I am an elect/elect engineer but computers,programming and softwares are the things that drives me crazy.Email me: akinyomioluwafemi@gmail.com
Facebook page ,Twitter Handle ,Full profile info
Google+


Labels: ,

0 Comments:

Post a Comment

if you don't have any ID please select "anonymous"

Subscribe to Post Comments [Atom]

<< Home