Childhood and the Media: Facebook & Google panel discussion – Part of @SWSCmedia #CyberStudies & #Liminality Series + #Childhood Studies

From the fourth in a series of Safegaurding conferences hosted by Bath Spa University, in association with David Niven Associates. (Courtesy of David Niven Associates).

Simon Milner explained the different ways in which Facebook protect young people that use the social network. That ranged from automatically stopping them from publishing to everyone to making them invisible to web searches.

It also gives people control over the content.

Facebook has a series of procedures in place to deal with inappropriate content; particularly suicide prevention, terrorist content and underage issues. There are various ways in which content can be flagged as wrong and reported.

Its social reporting system can also put people in touch with a third party offline.

Facebook works in partnership with five leading safety organisations.

Naomi Gummer opened by saying there were a number of myths about the internet such as it was just used for chatting, bullying and viewing sexual content. While recognising there was an element of truth in them, she gave statistics which challenged the idea that this was the majority view. Instead she said the internet “is absolutely transforming how young people communicate and socialise and access information.”

Naomi Gummer stated that managing the acknowledged risks required input from parents and web companies and not just lawmakers.

Google has a three strand approach to safety online which involves education and improving computer literacy for all users.

In answers to questions from the audience SM says that although Facebook does not allow under 13s to have Facebook accounts it cannot stop every such account and parents are colluding with their children in order to help under aged users set up accounts.

Naomi Gummer says Google is working with schools on internet literacy and wants to see an overhaul of computer science teaching in school.

Naomi Gummer says that on regulation, with the sheer volume of material, community regulation is the way ahead where people are aware of guidelines and can police the space themselves.

Simon Milner says that it is not technically feasible for Facebook to filter out inappropriate language and it is more appropriate for users to post moderate.

Join our debate and share your views about Childhood & Media on Tuesday (22-May-2012) at 8:00 PM London / 3:00 PM (New York) @SWSCmedia & @U4Change & @MHchat & share your views.

Simon Milner, UK Policy Director, Facebook 

Naomi Gummer, UK Public Policy Analyst, Google

David Niven Associates  (@dnassociates) is a training and consultancy company offering services to statutory, voluntary and private sector organisations.

Share Button