Skip to main content
Mayor of London logo London Assembly logo
Home

A checklist for digital safeguarding

Talk London
Created on
14 October 2020

The online and digital space is an increasingly important engagement place for us at City Hall: from the almost 60,000 members on the online Talk London community to the hundreds of daily emails in the Public Liaison Unit's inbox, and the ever-growing City Hall social media channels and mailing lists. It creates many opportunities – especially during a pandemic - but also comes with its challenges.

It is important that we understand these challenges and have the appropriate procedures and policies in place to address them: from escalating online incidents to protecting staff who work with our online channels.



This digital safeguarding checklist has been developed by the Greater London Authority (GLA) and safeguarding expert Charlotte Aynsley Digital Consultants – Rethinking Safeguarding to help organisations think about the risks and the context of the digital space. It will help them to develop an approach that will meet their needs and provide protection for staff and users.



The checklist is designed as a set of reflective questions for organisations to consider. It is based on the steps the GLA took when they lowered the registration age of their online community Talk London from 18 to 16, on International Youth Day in 2019.



It is important that when considering any change into the online context that all of the risks are considered and that the appropriate processes and technical solutions are put in place. This may mean retraining staff and developing a clear accountability and governance structure for the management and handling of any online incidents.

Case study

Talk London is City Hall’s online community, a space where anyone who lives, works or studies in London can have their say. By taking part in surveys and discussions, members help shape policy and make London better for everyone. We capture personal data upon registration, which allows us to analyse who we are hearing from and who we aren’t. To make Talk London as reflective of London’s population, we were keen to include young voices into policymaking too. A challenge, as this age group does not often actively engage with government agencies.



A digital safeguarding expert helped us with a thorough review: from the sign-up process to our community guidelines, from moderation to language used, and from a data privacy to a legal review. We involved young people every step of the way and made a series of updates before lowering the registration age on Youth Day 12/08/2019: new safety features, improved moderation, avatars instead of profile pictures, a reviewed privacy policy, safeguarding training and the development of City Hall’s first Digital Safeguarding Policy. The policy sets out expectations of City Hall’s online platforms (both owned and 3rd party), and how to deal with online issues. We have also launched a digital safeguarding network and are keen to share this policy and knowledge with other boroughs and organisations, so they can include young Londoners in their online activities too.

Checklist

This checklist is designed as a set of reflective questions for organisations to consider. It is based on the steps the GLA took when they lowered the registration age of their online community Talk London from 18 to 16, on International Youth Day in 2019.

Before embarking on the move into the online context it is important to understand the demand/need and how the offer may apply to your users.

  • Is there a clear need/demand to switch to online services?
  • Will the service need adapting to be delivered in this way?
  • Is there a clear need/demand to include children under the age of 18? Under 16? Under 13?
  • Are all your internal stakeholders on board?
  • How do you operationalise new processes and procedures with staff?
  • What kind of platform is going to work best?
  • Will a third party platform work or do we need to develop our own?
  • Have you tested (the idea of) your service with your target audience?
  • Were they involved in the making of it?

The online and digital space is an increasingly important engagement space, full of opportunities and challenges. It is important that you understand the associated risks and issues and have the appropriate procedures and policies in place to address them. Your risk assessment should consider both the safeguarding and the data protection elements.



In your risk assessment, highlight the main challenges and ensure that you are mitigating them as far as possible. For example: if you are a provider of services to vulnerable young people and you need to continue to engage with them on a 1-2-1 basis digitally, you will need to ensure that you apply a high level of protection to the young people and your staff.



Ensure that your risk assessment follows the appropriate sign off processes and procedures and that there is transparency and ownership.



When considering the risks, the three categories below are a good starting point :

  • Content – firstly we’ve got content – this can be content that can be viewed that is age inappropriate, violent or sexual in nature
  • Contact – then we’ve got contact – this can be either peer to peer or adult to child and can involve bullying or grooming
  • Conduct – finally there’s conduct – this can be individuals sharing inappropriate content or misusing your platform



The level of harm that a user can experience by engaging with you online will vary depending on the type of service that you are offering, how you engage, their own background, abilities and experience. This all needs to be considered when you are planning your approach.



It is important that you consider the worst-case scenarios and plan for them.



If you are planning on engaging with children and young people you will need to consider what additional measures you need to have in place and what the additional risks might be.

After you have assessed the need for the service, considered your target audience and developed your risk assessment, you will need to develop the appropriate responses and ensure that you have the approach in place.

  • How will you ensure that all of your policies are updated to reflect your new way of working?
  • How do you ensure that all staff are aware of the policies?
  • Do you need a code of conduct for staff?
  • How will the interactions be monitored and audited and who will oversee the process?
  • Who will be responsible for the leadership and implementation?
  • Do you need to have a separate acceptable us policy to highlight expectations to users?

  • How will you ensure that all of your users can access your service?
  • Is there parity of access?
  • Will the service be available via mobile?
  • Have you considered and integrated the cybersecurity issues?
  • How will you support all users including children to understand the risks?

  • How will you ensure that you are complying with data protection legislation?
  • How will you ensure informed consent?
  • Will the service involve encryption and how does that impact on safeguarding?
  • Who will have access to the data?
  • How will you ensure that provide extra protection for children especially those under the age of 13?

  • Is there a platform that already exists that will meet your requirements?
  • How does the functionality support the mitigations that you need to put in place to help to protect users?
  • Are you able to take control of the platform and monitor usage and interactions in the way that you need to?
  • Are there appropriate privacy settings?
  • Is the platform appropriate for your users profiles e.g. is it appropriate for children?
  • Does the platform allow for 1-2-1 conversations? If so, what additional safeguarding measures do you need to put in place?
  • How will you integrate a reporting system within the platform so that users are able to flag any incidents?

  • If you are involving children and young people how will you ensure that you gain consent from parents/carers?
  • How will that consent be collected?
  • How will you ensure that users are the appropriate ages on the platform?

  • How will safeguarding concerns/incidents be recorded and where?
  • How will safeguarding incidents be managed and escalated?
  • What are the arrangements that are in place for referrals to third parties e.g police?
  • How can users report any incidents and concerns?

  • How will you ensure that there are appropriate safeguarding measures in place to support children and young people?
  • What information and support will you offer?
  • How will you test that it's appropriate?

  • How will you ensure that staff are upskilled to support users in relation to safeguarding?
  • What staff training will be offered?
  • How will you support staff in these new ways of working?

Conclusion

Whatever online solution you choose it is important to consider the risks and to put the appropriate mitigations in place to support the wellbeing and safety of you users.



For more information please email [email protected] or [email protected]