Unit 3: Prevention

View

3.5 Digital safeguarding

Safeguarding in the digital and online worlds has become a serious source of concern. This section considers the scale of the problem, some of the major risks, and how we might mitigate against them.

Alongside the huge advantages brought about by the internet and social media have come many challenges to safeguarding.

We connect using  platforms such as Zoom, Teams, WhatsApp, Facebook, Instagram, TikTok, Slack and many others. There are high safeguarding risks associated with using these platforms if we are not clear about what is acceptable and unacceptable behaviour online.

As a result of the Covid-19 global pandemic, the use of these forms of communication has intensified, and as organisations we must ensure our staff and associated personnel understand the risks associated with using personal devices or unsupervised online contact.

One of the problems with online platforms is the amount of pornography that is available, particularly child pornography, and the ease of sharing this content.

In 2020 alone, Internet Watch Foundation identified over 150,000 web pages containing child sexual abuse imagery which represented an increase of 16% from those reported in 2019, including images of the rape of children by adults. Girls featured in over 90% of images, with the highest proportion being those between the ages of 7 and 13.

A horizontal bar chart with the heading analysis of individual image hashes by age of the child and severity of the abuse. The chart is split into three categories. Category A is the number of images showing sexual activity between adults and children including rape and sexual torture. Category B is the number of images involving non-penetrative sexual activity. Category C is the number of indecent images not falling within category A or B. Source is IWF annual report 2020. The figures are as follows. Percentages are rounded to the nearest whole number. Age 16 to 17. Category A is 165 (15%), category B is 398 (35%), category C is 571 (50%). The total is 1134 (0%). Age 14 to 15. Category A is 3,800 (25%), category B is 2,523 (17%), category C is 8,954 (59%). The total is 15,277 (3%). Age 11 to 13. Category A is 42,825 (17%), category B is 62,539 (25%), category C is 139,916 (57%). The total is 245,280 (43%). Age 7 to 10. Category A is 54,262 (23%), category B is 82,874 (35%), category C is 98,819 (42%). The total is 235,955 (42%). Age 3 to 6. Category A is 14,338 (24%), category B is 27,106 (45%), category C is 18,163 (30%). The total is 59,607 (11%). Age 0 to 2. Category A is 2,684 (27%), category B is 5,833 (59%), category C is 1,406 (14%). The total is 9,923 (2%).

(Source: Internet Watch Foundation)


Mitigating against online abuse

A silhouetted image of the back of a male head and shoulders. The person is in a dark room looking at a computer screen.

The ability of perpetrators to sexually groom and exploit vulnerable children and adults online is a major safeguarding concern. For example, children and adults who are vulnerable can be tricked, persuaded, or coerced to post pictures of themselves online or post videos through the use of webcams. Once these images are created, they can be in online circulation for years, which effectively perpetuates the original abusive act.

This direct contact also facilitates the ‘grooming’ of children and enables a ‘private’ relationship to form and children to eventually be compromised via images provided or through agreeing to meet face to face.

Inappropriate relationships can easily be created by aid workers with other workers or beneficiaries using social networks, short messaging services (SMS), messaging apps, online chats, comments on live streaming sites and voice chat in games.

Activity 3.11 Identifying risks developing strategies

Reflect on the scenarios below and note  your concerns and mitigation strategies which you would put in place to reduce the risk of harm.

Scenario 1

Juan is a camp manager for a children’s camp held during the school holidays. He is happy to accept Friend Requests on social media from children, their families and young volunteers. Sometimes he also initiates contact on social media.

Scenario 2

Sharmila, an aid worker for a partner organisation, is in constant contact with community members through her personal mobile phone. Recently she has been receiving sexualised jokes and pictures which make her feel uncomfortable.

Scenario 3

Morgan is a school counsellor and since the pandemic he carries out counselling sessions alone with a child online.

Scenario 4

Peter, a Project Support Officer, likes to keep in touch with everyone and loves to link several colleagues to WhatsApp groups on his personal phone. He’ll use this to share news about beneficiaries and their families, including safeguarding concerns.

Scenario 5

Leila is a researcher and provides her personal number to her research participants. These research participants now call her at all times of the day and night asking her for jobs, education opportunities, and even money.

View comment


A question about instant messaging and social media

View transcript

Sherine, a learner on this course, has a question about using instant messaging and social media platforms to keep in touch with beneficiaries and programme participants.

Watch the video above to learn more about Sherine’s question and the response she is given.


Promoting digital safety

View transcript

Watch the video above on children’s data being used online and learn about UNICEF’s manifesto – a set of 10 demands to protect children in their data.

Cameras on our smart phones have been so useful in documenting real events and there is such ease in sharing this information immediately.

Often, we use personal information and data of people we serve in the same way, without written, informed consent or without any thought on how uploading that personal data leaves a ‘digital footprint’ forever – even when it’s deleted.

Children have a right to privacy under Article 16 of the UN Convention on the Rights of the Child.


Promoting digital safeguarding

Technology is a wonderful innovation which can promote participation and be very inclusive to children and vulnerable adults, even when we are working remotely.

When used correctly and safely, technology can increase the participation of children and vulnerable groups and make our organisations more accountable to the communities we serve. For example, if you are a co-ordinator of a youth group, you may want to use messaging apps to stay in contact with them, creating online groups, forums and communities.

However, as explained earlier, there are risks associated with this work:

  • Children and vulnerable adults may be exposed to upsetting or inappropriate content online, particularly if the platform you’re using doesn’t have robust privacy and security settings or if you’re not checking posts and comments.
  • Children and vulnerable adults may be at risk of being groomed if they have an online profile that means they can be contacted privately. Note that most social media platforms have a very low age entry, which increases risk.
  • Perpetrators of sexual abuse and exploitation may create fake profiles to try to contact children and vulnerable adults through the platform you’re using. For example, an adult posing as a child. They may also create anonymous accounts and engage in ‘cyberbullying’ or ‘trolling’ (online stalking). People known to a child and a vulnerable adult can also perpetrate abuse in this way.

Here’s some mitigation actions to promote digital safeguarding:

  • If possible, staff and volunteers should use only work devices to contact beneficiaries, where online activity can be monitored.
  • Include at least one supervisor or manager on messaging apps to ensure transparency and accountability, as well as safety for the worker concerned.
  • Ensure organisations have included online and digital concerns in their safeguarding policies and that staff and associated personnel have all been trained.
  • Always use appropriate language and behaviour online and do not share photos or videos that would make others feel uncomfortable or unsafe.
  • Consider the right of privacy of individuals and ensure informed, written consent has been obtained.
  • Ensure any online forums and communities have been set up safely.
  • Ensure livestreaming webinars or online one-to-one sessions involving children and/or vulnerable adults are carried out safely and supervised closely.
  • If online sessions are going to be recorded – always ask permission first. Similarly, do not take photos of online sessions without express consent.

Remember children have a right to privacy under Article 16 of the UN Convention on the Rights of the Child.

To learn more about online safeguarding policy, open this PDF file Online Safeguarding Policy (UK) NSPCC.

For guidance on digital safeguarding for children please see Digital safeguarding for migrating and displaced children | Safeguarding Resource and Support Hub (safeguardingsupporthub.org).