This part of the Digital Capabilities Statement looks at the knowledge and skills that social workers require to demonstrate their digital capabilities. It is based on the Practice ‘super -domain’ of the PCF and is drawn from all the domains of A Health and Care Digital Capabilities Framework.

The contents of this section are:

  • Understanding the online and technology needs of people using services
  • Delivering services through digital technology
  • Ethical decision-making: knowledge and skills
  • Online safeguarding: knowledge and skills
  • Understanding applicable legislation and regulations

Understanding the online and technology needs of people using services

Social workers should know and understand how distinct groups of people who use services engage with digital technology to meet their needs. For instance, some autistic people prefer online social interactions over face-to-face relationships because of their sensory needs (Bertilsdotter  et al, 2013). People with physical and mental impairments also use assistive technology ‘to maintain or improve functioning to live healthy, productive, independent and dignified lives, participating in education, the labour market and social life.’ (World Health Organisation, 2016; p.1).

Social workers need to understand the range of technologies for meeting these health and social care needs. They can attain this knowledge through online searching, interpretation, verification and analysis and evaluation skills (National Association of Social Workers (NASW), 2017). They can also learn from service users themselves who may use particular apps or digital tools to manage their needs.

Social workers should:

  • Become familiar with the range of assistive technologies, communication tools and online systems that people with similar needs use
  • Enable people to feel empowered to explain the technologies that they use in their everyday lives to meet their needs. This can be achieved through professional curiosity to ask the appropriate questions
  • Explain in their assessment and care planning how the human rights of people who use services can be maximised through technology

Delivering services through digital technology

Increasingly social work interventions that are delivered online. Examples of online social work as highlighted by Shevellar (2015, p. 160) include ‘online counselling, online self-help support groups, cybertherapy and avatar therapy, video counselling, telephone counselling, self-guided web-based interactions’.

Technology can speed up access to these interventions, increasing personalisation and choice because people can access them at their convenience from their homes.

For people with long-term health conditions, digital technologies can be used to streamline and combine their appointments, which enhances their sense of control and enables them to adhere to their care plans. People can use apps to order their prescriptions and support their compliance with medications.

Social workers should always assume that people who use services utilise technology in some form to address their needs and explore what technology is used and why people use it. In assessment and care planning, social workers should explain how access to these technologies can be maintained for people who use services, because they may be indispensable to their needs. Social workers should also offer people the choice about how they want their services delivered. For instance in respect of counselling, an online option might be quicker and more convenient.

Social workers should:

  • Ensure that they regularly discuss with people who use services any technology that they use and analyse how this addresses their needs in their assessments
  • Understand the technologies that are available to assist people to manage their care
  • Research the range of online interventions available and communicate these to people who use services

Ethical decision-making: knowledge and skills

The proliferation of digital technologies poses similar ethical issues that social workers encounter in their offline work such as: maintaining confidentiality, seeking consent before recording or sharing personal information and respecting privacy. However widespread use of digital technology extends these ethical requirements into new areas.

Social workers now process large quantities of personal data and therefore they have an ethical duty to maintain data security (NASW, 2017).

Another new ethical dilemma caused by the widespread use of social media is respecting privacy. The digitalisation of everyday life has increased the chances of social workers inadvertently accessing inappropriate personal information of people who use services. For most part this may be in the public domain, however it can have an adverse impact on the professional relationship because people may feel that boundaries have been crossed.

Similarly, social workers’ social media profiles can be accessed by people who use services who may then invite them to connect with them. Reamer (2019) calls the latter ‘boundary issues’ because social workers’ private information is readily available to people they work with in their professional roles.

In earlier work Reamer (2013; p.170; emphasis added), noted that in using digital technology, social workers can encounter:

  • Ethical decisions – ‘social workers sometimes face circumstances that require deliberate ethical decisions.’ In using digital technology social workers have to frequently reflect on the ethical implications of their actions
  • Ethical mistakes – ‘which are often unintentional’ and occur because of ‘omission or commission’. The social worker takes (or doesn’t) take an action arising out of their use of digital technology, which has ethical implications
  • Ethical misconducts – ‘Potential pitfalls include misrepresenting one's credentials and expertise online, engaging in inappropriate dual relationships with clients electronically’ (for example, on Facebook or via e-mail)

The use of digital technology in social work has distinct ethical implications - the framework below is useful for ethical decision-making. 

Digital technology in social work: model for ethical decision-making

1. Recognise that there are ethical issues in the use of digital technology

Know about the ethical issues and accept that the same offline ethical dilemmas apply online but new issues arise; understand the distinct ethical issues that arise within your organisational and local contexts and your professional responsibility towards people using services

2. Understand the facts that apply

What is required of me, the social worker? What are the expressed wishes of the person (or people) using services? What are the laws and regulations that apply? Is this within or beyond the scope of my role? Should I refer to senior managers and seek their direction and if so when? Which professionals should I consult?

3. Evaluate your decision using ethical principles

Be explicit about the ethical principles shaping your decisions. Is it to maximise human rights and if so, what are they? Are you prioritising promotion of the greatest benefits to the person using service – e.g. safeguarding – over their other rights, such as autonomy, privacy and confidentiality, and if so why? Consult the Codes of Ethics (BASW, 2014), The Professional Standards (SWE, 2019) and other guidance in your decision-making

4. Take action to achieve desired results

At this stage implement actions agreed with managers, people who use services and/or professionals

5. Review decisions and reflect on outcomes

Re-evaluate decisions and actions at the earliest opportunity. What was the outcome and was it desirable? What have you learnt from this and what are your strengths and the gaps in your digital capabilities?

Adapted from Markkula Centre for Applied Ethics (undated)

Social workers should:

  • Understand and apply regulatory standards – i.e. Professional Standards (SWE, 2019) – and professional Codes of Practice – e.g. Code of Ethics (BASW, 2020) and Digital Ethics Charter and other guidance
  • Reflect on the ethical issues that emanate from their use and interactions with digital technology in their professional and private lives. This reflection can occur in formal supervision with their manager or informal group supervision with their colleagues
  • Understand and apply the BASW Social Media Policy (BASW, 2018) and relevant guidance

Online safeguarding: knowledge and skills

Digital technology brings benefits, and risks – especially to children and adults who may be vulnerable. Social workers need to be able to identify and balance the benefits and the risks and manage them.

Online safeguarding issues


According to the NSPCC online risks to children include:

  • ‘Sexting and sending sexually explicit photos’
  • Children being sent, sharing, or accessing age-inappropriate information
  • Cyber-bullying and trolling

Adults sending sexual information to children or engaging in sexual exploitation of children through online contact with them

‘Disproportionate’ use of social media may lead to children missing out on physical activities. This can impact on their health and wellbeing

Children can be manipulated through ‘fake news’ and ‘deep fake’ software


  • Adults with impaired capacity can be subject to unwanted sexual advances and online grooming
  • Sexual exploitation
  • Adults can be approached through social media for ‘cuckooing’
  • Adults with impaired decision-making can access online resources which may be detrimental to their needs. For instance, they may order goods and services they don’t need; apply for loans and ‘gift’ money to ‘strangers’.
  • Risk of financial abuse through unwanted solicitation
  • Risks of information breach through unauthorised access of data or granting consent to people to access their data
  • Hacking of online social media and financial accounts
  • Cyber-bullying of adults at risk
  • Susceptibility to manipulation through ‘fake news’ and ‘deep fake’ software

Social workers should consider the above in assessments, care planning and safeguarding processes.

Assessments – Drawing on professional curiosity, social workers should ask and discuss what online services and digital tools people use and the online groups they participate in. Where there is prevailing evidence that some groups of people using services are at particular online risks, this should be explored. For instance, for some adults with learning disability, explore the likelihood of unwanted advances and risks of sexual or financial exploitation. Social workers should balance these risks against identified protective factors and the benefits that people derive from their online interactions.

Care planning – Where risks are identified, social workers should ensure that care plans include appropriate safeguards. For instance, an autistic adult may spend inordinate amount of time on the internet because of the inaccessibility of community groups. Similarly, an adult recovering from mental distress can use the internet relatively more than others to address their social isolation. Appropriate care plans can ensure that services can be arranged to meet how people prefer to interact – i.e. whether online or in the community.

Using digital technology to safeguard people – Social workers need to understand how they can use digital technology to protect people from harm. Some digital technology can be used to contact people or groups who may be at risk – for instance when someone is missing from home - or enable social workers to share information with professionals, or allow people at risk to contact emergency services or the police quickly – for example, people at risk of domestic abuse.  

Owning to the need for careful balance between the benefits and risks of online interactions for people who use services, social workers need to have critical analysis skills. These include asking about peoples’ online uses, analysing risks and benefits and making evidence-informed intervention plans (Wilkins and Boahen, 2013).

Social workers should:

  • Appreciate that there are risks with using digital technology – they should explore them while not making assumptions that some people are inherently ‘vulnerable’
  • Balance risks with the benefits of using digital technology
  • Understand how digital technology can be used in safeguarding people who have been judged to be at risk

Understanding applicable legislation and regulations

The legislation and regulations that apply include those that govern social workers’ and their employers’ responsibilities, for instance for data security and information sharing, and people’ rights to the data that organisations hold about them.

Social workers are central to this distinction because being employees, they are effectively representatives of organisations that employ them. In their everyday professional activities, social workers also process data about people who use services and therefore have legal responsibilities. Social work is a value-based profession and social workers may be called to advocate and support people whose human rights have been violated through digital technology. Consequently, social workers need to understand human rights laws pertaining to digital technology (see Smith et al, 2019; p. 44-47 for an overview).

Laws and advice that underpin social work and digital technology

  • The Data Protection Act 2018, incorporating the General Data Protection Regulation 2018
  • Guide to the General Data Protection Regulation (GDPR)  
  • UK Information Commissioner website
  • Regulation of Investigatory Powers Act (2000)
  • The Mental Capacity Act 2005 – this is the statutory framework for determining peoples’ cognitive capacity to consent, for instance to their personal data being shared or collected,  in England and Wales.