Skip to main content

Charities and Artificial Intelligence

Posted by: , Posted on: - Categories: AI, Artificial intelligence

Over the past year, the transformative potential of Artificial Intelligence (AI) has emerged as a major theme for national discussion and debate, including within the charity sector.   

Whether you are a trustee already using AI, are planning to do so or don’t yet know how it might be useful, it is important that you are aware of the opportunities and risks involved. The key consideration is that AI should be used responsibly in a way that furthers your charity’s purposes.  

What is AI?  

AI is commonly described as a machine’s ability to perform the cognitive functions we usually associate with human minds. AI is not entirely new, despite what some recent headlines may lead us to think. Machine learning (a type of AI) has been used in the healthcare industry since the 1970s. Whilst a lot of commentary has focused on uses of the new ChatGPT tool, many popular services we may not think of as “AI” already use it in some form. This includes the algorithms behind streaming services, traffic prediction data in navigation apps and email spam filters, as well as the voice assistants in our homes and pockets. 

How are charities currently using AI? 

AI is predicted to bring about seismic changes to all sectors of the economy, and charities are unlikely to be an exception. The 2023 Charity Digital Skills report suggested that 35% of charities were already using AI for certain tasks and that a further 26% had plans to do so in the future.  

AI has the potential to bring about many benefits, particularly if it can help charities free up valuable time spent on resource-intensive tasks, and so make more hours available for high priority areas. In our conversations with the sector, we hear that Generative AI, which uses prompts from humans to create both written and picture content, is amongst the fastest expanding area. Some charities are finding the writing tools helpful for fundraising materials, bid writing, speeches or drafting policies, while ‘speech to text’ tools take meeting minutes. There are also emerging opportunities to use AI directly in service delivery. 

As one innovative example, Surrey Wildlife Trust is undertaking a three-year Space4Nature programme (funded by People’s Postcode Lottery) that uses satellite earth observation imagery, combined with volunteers’ observations and AI, to help map and assess habitats across Surrey. 

Where to start

First, consider how you might use AI, and if the options available are right for your charity. Think about the advantages and risks – and how these would be managed – in the context of your trustee duties and charity objectives. That could involve looking at what gaps can be filled or insights generated by an AI tool, what skills are needed to use these tools to your charity’s advantage and if people within the charity’s trustees, staff or volunteers have those skills. You can also consider how staff or volunteers may already be using AI. 

As use of AI develops and more applications become available, consider if having an internal AI policy would be beneficial so it is clear how and when it can be used in governance, by employees in their work, or in delivering services to your beneficiaries. For example, the British Heart Foundation (BHF) has been considering AI since June 2023 and has established an AI working group, a wider AI community of users and an AI strategy.  

Remain mindful of trustee duties and managing AI risks 

While there are opportunities, it is wise to proceed with caution as there are risks involved that need to be considered and managed. These risks are inherent to the way AI is built, operates, and continues to learn. AI is a work in progress so won’t always get things right. It is not yet sophisticated enough to give accurate legal advice, for example, and Generative AI models can confidently produce inaccurate, plagiarised, copyright infringing or biased results without any awareness that the results it has offered may be problematic. 

Trustees remain responsible for decision making so given the consequences if incorrect advice is relied upon, it is vital this process is not delegated to AI or based on AI generated content alone. For example, trustees may not be complying with their duties if a charity relied solely on AI generated advice to make a critical decision about their charity without undertaking reasonable independent checks to confirm its accuracy.  

It is important too that charities are compliant with wider legal obligationsthese can include taking care around copyright, and avoiding harmful content. Risks can be increased when AI is put to use directly in charity operations – in particular data security, especially GDPR alongside other legal and regulatory challenges. AI is still in its infancy in terms of safety and potential and should be used responsibly. Some AI tools handle data in less secure ways than others, and you will need to understand this in your own circumstances. Charities with beneficiaries who might be at higher risk, such as children, or who hold sensitive data like medical information, will need to be particularly mindful of the level of risk.  

We will expect trustees and others in charities to ensure that human oversight is in place to prevent material errors, and also as the human touch is key to the way many charities operate and interact with their beneficiaries. 

Charities should also consider external risks and reputational damage arising from the misuse and recirculation of AI, such as fake news or deep fakes (AI generated images).  

AI and the regulatory landscape  

We are doing further work on AI at the Commission to learn more about its potential and risks, and how it fits in with our regulatory role. We also continue to engage with the sector, central government and other regulators.  

We don’t currently anticipate producing specific new guidance on the use of AI, preferring – as for cryptocurrency – to encourage trustees to apply our existing guidance to new technologies as they emerge. We will, though, update guidance where appropriate to reference examples of new technology, as we did with our refreshed guidance on internal financial controls 

Resources to learn more about AI: 

The potential for AI to grow the impact of charities remains largely unknown, which itself can make this evolving technology seem daunting to many, especially in light of the risks - which must be handled carefully. Yet, there are opportunities for charities large and small to engage with the technology now it is more widely available. As you explore the possibilities, here are some examples of further reading:  


Sharing and comments

Share this page


  1. Comment by Graham posted on

    For all its advantages, what I have experienced dealing with a charitable organisation in recent years is that it enables them to falsify information and images.

    Have a nice day!

  2. Comment by Reynold Ebert posted on

    Your blog is a constant source of inspiration for me. Your passion for your subject matter is palpable, and it's clear that you pour your heart and soul into every post. Keep up the incredible work!

  3. Comment by Kevin Hardern posted on

    This is the problem. It is very difficult to be sure of anything unless you experience it yourself.

  4. Comment by Graham posted on

    If there is anyone reading this that has negative perceptions about the CC and it’s role to benefit the public, I would like to hear from you.
    Based on my previous communication dated April 02nd, I must reiterate that in our experiences over these past few years in dealing with an Independent Charitable Trust, AI has enabled them to hide away from the realities.

    I await feedback, from anyone!


Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.