By Gabby Krite, Head of Operations
For non-profit organisations every penny counts as their funds are primarily donated, giving them a corporate responsibility to be judicious about their spending in a way that other brands aren’t obliged to. They are also held far more accountable to bad decision making by the public. Collectively, this means they have to be more considered about adopting bigger societal innovations – I recall the struggle to convince a client that the algorithmic delivery of content didn’t mean that TikTok was an unsafe platform, certainly no less safe than Facebook. This doesn’t mean charities don’t innovate and I have experienced some amazing creativity because of their smaller budgets. Perhaps, it’s more appropriate to say they are more considered about bandwagons. The current bandwagon being AI.
A recent report from the Charities Aid Foundation (CAF) identified that donors have a +15% net positive score globally when answering the question “do the opportunities outweigh the risks” of AI, suggesting that Charities may have some grace when it comes to adopting AI.
https://www.cafonline.org/insights/research/what-the-public-think-of-charities-using-ai
Even respondents that were on the negative side didn’t think that meant that charities shouldn’t use AI.
With this in mind, we’ll help unpack an approach for how to bring AI to your organisation, responsibly.
Don’t be scared, you are already using AI
The recent boom has been because ChatGPT made generative AI more accessible to the public. But generative AI existed before and certainly Machine Learning, a subset of AI, is something we all quietly integrated into our lives without any fuss. Traffic predictions in GPS, junk mail filtering, fraud and product recommendations are all every day uses that we engage with frequently. Within a charity you might be using algorithms to optimise your paid media campaigns, predicting donor lifetime value or warehouse management.
Start with the actual problems
A top-down approach to new tech never works – always start with the problems your staff are telling you they experience. Identify your early tech adopters in the business who have a natural interest and create a working group with a leadership team sponsor. Having the senior representative allows ongoing communication on what is feasible, what is a no-go and what should be prioritised – their role is to empower but also keep your most enthusiastic people in check. Workshops and curious conversations with your colleagues are the best avenue to identifying the type of work that could benefit from AI integration but you need to always have in mind that AI can’t solve everything and where it can help it isn’t always the best solution. A robust cost benefit analysis will help with unpicking this. With something so new and a little scary, it can be difficult to generate ideas so I would recommend the working group pursue a small scale example first to use as a case study in your workshops.
Utilise your amazing governance structures
Charities are set up fantastically to adopt technology responsibly and you likely already have the processes in place to proceed with confidence. Ensure any new technology being considered has gone through the due diligence processes from a legal and GDPR perspective. Any rollout should have risk mapping and a small-scale testing phase integrated to try and identify issues before wider use. There are also a lot of businesses out there making big claims about what they can do with AI that aren’t necessarily accurate.
Define your boundaries
Related to the above you need to have clear usage policy for your organisation, and this will vary by business. There are some non-negotiables across businesses (e.g. do not enter PII or competitively sensitive information into the open models) but additional boundaries will need to be specific to your services. We would recommend co-creating this with stakeholders across the business. Examples that charities might consider:
- Creating emotional connection with potential donors is so important in your external communications, can you achieve that if you are creating external facing imagery of people using AI?
- If integrating into the delivery of your services, does this change in processing put any recipients at risk?
Educate your staff
AI is unavoidable. It is being integrated into our daily lives everywhere regardless of if we want it to or not. Even if you have a restrictive policy, it is naïve to think that staff won’t be using it to make their work lives easier too. Therefore, if you want to minimise risk to your organisation it is responsible to ensure that staff are trained on responsible usage and what they risks are. There are many courses out there so ensure you are going with a reputable source – speaking to your data compliance teams may be a good start. One of the biggest fears of AI universally is that jobs will be at risk, which can make people resistant to uptake. This is particularly strong in charities where budgets are always tight. I would therefore recommend identifying people who may be impacted before any roll out bringing them with on the rollout journey.