Improving our data – lessons from our first efforts at data visualisation

Programme Manager Ana Van Bilsen Irias shares some insights and lessons from our first efforts at data visualisation.

Access has started to make some changes to the way we use our data. Last week was a particular milestone with a new approach to the publication of our quarterly data, using an interactive visualisation platform (PowerBI).

Sharing learning and knowledge is central to Access’s approach, and part of our commitment to learning from what we do and sharing it as openly as possible. 

Lessons and challenges

Access holds considerable data across a range of programmes, and for the first time, we wanted to bring that together to look across our programmes and show their impact on partners and frontline organisations. We also wanted to present a clearer picture of where our money goes and to be as transparent as possible about what we are learning from it.

As part of a broader commitment to transparency, we’ve shared below some of the challenges and lessons that we have taken away from this process:

Data discipline and planning

Collecting data for different programmes with different aims can be a challenge. Even when we have access to the data (all of our programmes are delivered for us by partners), we realised that for some programmes the information we held wasn’t suitable for analysis purposes and in some cases not easily comparable. We chose several metrics that would enable us to understand where our money is going that could be gathered from the different programmes. For some programmes, this meant manually adding data that we didn’t necessarily record previously. Going forward we will be working with our partners to think about how we can make better use of all the data being collected.

Preparing the data

Most of the time spent analysing a data set goes into cleaning it and ensuring the data is correct. A lot of work went into ensuring the data was in the right format to be analysed, including:

  1. Correcting organisations’ names to avoid double counting
  2. Standardising partners’ names
  3. Adding/correcting geographical and postcode data

Lessons learnt or always learning?

The use of PowerBI to showcase our dashboards has highlighted that working with data is an ongoing exploratory process. And a process we are still learning from.  

What the dashboard is telling us

As this is the first time we are publishing our dashboards in this way – we thought it would be useful to pull out some highlights:

Where our money goes

To date, a total of £30m in Access grants has been awarded through 4 types of programmes.

To date Reach Fund has disbursed the most grant awards, closely followed by Growth Fund.

For Blended Finance Programmes (combining grants and loans to make social investment more accessible to charities and social enterprises) the biggest region of grant awards is the North-West. This is a similar trend for Reach Fund.

This is interesting as the North-West is a region with a high concentration of social investors, perhaps suggesting that with greater amounts of infrastructure, social investment activity may multiply rather than divide. 

London appears to be the top region for grants awarded, nevertheless this should take into consideration the fact that:

  • For organisations having received support through our infrastructure programme (Connect Fund), many have a national remit, and their HQ are often in London for fundraising and networking purposes.
  • The data set doesn’t separate between HQ and where the impact is, potentially skewing results.

What’s next

This is just the beginning of our data sharing and learning journey. We’ll be improving and evolving our approach to data as we engage with our partners and stakeholders to improve our data collection mechanisms and the insights we subsequently share. Do get in touch with us if you are interested in being part of that conversation.