What have we learned from collecting and collating early Growth Fund data?

One year on since the launch of the Growth Fund’s first social investment fund, we reflect on what we have learned so far. Although it’s too early to draw any conclusions about the financial and social impact of the loans, just collecting and collating the early data has taught us a lot already…

It’s been one year since we announced the launch of the Health and Wellbeing Fund (South West), which saw Resonance become our first Growth Fund social investor. One year on, we have now signed deals with eight such partners, six of whom are now operating live funds and two of whom will launch theirs in the coming months. With these, plus several others in the pipeline, it has been a busy and exciting first operational year for the Growth Fund, and we already have a wide variety of funds available under the programme. Some social investors, like Resonance, have chosen to focus on a specific region and/ or sector, whilst others have launched funds which are open to charities and social enterprises targeting a wide range of social impact areas across the whole of England. However what all of the funds have in common is that they are helping to plug a previous gap in the market by offering accessible, affordable, unsecured finance on a small (sub-£150k) scale.

Whilst supporting prospective social investors through the application, due diligence and launch processes still remains a significant part of our day-to-day Growth Fund activity, we are also now well into the programme’s main phase. The live funds are making increasing numbers of investments into charities and social enterprises, and for every loan made, data is collected by the social investor and shared with us on a quarterly basis. We have big ambitions for all of this financial and social impact information, which we will continue to gather, collate, analyse and share widely in line with our learning strategy. You may have already come across our quarterly dashboards which are an early example of this. As we prepare to publish our third dashboard this month, we have been refining and reflecting on the processes that we have in place to manage this ever increasing flow of information.

Since our commitment to learning and sharing this learning is one of the key the reasons that we collect and analyse all of this data, we feel that it would be remiss if we didn’t also reflect upon and take lessons from the way in which we are doing so. So what have we learned from receiving and collating this data so far?

  • We have to justify the costs of collecting data

Charities and social enterprises receiving loans and grants may sometimes feel that they are asked for a lot of information. Whilst we and the social investors are working hard to ensure that we absorb as much of this complexity as possible, we recognise that any form of reporting means less time for charity and social enterprise staff to devote to the activity that we are helping them to fund. This is particularly important for small and medium sized organisations which, given our goal is to make social investment accessible to those needing small-scale loans, are the most likely beneficiaries of the Growth Fund. We are aware of this and do not take decisions about which information to ask for lightly. We listen to charities and social enterprises’ experiences by meeting with them directly, hearing feedback passed on through the social investors we work with and by commissioning and funding sector research, and we are constantly evaluating our own processes and working with our stakeholders and partners to review our and their information requirements. We often challenge ourselves by asking ‘what would we use this piece of data for?’ and if we cannot answer that question, we know that we should not be asking for it.

  • Being less prescriptive can sometimes increase, rather than reduce, complexity

One of our biggest challenges has been to try to strike the right balance between bespoke and standardised reporting.

Our aim was to ensure that our wide range of social investor partners could each utilise their individual expertise and design reports which work for them, whilst also ensuring that we had enough consistency between reports to be able to combine them into one large dataset. Therefore we agreed up front the information that would be reported, but were deliberately far less prescriptive about the way in which it should be presented. Although we did provide guidance when requested, we initially encouraged each organisation to design reports however worked best for them. We worked hard to build our own internal processes to allow for this variation as much as possible, by standardising and reformatting the information ourselves.

However, although we were able to do this to a large extent, we found that in order to ensure we had usable data, there were some variations and inconsistencies which could only be overcome by asking one or more of the social investors to make certain changes to their own reporting mechanisms. We sought input and feedback from the social investors throughout the process, including bringing together representatives from all live and soon-to-be live funds at our Growth Fund partners’ community meetings. One of the most significant pieces of feedback we heard is that many actually preferred being given more specific guidelines, and most would have actually preferred a more prescribed approach from the start. We took this on board, and used the submissions and feedback we had received to create a template which we could offer to new social investors joining the fund. We also shared the template with the existing social investors to give them the option of using it if they would prefer, and for this quarter all have chosen to do so.

  • Standardised headings don’t necessarily mean standardised data…

It turns out there are dozens of potential ways of defining an ‘application stage’, elements such as ‘loan term’ can be described in anything from a paragraph to a single figure, and there are as many ways of describing ‘geographical reach’ as there are towns, counties and regions in the UK! Given that these are some of the factors which we had initially thought would be most simple to standardise and compare across the dataset, you can imagine how complex trying to collate some of the more qualitative and subjective ones has been.

While at first we tried to avoid asking social investors to select options from short lists for fear that it could limit the accuracy or the level of detail in their data, we learned that, if done well, it can actually improve both of these. Luckily, the fact that we did not take this approach from the start meant that when we did come to create these lists, a few funds were already underway and reporting. We were therefore able to use their initial reports, as well as feedback from conversations we had had with the social investors about their data collection and reporting methods, to inform these categories and hopefully select ones which would be appropriate across the portfolio.

  • However well you plan, there’s no substitute for learning by doing

Two or three years ago when the Growth Fund was being designed, Access spent a lot of time thinking about what data to collect and talking to our partners about how this might best be done. Inevitably, many initial decisions had to be made before any funds were launched. Whilst a lot of time went into thinking about how these processes might best work, we didn’t know what we didn’t know, so it was impossible to foresee every potential problem or inconsistency.

Whilst we would have preferred to be able to work out a perfect data collection and reporting method ready for the first social investor to launch their fund, this was never going to be possible. Although making changes after embarking on a process can sometimes feel like adding complication, we hope that the tweaks we have made have helped, rather than hindered, both current and future Growth Fund social investors. The fact that, for the latest quarter, all of them have chosen to use the new template is a positive sign.

We will of course keep listening to feedback and working to absorb complexity as much as we can, without compromising on the completeness and accuracy of the data we receive. In this way we’re hopeful we will build a meaningful dataset from which we can draw conclusions and learnings for the benefit of the social investors, the wider market and, most importantly, the charities and social enterprises we all exist to support.

 

 

We are grateful to all of the Growth Fund social investors for their input and feedback into the reporting processes and for working with the Growth Fund’s charity and social enterprise recipients to gather all of this information. Particular thanks to Resonance, Key Fund, First Ark and Big Issue Invest, the Growth Fund’s first four social investors, for bearing with us as we learned too. We look forward to sharing the data and learnings with all parties, and more widely, throughout the life of the Growth Fund.

This blog has focussed on the data and processes of our Growth Fund. We are equally committed to utilising and sharing information and learning from our capacity building programmes, currently the Reach Fund, Impact Management Programme and Connect Fund, in conjunction with our partners who are administering these programmes. Data from the capacity building programmes are also included in our quarterly dashboards and updates on all strands of our work are regularly published on our blog.