Feedback on expression of interest conversations for “Implementing evidence-based interventions to support older adults’ independence”

Thank you to everybody who took part in a formal expression of interest conversation for our recent funding call focused on implementing evidence-based interventions to support older adults’ independence. We received a huge amount of interest in the call, and held 35 conversations by the deadline of Friday 6th September 2024. We recently let applicants know whether they had been invited to submit a full application and the success rate at this stage was 37%.

Conversations followed the format and assessment process outlined in the call guidelines. We were impressed by the wide variety of work taking place throughout the country, and the panel’s general feedback on proposals at this stage of the application process is outlined below:

  • A number of proposals appeared to be ‘too early’ for the call. We were looking for proposals focused on existing interventions which already had robust evidence of effectiveness, and were ready to be implemented and evaluated at greater scale in a ‘real world’ setting. In some cases, proposals appeared to be seeking to develop new interventions and then gather evidence on their feasibility / acceptability and/or effectiveness. In other cases, the existing evidence base was centred primarily on the feasibility / acceptability of the intervention(s)– something the guidelines stated did not constitute robust evidence of effectiveness. Often evidence of effectiveness was stated, but there was a lack of supporting data / figures presented in the presentation slides, making it hard to discern how robust this evidence base was. Whilst we weren’t limiting ‘robust evidence’ to published papers, for un-published evidence (e.g. a commissioned evaluation by an external partner), we would have expected to see data / sources supporting these claims within the presentation.
  • Related to the above, on some occasions the proposed intervention(s) appeared to comprise components from various other existing interventions, often with differing levels of evidence supporting their effectiveness. Without a strong rationale for how these components would work together in the proposed setting(s) / population(s), this approach often led to questions over to what extent the intervention(s) being proposed was truly evidence-based. Sometimes there appeared to be some extrapolation of evidence (e.g. making the case that, because physical activity overall is proven to improve older adults’ independence, any intervention that aims to increase physical activity is thus evidence-based), and again this raised some concerns.
  • The panel provided a number of points of feedback related to the presentations and overall delivery:
    • A number of presentations ran over the 20-minute limit, or were close to doing so. This either meant we had to bring the presentation to a close early – to ensure fairness between applicants – or that applicants had to rush through their final slides, which often covered important points such as the proposed approach to equity, diversity and inclusion (EDI) and patient, carer and/or public involvement and engagement (PPIE).
    • In a number of cases, applicants spent a lot of time on the background to their proposal and/or the case for supporting ageing-related research – an important aspect, but not the key focus of the conversation given that it involved a knowledgeable panel – and this meant that there was insufficient time to fully describe / detail the intervention(s) and proposed programme of work (i.e. what the proposal was actually seeking to do). As outlined in the call guidelines, this was what we were really looking for the conversation to focus on. The best presentations gave adequate time to each aspect of the proposal, and provided real clarity on the intervention(s), their evidence base and the work the team were looking to complete.
    • Sometimes answers to questions were simply not concise / on-topic, or they involved a lot of sector-specific jargon. This had the effect of using up valuable time in the conversation and/or making it unclear exactly what was being proposed.
    • Some presentations did not appear to be well-coordinated (e.g. uncertainty over who was speaking to each aspect of the proposal), which raised questions over the wider management / leadership of the programme. In general, it wasn’t always clear how the programme would be managed and governed, and this is particularly important as implementation programmes can often be complex and/or involve a large number of team members / partners.
  • Similarly, the proposed team was often a key factor in the panel’s consideration of the proposal:
    • On a number of occasions there were perceived knowledge / expertise gaps within the proposed team – specific areas that were often raised included expertise in health economics evaluation (more detail on this is provided below) and implementation science.
    • It was sometimes queried whether the choice of lead applicant was driven more by the eligibility criteria for the call (i.e. the need to have an eligible lead institution) than the needs of the programme. For example, where the development / previous evaluation of the intervention(s) seemed to have been led by other partners and/or where it seemed the lead had only limited input into the presentation. This gave the impression the lead may have been brought in out of necessity, rather than having a deep understanding of / connection to the intervention(s) and the proposed work.
    • It wasn’t always clear why each individual was present, particularly when they didn’t make a clear and credible contribution to the presentation and/or conversation.
    • One aspect of the presentation that was often lacking was the specific role(s) of each team member in the proposed work. Whilst we weren’t expecting applicants to provide large amounts of detail on this, given the time restrictions, the best presentations made it clear why each named team member was involved and what aspect(s) of the work they would oversee / be involved in.
  • A key aspect of the call is the generation of evidence on the economic impacts / implications / sustainability of the intervention(s), to help facilitate its longer-term adoption. However, the panel often felt that financial impact / social impact / health economics aspects of the proposed programme were under-developed. In many cases it wasn’t clear who in the proposed team would be carrying out this work, or what the rationale for the work was (i.e. why the proposed approach had been chosen). More convincing was where the dedicated impact / economics lead was present on the call and/or named in the presentation and, crucially, where there was a compelling explanation of how this analysis could support the longer-term adoption of the intervention(s) – and importantly, this message was strongest when it was supported by / co-developed with the relevant commissioning / adopter partner (more detail on this is provided below).
  • On that note, there sometimes appeared to be some confusion / conflation of the commissioning / adopter partner(s) and other delivery partners. By “commissioning / adopter partner”, we meant an organisation that would support the implementation of the intervention(s) during the programme and could be in a position to support its longer-term commissioning / adoption following the programme. In some conversations, it became apparent that the organisation(s) upon which longer-term commissioning / adoption would be dependent (e.g. commissioners, Integrated Care Boards, local authorities etc.) were not involved in the conversation and/or had not yet been engaged. From experience, we know that it’s crucial that these partners are involved in the development of the proposal from an early stage to help facilitate longer-term adoption. Relatedly, it wasn’t always clear what the potential routes to adoption were and how the work being proposed could support this. With this in mind, the strongest conversations involved representatives from the relevant commissioning / adoption partner(s), who could clearly articulate their involvement in the proposal’s development and how the proposed programme would inform their decision-making regarding the longer-term adoption of the intervention. They also provided a strong justification for why the proposed work was needed, over and above the evidence of effectiveness that had already been gained.
  • It was sometimes unclear to what extent the intervention(s) was genuinely ageing-related / targeted at older adults. Where an intervention was available to all, but the case was made that the majority / a large proportion of ‘users’ were / could be older adults, the provision of supporting evidence to provide confidence in this was necessary.
  • Some proposals appeared not to have fully considered the potential risks that the intervention(s) might pose to older adults, or how these would be managed / mitigated against in a ‘real-world’ setting. Similarly, it wasn’t always clear what support would be provided to older adults to help them to engage with the intervention(s) (more detail on this is provided below).
  • Regarding equity, diversity and inclusion (EDI):
    • It was pleasing to see a stated commitment to engaging with and/or targeting under-served communities in many of the proposals. However, we often found that these ambitions lacked specific actions / strategies etc. to back-them up. The strongest teams provided specific details as to how under-served communities would be involved in the proposed work (e.g. connections with relevant community organisations, consideration of language barriers etc.) and, most compellingly, could evidence that they had the appropriate expertise and/or had been successful in doing this previously.
    • There were sometimes outstanding questions over how accessibility had been considered in the development of the intervention(s). In particular, for digital interventions there were sometimes concerns regarding digital inclusion / exclusion which were not fully addressed. In contrast, some applicants explained how they had carried out equality impact assessments on the intervention(s), which highlighted such issues and resulted in a more considered approach regarding how to address them.
  • In a few instances we noticed the use of the terms “elderly”, “subjects” and “hard-to-reach groups”. Whilst we didn’t factor this into our assessment, we thought it would be helpful to highlight this and signpost to some useful guidance on these terms:

With so many interesting and high-quality proposals, the decision on who to shortlist was difficult. We very much appreciate the time and effort put into developing a proposal, and hope that this feedback is helpful to those who were unsuccessful at this stage in taking their work forward. If you are looking for new networks to assist you, do consider joining / reaching out to members of the DMT Academy. You may also be interested in the DMT Academy Ignition Fund, which provides small pots of funding (up to £5k) to be used flexibly, for example to bring together different stakeholders / partners and develop ideas for future funding applications. Finally, do visit the UK Ageing Research Funders’ Forum news page to view other potential funding opportunities.

If you wish to stay updated on other funding opportunities from the Trust, then you can join our mailing list by adding your details to the ‘Join our community’ section of our “Contact us” page. You can also follow us on LinkedIn.

Share: