• Hour glass icon10 minutes

Evaluating your Prosper offering

Evaluating the impact of implementing Prosper at your institution can be pivotal in helping  tailor your offering to your postdoc audience. Effective communication of your findings can be used to persuade and promote engagement and/or leverage additional resources.

Here we provide a one-stop-shop covering all the different types of evaluation we’ve found useful. For the more experienced the evaluation checklist, factors to consider when you’re evaluating and recommendations on this page may provide some useful reminders.

If you’re unsure what type of evaluation will suit you, check out what type of evaluation to use on this page. A range of survey examples, communication emails and dissemination examples are provided in further resources, for you to use and adapt. 

We present how Prosper did evaluation, including our success criteria, what and how we sought to measure a change in our postdocs before, during and after pilot cohort participation. We detail the types of evaluation surveys we ran and other types of evaluation we used during the cohort. We specify how we analysed the qualitative and quantitative data collected. How this analysed data was then used, reported and publicised. Finally, we note some discrepancies.

Evaluation checklist 

  1. Set your success criteria 
  2. Define the data you need to collect – what questions do you need to ask? How do they address your success criteria?
  3. Define how and when you’ll collect evaluation data 
  1. Follow-up or longitudinal evaluation required? Consider what, when and how? 
  2. Data analysis 
  3. How will you use the data? Check against success criteria, reporting, disseminating, advertising, and using it to improve your offering 

Factors to consider when choosing how you evaluate

  • Be clear about what success criteria you wish to set for Prosper at your institution. What data will you need to collect in order to confirm you’ve achieved these criteria? What questions do you need to ask in order to improve your offering? What do you need to monitor and evidence? As an externally funded project we had a comprehensive set of success criteria See Prosper's success criteria here.
  • Alternatively, consider what you’d like to be able to say or report after a year of implementing Prosper and use that as a guide for the data you’ll need to collect. 
  • Evaluation needs to be considered at the outset so you can collect the necessary data at the right moment. 
  • How will you communicate with your audience? Which communication channels will work best to communicate to the audience you wish to evaluate? 
  • Be pragmatic, ensure your evaluation is not unreasonably burdensome to your time-poor postdocs (or other stakeholders). How often will you evaluate? How long will each evaluation interaction take?
  • What do you plan to do with your findings once you’ve collected them? Consider how you’ll disseminate your findings, what audience do you want to reach? How will you tailor how you report your impact to appeal to them? 
  • Consider the blend of quantitative (objective, numeric data) and qualitative (subjective, narrative data) you wish to use.
  • Follow UK GDPR legislation and data retention best practices. 

What type of evaluation to use?

Evaluation doesn’t just mean one-off, anonymous surveys. You can select an evaluation tool that works best for what you want to measure.

Type: Attributable start and end ‘bookend’ surveys 

Qualitative, quantitative or both?: Both 


  • Can track individual growth as well as collective group trends over a period of time 
  • Only have to ask for EDI data once 
  • Can analyse data for trends across various EDI characteristics and discipline 
  • Can get testimonial quotes 
  • Using a blend of quantitative and qualitive questions gives a comprehensive overview and makes for a compelling report 
  • Majority of questions and question order the same for both surveys 
  • Can invite specific respondents to follow-up focus groups

Recommended for: Cohorts, groups of postdocs


  • GDPR agreement (opt-in) needed 
  • Attributable data may impact on respondent honesty in answering 
  • Can be lengthy for respondent to complete 
  • Can be lengthy to analyse 
  • Can be tricky to get good response rates to the ‘end’ survey 
  • If main or only feedback mechanism can be slow to note and address issues 

Type: Attributable ‘pulse’ surveys

Qualitative, quantitative or both?: Both 


  • Can provide a quick insight into how the career development offering is going 
  • Can be used to provide a whole institutional postdoc population baseline point to measure against, if run before your Prosper implementation and then after 
  • Can be relatively quick to complete 
  • Can track individuals over multiple pulse surveys 
  • Can provide a broader overview than of just one session 
  • If ask EDI questions and disciplinary area can check for any patterns 

Recommended for: Cohorts, all postdocs at your institution


  • Only provides a snapshot, may need to be combined with other methods (like focus group to follow up any trends or re-occurring issues) 
  • If used to frequently can lead to survey fatigue 
  • Can get poor completion rates 

Type: Anonymous one-off surveys

Qualitative, quantitative or both?: Both 


  • Quick to complete 
  • Quick to analyse 
  • Fast-turn around allows issues to be rapidly identified 

Recommended for: Any session attendees (or registered to attend) 


  • No EDI or disciplinary data so can’t check for any patterns or correlation 
  • Typically low completion rate 
  • Can overwhelm respondent if lots of these surveys are issued 
  • Anonymity can result in less constructive feedback in comments or free text 
  • No longer term tracking of impact 

Type: Focus groups

Qualitative, quantitative or both?: Qualitative 


  • Topic of interest can be rapidly and effectively discussed 
  • Proposed possible solutions to issues/concern/barriers raise can be discussed 
  • Focus group attendees can benefit from hearing others views and perspectives 

Recommended for: Cohorts, groups of postdocs, any stakeholders at your institution 


  • Can be time consuming to arrange (date/time/invitees) 
  • Needs a facilitator and a note take (or to be recorded) 
  • Facilitator needs to strike balance between overly steering the discussion and keeping to the topic 
  • Time consuming to go through notes (or recording) 
  • Care must be taken about which groups of individuals are brought together to ensure they all feel able to contribute freely

Type: Long form reflective journaling

Qualitative, quantitative or both?: Qualitative 


  • Personal narrative and journey recorded 
  • Good way to track changes over time, such as shifts in mindset which can be tough to track in other ways 
  • Individual has lots of flexibility on what and how they reflect  
  • Can reveal issues, barriers or concerns which other evaluation methods haven’t 

Recommended for: Cohorts, groups of postdocs 


  • This method of reflection doesn’t suit all individuals 
  • Creating practical prompts can be time consuming  

If individuals share their reflective journal with you;

  • Can dissuade some to engage as reluctant to share their inner thoughts/narrative 
  • Time consuming to analyse and anonymise  

Evaluation recommendations

  • Use a pragmatic combination of quantitative and qualitative methods for evaluation 
  • Carefully consider the data you collect which is attributable and that which is anonymous 
  • Question if you need to conduct a survey after every individual session, consider what you want to achieve and be realistic regarding your expectations of completion rates 
  • As a rule of thumb we aimed for a 70 to 80% survey completion rate for cohort postdocs  
  • Use a mixture of quantitative and qualitative methods, it’s effective in allowing you to tell a story with a case study or real-life example and give a personal insight into the numbers you present 
  • Ask questions which require postdocs to use three words to describe their experience, as these answers can be used to create word clouds.
  • If you run focus groups ensure invited participants represent the diversity of postdocs working across your institution
  • Do consider when you evaluate. Be mindful of other evaluation (CEDARs, institution wide surveys or initiatives requiring evaluation) and try not to clash or overlap. Combine effort where possible. 
  • Do consider asking postdocs to set themselves goals as part of your evaluation as this can assist their career development 
  • Do disseminate your evaluation findings and impact, celebrate your successes and plan how to address or adapt to any issues 
  • Do use evaluation evidence and testimonials to recruit your next cohort or build advocacy for your implementation of Prosper 
  • Don’t fall foul of survey-fatigue! Consider how much and how frequently you request feedback. 
  • Don’t send more than a maximum of four follow-up reminders to complete a survey

Further information and evaluation resources 

Example evaluation survey questions downloads

Please note that all surveys were created and hosted on Jisc online surveys. We’ve provided the examples in a word document format so you can more readily adapt them. 

For a postdoc audience


Results, Impact and Dissemination 

See how Prosper did evaluationBlue Arrow rightFind out more
Refine image Refine Cross
Filter by: Unsure what to search for? Click here
130 minutes
Flash badge View notice(s)