What questions should I ask?

Evaluation should no longer be about numbers alone. A large number of attendees through the door does not capture any understanding about who participates, the depth of the engagement and what they take home.

Consider measuring how your project has supported participants’ feelings of agency, achievement, inclusive experience (e.g. feeling they belong), connection to science or confidence (among others).

There are many frameworks for evaluation, but STFC produced a simple toolkit for projects and programmes working with ‘Wonder’ audiences. The toolkit rearranges outputs and Generic Learning Outcomes (GLOs) into three areas:

Contextual: What is your starting point? (This could include personal characteristics or pre-existing relationships to science.)

Reactive: What did you think of it? (Gathering immediate reactions following an engagement, including whether they feel welcome, inspired, etc.)

Reflective: What sticks/what will you do next? (This might take time to emerge and looks at more substantial change in interaction, identity or engagement with science, so ideally would be collected after a period of time following the interaction.)

Top Tip

If you are truly looking for iterative and honest responses, particularly to make tweaks and changes to activities, then consider inclusive techniques that support anonymous answers (e.g. asking for anonymous comments on post-it notes). This helps gather more of the seldom-heard voices from your participants and also helps reduce the temptation of participants to please the facilitator by only being positive, which is a particularly high risk when you have created positive relationships!

Contextual questions you may wish to capture might include:

  • Ages, genders (ensure this is not binary)
  • Postcode data (can use alongside the Indices of Multiple Deprivation or understanding % Free School Meals for schools), descriptive details of your community partner
  • A pre-existing relationship to science may be captured by the question: “Someone in my family is really into science or works in science” (Yes/No/Don’t Know).

Reactive questions you may wish to capture might include:

  • Did you feel you were able to join in and ask questions? (e.g. 5-point scale from ‘Very much’ to ‘Not at all’)
  • Do you want to find out more about something you have learned?
  • Did you feel comfortable in this science space?

Reflective questions you may wish to capture might include:

  • Do you feel science has relevance to your daily life?
  • Do you think science can have a positive impact on our future?
  • Are you more or less interested in studying science or working in science?

Science Capital questions, in their entirety, are really personal. When we have to do these surveys, people can start to wonder if we are judging them or questioning their validity to be here.

CEO, Science centre, Audience monitoring and Science Capital report (October 2021)

Steer clear of biased questions

Think about how to collect data in ways that will make the data meaningful. A key concern is capturing overly positive responses because people are in the middle of having fun and they want to be kind. How might you manage to collect the data you need without making people feel obliged to be positive?

Think about how you frame your questions: “How much did you enjoy your visit today?” or “How much do you love science?” will lead to more biased answers than if you asked more open questions to avoid leading participants or being too subject focused, for example: “Thinking about today, how do you feel? What would you like to share with a friend? What would you change?”

When you finally sit down to look at the data you’ve collected, try to keep in mind whether there might be any more biases in play that you did not anticipate at the time.

Navigate intrusive questions

The tension between what feels acceptable and welcoming versus requirements for rigorous data capture from individuals representing our most marginalised audiences is an ongoing concern.

If you use a questionnaire or survey to gather this data, be sure it meets the needs of your participants in terms of language, wording and levels of literacy.

Case Study from Explore Your Universe

When community partners did not want their participants to be involved in any evaluation personally, this had to be respected. Explore Your Universe therefore used reflection journals as the main requirement for data capture. These journals had to be completed – in part – in consultation with the community partner practitioner, and covered description of the demographics of the participants, whether people were actively involved, and whether there were any moments where participation or behaviour was more engaged that normal (termed ‘meerkat moments’).

Top Tip

If the planned evaluation feels uncomfortable for you, it’s probably worse for your participants. Don’t put your relationship and trust at risk. Take your lead from your community partner. There are always alternative ways to capture the data you need (e.g. an interview with your community partner).