What questions should I ask?

Evaluation should no longer be about numbers alone. A large number of attendees through the door does not capture any understanding about who participates, the depth of the engagement and what they take home.

Consider measuring how your project has supported participants’ feelings of agency, achievement, inclusive experience (e.g. feeling they belong), connection to science or confidence (among others).

There are many frameworks for evaluation, but STFC produced a simple toolkit for projects and programmes working with ‘Wonder’ audiences. The toolkit rearranges outputs and Generic Learning Outcomes (GLOs) into three areas:

Contextual: What is your starting point? (This could include personal characteristics or pre-existing relationships to science.)

Reactive: What did you think of it? (Gathering immediate reactions following an engagement, including whether they feel welcome, inspired, etc.)

Reflective: What sticks/what will you do next? (This might take time to emerge and looks at more substantial change in interaction, identity or engagement with science, so ideally would be collected after a period of time following the interaction.)

TOP TIP

If you are truly looking for iterative and honest responses, particularly to make tweaks and changes to activities, then consider inclusive techniques that support anonymous answers (e.g. asking for anonymous comments on post-it notes). This helps gather more of the seldom-heard voices from your participants and also helps reduce the temptation of participants to please the facilitator by only being positive, which is a particularly high risk when you have created positive relationships!

Contextual questions you may wish to capture might include:

  • Ages, genders (ensure this is not binary)
  • Postcode data (can use alongside the Indices of Multiple Deprivation or understanding % Free School Meals for schools), descriptive details of your community partner
  • A pre-existing relationship to science may be captured by the question: “Someone in my family is really into science or works in science” (Yes/No/Don’t Know).

Reactive questions you may wish to capture might include:

  • Did you feel you were able to join in and ask questions? (e.g. 5-point scale from ‘Very much’ to ‘Not at all’)
  • Do you want to find out more about something you have learned?
  • Did you feel comfortable in this science space?

Reflective questions you may wish to capture might include:

  • Do you feel science has relevance to your daily life?
  • Do you think science can have a positive impact on our future?
  • Are you more or less interested in studying science or working in science?

Case study from Explore Your Universe

When community partners did not want their participants to be involved in any evaluation personally, this had to be respected. Explore Your Universe therefore used reflection journals as the main requirement for data capture. These journals had to be completed – in part – in consultation with the community partner practitioner, and covered description of the demographics of the participants, whether people were actively involved, and whether there were any moments where participation or behaviour was more engaged than normal (termed ‘meerkat moments’).

Steer clear of biased questions

Think about how to collect data in ways that will make the data meaningful. A key concern is capturing overly positive responses because people are in the middle of having fun and they want to be kind. How might you manage to collect the data you need without making people feel obliged to be positive?

Think about how you frame your questions: “How much did you enjoy your visit today?” or “How much do you love science?” will lead to more biased answers than if you asked more open questions to avoid leading participants or being too subject focused, for example: “Thinking about today, how do you feel? What would you like to share with a friend? What would you change?”

When you finally sit down to look at the data you’ve collected, try to keep in mind whether there might be any more biases in play that you did not anticipate at the time.

Science Capital questions, in their entirety, are really personal. When we have to do these surveys, people can start to wonder if we are judging them or questioning their validity to be here.

CEO, Science centre, Audience monitoring and Science Capital report (October 2021)

Navigate intrusive questions

The tension between what feels acceptable and welcoming versus requirements for rigorous data capture from individuals representing our most marginalised audiences is an ongoing concern.

If you use a questionnaire or survey to gather this data, be sure it meets the needs of your participants in terms of language, wording and levels of literacy.

TOP TIP

If the planned evaluation feels uncomfortable for you, it’s probably worse for your participants. Don’t put your relationship and trust at risk. Take your lead from your community partner. There are always alternative ways to capture the data you need (e.g. an interview with your community partner).

How to capture the magic

There are many creative and playful methods that can ‘gamify’ certain questions into activities (e.g. participants returning lab-coats onto coloured hooks based on their answer to how enjoyable they found the session). Different approaches to acquiring the answers you seek can help reduce the need for long questionnaires.

Be Creative

Case Study from Explore Your Universe

Used to strict requirements from project funders, staff from one science centre spent a large proportion of a first session completing feedback forms with the families. The forms needed facilitation and participants struggled with the level of literacy required by the multiple questions, particularly as the forms had not been translated into the participants’ first language.

Learning from this was rapid and taken forward to produce far simpler and more creative ways (single questions, stickers, emojis etc.) to capture experience. A flexible approach to evaluation methodology avoids diminishing the activity, experience or relationship when capturing impact.

TOP TIP

You don’t have to rely on a post-event survey form to gather feedback. Questions could be done during ticket sign-ups, as interviews or observations, or embedded into the experience itself.

Co-creating evaluation tools with community partners

Community partners and, where possible, participants should be involved in the design and co-creation of evaluation tools. They will know, for example, whether the language needs to be modified or translated, if the content needs more detail or simplification, or if images are misleading. They can also help their families or young people to fully understand voluntary informed consent.

Before you decide on a particular tool be sure to:

  1. Sit down with your community partner and discuss the tools you are using. Explain what they are measuring and why you wish to get these answers.
  2. Find out what they wish to measure and settle on commonalities. Avoid adding to the list, instead agree what you really will use to avoid overloading your participants with questions.
  3. Think about the methods that are practical and how you can gather the data and inform participants on consent.
  4. A balance needs to be struck between asking for the community partners’ support and demanding too much of them.

Many of the most meaningful and long-lasting impacts of partnership work are too subtle or awkward to translate into quantitative data (for example, the trust developing over the course of a project between partners). This is where qualitative approaches and case studies can help capture the full story.

TOP TIP

Keeping a reflective journal is a great tool for qualitative and quantitative data capture. It should be completed after every session and can help capture the ‘magic’ key moments, the stories of engagements and the unexpected outcomes.

In addition to metrics data (e.g. number of attendees) reflection journals can also capture formative evaluation of all the changes you made along the journey following feedback.

The reflection journals were a useful tool and the principles of reflective practice that were stimulated through conversation about the journals will be integrated into team planning in W5 going forwards.

Practitioner, W5

Case Study from Explore Your Universe

At the start of the programme, designed partnership cards were used to share how new relationships were settling in. The cards contained single words such as ‘tourist’, ‘guide’, ‘pilot’, ‘passenger’ and even ‘baggage’ as options to choose from! The cards were conversation starters, but pairs did show some interesting dynamics. An exceptional, long-lasting partnership shared ‘teammate’ from the perspective of both the science centre and community partner practitioners. Whereas a partnership that struggled to meet an equitable balance in the first phase of delivery shared ‘passenger’ and ‘performer’ from the community partner’s perspective.

Both your own experience and that of your partner should be a valued part of the evaluation picture. It is often your partner who can spot early signs of challenge or draw attention to ‘light bulb’ or ‘meerkat’ moments, or other indicators of impact that would otherwise be missed. Including this voice in your evaluation uncovers a wealth of impact and understanding from a different but critically important perspective.

This can be effectively captured using interviews or completing reflection journals together.

Interrogating practice with the reflective diaries gave us a new dimension to how we were doing things... Reflective diaries are now central in everything we do at Aberdeen Science Centre. All team members are encouraged to use reflection on a daily basis.

Practitioner, Aberdeen Science Centre
Capturing the magic

Evaluation is about building a broad picture that can tell a story of impact and learning. There is no single evaluative tool that is perfect by itself, so a good rule of thumb is to collect and triangulate three different approaches to help capture this bigger picture (e.g. reflective journals, practitioner surveys and observations).

Pre and post data

If you are looking at multiple interactions or changes over a sustained engagement, another consideration is whether you want to connect results to an individual (for tracking impact or change). This raises questions about anonymity, but there are creative ways you can anonymously track individuals from pre-to-post, without asking for names or date of birth outright.

Allow-time-l.blue.jpg

Allow enough time

Build in plenty of time to do this properly, not at the last minute when everyone is jumping on the minibus home!

Gathering quality reflections doesn’t need to be a formal focus group. Depending on the age and nature of the group, you could gather reflections by having a short discussion, or by asking everyone to think for five minutes before writing anything down on their form. Either way, it is worth encouraging them to take their time.