Are we asking the right questions?

I have used the Summer Reading surveys and added questions to the surveys for adults, teens and kids. One of the questions was “Did you have fun?”, another was “Do you think summer reading is an important component of Teen Services?” I found these questions to be more important than “What did you learn” type questions.

As I prepare to begin Book Discussion group surveys (for kids, teens and adults) using the Education/Lifelong Learning survey, I am concerned about the focus on learning something new, using resources better, etc. We run discussion groups to encourage reading, communication with one another and for an opportunity to explore different genres of literature. We also do this to “have fun, relax and enjoy a community experience.”

I get why there is a strong focus on a “result” to connect with funders but we don’t always see “results” with all of our programs. Couldn’t we try to find a survey to talk about the result of making someone feel included or experiencing enjoyment; meeting new friends, becoming aware of community opportunities?


I have so many thoughts about this. First, before I was a library director, I was a director of marketing, so spent lots of time designing surveys, analyzing surveys, you get the idea. One of the things that jumps out at me is the language: “Do you think summer reading is an important component of Teen Services?” It could be simplified a bit, made more approachable. There has to be a simpler way ask this, and who are we asking? Would this feel like a test to them?
And I agree 100% that the community experience should be and can be a “result.” It seems to me that this kind of community engagement, that also includes literacy engagement is a good and worthwhile result. The way our current society is set up, having an outcome that includes communication and the ability to learn with others during that discussion is so very important. One of the things we notice with our toddler programs is that, for new parents, this is an opportunity to meet and bond with other parents. That is important, and while we work with a pre-school focus, another outcome is socializing for the adults, especially for those with English as a second language. You are spot on with your concerns and unless there is some sort of testing going on to judge results, you will be hard pressed to demonstrate results. If you define the results another way and manage expectations for funders, (and why can’t fun be a result), then the questions will design themselves.


Good question, Diane. While the Task Force did not prioritize measuring fun or relaxation outcomes, we know those are important components of many library programs. You’re on the right path in choosing the Education/Lifelong Learning survey, since it is our most generic and can be widely used across a variety of programming.

We did add those customization pieces to the survey management tool so libraries could have some more flexibility in what they’re surveying. So while you can’t edit the standardized questions, you can add some intro messaging to provide context to what your program aims to achieve, so the patrons read that before filling out the generic outcome questions. For example, “The Book Discussion Group is designed to encourage reading, communication with others, and discovering new genres of literature.” Adding your own custom questions at the end can also help target those other outcomes, like feeling included, enjoying a community experience, discovering new genres, etc. Also, patrons can always respond “n/a” if the outcome doesn’t apply to that particular program.

I hope this helps!


We are looking carefully at our SRP survey data to do a variety of things – inform programmatic design, understand community reach, and contribute to reporting to a program funder. Fortunately, the learning focus works for us and aligns with the library/funder’s logic model.

Staff still ponder the impact of SRP as only one of many potential interventions in the lives of readers during the summer. So, sifting through the results has helped staff consider more carefully responses to the additional questions as well as the baseline questions and the comments.

What we do have is data – for the first time – to help us understand the difference the programs are making in our community. The next step for us is increased cross-agency collaboration (parks/rec, etc.) in deepening program opportunities. We knew the kids were having fun at the programs we offered. What we didn’t know was if they were reading more as a result of participating, if they were more prepared to return to school, etc. As I tell staff regularly, just because someone borrows a book doesn’t mean they read it.

The SRP surveys, as well as the others, help us understand patron engagement with the programs in ways we couldn’t before. One change from the data is that we will engage SRP participants differently in the surveys throughout the summer, not just at the end of the program. We will see what we learn from that. Still a work in progress!


Thanks for your input! It is true that you can customize, which we are doing. I think my bigger concern is that if we ONLY align with needing to “learn something new” we are not recognizing the larger mission of the public library. We say that we are a “community center for all ages,” which is not a new concept for libraries.

When we did our Summer reading surveys we received many notes on the question for adults about reading more. Many said some variation on, “I can’t possibly read any more than I do already.” When the numbers were crunched, it looked like people were not reading as much.


I agree, but I think that we have an opportunity with this large data collection tool that we should not miss!

1 Like

Thank you for the discussion. I agree, these questions are important for stand-alone programs. I think if surveys are used after the series of the programs, like a language course, we don’t have to ask if patrons liked it or had fun. If they were coming repeatedly, that means they enjoyed the program and possibly had fun.
I have another question:
I am running the Education/Lifelong Learning surveys for the adults English learners, and our students find some of the questions very hard to understand. Although, these are the right questions, they need to be interpreted for the students of the beginning and even intermediate levels in order to be answered.
“You feel more confident about what you just learned” and “You intend to apply what you just learned”. I think the follow up questions are even harder.
I would love to have these questions to be simplified or rephrased.
Please advise if I can do so.
Thank you.

At Sno-Isle Libraries, while considering what were meaningful measures of our programming services, we also had to think about what is the intent of our programming services. This led to a discussion and differentiation of event types.

A “program” is an event designed and measured by outcomes in knowledge, awareness, behavior, and skill. It is an assumption that we will deliver these events with fun, engagement, activities, etc. because we understand and value tenants of instructional design. And we like to have fun at work. :wink: We can measure outcomes for these with Project Outcome survey tools/methodology/etc.

A “community event” is an event planned and measured by outcomes in awareness, community partnerships, or perception of the library as a community anchor. This is where the “just for fun” connection can be found. And we are in the process of figuring out how to measure outcomes for these or if specific outputs are “good enough” indicators of success. For example, if the intent is to get people to have a fun at the library, is sustained attendance a success measure? Happy or Not buttons?

Great question & fun discussion! Thank you!

I should add, is surveying even necessary for these types of events and their outcomes? Or are we really looking at community level indicators that measure the library’s position and influence in the communities we serve and strive to make successful…