**This blog entry orginally appeared on the website oceanspaces.org.**/p>
One of the more encouraging aspects of working in the boundary between science and decision making is hearing scientists increasingly express the desire for their research to more effectively inform policy and management. However, for many the boundary appears difficult to navigate.
In an approach we call the Science Needs Assessment, we regularly conduct interviews with managers and policymakers to explore their science needs and identify ways that we can more effectively deliver relevant and useful science. By asking questions about how decision makers access and use science in their work, we are gathering knowledge about the process of science integration from beginning to end, and building stronger relationships across communities in the science-policy landscape.
Here are a few lessons we’ve learned about interviewing for science needs. To learn more about the Science Needs Assessment, visit our webpage at: http://calost.org/science-initiatives/?page=asna
Process not product
Science is not just a piece of information; it’s a process. The same can be said of science needs. When we interview, for example, a manager in the California Department of Fish and Wildlife, we’re not just looking for a list of pieces of information that she thinks would be helpful. We’re interested in how science gets taken up and used as part of her job. What are her approaches to accessing science, and how much, if at all, does that represent a standard approach adopted by the agency as a whole? What capacity does she have to commission new research, or vet existing research? What kinds of constraints are shaping the ways she interacts with science? What makes science credible in her eyes, and why?
Focus on concrete, tangible examples
We have found that specific examples can help to cut through jargon and vagueness that often pervade more general discussions about programs. We often ask, “can you describe an example in which science played a particularly constructive or positive role in something you were doing?” Then we pepper our interviewee with follow-up questions about the details of their answer. What was typical about this example, and what was out of the ordinary?
It’s hard to get managers to talk about “failures” as a counterpoint to the success stories. But people are often willing to respond when we ask, “what’s an example of a process where the role of science could have been better?” This can lead valuable lessons for scientists on ways they can be more effective, but also by reflecting on these situations, the managers often acknowledge things they could have done better.
Check your ego at the door
There are a few ways that your ego or existing knowledge can get in the way of a good interview. It’s hard to listen to someone tell you things that you already know, but of course this will inevitably happen, especially as you conduct more and more interviews. But proving to your interviewee that you know things is a distraction.
Avoid the impulse to ask leading questions. These can push your interviewee toward answers that are more in line with your own thinking, and thus, less informative to you on the whole. For example, instead of asking:
“Would you say that your ideas about which science is credible are driven by advice from scientists, or the views of your constituents?”
try something like:
“What do you look for in order to feel that science is credible?”
You also need to remember that your interviewee is doing you a favor, even if your intention is ultimately to help him. In this setting, he is the expert, and you are there to learn from him.
This points to a fundamental aspect of working across the boundary between science and decision making. We are not here to bring expertise to non-experts; we are finding ways to connect different domains of expertise such as management, science, and local knowledge.