Journal list menu

Volume 102, Issue 1 e01739
Communicating Science
Open Access

Research Briefs through Interpreters’ Eyes: Recommendations for Scientists Sharing Park-Based Research

Martha Merson

Corresponding Author

Martha Merson

TERC, 2067 Massachusetts Ave., Cambridge, Massachusetts, 02140 USA

Search for more papers by this author
Alyssa Parker-Geisman

Alyssa Parker-Geisman

TERC, 2067 Massachusetts Ave., Cambridge, Massachusetts, 02140 USA

Search for more papers by this author
Louise C. Allen

Louise C. Allen

Winston-Salem State University, 601 South Martin Luther King Junior Drive, Winston-Salem, North Carolina, 27110 USA

Search for more papers by this author
Nickolay I. Hristov

Nickolay I. Hristov

Winston-Salem State University, 601 South Martin Luther King Junior Drive, Winston-Salem, North Carolina, 27110 USA

Search for more papers by this author
First published: 24 September 2020
Citations: 1

Today’s scientists and science communicators have to navigate a complex and rapidly changing media environment (National Academies 2016). Americans are divided along party lines in terms of how they view the value and objectivity of scientists. In particular, there are sizable gaps between Democrats and Republicans when it comes to trust in scientists whose work is related to the environment (Funk et al. 2019).

In a media landscape pocked with skepticism and false claims, partnerships with trusted institutions are valuable and critical (Nesbit 2014, Skorton 2018). Intermediaries like docents, interpretive rangers, and guides who work in museums, zoos, and aquariums win high marks for trust among public audiences. These interpreters have the potential to increase the public’s perception of the value of scientific research (National Research Council 2009, Schneider 2017). As truth is replaced with true-sounding “post-facts,” it is critical for science to advance the most accurate knowledge (Plavén-Sigray et al. 2017). One way to do so is to maximize its accessibility to non-specialists. However, scientists and interpreters face time constraints that limit opportunities for in-depth, face-to-face exchanges about place-based studies. Furthermore, scientists’ published, peer-reviewed articles are often not accessible to professional audiences outside academe (Merson et al. 2017). With readability of scientific texts decreasing (Plavén-Sigray et al. 2017), research briefs are a valuable and reasonable compromise.

Briefs are produced across disciplines. Here, we focus on their features and usability within the National Park Service (NPS). As explained on NPS websites, 18 research learning centers and 32 regional offices of the Inventory & Monitoring (I&M) Division support science scholarship and communicate park-relevant findings. Lengthy technical reports do not meet the needs of all audiences; hence, they produce resource briefs: “concise highlights of monitoring findings in a visually appealing format that are directed towards a wider range of NPS staff” (Mims et al. 2016). Briefs are vetted by scientists and staff with disciplinary expertise, making them reliable sources of information which convey essential facts to their readers.

Briefs in the NPS actually have a long history and have always been intended for a broad audience. According to park historians, today's resource briefs are similar in purpose to the Yellowstone Nature Notes of the 1920s. Written by the park naturalist, the notes were primarily intended for internal distribution but were also more widely available (Schuller and Whittlesey 2000). More recently, The Science Communication Framework Vision and Principles set forth a vision for science communication to inform decision-making and to provide opportunities for diverse audiences to develop personally relevant intellectual and emotional connections with parks (Outreach Technical Advisory Group 2016). While science coordinators and science communicators within the Park Service identify resource managers and upper level administrators as a desired audience, they also imagine that their briefs will meet the needs of a broad audience, satisfying the curiosity of members of the public, and giving interpreters background on park-based research (Beer, personal communication, 4 May 2017, also O’Herron 2009, and Mims et al. 2016). For example, the Ocean Alaska Science and Learning Center (OASLC) describes its process as follows:

The OASLC staff collaborates with park staff from a variety of disciplines as well as natural resource staff with Alaska's Southeast and Southwest Inventory and Monitoring Networks (I&M) in developing and providing resource briefs. … The OASLC works with park and I&M staff in determining who the main audience will be when drafting these resource briefs, as well as working with staff in order to determine the best ways to disseminate resource brief information (for example: websites, interpretive training, and/or presentations). The OASLC posts current and past resource briefs on the OASLC website, as well as links to where these briefs are housed on park and I&M sites (available online; referred to on March 25, 2020).1

In this article, we focus on the format preferences and priorities of an essential group of potential users of research briefs. We contend that slight changes in the typical format and content will make briefs more useful to docents, volunteers, and interpretive rangers (hereafter all referred to as interpreters) who are in a position to communicate science in parks, nature centers, and other informal learning environments. The parks where they work function as our nation’s valuable outdoor laboratories, an underappreciated aspect of protected land (Shea 2015, Char et al. 2020). Discussing place-based, ecological studies is a natural fit for park interpreters. The National Association for Interpretation defines interpretation as “a mission-based communication process that forges emotional and intellectual connections between the interests of the audience and the meanings inherent in the resource.” The tag line is even more explicit about the place-based nature of interpretation: If you help visitors learn about a place that’s important to you, you’re an interpreter (National Association for Interpretation 2020).

Methods

In the name of advancing the public’s understanding of park-based science, Interpreters and Scientists Working on Our Parks (iSWOOP, NSF DRL-1514776) project leaders have worked to promote partnerships between scientists and National Park Service interpretive rangers. iSWOOP did so mainly by coordinating on-site and online professional development opportunities. However, we were also curious to gauge interpreters’ impressions of briefs as another avenue for increasing the accessibility of relevant research. To do so, we needed to understand both what interpreters encounter when they consult briefs and how they use them.

We downloaded and analyzed features of existing briefs from Park Service websites and conducted interviews with a convenience sample of interpreters. Because our goal was to arrive at a format that would be most supportive of interpreters’ efforts to highlight the value and relevance of park-based research, the semi-structured interviews included a hypothetical situation and task as well as questions about interpretive practice. To gauge their impressions of briefs, we offered two versions of a brief, one traditionally formatted brief and one with a shorter title, sub-headings with questions to pique curiosity, and a first page with half the space allocated to photos (Fig. 1, complete versions of the briefs are included in Appendix S1).

Details are in the caption following the image
Comparison of elements from two versions of briefs. The map indicating the study sites is from Version 1, the more traditionally formatted brief. The map offers information on locations and area covered. Maps of land forms tend to depersonalize a study. In Version 2, a photo from the field replaced the map. The photo introduces a member of the research team. An observant reader who notices the trap in hansd finds the beginning to the answer: How do scientists know what they know? Photo credit: Ryan Lebar.

One researcher conducted in-person, semi-structured interviews with individuals and pairs of NPS interpreters. An effort was made to pair resource management staff who might have different strategies and lenses for the briefs with interpreters with 3–5 years of experience. In this group of interpreters, one would expect mastery of the basics of the job, capacity to take on a challenge, and several hours each workday spent interacting with the public. Ultimately, interviewees included interns and research learning center staff as well as interpreters focusing on social media rather than in-person interpretation. Interviewees had a range of experience and backgrounds (years with NPS ranged from nine months to 30 years and a mix of degrees including science education, ecology, archeology, criminal justice, graphic design, and biology). In this article, we focus on the reactions of five experienced interpreters whose job responsibilities included formal and informal interpretive interactions with the public. The five included two male and three female frontline interpreters with 3–8 years interpreting for the National Park Service.

Interview structure and briefs presented: A task to elicit interpreters’ responses

In order to understand interpreters' impressions of the format and content of research briefs, we prepared two versions of a brief summarizing a study of the impact of climate change on amphibians in northwest Indiana (Appendix S1). The briefs drew on annual reports submitted by a herpetologist researching amphibian abundance in the Midwest. The timing of breeding at field sites across northwest Indiana illustrated the value of parks as an outdoor laboratory. An NPS science coordinator had authored and shared her draft (but retired before finalizing and publishing it). iSWOOP staff updated the format to resemble published briefs. Version 1 was the more traditional. Its sub-headings included: Importance, Methods, Preliminary Results, Future Research Directions, and Implications for Park Management. iSWOOP staff used the content from Version 1 to create Version 2, giving it a shorter title (Frogs and Salamanders of Indiana Dunes), sub-heads in the form of questions (such as Methods: How did scientists determine an answer?), and large photos with a researcher and salamander on page 1 in place of the map and graph that appeared on Version 1 (Fig. 1). Audio- and video-recorded interviews ranged from 35 to 75 minutes. About half the total interview time was dedicated to a think-aloud protocol. The interview began with a question about the interpreter’s background and a question about what they found most rewarding in interpreting science. Rangers were then asked to imagine they were new at a park that wanted to increase interpretation of park-based research on amphibians (Appendix S2). The wording of the task (Fig. 2) was meant to elicit interpreters’ professional and practical needs as they imagined interpreting the information for the public.

Details are in the caption following the image
The task elicited interpreters’ needs as they imagined interpreting information on Brodman’s amphibian research for the public.

Interviewees could choose one of two briefs on offer. While looking over Versions 1 and 2, interpreters were asked why the one they selected appealed to them and to comment as they read. Interpreters verbalized their thoughts as they perused, read, or skimmed the brief of their choice (Nielsen 2012). After completing the read-through, interpreters answered open-ended questions about how they could use the information they had gleaned in an interpretive interaction: What they would use as a hook for capturing the attention of visitors and how they would establish relevance for audience members of different ages. These open-ended, semi-structured interview questions related to interpretive practice: knowledge of audience for capturing interest and making the content relevant; knowledge of the resource for confidence with the content; and the personal rewards of interpretation. Finally, interpreters were asked to use a 10-point scale rating: How confident would you feel talking about this topic with visitors for less than 5 minutes, 5–10, or more than 10 minutes?

Coding interview transcripts

We created transcripts based on notes and audio recordings. Utterances from the think-aloud portion of the interview were grouped into three categories: Format/Layout, Text, and Images. Utterances were then coded according to their positive or negative valence. Responses such as “nice” and “useful” were coded as positive. Critiques, expressed lack of clarity, and indications of disengagement (“This picture is dark,” or “I can’t make out the axes,” or “I didn’t feel like reading more”) were coded as negative. Descriptive codes were developed based on responses to open-ended, semi-structured interview questions related to interpretive practice (Charmaz 2006; Hsiung 2010). We also used a set of a priori categories related to interpretive craft, for example, emotional and intellectual connections, hooks, and relevance. Emerging themes were signaled by repeated use of phrases (Saldaña 2015).

Coding features of research briefs

When interpreters do seek out research briefs, what do they encounter? To understand the common features and variation among existing briefs, we created a sample from the pool of briefs made available by the NPS I&M networks’ regional websites. From the 32 regional I&M home pages, we navigated to Reports and Publications. The dropdown menus then vary. We selected Briefs and Summaries. If that was not an option, we chose Quick Reads. If neither option appeared, the region is not represented. We selected briefs on wildlife and plant species because of public interest in seeing wildlife and iconic plant species.

Two researchers developed and tested the protocol for reviewing existing briefs, making determinations such as whether to count text in captions or not (which we did). We established categories based on features like readability that affect access, the presence of visuals and stories because use of visuals and stories are popular interpretive techniques, and relevance, which is seen as tied to interpreters’ mission to form intellectual and emotional connections (Ham 1992). The interviews and first round of our analysis of briefs occurred concurrently. Because interpreters spoke to the complexity of graphs and usefulness of photos, we refined our assessment by coding for different types of graphs and the percentage of space allocated for visuals. In addition, we created codes for the purpose of images, for example, sets the scene, shows the researcher in action, displays a resource or phenomenon that is rarely seen.

Results

We looked at a number of characteristics of briefs, including use of images and figures, story elements, relevance, and readability, as these elements commonly inform interpretive interactions. The 61 briefs collected were reviewed for readability. On the Flesch reading ease scale, the higher the score (0–100), the more readable the text. The scale takes into account the average number of words in a sentence and the average number of syllables per word (McGovern 2019). We found a range of a range from 10 to 66 with a mean score of 34.5. Fifty-three of the briefs scored “very difficult” (0–30, which may be comparable to reading The Affordable Care Act), or “difficult,” (30–50, which may be comparable to reading an academic paper on chess) (McGovern 2019). Just one scored in the range considered plain English (60–70, an eighth or ninth grade reading level), while five could be considered appropriate for high school students (50–60). Half of the briefs explicitly addressed the “importance” of the topic or research efforts. Though we planned to count briefs with a story arc and expected to code for one of three archetypal conflicts (good vs. evil, unanticipated connections, and breakthroughs), just one brief had an identifiable narrative.

Images are a large part of most research briefs. By images we mean graphic elements including figures, maps, and photos. We estimated space given to graphic elements (by the quarter page). Of 61 briefs collected from inventory and monitoring projects, authors typically gave graphic elements more than 25% and less than 50% of the space. All of the briefs featured at least one image. These typically served roles including providing visual interest, illustrating text, or describing trends with photos, maps, and graphs. Although nine briefs included 6–10 visual elements, the mode was just two (16 briefs of 61). We coded for images that presented something puzzling, rarely seen, or that included the researcher in action. However, we found very few. Of 228 total images and figures, just eleven included photos of researchers and just eight presented an image that might prompt curiosity or an actual question that would provoke active reading for an answer. More common were photos of the landscape or a particular species at a common or familiar scale.

A focus on images and use for public programs

Interpreters followed the interviewer’s request to talk aloud as they previewed and then read a research brief. Three aspects of interpreters’ reactions were striking, including their first impressions, purpose-driven approach, and interest in next steps for visitors. Where interviewees are referred to, we have used pseudonyms. Commenting on their first impressions, interpreters noted that a recognizable format (NPS banner, logo, and typeface) gave credibility to the information.

In particular, images were a focal aspect of interpreters’ attention when they reviewed briefs. Interpreters were interested in reading graphs and figures, though one commented that she would not show a graph to visitors unless it was a simple, colorful bar graph. Interpreters also commented that they like “before and after” photos (e.g., of habitat restoration). Interpreters specifically noted that photos of wildlife drew them in. On one brief, Version 2, the close-up photo of a salamander sparked interest, and interpreters thought they would enjoy reading that brief. During the think-aloud portion of her interview, Annie was purposeful in her approach:

I am wondering if I can get photos. … I’m wondering in my head about the quality of photos. … I’m probably going to use those frog calls in my program. … I’m going to key in on a few species, otherwise it’s too overwhelming … What can I get out of these researchers for my program? Can we get a PowerPoint®?

As Annie’s response exemplifies, interpreters’ sense of professional responsibility also shaped their reactions to briefs. Sections labeled “Importance,” “Next Steps,” or “Help” drew their attention. In one case an interpreter said, “Because of my job, I’d go to this … because there’s a graph on there, I’d be learning more science-y stuff that I could share with the public.” Indeed, throughout the interviews, interpreters were mission-driven. Looking through the briefs, they noted impressive facts, images, or an intriguing question they could imagine using with visitors to form or elicit connections. Interpreters also wanted to be prepared with suggestions for ongoing engagement “…We’ll scan through resource information looking for catchy things we can use with the public,” Johanna explained.

Briefs are a starting point

Ike and Carolyn expressed high levels of confidence (7 and 9 on a scale of 1 to 10) that they could lead impromptu conversations (<10 minutes in duration), based on what they had read in a single brief, though Ike made it clear that the brief is not a handy reference during a program. He commented:

I couldn’t use this as part of the talk. … I would get lost if I had an audience around me and I was trying to refer to it …unless someone asked me a specific question then I could answer based on my memory or refer back to this. I would feel lost if I were giving a talk based on this. Too much text to use in the moment with visitors.

Echoing this sentiment, Elliott noted, “The researchers aren’t focused on communicating their ideas in an easy-to-access fashion. That’s why I have a job.”

Johanna and Annie expressed less confidence about designing a formal program with a research brief as their main source for content (giving their confidence a 4 or 5 on a scale from 1 to 10). While a brief can be a sufficient basis for informal interactions, interpreters were clear that the brief is only a starting point for developing a longer program.

Experienced interpreters expressed an interest in learning more, posing questions about the impetus for the study: Why was the study a good idea? What were researchers seeing that motivated the funding and effort? Johana said:

If I haven’t researched it myself, I usually spend some time talking with the researcher/resources person and going through it. I usually take my time. I want to get down and dirty and find out what’s going on first. … I try to make sure it’s personal and I have something to add and that’s where we try to get involved … [with resource managers who may be conducting the study or at least more familiar with the premise and methods]. Okay, you guys are doing a survey or getting data, “Can we help you?” … For me, that’s really where I connect: learn by doing. … That really helps me a lot.

Even the most stellar brief will likely be one source among many in the development of a formal program. Not only did interpreters want to have personal, firsthand, hands-on knowledge to incorporate, they wanted details they could draw on if their participants expressed interest in more. Annie explained:

I would want to dig more. … If I’m going to do a program, I’m going to keep researching and talk to the people who are doing the studies. I want to have as much in my back pocket as possible. Do I have the knowledge beyond the narrow window of what I set out to talk about?

Missed opportunities

While briefs were perceived as useful by interpreters, they also remarked on content they felt was missing. Despite the interpreters’ interest in compelling images, the national parks’ thousands of images available for public use (e.g., in Instagram or Flickr accounts) are not all high resolution. Interpreters’ use of these images is further complicated because the restrictions and permissions for use are often unclear. High-resolution graphics they could use or a clear contact for requesting use was a high priority.

Interpreters expressed disappointment when briefs solely described actions the park could take based on the research. Excluding the visitor made little sense from these interpreters’ perspectives. Without a clear sense of the relevance to visitors, interpreters had less confidence that talking with visitors about the study would enable them to achieve their mission of connecting visitors to the park. Carolyn offered this example from her experience connecting geology to a human-interest story and dune protection. She began describing a group of older folks she led on a hike up a famous dune. She set the scene, telling them about the summer day a few years ago when a kid disappeared into a hole and the dad’s frantic attempt to dig him out.

One woman started to cry. If you get that emotional response … they are so much easier to pull in. Then they care why you should stay on this side of the fence. They start to understand why we are doing things a certain way. Handing them a briefing on the geological formation of the sand dune may not have provoked the same response.

Like other comments interpreters made about their purpose and profession, the integrity of the research and giving it an emotional, human framing are intertwined. Carolyn articulated it in this way:

Our job is to give the data and get the emotional response. That’s why we are the interpreters. We are reading all the books and translating it. Because who wants to read 30 pages on why Mt Baldy has moved this far in this amount of time? One or two will, but the [vast majority of] people on vacation hiking in their bathing suits won’t.

When interpreters said briefs were a starting point and they will dig for more, they aimed to deepen their knowledge of what happens behind the scenes. Annie reflected: “Do I have the knowledge beyond the narrow window of what I set out to talk about?” Simultaneously, interpreters looked for details scientists are trained to leave out. “I try to make sure it’s personal and I have something to add,” Johanna explained (authors’ emphasis).

Discussion

As a group, the briefs delivered factual reportage. Nevertheless, several included elements that demonstrate the flexibility of the genre. Appendix S3 shows examples of such features (Fig. 3, e.g., leads with a conversational title).

Details are in the caption following the image
Some briefs, like this one from the Northern Colorado Plateau Inventory & Monitoring Network, include elements that demonstrate the flexibility of the genre. Note the conversational title which emphasizes the familiar concept of thirst, signaling that readers without a technical background will find the content accessible. In additional examples of briefs in Appendix S3, authors organize their content to prominently label questions, explain how managers can apply findings, and to hint at a story. Information in this brief was summarized from Witwicki et al. (2016).

When presented with briefs, interpreters read through densely worded text. They were drawn to images of wildlife and curious to read about the “importance” of the study. These elements of briefs are well-suited for interpreters’ purposes, especially if they can track down high-resolution images to use in their interactions with visitors. While reading, interpreters collected ideas for images and content they could use with visitors. Interpreters were simultaneously reading briefs to gain knowledge and to prepare for visitor interactions. They explained they were looking for information on the value or application of the research so they could establish the relevance for visitors. They spoke of briefs as a first step; they wanted more information, more context, and a personal connection. Their instincts were well aligned with the current trend to incorporate stories in science communication (e.g., Olson 2009, ElShafie 2018, Zimmer 2018). Authors who craft more narrative abstracts are cited more often and because of this, their work may be more influential as well as more visible (Hillier et al 2016).

Similarly, storytelling is a recognized tool for interpretation (Beck and Cable 2002, Ham 1992, Larsen 2003, Tilden 2007) and has been identified as a program characteristic that influenced visitor satisfaction (Stern and Powell 2013). Indeed, maintaining the audiences’ attention is critical and requires planning (Ham 1992). Stories have a unique ability to capture and hold our attention (Olson 2009). Studies show that when we are absorbed in a story, we are less skeptical. For example, people who were highly absorbed in a story were more than twice as likely to help out when the experimenter “accidentally” dropped a handful of pens (Gottschall 2012). Stories are thus an invaluable technique for fostering connections and stewardship. iSWOOP has found two ways in which it is possible to wed stories and research briefs. Both expand the Methods sections of briefs.

The methods section is where readers find out details about scientists’ process and instrumentation. This is exactly where readers could also find stories of innovation, wherein researchers improve existing technology or improvise new solutions to collect data. Rather than glossing over such details, we recommend featuring compelling properties of gadgets and instruments. Interpreters can amuse and impress visitors with how we know what we know based on laser scanning and 3D motion capture, for example (see also Kark 2017 and Merson et al. 2017). If they have a grasp on researchers’ methods, interpreters can shape vignettes for visitors, revealing how researchers go about painstakingly counting pollen grains or finding out when frogs begin mating calls. They can pique interest with a statement like: “Dr. Bob’s study cost less than $10. Want to hear why?” or invite listeners to put themselves in the shoes of a paleoecologist: “Imagine the researchers hiking up this same mountain; with coring gear and the tools and materials to build a raft.”

Although research briefs do not follow the structure of stories, it is still possible for brief authors to seed stories. One sentence with evocative imagery can cue interpreters to look further for a story. In one brief, we read that river rapids are known for “eating kayaks.” Such vivid language signals a story worth unearthing. Further, if methods sections were instead entitled “Methods and Missteps,” this section could be a home for a sentence that hints at the unexpected, the near-catastrophic, or serendipitous events that do not typically get included in peer-reviewed articles, but are the stuff that makes science resonate with public audiences (Pinnix 2017). Sentences like, “data were collected under Austin’s bridges where the researcher slept (barely) for two months” or “researchers scheduled their field work around bomb testing in proximity to their field site,” hook interest. By eliminating details about obstacles and near catastrophes, briefs implicitly support the unspoken and too common belief or narrative that science proceeds in a linear fashion as outlined in the scientific method (Firestein 2012) and that those who are good at science encounter few difficulties in accomplishing their work (Dweck 2008). It is worth noting that at no point did interpreters express that they wanted stories scripted for them. As Carolyn mentioned, the art of encapsulating science for public consumption is the interpreters’ job. The art of storytelling keeps the emphasis on the construction of stories or narratives in a particular context. The “same” story may be told quite differently from one instance to another, even by the same teller, depending on audience, purpose, timing, and location (Moezzi et al. 2017). Interpreters are trained to adapt their techniques and approaches based on their observation of their audience.

For those who care about cultivating and promoting (1) an understanding of science process, (2) the value of parks as outdoor laboratories, and (3) a scientifically literate and engaged public, we know we have to not only do the science, but communicate it in a way that leaves others feeling connected, excited to hear more, and itching to talk about what they have learned with others. Interpreters’ comments about adding something personal, about evoking an emotional response, about offering a way to stay involved, are underscored by Pinnix (2017). An interpreter and science communicator she notes that while scientists want to talk science, the public “wants to hear about them. Who are they? What do they care about? Do they have kids? …” (2017).

The potential for interpreters to leverage briefs, thereby introducing STEM learning into conversations with tens of thousands annually, is exciting. With certain shifts, interpreters who work in parks can more easily use briefs as a starting point, add to them with firsthand experiences in the field or follow-up conversations with scientists. In conversations with the public, they can instill curiosity and include a “call to action,” which lets visitors know how they and other members of the public can follow up and contribute to scientific studies. In working with scientists to convey their questions, methods, and findings to groups of interpreters, we have adopted certain practices and urge authors of research briefs to take into account the following considerations and recommendations (Table 1).

Table 1. Adapting briefs.
Considerations for authors Recommendations to serve interpretation
Check for accessibility/readability Adopt active voice, explain terms, shorten sentences
Answer why Explain (or provide a link) to the back story: why this study, why the researcher took it on, what came before
Use questions Encourage prediction, active reading, or listening
Name the source for high-res images Provide credit and fair use information
Use photos that extend the experience of looking Share what those in the field or laboratory see, rather than a landscape or typical view of the focal species
Use photos that show the researcher in action Add a sense of adventure; spark conversation about how scientists come to know what they know
Add content that matches interpreters’ needs Hint at a story, offer impressive facts, images, or an intriguing question
Make suggestions for ongoing engagement Hook and sustain the relationship with the park and the research

Conclusion

The workforce of interpreters is dedicated to enhancing the visitor experience. Since 2015, NPS has logged over 300 million recreation visits annually (National Park Service 2020). Interpreters told us briefs are a useful starting point. Interpreters’ commitment to their mission will carry them through densely written texts. They will dig for more, especially if they know that their efforts will be rewarded with relevance, an intriguing story or riveting images. We concur with Pinnix who wrote:

Ultimately, great work goes nowhere unless the story is told. … No scientific paper can fire the imagination like a story; but we need both … Working hand in hand with scientists, we can bring stories to the wider world that inspire curiosity, action and wonder (Pinnix 2017).

With certain shifts by researchers who produce briefs, interpreters who work in parks can more easily draw on briefs to instill curiosity and increase the number of park visitors who see the value of parks as outdoor laboratories. There are inevitable obstacles to interactions about park-based research (bus tours with little time or school groups with a tight, standards-based focus). Given these, authors should seek to create briefs that appeal to interpreters and serve the mutually beneficial goal of showcasing park-based science. Brief authors can simplify language, eliminate jargon, and check the readability of their texts. More importantly, we recommend aligning briefs with interpretive techniques by providing novel, intriguing, high-resolution, and fair use visual material, including or alluding to stories, adding surprising, vivid details to explain how we know what we know, and describing the importance of the research within and beyond park boundaries in terms that will matter to visitors.

Documenting details in a scientifically rigorous way and writing to support the interpreters’ mission are not mutually exclusive. Ideally interpreters will draw on informative text and images in briefs, with confidence that they can reveal the significance of a park’s natural resources to science, inspire curiosity, spark new interests and connections, craft stories to shape new memories, and foster stewardship. Briefs have the potential to be an invaluable resource, serving interpreters’ priorities while communicating information.

Acknowledgments

This article is dedicated to the memory of Carrie Jordan, a dedicated and gifted interpreter. Special thanks to Robert Brodman, Cynthia Char, Brian Drayton, Bethann Garramon Merkle, Morgan Newport, Gilly Puttick, Tracey Wright, and the interpreters who participated in this process. We are grateful to dozens of National Park Service staff in many roles and divisions who have allowed us to learn from them and with them. iSWOOP owes thanks to generous scientists who have contributed so much to this endeavor and to our thoughtful advisors for their guidance. Interpreters and Scientists Working on Our Parks is supported by the National Science Foundation DRL-1514776 and 1514766. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

    Note

  1. 1 https://www.nps.gov/rlc/oceanalaska/research.htm