Moving One Step Ahead

Environment, Sustainability, Renewables, Conservation, Water Quality, Green Building — And How to Talk about it All!

Leave a comment

How to Help Reporters Get it Right

Recently, the media jumped on what they thought was a man-bites-dog story about a vegetarian diet—a study appeared to have found a link between a vegetarian diet and cancer. A link between a presumably healthy diet and cancer? The reporters were all over it.

Being a Vegetarian Could Kill You” warned the New York Post, and “Why a Vegetarian Diet May Be Bad for You” posted the American Council on Science and Health.

Unfortunately (fortunately?), that’s not what the researchers found. At all.

According to a good summary of the story by Motherboard, one of the study’s lead author was understandably not too happy about the misinterpretations: “In the beginning, we were pretty happy to see our research getting so much attention,” Kaixiong Ye, a biology post-doc at Cornell University and co-author of the study in question, told me. “But over the last few days I have found that most of the news coming out right now [on our study] is wrong. It’s kind of frustrating.”

The story made the rounds for a good 24 hours before the media caught up. A more accurate summary of the research can be found in The Washington Post: “Cornell study finds some people may be genetically programmed to be vegetarians”.

So what went wrong? And how can any scientist hope to prevent this from happening with their research?

It helps to know who is typically involved in the process and what their role is:

  • Scientist – typically one or more lead researchers and authors of the published journal article;
  • Media staff – author of the press release based on the published journal article; and
  • Reporter – the media person at the news outlet who picked up the press release and wrote the article.

Each person has their roles and responsibilities here. But it’s a little bit like playing Chinese whispers sometimes, so we’ll never really know exactly where the message went off-course. Perhaps the scientist didn’t have time to review the press release before it went out. Or maybe the media staff at his or her organization didn’t ask the research to review the release before it went out and had misinterpreted the findings. Finally, the reporter could have taken that eye-catching headline on the press release and run with it, without bothering to read the study more in depth or with a critical eye.

Any and all of these scenarios may have occurred.

In all of these cases, however, the best effort the scientist could make would be to ensure that the media staff at his/her institution had a good, solid understanding of:

  • The problem that the research was investigating;
  • The findings of the study itself;
  • The implications of those findings, i.e., what they really mean; and
  • What the next steps are in this body of work.

First, and perhaps most important, one of the lead authors should be designated the main contact. Ideally, this can be the individual on the research team with the most experience dealing with the press or the best communication skills.

This author should then make themselves available to their organization’s media person. The media person who will be writing the press release and any other summaries that will posted on the website or distributed. The designated author should then offer to sit down with the media person for a briefing or quick call and be available to answer any questions.

Once drafted, the designated author should request at least one opportunity to review the final language that the media person intends to use in the press release, anticipating that there may be some hyperbole (which is okay as long as, overall, it is accurate). Headlines are often afforded a bit of leeway, the better to catch the reporter’s eye. Though again, it should be (roughly) accurate. The scientist should also make themselves  available for any questions from their media person and serve as a contact for journalists who may call.

Certainly these additional steps add a bit of complexity to the process and more work for the scientist. This collaborative, teamwork approach, however, is really critical to getting the messaging correct from the beginning. And the investment of time up front could save time and effort later. Unfortunately, even with the best efforts, sometimes the reporter may still get it wrong. Which bring me to:

Next week: How to Respond When Reporters Get it Wrong

Elizabeth Striano


Leave a comment

7 Tips for Effective Science Communication and Outreach

As a scientist, it’s a very satisfying feeling to publish research that solves a complex problem. But your job no longer ends there. To really ensure your work has relevance, you need to be able to communicate about your science to a broader audience. When journalists pay attention, then the public can become aware of it and then, potentially, policymakers and other decision-makers will take notice, making it more likely that your work can have a real impact.

Scientists who embrace public communication, and who know what they want to communicate, to whom, how and when, can become leaders in their fields. And others will take notice of this leadership. As a result, being a good communicator can help you ensure your research gets noticed and has an impact, can support the next round of scientific development in your field, and can grant your career a boost through professional recognition.

Below are seven steps you can take to ensure outreach and communication for your work:

  1. Commit to communicate. As you embark on any research, commit at the beginning to communicating about the results when your work is complete. You could already thinking about the potential impact of your work and why the public might care about the results, who might be most interested, and how it can make a difference. Embracing this critical role from the beginning will help prepare you when you have completed your work and are ready to begin communicating.
  2.  Commit to your research. Always remember why you are doing your research, why it is so important to you, and why others should care as well. Presumably, you are taking on this complex project to solve a problem that you care about—keeping this goal in mind will help you have confidence in speaking about your work and generating the passion necessary to get others to believe as well.
  3. Expect and embrace criticism. Criticism of your work will not end with the completion of the peer review process, unfortunately. Genuine, thoughtful criticism can actually be valuable to you, both in allowing you to help others’ understand your work and helping you to see what the next phase of your research should be. By embracing this input and engaging with your critics, you can potentially create a new network of collaborators and colleagues who can act as sounding boards moving forward, leading to new discoveries.
  4. Be prepared. Good communication requires preparation, practice, and passion. Preparation is just that—knowing what you want to say and to whom you will be saying it. Knowing your audience is key in this step so that you can tailor your message appropriately. Practice allows your presentation to flow smoothly and effortlessly. Practice also helps keep your nerves under control. Passion is infectious and is critical in engaging your audience.
  5. Present solutions, not problems. When a journalist contacts you about an issue and wants your expert opinion, they want to learn about the solution from an expert—you. You need to be able to state definitively what you have learned and the implications of that information. Focusing on solutions can make the science accessible and actionable.
  6. Take advantage of opportunities. It almost goes without saying, that you should take advantage of opportunities to communicate your science when they present themselves. When a journalist calls you, be ready to chat; if you receive and invitation to speak, accept. One way to always be ready is to have a series of talking points, an op-ed, or even a short, accessible article about your work ready to go at any time.
  7. Build a network. Scientists need to think beyond their immediate colleagues and build relationships outside of their world, including with journalists, policy advocates, local and state officials, and others who may care about the science. In this way, these individuals can also become champions of your work, helping to build a chorus of voices speaking for your research.

These are just a few quick tips to get you started. There are many, many more, and we can dive into some of the details on individual steps in future posts.

Elizabeth Striano

Leave a comment

Emergency Crew Formed to Collect Scientific Data and Remove Water Chestnut Invader

Last Friday, Sept. 5, I spent the morning with Dr. Nancy Rybicki, aquatic plant ecologist with the U.S. Geological Survey, and John Odenkirk, fisheries biologist with the Virginia Department of Game and Inland Fisheries, as part of an emergency harvest crew assembled to manually remove a recently discovered infestation of invasive water chestnut at the Pohick Bay park boat rental area in Gunston Cove just off the Potomac River.

This aggressive plant is a prolific reproducer: One acre of water chestnut can produce enough seeds to cover 100 acres the following year. On Aug. 21, the water chestnut (Trapa spp.) covered 1100 square meters (about 0.3 acres); by Sept. 5, the plant had expanded to almost 1500 square meters.

The last time this highly invasive plant was seen in the Potomac was in the 1920s. The water chestnut grew rapidly from a small patch, and by 1933, thick mats of it covered more than 10,000 acres of the Potomac River from Washington to near Quantico, Va. The U.S. Army Corps of Engineers mechanically harvested the plants through 1945 at a cost of nearly $3 million (in 1992 dollars). They continued hand harvesting until the mid-1960s. We’re hoping to prevent that from happening again.

In addition to its reputation as a prolific spreader, this particular invasive plant can cause several problems for the environment and recreational activities. The seedpods have hard half-inch-long spines, which are sharp enough to penetrate shoe soles and large enough to keep people off beaches. Large mats of the nuisance plant can make boating almost impossible. Additionally, as the water chestnut leaves fan out, they block sunlight from reaching native bay grasses, which provide critical habitat for native fish and bird and help to maintain and improve water quality

Odenkirk and Rybicki assembled a team of 30 volunteers from DGIF and USGS, George Mason University (GMU), Pohick Bay park staff, Virginia Master Naturalists, and local hunters, fishers, and boaters to assist in removing the plants and collecting scientific information about the plant. Participants waded through the water and removed 751 bushels of plants weighing over 3.6 tons. The discarded plants were placed into piles for onsite composting in the park.

Water chestnut seeds can remain viable in sediments for many years, which means this area will need to be followed closely for several years moving forward. Rybicki is collecting information on the ecology of this population of water chestnut to inform park managers, natural resource agencies, and others who need scientific information to address the threat.

I worked with Cindy Smith, Ph.D., a fellow researcher from George Mason University’s Potomac Environmental Research and Education Center, to collect plant samples. We determined that one seed can produce a single multi-branching plant with up to 25 rosettes, over 30 seed pods, and stems that are up to 3 m long. The plant, which is rooted in the sediment, has green, triangular leaves that are shiny and waxy above and coated with fine hairs below. Here is a short video we made:

Although it is unclear how the water chestnut arrived in Gunston Cove, it was likely transferred via boat, particularly since the mat was discovered near the boat launch. A plant fragment or seed may have been inadvertently transported on a boat that had previously come from waters with an existing population. In addition, the seeds adhere readily to many surfaces, including the legs of volunteers during the emergency harvest. Because of this “stickiness”, the seeds could also have been transported on the feathers of waterfowl. Another possibility being investigated is whether this is a new introduction from one of the countries where it is native, rather than the spread of water chestnut from another state.

Because of limited funding for invasive species monitoring and removal, Rybicki and Odenkirk are hoping to get the word out about this potentially large problem, so please share as widely as possible. I’ll provide regular updates here as well.


Leave a comment

EPA’s Power Plant Rules to Cut CO2 18%, Not 30% as Reported

In early June 2014, the U.S. Environmental Protection Agency (U.S. EPA) announced proposed new rules that would cut emissions of carbon dioxide—a primary greenhouse gas (GHG)—from power plants by approximately 30%. The rules are part of the more comprehensive Climate Action Plan the Obama Administration rolled out in mid-2013, which included a series of discrete steps the United States will take to address the growing effects of climate change.

The proposed rules would require each state (except Vermont) to cut carbon dioxide emissions by a set amount predetermined by U.S. EPA based on a series of metrics, including the percentage of existing coal-fired power in the state, current use of renewables, and what is reasonably achievable without too great of a burden. Based on these reductions, U.S. EPA has estimated that when taken together, carbon-dioxide emissions from all power sources in the United States will decline by approximately 30% below 2005 levels by 2030.

It is important to note, however, that carbon dioxide emissions from power plants already have been reduced by approximately 12% from 2005 to 2013, so this rule would result in an additional “real” reduction of 18% on average by 2030 compared to today.

Since the industrial revolution, burning of fossil fuels and clearing of forests has led to increasing concentrations of GHGs in the Earth’s atmosphere. Documentation dating back to the late 1800s—when records started being kept—shows a clear relationship between the amount of carbon dioxide in the atmosphere and global temperature increases.

In addition, ice core samples taken from the Arctic Circle can confirm these changes going back thousands of years. The ice cores show that pre-industrial levels of atmospheric carbon dioxide remained relatively unchanged; it is only after the industrial revolution that a rapid increase in these concentrations occurred.

The presence of GHGs in the atmosphere is a concern because these gases act to absorb and trap solar radiation. Although a certain amount of carbon dioxide and other GHGs is necessary to maintain a relatively warm Earth that can support life, excess amounts are causing temperatures to increase beyond that to which most living organisms have adapted.

Industry and other special interest groups have been fighting U.S. EPA’s ability to regulate carbon dioxide emissions for decades. In fact, the United States has been reluctant to participate in any type of GHG emission cutting, and signed but did not ratify the Kyoto Protocol, an international agreement among industrialized nations to cut GHG emissions.

Fortunately, the Supreme Court eventually ruled in a recent finding that EPA did have the authority to regulate carbon dioxide as an air pollutant under the Clean Air Act. Obama’s Climate Action Plan, and the carbon dioxide rule in particular, is perhaps the most significant steps the federal government has ever taken to curb U.S. GHG emissions.

But although this rule to regulate carbon dioxide emissions is unprecedented in the United States, it may be too little, too late.  Although the rule is a good first step, it may do little to slow the already occurring changes in the Earth’s climate.

Power generation is a significant source of carbon dioxide emissions in the United States, accounting for approximately one-third to one-half of all GHGs. In particular, aging, coal-fired power plants are the largest current sources of carbon-dioxide emissions. So by targeting these primary sources, U.S. EPA is focusing on the area most likely to result in a significant positive impact on reducing the effects of climate change.

In addition, the rule is designed to give States a significant amount of autonomy and flexibility on how they will achieve their reduction goals. For instance, states may choose to upgrade existing plants; migrate to more natural-gas powered plants (though many of these too are aging and, hence, produce significant GHG emissions); rely more on renewables; incentivize energy efficiency; or some combination of these.

While not calling other emissions out specifically, the rule also will lead to reductions in other GHGs such as sulfur dioxide, nitrogen oxides, and ozone, which are also typically produced by power plants.

As expected, much of the protest came from states with significant reliance on coal-fired power, industry groups, and others that are concerned with cost implementation. However, U.S. EPA addressed these concerns from the very beginning by illustrating how implementation costs will be far outstripped by the benefits that will be reaped through improvements in health and the environment.

Unfortunately, however, there are many concerns regarding the real, tangible value of this rule. First, as mentioned earlier, the estimated reduction in GHG emissions will likely only be around 18% below 2012 levels because approximately 12% has already been achieved since 2005 as states took measures to reduce emissions in anticipation of an impending rule. In addition, this additional 18% is really only an approximation. With the way the reductions are divided and vary between states, the actual reduction could be significantly less; though there is always the small chance that it could be more as well, though this is unlikely for a variety of reasons.

Assuming, however, that the rule passes as is, with the approximately 18% “real” reduction, the reality is that this will have little effect on climate change currently in progress. For most of the United States, 2012 was hottest year on record to date; several of the years in the last decade have been the hottest ever recorded worldwide. In fact, average temperatures have increased nearly 1 degree C since the industrial revolution began. Carbon dioxide concentrations are nearly 400 ppm compared with approximately 300 ppm for the entire period up until the industrial revolution.

Another problem with the rule is that it does not address transportation emissions at all, which is about equal to energy production in the amount of GHGs produced. Perhaps the biggest problem, however, with attempting to reduce carbon dioxide is the already existing excess levels of GHGs present in the atmosphere. Carbon dioxide and other GHGs can remain in the atmosphere for years if not decades (some even for millennium). In addition, rates of emissions greatly exceed rates of removal, through natural cycles such as absorption in sinks, breakdown, etc. As a result, some percentage of existing GHGs will remain in the atmosphere for thousands of years even if current emissions were cut to zero immediately. These gases will continue to negatively affect the Earth’s climate as long as they are present.

As a result, even with this pending reduction, atmospheric concentrations of GHGs will continue to increase. On the other hand, without any cuts, concentrations of GHGs will increase far more rapidly. The only scenario that might work is an immediate 50 to 100% cut from present day levels, and even then, it would be many years to several decades for temperature increases to stabilize. These moderate reductions as indicated in the proposed rule will have little tangible effects. Even complete removal of carbon dioxide emissions would only lead to a modest decrease in atmospheric concentrations in the coming decades.

Finally, the United States is only one of many countries in the world. Although the United States is a significant producer, contributing about 20% of total GHGs and carbon dioxide, there are many other countries that also would need to reduce emissions to have a measurable impact.  Most problematic is that many developing nations—in particular, China and India with over 30% of the world’s population combined—are catching up with their GHG emissions.

Previously, many countries agreed on a somewhat arbitrary benchmark of keeping global temperature increase to less than 2 degrees C; unfortunately, based on current projections, it is not unlikely that an increase of almost 4 degrees C may occur by end of this century. Unfortunately, scientists do not really know what temperature increase the world can really absorb without significant environmental, economic, and health impacts. For example, California is already in a period of historic drought, a state that grows nearly 50% of U.S. fruits, nuts, and vegetables.

The proposed power plant rules are, however, a start.

Additional Resources:
More Details on the Rules from Vox
Federal Register Filing of New Rules

Elizabeth Striano

Leave a comment

The End of the Bluefin Tuna?

The plight of the Bluefin tuna is finally back in the news. Unfortunately, the full, sad story is still not being told. In July 2014, the National Oceanic and Atmospheric Administration (NOAA) said that it is considering a ban on commercial and recreational fishing of the Pacific Bluefin tuna. Scientists have estimated that only about 4% of the fish’s historic populations remain, or approximately 40,000 adults. But a ban is only being “considered” at this point. And there is no mention of the two other closely related species of Bluefin tuna, the Atlantic and southern Bluefin tuna, which are already considered “endangered”. In fact, scientists are saying that the Atlantic Bluefin tuna is on the brink of collapse, while the southern Bluefin tuna collapsed in the 1960s; although both species are still being commercially fished.

The Atlantic Bluefin (Thunnus thynnus) tuna symbolizes the many problems facing the world’s remaining fisheries, including severe overfishing, unchecked and open access in international waters, high market value, and deficient governance at both the international and national levels.

About the Tuna

The Atlantic Bluefin tuna (Thunnus thynnus) is one of the world’s largest vertebrates, weighing an average of 550 pounds and up to 1500 pounds (700 kg). At maturity, it averages about 6.5 feet long. A highly migratory species, the Atlantic Bluefin tuna inhabits a huge range of the ocean on both sides of the Atlantic and the Pacific. It can live up to 40 years and spawns at maturity, which ranges from 5 to 12 years old. One unique aspect of this tuna is that it is warm-blooded. As a result, the Atlantic bluefin can regularly make transoceanic migrations, using its elevated body temperatures to hunt in frigid waters.

Unfortunately, this species is highly prized for its economic value—individual fish can fetch hundreds of thousands of dollars in the famed Tokyo Fish Market. In fact, Japan consumes approximately 80% of the worldwide Atlantic Bluefin tuna catch. The value of the industry is estimated at $7.2 billion per year.

 Tuna Quotas

The International Commission for the Conservation of Atlantic Tunas (ICCAT) is the intergovernmental organization that oversees management of the Atlantic Bluefin and about 30 species of other tuna and tuna-like species in the Atlantic Ocean. Any member of the United Nations can join ICCAT. Currently there are 49 “contacting parties” to ICCAT, including the United States. The Commission’s official charter is “maintaining the populations of these fishes at levels which will permit the maximum sustainable catch for food and other purposes”. (Maximum sustainable catch is defined as the “number (weight) of fish in a stock that can be taken by fishing without reducing the stock biomass from year to year.” Also, widely criticized, but perhaps for another blog.)

Unfortunately, ICAAT has been unable to achieve this goal; populations of the Atlantic Bluefin tuna have plummeted to less than 95% of their historic populations.

The Commission meets biannually to set fishing quotas for the Bluefin, purportedly based on information provided by its own scientific committee. The commission’s managers, however—and not the scientists—are the ones who set the actual fishing quotas. It’s worth noting that these managers often have strong industry ties.

ICAAT consistently has set quotas—known as total allowable catch (TAC) limits—for member countries that are much greater than recommended by their own scientists. In 2008, for example, ICAAT scientists recommended the TAC for the Eastern/Mediterranean Bluefin tuna should not exceed 15,000 tons; the commission set the figure at 22,000 tons, approximately 50% greater than recommended. (In 1982, ICCAT somewhat arbitrarily divided the population of Atlantic Bluefin tuna into a Western population of the coast of North America; and an Eastern/Mediterranean population.) In addition, they ignored scientists’ recommendations that the fishery be closed during the spawning months of May and June, warning that it was in danger of collapse.

Rampant Cheating

In addition, ICAAT scientists have estimated that illegal fishing in the Mediterranean accounts for approximately 20-30% in additional catches. Even with quotas that scientists and conservation groups say are not protective enough, ICCAT members often ignore their quotas and otherwise cheat the system. Illegal and unreported fishing are estimated to account for $23.5 billion annually across the world.

According to a report from the International Consortium of Investigative Journalists, France, Italy, Spain, Turkey, and Japan have violated their quotas by misreporting catch size, hiring banned spotter planes, catching undersized fish, and trading quotas. In fact, until 2008, there was no enforcement; member countries did not even report their catches. As a result, there was significant cheating.

In addition, unregulated fish “ranches”—large pens in the middle of the ocean used to hold and fatten caught tuna—are used in the Mediterranean. Typically financed by Japan and placed in laxly regulated countries like Tunisia, Cyprus, and Turkey, these ranches are not policed and are used to “funnel” illegally caught tuna, which are then sold into the Tokyo Market. In 2010, ICCAT started putting “observers” on tuna fishing vessels, but this effort was found to be of limited value.

Conservation groups have lobbied ICCAT members to adopt scientists’ advice. These groups, including the World Wildlife Fund, took their fight to the Convention on International Trade in Endangered Species (CITES) early in 2010. They called on CITES to list Atlantic Bluefin tuna under the treaty’s Appendix 1, which would have banned international trade of the fish. Member countries disagreed, saying that ICCAT countries would be the more appropriate body to manage and protect Bluefin tuna. The United States and the European Union had supported this listing. Heavy lobbying by those opposed to the CITES listing, included Japan, Canada, and Tunisia.

Overfishing has become a critical problem in the marine environment, and the future of all global fisheries remains uncertain. The imminent collapse of the Atlantic Bluefin tuna symbolizes the failure of management of marine species that is occurring in virtually all global fisheries. Scientific research is not being incorporated into management decisions. Ironically, ICCAT was developed in response to the collapse of the Southern population of the Atlantic Bluefin tuna in the 1960s. The species has not recovered.

At this point, populations of the Atlantic Bluefin tuna are unlikely to recover because action may have been delayed for too long. At this point, a temporary moratorium on all fishing of Atlantic Bluefin tuna may be the only way to save this species. Unfortunately, this is unlikely to occur.

For more information, here is a list of Resources used in this article.

Elizabeth Striano

Leave a comment

Using Graphics Part Two

I recently posted about the importance of using graphics to illustrate complex concepts. Well I had to post a quick update from Mental Floss, which compiled examples of humorous graphic use including from their own content and material from other clever bloggers that heavily rely on infographics, comics, and other visuals. Everything from Venn diagrams to flowcharts, pie charts, and other types of charts and graphs were included.

As a communications consultant and PhD student in the sciences, I particularly related to the following:

Gantt chart is a type of timeline, often used for projects, in which different elements each have their own timeline, but they are all coordinated. … The Universal Gantt Chart for Project Managers shows how projects actually go, instead of how they are planned.

Some of these can also serve as a cautionary tale of when not to use charts–i.e., when they only serve to confuse rather than clarify. When used for humorous effect, however, these tend to work even better.

Elizabeth Striano


Leave a comment

No Escaping the Need for a Digital Strategy

A compelling internal report from the New York Times has surfaced that provides an analysis of its digital strategy and which has implications for anyone involved in any type of communications. The report–a summary of which was leaked last week–outlines the many missteps the Times has taken in implementation of their digital strategy and contrasts their approach to more successful upstarts like The Huffington Post and Buzzfeed. As outlined in this excellent summary from Nieman Journalism Lab, the report shows that the Times has remained firmly rooted in its history as a print publication and has not successfully made the transition to digital, which has hurt readership and distribution.

If an organization as large and established as the New York Times can miss these seemingly blatant opportunities, then what are the implications for smaller organizations or those with a smaller, more focused audience? If there is only one take home message from the report it is this: Digital outreach has become the single most important route to getting news and information to your target audience.

The report outlines this and many other important findings. All of which have significant implications for science journalism, which often struggles for accurate coverage from the very beginning. Because of this, the ability to harness social media has become even more critical to those in the sciences. Although I recommend reading the full report, at the very least everyone I recommend reading the Nieman’s summary.

The report makes the case that authors of any type of content have to become their own social media advocates:

In a section addressing promotion of New York Times content — essentially, social media distribution — the report’s authors survey the techniques of “competitors” and compare them to the Times’ strategy. For example, at ProPublica, “that bastion of old-school journalism values,” reporters have to submit 5 possible tweets when they file stories, and editors have a meeting regarding social strategy for every story package. Reuters employs two people solely to search for underperforming stories to repackage and republish. (p. 43)

Contrastingly, when the Times published Invisible Child, the story of Dasani, not only was marketing not alerted in time to come up with a promotional strategy, “the reporter didn’t tweet about it for two days.” Overall, less than 10 percent of Times traffic comes from social, compared to 60 percent at BuzzFeed. (p. 43)

At the extreme end of the scale, scientists can be reluctant to simply collaborate with the media on a basic level, nonetheless feel comfortable and willing to use social media. Yet the ability to exploit digital for readership has become inescapable. Any organization can learn something from this report, perhaps even using it as a roadmap for their own digital success. Because the New York Times report may be the “one of the key documents of this media age”.

Elizabeth Striano
Science writer and editor