Moving One Step Ahead

Environment, Sustainability, Renewables, Conservation, Water Quality, Green Building — And How to Talk about it All!


Leave a comment

Emergency Crew Formed to Collect Scientific Data and Remove Water Chestnut Invader

Last Friday, Sept. 5, I spent the morning with Dr. Nancy Rybicki, aquatic plant ecologist with the U.S. Geological Survey, and John Odenkirk, fisheries biologist with the Virginia Department of Game and Inland Fisheries, as part of an emergency harvest crew assembled to manually remove a recently discovered infestation of invasive water chestnut at the Pohick Bay park boat rental area in Gunston Cove just off the Potomac River. 

This aggressive plant is a prolific reproducer: One acre of water chestnut can produce enough seeds to cover 100 acres the following year. On Aug. 21, the water chestnut (Trapa spp.) covered 1100 square meters (about 0.3 acres); by Sept. 5, the plant had expanded to almost 1500 square meters.

The last time this highly invasive plant was seen in the Potomac was in the 1920s. The water chestnut grew rapidly from a small patch, and by 1933, thick mats of it covered more than 10,000 acres of the Potomac River from Washington to near Quantico, Va. The U.S. Army Corps of Engineers mechanically harvested the plants through 1945 at a cost of nearly $3 million (in 1992 dollars). They continued hand harvesting until the mid-1960s. We’re hoping to prevent that from happening again.

In addition to its reputation as a prolific spreader, this particular invasive plan can cause several problems for the environment and recreational activities. The seedpods have hard half-inch-long spines, which are sharp enough to penetrate shoe soles and large enough to keep people off beaches. Large mats of the nuisance plant can make boating almost impossible. Additionally, as the water chestnut leaves fan out, they block sunlight from reaching native bay grasses, which provide critical habitat for native fish and bird and help to maintain and improve water quality

Odenkirk and Rybicki assembled a team of 30 volunteers from DGIF and USGS, George Mason University (GMU), Pohick Bay park staff, Virginia Master Naturalists, and local hunters, fishers, and boaters to assist in removing the plants and collecting scientific information about the plant. Participants waded through the water and removed 751 bushels of plants weighing over 3.6 tons. The discarded plants were placed into piles for onsite composting in the park. 

Water chestnut seeds can remain viable in sediments for several years, which means this area will need to be followed closely for several years moving forward. Rybicki is collecting information on the ecology of this population of water chestnut to inform park managers, natural resource agencies, and others who need scientific information in order to address the threat.

I worked with Cindy Smith, Ph.D., a fellow researcher from George Mason University’s Potomac Environmental Research and Education Center, to collect plant samples. We determined that one seed can produce a single multi-branching plant with up to 25 rosettes, over 30 seed pods, and stems that are up to 3 m long. The plant, which is rooted in the sediment, has green, triangular leaves that are shiny and waxy above and coated with fine hairs below. Here is a short video we made:

 

Although it is unclear how the water chestnut arrived in Gunston Cove, it was likely transferred via boat, particularly since the mat was discovered near the boat launch. A plant fragment or seed may have been inadvertently transported on a boat that had previously come from waters with an existing population. In addition, the seeds adhere readily to many surfaces, including the legs of volunteers during the emergency harvest. Because of this “stickiness”, the seeds could also have been transported on the feathers of waterfowl. Another possibility being investigated is whether this is a new introduction from one of the countries where it is native, rather than the spread of water chestnut from another state.

Because of limited funding for invasive species monitoring and removal, Rybicki and Odenkirk are hoping to get the word out about this potentially large problem, so please share as widely as possible. I’ll provide regular updates here as well. 

Sources:

http://www.fcwa.org/water/Invasive%20Species%20final%20brochure%20small%20file.pdf

http://www.dnr.state.md.us/bay/sav/water_chestnut.asp

http://www.anstaskforce.gov/Species%20plans/Water%20Chestnut%20Mgt%20Plan.pdf 

http://www.fws.gov/r5crc/water_chestnut.htm

http://water.usgs.gov/nrp/proj.bib/sav/timeline.pdf


Leave a comment

EPA’s Power Plant Rules to Cut CO2 18%, Not 30% as Reported

In early June 2014, the U.S. Environmental Protection Agency (U.S. EPA) announced proposed new rules that would cut emissions of carbon dioxide—a primary greenhouse gas (GHG)—from power plants by approximately 30%. The rules are part of the more comprehensive Climate Action Plan the Obama Administration rolled out in mid-2013, which included a series of discrete steps the United States will take to address the growing effects of climate change.

The proposed rules would require each state (except Vermont) to cut carbon dioxide emissions by a set amount predetermined by U.S. EPA based on a series of metrics, including the percentage of existing coal-fired power in the state, current use of renewables, and what is reasonably achievable without too great of a burden. Based on these reductions, U.S. EPA has estimated that when taken together, carbon-dioxide emissions from all power sources in the United States will decline by approximately 30% below 2005 levels by 2030.

It is important to note, however, that carbon dioxide emissions from power plants already have been reduced by approximately 12% from 2005 to 2013, so this rule would result in an additional “real” reduction of 18% on average by 2030 compared to today.

Since the industrial revolution, burning of fossil fuels and clearing of forests has led to increasing concentrations of GHGs in the Earth’s atmosphere. Documentation dating back to the late 1800s—when records started being kept—shows a clear relationship between the amount of carbon dioxide in the atmosphere and global temperature increases.

In addition, ice core samples taken from the Arctic Circle can confirm these changes going back thousands of years. The ice cores show that pre-industrial levels of atmospheric carbon dioxide remained relatively unchanged; it is only after the industrial revolution that a rapid increase in these concentrations occurred.

The presence of GHGs in the atmosphere is a concern because these gases act to absorb and trap solar radiation. Although a certain amount of carbon dioxide and other GHGs is necessary to maintain a relatively warm Earth that can support life, excess amounts are causing temperatures to increase beyond that to which most living organisms have adapted.

Industry and other special interest groups have been fighting U.S. EPA’s ability to regulate carbon dioxide emissions for decades. In fact, the United States has been reluctant to participate in any type of GHG emission cutting, and signed but did not ratify the Kyoto Protocol, an international agreement among industrialized nations to cut GHG emissions.

Fortunately, the Supreme Court eventually ruled in a recent finding that EPA did have the authority to regulate carbon dioxide as an air pollutant under the Clean Air Act. Obama’s Climate Action Plan, and the carbon dioxide rule in particular, is perhaps the most significant steps the federal government has ever taken to curb U.S. GHG emissions.

But although this rule to regulate carbon dioxide emissions is unprecedented in the United States, it may be too little, too late.  Although the rule is a good first step, it may do little to slow the already occurring changes in the Earth’s climate.

Power generation is a significant source of carbon dioxide emissions in the United States, accounting for approximately one-third to one-half of all GHGs. In particular, aging, coal-fired power plants are the largest current sources of carbon-dioxide emissions. So by targeting these primary sources, U.S. EPA is focusing on the area most likely to result in a significant positive impact on reducing the effects of climate change.

In addition, the rule is designed to give States a significant amount of autonomy and flexibility on how they will achieve their reduction goals. For instance, states may choose to upgrade existing plants; migrate to more natural-gas powered plants (though many of these too are aging and, hence, produce significant GHG emissions); rely more on renewables; incentivize energy efficiency; or some combination of these.

While not calling other emissions out specifically, the rule also will lead to reductions in other GHGs such as sulfur dioxide, nitrogen oxides, and ozone, which are also typically produced by power plants.

As expected, much of the protest came from states with significant reliance on coal-fired power, industry groups, and others that are concerned with cost implementation. However, U.S. EPA addressed these concerns from the very beginning by illustrating how implementation costs will be far outstripped by the benefits that will be reaped through improvements in health and the environment.

Unfortunately, however, there are many concerns regarding the real, tangible value of this rule. First, as mentioned earlier, the estimated reduction in GHG emissions will likely only be around 18% below 2012 levels because approximately 12% has already been achieved since 2005 as states took measures to reduce emissions in anticipation of an impending rule. In addition, this additional 18% is really only an approximation. With the way the reductions are divided and vary between states, the actual reduction could be significantly less; though there is always the small chance that it could be more as well, though this is unlikely for a variety of reasons.

Assuming, however, that the rule passes as is, with the approximately 18% “real” reduction, the reality is that this will have little effect on climate change currently in progress. For most of the United States, 2012 was hottest year on record to date; several of the years in the last decade have been the hottest ever recorded worldwide. In fact, average temperatures have increased nearly 1 degree C since the industrial revolution began. Carbon dioxide concentrations are nearly 400 ppm compared with approximately 300 ppm for the entire period up until the industrial revolution.

Another problem with the rule is that it does not address transportation emissions at all, which is about equal to energy production in the amount of GHGs produced. Perhaps the biggest problem, however, with attempting to reduce carbon dioxide is the already existing excess levels of GHGs present in the atmosphere. Carbon dioxide and other GHGs can remain in the atmosphere for years if not decades (some even for millennium). In addition, rates of emissions greatly exceed rates of removal, through natural cycles such as absorption in sinks, breakdown, etc. As a result, some percentage of existing GHGs will remain in the atmosphere for thousands of years even if current emissions were cut to zero immediately. These gases will continue to negatively affect the Earth’s climate as long as they are present.

As a result, even with this pending reduction, atmospheric concentrations of GHGs will continue to increase. On the other hand, without any cuts, concentrations of GHGs will increase far more rapidly. The only scenario that might work is an immediate 50 to 100% cut from present day levels, and even then, it would be many years to several decades for temperature increases to stabilize. These moderate reductions as indicated in the proposed rule will have little tangible effects. Even complete removal of carbon dioxide emissions would only lead to a modest decrease in atmospheric concentrations in the coming decades.

Finally, the United States is only one of many countries in the world. Although the United States is a significant producer, contributing about 20% of total GHGs and carbon dioxide, there are many other countries that also would need to reduce emissions to have a measurable impact.  Most problematic is that many developing nations—in particular, China and India with over 30% of the world’s population combined—are catching up with their GHG emissions.

Previously, many countries agreed on a somewhat arbitrary benchmark of keeping global temperature increase to less than 2 degrees C; unfortunately, based on current projections, it is not unlikely that an increase of almost 4 degrees C may occur by end of this century. Unfortunately, scientists do not really know what temperature increase the world can really absorb without significant environmental, economic, and health impacts. For example, California is already in a period of historic drought, a state that grows nearly 50% of U.S. fruits, nuts, and vegetables.

The proposed power plant rules are, however, a start.

Additional Resources:
More Details on the Rules from Vox
Federal Register Filing of New Rules

Elizabeth Striano
http://www.agreenfootprint.com


Leave a comment

The End of the Bluefin Tuna?

The plight of the Bluefin tuna is finally back in the news. Unfortunately, the full, sad story is still not being told. In July 2014, the National Oceanic and Atmospheric Administration (NOAA) said that it is considering a ban on commercial and recreational fishing of the Pacific Bluefin tuna. Scientists have estimated that only about 4% of the fish’s historic populations remain, or approximately 40,000 adults. But a ban is only being “considered” at this point. And there is no mention of the two other closely related species of Bluefin tuna, the Atlantic and southern Bluefin tuna, which are already considered “endangered”. In fact, scientists are saying that the Atlantic Bluefin tuna is on the brink of collapse, while the southern Bluefin tuna collapsed in the 1960s; although both species are still being commercially fished.

The Atlantic Bluefin (Thunnus thynnus) tuna symbolizes the many problems facing the world’s remaining fisheries, including severe overfishing, unchecked and open access in international waters, high market value, and deficient governance at both the international and national levels.

About the Tuna

The Atlantic Bluefin tuna (Thunnus thynnus) is one of the world’s largest vertebrates, weighing an average of 550 pounds and up to 1500 pounds (700 kg). At maturity, it averages about 6.5 feet long. A highly migratory species, the Atlantic Bluefin tuna inhabits a huge range of the ocean on both sides of the Atlantic and the Pacific. It can live up to 40 years and spawns at maturity, which ranges from 5 to 12 years old. One unique aspect of this tuna is that it is warm-blooded. As a result, the Atlantic bluefin can regularly make transoceanic migrations, using its elevated body temperatures to hunt in frigid waters.

Unfortunately, this species is highly prized for its economic value—individual fish can fetch hundreds of thousands of dollars in the famed Tokyo Fish Market. In fact, Japan consumes approximately 80% of the worldwide Atlantic Bluefin tuna catch. The value of the industry is estimated at $7.2 billion per year.

 Tuna Quotas

The International Commission for the Conservation of Atlantic Tunas (ICCAT) is the intergovernmental organization that oversees management of the Atlantic Bluefin and about 30 species of other tuna and tuna-like species in the Atlantic Ocean. Any member of the United Nations can join ICCAT. Currently there are 49 “contacting parties” to ICCAT, including the United States. The Commission’s official charter is “maintaining the populations of these fishes at levels which will permit the maximum sustainable catch for food and other purposes”. (Maximum sustainable catch is defined as the “number (weight) of fish in a stock that can be taken by fishing without reducing the stock biomass from year to year.” Also, widely criticized, but perhaps for another blog.)

Unfortunately, ICAAT has been unable to achieve this goal; populations of the Atlantic Bluefin tuna have plummeted to less than 95% of their historic populations.

The Commission meets biannually to set fishing quotas for the Bluefin, purportedly based on information provided by its own scientific committee. The commission’s managers, however—and not the scientists—are the ones who set the actual fishing quotas. It’s worth noting that these managers often have strong industry ties.

ICAAT consistently has set quotas—known as total allowable catch (TAC) limits—for member countries that are much greater than recommended by their own scientists. In 2008, for example, ICAAT scientists recommended the TAC for the Eastern/Mediterranean Bluefin tuna should not exceed 15,000 tons; the commission set the figure at 22,000 tons, approximately 50% greater than recommended. (In 1982, ICCAT somewhat arbitrarily divided the population of Atlantic Bluefin tuna into a Western population of the coast of North America; and an Eastern/Mediterranean population.) In addition, they ignored scientists’ recommendations that the fishery be closed during the spawning months of May and June, warning that it was in danger of collapse.

Rampant Cheating

In addition, ICAAT scientists have estimated that illegal fishing in the Mediterranean accounts for approximately 20-30% in additional catches. Even with quotas that scientists and conservation groups say are not protective enough, ICCAT members often ignore their quotas and otherwise cheat the system. Illegal and unreported fishing are estimated to account for $23.5 billion annually across the world.

According to a report from the International Consortium of Investigative Journalists, France, Italy, Spain, Turkey, and Japan have violated their quotas by misreporting catch size, hiring banned spotter planes, catching undersized fish, and trading quotas. In fact, until 2008, there was no enforcement; member countries did not even report their catches. As a result, there was significant cheating.

In addition, unregulated fish “ranches”—large pens in the middle of the ocean used to hold and fatten caught tuna—are used in the Mediterranean. Typically financed by Japan and placed in laxly regulated countries like Tunisia, Cyprus, and Turkey, these ranches are not policed and are used to “funnel” illegally caught tuna, which are then sold into the Tokyo Market. In 2010, ICCAT started putting “observers” on tuna fishing vessels, but this effort was found to be of limited value.

Intervention
Conservation groups have lobbied ICCAT members to adopt scientists’ advice. These groups, including the World Wildlife Fund, took their fight to the Convention on International Trade in Endangered Species (CITES) early in 2010. They called on CITES to list Atlantic Bluefin tuna under the treaty’s Appendix 1, which would have banned international trade of the fish. Member countries disagreed, saying that ICCAT countries would be the more appropriate body to manage and protect Bluefin tuna. The United States and the European Union had supported this listing. Heavy lobbying by those opposed to the CITES listing, included Japan, Canada, and Tunisia.

Overfishing has become a critical problem in the marine environment, and the future of all global fisheries remains uncertain. The imminent collapse of the Atlantic Bluefin tuna symbolizes the failure of management of marine species that is occurring in virtually all global fisheries. Scientific research is not being incorporated into management decisions. Ironically, ICCAT was developed in response to the collapse of the Southern population of the Atlantic Bluefin tuna in the 1960s. The species has not recovered.

At this point, populations of the Atlantic Bluefin tuna are unlikely to recover because action may have been delayed for too long. At this point, a temporary moratorium on all fishing of Atlantic Bluefin tuna may be the only way to save this species. Unfortunately, this is unlikely to occur.

For more information, here is a list of Resources used in this article.

Elizabeth Striano
www.agreenfootprint.com


Leave a comment

Using Graphics Part Two

I recently posted about the importance of using graphics to illustrate complex concepts. Well I had to post a quick update from Mental Floss, which compiled examples of humorous graphic use including from their own content and material from other clever bloggers that heavily rely on infographics, comics, and other visuals. Everything from Venn diagrams to flowcharts, pie charts, and other types of charts and graphs were included.

As a communications consultant and PhD student in the sciences, I particularly related to the following:

Gantt chart is a type of timeline, often used for projects, in which different elements each have their own timeline, but they are all coordinated. … The Universal Gantt Chart for Project Managers shows how projects actually go, instead of how they are planned.

Some of these can also serve as a cautionary tale of when not to use charts–i.e., when they only serve to confuse rather than clarify. When used for humorous effect, however, these tend to work even better.

Elizabeth Striano
www.agreenfootprint.com

 


Leave a comment

No Escaping the Need for a Digital Strategy

A compelling internal report from the New York Times has surfaced that provides an analysis of its digital strategy and which has implications for anyone involved in any type of communications. The report–a summary of which was leaked last week–outlines the many missteps the Times has taken in implementation of their digital strategy and contrasts their approach to more successful upstarts like The Huffington Post and Buzzfeed. As outlined in this excellent summary from Nieman Journalism Lab, the report shows that the Times has remained firmly rooted in its history as a print publication and has not successfully made the transition to digital, which has hurt readership and distribution.

If an organization as large and established as the New York Times can miss these seemingly blatant opportunities, then what are the implications for smaller organizations or those with a smaller, more focused audience? If there is only one take home message from the report it is this: Digital outreach has become the single most important route to getting news and information to your target audience.

The report outlines this and many other important findings. All of which have significant implications for science journalism, which often struggles for accurate coverage from the very beginning. Because of this, the ability to harness social media has become even more critical to those in the sciences. Although I recommend reading the full report, at the very least everyone I recommend reading the Nieman’s summary.

The report makes the case that authors of any type of content have to become their own social media advocates:

In a section addressing promotion of New York Times content — essentially, social media distribution — the report’s authors survey the techniques of “competitors” and compare them to the Times’ strategy. For example, at ProPublica, “that bastion of old-school journalism values,” reporters have to submit 5 possible tweets when they file stories, and editors have a meeting regarding social strategy for every story package. Reuters employs two people solely to search for underperforming stories to repackage and republish. (p. 43)

Contrastingly, when the Times published Invisible Child, the story of Dasani, not only was marketing not alerted in time to come up with a promotional strategy, “the reporter didn’t tweet about it for two days.” Overall, less than 10 percent of Times traffic comes from social, compared to 60 percent at BuzzFeed. (p. 43)

At the extreme end of the scale, scientists can be reluctant to simply collaborate with the media on a basic level, nonetheless feel comfortable and willing to use social media. Yet the ability to exploit digital for readership has become inescapable. Any organization can learn something from this report, perhaps even using it as a roadmap for their own digital success. Because the New York Times report may be the “one of the key documents of this media age”.

Elizabeth Striano
Science writer and editor
www.agreenfootprint.com


1 Comment

Using Graphics to Tell the Story

It’s such a cliche that I’m reluctant to repeat the adage, but it is so true that a picture is worth a thousand words. Perhaps even more so in the sciences. I have an excellent example of just how true this adage from the last few weeks. In this case, a scientific fact–that mosquitoes are actually, by far, the most deadly creature to humans–was skillfully conveyed via a readily consumable graphic that quickly went viral. How wonderful of an achievement would that be with so many other facts? Especially in an era when so many scientific facts are disputed or questioned.

Originally posted on Gates Notes, The Blog of Bill Gates, on April 24, the simple but ingenious graphic was quickly picked up by many other outlets, including The Washington Post, CBS News, and many more after being tweeted and shared through social media. The end result was that this story stuck around through the end of April, by which many other articles had long disappeared. But what are some of the key points that positioned this graphic to be shared so readily sharable?

1) Shock value–very few people knew that the answer was going to be “mosquito” before they clicked on that link. The element of surprise contributed significantly to a reader’s willingness to share.

2) Simplicity–the graphic itself is very streamlined and easy to follow. All extraneous data was eliminated. The images that were used to represent the different animals were simple and readily identifiable silhouettes.

3) Minimal text–As stated above, the image itself provided much of the data. Actual text was kept to a minimum and used only as necessary, this includes a short title.

4) Snapshot effect–it really only takes a few seconds to convey the entire message of the image.

Graphics can help anyone reach a wider audience. If we can follow the principles outlined above, we can create information that can be used and shared and that, ultimately, has an impact. Unfortunately, however, it’s still unclear on what exactly makes a post go viral, although tips abound. Having an excellent graphic that tells a story is a great start.

Elizabeth Striano
www.agreenfootprint.com

 


Leave a comment

Lectures Don’t Work for Science

Instinctively, we have all suspected on some level that there must be a better way to teach and share scientific information other than the traditional classroom lecture. Well we now have the science to back that up and approaches to help make some changes!

A new study of undergraduate science, technology, engineering, and mathematics (STEM) courses showed that scores improved approximately 6% when active learning was included in the classroom. It also found that students in “classes with traditional lecturing were 1.5 times more likely to fail than were students in classes with active learning”. The findings applied across all STEM fields and all class sizes.

The study, a meta-analysis of 225 studies of undergraduate STEM teaching methods published in Proceedings of the National Academy of Sciences, is “the largest and most comprehensive meta-analysis of undergraduate STEM education published to date”.

The authors concluded that when students were active participants rather than passive listeners, they were better able to grasp concepts. Active participation included any activity that required students to engage, from answering questions to collaborating with other students in small groups.

The Alan Alda Center for Communicating Science recently taught a class on using improv to improve understanding of scientific concepts in a presentation-type environment:

“Scientists need to make abstract concepts clear and relevant to any audience they are talking to,” says Lantz-Gefroh. The exercise “is a playful way of getting them to be vivid and expressive when selling a nonsensical idea and then apply those lessons to talking about their real science.”

This course is one of many the Center is using to help scientists communicate better and to “put aside the jargon and connect with the public in language it can understand”. This type of work and investigation points to growing acceptance of the importance of improving public understanding of the science behind issues such as climate change and other complex topics.

Elizabeth Striano
Science writer and editor
www.agreenfootprint.com

Follow

Get every new post delivered to your Inbox.

Join 99 other followers