When it comes to the Indian River Lagoon, it was for Marine Resources Council (MRC) Executive Director Leesa Souto to put a positive spin on a failing grade during a sneak peak of the Indian River Lagoon Report Card Feb. 22 at Florida Tech in Melbourne.
Like the “before” picture in a comparison advertisement, the greatest challenge is the first edition because presumably the good news comes later, after substantive changes in policy and increased community awareness, as the later success stories of lagoon restoration can be tracked and the supporting data can be presented to the public.
The bad news is that, as the study nears completion after review from the National Estuary Program, it appears the majority of the 156-mile Indian River Lagoon that extends from Ponce De Leon Inlet in the north to Jupiter Inlet in the south and is home to 3,500 species would be considered flunking.
“We could just put an F on the entire lagoon or, for management purposes, we need to kind of do the gradients of F,’’ Souto said.
Now less of a report card by switching from letter grades A through F to numeric scores, the researchers divided the lagoon into 10 areas and created a standardized scoring system, ranging from 0 to 100. The document in its current form shows some painful baseline numbers – all of which would be considered an F.
The origins of the study go back to the Lagoon Action Assembly in 2014, organized by MRC, in which 100 delegates from a cross section of the community interested in consensus building looked for ways to restore the IRL. One of the assembly’s top priorities was the process and production of an IRL report card.
The effort to culminate this spring with its publication required valid comparisons of a vast amount of data from all areas of the Lagoon, and that data needed to go back 20 years. Because of gaps, or concentrations of study in only specific areas, the list of criteria was pared down to measures of chlorophyll, nitrogen, phosphorus, seagrass depth and the turbidity of the water as an aspect of water quality.
“We had over 20 to start with and every time we had to eliminate an indicator, we mourned because it’s a lot of work. I knew it was going to be complicated, but it was way more than we thought deciding which indicators to use and which not to and the method of standardizing different types of data (for comparison purposes),” Souta said.
The two-year, $180,000 study has been so far funded by local foundations, private donors and a $47,000 grant from the National Estuary Program.
It was inspired by the Chesapeake Bay Foundation’s annual community report, which includes repeatable scientific methods to measure progress of restoration efforts, Souto said.
“It has a lot of data and charts in there but it’s more about stories and the people behind the successful projects. That’s what I want to get to,’’ she said.