Blog

A Community-Led Effort to Measure the Health of Lake Erie River Basins

Lake Erie Volunteer Science Network developed an overview of river basin health across the Lake Erie Region to influence overall lake health
Digital Services
November 29, 2023
A Community-Led Effort to Measure the Health of Lake Erie River Basins

The Lake Erie Volunteer Science Network (LEVSN) brings together community science water quality monitoring groups to present a region-wide perspective on water(shed) health. They asked the question, can independent monitoring groups collectively analyze their monitoring data to provide an overview of Lake Erie Health? The groups, led by Cleveland Water Alliance and grounded in a culture of communication, collaboration, and standard data management, augmented their existing monitoring efforts with the implementation of a set of standards called the Lake Erie Basin Assessment Framework. The data collected throughout the monitoring season is shared, analyzed, and presented in an online application as well as a complementary analytical paper and public-facing report. This article walks through how LEVSN’s original inclusion of data management in their SOPs and strategies increased their capacity to build a more impactful, resilient, and data-driven movement.

Section 1: Introduction

Citizen science groups across the Lake Erie region collectively re-wrote the book on how community groups collect water quality monitoring data. Since 2020, groups across New York, Ohio, and Michigan have worked together to advance the credibility of community science data, resulting in a better understanding of the health of tributaries that flow into Lake Erie. 

Original LEVSN Participating Communities. Photo Credit: Cleveland Water Alliance

Previously, groups across the region monitored water quality in support of their individual water quality goals. An outside observer would not have concluded that these efforts were monitoring within the same hydrologic region. While each program engaged volunteers to identify and address local concerns, they lacked a way to demonstrate the reliability of their data or collaborate with each other for greater collective impact. To elevate the credibility of the data and tell a regional story about the condition of Lake Erie watersheds, the Cleveland Water Alliance created the Lake Erie Volunteer Science Network (LEVSN). This regional collaboration of local monitoring groups works together to unlock the potential of volunteer science to address gaps in regional water quality data collection. In a short time, the participating groups have become a force for community-driven water stewardship - harnessing data, technology, and collaboration to protect our shared natural resources. The strategies executed to date and the resulting information acquired are the results of lessons learned that can be adopted across other broad watershed areas.

“Partnering with The Commons on LEVSN’s Data Management System is one of the best decisions I have made over my four years of launching and growing the network,” says Max Herzog, Program Manager with CWA. “Access to and adoption of the Water Reporter platform served as an early rallying cry to get volunteer groups across the region engaged and helped folks see the potential of standardization and collaboration across communities. Over the years, The Commons team has consistently acted more as a collaborator than a vendor as they iterated with us to incorporate participant feedback into powerful tools that have taken our work to the next level. We look forward to continuing with them on this journey to elevate the credibility and impact of the volunteer water monitoring movement.”

What happens before you build a successful regional network? You launch the prototype

The predecessor of the Lake Erie Volunteer Science Network was Cleveland Water Alliance’s Smart Citizen Science Initiative (SCSI), founded with critical funding support from the Great Lakes One Water Partnership. In 2020, twelve pioneering groups joined the Initiative as Champions ready to bring together their water monitoring program data. From the beginning, Max Herzog, Program Manager at CWA, recognized that a unifying data management system would be a cornerstone of this initiative and assembled technical service providers to support the development of the network. The investment paid off.

“When a group gets this part of their house, or program, in order, they recoup resources and labor to apply to other mission-critical elements, for example expanding the size and scope of monitoring,” explains Barb Horn - the study design consultant who supported this project. “The groups’ decision to manage their data using Water Reporter allowed them to focus on standard data collection and develop a network data analyses and reporting method. Before LEVSN, most groups were not engaged in data analysis, only data generation. Participants now understand the critical role of data validation in generating consistent, predictable data of a known quality. Furthermore, engaged data users' understanding and confidence in what was possible via a community network was vastly increased.”

That first year provided a lot of early learning and advancements that sketched the blueprint for a stronger, unified regional effort. In 2021, SCSI re-branded and began planning as the Lake Erie Volunteer Science Network (LEVSN). The group took the time to build relationships and a solid foundation of leadership, inclusion, and trust that centered on shared goals, outcomes, and collective impact.

Initial investments in LEVSN included data use goals and data management planning when creating the Lake Erie Baseline Assessment Framework

LEVSN’s Standards Working Group, a task force composed of volunteer monitoring programs and external experts from Ohio Sea Grant, The Commons, and Ohio EPA, developed the Lake Erie Baseline Assessment Framework (LEBAF) to standardize data collection, analysis, and communication around water quality data. Three outputs anchor LEBAF:

  1. A set of Standard Operating Procedures (SOPs), which describe the program’s monitoring purpose and data uses, technical, information, and evaluation design elements that guide mutually reinforcing activities for volunteer scientists across the Lake Erie Basin. The SOP includes essential documentation for data management, volunteer management, quality assurance and control, and data analysis and reporting.
  2. A “standardization menu” that tracks external data users and program participants monitoring priorities across the region. This information is updated and prioritized annually. This output aggregates elements that can be standardized to tell a more complete story about watershed health. 
  3. A mechanism to look at the collected data at a finer scale. Individual organizations conduct annual analyses and reporting of aquatic life to screen the health of the Lake Erie Basin waterways relevant to their communities. Outcomes and impact are still in progress and are yet to be evaluated for this output, but the consistent contribution of data annually is a key input to achieve this long-term, localized analysis. 

Section 2: Planning

The Standard Operating Procedures (SOP) are framed around three pillars - 

  1. A monitoring purpose - aquatic life health conditions
  2. Data use - screening and reporting at both a large river basin scale and regionally.
  3. Targeted data users for which standardization and all SOPs served included themselves and high-level decision-makers.

The Standards Working Group created a shared framework based on these pillars to empower participants to implement standardized data collection, analysis, interpretation, and reporting processes that protect local interests while contributing to information gathering and dissemination efforts at the Great Lake Basin level.

Thanks to the commitment to standardization across all monitoring program functions, LEVSN can present a regional volunteer-driven perspective on the condition of watersheds that feed Lake Erie and provide a benchmark against which future monitoring can be compared. The first season of monitoring validated that the LEBAF SOP created by LEVSN and partners can create a regional snapshot of watershed health. Leadership was excited to report that the interpretation of the 2022 monitoring season data provided an excellent window into watershed health. 

Implementation of the evaluation SOP verified that the network could support more data, sites, and groups and that growth would only increase collective impact. As such, the 2023 monitoring season saw an expansion and maturity of the program, highlights of which we discuss in the methodology and results sections of this use case.

How to get answers to the ultimate question

We all want to get to the ultimate answer - does this network approach generate valuable information on watershed health? However, that’s not an answer easily received. The work to establish a system that could reliably answer that question took much of LEVSN's initial resources. Developing the SOPs and then building a complementary data management system (DMS) to function behind the scenes was paramount to producing a pathway toward tangible and demonstrable wins for water quality. The DMS has two core functions. First, it ensures all groups provide structured, machine-readable data at the end of the monitoring season. Next, it facilitates analytics and information building based on the collected data. The methodology and results sections of this use case focus on the intersection of the data management system and the engagement of the network participants. More discussions on interpretations of data-derived information can be found on Cleveland Water Alliance’s Lake Erie Volunteer Science Network page.

Section 3: Methodology

This use case focuses on the components of the program guided by data management. We address the question - How did LEVSN partners manage their data to achieve their programmatic goals? The elements outlined in this section characterize the below-the-deck work undertaken and tools used to facilitate efficient collective analysis, reporting, and communication. Documentation exists that shares the SOPs, implementation, data analysis results, and more. For anyone interested in adopting a similar framework and approach in their region, we recommend looking at those resources as well.

Launching a new network - how to put a tourniquet on fragmented monitoring efforts

The success of this network is made possible in part because of the inclusion of all stakeholders, including the technical service providers, from the beginning. When everyone is at the table, each stakeholder identifies their contribution, role, and benefits. This allows them to achieve a deeper understanding of the processes and tools that produce collective outcomes and impact. For example, by being a part of the study design process, The Commons team provided real-time consultation on three main components. 

  1. Gained insight into how to structure the data models based on data collection and data interpretation workflow needs. 
  2. Assimilated strategies to ensure the adoption of best data management practices.
  3. Oversaw the integration of data management processes into the official SOPs. 

This liberated participants to focus on the science of monitoring data analyses, use, and community engagement.

By the end of the 2023 monitoring season, the network and development work done to date will have produced four key outcomes: 

  1. Two years of credible, standardized data collection, 
  2. Regional analyses of Lake Erie watersheds’ aquatic life conditions, 
  3. Data-based recommendations for resource improvements to a broad network of stakeholders, and 
  4. A scalable data management system. 

What was the technical recipe for success?

Assembling the software toolkit: curating a data management system

Let’s walk through the software we used to complete this project. 

Figure 1. Diagram of the data flow, software utilized, and role of each system in supporting LEVSN 

Water Reporter. Water Reporter is the data management platform built and hosted by The Commons that invites water quality monitoring groups to configure and manage their quantitative, site-dependent water quality monitoring data in a pre-established relational data model. Each group can designate their station locations and define the parameters so that information collected in each sample can be organized appropriately. A simple map can visualize the data collected over time at each site and by each parameter. 

Airtable. Airtable is a cloud-based data management platform that allows users to configure, input, store, and organize large datasets in user-friendly ways. For this project, Airtable serves as the centralized database to aggregate each LEVSN group’s water quality data. Additionally, Airtable’s open and accessible format allows for multiple integrations with various third-party applications, setting the project up for successful public visualizations and interactive data exploration. Airtable is configured to push hosted data to an ArcGIS Online feature service that can then power a public-facing data visualization.  Further, Airtable’s flexibility related to scripting and formulas enabled project stakeholders to quickly score and tabulate exceedances based on an array of complex parameter thresholds based on state water quality standards.

ArcGIS OnlineArcGIS Online is a cloud-based geospatial mapping and analysis service offered by ESRI as a part of their larger geographic information system (GIS) services. With ArcGIS Online, users can produce maps, dashboards, story maps, and various other public-facing tools to better share their data with the public at large. For this project, ArcGIS Online was used to produce a map and data dashboard to publicly share discrete and continuous water quality data and was analyzed by LEVSN partners.

Tableau. Tableau is a visual analytics platform that helps people see, understand, and interact with data. Tableau’s broad array of analytical options assisted our team in filling in gaps in the capacity of the other analytic software in use. The components derived from Tableau were inserted into the final ArcGIS-hosted dashboard, keeping all of the data-derived analytics accessible for end data user audiences. 

Implementing the LEBAF SOP via the Data Management System with all LEVSN partners

How can we align processes and products?

Step 1. Establish requirements for an effective data model. The LEVSN Working Group detailed two requirements that their data model needed for success.

First, they determined that it needed to support two data workflows. The workflow and functions needed to get data from the field or lab, validated, and into a repository available for analyses and reporting are different from the workflow and functions needed to pull valid data, conduct analyses, report, and deliver results and information. Additionally, the working group determined that their data model first needed to serve the monitoring purpose of aquatic life conditions, a data screening use, and the information needs of each group and other data users second. Both of these strategies anchored the building blocks of the data model design, scoped to serve LEVSN's specific monitoring purpose, data use, and the information needs of targeted data users. The resulting data model organized the data to ensure the elements and connections met the initial requirements. Specific elements included connections between monitoring results, meta-data, ancillary data, data quality objectives, and information needed for analyses and reporting, such as parameter benchmarks and exceedance criteria.   

The meticulous preparation and resulting framework set rigorous standards for collecting, analyzing, and communicating volunteer-collected water quality data. This preparation sets up the network for success because it guarantees the production of data of a known quality. If these high standards were ignored, the data-to-information pathway made possible through this data model would produce futile data and, ultimately, capsize the entire network. 

What is the standard data model?

Step 2. Build a standard data model and template.  The initial LEBAF SOP developed by the LEVSN working group identified the basic chemical parameters that would support the monitoring purpose and data use of screening for aquatic life conditions in rivers. Next, the group researched and developed regional benchmarks and assessment criteria for each parameter to tell that aquatic life condition story. The parameters collected were pH, dissolved oxygen, water temperature, and conductivity.

Conductivity quickly became the most complex parameter to interpret. Conductivity is a measurement of ions suspended and/or dissolved in the water; however, it doesn’t identify the composition of those ions, which can be a product of a multitude of different potential pollutants. Hence, it is much more difficult to develop an effective conductivity standard that will protect aquatic life compared to pH, dissolved oxygen, or temperature. Despite this shortcoming, conductivity is relatively easy and feasible to measure and can be an effective parameter for screening. The relationship between conductivity and chloride, salt, and total dissolved solids is predictable and reproducible via respective mathematical equations. Thus, a conductivity result can be translated into a chloride, salinity, or total dissolved solid result, parameters for which standards of health do exist. Ultimately, the chloride, salinity, and total dissolved solid parameters, or expressions of conductivity, became surrogate parameters for LEVSN’s data management, analysis, and reporting - adding to the ability to meet their main goal - assessing aquatic life conditions.

Understanding the parameters and purpose of those parameters is of importance when discussing the development of the data model and analytics. Based on this information The Commons’ team constructed a data model that collected conductivity results, hosted mathematical conversions for the three additional expressions of conductivity, and processed three additional standards across the assessment and reporting workflows. The components under consideration included exceedance calculations, tables, graphs, and mapping outputs. 

Based on the determined parameters and meta-data fields associated with each sample, The Commons built a data model and software system that all volunteer groups could use to store, validate, manage, analyze, report, and share data. The data model included all of the information to be collected to streamline data aggregation and, where possible, automated components based on mathematical expressions and algorithms. Volunteer groups started the monitoring season knowing what data they needed to collect and the format in which it needed to be shared to prepare the data for participation in the LEVSN regional effort.

Figure 2. Diagram of the data schema created to host all LEVSN data in Airtable

How can data be prepared and shared?

Step 3. Get all data into a standard, structured format. By the end of the monitoring season, each participating volunteer monitoring group employs the same standard template to structure and import their monitoring data into their Water Reporter account. Getting data into a standardized, structured format is the instrumental first step to collaborative analytics and communication. Water Reporter helped participating groups organize and structure their data in a machine-readable format preparing it for the next step. Next, each group follows standard steps, as outlined in the relevant workflows in the SOPs, to validate the imported data and meta-data before that data is moved onto analysis and reporting workflows which turns the data into information. 

How can we streamline analytics and understanding?

Step 4. Aggregate data and automate calculations. Once each group structured their data in Water Reporter it could be retrieved and aggregated into the AirTable base used for analysis and reporting. The prepared Airtable base hosts the data model developed by The Commons and simplifies the importing process from the relational data model schema hosted in Water Reporter. This means that, with minimal human intervention and reconfiguring, we centralized the disparate data into a single database for streamlining analysis tasks and workflow at different geographic scales.

As mentioned in Step 2, the LEVSN Working Group designed its analysis and reporting strategy for the basic chemical parameters before the monitoring season got underway in 2022. In 2022 The Commons team wrote scripts in a third-party application, R-Shiny, to conduct analytics on the entire annual data set. This produced exceedance tables, graphs, and maps for each parameter at the station and large river level, but the service used came with some huge drawbacks. Groups could not run the script themselves or select the boundaries of their maps. In 2023, we moved the automation from the r-shiny script into Airtable. This made the scripts more accessible and editable. The new approach consists of calculated fields and scripting for more complex, efficient analysis that, as an added bonus, users can run anytime and on their own.

Before data was imported and validated from the groups’ completed monitoring seasons, we prepared the end-of-year calculations and analytics. As individual groups completed their internal QA/QC process and moved their data into the Water Reporter structure, our team finished coding and verifying the calculation scripts. 

Example of script created in Airtable to automate calculations.

The primary metric LEVSN is using to tell the aquatic life condition data screening story is the number of exceedances against respective standards for each parameter. Thus, the analysis centered on the calculation of the number and percent of exceedances for each monitored and surrogate water quality parameter at each site as well as the subsequent aggregation of results for all sites across the watershed of each River/Tributary and the Lake Erie Basin. Exceedances were determined using benchmarks for each parameter. LEBAF’s benchmarks were derived from integrating the state Clean Water Act criteria from Michigan, New York, and Ohio, which are based on research, literature, and laboratory studies. The research was then translated into and adopted as the framework in the SOP, selecting the best combination to serve as a regional Lake Erie standard for rivers.

How do you share data with target decision-makers?

Step 5. Create an ArcGIS Online dashboard to power an interactive application. Data analysis should be repeatable and transparent and an effective data model (and software systems) facilitate the ability to show and share analyses. A powerful feature of Airtable is the ability to integrate your bases with third-party software. To build a public-facing dashboard that requires minimal human oversight and intervention, an Airtable-hosted database is a perfect backend-as-a-service solution to easily integrate data with third-party systems. Our team created an ArcGIS Online (AGOL) Feature Service using LEVSN’s Airtable base. If you’re familiar with ArcGIS Online feature services, you know that managing data in AGOL can be complex and limited when compared to the fluid editing environment offered by AirTable. Most often, a manager is required to routinely update and edit the data within the feature service within ArcGIS online. This creates issues of version control and introduces opportunities for human error and stale data to be passed off to the public as the authoritative source. 

In this project, to bypass common data struggles experienced in ArcGIS Online-hosted applications, we wrote an API connection that pushes the data managed in Airtable directly to the Feature Service. Our connection includes commands to push edits, additions, and deletions made in the Airtable base directly to the data hosted in the Feature Service. This focuses the attention of data management on the Airtable data and reduces the opportunity to introduce errors to the data powering the dashboard. This is key to maintaining data integrity during the analyses and reporting functions and workflows.

The LEBAF protocol includes some rigorous analytics that could not be visually supported with the base offerings of the ArcGIS Online software. Our team used Tableau to conduct more complex analyses and embedded those analytics into the final application alongside the geospatial and sample statistics hosted and shared through ArcGIS. The end result is an application that does justice to all of the levels of complexity, maintains data integrity, and offers multiple entry points for a wide array of decision-makers. It not only makes the analyses transparent and reproducible, but data users external to the network can conduct their own analyses, expanding the potential collective impact.

The resulting dashboard built with the LEVSN feature service delivers incredible information to decision-makers. All of the analytics performed in Airtable via script appear as interactive, approachable data points within the dashboard. Users from Ann Arbor, Michigan to Cleveland, Ohio, and Buffalo, New York can hone in on their local watershed or zoom out to understand their community’s connection to the broader basin range. 

 

Section 4: Results

LEVSN is a leader in community-science collaboration not only because of its commitment to collective impact through a broad network of stakeholders that agreed to adhere to a common framework and SOPs but also because participants committed time, energy, and resources to build processes and procedures that are inclusive, results-oriented, and repeatable across all partners. As a collective group, LEVSN is fostering a culture of collaboration and a commitment to data management as an integral program element. That has become a catalyst for tackling big issues regarding water quality and community data.

“Many groups believe generating data is the endpoint, it is never the endpoint. Even turning that data into information and using the information is not the endpoint. The endpoint is the impact of the data use. If a group has centered on generating data, taking the next step to turn that data into usable information - the data to information workflow - can be overwhelming to design and implement.” Barb Horn described the foundation that LEVSN’s data management strategy provided for the growth of the network. “With the commitment to get their data used for collective impact, having Water Reporter eased the sense of overwhelmingness and facilitated LEVSN groups to push themselves and understand the value and role of data validation, how to generate predictable and consistent data, and design analyses and reporting that aligns with their monitoring purpose and data uses. “If a data management system hadn’t been a key program element and determined at the beginning, the group wouldn’t have had as much energy to focus on standardizing the science elements, such as what to monitor, where and how, and the entire effort could have lost relevancy and dissolved, which is common.”

Outputs

This collaborative and inclusive network project produced a data management system that allows all participants to manipulate and manage the data, validate inputs, automatically produce calculations and analytics based on partner-collected data, develop a variety of information products, and host an online dashboard that displays all monitoring information in an easily accessible and digestible format for both their data users and a broader audience.

Automated data analysis produced standardized summary statistics, graphs, and maps at each scale. Analysis-centered calculation of the number and percent of exceedances for each monitored and surrogate water quality parameter at each site and the subsequent aggregation of results for all sites across the watershed of each River/Tributary and the Lake Erie Basin.  Outputs are scalable, facilitating not only analyses but also reporting and supporting information products at the station level for local organizations and the equivalent at the large river level for LEBAF, LEVSN, and the region.

Outcomes

Optimizing Data-to-Information Workflows. LEVSN adopted a data-management-centric culture that fostered a commitment to pushing their data into the prepared robust data model. LEVSN did not rush to define its database structure. They started by determining what they needed their data model to accomplish, then worked through the structuring and data workflows to achieve their end goals.  

Unfortunately, groups and networks do not need to start the data modeling with a requirement analysis. A group can jump ahead and organize their data fields within a database then start populating it with data. The consequences of postponing or avoiding these discussions can be disastrous and detrimental to programmatic goals. Completing this planning work upfront, saved LEVSN from leaking resources and spinning its wheels further along in the data-to-information workflow. If the conversations had been delayed, the groups could not have produced a database full of consistent and reproducible data of a known quality. Often the consequence of the latter is the data is not used. The result of the former is that data is used. The longevity of the LEVSN comes down, in part, to their focus on the foundations. 

Synergistic monitoring efforts. Participating members of LEBAF opt-in to the standardization through an annual memorandum of understanding that requires adherence to the technical requirements and minimum performance criteria of the shared framework and associated SOPs. The framework is designed to synergize with, rather than replace, their pre-existing monitoring programs. Because groups included and committed to standardizing and digitizing the data management of their monitoring work, LEBAF participants were able to meet their data objectives, monitoring purposes, and intended data uses. The conducted analysis tells stories at three scales:

  1. Local River by each site
  2. Large rivers and other direct tributaries to Lake Erie
  3. Lake Erie RiverBasins (as a region)

Evolution through evaluation. Key to LEBAF’s success was designing and implementing an evaluation process that included formative and summative reviews. The evaluation of the data management system, associated workflows, monitoring program, and analysis changes all led to the evolution from Water Reporter to adding an R-shiny script to Airtable and associated outputs.  Evaluation is why evolution occurred. It is not certain but highly probable that participants' interest, commitment, and engagement would have dissolved, threatening the future of LEBAF at the vulnerable start-up phase. That is the strength and benefit of taking the time to design and support program foundation elements such as the data management system, independent of a program's longevity. By working together and diving deep into the procedures of standardization, the groups collectively built a system that produces predictable, reliable, and consistent data of a known quality that helps the data users, who are often influential decision-makers, be able to use it to protect and restore waters for all beings across Lake Erie.

Developing Lake Erie Regional River Basin thresholds. Before this project, regional thresholds for pH, DO, temperature, conductivity chloride, salinity, and total dissolved solids for surface water and rivers did not exist. LEBAF used existing scientifically defensible state or research thresholds to produce a set of regional thresholds. Furthermore, LEBAF developed a standardized robust analysis and reporting process that any entity could employ, complimenting local threshold analyses. The larger outcome is an effective data-to-information path, from analyses to reporting to data used to impact, that evaluates aquatic life conditions, screens for further protection, restoration, or exploration recommendations, and is scalable from the local level to the Lake Erie region. 

Large River and Direct Tributary Analysis. Station monitoring results were aggregated to large river geographies. This provided a scalable perspective to identify aquatic life conditions and gaps in monitoring or knowledge. Thanks to the data systems' automated calculations a summary for each parameter and exceedances within each watershed boundary can easily be derived, displayed, and used for analyses. Based on these large river data summaries, recommendations, conclusions, gaps, assumptions, and limitations are produced through the lens of the larger river and when possible the large river's potential influence on aquatic life conditions in Lake Erie. In the first year, this approach was conducted for the Large Rivers: Clinton River, Cuyahoga River, Huron River, Rocky River; and Direct Tributaries to Lake Erie: Buffalo River, Doan Brook, Eighteen Mile Creek, Euclid Creek, Rush Creek, Smoke Creek, Mills Creek, Old Woman Creek,  and Pipe Creek. Five stations along the Lake Erie shoreline or in the lake itself were also monitored. 

Accelerated and collective analysis. Conducting a collective analysis across groups and a region would not be possible without a DMS. The DMS standardizes and formats the data, stores the data in an accessible repository, and allows quality control and assurance procedures and data management practices to join, share, and use the data. This may seem obvious but that is only because it was created from the original formation of LEVSN.  Furthermore, with the adoption of the DMS, the window between completing the monitoring season data collection and producing the first round of analytics for public consumption is a matter of weeks rather than months. This allows an annual analysis and reporting process and expands the number of created products. Even more important, it allows groups that were not conducting analyses before to add this to their program. In a collaboration, engagement is key for sustainability and when a process does not produce results, attrition results. 

Accelerated analysis is a significant improvement over past efforts to collect and analyze disparate data which involved time-intensive aggregation, translation, and QA/QC periods before the hands-on period to complete the analytics and then translate those results into papers and products. This is true for individual groups much less a collaboration joining data. There is no shortcut for data interpretation from analyses and findings, developing recommendations, delivering information, and evaluating the impact. The DMS used by LEVSN means that the dashboard is available to embed in participant groups' websites and shared with decision-makers within a month of the closure of the monitoring season, facilitating timely delivery and outreach, and thus creating the desired collective impact for the health of Lake Erie Basins’ waterways. 

Impact

We provided the DMS to conduct local station and river analyses, helping each LEBAF organization expand and strengthen existing programs and efforts. The same DMS also provided the ability to conduct a larger river analysis resulting in recommendations that direct, focus, and help prioritize restoration, protection, and exploration in that river basin. The DMS also produced a unique regional analysis, perspective, and set of recommendations across Lake Erie watersheds.  The regional analyses were based on results from all stations, direct tributaries, large rivers, and lake stations taken together. LEBAF’s monitoring purpose of condition and screening data use provides the question, the DMS and associated analyses and reporting process allow us to find the story in the data and tell that story to create collective impact within and across watersheds. After a robust evaluation process, LEBAF made numerous improvements in the technical, information, and evaluation designs, including those that evolved the DMS and will continue to do so. In the second field season, 12 participating groups collected, analyzed, interpreted, and disseminated data from 142 stations on 20 local waterways. 

As a result of this robust and thorough assessment, LEVSN can present a regional volunteer-driven perspective on the condition of watersheds that feed Lake Erie, provide a baseline data set, analysis, and reporting process benchmark to which future monitoring can be compared, and continue to add parameters, organizations, locations, and monitoring purposes to the network. Using the definition of health adopted by the LEBAF SOP, field measurements appear to indicate that Lake Erie’s watersheds are generally healthy and support aquatic life, with identified localized areas to explore or support restoration efforts. 

“It’s the nature of who The Commons is that allowed this project to happen. They learned and adapted with the groups and created a product that took into account what they were looking for. They were not a sterile development company,” explained Barb Horn on the approach that The Commons took to this project.

The long-term commitment by participants allows LEVSN to focus on refining the operation and maintenance of existing protocols, tools, documentation, and workflows, with emphasis on more exchanges between groups and the sustainability of the network. This commitment, coupled with CWA’s financial investment in the network’s launch, is critical to the ongoing success of the network. Building on existing standards will provide depth and direction, reinforcing the work's credibility and broadening our understanding of the Lake Erie Basin. Analysis conducted to date shows that while the overall health of Lake Erie needs improvement, some individual rivers and tributaries are thriving while some localized parameters indicate stations or waterways that are at risk of (or currently) experiencing unhealthy conditions.

Author’s note: Thank you to Max Herzog, Barb Horn, and John Dawes for their contributions to this article.