As a program manager using a software or overseeing the development of a new application, you may find yourself at a crossroads of weighing options for how to get more use out of your software either through expanded reach of your data or increased functionality within your software. Perhaps you have considered using APIs. APIs provide enormous value to the network of technology by expanding the power and the reach of Software as a Service systems and the information stored within them. They are an incredibly useful and wide-reaching tool so they often emerge in discussions as an option to explore in software development and application. But is integrating softwares via an API actually an option that will deliver you the reliability and results you expect?
In this article, we are going to share with budding product managers and software users looking to provide input into development roadmaps what to look for when determining if an API meets minimum quality standards. We created an API quality framework that we use to assess which platforms we will consider for integration with our products, and we will share that with you too. For a software product manager or software user looking to manage a development project, we hope this article helps you make smart choices about when to use APIs and when to consider alternative options to achieve your goals.
What are some common APIs that a software developer may use to meet the needs of the project manager or user base?
At The Commons, we have an example of a powerful API that we rely on in our products. Both FieldDoc and Water Reporter use a MapBox integration for our base layers and many other services like geo-coding.
In Water Reporter users can even interchange base layers from MapBox’s suite of options.
Our system gets to make use of these powerful mapping softwares that a whole team of developers maintains and upgrades while our developers can focus on writing the code for data management of restoration installations (FieldDoc) and monitoring data (Water Reporter). The MapBox API is integral to our user experience.
The Commons’ Ultimate API Standards Checklist
The Commons relies on a simple but effective checklist when looking at a new API integration opportunity. We’ve shared our checklist here for two reasons. First, we want to help other program managers and developers create their own systems of discovery and determination for integration. Second, we want to broaden the interoperability of software applications in the environmental space and by sharing these standards we hope other developers considering opening public APIs can develop their products to meet these standards. Without further adieu, here is our checklist:
- Does the software have a public API?
- Can a third party application authenticate?
- Does it support common API functions such as GET, POST, PUT, PATCH, and DELETE
- Does the API have adequate, thorough documentation?
- Does the API have version control?
- Does the API provide utility for the integration we are trying to perform.
- Does the API have an uptime that is aligned to the uptime of our products?
- Does the API provide developer usable responses when tasks succeed or fail?
If the answer to these questions is universally yes, then we will consider developing, deploying, and maintaining the components to stand up the integration.
If the answer to any of these questions is No, however, then building a direct integration is a greater liability for our products and users than an asset.
What to do when an API misses the mark?
It’s important to acknowledge that APIs are only the optimal solution some of the time. We are dedicating an entire blog post to discussing our API checklist not because they are the only option for software integrations but because they are a popular, albeit commonly misunderstood, option that deserves recognition and cost-benefit weighing for program managers.
At The Commons, we recognize the power of APIs is not infinite. While we are currently expanding opportunities for more software integrations across our core products, we have not built our entire roadmap and service structure around the expectations of software integrations. We see APIs as a tool to explore when expanding use and application of our products, but not the only solution.
How can an API fail?
APIs often fail if they are insufficiently structured and poorly documented, resulting in a communication breakdown between applications. Similarly, APIs placed on shoddy code won’t improve the base system that they are providing the parent software access to use. Avoiding failing APIs is the whole reason that we made the checklist above.
The lion-share of malfunctioning APIs disrupt the user experience. We write code for the user experience so we will table any work that will flat line not galvanize the products and our use base. Temperamental or break-prone APIs may leave the user stalled in an incomplete task, whether that task is pushing data to a third party aggregator, data visualization application, or trying to find a point on a map. To a user, malfunctions look like the dreaded “image not available”, problematic response codes like ‘ERROR 404’, a stalled load bar, or send you to a pizza shop that’s on the other side of a half-constructed bridge to nowhere.
Users don’t care if a third party API has disrupted their experience, they see the issue within the platform they are working in which puts the liability squarely on the integrator, reducing usability and confidence in the service. In other words, the primary software can be held back by broken integrations.
What it takes to integrate an API into a Commons product
Our users are looking to amplify their data and streamline data management as much as possible to extend the use of their information. Quite often at The Commons, we are approached with requests to help a user that stores data in one of our products, such as Water Reporter or FieldDoc, get that data into another system, such as the EPA’s Water Quality Exchange or a reporting platform like National Environmental Information Exchange Network. Our best clients envision a world with federated databases sharing information and data between data producers and aggregators. In this dream, the users would store data in one system and then push data to another system with the click of a button. Are we all coding for the for the holy grail of Easy Buttons. Technically APIs can make this dream a reality, by sharing and discovering data across systems; however, the environmental space has miles to go before they can make dreams reality.
At The Commons, we rely heavily on building replicable structures that meet the standards practiced across the modern software development industry. That means we often have to choose between options by ranking and prioritizing applications that we connect to and determining if other systems meet our standards. When it comes to aligning API integrations to our product roadmap and user needs, our checklist has helped us make some difficult decisions a lot more straightforward. We encourage our partners and developers in the environmental space to dedicate the resources necessary to open up reliable public APIs in order to help realize the dream of a more federated system of databases and data sharing.
The future of APIs in the Environmental Monitoring Sector
When there is a will, there is a way. At The Commons, we have the honor and privilege of participating in multi-sector stakeholder conversations around the notion of elevating community science and augmenting data sharing. We have been able to peek under the hood of many regional, state, and federal data warehouses and work directly with system engineers that build the infrastructure for these repositories. These databases are sound in their design but sometimes lack API infrastructure that meets our minimum standards for integration with our SaaS products.
With the broad efforts to standardize monitoring programs underway thanks to national groups like the Internet of Water, Water Data Collaborative, and regional groups like Smart Citizen Science Initiative, Unified Water Study, and Chesapeake Monitoring Cooperative, data producers are still looking for opportunities and mechanisms to send data to data portals while building more accessible and flexible systems for data sharing and discovery.
The dinosaur data management strategies required the slow re-formatting of data to fit each aggregator’s schema for both data and metadata. The modern approach we need to get to as a movement is systems integrated through APIs. Building a network of systems that meet our checklist is an excellent strategy for accelerating the communication opportunities between existing platforms. The intermediary step, when data warehouse applications fall short of the technical specifications required for successful APIs, involves three steps.
- Products collect and store their data in machine-readable formats that optimize structure for data data sharing rather than aesthetic data management, such as Water Reporter.
- Understand the requirements for data sharing to the receiving data portal and build data cleanup scripts that re-structures the schema of the stored data to fit the aggregation system.
- Manually import the newly-formatted data into the receiving database or warehouse.
Writing the Script on Structured data sharing
Writing clean up scripts that allow one data source to be distributed widely helps breakdown data silos without overburdening data owners with the distribution of the data sources to third parties. Broader distribution and access to data ultimately amplifies the use of their data in science-driven decision making. Script writing is an iterative process that takes a unique set of skills and if it seems like a band aid, you’re right. Unfortunately, it’s your best and most accessible option if APIs are not up to par to do that hard data sharing work for you.
Fortunately, lessons can be learned during this sub optimum stepping stone phase. If you document the workflow and spend the time investigating possible pitfalls and pathways, this can make building an API infrastructure much easier later down the road.
Parity for APIs at the intersection of technology and environment
We recognize that many databases used by the environmental sector right now are not as state-of-the-art as our users expect based on their access to broader technology offerings, like Google Maps, Twitter, or Slack. But when you think about it, we are building software at a fraction of the funding these million, billion, and trillion dollar companies operate on and the needle is still getting moved. We are a glass half full shop and we look for pathways to provide the best leveraged software solutions and opportunities — albeit sometimes outside of APIs — to the environmental sector as valuable as these technology giants and their generic offerings. The Commons continues to expand both our direct software offerings and the network of committed technical service provider partners. We know that the developers supporting this sector with modern software development approaches will help our industries’ API fueled dreams become reality.
Our goal is to simultaneously amplify the environmental sector by building technology for them to communicate their message while also giving them access to software solutions that make their day to day data management tasks a breeze. Building integrations via sub-par APIs wastes already limited development resources. The Commons takes the stance that spending the vast majority of our time planning and defining what an API integration is will take away time from writing code. Even though the corporate tech sector makes documentation look easy, it’s not. All of the integrations you work with in your day to day are the result of hundreds of thousands of planning hours to identify improvements to the user experience utilizing highly scoped workflows. Further these decisions are also backed with a significant amount of inward focus on bolstering the company’s bottom line. Squarespace integrated with PayPal so that small business owners could build micro-online marketplaces via their websites. They wanted to kill possible competition and further expand the reach into common uses of their core product: building drag and drop websites. In some nefarious instances, large tech companies buy up and then integrate with existing platforms in an effort to squash the competition or phase out a
Going back to being an optimist, I am in no way saying the tech sector in the environment is at a stage with their integration approach where they want or can implement a “catch and kill” strategy, I’m just trying to illustrate that integration in the corporate sector has massive user and business implications that should be looked at holistically.
Any integration being pursued in our space needs to be rooted in collaboration and mutualism. What components of my system work great and which fall short? What systems out there can integrate with that can help bolster the value of my offering and my partner’s, and most importantly my users? It’s not enough to simply say, oh you have an API, let’s integrate.
Commons Data Conversion Services
As an intermediary between data sources and direct integrations, The Commons offers a conversion service. This helps monitoring programs who manage their data in Water Reporter easily convert their data into a format acceptable by the WQX data model and assist the program managers in porting their data into the system.
The beauty of the Conversion service is that users are not tied to one data bridge. If The Commons can work with the data ingestor operators, we can build conversions for one data source to multiple data streams. This conversion approach may lack the one-click desirability of an API integration but it allows for data owners to retain their data in a structure that best suits their strategic goals while confidently participating in data sharing activities. Furthermore, the conversion service overcomes current inadequacies and infrastructure issues lingering in third party data repositories systems.