For years, water takeaway, gathering and distribution have been a challenge for industrial companies. More than 80% of the industrial waste landing in rivers, streams and lakes is untreated. Compliance, sustainability and safety requirements have become even more stringent for environmental and public health and likely will continue to tighten in the coming years.
When developing and implementing a water quality monitoring and testing strategy, industrial water operators often face challenges including large, distributed water networks not suited for conventional water monitoring processes that rely on manual sampling and testing. This includes long turnaround times for test results and rush fees for expedited results.
Other challenges include decentralized water quality data, device reliability, the inability to autonomously test for heavy metals, overspending on chemical treatment and cost-prohibitive business models that don’t encourage the adoption of new technology.
Large Distributed Water Networks
Many industrial operations have enormous water distribution networks that require a considerable amount of time and expense to monitor and test water quality across the plant or facility. In some cases, it is a full-time job to sample, test and monitor water quality across an operation. It is crucial for operators to maintain the costs of their chemical feed while managing the waste. They also must remain compliant on water discharge based on their National Pollutant Discharge Elimination System (NPDES) permits.
Traditionally, water operators have collected samples, sent those samples to a laboratory and waited for the results. Until recently, results were mailed back to operators for review and analysis (in certain situations that is still the case). In recent years, results have been posted to various web applications for review and analysis. Often those applications are siloed from other data sources and require manual, time-intensive data input.
In addition to manual sampling, data analysis and laboratory testing, operators can also monitor water quality with probes and online analyzers. Monitoring water with probes helps identify physical parameters like pH, electrical conductivity (EC) or turbidity to help with estimates or initial correlations, but it only removes the need for a small subset of parameters not being measured in the lab sample. It does not eliminate the need for lab analysis and still has a high labor component. Not to mention, probes must be cleaned and calibrated frequently, which also introduces potential operator error. Alternatively, more sophisticated operations might leverage online analyzers to monitor water quality remotely with a certain amount of accuracy compromised over time and data drift that is associated with continuous monitoring of online probes, mostly using the optical methods. Like a water testing probe, online analyzers require frequent calibration and require purchase of consumables based on the frequency of the sampling, making a single investment of a chlorine or nitrates analyzer very cost prohibitive. Unless the instrument is self-calibrating and has an automated mechanism to sanitize itself between experiments, the calibration process can cause downtime in the operation and the data itself cannot be measured for the low sensitivities that is desired by many operations.
Decentralized Water Quality Data
Water data and analytics have become increasingly available and relevant, but the insights themselves are not actionable. In fact, there has been a proliferation of data sources and business intelligence (BI) tools across the enterprise regardless of industry or application. As a result, integrating water quality data and analytics into an actionable water quality monitoring strategy that is pervasive across the complete organization is critical. Developing a metrics-driven approach for how data is handled is built to enhance both the top line operations and increase savings. Unfortunately, building the right approach to leverage the data that is collected, aggregated, assessed and forecasted can be complicated, expensive and requires specialized expertise.
Some analytics BI tools are standalone, and other tools are part of a larger infrastructure solution. Standalone tools tend to be completely customizable and require extensive data architecture, analysis and reporting expertise to realize a meaningful outcome that is relevant to a water operator. Bespoke analytics BI tools that are part of a larger infrastructure solution implementation tend to be closed and interoperable with other systems. In either case, data is often siloed and disconnected.
Device Reliability and Accuracy
Laboratories are no doubt the most accurate option for water quality test results and there are still several aspects of water like biologicals, pathogens and viruses that cannot be detected by any online instrumentation accurately. Unfortunately, laboratory testing can be static, expensive and time-consuming. Obviously, instruments such as probes and current online analyzers are not as affordable or durable or precise compared to labs, but they provide faster results. However, each device requires frequent cleaning, calibration with purchase of consumables and every parameter almost needs its own dedicated online instrument, forcing companies to acquire several online analyzers for measuring some critical elements like free chlorine, residual chlorine, total hardness, calcium or, in several cases, providing calculated estimates without any direct measurements. When cleaning or calibration does not occur regularly, the reliability and accuracy of the devices are significantly impacted.
Autonomous Heavy Metal Detection
Precisely measuring heavy metals like iron, selenium or arsenic, for example, is critical in a wide range of industrial water applications including metals, mining and manufacturing. The availability of online instrumentation that can monitor heavy metals autonomously without compromising data-accuracy is non-existent.
Ideally, water operators could leverage water quality solutions that autonomously test for heavy metals, automatically self-clean and self-calibrate. That information could be sent to a central data warehouse with an analytics interface to remotely monitor and control testing. This would allow water operators to be proactive in their remediation and process planning.
Excessive Chemical Treatment
Because many of the traditional sampling and testing methodologies are manual and time-intensive, water operators can not always dial in chemical treatment processes with a high level of accuracy. It is not uncommon for operators to use temperature, pH and turbidity measurements — along with years of experience — to dial in the right chemical treatment.
This approach can work but is not as data-driven as it could be given the systems and tools available to water operators today. In addition, and not inconsequentially, there are growing concerns about how the skills and capabilities of current operators are passed down in a meaningful way as the next generation takes the reins.
For water operators who may be less experienced, it is possible that they may use excessive amounts of chemicals in the treatment process to be “safe” and avoid potential fines. In extreme circumstances, excessive chemical treatment could impact the overall quality of effluent water — and affect public health and safety.
Safety and cost are obviously two critical industrial water treatment concerns. Being able to leverage technology and data that offers real-time insight into water quality is critical to an efficient, safe and cost-effective water treatment process.
Critical Considerations When Developing a Water Monitoring Strategy
The challenges that operators face when developing a water quality strategy are far-reaching and can often seem overwhelming. It is not uncommon for operators to default to methods that “have always worked” such as handheld instrumentation, sampling and certified lab testing.
Recent technological developments have enabled operators to modernize traditional operational processes and make them more efficient and more effective. This includes leveraging online monitoring technology, SaaS software, data warehousing in the cloud, remote access and control. As the notion of water 4.0 is top of mind for most operators, the pros and cons associated with those decisions can weigh heavily as they determine what is next in their individual digital transformation journey.
When developing a water quality strategy that leverages new technologies, there are several items to consider in the evaluation and selection process.
Autonomous and Remote Capability
Instrumentation without manual intervention is a critical component of any water quality monitoring strategy as customers expand their operations and think about resource planning during pandemics like COVID-19 and others that are forcing a digital water priority into their future design for growth, survival and business sustainability.
The benefits to an autonomous solution are many and include the opportunity to:
• Eliminate the time and expense associated with manual testing.
• Eliminate operator error for compliance and process.
• Increase the frequency of testing for each parameter without a cost penalty.
• Improve measurement reliability and consistency.
• Proactive learning of water composition to allow for efficient dosing, remediation, and management of several applications.
In addition to autonomous solutions, remote monitoring capabilities are also an important consideration, as they enable water operators to bi-directionally interact and focus on what they do best and not waste time waiting for test results or aggregating and analyzing data. Remote monitoring solutions also often come with the ability to define and configure threshold-based alerting workflows and integrated service events through application program interface (API) capabilities for customers to design their plant operations with best-of-breed technologies that are interoperable.
Data-driven and Interoperable
In today’s environment, there iss no reason why an operation can not leverage data-driven solutions to inform (or make) operational decisions. However, with all these solutions generating data, it is nearly impossible to know what data is needed, how data might interact with other systems and how the data might be leveraged in the future. The goal is not to be data rich and information poor but to think of an outcome-driven approach to data design.
As a result, when mapping out a water quality strategy, a primary consideration must be the interoperability of those systems. Systems that are standalone create data silos that limit the capability of the overall system. Implementing solutions with an API to interact with all types of external data processing systems and services is critical.
Custom Configuration & Range of Parameters
Given the current marketplace for water quality instrumentation, an important consideration is the breadth of parameters a particular solution will measure. Many of the technologies available today only measure one parameter (or a small subset of parameters).
In a perfect world, a singular, autonomous solution would measure every parameter necessary in an operator’s environment. Finding solutions that support an array of parameters and can be configured to meet an operation’s specific requirements independent of the water source or category is ideal.
About the Author: Meena Sankaran is the founder and CEO of KETOS. KETOS’ mission is to transform the water industry with the goal of making water safer and sustainable for future generations. KETOS has automated mission-critical testing and monitoring processes that were traditionally manual, slow and expensive. Sankaran has a bachelor’s in electronics engineering from India and master’s in electrical engineering from UTA, Texas.