Computer science

Pipe Dreams used artificial intelligence and online modelling to monitor water quality in real-time.

Machine learning applications
Figure 7 (above) Machine learning applications

Pipe Dreams ambitiously aimed to use the data derived from field and laboratory experiments to develop techniques, tools and models that quantify the interactions between the physical, chemical and biological processes that occurred in distribution systems and hence assess how water quality would change. The development of hybrid artificial intelligence and online modelling tools to integrate structural and water quality performance in a suite of near real-time operation and management tools were key to this.

Artificial intelligence techniques generated data for derivation from a library of composite hydrological and geochemical `finger-prints´ across the parameters that could be associated with the detailed molecular microbial analysis. Subsequently when deployed in the field, the artificial intelligence techniques were reversed to recognize these finger-prints and infer the microbial and chemical changes impacting on water quality.

Pattern recognition and machine learning techniques for classification were utilized and a comprehensive toolbox of methodologies exist including - classical linear regression, binary discrimination or classification, neural networks, kernel methods (such as Support Vector Machines), Bayesian graphical models, variational inference, Monte Carlo sampling methods, hidden Markov models, pattern matching and fusion of classifiers techniques.

This innovative combination of the latest inline instrumentation and artificial intelligence enabled the realisation of near real-time acquisition for decision making through the envisioned modeling and management tools. Future operational applications will improve on the quasi real-time data that is currently being provided and allow data to be processed locally or remotely to provide exception alert generation or other information involving more than one signal. 

Benefits were accrued with increased coverage but depended on progress with low cost sensors and communications using peer-to-peer technologies or local hubs. The capture of such data then needed to be supported by data management (using more distributed middleware infrastructure such as Dataturbine) and receiving systems to make it accessible to downstream applications and actual use in the field through remote communications. Visualisation and aggregation of multiple data sources were vital to support management of events and reduce their impact while improving the overall level of service.

Centres of excellence

The University's cross-faculty research centres harness our interdisciplinary expertise to solve the world's most pressing challenges.