A portal for the analysis of water
Regular monitoring: The quality of drinking water is regularly checked by the waterworks, and recently artificial intelligence has been helping to evaluate the information. Photo: Anderson Rian Klwak/Unsplash
Clean water is essential for life. That makes it all the more important to find out what is in the water. After all, industry is always experimenting with new chemicals, agriculture with fertilisers and pesticides, and even nature regularly produces new germs or bio substances. But how to track down the unknown? This is the question addressed by the K2I project funded by the Federal Ministry of Education and Research (BMBF) (funding code 02WDG1593 A-D), which involves the Leibniz Supercomputing Centre (LRZ), the Water Technology Centre (TZW), the Landeswasserversorgung Langenau (LW) and the Technical University of Munich (TUM); laboratories of other water suppliers are involved as associated partners. The goal of K2I is a portal with databases in which laboratories and waterworks can store analysis data on water quality: "We are working together on a cloud solution in which water data is recorded in a standardised way and evaluated using artificial intelligence methods," Viktoria Pauw, a member of the LRZ research team, specifies the project. "We are also developing workflows and analytics algorithms for this."
Collective and artificial intelligence
Utilities need to know what is in the water and how to treat it: "In order to set limits and determine water quality, laboratories use target analyses to examine water samples and determine the levels of known substances such as calcium, pesticides, medicines or pathogens," explains Uwe Müller, a PhD in engineering and chemistry, from the TZW, an institution of the Deutsche Verein des Gas- und Wasserfachs (DVGW). "But the challenge is the non-target screenings, so you detect the unknown or not expected ingredients in the water." Non-target screenings (NTS), searches without a defined target, are considered the supreme discipline of environmental analytics, for which liquid chromatographs are usually coupled to high-resolution mass spectrometers. Both devices measure the molecular weights of contents and thus enable the determination of individual substances. But this method is not only expensive and time-consuming, which is why it is rarely used. It also produces data volumes of up to 300 megabytes and more per control and substance - such volumes can hardly be analysed in detail by individual laboratories. "It works better collectively," says Tobias Bader, a chemist with a doctorate from the National Water Supply in Langenau. "Until now, it was hardly possible to reconcile all the interests of the water industry; the waterworks usually find it difficult to share their analysis data. Gradually, however, a process of re-thinking is starting, because together we can achieve more."
The planned portal will therefore collect sample data from different laboratories and water utilities to get enough analysis results from target and non-target screenings so that artificially intelligent (AI) systems can search for known patterns or anomalies in them. "We discover similarities and differences in water quality more easily this way," Müller says. The search for unknown patterns could also be accelerated, and the sources of harmful substances could be pinpointed more quickly. It is not for nothing that the project name K2I stands for "Artificial and collective intelligence in trace substance tracking in surface water": "Together we can either focus on important substances in the evaluations or better assess the relevance of new patterns or trace substances," says Bader. "If a certain pattern also appears in the screenings of other laboratories and waterworks, a substance is widespread. If I only find it in my own dataset, we have targeted samples taken and search for the regional source."
Process and standardise data
The K2I teams have been working on their tasks for a good year. Especially the handling of the data required a lot of experience and ideas in the beginning: "We had few samples, non-target screenings are expensive, we carry out about 100 per year. But there would have to be at least a 1000 to train an artificial intelligence and obtain results," Bader reports. "We therefore asked for datas from other institutes." The Federal Institute of Hydrology (BfG) contributed analytical data, as did the Bavarian Landesamt für Umwelt and the North Rhine-Westphalian Landesamt für Natur Umwelt und Verbraucherschutz. However, because laboratories work with their own equipment and methods, the information and measured values must be harmonised, standardised and also documented. Uniform workflows and storage processes are now helping to standardise the data, but so is the software from Envibee, a start-up from Zurich and also a project partner: "If we succeed in making the data formats comparable, that's a big leap forward," explains Pauw. Then artificial neuronal networks could be trained to recognise patterns and assign substances.
The first databases are already filling up with results from samples and screenings. Algorithms for smart water analysis are also already being experimented with. "In the first year, we worked more conceptually, but now we're really getting into practice," says chemist Bader happily. "The processing of the data is running stably on the servers, we are continuously receiving new data, I am confident that we will have developed technology and workflows for the water portal by next year." An even bigger goal is already being targeted: a water portal for the whole of Germany. Other comparable research projects are underway in the country to analyse water with artificial intelligence. If their results and technologies are brought together, a supra-regional testing and warning system can be created - and a treasure trove of data that is also likely to inspire researchers. (vs)