HydroTransNet: A Transformer-Based Model for Enhanced Water Quality Prediction

HydroTransNet: A Transformer-Based Model for Enhanced Water Quality Prediction

Venkata Simhadri Naidu Surapu Kanusu Srinivasa Rao* Ratnakumari Challa Arepalli Peda Gopi

Department of Computer Science & Technology, Yogi Vemana University, Kadapa 516005, India

Department of Computer Science & Engineering, AP-IIIT, Rajiv Gandhi University of Knowledge Technologies, RK Valley, Kadapa 516330, Andhra Pradesh, India

Department of Computer Science & Engineering, Vignan’s Nirula Institute of Technology and Science for Women, Guntur 522009, India

Corresponding Author Email: 
kanususrinivas@yvu.edu.in
Page: 
1250-1256
|
DOI: 
https://doi.org/10.18280/mmep.120416
Received: 
20 December 2024
|
Revised: 
18 February 2025
|
Accepted: 
25 February 2025
|
Available online: 
30 April 2025
| Citation

© 2025 The authors. This article is published by IIETA and is licensed under the CC BY 4.0 license (http://creativecommons.org/licenses/by/4.0/).

OPEN ACCESS

Abstract: 

Water quality is always critical for most aquatic species and is crucial in the support of juvenile fish. However, current approaches for water quality assessment do not provide the required level of detail for pollution classification because they fail to account for temporal dependencies between different water quality variables. This research introduces a new transformer-based model, called HydroTransNet, which is developed to classify the water quality using multi-head self-attention and positional encoding. HydroTransNet quantifies the dynamic interactions of water quality variables over time and provides a temporal resolution of the dependencies of the various metrics at the sampling points within the large and complicated aquatic systems. The model architecture consists of several transformer encoder layers, universal normalization, and a fully connected layer that enhances the model’s predictive accuracy of water quality parameters. HydroTransNet was tested on several datasets, and the datasets contain water quality data under different conditions. The results indicate that HydroTransNet is better than traditional machine learning methods and has an accuracy of 99.1%. The proposed model has important implications for ecological monitoring: it affords a real opportunity for assessing the pollution level in the freshwater ecosystems and contributes to improvement of the environmental quality and resources. Because of the HydroTransNet’s effectiveness in observing critical parameters of water, it presents an effective model for real-time environmental observation with applications in the preservation and rehabilitation of natural aquatic habitats.

Keywords: 

deep learning, water quality, transformers, aquaculture, HydroTransNet

1. Introduction

Assessment of water quantity in rivers, lakes, and oceans is one of the most important indicators of ecological status, as it affects fish and different water inhabitants [1]. These ecosystems are disturbed by pollution and contamination, which results to decrease in species, and the creation of certain conditions that are more likely to influence complete food chains [2]. Monitoring of water quality [3-8] is crucial to the preservation of the affected ecosystems but the traditional methods of water sampling and laboratory analysis are time-consuming and do not provide real time data [9-14]. These methods do not easily fit the dynamic and variable nature of natural water bodies especially those that are influenced by seasonal changes, anthropogenic impacts and effects of urbanization [15-18]. Hydroinformatics have also been applied to supplement the traditional water quality monitoring through development of models that provide outlooks based on the history. While these models help improve the efficiency of monitoring, they have some shortcomings in describing the behavior of the multiple water quality parameters and their interactions [19]. Standard approaches to data analysis based on machine learning and statistical methods are unable to identify these connections because variability and interdependence are high in real-world aquatic environments [20-24].

New opportunities for their overcoming are opening with the help of the latest advances in deep learning [25, 26], including the transformer model. The transformer models were initially introduced for processing sequential data in natural language processing and are most useful in capturing patterns and dependencies in time-series data, which will be useful in capturing temporal relationships within water quality metrics. To this end, we propose HydroTransNet, a transformer-based classifier for water quality classification based on temporal dependencies of the environmental features. HydroTransNet incorporates multi-head self-attention mechanisms and positional encoding to improve the interpretability of the temporal feature of water quality data and is a more flexible and robust solution to environmental monitoring. In this respect, HydroTransNet takes into account the temporal order of observations and their applicability to build a precise and adaptable system for the long-term water quality monitoring. This has the possibility of improving the development of water quality monitoring activities in order to improve the health of the ecosystem. Section 2 includes a detailed review of existing research works within the field. The proposed model is explained in detail through Section 3 by introducing its framework in addition to methodology and important system components. Section 4 provides an extensive examination of results together with a discussion which interprets all findings and their effects. Section 5 concludes by summing up major findings and recommending directions for potential research development.

2. Related Work

Several techniques have been proposed by other researchers to predict water quality. For example, Arepalli et al. [6] employed a highly complex data analysis using GRU to forecast water quality for salmon fish. This helps us understand and manage better water quality for salmon fish. Other research focused on some aspects of water quality. Ubah et al. [7] used AI to predict the factors influencing the water irrigation quality and Talukdar et al. [8] developed a model to predict overall health of the aquatic ecosystem. Other sectors that also gain from such developments include fish farming or aquaculture. Li et al. [9] put forward a smart system that regulates the level of ammonia in fishponds, which in turn increased the growth rate among the fishes. Metin et al. [10] and Nagaraju et al. [11] proposed how to predict toxic compounds in water used in aquaculture to improve their quality for fish farming. They also introduced another intelligent approach to accurately predict ammonia levels in real-time for aquaculture, aiding in better fish health and productivity. Yu et al. [12] employed the latest methods in computer science for the purpose of better conservation and utilization of water. Panwar et al. [13] proposed AquaVision, an autonomous transfer learning model that detects trash in bodies of water. The model achieved better accuracy on AquaSat dataset.

Nasir et al. [14] found antioxidant defence mechanisms against ammonia stress in freshwater turtles, Chen et al. [15] employed deep learning methods. Huu and Duc [16] developed ammonium monitoring system with up to 80% accuracy in aquaculture using deep learning image processing and Internet of Things technologies. In their work, Wang et al. [17] used LSTM neural network model with Landsat-8 Sentinel -2 data to enhance the measurement and detect outliers. The researchers also included reference from their work conducted by Chen et al. [15]. Arepalli and Naik [18] have pointed out that fish farming can cause serious pollution in water bodies and affect aquatic ecosystems and organisms. Hence, it is important to evaluate them accurately so as to prevent further degradation. They analyze traditional methods because they are expensive, time consuming, prone to errors therefore suggesting integration of AI, IoT data analytics into evaluation process. In their research work they come up with new framework where data is collected using IoT devices then Ordinary Differential Equation Gated Recurrent Unit (AODEGRU) model integrated attention mechanisms applied achieve better classification accuracy. From experiments conducted this model shows high rates accuracy much better than any other existing methods.

Arepalli and Naik [19] proposed water quality classification framework Dilated Spatial-temporal Convolution Neural Network (DSTCNN) model, that identifies both shortcomings in traditional aquaculture water quality monitoring procedures as well as difficulties that deep learning models encounter including overfitting and interpretation barriers. Real-time monitoring through IoT systems has enhanced assessment capacity yet accuracy standards as well as generalization capabilities need improvement.

3. Proposed Work

The model takes as input a series of water quality measurements X={x1, x2, xT}, where T is the length of the sequence, and each feature vector contains different water quality parameters. The output of the model is a predicted class that indicates the quality of the water.

The HydroTransNet exists as a transformer-based model structure which processes water quality information to identify time dependencies successfully. A sequence of transformer encoder layers makes up the model structure along with multi-head self-attention processing units to examine connections between water quality measurements across different time intervals. This architecture contains 6 encoder layers and each layer has eight attention heads for simultaneous examination of different aspects in input data. A dimension of 512 functions within the embedding system to generate representations which deliver significant features alongside operationally efficient processing. Positional encoding was incorporated to maintain the sequential order of information since transformers by nature do not handle sequence arrangement. The model obtains increased stability through the implementation of layer normalization alongside dropout regularization techniques for preventing overfitting. The last layer in the sequence converts the extracted features into classification labels through a fully connected transformer. The architectural design of HydroTransNet surpasses traditional machine learning models because it succeeds at modeling the complex as well as dynamic nature of water quality variations.

Compute positional encoding:

The positional encoding equations use sinusoidal functions to encode token positions, ensuring unique and smooth variations across dimensions. The combination of sine and cosine functions with different frequencies allows the model to capture both short-term and long-term positional relationships in the sequence.

$\begin{aligned} & P E_{(p o s, 2 i)}=\sin \left(\frac{p o s}{10000^{\frac{2 i}{d}}}\right) \\ & P E_{(p o s, 2 i+1)}=\cos \left(\frac{p o s}{10000^{\frac{2 i}{d}}}\right)\end{aligned}$            (1)

Here, d is the dimension of the entrenching.

Map input sequence to embedding space:

o Embed the input sequence $\mathrm{E}=\mathrm{XW} \mathrm{W}_{\mathrm{e}}+\mathrm{b}_{\mathrm{e}}$

o Add positional encodings to the embedded input:

$E^{\prime}=E+P E$               (2)

Multi-head self-attention:

In the mechanism of self-attention with multiple heads, every head of attention calculates queries, keys and values by multiplying different weight matrices with input embeddings. This enables each head of attention to concentrate on various parts of information and grasp diverse connections between elements in a sequence.

For each attention head h, compute queries Q, keys K, and values V:

$\begin{aligned} & Q^h=E^{\prime} W_h^Q \\ & K^h=E^{\prime} W_h^K \\ & V^h=E^{\prime} W_h^V\end{aligned}$             (3)

Compute scaled dot-product attention:

In each attention head, the attention scores are calculated by enchanting the dot product of the queries and keys, scaling it by the square root of the dimension of the keys, and then applying the Softmax function to obtain normalized attention weights. These weights are used to calculate a weighted sum of the values, producing the output of the attention mechanism.

Attention ${ }^h\left(Q^h, K^h, V^h\right)=\operatorname{Softmax}\left(\frac{Q^h\left(K^h\right)^T}{\sqrt{d_k}}\right) V^h$           (4)

The multi-head attention mechanism concatenates the outputs of all attention heads into one vector. This vector is then transformed by a final weight matrix to give the output of the multi-head attention layer.

Concatenate attention heads:

MultiHead $(Q, K, V)=$ Concat $\left(\right.$ head $_1 \ldots$, head $\left._H\right) \mathrm{W}^0$         (5)

Add and normalize:

The original input embeddings are added to the output of the multi-head attention mechanism in the add and normalize step (residual connection). This sum is then passed through layer normalization, which stabilizes and improves the training of the model by normalizing the output and ensuring it has a mean of zero and variance of one.

  • Add residual connection and apply layer normalization:

$\mathrm{E}^{\prime \prime}=\operatorname{LayerNorm}\left(\mathrm{E}^{\prime}+\operatorname{MultiHead}(Q, K, V)\right)$              (6)

Feed-forward network (FFN):

  • Apply position-wise FFN:

$\operatorname{FFN}(x)=\operatorname{ReLU}\left(x W_1+b_1\right) W_2+b_2$            (7)

  • Add residual connection and apply layer normalization:

Output $=$ LayerNorm $\left(\mathrm{E}^{\prime \prime}+\mathrm{FFN}\left(\mathrm{E}^{\prime \prime}\right)\right)$   

Global average pooling:

  • Apply global average pooling to the output sequence to obtain a fixed-size representation:

$\mathrm{z}=\frac{1}{\mathrm{~T}} \sum_{\mathrm{t}=1}^{\mathrm{T}} \mathrm{Output}_ \mathrm{i}$         (8)

Fully connected layer:

  • Pass the pooled representation through a fully connected layer to map it to the output classes:

$y=\operatorname{Softmax}(z W+b)$            (9)

Algorithm 1: HydroTransNet

Input: X={x1, x2, xT}

Output: Classify weather the water quality is contaminated for aquaculture or not

  1. Positional encoding

  2. Compute positional encoding

  3. For each position pos in the sequence and each dimension i:

$\begin{gathered}P E_{(p o s, 2 i)}=\sin \left(\frac{p o s}{1000 \frac{2 i}{d}}\right) \\ P E_{(p o s, 2 i+1)}=\cos \left(\frac{p o s}{10000 \frac{2 i}{d}}\right)\end{gathered}$

  1. Input embedding

  2. Map input sequence to embedding space:

    1. Embed the input sequence XXX using a linear layer:

  3. $E^{\prime}=E+P E$

    1. Add positional encodings to the embedded input:

  4. $E^{\prime}=E+P E$

  5. Transformer encoder layer

  6. Multi-head self-attention

  7. For each attention head h,

    1. compute queries Q, keys K, and values V:

    2. $Q^h=E^{\prime} W_h^Q$

    3. $K^h=E^{\prime} W_h^K$

    4. $V^h=E^{\prime} W_h^V$

    5. Compute scaled dot-product attention:

    6. $\begin{aligned} & \text {Attention}{}^h\left(Q^h, K^h, V^h\right)= \operatorname{Softmax}\left(\frac{\mathrm{Q}^{\mathrm{h}}\left(\mathrm{K}^{\mathrm{h}}\right)^{\mathrm{T}}}{\sqrt{\mathrm{d}_{\mathrm{k}}}}\right) V^h\end{aligned}$

    7. Concatenate attention heads:

    8. MultiHead $(\mathrm{Q}, \mathrm{K}, \mathrm{V})=$ Concat($\text{head}_1 \ldots$, $\left.\text{head}_H\right) W^0$

  8. Add and normalize:

    1. Add residual connection and apply layer normalization:

    2. $\begin{aligned} & \mathrm{E}^{\prime \prime}=\text { LayerNorm(E} \mathrm{E}^{\prime}+\text {MultiHead}(\mathrm{Q}, \mathrm{K}, \mathrm{V}))\end{aligned}$

  9. FFN:

    1. Apply position-wise FFN:

      1. $F F N(x)=\operatorname{ReLU}\left(x W_1+b_1\right) W_2+b_2$

    2. Add residual connection and apply layer normalization:

      1. Output = LayerNorm(E's + FFN(E's))

  10. Repeat steps 3-5:

    1. Apply multiple transformer encoder layers by repeating the above steps.

  11. Global average pooling

    1. Apply global average pooling to the output sequence to obtain a fixed-size representation:

$\mathrm{z}=\frac{1}{\mathrm{~T}} \sum_{\mathrm{t}=1}^{\mathrm{T}}$ $\text{Output}_{\mathrm{i}}$

  1. Fully connected layer:

    1. Pass the pooled representation through a fully connected layer to map it to the output classes:

  2. y=Softmax(zW+b)

  3. The output y represents the predicted class probabilities for water quality.

The algorithm of water contamination analysis starts with a series of water quality measurements, each having different parameters. Initially, positional encodings are calculated to indicate where each measurement stands in the sequence. These encodings are then added to the embedded input data so as to create a positional-aware representation. The model’s heart consists of several transformer encoder layers. Each layer has a multi-head self-attention mechanism that computes queries, keys and values for determining which measurements are more important than others. Values are weighted by attention scores and outputs from all attention heads are concatenated and transformed. This is followed by adding residual connections and normalizing the results. Next, FFN is applied to further process the data. After repeating these steps for multiple encoder layers, global average pooling is used to aggregate sequence data into a fixed-size vector. Then this vector goes through a fully connected layer which classifies whether the water quality is contaminated with final output representing predicted class probabilities.

The FFN within the HydroTransNet architecture turns the multi-head self-attention layer output into improved features for analysis. After the attention mechanism retrieves water quality parameter relationships the output enters a position-wise FFN which is also known as the fully connected layer. Two linear transformations follow each other in the architecture with ReLU set as the non-linear function. Through the first transformation the model expands the embedding dimensional space before the second transformation reduces it back to the input dimension for maintaining consistency with original parameters. Its double-layer architecture permits the model to develop advanced abstractions of water quality data without compromising operational speed.

Each position of sequence processing occurs independently through the FFN because it treats every individual data point without examining its contextual position. The attention mechanism's learned contextual relationships stay intact because the model continues to process individual representations to obtain improved classification precision. To prevent overfitting and guarantee dataset generalization dropout regularization functions between both transformations in the model. Layer normalization occurs for a second time following feed-forward operations to support stable activation distribution which improves training convergence. The combination between the fully connected layer and normalization steps with residual connections results in HydroTransNet becoming an accurate platform for real-world environmental monitoring applications.

4. Results and Discussion

We have evaluated the proposed model against contemporary models using accuracy, precision, recall and F1-score publicly available datasets. To ensure reliability and robustness in our studies, we used confusion matrix analysis to evaluate the proposed model. The IoT-enabled sensors collected the water quality parameters from aquaculture fish ponds through continuous measurement of essential environmental factors that sustain fish health. Water quality sensors operated regularly to measure pH alongside dissolved oxygen (DO), temperature, turbidity, conductivity, ammonia levels, nitrate concentration and biochemical oxygen demand (BOD) parameters. The fishponds received assessments from calibrated multi-parameter water quality probes that were put at various depths throughout different fish pond locations for capturing distinct water quality patterns. The online data communication between sensors and cloud storage occurred through wireless technologies including LoRa Wi-Fi and GSM for instant data transmission. The system performed automated data recording and programmed anomaly finding together with pattern evaluation to recognize water quality problems early on.

Data preprocessing techniques applied to the dataset completed noise reduction and removed missing values and normalization steps for maintaining dataset reliability and consistency. The research data was structured through CSV or JSON formats before its placement in a public repository for easy tool integration and access to users. The dataset gained additional value for research uses and predictive modeling through recordation of essential metadata such as sensor calibration specifications and environmental conditions along with timestamps. This dataset functions as an important tool that helps researchers alongside aquaculture managers and environmental scientists to establish predictive models which optimize water quality control and enable them to create data-based sustainable strategies for fish health improvements within aquaculture systems.

The following sections present detailed findings from our results discussions.

Figure 1 illustrates the accuracy of four models-DSTCNN, attention-based model known as the AODEGRU, temporal fusion transformer (TFT), and HydroTransNet-across multiple epochs. The accuracy of HydroTransNet is consistently increasing and reaches 99.1% by the 100th epoch which shows its strong learning ability and robustness among others. TFT also performs well with an accuracy rate of 98%, but it slightly falls behind HydroTransNet. AODEGRU has a good start with a peak performance of 94.9% but fluctuates around this value in later epochs which indicates that it may be sensitive to changes in data. DSTCNN improves steadily but plateaus at around 94.7%, suggesting that feature learning might have saturated.

Figure 1. Accuracy

Figure 2 compares the precision of the models where again HydroTransNet leads achieving a precision rate of 98% by the final epoch. This means that HydroTransNet makes fewer false positive predictions than other models and can be used when high confidence in positive predictions is required because they are more reliable. TFT performs strongly too having achieved a final precision level equal to 96% closely following behind HydroTransNet’s value. AODEGRU and DSTCNN exhibit lower precisions indicating higher tendency for false positives especially during later stages of training. Higher precision shown by HydroTransNet is very important in critical applications where false positives are expensive.

Figure 2. Precision

In Figure 3, recall values for the models are plotted. Here, HydroTransNet achieves the highest recall rate equal to 99% by the 100th epoch which shows its capability to detect true positives, i.e., high sensitivity towards finding contaminated water (target class). TFT also performs well with a recall level maintained at around 97%, meaning that most positive instances are captured by this model hence making it reliable too. Slightly lower recalls shown by AODEGRU and DSTCNN imply that they may miss more true positives compared to HydroTransNet. High recall of HydroTransNet is particularly useful when all positive cases must be detected.

Figure 3. Recall

Figure 4 presents the F1-score which is a measure that balances between precision and recall hence providing an overall assessment of how well each model performs. HydroTransNet achieves the highest F1-score equal to 98% at the end of training thus indicating its balanced and superior performance in both precision and recall. TFT also shows strong performance with final F1-score being 96% but still does not outperform HydroTransNet. AODEGRU and DSTCNN have slightly lower F1-scores meaning that while they individually perform good on either precision or recall, their effectiveness is not as balanced as that of HydroTransNet. Higher F1-score for HydroTransNet highlights its general superiority in classification tasks.

Figure 4. F1-score

Figure 5 shows loss values for each model during training; lower losses represent better convergence and overall performance of models. The most significant drop in loss is observed by HydroTransNet which reaches a minimum value of 0.08 by the 100th epoch thus showing its efficiency in learning and optimization. TFT and AODEGRU also exhibit steady decrease in loss but their final values are slightly higher than those obtained by HydroTransNet implying less efficient convergence on them. DSTCNN ends up having relatively higher loss although it improves throughout training which may indicate overfitting or difficulty with complex patterns within data. Lower loss achieved by HydroTransNet reflects stronger training robustness and ability to generalize well on unseen data points.

Figure 5. Loss

Figure 6 shows the comparison of existing and proposed HydroTransNet models, the proposed model is better for detecting water contamination. It has almost perfect accuracy with only one false negative and no false positives which means that it can identify both contaminated and non-contaminated samples correctly. This precision and recall are what make HydroTransNet the most dependable model since it can detect accurately in areas where accurate detection is important for preventing risks of contamination. However, Deep Convolutional Neural Network (DCNN) and AODEGRU also have their own strengths, but they come with some weaknesses as well. DCNN has more false negatives than HydroTransNet which may result into missing out on contaminations while AODEGRU shows slightly better sensitivity by having fewer false negatives than DCNN, but it also records higher rates of false positives together with lower counts of true negatives thus showing signs of over-predicting contaminations. TFT takes a middle ground approach by having higher numbers of true negative detections, but this comes at a cost of sensitivity because there will be more false negatives especially in environments that are sensitive to contamination. These limitations point out why HydroTransNet was designed in such a way that it provides better accuracy than any other existing model while still being reliable enough to address their weaknesses.

Figure 6. Confusion matrix analysis

5. Conclusion

HydroTransNet is a novel transformer-based model for water quality classification that is proposed in this research paper. In contrast to traditional approaches, HydroTransNet can identify temporal dependencies between the water quality parameters satisfactorily and has superior performance. The model can represent the spatial and temporal dynamics of water bodies by using multi-head self-attention mechanisms and positional encoding. HydroTransNet has demonstrated the ability to achieve near perfect results with an accuracy of 99.1% which means that it could easily change the way water quality is assessed and monitored. This model can be employed by the environmental agencies and researchers to analyze and control the aquatic environment. With the information on water quality available in HydroTransNet, aquatic life like fish, and general health of water bodies can be protected. Furthermore, the performance of the model in predicting the state of water indicates that the model could be useful in decision-making concerning the management of water resources pollution and conservation of ecosystems. This paper paves the way for the subsequent studies that explore the applicability of transformer-based models in other environmental monitoring fields like satellite imagery, deeper historical data.

  References

[1] Aghel, B., Rezaei, A., Mohadesi, M. (2019). Modeling and prediction of water quality parameters using a hybrid particle swarm optimization-neural fuzzy approach. International Journal of Environmental Science and Technology, 16(8): 4823-4832. https://doi.org/10.1007/s13762-018-1896-3

[2] Ahmed, U., Mumtaz, R., Anwar, H., Shah, A.A., Irfan, R., García-Nieto, J. (2019). Efficient water quality prediction using supervised machine learning. Water, 11(11): 2210. https://doi.org/10.3390/w11112210

[3] Bisht, A.K., Singh, R., Bhatt, A., Bhutiani, R. (2018). Development of an automated water quality classification model for the River Ganga. In Smart and Innovative Trends in Next Generation Computing Technologies: Third International Conference, NGCT 2017, Dehradun, India, pp. 190-198. https://doi.org/10.1007/978-981-10-8657-1_15

[4] Cao, S., Zhou, L., Zhang, Z. (2021). Prediction of dissolved oxygen content in aquaculture based on clustering and improved ELM. IEEE Access, 9: 40372-40387. https://doi.org/10.1109/ACCESS.2021.3064029

[5] Central Pollution Control Board (CPCB). (2019). Water Quality Standards. https://cpcb.nic.in/wqstandards/, accessed on May 5, 2021.

[6] Arepalli, P.G., Akula, M., Kalli, R.S., Kolli, A., Popuri, V.P., Chalichama, S. (2022). Water quality prediction for salmon fish using gated recurrent unit (GRU) model. In 2022 Second International Conference on Computer Science, Engineering and Applications (ICCSEA), Gunupur, India, pp. 1-5. https://doi.org/10.1109/ICCSEA54677.2022.9936539

[7] Ubah, J.I., Orakwe, L.C., Ogbu, K.N., Awu, J.I., Ahaneku, I.E., Chukwuma, E.C. (2021). Forecasting water quality parameters using artificial neural network for irrigation purposes. Scientific Reports, 11(1): 24438. https://doi.org/10.1038/s41598-021-04062-5

[8] Talukdar, S., Ahmed, S., Naikoo, M.W., Rahman, A., Mallik, S., Ningthoujam, S., Bera, S., Ramana, G.V. (2023). Predicting lake water quality index with sensitivity-uncertainty analysis using deep learning algorithms. Journal of Cleaner Production, 406: 136885. https://doi.org/10.1016/j.jclepro.2023.136885

[9] Li, H.C., Yu, K.W., Lien, C.H., Lin, C., Yu, C.R., Vaidyanathan, S. (2023). Improving aquaculture water quality using dual-input fuzzy logic control for ammonia nitrogen management. Journal of Marine Science and Engineering, 11(6): 1109. https://doi.org/10.3390/jmse11061109

[10] Metin, A., Kasif, A., Catal, C. (2023). Temporal fusion transformer-based prediction in aquaponics. The Journal of Supercomputing, 79(17): 19934-19958. https://doi.org/10.1007/s11227-023-05389-8 

[11] Nagaraju, T.V., Sunil, B.M., Chaudhary, B., Prasad, C.D., Gobinath, R. (2023). Prediction of ammonia contaminants in the aquaculture ponds using soft computing coupled with wavelet analysis. Environmental Pollution, 331: 121924. https://doi.org/10.1016/j.envpol.2023.121924

[12] Yu, H., Yang, L., Li, D., Chen, Y. (2021). A hybrid intelligent soft computing method for ammonia nitrogen prediction in aquaculture. Information Processing in Agriculture, 8(1): 64-74. https://doi.org/10.1016/j.inpa.2020.04.002

[13] Panwar, H., Gupta, P.K., Siddiqui, M.K., Morales-Menendez, R., Bhardwaj, P., Sharma, S., Sarker, I.H. (2020). AquaVision: Automating the detection of waste in water bodies using deep transfer learning. Case Studies in Chemical and Environmental Engineering, 2: 100026. https://doi.org/10.1016/j.cscee.2020.100026

[14] Nasir, N., Kansal, A., Alshaltone, O., Barneih, F., Shanableh, A., Al-Shabi, M., Al Shammaa, A. (2023). Deep learning detection of types of water-bodies using optical variables and ensembling. Intelligent Systems with Applications, 18: 200222. https://doi.org/10.1016/j.iswa.2023.200222

[15] Chen, X., Li, M., Niu, C. (2022). Diverse defense responses to ammonia stress in three freshwater turtles. Aquaculture, 546: 737302. https://doi.org/10.1016/j.aquaculture.2021.737302

[16] Huu, P.N., Duc, H.N. (2021). Propose an automatic ammonia concentration of water measuring system combining image processing for aquaculture. Journal Européen des Systèmes Automatisés, 54(3): 453-460. https://doi.org/10.18280/jesa.540308

[17] Wang, X., Qiao, M., Li, Y., Tavares, A., Qiao, Q., Liang, Y. (2023). Deep-learning-based water quality monitoring and early warning methods: A case study of ammonia nitrogen prediction in rivers. Electronics, 12(22): 4645. https://doi.org/10.3390/electronics12224645

[18] Arepalli, P.G., Naik, K.J. (2024). Water contamination analysis in IoT enabled aquaculture using deep learning based AODEGRU. Ecological Informatics, 79: 102405. https://doi.org/10.1016/j.ecoinf.2023.102405

[19] Arepalli, P.G., Naik, K.J. (2024). An IoT based smart water quality assessment framework for aqua-ponds management using Dilated Spatial-temporal Convolution Neural Network (DSTCNN). Aquacultural Engineering, 104: 102373. https://doi.org/10.1016/j.aquaeng.2023.102373

[20] Peterson, K.T., Sagan, V., Sloan, J.J. (2020). Deep learning-based water quality estimation and anomaly detection using Landsat-8/Sentinel-2 virtual constellation and cloud computing. GIScience & Remote Sensing, 57(4): 510-525. https://doi.org/10.1080/15481603.2020.1738061

[21] Wang, Y., Zhou, J., Chen, K., Wang, Y., Liu, L. (2017). Water quality prediction method based on LSTM neural network. In 2017 12th International Conference on Intelligent Systems and Knowledge Engineering (ISKE), Nanjing, China, pp. 1-5. https://doi.org/10.1109/ISKE.2017.8258814

[22] Singha, S., Pasupuleti, S., Singha, S.S., Kumar, S. (2020). Effectiveness of groundwater heavy metal pollution indices studies by deep-learning. Journal of Contaminant Hydrology, 235: 103718. https://doi.org/10.1016/j.jconhyd.2020.103718

[23] Pyo, J., Hong, S.M., Jang, J., Park, S., Park, J., Noh, J. H., Cho, K.H. (2022). Drone-Borne sensing of major and accessory pigments in algae using deep learning modeling. GIScience & Remote Sensing, 59(1): 310-332. https://doi.org/10.1080/15481603.2022.2027120

[24] Shams, M.Y., Elshewey, A.M., El-Kenawy, E.S.M., Ibrahim, A., Talaat, F.M., Tarek, Z. (2024). Water quality prediction using machine learning models based on grid search method. Multimedia Tools and Applications, 83(12): 35307-35334. https://doi.org/10.1007/s11042-023-16737-4

[25] Arepalli, P.G., Khetavath, J.N. (2023). An IoT framework for quality analysis of aquatic water data using time-series convolutional neural network. Environmental Science and Pollution Research, 30(60): 125275-125294. https://doi.org/10.1007/s11356-023-27922-1

[26] Arepalli, P.G., Naik, K.J. (2023). An IoT-Based water contamination analysis for aquaculture using lightweight multi-headed GRU model. Environmental Monitoring and Assessment, 195(12): 1516. https://doi.org/10.1007/s10661-023-12126-4