1 s2.0 S2214714423006086 Main

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Journal of Water Process Engineering 55 (2023) 104088

Contents lists available at ScienceDirect

Journal of Water Process Engineering


journal homepage: www.elsevier.com/locate/jwpe

Modeling and optimization of Graphene Oxide (GO) membranes for


nanofiltration with artificial neural networks
Haodong Yang a, Zhe Chen a, *, Yong Li b, Lei Yao b, Geming Wang a, Quanrong Deng a, Ping Fu a,
Shenggao Wang a
a
Hubei Key Laboratory of Plasma Chemical and Advanced Materials, School of Materials Science and Engineering, Wuhan Institute of Technology, Wuhan 430205,
China
b
School of Electrical and Information Engineering, Wuhan Institute of Technology, Wuhan 430205, China

A R T I C L E I N F O A B S T R A C T

Keywords: It is important to understand the correlation between the nanostructure and membrane performances (water flux
GO membrane and rejection) in membrane separation technology, which is helpful to develop novel membranes. In this study, a
Genetic algorithm back propagation artificial neural network model optimized with genetic algorithms was proposed to predict the
Back-propagation artificial neural network
nanofiltration performances (water flux and rejection) of GO membranes. The aim is to explore the feature
Water flux
importance of the membrane structure parameters (such as interlayer spacing, Zeta potential, water contact
Rejection
angle, roughness and thickness) and the operating condition (operation pressure) on the membrane properties, so
as to provide clues for improving the membrane performances. The obtained results showed that genetic
algorithm-back propagation artificial neural network (GABPANN) exhibited more accurate performance to
predict the correlation between the parameters (including membrane structure and operation pressure) and the
membrane performances according to the published experimental results. Moreover, the GABPANN results
indicated that the water contact angle is the most powerful parameter determining the water flux, while the
surface charge is the most powerful one determining the rejection of GO membranes. This work provided a novel
strategy to efficiently optimize the nanofiltration performance of GO membranes, beneficial for better under­
standing and controlling the structural design of GO membranes.

1. Introduction necessary to explore the feature importance of each factor on the


membrane performance. The findings could be useful for membrane
Membrane technology has attracted extensive attention due to its scientists to design a GO membrane with novel properties.
economic and environmental advantages in industrial applications In order to describe and possibly predict the fluxes and rejections
[1–4]. Several two-dimensional (2D) nanoscale materials, such as mo­ through porous or dense membranes, many transport models have been
lybdenum disulfide (MoS2), MXene, and graphene-based materials, proposed. These different transport models can be roughly divided into
have been applied in the membrane fabrication processes and exhibit three groups: (1) models based on irreversible thermodynamics [11,12],
excellent performances. Since the ion transport through the interlayer (2) solution-diffusion models [13,14], and (3) pore-flow models
spacing between the 2D sheets can be easily managed by tuning the [15–18]. However, a number of model parameters have to be fitted or
physical and/or chemical properties of the 2D materials [5–7], the GO obtained from independent measurements to describe the permeation.
membranes are expected to exhibit excellent properties by controlling Obtaining the model parameters by independent measurements is often
the nanostructure. Previous literatures have reported that many factors difficult or impossible. Semiempirical models have been therefore pro­
would influence the membrane properties, in which the hydrophilic posed to describe the model parameters as a function of a certain solute/
surface endowed the higher water flux, while the surface charge and solvent/membrane properties [19,20]. Regression of experimental data
interlayer spacing were responsible for the ion rejection [8–10]. Due to is often used to quantify the unknown model parameters.
the trade-off effect, it is not easy to develop a separation membrane with Artificial intelligence (AI) has received extensive attention in recent
both high rejection and high water flux at the same time. Therefore, it is years. The wide prediction range of artificial intelligence methods in

* Corresponding author.
E-mail address: [email protected] (Z. Chen).

https://doi.org/10.1016/j.jwpe.2023.104088
Received 4 April 2023; Received in revised form 10 July 2023; Accepted 22 July 2023
Available online 7 August 2023
2214-7144/© 2023 Elsevier Ltd. All rights reserved.
H. Yang et al. Journal of Water Process Engineering 55 (2023) 104088

Fig. 1. Process flowchart for machine learning (ML) based GO membrane performance prediction.

various applications represents a hot frontier research in various tech­ This study collected and utilized laboratory-scaled experimental data
nical disciplines, including membrane technology [21]. The AI tools from the literatures, including interlayer spacing, Zeta potential,
include fuzzy logic, particle swarm optimization (POS), Monte Carlo roughness, and GO layer thickness, and further developed an artificial
simulation (MCS), support vector machine (SVM), random forest (RF), neural network algorithm model to describe the relationship between
artificial neural network (ANN), etc. Among them, SVM, RF and ANN membrane microstructure and membrane performance, so as to analyze
have been frequently applied to solve environmental engineering and predict GO membrane performance studies. The genetic algorithm
problems and shown good performances [22–26]. Anietie et al. used (GA) was introduced and combined with BPANN (GABPANN) to avoid
extensive logging data to predict the petrophysical properties of indi­ falling into the local minimum, and the fit quality of the GABPANN
vidual reservoirs using ANN models and obtained reliable results for model was compared with the three traditional models (BPANN, RF, and
porosity, permeability, and water saturation [27]. Through the SVM). The purpose of this study is to provide guidance on the design of
modeling and prediction of multi-dimensional systems, it can overcome novel GO films and even other thin-film nanocomposite membranes.
the limitations of traditional methods and can be used to solve various
regression, classification, anomaly detection, and other problems. AI 2. Methodology
approaches have been used to predict the membrane fouling [28],
design and optimization of thin-film nanocomposite membranes 2.1. Data sets
[29,30], and estimate the membrane lifetime [31]. Since various factors
affecting the membrane performance are interrelated and exhibit com­ 2.1.1. Data collection
plex nonlinear relationships, traditional AI takes a long time for For laminar GO membranes, the interlayer nanochannels between
modeling and training. Therefore, we plan to apply artificial neural adjacent nanosheets act as molecular transport channels during the
networks (ANN) to study GO membranes. ANN is enlightened by the filtration process. The precise control of GO membranes transport
framework of organismic neural networks. The objective of a neural channels has a significant impact on the membrane properties. The size
network is to establish a mapping between inputs and desired outputs of the channel leads to molecular sieve behavior, and thicker GO layers
through supervised learning, such that the learned model can make would provide longer channels, thus resulting in enhanced selectivity
reasonable predictions from ungiven data. It is a black-box model, which and reduced permeability [35,36]. The functional hydrophilic hydroxyl,
can obtain a relationship between the input variables and output vari­ carboxyl, and epoxide groups on GO sheets can benefit from absorbing
ables after training process. Besides, ANN can predict the feature water molecules and confer Donnan exclusion by providing a negatively
importance of each input variable. Up to date, most ANN related liter­ charged effect [37,38]. Besides, the surface roughness of the GO mem­
ature focused on the relationship between the external conditions of the brane would vary the water flux [39]. In this case, the collected data are
membrane separation processes and performances. Delgrange-Vincent divided into two categories: five input variables and two dependent
et al. have predicted the membrane fouling using ANN [32]. Si et al. output variables. As illustrated in Fig. 1, five variables, the interlayer
presented an effective approach utilizing ANN to predict the vacuum spacing between the GO sheets, Zeta potential, roughness of the GO
membrane distillation process of sulfuric acid solution [33]. Zhao et al. layer, thickness of the GO layer, and the operation pressure, were
established a quantitative structure-property relationship model for selected as the input parameters. Since water flux and rejection are two
predicting the gas separation performance of polyimide membranes independent engineering parameters which can evaluate the membrane
using ANN [34]. However, few reports applied ANN to predict the GO performance, these two parameters were selected as the output pa­
membrane performance from the membrane microstructure, which is rameters. Experimental data were extracted from 72 pieces (2017 to
the most concerned point for GO membrane applications. 2021) in all the published journal literature, which can be seen in

2
H. Yang et al. Journal of Water Process Engineering 55 (2023) 104088

(ymax − ymin )(x − xmin )


y= + ymin (1)
xmax − xmin

where x is the actual value in the data set, xmax and xmin are the
maximum and minimum values of the data set, respectively, and ymax
and ymin are the normalized range (0,1), y is the normalized value of x.
Biases are easily induced when the number of samples is limited.
Therefore, we adopted a method called 5-fold cross-validation, which
randomly arranged the data of the dataset into five subsets or folds, to
train and verify the entire dataset. Then, four of the five subsets were
used for training, and the remaining one was used as the validation
dataset. This was repeated 10 times until each subset was used as a
validation set and cross-validated three times.
Fig. 2. Artificial neuronal structure.

2.2. Model development


Table 1
Activation functions used in the BP-ANN network. In this work, random forest (RF), support vector machine (SVM),
back propagation artificial neural network (BPANN), and genetic
Activation function Equation Range
algorithm-back propagation artificial neural network (GABPANN) were
Logistic sigmoid 1 (0, 1)
f(z) = employed to predict the performance of GO membrane.
1 + e− z
Hyperbolic tangent sigmoid 2z
e − 1 (− 1, 1)
f(z) = 2z
e +1 Table 2
Pure linear f(z) = z (− ∞, +∞) The optimum configuration of the BPANN model.
Parameters BPANN
Table S1 of the Supporting Information. Water flux Rejection

Number of neurons in the input layer 6 6


2.1.2. Data preprocessing Structure of hidden layers 5-9-11 12-5-5
Generally speaking, the artificial neural network model is divided Activation function tansig-tansig- tansig-logsig-
into three stages: training stage, verification, and prediction. Therefore, tansig tansig
the data set is divided into three different categories, namely, the Number of output neurons 1 1
Number of folding in the cross- 5 5
training data set, the validation data set, and the prediction data set. In
validation
this study, 70 % of the data were randomly selected as the training set, Number of epochs 10,000 10,000
and the remaining 30 % of the data were used for the validation set and Learning rate 0.001 0.001
the test set. Loss function MSE MSE
In order to ensure that each input contributes equally to the pre­
diction of the output and minimize the redundancy, the data is
normalized to get better prediction results. The data is normalized by the Parameters GABPANN
“mapminmax” function (refer to Eq. (1)) and converted in the range of Population size 60
(− 1, 1): Max Generations 150
Crossover fraction 0.6
Migration fraction 0.001

Fig. 3. Flow chart of BPANN and GABPANN.

3
H. Yang et al. Journal of Water Process Engineering 55 (2023) 104088

Fig. 4. Performance of training, validation, and errors for GABPANN of a) pure water flux and b) rejection.

Fig. 5. Regression plot of training, validation, and prediction test for GABPANN of a) pure water flux and b) rejection.

2.2.1. Support vector machine (SVM) ⃒ ⃒〈


Support vector machine (SVM) is a set of related supervised learning
⃒f (x) − ωT ⋅φ(x) ⃒ ε (3)
methods used for classification and regression, which can be applied to Usually, a loss function (Eq. (4)), a penalty parameter (Eq. (5)), and a
maximize predictive accuracy while automatically avoiding over-fit to sum of positive slack variables (Eq. (6)) are introduced to ensure the
the data [40,41]. existence of a solution by transforming the regression problem into a
Assume a given data set (x, f(x)), where x refers to the space of the convex quadratic programming problem:
input data set and f(x) refers to the output vector. The relationship be­
tween x and f(x) is expressed as Eq. (2): ∑
m
( )
min(‖ω‖ + C ) ξi + ξ*i (4)
(2)
i=1
f (x) = ωT ⋅φ(x) + b
s.t.yi − ωT ⋅φ(xi ) − b ≤ ε + ξi (5)
where φ(χ) denotes the nonlinear mapping function and ω and b are the
coefficients. Those should follow the relation of Eq. (3):
ωT ⋅φ(xi ) + b − yi ≤ ε + ξi , ξi ≥ 0, ξ*i ≥ 0 (6)

4
H. Yang et al. Journal of Water Process Engineering 55 (2023) 104088

Fig. 6. Comparison of R2 between BP-ANN and GABPANN for a) pure water flux and b) rejection.

Fig. 7. Comparison of output errors between BPANN and GABPAN for a) pure water flux and b) rejection.


nsv
Table 3 f (x) = ai K(xi , x) + b (7)
The statistical indexes of different models. i=1

Output Model Statistical indexes


The Gaussian radial basis kernel function is chosen for its flexibility
MAE MSE RMSE MAPE R2 and versatility, and is expressed as Eq. (8):
Water flux RF 13.82 333.82 18.27 150.2 0.23 ( )
SVM 12.74 835.47 28.9 138.47 0.27
K(xi , x) = exp − g⋅‖xi − x‖2 (8)
BPANN 26.94 1190.44 34.5 9.3 0.84
GABPANN 3.57 24.06 4.90 45.10 0.91 where g denotes the kernel function parameter.
Rejection RF 10.76 246.53 15.7 117 0.35
SVM 9.34 195.31 13.98 101.49 0.39
2.2.2. Random forest (RF)
BPANN 12.48 260.1 16.13 2.02 0.18
GABPANN 3.61 23.02 4.80 4.16 0.85 Random forest (RF) is a machine learning algorithm based on deci­
sion trees [40]. It is composed of multiple decision trees, and each de­
cision tree is different. When constructing the decision tree, we
Lagrange multiplier is introduced to solve the problem. Any function randomly select a part of the samples from the training data, and
satisfying Mercer's condition can be called a kernel function. The cor­ randomly select some features for training. Each tree uses different
responding regression is Eq. (7): samples and features, and the training results are also different. RF has

5
H. Yang et al. Journal of Water Process Engineering 55 (2023) 104088

Fig. 8. Relative weights of different parameter contribution to (a) pure water flux and (b) rejection data.

been used for the membrane performance (rejection and flux) prediction algorithms. Sahoo and Ray applied a GA-based ANN model to predict the
in thin film composite membranes [42]. flux decline of a cross-flow membrane [46]. GABPANN combines the
advantages of GA and BPANN, which makes it powerful in specific
2.2.3. Back-propagation artificial neural network (BPANN) fields.
Artificial neural network (ANN) is a scientific model of machine GA solves the problem by creating an initial population and itera­
learning, whose design is inspired by the human nervous system [43]. tively modifying the number of independent populations. And GA
The processing units in the ANN model are hierarchically arranged, randomly selects individuals from the current population (parents) that
consisting of three layers (an input layer, one or more hidden layers, and contribute to the next generation, and the new population (children) is
an output layer). Each neuron in the hidden layer is connected to each inherited from the previous generation of parents. With the continuous
neuron in the input layer, and also to each neuron in the output layer. generation of new populations, the population develops towards the
Each unit receives the input of all units in the upper layer according to optimal solution. The GA is mainly composed of five parts, including the
the connection weight, and then transfers to the next layer. The back- initial population size, fitness function, selection, crossover and muta­
propagation was proposed by Rumelhart et al. [44]. This procedure tion. See the supporting documents for the specific explanation. The
repeatedly adjusts the weight of the connections in the network to algorithm flow of BPANN optimized by GA is shown in Fig. 3.
minimize the difference between the actual output data and the desired
output data. 2.2.5. Evaluation index
In this study, a back-propagation artificial neural network (BPANN) The performance of the models can be evaluated by coefficient of
model was developed. BPANN exhibited the general advantage of self- determination (R2), mean absolute error (MAE), mean absolute per­
learning, adaptive and nonlinear mapping capabilities, starting from centage error (MAPE), mean square error (MSE), root mean square error
setting the initial weights, minimizing the error at each iteration (RMSE) and the calculation formula is shown below:
(gradient descent algorithm implementation), and stopping according to

N
the stopping criteria. y i − yi )2

The BPANN paradigm consists of an artificial neuronal architecture 2
R = 1− i=1
(11)
∑N
and individual neurons as processing units, as shown in Fig. 2. (y − yi )2
Neurons are converted by introducing a nonlinear activation func­ i=1

tion into the network. A common activation function is sigmoid (Log-


sigmoid), tan-sigmoid, and linear transfer function (purelinear), whose 1 ∑N
MAE = yi|
|yi − ̂ (12)
output ranges are shown in Table 1: N i=1
The feed-forward network calculation by the BPANN is as follow:
N ⃒ ⃒
1 ∑ yi*
⃒yi − ̂ ⃒

N MAPE = ⃒
⃒ 100%⃒⃒ (13)
s= ωij χ i + b (9) N i=1 yi
n=1

1 ∑N
1 MSE = (yi − ̂y i )2 (14)
y = F(s) = s
(10) N i=1
1 + e−
√̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
where ωij represents the weight of the connection between neurons i and √
√1 ∑ N
j, s represents the output of neuron i input neuron j, and b, as a bias term RMSE = √ (yi − ̂ y i )2 (15)
N i=1
in the form of a threshold, represents the output value represented by
the “sigmoid” function.
where N is the number of real training data points,̂ y isthe predicted
2.2.4. Genetic algorithm (GA) value, y is the mean value, and yi is the true value.
Genetic algorithm (GA), which imitates the phenomenon of natural It is important to note that MAE and MSE measure the absolute size
evolution, constitutes a class of search algorithms based on the me­ of the deviation of the true value from the predicted value, while MAPE
chanics of natural selection and natural genetics [45]. It can be used to measures the relative size of the deviation. Relatively, MAE and MAPE
solve the optimization problems with or without constraints and various are not easily affected by extreme values, while MSE and RMSE calculate
optimization problems that are not suitable for standard optimization the square of the error, which is more sensitive to outlier data and can
highlight the error values with greater impact. In general, a satisfactory

6
H. Yang et al. Journal of Water Process Engineering 55 (2023) 104088

model should have low value in MAE, MAPE, MSE, and RMSE and R2 the salt rejection. Therefore, regarding to the five GO membrane input
should be close to 1. parameters, proper control of surface charge and surface hydrophilicity
could be the key to breaking the permeability-selectivity tradeoff. Be­
3. Results and discussion sides, the decrease of GO layer thickness could also favor high water
flux. Since the chemical modification would simultaneously influence
3.1. Optimization and prediction of GABPANN model the Zeta potential and surface hydrophilicity, a GO membrane with
better performance might be achieved with thin dense layer and
All algorithms need to adjust the parameters to improve the effi­ chemical modification.
ciency and generalization performance of the model, and the GABPANN
parameters such as the number of hidden layer nodes, learning rate, 4. Conclusion
momentum factor, and initial weight will have a great impact on the
performance of the GABPANN model. The learning rate is a hyper- In this paper, a GABPANN model was developed to predict the pure
parameter that affects the magnitude of loss and the convergence rate water flux and interception rate of GO membranes. The model used five
of the model in the network training process. If the learning rate was too input parameters, including four membrane structure parameters (layer
small, underfitting would occur, otherwise, overfitting would occur, spacing, Zeta potential, water contact angle, roughness, and thickness)
resulting in a rapid decline in training loss or divergence. In the process and one operating condition variable (operating pressure). The statis­
of modeling, the performance of the model is improved by trial and error tical results confirmed that the GABPANN model exhibited high accu­
to achieve the optimal fitting with different activation functions, nodes racy and stability. In addition, the MIV algorithm reveals the degree of
and layers. The optimal BPANN structural model parameters are shown influence of each feature variable on the model energy. During the
in Table 2. development of GO membrane, the membrane preparation process can
Fig. 4 shows the performance of the model during the training, be adjusted according to the key characteristic parameters obtained by
validation, and testing phases. In the training process, the MSE stops the MIV results, which can regulate the membrane energy to the
training when the test error is the lowest and the mean square error maximum extent. The results of this study show that the water contact
remains the same for at least 6 iterations. In this study, the network was angle (hydrophilic degree) plays the most important role in water flux,
trained for 8 and 9 epochs, respectively. while the Zeta potential exhibits the most important one in the rejection
Fig. 5 shows the training, validation and predictive test regression of rate for GO membranes. According to the results, a novel GO membrane
GABPANN for pure water flux and rejection. The dashed line shows the might be developed with grafting appropriate chemical groups and
best linear fit, while the solid line shows the linear fit of the current decreasing the top dense layer.
model between the predicted and actual values. As shown in the Fig. 5, The established GABPANN model was compared with BPANN and
there is a good correlation between the experimental and predicted SVM with RF. The results showed that the GABPANN model optimized
values in the training and testing sets. Among them, the R2 values of the by GA performed better than the other three types of models and
training set and test set obtained by the prediction model of pure water exhibited higher stability and accuracy. It can be used in laboratory-
flux were 0.991 and 0.976, respectively. The R2 values of training set scaled GO membranes and further extended to the development work
and test set obtained by the prediction model of rejection rate were of various types of materials.
0.979 and 0.914, respectively.
Data sets are introduced into BPANN and GABPANN models
Declaration of competing interest
respectively for the precision comparison. As shown in the Fig. 6,
compared with BPANN, the coefficient of determination R2 of GAB­
The authors declare that they have no known competing financial
PANN model is closer to 1, indicating that the fitting degree is better. As
interests or personal relationships that could have appeared to influence
seen in Fig. 7, the output error of GABPANN is obviously better than that
the work reported in this paper.
of BPANN, indicating that the introduction of GA can better predict the
performance of GO membranes.
Data availability

3.2. Comparison of GABPANN model with SVM and RF models


Data will be made available on request.

With the feature variables extracted in Section 2.1 as inputs, the


Acknowledgments
prediction results of four nonlinear models (RF, SVM, BPANN, and
GABPANN) were obtained in Table 3. It can be obviously concluded that
This research did not receive any specific grant from funding
GABPANN model exhibited the highest accuracy among the four
agencies in the public, commercial, or not-for-profit sectors.
different AI models. In other words, the developed GABPANN model in
this study is superior, and can be used to predict the water flux and salt
Appendix A. Supplementary data
rejection of any lab-scaled GO membrane for nanofiltration. Moreover,
multiple input parameters can be included simultaneously.
Supplementary data to this article can be found online at https://doi.
org/10.1016/j.jwpe.2023.104088.
3.3. MIV variable screening of GABPANN

In this study, we use MIV importance analysis as a key indicator to References


evaluate the importance of feature variables in the model. Fig. 8 shows [1] T. Ahmad, X. Liu, C. Guria, Preparation of polyvinyl chloride (PVC) membrane
the calculation results of the membrane parameter weights affecting the blended with acrylamide grafted bentonite for oily water treatment [J],
pure water flux and rejection performance of the membrane. It is Chemosphere 310 (2023), 136840, https://doi.org/10.1016/j.
chemosphere.2022.136840.
commonly believed that the enhancement of water flux can be influ­
[2] T. Ahmad, C. Guria, S. Shekhar, Effects of inorganic salts in the casting solution on
enced by the surface hydrophilicity, interlayer spacing, GO layer morphology of poly (vinyl chloride)/bentonite ultrafiltration membranes [J],
thickness, as well as the surface roughness. The fitting results of our Mater. Chem. Phys. 280 (2022), 125805, https://doi.org/10.1016/j.
model confirmed the surface hydrophilicity was the most important matchemphys.2022.125805.
[3] T. Ahmad, C. Guria, A. Mandal, Optimal synthesis, characterization and antifouling
factor that affecting the water flux. Interestingly, the developed model performance of Pluronic F127/bentonite-based super-hydrophilic polyvinyl
indicated the surface Zeta potential was the key factor that determines chloride ultrafiltration membrane for enhanced oilfield produced water treatment

7
H. Yang et al. Journal of Water Process Engineering 55 (2023) 104088

[J], J. Ind. Eng. Chem. 90 (2020) 58–75, https://doi.org/10.1016/j. [26] L. Yao, Y. Li, Q. Cheng, et al., Modeling and optimization of metal-organic
jiec.2020.06.023. frameworks membranes for reverse osmosis with artificial neural networks [J],
[4] T. Ahmad, C. Guria, A. Mandal, Kinetic modeling and simulation of non-solvent Desalination 532 (2022), 115729, https://doi.org/10.1016/j.desal.2022.115729.
induced phase separation: immersion precipitation of PVC-based casting solution [27] A.N. Okon, S.E. Adewole, E.M. Uguma, Artificial neural network model for
in a finite salt coagulation bath [J], Polymer 199 (2020), 122527, https://doi.org/ reservoir petrophysical properties: porosity, permeability and water saturation
10.1016/j.polymer.2020.122527. prediction [J], Model. Earth Syst. Environ. 7 (4) (2021) 2373–2390, https://doi.
[5] X. Yu, H. Cheng, M. Zhang, et al., Graphene-based smart materials [J], Nat. Rev. org/10.1007/s40808-020-01012-4.
Mater. 2 (9) (2017) 1–13, https://doi.org/10.1038/natrevmats.2017.46. [28] F. Schmitt, K.U. Do, Prediction of membrane fouling using artificial neural
[6] X. Liu, M.C. Hersam, Interface characterization and control of 2D materials and networks for wastewater treated by membrane bioreactor technologies:
heterostructures [J], Adv. Mater. 30 (39) (2018) 1801586, https://doi.org/ bottlenecks and possibilities [J], Environ. Sci. Pollut. Res. 24 (29) (2017)
10.1002/adma.201801586. 22885–22913, https://doi.org/10.1007/s11356-017-0046-7.
[7] Z. Zhu, D. Wang, Y. Tian, et al., Ion/molecule transportation in nanopores and [29] M.M. Gromiha, Y. Yabuki, Functional discrimination of membrane proteins using
nanochannels: from critical principles to diverse functions [J], J. Am. Chem. Soc. machine learning techniques [J], BMC Bioinformatics 9 (2008) 1–8, https://doi.
141 (22) (2019) 8658–8669, https://doi.org/10.1021/jacs.9b00086. org/10.1186/1471-2105-9-135.
[8] Y. Zhang, Z. Chen, L. Yao, et al., Study of ion permeation through the graphene [30] C.S.H. Yeo, Q. Xie, X. Wang, et al., Understanding and optimization of thin film
oxide/polyether sulfone membranes [J], ChemElectroChem 7 (2) (2020) 493–499, nanocomposite membranes for reverse osmosis with machine learning [J],
https://doi.org/10.1002/celc.201902108. J. Membr. Sci. 606 (2020), 118135, https://doi.org/10.1016/j.
[9] W. Choi, K.Y. Chun, J. Kim, et al., Ion transport through thermally reduced and memsci.2020.118135.
mechanically stretched graphene oxide membrane [J], Carbon 114 (2017) [31] H. Liu, J. Chen, D. Hissel, et al., Remaining useful life estimation for proton
377–382, https://doi.org/10.1016/j.carbon.2016.12.041. exchange membrane fuel cells using a hybrid method [J], Appl. Energy 237 (2019)
[10] H. Huang, Y. Mao, Y. Ying, et al., Salt concentration, pH and pressure controlled 910–919, https://doi.org/10.1016/j.apenergy.2019.01.023.
separation of small molecules through lamellar graphene oxide membranes [J], [32] N. Delgrange-Vincent, C. Cabassud, M. Vabassud, L. Durand-Bourlier, J.M. Laine,
Chem. Commun. 49 (53) (2013) 5963–5965, https://doi.org/10.1039/ Neural networks for long term prediction of fouling and backwash efficiency in
C3CC41953C. ultrafiltration for drinking water production [J], Desalination 131 (2000)
[11] O. Kedem, A. Katchalsky, Thermodynamic analysis of the permeability of 353–362, https://doi.org/10.1016/S0011-9164(00)90034-1.
biological membranes to non-electrolytes [J], Biochim. Biophys. Acta 27 (1958) [33] Z. Si, D. Zhou, J. Guo, X. Zhuang, J. Xiang, Prediction of sulfuric acid solution in
229–246, https://doi.org/10.1016/0006-3002(58)90330-5. the vacuum membrane distillation process using artificial neural network [J],
[12] K.S. Spiegler, O. Kedem, Thermodynamics of hyperfiltration (reverse osmosis): J. Water Process Eng. 53 (2023), 103888, https://doi.org/10.1016/j.
criteria for efficient membranes [J], Desalination 1 (4) (1966) 311–326, https:// jwpe.2023.103888.
doi.org/10.1016/S0011-9164(00)80018-1. [34] M. Zhao, C. Zhang, Y. Weng, Improved artificial neural networks (ANNs) for
[13] J.G. Wijmans, R.W. Baker, The solution-diffusion model: a review [J], J. Membr. predicting the gas separation performance of polyimides [J], J. Membr. Sci. 681
Sci. 107 (1–2) (1995) 1–21, https://doi.org/10.1016/0376-7388(95)00102-I. (2023), 121765, https://doi.org/10.1016/j.memsci.2023.121765.
[14] D.R. Paul, Reformulation of the solution-diffusion theory of reverse osmosis [J], [35] Y. Chen, X. Yang, Molecular simulation of layered GO membranes with amorphous
J. Membr. Sci. 241 (2) (2004) 371–386, https://doi.org/10.1016/j. structure for heavy metal ions separation [J], J. Membr. Sci. 660 (2022), 120863,
memsci.2004.05.026. https://doi.org/10.1016/j.memsci.2022.120863.
[15] J.D. Ferry, Ultrafilter membranes and ultrafiltration [J], Chem. Rev. 18 (3) (1936) [36] J. Wang, P. Zhang, B. Liang, et al., Graphene oxide as an effective barrier on a
373–455, https://doi.org/10.1021/cr60061a001. porous nanofibrous membrane for water treatment [J], ACS Appl. Mater. Interfaces
[16] B. Van der Bruggen, J. Schaep, D. Wilms, et al., A comparison of models to describe 8 (9) (2016) 6211–6218, https://doi.org/10.1021/acsami.5b12723.
the maximal retention of organic molecules in nanofiltration [J], Sep. Sci. Technol. [37] P. Sun, F. Zheng, M. Zhu, et al., Selective trans-membrane transport of alkali and
35 (2) (2000) 169–182, https://doi.org/10.1081/SS-100100150. alkaline earth cations through graphene oxide membranes based on cation–π
[17] H. Mehdizadeh, J.M. Dickson, Theoretical modification of the surface force-pore interactions [J], ACS Nano 8 (1) (2014) 850–859, https://doi.org/10.1021/
flow model for reverse osmosis transport [J], J. Membr. Sci. 42 (1–2) (1989) nn4055682.
119–145, https://doi.org/10.1016/S0376-7388(00)82369-8. [38] M. Zhang, K. Guan, Y. Ji, et al., Controllable ion transport by surface-charged
[18] X.L. Wang, T. Tsuru, S. Nakao, et al., Electrolyte transport through nanofiltration graphene oxide membrane [J], Nat. Commun. 10 (1) (2019) 1253, https://doi.org/
membranes by the space-charge model and the comparison with Teorell-Meyer- 10.1038/s41467-019-09286-8.
Sievers model [J], J. Membr. Sci. 103 (1–2) (1995) 117–133, https://doi.org/ [39] Y. Mao, Q. Huang, B. Meng, et al., Roughness-enhanced hydrophobic graphene
10.1016/0376-7388(94)00317-R. oxide membrane for water desalination via membrane distillation [J], J. Membr.
[19] N.V. Dale, M.D. Mann, H. Salehfar, Semiempirical model based on thermodynamic Sci. 611 (2020), 118364, https://doi.org/10.1016/j.memsci.2020.118364.
principles for determining 6 kW proton exchange membrane electrolyzer stack [40] Y. Zhou, Z. Lu, W. Yun, Active sparse polynomial chaos expansion for system
characteristics [J], J. Power Sources 185 (2) (2008) 1348–1353, https://doi.org/ reliability analysis [J], Reliab. Eng. Syst. Saf. 202 (2020), 107025, https://doi.org/
10.1016/j.jpowsour.2008.08.054. 10.1016/j.ress.2020.107025.
[20] Z. Chen, K. Ito, H. Yanagishita, et al., Correlation study between free-volume holes [41] A. Tayyebi, A.S. Alshami, X. Yu, E. Kolodka, Can machine learning methods guide
and molecular separations of composite membranes for reverse osmosis processes gas separation membranes fabrication [J]? J. Membr. Sci. Lett. 2 (2022), 100033
by means of variable-energy positron annihilation techniques [J], J. Phys. Chem. C https://doi.org/10.1016/j.memlet.2022.100033.
115 (37) (2011) 18055–18060, https://doi.org/10.1021/jp203888m. [42] Z. Zhang, Y. Luo, H. Peng, Y. Chen, R.Z. Liao, Q. Zhao, Deep spatial representation
[21] C. Niu, X. Li, R. Dai, et al., Artificial intelligence-incorporated membrane fouling learning of polyamide nanofiltration membranes [J], J. Membr. Sci. 620 (2021),
prediction for membrane-based processes in the past 20 years: a critical review [J], 118910, https://doi.org/10.1016/j.memsci.2020.118910.
Water Res. (2022), 118299, https://doi.org/10.1016/j.watres.2022.118299. [43] C. Niu, X. Li, R. Dai, Z. Wang, Artificial intelligence-incorporated membrane
[22] A. Tayyebi, A.S. Alshami, X. Yu, et al., Can machine learning methods guide gas fouling prediction for membrane-based processes in the past 20 years: a critical
separation membranes fabrication?[J], J. Membr. Sci. Lett. (2022), 100033, review [J], Water Res. 216 (2022), 118299, https://doi.org/10.1016/j.
https://doi.org/10.1016/j.memlet.2022.100033. watres.2022.118299.
[23] S. Park, A.T. Angeles, M. Son, et al., Predicting the salt adsorption capacity of [44] D.E. Rumelhart, G.E. Hinton, R.J. Williams, Learning representations by back-
different capacitive deionization electrodes using random forest [J], Desalination propagating errors [J], Nature 323 (1986) 533–536. https://www.nature.co
537 (2022), 115826, https://doi.org/10.1016/j.desal.2022.115826. m/articles/323533a0.
[24] N.T.S. Khomami, P.M. Patel, C.P. Jusi, et al., Influential parameters of surface [45] S. Yuan, H. Ajam, Z.A.B. Sinnah, F.M.A. Altalbawy, S.A.A. Ameer, A. Husain, A. Al
waters on the formation of coating on TiO2 nanoparticles under natural conditions Mashhadani, A. Alkhayyat, A. Alsalamy, R.A. Zubaid, Y. Cao, The role of artificial
[J], Environ. Sci: Nano 8 (11) (2021) 3153–3166, https://doi.org/10.1039/ intelligence techniques for increasing the prediction performance of important
D1EN00431J. parameters and their optimization in membrane processes: a systematic review [J],
[25] Y. Yan, T.N. Borhani, S.G. Subraveti, et al., Harnessing the power of machine Ecotoxicol. Environ. Saf. 260 (2023), 115066, https://doi.org/10.1016/j.
learning for carbon capture, utilisation, and storage (CCUS)–a state-of-the-art ecoenv.2023.115066.
review [J], Energy Environ. Sci. 14 (12) (2021) 6122–6157, https://doi.org/ [46] G.B. Sahoo, C. Ray, Predicting flux decline in crossflow membranes using artificial
10.1039/D1EE02395K. neural networks and genetic algorithms [J], J. Membr. Sci. 283 (1–2) (2006)
147–157, https://doi.org/10.1016/j.memsci.2006.06.019.

You might also like