Deep Learning Based Models For Solar Ene
Deep Learning Based Models For Solar Ene
Deep Learning Based Models For Solar Ene
www.astesj.com 349
https://dx.doi.org/10.25046/aj060140
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)
apply the attention mechanism trained at the LSTM to target the and discusses the obtained results. Finally, section 5 summarizes
extracted relevant features, which has greatly enhanced the origi- the conclusions and perspectives of this study.
nal predictive power of LSTM. An LSTM approach for short-term
forecasts based on a timescale that includes global horizontal irradi-
ance (GHI) one hour ahead and one day ahead have been applied 2 Methodology
in [11]. Furthermore, authors in [12] have implemented univariate
and multivariate GRU models employing historical solar radiation, Solar energy prediction is a key element in enhancing the competi-
external meteorological variables, and cloud cover data to predict tiveness of solar power plants in the energy market, and decreasing
solar radiation. In [13], authors have presented multivariate GRU to reliance on fossil fuels in socio-economic development. Our work
predict Direct Normal Irradiance (DNI) hourly. And the suggested aims to accurately predict the solar energy. For this purpose, we
method is evaluated compared to LSTM using historical irradiance explore architectures of the RNN, LSTM and GRU algorithm which
data. In addition, a hybrid deep learning model have introduced in is suitable to be used for forecasting such time-series data, and we
[14] that combines a GRU neural network with an attention mecha- experiment and evaluate them in Morocco’s case, especially Er-
nism for solar radiation prediction. All these research contributions rachidia area. This section presents at first the basis architecture of
cited above are valuable. However, they are not able to identify this recurrent neural network before explaining the important steps
the important parameters that would have an impact on the accu- that we follow to build our models and perform our comparative
racy of predictions. and make real-time predictions for efficient study.
and optimized management. Given the fluctuating nature of output
power generation as a function of meteorological conditions such as 2.1 Recurrent Neural Network architecture
the temperature, the wind speed, the cloud cover, the atmospheric
aerosol levels and the humidity level, which leads to high uncer- Recurrent neural network (RNN) is a category of neural network
tainties on the output power of PV [15]. Besides, according to the used in sequential data prediction where the output is dependent
forecast horizon, PV forecasts range from very short to long term. In on the input [20]. The RNN [21] is capable to capture the dynamic
general, researchers focus on short-term, hourly and daily forecasts of time-series data by storing information from previous computa-
as opposed to long-term. The first are of major significance for the tions in the internal memory. RNN has been applied in a context
management of PVs and the related security constraints (planning, where past values of the output make a significant contribution to
control of PV storage and the rapprochement of the electricity mar- the future. It is mainly used in forecasting applications because of
ket, providing the secure operation of generation and distribution its ability to process sequential data of different length.
services, and reinforcing the security of grid operation), while the The basic principle of RNNs is to consider the input of a hidden
long-term forecasts are useful particularly, for maintenance [16]. In neuron which takes input from neurons at the preceding time step
this context, we study the efficiency of three DL models. We also [22]. To this purpose, they employ cells represented by gates that
focus in this study, on real-time prediction of solar energy in order influence the output using historical data observations are given to
to help planner, decision-makers, power plant operators and grid generate the output. RNN is particularly efficient for learning the
operators to making responsive decisions as early as possible, and dynamic temporal behaviors exhibited in time-series data [23]. In
to manage smart grid PV systems more reliably and efficiently [17]. RNN, the hidden neuron ht for a given input sequence xt , it has
Moreover, the use of real-time prediction allows to adjust to changes information feedback from other neurons in the preceding time step
in production and to react to complex events (exceptionally high or multiplied by a Whh which is the weight of the preceding hidden
low load production or consumption). In addition, it decreases the state ht−1 can be calculated sequence by Eq.1. xt is the input at
amount of operating reserves needed by the system, thus reducing instant time t, Wxh is the weight of the actual input state, tanh is the
system balancing costs [18]. However, the models we have selected activation function. The output state yt is computed according to
give good results and seem to be suitable for long-term forecasting the Eq.2 where Why is the weight at the output state.
thanks to their power as a deep learning model and their ability
to perform complex processing on huge data sets. Moreover, we ht = tanh(Whh ht-1 + Wxh xt ) (1)
outline that our work has been experimented with Moroccan’s case,
particularly in the region of Errachidia. In this perspective, Moroc-
can decision-makers have launched a global plan to improve the yt = Why ht (2)
percentage of renewable energy in the energy mix and substantially The LSTM is a special kind of RNN, designed to avoid and
improve energy efficiency. In view of increasing the percentage resolve the vanishing gradient problems that limit the efficiency
of electricity generation capacity from renewable energy sources of simple RNN [8]. LSTM network has memory blocks that are
(42% by 2020 and 52% by 2030) [19]. Therefore, we believe that connected through a succession of layers. In the LSTM cells, there
the case of Morocco remains an interesting case study that could are three types of gates: the input gate, the forget gate and the output
lead to important findings since it is not only a promising future PV gate. This makes it possible to achieve good results on a variety of
energy supplier, but also one of the leading countries in the global time-series learning tasks, particularly in the nonlinear of a given
energy transition, particularly in Africa. The remainder of this paper dataset [24]. Each block of LSTM handles the state of the block and
is organized into four sections. Section 2 gives an overview of our the output, it operates at different time steps and transmits its output
comparative methodology. Subsequently, section 3 presents the ap- to the following block and then the final LSTM block generates the
proach applied to elaborate the forecast models. Section 4 presents sequential output [9]. Besides, LSTM is a robust algorithm, which
www.astesj.com 350
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)
allows the recurrent neural network to efficiently process time-series its capacity to learn and acquire dependencies over the long term
data. Its key component is memory blocks which have been released and observations of varying length [29]. This characteristic is par-
to address the vanishing gradient disadvantage by memorizing net- ticularly useful for time-series data, and it is helpful in reducing the
work parameters for long durations [25].an LSTM block gets an computational complexity [30]. GRU cells are described using the
input sequence after that, the activation units are used by each gate following equations:
to decide whether or not it is activated. This operation enables the
change of state and the adding of information passing through the
block conditional. In the training phase, the gates have weights that zt = σ(Wz .[ht-1 , xt ]) (9)
can be learned. In fact, the gates make the LSTM blocks smarter
than conventional neurons and allow them to remember current
sequences [10]. LSTM is flexible and estimates dependencies of rt = σ(Wr .[ht-1 , xt ]) (10)
different time scales thanks to its ability to perform long task se-
quences and to identify long-range features. Basically, LSTM starts
with a forget gate layer (ft ) that uses a sigmoid function combined
with the preceding hidden layer (ht−1 ) and the current input (xt ) as ĥt = tanh(W.[rt − ht-1 , xt ]) (11)
described in the following equations:
www.astesj.com 351
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)
to learn from the past to predict PV solar energy for the next half
hour. This choice has been adopted because of the size of the data
and to perform real-time predictions for an efficient and optimized
management. Finally, based on best practices in the area of data
analytics, the dataset is split into 80% and 20% were respectively
used to train and test the models developed.
www.astesj.com 352
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)
3.4 Performance metrics concerning the efficiency of R2 as an appropriate metric for deter-
mining the best regression model [35, 36], we additionally evaluate
To evaluate the performance of the proposed DL models for PV our results using other statistical metrics that are the most widely
solar energy prediction. A set of evaluation metrics widely-used used: MAE, MSE, Max Error and RMSE. The RMSE remains the
are employed to evaluate the results accuracy forecasting. In this most frequently employed in the regression area [37]. Table 2 show
step, we have applied the most suitable to the context of DL and the metrics values corresponding to solar energy prediction in real
regression problems. For this purpose, five performance metrics, time of studied models. Considering the parameters presented in
whose equations are presented below, have been employed for mod- Table 2, we can see the results for one-half hour ahead prediction of
els testing as well as training: MAE (Mean Absolute Error), MSE our experiments for the recurrent neural networks: RNN, LSTM and
(Mean Square Error), RMSE (Root Mean Square Error), Max Error GRU. In spite of their architectural differences, RNN and LSTM
(ME) and R squared (R2 ). have demonstrated similar performance, showing that the better
n model is strongly dependent on the task. Thanks to their ability to
1X
MAE = |y − ŷj | (14) handle long-term dependencies in sequential information for time-
n j=1 j series predicting applications, both RNN and LSTM perform better
n
with very high accuracy as shown by their related R2 values which
1X are respectively equal to 94.68% and 94.26%. GRU gives slightly
MAE = (y − ŷj )2 (15)
n j=1 j less performance than the RNN and LSTM with 90.32 as value. It
v also causes errors unlike RNN and LSTM. We observe that for GRU,
2.74, 15.53, 3.94, 37.30 are given as MAE, MSE, RMSE and ME
u
t X n
1
RMSE = (y − ŷj )2 (16) values, respectively. Furthermore, these measures are nearly similar
n j=1 j
for RNN and LSTM with MAE=1.83, MSE=8.53, RMSE=2.92and
ME=30.42 for RNN, and MAE=1.90, MSE=9.20, RMSE=3.03 and
ME = max |yj − ŷj | (17) ME=29.73 for LSTM. Figure 3, Figure 4, Figure 5, Figure 6, Figure
1≤ j≤n
7 and Figure 8 show their training and testing curves. In reality,
SSreg more accurate prediction results mean less uncertainty and fluctua-
R2 = 1 − (18) tions in PV solar energy generation. Accuracy is a crucial factor in
SStot
the efficient planning and use of electrical energy and energy sys-
where y is the actual output, ŷ is the predicted output and n is tems with high PV power penetration. In short, as shown in Table
the number of samples. Eq. 14 computes MAE as the average of the 2, the prediction capability of RNN and LSTM seems very reliable
absolute errors (absolute values indicating the differences between compared to GRU under meteorological conditions, and they are
the actual and the predicted values). Eq. 15 shows the MSE, which better adapted to meet real-time PV solar energy prediction needs
is the average squared errors (difference between the real values and with high accuracy. Due to their ability to minimize errors during
what is estimated. Eq. 16 calculates RMSE, which is the square root learning and test processes, they can reduce uncertainties related to
of the MSE, and is applied in cases with small errors [33], whereas the operation and planning of PV energy integrated management
Eq. 17 measures the maximum residual error ME and reveals the systems.
worst error between the actual and predicted value. In addition, to
calculate the predictive accuracy of proposed models, we also used Table 2: Metrics related to PV solar energy dataset.
R-squared, which is a statistical metric in a regression model, to
identify the proportion of variance of the dependent variable that Models Data MAE MSE RMSE ME ME
Training 1.73 7.22 2.68 34.06 95.62
can be made explicit by the independent variable. Expressly, Eq. 18 RNN
Test 1.83 8.53 2.92 30.42 94.68
indicates the R-squared measure to which the data correspond in Training 1.77 7.65 2.76 35.42 95.36
LSTM
the regression model (the goodness of fit) where SSreg is the sum Test 1.90 9.20 3.03 35.42 94.26
Training 2.63 13.69 3.70 34.26 91.70
of the squares due to the regression and SStot is the total sum of GRU
Test 2.74 15.53 3.94 37.30 90.32
the squares [34]. Eventually, in the different steps outlined in this
section, we employed a set of technical tools for the implementation
of the studied algorithms based on the Python libraries, including
Pandas, NumPy, SciPy and Matplotlib in a Jupiter Notebook, in
addition to Scikit-learn (sklearn) and TensorFlow.
www.astesj.com 353
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)
www.astesj.com 354
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)
than the RNN and LSTM. From these findings, it can be concluded [17] A. Tuohy, J. Zack, S. E. Haupt, J. Sharp, M. Ahlstrom, S. Dise, E. Grimit, C.
that RNN and LSTM showed very high accuracy and low errors, Mohrlen, M. Lange, M. G. Casado, J. Black, M. Marquis, C. Collier, ”Solar
forecasting: Methods, challenges, and performance,” IEEE Power and Energy
they are reliable models that can minimize errors in the learning and Magazine, 13(6), 50-59, 2015.DOI:10.1016/j.111799.
testing process, in addition to reduce the uncertainties associated [18] G. Notton, M.L. Nivet, C. Voyant, C. Paoli, C. Darras, F. Motte, A. Fouil-
with the operation and planning of integrated PV energy manage- loy, “Intermittent and stochastic character of renewable energy sources:
ment systems. Moreover, they are promising techniques that also Consequences, cost of intermittence and benefit of forecasting,” Renew-
appears to be suitable for long-term PV forecasting and should be able and Sustainable Energy Reviews, 87(December 2016), 96–105, 2018.
DOI:10.1016/j.rser.2018.02.007.
studied and recommended as a possible unified and standard tool
[19] A.A. Merrouni, F.E. Elalaoui, A. Mezrhab, A. Mezrhab, A. Ghennioui,
for real-time, short-term and long-term PV prediction. As future “Large scale PV sites selection by combining GIS and Analytical Hier-
work of this study, we plan to enhance these models by employ- archy Process. Case study: Eastern Morocco,” Renewable Energy, 2017.
ing other approaches, extend them to longer term PV forecasting, DOI:10.1016/j.renene.2017.10.044.
as well as to studying other promising deep learning methods in [20] A. Alzahrani, P. Shamsi, C. Dagli, M. Ferdowsi, “Solar Irradiance Forecasting
view of providing and deploying a powerful model for PV energy Using Deep Neural Networks,” Procedia Computer Science, 114, 304–313,
2017. DOI:10.1016/j.procs.2017.09.045.
prediction.
[21] A. Alzahrani, P. Shamsi, M. Ferdowsi, “Solar Irradiance Forecasting Using
Deep Recurrent Neural Networks,” In 2017 IEEE 6th international conference
Conflict of Interest The authors declare no conflict of interest. on renewable energy research and applications (ICRERA), 988-994, 2017.
[22] Y. LeCun, Y. Bengio, G. Hinton. ”Deep learning” Nature 521(7553), 436-444,
2015.
References [23] H. Wang, Z. Lei, X. Zhang, B. Zhou, J. Peng, “A review of deep learning for re-
[1] E. Hache, “Do renewable energies improve energy security in the long run? newable energy forecasting,” Energy Conversion and Management, 198(July),
Septembre 2016 Les cahiers de l’économie - no 109 Série Recherche,” 2016. 111799, 2019. DOI:10.1016/j.enconman.2019.111799.
[2] M.G. De Giorgi, P.M. Congedo, M. Malvoni, “Photovoltaic power forecasting [24] X. Qing, Y. Niu, “Hourly day-ahead solar irradiance prediction us-
using statistical methods: Impact of weather data,” IET Science, Measurement ing weather forecasts by LSTM,” Energy, 148, 461–468, 2018.
and Technology, 8(3), 90–97, 2014. DOI:10.1049/iet-smt.2013.0135. DOI:10.1016/j.energy.2018.01.177.
[3] P. Li, K. Zhou, S. Yang, “Photovoltaic Power Forecasting: Models and Meth- [25] H. Liu, X. Mi, Y. Li, “Comparison of two new intelligent wind speed forecast-
ods,” 2nd IEEE Conference on Energy Internet and Energy System Integration, ing approaches based on Wavelet Packet Decomposition , Complete Ensemble
EI2 2018 - Proceedings, 1–6, 2018. DOI:10.1109/EI2.2018.8582674. Empirical Mode Decomposition with Adaptive Noise and Arti fi cial Neural
[4] U.K. Das, K.S. Tey, M. Seyedmahmoudian, S. Mekhilef, M.Y.I. Idris, W. Van Networks,” Energy Conversion and Management, 155(July 2017), 188–200,
Deventer, B. Horan, A. Stojcevski, “Forecasting of photovoltaic power genera- 2018. DOI:10.1016/j.enconman.2017.10.085.
tion and model optimization: A review,” Renewable and Sustainable Energy [26] Y. Wang, W. Liao, Y. Chang, “Gated Recurrent Unit Network-Based
Reviews, 81(August 2017), 912–928, 2018 DOI:10.1016/j.rser.2017.08.017. Short-Term Photovoltaic Forecasting,” Energies, 11(8), 2163, 2018.
[5] A. Youssef, M. El-telbany, A. Zekry, “The role of artificial intelligence in photo- DOI:10.3390/en11082163.
voltaic systems design and control: A review,” Renewable and Sustainable En- [27] K. Cho, B. van Merrienboer, C. Gulcehre, F. Bougares, H. Schwenk, Y. Bengio,
ergy Reviews, 78(November), 72–79, 2017. DOI:10.1016/j.rser.2017.04.046. “Learning Phrase Representations using RNN Encoder–Decoder for Statistical
[6] A.P. Yadav, A. Kumar, L. Behera, “RNN Based Solar Radiation Forecasting Machine Translation,” arXiv preprint arXiv:1406.1078, 2014.
Using Adaptive Learning Rate,” 442–452, 2013. [28] M. C. Sorkun, C. Paoli, Ö. D. Incel, ”Time series forecasting on solar irradi-
[7] G. Li , H. Wang, S. Zhang, J. Xin, H. Liu,“Recurrent Neural Networks Based ation using deep learning,” 2017 10th International Conference on Electrical
Photovoltaic Power Forecasting Approach,” 1–17, 2019. and Electronics Engineering (ELECO), Bursa, 151-155, 2017.
[8] P. Li, K. Zhou, X. Lu, S. Yang, “A hybrid deep learning model for short- [29] H.M. Lynn, S.B.U.M. Pan, “A Deep Bidirectional GRU Network Model for Bio-
term PV power forecasting,” Applied Energy, (November), 114216, 2019. metric Electrocardiogram Classification Based on Recurrent Neural Networks,”
DOI:10.1016/j.apenergy.2019.114216. IEEE Access, 7, 145395–145405, 2019. DOI:10.1109/ACCESS.2019.2939947.
[9] M. Abdel-Nasser, K. Mahmoud, “Accurate photovoltaic power forecasting [30] Z. Che, S. Purushotham, K. Cho, D. Sontag, Y. Liu, “Recurrent Neural Net-
models using deep LSTM-RNN,” Neural Computing and Applications, 31(7), works for Multivariate Time Series with Missing Values,” Scientific Reports,
2727–2740, 2019. DOI:10.1007/s00521-017-3225-z. 1–12, 2018. DOI:10.1038/s41598-018-24271-9.
[10] H. Zhou, Y. Zhang, L. Yang, Q. Liu, K. Yan, Y. Du, “Short-Term Pho- [31] H. Zhou, Z. Deng, Y. Xia, M. Fu, “A new Sampling Method in Particle
tovoltaic Power Forecasting Based on Long Short Term Memory Neural Filter Based on Pearson Correlation Coefficient,” Neurocomputing, 2016.
Network and Attention Mechanism,” IEEE Access, 7, 78063–78074, 2019. DOI:10.1016/j.neucom.2016.07.036.
DOI:10.1109/ACCESS.2019.2923006.
[32] D.P. Kingma, J. Ba:, ”Adam: A Method for Stochastic Optimization,” Proceed-
[11] Y. Yu, J. Cao, J. Zhu, “An LSTM Short-Term Solar Irradiance Forecasting
ings of the 3rd International Conference on Learning Representations (ICLR),
Under Complicated Weather Conditions,” IEEE Access, 7, 145651–145666,
p. No abs/1412.6980, 2014.
2019. DOI:10.1109/ACCESS.2019.2946057.
[33] F. Wang, Z. Mi, S. Su, H. Zhao, “Short-term solar irradiance forecasting model
[12] J. Wojtkiewicz, M. Hosseini, R. Gottumukkala, T.L. Chambers, “Hour-Ahead
based on artificial neural network using statistical feature parameters,” Energies,
Solar Irradiance Forecasting Using Multivariate Gated Recurrent Units,” 1–13,
5(5), 1355–1370, 2012. DOI:10.3390/en5051355.
2019. DOI:10.3390/en12214055.
[13] M. Hosseini, S. Katragadda, J. Wojtkiewicz, R. Gottumukkala, “Direct Normal [34] B.J. Miles, “R Squared , Adjusted R Squared,” Wiley, 2014.
Irradiance Forecasting Using Multivariate Gated Recurrent Units,” 1–15, 2020. [35] C. Onyutha, “From R-squared to coefficient of model accuracy for assessing ”
DOI:10.3390/en13153914. goodness-of-fits ”,” Geoscientific Model Development Discussions, 1-25, 2020.
[14] K. Yan, H. Shen, L. Wang, H. Zhou, M. Xu, Y. Mo, “Short-Term Solar Irradi- doi: 10.5194/gmd-2020-51
ance Forecasting Based on a Hybrid Deep Learning Methodology,” 1–13. [36] D.B. Figueiredo Filho, J.A.S. Júnior, E.C. Rocha, ”What is R2 all about?,”
[15] M.Q. Raza, M. Nadarajah, C. Ekanayake, “On recent advances in PV out- Leviathan (São Paulo), 3, 60–68, 2011.
put power forecast,” Solar Energy, 136(September 2019), 125–144, 2016. [37] I. Jebli, F. Belouadha, M.I. Kabbaj, “The forecasting of solar energy
DOI:10.1016/j.solener.2016.06.073. based on Machine Learning,” (c), 2020 International Conference on Elec-
[16] C. Wan, J. Zhao, Y. Song, Z. Xu, J. Lin, Z. Hu, “Photovoltaic and solar power trical and Information Technologies (ICEIT). IEEE, 1-8, 2020. DOI:
forecasting for smart grid energy management,” CSEE Journal of Power and 10.1109/ICEIT48248.2020.9113168
Energy Systems, 1(4), 38–46, 2016. DOI:10.17775/cseejpes.2015.00046.
www.astesj.com 355