Deep Learning Based Models For Solar Ene

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Advances in Science, Technology and Engineering Systems Journal

Vol. 6, No. 1, 349-355 (2021) ASTES Journal


www.astesj.com ISSN: 2415-6698
Special Issue on Multidisciplinary Sciences and Engineering

Deep Learning based Models for Solar Energy Prediction


Imane Jebli*,1 , Fatima-Zahra Belouadha1 , Mohammed Issam Kabbaj1 , Amine Tilioua2
1
AMIPS Research Team, E3S Research Center, Computer Science Department, Ecole Mohammadia d’Ingénieurs, Mohammed V University
in Rabat, Avenue Ibn Sina B.P. 765, Agdal Rabat 10090, Morocco
2
Thermal and Applied Thermodynamics, Mechanics Energy Efficiency and Renewable Energies Laboratory, Department of Physics,
Faculty of Sciences and Techniques Errachidia, Moulay Ismaı̈l University of Meknès, Boutalamine Errachidia, B.P-509, Morocco

ARTICLE INFO ABSTRACT


Article history: Solar energy becomes widely used in the global power grid. Therefore, enhancing the
Received: 10 November, 2020 accuracy of solar energy predictions is essential for the efficient planning, managing and
Accepted: 06 January, 2021 operating of power systems. To minimize the negatives impacts of photovoltaics on electricity
Online: 22 January, 2021 and energy systems, an approach to highly accurate and advanced forecasting is urgently
needed. In this paper, we studied the use of Deep Learning techniques for the solar energy
Keywords: prediction, in particular Recurrent Neural Network (RNN), Long Short-Term Memory
Photovoltaic energy prediction (LSTM) and Gated Recurrent Units (GRU). The proposed prediction methods are based on
Deep Learning real meteorological data series of Errachidia province, from 2016 to 2018. A set of error
Recurrent Neural Network metrics were adopted to evaluate the efficiency of these models for real-time photovoltaic
Long Short-Term Memory forecasting, to achieve more reliable grid management and safe operation, in addition to
Gated Recurrent Units improve the cost-effectiveness of the photovoltaic system. The results reveal that RNN and
LSTM outperform slightly GRU thanks to their capacity to maintain long-term dependencies
in time series data.

1 Introduction dicting PV solar energy. However, thanks to their great learning


and regression capabilities, AI and especially, Deep Learning (DL)
Currently, the global trend is to integrate renewable energies to techniques [5], have been widely used in this area. They have the
generate electricity and rethink its energy mix. In this regard, the capability to extract in-depth features from PV power datasets and
use and expansion of renewable energy have become an important get more reliable predictive outputs. Authors in [6] have applied
element in maintaining energy security, as well as to establish an Recurrent Neural Network (RNN) as a good tool for predicting solar
ecological and sustainable electrical system [1]. Nevertheless, the irradiation time-series using recorded meteorological data. Besides,
energy context is affected by price variations, changes in demand in [7] the author have proposed an RNN-based forecasting
and the volatility of renewable energy production. In order, to con- method for short-term PV power prediction. Moreover, another
front these main challenges of the energy mix, it is essential to research in [8] has proposed a hybrid deep learning model using
have a good forecasting model for renewable energies, which is Wavelet Packet Decomposition (WPD) and Long Short-Term
particularly useful for optimizing and adapting supply to demand. Memory (LSTM) to forecast PV power for one hour ahead with an
This must reliably provide advance information on the availability interval of five min-utes. WPD is applied to split the PV power
of power, which helps to achieve the grid’s stable operation and to output time-series and next the LSTM is implemented to forecast
allow for optimal unit engagement and economic dispatching [2]. high and low-frequency subseries. The results of this study
In this work, we concentrate on solar energy forecasting. Photo- indicate that the WPD-LSTM method outperforms in various
voltaic (PV) forecasting methods fall into two groups: direct and seasons and weather conditions than LSTM, Gated Recurrent
indirect prediction models. As described in [3, 4], solar radiation Units (GRU), RNN, and Multilayer Per-ceptron (MLP). Authors
at different time scales is predicted with different methods and then in [9] have presented a deep LSTM based on historical power data
converted to power based on the panel characteristics in the case of to predict the PV system output power for one hour ahead. The
indirect prediction, while direct predictions are performed directly aim in [10] to suggest a hybrid model using LSTM and attention
from the output power of the plant. Besides, Statistical, physical, mechanism for short-term PV power forecast-ing, and employed
artificial intelligence (AI) and hybrid approaches were used in pre- LSTM to extract features from the historical data and learn in
* Corresponding
sequence the information on long-term dependence, to
Author: Imane JEBLI, Avenue Ibn Sina B.P. 765, Agdal Rabat 10090, Morocco, Email: [email protected]

www.astesj.com 349
https://dx.doi.org/10.25046/aj060140
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)

apply the attention mechanism trained at the LSTM to target the and discusses the obtained results. Finally, section 5 summarizes
extracted relevant features, which has greatly enhanced the origi- the conclusions and perspectives of this study.
nal predictive power of LSTM. An LSTM approach for short-term
forecasts based on a timescale that includes global horizontal irradi-
ance (GHI) one hour ahead and one day ahead have been applied 2 Methodology
in [11]. Furthermore, authors in [12] have implemented univariate
and multivariate GRU models employing historical solar radiation, Solar energy prediction is a key element in enhancing the competi-
external meteorological variables, and cloud cover data to predict tiveness of solar power plants in the energy market, and decreasing
solar radiation. In [13], authors have presented multivariate GRU to reliance on fossil fuels in socio-economic development. Our work
predict Direct Normal Irradiance (DNI) hourly. And the suggested aims to accurately predict the solar energy. For this purpose, we
method is evaluated compared to LSTM using historical irradiance explore architectures of the RNN, LSTM and GRU algorithm which
data. In addition, a hybrid deep learning model have introduced in is suitable to be used for forecasting such time-series data, and we
[14] that combines a GRU neural network with an attention mecha- experiment and evaluate them in Morocco’s case, especially Er-
nism for solar radiation prediction. All these research contributions rachidia area. This section presents at first the basis architecture of
cited above are valuable. However, they are not able to identify this recurrent neural network before explaining the important steps
the important parameters that would have an impact on the accu- that we follow to build our models and perform our comparative
racy of predictions. and make real-time predictions for efficient study.
and optimized management. Given the fluctuating nature of output
power generation as a function of meteorological conditions such as 2.1 Recurrent Neural Network architecture
the temperature, the wind speed, the cloud cover, the atmospheric
aerosol levels and the humidity level, which leads to high uncer- Recurrent neural network (RNN) is a category of neural network
tainties on the output power of PV [15]. Besides, according to the used in sequential data prediction where the output is dependent
forecast horizon, PV forecasts range from very short to long term. In on the input [20]. The RNN [21] is capable to capture the dynamic
general, researchers focus on short-term, hourly and daily forecasts of time-series data by storing information from previous computa-
as opposed to long-term. The first are of major significance for the tions in the internal memory. RNN has been applied in a context
management of PVs and the related security constraints (planning, where past values of the output make a significant contribution to
control of PV storage and the rapprochement of the electricity mar- the future. It is mainly used in forecasting applications because of
ket, providing the secure operation of generation and distribution its ability to process sequential data of different length.
services, and reinforcing the security of grid operation), while the The basic principle of RNNs is to consider the input of a hidden
long-term forecasts are useful particularly, for maintenance [16]. In neuron which takes input from neurons at the preceding time step
this context, we study the efficiency of three DL models. We also [22]. To this purpose, they employ cells represented by gates that
focus in this study, on real-time prediction of solar energy in order influence the output using historical data observations are given to
to help planner, decision-makers, power plant operators and grid generate the output. RNN is particularly efficient for learning the
operators to making responsive decisions as early as possible, and dynamic temporal behaviors exhibited in time-series data [23]. In
to manage smart grid PV systems more reliably and efficiently [17]. RNN, the hidden neuron ht for a given input sequence xt , it has
Moreover, the use of real-time prediction allows to adjust to changes information feedback from other neurons in the preceding time step
in production and to react to complex events (exceptionally high or multiplied by a Whh which is the weight of the preceding hidden
low load production or consumption). In addition, it decreases the state ht−1 can be calculated sequence by Eq.1. xt is the input at
amount of operating reserves needed by the system, thus reducing instant time t, Wxh is the weight of the actual input state, tanh is the
system balancing costs [18]. However, the models we have selected activation function. The output state yt is computed according to
give good results and seem to be suitable for long-term forecasting the Eq.2 where Why is the weight at the output state.
thanks to their power as a deep learning model and their ability
to perform complex processing on huge data sets. Moreover, we ht = tanh(Whh ht-1 + Wxh xt ) (1)
outline that our work has been experimented with Moroccan’s case,
particularly in the region of Errachidia. In this perspective, Moroc-
can decision-makers have launched a global plan to improve the yt = Why ht (2)
percentage of renewable energy in the energy mix and substantially The LSTM is a special kind of RNN, designed to avoid and
improve energy efficiency. In view of increasing the percentage resolve the vanishing gradient problems that limit the efficiency
of electricity generation capacity from renewable energy sources of simple RNN [8]. LSTM network has memory blocks that are
(42% by 2020 and 52% by 2030) [19]. Therefore, we believe that connected through a succession of layers. In the LSTM cells, there
the case of Morocco remains an interesting case study that could are three types of gates: the input gate, the forget gate and the output
lead to important findings since it is not only a promising future PV gate. This makes it possible to achieve good results on a variety of
energy supplier, but also one of the leading countries in the global time-series learning tasks, particularly in the nonlinear of a given
energy transition, particularly in Africa. The remainder of this paper dataset [24]. Each block of LSTM handles the state of the block and
is organized into four sections. Section 2 gives an overview of our the output, it operates at different time steps and transmits its output
comparative methodology. Subsequently, section 3 presents the ap- to the following block and then the final LSTM block generates the
proach applied to elaborate the forecast models. Section 4 presents sequential output [9]. Besides, LSTM is a robust algorithm, which

www.astesj.com 350
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)

allows the recurrent neural network to efficiently process time-series its capacity to learn and acquire dependencies over the long term
data. Its key component is memory blocks which have been released and observations of varying length [29]. This characteristic is par-
to address the vanishing gradient disadvantage by memorizing net- ticularly useful for time-series data, and it is helpful in reducing the
work parameters for long durations [25].an LSTM block gets an computational complexity [30]. GRU cells are described using the
input sequence after that, the activation units are used by each gate following equations:
to decide whether or not it is activated. This operation enables the
change of state and the adding of information passing through the
block conditional. In the training phase, the gates have weights that zt = σ(Wz .[ht-1 , xt ]) (9)
can be learned. In fact, the gates make the LSTM blocks smarter
than conventional neurons and allow them to remember current
sequences [10]. LSTM is flexible and estimates dependencies of rt = σ(Wr .[ht-1 , xt ]) (10)
different time scales thanks to its ability to perform long task se-
quences and to identify long-range features. Basically, LSTM starts
with a forget gate layer (ft ) that uses a sigmoid function combined
with the preceding hidden layer (ht−1 ) and the current input (xt ) as ĥt = tanh(W.[rt − ht-1 , xt ]) (11)
described in the following equations:

it = σ(Wi .[ht-1 , xt ] + bi ) (3) ht = (1 − zt ).ht-1 + zt .ĥt (12)

čt = tanh(Wc .[ht-1 , xt ] + bc ) (4)


yt = σ(W0 .ht ) (13)
ft = σ(Wf .[ht-1 , xt ] + bf ) (5)
where zt and rt respectively correspond to the output of the up-
date gate and the reset gate, while Wz , Wr , W, and Wo respectively
ot = σ(Wo .[ft-1 , xt ] + bo ) (6) are the weights of each gate. σ and tanh respectively are the sigmoid
and the hyperbolic tangent activation function, and xt is the network
ct = ft .ct-1 + it .čt (7) input at time t. ht and ht-1 correspond to the hidden layer informa-
tion of the actual and precedent time, and ĥt is the candidate state
of the input. Initially, the network input xt at the time of the hidden
ht = ot . tanh(ct ) (8) state ht-1 and t at the last state computes the output of the reset gate
where it , čt , ft , ot , ct and ht represent the input gate, cell input and the update gate by Eq.9 and 10. Next, after the reset of the
activation, forget gate, output gate, cell state, and the hidden state reset gate rt calculates the amount of memory stored, the implicit
respectively. Wi , Wc , Wf and Wo represent their weight matrices layer ĥt is computed by Eq.11, which is the new information at the
respectively. bi , bc , bf , and bo represent the biases. xt is the input, actual time t. Then, by using Eq.12, the update gate zz determines
ht-1 is the last hidden state, ht is the internal state. σ is the sigmoid the amount of information was removed at the previous time, and
function. how much information is stored in the candidate hidden layer ht at
this time. The hidden layer information ht is subsequently added.
Finally, the output of the GRU is switched to the next GRU gated
2.2 Gated Recurrent Units architecture loop unit according to Eq.13: which is a sigmoid function.
The GRU [26] is a particular type of recurrent neural network pro-
posed by Cho in 2014 which, is the same as the LSTM) in terms
of use to resolve the issue of long-term memory network and back- 3 Our approach for predicting solar en-
propagation. The GRU network involves the design of several cells
that store important information and forget those that are deemed ir- ergy
relevant in the future [27]. The feedback loops of the GRU network
can be regarded as a time loop, because the output of the cell from In this study, we apply three different DL models to predict the
the past period is taken as an input in the following cell in addition PV solar energy output, especially RNN, LSTM and GRU for the
to the actual input. This key property allows the model to remember half-hour ahead. The aim of this study is to compares the relevance
the patterns of interest and to predict sequential time-series datasets of the mentioned models, in order to identify the best algorithm
over time. Unlike the LSTM, the GRU [28] consists of only two to be used for predicting solar energy. The process of the method
gates: update and reset gates. the GRU substitutes the forget gate employed in this work is illustrated in Figure 1. It is organized into
and the input gate in the LSTM with an update gate that verifies the four main stages: data collection, data pre-processing, model train-
cell state of the previous state information that will be moved to the ing and parameterization, and model testing and validation. This
current cell, while, the reset gate defines whether new information section details how each of these processing/selection steps was
will be included in the preceding state. For this reason, GRU has performed, as well as key details on the architecture and parameters
proven to be one of the most efficient RNN techniques, because of of the developed models.

www.astesj.com 351
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)

to learn from the past to predict PV solar energy for the next half
hour. This choice has been adopted because of the size of the data
and to perform real-time predictions for an efficient and optimized
management. Finally, based on best practices in the area of data
analytics, the dataset is split into 80% and 20% were respectively
used to train and test the models developed.

3.3 Training and models parametrization


Each of the models developed must be parameterized during the
Figure 1: Our method for predicting solar energy
training phase in order to be able to provide the most accurate fore-
casts while taking into account the size of the batches. After setting
different parameters for our DL models during the training and eval-
3.1 Dataset
uating the results obtained, the Adam optimizer has proved to be
The datasets used in this work, they represent meteorological data the best optimizer and was then selected as the common parameter
of Errachidia province, which is situated in the sunny region of for all proposed DL models. ADAM, which is widely used and
Morocco, it benefits from an important solar energy potential during works better than other stochastic optimizers in empirical results
all the year. This data consists of measurements from three years [32], would allow the models to learn quickly. In addition, the
(2016, 2017 and 2018), and they are used to predict the solar energy tanh activation function has shown a good fitting for all DL models.
level based on a set of measurable weather conditions, in purpose On the other hand, we notice that our DL models have different
of analyzing the performance of the studied models. architectures and that the number of layers and neurons for example
has been fixed after tuning several values and selecting the ones
that have given the best precisions. Figure 2 illustrates the neural
3.2 Preprocessing and Feature Selection network architecture, and Table 1 shows details on the architectures
Pre-processing and feature selection are fundamental steps in DL and the key parameters of each model suggested for the dataset
approaches for making collected data in an appropriate form. Pre- studied.
processing allows numerous tasks and operations to convert the
source data in a clean data, in a way that it can be easily integrated
into DL models. It impacts the accuracy of the model and its results.
Feature selection provides the most appropriate features that prop-
erly affect the learning process, and can minimize the number of
variables to efficiently enhance the accuracy of the model and avoid
costly computations. Therefore, in this work, we proceed with the
following steps to prepare and select features from the source data:
extraction of target data inputs, selection of relevant features, filling
in the null values, normalization of the data, and adjustment of the
steps to be taken into account for prediction.
The source data give multiple features but not all of them are
important for use in the prediction. In our case, we have selected
four important features: solar energy, temperature, humidity and
pressure. These features are the most and highly correlated variables Figure 2: ANN architecture for predicting solar energy
to the targeted output (solar energy) according to Pearson correla-
tions [31]. The objective of this step is to help developing most
accurate models that learn from the most correlated data in order to Table 1: Architecture and parametrization of the models.
give the most accurate forecasts. We note that we have selected the
Median in order to fill missing data.
Parameters
The features of a given dataset are usually presented at different
scales. To ensure that all these values are at the same scale and to
Models Layers Epocs Activation Optimizer Batch size
add uniformity to our dataset, we employ the Min-Max scale that Function
transforms all values between 0 and 1, which removes the noise
from our data and simplifying the learning process of our models. RNN RNN cell Of 100 units 20 Tanh Adam 12719

Besides, the meteorological data are time-series with a time


step of 30 min, and therefore, the actual data values are needed as LSTM LSTM cell Of 100 units 20 Tanh Adam 12719
inputs for making predictions. Time-series data cannot use future
values as input features, whereas the inputs of a time-series model GRU GRU cell Of 100 units 20 Tanh Adam 12719
are the values of past features. In this work, we adjust our model

www.astesj.com 352
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)

3.4 Performance metrics concerning the efficiency of R2 as an appropriate metric for deter-
mining the best regression model [35, 36], we additionally evaluate
To evaluate the performance of the proposed DL models for PV our results using other statistical metrics that are the most widely
solar energy prediction. A set of evaluation metrics widely-used used: MAE, MSE, Max Error and RMSE. The RMSE remains the
are employed to evaluate the results accuracy forecasting. In this most frequently employed in the regression area [37]. Table 2 show
step, we have applied the most suitable to the context of DL and the metrics values corresponding to solar energy prediction in real
regression problems. For this purpose, five performance metrics, time of studied models. Considering the parameters presented in
whose equations are presented below, have been employed for mod- Table 2, we can see the results for one-half hour ahead prediction of
els testing as well as training: MAE (Mean Absolute Error), MSE our experiments for the recurrent neural networks: RNN, LSTM and
(Mean Square Error), RMSE (Root Mean Square Error), Max Error GRU. In spite of their architectural differences, RNN and LSTM
(ME) and R squared (R2 ). have demonstrated similar performance, showing that the better
n model is strongly dependent on the task. Thanks to their ability to
1X
MAE = |y − ŷj | (14) handle long-term dependencies in sequential information for time-
n j=1 j series predicting applications, both RNN and LSTM perform better
n
with very high accuracy as shown by their related R2 values which
1X are respectively equal to 94.68% and 94.26%. GRU gives slightly
MAE = (y − ŷj )2 (15)
n j=1 j less performance than the RNN and LSTM with 90.32 as value. It
v also causes errors unlike RNN and LSTM. We observe that for GRU,
2.74, 15.53, 3.94, 37.30 are given as MAE, MSE, RMSE and ME
u
t X n
1
RMSE = (y − ŷj )2 (16) values, respectively. Furthermore, these measures are nearly similar
n j=1 j
for RNN and LSTM with MAE=1.83, MSE=8.53, RMSE=2.92and
ME=30.42 for RNN, and MAE=1.90, MSE=9.20, RMSE=3.03 and
ME = max |yj − ŷj | (17) ME=29.73 for LSTM. Figure 3, Figure 4, Figure 5, Figure 6, Figure
1≤ j≤n
7 and Figure 8 show their training and testing curves. In reality,
SSreg more accurate prediction results mean less uncertainty and fluctua-
R2 = 1 − (18) tions in PV solar energy generation. Accuracy is a crucial factor in
SStot
the efficient planning and use of electrical energy and energy sys-
where y is the actual output, ŷ is the predicted output and n is tems with high PV power penetration. In short, as shown in Table
the number of samples. Eq. 14 computes MAE as the average of the 2, the prediction capability of RNN and LSTM seems very reliable
absolute errors (absolute values indicating the differences between compared to GRU under meteorological conditions, and they are
the actual and the predicted values). Eq. 15 shows the MSE, which better adapted to meet real-time PV solar energy prediction needs
is the average squared errors (difference between the real values and with high accuracy. Due to their ability to minimize errors during
what is estimated. Eq. 16 calculates RMSE, which is the square root learning and test processes, they can reduce uncertainties related to
of the MSE, and is applied in cases with small errors [33], whereas the operation and planning of PV energy integrated management
Eq. 17 measures the maximum residual error ME and reveals the systems.
worst error between the actual and predicted value. In addition, to
calculate the predictive accuracy of proposed models, we also used Table 2: Metrics related to PV solar energy dataset.
R-squared, which is a statistical metric in a regression model, to
identify the proportion of variance of the dependent variable that Models Data MAE MSE RMSE ME ME
Training 1.73 7.22 2.68 34.06 95.62
can be made explicit by the independent variable. Expressly, Eq. 18 RNN
Test 1.83 8.53 2.92 30.42 94.68
indicates the R-squared measure to which the data correspond in Training 1.77 7.65 2.76 35.42 95.36
LSTM
the regression model (the goodness of fit) where SSreg is the sum Test 1.90 9.20 3.03 35.42 94.26
Training 2.63 13.69 3.70 34.26 91.70
of the squares due to the regression and SStot is the total sum of GRU
Test 2.74 15.53 3.94 37.30 90.32
the squares [34]. Eventually, in the different steps outlined in this
section, we employed a set of technical tools for the implementation
of the studied algorithms based on the Python libraries, including
Pandas, NumPy, SciPy and Matplotlib in a Jupiter Notebook, in
addition to Scikit-learn (sklearn) and TensorFlow.

4 Results and discussion


In this section, we present and discuss the results of our three studied
models tested for PV solar energy forecasting. It should be noted
that although R-squared (R2 ) is a commonly used metric, it is not
regarded as an effective indicator of the models fit with the data [35].
A low or high R2 value does not always indicate that the model is
wrong or that it is automatically right [36]. Due to the controversy Figure 3: RNN training curves of solar energy

www.astesj.com 353
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)

results, which would allow more efficient conclusions. The normal-


ized RMSE (NRMSE) represents the rate of the RMSE value and
the range (the maximum value minus the minimum value) of the
actual values. Table 3 shows a results synthesis based on this metric
for real-time solar energy prediction. We can observe that RNN and
LSTM provide good and similar NRMSE values with 0.055 and
0.058 respectively.

Figure 4: RNN testing curves of solar energy

Figure 8: GRU testing curves of solar energy

Table 3: Results synthesis for real time prediction.

Models Data NRMSE


Figure 5: LSTM training curves of solar energy
Training 0.055
RNN
Test 0.066
Training 0.058
LSTM
Test 0.068
Training 0.077
GRU
Test 0.089

5 Conclusion and Perspectives


With the growing deployment of solar energy into modern grids,
PV solar energy prediction has become increasingly important to
deal with the volatility and uncertainty associated with solar power
Figure 6: LSTM testing curves of solar energy
in these systems. In the literature, different models have been pro-
posed for the prediction of PVs by using the AI techniques capacity.
However, the majority of the research contributions in this area offer
different techniques for making separate short-term and long-term
forecasts, and do not focus on real-time forecasts. For this reason,
we are working to find a model capable of producing continuous
real-time forecasts using meteorological data, which is primordial
for providing important decision support for power system operators
to ensure more efficient management and secure operation of the
grid and enhance the cost-effectiveness of the PV system. In this
study, we investigated the efficiency of three different DL models:
RNN, LSTM and GRU. The models were tested with data from a
Moroccan region to make real-time PV forecasts. Further, the analy-
Figure 7: GRU training curves of solar energy sis of the results was based on six metrics MAE, MSE, RMSE, ME,
R2 and NRMSE to determine both forecast accuracy and margin
Besides, while RMSE is regarded in the literature as a good of error. The results prove the efficiency of RNN and LSTM for
metric for evaluating and comparing regression models, it is still real-time PV prediction thanks to their ability to address long-term
difficult to interpret properly. Therefore, we also normalize the dependencies in sequential information for time-series regression
RMSE values to provide a more meaningful representation of our problems, compared to GRU that gives slightly less performance

www.astesj.com 354
I. Jebli et al. / Advances in Science, Technology and Engineering Systems Journal Vol. 6, No. 1, 349-355 (2021)

than the RNN and LSTM. From these findings, it can be concluded [17] A. Tuohy, J. Zack, S. E. Haupt, J. Sharp, M. Ahlstrom, S. Dise, E. Grimit, C.
that RNN and LSTM showed very high accuracy and low errors, Mohrlen, M. Lange, M. G. Casado, J. Black, M. Marquis, C. Collier, ”Solar
forecasting: Methods, challenges, and performance,” IEEE Power and Energy
they are reliable models that can minimize errors in the learning and Magazine, 13(6), 50-59, 2015.DOI:10.1016/j.111799.
testing process, in addition to reduce the uncertainties associated [18] G. Notton, M.L. Nivet, C. Voyant, C. Paoli, C. Darras, F. Motte, A. Fouil-
with the operation and planning of integrated PV energy manage- loy, “Intermittent and stochastic character of renewable energy sources:
ment systems. Moreover, they are promising techniques that also Consequences, cost of intermittence and benefit of forecasting,” Renew-
appears to be suitable for long-term PV forecasting and should be able and Sustainable Energy Reviews, 87(December 2016), 96–105, 2018.
DOI:10.1016/j.rser.2018.02.007.
studied and recommended as a possible unified and standard tool
[19] A.A. Merrouni, F.E. Elalaoui, A. Mezrhab, A. Mezrhab, A. Ghennioui,
for real-time, short-term and long-term PV prediction. As future “Large scale PV sites selection by combining GIS and Analytical Hier-
work of this study, we plan to enhance these models by employ- archy Process. Case study: Eastern Morocco,” Renewable Energy, 2017.
ing other approaches, extend them to longer term PV forecasting, DOI:10.1016/j.renene.2017.10.044.
as well as to studying other promising deep learning methods in [20] A. Alzahrani, P. Shamsi, C. Dagli, M. Ferdowsi, “Solar Irradiance Forecasting
view of providing and deploying a powerful model for PV energy Using Deep Neural Networks,” Procedia Computer Science, 114, 304–313,
2017. DOI:10.1016/j.procs.2017.09.045.
prediction.
[21] A. Alzahrani, P. Shamsi, M. Ferdowsi, “Solar Irradiance Forecasting Using
Deep Recurrent Neural Networks,” In 2017 IEEE 6th international conference
Conflict of Interest The authors declare no conflict of interest. on renewable energy research and applications (ICRERA), 988-994, 2017.
[22] Y. LeCun, Y. Bengio, G. Hinton. ”Deep learning” Nature 521(7553), 436-444,
2015.
References [23] H. Wang, Z. Lei, X. Zhang, B. Zhou, J. Peng, “A review of deep learning for re-
[1] E. Hache, “Do renewable energies improve energy security in the long run? newable energy forecasting,” Energy Conversion and Management, 198(July),
Septembre 2016 Les cahiers de l’économie - no 109 Série Recherche,” 2016. 111799, 2019. DOI:10.1016/j.enconman.2019.111799.
[2] M.G. De Giorgi, P.M. Congedo, M. Malvoni, “Photovoltaic power forecasting [24] X. Qing, Y. Niu, “Hourly day-ahead solar irradiance prediction us-
using statistical methods: Impact of weather data,” IET Science, Measurement ing weather forecasts by LSTM,” Energy, 148, 461–468, 2018.
and Technology, 8(3), 90–97, 2014. DOI:10.1049/iet-smt.2013.0135. DOI:10.1016/j.energy.2018.01.177.
[3] P. Li, K. Zhou, S. Yang, “Photovoltaic Power Forecasting: Models and Meth- [25] H. Liu, X. Mi, Y. Li, “Comparison of two new intelligent wind speed forecast-
ods,” 2nd IEEE Conference on Energy Internet and Energy System Integration, ing approaches based on Wavelet Packet Decomposition , Complete Ensemble
EI2 2018 - Proceedings, 1–6, 2018. DOI:10.1109/EI2.2018.8582674. Empirical Mode Decomposition with Adaptive Noise and Arti fi cial Neural
[4] U.K. Das, K.S. Tey, M. Seyedmahmoudian, S. Mekhilef, M.Y.I. Idris, W. Van Networks,” Energy Conversion and Management, 155(July 2017), 188–200,
Deventer, B. Horan, A. Stojcevski, “Forecasting of photovoltaic power genera- 2018. DOI:10.1016/j.enconman.2017.10.085.
tion and model optimization: A review,” Renewable and Sustainable Energy [26] Y. Wang, W. Liao, Y. Chang, “Gated Recurrent Unit Network-Based
Reviews, 81(August 2017), 912–928, 2018 DOI:10.1016/j.rser.2017.08.017. Short-Term Photovoltaic Forecasting,” Energies, 11(8), 2163, 2018.
[5] A. Youssef, M. El-telbany, A. Zekry, “The role of artificial intelligence in photo- DOI:10.3390/en11082163.
voltaic systems design and control: A review,” Renewable and Sustainable En- [27] K. Cho, B. van Merrienboer, C. Gulcehre, F. Bougares, H. Schwenk, Y. Bengio,
ergy Reviews, 78(November), 72–79, 2017. DOI:10.1016/j.rser.2017.04.046. “Learning Phrase Representations using RNN Encoder–Decoder for Statistical
[6] A.P. Yadav, A. Kumar, L. Behera, “RNN Based Solar Radiation Forecasting Machine Translation,” arXiv preprint arXiv:1406.1078, 2014.
Using Adaptive Learning Rate,” 442–452, 2013. [28] M. C. Sorkun, C. Paoli, Ö. D. Incel, ”Time series forecasting on solar irradi-
[7] G. Li , H. Wang, S. Zhang, J. Xin, H. Liu,“Recurrent Neural Networks Based ation using deep learning,” 2017 10th International Conference on Electrical
Photovoltaic Power Forecasting Approach,” 1–17, 2019. and Electronics Engineering (ELECO), Bursa, 151-155, 2017.
[8] P. Li, K. Zhou, X. Lu, S. Yang, “A hybrid deep learning model for short- [29] H.M. Lynn, S.B.U.M. Pan, “A Deep Bidirectional GRU Network Model for Bio-
term PV power forecasting,” Applied Energy, (November), 114216, 2019. metric Electrocardiogram Classification Based on Recurrent Neural Networks,”
DOI:10.1016/j.apenergy.2019.114216. IEEE Access, 7, 145395–145405, 2019. DOI:10.1109/ACCESS.2019.2939947.
[9] M. Abdel-Nasser, K. Mahmoud, “Accurate photovoltaic power forecasting [30] Z. Che, S. Purushotham, K. Cho, D. Sontag, Y. Liu, “Recurrent Neural Net-
models using deep LSTM-RNN,” Neural Computing and Applications, 31(7), works for Multivariate Time Series with Missing Values,” Scientific Reports,
2727–2740, 2019. DOI:10.1007/s00521-017-3225-z. 1–12, 2018. DOI:10.1038/s41598-018-24271-9.
[10] H. Zhou, Y. Zhang, L. Yang, Q. Liu, K. Yan, Y. Du, “Short-Term Pho- [31] H. Zhou, Z. Deng, Y. Xia, M. Fu, “A new Sampling Method in Particle
tovoltaic Power Forecasting Based on Long Short Term Memory Neural Filter Based on Pearson Correlation Coefficient,” Neurocomputing, 2016.
Network and Attention Mechanism,” IEEE Access, 7, 78063–78074, 2019. DOI:10.1016/j.neucom.2016.07.036.
DOI:10.1109/ACCESS.2019.2923006.
[32] D.P. Kingma, J. Ba:, ”Adam: A Method for Stochastic Optimization,” Proceed-
[11] Y. Yu, J. Cao, J. Zhu, “An LSTM Short-Term Solar Irradiance Forecasting
ings of the 3rd International Conference on Learning Representations (ICLR),
Under Complicated Weather Conditions,” IEEE Access, 7, 145651–145666,
p. No abs/1412.6980, 2014.
2019. DOI:10.1109/ACCESS.2019.2946057.
[33] F. Wang, Z. Mi, S. Su, H. Zhao, “Short-term solar irradiance forecasting model
[12] J. Wojtkiewicz, M. Hosseini, R. Gottumukkala, T.L. Chambers, “Hour-Ahead
based on artificial neural network using statistical feature parameters,” Energies,
Solar Irradiance Forecasting Using Multivariate Gated Recurrent Units,” 1–13,
5(5), 1355–1370, 2012. DOI:10.3390/en5051355.
2019. DOI:10.3390/en12214055.
[13] M. Hosseini, S. Katragadda, J. Wojtkiewicz, R. Gottumukkala, “Direct Normal [34] B.J. Miles, “R Squared , Adjusted R Squared,” Wiley, 2014.
Irradiance Forecasting Using Multivariate Gated Recurrent Units,” 1–15, 2020. [35] C. Onyutha, “From R-squared to coefficient of model accuracy for assessing ”
DOI:10.3390/en13153914. goodness-of-fits ”,” Geoscientific Model Development Discussions, 1-25, 2020.
[14] K. Yan, H. Shen, L. Wang, H. Zhou, M. Xu, Y. Mo, “Short-Term Solar Irradi- doi: 10.5194/gmd-2020-51
ance Forecasting Based on a Hybrid Deep Learning Methodology,” 1–13. [36] D.B. Figueiredo Filho, J.A.S. Júnior, E.C. Rocha, ”What is R2 all about?,”
[15] M.Q. Raza, M. Nadarajah, C. Ekanayake, “On recent advances in PV out- Leviathan (São Paulo), 3, 60–68, 2011.
put power forecast,” Solar Energy, 136(September 2019), 125–144, 2016. [37] I. Jebli, F. Belouadha, M.I. Kabbaj, “The forecasting of solar energy
DOI:10.1016/j.solener.2016.06.073. based on Machine Learning,” (c), 2020 International Conference on Elec-
[16] C. Wan, J. Zhao, Y. Song, Z. Xu, J. Lin, Z. Hu, “Photovoltaic and solar power trical and Information Technologies (ICEIT). IEEE, 1-8, 2020. DOI:
forecasting for smart grid energy management,” CSEE Journal of Power and 10.1109/ICEIT48248.2020.9113168
Energy Systems, 1(4), 38–46, 2016. DOI:10.17775/cseejpes.2015.00046.

www.astesj.com 355

You might also like