Academia.eduAcademia.edu

Fuzzy Risk Measure for Operational Risk

Journal of Advances in Mathematics and Computer Science

Operational risk is one of the most hazardous types of risk banks face. Banks must take caution and reserve capital to meet these risks. Value at Risk (VaR) and Expected Shortfall (ES) used to measure operational risk and estimate the required capital to meet it. Value at Risk is not sub-additive and measure risk at specific point in risk position, i.e. does not measure the risk in the tail, while Expected Shortfall examines only the left tails of the loss distribution. At the same time, to apply VaR or ES banks must have enough historical data. On the other hand, banks need an early warning indictor to monitor the movement of the capital needed to meet operational risk and take correction action in appropriate time. In this paper we will introduce new risk measure based on fuzzy numbers. The main advantage of the new risk measure is that the banks can use it as an early warning indictor to monitor the capital required to meet risk. At the same time can be used as an alternative to VaR but have more desirable properties. The application of the proposed risk measure shown that, the obtained results are more reliable and accurate at the same time the proposed risk measure have more desirable properties than VaR and ES.

Journal of Advances in Mathematics and Computer Science 28(1): 1-17, 2018; Article no.JAMCS.42625 ISSN: 2456-9968 (Pas t name: British Journal of Mathematics & Computer Science, Past ISSN: 2231-0851) Fuzzy Risk Measure for Operational Risk Assem Tharwat1, Ramadan A. ZeinEldin2, Hamiden Abd El-Wahed Khalifa3 and Ahmed M. Saleim4* 1 College of Business Administration, American University of the Emirates, Dubai, United Arab Emirates. 2 Deanship of Scientific Research, King Abdul Aziz University, Kingdom of Saudi Arabia. 3 Institute of Statistical Studies & Research, Cairo University, Egypt. 4 Institute of Statistical Studies & Research, Cairo University, Egypt. Authors’ contributions This work was carried out in collaboration between all authors. Author AMS designed the study, performed the statistical analysis and wrote the first draft of the manuscript. Authors AT and RAZE managed the analyses of the study. Author HAEWK managed the literature searches. All authors read and approved the final manuscript. Article Information DOI: 10.9734/JAMCS/2018/42625 Editor(s): (1) Dr. Morteza Seddighin, Professor, Indiana University East Richmond, USA. Reviewers: (1) Farzad Tahriri, Tehran University, Iran. (2) Neven Saleh Khalil Saeh, Higher Institute of Engineering, Egypt. Complete Peer review History: http://www.sciencedomain.org/review-history/25469 Received: 26th April 2018 Accepted: 4th July 2018 Published: 9th July 2018 Original Research Article _______________________________________________________________________________ Abstract Operational risk is one of the most hazardous types of risk banks face. Banks must take caution and reserve capital to meet these risks. Value at Risk (VaR) and Expected Shortfall (ES) used to measure operational risk and estimate the required capital to meet it. Value at Risk is not sub-additive and measure risk at specific point in risk position, i.e. does not measure the risk in the tail, while Expected Shortfall examines only the left tails of the loss distribution. At the same time, to apply VaR or ES banks must have enough historical data. On the other hand, banks need an early warning indictor to monitor the movement of the capital needed to meet operational risk and take correction action in appropriate time. In this paper we will introduce new risk measure based on fuzzy numbers. The main advantage of the new risk measure is that the banks can use it as an early warning indictor to monitor the capital required to meet risk. At the same time can be used as an alternative to VaR but have more desirable properties. The application of the proposed risk measure shown that, the obtained results are more reliable and accurate at the same time the proposed risk measure have more desirable properties than VaR and ES. _____________________________________ *Corresponding author: E-mail:[email protected]; Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 Keywords: Operational risk; loss distribution approach; risk measure; value at risk; expected shortfall; fuzzy number; cut. 1 Introduction To find the capital to meet operational risk, under advanced measurement approach, we need to do two things. The first one is to estimate the expected operational loss. The most common and theoretical method to measure operational risk is loss distribution approach. The second thing, after find expected loss we need to map this loss to single number that represents the required capital to meet future loss. These done by using what we called risk measure. The most common risk measure is Value at Risk (VaR) and Expected Shortfall (ES). VaR is calculated at a high specific level of confidence (99.9th percentile) and at a given period, at the same time, high severity/low frequency nature of operational risk data lead to limited data points at the tail of the loss distribution. This means that VaR does not estimate or predict loss due to extreme movements, i.e. VaR provides an estimate at particular points in the losses distribution, which mean it does not measure the risk in the tail. Indeed to that, VaR methodology based on normality assumption, so many authors criticized using VaR when the loss are not normally distributed, which is the case in operational risk, [1] and [2]. In 2014, the Basel Committee on Banking Supervision recommends the usage of Expected Shortfall. Since then the Expected Shortfall (ES) has been widely accepted as a risk measure but it has been criticized for issues relating to back-testing, [3] and [4]. In addition to criticism to VaR and ES and the necessity of availability of sufficient historical data, banks need to an indicator to monitor the amount of capital required to meet risk. In this paper, fuzzy number will be used to introduce a new risk measure to estimate required capital to meet operational risk at the same time can used as an early warning indicator. The remainder of this paper is organized as follows. Section 2 provides the background for the risk measure, the risk measure definition, literature review and survey, risk measure properties and popular risk measure. Section 3 gives some notationsabout the most important preliminary of fuzzy set which used in the paper. In section 4 we describe why we need a new risk measure. Section 5 devoted to present the solution methodology, the proposed risk measure, its properties and using it to determine the required capital to meet operational risk or as an early warning indicator. Application are presented and discussed in section 6. Conclusion was given in section 7. 2 Risk Measures Risk measure is the most important step in risk management and play a crucial role for all parties involving in decisions making process (Regulators, Supervisors, Risk manager and top management, Public or private companies and Investors), [5,6] and [4]. a) Regulators use risk measure to provide rules to guarantee the stability of the system and determine the risk capital that bank will be required to hold as Basel II and Basel III Accords illustrated the importance of choosing a proper risk measure for regulatory purpose. b) Supervisors need risk measure to ensure that the bank activities respect the legal framework. c) Risk manager and top management, the decisions made by risk manager will be affected by risk measure and its accuracy, at the same time top management use risk measure for management purpose. d) Public or private companies use risk measure to manage the wealth of their customers. e) Investors use the risk measure to make investment decision since investor decision depend on the return as well as degree of risk. 2 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 Based on the aforementioned, having an accurate risk measure, to estimate the required capital, and having an early warning indictor, to monitor the amount of capital required to meet operational risk, is very important. 2.1 Risk measure definition Risk measure provide a decision maker by a system to assign a single value to the set of random losses in order to make informed risk decisions, [7,6], and [3]. There are many definitions to risk measure. The most common and most formally definition define risk measure as a function which assign a single numerical value to the random loss, i.e. map a distribution of a set of loss Ω to real numbersℝ, that is ∶ Ω → ℝ, [8,6,3,4], and [9]. 2.2 Literature review and survey In 1930 Gramer introduce ruin theory. Ruin theory (sometimes called risk theory) uses mathematical models to describe an insurer's vulnerability to ruin/insolvency. In such models, key of interest are the probability of ruin. The classical model has two opposing cash flows: incoming cash premiums with constant rate of arrive and outgoing claims, which follow Poisson process. The central object of the model is to investigate the probability of ruin, [3] and https://en.wikipedia.org/wiki/Ruin_theory. In 1952, Markowitz introduces modern portfolio theory, which uses the variance of the profit and loss as risk measure. However, it suffers from two drawbacks. It assume that the risk are random variables with finite variance and it suppose that the distributions are symmetric around the mean since the variance does not distinguish between positive and negative deviations from the mean, [10]. The Value at Risk (VaR) is widely used since introduced in 1990s as a single aggregate risk measure. VaR is the most widely risk measure in banks and can apply to all risk types. VaR captures the downside risk so it is used as basis for measuring risk-based capital, regulatory or economic capital, to ensure that the bank never become insolvent. VaR define the potential losses or maximum losses at a specific confidence level, [11] and [1]. In 1997 Wang, Young, and Panjer started first axiomatic approach to risk measure. They present four axioms to describe the behavior of market insurance prices (pure premium for an insurance risk) and propose an additional axiom for reducing compound risks. Thus, the insurance risk measures satisfy five axioms: Law invariance, Monotonicity, Comonotonic additivity, Continuity and Scale normalization, [12] and [6]. In 1998 Artzner, Delbaen, Eber, and Heath present a unified framework for construction, analysis, and implementation of measure of risk (market and non-market risks “they concentrate on the market risk”). They define the risk, based on the principle of bygones are bygones, as the variability of the future value of a position due uncertain events instead of define the risk as change in position value between two points. To achieve their goals they define a set called acceptable set (set of acceptable net worths), which contain acceptable future net worth’s. The acceptable set are the basic object to be consider in order to decide accept or reject the risk, i.e. the risk measure can define by describing how close or far from acceptance position is. Thus, measurement of the risk of a position or portfolios currently held will be weather its future value belongs or does not belong to subset of acceptable risks. Then they define measure of risk of a position with an unacceptable future net worth (unacceptable risks) as the minimum capital required investing to make the future value of the new position or portfolios become acceptable. They also define a four axioms or properties (Translation invariance, Subadditivity, Positive homogeneity and Monotonicity) for risk measure and call risk measure, which satisfy these axioms coherent risk measure. They argue that to effectively regulate or measure risks, these axioms should hold for any risk measure, [13]. In 2013 Hede, Kou and Peng extended coherence risk measure by relaxes the subadditivity in the coherent risk measure and set two new axioms, Scenario-wise co-monotonic subadditivity and Empirical law 3 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 invariance. Risk statistic satisfy the axioms translation invariance, positive homogeneity and monotonicity (in coherent risk measure) and the new two axioms Scenario-wise co-monotonic subadditivity and Empirical law invariance will be called natural risk statistic, [6] and [14]. In 2017 Eliza Khemissi [15] introduce additional axiom of measure of risk and review, which risk measure fulfill this additional axiom in order to enrich Arzner’s and other axioms. 2.3 Risk measure properties The desirable properties of risk measures have been established in a set of axioms. The axioms are stated in groups as follows: Coherence risk measure, risk measure satisfies the following axioms called coherent risk measure:     Translation invariance Sub-additive Positive homogeneity Monotonicity Insurance risk measure, risk measure satisfies the following axioms called insurance risk measure:      Monotonicity Co-monotonically additive Law invariance Continuity Scale normalization Natural risk statistic, risk measure satisfies the following axioms called natural risk statistic:     Translation invariance Positive homogeneity Monotonicity Scenario-wise co-monotonic sub-additive, Empirical law invariance Convex risk measure, risk measure satisfies the following axioms called convex risk measure:    Translation invariance Monotonicity Convexity In addition to these set of axioms there are two significant properties (Elicit-ability and Robustness) should be considered, [3]: 2.3.1 The popular risk measure axioms The popular properties (axioms) for risk measure was exhibiting and clarifying in the following. Let Ω = { , , … , , }be the set of random number represent the random losses associated with operational risk ( is the set of losses in the ith operational risk cell) and : → ℝbe the risk measure. Axioms 1: Translation invariance Risk measure is translation invariant if for all loss variables ∈ Ωand ∈ ℝit hold that ( + ) = ( ) + . Translation invariance, mean that if the loss increase (decrease) by the amount the risk increase 4 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 (decrease) by the same amount . Translation invariance determines the necessary capital needed to compensate the risk, [3,6,16,13] and [4]. Axioms 2: Sub-additive Risk measure is sub-additive if for all loss variables, , ∈ Ω it holds that ( + ) ≤ ( ) + ( ). Subadditivity mean that risk merger does not yield extra risk which is natural requirement, [15,3,6] and [13]. Axioms 3: Positive homogeneity Risk measure is positive homogeneous if for all variables ∈ Ω and ℎ ∈ ℝ, ℎ ≥ 0 it hold that (ℎ ) = ℎ ( ). Positive homogeneity means that, the risk measure at a specific position will double if the risk at these position doubles [9,3]. Axioms 4: Monotonicity ≤ then ( ) ≤ ( ) . Risk measure is monotonic if for all loss variables , ∈ Ω with Monotonicity means if we have two portfolios one with payoffs “y” and the second with payoff “x” and first one dominates the second one then the portfolio with payoffs “y” must have less or equal risk than portfolio with payoffs “x”. Monotonicity is the minimum requirement for a rational risk measure. Monotonicity ensures that the higher losses lead to higher risk measure, [15,17,3,6] and [4]. There is another property called co-monotonic additive, which considered as complementary to sub-additive property (Axioms 2), [3]. Axioms 5: Co-monotonically additive Risk measure is co-monotonic additive if for any co-monotonic random variable that ( + ) = ( ) + ( ), [3]. and it holding A risk measure called coherent risk measure if it satisfies axioms 1, 2, 3, and 4. A drawback of coherent risk measures is that they are not robust, [3] and [6]. Axioms 6: Law invariance Risk measure is called law invariant if the two random variables and have the same distribution then ( ) = ( ), i.e. ( ≤ ) = ( ≤ ) ⇒ ( ) = ( )∀ ∈ ℝ. This axiom mean that the risk measure depends entirely on the distribution of the random variable associated to it, [9, 6] and [4]. Axioms 7: Continuity For risk measure and ∈ ℝ, ≥ 0 , has satisfies ( ), in which, ( − ) = max( − , 0), [6] and [12]. → (( − ) ) = ( ) and →∞ (min ( , )) = The first condition means that a small truncation in the loss variable results in small changes in capital. The second condition means that risk measure can be estimated by approximating by bounded variables, [12]. Axioms 8: Scale normalization Risk measure is scaled normalization if (1) = 1, [6]. 5 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 Any risk measure satisfies axioms 4, 5, 6, 7, and 8 will called insurance risk measures. Insurance risk measure is not sub-additive always, does not incorporate scenario analysis and does not enable to compare different distortion functions or different priors, [12] and [6]. Axioms 9: Scenario-wise co-monotonic sub-additive (or called co-monotonic sub-additive) Risk measure is scaled scenario-wise co-monotonic sub-additive if ( + ) ≤ ( ) + ( )for any , that are scenario-wise co-monotonic. Scenario-wise co-monotonicity is counterpart of the notion of co-monotonic for two random variables. Scenario-wise co-monotonicity means that and in the same direction, [6] and [14]. Axioms 10: Empirical law invariance (or called Permutation invariance) Empirical law invariance is counterpart of law invariance (axiom 6) for insurance risk measure. Empirical law invariance axiom means that if any two data and has the same empirical distributions under each scenario then and gives the same risk measurement, [14] and [6]. Natural risk statistic satisfies axioms 1, 3, 4, 9, and 10, [6] and [14]. Axiom 11: Convexity The Axioms 2 and 3 (Sub-additive and Positive homogeneity) are relaxed to a single convexity axiom, equation (1). Convexity axiom relax sub-additive axiom since it ensure that diversification never increase risk measure, [4,6] and [9]. ( + (1 − ) ) ≤ ( ) + (1 − ) ( )∀ , ∈ Ω, ∀ ∈ [0,1] (1) A risk measure called convex risk measure if it satisfies axiom 1 and axiom 4, and axiom 11, [9]. 2.3.2 Elicit-ability and robustness Elicit-ability Elicit-ability is a criterion help in determine the optimal point forecasts. Elicit-ability definition is linked to the definition of scoring functions. Scoring function assign a numerical score to single valued point forecast depend on the predictive point and realization [3]. Robustness A risk measure is said to be robust if it is quite insensitive to measurement errors. Robustness measure the effect of small change or deviation from the assumptions in data on the estimate of risk measure. If the risk measure not robustness then small change in data will produce a hug impact on risk measure estimation. To investigate robustness of risk measure, it is useful to consider the Wasserstein distance, [3,4] and [9]. 2.4 Popular risk measure 2.4.1 Value at Risk (VaR) Value at Risk is the maximum loss the bank may lose it with the current information at a specific level of confidence. Value at Risk (VaR) defines as a quantile of the distribution of aggregate losses. If denote a loss random variable, = ( ≥ )be the cumulative distribution function of and ∈ [0,1] level of ( ) , is the % percentile (or confidence. Value-at-Risk of at the confidence level % , denoted quantile) of the loss distribution of . Formally, VaR has given by equation (2), [18,10,6] and [16]: 6 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 ( )= ( ) = inf { ∈ ℝ: ( ) ≥ } = inf{ ∈ ℝ: ( ≤ ) ≥ } (2) VaR can be used for estimate required capital to meet risk and provide a bias to compare the risk levels for different loss distribution but it suffer from drawbacks when using to modeling operational risk. VaR give us information about the minimum loss in the 100(1 − )% worst cases but does not provide any information about the size of the loss in the remaining . Based on previous drawback, the VaR cannot use for comparison between different distributions of loss because it fail to distinguish between various tail behind the . In addition, VaR dose not fulfill sub-additive properties, [16] and [9]. 2.3.2 Expected Shortfall (ES) The criticism directed to VaR triggered many attempts to modify VaR. These attempts lead to introduce new risk measure called Expected Shortfall (ES). Expected Shortfall also called Tail VaR, Super-quantile, Conditional VaR or Tail Conditional Expectation. ES instead of measuring the minimum loss incurred in an accepted percentage of worst cases, ES measure the expected loss continued in that portion of unfortunate probabilities. Unlike VaR, ES use the extreme losses in the tail of the distribution (left tail). There are different alternative formula for define ES, [15,16] and [9]. Susanne Emmer, et al. [3] defines Expected Shortfall (ES) as follows: let , = (1,2,3, … )represent the loss in ℎ postion, = ∑ be the generic loss for one period, and then ES define by equation (3): ( )= 1 1− = [ | ≥ Which reduced to (if [ = ( )] = 0). ( ) ( )] + ( [ | ≥ 3 Notations ( )= [ : ≥ ( )] − ( ))( [ ( ) − 1) ( )] if aggregate losses (3) have continuous distribution In this section we give review to the important fuzzy notations used in this paper. 3.1 Fuzzy sets Fuzzy set is a class of object with a continuum of grades of membership. Fuzzy set can define as follow, let be a universal set, then the fuzzy set is given by: = {( , ( ): ∈ )}, where, ( ) called the membership function and represent the grade of membership of in , [19]and [20]. . − of a fuzzy set ( ) Given a fuzzy set in universal set The −cut (or interval of confidence at level or cut level set) of the fuzzy set is a crisp set, , that contains all the elements of have membership values in greater = { : ( ) ≥ , ∈ , ∈ [0,1]}, [19]and [20]. than or equal to , i.e 3.3 Convex fuzzy set A fuzzy set = {( , ( )} in universal set is called convex if all are convex set, i.e. for every element ∈ and ∈ and for every ∈ [0,1], + (1 − ) ∈ ∀ ∈ [0,1], [19]. 3.4 Fuzzy number Fuzzy numbers allow us to build a mathematical model for linguistic variable or fuzzy environment. A fuzzy number is an extension of a regular number, it does not refer to one single or certain value but rather refer to 7 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 a connected set of possible (imprecise or uncertain) values, where each possible value has its membership function (weight) between 0 and 1, [20] and [21]. The cut operation can apply to the fuzzy number. Let be a fuzzy number then ( ) ( ) ( ) ( ) , ], and defined as = [ , ], [21]. the crisp interval[ cut for is given by 3.5 A triangular fuzzy number Among the different shapes of fuzzy number, triangular fuzzy number (TFN) is the most popular one. A triangular fuzzy number can be represented by a triplet [ , , ] . This representation interpreted as membership function given by equation (4), [21] and [19]: ( )= a≤x≤b b≤x≤c (4) 3.6 Fuzzy arithmetic using -cut Palash Dutta, et al. [19] shown that α-cut method is general enough to deal with different type of fuzzy arithmetic including addition, subtraction, division, multiplication, extracting nth root, exponentiation, taking logarithm. 4 The Problem Description Current risk measures for operational risk such as Value at Risk (VaR) and Expected Shortfall (ES) suffer from some problems as described above, moreover it require existence of adequate historical data to estimate the required capital to meet potential losses in future. In addition, both VaR and ES do not enable banks from monitor the changing in the required capital to meet operational risk during a year. That is to say, the bank reaction to meet future risks will be postpone to the end of financial period and based on historical data, which expose the bank to more risks. Therefore, the bank needs a new risk measure that goes beyond the flaws found in VaR and ES, at the same time banks must have an early warning indicator to monitor instantly the capital required to meet operational risk. 5 Solution Methodology In the following fuzzy numbers will be used to present a new risk measure which can be used as an early warning indictor to instantaneously determine capital need to meet loss arise from operational risk at the same time can be used as an alternative to VaR and ES. 5.1 Fuzzyfication Fuzzy numbers will be used to present a new risk measure (α_RM) enable banks in a continuous manner to monitor the amount of the expected capital to meet operational risk, using the currently (monthly/daily) operational loss without waiting to the end of financial period and can be used as an alternative to VaR and ES with more desirable properties. Triangular fuzzy number will be used to present such method stems from that the triplet [ ] of the triangular fuzzy number can be viewed as the [minimum average maximum] expected loss, which is suitable to financial institutions. To present the proposed method firstly we need to represent the operational loss in fuzzy triangular number, if there are only three operational losses, we can arrange it in ascending order and represent it by fuzzy triangular number as follows. Let A = (a , a , a ) be any three operational loss such thata ≥ a ≥ 8 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 a . We can express the operational loss set A as triangular fuzzy numbersA = (a , a Fig.(1). ,a ) as given in ( ) 1 x Fig. 1. Represent operational losses in fuzzy triangular number With membership function of A given by equation (7): Μ (x) = a ≤x≤a a (7) ≤x≤a Recalling the definition of α –cut, given a fuzzy set A in X and any real numberα ∈ [0, 1], then the α -cut of a set of A, denoted by Aα is the crisp set given by Aα = a ∈ A: μ (x) ≥ α , [16]. Thus α –cut can represent the confidence that the expected next loss fall in crisp set Aα withαconfidence, [22]. Consequently, the capital required to cover the expected loss fall in the crisp set defined by equation (8): Aα = [(a − a )α + a , a − (a −a )α] (8) α_RM for n operational loss can be summarized in following steps: let A = (a , a , a , … . , a )be a set of operational loss during specific period or part of current period. Arrange A = (a , a , a , … . , a ) in ascending order. Let A = (a , a , a , … . , a ) be the arranged operational loss. Represent each three successive losses as triangular fuzzy numbers, thus we have n − 2 triangular fuzzy numbers as indict in Fig. (2), where A = (a , a , a ) , A = (a , a , a ), … … ., A = (a , a , a ), … … , A = (a , a , a ). The membership function for A is given by equation (9): 1. 2. μ (x) = a ≤x≤a a (9) ≤x≤a 4. α-cut of i fuzzy triangular numberA given by equation (10): 5. The aggregate α-cut given by equation (11): 3. Aα = [(a − a )α + a , a − (a −a )α] (10) 9 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 A= = [(a Aα = Aα + Aα + ⋯ . +Aα − a )α + ∑ a ,∑ (11) a − (a − a )α] with membership function given by equation (12): μ (x) = 6. ∑ ∑ ∑ ∑ ∑ ∑ = = ∑ ∑ ∑ ∑ a ≤x≤∑ a ≤x≤∑ a a (12) The required capital for operational risk falls in the crisp set given by the interval in equation (13). = [(a a , ∑ a − (a − a )α] (13) − a )α + ∑ Where, C is the required capital. ( ) 1 Fig. 2. Represent operational loss in TFN Banks can use α _RM during the financial period to monitor the change in capital required to meet operational risk, increasing the required capital consistently and unnaturally or increase it above predetermined limit will give a pointer to bank to take a correction action. 5.2 Solution algorithm To find capital required to meet operational risk, under the Advanced Measurement Approach (AMA), banks use the Loss Distribution Approach (LDA), which is the most common method used to measure operational risk. LDA is a frequency/severity model extensively used in many applications. It is a parametric technique use internal historical data and sometime enriched by external data to determine the frequency distribution and severity distribution of operational risk for each business line/event type (risk cell) at a specific time horizon. Then a suitable technique (e.g. Monte Carlo simulation or Panjer’s recursive algorithms) used to combine the two distributions to obtain aggregate loss for the next period for each risk cell. Then bank can use the Value at Risk or Excepted Shortfall to determine the capital charge need to meet operational risk, [2]. In this section, we will present algorithm to use α_RM as an alternative to Value at Risk and Expected Shortfall to measure operational risk. Supposed that, there are a historical operational loss data, to find the required capital to meet operational loss next period, we follow the following steps: 1- Quantify the distribution of frequency and severity. 2- According to distribution of frequency, generate random number n represent the number of loss occur in the period. 10 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 3- According to distribution of severity, generate n losses represent the numbers of losses occur in one period. 4- Use α_RM to find the interval (lower bound and upper bound) of capital to cover these losses. 5- Repeat steps 2 and 4 times. 6- The interval of the capital to meet the operational loss is the Quantile, at 99.9%, for lower and upper bound of the generated interval in previous step. . _RM properties as risk measure By revising the axioms listed above, we find that some of them are redundant (counterpart to another) and others are not necessary to operational risk. Finally, the axioms to be proved if the proposed risk measure α _EWI satisfy it are axioms1 (Translation invariance), axioms2 (Sub-additive), axioms3 (Positive homogeneity), axioms4 (Monotonicity), axioms7 (Continuity), axioms8 (Scale normalization), axiom11 (Convexity), elicit-ability and robustness. In the following we prove that risk measure, α_RM satisfy the axioms 1, 2, 3, 4, 7, 8, and 11. We postpone verifying it is elicit-ability and robustness in further work. 1- Translation invariance (axioms 1): + The left hand side: + + = + ) − ( + )) + ( + ), ( + ) − ( ( + ) − ( + )) ] = [( − ) + ( + ), ( + ) − ( − ) ] = [(( The right hand side: + = [( ) + , −( − ) ]+ = [( − ) + ( + ), ( − 2- Sub-additive (axioms 2): if , The right hand side: + =[ ( The left hand side: + =[ ( + )−( + + ∈ Ω then + + )−( ≤ − + ) ] = {( + ), ( + ), ( + )} ) + ( + ), ( + ) − ( + = [( − ) + , − ( − ) ] = [( − ) + , − ( − ) ] ) − ( + ) + ( + ), ( + ) − ( + 3- Positive homogeneity (axioms 3): ℎ =ℎ ℎ = (ℎ , ℎ , ℎ ) ℎ = [(ℎ − ℎ ) + ℎ , ℎ − (ℎ − ℎ ) ] )−( + + )−( ) ] + ) ] The right hand side: = [( − ) + , − ( − ) ] = ℎ[( − ) + , − ( − ) ] ℎ = [(ℎ − ℎ ) + ℎ , ℎ ≤ 4- Monotonicity (axioms 4): if ≤ then = [( − ) + , −( − − (ℎ −ℎ ) ] ) ] 11 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 = [( − ) + , −( − ) ] If ≤ these means that ≤ ∧ ≤ ≤ ∧ ≤ then ( Since: ≤ and ∧ − ≤ ) ≤( In the similar way −( − ) . ( − ) = −( − ) ≤ − ) then ( − ) + ≤( − ) + 5- Scale normalization (axioms 7): (1) = [1,1] = [( − ) + , − ( − ) ] (1) = [(1 − 1) + 1, 1 − (1 − 1) ] (1) = [1,1] 6- Continuity (axioms 8): We want to prove: The right hand side: → The left hand side: ( − ) = [(( ( − ) → = → 7- Convexity (axioms11): = [( − )−( [(( = [( − ) + − )) + ( , −( − ), ( − ) ] − )−(( − )−( − ) − ( − )) + ( − ), ( − ) − ( ( − ) + , −( − ) ]= + (1 − ) ≤ + (1 − ) − )) ] − )−( − )) ] From the definition of cut and the fact that the, adding two triangular fuzzy numbers produce triangular fuzzy number. We can that the convexity condition holds. 6 Application To test the accuracy of operational risk model, banks perform backtesting to these models. Backtesting used to analysis the differences between the prediction and the actual operational loss. Backtesting measure the accuracy of operational risk model by comparing the model output with the actual result during a specific period. There are four types of test can be used, clustering of violations, frequency of the violations, size of violations and size of over/under allocation of capital, [23]. Due to lack of data, we depend on the size of violations and size of over/under allocation of capital to validate our model. To validate the α_RM the data obtained from [23] (fraud data and legal data) and from [24] (operational loss for bank) will be used. . _RM as an early warning indicator To validate the α_RM as an early waning indicator, with extreme events, the fraud data obtained from [10] will be used. The fraud data collected from fraud events that took place between 1992 and 1996 and provided on monthly aggregate. The fraud data was summarized in Table (1). In [23], they use Generalized Extreme Value distribution to find the amount of capital to cover fraud risk, the result summarized in Table (2): 12 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 Marcelo G.Cruz [23], returns the high amount of capital at 99% to the asymptotic characteristic of Generalized Extreme Value. We apply α_RM to the fraud data in [7], for example in the first year; we have 10 triangular fuzzy number, = (50,000 68,000 182,435) , = (68,000 182,435 220,357 , … = (734,900 845,000 907,077) . Then we find α-cut for each fuzzy triangular numberA and the required capital fall in the interval of aggregate α-cut ofA . The results of applying α_RM are given in Table (3): year month Table 1. Fraud data, [23] 1 2 3 4 5 6 7 8 9 10 11 12 Sum No. of fraud 1992 1993 1994 1995 1996 50,000.000 68,000.000 182,435.320 220,357.000 350,000.000 360,000.000 360,000.000 406,001.470 550,000.000 734,900.000 845,000.000 907,077.000 5,033,770.790 586 47,500.000 51,908.050 52,048.500 78,375.000 120,000.000 157,083.000 160,000.000 200,000.000 214,634.950 556,000.000 560,000.000 1,100,000.000 3,297,549.500 454 64,600.000 107,000.000 107,031.200 109,543.000 129,754.000 176,000.000 200,000.000 350,000.000 410,060.720 1,300,000.000 3,950,000.000 6,600,000.000 13,503,988.920 485 52,700.000 75,177.000 83,613.700 86,878.460 116,000.000 120,000.000 165,000.000 239,102.930 248,341.960 260,000.000 394,672.110 600,000.340 2,441,486.500 658 89,540.000 122,650.000 128,412.000 210,536.560 229,368.500 230,000.000 294,835.230 332,000.000 423,319.620 426,000.000 750,000.000 1,820,000.000 5,056,661.910 798 Table 2.Capital required to cove fraud risk [23] 99% 95% 90% 85% 80% 1992 28,113,271 5,656,815 2,808,238 1,851,669 1,370,670 1993 26,992,371 5,522,432 2,760,593 1,827,292 1,356,118 1994 144,659,944 22,135,683 9,693,673 5,911,090 4,124,980 1995 1,012,706 444,486 335,950 292,046 267,195 Table 3.The result of applying _RMto fraud data in [7] 0.999 0.950 0.001 0.050 1992 [4,075,899 4,077,532] [4,036,944 4,118,647] [3,282,489 4,914,931] [3,321,444 4,873,816] 1993 [2,149,537 2,151,097] [2,124,425 2,202,454] [1,638,062 3,197,093] [1,663,175 3,145,736] 1994 [6,835,504 6,845,881] [6,645,119 7,164,038] [2,957,874 13,325,895] [3,148,259 13,007,738] 1995 [1,788,444 1,789,310] [1,771,688 1,815,027] [1,447,156 2,313,084] [1,463,913 2,287,368] 1996 38,023,601 7,311,518 3,547,268 2,302,561 1,682,761 1996 [3,146,461 3,148,819] [3,114,099 3,231,989] [2,487,322 4,842,774] [2,519,685 4,759,604] The results obtained using GEV in [23] not realistic, [23] use 99%, 90%, 85%, and 80% percent to calculate the required capital to meet operational risk while the 99.9% is used in calculation. The capital estimated at 13 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 99% is much overestimated, at 95% is overestimated in 1994 and underestimated at 1995, and 85% or 80% is much underestimated and not associated with actual loss. On the other hand, the result obtained from α_RM in each year is very consistent to the actual operational loss, which means that we can depend on the proposed method to give bank general view about the amount of the required capital to meet operational loss, so we can use it as early warning indicator. To test α_RM with operational loss follow distribution other than generalized extreme value distribution, the data obtained from [24] was used. Operational losses were collected during two years (2004, 2005) from South African retail bank. The operational loss (in Rand) given in Table (4). Table 4.Operational losses summary, [24] Sum 31,586,348.00 mean 277,073 Minimum 1,000.00 Maximum 9,669,000.00 Standard Deviation 1,032,814.09 Ja’NelEsterhuysen, et al. [24], uses the LDA (they use Poisson distribution for the frequency with mean equal 57 and Exponential distribution for severity distribution and generate 5000 yearly losses) they find that the VaR (required capital) at 99.9th is 13,384,748 Rand for one year. We apply theα_RM to the same data given in [24]. The obtained results are given in Table 5. Table 5. The result of applying _RM to operational loss in [24] 0.999 0.990 0.001 0.010 The required capital [10,931,279 11,006,514] [10,931,279 11,006,514] [8,271,364 15,787,306] [8,295,569 15,743,800] The required capital according to [24] is 13,384,748 Rand and the actual loss for year is 15,793,174 Rand. The obtained result by α_RM as shown in table [18] is consistent with the actual loss and with the result obtained by [24] which means that we can use the proposed method as an early warning indictor for operational risk. . _RM as an alternative to VaR and ES To test the α_RM as a risk measure equivalent to VaR and ES but have more desirable properties, the fraud loss described in previous section and legal data from [23] will be was used to test the result are presents in following. 6.2.1 The estimated capital for fraud data in [23] To validate the α_RM as a risk measure equivalent to VaR and ES but have more desirable properties, with extreme events, the fraud data obtained from [23] will be used Marcelo G. Cruz [23], use Generalized Extreme Value to find the required capital for each years and he conclude that the high estimated capital due to high quantiles required for VaR as described in section 5.1. Therefore, we see that using GEV is no suitable and we fit the fraud data and find the required capital using VaR, ES and α_RM for each year using Poisson distribution for frequency and Exponential distribution for severity. The result of simulation with 10,000 runs given in Table 6. Table (6) shown that, the results obtained from α_RM is consistent to the result obtained from VaR and ES and more reliable than using GEV which give a high estimated capital. 14 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 Table 6.The estimated capital using _RM to fraud data in [23] VaR at 99.9% ES at 99.9% _QRM at 99.9% 1992 5,973,932 5,852,970 [4,720,118 5,270,209] 1993 3,997,615 3,891,325 [3,134,743 3,516,099] 1994 16,235,068 15,822,683 [12,774,334 14,284,808] 1995 2,858,207 2,814,362 [2,263,260 2,515,480] 1996 4,912,240 4,828,310 [3,918,151 4,268,070] 6.2.2 The estimated capital for operational loss in [24] To validate the α_RMthe operational loss in [24] was used. Ja’NelEsterhuysen, et. al. [24], use Poisson distribution for frequency and Exponential distribution for severity, the VaR (required capital) at 99.9th is 13 384 748 rand. We execute another simulation (10,000 runs) using the same distributions for frequency and the severity to find VaR, ES and α_RM. The result of simulation given in Table 7. Table 7.The estimated capital using α_RM to fraud data in [24] VaR 13,152,676 ES 12,248,406 _RM [11,488,918 13,146,037] The actual (average) operational loss for one year is 15,793,174 Rand, using VaR the estimated required capital is13,152,676 Rand while using ES the required capital is 12,248,406 Rand. The results obtained from the proposed risk measure α_RM suggested to reserve capital in the range [11,488,918 13,146,037] to meet operational risk which is consistent to the result obtained from VaR and ES. 6.2.3 The estimated capital for legal loss in [23] The legal loss consist of 75 loss, total loss is 32,974,449$, average of the loss is 739,299$, minimum loss 142,774$ and the maximum loss is 3,921,879$. Firstly, we use theα_RM to the 75 given loss to have an indication about the expected required capital to meet this loss, the result with different α given in Table (8). Table 8. Estimated capital for legal loss in [23] using _RM 0.999 0.99 0.001 0.01 28,980,263 28,958,437 26,559,967 26,581,793 _RM 28,986,367.28 32,805,720.79 32,661,822.17 32,663,179.40 Secondly, we apply the steps given in section 5.2 to test α_RM as an alternative to VaR and ES, the number of runs is 10000. The required capital according to VaR is 46,590,199$ and the estimated capital using α_RM fall in the range [41,988,218 46,557,688]. Comparing between the total actual loss (32,974,449$), the result shown that we can use α_RM as early warning indicator, it’s give us an idea about the amount of capital required to meet legal risk during the year as shown in table (8). This result enhance by the result obtained from the simulation (using α_RM as an alternative to VaR and ES) where the required capital fall in range [41,988,218 46,557,688]. 7 Conclusion The paper introduces new risk measure based on fuzzy (α_RM), these new risk measure present two contributions to improve risk measure for operational risk. The first contribution, banks can use α_RM as an 15 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 early warning indicator to operational risk. Using α_RM atα = .001means that there are uncertainty in the environment surrounded the bank while using α = .999means more certainty. Using α_RM as early warning indicator will helps banks to monitor the required capital in a regular and progressive manner therefore banks can take preventive actions. The second contribution is that α_RM can be used as an alternative to VaR and ES but have more desirable properties. The main advantage of the risk measure is it can be used to give indication to top management about the amount of required capital to meet operational risk without have a large historical database and it have desirable properties more than VaR and ES. The main disadvantage is that α_RM still depend on fitting the distribution for the frequency and severity, if the fitting is not correct the obtained result will be misleading. Competing Interests Authors have declared that no competing interests exist. References [1] Imad A. Moosa. A Critique of the advanced measurement approach to regulatory capital against operational risk. Journal of Banking Regulation. 2008;9(3):151-164. [2] Karam E, Planchet F. Operational risks in financial sectors. Hindawi Publishing Corporation Advances in Decision Sciences; 2012. [Article ID 385387] [3] Susanne Emmer, Marie Kratz, Dirk Tasche. What is the best risk measure in practice? A comparison of standard measures. Journal of Risk. 2015;18(2):31-60. [4] Philipp Gschopf. Measuring risk with Expectile based expected shortfall estimates. Master of Science in Statistics. Institute for Statistics and Econometrics, Humboldt-Universitatzu Berlin; 2014. [5] Alejandro Balbas. Mathematical methods in modern risk measurement: A survey. Applied Mathematics. 2007;101(2):205–219. [6] Steven Kou, Xianhua Peng, Chris C. Heyde. External risk measures and Basel accords. Mathematics of Operations Research. 2013;38(3):393-417. [7] Kailan Shang, Zakir Hossen. Applying fuzzy logic to risk assessment and decision-making. Sponsored by Casualty Actuarial Society, Canadian Institute of Actuaries, Society of Actuaries; 2013. [8] Sovan Mitra. Risk measures in quantitative finance. In Proceedings of national UK University Risk Conference and a Risk Management Industry Workshop; 2009. [9] Simona Roccioletti. Back-testing value at risk and expected shortfall. Springer Fachmedien Wiesbaden; 2014. [10] Xiaoping Zhou, Antonina V. Durfee, Frank J. Fabozzi. On stability of operational risk estimates by LDA: From causes to approaches. Journal of Banking & Finance. 2016;266–278. [11] Joel Bessis. Risk management in banking. John Wiley & Sons Ltd; 2002. [12] Shaun S. Wang, Virginia R. Young, Harry H. Panjer. Axiomatic characterization of insurance prices. Insurance Mathematic and Economics. 1997;21:173–183. 16 Tharwat et al.; JAMCS, 28(1): 1-17, 2018; Article no.JAMCS.42625 [13] Philippe Artzner, Freddy Delbaen, Jean-Marc Eber, David Heath. Coherent Measures of Risk. Mathematical Finance. 1999;9(3):203-228. [14] Shabbir Ahmed, Damir Filipovi, Gregor Svindland. A note on natural risk statistics. Working Paper Series, Vienna Institute of Finance; 2008. [15] [16] Eliza Khemissi. Axiomatic extension of risk measurement. Econometrics. 2017;2(56). Michal Lebovič. Use of Coherent risk measures in operational risk modeling. Diploma Thesis, Charles University in Prague, Faculty of Social Sciences, Institute of Economic Studies; 2012. [17] Shelton Nicholls. Techniques for risk analysis. 8th Annual Senior Level, Policy Seminar on Risk Management and Investments in the Caribbean. Trinidad Hilton and Conference Centre; 2003. [18] Harry H. Panjer. Operational risk modeling analytics. John Wiely & M. sons, Ltd; 2006. [19] Palash Dutta, Hrishikesh Boruah, Tazid Ali. Fuzzy arithmetic with and without using α-cut method: A comparative study. International Journal of Latest Trends in Computing. 2011;2(1):99-107. [20] Sanhita Banerjee, Tapan Kumar Roy. Arithmetic Operations on generalized trapezoidal fuzzy number and its applications. Turkish Journal of Fuzzy Systems. 2012;3(1):16-44. [21] Shang Gao, Zaiyue Zhang. Multiplication operation on fuzzy numbers. Journal of Software. 2009; 4(4):331-338. [22] Antonina Durfee, Alexey Tselykh. Evaluating operational risk exposure using fuzzy number approach to scenario analysis. European Society for Fuzzy Logic and Technology (EUSFLAT-LFA); 2011. Marcelo G. Cruz. Modeling, measuring and hedging operational risk. John Wiely & M. sons, Ltd; 2002. [23] [24] Ja’Nel Esterhuysen, Paul Styger, Gary Van Vuuren. Calculating operational value-at-risk (Op Va R) South African Journal of Economic and Management Sciences (SAJEMS). 2008;11(1). _______________________________________________________________________________________ © 2018 Tharwat et al.; This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Peer-review history: The peer review history for this paper can be accessed here (Please copy paste the total link in your browser address bar) http://www.sciencedomain.org/review-history/25469 17