International Journal of Computer Theory and Engineering, 2022
A critical step in hypothesis testing at the computer theory and/or engineering decision-making s... more A critical step in hypothesis testing at the computer theory and/or engineering decision-making stage is to optimally compute and use type-I (α) and type-II (β) error probabilities. The article's first research objective is to optimize α and β errors, or producer's and consumer's risks, or risks of false positives (FP) and false negatives (FN) by employing the merits of a game-theoretical framework. To achieve this goal, the cross-products of errors and non-errors model is proposed. The second objective is to apply the proposed model to an industrial manufacturing quality control mechanism, i.e. sequential sampling plans (SSP). The article proposes an alternative technique compared to prematurely selecting the conventionally pre-specified type-I and type-II error probabilities. One studies mixed strategy, two-players and zerosum games' minimax rule derived by von Neumann and executed by Dantzig's linear programming (LP) algorithm. Further, one equation for one unknown scenario yielding simple algebraic roots validate the computationally-intensive LP optimal solutions. The cost and utility constants are elicited through company-specific input data management. The contrasts between conventional and proposed results are favorably illustrated by tables, figures, individual and comparative plots, and Venn diagrams in order to modify and improve the traditionally executed SSP's final decisions. Index Terms-Cross-products of errors, minimax rule, accept-reject-continue-terminate, cost and utility.
Hacettepe Journal of Mathematics and Statistics, 2017
Exponential smoothing models are simple, accurate and robust forecasting models and because of th... more Exponential smoothing models are simple, accurate and robust forecasting models and because of these they are widely applied in the literature. Holt's linear trend method is a valuable extension of exponential smoothing that helps deal with trending data. In this study we propose a modified version of Holt's linear trend method that eliminates the initialization issue faced when fitting the original model and simplifies the optimization process. The proposed method is compared empirically with the most popular forecasting algorithms based on exponential smoothing and Box-Jenkins ARIMA with respect to its predictive performance on the M3-Competition data set and is shown to outperform its competitors.
This study analyzed poisonings caused by pesticides that were reported to Drug and Poison Informa... more This study analyzed poisonings caused by pesticides that were reported to Drug and Poison Information Center (DPIC), in Izmir from 1993 to 2001. Patient demographics, type of the pesticide, distribution according to month and year, route and reason for exposure, clinical effects and outcome were analyzed from 25,572 poisoning calls. Pesticide intoxications accounted for 8.8% of the poisonings, with 80.3% insecticides and 19.7% rodenticides. The majority of poisonings ranged from 0 to 6 y (28.2%) and 19-29 y (23.2%). Half the accidental exposures (57.7%) were in the 0-6-y group; the attempted suicide was predominant in the 19-29 y group (39.8%). Most were intoxicated with organophosphates (47.6%); 54.1% did not develop signs and symptoms of toxicity. Fatality due to pesticide poisoning was 0.4%. Preventive education against pesticide poisoning is needed.
The objective of this study is to analyze exposures concerning analgesics that were reported to D... more The objective of this study is to analyze exposures concerning analgesics that were reported to Dokuz Eylul University Drug and Poison Information Center (DPIC) and admitted to the Department of Emergency Medicine in Dokuz Eylul University Hospital (EMDEU) between 1993 and 2004. Demographics of the patients, characteristics of analgesic exposures, performed treatment attempts and outcome of the poisoned patients were recorded on standard data forms and were then entered into a computerized database program. Statistical analysis was performed by using the chi-square test. The DPIC recorded 55 962 poisoning calls, 48 654 (86.9%) of them related to medicines. Analgesics accounted for 16.3% (7 939 cases) of all medicine-related exposures; among them 446 exposures were admitted to EMDEU. More than half of the analgesic exposure calls and admitted cases involved adults (55.9%, 4 440). Females dominated in all age groups (70.3%, 5 578). Mean age was 20.2 ± 11.8. The most involved analgesic...
With the advances in pervasive computing and wireless networks, quantitative risk measurement of ... more With the advances in pervasive computing and wireless networks, quantitative risk measurement of component (unit) and network availability has become a challenging task. It is widely recognized that the forced outage ratio (FOR) of an imbedded hardware component is defined as the failure rate divided by the sum of the failure and repair rates; or FOR is the non-operating time divided by the total exposure time. However, it is also well documented that FOR is not a constant a random variable. The probability density function (p.d.f.) of the FOR is the Sahinoglu-Libby (SL) probability model, used if certain underlying assumptions hold. The failure and repair rates are taken to be the generalized gamma variables where the corresponding shape and scale parameters respectively are not equal. The SL model is shown to default to that of a standard two-parameter beta p.d.f. when the shape parameters are identical. Decision theoretic solutions are sought to compute small-sample Bayesian estimators by using informative and non-informative priors for the failure and repair rates with respect to three definitions of loss functions. These estimators for component availability are then propagated to calculate the network expected source-target availability for simple complex networks. On the other hand, an often overlooked fact is that many real-life grid units from routers or servers in cybersystems to electric-power generating plants, and water-supply networks or dams do not operate in a dichotomously full or empty capacity. Due to lack of a closed-form solution of the DFOR in the three-state model as opposed to closed-form of the two-state model, the analysis can be conducted by Monte Carlo simulations using the empirical Bayesian principles to estimate the full and derated availability of a repairable hardware unit. Industrial applications for units will be numerically illustrated. For the three-state model following the Monte Carlo simulations, authors will show how to estimate the resultant p.d.f.s obtained from numerical analyses regarding single units.
Hacettepe Journal of Mathematics and Statistics, 2019
In this study, the forecasting accuracy of a new forecasting method that is alternative to two ma... more In this study, the forecasting accuracy of a new forecasting method that is alternative to two major forecasting approaches: exponential smoothing (ES) and ARIMA, will be evaluated. Using the results from the M3-competition, the forecasting performance of this method will be compared to not only these two major approaches but also to other successful methods derived from these two approaches with respect to simplicity and cost in addition to accuracy.
Like the previous M competitions, M4 competition resulted in great contributions to the field of ... more Like the previous M competitions, M4 competition resulted in great contributions to the field of forecasting. Ata method which is a new forecasting method alternative to exponential smoothing, competed in this competition with five different models. The results obtained from these five models are discussed in detail in this paper. According to various error metrics, the models perform better than their exponential smoothing based counters. Despite their simplicity, they are ranked satisfactorily high compared to the other methods. In addition, the forecasting accuracy of simple combinations of these Ata models and ARIMA are given for the M4 competition data set. The combinations work significantly better than models that are much more complex. Therefore, besides the fact that Ata models perform well alone, Ata should be considered as a candidate for being included in combinations of forecasts.
Üstel düzleştirme çeşitli zaman serisi verileri için yaygın olarak kullanılan popüler bir tahmin ... more Üstel düzleştirme çeşitli zaman serisi verileri için yaygın olarak kullanılan popüler bir tahmin yöntemidir. Üstel düzleştirme ile ilgili iki önemli problem mevcuttur. Birincisi, düzleştirme sabitinin değerine karar vermek. İkincisi de başlangıç değerini belirlemektir. Bu çalışmada başlangıç değerinin önemi ve tahmin üzerindeki etkisi araştırılmış ve araştırmacılara yardımcı olmak amacıyla bir çapraz tablo oluşturulmuştur.
Dokuz Eylül Üniversitesi Sosyal Bilimler Enstitüsü Dergisi, 2016
USTEL DUZLEŞTIRME YONTEMLERINDE BAŞLANGIC DEĞERININ ONEMI Oz Ustel duzlestirme cesitli zaman ser... more USTEL DUZLEŞTIRME YONTEMLERINDE BAŞLANGIC DEĞERININ ONEMI Oz Ustel duzlestirme cesitli zaman serisi verileri icin yaygin olarak kullanilan populer bir tahmin yontemidir. Ustel duzlestirme ile ilgili iki onemli problem mevcuttur. Birincisi, duzlestirme sabitinin degerine karar vermek. Ikincisi de baslangic degerini belirlemektir. Bu calismada baslangic degerinin onemi ve tahmin uzerindeki etkisi arastirilmis ve arastirmacilara yardimci olmak amaciyla bir capraz tablo olusturulmustur. Anahtar Kelimeler: Ustel Duzlestirme, Basit Ustel Duzlestirme, Baslangic Degeri.
Son yıllarda, sayısal olarak saklanan resimlerin sayısındaki büyük artış, İçerik Tabanlı Resim Er... more Son yıllarda, sayısal olarak saklanan resimlerin sayısındaki büyük artış, İçerik Tabanlı Resim Erişimi (İTRE) tekniklerine olan gereksinimi de gün geçtikçe artırmaktadır. Bu alanda resimlere ait renk içeriklerinin temsil edilmesinde renk momentleri yaygın olarak kullanılmakta ve oldukça iyi sonuçlar vermektedir. Bu makalede, Düşük Maliyetli Renk Momentleri (DMRM) adı verilen ve sınıflı frekans dağılışı çıkararak renk moment hesaplayan alternatif bir hesaplama yöntemi anlatılmıştır. DMRM yöntemi, hesaplama maliyeti açısından belirgin bir düşme sağlarken, buna karşılık hassasiyetinde belirgin bir düşmenin olmadığı gözlenmiştir.
International Journal of Computer Theory and Engineering, 2022
A critical step in hypothesis testing at the computer theory and/or engineering decision-making s... more A critical step in hypothesis testing at the computer theory and/or engineering decision-making stage is to optimally compute and use type-I (α) and type-II (β) error probabilities. The article's first research objective is to optimize α and β errors, or producer's and consumer's risks, or risks of false positives (FP) and false negatives (FN) by employing the merits of a game-theoretical framework. To achieve this goal, the cross-products of errors and non-errors model is proposed. The second objective is to apply the proposed model to an industrial manufacturing quality control mechanism, i.e. sequential sampling plans (SSP). The article proposes an alternative technique compared to prematurely selecting the conventionally pre-specified type-I and type-II error probabilities. One studies mixed strategy, two-players and zerosum games' minimax rule derived by von Neumann and executed by Dantzig's linear programming (LP) algorithm. Further, one equation for one unknown scenario yielding simple algebraic roots validate the computationally-intensive LP optimal solutions. The cost and utility constants are elicited through company-specific input data management. The contrasts between conventional and proposed results are favorably illustrated by tables, figures, individual and comparative plots, and Venn diagrams in order to modify and improve the traditionally executed SSP's final decisions. Index Terms-Cross-products of errors, minimax rule, accept-reject-continue-terminate, cost and utility.
Hacettepe Journal of Mathematics and Statistics, 2017
Exponential smoothing models are simple, accurate and robust forecasting models and because of th... more Exponential smoothing models are simple, accurate and robust forecasting models and because of these they are widely applied in the literature. Holt's linear trend method is a valuable extension of exponential smoothing that helps deal with trending data. In this study we propose a modified version of Holt's linear trend method that eliminates the initialization issue faced when fitting the original model and simplifies the optimization process. The proposed method is compared empirically with the most popular forecasting algorithms based on exponential smoothing and Box-Jenkins ARIMA with respect to its predictive performance on the M3-Competition data set and is shown to outperform its competitors.
This study analyzed poisonings caused by pesticides that were reported to Drug and Poison Informa... more This study analyzed poisonings caused by pesticides that were reported to Drug and Poison Information Center (DPIC), in Izmir from 1993 to 2001. Patient demographics, type of the pesticide, distribution according to month and year, route and reason for exposure, clinical effects and outcome were analyzed from 25,572 poisoning calls. Pesticide intoxications accounted for 8.8% of the poisonings, with 80.3% insecticides and 19.7% rodenticides. The majority of poisonings ranged from 0 to 6 y (28.2%) and 19-29 y (23.2%). Half the accidental exposures (57.7%) were in the 0-6-y group; the attempted suicide was predominant in the 19-29 y group (39.8%). Most were intoxicated with organophosphates (47.6%); 54.1% did not develop signs and symptoms of toxicity. Fatality due to pesticide poisoning was 0.4%. Preventive education against pesticide poisoning is needed.
The objective of this study is to analyze exposures concerning analgesics that were reported to D... more The objective of this study is to analyze exposures concerning analgesics that were reported to Dokuz Eylul University Drug and Poison Information Center (DPIC) and admitted to the Department of Emergency Medicine in Dokuz Eylul University Hospital (EMDEU) between 1993 and 2004. Demographics of the patients, characteristics of analgesic exposures, performed treatment attempts and outcome of the poisoned patients were recorded on standard data forms and were then entered into a computerized database program. Statistical analysis was performed by using the chi-square test. The DPIC recorded 55 962 poisoning calls, 48 654 (86.9%) of them related to medicines. Analgesics accounted for 16.3% (7 939 cases) of all medicine-related exposures; among them 446 exposures were admitted to EMDEU. More than half of the analgesic exposure calls and admitted cases involved adults (55.9%, 4 440). Females dominated in all age groups (70.3%, 5 578). Mean age was 20.2 ± 11.8. The most involved analgesic...
With the advances in pervasive computing and wireless networks, quantitative risk measurement of ... more With the advances in pervasive computing and wireless networks, quantitative risk measurement of component (unit) and network availability has become a challenging task. It is widely recognized that the forced outage ratio (FOR) of an imbedded hardware component is defined as the failure rate divided by the sum of the failure and repair rates; or FOR is the non-operating time divided by the total exposure time. However, it is also well documented that FOR is not a constant a random variable. The probability density function (p.d.f.) of the FOR is the Sahinoglu-Libby (SL) probability model, used if certain underlying assumptions hold. The failure and repair rates are taken to be the generalized gamma variables where the corresponding shape and scale parameters respectively are not equal. The SL model is shown to default to that of a standard two-parameter beta p.d.f. when the shape parameters are identical. Decision theoretic solutions are sought to compute small-sample Bayesian estimators by using informative and non-informative priors for the failure and repair rates with respect to three definitions of loss functions. These estimators for component availability are then propagated to calculate the network expected source-target availability for simple complex networks. On the other hand, an often overlooked fact is that many real-life grid units from routers or servers in cybersystems to electric-power generating plants, and water-supply networks or dams do not operate in a dichotomously full or empty capacity. Due to lack of a closed-form solution of the DFOR in the three-state model as opposed to closed-form of the two-state model, the analysis can be conducted by Monte Carlo simulations using the empirical Bayesian principles to estimate the full and derated availability of a repairable hardware unit. Industrial applications for units will be numerically illustrated. For the three-state model following the Monte Carlo simulations, authors will show how to estimate the resultant p.d.f.s obtained from numerical analyses regarding single units.
Hacettepe Journal of Mathematics and Statistics, 2019
In this study, the forecasting accuracy of a new forecasting method that is alternative to two ma... more In this study, the forecasting accuracy of a new forecasting method that is alternative to two major forecasting approaches: exponential smoothing (ES) and ARIMA, will be evaluated. Using the results from the M3-competition, the forecasting performance of this method will be compared to not only these two major approaches but also to other successful methods derived from these two approaches with respect to simplicity and cost in addition to accuracy.
Like the previous M competitions, M4 competition resulted in great contributions to the field of ... more Like the previous M competitions, M4 competition resulted in great contributions to the field of forecasting. Ata method which is a new forecasting method alternative to exponential smoothing, competed in this competition with five different models. The results obtained from these five models are discussed in detail in this paper. According to various error metrics, the models perform better than their exponential smoothing based counters. Despite their simplicity, they are ranked satisfactorily high compared to the other methods. In addition, the forecasting accuracy of simple combinations of these Ata models and ARIMA are given for the M4 competition data set. The combinations work significantly better than models that are much more complex. Therefore, besides the fact that Ata models perform well alone, Ata should be considered as a candidate for being included in combinations of forecasts.
Üstel düzleştirme çeşitli zaman serisi verileri için yaygın olarak kullanılan popüler bir tahmin ... more Üstel düzleştirme çeşitli zaman serisi verileri için yaygın olarak kullanılan popüler bir tahmin yöntemidir. Üstel düzleştirme ile ilgili iki önemli problem mevcuttur. Birincisi, düzleştirme sabitinin değerine karar vermek. İkincisi de başlangıç değerini belirlemektir. Bu çalışmada başlangıç değerinin önemi ve tahmin üzerindeki etkisi araştırılmış ve araştırmacılara yardımcı olmak amacıyla bir çapraz tablo oluşturulmuştur.
Dokuz Eylül Üniversitesi Sosyal Bilimler Enstitüsü Dergisi, 2016
USTEL DUZLEŞTIRME YONTEMLERINDE BAŞLANGIC DEĞERININ ONEMI Oz Ustel duzlestirme cesitli zaman ser... more USTEL DUZLEŞTIRME YONTEMLERINDE BAŞLANGIC DEĞERININ ONEMI Oz Ustel duzlestirme cesitli zaman serisi verileri icin yaygin olarak kullanilan populer bir tahmin yontemidir. Ustel duzlestirme ile ilgili iki onemli problem mevcuttur. Birincisi, duzlestirme sabitinin degerine karar vermek. Ikincisi de baslangic degerini belirlemektir. Bu calismada baslangic degerinin onemi ve tahmin uzerindeki etkisi arastirilmis ve arastirmacilara yardimci olmak amaciyla bir capraz tablo olusturulmustur. Anahtar Kelimeler: Ustel Duzlestirme, Basit Ustel Duzlestirme, Baslangic Degeri.
Son yıllarda, sayısal olarak saklanan resimlerin sayısındaki büyük artış, İçerik Tabanlı Resim Er... more Son yıllarda, sayısal olarak saklanan resimlerin sayısındaki büyük artış, İçerik Tabanlı Resim Erişimi (İTRE) tekniklerine olan gereksinimi de gün geçtikçe artırmaktadır. Bu alanda resimlere ait renk içeriklerinin temsil edilmesinde renk momentleri yaygın olarak kullanılmakta ve oldukça iyi sonuçlar vermektedir. Bu makalede, Düşük Maliyetli Renk Momentleri (DMRM) adı verilen ve sınıflı frekans dağılışı çıkararak renk moment hesaplayan alternatif bir hesaplama yöntemi anlatılmıştır. DMRM yöntemi, hesaplama maliyeti açısından belirgin bir düşme sağlarken, buna karşılık hassasiyetinde belirgin bir düşmenin olmadığı gözlenmiştir.
Uploads
Papers by Sedat Çapar