Value at Risk Research Papers (original) (raw)
Unlike the value at risk, the expected shortfall is a coherent measure of risk. In this paper, we discuss estimation of the expected shortfall of a random variable Y t with special reference to the case when auxiliary information is... more
Unlike the value at risk, the expected shortfall is a coherent measure of risk. In this paper, we discuss estimation of the expected shortfall of a random variable Y t with special reference to the case when auxiliary information is available in the form of a set of predictors X t. We consider three classes of estimators of the conditional expected shortfall of Y t given X t : a class of fully non-parametric estimators and two classes of analog estimators based, respectively, on the empirical conditional quantile function and the empirical conditional distribution function. We study their sampling properties by means of a set of Monte Carlo experiments and analyze their performance in an empirical application to financial data.
How malleable are preferences? This paper provides experimental evidence on the extent to which insurance sellers can influence buyers and whether mandatory information disclosure offsets these effects. The experiment involves 214... more
How malleable are preferences? This paper provides experimental evidence on the extent to which insurance sellers can influence buyers and whether mandatory information disclosure offsets these effects. The experiment involves 214 subjects seeking or recently obtaining unsecured loans and 25 sellers with experience of commission selling and in receipt of substantial performance pay. Potential insurance buyers had up to £5,750 value at risk. Extravert sellers are particularly effective. The most persuasive sellers raise willingness to pay by a substantial amount relative to the least successful. These differences arise even though sellers are only given a couple of minutes to make a "pitch". Trusting buyers seem the most susceptible to seller influence. Revealing information concerning value for money (the claims ratio) or the seller's commission is regarded as important by most consumers but is found to have negligible effects on behaviour. Knowing the claims ratio tends to make subjects less confident in their decisions. Overall, there is little evidence that disclosure benefits consumers. * We would like to thank the FSA for commissioning this experiment and ORC, in particular Martin Grimwood and Chris Marsh, for their professional handling of the logistics.
The extent to which the money supply affects the aggregate cash balance demanded at a certain level of nominal income and interest rates is determined by the interestrate-elasticity and stability of the money demand. An actuarial approach... more
The extent to which the money supply affects the aggregate cash balance demanded at a certain level of nominal income and interest rates is determined by the interestrate-elasticity and stability of the money demand. An actuarial approach is adopted in this paper for dealing with investors facing liquidity constraints and maintaining different expectations about risks. Under such circumstances, a level of surplus exists which maximises expected value. Moreover, when the distorted probability principle is introduced, the optimal liquidity demand is expressed as a Value at Risk and the comonotonic dependence structure determines the amount of money demanded by the economy. As a consequence, the more unstable the economy, the greater the interestrate-elasticity of the money demand. Moreover, for different parametric characterisation of risks, market parameters are expressed as the weighted average of sectorial or individual estimations, in such a way that multiple equilibria of the economy are possible.
In this paper we solve the problem of static portfolio allocation based on historical Value at Risk (VaR) by using genetic algorithm (GA). VaR is a predominantly used measure of risk of extreme quantiles in modern finance. For estimation... more
In this paper we solve the problem of static portfolio allocation based on historical Value at Risk (VaR) by using genetic algorithm (GA). VaR is a predominantly used measure of risk of extreme quantiles in modern finance. For estimation of historical static portfolio VaR, calculation of time series of portfolio returns is required. To avoid daily recalculations of proportion of capital invested in portfolio assets, we introduce a novel set of weight parameters based on proportion of shares. Optimal portfolio allocation in the VaR context is computationally very complex since VaR is not a coherent risk metric while number of local optima increases exponentially with the number of securities. We presented two different single-objective and a multiobjective technique for generating mean-VaR efficient frontiers. Results document good risk/reward characteristics of solution portfolios while there is a trade-off between the ability to control diversity of solutions and computation time. . He received his PhD degree in Computer modeling and simulations at the University of Kragujevac. Dr. Vladimir Rankovic has been involved in several national and international scientific projects related to computer modeling and simulations, numerical methods and software development. His current research focuses on intelligent systems and computational methods and techniques in economics and finance. This includes evolutionary multi-objective optimization techniques and theirs application in portfolio optimization and supply chain optimization and management. in 2012. Zoran Kalinic has been involved in several research projects related to information systems and software design. Also, he has taught as a guest lecturer at Cracow University of Economics, Poland. He is author/co-author of more than 50 scientific papers and monograph chapters. His current research interests include e-business, the application of artificial intelligence techniques in economics and finance, mobile communications and information systems and supply chain management.
In order to promote competition in Indian Electricity Market as per Indian Electricity Act 2003, Power Exchanges are established and running successfully since 2008. Power exchange is an electronic exchange which provides a common place... more
In order to promote competition in Indian Electricity Market as per Indian Electricity Act 2003, Power Exchanges are established and running successfully since 2008. Power exchange is an electronic exchange which provides a common place for electricity trading. Price of electricity is more volatile than any other commodity. This volatile nature of electricity-price introduces risk to market participants. Hence assessment and control of risk have become essential tasks for the players in power market. In this paper Value at Risk (VaR) & Conditional Value at Risk (CVaR) methods are used for the assessment of price-risk. The data of Market Clearing Price (MCP) available at Indian Energy Exchange (IEX) are used for the calculation of risk. The results are discussed in detail.
The new Basel II regulation contains a number of new regulatory features. Most importantly, internal ratings will be given a central role in the evaluation of the riskiness of bank loans. Another novelty is that retail credit and loans to... more
The new Basel II regulation contains a number of new regulatory features. Most importantly, internal ratings will be given a central role in the evaluation of the riskiness of bank loans. Another novelty is that retail credit and loans to small and medium-sized enterprises will receive a special treatment in recognition of the fact that the riskiness of such exposure derives to a greater extent from idiosyncratic risk and much less from common factor risk. Much of the work done on the differences between the risk properties of retail, SME and corporate credit has been based on parameterized models of credit risk. In this paper we present new quantitative evidence on the implied credit loss distributions for two Swedish banks using a non-parametric Monte Carlo re-sampling method following Carey [1998]. Our results are based on a panel data set containing both loan and internal rating data from the banks' complete business loan portfolios over the period 1997-2000. We compute the credit loss distributions that each rating system implies and compare the required economic capital implied by these loss distributions with the regulatory capital under Basel II. By exploiting the fact that a subset of all businesses in the sample is rated by both banks, we can generate loss distributions for SME, retail and corporate credit portfolios with a constant risk profile. Our findings suggest that a special treatment for retail credit and SME loans may not be justified. We also investigate if any alternative definition of SME's and retail credit would warrant different risk weight functions for these types of exposure. Our results indicate that it may be difficult to find a simple risk weight function that can account for the differences in portfolio risk properties between banks and asset types.
- by Tor Jacobson and +1
- •
- Monte Carlo, Panel Data, Economic Capital, Sampling methods
We compare capital requirements derived by tail conditional expectation (TCE) with those derived by tail conditional median (TCM) and find that there is no clear-cut relationship between these two measures in empirical data. Our results... more
We compare capital requirements derived by tail conditional expectation (TCE) with those derived by tail conditional median (TCM) and find that there is no clear-cut relationship between these two measures in empirical data. Our results highlight the relevance of TCM as a robust alternative to TCE, especially for regulatory control.
Risk managers are increasingly required by international Regulatory Institutions to adopt accurate techniques for the measurement and control of portfolios financial risks. The task requires first the identification of the different risk... more
Risk managers are increasingly required by international Regulatory Institutions to adopt accurate techniques for the measurement and control of portfolios financial risks. The task requires first the identification of the different risk sources affecting the portfolio and the measurement of their impact, then after: the adoption of appropriate portfolio strategies aimed at neutralising these risks. The comprehensive concept of Value-at-Risk (VaR) as a maximum tolerable loss, with a given confidence interval, has become in this regard the industry standard in risk management.
We examine the risk characteristics and capital adequacy of hedge funds through the Valueat-Risk approach. Using extensive data on nearly 1,500 hedge funds, we find only 3.7% live and 10.9% dead funds are undercapitalized as of March... more
We examine the risk characteristics and capital adequacy of hedge funds through the Valueat-Risk approach. Using extensive data on nearly 1,500 hedge funds, we find only 3.7% live and 10.9% dead funds are undercapitalized as of March 2003. Moreover, the undercapitalized funds are relatively small and constitute a tiny fraction of total fund assets in our sample. Cross-sectionally, the variability in fund capitalization is related to size, investment style, age, and management fee. Hedge fund risk and capitalization also display significant time variation. Traditional risk measures like standard deviation or leverage ratios fail to detect these trends. r
As a benchmark for measuring market risk, Value-at-Risk (VaR) reduces the risk associated with any kind of asset to just a number (amount in terms of a currency), which can be well understood by regulators, board members, and other... more
As a benchmark for measuring market risk, Value-at-Risk (VaR) reduces the risk associated with any kind of asset to just a number (amount in terms of a currency), which can be well understood by regulators, board members, and other interested parties. This paper employs a new kind of VaR approach due to Engle and Manganelli [4] to forecasting oil price risk. In doing so, we provide two original contributions: introducing a new exponentially weighted moving average CAViaR model and developing a least squares regression model for multi-period VaR prediction.
For risk analyses not only knowledge about the impact of different types of hazards, but also information about the elements and values at risk is necessary. This article introduces a methodology for a countrywide estimation of asset... more
For risk analyses not only knowledge about the impact of different types of hazards, but also information about the elements and values at risk is necessary. This article introduces a methodology for a countrywide estimation of asset values for commercial and industrial properties using Germany as an example. It consists of a financial appraisal of asset values on the municipal level and a further disaggregation by means of land use data. Novelties are the distinction of 60 economic activities, the consideration of production site sizes and the application of a dasymetric mapping technique for a sector-specific estimation and disaggregation of asset values. A validation with empirical data confirms the feasibility of the calculation. The resulting maps can be used for loss estimations e.g. in the framework of cost–benefit analyses that aim to evaluate hazard mitigation measures or for portfolio analyses by banks and insurance companies. The approach can be used for other countries if the necessary data is available (mainly in industrialized countries). In any case, it reveals the critical points when estimating commercial and industrial asset values.
Academic research has highlighted the inherent flaws within the RiskMetrics model and demonstrated the superiority of the GARCH approach in-sample. However, these results do not necessarily extend to forecasting performance. This paper... more
Academic research has highlighted the inherent flaws within the RiskMetrics model and demonstrated the superiority of the GARCH approach in-sample. However, these results do not necessarily extend to forecasting performance. This paper seeks answer to the question of whether RiskMetrics volatility forecasts are adequate in comparison to those obtained from GARCH models. To answer the question stock index data is taken from 31 international markets and subjected to two exercises, a straightforward volatility forecasting exercise and a Value-at-Risk exceptions forecasting competition. Our results provide some simple answers to the above question. When forecasting volatility of the G7 stock markets the APARCH model, in particular, provides superior forecasts that are significantly different from the RiskMetrics models in over half the cases. This result also extends to the European markets with the APARCH model typically preferred. For the Asian markets the RiskMetrics model performs well, and is only significantly dominated by the GARCH models for one market, although there is evidence that the APARCH model provides a better forecast for the larger Asian markets. Regarding the Value-at-Risk exercise, when forecasting the 1% VaR the RiskMetrics model does a poor job and is typically the worst performing model, again the APARCH model does well. However, forecasting the 5% VaR then the RiskMetrics model does provide an adequate performance. In short, the RiskMetrics model only performs well in forecasting the volatility of small emerging markets and for broader VaR measures.
Extreme returns in stock returns need to be captured for a successful risk management function to estimate unexpected loss in portfolio. Traditional value-at-risk models based on parametric models are not able to capture the extremes in... more
Extreme returns in stock returns need to be captured for a successful risk management function to estimate unexpected loss in portfolio. Traditional value-at-risk models based on parametric models are not able to capture the extremes in emerging markets where high volatility and nonlinear behaviors in returns are observed. The Extreme Value Theory (EVT) with conditional quantile proposed by McNeil and Frey (2000) is based on the central limit theorem applied to the extremes rater than mean of the return distribution. It limits the distribution of extreme returns always has the same form without relying on the distribution of the parent variable. This paper uses 8 filtered EVT models created with conditional quantile to estimate value-at-risk for the Istanbul Stock Exchange (ISE). The performances of the filtered expected shortfall models are compared to those of GARCH, GARCH with student-t distribution, GARCH with skewed student-t distribution and FIGARCH by using alternative back-t...
Many financial institutions assess portfolio decisions using RAROC, the ratio of expected return to risk (or 'economic') capital. We use asset pricing theory to determine the appropriate hurdle rate, finding that this varies with the... more
Many financial institutions assess portfolio decisions using RAROC, the ratio of expected return to risk (or 'economic') capital. We use asset pricing theory to determine the appropriate hurdle rate, finding that this varies with the skewness of asset returns. We quantify this discrepancy under a range of assumptions showing that the RAROC hurdle rate differs substantially, being higher by a factor of five or more for equity which has a right skew compared to debt which has a pronounced left skew, and also between different qualities of debt exposure. We discuss implications for both financial institution risk management and supervision.
The Basel 2 Accord requires regulatory capital to cover stress tests, yet no coherent and objective framework for stress testing portfolios exists. We propose a new methodology for stress testing in the context of market risk models that... more
The Basel 2 Accord requires regulatory capital to cover stress tests, yet no coherent and objective framework for stress testing portfolios exists. We propose a new methodology for stress testing in the context of market risk models that can incorporate both volatility clustering and heavy tails. Empirical results compare the performance of eight risk models with four possible conditional and unconditional return distributions over different rolling estimation periods. When applied to major currency pairs using daily data spanning more than 20 years we find that stress test results should have little impact on current levels of foreign exchange regulatory capital.
- by Elizabeth Sheedy and +1
- •
- Applied Mathematics, GARCH, Value at Risk, Foreign Exchange
In this paper, we examine the effect of implicit seller reserves on the estimation of value-at-risk based on historical asset sales data. We direct our examination toward how and whether fine art might prove an appropriate form of loan... more
In this paper, we examine the effect of implicit seller reserves on the estimation of value-at-risk based on historical asset sales data. We direct our examination toward how and whether fine art might prove an appropriate form of loan collateral for banks and other financial institutions. Using a data set of French Impressionist paintings brought to auction from 1985 to 2001, we control for the effect of works that are bought in-house to construct a distribution of potential sale values that corrects for sample selection bias. It turns out that the downside risk surrounding deviations of auction prices from expert presale estimates depends critically on how buy-ins are incorporated. If downside risk is assessed solely on historical experience with successful auction sales, the data appear to support loan-to-value ratios between 50% and a 100% larger than loan-to-value ratios that countenance the existence of seller reserves. The auction process, however, is quantifiable and can reveal the necessary risk information required for loan consideration.
Prior to entering academia, he worked as an engineer, primarily in instrumentation and developing special purpose computer-based data acquisition and control systems. Practical applications This paper provides details of the pricing and... more
Prior to entering academia, he worked as an engineer, primarily in instrumentation and developing special purpose computer-based data acquisition and control systems. Practical applications This paper provides details of the pricing and hedging characteristics of re-settable strike-price puts as compared to plain vanilla puts and makes explicit the differences between the two. Examples are provided of exposures that arise for issuers of re-settable strike-price puts, either as separate instruments or as instruments that are embedded in investment products, such as protected index notes. This paper also provides guidance to practitioners concerning the need to hedge re-settable strike-price puts differently than plain vanilla puts when the time to reset of the option is small and the underlying is near S*, a key value defined in the paper. Specifically, the gamma of the re-settable strike-price put is much greater than the gamma of the plain vanilla put, so a delta hedge for the former likely will need to be reset more frequently than for the latter, particularly near the reset date.
Highperformance computing in finance: The last 10 years and the next ☆ Stavros A. Zenios ,
The aim of this paper was to accurately and efficiently forecast from multivariate generalized autoregressive conditional heteroscedastic models. The Rotated Dynamic Conditional Correlation (RDCC) model with the Normal, Student’s-t and... more
The aim of this paper was to accurately and efficiently forecast from multivariate generalized autoregressive conditional heteroscedastic models. The Rotated Dynamic Conditional Correlation (RDCC) model with the Normal, Student’s-t and Multivariate Exponential Power distributions for errors were used to account for heavy tails commonly observed in financial time series data. The daily stock price data of Karachi, Bombay, Kuala Lumpur and Singapore stock exchanges from January 2008 to December 2017 were used. The predictive capability of RDCC models, with various error distributions, in forecasting one-day-ahead Value-at-Risk (VaR) was assessed by several back-testing procedures. The empirical results of the study revealed that the RDCC model with Student’s-t distribution produced more accurate and reliable risk forecasts than other competing models. To cite this article [Farid, S. & Iqbal, F. (2020). Forecasting Value-at-Risk of Asian Stock Markets Using the RDCC-GARCH Model Under D...
Modeling and forecasting the volatility of Brazilian asset returns: a realized
We consider the single period stochastic inventory (newsvendor) problem with downside risk constraints. The aim in the classical newsvendor problem is maximizing the expected profit. This formulation does not take into account the risk of... more
We consider the single period stochastic inventory (newsvendor) problem with downside risk constraints. The aim in the classical newsvendor problem is maximizing the expected profit. This formulation does not take into account the risk of earning less than a desired target profit or losing more than an acceptable level due to the randomness of demand. We utilize Value at Risk (VaR) as the risk measure in a newsvendor framework and investigate the multi-product newsvendor problem under a VaR constraint. To this end, we first derive the exact distribution function for the twoproduct newsvendor problem and develop an approximation method for the profit distribution of the N-product case (N42). A mathematical programming approach is used to determine the solution of the newsvendor problem with a VaR constraint. This approach allows us to handle a wide range of cases including the correlated demand case that yields new results and insights. The accuracy of the approximation method and the effects of the system parameters on the solution are investigated numerically.
Default probabilities (PDs) and correlations play a crucial role in the New Basel Capital Accord. In commercial credit risk models they are an important constituent. Yet, modeling and estimation of PDs and correlations is still under... more
Default probabilities (PDs) and correlations play a crucial role in the New Basel Capital Accord. In commercial credit risk models they are an important constituent. Yet, modeling and estimation of PDs and correlations is still under active discussion. We show how the Basel II one factor model which is used to calibrate risk weights can be extended to a model for estimating PDs and correlations. The important advantage of this model is that it uses actual information about the point in time of the credit cycle. Thus, uncertainties about the parameters which are needed for Value-at-Risk calculations in portfolio models may be substantially reduced. First empirical evidence for the appropriateness of the models and underlying risk factors is given with S&P data.
- by Thilo Liebig and +1
- •
- Credit Rating, Credit Risk, Value at Risk, Risk factors
Pharmaceutical development and manufacturing systems typically rely on a Quality by Testing (QbT) model that use release testing and other measures to ensure product quality. However, there is a significant gap between typical... more
Pharmaceutical development and manufacturing systems typically rely on a Quality by Testing (QbT) model that use release testing and other measures to ensure product quality. However, there is a significant gap between typical pharmaceutical production system capability and supplied quality. To sustain high levels of product supply quality, the industry incurs a high cost of quality and retains value at risk. This paper presents research results from a systems engineering perspective using case study data that quantitatively evaluates the gap between pharmaceutical production system sigma and supplied quality. It also identifies the extent to which emerging Quality by Design (QbD) eliminates system contradictions that prohibit higher production system sigma performance.
Returns to a model farm are simulated to assess the impact of marketing and insurance risk management tools as measured by mean net returns and returns at 5% value-at-risk (VaR). Results indicate that revenue insurance strategies and... more
Returns to a model farm are simulated to assess the impact of marketing and insurance risk management tools as measured by mean net returns and returns at 5% value-at-risk (VaR). Results indicate that revenue insurance strategies and strategies involving a combination of price and yield protection provide substantial downside revenue protection, while mean net returns only modestly differ from the benchmark harvest sale strategy when considering all years between 1986 and 2000.
This paper studies a strategy that minimizes the risk of a position in a zero coupon bond by buying a percentage of a put option, subject to a fixed budget available for hedging. We consider two popular risk measures: Value-at-Risk(VaR)... more
This paper studies a strategy that minimizes the risk of a position in a zero coupon bond by buying a percentage of a put option, subject to a fixed budget available for hedging. We consider two popular risk measures: Value-at-Risk(VaR) and Tail Value-at-Risk (TVaR). We elaborate a formula for determining the optimal strike price for this put option in case of a Hull-White stochastic interest rate model. We calibrate the Hull-White model parameters to a set of cap prices, in order to provide a credible numerical illustration. We demonstrate the relevance of searching the optimal strike price, since moving away from the optimum implies a loss, due to an increased (T)VaR. In this way, we extend the results of [Ahn et al., 1999], who minimize VaR for a position in a share.
Benati and Rizzi [S. Benati, R. Rizzi, A mixed integer linear programming formulation of the optimal mean/Value-at-Risk portfolio problem, European Journal of Operational Research 176 (2007) 423-434], in a recent proposal of two linear... more
Benati and Rizzi [S. Benati, R. Rizzi, A mixed integer linear programming formulation of the optimal mean/Value-at-Risk portfolio problem, European Journal of Operational Research 176 (2007) 423-434], in a recent proposal of two linear integer programming models for portfolio optimization using Value-at-Risk as the measure of risk, claimed that the two counterpart models are equivalent. This note shows that this claim is only partly true. The second model attempts to minimize the probability of the portfolio return falling below a certain threshold instead of minimizing the Value-at-Risk. However, the discontinuity of real-world probability values makes the second model impractical. An alternative model with Value-at-Risk as the objective is thus proposed.
This paper links communications and media usage to social and household economics boundaries. It highlights that in present day society, communications and media are a necessity, but not always affordable, and that they furthermore open... more
This paper links communications and media usage to social and household economics boundaries. It highlights that in present day society, communications and media are a necessity, but not always affordable, and that they furthermore open up for addictive behaviours which raise additional financial and social risks. A simple and efficient methodology compatible with state-of-the-art social and communications business statistics is developed, which produces the residual communications and media affordability budget and ultimately the value-at-risk in terms of usage and tariffs. Sensitivity analysis provides precious information on communications and media adoption on the basis of affordability. Case data are surveyed from various countries. ICT policy recommendations are made to support widespread and responsible communications access.
The credit value-at-risk model underpinning the Basel II Internal Ratings-Based approach assumes that idiosyncratic risk has been diversified away fully in the portfolio, so that economic capital depends only on systematic risk... more
The credit value-at-risk model underpinning the Basel II Internal Ratings-Based approach assumes that idiosyncratic risk has been diversified away fully in the portfolio, so that economic capital depends only on systematic risk contributions. We develop a simple methodology for approximating the effect of undiversified idiosyncratic risk on VaR. The supervisory review process (Pillar 2) of the new Basel framework offers a potential venue for application of the proposed granularity adjustment (GA).
Le rôle du traitement de l’information dans le cadre de l’intermédiation bancaire est de première importance. La banque peut accéder à différents types d’information pour appréhender la gestion du risque par couverture de la Value at Risk... more
Le rôle du traitement de l’information dans le cadre de l’intermédiation bancaire est de première importance. La banque peut accéder à différents types d’information pour appréhender la gestion du risque par couverture de la Value at Risk par allocation de fonds propres. L’information hard, contenue dans les documents comptables et produite grâce à des modèles de score, est quantitative et
El presente Trabajo Fin de Master plantea como objetivo la aplicación práctica del modelo GARCH y distribuciones de colas anchas en el cálculo del riesgo de mercado, mediante metodología VeR Paramétrica, en una cartera de renta variable... more
El presente Trabajo Fin de Master plantea como objetivo la aplicación práctica del modelo GARCH y distribuciones de colas anchas en el cálculo del riesgo de mercado, mediante metodología VeR Paramétrica, en una cartera de renta variable en el periodo de 1/1/2014 al 31/12/2018 para los principales títulos del sector bancario y financiero que cotizan en el mercado bursátil español y que pertenecen al IBEX-35. Se escogen los valores cotizados de Santander, BBVA, CaixaBank, Sabadell y Bankia. Se realiza metodología empírica mediante técnicas matemáticas, estadísticas y econométricas. Se comparan resultados obtenidos entre distribución normal y distribución T-Student (colas anchas), y según la forma de cálculo de la volatilidad (Estándar o GARCH). Finalmente se realiza el Backtesting en el periodo de 1/1/2019 al 30/6/2019 para los resultados de la cartera y se establecen las conclusiones finales del estudio.
Value at risk is currently the standard in risk reporting. In this document we will describe methods to more accurately assess the risk in a portfolio with derivatives like options. We will describe the delta-gamma method and Monte Carlo... more
Value at risk is currently the standard in risk reporting. In this document we will describe methods to more accurately assess the risk in a portfolio with derivatives like options. We will describe the delta-gamma method and Monte Carlo simulation. These specifications can be used to enhance a classical Value at Risk model. To show the practical implementation we also give some VBA pseudo-code structures that can be used in a program like Excel
Dalam penulisan ini akan dilakukan pembahsan menganai pembentkan portofolio optimum yang berisi sejumlah saham yang tergolong dalam LQ45. Pemilihan saham kategori LQ45 karena liquiditas saham yang tergolong LQ45 sangat liquid dan banyak... more
Dalam penulisan ini akan dilakukan pembahsan menganai pembentkan portofolio optimum yang berisi sejumlah saham yang tergolong dalam LQ45. Pemilihan saham kategori LQ45 karena liquiditas saham yang tergolong LQ45 sangat liquid dan banyak peminat di pasar saham. Data yang digunakan dalam penelitian ini data saham yang digunakan adalah periode 1 january 2016 – 31 maret 2016 data harian sebagai sampel dalam penghitungan besaran risiko ( variance),expected return dari masing-masing saham , dan nilai VaR dengan metode historical price dan penentuan portofolio yang optimal. Dari hasil pengolahan data ditemukan bahwa kombinasi dua saham yang memberikan risiko terendah berdasarkan rumus VaR untuk portofolio kombinasi dua saham adalah portofolio dengan saham HMSP dan BBCAdimana nilai VaR portofolio yang ditanggung oleh investor jika menginvestasikan dana portofolio ini sebesar Rp35,973,642.78 dari total dana 1M.
لم تعُد المخاطر قيداّ على الأعمال بل أصبحت مصدراً هاماً من مصادر الميزة التنافسية، حيث باتت المخاطر جزءاً هاماً من بيئة الأعمال بصورة عامة، فبدون مخاطر لا يوجد أرباح، وبالمقابل إن تجاهل المخاطر يمكن أن يهدد أكبر المؤسسات بالفشل والإفلاس،... more
لم تعُد المخاطر قيداّ على الأعمال بل أصبحت مصدراً هاماً من مصادر الميزة التنافسية، حيث باتت المخاطر جزءاً هاماً من بيئة الأعمال بصورة عامة، فبدون مخاطر لا يوجد أرباح، وبالمقابل إن تجاهل المخاطر يمكن أن يهدد أكبر المؤسسات بالفشل والإفلاس، وينعكس هذا التهديد على سمعة المؤسسة واستقرارها المالي واستمرارية وجودها.
هذا وإن موضوع المخاطر وإدارتها من الموضوعات التي شغلت فكر الكثير من الباحثين الأكاديميين والتطبيقيين على حدٍ سواء، نظراً لما لها من أثر جوهري في سلامة واستقرار المنشآت التجارية والمؤسسات المالية، وقد زادت أهمية إدارة المخاطر بعد اندلاع الأزمات المالية المتتالية التي عصفت بالنظام المالي والمصرفي العالمي، ما استوجب ظهور صناعة مالية مبتكرة تحمي المنشآت من المخاطر المختلفة عُرفت بصناعة إدارة المخاطر لها منتجوها وأدواتها وأسواقها.
ومن هنا يأتي هذا الكتاب ليقدم للقارئ أهم موضوعات المخاطر المالية والمصرفية وإدارتها الحديثة وتطبيقها على أرض الواقع، حيث تضمن الفصل الأول لمحة سريعة عن مفهوم إدارة المخاطر ومسبباتها وأنواعها وأساليب إدارتها.
أما الفصل الثاني فقد خُصِصَ للحديث عن المخاطر المالية (خطر الائتمان، خطر سعر الفائدة، خطر سعر الصرف) من حيث التعريف بهذه المخاطر وقياس الخسارة الناتجة عنها باستخدام النماذج الرياضية والاحصائية، واقتراح أساليب وتقنيات داخلية لإدارة هذه المخاطر بشكل ذاتي، مع سرد أمثلة وحالات تطبيقية توضح كيفية استخدام هذه الأساليب والتقنيات.
في حين تضمن الفصل الثالث شرح وتطبيق أدوات الهندسة المالية المصممة لإدارة المخاطر (المشتقات المالية)، المؤلفة من عقود الخيارات والعقود الآجلة والعقود المستقبلية وعقود المبادلات. حيث تم شرح هذه العقود بطريقة مبسطة وتوضيح آلية تطبيقها على أرض الواقع، مع اقتراح استراتيجيات محددة لاستخدام هذه الأدوات بشكل سليم.
أما الفصل الرابع فقد خُصص للحديث عن للمخاطر المصرفية المؤلفة من خطر المحفظة الائتمانية، خطر محفظة السوق، خطر السيولة المصرفية، خطر التشغيل، حيث انطلق الكاتب من شرح القوائم المالية الموحدة للمصارف وتوضيح أهم البنود الرئيسية فيها، ثم ناقش المخاطر المصرفية وأساليب إدارتها بطريقة عملية تضمنت حالات تطبيقية متعددة.
وقد خٌصص الفصل الخامس للحديث عن لجنة بازل للرقابة المصرفية، التي تعد اللاعب الرئيسي في مجال وضع القواعد الاحترازية التي تساعد على تحقيق سلامة النظام المصرفي على المستويين المحلي والعالمي. حيث تم التوسع في شرح الأساليب المقترحة من قبل اللجنة لإدارة المخاطر المصرفية وطرح الحالات العملية والتطبيقية التي توضح كيفية استخدام هذه الأساليب.
The focus of this work is the computation of efficient strategies for commodity trading in a multi-market environment. In today's "global economy" commodities are often bought in one location and then sold (right away, or after some... more
The focus of this work is the computation of efficient strategies for commodity trading in a multi-market environment. In today's "global economy" commodities are often bought in one location and then sold (right away, or after some storage period) in different markets. Thus, a trading decision in one location must be based on expectations about future price curves in all other relevant markets, and on current and future storage and transportation costs. Investors try to compute a strategy that maximizes expected return, usually with some limitations on assumed risk. With standard stochastic assumptions on commodity price fluctuations, computing an optimal strategy can be modeled as a Markov decision process (MDP). However, in general such a formulation does not lead to efficient algorithms. In this work we propose a model for representing the multi-market trading problem and show how to obtain efficient structured algorithms for computing optimal strategies for a number of commonly used trading objective functions (Expected NPV, Mean-Variance, and Value at Risk).
In this paper we document that realized variation measures constructed from highfrequency returns reveal a large degree of volatility risk in stock and index returns, where we characterize volatility risk by the extent to which... more
In this paper we document that realized variation measures constructed from highfrequency returns reveal a large degree of volatility risk in stock and index returns, where we characterize volatility risk by the extent to which forecasting errors in realized volatility are substantive. Even though returns standardized by ex post quadratic variation measures are nearly gaussian, this unpredictability brings considerably more uncertainty to the empirically relevant ex ante distribution of returns. Carefully modeling this volatility risk is fundamental.
Value-at-Risk (VaR) is the most popular tool for risk measurement in ban- king and finance industry today. The study estimates the volatility for mar- ket risk measurement to calculate diversified VaR. Using Multivariate GARCH BEKK... more
Value-at-Risk (VaR) is the most popular tool for risk measurement in ban- king and finance industry today. The study estimates the volatility for mar- ket risk measurement to calculate diversified VaR. Using Multivariate GARCH BEKK proposed by Engle and Kroner (1993) and variance-covariance matrix methods, this paper compares both methods in generating volatility forecast to estimate diversified VaR particularly for market risk. The paper examines three exchange rates: GBP/USD, USD/JPY, and USD/SGD, from the period of 2000 to 2005. The empirical result shows that GARCH BEKK model performs better, though has more sophisticated specification, than variance-covariance matrix method in estimating the volatility. The estimation results are as follows: VaR estimation generated by GARCH BEKK is 0.1388% which leads to capital charge of 5.2063%; while estimation generated by variance-covariance matrix is 0.1982% which leads to capital charge of 7.433%. The results also show that the volatility changes significantly every 125 observations or at least once in three months. This concludes that volatility forecast should be evaluated at least every three months.
We provide a mathematical definition of fragility and antifragility as negative or positive sensitivity to a semi-measure of dispersion and volatility (a variant of negative or positive "vega") and examine the link to nonlinear effects.... more
We provide a mathematical definition of fragility and antifragility as negative or positive sensitivity to a semi-measure of dispersion and volatility (a variant of negative or positive "vega") and examine the link to nonlinear effects. We integrate model error (and biases) into the fragile or antifragile context. Unlike risk, which is linked to psychological notions such as subjective preferences (hence cannot apply to a coffee cup) we offer a measure that is universal and concerns any object that has a probability distribution (whether such distribution is known or, critically, unknown).
Financial time series analysis deals with the understanding of data collected on financial markets. Several parametric distribution models have been entertained for describing, estimating and predicting the dynamics of financial time... more
Financial time series analysis deals with the understanding of data collected on financial markets. Several parametric distribution models have been entertained for describing, estimating and predicting the dynamics of financial time series. Alternatively, this article considers a Bayesian semiparametric approach. In particular, the usual parametric distributional assumptions of the GARCH-type models are relaxed by entertaining the class of location-scale mixtures
Basel III seeks to improve the financial sector's resilience to stress scenarios which calls for a reassessment of banks' credit risk models and, particularly, of their dependence on business cycles. This paper advocates a Mixture of... more
Basel III seeks to improve the financial sector's resilience to stress scenarios which calls for a reassessment of banks' credit risk models and, particularly, of their dependence on business cycles. This paper advocates a Mixture of Markov Chains (MMC) model to account for stochastic business cycle effects in credit rating migration risk. The MMC approach is more efficient and provides superior out-of-sample credit rating migration risk predictions at long horizons than a naïve approach that conditions deterministically on the business cycle phase. Banks using the MMC estimator would counter-cyclically increase capital by 6% during economic expansion and free up to 17% capital for lending during downturns relative to the naïve estimator. Thus the MMC estimator is well aligned with the Basel III macroprudential initiative to dampen procyclicality by reducing the recession-versus-expansion gap in capital buffers. JEL classifications: C13; C41; G21; G28.
- by Elena Kalotychou and +1
- •
- Credit Rating, Credit Risk, Value at Risk, Financial Sector
Value at Risk (VaR) is a common statistical method that has been used recently to measure market risk. In other word, it is a risk measure which can predict the maximum loss over the portfolio at a certain level of confidence. Value at... more
Value at Risk (VaR) is a common statistical method that has been used recently to measure market risk. In other word, it is a risk measure which can predict the maximum loss over the portfolio at a certain level of confidence. Value at risk, in general, is used by the banks during the calculation process to determine the minimum capital amount against market risks. Furthermore, it can also be exploited to calculate the maximum loss at investment portfolios designated for stock markets. The purpose of this study is to compare the VaR and Markowitz efficient frontier approach in terms of portfolio risks. Along with this angle, we have calculated the optimal portfolio by Portfolio Optimization method based on average variance calculated from the daily closing prices of the ninety-one stocks traded under the Ulusal-100 index of the Istanbul Stock Exchange in 2011. Then, for each of designated portfolios, Monte-Carlo Simulation Method was run for thousand times to calculate the VaR. Finally, we concluded that there is a parallel relationship between the calculated optimum portfolio risks and VaR values of the portfolios.
- by Umut Uyar and +1
- •
- Finance, Portfolio Management, Simulation, Value at Risk
This paper develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change,... more
This paper develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980's. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of 'model risk' in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to non-financial risks.
We analyze the predictive performance of various volatility models for stock returns. To compare their performance, we choose loss functions for which volatility estimation is of paramount importance. We deal with two economic loss... more
We analyze the predictive performance of various volatility models for stock returns. To compare their performance, we choose loss functions for which volatility estimation is of paramount importance. We deal with two economic loss functions (an option pricing function and an utility function) and two statistical loss functions (a goodness-of-fit measure for a Value-at-Risk (VaR) calculation and a predictive likelihood function). We implement the tests for superior predictive ability of White and Hansen . We find that, for option pricing, simple models like the Riskmetrics exponentially weighted moving average (EWMA) or a simple moving average, which do not require estimation, perform as well as other more sophisticated specifications. For a utility based loss function, an asymmetric quadratic GARCH seems to dominate, and this result is robust to different degrees of risk aversion. For a VaR based loss function, a stochastic volatility model is preferred. Interestingly, the Riskmetrics EWMA model, proposed to calculate VaR, seems to be the worst performer. For the predictive likelihood based loss function, modeling the conditional standard deviation instead of the variance seems to be a dominant modeling strategy.
The Robert Schuman Centre for Advanced Studies (RSCAS), created in 1992 and directed by Stefano Bartolini since September 2006, aims to develop inter-disciplinary and comparative research and to promote work on the major issues facing the... more
The Robert Schuman Centre for Advanced Studies (RSCAS), created in 1992 and directed by Stefano Bartolini since September 2006, aims to develop inter-disciplinary and comparative research and to promote work on the major issues facing the process of integration and European society.
Copulae provide investors with tools to model the dependency structure among financial products. The choice of copulae plays an important role in successful copula applications. However, selecting copulae usually relies on general... more
Copulae provide investors with tools to model the dependency structure among financial products. The choice of copulae plays an important role in successful copula applications. However, selecting copulae usually relies on general goodness-of-fit (GoF) tests which are independent of the particular financial problem. This paper ¯rst proposes a pair-copula-GARCH model to construct the dependency structure and simulate the joint returns of five U.S. equites. It then discusses copula selection problem from the perspective of downside risk management with the so-called D-vine structure, which considers the Joe-Clayton copula and the Student t copula as building blocks for the vine pair-copula decomposition. Value at risk, expected shortfall, and Omega function are considered as downside risk measures in this study. As an alternative to the traditional bootstrap approaches, the proposed pair-copula-GARCH model provides simulated asset returns for generating future scenarios of portfolio v...