Emerald | Journal of Risk Finance, The | Table of Contents http://www.emeraldinsight.com/1526-5943.htm Table of contents from the most recently published issue of Journal of Risk Finance, The Journal en-gb Mon, 17 Mar 2014 00:00:00 +0000 2013 Emerald Group Publishing Limited editorial@emeraldinsight.com support@emeraldinsight.com 60 Emerald | Journal of Risk Finance, The | Table of Contents http://www.emeraldinsight.com/common_assets/img/covers_journal/jrfcover.gif http://www.emeraldinsight.com/1526-5943.htm 120 157 Incentives for complexity in financial regulation http://www.emeraldinsight.com/journals.htm?issn=1526-5943&volume=15&issue=2&articleid=17107138&show=abstract http://www.emeraldinsight.com/10.1108/JRF-12-2013-0083 <strong>Abstract</strong><br /><br /><B>Purpose</B> – The purpose of the paper is to find out which incentives are present for persons who are taking care of financial regulation in practice, and how these incentives impact their attitudes towards complexity of financial regulation. <B>Design/methodology/approach</B> – Based on recent contributions, reasons behind the increase in complexity observed in financial regulation are discussed. The role of actual incentives for the persons involved in setting up and enforcing regulation is detailed. <B>Findings</B> – Incentives for persons that impact drafting and implementation of financial regulation produce a bias towards excessive complexity. Additional complexity reduces the risk for being exposed to aggressive journalism and pressure from populist politicians. Increasing complexity of regulation will also benefit large players since the costs are largely fixed. <B>Research limitations/implications</B> – Careful studies measuring the costs of increased complexity in terms of increased resource requirements are needed. <B>Practical implications</B> – To reduce the bias towards excess complexity, a body consisting of knowledgeable persons with high integrity is required with an explicit mandate of scrutinising regulation in order to reduce, or at least not increase, complexity. This body must be empowered with sufficient discretion to tackle cases that lack precedents. <B>Originality/value</B> – The paper introduces an explicit discussion of existing incentives on the regulator side of financial markets to increase the understanding of the issues involved in the increased complexity that we observe in the rules that are implemented to guide behaviour in financial markets. Article literatinetwork@emeraldinsight.com (Tom Berglund) Mon, 17 Mar 2014 00:00:00 +0000 A note on the appropriate choice of risk measures in the solvency assessment of insurance companies http://www.emeraldinsight.com/journals.htm?issn=1526-5943&volume=15&issue=2&articleid=17107139&show=abstract http://www.emeraldinsight.com/10.1108/JRF-11-2013-0082 <strong>Abstract</strong><br /><br /><B>Purpose</B> – The concept of value at risk is used in the risk-based calculation of solvency capital requirements in the Basel II/III banking regulations and in the planned Solvency II insurance regulation framework planned in the European Union. While this measure controls the ruin probability of a financial institution, the expected policyholder deficit (EPD) and expected shortfall (ES) measures, which are relevant from the customer's perspective as they value the amount of the shortfall, are not controlled at the same time. Hence, if there are variations in or changes to the asset-liability situation, financial companies may still comply with the capital requirement, while the EPD or ES reach unsatisfactory levels. This is a significant drawback to the solvency frameworks. The paper aims to discuss these issues. <B>Design/methodology/approach</B> – The author has developed a model framework wherein the author evaluates the relevant risk measures using the distribution-free approach of the normal power approximation. This allows the author to derive analytical approximations of the risk measures solely through the use of the first three central moments of the underlying distributions. For the case of a reference insurance company, the author calculates the required capital using the ruin probability and EPD approaches. For this, the author performs sensitivity analyses considering different asset allocations and different liability characteristics. <B>Findings</B> – The author concludes that only a simultaneous monitoring of the ruin probability and EPD can lead to satisfactory results guaranteeing a constant level of customer protection. For the reference firm, the author evaluates the relative changes in the capital requirement when applying the EPD approach next to the ruin probability approach. Depending on the development of the assets and liabilities, and in the cases the author illustrates, the reference company would need to provide substantial amounts of additional equity capital. <B>Originality/value</B> – A comparative assessment of alternative risk measures is relevant given the debate among regulators, industry representatives and academics about how adequately they are used. The author borrows the approach in parts from the work of Barth. Barth compares the ruin probability and EPD approach when discussing the RBC formulas of the US National Association of Insurance Commissioners introduced in the 1990s. The author reconsiders several of these findings and discusses them in the light of the new regulatory frameworks. More precisely, the author first performs sensitivity analyses for the risk measures using different parameter configurations. Such analyses are relevant since in practice parameter values may differ from estimates used in the model and have a significant impact on the values of the risk measures. Second, the author goes beyond a simple discussion of the outcomes for each risk measure, by deriving the firm conclusion that both the frequency and magnitude of shortfalls need to be controlled. Article literatinetwork@emeraldinsight.com (Joël Wagner) Mon, 17 Mar 2014 00:00:00 +0000 Jointly estimating jump betas http://www.emeraldinsight.com/journals.htm?issn=1526-5943&volume=15&issue=2&articleid=17107140&show=abstract http://www.emeraldinsight.com/10.1108/JRF-07-2013-0052 <strong>Abstract</strong><br /><br /><B>Purpose</B> – This paper aims to enhance a co-skew-based risk measurement methodology initially introduced in Polimenis, by extending it for the joint estimation of the jump betas for two stocks. <B>Design/methodology/approach</B> – The authors introduce the possibility of idiosyncratic jumps and analyze the robustness of the estimated sensitivities when two stocks are jointly fit to the same set of latent jump factors. When individual stock skews substantially differ from those of the market, the requirement that the individual skew is exactly matched is placing a strain on the single stock estimation system. <B>Findings</B> – The authors argue that, once the authors relax this restrictive requirement in an enhanced joint framework, the system calibrates to a more robust solution in terms of uncovering the true magnitude of the latent parameters of the model, at the same time revealing information about the level of idiosyncratic skews in individual stock return distributions. <B>Research limitations/implications</B> – Allowing for idiosyncratic skews relaxes the demands placed on the estimation system and hence improves its explanatory power by focusing on matching systematic skew that is more informational. Furthermore, allowing for stock-specific jumps that are not related to the market is a realistic assumption. There is now evidence that idiosyncratic risks are priced as well, and this has been a major drawback and criticism in using CAPM to assess risk premia. <B>Practical implications</B> – Since jumps in stock prices incorporate the most valuable information, then quantifying a stock's exposure to jump events can have important practical implications for financial risk management, portfolio construction and option pricing. <B>Originality/value</B> – This approach boosts the “signal-to-noise” ratio by utilizing co-skew moments, so that the diffusive component is filtered out through higher-order cumulants. Without making any distributional assumptions, the authors are able not only to capture the asymmetric sensitivity of a stock to latent upward and downward systematic jump risks, but also to uncover the magnitude of idiosyncratic stock skewness. Since cumulants in a Levy process evolve linearly in time, this approach is horizon independent and hence can be deployed at all frequencies. Article literatinetwork@emeraldinsight.com (Vassilis Polimenis, Ioannis Papantonis) Mon, 17 Mar 2014 00:00:00 +0000 Impacts of the US macroeconomic news on Asian stock markets http://www.emeraldinsight.com/journals.htm?issn=1526-5943&volume=15&issue=2&articleid=17107141&show=abstract http://www.emeraldinsight.com/10.1108/JRF-09-2013-0064 <strong>Abstract</strong><br /><br /><B>Purpose</B> – This paper aims to investigate the spillover effect of 14 US key macroeconomic news on the first two moments of 12 Asian stock market returns. <B>Design/methodology/approach</B> – The authors collect market expectation and actual scheduled announcements data for 14 key US's macroeconomic announcements from January 2002 to April 2012 from Bloomberg. The dataset consists of six groups: monetary policy and general macroeconomic indicators: the Federal Reserve's target interest rates (FOMC), gross domestic product (GDP), and leading indicator (LI); price indicators: consumer price index (CPI) and producer price index (PPI); business indicator: housing starts (HS) and industrial production (IP); consumption indicators: retail sales (RS) and consumer confidence level (CONSUM); labor market indicators: non-farm payroll (NFP), unemployment level (UE), and jobless claim (JOB); and external sector indicators: current account (CA) and trade balance (TB). The authors also collect daily opening and closing data of 12 Asian stock markets. Following Dow Jones classification, the authors divide them into two groups: five developed markets (Japan, Hong Kong, Republic of Korea, Singapore and Taiwan), and seven emerging markets (China, India, Indonesia, Malaysia, Pakistan, Sri Lanka, and Thailand). The MA-EGARCH (1,1) model is used for the empirical test. <B>Findings</B> – First, the authors find that stronger than expected news from the USA is associated with higher conditional mean and lower conditional variance of the Asian stock market returns, in general. Second, the Asian stock markets tend to put more weight on information relating to the US labor market than the other news as this indicator reveals much information about the underlying health of the US economy since full employment is the most important mandate for the US administration and policy makers. Third, in responding to the US news, the Asian emerging markets seem to respond stronger to the US news than the Asian developed markets both in terms of the number of responses and the magnitude of the reaction. This suggests that this could be seen as evidence that emerging markets are more dependent on the information content of the US news than the developed markets. Fourth, the US news is absorbed gradually leading to persisting volatility responses in the Asian stock markets. <B>Originality/value</B> – The authors fill a gap in the extant literature in investigating the speeds of the news absorption across the Asia region by examining the spillover effects across three time horizons, namely daily, overnight and intraday. Article literatinetwork@emeraldinsight.com (Tho Nguyen, Chau Ngo) Mon, 17 Mar 2014 00:00:00 +0000 Analysis of the impact of improved market trading efficiency on the speculation-hedging relation http://www.emeraldinsight.com/journals.htm?issn=1526-5943&volume=15&issue=2&articleid=17107142&show=abstract http://www.emeraldinsight.com/10.1108/JRF-11-2013-0077 <strong>Abstract</strong><br /><br /><B>Purpose</B> – In this study, the author aims to examine the behavior of QQQ options at the time of the QQQ move from AMEX to NASDAQ on December 1, 2004. The author addresses the questions: is there a relation between hedging and speculation, if such a relation exists considering the improvement in market trading efficiency after the QQQ move did the relation between speculative demand for options and hedging demand for options strengthen at the time of the QQQ move, if such a relation exists does hedging activity follow speculative activity. <B>Design/methodology/approach</B> – The author uses the fact that deep-out-of-the-money puts are used for hedging, whereas deep-out-of-the-money calls are used for speculation. The author uses spectral analysis on QQQ options in the attempt to answer the research question. The author uses spectral analysis because the data in the study are non-normally distributed which would make parametric testing meaningless. <B>Findings</B> – The author finds that indeed the relation between speculative demand and hedging demand for options exists and strengthens after the consolidation of trading on NASDAQ and that hedging follows speculation. The fact that this relation exists is economically meaningful in that this is established for the first time empirically in support of the theoretical models predicting this relation's existence. <B>Originality/value</B> – Market participants on both the speculation side of the investment spectrum, such as hedge funds, and hedging side of the investment spectrum, such as mutual funds and money managers, would be interested in this topic and the findings of this paper. The main contribution of this study is in examining the relation between differential demand for options by using the non-parametric tools of spectral analysis. This helps extend the understanding of exchange traded funds' (ETF') option behavior and contributes to this strand of the ETF literature. Article literatinetwork@emeraldinsight.com (Stoyu I. Ivanov) Mon, 17 Mar 2014 00:00:00 +0000 Forecasting bank credit ratings http://www.emeraldinsight.com/journals.htm?issn=1526-5943&volume=15&issue=2&articleid=17107143&show=abstract http://www.emeraldinsight.com/10.1108/JRF-11-2013-0076 <strong>Abstract</strong><br /><br /><B>Purpose</B> – This study aims to present an empirical model designed to forecast bank credit ratings using only quantitative and publicly available information from their financial statements. For this reason, the authors use the long-term ratings provided by Fitch in 2012. The sample consists of 92 US banks and publicly available information in annual frequency from their financial statements from 2008 to 2011. <B>Design/methodology/approach</B> – First, in the effort to select the most informative regressors from a long list of financial variables and ratios, the authors use stepwise least squares and select several alternative sets of variables. Then, these sets of variables are used in an ordered probit regression setting to forecast the long-term credit ratings. <B>Findings</B> – Under this scheme, the forecasting accuracy of the best model reaches 83.70 percent when nine explanatory variables are used. <B>Originality/value</B> – The results indicate that bank credit ratings largely rely on historical data making them respond sluggishly and after any financial problems are already known to the public. Article literatinetwork@emeraldinsight.com (Periklis Gogas, Theophilos Papadimitriou, Anna Agrapetidou) Mon, 17 Mar 2014 00:00:00 +0000 Invitation to submit your research to the Journal of Risk Finance http://www.emeraldinsight.com/journals.htm?issn=1526-5943&volume=15&issue=2&articleid=17107144&show=abstract Call for papers Mon, 17 Mar 2014 00:00:00 +0000