Pin Ng - Academia.edu (original) (raw)

Papers by Pin Ng

Research paper thumbnail of Risk Contagion between Global Commodities from the Perspective of Volatility Spillover

Energies

Prices of oil and other commodities have fluctuated wildly since the outbreak of the COVID-19 pan... more Prices of oil and other commodities have fluctuated wildly since the outbreak of the COVID-19 pandemic. It is crucial to explore the causes of price fluctuations and understand the source and path of risk contagion to better mitigate systemic risk and maintain economic stability. The paper adopts the method of network topology to examine the path of risk contagion between China’s and foreign commodities, focusing on the dynamic evolution and transmission mechanism of risk contagion during the pandemic. This research found that among China’s commodities, energy, grain, and textiles are net recipients of risk contagion, while chemical products and metals are net risk exporters. Among international commodities, industries have positive risk spillover effects on metals and textiles. During the first phase of the pandemic, China’s commodities were the main exporters of risk contagion. However, international industries and metals became the main risk exporters and exerted risk spillover o...

Research paper thumbnail of The Adverse Impact of Temperature on Income

Research paper thumbnail of Income and Temperatures Working Paper Series — 1006 | April 2010

Research paper thumbnail of Robust tests for heteroskedasticity and autocorrelation in the multiple regression model: Working paper series--02-05

The standard Rao's (1948) score or Lagrange multiplier test for heteroskedasticity was origin... more The standard Rao's (1948) score or Lagrange multiplier test for heteroskedasticity was originally developed assuming normality of the disturbance term [see Godfrey (1978b), and Bruesch and Pagan (1979)]. Therefore, the resulting test depends heavily on the normality assumption. Koenker (1981) suggests a studentized for which is robust to nonnormality. This approach seems to be limited because of the unavailability of a general procedure that transforms a test to a robust one. Following Bickel (1978), we use a different approach to take account of nonnormality. Our tests will be based on the score function which is defined as the negative derivitive of the log-density function with respect to the underlying random variable. To implement the test we use a nonparametric estimate of the score function. Our robust test for heteroskedasticity is obtained by running a regression of the product of the score function and ordinary least squares residuals on some exogenous variables which ...

Research paper thumbnail of The impact of climate change on tourism economies: Working paper series--12-01

Tourism economies depend on tourism for promoting economic growth. Tourism is obviously highly se... more Tourism economies depend on tourism for promoting economic growth. Tourism is obviously highly sensitive to climate. Therefore, with mounting evidence of climate change, an important question is: what is the impact of climate change on tourism economies? To answer this question, we use a model in Ng and Zhao (2010) to estimate the economic impacts of climate change on different types of economies. Our main finding is that climate change's impact on tourism economies is not smaller than its impact on other types of economies if temperature increases by more than 1 degree Celsius. Therefore, our findings suggest that tourism economies should also implement aggressive climate mitigation policy.

Research paper thumbnail of What do regressions estimate? Working paper series--08-04

Along with the readily available computer software that performs least squares regressions, comes... more Along with the readily available computer software that performs least squares regressions, comes the proliferation of misuse and misinterpretation of the regression results. We illustrate in the paper that the solutions to the intercept and slope coefficients in a simple linear regression model are dependent on the definition of the loss function used in the optimization problem. Likewise, there is not a universal population linear regression line. If a squared error loss function is used, the population regression line turns out to be the conditional mean function while an absolute loss function yields the conditional median. When the asymmetric loss function defined in Koenker and Bassett (1978) is used, the population regression line becomes the conditional quantile function. Therefore, what the sample regression is estimating and how the estimated intercept and slope coefficients should be interpreted are dependent upon what loss function is being used in the optimization probl...

Research paper thumbnail of Community tourism resilience: some applications of the scale, change and resilience (SCR) model

Tourism and resilience, 2017

This chapter discusses applications of the scale, change and resilience (SCR) model in tourism. T... more This chapter discusses applications of the scale, change and resilience (SCR) model in tourism. The SCR model extends resilience theory by making it more practical and applied. It is solidly grounded in resilience thinking, incorporating essential concepts such as spatial and temporal scale, and slow- and fast-change drivers and variables. The SCR model is a framework for understanding how different systems (actors), among the complexity of subsystems that comprise a community, respond to different types of change, both individually and as a collective. The chapter presents a study comparing rural tourism communities in Taiwan that have experienced major natural disasters with those that have not experienced such disasters, as a first effort at understanding resilience from the SCR framework.

Research paper thumbnail of Temporal and Spatial Evolution of Carbon Emissions and Their Influencing Factors for Tourist Attractions at Heritage Tourist Destinations

Sustainability, 2019

Carbon emissions play an important role in sustainable tourism development at heritage sites. The... more Carbon emissions play an important role in sustainable tourism development at heritage sites. The study takes the Wulingyuan Scenic and Historic Interest Area (WSHIA) as an example, and primary and secondary data sources are used to measure and estimate the carbon emissions of tourist attractions from 1979 to 2014. The temporal and spatial evolution of carbon emissions and their influencing factors for tourist attractions at heritage tourist destinations are analyzed. The results show that there are great differences in carbon emissions per visitor across the different types of tourism attractions at the heritage tourist destination, and there are significant monthly and interannual differences in the carbon emissions of the tourism attractions in the WSHIA. The main influencing factors include tourism seasonality, the rapid growth of China’s tourism market, and the rising popularity of heritage tourism. The spatial evolution of carbon emissions of the tourist attractions can be div...

Research paper thumbnail of Stochastic dominance via quantile regression with applications to investigate arbitrage opportunity and market efficiency

European Journal of Operational Research, 2017

Research paper thumbnail of Mincer–Zarnowitz quantile and expectile regressions for forecast evaluations under aysmmetric loss functions

Journal of Forecasting, 2017

Forecasts are pervasive in all areas of applications in business and daily life. Hence evaluating... more Forecasts are pervasive in all areas of applications in business and daily life. Hence evaluating the accuracy of a forecast is important for both the generators and consumers of forecasts. There are two aspects in forecast evaluation: (a) measuring the accuracy of past forecasts using some summary statistics, and (b) testing the optimality properties of the forecasts through some diagnostic tests. On measuring the accuracy of a past forecast, this paper illustrates that the summary statistics used should match the loss function that was used to generate the forecast. If there is strong evidence that an asymmetric loss function has been used in the generation of a forecast, then a summary statistic that corresponds to that asymmetric loss function should be used in assessing the accuracy of the forecast instead of the popular root mean square error or mean absolute error. On testing the optimality of the forecasts, it is demonstrated how the quantile regressions set in the predictio...

Research paper thumbnail of The Elasticity of Demand for Gasoline: A Semi-parametric Analysis

Advanced Studies in Theoretical and Applied Econometrics, 2014

Research paper thumbnail of Inequality constrained quantile regression

Sankhya Ser A

An algorithm for computing parametric linear quantile regression estimates subject to linear ineq... more An algorithm for computing parametric linear quantile regression estimates subject to linear inequality constraints is described. The algorithm is a variant of the interior point algorithm described in Koenker and Portnoy (1997) for unconstrained quantile regression and is consequently quite ef-ficient even for large problems, particularly when the inherent sparsity of the resulting linear algebra is exploited. Applications to qualitatively con-strained nonparametric regression are described in the penultimate sections. Implementations of the algorithm are available in MATLAB and R.

Research paper thumbnail of Mincer-Zarnovitz Quantile and Expectile Regressions for Forecast Evaluations under Asymmetric Loss Functions

Forecast is pervasive in all areas of applications in business and daily life and, hence, evaluat... more Forecast is pervasive in all areas of applications in business and daily life and, hence, evaluating the accuracy of a forecast is important for both the generators and consumers of forecasts. There are two aspects in forecast evaluation: (1) measuring the accuracy of past forecasts using some summary statistics and (2) testing the optimality properties of the forecasts through some diagnostic tests. On measuring the accuracy of a past forecast, we illustrate that the summary statistics used should match the loss function that was used to generate the forecasts. If there is strong evidence that an asymmetric loss function has been used in the generation of a forecast, then a summary statistic that corresponds to that asymmetric loss function should be used in assessing the accuracy of the forecast instead of the popular RM SE or M AE. On testing the optimality of the forecasts, we demonstrate how the quantile regressions and expectile regressions set in the prediction-realization framework of Mincer and Zarnowitz (1969) can be used to recover the unknown parameter that controls the potentially asymmetric loss function used in generating the past forecasts. Finally, we apply the prediction-realization framework to the Federal Reserve's economic growth forecast and forecast sharing in a PC manufacturing supply chain. We …nd that the Federal Reserves values over prediction approximately 1.5 times more costly than under prediction. We also …nd that the PC manufacturer weighs positive forecast errors (under forecasts) about four times as costly as negative forecast errors (over forecasts).

Research paper thumbnail of Refining Our Understanding of Beta through Quantile Regressions

Journal of Risk and Financial Management, 2014

The Capital Asset Pricing Model (CAPM) has been a key theory in financial economics since the 196... more The Capital Asset Pricing Model (CAPM) has been a key theory in financial economics since the 1960s. One of its main contributions is to attempt to identify how the risk of a particular stock is related to the risk of the overall stock market using the risk measure Beta. If the relationship between an individual stock's returns and the returns of the market exhibit heteroskedasticity, then the estimates of Beta for different quantiles of the relationship can be quite different. The behavioral ideas first proposed by Kahneman and Tversky (1979), which they called prospect theory, postulate that: (i) people exhibit "loss-aversion" in a gain frame; and (ii) people exhibit "risk-seeking" in a loss frame. If this is true, people could prefer lower Beta stocks after they have experienced a gain and higher Beta stocks after they have experienced a loss. Stocks that exhibit converging heteroskedasticity (22.2% of our sample) should be preferred by investors, and stocks that exhibit diverging heteroskedasticity (12.6% of our sample) should not be preferred. Investors may be able to benefit by choosing portfolios that are more closely aligned with their preferences.

Research paper thumbnail of Quantile smoothing splines

Biometrika, 1994

Although nonparametric regression has traditionally focused on the estimation of conditional mean... more Although nonparametric regression has traditionally focused on the estimation of conditional mean functions, nonparametric estimation of conditional quantile functions is often of substantial practical interest. We explore a class of quantile smoothing splines, which are defined as solutions to a penalized quantile regression problem. We characterize solutions, as splines, i.e. piecewise polynomials, and discuss computation by standard linear programming techniques. For sufficiently small values of the bandwidth parameter the solutions interpolate the specified quantiles of the response variable at the distinct design points, while for sufficiently large bandwidths solutions specialize to the linear regression quantile fit (Koenker and Bassett(1978)) to the observations. Because the methods estimate conditional quantile functions they possess an inherent robustness to extreme observations in the response variable. Remarkably, the entire path of solutions, in the quantile parameter or the bandwidth parameter, may be computed efficiently by parametric linear programming methods. Finally we note that the approach may be easily adapted to impose monotonicity, convexity, or other constraints on the fitted function. Two examples are provided to illustrate the use of the proposed methods.

Research paper thumbnail of Computing Cox's Smoothing Spline Score Estimator

Research paper thumbnail of Time and Regime Dependence of Foreign Exchange Exposure

Research paper thumbnail of Using Quantile Regression to Evaluate Human Thermal Climates in China

Research paper thumbnail of A large sample normality test

The score function, defined as the negative logarithmic derivative of the probability density fun... more The score function, defined as the negative logarithmic derivative of the probability density function, plays an ubiquitous role in statistics. Since the score function of the normal distribution is linear, testing normality amounts to checking the linearity of the empirical score function. Using the score function, we present a graphical alternative to the Q-Q plot for detecting departures from normality. Even though graphical approaches are informative, they lack the objectivity of formal testing procedures. We, therefore, supplement our graphical approach with a formal large sample chi-square test. Our graphical approach is then applied to a wide range of alternative data generating processes. The finite sample size and power performances of the chi square test are investigated through a small scale Monte Carlo study.

Research paper thumbnail of The effect of learning styles on course performance: A quantile regression analysis

Research paper thumbnail of Risk Contagion between Global Commodities from the Perspective of Volatility Spillover

Energies

Prices of oil and other commodities have fluctuated wildly since the outbreak of the COVID-19 pan... more Prices of oil and other commodities have fluctuated wildly since the outbreak of the COVID-19 pandemic. It is crucial to explore the causes of price fluctuations and understand the source and path of risk contagion to better mitigate systemic risk and maintain economic stability. The paper adopts the method of network topology to examine the path of risk contagion between China’s and foreign commodities, focusing on the dynamic evolution and transmission mechanism of risk contagion during the pandemic. This research found that among China’s commodities, energy, grain, and textiles are net recipients of risk contagion, while chemical products and metals are net risk exporters. Among international commodities, industries have positive risk spillover effects on metals and textiles. During the first phase of the pandemic, China’s commodities were the main exporters of risk contagion. However, international industries and metals became the main risk exporters and exerted risk spillover o...

Research paper thumbnail of The Adverse Impact of Temperature on Income

Research paper thumbnail of Income and Temperatures Working Paper Series — 1006 | April 2010

Research paper thumbnail of Robust tests for heteroskedasticity and autocorrelation in the multiple regression model: Working paper series--02-05

The standard Rao's (1948) score or Lagrange multiplier test for heteroskedasticity was origin... more The standard Rao's (1948) score or Lagrange multiplier test for heteroskedasticity was originally developed assuming normality of the disturbance term [see Godfrey (1978b), and Bruesch and Pagan (1979)]. Therefore, the resulting test depends heavily on the normality assumption. Koenker (1981) suggests a studentized for which is robust to nonnormality. This approach seems to be limited because of the unavailability of a general procedure that transforms a test to a robust one. Following Bickel (1978), we use a different approach to take account of nonnormality. Our tests will be based on the score function which is defined as the negative derivitive of the log-density function with respect to the underlying random variable. To implement the test we use a nonparametric estimate of the score function. Our robust test for heteroskedasticity is obtained by running a regression of the product of the score function and ordinary least squares residuals on some exogenous variables which ...

Research paper thumbnail of The impact of climate change on tourism economies: Working paper series--12-01

Tourism economies depend on tourism for promoting economic growth. Tourism is obviously highly se... more Tourism economies depend on tourism for promoting economic growth. Tourism is obviously highly sensitive to climate. Therefore, with mounting evidence of climate change, an important question is: what is the impact of climate change on tourism economies? To answer this question, we use a model in Ng and Zhao (2010) to estimate the economic impacts of climate change on different types of economies. Our main finding is that climate change's impact on tourism economies is not smaller than its impact on other types of economies if temperature increases by more than 1 degree Celsius. Therefore, our findings suggest that tourism economies should also implement aggressive climate mitigation policy.

Research paper thumbnail of What do regressions estimate? Working paper series--08-04

Along with the readily available computer software that performs least squares regressions, comes... more Along with the readily available computer software that performs least squares regressions, comes the proliferation of misuse and misinterpretation of the regression results. We illustrate in the paper that the solutions to the intercept and slope coefficients in a simple linear regression model are dependent on the definition of the loss function used in the optimization problem. Likewise, there is not a universal population linear regression line. If a squared error loss function is used, the population regression line turns out to be the conditional mean function while an absolute loss function yields the conditional median. When the asymmetric loss function defined in Koenker and Bassett (1978) is used, the population regression line becomes the conditional quantile function. Therefore, what the sample regression is estimating and how the estimated intercept and slope coefficients should be interpreted are dependent upon what loss function is being used in the optimization probl...

Research paper thumbnail of Community tourism resilience: some applications of the scale, change and resilience (SCR) model

Tourism and resilience, 2017

This chapter discusses applications of the scale, change and resilience (SCR) model in tourism. T... more This chapter discusses applications of the scale, change and resilience (SCR) model in tourism. The SCR model extends resilience theory by making it more practical and applied. It is solidly grounded in resilience thinking, incorporating essential concepts such as spatial and temporal scale, and slow- and fast-change drivers and variables. The SCR model is a framework for understanding how different systems (actors), among the complexity of subsystems that comprise a community, respond to different types of change, both individually and as a collective. The chapter presents a study comparing rural tourism communities in Taiwan that have experienced major natural disasters with those that have not experienced such disasters, as a first effort at understanding resilience from the SCR framework.

Research paper thumbnail of Temporal and Spatial Evolution of Carbon Emissions and Their Influencing Factors for Tourist Attractions at Heritage Tourist Destinations

Sustainability, 2019

Carbon emissions play an important role in sustainable tourism development at heritage sites. The... more Carbon emissions play an important role in sustainable tourism development at heritage sites. The study takes the Wulingyuan Scenic and Historic Interest Area (WSHIA) as an example, and primary and secondary data sources are used to measure and estimate the carbon emissions of tourist attractions from 1979 to 2014. The temporal and spatial evolution of carbon emissions and their influencing factors for tourist attractions at heritage tourist destinations are analyzed. The results show that there are great differences in carbon emissions per visitor across the different types of tourism attractions at the heritage tourist destination, and there are significant monthly and interannual differences in the carbon emissions of the tourism attractions in the WSHIA. The main influencing factors include tourism seasonality, the rapid growth of China’s tourism market, and the rising popularity of heritage tourism. The spatial evolution of carbon emissions of the tourist attractions can be div...

Research paper thumbnail of Stochastic dominance via quantile regression with applications to investigate arbitrage opportunity and market efficiency

European Journal of Operational Research, 2017

Research paper thumbnail of Mincer–Zarnowitz quantile and expectile regressions for forecast evaluations under aysmmetric loss functions

Journal of Forecasting, 2017

Forecasts are pervasive in all areas of applications in business and daily life. Hence evaluating... more Forecasts are pervasive in all areas of applications in business and daily life. Hence evaluating the accuracy of a forecast is important for both the generators and consumers of forecasts. There are two aspects in forecast evaluation: (a) measuring the accuracy of past forecasts using some summary statistics, and (b) testing the optimality properties of the forecasts through some diagnostic tests. On measuring the accuracy of a past forecast, this paper illustrates that the summary statistics used should match the loss function that was used to generate the forecast. If there is strong evidence that an asymmetric loss function has been used in the generation of a forecast, then a summary statistic that corresponds to that asymmetric loss function should be used in assessing the accuracy of the forecast instead of the popular root mean square error or mean absolute error. On testing the optimality of the forecasts, it is demonstrated how the quantile regressions set in the predictio...

Research paper thumbnail of The Elasticity of Demand for Gasoline: A Semi-parametric Analysis

Advanced Studies in Theoretical and Applied Econometrics, 2014

Research paper thumbnail of Inequality constrained quantile regression

Sankhya Ser A

An algorithm for computing parametric linear quantile regression estimates subject to linear ineq... more An algorithm for computing parametric linear quantile regression estimates subject to linear inequality constraints is described. The algorithm is a variant of the interior point algorithm described in Koenker and Portnoy (1997) for unconstrained quantile regression and is consequently quite ef-ficient even for large problems, particularly when the inherent sparsity of the resulting linear algebra is exploited. Applications to qualitatively con-strained nonparametric regression are described in the penultimate sections. Implementations of the algorithm are available in MATLAB and R.

Research paper thumbnail of Mincer-Zarnovitz Quantile and Expectile Regressions for Forecast Evaluations under Asymmetric Loss Functions

Forecast is pervasive in all areas of applications in business and daily life and, hence, evaluat... more Forecast is pervasive in all areas of applications in business and daily life and, hence, evaluating the accuracy of a forecast is important for both the generators and consumers of forecasts. There are two aspects in forecast evaluation: (1) measuring the accuracy of past forecasts using some summary statistics and (2) testing the optimality properties of the forecasts through some diagnostic tests. On measuring the accuracy of a past forecast, we illustrate that the summary statistics used should match the loss function that was used to generate the forecasts. If there is strong evidence that an asymmetric loss function has been used in the generation of a forecast, then a summary statistic that corresponds to that asymmetric loss function should be used in assessing the accuracy of the forecast instead of the popular RM SE or M AE. On testing the optimality of the forecasts, we demonstrate how the quantile regressions and expectile regressions set in the prediction-realization framework of Mincer and Zarnowitz (1969) can be used to recover the unknown parameter that controls the potentially asymmetric loss function used in generating the past forecasts. Finally, we apply the prediction-realization framework to the Federal Reserve's economic growth forecast and forecast sharing in a PC manufacturing supply chain. We …nd that the Federal Reserves values over prediction approximately 1.5 times more costly than under prediction. We also …nd that the PC manufacturer weighs positive forecast errors (under forecasts) about four times as costly as negative forecast errors (over forecasts).

Research paper thumbnail of Refining Our Understanding of Beta through Quantile Regressions

Journal of Risk and Financial Management, 2014

The Capital Asset Pricing Model (CAPM) has been a key theory in financial economics since the 196... more The Capital Asset Pricing Model (CAPM) has been a key theory in financial economics since the 1960s. One of its main contributions is to attempt to identify how the risk of a particular stock is related to the risk of the overall stock market using the risk measure Beta. If the relationship between an individual stock's returns and the returns of the market exhibit heteroskedasticity, then the estimates of Beta for different quantiles of the relationship can be quite different. The behavioral ideas first proposed by Kahneman and Tversky (1979), which they called prospect theory, postulate that: (i) people exhibit "loss-aversion" in a gain frame; and (ii) people exhibit "risk-seeking" in a loss frame. If this is true, people could prefer lower Beta stocks after they have experienced a gain and higher Beta stocks after they have experienced a loss. Stocks that exhibit converging heteroskedasticity (22.2% of our sample) should be preferred by investors, and stocks that exhibit diverging heteroskedasticity (12.6% of our sample) should not be preferred. Investors may be able to benefit by choosing portfolios that are more closely aligned with their preferences.

Research paper thumbnail of Quantile smoothing splines

Biometrika, 1994

Although nonparametric regression has traditionally focused on the estimation of conditional mean... more Although nonparametric regression has traditionally focused on the estimation of conditional mean functions, nonparametric estimation of conditional quantile functions is often of substantial practical interest. We explore a class of quantile smoothing splines, which are defined as solutions to a penalized quantile regression problem. We characterize solutions, as splines, i.e. piecewise polynomials, and discuss computation by standard linear programming techniques. For sufficiently small values of the bandwidth parameter the solutions interpolate the specified quantiles of the response variable at the distinct design points, while for sufficiently large bandwidths solutions specialize to the linear regression quantile fit (Koenker and Bassett(1978)) to the observations. Because the methods estimate conditional quantile functions they possess an inherent robustness to extreme observations in the response variable. Remarkably, the entire path of solutions, in the quantile parameter or the bandwidth parameter, may be computed efficiently by parametric linear programming methods. Finally we note that the approach may be easily adapted to impose monotonicity, convexity, or other constraints on the fitted function. Two examples are provided to illustrate the use of the proposed methods.

Research paper thumbnail of Computing Cox's Smoothing Spline Score Estimator

Research paper thumbnail of Time and Regime Dependence of Foreign Exchange Exposure

Research paper thumbnail of Using Quantile Regression to Evaluate Human Thermal Climates in China

Research paper thumbnail of A large sample normality test

The score function, defined as the negative logarithmic derivative of the probability density fun... more The score function, defined as the negative logarithmic derivative of the probability density function, plays an ubiquitous role in statistics. Since the score function of the normal distribution is linear, testing normality amounts to checking the linearity of the empirical score function. Using the score function, we present a graphical alternative to the Q-Q plot for detecting departures from normality. Even though graphical approaches are informative, they lack the objectivity of formal testing procedures. We, therefore, supplement our graphical approach with a formal large sample chi-square test. Our graphical approach is then applied to a wide range of alternative data generating processes. The finite sample size and power performances of the chi square test are investigated through a small scale Monte Carlo study.

Research paper thumbnail of The effect of learning styles on course performance: A quantile regression analysis