autcha araveeporn - Academia.edu (original) (raw)

Papers by autcha araveeporn

Research paper thumbnail of Tseries: An R Package for Stationarity Tests in Time Series Data

Research Highlights in Science and Technology Vol. 1, Apr 22, 2023

Research paper thumbnail of Comparing Penalized Regression Analysis of Logistic Regression Model with Multicollinearity

Proceedings of the 2019 2nd International Conference on Mathematics and Statistics - ICoMS'19, 2019

The goal of this research is to estimate the parameter of the logistic regression model by penali... more The goal of this research is to estimate the parameter of the logistic regression model by penalized regression analysis which consisted of ridge regression, lasso, and elastic net method. The logistic regression is considered between a binary dependent variable and 3 and 5 independent variables. The independent variables are generated from normal distribution, contaminated normal distribution, and t distribution on correlation coefficient at 0.1, 0.5, and 0.99 or called multicollinearity problem. The maximum likelihood estimator has used as the classical method by differential the log likelihood function with respect to the coefficients. Ridge regression is to choose the unknown ridge parameter by cross-validation, so ridge estimator is evaluated by the adding ridge parameter on penalty term. Lasso (least absolute shrinkage and selection operator) is added the penalty term on scales sum of the absolute value of the coefficients. The elastic net can be mixed between ridge regression and lasso on the penalty term. The criterion of these methods is compared by percentage of predicted accuracy value. The results are found that lasso is satisfied when the independent variables are simulated from normal and t distribution in most cases, and the lasso outperforms on the contaminated normal distribution.

Research paper thumbnail of Estimating conditional heteroscedastic nonlinear autoregressive model by using smoothing spline and penalized spline methods

We propose smoothing spline (SS) and penalized spline (PS) methods in a class of nonparametric re... more We propose smoothing spline (SS) and penalized spline (PS) methods in a class of nonparametric regression methods for estimating the unknown functions in a conditional heteroscedastic nonlinear autoregressive (CHNLAR) model. The CHNLAR model consists of a trend and heteroscedastic functions in terms of past data at lag 1. The SS and PS methods were tested in estimating the unknown functions used to transform data so that it fits the trend and the heteroscedastic functions. In a simulation study, time series data were generated and hypothesis testing of the bias was used to check the accuracy. The SS and PS methods exhibit a good power estimation in most cases of generated data. As real data, gold price was modeled by using SS and PS methods in the CHNLAR model. The results show that the SS method performed better than the PS method.

Research paper thumbnail of The Penalized Regression and Penalized Logistic Regression of Lasso and Elastic Net Methods for High- Dimensional Data: A Modelling Approach

Innovations in Science and Technology Vol. 3, 2022

Research paper thumbnail of The Maximum Likelihood Method Of Random Coefficient Dynamic Regression Model

The Random Coefficient Dynamic Regression (RCDR) model is to developed from Random Coefficient Au... more The Random Coefficient Dynamic Regression (RCDR) model is to developed from Random Coefficient Autoregressive (RCA) model and Autoregressive (AR) model. The RCDR model is considered by adding exogenous variables to RCA model. In this paper, the concept of the Maximum Likelihood (ML) method is used to estimate the parameter of RCDR(1,1) model. Simulation results have shown the AIC and BIC criterion to compare the performance of the the RCDR(1,1) model. The variables as the stationary and weakly stationary data are good estimates where the exogenous variables are weakly stationary. However, the model selection indicated that variables are nonstationarity data based on the stationary data of the exogenous variables.

Research paper thumbnail of An Estimating Parameter Of The Mean In Normal Distribution By Maximum Likelihood, Bayes, And Markov Chain Monte Carlo Methods

This paper is to compare the parameter estimation of<br> the mean in normal distribution by... more This paper is to compare the parameter estimation of<br> the mean in normal distribution by Maximum Likelihood (ML),<br> Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML<br> estimator is estimated by the average of data, the Bayes method is<br> considered from the prior distribution to estimate Bayes estimator,<br> and MCMC estimator is approximated by Gibbs sampling from<br> posterior distribution. These methods are also to estimate a parameter<br> then the hypothesis testing is used to check a robustness of the<br> estimators. Data are simulated from normal distribution with the true<br> parameter of mean 2, and variance 4, 9, and 16 when the sample<br> sizes is set as 10, 20, 30, and 50. From the results, it can be seen<br> that the estimation of MLE, and MCMC are perceivably different<br> from the true parameter when the sample size is 10 and 20 with<br> variance 16. Furthermore, the Bayes es...

Research paper thumbnail of Contributed paper Forecasting the Stock Exchange Rate of Thailand Index by Conditional Heteroscedastic Autoregressive Nonlinear Model with Autocorrelated Errors

The goal of this work is to develop a nonparametric regression model that not only account for po... more The goal of this work is to develop a nonparametric regression model that not only account for possibly non-linear trend (i.e., conditional mean of the response variable) but also account for possibly non-linear conditional variance of response (i.e., heteroscedasticity) as a function of predictor variables in the presence of auto-correlated errors. The trend and the heteroscedasticity are modeled using a class of penalized spline. The residuals are modeled as a long autoregressive process which can approximate almost any autoregressive moving average (ARMA) process by selecting an appropriate number of lag residuals. Both classical and Bayesian methodologies are developed to obtain the smooth estimates of the conditional mean and variance functions. The resulting estimated residuals are then used to fit a possibly long AR process by suitably choosing the order of AR using the Akaike Information Criteria (AIC). The forecasting performance of the proposed methods is then applied to t...

Research paper thumbnail of The estimated parameter of logistic regression model by Markov Chain Monte Carlo method with multicollinearity

Statistical journal of the IAOS, 2020

Markov Chain Monte Carlo (MCMC) method has been a popular method for getting information about pr... more Markov Chain Monte Carlo (MCMC) method has been a popular method for getting information about probability distribution for estimating posterior distribution by Gibbs sampling. So far, the standard methods such as maximum likelihood and logistic ridge regression methods have represented to compare with MCMC. The maximum likelihood method is the classical method to estimate the parameter on the logistic regression model by differential the loglikelihood function on the estimator. The logistic ridge regression depends on the choice of ridge parameter by using crossvalidation for computing estimator on penalty function. This paper provides maximum likelihood, logistic ridge regression, and MCMC to estimate parameter on logit function and transforms into a probability. The logistic regression model predicts the probability to observe a phenomenon. The prediction accuracy evaluates in terms of the percentage with correct predictions of a binary event. A simulation study conducts a binary...

Research paper thumbnail of Bayesian Analysis of Random Coefficient Dynamic AutoRegressive Model

Thailand Statistician, 2012

The goal of this work is to develop Random Coefficient AutoRegressive (RCAR) model and AutoRegres... more The goal of this work is to develop Random Coefficient AutoRegressive (RCAR) model and AutoRegressive (AR) model to Random Coefficient Dynamic AutoRegressive (RCDAR) model. The RCDAR model is considered by adding exogenous variables in RCAR model, so there are two variables in RCDAR model. This paper proposes the Bayesian analysis to estimate parameter of the first order RCDAR model. The noninformative prior is used to the Bayesian estimation procedure that works well for the AR model. Monte Carlo simulations was repeated for each situations in comparison of the coefficient from RCDAR model which is stationary, weakly stationary, and closed to nonstationary data. The results of coefficient estimator are satisfied weakly stationary data which is performed to fit possibly for large sample sizes.

Research paper thumbnail of Developing Nonparametric Conditional Heteroscedastic Autoregressive Nonlinear Model by Using Maximum Likelihood Method

The goal of this work is to develop a nonparametric conditional heteroscedastic autoregressive no... more The goal of this work is to develop a nonparametric conditional heteroscedastic autoregressive nonlinear (NCHARN) model by using maximum likelihood method that not only account for possibly non-linear trend but also account for possibly non-linear conditional variance of response as a function of predictor variables in the presence of auto-correlated errors. The trend and the heteroscedasticity are modeled using a class of penalized spline and the residuals are modeled as a autoregressive process (AR) by selecting an appropriate number of lag residuals. Both classical penalized spline and AR process of penalized spline under NCHARN model are developed to obtain the smooth estimates of the conditional mean and variance functions. The resulting estimated values are then used the maximum likelihood method to fi t a trend, volatility, and a coeffi cient of AR process by suitably choosing the order of AR using the Akaike Information Criteria (AIC). The forecasting performance of the prop...

Research paper thumbnail of A Comparison of Parameter Estimation of Logistic Regression Model by Maximum Likelihood, Ridge Regression, Markov Chain Monte Carlo Methods

The goal of this research is to estimate the parameter of logistic regression model. The coeffici... more The goal of this research is to estimate the parameter of logistic regression model. The coefficient parameter is evaluated by maximum likelihood, ridge regression, markov chain monte carlo methods. The logistic regression is considered the correlation between binary dependent variable and 2, 3, and 4 independent variables which is generated from normal distribution, contaminated normal distribution, and t distribution. The maximum likelihood estimator is estimated by differential the log likelihood function with respect to the coefficients. Ridge regression is to choose the unknown ridge parameter by cross-validation, so ridge estimator is evaluated on a form of maximum likelihood method by adding ridge parameter. The markov chain monte carlo estimator can approximate from Gibbs sampling algorithm by the posterior distribution based on a probability distribution and prior probability distribution. The performance of these method is compare by percentage of predicted accuracy value....

Research paper thumbnail of Forecasting the Stock Exchange Rate of Thailand Index by Conditional Heteroscedastic Autoregressive Nonlinear Model with Autocorrelated Errors

The goal of this work is to develop a nonparametric regression model that not only account for po... more The goal of this work is to develop a nonparametric regression model that not only account for possibly non-linear trend (i.e., conditional mean of the response variable) but also account for possibly non-linear conditional variance of response (i.e., heteroscedasticity) as a function of predictor variables in the presence of auto-correlated errors. The trend and the heteroscedasticity are modeled using a class of penalized spline. The residuals are modeled as a long autoregressive process which can approximate almost any autoregressive moving average (ARMA) process by selecting an appropriate number of lag residuals. Both classical and Bayesian methodologies are developed to obtain the smooth estimates of the conditional mean and variance functions. The resulting estimated residuals are then used to fit a possibly long AR process by suitably choosing the order of AR using the Akaike Information Criteria (AIC). The forecasting performance of the proposed methods is then applied to t...

Research paper thumbnail of Review Article The Estimation of Smoothing Parameter using Smoothing Techniques on Nonparametric Regression

This article discusses on the smoothing parameter which is controlled by interpolating spline bas... more This article discusses on the smoothing parameter which is controlled by interpolating spline based on the smoothing techniques that consisted of smoothing spline method, kernel regression method, and penalized spline regression method. The smoothing parameter is controlled the fitting model and the trade of between the bias of the estimator. We also propose the range of smoothing parameter of these methods to fit the smoothing function which data is nonlinear. Therefore, we mention the characteristic of smoothing function when the smoothing parameters have the various values. According to the results, it is concluded that the smoothing parameter of the smoothing spline method is suitable worked between zero to one, the kernel regression is good performance between two to ten, and the penalized spline is useful between one to ten.

Research paper thumbnail of Comparison of Estimating Parameter in Parametric Regression, Nonparametric Regression, and Semiparametric Regression models in Case of Two Explanatory Variables

In this paper, we compare the estimating parameter in the parametric regression model, nonparamet... more In this paper, we compare the estimating parameter in the parametric regression model, nonparametric regression model, and semiparametric regression model between response variable and two explanatory varaibles. The parametric regression model uses the least square method for estimating parameter. The penalized spline method based on nonparametric regression method is proposed for estimating function of nonparametric regression model, and semiparametric regression model. The minimum of Mean Square Error (MSE) is a criterion for choosing the optimal model. Here, we simulate the response variable and two explanatory variables that correlated a nonlinear data based on uniform distribution. The real data can be applied of these models to illustrate the methodology. The estimated values of nonparametric regression model is a good performance in both of simulated data and real data. Keywords: nonparametric regression model, parametric regression model, penalized spline method, semiparam...

Research paper thumbnail of The Higher-Order of Adaptive Lasso and Elastic Net Methods for Classification on High Dimensional Data

Mathematics

The lasso and elastic net methods are the popular technique for parameter estimation and variable... more The lasso and elastic net methods are the popular technique for parameter estimation and variable selection. Moreover, the adaptive lasso and elastic net methods use the adaptive weights on the penalty function based on the lasso and elastic net estimates. The adaptive weight is related to the power order of the estimator. Normally, these methods focus to estimate parameters in terms of linear regression models that are based on the dependent variable and independent variable as a continuous scale. In this paper, we compare the lasso and elastic net methods and the higher-order of the adaptive lasso and adaptive elastic net methods for classification on high dimensional data. The classification is used to classify the categorical data for dependent variable dependent on the independent variables, which is called the logistic regression model. The categorical data are considered a binary variable, and the independent variables are used as the continuous variable. The high dimensional...

Research paper thumbnail of Comparing Parameter Estimation of Random Coefficient Autoregressive Model by Frequentist Method

Mathematics

This paper compares the frequentist method that consisted of the least-squares method and the max... more This paper compares the frequentist method that consisted of the least-squares method and the maximum likelihood method for estimating an unknown parameter on the Random Coefficient Autoregressive (RCA) model. The frequentist methods depend on the likelihood function that draws a conclusion from observed data by emphasizing the frequency or proportion of the data namely least squares and maximum likelihood methods. The method of least squares is often used to estimate the parameter of the frequentist method. The minimum of the sum of squared residuals is found by setting the gradient to zero. The maximum likelihood method carries out the observed data to estimate the parameter of a probability distribution by maximizing a likelihood function under the statistical model, while this estimator is obtained by a differential parameter of the likelihood function. The efficiency of two methods is considered by average mean square error for simulation data, and mean square error for actual ...

Research paper thumbnail of An estimating parameter of nonparametric regression model based on smoothing techniques

Statistical Journal of the IAOS

This paper studies the estimating parameter of a nonparametric regression model that consists of ... more This paper studies the estimating parameter of a nonparametric regression model that consists of the function of independent variables and observation of dependent variables. The smoothing spline, penalized spline, and B-spline methods in a class of smoothing techniques are considered for estimating the unknown parameter on nonparametric regression model. These methods use a smoothing parameter to control the smoothing performance on data set by using a cross-validation method. We also compare these methods by fitting a nonparametric regression model on simulation data and real data. The nonlinear model is a simulation data which is generated in two different models in terms of mathematical function based on statistical distribution. According to the results, the smoothing spline, the penalized spline, and the B-spline methods have a good performance to fit nonlinear data by considering the hypothesis testing of biased estimator. However the penalized spline method shows the minimum mean square errors on two models. As real data, we use the data from a light detection and ranging (LIDAR) experiment that contained the range distance travelled before the light as an independent variable and the logarithm of the ratio of received light from two laser sources as a dependent variable. From the mean square errors of fitting data, the penalized spline again shows the minimum values.

Research paper thumbnail of Comparing random coefficient autoregressive model with and without autocorrelated errors by Bayesian analysis

Statistical Journal of the IAOS

We proposed a Bayesian analysis for estimating an unknown parameter in a Random Coefficient Autor... more We proposed a Bayesian analysis for estimating an unknown parameter in a Random Coefficient Autoregressive (RCA) model and its AutoRegressive (AR) process errors. We called this model an RCA model with autocorrelated errors (RCA-AR). A Markov Chain Monte Carlo (MCMC) method was used to generate samples from a posterior distribution which, after having been averaged, gave the estimated value of the unknown parameter. We used a Gibbs sampling algorithm in our MCMC calculation. To compare the performances of the RCA and the RCA-AR models, a simulation was performed with a set of test data and then the mean square errors obtained were used to indicate their performance. The result was that the RCA-AR model worked better than the RCA model in every case. Lastly, we tried both models with real data. They were used to estimate a series of monthly averages of the Stock Exchange of Thailand (SET) index. The result was that the RCA-AR still worked better than the RCA model, similar to the simulation of test data.

Research paper thumbnail of Parameter Estimation of Poisson Distribution by Using Maximum Likelihood, Markov Chain Monte Carlo, and Bayes method

Thammasat International Journal of Science and Technology, 2014

Research paper thumbnail of The Comparison of Bandwidth Selection Methods using Kernel Function

Kmitl Science and Technology Journal, Dec 16, 2014

Research paper thumbnail of Tseries: An R Package for Stationarity Tests in Time Series Data

Research Highlights in Science and Technology Vol. 1, Apr 22, 2023

Research paper thumbnail of Comparing Penalized Regression Analysis of Logistic Regression Model with Multicollinearity

Proceedings of the 2019 2nd International Conference on Mathematics and Statistics - ICoMS'19, 2019

The goal of this research is to estimate the parameter of the logistic regression model by penali... more The goal of this research is to estimate the parameter of the logistic regression model by penalized regression analysis which consisted of ridge regression, lasso, and elastic net method. The logistic regression is considered between a binary dependent variable and 3 and 5 independent variables. The independent variables are generated from normal distribution, contaminated normal distribution, and t distribution on correlation coefficient at 0.1, 0.5, and 0.99 or called multicollinearity problem. The maximum likelihood estimator has used as the classical method by differential the log likelihood function with respect to the coefficients. Ridge regression is to choose the unknown ridge parameter by cross-validation, so ridge estimator is evaluated by the adding ridge parameter on penalty term. Lasso (least absolute shrinkage and selection operator) is added the penalty term on scales sum of the absolute value of the coefficients. The elastic net can be mixed between ridge regression and lasso on the penalty term. The criterion of these methods is compared by percentage of predicted accuracy value. The results are found that lasso is satisfied when the independent variables are simulated from normal and t distribution in most cases, and the lasso outperforms on the contaminated normal distribution.

Research paper thumbnail of Estimating conditional heteroscedastic nonlinear autoregressive model by using smoothing spline and penalized spline methods

We propose smoothing spline (SS) and penalized spline (PS) methods in a class of nonparametric re... more We propose smoothing spline (SS) and penalized spline (PS) methods in a class of nonparametric regression methods for estimating the unknown functions in a conditional heteroscedastic nonlinear autoregressive (CHNLAR) model. The CHNLAR model consists of a trend and heteroscedastic functions in terms of past data at lag 1. The SS and PS methods were tested in estimating the unknown functions used to transform data so that it fits the trend and the heteroscedastic functions. In a simulation study, time series data were generated and hypothesis testing of the bias was used to check the accuracy. The SS and PS methods exhibit a good power estimation in most cases of generated data. As real data, gold price was modeled by using SS and PS methods in the CHNLAR model. The results show that the SS method performed better than the PS method.

Research paper thumbnail of The Penalized Regression and Penalized Logistic Regression of Lasso and Elastic Net Methods for High- Dimensional Data: A Modelling Approach

Innovations in Science and Technology Vol. 3, 2022

Research paper thumbnail of The Maximum Likelihood Method Of Random Coefficient Dynamic Regression Model

The Random Coefficient Dynamic Regression (RCDR) model is to developed from Random Coefficient Au... more The Random Coefficient Dynamic Regression (RCDR) model is to developed from Random Coefficient Autoregressive (RCA) model and Autoregressive (AR) model. The RCDR model is considered by adding exogenous variables to RCA model. In this paper, the concept of the Maximum Likelihood (ML) method is used to estimate the parameter of RCDR(1,1) model. Simulation results have shown the AIC and BIC criterion to compare the performance of the the RCDR(1,1) model. The variables as the stationary and weakly stationary data are good estimates where the exogenous variables are weakly stationary. However, the model selection indicated that variables are nonstationarity data based on the stationary data of the exogenous variables.

Research paper thumbnail of An Estimating Parameter Of The Mean In Normal Distribution By Maximum Likelihood, Bayes, And Markov Chain Monte Carlo Methods

This paper is to compare the parameter estimation of<br> the mean in normal distribution by... more This paper is to compare the parameter estimation of<br> the mean in normal distribution by Maximum Likelihood (ML),<br> Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML<br> estimator is estimated by the average of data, the Bayes method is<br> considered from the prior distribution to estimate Bayes estimator,<br> and MCMC estimator is approximated by Gibbs sampling from<br> posterior distribution. These methods are also to estimate a parameter<br> then the hypothesis testing is used to check a robustness of the<br> estimators. Data are simulated from normal distribution with the true<br> parameter of mean 2, and variance 4, 9, and 16 when the sample<br> sizes is set as 10, 20, 30, and 50. From the results, it can be seen<br> that the estimation of MLE, and MCMC are perceivably different<br> from the true parameter when the sample size is 10 and 20 with<br> variance 16. Furthermore, the Bayes es...

Research paper thumbnail of Contributed paper Forecasting the Stock Exchange Rate of Thailand Index by Conditional Heteroscedastic Autoregressive Nonlinear Model with Autocorrelated Errors

The goal of this work is to develop a nonparametric regression model that not only account for po... more The goal of this work is to develop a nonparametric regression model that not only account for possibly non-linear trend (i.e., conditional mean of the response variable) but also account for possibly non-linear conditional variance of response (i.e., heteroscedasticity) as a function of predictor variables in the presence of auto-correlated errors. The trend and the heteroscedasticity are modeled using a class of penalized spline. The residuals are modeled as a long autoregressive process which can approximate almost any autoregressive moving average (ARMA) process by selecting an appropriate number of lag residuals. Both classical and Bayesian methodologies are developed to obtain the smooth estimates of the conditional mean and variance functions. The resulting estimated residuals are then used to fit a possibly long AR process by suitably choosing the order of AR using the Akaike Information Criteria (AIC). The forecasting performance of the proposed methods is then applied to t...

Research paper thumbnail of The estimated parameter of logistic regression model by Markov Chain Monte Carlo method with multicollinearity

Statistical journal of the IAOS, 2020

Markov Chain Monte Carlo (MCMC) method has been a popular method for getting information about pr... more Markov Chain Monte Carlo (MCMC) method has been a popular method for getting information about probability distribution for estimating posterior distribution by Gibbs sampling. So far, the standard methods such as maximum likelihood and logistic ridge regression methods have represented to compare with MCMC. The maximum likelihood method is the classical method to estimate the parameter on the logistic regression model by differential the loglikelihood function on the estimator. The logistic ridge regression depends on the choice of ridge parameter by using crossvalidation for computing estimator on penalty function. This paper provides maximum likelihood, logistic ridge regression, and MCMC to estimate parameter on logit function and transforms into a probability. The logistic regression model predicts the probability to observe a phenomenon. The prediction accuracy evaluates in terms of the percentage with correct predictions of a binary event. A simulation study conducts a binary...

Research paper thumbnail of Bayesian Analysis of Random Coefficient Dynamic AutoRegressive Model

Thailand Statistician, 2012

The goal of this work is to develop Random Coefficient AutoRegressive (RCAR) model and AutoRegres... more The goal of this work is to develop Random Coefficient AutoRegressive (RCAR) model and AutoRegressive (AR) model to Random Coefficient Dynamic AutoRegressive (RCDAR) model. The RCDAR model is considered by adding exogenous variables in RCAR model, so there are two variables in RCDAR model. This paper proposes the Bayesian analysis to estimate parameter of the first order RCDAR model. The noninformative prior is used to the Bayesian estimation procedure that works well for the AR model. Monte Carlo simulations was repeated for each situations in comparison of the coefficient from RCDAR model which is stationary, weakly stationary, and closed to nonstationary data. The results of coefficient estimator are satisfied weakly stationary data which is performed to fit possibly for large sample sizes.

Research paper thumbnail of Developing Nonparametric Conditional Heteroscedastic Autoregressive Nonlinear Model by Using Maximum Likelihood Method

The goal of this work is to develop a nonparametric conditional heteroscedastic autoregressive no... more The goal of this work is to develop a nonparametric conditional heteroscedastic autoregressive nonlinear (NCHARN) model by using maximum likelihood method that not only account for possibly non-linear trend but also account for possibly non-linear conditional variance of response as a function of predictor variables in the presence of auto-correlated errors. The trend and the heteroscedasticity are modeled using a class of penalized spline and the residuals are modeled as a autoregressive process (AR) by selecting an appropriate number of lag residuals. Both classical penalized spline and AR process of penalized spline under NCHARN model are developed to obtain the smooth estimates of the conditional mean and variance functions. The resulting estimated values are then used the maximum likelihood method to fi t a trend, volatility, and a coeffi cient of AR process by suitably choosing the order of AR using the Akaike Information Criteria (AIC). The forecasting performance of the prop...

Research paper thumbnail of A Comparison of Parameter Estimation of Logistic Regression Model by Maximum Likelihood, Ridge Regression, Markov Chain Monte Carlo Methods

The goal of this research is to estimate the parameter of logistic regression model. The coeffici... more The goal of this research is to estimate the parameter of logistic regression model. The coefficient parameter is evaluated by maximum likelihood, ridge regression, markov chain monte carlo methods. The logistic regression is considered the correlation between binary dependent variable and 2, 3, and 4 independent variables which is generated from normal distribution, contaminated normal distribution, and t distribution. The maximum likelihood estimator is estimated by differential the log likelihood function with respect to the coefficients. Ridge regression is to choose the unknown ridge parameter by cross-validation, so ridge estimator is evaluated on a form of maximum likelihood method by adding ridge parameter. The markov chain monte carlo estimator can approximate from Gibbs sampling algorithm by the posterior distribution based on a probability distribution and prior probability distribution. The performance of these method is compare by percentage of predicted accuracy value....

Research paper thumbnail of Forecasting the Stock Exchange Rate of Thailand Index by Conditional Heteroscedastic Autoregressive Nonlinear Model with Autocorrelated Errors

The goal of this work is to develop a nonparametric regression model that not only account for po... more The goal of this work is to develop a nonparametric regression model that not only account for possibly non-linear trend (i.e., conditional mean of the response variable) but also account for possibly non-linear conditional variance of response (i.e., heteroscedasticity) as a function of predictor variables in the presence of auto-correlated errors. The trend and the heteroscedasticity are modeled using a class of penalized spline. The residuals are modeled as a long autoregressive process which can approximate almost any autoregressive moving average (ARMA) process by selecting an appropriate number of lag residuals. Both classical and Bayesian methodologies are developed to obtain the smooth estimates of the conditional mean and variance functions. The resulting estimated residuals are then used to fit a possibly long AR process by suitably choosing the order of AR using the Akaike Information Criteria (AIC). The forecasting performance of the proposed methods is then applied to t...

Research paper thumbnail of Review Article The Estimation of Smoothing Parameter using Smoothing Techniques on Nonparametric Regression

This article discusses on the smoothing parameter which is controlled by interpolating spline bas... more This article discusses on the smoothing parameter which is controlled by interpolating spline based on the smoothing techniques that consisted of smoothing spline method, kernel regression method, and penalized spline regression method. The smoothing parameter is controlled the fitting model and the trade of between the bias of the estimator. We also propose the range of smoothing parameter of these methods to fit the smoothing function which data is nonlinear. Therefore, we mention the characteristic of smoothing function when the smoothing parameters have the various values. According to the results, it is concluded that the smoothing parameter of the smoothing spline method is suitable worked between zero to one, the kernel regression is good performance between two to ten, and the penalized spline is useful between one to ten.

Research paper thumbnail of Comparison of Estimating Parameter in Parametric Regression, Nonparametric Regression, and Semiparametric Regression models in Case of Two Explanatory Variables

In this paper, we compare the estimating parameter in the parametric regression model, nonparamet... more In this paper, we compare the estimating parameter in the parametric regression model, nonparametric regression model, and semiparametric regression model between response variable and two explanatory varaibles. The parametric regression model uses the least square method for estimating parameter. The penalized spline method based on nonparametric regression method is proposed for estimating function of nonparametric regression model, and semiparametric regression model. The minimum of Mean Square Error (MSE) is a criterion for choosing the optimal model. Here, we simulate the response variable and two explanatory variables that correlated a nonlinear data based on uniform distribution. The real data can be applied of these models to illustrate the methodology. The estimated values of nonparametric regression model is a good performance in both of simulated data and real data. Keywords: nonparametric regression model, parametric regression model, penalized spline method, semiparam...

Research paper thumbnail of The Higher-Order of Adaptive Lasso and Elastic Net Methods for Classification on High Dimensional Data

Mathematics

The lasso and elastic net methods are the popular technique for parameter estimation and variable... more The lasso and elastic net methods are the popular technique for parameter estimation and variable selection. Moreover, the adaptive lasso and elastic net methods use the adaptive weights on the penalty function based on the lasso and elastic net estimates. The adaptive weight is related to the power order of the estimator. Normally, these methods focus to estimate parameters in terms of linear regression models that are based on the dependent variable and independent variable as a continuous scale. In this paper, we compare the lasso and elastic net methods and the higher-order of the adaptive lasso and adaptive elastic net methods for classification on high dimensional data. The classification is used to classify the categorical data for dependent variable dependent on the independent variables, which is called the logistic regression model. The categorical data are considered a binary variable, and the independent variables are used as the continuous variable. The high dimensional...

Research paper thumbnail of Comparing Parameter Estimation of Random Coefficient Autoregressive Model by Frequentist Method

Mathematics

This paper compares the frequentist method that consisted of the least-squares method and the max... more This paper compares the frequentist method that consisted of the least-squares method and the maximum likelihood method for estimating an unknown parameter on the Random Coefficient Autoregressive (RCA) model. The frequentist methods depend on the likelihood function that draws a conclusion from observed data by emphasizing the frequency or proportion of the data namely least squares and maximum likelihood methods. The method of least squares is often used to estimate the parameter of the frequentist method. The minimum of the sum of squared residuals is found by setting the gradient to zero. The maximum likelihood method carries out the observed data to estimate the parameter of a probability distribution by maximizing a likelihood function under the statistical model, while this estimator is obtained by a differential parameter of the likelihood function. The efficiency of two methods is considered by average mean square error for simulation data, and mean square error for actual ...

Research paper thumbnail of An estimating parameter of nonparametric regression model based on smoothing techniques

Statistical Journal of the IAOS

This paper studies the estimating parameter of a nonparametric regression model that consists of ... more This paper studies the estimating parameter of a nonparametric regression model that consists of the function of independent variables and observation of dependent variables. The smoothing spline, penalized spline, and B-spline methods in a class of smoothing techniques are considered for estimating the unknown parameter on nonparametric regression model. These methods use a smoothing parameter to control the smoothing performance on data set by using a cross-validation method. We also compare these methods by fitting a nonparametric regression model on simulation data and real data. The nonlinear model is a simulation data which is generated in two different models in terms of mathematical function based on statistical distribution. According to the results, the smoothing spline, the penalized spline, and the B-spline methods have a good performance to fit nonlinear data by considering the hypothesis testing of biased estimator. However the penalized spline method shows the minimum mean square errors on two models. As real data, we use the data from a light detection and ranging (LIDAR) experiment that contained the range distance travelled before the light as an independent variable and the logarithm of the ratio of received light from two laser sources as a dependent variable. From the mean square errors of fitting data, the penalized spline again shows the minimum values.

Research paper thumbnail of Comparing random coefficient autoregressive model with and without autocorrelated errors by Bayesian analysis

Statistical Journal of the IAOS

We proposed a Bayesian analysis for estimating an unknown parameter in a Random Coefficient Autor... more We proposed a Bayesian analysis for estimating an unknown parameter in a Random Coefficient Autoregressive (RCA) model and its AutoRegressive (AR) process errors. We called this model an RCA model with autocorrelated errors (RCA-AR). A Markov Chain Monte Carlo (MCMC) method was used to generate samples from a posterior distribution which, after having been averaged, gave the estimated value of the unknown parameter. We used a Gibbs sampling algorithm in our MCMC calculation. To compare the performances of the RCA and the RCA-AR models, a simulation was performed with a set of test data and then the mean square errors obtained were used to indicate their performance. The result was that the RCA-AR model worked better than the RCA model in every case. Lastly, we tried both models with real data. They were used to estimate a series of monthly averages of the Stock Exchange of Thailand (SET) index. The result was that the RCA-AR still worked better than the RCA model, similar to the simulation of test data.

Research paper thumbnail of Parameter Estimation of Poisson Distribution by Using Maximum Likelihood, Markov Chain Monte Carlo, and Bayes method

Thammasat International Journal of Science and Technology, 2014

Research paper thumbnail of The Comparison of Bandwidth Selection Methods using Kernel Function

Kmitl Science and Technology Journal, Dec 16, 2014