Ridge regression (original) (raw)

Property Value
dbo:abstract Ridge regression is a method of estimating the coefficients of multiple-regression models in scenarios where the independent variables are highly correlated. It has been used in many fields including econometrics, chemistry, and engineering. Also known as Tikhonov regularization, named for Andrey Tikhonov, it is a method of regularization of ill-posed problems. it is particularly useful to mitigate the problem of multicollinearity in linear regression, which commonly occurs in models with large numbers of parameters. In general, the method provides improved efficiency in parameter estimation problems in exchange for a tolerable amount of bias (see bias–variance tradeoff). The theory was first introduced by Hoerl and Kennard in 1970 in their Technometrics papers “RIDGE regressions: biased estimation of nonorthogonal problems” and “RIDGE regressions: applications in nonorthogonal problems”. This was the result of ten years of research into the field of ridge analysis. Ridge regression was developed as a possible solution to the imprecision of least square estimators when linear regression models have some multicollinear (highly correlated) independent variables—by creating a ridge regression estimator (RR). This provides a more precise ridge parameters estimate, as its variance and mean square estimator are often smaller than the least square estimators previously derived. (en) リッジ回帰(リッジかいき、Ridge regression)は、独立変数が強く相関している場合に、重回帰モデルの係数を推定する方法。計量経済学、化学、工学などの分野で使用されている。 この理論は、1970年に Hoerl と ケナード が Technometrics の論文「RIDGE regressions: biased estimation of nonorthogonal problems」と「RIDGE regressions: applications in nonorthogonal problems」で初めて紹介した 。これは、リッジ分析の分野における 10 年間の研究の結果だった。 リッジ回帰は、線形回帰モデルに多重共線性がある(強く相関する独立変数がある)場合に最小二乗推定量が不正確になることを解決するために開発された。リッジ回帰推定量は、最小二乗推定量よりも精度が高い 。 (ja)
dbo:wikiPageExternalLink https://books.google.com/books%3Fid=Jv_ZBwAAQBAJ&pg=PA86 https://books.google.com/books%3Fid=v0KCDwAAQBAJ https://books.google.com/books%3Fid=wmA_R3ZFrXYC https://books.google.com/books%3Fid=yPOUDwAAQBAJ&pg=PA69 http://apps.nrbook.com/empanel/index.html%23pg=1006
dbo:wikiPageID 954328 (xsd:integer)
dbo:wikiPageLength 26689 (xsd:nonNegativeInteger)
dbo:wikiPageRevisionID 1120671107 (xsd:integer)
dbo:wikiPageWikiLink dbr:Bayesian_probability dbr:Elastic_net_regularization dbr:Non-linear_least_squares dbr:Technometrics dbr:Bayes'_theorem dbc:Regression_analysis dbr:Bias_of_an_estimator dbr:Bias–variance_tradeoff dbr:Residual_(numerical_analysis) dbr:Design_matrix dbr:Integral_equation dbr:Inverse_problem dbr:Standard_deviation dbr:Levenberg–Marquardt_algorithm dbr:Maximum_a_posteriori dbr:Gauss–Markov_theorem dbr:Norm_(mathematics) dbr:Low-pass_filters dbr:Grace_Wahba dbr:Moment_matrix dbr:Multivariate_normal_distribution dbr:Condition_number dbr:Constraint_(mathematics) dbr:Cross-validation_(statistics) dbr:Homoscedasticity dbr:Underdetermined_system dbr:Regression_model dbr:Andrey_Nikolayevich_Tikhonov dbr:Machine_learning dbr:Singular_matrices dbr:Statistics dbr:Compact_operator dbr:Identity_matrix dbr:Kriging dbr:Overdetermined_system dbr:Mahalanobis_distance dbr:Main_diagonal dbr:Errors_and_residuals_in_statistics dbr:Lasso_(statistics) dbr:Least_squares dbr:Linear_regression dbr:Logistic_regression dbc:Linear_algebra dbr:Expected_value dbr:Normal_distribution dbr:Matrix_regularization dbr:Prior_probability dbr:Rank_(linear_algebra) dbr:Regularization_(mathematics) dbr:Residual_sum_of_squares dbr:Hermitian_adjoint dbr:Hilbert_space dbc:Inverse_problems dbr:Covariance_matrix dbc:Estimation_methods dbr:Cholesky_factorization dbr:Lagrange_multiplier dbr:Support_vector_machine dbr:Coefficient dbr:Efficient_estimator dbr:High-pass_filter dbr:Whitening_transformation dbr:Wiener_filter dbr:Difference_operator dbr:Discrete_fourier_transform dbr:Ill-posed_problem dbr:Mikhail_Lavrentyev dbr:Ordinary_least_squares dbr:Multicollinearity dbr:Statistical_classification dbr:Non-binding_constraint dbr:Well-posed_problem dbr:Restricted_maximum_likelihood dbr:Singular-value_decomposition dbr:Statistical_independence dbr:Effective_number_of_degrees_of_freedom dbr:Eigenvalues dbr:Regressand dbr:Generalized_singular-value_decomposition dbr:Arthur_E._Hoerl dbr:Discrepancy_principle dbr:L-curve_method dbr:Unbiased_predictive_risk_estimator
dbp:date May 2020 (en) November 2022 (en)
dbp:reason what are the relative dimensions of A, b and x/ is A a square or non-square matrix?; are x and y of the same dimension (en) If multiplying a matrix by x is a filter, what in A is a frequency, and what values correspond to high or low frequencies? (en) does this represent a system of linear equations (en)
dbp:wikiPageUsesTemplate dbt:Regression_bar dbt:Authority_control dbt:Cite_book dbt:Clarify dbt:Efn dbt:Further dbt:Main dbt:Math dbt:Notelist dbt:Reflist dbt:Short_description dbt:Least_squares_and_regression_analysis
dcterms:subject dbc:Regression_analysis dbc:Linear_algebra dbc:Inverse_problems dbc:Estimation_methods
rdf:type owl:Thing
rdfs:comment リッジ回帰(リッジかいき、Ridge regression)は、独立変数が強く相関している場合に、重回帰モデルの係数を推定する方法。計量経済学、化学、工学などの分野で使用されている。 この理論は、1970年に Hoerl と ケナード が Technometrics の論文「RIDGE regressions: biased estimation of nonorthogonal problems」と「RIDGE regressions: applications in nonorthogonal problems」で初めて紹介した 。これは、リッジ分析の分野における 10 年間の研究の結果だった。 リッジ回帰は、線形回帰モデルに多重共線性がある(強く相関する独立変数がある)場合に最小二乗推定量が不正確になることを解決するために開発された。リッジ回帰推定量は、最小二乗推定量よりも精度が高い 。 (ja) Ridge regression is a method of estimating the coefficients of multiple-regression models in scenarios where the independent variables are highly correlated. It has been used in many fields including econometrics, chemistry, and engineering. Also known as Tikhonov regularization, named for Andrey Tikhonov, it is a method of regularization of ill-posed problems. it is particularly useful to mitigate the problem of multicollinearity in linear regression, which commonly occurs in models with large numbers of parameters. In general, the method provides improved efficiency in parameter estimation problems in exchange for a tolerable amount of bias (see bias–variance tradeoff). (en)
rdfs:label リッジ回帰 (ja) Ridge regression (en)
owl:sameAs wikidata:Ridge regression dbpedia-ja:Ridge regression https://global.dbpedia.org/id/FpbVP
prov:wasDerivedFrom wikipedia-en:Ridge_regression?oldid=1120671107&ns=0
foaf:isPrimaryTopicOf wikipedia-en:Ridge_regression
is dbo:wikiPageDisambiguates of dbr:Ridge_(disambiguation)
is dbo:wikiPageRedirects of dbr:Tikhonov_regularization dbr:L2_regularization dbr:Linear_regularization dbr:Ridge_Regression dbr:Weight_decay dbr:Phillips-Twomey_method dbr:Constrained_linear_inversion dbr:Tikhonov-Miller_method dbr:Tikhonov_regularisation dbr:Weight_regularization
is dbo:wikiPageWikiLink of dbr:Bayesian_linear_regression dbr:Elastic_net_regularization dbr:Electricity_price_forecasting dbr:Bias–variance_tradeoff dbr:Regularized_least_squares dbr:Degrees_of_freedom_(statistics) dbr:Levenberg–Marquardt_algorithm dbr:Gauss–Markov_theorem dbr:Poisson_regression dbr:Coefficient_of_determination dbr:Genome-wide_complex_trait_analysis dbr:Mlpy dbr:Cross-validation_(statistics) dbr:Machine_learning dbr:Feature_selection dbr:Kernel_method dbr:Multi-armed_bandit dbr:Principal_component_regression dbr:Adjusted_Plus_Minus dbr:Lasso_(statistics) dbr:Least_squares dbr:Linear_regression dbr:Logistic_regression dbr:Regularization_(mathematics) dbr:Ridge_(disambiguation) dbr:Learnable_function_class dbr:High-dimensional_statistics dbr:Tikhonov_regularization dbr:Regularized_canonical_correlation_analysis dbr:Multicollinearity dbr:Shrinkage_(statistics) dbr:List_of_statistics_articles dbr:Reservoir_computing dbr:Outline_of_machine_learning dbr:Outline_of_regression_analysis dbr:Outline_of_statistics dbr:Types_of_artificial_neural_networks dbr:L2_regularization dbr:Linear_regularization dbr:Ridge_Regression dbr:Weight_decay dbr:Phillips-Twomey_method dbr:Constrained_linear_inversion dbr:Tikhonov-Miller_method dbr:Tikhonov_regularisation dbr:Weight_regularization
is owl:differentFrom of dbr:Ridge_function
is foaf:primaryTopic of wikipedia-en:Ridge_regression