dbo:abstract |
In information geometry, a divergence is a kind of statistical distance: a binary function which establishes the separation from one probability distribution to another on a statistical manifold. The simplest divergence is squared Euclidean distance (SED), and divergences can be viewed as generalizations of SED. The other most important divergence is relative entropy (Kullback–Leibler divergence, KL divergence), which is central to information theory. There are numerous other specific divergences and classes of divergences, notably f-divergences and Bregman divergences (see ). (en) En statistiques, une divergence est une fonction ou une fonctionnelle qui mesure la dissimilarité d'une loi de probabilité par rapport à une autre. Selon le contexte, elles peuvent être définies pour des lois, des mesures positives (non-normalisées), des vecteurs (par exemple sur l'espace des paramètres si l'on considère un modèle paramétrique), ou encore des matrices. Les divergences sont analogues à des distances au carré et permettent de généraliser la notion de distance aux variétés statistiques, mais il s'agit d'une notion plus faible dans la mesure où elles ne sont en général pas symétriques et ne vérifient pas l'inégalité triangulaire. (fr) |
dbo:wikiPageID |
25896411 (xsd:integer) |
dbo:wikiPageLength |
19371 (xsd:nonNegativeInteger) |
dbo:wikiPageRevisionID |
1103479767 (xsd:integer) |
dbo:wikiPageWikiLink |
dbr:Probability_distribution dbr:Coordinate_chart dbr:Torsion_of_connection dbr:Annals_of_Mathematical_Statistics dbr:Bhattacharyya_angle dbr:Bhattacharyya_distance dbc:Statistical_distance dbr:Relative_entropy dbr:Information_geometry dbr:John_Wiley_&_Sons dbr:Conditional_probability dbr:Convex_set dbr:Statistical_manifold dbr:Statistical_distance dbr:Convex_conjugate dbr:Strictly_convex_function dbr:Hellinger_distance dbr:Triangle_inequality dbc:F-divergences dbr:Least_squares dbr:Linear_inverse_problem dbr:Linear_regression dbr:Logistic_regression dbr:Affine_connection dbr:Bregman_divergence dbr:Parametric_family dbr:Quadratic_form dbr:Pythagorean_theorem dbr:Riemannian_metric dbr:Harold_Jeffreys dbr:Jensen–Shannon_divergence dbr:Binary_function dbr:Differentiability_class dbr:Differentiable_manifold dbr:Dimensional_analysis dbr:Dover_Publications dbr:Positive_semidefinite_matrix dbr:Information_theory dbr:Kullback–Leibler_divergence dbr:Principle_of_maximum_entropy dbr:F-divergence dbr:Metric_(mathematics) dbr:Fisher_information_metric dbr:Total_variation_distance_of_probability_measures dbr:Squared_Euclidean_distance dbr:Chi-squared_divergence dbr:Negative_entropy dbr:Dual_affine_connection dbr:Α-connection |
dbp:wikiPageUsesTemplate |
dbt:! dbt:Citation dbt:Citation_needed dbt:Cite_book dbt:Cite_journal dbt:Details dbt:Distinguish dbt:Efn dbt:Harvtxt dbt:Main dbt:Math dbt:Notelist dbt:Refbegin dbt:Refend dbt:Reflist dbt:Sfn dbt:Slink dbt:Tmath dbt:Isbn dbt:Statistics |
dcterms:subject |
dbc:Statistical_distance dbc:F-divergences |
gold:hypernym |
dbr:Function |
rdf:type |
owl:Thing yago:WikicatStatisticalDistanceMeasures yago:Abstraction100002137 yago:Act100030358 yago:Action100037396 yago:Choice100161243 yago:Decision100162632 yago:Event100029378 yago:Maneuver100168237 yago:Measure100174412 yago:Move100165942 yago:PsychologicalFeature100023100 yago:YagoPermanentlyLocatedEntity dbo:Disease |
rdfs:comment |
In information geometry, a divergence is a kind of statistical distance: a binary function which establishes the separation from one probability distribution to another on a statistical manifold. The simplest divergence is squared Euclidean distance (SED), and divergences can be viewed as generalizations of SED. The other most important divergence is relative entropy (Kullback–Leibler divergence, KL divergence), which is central to information theory. There are numerous other specific divergences and classes of divergences, notably f-divergences and Bregman divergences (see ). (en) En statistiques, une divergence est une fonction ou une fonctionnelle qui mesure la dissimilarité d'une loi de probabilité par rapport à une autre. Selon le contexte, elles peuvent être définies pour des lois, des mesures positives (non-normalisées), des vecteurs (par exemple sur l'espace des paramètres si l'on considère un modèle paramétrique), ou encore des matrices. (fr) |
rdfs:label |
Divergence (statistics) (en) Divergence (statistiques) (fr) |
owl:differentFrom |
dbr:Deviance_(statistics) dbr:Deviation_(statistics) dbr:Discrepancy_(disambiguation) |
owl:sameAs |
freebase:Divergence (statistics) wikidata:Divergence (statistics) dbpedia-fr:Divergence (statistics) https://global.dbpedia.org/id/4j9rd |
prov:wasDerivedFrom |
wikipedia-en:Divergence_(statistics)?oldid=1103479767&ns=0 |
foaf:isPrimaryTopicOf |
wikipedia-en:Divergence_(statistics) |
is dbo:wikiPageDisambiguates of |
dbr:Divergence_(disambiguation) |
is dbo:wikiPageRedirects of |
dbr:Statistical_divergence dbr:Contrast_function |
is dbo:wikiPageWikiLink of |
dbr:Rényi_entropy dbr:Statistical_distance dbr:Central_tendency dbr:Distance dbr:Fairness_(machine_learning) dbr:Bregman_divergence dbr:Discrepancy_function dbr:Quantification_(machine_learning) dbr:Kullback–Leibler_divergence dbr:Euclidean_distance dbr:F-divergence dbr:Discrepancy dbr:Divergence_(disambiguation) dbr:List_of_statistics_articles dbr:Wasserstein_GAN dbr:Paired_opposites dbr:Stein_discrepancy dbr:Statistical_divergence dbr:Contrast_function |
is owl:differentFrom of |
dbr:Deviance_(statistics) |
is foaf:primaryTopic of |
wikipedia-en:Divergence_(statistics) |