Foundation models (original) (raw)
A foundation model is a large artificial intelligence model trained on a vast quantity of unlabeled data at scale (usually by self-supervised learning) resulting in a model that can be adapted to a wide range of downstream tasks. Foundation models have helped bring about a major transformation in how AI systems are built since their introduction in 2018. Early examples of foundation models were large pre-trained language models including BERT and GPT-3. Using the same ideas, domain specific models using sequences of other kinds of tokens, such as medical codes, have been built as well. Subsequently, several multimodal foundation models have been produced including DALL-E, Flamingo, and Florence. The Stanford Institute for Human-Centered Artificial Intelligence's (HAI) Center for Research o
Property | Value |
---|---|
dbo:abstract | A foundation model is a large artificial intelligence model trained on a vast quantity of unlabeled data at scale (usually by self-supervised learning) resulting in a model that can be adapted to a wide range of downstream tasks. Foundation models have helped bring about a major transformation in how AI systems are built since their introduction in 2018. Early examples of foundation models were large pre-trained language models including BERT and GPT-3. Using the same ideas, domain specific models using sequences of other kinds of tokens, such as medical codes, have been built as well. Subsequently, several multimodal foundation models have been produced including DALL-E, Flamingo, and Florence. The Stanford Institute for Human-Centered Artificial Intelligence's (HAI) Center for Research on Foundation Models (CRFM) popularized the term. (en) Un modèle de fondation est un modèle d'intelligence artificielle de grande taille, entraîné sur une grande quantité de données non étiquetées (généralement par apprentissage auto-supervisé ). Le modèle résultant peut être adapté à un large éventail de tâches en aval ("downstream tasks" en anglais). Depuis leur introduction en 2018, les modèles de fondation ont induit une transformation majeure dans la manière de construire les systèmes d'IA. Les premiers modèles de fondation étaient de grands modèles de langage pré-entraînés, notamment BERT et GPT-3 . Par la suite, des modèles de fondation multimodaux, tels DALL-E, Flamingo,et Florence, qui intègrent image et texte, ont fait leur apparition. Ce terme a été popularisé par le centre de recherche sur les modèles de fondation (CRFM) du Stanford Institute for Human-Centered Artificial Intelligence (HAI). (fr) |
dbo:wikiPageID | 70984276 (xsd:integer) |
dbo:wikiPageLength | 11434 (xsd:nonNegativeInteger) |
dbo:wikiPageRevisionID | 1122718597 (xsd:integer) |
dbo:wikiPageWikiLink | dbr:DALL-E dbc:Artificial_intelligence dbc:Machine_learning dbc:Computational_fields_of_study dbr:GPT-3 dbr:Artificial_intelligence dbc:Computational_linguistics dbc:Natural_language_processing dbc:Deep_learning dbc:Unsupervised_learning dbr:BERT_(language_model) dbc:Language_modeling dbr:Self-supervised_learning dbr:Deep_neural_networks |
dbp:wikiPageUsesTemplate | dbt:Reflist dbt:Short_description dbt:Natural_Language_Processing dbt:Differentiable_computing dbt:Existential_risk_from_artificial_intelligence |
dct:subject | dbc:Artificial_intelligence dbc:Machine_learning dbc:Computational_fields_of_study dbc:Computational_linguistics dbc:Natural_language_processing dbc:Deep_learning dbc:Unsupervised_learning dbc:Language_modeling |
rdfs:comment | A foundation model is a large artificial intelligence model trained on a vast quantity of unlabeled data at scale (usually by self-supervised learning) resulting in a model that can be adapted to a wide range of downstream tasks. Foundation models have helped bring about a major transformation in how AI systems are built since their introduction in 2018. Early examples of foundation models were large pre-trained language models including BERT and GPT-3. Using the same ideas, domain specific models using sequences of other kinds of tokens, such as medical codes, have been built as well. Subsequently, several multimodal foundation models have been produced including DALL-E, Flamingo, and Florence. The Stanford Institute for Human-Centered Artificial Intelligence's (HAI) Center for Research o (en) Un modèle de fondation est un modèle d'intelligence artificielle de grande taille, entraîné sur une grande quantité de données non étiquetées (généralement par apprentissage auto-supervisé ). Le modèle résultant peut être adapté à un large éventail de tâches en aval ("downstream tasks" en anglais). Depuis leur introduction en 2018, les modèles de fondation ont induit une transformation majeure dans la manière de construire les systèmes d'IA. Les premiers modèles de fondation étaient de grands modèles de langage pré-entraînés, notamment BERT et GPT-3 . Par la suite, des modèles de fondation multimodaux, tels DALL-E, Flamingo,et Florence, qui intègrent image et texte, ont fait leur apparition. Ce terme a été popularisé par le centre de recherche sur les modèles de fondation (CRFM) du Stanfor (fr) |
rdfs:label | Foundation models (en) Modèle de fondation (fr) |
owl:sameAs | wikidata:Foundation models dbpedia-fr:Foundation models https://global.dbpedia.org/id/GeFZ6 |
prov:wasDerivedFrom | wikipedia-en:Foundation_models?oldid=1122718597&ns=0 |
foaf:isPrimaryTopicOf | wikipedia-en:Foundation_models |
is dbo:wikiPageRedirects of | dbr:Foundation_model |
is dbo:wikiPageWikiLink of | dbr:History_of_artificial_intelligence dbr:Peter_Norvig dbr:Foundation_model dbr:I._J._Good |
is foaf:primaryTopic of | wikipedia-en:Foundation_models |