温馨提示:本站仅提供公开网络链接索引服务,不存储、不篡改任何第三方内容,所有内容版权归原作者所有
AI智能索引来源:http://www.eb.com/technology/knowledge-distillation
点击访问原文链接

Knowledge distillation | Definition, Large Language Models, & Examples | Britannica

SUBSCRIBESUBSCRIBESUBSCRIBEHomeHistory & SocietyScience & TechBiographiesAnimals & NatureGeography & TravelArts & CultureProConMoneyGames & QuizzesVideosOn This DayOne Good FactDictionaryNew ArticlesHistory & SocietyLifestyles & Social IssuesPhilosophy & ReligionPolitics, Law & GovernmentWorld HistoryScience & TechHealth & MedicineScienceTechnologyBiographiesBrowse BiographiesAnimals & NatureBirds, Reptiles & Other VertebratesEnvironmentFossils & Geologic TimeInsects & Other InvertebratesMammalsPlantsGeography & TravelGeography & TravelArts & CultureEntertainment & Pop CultureLiteratureSports & RecreationVisual ArtsImage GalleriesPodcastsSummariesTop QuestionsLists and StoriesBritannica KidsAsk the ChatbotGames & QuizzesHistory & SocietyScience & TechBiographiesAnimals & NatureGeography & TravelArts & CultureProConMoneyVideosknowledge distillationIntroduction & Top QuestionsBackgroundInitial stepsMeasuring ability to generalizeMethods of knowledge distillationReferences & Edit HistoryQuick Facts & Related TopicsImagesTechnologyComputersFacebookXhttps://www.britannica.com/technology/knowledge-distillationKnowledge transfer from a teacher model to a student modelAdam Volle Adam Volle is a freelance writer and editor based in Atlanta, Georgia. Britannica Editors Encyclopaedia Britannica's editors oversee subject areas in which they have extensive knowledge, whether from years of experience gained by working on that content or via study for an advanced degree....Historymachine learninglarge language modelsneural networkneural networksmachine learningcontextlarge language modelsChatGPTGoogle Geminineural networkparametersimage recognitionnatural language processingspeech recognitioninferencebiasespredictionsclassificationprobability distributionSUBSCRIBE Although logits are the primary focus of knowledge transfer between teacher and student models, there are other forms of knowledge that require consideration. All forms of knowledge in neural networks can generally be placed in one of three categories: response-based knowledge, feature-based knowledge, or relation-based knowledge. Logits constitutediscriminationMethods of knowledge distillationdistilledlarge language modelneural networkmachine learningSee all related contentAdam Volle

智能索引记录