Overview of Large Language Models











>> YOUR LINK HERE: ___ http://youtube.com/watch?v=dFwF0cwFd-E

Large language models have resulted in huge success for rich text generation in terms of text, speech, images, videos as well as code. In this video I will talk about a brief history of evolution of such large language models. I will start with Transformers, BERT and GPT. Then we will talk about further natural language understanding models like RoBERTa, ELECTRA, DeBERTa. We will also talk about natural language generation models like BART, and T5. Then we will talk about multilingual models like XLM, Unicoder, mBART, mT5, DeltaLM; and multimodal models like VisualBERT, vilBERT, CLIP. To be able to deploy these models in the real-world settings, model compression and distributed training became essential. Hence, we will talk about topics like distillation, adapters and mixture of experts. Recently, prompt-based models have become popular. Hence, we will talk about GPT3, InstructGPT and in general about prompting. This is the story of modern NLP from the lens of large language models. • Here is the agenda: • 00:00:00 Rich text generation • 00:03:14 Transformers, BERT, GPT, T5 • 00:08:35 Natural Language Understanding: RoBERTa, ELECTRA, DeBERTa • 00:13:21 Natural Language Generation: BART, T5 • 00:16:20 Multi-lingual models: XLM, Unicoder, mBART, mT5, DeltaLM • 00:22:42 Multi-modal models: VisualBERT, vilBERT, CLIP • 00:28:00 Compression and distributed training: Distillation, Adapters, Mixture of Experts • 00:41:20 Prompt based models: GPT3, InstructGPT, Prompting

#############################









New on site
Content Report
Youtor.org / YTube video Downloader © 2025

created by www.youtor.org