Keynote: Pre-training and Fine-tuning of Code Generation Models - Loubna Ben-Allal, Hugging Face
主论坛演讲:代码生成模型的预训练和微调 | Keynote: Pre-training and Fine-tuning of Code Generation Models - Loubna Ben-Allal, Machine Learning Engineer, Hugging Face
基于代码训练的大型语言模型在代码补全和从自然语言描述中合成代码方面展现出了非凡的能力。在这次演讲中,我们将探讨构建和训练类似StarCoder的大型代码模型的幕后过程,这是一个跨足80多种编程语言的强大的150亿参数的代码生成模型,并且还融入了负责任的人工智能实践。此外,我们还将讨论如何使用开源库,包括transformers、datasets和PEFT,来利用这些模型。
Large Language Models trained on code have showcased remarkable abilities in code completion and synthesis from natural language descriptions. In this talk, we'll explore the behind-the-scenes process of building and training large code models like StarCoder, a robust 15B Code Generation model trained across 80+ programming languages, while also incorporating responsible AI practices. Additionally, we'll discuss how to leverage these models using open-source libraries like transformers and peft, and how to efficiently deploy them.
CNCF概况(幻灯片)
扫描二维码联系我们!
CNCF (Cloud Native Computing Foundation)成立于2015年12月,隶属于Linux Foundation,是非营利性组织。
CNCF(云原生计算基金会)致力于培育和维护一个厂商中立的开源生态系统,来推广云原生技术。我们通过将最前沿的模式民主化,让这些创新为大众所用。请关注CNCF微信公众号。