Skip to content

imaodong/Adapter-tuning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

一种Adapter-tuning的实现方式,只提供的思路,具体可以视情况稍微修改。

这里补充一些模型层数: GPT-2 Small:12个GPT2Block,约有1.17亿个参数。 GPT-2 Medium:24个GPT2Block,约有3.48亿个参数。 GPT-2 Large:36个GPT2Block,约有7.55亿个参数。 GPT-2 XL (也称为Extra Large):48个GPT2Block,约有15.54亿个参数。

RoBERTa Base:12个RobertaLayer,总共约有1.25亿个参数。 RoBERTa Large:24个RobertaLayer,总共约有3.55亿个参数。

About

A implement of adapter-tuning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages