upcarta
Sign In
Sign Up
Explore
Search
Mentions
Jesse Dodge
@JesseDodge
·
May 5, 2023
From
Twitter
Fantastic work from MosaicML open sourcing some large models!
Tweet
May 5, 2023
MPT is here! Check out our shiny new LLMs, open-source w/commercial license. The base MPT-7B model is 7B params trained on 1T tokens and reaches LLaMA-7B quality. We also created Instruct (commercial), Chat, and (my favorite) StoryWriter-65k+ variant
by
Jonathan Frankle
Post
Add to Collection
Mark as Completed
https://www.upcarta.com/posts/73182
Share on Twitter
Repost