MoasicML open sources new 8k context length MPT-30B language model under Apache 2.0 license (22.06.2023 blogpost)
MoasicML open sources new 8k context length MPT-30B language model under Apache 2.0 license (22.06.2023 blogpost)
www.mosaicml.com MPT-30B: Raising the bar for open-source foundation models
Introducing MPT-30B, a new, more powerful member of our Foundation Series of open-source models, trained with an 8k context length on H100s.

You're viewing a single thread.
All Comments
17 comments
Still had some reasoning issues, but looking forward to the fine tunes!
3 0 ReplyDoes prompting in a way to improve it's reasoning work? I'm talking about CoT or ToT or other prompting strategies.
2 0 Reply
17 comments
Scroll to top