Skip Navigation

MoasicML open sources new 8k context length MPT-30B language model under Apache 2.0 license (22.06.2023 blogpost)

www.mosaicml.com MPT-30B: Raising the bar for open-source foundation models

Introducing MPT-30B, a new, more powerful member of our Foundation Series of open-source models, trained with an 8k context length on H100s.

MPT-30B: Raising the bar for open-source foundation models
17
Machine Learning - Learning/Language Models @lemmy.intai.tech manitcor @lemmy.intai.tech

MPT-30B: Raising the bar for open-source foundation models

1 0

You're viewing a single thread.

17 comments
  • Still had some reasoning issues, but looking forward to the fine tunes!

    • Does prompting in a way to improve it's reasoning work? I'm talking about CoT or ToT or other prompting strategies.

17 comments