1.8B language model, open source and free
H2O-Danube-1.8B is a 1.8B language model trained on 1T markers, following the core principles of LLama 2 and Mistral. Although our model uses significantly fewer total markers when training than similarly sized reference models, it exhibits very competitive metrics on multiple benchmarks. Additionally, we release a chat model trained with supervised fine-tuning and direct preference optimization. We open source H2O-Danube-1.8B under the Apache 2.0 license to further democratize large language models and economically benefit a wider audience.
Can be used in fields such as natural language processing, chatbots, text generation, etc.
For building intelligent chatbots
Applied to large-scale text generation tasks
for natural language processing research
Discover more similar quality AI tools