📁 language

H2O-Danube-1.8B

1.8B language model, open source and free

#natural language processing
#Open source
#language model
H2O-Danube-1.8B

Product Details

H2O-Danube-1.8B is a 1.8B language model trained on 1T markers, following the core principles of LLama 2 and Mistral. Although our model uses significantly fewer total markers when training than similarly sized reference models, it exhibits very competitive metrics on multiple benchmarks. Additionally, we release a chat model trained with supervised fine-tuning and direct preference optimization. We open source H2O-Danube-1.8B under the Apache 2.0 license to further democratize large language models and economically benefit a wider audience.

Main Features

1
1.8B tagged language model
2
Open source and free
3
Competitive performance metrics
4
Released optimized chat model

Target Users

Can be used in fields such as natural language processing, chatbots, text generation, etc.

Examples

For building intelligent chatbots

Applied to large-scale text generation tasks

for natural language processing research

Quick Access

Visit Website →

Categories

📁 language
› AI model
› AI language model

Related Recommendations

Discover more similar quality AI tools