Found 2 AI tools
Click any tool to view details
EXAONE-3.0-7.8B-Instruct is a bilingual (English and Korean) pre-trained generative model with 780 million parameters developed by LG AI Research. The model is pre-trained with 8T of selected tokens and post-trained through supervised fine-tuning and direct preference optimization, demonstrating extremely competitive benchmark performance compared to open models of similar size.
The TinyLlama project aims to pre-train a 1.1B Llama model on 3 trillion tokens. With some proper optimization we could do it in "only" 90 days using 16 A100-40G GPUs. Training has started on 2023-09-01. We use the exact same architecture and tokenizer as Llama 2. This means that TinyLlama can be used in many open source projects built on Llama. In addition, TinyLlama has only 1.1B parameters, and its compactness enables it to meet the needs of many applications with limited computing and memory footprint.
Explore other subcategories under chat Other Categories
730 tools
218 tools
134 tools
125 tools
114 tools
110 tools
94 tools
80 tools
AI model inference training Hot chat is a popular subcategory under 2 quality AI tools