Found 1 AI tools
Click any tool to view details
This is a 13 billion parameter pre-trained bilingual large-scale language model supporting Arabic and English, trained on a dataset of 72 billion Arabic tokens and 279 billion English/code tokens. The Arabic data was iterated for 1.6 epochs (compared to 1 epoch for English/Code), with a total of 395 billion tokens for training. The model is based on the Transformer decoder-specific architecture (GPT-3) and uses the SwiGLU nonlinear activation function. It implements ALiBi positional embeddings that can extrapolate to long sequence lengths, providing improved context handling and model accuracy.
Explore other subcategories under chat Other Categories
730 tools
218 tools
134 tools
125 tools
114 tools
110 tools
94 tools
80 tools
AI translation Hot chat is a popular subcategory under 1 quality AI tools