Loading...
by Mistral AI
A balanced model in the Ministral 3 family, Ministral 3 8B is a powerful, efficient tiny language model with vision capabilities. This model is the base pre-trained version, not fine-tuned for instruction or reasoning tasks, making it ideal for custom post-training processes. For instruction and chat based use cases, we recommend using Ministral 3 8B Instruct 2512. The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 8B can even be deployed locally, capable of fitting in 24GB of VRAM in BF16, and less than 12GB of RAM/VRAM when quantized.
Discover EU-based alternatives for this AI application.
Track, assess, and govern your AI applications with Anove.