Loading...
by Zheng Zian
OpenMoE is a text AI model developed by Zheng Zian. This limited performance model is based on OpenMoE-8B. The model is licensed under Apache-2.0. Released in 2023-07. An early model aiming to ignite the open-source MoE community.
1 considerations identified
Review recommended before use
These considerations are automatically identified based on publicly available information about the vendor and AI catalog data. Actual risks may vary based on your specific use case and implementation.
Discover EU-based alternatives for this AI application.
Track, assess, and govern your AI applications with Anove.