Back to Directory



Browse all European AI
OpenMoE
by Zheng Zian
AI Model
Low Confidence
OpenMoE is a text AI model developed by Zheng Zian. This limited performance model is based on OpenMoE-8B. The model is licensed under Apache-2.0. Released in 2023-07. An early model aiming to ignite the open-source MoE community.
Transparency Score
35%
Limited Transparency
Supply Chain
0/20
Compliance
10/20
Policy
0/25
Technical
25/25
Ethical & Operational
0/10
Openness Assessment
46%Partially Open
5 Open
3 Partial
6 Closed
Availability
5/5 open
Documentation
3/6 open
Access Methods
0/3 open
Capabilities
Text Generation
Summarization
Translation
Question Answering
Vendor Information
Complete information about the vendor/provider of this AI application
Zheng ZianView all products →
Contact Information
EU AI Act Provider Information
Verification Status:
Unverified
Compliance & Risk
Get insights into risk by running assessments on this AI application.
Data Categories
Types of data commonly processed by this application
Text Data
User Content
Conversation Logs
Added: January 26, 2026
Updated: January 26, 2026
🇪🇺 European-based Alternatives
Discover AI solutions from European providers
Devstral 2 (123B)
Mistral AI
AI Model
Magistral Medium 1.1
Mistral AI
AI Model
Mistral Large 2
Mistral AI
AI Model
Mistral Saba
Mistral AI
AI Model
Ready to manage AI applications?
Track, assess, and govern your AI applications with Anove.