If something goes wrong inside the EU, this is who to contact.
Not needed — the maker is already based in Europe.
by Zheng Zian
OpenMoE is a text AI model developed by Zheng Zian. This limited performance model is based on OpenMoE-8B. The model is licensed under Apache-2.0. Released in 2023-07. An early model aiming to ignite the open-source MoE community.
Built by Zheng Zian. Your data passes through them.
We watch for changes to their terms so you don't have to.
The kinds of information this AI typically takes in
The people and company behind this AI
What this maker has officially told EU regulators about how their AI works.
These four sections are required by the EU AI Act. Auditors and compliance teams use them — feel free to skim.
If something goes wrong inside the EU, this is who to contact.
Not needed — the maker is already based in Europe.
Review recommended before use
Discover EU-based alternatives for this AI application.