Collected molecules will appear here. Add from search or explore.
Curated literature repository and bibliography for Mixture-of-Experts (MoE) research.
Defensibility
stars
663
forks
46
The project is a standard 'Awesome List' focusing on Mixture-of-Experts (MoE) papers. With 663 stars over 4 years, it represents a respectable archive of the field's foundation (e.g., GShard, Switch Transformer), but it lacks a technical moat. Defensibility is near-zero as the value is purely in curation, which can be trivially cloned or superseded by newer repositories or AI-powered research tools like Perplexity. The current velocity of 0.0/hr suggests the project may be stagnant, which is a critical failure in the high-velocity MoE space (post-Mixtral and DeepSeek-V3). Frontier labs pose a 'low' risk because they do not compete in bibliography management, but the project faces a high risk of obsolescence from more active community-driven trackers or living survey papers. It serves as a historical reference rather than a living development tool.
TECH STACK
INTEGRATION
reference_implementation
READINESS