Skip to content

Latest commit

 

History

History
9 lines (6 loc) · 314 Bytes

README.md

File metadata and controls

9 lines (6 loc) · 314 Bytes

MoE PyTorch

MoE

PyTorch implementation of Sparsely-Gated Mixture-of-Experts (MoE).

MoE - Outrageously Large Neural Networks: The Sparsely-Gated Mixture-of-Experts Layer.