DailyHum Logo
Microsoft’s Tutel optimizes mixture of experts model training
Nov 23, 2021 | Venture Beat
Picture - Microsoft’s Tutel optimizes mixture of experts model training

Let the OSS Enterprise newsletter guide your open source journey! Sign up here.

Microsoft this week announced Tutel, a library to support the development of mixture of experts (MoE) models — a particular type of large-scale AI model. Tutel, which is open source and has been integrated into fairseq, one of Facebook’s toolkits in PyTorch, is designed to enable developers across AI disciplines to “execute MoE more easily and efficiently,” Microsoft says.

MoE are made up of small clusters of “neurons” that are only active under special, specific circumstances. Lower “layers” of the MoE model extract features and experts are called upon to evaluate those features. For example, MoEs can be used to create a translation system, with each expert cluster learning to handle a separate part of speech or special grammatical rule.

Nov 24, 2021 | Venture Beat
The shape of edge AI to come
NM+ Weekly Update: Spain Edition
Nov 27, 2021 | Nomadic Matt Blog