Сортировать по
Язык: Все
Топик: mixture-of-experts
Unify Efficient Fine-tuning of 100+ LLMs
llmgptlanguage-modelagentbaichuanchatglmfine-tuninggenerative-aiinstruction-tuninglarge-language-modelsllamaloramistralmixture-of-expertspeftqloraquantizationqwenrlhftransformers- Python
01Обновлено 8 месяцев назад
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
machine-learningpytorchdeep-learninginferencemixture-of-expertsgputrillion-parameterszerobillion-parameterscompressiondata-parallelismmodel-parallelismpipeline-parallelism- Python
01Обновлено 7 месяцев назад
Mixture-of-Experts for Large Vision-Language Models
- Python
00Обновлено 7 месяцев назад