Сортировать по
Язык: Все
Топик: data-parallelism
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
machine-learningpytorchdeep-learninginferencemixture-of-expertsgputrillion-parameterszerobillion-parameterscompressiondata-parallelismmodel-parallelismpipeline-parallelism- Python
01Обновлено 7 месяцев назад
Making large AI models cheaper, faster and more accessible
aideep-learninginferencefoundation-modelsmodel-parallelismpipeline-parallelismdata-parallelismbig-modeldistributed-computingheterogeneous-traininghpclarge-scale- Python
01Обновлено 7 месяцев назад