Сортировать по
Язык: Все
Топик: mixtral
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
llmlarge-language-modelslanguage-modelgptfine-tuninglorainstruction-tuningqlorachinesemixtralmixtral-8x7bmixtral-8x7b-instruct- Python
01Обновлено 7 месяцев назад
An efficient, flexible and full-featured toolkit for fine-tuning large models (InternLM, Llama, Baichuan, Qwen, ChatGLM)
llmlarge-language-modelschatbotllamallama2agentpeftqwenconversational-aillavallm-trainingmixtralchatglmbaichuanchatglm2chatglm3internlmmsagentsupervised-finetuning- Python
01Обновлено 7 месяцев назад