Active open source contributor, interested in LLMs, PL, and its related infra/system.
Highlights
Pinned Loading
-
-
TUDB-Labs/MixLoRA
TUDB-Labs/MixLoRA PublicState-of-the-art Parameter-Efficient MoE Fine-tuning Method
-
TUDB-Labs/mLoRA
TUDB-Labs/mLoRA PublicAn Efficient "Factory" to Build Multiple LoRA Adapters
-
TUDB-Labs/MoE-PEFT
TUDB-Labs/MoE-PEFT PublicAn Efficient LLM Fine-Tuning Factory Optimized for MoE PEFT
-
-
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.