LoRA Variants Surveys
1 Timeline Order Summarize the literature reviewed in chronological order. 2023 📝【EMNLP 2023 - Main】- Sparse Low-rank Adaptation of Pre-trained Language Models (Tsinghua University, The University of Chicago) Subject: Adaptive Rank Selection Problem: Standard LoRA uses a fixed, inflexible rank (hyperparameter $ r $), requiring expensive manual tuning. Core Idea: Make the rank learnable rather than fixed. Mechanism: Gating: Introduces an optimizable gating unit to the low-rank matrices. Optimization: Uses proximal gradient methods to update the gates. Dynamics: Prunes less important ranks during training automatically. Result: Eliminates discrete rank search; the model discovers its own optimal rank structure. SoRA ...