A lightweight fine-tuning technique that adds small modules to a model rather than retraining the whole thing.
"We used tools like Adam and LoRA to create company-specific adapters for general models, so we don't have to retrain an entire model or create our own."