Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
swyx
9mo ago
0 comments
Share
do LoRAs conflict with your distillation?
0 comments
default
newest
oldest
sangwulee
9mo ago
The architecture is the same so we found that some LoRAs work out-of-the box, but some LoRAs don't. In those cases, I would expect people to re-run their LoRA finetuning with the trainer they've used.
j
/
k
navigate · click thread line to collapse