Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
0 points
Tostino
1y ago
0 comments
Share
You could also just continue pre-training of an existing foundation model. Would still be cheaper by not starting from zero.
undefined | Better HN
0 comments
default
newest
oldest
a-s-k-af
1y ago
The amount of accuracy while doing fine tuning or distillation is usually better than pre-training an existing model, not to mention the graph against the cost.
j
/
k
navigate · click thread line to collapse