Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
0 points
XenophileJKO
2mo ago
0 comments
Share
Context is the information you give the model, attention is what parts it focuses on.
And this is finite in capacity and emergent from the architecture.
undefined | Better HN
0 comments
default
newest
oldest
lwhi
2mo ago
So attention is based on a smaller subset of context?
j
/
k
navigate · click thread line to collapse