Hacker News
new
top
best
ask
show
job
Impact of Pretraining Word Co-Occurrence on Compositional Generalization In
(
arxiv.org
)
2 points
by
badmonster
a day ago
1 comment
badmonster
a day ago
a subtle but powerful insight: large multimodal models like CLIP don’t just learn individual concepts. they also depend heavily on how often those concepts appear together during training.