Back to News
Advertisement

Ask HN: Does Claude use 'prior' in a Bayesian sense more than English?

sslake 2 days ago 6 comments
Just an observation. When asked to summarize articles, or extract insights, I see the word 'prior' being used a lot more by Claude than usual English language writing (Journalistic essence). And it's clearly using it in a Bayesian sense, because it's always mentioning things like 'Updating priors', 'the prior doesn't hold', etc.

Probably something I noticed after reading the 'goblin' and 'gremlin' article.

Advertisement

⚡ Community Insights

Discussion Sentiment

100% Positive

Analyzed from 88 words in the discussion.

Trending Topics

#talk#priors#exponentials#turning#silicon#valley#pseudo#math#slang#latent

Discussion (6 Comments)Read Original on HackerNews

nivertech•2 days ago
AI talk is turning into Silicon Valley pseudo-math slang. Priors, exponentials, latent space

You get lines like “no priors” or “embracing exponentials” that sound smart but mostly signal status

Same move as N Taleb and “convexity.” A real idea turned into a generic intellectual flex

bjourne•2 days ago
Probably? Reinforcement learning creates bots with specific styles. For example, ChatGPT is very fond of "typically", "unpack this", and "if you want".
ex-aws-dude•2 days ago
Once again a post with literally 3 points and 2 hours old is the top of /ask

Why is the HN algorithm such ass, can we talk about that?

pbkompasz•2 days ago
Well it did have Claude both in the title and the description...