Back to News
Advertisement
Advertisement

⚡ Community Insights

Discussion Sentiment

100% Positive

Analyzed from 186 words in the discussion.

Trending Topics

#bit#weights#using#why#something#optical#computer#mzi#where#run

Discussion (6 Comments)Read Original on HackerNews

zitterbewegung•about 2 hours ago
This is something I always thought if you could make an Optical Computer using MZI or other technologies that don't have very "exact" requirements on computation. Similar to how LLMs right now where models are run on consumer devices like a Macbook Pro where we quantize to 4 bit computations you could hypothetically just run a larger model using MZI's doing inference on those systems.

Since you are only changing the underlying model every so often instead of doing a large training loop when you setup the optical computer that can do inference it scales 2n+1 with clock speeds of to 100THz with only 100w of power vs traditional GPUs at 2GHZ with 1Kw for 15k cores.

LeroyRaz•about 2 hours ago
This reads like something AI generated...
refulgentis•about 3 hours ago
"That’s not a lab toy. That’s a product-trajectory data point."

sigh. (why? because now I have to guess how much is vague handwaving, or an AI trying to fit a square peg into a round hole, and how much is reality)

refulgentis•about 3 hours ago
I get the "AI uses < 32 bit weights sometimes" thing, intimately, but I feel like I'm missing:

A) Why that means calculations can be imprecise - the weights are data stored in RAM, is the idea we'd use > N-bit weights and say it's effectively N-bit due to imprecision, so we're good? Because that'd cancel out the advantage of using < N-bit weights. (which, of course, is fine if B) has a strong answer)

B) A aside, why is photonics preferable?

irickt•about 3 hours ago
An interesting and accessible article on the increased plausibility of photonic compute.