Interconnections for optical chips
Scale has been shown to be one of the best levers for improving machine intelligence, and interconnect bandwidth is the key limiting factor for scaling silicon neural networks. Fathom is building computing hardware that enables connecting chips at higher bandwidth and larger scales.
The only example of human-level intelligence is the human brain, which has ~125 trillion synapses. This is orders of magnitude more than today’s largest artificial neural networks. Fathom Radiant was founded to help safely bridge this gap.
Fathom believes the main limitation to scaling neural networks is the interconnect technology of traditional electronic computers - put simply, the challenge is moving bits around. By combining the complementary strengths of optics and electronics, they've created a revolutionary optical fabric that is low latency, high bandwidth, low power, and that can scale to millions of compute nodes. This enables computing clusters with PB/s of long-range bandwidth at Watt-level power consumption.