New system could reduce data-transmission delays across
server farms by 99.6 percent.
Big websites usually maintain their own “data centers,”
banks of tens or even hundreds of thousands of servers, all passing data back
and forth to field users’ requests. Like any big, decentralized network, data
centers are prone to congestion: Packets of data arriving at the same router at
the same time are put in a queue, and if the queues get too long, packets can
be delayed.