The GPUs and other chips used to train AI communicate with each other inside data centers through “interconnects.” But those interconnects have limited bandwidth, which limits AI training performance.