The mainstream use of artificial intelligence is causing a disruption in enterprises across all industries. As a result, most enterprises are keeping some or all of their AI model training and inferencing on-premises for a variety of reasons. What many find is that their existing infrastructure is not adequate and cannot support these new workloads.
While much attention has focused on meeting the compute demands of AI, there are comparable (and equally hard to address) issues with networking and storage.