Currently, as AI technology demand grows, models are growing at an unprecedented rate, pushing the boundaries of what current hardware can support. Today’s landscape is dominated by increasingly large and complex models, with newer models being pushed, such as NEAR’s 1.4T parameter model. These models require immense computational power and memory. Despite the correlated growth of increasingly powerful GPUs, the traditional approach of vertical scaling is beginning to show limitations. This a...