The demand for GPU acceleration, which drives AI across diverse markets, continues to rise. However, keeping up with the latest technology is not always cost-effective. The good news is that AI costs don’t have to be a barrier to entry when legacy components are available.
We help companies balance two often-competing priorities: acquiring the computing capabilities they need while staying within their budget. When you evaluate your specific requirements, you don’t always need the latest technology. In many cases you can get a lot more for your money with legacy GPUs.
CNE recommends a conservative approach to building and managing your company’s IT assets. In a volatile industry, it is easy to overspend on technology. Slow and steady wins the race.
Legacy GPUs remain competitive for AI
Because of their extraordinary parallel processing power, GPUs are central to both narrow and strong AI. When it comes to FLOPS, there is no question that GPUs significantly outperform CPUs. But how do legacy GPUs stack up against newer models?
A recent article on The Next Platform compared the cost per image per second of the Tesla K80, a legacy GPU, against the Tesla V100. A single Tesla K80 card, which has two GPUs, can process approximately 52 images per second. It costs about $1,000. The Tesla V100 can process almost 600 images per second, but it costs roughly $11,500. The cost per image processed, regardless of which card you choose, is about $19.
Evaluated on a cost-per-image basis, there is no doubt that a legacy GPU is a smart choice. These benefits can increase in both distributed systems and tightly integrated systems. Moreover, given the cost differential, it is much more cost-effective to invest in legacy GPUs.
Evaluating AI costs against your computing needs
Investing in new technology is costly, and any effort to keep up with the latest technology is impossible unless you have an unlimited budget. With new units priced at over $10,000 a piece, you’d need to make a significant investment per server for technology that is continually changing.
For many companies, purchasing refurbished or legacy GPUs is just smart business. A legacy GPU can often provide you with the same computing power for a fraction of the cost. You get what you need to get the job done, improving your AI costs, without over investing in technology.
Just make sure that when purchasing legacy components, you’re buying from a trusted vendor that offers a warranty so that performance won’t be an issue.
Takeaway
The technology behind deep learning and artificial intelligence is changing quickly, and current capabilities do not yet live up to the potential of AI. By building out an infrastructure that meets or slightly exceeds your current needs, you can optimize your current system so it is incredibly efficient.
And as your needs expand, you can gradually increase your capabilities. This approach fits well within a circular economy framework, which allows you to achieve a robust ROI for your IT equipment.
As you continue to build out your infrastructure, take a look at the GPUs we have in stock and complete your own analysis. We’re certain you’ll find that legacy GPUs are a smart investment. Remember, AI costs don’t have to break the bank.