Intel kicked off its Innovation 2023 event today, a two-day event, and it looks like most of the focus will be on artificial intelligence (AI), which is about to be expected given that the technology has so much potential. .
Among other things, the company demonstrated ChatGPT in the cloud compared to a model of 7 billion parameters running on a single Xeon system. The system in question is powered by a fifth generation Xeon processor (codenamed “Emerald Rapids”). Although we are not sure, the 7 billion parameter model Intel is referring to here may be the Falcon LLM.
While GPT, among other LLMs (large language models), are very useful and also fun to play with, they can be very demanding on the hardware side of things as well as the general requirements. For example, a recent study suggested that ChatGPT would “drink” about a pint of water for every 20 prompts or so. Financially, a report earlier this year suggested that ChatGPT could cost nearly three quarters of a million or $700,000 per day. Naturally, hardware vendors like Intel, AMD and Nvidia see the opportunity here and that’s why they’re designing next-generation solutions with AI acceleration in mind.
Besides the 5th generation Xeon demo, Intel also teased some of the performance we can expect from the next generation 6th generation “Granite Rapids”. The company claims a 2-3 times improvement, among other things thanks to the boost you will get from upgrading the memory subsystem. Intel will move to 12 channels with support for up to DDR5-8800 MCR DIMMs in the 6th generation Xeon compared to 8 channels DDR5-8000 in the 5th generation. The first is planned to be released in 2024 while the fifth generation is Already a sample to customers.