New IBM chip will open up AI opportunities for businesses

IBM wants a bigger piece of the AI hardware pie and has announced a new processor to do just that.

During the recent Hot Chips conference, the chip giant unveiled the details of its new Telum Processor, designed to bring deep learning inference to enterprise workloads.

According to a follow-up press release, the new chip is designed to help address fraud, in real-time. It’s also IBM’s first chip to come with on-chip acceleration for AI inferencing during a transaction. 

This chip has been three years in the making, and IBM believes it’ll find its use in banking, finance, trading, insurance applications, and customer interactions. 

It also planning for a Telum-based system for H1 2022.

AI demand

The idea behind building such a processor, IBM says, came after market analysis which found 90% of companies want to be able to build and run AI projects wherever their data resides. Telum is designed to do just that, allowing enterprises to conduct high-volume inferencing for real-time sensitive transactions, without invoking off-platform AI solutions that could impact performance.

Telum has an “innovative centralized design,” IBM says, allowing users to use it for fraud detection, loan processing, trade clearing and settlement, anti-money laundering and risk analysis. 

The chip comes with eight processor cores and a deep super-scalar out-of-order instruction pipeline, running with more than 5GHz clock frequency, meaning it’s optimized for the demands of heterogeneous enterprise-class workloads, IBM explains.

Telum also has a completely redesigned cache and chip-interconnection infrastructure which provides 32MB cache per core, and can scale up to 32 Telum chips. 

The chip was created in partnership with Samsung, which developed the 7nm EUV technology node. It is also the first chip with technology created by the IBM Research AI Hardware Center.

Access the original article
Subscribe
Don't miss the best news ! Subscribe to our free newsletter :