AWS unveils new machine learning chip Trainium
Published 8 December 2020
Amazon Web Services (AWS) unveiled a brand new high-performance chip known as Trainium for training machine learning models on the cloud.
Trainium is designed to address the limitations of Inferentia, AWS’ machine learning chip that was released last year.
Development teams were limited by fixed machine learning training budgets, which hampers the scope and frequency of training needed to improve their models and applications
The chip will complement Inferentia by offering AWS customers a smooth end-to-end flow of ML compute from scaling training workloads to deploying accelerated inference.
AWS Trainium shares the same AWS Neuron SDK as AWS Inferentia, making for simple migration or combination of the chips.
AWS says that Trainium is cost-effective, and is capable of accommodating deep training workloads, including image classification, semantic search, translation, voice recognition, natural language processing and recommendation engines.
Trainium will be available in 2021 via Amazon EC2 instances and AWS Deep Learning AMIs as well as managed services including Amazon SageMaker, Amazon ECS, EKS, and AWS Batch.
By Jie Yee Ong, Tech Reporter