Player FM 앱으로 오프라인으로 전환하세요!
How Open Source Transformers Are Accelerating AI – Conversations in the Cloud – Episode 261
저장한 시리즈 ("피드 비활성화" status)
When? This feed was archived on September 08, 2022 20:29 (). Last successful fetch was on August 08, 2022 23:26 ()
Why? 피드 비활성화 status. 잠시 서버에 문제가 발생해 팟캐스트를 불러오지 못합니다.
What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.
Manage episode 306325954 series 29991
In this Intel Conversations in the Cloud audio podcast: Jeff Boudier from Hugging Face joins host Jake Smith to talk about the company’s open source machine learning transformers (also known as “pytorch-pretrained-bert”) library. Jeff talks about how transformers have accelerated the proliferation of natural language process (NLP) models and their future use in objection detection and other machine learning tasks. He goes into detail about Optimum—an open source library to train and run models on specific hardware, like Intel Xeon CPUs, and the benefits of the Intel Neural Compressor, which is designed to help deploy low-precision inference solutions. Jeff also announces Hugging Face’s new Infinity solution that integrates the inference pipeline to achieve results in milliseconds wherever Docker containers can be deployed.
For more information, visit:
hf.co
Follow Jake on Twitter at:
twitter.com/jakesmithintel
98 에피소드
저장한 시리즈 ("피드 비활성화" status)
When? This feed was archived on September 08, 2022 20:29 (). Last successful fetch was on August 08, 2022 23:26 ()
Why? 피드 비활성화 status. 잠시 서버에 문제가 발생해 팟캐스트를 불러오지 못합니다.
What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.
Manage episode 306325954 series 29991
In this Intel Conversations in the Cloud audio podcast: Jeff Boudier from Hugging Face joins host Jake Smith to talk about the company’s open source machine learning transformers (also known as “pytorch-pretrained-bert”) library. Jeff talks about how transformers have accelerated the proliferation of natural language process (NLP) models and their future use in objection detection and other machine learning tasks. He goes into detail about Optimum—an open source library to train and run models on specific hardware, like Intel Xeon CPUs, and the benefits of the Intel Neural Compressor, which is designed to help deploy low-precision inference solutions. Jeff also announces Hugging Face’s new Infinity solution that integrates the inference pipeline to achieve results in milliseconds wherever Docker containers can be deployed.
For more information, visit:
hf.co
Follow Jake on Twitter at:
twitter.com/jakesmithintel
98 에피소드
모든 에피소드
×플레이어 FM에 오신것을 환영합니다!
플레이어 FM은 웹에서 고품질 팟캐스트를 검색하여 지금 바로 즐길 수 있도록 합니다. 최고의 팟캐스트 앱이며 Android, iPhone 및 웹에서도 작동합니다. 장치 간 구독 동기화를 위해 가입하세요.