Moving AI from the Data Center to the Edge – Intel Chip Chat – Episode 663

 
공유
 

Manage episode 237659970 series 1136657
Player FM과 저희 커뮤니티의 Connected Social Media 콘텐츠는 모두 원 저작자에게 속하며 Player FM이 아닌 작가가 저작권을 갖습니다. 오디오는 해당 서버에서 직접 스트리밍 됩니다. 구독 버튼을 눌러 Player FM에서 업데이트 현황을 확인하세요. 혹은 다른 팟캐스트 앱에서 URL을 불러오세요.

In this Intel Conversations in the Cloud audio podcast: As AI expands into the mainstream, how will it succeed at the edge? Matt Jacobs, Senior VP Commercial Systems, Penguin Computing, Inc. talks about the growth of AI, its move from the data center, and easing customers into the edge space.

An established leader in data center and HPC solutions, Penguin takes a system-level view of the move to AI, a trend they’ve seen developing over the past two years. In the next 18 months, symbiotic technologies will be converging to create “real growth opportunity,” in Jacobs’ words.

Deploying AI at the edge comes with its own needs, and they differ from those of traditional data centers. Easily maintained low-power environments well-tuned for workloads are key, and the spread of processing power from the data center to the near-edge to the far-edge calls for software that can ensure workload portability.

New compute platforms, like 2nd Generation Intel Xeon Scalable processors with built-in AI acceleration, suit the unique requirements of workloads at the edge. Upcoming Intel “One API” software, which will simplify programming diverse computing engines, will support targeting workloads at various levels of capability.

For more about Penguin Computing, Inc. visit:
pengiuncomputing.com

Information about Intel technologies is available at:
intel.com
intel.com/ai

Intel technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Performance varies depending on system configuration. No product or component can be absolutely secure. Check with your system manufacturer or retailer or learn more at:
intel.com

177 에피소드