Sponsored Keynote: How to Protect AI Workloads in Cloud Native - Grace Lian, Intel
赞助主论坛演讲:在云原生中保护AI工作负载的机密性? | Sponsored Keynote: How to Protect AI Workloads in Cloud Native - Grace Lian, Senior Director of Cloud Software Engineering, Intel
随着人工智能(AI)的迅速增长和广泛应用的持续发展,支撑其训练和推理的云计算平台正受到前所未有的关注。数据隐私和监管考虑以及机器学习模型的高价值增加了安全性的需求。在这次演讲中,我们将重点关注机密计算,特别是如何在云原生软件中启用英特尔硬件可信执行环境以支持AI工作负载。从机密集群到机密虚拟机再到机密容器,最终用户可以选择保护其AI工作负载的方式。我们还会介绍两个新的开源项目,Confidential Cloud Native Primitives(CCNP)和Cloud Native AI Pipeline(CNAP),以在您的AI之旅中实现数据和模型的机密性。
As the rapid growth and widespread adoption of artificial intelligence (AI) continue, cloud computing platforms, which underpin their training and inference, are gaining unprecedented attention. Data privacy and regulatory considerations in conjunction with the high value of machine learning models heightens the need for security. In this talk we focus on confidential computing, in particular how we enable Intel hardware trusted execution environments in cloud native software to support AI workloads. From confidential clusters to confidential VMs to confidential containers, end users have options to protect their AI workloads. We touch on two new open source projects, Confidential Cloud Native Primitives (CCNP) and Cloud Native AI Pipeline (CNAP), to realize confidentiality for your data and models in your AI journey.
CNCF概况(幻灯片)
扫描二维码联系我们!
CNCF (Cloud Native Computing Foundation)成立于2015年12月,隶属于Linux Foundation,是非营利性组织。
CNCF(云原生计算基金会)致力于培育和维护一个厂商中立的开源生态系统,来推广云原生技术。我们通过将最前沿的模式民主化,让这些创新为大众所用。请关注CNCF微信公众号。