Powered by RND
PodcastsTechnologyMLOps.community
Listen to MLOps.community in the App
Listen to MLOps.community in the App
(3,738)(249,730)
Save favourites
Alarm
Sleep timer

MLOps.community

Podcast MLOps.community
Demetrios Brinkmann
Weekly talks and fireside chats about everything that has to do with the new space emerging around DevOps for Machine Learning aka MLOps aka Machine Learning Op...

Available Episodes

5 of 398
  • Navigating Machine Learning Careers: Insights from Meta to Consulting // Ilya Reznik // #286
    In his 13 years of software engineering, Ilya Reznik has specialized in commercializing machine learning solutions and building robust ML platforms. He's held technical lead and staff engineering roles at premier firms like Adobe, Twitter, and Meta. Currently, Ilya channels his expertise into his travel startup, Jaunt, while consulting and advising emerging startups. Navigating Machine Learning Careers: Insights from Meta to Consulting // MLOps Podcast #286 with Ilya Reznik, ML Engineering Thought Leader at Instructed Machines, LLC. // Abstract Ilya Reznik's insights into machine learning and career development within the field. With over 13 years of experience at leading tech companies such as Meta, Adobe, and Twitter, Ilya emphasizes the limitations of traditional model fine-tuning methods. He advocates for alternatives like prompt engineering and knowledge retrieval, highlighting their potential to enhance AI performance without the drawbacks associated with fine-tuning. Ilya's recent discussions at the NeurIPS conference reflect a shift towards practical applications of Transformer models and innovative strategies like curriculum learning. Additionally, he shares valuable perspectives on navigating career progression in tech, offering guidance for aspiring ML engineers aiming for senior roles. His narrative serves as a blend of technical expertise and practical career advice, making it a significant resource for professionals in the AI domain. // Bio Ilya has navigated a diverse career path since 2011, transitioning from physicist to software engineer, data scientist, ML engineer, and now content creator. He is passionate about helping ML engineers advance their careers and making AI more impactful and beneficial for society. Previously, Ilya was a technical lead at Meta, where he contributed to 12% of the company’s revenue and managed approximately 30 production ML models. He also worked at Twitter, overseeing offline model evaluation, and at Adobe, where his team was responsible for all intelligent services within Adobe Analytics. Based in Salt Lake City, Ilya enjoys the outdoors, tinkering with Arduino electronics, and, most importantly, spending time with his family. // MLOps Swag/Merch https://shop.mlops.community/ // Related Links Website: mlepath.com --------------- ✌️Connect With Us ✌️ ------------- Join our slack community: https://go.mlops.community/slack Follow us on Twitter: @mlopscommunity Sign up for the next meetup: https://go.mlops.community/register Catch all episodes, blogs, newsletters, and more: https://mlops.community/ Connect with Demetrios on LinkedIn: https://www.linkedin.com/in/dpbrinkm/ Connect with Ilya on LinkedIn: https://www.linkedin.com/in/ibreznik/
    --------  
    1:00:36
  • Collective Memory for AI on Decentralized Knowledge Graph // Tomaž Levak // #285
    Tomaž Levak is the Co-founder and CEO of Trace Labs – OriginTrail core developers. OriginTrail is a web3 infrastructure project combining a decentralized knowledge graph (DKG) and blockchain technologies to create a neutral, inclusive ecosystem. Collective Memory for AI on Decentralized Knowledge Graph // MLOps Podcast #285 with Tomaz Levak, Founder of Trace Labs, Core Developers of OriginTrail. // Abstract The talk focuses on how OriginTrail Decentralized Knowledge Graph serves as a collective memory for AI and enables neuro-symbolic AI. We cover the basics of OriginTrail’s symbolic AI fundamentals (i.e. knowledge graphs) and go over details how decentralization improves data integrity, provenance, and user control. We’ll cover the DKG role in AI agentic frameworks and how it helps with verifying and accessing diverse data sources, while maintaining compatibility with existing standards. We’ll explore practical use cases from the enterprise sector as well as latest integrations into frameworks like ElizaOS. We conclude by outlining the future potential of decentralized AI, AI becoming the interface to “eat” SaaS and the general convergence of AI, Internet and Crypto. // Bio Tomaz Levak, founder of OriginTrail, is active at the intersection of Cryptocurrency, the Internet, and Artificial Intelligence (AI). At the core of OriginTrail is a pursuit of Verifiable Internet for AI, an inclusive framework addressing critical challenges of the world in an AI era. To achieve the goal of Verifiable Internet for AI, OriginTrail's trusted knowledge foundation ensures the provenance and verifiability of information while incentivizing the creation of high-quality knowledge. These advancements are pivotal to unlock the full potential of AI as they minimize the technology’s shortfalls such as hallucinations, bias, issues of data ownership, and model collapse. Tomaz's contributions to OriginTrail span over a decade and across multiple fields. He is involved in strategic technical innovations for OriginTrail Decentralized Knowledge Graph (DKG) and NeuroWeb blockchain and was among the authors of all three foundational White Paper documents that defined how OriginTrail technology addresses global challenges. Tomaz contributed to the design of OriginTrail token economies and is driving adoption with global brands such as British Standards Institution, Swiss Federal Railways and World Federation of Haemophilia, among others. Committed to the ongoing expansion of the OriginTrail ecosystem, Tomaz is a regular speaker at key industry events. In his appearances, he highlights the significant value that the OriginTrail DKG brings to diverse sectors, including supply chains, life sciences, healthcare, and scientific research. In a rapidly evolving digital landscape, Tomaz and the OriginTrail ecosystem as a whole are playing an important role in ensuring a more inclusive, transparent and decentralized AI. // MLOps Swag/Merch https://shop.mlops.community/ // Related Links Website: https://origintrail.io Song recommendation: https://open.spotify.com/track/5GGHmGNZYnVSdRERLUSB4w?si=ae744c3ad528424b --------------- ✌️Connect With Us ✌️ ------------- Join our slack community: https://go.mlops.community/slack Follow us on Twitter: @mlopscommunity Sign up for the next meetup: https://go.mlops.community/register Catch all episodes, blogs, newsletters, and more: https://mlops.community/ Connect with Demetrios on LinkedIn: https://www.linkedin.com/in/dpbrinkm/ Connect with Tomaz on LinkedIn: https://www.linkedin.com/in/tomazlevak/
    --------  
    53:24
  • Efficient Deployment of Models at the Edge // Krishna Sridhar // #284
    Krishna Sridhar is an experienced engineering leader passionate about building wonderful products powered by machine learning. Efficient Deployment of Models at the Edge // MLOps Podcast #284 with Krishna Sridhar, Vice President of Qualcomm. Big shout out to Qualcomm for sponsoring this episode! // Abstract Qualcomm® AI Hub helps to optimize, validate, and deploy machine learning models on-device for vision, audio, and speech use cases. With Qualcomm® AI Hub, you can: Convert trained models from frameworks like PyTorch and ONNX for optimized on-device performance on Qualcomm® devices. Profile models on-device to obtain detailed metrics including runtime, load time, and compute unit utilization. Verify numerical correctness by performing on-device inference. Easily deploy models using Qualcomm® AI Engine Direct, TensorFlow Lite, or ONNX Runtime. The Qualcomm® AI Hub Models repository contains a collection of example models that use Qualcomm® AI Hub to optimize, validate, and deploy models on Qualcomm® devices. Qualcomm® AI Hub automatically handles model translation from source framework to device runtime, applying hardware-aware optimizations, and performs physical performance/numerical validation. The system automatically provisions devices in the cloud for on-device profiling and inference. The following image shows the steps taken to analyze a model using Qualcomm® AI Hub. // Bio Krishna Sridhar leads engineering for Qualcomm™ AI Hub, a system used by more than 10,000 AI developers spanning 1,000 companies to run more than 100,000 models on Qualcomm platforms. Prior to joining Qualcomm, he was Co-founder and CEO of Tetra AI which made its easy to efficiently deploy ML models on mobile/edge hardware. Prior to Tetra AI, Krishna helped design Apple's CoreML which was a software system mission critical to running several experiences at Apple including Camera, Photos, Siri, FaceTime, Watch, and many more across all major Apple device operating systems and all hardware and IP blocks. He has a Ph.D. in computer science from the University of Wisconsin-Madison, and a bachelor’s degree in computer science from Birla Institute of Technology and Science, Pilani, India. // MLOps Swag/Merch https://shop.mlops.community/ // Related Links Website: https://www.linkedin.com/in/srikris/ --------------- ✌️Connect With Us ✌️ ------------- Join our slack community: https://go.mlops.community/slack Follow us on Twitter: @mlopscommunity Sign up for the next meetup: https://go.mlops.community/register Catch all episodes, blogs, newsletters, and more: https://mlops.community/ Connect with Demetrios on LinkedIn: https://www.linkedin.com/in/dpbrinkm/ Connect with Krishna on LinkedIn: https://www.linkedin.com/in/srikris/
    --------  
    51:33
  • Real World AI Agent Stories // Zach Wallace // #283
    Machine Learning, AI Agents, and Autonomy // MLOps Podcast #283 with Zach Wallace, Staff Software Engineer at Nearpod Inc. // Abstract Demetrios chats with Zach Wallace, engineering manager at Nearpod, about integrating AI agents in e-commerce and edtech. They discuss using agents for personalized user targeting, adapting AI models with real-time data, and ensuring efficiency through clear task definitions. Zach shares how Nearpod streamlined data integration with tools like Redshift and DBT, enabling real-time updates. The conversation covers challenges like maintaining AI in production, handling high-quality data, and meeting regulatory standards. Zach also highlights the cost-efficiency framework for deploying and decommissioning agents and the transformative potential of LLMs in education. // Bio Software Engineer with 10 years of experience. Started my career as an Application Engineer, but I have transformed into a Platform Engineer. As a Platform Engineer, I have handled the problems described below - Localization across 6-7 different languages - Building a custom local environment tool for our engineers - Building a Data Platform - Building standards and interfaces for Agentic AI within ed-tech. // MLOps Swag/Merch https://shop.mlops.community/ // Related Links https://medium.com/renaissance-learning-r-d/data-platform-transform-a-data-monolith-9d5290a552ef --------------- ✌️Connect With Us ✌️ ------------- Join our slack community: https://go.mlops.community/slack Follow us on Twitter: @mlopscommunity Sign up for the next meetup: https://go.mlops.community/register Catch all episodes, blogs, newsletters, and more: https://mlops.community/ Connect with Demetrios on LinkedIn: https://www.linkedin.com/in/dpbrinkm/ Connect with Zach on LinkedIn: https://www.linkedin.com/in/zachary-wallace/
    --------  
    47:07
  • Machine Learning, AI Agents, and Autonomy // Egor Kraev // #282
    Since three years, Egor is bringing the power of AI to bear at Wise, across domains as varied as trading algorithms for Treasury, fraud detection, experiment analysis and causal inference, and recently the numerous applications unlocked by large language models. Open-source projects initiated and guided by Egor include wise-pizza, causaltune, and neural-lifetimes, with more on the way. Machine Learning, AI Agents, and Autonomy // MLOps Podcast #282 with Egor Kraev, Head of AI at Wise Plc. // Abstract Demetrios chats with Egor Kraev, principal AI scientist at Wise, about integrating large language models (LLMs) to enhance ML pipelines and humanize data interactions. Egor discusses his open-source MotleyCrew framework, career journey, and insights into AI's role in fintech, highlighting its potential to streamline operations and transform organizations. // Bio Egor first learned mathematics in the Russian tradition, then continued his studies at ETH Zurich and the University of Maryland. Egor has been doing data science since last century, including economic and human development data analysis for nonprofits in the US, the UK, and Ghana, and 10 years as a quant, solutions architect, and occasional trader at UBS then Deutsche Bank. Following last decade's explosion in AI techniques, Egor became Head of AI at Mosaic Smart Data Ltd, and for the last four years is bringing the power of AI to bear at Wise, in a variety of domains, from fraud detection to trading algorithms and causal inference for A/B testing and marketing. Egor has multiple side projects such as RL for molecular optimization, GenAI for generating and solving high school math problems, and others. // MLOps Swag/Merch https://shop.mlops.community/ // Related Links https://github.com/transferwise/wise-pizza https://github.com/py-why/causaltune https://www.linkedin.com/posts/egorkraev_a-talk-on-experimentation-best-practices-activity-7092158531247755265-q0kt?utm_source=share&utm_medium=member_desktop --------------- ✌️Connect With Us ✌️ ------------- Join our slack community: https://go.mlops.community/slack Follow us on Twitter: @mlopscommunity Sign up for the next meetup: https://go.mlops.community/register Catch all episodes, blogs, newsletters, and more: https://mlops.community/ Connect with Demetrios on LinkedIn: https://www.linkedin.com/in/dpbrinkm/ Connect with Egor on LinkedIn: https://www.linkedin.com/in/egorkraev/
    --------  
    1:05:20

More Technology podcastsMore Technology podcasts

About MLOps.community

Weekly talks and fireside chats about everything that has to do with the new space emerging around DevOps for Machine Learning aka MLOps aka Machine Learning Operations.
Podcast website

Listen to MLOps.community, Acquired and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features
Social
v7.5.1 | © 2007-2025 radio.de GmbH
Generated: 1/29/2025 - 9:58:53 PM