0:04 Introduction of Dr Erik Widman
0:57 Introduction by Dr Erik Widman
2:00 What the talk is and is not about
2:50 Acknowledgement
3:06 Table of contents
3:44 Glimpse at "real-world" data science
6:44 Goals of MLOps
7:30 MLOps technical capabilities
8:57 Sample orchestration in an ML deployment
10:22 What is Hellofresh?
13:00 Distribution of Hellofresh's tech department
15:20 Hellofresh's data science journey
17:45 The approach to building MLOps platform at Hellofresh
19:42 Understanding the data science team structure at Hellofresh
22:37 Mapping MLOps maturity
24:25 Understand the types of models in the organisation
25:30 Survey tools used
26:23 Evaluate Hellofresh with the ML maturity framework
28:13 Hellofresh's model maturity framework
29:30 Hellofresh's MLOps platform vision
30:43 Hellofresh's MLOps product offering
34:25 How SpiceRack alleviates existing data science challenges
35:00 Pain points in choosing MLOps tooling
36:30 MLOps tool selection principles
37:52 How Hellofresh selected their feature store tool
40:35 How to scale tooling selection
42:19 Advice for tool selection
44:50 Design principles for development
46:00 Spick Rack components and high-level experience
47:55 Typical Spick Rack steps
50:34 Release strategy for Spick Rack
51:15 Final advice on building an MLOps platform
55:05 [Question 1] Can you talk about the system tests that you did use to ensure the models are working properly?
55:57 [Question 2] Do you have resources that you would recommend for people who are starting in this to learn the principles you have been talking about on a more specific basis?
Timestamps for: PyData Chicago: Building an MLOps platform at HelloFresh, by Dr. Erik Widman on October 31, 2022
0:04 Introduction of Dr Erik Widman 0:57 Introduction by Dr Erik Widman 2:00 What the talk is and is not about 2:50 Acknowledgement 3:06 Table of contents 3:44 Glimpse at "real-world" data science 6:44 Goals of MLOps 7:30 MLOps technical capabilities 8:57 Sample orchestration in an ML deployment 10:22 What is Hellofresh? 13:00 Distribution of Hellofresh's tech department 15:20 Hellofresh's data science journey 17:45 The approach to building MLOps platform at Hellofresh 19:42 Understanding the data science team structure at Hellofresh 22:37 Mapping MLOps maturity 24:25 Understand the types of models in the organisation 25:30 Survey tools used 26:23 Evaluate Hellofresh with the ML maturity framework 28:13 Hellofresh's model maturity framework 29:30 Hellofresh's MLOps platform vision 30:43 Hellofresh's MLOps product offering 34:25 How SpiceRack alleviates existing data science challenges 35:00 Pain points in choosing MLOps tooling 36:30 MLOps tool selection principles 37:52 How Hellofresh selected their feature store tool 40:35 How to scale tooling selection 42:19 Advice for tool selection 44:50 Design principles for development 46:00 Spick Rack components and high-level experience 47:55 Typical Spick Rack steps 50:34 Release strategy for Spick Rack 51:15 Final advice on building an MLOps platform 55:05 [Question 1] Can you talk about the system tests that you did use to ensure the models are working properly? 55:57 [Question 2] Do you have resources that you would recommend for people who are starting in this to learn the principles you have been talking about on a more specific basis?