#51 A DevOps Angle to Data Mesh and WePay's Journey - Interview w/ Chris Riccomini
Data Mesh Radio - A podcast by Data as a Product Podcast Network
Categorie:
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here.What the Heck is a Data Mesh?! Post: https://cnr.sh/essays/what-the-heck-data-meshChris' Twitter: https://twitter.com/criccominiChris' LinkedIn: https://www.linkedin.com/in/riccomini/Chris' website: https://cnr.sh/The Missing README book: https://themissingreadme.com/In this episode, Scott interviewed Chris Riccomini, a Software Engineer, Author, and Investor. Chris led the infrastructure team at WePay when they embarked on a data mesh journey and made a well-written post on thinking about data mesh in DevOps terms.Like a number of people/organizations that have come on the podcast, at WePay, Chris was pursuing the general goals of data mesh and was applying some of the approaches as well - but it was not nearly as cohesive as Zhamak laid things out.Their initial setup had two teams managing the pipeline/transformation infrastructure. Chris's team was mostly handing the extracting and loading and then there was a team of analytics engineers handling the transformations. The Transformation team saw a major increase in demand and quickly became overloaded -> a bottleneck. Chris' team also started to get overloaded so they knew they had to evolve.One way the team started to address the bottlenecks was by decentralizing the pipelines. Teams could make a request and a scalable and reliable pipeline would essentially get automatically set up for them. WePay is in the financial services space so as part of those pipelines, to prevent risk, teams could mark their sensitive/PII columns and the infra team also put in some autodetection capabilities to make sure they didn't miss any.WePay created a "canonical data representation" or CDR, which is pretty analogous to a data product in data mesh. Chris really liked WePay's use of the embedded analytics engineer to serve as a data product developer.One key innovation for WePay was tooling to...