#79 A Data Success Secret Recipe: Comfort with Ambiguity and Change Management - Interview w/ Vincent Koc

Data Mesh Radio - A podcast by Data as a Product Podcast Network

Categorie:

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center hereVincent's LinkedIn: https://www.linkedin.com/in/koconder/In this episode, Scott interviewed Vincent Koc, Head of Data at the merchant platform company hipages.To start with some big takeaways from Vincent:If you aren't comfortable with an agile mindset and ambiguity, bleeding edge probably isn't for you - and that's okay!You and your organization need to be comfortable with failing, learning, and then iteratingTo get data mesh - or really any big change initiative in data - right you should focus on change management much more than you probably thinkIt ends up being the secret sauce or crucial lacking factor much more often than the techThink problem-specific, not technology specificIt's easy to over-engineer the problem - technologists want to technologyIn general, consistency is key to achieving widespread success in dataOne domain having a major success won't lead to broader org-wide success if you don't look to leverage reusability factors to make consistency across other domains easy - a bunch of great but non-consistent solutions doesn't add up to a valuable whole pictureFor Vincent, every organization considering data mesh should ask if it is really the correct approach for them. Data mesh really isn't for a large subset of organizations, whether that is right now or even ever. If your organization doesn't have an appetite for change, it's going to be very tough to move towards data mesh. If you want to implement data mesh, he recommends embracing an agile methodology e.g. fast feedback and trial and error.When thinking about splitting your data monolith into domains, Vincent recommends taking a lot of learnings from what works well in the microservices realm. You shouldn't decompose everything all at once - that just creates chaos. You can split out larger domains one by one and then figure out if you need to split them further when there is more value in doing so. Peel them off instead of a big bang approach.Vincent believes that, in general, ~20% of your teams will consume ~80% of your data team's time and energy. There are a few ways to work with those teams to reduce that but it is also somewhat a fact of reality....

Visit the podcast's native language site