NEW Tool:

Use generative AI to learn more about data.world

Product Launch:

data.world has officially leveled up its integration with Snowflake’s new data quality capabilities

PRODUCT LAUNCH:

data.world enables trusted conversations with your company’s data and knowledge with the AI Context Engine™

PRODUCT LAUNCH:

Accelerate adoption of AI with the AI Context Engine™️, now generally available

Upcoming Digital Event

Are you ready to revolutionize your data strategy and unlock the full potential of AI in your organization?

View all webinars

Does our understanding of data bias our analytics outcomes?

Clock Icon 66 minutes
Sparkle

About this episode

Why do some data-driven decisions seem to go so disastrously wrong? Ironically, the answer to this question likely isn’t found in the data at all, but rather our subconscious. In a time when companies have never been more data rich, it’s often our inherent information biases that doom critical analytics and data science work. 

Is it possible to take the bias out of data work? That’s the question we ponder in this episode featuring Ciaran Dynes, chief product officer at Matillion.

This episode features
  • What responsibilities companies and people have to curb information bias

  • How hypothesis testing and experimentation can improve data work

  • What’s the most egregious example of information bias in the wild?

Key takeaways

  • Look at the data, but you need People, Context and Relationships to deal with information bias.

  • Does the data support the hypothesis/conclusions?

  • Maybe there is tacit collective knowledge.

Special guests

Avatar of Ciaran Dynes
Ciaran Dynes Matillion
chat with archie icon