The Heavybit Library
The Heavybit Library is an extensive catalog of educational content featuring hundreds of hours of expert presentations, insightful podcasts, and articles focused on helping technical founders achieve breakout success.
Browse
Generationship Ep. #22, Back to the Real World with Elijah Ben Izzy
In episode 22 of Generationship, Rachel Chalmers speaks with Elijah Ben Izzy, CTO at Dagworks. Elijah emphasizes the importance...

Enterprise AI Infrastructure: Compliance, Risks, Adoption
How Enterprise AI Infrastructure Must Balance Change Management vs. Risk Aversion 50%-60% of enterprises reportedly “use” AI,...

Generationship Ep. #20, Smells Like ML with Salma Mayorquin and Terry Rodriguez of Remyx AI
In episode 20 of Generationship, Rachel Chalmers is joined by Salma Mayorquin and Terry Rodriguez of Remyx AI. Together they...
Enterprise AI Infrastructure: Privacy, Maturity, Resources
Enterprise AI Infrastructure: Privacy, Economics, and Best First Steps The path to perfect AI infrastructure has yet to be...
Machine Learning Lifecycle: Take Projects from Idea to Launch
Machine learning is the process of teaching deep learning algorithms to make predictions based on a specific dataset. ML...
Machine Learning Model Monitoring: What to Do In Production
Machine learning model monitoring is the process of continuously tracking and evaluating the performance of a machine learning...

O11ycast Ep. #73, AI’s Impact on Observability with Animesh Koratana of PlayerZero
In episode 73 of o11ycast, Jessica Kerr, Martin Thwaites, and Austin Parker speak with Animesh Koratana, founder and CEO of...
Generationship Ep. #15, Mother of All Life with Gülin Yilmaz
In episode 15 of Generationship, Rachel Chalmers sits down with Gülin Yilmaz of Rosette Health. This episode dives into the...
The Future of Coding in the Age of GenAI
What AI Assistants Mean for the Future of Coding If you only read the headlines, AI has already amplified software engineers...
AI Inference: A Guide for Founders and Developers
What Is AI Inference (And Why Should Devs Care?) AI inference is the process of machine learning models processing previously...

The Data Pipeline is the New Secret Sauce
Why Data Pipelines and Inference Are AI Infrastructure’s Biggest Challenges While there’s still great excitement around AI and...

O11ycast Ep. #71, Evaluating LLM-based Apps with Shir Chorev of Deepchecks
In Episode 71 of o11ycast, Jessica Kerr and Austin Parker sit down with Shir Chorev to delve into the nuances of incorporating...