The Heavybit Library
The Heavybit Library is an extensive catalog of educational content featuring hundreds of hours of expert presentations, insightful podcasts, and articles focused on helping technical founders achieve breakout success.
Browse
How to Make Open-Source & Local LLMs Work in Practice
How to Get Open-Source LLMs Running Locally Heavybit has partnered with GenLab and the MLOps Community, which gathers thousands...
The Future of Coding in the Age of GenAI
What AI Assistants Mean for the Future of Coding If you only read the headlines, AI has already amplified software engineers...
AI Inference: A Guide for Founders and Developers
What Is AI Inference (And Why Should Devs Care?) AI inference is the process of machine learning models processing previously...
Enterprise AI Infrastructure: Privacy, Maturity, Resources
Enterprise AI Infrastructure: Privacy, Economics, and Best First Steps The path to perfect AI infrastructure has yet to be...
Generationship Ep. #22, Back to the Real World with Elijah Ben Izzy
In episode 22 of Generationship, Rachel Chalmers speaks with Elijah Ben Izzy, CTO at Dagworks. Elijah emphasizes the importance...
Enterprise AI Infrastructure: Compliance, Risks, Adoption
How Enterprise AI Infrastructure Must Balance Change Management vs. Risk Aversion 50%-60% of enterprises reportedly “use” AI,...
Generationship Ep. #20, Smells Like ML with Salma Mayorquin and Terry Rodriguez of Remyx AI
In episode 20 of Generationship, Rachel Chalmers is joined by Salma Mayorquin and Terry Rodriguez of Remyx AI. Together they...
Generationship Ep. #18, Intelligence on Tap with Shawn "swyx" Wang
In episode 18 of Generationship, Rachel Chalmers sits down with Shawn "swyx" Wang to delve into AI Engineering. Shawn shares his...
Machine Learning Lifecycle: Take Projects from Idea to Launch
Machine learning is the process of teaching deep learning algorithms to make predictions based on a specific dataset. ML...
Machine Learning Model Monitoring: What to Do In Production
Machine learning model monitoring is the process of continuously tracking and evaluating the performance of a machine learning...
The Data Pipeline is the New Secret Sauce
Why Data Pipelines and Inference Are AI Infrastructure’s Biggest Challenges While there’s still great excitement around AI and...
Generationship Ep. #16, Brains in Jars with Raiya Kind, PhD
In Episode 16 of Generationship, Rachel Chalmers hosts Raiya Kind, PhD. Together they delve into the human-AI paradigm through...