Feeling inspired to write your first TDS post? We’re always open to contributions from new authors.
As many of us are entering the final stretch of summer, why not take advantage of the calmer weeks before a typically hectic September kicks in and explore new topics in data science and machine learning?
To help all the learners and skill-growers among our readers, this week we’re presenting a special edition of The Variable, dedicated entirely to our best recent deep dives (and other articles that demand a bit more time and focus than usual). Their reading time might be longer, but they do a fantastic job covering their respective topics with nuance, care, and an eye towards practical applications. We hope you enjoy our selection.
- A Practical Guide to Contrastive Learning
Useful for learning underlying data representations without any explicit labels, contrastive learning comes with numerous real-world use cases; Mengliu Zhao guides us through the process of building a SimSiam model using the example of the FashionMNIST dataset. - Paper Walkthrough: Vision Transformer (ViT)
We’re always in the mood for a solid, thorough paper analysis—and even more so when it covers a groundbreaking concept like vision transformers. If you’re new to this topic or would like to expand your existing knowledge of ViT, don’t miss Muhammad Ardi’s debut TDS article. - Speeding Up the Vision Transformer with BatchNorm
Let’s stay with the vision transformer for a bit longer: if you’re already familiar with it but could use some help making your workflows more efficient and streamlined, Anindya Dey, PhD provides a comprehensive guide to integrating batch normalization into an encoder-only transformer architecture, leading to reduced training and inference time. - Enhancing E-Commerce with Generative AI — Part 1
Some of the promised benefits of recently released AI tools remain to be seen. Mina Ghashami presents a new series that focuses on use cases where generative-AI applications are already poised to make a real impact, starting with one of the most common (and business-critical) tasks for e-commerce platforms: product recommendations.
- Causal Inference with Python: A Guide to Propensity Score Matching
Bringing theory and practice together, Lukasz Szubelak invites us to explore the ins and outs of causal inference in his patient deep dive, which focuses on propensity score matching as a powerful technique for estimating treatment effects in non-randomized settings. - ChatGPT vs. Claude vs. Gemini for Data Analysis (Part 1)
ML practitioners are facing an increasingly difficult choice when deciding which LLM-powered products to choose. Yu Dong’s new series aims to bring clarity to an occasionally chaotic ecosystem by comparing the performance of three major offerings (ChatGPT, Claude, and Gemini) in essential data-analysis tasks—in this case, writing SQL queries. - Omitted Variable Bias
Reading Sachin Date’s math and statistics explainers is always a highlight for us—and his latest, on “one of the most frequently occurring, and easily missed, biases in regression studies” is no exception. We invite you to explore his deep dive on the omitted variable bias, which also outlines several approaches for analyzing and estimating its effects.
Thank you for supporting the work of our authors! We love publishing articles from new authors, so if you’ve recently written an interesting project walkthrough, tutorial, or theoretical reflection on any of our core topics, don’t hesitate to share it with us.
Until the next Variable,
TDS Team
Vision Transformers, Contrastive Learning, Causal Inference, and Other Deep Dives You Shouldn’t Miss was originally published in Towards Data Science on Medium, where people are continuing the conversation by highlighting and responding to this story.
Originally appeared here:
Vision Transformers, Contrastive Learning, Causal Inference, and Other Deep Dives You Shouldn’t Miss