Wrap-up & Next Steps

Thank you for participating!

What We Covered Today

A Complete Journey Through Modern RecSys

Part I: Foundations ✅

Classical Recommender Systems

  • Collaborative filtering fundamentals
  • Matrix factorization and embeddings
  • EASE - Embarrassingly Shallow Autoencoders
  • Two-tower models for scalable retrieval
  • Retrieval → Ranking pipeline

Key Insight: Embeddings are the foundation of all modern recommendation systems

Part II: Sequential Models ✅

Transformers for Recommendations

  • Transformer architecture (attention mechanisms)
  • Self-Attentive Sequential Recommendation (SASRec)
  • Capturing temporal patterns in user behavior
  • From static to dynamic user preferences

Key Insight: Sequence matters! Past interactions reveal future intent

Part III: LLM-Powered Recommendation ✅

Generative AI Meets RecSys

  • Zero-shot recommendation without training data
  • Metadata augmentation with LLMs
  • Solving the cold-start problem
  • Semantic understanding beyond collaborative signals
  • Conversational recommendation

Key Insight: LLMs unlock capabilities impossible with traditional methods

Part IV: Generative Pages ✅

Beyond Ranked Lists

  • Re-ranking and slate optimization
  • Diversity and coverage constraints
  • Whole page generation
  • Multi-carousel experiences

Key Insight: Modern platforms generate entire personalized experiences

Key Takeaways

  1. Fundamentals still matter - Collaborative filtering, embeddings, and two-tower models remain essential
  2. Transformers are everywhere - From NLP to sequential recommendation
  3. LLMs are game-changers - Zero-shot capabilities, cold-start solutions, semantic understanding
  4. Think beyond lists - Design entire personalized experiences

Take-Home Materials

All materials available on GitHub:

  • 📓 Jupyter Notebooks
  • 📊 Slide Presentations
  • 🐍 Reusable Python utilities

🔗 github.com/jankislinger/recsys-genai

All references: See the References page

Thank You!