📘
Winter LLM Bootcamp
  • Welcome to the course. Bienvenue!
    • Course Structure
    • Course Syllabus and Timelines
    • Know your Educators
    • Action Items and Prerequisites
    • Bootcamp Kick-Off Session
  • Basics of LLMs
    • What is Generative AI?
    • What is a Large Language Model?
    • Advantages and Applications of LLMs
    • Bonus Resource: Multimodal LLMs and Google Gemini
  • Word Vectors, Simplified!
    • What is a Word Vector
    • Word Vector Relationships
    • Role of Context in LLMs
    • Transforming Vectors into LLM Responses
    • Bonus Section: Overview of the Transformers Architecture
      • Attention Mechanism
      • Multi-Head Attention and Transformers Architecture
      • Vision Transformers
    • Graded Quiz 1
  • Prompt Engineering and Token Limits
    • What is Prompt Engineering
    • Prompt Engineering and In-context Learning
    • Best Practices to Follow
    • Token Limits and Hallucinations
    • Prompt Engineering Excercise (Ungraded)
      • Story for the Excercise: The eSports Enigma
      • Your Task for the Module
  • Retrieval Augmented Generation (RAG) and LLM Architecture
    • What is Retrieval Augmented Generation (RAG)
    • Primer to RAG: Pre-trained and Fine-Tuned LLMs
    • In-Context Learning
    • High-level LLM Architecture Components for In-Context Learning
    • Diving Deeper: LLM Architecture Components
    • Basic RAG/LLM Architecture Diagram with Key Steps
    • RAG versus Fine-Tuning and Prompt Engineering
    • Versatility and Efficiency in RAG
    • Understanding Key Benefits of Using RAG in Enterprises
    • Hands-on Demo: Performing Similarity Search in Vectors (Bonus Module)
    • Using kNN and LSH to Enhance Similarity Search (Bonus Module)
    • Graded Quiz 2
  • Hands-on Development
    • Prerequisites
    • Dropbox Retrieval App
      • Understanding Docker
      • Building the Dockerized App
      • Retrofitting our Dropbox app
    • Amazon Discounts App
      • How the project works
      • Repository Walkthrough
    • How to Run 'Examples'
    • Bonus Section: Real-time RAG with LlamaIndex and Pathway
  • Bonus Resource: Recorded Interactions from the Archives
  • Final Project + Giveaways
    • Prizes and Giveaways
    • Suggested Tracks for Ideation
    • Form for Submission
Powered by GitBook
On this page

Was this helpful?

  1. Word Vectors, Simplified!

What is a Word Vector

PreviousWord Vectors, Simplified!NextWord Vector Relationships

Last updated 1 year ago

Was this helpful?

Before we tackle the intricate aspects of Large Language Models, it's crucial to understand the foundational concept: Word Vectors. Imagine language as a vast, multi-dimensional space where each word has its unique spot. Word vectors convert words into a numerical format, making text comprehensible to LLMs.

In the accompanying video, Anup Surendran lays the groundwork for understanding LLMs by exploring basic questions:

  • What exactly are LLMs?

  • What are Word Vectors?

Note: As you move through the modules, you'll see that some videos are segmented and shared across various submodules. This strategy is deliberately chosen to align with the textual content and maintain a coherent learning journey. However, the video mentioned above isn't split in this manner. :)