Definition:
Zero-shot Learning /ˈzɪə.rəʊ ʃɒt ˈlɜː.nɪŋ/ noun — In machine learning, zero-shot learning (ZSL) is a technique that enables models to perform tasks or recognize categories that they were never explicitly trained on, by leveraging semantic information, such as attributes, descriptions, or language models.
Unlike traditional supervised learning, which requires labeled examples for every class, zero-shot learning allows a model to:
- Generalize knowledge from seen classes to unseen classes
- Interpret natural language instructions or descriptions to guide behavior
- Use text embeddings or semantic vectors as bridges between known and novel categories
Common approaches include:
- Using pre-trained models (e.g., CLIP, GPT, BERT) to match text prompts to inputs
- Mapping both visual and textual data into a shared embedding space
- Transferring relational or attribute-based knowledge from labeled to unlabeled classes
Zero-shot learning is widely used in:
- Vision-language models
- Conversational AI and chatbots
- Text classification with dynamic labels
- Personalized recommendation systems
It plays a critical role in creating flexible, scalable AI systems that can adapt to new tasks without retraining—key for real-world deployment and continual learning.
« Back to dictionary

