translate

Natural Language Processing

From text preprocessing to BERT — the tasks, models, and evaluation methods that define how machines understand human language.
Co-Created by Kiran Shirol and Claude
Topics Tokenization Embeddings Classification NER Transformers Evaluation
info Recommended prerequisites:  AI Fundamentals  and  Deep Learning Fundamentals  (or equivalent knowledge of neural networks and attention).
play_arrow Start Learning summarize Key Insights dictionary Glossary 10 Chapters · High-Level
Section 1

Text & Representation

Why language is hard for machines, how to preprocess text, and the journey from sparse to dense representations.
Section 2

Tasks & Models

The core NLP tasks — classification, sequence labeling, generation — and the models that solve them, from Naive Bayes to BERT.
Section 3

Modern NLP

Transfer learning, evaluation metrics, and the modern landscape from instruction tuning to multilingual models.