In-context learning (ICL) allows large language models (LLMs) to perform new tasks or adapt to new data by leveraging examples provided directly within the input prompt, without requiring explicit model weight updates. It enables flexible, on-the-fly task execution by treating examples as part of the input context for next-token prediction.
In-context learning is a way for large AI models to learn new tasks by simply looking at a few examples given in the same prompt, without needing to be retrained. This makes them very flexible and able to adapt quickly to many different jobs, from translation to answering questions. It's a key feature that makes modern AI models so versatile.
ICL, few-shot prompting, prompt-based learning, demonstration learning
Was this definition helpful?