In-Context Learning
In-context learning is an ability of models that can learn new information without fine-tuning, by adding context to the prompt during inference. This context enables the model to generalise outside of the information present in its training data and was a key breakthrough of GPT-3.