This repository is the official implementation of In-context Vectors: Making In Context Learning More Effective and Controllable Through Latent Space Steering.
Large language models (LLMs) demonstrate emergent in-context learning capabilities, where they adapt to new tasks based on example demonstrations. However, in-context learning has seen limited effectiveness in many settings, is difficult to quantitatively control and takes up context window space. To overcome these limitations, we propose an alternative approach that recasts in-context learning as in-context vectors (ICV). Using ICV has two steps. We first use a forward pass on demonstration examples to create the in-context vector from the latent embedding of the LLM. This vector captures essential information about the intended task. On a new query, instead of adding demonstrations to the prompt, we shift the latent states of the LLM using the ICV. The ICV approach has several benefits: 1) it enables the LLM to more effectively follow the demonstration examples; 2) it's easy to control by adjusting the magnitude of the ICV; 3) it reduces the length of the prompt by removing the in-context demonstrations; 4) ICV is computationally much more efficient than fine-tuning. We demonstrate that ICV achieves better performance compared to standard in-context learning and fine-tuning on diverse tasks including safety, style transfer, role-playing and formatting. Moreover, we show that we can flexibly teach LLM to simultaneously follow different types of instructions by simple vector arithmetics on the corresponding ICVs.
conda create -n icv python=3.9
pip install -r requirements.txt
Download the paradetox dataset from huggingface.
The jupyter notebook provides a simple demo code for you to play with the in-context vector to steer properties of the generated texts using few demonstrations.
Here is an example on applying in-context vector for falcon/llama for text detoxification on paradetox dataset.
python task_style_vector.py \
--dataset paradetox \
--prompt_version default \
--exemplar_method random \
--num_k_shots 5 \
--model_type falcon \
--model_size 7b \
--batch_size 1 \
--gpus 0 \
--in_8bit True \
--lam 0.1 \
--seed 0
For evaluation, you can run the following
python evaluation.py ./logger/main/paradetox/file_name.json paradetox
@article{liu2024context,
title={In-context vectors: Making in context learning more effective and controllable through latent space steering},
author={Liu, Sheng, and Ye, Haotian, and Xing, Lei and Zou, James},
booktitle={International Conference on Machine Learning},
year={2024},
organization={PMLR}
}
For technical details and full experimental results, please check our paper.