Skip to content

Commit

Permalink
[README] update flash-attn instructions
Browse files Browse the repository at this point in the history
  • Loading branch information
imoneoi authored Sep 14, 2023
1 parent a9ad313 commit 76ab48c
Showing 1 changed file with 2 additions and 12 deletions.
14 changes: 2 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,20 +108,13 @@ We will release the evaluation results as soon as they become available, so stay

## <a id="installation"></a> Installation

To use OpenChat, you need to install CUDA and PyTorch, then you can install OpenChat via pip:
To use OpenChat, you need to install PyTorch, then you can install OpenChat via pip:

```bash
pip3 install ochat
```

If you want to train models, please also install FlashAttention 1.

```bash
pip3 install packaging ninja
pip3 install --no-build-isolation "flash-attn<2"
```

FlashAttention and vLLM may have compatibility issues. If you encounter these problems, you can try to create a new `conda` environment following the instructions below.
vLLM may have compatibility issues. If you encounter these problems, you can try to create a new `conda` environment following the instructions below.

```bash
conda create -y --name openchat
Expand All @@ -131,9 +124,6 @@ conda install -y python
conda install -y cudatoolkit-dev -c conda-forge
pip3 install torch torchvision torchaudio

pip3 install packaging ninja
pip3 install --no-build-isolation "flash-attn<2"

pip3 install ochat
```

Expand Down

0 comments on commit 76ab48c

Please sign in to comment.