Skip to content

Commit

Permalink
docs: notation
Browse files Browse the repository at this point in the history
Signed-off-by: Angel Luu <[email protected]>
  • Loading branch information
aluu317 committed Nov 6, 2024
1 parent 0c76bce commit 8ba56af
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -157,7 +157,7 @@ Granite 8B | LlamawithCausalLM | ✅ | ✅ | ✅ |
Granite 13B | GPTBigCodeForCausalLM | ✅ | ✅ | ✔️ |
Granite 20B | GPTBigCodeForCausalLM | ✅ | ✔️ | ✔️ |
Granite 34B | GPTBigCodeForCausalLM | 🚫 | ✅ | ✅ |
Llama3.1-8B | LLaMA 3.1 | ✅ - supported from platform up to 8k context length - same architecture as llama3-8b | ✔️ | ✔️ |  
Llama3.1-8B | LLaMA 3.1 | ✅*** | ✔️ | ✔️ |  
Llama3.1-70B(same architecture as llama3) | LLaMA 3.1 | 🚫 - same as Llama3-70B | ✔️ | ✔️ |
Llama3.1-405B | LLaMA 3.1 | 🚫 | 🚫 | ✅ |
Llama3-8B | LLaMA 3 | ✅ | ✅ | ✔️ |  
Expand All @@ -171,6 +171,8 @@ Mistral large | Mistral | 🚫 | 🚫 | 🚫 |

(**) - Supported for q,k,v,o layers . `all-linear` target modules does not infer on vLLM yet.

(***) - Supported from platform up to 8k context length - same architecture as llama3-8b

## Training

### Single GPU
Expand Down

0 comments on commit 8ba56af

Please sign in to comment.