Is langchain broken with current version of node-llama-cpp? #7028
-
Checked other resources
Commit to Help
Example Codeimport { ChatLlamaCpp } from "@langchain/community/chat_models/llama_cpp";
import { HumanMessage } from "@langchain/core/messages";
// Can confirm path is correct, but none of these models work
let llamaPath = "./hf_TheBloke_mistral-7b-instruct-v0.2.Q5_K_M.gguf";
llamaPath = "./hf_mradermacher_Mistral-Nemo-Instruct-2407.Q4_K_M.gguf";
llamaPath = "./Mistral-Nemo-Instruct-2407-Q4_K_M.gguf";
llamaPath = "./Hermes-2-Pro-Llama-3-8B-Q4_K_M.gguf";
const model = new ChatLlamaCpp({ modelPath: llamaPath });
const response = await model.invoke([
new HumanMessage({ content: "My name is John." }),
]);
console.log({ response }); Descriptionhttps://js.langchain.com/docs/integrations/chat/llama_cpp/
Error:
dependencies from package.json:
System Info[email protected] | MIT | deps: 12 | versions: 297 keywords: llm, ai, gpt3, chain, prompt, prompt engineering, chatgpt, machine learning, ml, openai, embeddings, vectorstores dist dependencies: maintainers:
dist-tags: published a month ago by jacoblee93 [email protected] |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
There is a similar open issue titled "LlamaCpp throwing 'TypeError: Cannot destructure property '_llama' of 'undefined' as it is undefined.'" which might be related to the problem you're experiencing. The suggested solution is to ensure that the model is in a compatible In your case, it seems like the models you're trying to use are in the |
Beta Was this translation helpful? Give feedback.
-
Answer from Langchain dev, "use version 2 not 3", packag.json:
|
Beta Was this translation helpful? Give feedback.
Answer from Langchain dev, "use version 2 not 3", packag.json: