You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for your valuable contribution. I have a question regarding how the LLM generates appropriate KG-grounded reasoning paths. Specifically, I noticed that the KG information does not appear to be explicitly provided as a prompt when the LLM is tasked with generating the reasoning paths.
How does the LLM manage to produce the relations and entities directly from the KG in this case? Is this because the model has been fine-tuned using ground-truth paths, or is there another mechanism involved?
Best.
The text was updated successfully, but these errors were encountered:
Hi, Thanks for your interest in our work. In GCR, we adopt the graph-constrained decoding to ensure the truthfulness of LLM generations, which is realized by the KG-Trie. You can find more details in our paper.
Hi,
Thank you for your valuable contribution. I have a question regarding how the LLM generates appropriate KG-grounded reasoning paths. Specifically, I noticed that the KG information does not appear to be explicitly provided as a prompt when the LLM is tasked with generating the reasoning paths.
How does the LLM manage to produce the relations and entities directly from the KG in this case? Is this because the model has been fine-tuned using ground-truth paths, or is there another mechanism involved?
Best.
The text was updated successfully, but these errors were encountered: