Concept for filling Knowledge graph via information from conversation with a LLM
Filling Knowledge graph via information from conversation with a LLM
The aim is to save information based on the conversation a user is holding with the LLM. The information must be reviewed and selected either manually by the user or automatically by the system. Then this info must be saved in a KG to ground the later conversation with the LLM. The grounding part is done in #25 (closed) .
Similar works
- https://python.langchain.com/docs/modules/memory/agent_with_memory_in_db
- https://medium.com/@a-gilmore/context-is-everything-logging-your-llm-conversations-in-a-graph-database-7fa641265657
Current task
-
Running a small LLM -
Understanding KGs -
Choosing a LLM for this usecase -
Making a UI for selecting manual information in the conversation
References
https://huggingface.co/TheBloke/SOLAR-10.7B-Instruct-v1.0-uncensored-GGUF
Edited by Danial Hezarkhani