In addition, I want to introduce another paper presented at ACL 2023 Main: MPCHAT: Towards Multimodal Persona-Grounded Conversation.
This study focuses on constructing a multimodal persona-grounded dialogue dataset, where a multimodal (i.e., text+image) persona represents the user's personal experiences in their episodic memories.
We would greatly appreciate it if you could consider adding our papers to your repository and your survey paper.
Thanks!
Thank you for sharing your valuable contributions. We will incorporate them into our repository, and they will be reflected in the next update of our survey paper on arXiv.
Hi, what a fantastic resource for developing PersonaLLM agents!
I want to highlight a recent paper presented at ACL 2024 Findings: TimeChara: Evaluating Point-in-Time Character Hallucination of Role-Playing Large Language Models. This study focuses on assessing hallucinations in role-playing LLM agents when they simulate characters at specific moments in time.
In addition, I want to introduce another paper presented at ACL 2023 Main: MPCHAT: Towards Multimodal Persona-Grounded Conversation. This study focuses on constructing a multimodal persona-grounded dialogue dataset, where a multimodal (i.e., text+image) persona represents the user's personal experiences in their episodic memories.
We would greatly appreciate it if you could consider adding our papers to your repository and your survey paper. Thanks!