Blog
Mercari AI team presented in NLP2024 “A related document retrieval system using past queries”
Overview
This time, Kanta Suga (from the Graduate School of Advanced Science and Engineering, Waseda University), a former intern of the Mercari AI team, together with Mercari AI team engineers Rintaro Miyamoto, Naoki Katsura, and Keisuke Umezawa, presented their research on the "A related document retrieval system using past queries" at the 30th Annual Conference of the Association for Natural Language Processing (NLP2024).
The Association for Natural Language Processing is the largest domestic conference where researchers and engineers in natural language processing (NLP) gather, and the 30th conference, NLP2024, was held in Kobe, Hyogo Prefecture from March 11 to March 15, 2024.
Research Background
At Mercari, we develop in-house tools for customer inquiries and fraud detection. Especially, quickly resolving issues that customers face is very important for improving customer satisfaction.
Although customer issues and their contexts vary, there are similarities among past inquiries, and by leveraging this accumulated information, we aim to reduce resolution time and have proposed this method.
Research Summary
Mercari has a contact form within its service, allowing customers to send inquiries when they face issues. With the large number of inquiries received, there is a need for efficient resolution time.
Many inquiries are similar to past ones, and referencing similar past inquiries serves as a starting point for improving operational efficiency.
However, due to changes in support manuals and systems during service operations, referencing past inquiries alone is sometimes insufficient. Therefore, in this research, we propose a method of using past inquiries to search the current help guide, and in our experiments, we report on the results of various conditions changed.
Key Points of the Presentation
In this paper, we propose a method whereby when customers face issues on Mercari, the text of their inquiries and data from past conversation histories are used to search for solutions in the existing help guide, associating them based on similarity.
In the proposed method, customer inquiries are vectorized using a document embedding model. Similarly, from vectorized past inquiry histories, inquiries with high similarity are collected, and the help guides showing high similarity to the responses of these inquiries are linked by majority vote, achieving high search accuracy.
Presenter's Comment
Kanta Suga | Mercari AI Team, Former Intern
During this opportunity, I realized the importance of directly exchanging opinions at conferences to find new improvements and to announce the results externally.
Also, while listening to various presentations, I felt the rapid technological advancements in the AI field, and I realized that I need to make more effort to keep up. It was a very valuable experience to have such an opportunity to present even as an intern. Thank you very much.
Naoki Katsura | ML Engineer
At this year's annual conference of the Association for Natural Language Processing, we not only presented a poster but also set up a Mercari sponsor booth, where we engaged in discussions with domestic students, researchers, and engineers.
Researchers and engineers in the industry often share common perceptions of the challenges faced with using language models for customer inquiries. This conference was a meaningful platform for input and discussions. We will utilize the insights gained through these discussions in Mercari's operations.
Members at the site
Mercari's sponsor booth at the conference
About the CRE (Customer Reliability Engineering) ML Team
The CRE ML Team develops in-house tools and streamlines operations using machine learning in the area of customer support at Mercari.