Developing Evaluation Metrics for Instant Search Using Mixed Methods

Abstract

Instant search has become a popular search paradigm in which users are shown a new result page in response to every keystroke triggered. Over recent years, the paradigm has been widely adopted in several domains including personal email search, e-commerce, and music search. However, the topic of evaluation and metrics of such systems has been less explored in the literature thus far. In this work, we describe a mixed methods approach to understanding user expectations and evaluating an instant search system in the context of music search. Our methodology involves conducting a set of user interviews to gain a qualitative understanding of users’ behaviors and their expectations. The hypotheses from user research are then extended and verified by a large-scale quantitative analysis of interaction logs. Using music search as a lens, we show that researchers and practitioners can interpret the behavior logs more effectively when accompanied by insights from qualitative research. Further, we also show that user research eliminates the guesswork involved in identifying users signals that estimate user satisfaction. Finally, we demonstrate that metrics identified using our approach are more sensitive than the commonly used click-through rate metric for instant search.

Related

May 2023 | TheWebConf

Improving Content Retrievability in Search with Controllable Query Generation

Gustavo Penha, Enrico Palumbo, Maryam Aziz, Alice Wang, and Hugues Bouchard

March 2023 | Frontier on Big Data: Recommender Systems

A Survey on Multi-objective Recommender Systems

Dietmar Jannach and Himan Abdollahpouri

March 2023 | Intelligent User Interfaces (IUI)

Enabling Goal-Focused Exploration of Podcasts in Interactive Recommender Systems

Yu Liang, Aditya Ponnada, Paul Lamere, Nediyana Daskalova