Developing Evaluation Metrics for Instant Search Using Mixed Methods

Abstract

Instant search has become a popular search paradigm in which users are shown a new result page in response to every keystroke triggered. Over recent years, the paradigm has been widely adopted in several domains including personal email search, e-commerce, and music search. However, the topic of evaluation and metrics of such systems has been less explored in the literature thus far. In this work, we describe a mixed methods approach to understanding user expectations and evaluating an instant search system in the context of music search. Our methodology involves conducting a set of user interviews to gain a qualitative understanding of users’ behaviors and their expectations. The hypotheses from user research are then extended and verified by a large-scale quantitative analysis of interaction logs. Using music search as a lens, we show that researchers and practitioners can interpret the behavior logs more effectively when accompanied by insights from qualitative research. Further, we also show that user research eliminates the guesswork involved in identifying users signals that estimate user satisfaction. Finally, we demonstrate that metrics identified using our approach are more sensitive than the commonly used click-through rate metric for instant search.

Related

August 2021 | KDD

Neural Instant Search for Music and Podcast

Helia Hashemi, Aasish Pappu, Mi Tian, Praveen Ravichandran, Mounia Lalmas, Ben Carterette

July 2021 | ACL

Modeling Language Usage and Listener Engagement in Podcasts

Sravana Reddy, Mariya Lazarova, Yongze Yu, Rosie Jones

July 2021 | SIGIR

Podcast Metadata and Content: Episode Relevance and Attractiveness in Ad Hoc Search

Ben Carterette, Rosie Jones, Gareth Jones, Maria Eskevich, Sravana Reddy, Ann Clifton, Yongze Yu, Jussi Karlgren and Ian Soboroff