Personal search
Personal search is an information retrieval task where the users has access only to their own private
documents instead of the web. Personal search has many applications such as email search,
desktop search, device search etc. Majority of click models for web search, which learn the
probability of a click from a large amount of click data per query-document pair are not applicable to
personal search which makes it a difficult and time consuming search scenario.
Situational Context
Situational context refers to the contextual features of the current search request that are
independent from both query content and user history. As an example, situational context can
depend on search request time, location, device etc. Recent increase in the number of search
requests from mobile devices has intensified the importance of situational context. Moreover, using context information, one can fill the gaps in query understanding in scenarios where the query might be very short and ambiguous (like on mobile devices).
Deep Neural Network system for situational context based ranking in personal search
Authors of this paper have proposed two neural network models, Context-Aware Wide and Deep Network Model (WDNN) and Context-Aware Deep Network Model (Context-DNN) for the above problem.
Following figure gives an overview of the models.
Queries and documents are represented as a list of n-grams. As shown in the figure, embedding layers are used on top of document and query to compute a dense representation for each of them. In order to incorporate contextual features, we use an embedding layer on top of each contextual feature. The reason for using embedding layers is to have close vector representations for the feature values that are similar to each other. For example, the representations of weekends should be intuitively close to each other and far from the weekday vectors
A non-linear hidden layer ( contextual abstraction) is used to combine the contexts. The reason for using this is that we can learn a meaningful representation from a combination of contextual features For example, weekends and holidays vary by country, and thus a combination of both location and time representations can lead to learning a more accurate contextual representation, which is used in the final estimation neuron.
Due to high the level abstractions, some information might be over-generalized. So, sparse features can be used alongside deep representation of the features to model memorization ( Cheng et al.). In WDNN, raw contextual features are used in a binarized format, which are extremely sparse, for memorization and prove to be quite useful.
Conclusion
The authors evaluate the above models and claim significant improvement over current state of the art systems.
As the number of smart devices around us are increasing and getting even more personal, the information provided by these devices can significantly improve the search experience. Since this is a relatively unexplored area, there is much scope for further study and improvement.
References
Situational Context for Ranking in Personal Search
Comments
Post a Comment