Skip to main content Accessibility help
×
Hostname: page-component-55f67697df-2mk96 Total loading time: 0 Render date: 2025-05-10T12:08:34.922Z Has data issue: false hasContentIssue false

4 - Application of Embedding Technology in Recommender Systems

Published online by Cambridge University Press:  08 May 2025

Get access

Summary

Embedding technology plays a pivotal role in deep learning, particularly in industries such as recommendation, advertising, and search. It is considered a fundamental operation for transforming sparse vectors into dense representations that can be further processed by neural networks. Beyond its basic role, embedding technology has evolved significantly in both academia and industry, with applications ranging from sequence processing to multifeature heterogeneous data. This chapter discusses the basics of embedding, its evolution from Word2Vec to graph embeddings and multifeature fusion, and its applications in recommender systems, with an emphasis on online deployment and inference.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2025

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Book purchase

Temporarily unavailable

References

Mikolov, Tomas, et al. Distributed representations of words and phrases and their compositionality. Advances in Neural Information Processing Systems, 26, 2013.Google Scholar
Mikolov, Tomas, et al. Efficient estimation of word representations in vector space: arXiv preprint arXiv:1301.3781 (2013).Google Scholar
Rong, Xin. Word2vec parameter learning explained: arXiv preprint arXiv:1411.2738 (2014).Google Scholar
Goldberg, Yoav, Levy, Omer. Word2vec explained: Deriving Mikolov et al.’s negative-sampling word-embedding method: arXiv preprint arXiv: 1402.3722 (2014).Google Scholar
Bengio, Yoshua, et al. A neural probabilistic language model. Journal of Machine Learning Research, 3, 2003: 1137–1155.Google Scholar
Barkan, Oren, Koenigstein, Noam. Item2vec: Neural item embedding for collaborative filtering. 2016 IEEE 26th International Workshop on Machine Learning for Signal Processing (MLSP), Salerno, Italy, September 13–16, 2016.Google Scholar
Perozzi, Bryan, Al-Rfou, Rami, Skiena, Steven. DeepWalk: Online learning of social representations. Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, USA, August 24–27, 2014.Google Scholar
Grover, Aditya, Leskovec, Jure. node2vec: Scalable feature learning for networks. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, USA, August 13–17, 2016.Google Scholar
Wang, Jizhe, et al. Billion-scale commodity embedding for e-commerce recommender in Alibaba. Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, London, UK, August 19–23, 2018.Google Scholar
Tang, Jian, et al. Line: Large-scale information network embedding. Proceedings of the 24th International Conference on World Wide Web. International World Wide Web Conferences Steering Committee, Florence, Italy, May 18–22, 2015.Google Scholar
Wang, Daixin, Cui, Peng, Zhu, Wenwu. Structural deep network embedding. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, USA, August 13–17, 2016.Google Scholar
Slaney, Malcolm, Casey, Michael. Locality-sensitive hashing for finding nearest neighbors [lecture notes]. IEEE Signal Processing Magazine, 25(2), 2008: 128–131.Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×