Skip to main content Accessibility help
×
Hostname: page-component-55f67697df-2mk96 Total loading time: 0 Render date: 2025-05-10T17:58:37.597Z Has data issue: false hasContentIssue false

3 - Top of the Tide

Application of Deep Learning in Recommender Systems

Published online by Cambridge University Press:  08 May 2025

Get access

Summary

The introduction of advanced deep learning models such as Microsoft’s Deep Crossing, Google’s Wide&Deep, and others like FNN and PNN in 2016 marked a significant shift in the field of recommender systems and computational advertising, establishing deep learning as the dominant approach. This chapter discusses the evolution of traditional recommendation models and highlights two main advancements in deep learning models: enhanced expressivity for uncovering hidden data patterns and flexible model structures tailored to specific business use cases. Drawing on techniques from computer vision, speech, and natural language processing, deep learning recommendation models have rapidly evolved. The chapter summarizes several influential deep learning models and constructs an evolution map. These models are selected based on their industry impact and their role in advancing deep learning recommender systems. Additionally, the chapter will introduce applications of Large Language Models (LLMs) in recommender systems, exploring how these models further enhance recommendation technologies.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2025

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Book purchase

Temporarily unavailable

References

Sedhain, Suvash, et al. Autorec: Autoencoders meet collaborative filtering. Proceedings of the 24th International Conference on World Wide Web, May 18, 2015 (pp. 111–112).Google Scholar
Shan, Ying, et al. Deep crossing: Web-scale modeling without manually crafted combinatorial features. Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, August 13, 2016 (pp. 255–262).Google Scholar
He, Kaiming, et al. Deep residual learning for image recognition. Proceedings of the IEEE conference on computer vision and pattern recognition, June 27–30, 2016 (pp. 770–778).Google Scholar
He, Xiangnan, et al. Neural collaborative filtering. Proceedings of the 26th international conference on world wide web. International World Wide Web Conferences Steering Committee, April 3, 2017 (pp. 173–182).Google Scholar
Qu, Yanru, et al. Product-based neural networks for user response prediction. IEEE 16th International Conference on Data Mining (ICDM), December 12, 2016 (pp. 1149–1154).Google Scholar
Cheng, Heng-Tze, et al. Wide & deep learning for recommender systems. Proceedings of the 1st workshop on deep learning for recommender systems, September 15, 2016 (pp. 7–10).Google Scholar
Wang, Ruoxi, et al. Deep & cross network for ad click predictions. Proceedings of the ADKDD’17, August 14, 2017 (pp. 1–7).Google Scholar
Zhang, Weinan, Du, Tianming, Wang, Jun. Deep learning over multi-field categorical data – a case study on user response prediction. Advances in Information Retrieval: 38th European Conference on Information Retrieval, March 20–23, 2016 (pp. 45–57).Google Scholar
Guo, Huifeng, et al. DeepFM: A factorization-machine based neural network for CTR prediction: arXiv preprint arXiv:1703.04247 (2017).Google Scholar
He, Xiangnan, Chua, Tat-Seng. Neural factorization machines for sparse predictive analytics. Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval, August 7, 2017 (pp. 355–364).Google Scholar
Xiao, Jun, et al. Attentional factorization machines: Learning the weight of feature interactions via attention networks: arXiv preprint arXiv: 1708.04617 (2017).Google Scholar
Zhou, Guorui, et al. Deep interest network for click-through rate prediction. Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, July 19, 2018 (pp. 1059–1068).Google Scholar
Zhou, Guorui, et al. Deep interest evolution network for click-through rate prediction. Proceedings of the AAAI Conference on Artificial Intelligence, 33(1), 2019: 5941–5948.Google Scholar
Zheng, Guanjie, et al. DRN: A deep reinforcement learning framework for news Recommender. Proceedings of the 2018 World Wide Web Conference. International World Wide Web Conferences Steering Committee, April 23, 2018 (pp. 167–176).Google Scholar
Devlin, Jacob, et al. Bert: Pre-training of deep bidirectional transformers for language understanding: arXiv preprint arXiv:1810.04805 (2018).Google Scholar
Sun, Fei, et al. BERT4Rec: Sequential recommendation with bidirectional encoder representations from transformer. Proceedings of the 28th ACM International Conference on Information and Knowledge Management, November 3, 2019 (pp. 1441–1450).Google Scholar
Zhang, Qi, et al. UNBERT: User-news matching BERT for news recommendation. IJCAI, 21, 2021: 3356–3362.Google Scholar
Vaswani, Ashish, et al. Attention is all you need. Advances in Neural Information Processing Systems, 30, 2017.Google Scholar
Lin, Jianghao, et al. How can recommender systems benefit from large language models: A survey: arXiv preprint arXiv:2306.05817 (2023).Google Scholar
Mysore, Sheshera, McCallum, Andrew, Zamani, Hamed. Large language model augmented narrative driven recommendations. Proceedings of the 17th ACM Conference on Recommender Systems, September 14, 2023 (pp. 777–783).Google Scholar
Kang, Wang-Cheng, et al. Do LLMs understand user preferences? Evaluating LLMs on user rating prediction: arXiv preprint arXiv:2305.06474 (2023).Google Scholar
Gao, Yunfan, et al. Chat-rec: Towards interactive and explainable LLMs-augmented recommender system: arXiv preprint arXiv:2303.14524 (2023).Google Scholar

Save book to Kindle

To save this book to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×