Applications of Transformers in Recommendation Models: Capturing Long Term Dependencies Optimization Strategies

Authors

  • Abhishek Kumar

Keywords:

Transformer Architectures, Self-Attention Mechanisms, Sequential Recommendation, Multimodal Content Understanding, Cross-Domain Transfer Learning

Abstract

Transformer architecture has transformed this aspect of recommendation systems by dismantling theunderlying constraints of long-term dependencies and intricate behavioural patterns. The self-attentionmechanism permits dynamic weighting of past interactions according to contextual similarity instead of temporal closeness in order to permit

References

Ashish Vaswani, et al., "Attention is all you need,"arxiv, 2017. [Online]. Available: https://arxiv.org/abs/1706.03762

Downloads

Published

2026-01-05

How to Cite

Abhishek Kumar. (2026). Applications of Transformers in Recommendation Models: Capturing Long Term Dependencies Optimization Strategies . Journal of Computational Analysis and Applications (JoCAAA), 35(1), 38–47. Retrieved from https://eudoxuspress.com/index.php/pub/article/view/4607

Issue

Section

Articles