Abstract:Traditional algorithms for knowledge-aware propagation recommendation face challenges including low correlation of higher-order features, unbalanced information utilization, and noise introduction. To address these challenges, this study proposes a multi-level contrastive learning for knowledge-aware propagation recommender algorithm utilizing knowledge enhancement (MCLK-KE). By constructing enhanced views and utilizing mask reconstruction-based self-supervised pre-training, the algorithm extracts deeper information from key triples to effectively suppress noise signals. It achieves a balanced utilization of knowledge and interactive signals while enhancing feature representation by comparing graphs to capture effective node attributes globally. Multi-task training significantly improves model performance by incorporating recommendation prediction, contrastive learning, and mask reconstruction tasks. In tests on three publicly available datasets, MCLK-KE demonstrates a maximum increase of 3.3% in AUC and 5.3% in F1 scores compared to the best baseline model.