[1] 任磊, 杜一, 马帅, 等. 大数据可视分析综述[ J] . 软 件学报, 2014, 25(9) : 1909-1936. REN L, DU Y, MA S, et al. Visual analytics towards big data[J]. Journal of Software, 2014, 25(9): 1909-1936.
[2] BORSBOOM D, DESERNO M K, RHEMTULLA M, et al. Network analysis of multivariate data in psychological science[ EB / OL] . ( 2021- 08- 19) [ 2022- 03- 16] . https: / / doi. org / 10. 1038 / s43586-021-00055-w.
[3] ZHOU J Y, LIU L, WEI W Q, et al. Network representation learning: from preprocessing, feature extraction to node embedding[ J] . ACM Computing Surveys, 2023, 55 (2) : 1-35.
[4] CUI P, WANG X, PEI J, et al. A survey on network embedding [ J ] . IEEE Transactions on Knowledge and Data Engineering, 2019, 31(5) : 833-852.
[5] 涂存超, 杨成, 刘知远, 等. 网络表示学习综述[ J] . 中国科学: 信息科学, 2017, 47(8) : 980-996.
TU C C, YANG C, LIU Z Y, et al. Network representation learning: an overview [ J] . Scientia Sinica Informationis, 2017, 47(8) : 980-996.
[6] 高岳林, 杨钦文, 王晓峰, 等. 新型群体智能优化算 法综述[ J] . 郑州大学学报( 工学版) , 2022, 43( 3) : 21-30.
GAO Y L, YANG Q W, WANG X F, et al. Overview of new swarm intelligent optimization algorithms[ J] . Journal of Zhengzhou University ( Engineering Science) , 2022, 43(3) : 21-30.
[7] 郑建兴, 郭彤彤, 申利华, 等. 基于评论文本情感注 意力的推荐方法研究[ J] . 郑州大学学报( 工学版) , 2022, 43(2) : 44-50, 57.
ZHENG J X, GUO T T, SHEN L H, et al. Research on recommendation method based on sentimental attention of review text[ J] . Journal of Zhengzhou University ( Engineering Science) , 2022, 43(2) : 44-50, 57.
[8] 程苏琦, 沈 华 伟, 张 国 清, 等. 符 号 网 络 研 究 综 述 [ J] . 软件学报, 2014, 25(1) : 1-15. CHENG S Q, SHEN H W, ZHANG G Q, et al. Survey of signed network research [ J ] . Journal of Software, 2014, 25(1) : 1-15.
[9] CHEN J, ZHONG M, LI J, et al. Effective deep attributed network representation learning with topology adapted smoothing[ J] . IEEE Transactions on Cybernetics, 2022, 52(7) : 5935-5946.
[10] WU Z H, PAN S R, CHEN F W, et al. A comprehensive survey on graph neural networks[ J] . IEEE Transactions on Neural Networks and Learning Systems, 2021, 32(1) : 4-24.
[11] ZHANG Z W, CUI P, ZHU W W. Deep learning on graphs: a survey [ J] . IEEE Transactions on Knowledge and Data Engineering, 2022, 34(1) : 249-270.
[12] HEIDER F. Attitudes and cognitive organization[ J] . The Journal of Psychology, 1946, 21(1) : 107-112.
[13] 刘苗苗, 扈庆翠, 郭景峰, 等. 符号网络链接预测算 法研究综述[ J] . 计算机科学, 2020, 47(2) : 21-30.
LIU M M, HU Q C, GUO J F, et al. Survey of link prediction algorithms in signed networks[ J] . Computer Science, 2020, 47(2) : 21-30.
[14] ANCHURI P, MAGDON-ISMAIL M. Communities and balance in signed networks: a spectral approach [ C] / / 2012 IEEE / ACM International Conference on Advances in Social Networks Analysis and Mining. Piscataway: IEEE,2012: 235-242.
[15] LESKOVEC J, HUTTENLOCHER D, KLEINBERG J. Predicting positive and negative links in online social networks [C] / / Proceedings of the 19th International Conference on World Wide Web. New York: ACM, 2010: 641-650.
[16] PEROZZI B, AL-RFOU R, SKIENA S. DeepWalk: online learning of social representations [ C] / / Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2014: 701-710.
[17] WANG S H, TANG J L, AGGARWAL C, et al. Signed network embedding in social media [ C] / / Proceedings of the 2017 SIAM International Conference on Data Mining. Philadelphia: Society for Industrial and Applied Mathematics, 2017: 327-335.
[18] DERR T, MA Y, TANG J L. Signed graph convolutional networks [ C ] / / 2018 IEEE International Conference on Data Mining. Piscataway: IEEE,2018: 929-934.
[19] LI Y, TIAN Y, ZHANG J W, et al. Learning signed network embedding via graph attention [ J] . Proceedings of the AAAI Conference on Artificial Intelligence, 2020, 34 (4) : 4772-4779.
[20] HUANG J J, SHEN H W, HOU L, et al. Signed graph attention networks [ C ] / / Artificial Neural Networks and Machine Learning-ICANN 2019: Workshop and Special Sessions. Cham: Springer, 2019: 566-577.
[21] HUANG J J, SHEN H W, HOU L, et al. SDGNN: learning node representation for signed directed networks [ J] . Proceedings of the AAAI Conference on Artificial Intelligence, 2021, 35(1) : 196-203.