-
摘要:
图池化方法已经在生物信息学、化学、社交网络、推荐系统等多个领域中得到广泛应用,但关于图池化方法大多没有很好的解决节点选择问题和池化带来的节点信息丢失问题。对此提出一种新的多级联合图池化(MUPool)方法。所提方法使用多视角模块从多个视角获取节点的特征,即通过多个卷积模块提取不同的特征。同时提出多级联合模块(级联),将不同池化层的输出串联,每一层都可以融合以往所有层的信息。提出使用后端融合模块,针对每个池化层建立一个分类器,对预测结果进行融合得到最终分类结果。所提方法在多个数据集上进行实验,准确度平均提高1.62%,所提方法可以与现有的分层池化方法相结合,结合后的方法准确度平均提高2.45%。
Abstract:Graph pooling method has been widely used in bioinformatics, chemistry, social networks, recommendation systems and other fields. At present, the graph pooling method does not solve the problem of node selection and node information loss caused by pooling. A new graph pooling method is proposed, namely the graph pooling method based on multilevel union (MUPool). The suggested technique extracts distinct features from several convolution modules by using a multi-view module to obtain the properties of nodes from various angles. At the same time, a multilevel union module is proposed to concatenate the outputs of different pooling layers, each layer fusing information from all previous layers. The suggested approach builds a classifier based on each pooling layer using the late fusion module, then fuses the predicted results to obtain the final classification results. The proposed method is tested on multiple data sets, and the accuracy is improved by 1.62% on average, the proposed method can be combined with the existing hierarchical pooling method, the accuracy of the combined method is improved by 2.45% on average.
-
表 1 数据集和参数
Table 1. Datasets and parameters
表 2 MUPool和对比方法在5个数据集上的准确度
Table 2. Accuracy of MUPool and comparison method on five datasets
% 方法 IMDB-B[40] IMDB-M[40] PROTEINS[41] D&D[42] Mutagenicity[43] Set2Set[29] 72.90 ± 0.75 50.19 ± 0.39 71.46 ± 2.17 71.94 ± 0.56 77.69 ± 0.55 SortPool[30] 70.03 ± 0.86 47.83 ± 0.85 75.54 ± 0.94 79.37 ± 0.97 DiffPool[10] 73.14 ± 0.70 51.31 ± 0.72 76.25 ± 0.88 80.64 ± 0.72 80.44 ± 0.82 TopKPool[11] 71.58 ± 0.95 48.59 ± 0.72 77.68 ± 2.23 82.43 ± 0.55 79.14 ± 0.76 SAGPool[13] 71.86 ± 0.97 76.45 ± 0.82 79.18 ± 0.82 MinCutPool[43] 72.65 ± 0.75 51.04 ± 0.70 76.05 ± 2.60 80.08 ± 2.3 ASAP[15] 72.81 ± 0.50 50.78 ± 0.75 74.19 ± 0.97 76.87 ± 0.70 80.12 ± 0.88 EdgePool[34] 72.46 ± 0.74 50.79 ± 0.59 72.50 ± 3.20 75.85 ± 0.58 CGIPool[37] 72.40 ± 0.87 51.45 ± 0.65 74.10 ± 2.31 73.11 ± 0.93 80.65 ± 0.79 GMT[44] 73.48 ± 0.76 50.66 ± 0.82 75.09 ± 0.59 78.72 ± 0.59 MUPool 76.05 ± 1.31 53.17 ± 0.62 81.65 ± 0.81 86.60 ± 0.73 81.55 ± 1.16 表 3 MUPool方法去除某个模块后实验的准确度
Table 3. Accuracy of MUPool method’s experiment after removing a module
% 表 4 原始方法和插入M&L模块的分层池化方法的准确度对比
Table 4. Comparison of accuracy between original method and the hierarchical pooling mothod with addition of M&L module
% -
[1] SIMONYAN K, ZISSERMAN A. Very deep convolutional networks for large scale image recognition[C]//Proceedings of the 3th International Conference on Learning Representations. San Diego: ICLR, 2015. [2] HUANG G, LIU Z, VAN DER MAATEN L, et al. Densely connected convolutional networks[C]//Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE Press, 2017: 2261-2269. [3] ZHANG X, ZHAO J B, LECUN Y. Character-level convolutional networks for text classification[C]//Proceedings of the 28th International Conference on Neural Information Processing Systems. New York: ACM, 2015: 649-657. [4] KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks[EB/OL]. (2017-02-22) [2022-01-06]. http://arxiv.org/abs/1609.02907. [5] HAMILTON W L, YING R, LESKOVEC J. Inductive representation learning on large graphs[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. New York: ACM, 2017: 1025-1035. [6] NAMAZI R, GHALEBI E, WILLIAMSON S, et al. SMGRL: A scalable multi-resolution graph representation learning framework [EB/OL]. (2022-01-29) [2022-02-02]. https://arxiv.org/abs/2201.12670. [7] DEFFERRARD M, BRESSON X, VANDERGHEYNST P. Convolutional neural networks on graphs with fast localized spectral filtering[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems. New York: ACM, 2016: 3844-3852. [8] RHEE S, SEO S, KIM S. Hybrid approach of relation network and localized graph convolutional filtering for breast cancer subtype classification[C]//Proceedings of the 27th International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2018: 3527-3534. [9] MA Y, WANG S H, AGGARWAL C C, et al. Graph convolutional networks with EigenPooling[C]//Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. New York: ACM, 2019: 723-731. [10] YING R, YOU J X, MORRIS C, et al. Hierarchical graph representation learning with differentiable pooling[C]//Proceedings of the 32nd International Conference on Neural Information Processing Systems. New York: ACM, 2018: 4805-4815. [11] GAO H Y, JI S W. Graph u-nets[C]//IEEE Transactions on Pattern Analysis and Machine Intelligence. Piscataway: IEEE Press, 2022: 4948-4960. [12] CANGEA C, VELIČKOVIĆ P, JOVANOVIĆ N, et al. Towards sparse hierarchical graph classifiers[EB/OL]. (2018-11-03) [2022-02-03]. https://doi.org/10.48550/arXiv.1811.01287. [13] LEE J, LEE I, KANG J. Self-attention graph pooling[C]//Proceedings of the International Conference on Machine Learning. New York: ACM, 2019: 3734-3743. [14] DIEHL F, BRUNNER T, LE M T, et al. Towards graph pooling by edge contraction[C]//Proceedings of the International Conference on Machine Learning. New York: ACM, 2019. [15] RANJAN E, SANYAL S, TALUKDAR P. ASAP: Adaptive structure aware pooling for learning hierarchical graph representations[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Washton, D.C.: AAAI, 2020, 34(4): 5470-5477. [16] BRUNA J, ZAREMBA W, SZLAM A, et al. Spectral networks and locally connected networks on graphs[EB/OL]. (2014-03-21) [2022-02-09]. https://arxiv.org/abs/1312.6203v3. [17] CHUNG F R K, GRAHAM F C. Spectral graph theory[M]. Providence, R.I.: American Mathematical Society, 1997. [18] XU B, SHEN H, CAO Q, et al. Graph wavelet neural network[C]//Proceedings of the 7th International Conference on Learning Representations. New Orleans: ICLR, 2019. [19] LI M, MA Z, WANG Y G, et al. Fast haar transforms for graph neural networks[J]. Neural Networks, 2020, 128: 188-198. doi: 10.1016/j.neunet.2020.04.028 [20] MA X, WU G, KIM W H Multi-resolution graph neural network for identifying disease-specific variations in brain connectivity[EB/OL]. (2019-12-03) [2022-02-03]. https://arxiv.ORG/ABS/1912.01181. [21] WANG Y G, LI M, MA Z, et al. Haar graph pooling[C]//Proceedings of the 37th International Conference on Machine Learning. New York: ACM, 2020: 9952–9962. [22] ZHENG X, ZHOU B, WANG Y G, et al. Decimated framelet system on graphs and fast g-framelet transforms[J]. Journal of Machine Learning Research, 2022, 23(18): 1-18. [23] SCARSELLI F, GORI M, TSOI A C, et al. The graph neural network model[J]. IEEE Transactions on Neural Networks, 2009, 20(1): 61-80. [24] TARLOW Y D, BROCKSCHMIDT M, ZEMEL R. Gated graph sequence neural networks[C]//Proceedings of the 4th International Conference on Learning Representations. San Juan: ICLR, 2016. [25] GALLICCHIO C, MICHELI A. Fast and deep graph neural networks[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Washton, D.C.: AAAI, 2020, 34(4): 3898-3905. [26] CHEN W R, LUO C Y, WANG S K, et al. Representation learning with complete semantic description of knowledge graphs[C]//Proceedings of the 2017 International Conference on Machine Learning and Cybernetics. Piscataway: IEEE Press, 2017: 143-149. [27] GAO H Y, LIU Y, JI S W. Topology-aware graph pooling networks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, 43(12): 4512-4518. [28] ATWOOD J, TOWSLEY D. Diffusion-convolutional neural networks[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems. New York: ACM, 2016: 2001-2009. [29] VINYALS O, BENGIO S, KUDLUR M. Order matters: Sequence to sequence for sets[C]//Proceedings of the 4th International Conference on Learning Representations. San Juan: ICLR, 2016. [30] ZHANG M, CUI Z, NEUMANN M, et al. An end-to-end deep learning architecture for graph classification[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Menlo Park: AAAI, 2018: 4438-4445. [31] ITOH T D, KUBO T, IKEDA K. Multi-level attention pooling for graph neural networks: Unifying graph representations with multiple localities[J]. Neural Networks, 2022, 145: 356-373. [32] ZHANG Z, BU J J, ESTER M, et al. Hierarchical multi-view graph pooling with structure learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(1): 545-559. [33] LIU N, JIAN S L, LI D S, et al. Hierarchical adaptive pooling by capturing high-order dependency for graph representation learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(4): 3952-3965. [34] DIEHL F. Edge contraction pooling for graph neural networks[EB/OL]. (2019-05-27) [2022-03-09]. https://arxiv.org/abs/1905.10990. [35] QIN J, LIU L, SHEN H, et al. Uniform pooling for graph networks[J]. Applied Sciences, 2020, 10(18): 6287-6301. doi: 10.3390/app10186287 [36] TANG H T, MA G X, HE L F, et al. CommPOOL: An interpretable graph pooling framework for hierarchical graph representation learning[J]. Neural Networks, 2021, 143: 669-677. doi: 10.1016/j.neunet.2021.07.028 [37] PANG Y S, ZHAO Y X, LI D S. Graph pooling via coarsened graph infomax[C]//Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM, 2021: 2177-2181. [38] SU Z D, HU Z H, LI Y D. Hierarchical graph representation learning with local capsule pooling[C]//ACM Multimedia Asia. New York: ACM, 2021. [39] BACCIU D, CONTE A, GROSSI R, et al. K-plex cover pooling for graph neural networks[J]. Data Mining and Knowledge Discovery, 2021, 35(5): 2200-2220. doi: 10.1007/s10618-021-00779-z [40] YANARDAG P, VISHWANATHAN S V N. A structural smoothing framework for Robust graph-comparison[C]//Proceedings of the 28th International Conference on Neural Information Processing Systems. New York: ACM, 2015: 2134-2142. [41] FERAGEN A, KASENBURG N, PETERSEN J, et al. Scalable kernels for graphs with continuous attributes[C]//Proceedings of the 26th International Conference on Neural Information Processing Systems. New York: ACM, 2013: 216-224. [42] DOBSON P D, DOIG A J. Distinguishing enzyme structures from non-enzymes without alignments[J]. Journal of Molecular Biology, 2003, 330(4): 771-783. doi: 10.1016/S0022-2836(03)00628-4 [43] KAZIUS J, MCGUIRE R, BURSI R. Derivation and validation of toxicophores for mutagenicity prediction[J]. Journal of Medicinal Chemistry, 2005, 48(1): 312-320. doi: 10.1021/jm040835a [44] BAEK J, KANG M, HWANG S J. Accurate learning of graph representations with graph multiset pooling[C]//Proceedings of the 9th International Conference on Learning Representations. Vienna: ICLR, 2021.