Volume 50 Issue 2
Feb.  2024
Turn off MathJax
Article Contents
DONG X L,HUANG J,QIN F,et al. Graph pooling method based on multilevel union[J]. Journal of Beijing University of Aeronautics and Astronautics,2024,50(2):559-568 (in Chinese) doi: 10.13700/j.bh.1001-5965.2022.0386
Citation: DONG X L,HUANG J,QIN F,et al. Graph pooling method based on multilevel union[J]. Journal of Beijing University of Aeronautics and Astronautics,2024,50(2):559-568 (in Chinese) doi: 10.13700/j.bh.1001-5965.2022.0386

Graph pooling method based on multilevel union

doi: 10.13700/j.bh.1001-5965.2022.0386
Funds:  National Natural Science Foundation of China (61806005); The University Synergy Innovation Program of Anhui Province (GXXT-2020-012); Natural Science Foundation of the Educational Commission of Anhui Province (KJ2021A0372,KJ2019A0064)
More Information
  • Corresponding author: E-mail:huangjun.cs@ahut.edu.cn
  • Received Date: 19 May 2022
  • Accepted Date: 23 Jun 2022
  • Available Online: 31 Oct 2022
  • Publish Date: 18 Oct 2022
  • Graph pooling method has been widely used in bioinformatics, chemistry, social networks, recommendation systems and other fields. At present, the graph pooling method does not solve the problem of node selection and node information loss caused by pooling. A new graph pooling method is proposed, namely the graph pooling method based on multilevel union (MUPool). The suggested technique extracts distinct features from several convolution modules by using a multi-view module to obtain the properties of nodes from various angles. At the same time, a multilevel union module is proposed to concatenate the outputs of different pooling layers, each layer fusing information from all previous layers. The suggested approach builds a classifier based on each pooling layer using the late fusion module, then fuses the predicted results to obtain the final classification results. The proposed method is tested on multiple data sets, and the accuracy is improved by 1.62% on average, the proposed method can be combined with the existing hierarchical pooling method, the accuracy of the combined method is improved by 2.45% on average.

     

  • loading
  • [1]
    SIMONYAN K, ZISSERMAN A. Very deep convolutional networks for large scale image recognition[C]//Proceedings of the 3th International Conference on Learning Representations. San Diego: ICLR, 2015.
    [2]
    HUANG G, LIU Z, VAN DER MAATEN L, et al. Densely connected convolutional networks[C]//Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition. Piscataway: IEEE Press, 2017: 2261-2269.
    [3]
    ZHANG X, ZHAO J B, LECUN Y. Character-level convolutional networks for text classification[C]//Proceedings of the 28th International Conference on Neural Information Processing Systems. New York: ACM, 2015: 649-657.
    [4]
    KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks[EB/OL]. (2017-02-22) [2022-01-06]. http://arxiv.org/abs/1609.02907.
    [5]
    HAMILTON W L, YING R, LESKOVEC J. Inductive representation learning on large graphs[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems. New York: ACM, 2017: 1025-1035.
    [6]
    NAMAZI R, GHALEBI E, WILLIAMSON S, et al. SMGRL: A scalable multi-resolution graph representation learning framework [EB/OL]. (2022-01-29) [2022-02-02]. https://arxiv.org/abs/2201.12670.
    [7]
    DEFFERRARD M, BRESSON X, VANDERGHEYNST P. Convolutional neural networks on graphs with fast localized spectral filtering[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems. New York: ACM, 2016: 3844-3852.
    [8]
    RHEE S, SEO S, KIM S. Hybrid approach of relation network and localized graph convolutional filtering for breast cancer subtype classification[C]//Proceedings of the 27th International Joint Conference on Artificial Intelligence. California: International Joint Conferences on Artificial Intelligence Organization, 2018: 3527-3534.
    [9]
    MA Y, WANG S H, AGGARWAL C C, et al. Graph convolutional networks with EigenPooling[C]//Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. New York: ACM, 2019: 723-731.
    [10]
    YING R, YOU J X, MORRIS C, et al. Hierarchical graph representation learning with differentiable pooling[C]//Proceedings of the 32nd International Conference on Neural Information Processing Systems. New York: ACM, 2018: 4805-4815.
    [11]
    GAO H Y, JI S W. Graph u-nets[C]//IEEE Transactions on Pattern Analysis and Machine Intelligence. Piscataway: IEEE Press, 2022: 4948-4960.
    [12]
    CANGEA C, VELIČKOVIĆ P, JOVANOVIĆ N, et al. Towards sparse hierarchical graph classifiers[EB/OL]. (2018-11-03) [2022-02-03]. https://doi.org/10.48550/arXiv.1811.01287.
    [13]
    LEE J, LEE I, KANG J. Self-attention graph pooling[C]//Proceedings of the International Conference on Machine Learning. New York: ACM, 2019: 3734-3743.
    [14]
    DIEHL F, BRUNNER T, LE M T, et al. Towards graph pooling by edge contraction[C]//Proceedings of the International Conference on Machine Learning. New York: ACM, 2019.
    [15]
    RANJAN E, SANYAL S, TALUKDAR P. ASAP: Adaptive structure aware pooling for learning hierarchical graph representations[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Washton, D.C.: AAAI, 2020, 34(4): 5470-5477.
    [16]
    BRUNA J, ZAREMBA W, SZLAM A, et al. Spectral networks and locally connected networks on graphs[EB/OL]. (2014-03-21) [2022-02-09]. https://arxiv.org/abs/1312.6203v3.
    [17]
    CHUNG F R K, GRAHAM F C. Spectral graph theory[M]. Providence, R.I.: American Mathematical Society, 1997.
    [18]
    XU B, SHEN H, CAO Q, et al. Graph wavelet neural network[C]//Proceedings of the 7th International Conference on Learning Representations. New Orleans: ICLR, 2019.
    [19]
    LI M, MA Z, WANG Y G, et al. Fast haar transforms for graph neural networks[J]. Neural Networks, 2020, 128: 188-198. doi: 10.1016/j.neunet.2020.04.028
    [20]
    MA X, WU G, KIM W H Multi-resolution graph neural network for identifying disease-specific variations in brain connectivity[EB/OL]. (2019-12-03) [2022-02-03]. https://arxiv.ORG/ABS/1912.01181.
    [21]
    WANG Y G, LI M, MA Z, et al. Haar graph pooling[C]//Proceedings of the 37th International Conference on Machine Learning. New York: ACM, 2020: 9952–9962.
    [22]
    ZHENG X, ZHOU B, WANG Y G, et al. Decimated framelet system on graphs and fast g-framelet transforms[J]. Journal of Machine Learning Research, 2022, 23(18): 1-18.
    [23]
    SCARSELLI F, GORI M, TSOI A C, et al. The graph neural network model[J]. IEEE Transactions on Neural Networks, 2009, 20(1): 61-80.
    [24]
    TARLOW Y D, BROCKSCHMIDT M, ZEMEL R. Gated graph sequence neural networks[C]//Proceedings of the 4th International Conference on Learning Representations. San Juan: ICLR, 2016.
    [25]
    GALLICCHIO C, MICHELI A. Fast and deep graph neural networks[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Washton, D.C.: AAAI, 2020, 34(4): 3898-3905.
    [26]
    CHEN W R, LUO C Y, WANG S K, et al. Representation learning with complete semantic description of knowledge graphs[C]//Proceedings of the 2017 International Conference on Machine Learning and Cybernetics. Piscataway: IEEE Press, 2017: 143-149.
    [27]
    GAO H Y, LIU Y, JI S W. Topology-aware graph pooling networks[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021, 43(12): 4512-4518.
    [28]
    ATWOOD J, TOWSLEY D. Diffusion-convolutional neural networks[C]//Proceedings of the 30th International Conference on Neural Information Processing Systems. New York: ACM, 2016: 2001-2009.
    [29]
    VINYALS O, BENGIO S, KUDLUR M. Order matters: Sequence to sequence for sets[C]//Proceedings of the 4th International Conference on Learning Representations. San Juan: ICLR, 2016.
    [30]
    ZHANG M, CUI Z, NEUMANN M, et al. An end-to-end deep learning architecture for graph classification[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Menlo Park: AAAI, 2018: 4438-4445.
    [31]
    ITOH T D, KUBO T, IKEDA K. Multi-level attention pooling for graph neural networks: Unifying graph representations with multiple localities[J]. Neural Networks, 2022, 145: 356-373.
    [32]
    ZHANG Z, BU J J, ESTER M, et al. Hierarchical multi-view graph pooling with structure learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(1): 545-559.
    [33]
    LIU N, JIAN S L, LI D S, et al. Hierarchical adaptive pooling by capturing high-order dependency for graph representation learning[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(4): 3952-3965.
    [34]
    DIEHL F. Edge contraction pooling for graph neural networks[EB/OL]. (2019-05-27) [2022-03-09]. https://arxiv.org/abs/1905.10990.
    [35]
    QIN J, LIU L, SHEN H, et al. Uniform pooling for graph networks[J]. Applied Sciences, 2020, 10(18): 6287-6301. doi: 10.3390/app10186287
    [36]
    TANG H T, MA G X, HE L F, et al. CommPOOL: An interpretable graph pooling framework for hierarchical graph representation learning[J]. Neural Networks, 2021, 143: 669-677. doi: 10.1016/j.neunet.2021.07.028
    [37]
    PANG Y S, ZHAO Y X, LI D S. Graph pooling via coarsened graph infomax[C]//Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval. New York: ACM, 2021: 2177-2181.
    [38]
    SU Z D, HU Z H, LI Y D. Hierarchical graph representation learning with local capsule pooling[C]//ACM Multimedia Asia. New York: ACM, 2021.
    [39]
    BACCIU D, CONTE A, GROSSI R, et al. K-plex cover pooling for graph neural networks[J]. Data Mining and Knowledge Discovery, 2021, 35(5): 2200-2220. doi: 10.1007/s10618-021-00779-z
    [40]
    YANARDAG P, VISHWANATHAN S V N. A structural smoothing framework for Robust graph-comparison[C]//Proceedings of the 28th International Conference on Neural Information Processing Systems. New York: ACM, 2015: 2134-2142.
    [41]
    FERAGEN A, KASENBURG N, PETERSEN J, et al. Scalable kernels for graphs with continuous attributes[C]//Proceedings of the 26th International Conference on Neural Information Processing Systems. New York: ACM, 2013: 216-224.
    [42]
    DOBSON P D, DOIG A J. Distinguishing enzyme structures from non-enzymes without alignments[J]. Journal of Molecular Biology, 2003, 330(4): 771-783. doi: 10.1016/S0022-2836(03)00628-4
    [43]
    KAZIUS J, MCGUIRE R, BURSI R. Derivation and validation of toxicophores for mutagenicity prediction[J]. Journal of Medicinal Chemistry, 2005, 48(1): 312-320. doi: 10.1021/jm040835a
    [44]
    BAEK J, KANG M, HWANG S J. Accurate learning of graph representations with graph multiset pooling[C]//Proceedings of the 9th International Conference on Learning Representations. Vienna: ICLR, 2021.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(8)  / Tables(4)

    Article Metrics

    Article views(644) PDF downloads(8) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return