TY - GEN
T1 - PrivIM: Differentially Private Graph Neural Networks for Influence Maximization
AU - Hou, Renxuan
AU - Ye, Qingqing
AU - Ran, Xun
AU - Zhang, Sen
AU - Hu, Haibo
N1 - Publisher Copyright:
© 2025 IEEE.
PY - 2025/5
Y1 - 2025/5
N2 - Influence Maximization (IM), aiming to identify a small set of highly influential nodes in social networks, is a critical problem in graph analysis. Recently, Graph Neural Networks (GNNs) have demonstrated superior effectiveness in addressing IM. However, a trained GNN still raises significant privacy concerns, as it may expose sensitive node features and structural information. While Differential Privacy (DP) techniques have been widely applied to GNNs for node-level tasks, they cannot be directly extended to 1M problems. This is because IM requires more complex structural information for training, resulting in an extremely larger DP noise scale than node-level tasks. To tackle these issues, we propose PrivIM, a novel differentially private subgraph-based GNNs framework for IM tasks, which ensures node-level DP guarantees. Within PrivIM, we design a unique dual-stage adaptive frequency sampling scheme to optimize the model utility. First, it reduces the correlation between nodes by dynamically adjusting each node's sampling probability. Then additional subgraphs are incorporated to supplement boundary structural information, enhancing utility without increasing privacy budget. Extensive experiments on six real-world datasets demonstrate that PrivIM maintains high utility in IM compared to baseline methods.
AB - Influence Maximization (IM), aiming to identify a small set of highly influential nodes in social networks, is a critical problem in graph analysis. Recently, Graph Neural Networks (GNNs) have demonstrated superior effectiveness in addressing IM. However, a trained GNN still raises significant privacy concerns, as it may expose sensitive node features and structural information. While Differential Privacy (DP) techniques have been widely applied to GNNs for node-level tasks, they cannot be directly extended to 1M problems. This is because IM requires more complex structural information for training, resulting in an extremely larger DP noise scale than node-level tasks. To tackle these issues, we propose PrivIM, a novel differentially private subgraph-based GNNs framework for IM tasks, which ensures node-level DP guarantees. Within PrivIM, we design a unique dual-stage adaptive frequency sampling scheme to optimize the model utility. First, it reduces the correlation between nodes by dynamically adjusting each node's sampling probability. Then additional subgraphs are incorporated to supplement boundary structural information, enhancing utility without increasing privacy budget. Extensive experiments on six real-world datasets demonstrate that PrivIM maintains high utility in IM compared to baseline methods.
KW - Differential privacy
KW - Graph neural networks
KW - Influence maximization
UR - https://www.scopus.com/pages/publications/105015460532
U2 - 10.1109/ICDE65448.2025.00259
DO - 10.1109/ICDE65448.2025.00259
M3 - Conference article published in proceeding or book
AN - SCOPUS:105015460532
T3 - Proceedings - International Conference on Data Engineering
SP - 3467
EP - 3479
BT - Proceedings - 2025 IEEE 41st International Conference on Data Engineering, ICDE 2025
PB - IEEE Computer Society
T2 - 41st IEEE International Conference on Data Engineering, ICDE 2025
Y2 - 19 May 2025 through 23 May 2025
ER -