Close

Presentation

This content is available for: Workshop Reg Pass. Upgrade Registration
An Analysis of Graph Neural Network Memory Access Patterns
DescriptionGraph Neural Networks (GNNs) are becoming increasingly popular for applying neural networks to graph data. However, as the size of the input graph increases, the GPU memory wall problem becomes an important issue. Since both current solutions to reduce the memory footprint, such as mini-batch approaches and the use of memory-efficient tensor manipulations, have drawbacks, we attempt to solve the problem by expanding the memory size using a virtual memory technology. To overcome the data transfer overhead of virtual memory technology, in this paper we focus on analyzing the memory access pattern of GNNs with the goal of reducing the data transfer latency perceived by the user. A preliminary result of applying optimization techniques guided by our analysis results shows a 40% reduction in the execution time of a combination of training and testing.
Event Type
Workshop
TimeMonday, 13 November 20235:06pm - 5:30pm MST
Location704-706
Tags
Artificial Intelligence/Machine Learning
Graph Algorithms and Frameworks
Registration Categories
W