graph neural networks

Learning Cortical Parcellations Using Graph Neural Networks

We examine the utility of graph neural networks for the purpose of learning cortical segmentations. We show that attention-based transformer networks significantly outperform conventional GCN and linear feed-forward variants for the purpose of generating accurate reproducible cortical maps.

Jumping-Knowledge Representation Learning With LSTMs

Background As I mentioned in my previous post on constrained graph attention networks, graph neural networks suffer from overfitting and oversmoothing as network depth increases. These issues can ultimately be linked to the local topologies of the graph.