pytorch

Jumping-Knowledge Representation Learning With LSTMs

Background As I mentioned in my previous post on constrained graph attention networks, graph neural networks suffer from overfitting and oversmoothing as network depth increases. These issues can ultimately be linked to the local topologies of the graph.

Constrained Graph Attention Networks

In their recent paper, Wang et al. propose a few updates to the Graph Attention Network (GAT) neural network algorithm (if you want to skip the technical bit and get to the code, click here). Briefly, GATs are a recently-developed neural network architecture applied to data distributed over a graph domain.

Cross-Entropy With Structure

As I mentioned in my previous post, I work with cortical surface segmentation data. Due to the biology of the human brain, there is considerable reproducible structure and function across individuals (thankfully!). One manifestion of this reproducibility is exemplified by the neocortex a.

Gaussian Graph Convolutional Networks

I’m using graph convolutional networks as a tool to segment the cortical surface of the brain. This research resides in the domain of node classification using inductive learning. By node classification, I mean that we wish to assign a discrete label to cortical surface locations (nodes / vertices in a graph) on the basis of some feature data and brain network topology.