RuntimeError When Modifying Node Count in Networkx Graph for Graph Neural Network Training

22 views Asked by At

I'm encountering a runtime error while manipulating the node count in a Networkx-generated graph passed through a Graph Neural Network (GNN). Here's my GNN code, which seems independent of the graph's node count:

class GCN(nn.Module):
    def __init__(self, input_size, hidden_size, num_classes):
        super(GCN, self).__init__()
        self.layer1 = GCNConv(input_size, hidden_size)
        self.layer2 = GCNConv(hidden_size, hidden_size)
        self.layer3 = GCNConv(hidden_size, num_classes)
        self.softmax = nn.Softmax(dim=0)

    def forward(self, node_features, edge_index):
        output = self.layer1(node_features, edge_index)
        output = torch.relu(output)
        output = self.layer2(output, edge_index)
        output = torch.relu(output)
        output = self.layer3(output, edge_index)
        output = self.softmax(output)

        return output

This is how I am creating the graph and removing a node from it.

def generate_graph(num_nodes):
    # generate weighted and connected graph
    Graph = nx.gnm_random_graph(num_nodes, random.randint(num_nodes, num_nodes*2), seed=42)
    while not nx.is_connected(Graph):
        Graph = nx.gnm_random_graph(num_nodes, random.randint(num_nodes, num_nodes*2), seed=42)

    # add features to nodes
    # node 0 will be the source node
    # each node will have a feature of 3
    # first feature will represent the node's bias (a random value between 0 and 1)
    # second feature will represent if the node is a source node (0 or 1, 1 if the node is the source node)
    # third feature will represent the node's degree
    for node in Graph.nodes:
        Graph.nodes[node]['feature'] = [random.random(), 1 if node == 0 else 0, Graph.degree[node]]

    node_features = Graph.nodes.data('feature')
    node_features = torch.tensor([node_feature[1] for node_feature in node_features])
    edge_index = torch.tensor(list(Graph.edges)).t().contiguous()

    return Graph, node_features, edge_index


def remove_node_from_graph(Graph, node):
    # remove the node from the graph
    Graph.remove_node(node)

    # update the features of the nodes
    for node in Graph.nodes:
        Graph.nodes[node]['feature'][2] = Graph.degree[node]

    node_features = Graph.nodes.data('feature')
    node_features = torch.tensor([node_feature[1] for node_feature in node_features])
    edge_index = torch.tensor(list(Graph.edges)).t().contiguous()

    return Graph, node_features, edge_index

Training my GCN with a 10-node graph succeeds, but when I remove one node and pass the modified graph through the GCN, I encounter the error:

RuntimeError: index 9 is out of bounds for dimension 0 with size 9

Surprisingly, the process works fine when I generate a new 9-node graph after the initial training step. I'm struggling to pinpoint where I might be making a mistake. Any insights would be greatly appreciated!

0

There are 0 answers