I have a huge amount of data and want to create a model in Neo4j representing this data.
It would be about 3 million nodes and more than 3 billion relationships. Building this with the Batch Inserter takes too long to import the data and then create the nodes and relationships.
The question is: can I split the huge model into two separate models and then run a cypher query that accesses the two models at the same time? If yes, how do I do it?
I'm afraid Neo4j does not natively support partitioning the graph across multiple instances of the database (yet).
However, 3M nodes and 3B edges isn't a huge amount of data. What exactly do you mean by "too long"?