Reconfigure dataloaders when migrating from graphql-java-kickstart to spring-graphql

216 views Asked by At

I'm in the process of migrating a GraphQl-java-kickstart project to Spring-GraphQl, as Spring-GraphQl was not available at the time when GraphQL was implemented in the project. The main reason for migrating is that the auto-configuration and annotation based approach offered by Spring may help remove much of the complex boilerplate code in the initial GraphQl project.

There is one dataloader registration in the kickstart dataloader config which I cannot seem to recreate in the new Spring Graphql config setup. In the kickstart grapql config a dataloader is created and used as a method argument in another dataloader (simplified below) :

@Component
public class RequestContextBuilder extends DefaultGraphQLServletContextBuilder {

    @Override
    public GraphQLKickstartContext build(HttpServletRequest request, HttpServletResponse response) {
        
        Map<Object, Object> map = new HashMap<>();
        // ... 
        return GraphQLKickstartContext.of(buildDataLoaderRegistry(), map);
    }
    
    private DataLoaderRegistry buildDataLoaderRegistry() {

        var noBatching = new DataLoaderOptions().setBatchingEnabled(false);

        var dataLoaderRegistry = new DataLoaderRegistry();

        var firstLoader = DataLoaderFactory.newDataLoader(
                ids1 -> supplyAsync(() -> firstService.getItems(ids1), traceableExecutorService),
                noBatching
        );
        dataLoaderRegistry.register("first", firstLoader);


        var secondLoader = DataLoaderFactory.newDataLoader(
                ids2 -> supplyAsync(() -> ((Function<List<ID>, List<Item>>) ids2 -> 
                  secondService.loadUsingOtherLoader(ids2, firstLoader)).apply(ids), traceableExecutorService),
                noBatching
        );
        dataLoaderRegistry.register("second", secondLoader);

        return dataLoaderRegistry;
    }
}

...
@Service
public class SecondService {
  
  public List<Item> loadUsingOtherLoader(List<ID> ids, DataLoader<ID, Item> firstLoader) {
    
    return ids.stream()
              .map(firstloader::load)
              . //
              .toList();
  }
}

I cannot seem to recreate this setup in the Spring-GraphQl configuration using the BatchLoaderRegistry, as the loaders are passed as lambdas instead of DataLoaders and cannot be retrieved from the registry:

@Configuration
public class RequestContextBuilder {

    public RequestContextBuilder(BatchLoaderRegistry registry, FirstService firstService, SecondService secondService) {
    
        registry.forName("first")
                .withOptions(BATCHING_DISABLED)
                .registerBatchLoader((ids, env) -> Flux.fromStream(ids.stream()).map(firstService::getItems));
        
        registry.forName("second")
                .withOptions(BATCHING_DISABLED)
                .registerBatchLoader((ids, env) -> Flux.fromStream(ids.stream()).map(secondService::loadUsingOtherLoader));
        }
    }

}

Although in an ideal situation the second loader should not need to be dependant on the first loader, the initial stage of this migration is to switch from graphql-kickstart to spring-graphql with as few changes as possible. In a later refactor stage the loaders will be decoupled, spring logic+annotations applied, etc.

Would anyone know how this matter can be resolved?

1

There are 1 answers

0
Michiel On BEST ANSWER

For anyone interested, I managed to resolve this by retrieving the other dataloader using using the BatchLoaderEnvironment:

@Configuration
public class RequestContextBuilder {

    public RequestContextBuilder(BatchLoaderRegistry registry, FirstService firstService, SecondService secondService) {
    
        registry.forName("first")
                .withOptions(BATCHING_DISABLED)
                .registerBatchLoader((ids, env) -> Flux.fromStream(ids.stream()).map(firstService::getItems));
        
        registry.forName("second")
                .withOptions(BATCHING_DISABLED)
                .registerBatchLoader((ids, env) -> {
     
              var dfe = obtainDfeFromEnvironment(env);
              var firstLoader = dfe.getDataLoader("first");
              // .... etc
           }
        }
    }

    private static DataFetchingEnvironment obtainDfeFromEnvironment(BatchLoaderEnvironment env) {

          return env.getKeyContextsList().stream()
                .filter(DataFetchingEnvironment.class::isInstance)
                .map(DataFetchingEnvironment.class::cast)
                .findFirst()
                .orElseThrow();
    }
}

And by passing the DataFetchingEnvironment as a second argument while invoking the dataloader:

dfe.<SomeId, List<SomeClass>>getDataLoader("second").load(someId, dfe);