Reasoning an ontology using OWL API

1.2k views Asked by At

I have used OWL API 4.1.3 to load my ontology which is not big. As I need to use inferred information I also carried out reasoning using Hermit 1.3.8.413 library. The following code shows how I have done it.

public class ReasonRDF {

public static void main(String[] args) throws OWLOntologyCreationException, OWLOntologyStorageException {

    readRDF("C:/Users/workspace/Ontology_matching/NVDB_Matching_v18_H_4_1_CONVERTYING/results/NewInstantiated/owl/OSM1.owl");

}
public static void readRDF(String address) throws OWLOntologyCreationException, OWLOntologyStorageException{
    OWLOntologyManager manager =OWLManager.createOWLOntologyManager();
    File file = new File (address);
    OWLOntology ont = manager.loadOntologyFromOntologyDocument(IRI.create(file));
    System.out.println("Ontology Loaded...");

    System.out.println("Logical IRI   : " + ont.getOntologyID());
    System.out.println("Format        : " + manager.getOntologyFormat(ont));
    System.out.println("Runtime memory: " + Runtime.getRuntime().totalMemory());      
      ReasonerFactory reasonerFactory = new ReasonerFactory();
      ConsoleProgressMonitor progressMonitor = new ConsoleProgressMonitor();
      Configuration config = new Configuration();
      config.ignoreUnsupportedDatatypes=true;
      config.reasonerProgressMonitor= progressMonitor;
      OWLReasoner reasoner = reasonerFactory.createReasoner(ont, config);


      long t0 = System.nanoTime();

      System.out.println("Starting to add axiom generators");
      OWLDataFactory datafactory = manager.getOWLDataFactory();
      List<InferredAxiomGenerator<? extends OWLAxiom>> inferredAxioms = new ArrayList<InferredAxiomGenerator<? extends OWLAxiom>>();
      //inferredAxioms.add(new InferredSubClassAxiomGenerator());
      inferredAxioms.add(new InferredClassAssertionAxiomGenerator());
      //inferredAxioms.add(new InferredDataPropertyCharacteristicAxiomGenerator());
      //inferredAxioms.add(new InferredObjectPropertyCharacteristicAxiomGenerator());
      //inferredAxioms.add(new InferredEquivalentClassAxiomGenerator());
      //inferredAxioms.add(new InferredPropertyAssertionGenerator());
      //inferredAxioms.add(new InferredInverseObjectPropertiesAxiomGenerator());         
      inferredAxioms.add(new InferredSubDataPropertyAxiomGenerator());
      inferredAxioms.add(new InferredSubObjectPropertyAxiomGenerator());
      System.out.println("finished adding axiom generators");

//        List<InferredIndividualAxiomGenerator<? extends OWLIndividualAxiom>> individualAxioms= new ArrayList<InferredIndividualAxiomGenerator<? extends OWLIndividualAxiom>>();
//        inferredAxioms.addAll(individualAxioms);

    // for writing inferred axioms to the new ontology
    OWLOntology infOnt = manager.createOntology(IRI.create(ont.getOntologyID().getOntologyIRI().get()+"_inferred"));

      // use generator and reasoner to infer some axioms
      System.out.println("Starting to infer");
      InferredOntologyGenerator iog = new InferredOntologyGenerator(reasoner, inferredAxioms);
      //InferredOntologyGenerator iog = new InferredOntologyGenerator(reasoner);

      System.out.println("Inferrence is over");

      System.out.println("Storing the results");
      iog.fillOntology(datafactory,infOnt);
      System.out.println("Results are stored");
      long elapsed_time = System.nanoTime()-t0;
      System.out.println(elapsed_time);

      // save the ontology
      manager.saveOntology(infOnt, IRI.create("file:///C:/Users/ontologies/NVDB4_test.rdf"));
    }
}

It does not throw any error but it takes for ever to store the inferred ontology in a new file. In fact it does not complete the job even after 2 days. My IDE is eclipse EE and I have given 6 to 12 GB memory to run this application. I can't find any problem with my code or my ontology.

Could someone suggest an optimization or maybe even a better way of implementation or another api?

here is my ontology in case someone wants to test it.

1

There are 1 answers

2
Ignazio On

The size of an ontology is only loosely related to the complexity of reasoning on it - some small ontologies are much harder for reasoners than other very large ones. (Of course there's also the possibility of a bug).

Is it possible for you to share the ontology contents?

Edit: Having tried the ontology, it looks like size does not matter that much; the ontology is proving quite hard to reason with.

I have tried disabling the SWRL rules and skipping the class assertion generation, and still hit a roadblock. The number and topology of object properties is enough to stress HermiT hard.

I have tried version 1.3.8.500, in case of any issues in OWLAPI that might have been fixed in updated versions; the only significant result I got is that the code is not running memory bound. 3 Gigabytes of RAM assigned to the VM seem to be more than enough.

Disjointness related reasoning seems to be taking a large amount of time - this is not unexpected. Consider if you can remove disjoint axioms from your ontology and still achieve your requirements.

Also consider if it is meaningful to separate the individuals by partitioning the ABox - if there are individuals that you are sure are not related, it might be good to separate the assertions in multiple ontologies. Large numbers of unrelated individuals might be causing the reasoner to attempt reasoning paths that will never provide useful inferences.