I would like to store a large number of integers in an rdf database (Blazegraph). Those values need to be updated (incremented) with new data, or if missing, created. What's the best way to do that in sparql 1.1? If I was wrtiting it in SQL (MySQL/MariaDB), it would be this, assuming the database is empty at first, and the table's unique key is set to "s,p" (subject and predicate):
-- Inserting or adding A=10, B=1
INSERT INTO tbl (s,p,o) VALUES ('A','cnt',10), ('B','cnt',1)
ON DUPLICATE KEY UPDATE o=o+VALUES(o);
Resulting RDF data:
:A :cnt 10 .
:B :cnt 1 .
Next run:
-- Inserting or adding A=3, C=5
INSERT INTO tbl (s,p,o) VALUES ('A','cnt',3), ('C','cnt',5)
ON DUPLICATE KEY UPDATE o=o+VALUES(o);
Resulting RDF data:
:A :cnt 13 .
:B :cnt 1 .
:C :cnt 5 .
So the question is - how would one construct a SPARQL query to do bulk UPSERT based on existing data and increments, and make it efficient.