Open lehneres opened 7 years ago
It seems odd that a write to the graphindex would fail due to "conditional request failed". Can you share your configuration, schema and sample loading code?
I use the example configuration from https://github.com/awslabs/dynamodb-janusgraph-storage-backend/blob/master/src/test/resources/dynamodb.properties - only changed the region
No schema defined & Code looks like this:
`
do { listing = s3Client.listObjects(listObjectsRequest);
listing.getObjectSummaries().parallelStream().forEach(objectSummary -> {
try {
System.out.println(" - " + objectSummary.getKey() + " " + "(size = " + objectSummary.getSize() + ")");
final String object = s3Client.getObjectAsString(GraphLoaderConfig.AWS.bucketName, objectSummary.getKey());
final TagCloudEntity entity = new ObjectMapper().readValue(object, TagCloudEntity.class);
if (entity != null) {
final JanusGraphTransaction tx = graph.newTransaction();
final JanusGraphVertex curNode = tx.addVertex(entity.getDocId());
for (final Label label : entity.getTagCloudEntities().getLabels()) {
tx.addVertex(label.getText());
curNode.addEdge("tcLabel", curNode).property("relevanceScore", label.getRelevanceScore());
}
tx.commit();
}
} catch (final IOException e) {
e.printStackTrace();
}
});
listObjectsRequest.setMarker(listing.getNextMarker());
final Iterator<JanusGraphVertex> iterator = graph.query().vertices().iterator();
i = 0;
while (iterator.hasNext()) {
i++;
iterator.next();
}
System.out.println("\n\nGraph size: " + i + "\n\n");
} while (listing.isTruncated() && i < 10000);
`
could it be the parallelStream()
?
EDIT: Ja its not failing with a non-parallel stream. Might be still interesting for you
Hello,
loading a graph with weighted edges I run into repeated exceptions:
org.janusgraph.diskstorage.locking.PermanentLockingException: UpdateItem_jg_graphindex The conditional request failed
Any known fix or workaround to this?