pyvandenbussche / lov

Linked Open Vocabularies (LOV) - FrontEnd
http://lov.okfn.org/dataset/lov/
60 stars 12 forks source link

Where can I find the metrics of LOV? #68

Closed jtrillos closed 6 years ago

jtrillos commented 6 years ago

I tried to understand a little more about this project and I have running a 90% of the project but I do not have any idea where to find the metrics.rdf in lov. According to datahub is: http://lov.okfn.org/dataset/lov/lov.rdf but the file does not exist anymore.

Can you help me please?

pyvandenbussche commented 6 years ago

Hi, sorry yes things have been moved and now you can find all data files compressed here: http://lov.okfn.org/dataset/lov/sparql (dump file on the right of the screen). Specifically http://lov.okfn.org/lov.nq.gz will contain all statistics for each version of each vocabulary.

jtrillos commented 6 years ago

@pyvandenbussche, I have another question, how the project add new datasets to a vocabulary? I mean, how and where LOV saved the dataset information that is used in a specific vocabulary? I thought it was with a lov.rdf with the help of the RDF2Mongo script. For example, I tried to add a new vocabulary, and it shows me the statistics (classes, properties, instances...) but it does not say how many datasets has. How can I do that?

pyvandenbussche commented 6 years ago

Here is how LOV works.

I wrote some scripts like RDF2Mongo initially when I migrated LOV to the new architecture. This is not meant to be run anymore.

Using the interface lov editor, LOV curators can add a vocabulary/person/organization/Vocabulary versions. This is done solely using the GUI. All the data is saved in the main database: MongoDB. While adding a new vocabulary or a new version it will parse the vocabulary and extract the metadata automatically such as # classes etc. and store them in MongoDB. On the homepage you should see on the right the latest vocabulary updated live.

Once a day, there are other scripts https://github.com/pyvandenbussche/lovScripts running in the background: