mwcvitkovic / Supervised-Learning-on-Relational-Databases-with-GNNs

Code to reproduce the results in the paper Supervised Learning on Relational Databases with Graph Neural Networks.
MIT License
60 stars 14 forks source link

ValueError: Found array with 0 sample In Transaction "productsize" #2

Closed tangwudu closed 4 years ago

tangwudu commented 4 years ago

Hi, Thanks for your great work. I'm trying to reproduce your results. I can successfully run your code to "Step 7 Run python -m data..build_dataset_from_database from the root directory of this repo." I do found some configuration problems and I can solve them.

However, at "Step 8 Run python -m data..build_db_info from the root directory of this repo." I encountered this problem: running: MATCH (n:Transaction) RETURN n.productsize ; Query Result: [] ValueError: Found array with 0 sample(s) (shape=(0, 1)) while a minimum of 1 is required by RobustScaler.

Could you please help to figure out this problem?

mwcvitkovic commented 4 years ago

Hi @tangwudu, sorry for the delayed response.

You shouldn't actually need to do Step 8 at all since I've included the <db_name>.db_info.json files in the git repo. Does the code work if you skip Step 8?

But in case you want to run it anyway to recreate the <db_name>.db_info.json files, my guess is that the raw data aren't loaded into neo4j. It seems like python is successfully connecting to the neo4j database given that you're getting a Query Result, but the fact that the Query Result is empty makes it look like you have an empty database. Can you use the neo4j web interface to confirm that there's data in the neo4j database?

tangwudu commented 4 years ago

Hi @mwcvitkovic I've found that I can skip Step 8. And I have run your training process actually. Thanks for your advices.

Yes, I'm planning to do some check about neo4j database. Will get back to you once I found something.

Best

tangwudu commented 4 years ago

BTW, I can access other data until productsize.

Get Driver: bolt://localhost:9687 running: MATCH (n:Offer) RETURN DISTINCT n.offer_id ORDER BY n.offer_id ; running: MATCH (n:Offer) WHERE n.offer_id IS null RETURN count(n); running: MATCH (n:Offer) RETURN n.quantity ; /ldap_home/tangwu.du/.conda/envs/RDB/lib/python3.6/site-packages/sklearn/preprocessing/_data.py:2357: UserWarning: n_quantiles (1000) is greater than the total number of samples (37). n_quantiles is set to n_samples. % (self.n_quantiles, n_samples)) /ldap_home/tangwu.du/.conda/envs/RDB/lib/python3.6/site-packages/sklearn/preprocessing/_discretization.py:197: UserWarning: Bins whose width are too small (i.e., <= 1e-8) in feature 0 are removed. Consider decreasing the number of bins. 'decreasing the number of bins.' % jj) running: MATCH (n:Offer) WHERE n.quantity IS null RETURN count(n); running: MATCH (n:Offer) RETURN n.offervalue ; /ldap_home/tangwu.du/.conda/envs/RDB/lib/python3.6/site-packages/sklearn/preprocessing/_data.py:2357: UserWarning: n_quantiles (1000) is greater than the total number of samples (37). n_quantiles is set to n_samples. % (self.n_quantiles, n_samples)) /ldap_home/tangwu.du/.conda/envs/RDB/lib/python3.6/site-packages/sklearn/preprocessing/_discretization.py:197: UserWarning: Bins whose width are too small (i.e., <= 1e-8) in feature 0 are removed. Consider decreasing the number of bins. 'decreasing the number of bins.' % jj) running: MATCH (n:Offer) WHERE n.offervalue IS null RETURN count(n); running: MATCH (n:History) RETURN DISTINCT n.market ORDER BY n.market ; running: MATCH (n:History) WHERE n.market IS null RETURN count(n); running: MATCH (n:History) RETURN DISTINCT n.repeater ORDER BY n.repeater ; running: MATCH (n:History) WHERE n.repeater IS null RETURN count(n); running: MATCH (n:History) WHERE n.offerdate IS null RETURN count(n); running: MATCH (n:Transaction) RETURN DISTINCT n.dept ORDER BY n.dept ; running: MATCH (n:Transaction) WHERE n.dept IS null RETURN count(n); running: MATCH (n:Transaction) WHERE n.date IS null RETURN count(n); running: MATCH (n:Transaction) RETURN n.productsize ;

mwcvitkovic commented 4 years ago

Closing due to inactivity