mikeizbicki / cmc-csci143

big data course materials
40 stars 76 forks source link

sql.normalized_batch/04.sql fail #522

Open henrylong612 opened 5 months ago

henrylong612 commented 5 months ago

Hello @mikeizbicki,

I have successfully created an index that speeds up the sql.normalized_batch/04.sql query. However, when I run the test script, I get that the sql.normalized_batch/04.sql test is failing. I imagine that this cannot be caused by my index. Does this mean that I loaded the tweets incorrectly? All of my other test cases seem to be passing.

mikeizbicki commented 5 months ago

It probably means the tweets were loaded incorrectly (although there are some settings on the GIN index that can change the results of the query; see for example the gin_fuzzy_search_limit section of the GIN index reading https://habr.com/en/companies/postgrespro/articles/448746/).

The expected number of tweets is 6056. Can you post the number that you're getting back? Depending on the number, I might not make you reinsert all the data (because that's a lot of work and not really the point of the assignment).

henrylong612 commented 5 months ago

Hi @mikeizbicki! Here are the results I am getting:

postgres=# SELECT
    count(*)
FROM tweets
WHERE to_tsvector('english',text)@@to_tsquery('english','coronavirus')
  AND lang='en'
;
 count 
-------
 11415
(1 row)

This seems much greater than the expected number. What might be causing this?

mikeizbicki commented 5 months ago

Somehow duplicate entries have been added to your database. (There are no UNIQUE constraints, for example, that would prevent this from happening.) The other queries all do something like SELECT count(DISTINCT id) which causes this to not be a problem for those queries.

You don't have to redo inserting all the data. I'll waive for you the requirement that question 4 pass the test case.

This waiver is for @henrylong612 only. If anyone else here is in a similar situation, you can reply to this comment and request a similar waiver, but I'll have to approve it.

henrylong612 commented 5 months ago

Thank you @mikeizbicki!

luisgomez214 commented 5 months ago

@mikeizbicki I am getting this output

postgres=# SELECT
    count(*)
FROM tweets
WHERE to_tsvector('english',text)@@to_tsquery('english','coronavirus')
  AND lang='en'
;
 count 
-------
 12112

I believe I am having a similar issue

mikeizbicki commented 5 months ago

@luisgomez214 I'll also waive for you the requirement that problem 4 pass the test cases.

luisgomez214 commented 5 months ago

Thank you @mikeizbicki

My run time is

lambda-server:~/bigdata/twitter_postgres_indexes (master *%=) $ time docker-compose exec pg_normalized_batch ./run_tests.sh sql.normalized_batch
/home/Luis.Gomez.25/.local/lib/python3.6/site-packages/paramiko/transport.py:32: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography. The next release of cryptography will remove support for Python 3.6.
  from cryptography.hazmat.backends import default_backend
sql.normalized_batch/01.sql pass
sql.normalized_batch/02.sql pass
sql.normalized_batch/03.sql pass
sql.normalized_batch/04.sql fail
sql.normalized_batch/05.sql pass

real    0m8.743s
user    0m0.572s
sys 0m0.387s

Is this correct or should I aim for 3 seconds?

mikeizbicki commented 5 months ago

@luisgomez214 You should be able to do better than this with the right indexes. You can submit as-is for 12/16 points on this section (-1 point/second over 5 seconds).

JTan242 commented 5 months ago

@luisgomez214 You should be able to do better than this with the right indexes. You can submit as-is for 12/16 points on this section (-1 point/second over 5 seconds).

My runtime for normalized_batch varies from 2-8 seconds when I run it. I was wondering if we will be graded on our indexes or our times when the test is ran?

mikeizbicki commented 5 months ago

@JTan242 The grading is described in the homework README file (https://github.com/mikeizbicki/twitter_postgres_indexes/?tab=readme-ov-file#grading), where it states that the grade is based on the runtimes.

mmendiratta27 commented 4 months ago

Hi @mikeizbicki, I am getting a similar error for query 4 as the others.

postgres=# SELECT
    count(*)
FROM tweets
WHERE to_tsvector('english',text)@@to_tsquery('english','coronavirus')
  AND lang='en'
;
 count 
-------
 12112
(1 row)
mikeizbicki commented 4 months ago

@mmendiratta27 I'm not sure if this is still relevant for you, but I will do the same waiver in that you don't need to have test case 4 passing.