Open gauravk-in opened 9 months ago
This error also frequently shows up on RQG runs. cc: @sushantrmishra
The problem is likely #20126, the symptoms are matching, one prerequisite - plan has Gather in the middle is met.
co=# /*+ Leading ( ( ts3 ts2 ) ) NestLoop(ts3 ts2) */ explain SELECT ts2.k1, ts2.k2, ts3.v1, ts3.v2 FROM ts2 JOIN ts3 on ts2.k1 = ts3.k1 WHERE ts2.k1 >= 300 A
ND ts2.k1 < 3100 AND ts3.k1 >= 300 AND ts3.k1 < 3100 GROUP BY ts2.k1, ts2.k2, ts3.v1, ts3.v2;
QUERY PLAN
----------------------------------------------------------------------------------------------------------
HashAggregate (cost=1238.11..1248.11 rows=1000 width=72)
Group Key: ts2.k1, ts2.k2, ts3.v1, ts3.v2
-> Nested Loop (cost=9.41..1228.11 rows=1000 width=72)
Join Filter: (ts2.k1 = ts3.k1)
-> Seq Scan on ts3 (cost=4.71..987.47 rows=5 width=40)
Storage Filter: ((k1 >= 300) AND (k1 < 3100))
-> Materialize (cost=4.71..168.15 rows=1000 width=36)
-> Gather (cost=4.71..163.15 rows=1000 width=36)
Workers Planned: 2
-> Parallel Index Scan using ts2_pkey on ts2 (cost=4.71..163.15 rows=417 width=36)
Index Cond: ((k1 >= 300) AND (k1 < 3100))
(11 rows)
It is not clear though if there is a dependency on transaction isolation level. The issue #20126 is fixed, let's see if problem occurs again.
Thanks @andrei-mart. Stopped seeing this error in the RQG runs with the latest master.
Update: The problem still occurs in RQG runs. it's much less frequently though. I'll try to see if the RQG test cases can reproduce it reasonably frequently.
Update2: created #21320
Jira Link: DB-10000
Description
A mix of 2 different issues were observed intermittently when using TAQO framework for performance testing using the PG Parity configuration. These issues were reproduced on different queries in the test workload, but the issue could not be reproduced manually.
The above queries are from the
basic
workload and the DDL can be found here. https://github.com/yugabyte/taqo/blob/main/sql/basic/create.sqlThe flags used are as follows,
Issue Type
kind/bug
Warning: Please confirm that this issue does not contain any sensitive information