Open shangyanwen opened 1 year ago
Address to download 10 GB of test data:http://192.168.30.30/test/tpch_10g_syw/
mysql> select -> o_year, -> sum(case -> when nation = 'INDIA' then volume -> else 0 -> end) / sum(volume) as mkt_share -> from -> ( -> select -> extract(year from o_orderdate) as o_year, -> l_extendedprice * (1 - l_discount) as volume, -> n2.n_name as nation -> from -> part, -> supplier, -> lineitem, -> orders, -> customer, -> nation n1, -> nation n2, -> region -> where -> p_partkey = l_partkey -> and s_suppkey = l_suppkey -> and l_orderkey = o_orderkey -> and o_custkey = c_custkey -> and c_nationkey = n1.n_nationkey -> and n1.n_regionkey = r_regionkey -> and r_name = 'ASIA' -> and s_nationkey = n2.n_nationkey -> and o_orderdate between '1995-01-01' and '1996-12-31' -> and p_type = 'SMALL PLATED COPPER' -> ) as all_nations -> group by -> o_year -> order by -> o_year; +--------+------------+ | o_year | mkt_share | +--------+------------+ | 1995 | 0.00000000 | | 1996 | 0.00000000 | +--------+------------+ 2 rows in set (1 min 7.27 sec)
original Precision is 31, tianmu can support only up to 18
(gdb) b query.cpp:556
Breakpoint 1 at 0x2cbcb0e: file /data/codebase/stonedb/storage/tianmu/core/query.cpp, line 556.
(gdb) b query.cpp:560
Breakpoint 2 at 0x2cbcb3a: file /data/codebase/stonedb/storage/tianmu/core/query.cpp, line 560.
(gdb) c
Continuing.
[Switching to Thread 0x7fdec8208700 (LWP 21978)]
Thread 62 "mysqld" hit Breakpoint 2, Tianmu::core::Query::GetPrecisionScale (item=0x7fdb5c0e80c0,
precision=@0x7fdb5ce342b4: 31, scale=@0x7fdb5ce342b8: 4, max_scale=false)
at /data/codebase/stonedb/storage/tianmu/core/query.cpp:560
560 precision = 18;
(gdb) p precision
$1 = (int &) @0x7fdb5ce342b4: 31
(gdb)
@hustjieke
depends on decimal, it'll be fixed after decimal is supported
Have you read the Contributing Guidelines on issues?
Please confirm if bug report does NOT exists already ?
Describe the problem
Expected behavior
How To Reproduce
1、Build the test environment of TPCH ,Download 10 GB test data address: 2、Example Import 10 GB data 3、Execute the following SQL
Environment
Are you interested in submitting a PR to solve the problem?