apache / drill

Apache Drill is a distributed MPP query layer for self describing data
https://drill.apache.org/
Apache License 2.0
1.93k stars 979 forks source link

DRILL-8467: Update Netty to 4.1.101 #2857

Closed rymarm closed 9 months ago

rymarm commented 9 months ago

DRILL-8467: Update netty to 4.1.101

Description

Update Netty to 4.1.101. Netty 4.1.75 has 2 "breaking" changes:

  1. The default PooledByteBufAllocator chunk size was reduced from 16 MiB to 4 MiB
  2. The default value of io.netty.allocator.useCacheForAllThreads was changed to false

Source: https://netty.io/news/2022/03/10/4-1-75-Final.html

I made InnerAllocator creation with the previously accepted chunk size for Drill - 16 MiB, and left useCacheForAllThreads enabled because as far as I understand cache for all threads(not only for the Netty theads) gives better performance for Drill case.

I left those options overridable by Netty Java properties io.netty.allocator.useCacheForAllThreads and io.netty.allocator.maxOrder(responds for chunk size), so it can be changed without Drill recompilation as it was before.

Documentation

No changes

Testing

Run unit tests.

cgivre commented 9 months ago

@rymarm Would it make sense to add config options for these "new" parameters?

rymarm commented 9 months ago

@cgivre I don't think that we need to add config options for changing the default chunk size and allocation cache enabling, because both are fundamental for Drill and I don't think there are unique cases when Drill works better with other than default values so a user wishes to change them.

But even If a user wishes to change them for some reason, he still can use java options -Dio.netty.allocator.useCacheForAllThreads and -Dio.netty.allocator.maxOrder to change the values.

rymarm commented 9 months ago

I need to find out now, why Drill tests on Java 8 failed with our of memory error:

Errors:   
[ERROR]   TestValueVector.testFixedVectorReallocation »  Unexpected exception, expected<org.apache.drill.exec.exception.OversizedAllocationException> but was<org.apache.drill.exec.exception.OutOfMemoryException>   
[ERROR]   TestValueVector.testVariableVectorReallocation »  Unexpected exception, expected<org.apache.drill.exec.exception.OversizedAllocationException> but was<org.apache.drill.exec.exception.OutOfMemoryException>
jnturton commented 9 months ago

Hi @rymarm! Here are two things it would be interesting to redo the Java 8 runs with.

  1. Adding the EasyOutOfMemory annotation to TestValueVector.java.
  2. Bumping the directMemoryMb value in pom.xml. We set up a swap file at the start of the CI run now so you could try to taking this up to 3Gb.
jnturton commented 9 months ago

P.S. the failing test class runs fine in isolation on my laptop under IBM JDK 8 so I do think we can fix this in the CI with some environment tweaks.

jnturton commented 9 months ago

@rymarm I just got the same test failure in the 1.21 branch which is still based on Netty 4.1.73.Final. I'm testing there with more direct memory and will send a PR to master if that takes care of the issue.

jnturton commented 9 months ago

Update on my comments above: the direct memory limit increase has just been merged to master. Please rebase this and let's see what we get.

rymarm commented 9 months ago

@jnturton Thank you so much for the investigation! I didn't have time to check tests failures. P.S. You decide to finish all unresolved tasks until the end of the year?)

jnturton commented 9 months ago

P.S. You decide to finish all unresolved tasks until the end of the year?)

😂

jnturton commented 9 months ago

We just need to bump up the JAR size limit in the hadoop-2 profile at the bottom of the pom in jdbc-all now.