Open nbekirov opened 2 days ago
For anyone looking for a workaround, here's a patch that fixes the issue for PostgreSQL:
module WithAdvisoryLock
class PostgreSQL < Base
# Prevent column name aliasing from ending up in the `Java::ArjdbcUtil::StringCache` and causing out-of-memory issues
# See: https://github.com/ClosureTree/with_advisory_lock/issues/106
def execute_successful?(pg_function)
comment = lock_name.gsub(/(\/\*)|(\*\/)/, '--')
sql = "SELECT #{pg_function}(#{lock_keys.join(',')}) /* #{unique_column_name}; #{comment} */"
result = connection.select_value(sql)
# MRI returns 't', jruby returns true. YAY!
(result == 't' || result == true)
end
end
end
Recently, our production web environments began failing due to OOM (Out of Memory) errors. Upon analyzing the heap dumps, we noticed a large number of 33-character strings, all prefixed with a t. It turns out that this issue was caused by our use of with_advisory_lock. This method utilizes random strings generated by
WithAdvisoryLock::Base#unique_column_name
as aliases for the lock acquisition result column names, which are then stored in theJava::ArjdbcUtil::StringCache
.While this might be considered an issue in
activerecord
oractiverecord-jdbc-adapter
, it would be preferable to use a more gentle strategy to "Prevent AR from caching results improperly". Any thoughts?[1] https://github.com/jruby/activerecord-jdbc-adapter/blob/master/src/java/arjdbc/jdbc/RubyJdbcConnection.java [2] https://github.com/jruby/activerecord-jdbc-adapter/blob/master/src/java/arjdbc/util/StringCache.java