I've been caught by a very strange bug at my production server related to decimal fields:
Reading values bigger than 20000 (aprox) gets trash.
The issue only occurs with decimal(18, 5) fields.. but decimal(18, 2) fields work fine, haven't tested others.
The data is well written and correct at the db, I can check that using isql directly, the problem is when is read by fb.
I'm pretty sure the bug lurks at the fb gem (isql works fine and fb-activerecord-adapter is bypassed).
Strange enough I wasn't able to reproduce this bug at my dev machine.. but I've found why: it's a x86_64 architecture, and my production server is x86.
decimal(18, x) gets stored as 64bit integers, I think that fb is messing the math of the scale in some way that only affects 32bits arquitectures.
Maybe "static VALUE fb_cursor_fetch(struct FbCursor *fb_cursor)" at fb.c@1961 is the culprit ? Maybe the conditional compilation switched by HAVE_LONG_LONG is related?
How to reproduce (32bit architecture needed):
require 'fb'
include Fb
db = Database.new(:database => "#{File.expand_path(File.dirname(__FILE__))}/test.fdb", :username => 'sysdba', :password => 'masterkey')
conn = db.connect rescue db.create.connect
conn.execute("create table TEST (ID int not null primary key, N decimal(18, 5))") if !conn.table_names.include?("TEST")
conn.execute("delete from TEST")
conn.execute("insert into TEST values (?, ?)", 1, 50000)
puts "50000 = #{conn.query(:hash, "select * from TEST").first['N']} ?"
At my production server I get:
"50000 = 7050.32704 ?"
and at my dev machine of course:
"50000 = 50000.0 ?"
I've been caught by a very strange bug at my production server related to decimal fields:
How to reproduce (32bit architecture needed):
At my production server I get: "50000 = 7050.32704 ?" and at my dev machine of course: "50000 = 50000.0 ?"