envato / double_entry

A double-entry accounting system for Ruby applications.
https://rubygems.org/gems/double_entry
MIT License
429 stars 68 forks source link

Why does #lock_tables need to be the outermost transaction? #64

Open keithpitt opened 10 years ago

keithpitt commented 10 years ago

Consider the following code:

class Invoice::Payer
  # ...

  def pay
    @invoice.lock do
      payment = CreditCard::Charger.charge(@invoice.account.credit_card, @invoice.description, @invoice.amount)
      if payment.success?
        @invoice.update(paid: true)
      end
    end
  end
end

class CreditCard::Charger
  # ...

  def charge
    # ...

    from_account = DoubleEntry.account(:credit_card, scope: @credit_card.account)
    to_account = DoubleEntry.account(:sales, scope: @credit_card.account)

    DoubleEntry.transfer(@amount, from: from_account, to: to_account, code: :purchase)
  end
end

This seems like pretty reasonable code to me, but unfortunately doesn't work because in the Invoice class, I'm locking the invoice (which creates a transaction), and then the DoubleEntry#transfer method fails because it's transaction is not the outer-most one.

What's the reasoning behind making it the outer-most transaction? What would the world be like if it "just worked"?

eadz commented 10 years ago

https://github.com/eadz/double_entry/pull/6

That branch passes, so there isn't any test coverage to help explain it.

There also isn't any git history on the locking file to help us :/ https://github.com/envato/double_entry/commits/master/lib/double_entry/locking.rb

In theory, if the code inside CreditCard::Charger doesn't depend on @invoice I can't see how this can cause any issue, assuming @credit_card is loaded and doesn't reload, and no other SQL queries.

I think a main point is you want all your locks at the top, and DoubleEntry certainly does when it loads and/or creates the accounts. It does make using DoubleEntry very hard in cases like the above, adding itself as an outer layer to any code that wants to do a financial transaction inside it, adding the dependency of knowing the accounts to lock to the code that just wants to do some database query along side a DoubleEntry transfer.

I would prefer if that wasn't the case, as I like the look of the code above. It would be great to know exactly why is is required, so that if we understand the risks can design around them opt to not have DoubleEntry as an outer layer.

keithpitt commented 10 years ago

Thanks for the response @eadz - I think @notahat may know something? I vaguely remember.

notahat commented 10 years ago

It's been a while, and this is complicated, but it goes something like this:

You want to grab all your locks at the start of the transaction, and do them in a consistent order, to minimise deadlocks.

Deadlocks will happen though, particularly when we have to create AccountBalance records on the fly. That means we need the capability to retry the transaction.

The fun is, when you get a deadlock, MySQL will always roll back the outermost transaction. If double entry didn't own the outermost transaction, that could roll back other things you've done.

In the example above, if you wrote anything to the DB between @invoice.lock and DoubleEntry.transfer, you would lose that data whenever a deadlock happened.

Make any sort of sense?

notahat commented 10 years ago

Or, put another way:

You want reliable locking and deadlock handling? Hard things are hard.

pablolee commented 10 years ago

@rabidcarrot had also mentioned something about concurrency issues if all accounts weren't locked at the outermost level

notahat commented 10 years ago

Yep. If you don't do all your locking in one place and in a consistent order, you're way more likely to get deadlocks.

keithpitt commented 10 years ago

Thanks @notahat, I wonder how much of that is MySQL specific? I'm trying to find some docs on how deadlocks are handled in MySQL and Postgres, but as you put it...hard things are hard :P

notahat commented 10 years ago

I'm not sure whether Postgres behaves the same way.

eadz commented 10 years ago

DoubleEntry doesn't prevent deadlocks from within the lock accounts block. You could do

DoubleEntry.lock_accounts(*accounts) do
  @user.lock
  @product.lock
end

in one place and

DoubleEntry.lock_accounts(*accounts) do
  @product.lock
  @user.lock
end

in another, which could cause a deadlock so it's really up to the programmer to avoid this.

But the current locking API has really caused issues due to the outer transaction requirement ( the main issue is demonstrated by the lack of ability to write code like in the first post). Maybe there is a way to improve this API?

The fun is, when you get a deadlock, MySQL will always roll back the outermost transaction. If double entry didn't own the outermost transaction, that could roll back other things you've done.

In the Keith's example above, is that a problem?

Deadlocks will happen though, particularly when we have to create AccountBalance records on the fly. That means we need the capability to retry the transaction.

If this is the main reason, I don't see any need for the creation of account balance records to have to happen during the transaction. It could be a separate transaction that happened some point earlier, for example by calling DoubleEntry.ensure_account_balance(@account) and that transaction may have to be the outer most transaction, but at least we have separated out the creating new balance records from the lock balance records concern, maybe allowing a more flexible locking API.

notahat commented 10 years ago

Maybe there is a way to improve this API?

I don't think there is a way to improve the locking without either a) changing ActiveRecord's behaviour, b) changing MySQL's behaviour, or c) changing the database model used.

Keep in mind that this database model is a legacy model that had locking grafted on after the fact. If you were designing to make locking easy, it would likely look pretty different.

(If I was designing from scratch, I'd actually design for eventual consistency and not have to muck around with all this locking. That's the way industrial strength financial systems work in most places.)

In the Keith's example above, is that a problem?

Keith's example has a "...". It depends on what's in the "...".

More to the point, there's no way to programmatically know if something happened in the "..." that might cause a problem, so if you allow stuff to happen, you'll get random nasty failures. The failures will be timing related, so are very unlikely to show up in testing. (See the jack_hammer script in this project for what you actually have to do to detect problems.)

eadz commented 9 years ago

Maybe there is a way to improve this API?

I don't think there is a way to improve the locking without either a) changing ActiveRecord's behaviour, b) changing MySQL's behaviour, or c) changing the database model used.

Requiring DoubleEntry.lock_tables to be the outermost transaction can result in less than ideal code.

Keith's example would be great to be able to do. I believe it is possible to change the api without a) b) or c), but by doing d) - requiring account balance records to be created before doing a double entry transfer.

The creation of the account balance records is the tricky thing that really needs that outer lock, due to the rollback and retry stuff. Looking at the locking code, it is very AccountBalance specific.

I believe Keith's example could work provided he simply guaranteed that all required double entry account records already existed. This would require more thought on the part of someone implementing a double entry solution, but it could be as simple as creating a a matching DoubleEntry::AccountBalance every time you create a CreditCard record. That way, when DoubleEntry tries to lock accounts, the lock won't fail due to no record existing.

notahat commented 9 years ago

The AccountBalance records exist primarily to enable locking! If you don't do per-account locks somehow, the totals in the lines table will get out of whack because of race conditions around knowing the current balance. (This is exactly the problem we had back when this code was first written, and it's what all this locking magic is there to fix.)

If I was building finance code from scratch today, I'd make it eventually consistent. i.e. Just record that money was moved inside the transaction, and calculate balances after the fact. That's a better model in many ways, but it's not the model of this library, and there's not a way to make this library use that model without fundamentally changing the underlying data structures.

eadz commented 9 years ago

@notahat I'm not saying do away with account balance records, or per-account locking, I'm saying the really complicated "ensure outermost transaction" locking code is due to the creation of account balance records, not locking them. ( or, trying to lock them when they don't exist ).

My hypnosis is: that if, as part of your application, you ensure that all account balance records are created separately before using them ( and you may need to "ensure outermost transaction" for that ), once you have that assurance, you can safely do DoubleEntry transactions within another transaction.

notahat commented 9 years ago

@eadz: There's definitely a lot of complexity in there to support lazy creation of account balances. And yes, you could simplify things if you assumed all account balances were created up front.

I don't think that lazy creation is the only reason for the outermost transaction requirement though.

eadz commented 9 years ago

@eadz: There's definitely a lot of complexity in there to support lazy creation of account balances. And yes, you could simplify things if you assumed all account balances were created up front.

I don't think that lazy creation is the only reason for the outermost transaction requirement though.

I think it is the only reason that really matters.

I think this is the essence of this Issue "Why does #lock_tables need to be the outermost transaction?" So we can agree that it's because of the account balance creation, but why else?

What is the other reason for being an outermost transaction? ( Other than deadlocks which IMHO having an outermost transaction requirement doesn't solve that problem on its own, although it may help ).

And the other question is, whatever the reason, is it so important that it prevents us from having a cleaner API?

notahat commented 9 years ago

Deadlocks are the problem because of ActiveRecord and MySQL having broken behaviour around them.

If you get a deadlock in a nested transaction, ActiveRecord behaves as though just the inner transaction is being rolled back, but MySQL rolls back the outermost transaction. Makes it impossible to retry after a deadlock unless you can be sure you own the outermost transaction.

So you could abandon the requirement for being the outermost transaction as long as you abandon the retrying on deadlock.

eadz commented 9 years ago

If you get a deadlock in a nested transaction, ActiveRecord behaves as though just the inner transaction is being rolled back, but MySQL rolls back the outermost transaction. Makes it impossible to retry after a deadlock unless you can be sure you own the outermost transaction.

You may be able to pass requires_new: true when starting a transaction, in addition to raising ActiveRecord::Rollback outside of the inner transaction when the inner transaction fails, to get the behaviour you're after. But I'm not suggesting this as a solution.

I would suggest removing the automatic retry instead: because of the potential benefits you get to the API, removal of database specific and very complicated transactional / locking code, and the reduction in surprises such as the following:

DoubleEntry.lock_accounts(account_a, account_b) do
  payment = CreditCard::Charger.charge(@invoice.account.credit_card...)
   if payment.success?
     DoubleEntry.transfer(Money.new(20_00), :from => account_a, :to => account_b, :code => :purchase)
   end
   # other code that may cause a deadlock
end

It's not clear in the documentation that CreditCard::Charger.charge could be called twice in case of deadlocks... and I'm not sure the benefit you get from restarting deadlocks is worth it the downsides - maybe deadlocks should be caught and handled higher up the stack, as there are usually other SQL transactions and other code within the DoubleEntry transaction that are not the responsibility of DoubleEntry to restart or retry.

ejoubaud commented 8 years ago

What about a :retry_on_deadlock => false or allow_outer_transaction => true option?

Retrying on deadlock and checking for outmost transaction remain the default, so it's safe by default, but if you know what you're doing, you can take your chances and handle retries higher up in the stack? When the option is on, deadlocks or any other errors just bubble up and fail the outer transaction, which is free to have its own retry mechanism (and even recommended so in the option's documentation)

Maybe we could even provide a DoubleEntry.outer_transaction(&block) method implementing the deadlock retry mechanism, for use higher up in the stack when you need to wrap DoubleEntry calls along with other stuff?

Having use for a transaction around your financial transfer is not an edge case, I'd argue it's one of the most common and important things you may need to wrap in a transaction with other stuff. It's been a recurring issue for us. Well worth the extra-complexity of an option in the code/interface I reckon.

eadz commented 4 years ago

What about a :retry_on_deadlock => false or allow_outer_transaction => true option?

I think we definitely need some more options.

I think there were some decisions made early on that have meant the API for using double entry is a little more difficult than it could be.

Having double entry do some "magic" ( creating account balance records as part of locking ), seems to in my opinion complicate things enough to warrant an alternative.

Some of the locking code seems to have some code smells; Here are 3:

def create_missing_account_balances
  @accounts_without_balances.each do |account|
    # Get the initial balance from the lines table. <<- this comment 
    balance = account.balance
    # Try to create the balance record, but ignore it if someone else has done it in the meantime.
    AccountBalance.create_ignoring_duplicates!(account: account, balance: balance)
  end
end

I don't understand how there can be a balance in the lines table, when a lock is required to create a line and therefore an account balance record.

And finally, the DoubleEntry::Locking module seems to break SRP as it does at least two major things

From my reading, a potential way forward would be to make the following change:

Require AccountBalance records to exist before trying to lock them

This would remove a lot of the checks and edge cases around account balance records not existing. The creation of account balance records can happen in another place, as when you call DoubleEntry.account(:my_account) you know then you'll need to lock that account.

From here there are a few options:

The creation of DoubleEntry account balance records DOES NOT need to happen with any locks - it can't, there is nothing to lock on, that's why it tries to create the record and if it fails it ignores the error ( duplicate error ).

kschutt commented 4 years ago

Based on these discussions, and also from #168 and #160. Wouldn't it be possible to eliminate locking by performing the following?

1) Before any transfer, ensure the AccountBalance object exists 2) Perform the transfer 3) Periodically perform a line check. I'd probably opt to do this every 15 minutes and have it automatically fix balance errors

eadz commented 4 years ago

@kschutt for most cases, you wouldn't want to eliminate locking. I guess in theory you could rely on a line check but I don't believe it will fix any error it finds. It's a sanity checker, and doesn't fix any errors it finds AFAIK. For most cases you would want your balances to be accurate all the time, hence the locking.

This thread is about the programmer's API for locking and the requirement to have double entry as the outer most transaction, not about removing locks entirely.

eadz commented 3 years ago

I've been using advisory locks* recently, and I think could be another tool to potentially simplify the code.

Both postgres and mysql ( >5.7 ) supports these.

I believe that the following would allow kieth's code to work. Given that we already have a unique handle for the account, we can lock on the row before it exists.

Lets say the account is `"sales-#{customer.id}"

we can do

with_advisory_lock("double-entry-account-sales-2134") do 
   account = AccountBalance.find_or_create_by_ref("double-entry-account-sales-2134")
   # this will never fail due to race conditions (? i think!! ?)
end

so that means that DoubleEntry.lock_accounts would look something like

  # this is just an example of how it works. 
  # the actual code would map to names, sort the accounts,  then run a bunch of nested advisory locks
  # based on the names. 
  with_advisory_lock(account1-ref) do
    with_advisory_lock(account2-ref) do
       # find or create accounts
       # yield to do transaction ( no need to lock the rows ). 
    end
end

Not sure if this solves the outermost transaction issue - I think if it just fails on deadlock rather than retrying then that might be solved. But potentially would simplify the locking module.

And/Or @keithpitt could potentially use an advisory lock instead of @invoice.lock to achieve the code in the first post.