volatiletech / sqlboiler

Generate a Go ORM tailored to your database schema.
BSD 3-Clause "New" or "Revised" License
6.73k stars 544 forks source link

UpdateAll slices with its own change #248

Closed Dexus closed 6 years ago

Dexus commented 6 years ago

Hello together,

how can i update all slices at once?

I have a slice where I update each slice with the new data, but I dont linke to update each per each.

Is there a way to update all at once to reduce the querys? Dont have seen something like that.

ceshihao commented 6 years ago

Do you mean UpdateAll ? https://github.com/volatiletech/sqlboiler/blob/master/templates/16_update.tpl#L142

Dexus commented 6 years ago

Maybe, but UpdateAll will update all Slices with that "one" col update. But not update each slice item with its own change and the whitelist like you can do on each item.

aarondl commented 6 years ago

Hey Dexus. It's intentional that this functionality is missing. Much like Go wants you to write for loops in order to force you to understand the performance implications of such code, so does sqlboiler. It's important to see a loop with an sqlboiler Update call in the middle of it, because that tells you that you're making N calls to the database. I don't think I'm interested in adding a helper for this at this time. Though if I've misunderstood, please clarify and we can re-open.

Dexus commented 6 years ago

Hi @aarondl,

lets say i have the following simplified code:


    rangAlliances := models.RankingAlliances(tx).AllP()
    sort.Slice(rangAlliances, func(i, j int) bool {
        return rangAlliances[i].Points > rangAlliances[j].Points
    })

    for idx, rangAlliance := range rangAlliances {
        //log.Print(idx)
        //log.Println(rangAlliance)
        rangAlliance.LastRank = rangAlliance.Rank
        rangAlliance.Rank = uint64(idx + 1)
        err := rangAlliance.Update(tx, "rank", "last_rank")
        if err != nil {
            log.Println(err)
        }
    }

Let's assume I now have 1,000,000 records so updating here takes 45 seconds as an example. But if I now have a query that has related updates, the time is reduced to about 15 seconds.

Now, assuming that I have hundreds of millions of records, in different variants, that will save a lot of time later on. At least as far as my approach and view is concerned. However, I can understand you if you think it is wrong to list related transactions individually.

Many thanks and greetings

aarondl commented 6 years ago

Hey @Dexus, I'm trying to understand what the actual optimization here is. "But if I now have a query that has related updates, the time is reduced to about 15 seconds.". I'm actually not familiar with what this might be, can you show me an example of what this might look like using just raw database/sql without sqlboiler, or even using your database itself. That would help a lot, thanks!

Dexus commented 6 years ago

He @aarondl it is very simple

1,000,000 updates in a for loop like db.Exec("Update one item sql...") vs one Query db.Exec("Update 1,000,000 rows in one query; update...; update...;update...;")

What I need is the option to create a concated Query to Update all items with one query call.

This would save a lot of internal calls (= time) when I'm abel to update all with one query call.