bradfitz / gomemcache

Go Memcached client library #golang
Apache License 2.0
1.74k stars 455 forks source link

Timeouts when calling Set() in concurrent goroutines #88

Open philippgille opened 5 years ago

philippgille commented 5 years ago

I'm pretty new to Go and very new to Memcached, which is probably a bad combination for using gomemcache, but I couldn't find any info regarding my problem in the docs.

I start Memcached with Docker like this: docker run -it --rm -p 11211:11211 memcached

Then I run this code:

package main

import (
    "fmt"
    "strconv"
    "sync"

    "github.com/bradfitz/gomemcache/memcache"
)

func main() {
    mc := memcache.New("localhost:11211")

    // This one works
    mySet(mc, -1)
    // See:
    myGet(mc, -1)

    goroutineCount := 100
    waitGroup := sync.WaitGroup{}
    waitGroup.Add(goroutineCount)

    for i := 0; i < goroutineCount; i++ {
        go func(i int) {
            // These lead to errors
            mySet(mc, i)
            //waitGroup.Done()
        }(i)
    }
    waitGroup.Wait()
}

func mySet(mc *memcache.Client, i int) {
    item := memcache.Item{
        Key:   strconv.Itoa(i),
        Value: []byte("foo"),
    }
    err := mc.Set(&item)
    if err != nil {
        fmt.Println(err)
    }
}

func myGet(mc *memcache.Client, i int) {
    item, err := mc.Get(strconv.Itoa(i))
    if err != nil {
        fmt.Println(err)
        return
    }
    fmt.Println("value: " + string(item.Value))
}

waitGroup.Done() is commented out because that leads to more output (?). It's just for reproducing the behavior, so it doesn't matter that the program doesn't finish this way.

This is what I get:

value: foo
read tcp 127.0.0.1:39385->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39379->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39387->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39382->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39389->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39391->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39401->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39394->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39396->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39390->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39409->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39412->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39410->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39413->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39406->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39405->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39415->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39418->127.0.0.1:11211: i/o timeout
read tcp 127.0.0.1:39416->127.0.0.1:11211: i/o timeout
memcache: connect timeout to 127.0.0.1:11211
memcache: connect timeout to 127.0.0.1:11211
memcache: connect timeout to 127.0.0.1:11211
read tcp 127.0.0.1:39414->127.0.0.1:11211: i/o timeout
memcache: connect timeout to 127.0.0.1:11211
read tcp 127.0.0.1:39417->127.0.0.1:11211: i/o timeout
memcache: connect timeout to 127.0.0.1:11211
read tcp 127.0.0.1:39425->127.0.0.1:11211: i/o timeout
memcache: connect timeout to 127.0.0.1:11211
read tcp 127.0.0.1:39421->127.0.0.1:11211: i/o timeout
memcache: connect timeout to 127.0.0.1:11211
memcache: connect timeout to 127.0.0.1:11211
memcache: connect timeout to 127.0.0.1:11211
memcache: connect timeout to 127.0.0.1:11211
read tcp 127.0.0.1:39443->127.0.0.1:11211: i/o timeout
memcache: connect timeout to 127.0.0.1:11211
memcache: connect timeout to 127.0.0.1:11211
memcache: connect timeout to 127.0.0.1:11211
memcache: connect timeout to 127.0.0.1:11211
memcache: connect timeout to 127.0.0.1:11211
memcache: connect timeout to 127.0.0.1:11211
read tcp 127.0.0.1:39419->127.0.0.1:11211: i/o timeout
memcache: connect timeout to 127.0.0.1:11211
read tcp 127.0.0.1:39422->127.0.0.1:11211: i/o timeout

So the first Set() and Get() call work, which indicate that I properly created the client and that the server is running fine. But as soon as the calls take place concurrently in multiple goroutines, I get errors.

The server is still working fine. When I restart the program, the first Set() and Get() still work.

philippgille commented 5 years ago

Okay, I just realized it might actually be the server that can't handle the load properly.

Can anyone confirm this?

time2k commented 5 years ago

Manual increase Client.MaxIdleConns property may help