ggtracker / ggtrackerstack

Project to run the whole ggtracker stack in vagrant
20 stars 10 forks source link

3.3.0 Update - Replays cannot be uploaded #47

Closed gravelweb closed 8 years ago

gravelweb commented 8 years ago

Some users are reporting that they cannot upload their replays (https://twitter.com/dirak_/status/732745048111087616). Looking at GGTracker, replay uploads have definitely slowed down since the 3.3.0 patch.

dsjoerg commented 8 years ago

People started reporting issues on May 14th so I suspect that GGTracker has operational problems unrelated to 3.3.0. https://twitter.com/derekduoba/status/731613284965699588

There may also be a 3.3.0 problem, it doesnt have to be just one or the other.

Anyway, if you can get any example of a reproducible problem in production or in dev that would help a lot.

gravelweb commented 8 years ago

Stack trace from failed upload:

12:45:18 python.1 | 2016-05-18 12:45:18 2174 INFO parsing failed for replay uploads/a4760eb5-afc5-419f-9d35-15e066f10b75/issue47.SC2Replay. oh well. exception='utf8' codec can't decode byte 0xaf in position 0: invalid start byte. <type 'exceptions.UnicodeDecodeError'> jobs.py 73 Traceback (most recent call last): 12:45:18 python.1 | File "/vagrant/esdb/vendor/ggpyjobs/ggtracker/jobs.py", line 73, in perform 12:45:18 python.1 | replayDB, blob = sc2reader_to_esdb.processReplay(StringIO(replaystring), args['channel']) 12:45:18 python.1 | File "/vagrant/esdb/vendor/ggpyjobs/sc2parse/sc2reader_to_esdb.py", line 116, in processReplay 12:45:18 python.1 | replay = ggfactory.load_replay(stringio) 12:45:18 python.1 | File "/vagrant/esdb/vendor/ggpyjobs/src/sc2reader/sc2reader/factories/sc2factory.py", line 85, in load_replay 12:45:18 python.1 | return self.load(Replay, source, options, **new_options) 12:45:18 python.1 | File "/vagrant/esdb/vendor/ggpyjobs/src/sc2reader/sc2reader/factories/sc2factory.py", line 137, in load 12:45:18 python.1 | return self._load(cls, resource, filename=filename, options=options) 12:45:18 python.1 | File "/vagrant/esdb/vendor/ggpyjobs/src/sc2reader/sc2reader/factories/sc2factory.py", line 146, in _load 12:45:18 python.1 | obj = cls(resource, filename=filename, factory=self, **options) 12:45:18 python.1 | File "/vagrant/esdb/vendor/ggpyjobs/src/sc2reader/sc2reader/resources.py", line 265, in __init__ 12:45:18 python.1 | self._read_data(data_file, self._get_reader(data_file)) 12:45:18 python.1 | File "/vagrant/esdb/vendor/ggpyjobs/src/sc2reader/sc2reader/resources.py", line 615, in _read_data 12:45:18 python.1 | self.raw_data[data_file] = reader(data, self) 12:45:18 python.1 | File "/vagrant/esdb/vendor/ggpyjobs/src/sc2reader/sc2reader/readers.py", line 123, in __call__ 12:45:18 python.1 | ) for i in range(data.read_bits(5))], 12:45:18 python.1 | File "/vagrant/esdb/vendor/ggpyjobs/src/sc2reader/sc2reader/decoders.py", line 252, in read_aligned_string 12:45:18 python.1 | return self._buffer.read_string(count, encoding) 12:45:18 python.1 | File "/vagrant/esdb/vendor/ggpyjobs/src/sc2reader/sc2reader/decoders.py", line 108, in read_string 12:45:18 python.1 | return self.read_bytes(count).decode(encoding) 12:45:18 python.1 | File "/usr/lib/python2.7/encodings/utf_8.py", line 16, in decode 12:45:18 python.1 | return codecs.utf_8_decode(input, errors, True) 12:45:18 python.1 | UnicodeDecodeError: 'utf8' codec can't decode byte 0xaf in position 0: invalid start byte

gravelweb commented 8 years ago

I've included the stacktrace, replayfile, as well as extracted the files from the MPQ archive in the tarball below. issue47.tar.gz

dsjoerg commented 8 years ago

OK, asked Bliz for new protocol file. https://github.com/Blizzard/s2protocol/issues/34

gravelweb commented 8 years ago

This is actually not affecting 2v2 and 3v3 (possibly 4v4 as well). Looks like just the 1v1 format changed?

dsjoerg commented 8 years ago

I pushed out a fix yesterday: https://github.com/ggtracker/sc2reader/commit/3bc825ce73eb6d9b0f8bfe0b38976cea1dd39062

I believe 3.3.0 are being handled correctly now. There is still a separate heisenbug that started happening ~4 days ago before 3.3.0 came out.