Open Zanonia opened 6 years ago
Hello,
That's weird, have you edited db.json by hand? I can't see any other situation where that could happen (or that would be a serious bug in the json python stl which I can't expect to happen for those kind of simple situations)
I didn't touch the db.json until t2m crashed. I kept a copy of the original, I think the problem is the comma on line 259:
865219095100313600,
865303903293906944,
865457539185483776,
865468081333616640,
865472026546364418,
86548177475470,
,
865488515278163969,
865565397780754434,
865582743744385024,
Can a large number of tweets to be transferred and a too frequent call of the script (the cron was on a 2min interval at this time) cause this kind of problems?
I also noticed in Nginx's error log that when the media attached to a tweet exceeded the size limit of the instance (1MB on mine), an upload was attempted every time you use t2m and this for several days. I don't know if that deserves to open a new issue.
Hello
I have this error since today, whereas the script had been running without problems for months and nothing has changed on my system :
Edit : Apparently, the problem was due to poor data formatting in db.json. I don't understand how this happened. I solved the problem by rewriting a new db.json manually.