Open GoogleCodeExporter opened 9 years ago
Agreed. Bitcoincharts seem unable to keep their service up and running, so
relying on them no more seems like a wise choice. The addition of stale
datafeed only helps us mitigate the damage caused by a stale datafeed, but
unfortunately does not solve the problem =(
If the Gox API is also not good enough, maybe we could increase reliability of
the ga-bitbot by simply restarting the feed scripts automatically when the
stale datafeed is detected, so our bitbots keep running without supervision and
the manual restarts..
Original comment by purge...@gmail.com
on 3 Apr 2013 at 6:03
Only a temporary 'fix', but could not resist.. =]
If you insert the following line just below the line 180 in bid_maker.py:
os.system('kill -2 `pgrep -f "pypy gal.py server"`; sleep 60')
...the bid_maker will kill [as in CTRL+C] the gal.py server process. All the
processes spawned by it should also be terminated. All you need to do next is
to ensure the server process gets respawned automatically - you may run it as:
while true; do nice -5 pypy bid_maker.py; sleep 60; done
...and as soon as the stale datafeed is detected the bid_maker should kill the
gal.py, which in turn should restart, running the feed sync scripts again and
making sure the feed is up-to-date again.
Note: This does not work on windows, unless you manage to get/install the GNU
utils [cygwin.. hint-hint =]
Original comment by purge...@gmail.com
on 3 Apr 2013 at 6:42
Rather than killing and restarting the whole system for a stale feed, I am
trying:
(@41) from subprocess import check_output as call, Popen, PIPE
.....
(@181) Popen(shlex.split('python bcfeed_synch.py -d')).wait()
pulled from gal.py to just sync up the latest data, which is what gal.py does
before starting up the ongoing feed during startup.
ie:
#capture the price feeds regardless of client or server mode
#servers need it for reporting and clients need it for processing
#update the dataset
print "gal: Synching the local datafeed..."
Popen(shlex.split('python bcfeed_synch.py -d')).wait()
#launch the bcfeed script to collect data from the live feed
print "gal: Starting the live datafeed capture script..."
p = Popen(shlex.split('python bcfeed.py'),stdin=fnull, stdout=fnull,
stderr=BCFEED_STDERR_FILE)
Original comment by anderson...@gmail.com
on 3 Apr 2013 at 7:31
sorry about the duplicate of mine.
I tried updating the feed from scratch but it still says the same thing and
attempts to re sync from the oldest end of the data feed and not the newer end?
I think this is because bitcoincharts has reversed the order or something?
Looks that way anyway.
Are people still experiencing this issue?
Original comment by JohnnyGe...@gmail.com
on 22 Apr 2013 at 10:37
I believe I've come out with a good workaround for the current problem with
bcfeed_synch.py. Currently bitcoincharts.com only returns the last 20,000
values when you give it a epoch time value in this URL
"http://bitcoincharts.com/t/trades.csv?symbol=mtgoxUSD&start={START_TIME}."
Unfortunatly START_TIME is hard coded at "0" in an attempt to get all
historical data the first time this script is ran.
Then an top of all of this bitcoincharts.com changed the order of their data to
newest-first-oldest-last order, while bcfeed.py writes the data feed in
oldest-first-newest-last order, which means after a run of report_gen.py as
quartile graphs that appear to double back on themselves.
All of that is to say that bitcoincharts themselves have a better, more
canonical source or data that contains all the data that they've collected.
Below are my diffs and an explanation of what I'm doing differently.
diff --git a/gal.py b/gal.py
@@ -254,2 +254,2 @@ atexit.register(shutdown)
-print "gal: Synching the local datafeed..."
-Popen(shlex.split('python bcfeed_synch.py -d')).wait()
+#print "gal: Synching the local datafeed..."
+#Popen(shlex.split('python bcfeed_synch.py -d')).wait()
@@ -258,3 +258,3 @@ Popen(shlex.split('python bcfeed_synch.py -d')).wait()
-print "gal: Starting the live datafeed capture script..."
-p = Popen(shlex.split('python bcfeed.py'),stdin=fnull, stdout=fnull,
stderr=BCFEED_STDERR_FILE)
-no_monitor.append(p)
+#print "gal: Starting the live datafeed capture script..."
+#p = Popen(shlex.split('python bcfeed.py'),stdin=fnull, stdout=fnull,
stderr=BCFEED_STDERR_FILE)
+#no_monitor.append(p)
All this does is remove bcfeed_synch.py from the startup process, as well as
the live datafeed capture script you'll see why I do that in a minute.
diff --git a/bcfeed_synch.py b/bcfeed_synch.py
@@ -22,2 +22,2 @@ print "-"*80
-link =
"""http://bitcoincharts.com/t/trades.csv?symbol=mtgoxUSD&start={START_TIME}"""
+link = """http://api.bitcoincharts.com/v1/csv/mtgoxUSD.csv"""
@@ -40 +40 @@ if len(sys.argv) >= 2:
- link = link.replace('{START_TIME}',str(start_time))
+ #link = link.replace('{START_TIME}',str(start_time))
Here is the real meat of the change. Changing this to pull from the pre-packed
CSV files means that we get the complete trade history for MtGox. The downside
is that ever time this scrip runs, we pull a 250MB file down from
bitcoincharts. What I do personally is run bcfeed.py in its own screen session
since it likes to die suddenly, and I want to know why.
Hopefully all of this has helped you out. If you have any questions, please
drop me a line.
Original comment by geek...@gmail.com
on 11 Sep 2013 at 8:13
Original issue reported on code.google.com by
anderson...@gmail.com
on 2 Apr 2013 at 11:40