huseinzol05 / Stock-Prediction-Models

Gathers machine learning and deep learning models for Stock forecasting including trading bots and simulations
Apache License 2.0
7.69k stars 2.74k forks source link

Data entry points to app.py? #54

Closed augmen closed 4 years ago

augmen commented 4 years ago

i am stuck with the code : i want to know we are feeding live data to the agent.trade() function with request.ipynb then why we need the same data again to be feed here in df ? df = pd.read_csv('TWTR.csv') real_trend = df['Close'].tolist() parameters = [df['Close'].tolist(), df['Volume'].tolist()] minmax = MinMaxScaler(feature_range = (100, 200)).fit(np.array(parameters).T) scaled_parameters = minmax.transform(np.array(parameters).T).T.tolist() initial_money = np.max(parameters[0]) * 2

here we are using the same csv file right ? or we are comparing past data with live data ?

@huseinzol05 pax guide or anyone can guide on right method to feed live data .

augmen commented 4 years ago

I am still struggling connecting with websocket as the websockets fetches data in real time and in json. First Approach I have converted the json strings into array like this requests.get('http://localhost:8005/trade?data={}'.format([close,volume])).json()) and try to feed them into app.py flask server using below : @app.route('/trade', methods = ['GET']) def trade(): data = json.loads(request.args.get('data')) return jsonify(agent.trade(data)) at the same time we are feeding the app.py with .csv training file in below line : df = pd.read_csv('TWTR.csv') real_trend = df['Close'].tolist() parameters = [df['Close'].tolist(), df['Volume'].tolist()] minmax = MinMaxScaler(feature_range = (100, 200)).fit(np.array(parameters).T) scaled_parameters = minmax.transform(np.array(parameters).T).T.tolist() initial_money = np.max(parameters[0]) * 2

Second approach we have tried to save live data to DB and create a function called def trade_iteration which fetches batches of data from DB , converts it intopandas Dataframe and feeds into app.py via below line 350-355; df = get_last_n_records_for_symbol('ETHUSDT', n=RECORDS_TO_FEED) print(df.shape) RUN_INTERVAL_TIME_IN_SEC = 6 RECORDS_TO_FEED = 30 real_trend = df['close'].tolist() print(real_trend) parameters = [df['close'].tolist(), df['volume'].tolist()] minmax = MinMaxScaler(feature_range=(100, 200)).fit(np.array(parameters).T) scaled_parameters = minmax.transform(np.array(parameters).T).T.tolist() initial_money = 10000

in both the above approaches we have make sure that son data is in ['close', 'volume'] dictionary .

The problem is Dataframe is static data while json data is dynamic !

Can you guide us what's the right method and format of data to put into app.py from websocket.

we are still not sure for hoe the app.py generates results through agent.trade()

Also what is this what variables or function it is used for ; @app.route('/queue', methods = ['GET']) def queue(): return jsonify(agent._queue) It looks like a queue for processing multiple iterations on the neural network. I thought it was a method.

@huseinzol05 @AlconDivino @marvin-hansen Plz guide

marvin-hansen commented 4 years ago

@augmen

I have moved away from building trading infrastructure and instead used QuantConnect (QC) to develop, test, and deploy strategies. QC supports all major deep learning frameworks and gives you free access to ~400 TB of data. It's not perfect, but at least you can relatively easy 1) pull historical data from the data warehouse 2) load the data into your Ai of choice 3) schedule regular training, say daily, weekly, monthly 4) Use the Ai signal to (paper) trade 5) Backtest, analyze, and tweak the results

Also, you get essentially for free:

It took me about 2 - 3 hours to get a simple neuronal net algo to trade equity and the results were okay (36% CAGR), but with some more tweaking and applying a RL algo on crypto, you should do way better. Equity, Forex, and Crypto works perfectly fine on QC. Just copy the sample code, modify, test, tweak, and you're done in a few days at most.

As I said, QC is not perfect and has some caveats:

If you don't need Index Futures for portfolio hedging, or just want some easy & cheap algo cloud hosting for equity & crypto, then QC is perhaps the best thing around. Python sample code for just about everything is already there and the forum is quite helpful most of the time.

If your AUM exceeds $1 million, I would stay clear of QC and go straight to AlgoTerminal or a similar institutional-grade platform that has index data and level 2 access across all instruments.

https://github.com/QuantConnect/Lean/tree/master/Algorithm.Python https://www.quantconnect.com/docs/home/home

augmen commented 4 years ago

@marven-hansen thanks 🙏🙇 buddy.