Open joaosalvado10 opened 6 years ago
You can download the news data here: https://www.kaggle.com/uciml/news-aggregator-dataset/data
Save the csv file in the data
folder . Then download the historical data of a stock in 2014 here: https://finance.yahoo.com/quote/AAPL/history?p=AAPL . Save this file also to the data
folder. Run the file combine_data.py
to prepare the data. The LSTM model can be run by first training the model in LSTM/lstm_train.py
followed by evaluating the model by running LSTM/lstm_test.py
. To run the sentiment analysis model, one of the scripts in the predictionBySentiment
folder need to be run.
Disclaimer: this project was only made for a university master's course, do not expect fully tested and functional software.
When I tried to run tf_test, it shows that too few arguments. Could you please help me on this :)
The code::::
from keras.models import load_model from tf_data import TF_Data import argparse
parser = argparse.ArgumentParser( prog='tf_test.py', formatter_class=argparse.ArgumentDefaultsHelpFormatter, description='Evaluating LSTM Model')
parser.add_argument("test_sentence") args = parser.parse_args() test_sentence_str = args.test_sentence model = load_model('combined_MSFT_microsoft_tech_news_day_aftertomorrow.hdf5') filename = '../data/all_data/combined_MSFT_microsoft_tech_news.csv'
data = TF_Data(filename,top_words=5000) test_sentence = data.test_sentence(test_sentence_str) print model.predict(test_sentence)[0][0]
When I tried to run tf_test, it shows that too few arguments. Could you please help me on this : usage: tf_test.py [-h] test_sentence tf_test.py: error: the following arguments are required: test_sentence An exception has occurred, use %tb to see the full traceback.
Hello!
The LSTM model can be run by first training the model in LSTM/lstm_train.py
Can you clarify please, To run the training, what should be defined as "model_name" in "def train (filename, model_name, day='tomorrow')"?
Hello, it seems like you have done a great work. I would like to know how do I run your code.
Thank you