I succeeded in training model with executing finbert_training.ipynb. Then I ran a docker conatiner from Dockerfile and threw POST request to container(localhost:8080), which showed me the following error.
08/27/2021 17:03:42 - INFO - pytorch_pretrained_bert.modeling - Better speed can be achieved with apex installed from https://www.github.com/nvidia/apex .
[nltk_data] Downloading package punkt to /root/nltk_data...
[nltk_data] Unzipping tokenizers/punkt.zip.
08/27/2021 17:03:42 - INFO - pytorch_pretrained_bert.modeling - loading archive file /src/models/classifier_model/finbert-sentiment
08/27/2021 17:03:42 - INFO - pytorch_pretrained_bert.modeling - Model config {
"_name_or_path": "c:\\Users\\user\\projects\\finbert\\finBERT\\models\\language_model\\finbertTRC2",
"attention_probs_dropout_prob": 0.1,
"gradient_checkpointing": false,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"id2label": {
"0": "LABEL_0",
"1": "LABEL_1",
"2": "LABEL_2"
},
"initializer_range": 0.02,
"intermediate_size": 3072,
"label2id": {
"LABEL_0": 0,
"LABEL_1": 1,
"LABEL_2": 2
},
"layer_norm_eps": 1e-12,
"max_position_embeddings": 512,
"model_type": "bert",
"num_attention_heads": 12,
"num_hidden_layers": 12,
"pad_token_id": 0,
"position_embedding_type": "absolute",
"type_vocab_size": 2,
"vocab_size": 30522
}
08/27/2021 17:03:45 - INFO - pytorch_pretrained_bert.modeling - Weights from pretrained model not used in BertForSequenceClassification: ['bert.embeddings.position_ids']
* Serving Flask app 'main' (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: off
08/27/2021 17:03:45 - WARNING - werkzeug - * Running on all addresses.
WARNING: This is a development server. Do not use it in a production deployment.
08/27/2021 17:03:45 - INFO - werkzeug - * Running on http://172.17.0.2:8080/ (Press CTRL+C to quit)
08/27/2021 17:10:41 - INFO - filelock - Lock 140200106040192 acquired on /root/.cache/huggingface/transformers/c1d7f0a763fb63861cc08553866f1fc3e5a6f4f07621be277452d26d71303b7e.20430bd8e10ef77a7d2977accefe796051e01bc2fc4aa146bc862997a1a15e79.lock
Downloading: 100% 28.0/28.0 [00:00<00:00, 10.4kB/s]
08/27/2021 17:10:42 - INFO - filelock - Lock 140200106040192 released on /root/.cache/huggingface/transformers/c1d7f0a763fb63861cc08553866f1fc3e5a6f4f07621be277452d26d71303b7e.20430bd8e10ef77a7d2977accefe796051e01bc2fc4aa146bc862997a1a15e79.lock
08/27/2021 17:10:43 - INFO - filelock - Lock 140200106041056 acquired on /root/.cache/huggingface/transformers/3c61d016573b14f7f008c02c4e51a366c67ab274726fe2910691e2a761acf43e.37395cee442ab11005bcd270f3c34464dc1704b715b5d7d52b1a461abe3b9e4e.lock
Downloading: 100% 570/570 [00:00<00:00, 406kB/s]
08/27/2021 17:10:43 - INFO - filelock - Lock 140200106041056 released on /root/.cache/huggingface/transformers/3c61d016573b14f7f008c02c4e51a366c67ab274726fe2910691e2a761acf43e.37395cee442ab11005bcd270f3c34464dc1704b715b5d7d52b1a461abe3b9e4e.lock
08/27/2021 17:10:44 - INFO - filelock - Lock 140200106040720 acquired on /root/.cache/huggingface/transformers/45c3f7a79a80e1cf0a489e5c62b43f173c15db47864303a55d623bb3c96f72a5.d789d64ebfe299b0e416afc4a169632f903f693095b4629a7ea271d5a0cf2c99.lock
Downloading: 100% 232k/232k [00:00<00:00, 490kB/s]
08/27/2021 17:10:45 - INFO - filelock - Lock 140200106040720 released on /root/.cache/huggingface/transformers/45c3f7a79a80e1cf0a489e5c62b43f173c15db47864303a55d623bb3c96f72a5.d789d64ebfe299b0e416afc4a169632f903f693095b4629a7ea271d5a0cf2c99.lock
08/27/2021 17:10:46 - INFO - filelock - Lock 140200106040768 acquired on /root/.cache/huggingface/transformers/534479488c54aeaf9c3406f647aa2ec13648c06771ffe269edabebd4c412da1d.7f2721073f19841be16f41b0a70b600ca6b880c8f3df6f3535cbc704371bdfa4.lock
Downloading: 100% 466k/466k [00:00<00:00, 595kB/s]
08/27/2021 17:10:47 - INFO - filelock - Lock 140200106040768 released on /root/.cache/huggingface/transformers/534479488c54aeaf9c3406f647aa2ec13648c06771ffe269edabebd4c412da1d.7f2721073f19841be16f41b0a70b600ca6b880c8f3df6f3535cbc704371bdfa4.lock
['The Federal Reserve is committed to using its full range of tools to support the U.S. economy in this challenging time, thereby promoting its maximum employment and price stability goals.']
08/27/2021 17:10:49 - INFO - finbert.utils - *** Example ***
08/27/2021 17:10:49 - INFO - finbert.utils - guid: 0
08/27/2021 17:10:49 - INFO - finbert.utils - tokens: [CLS] the federal reserve is committed to using its full range of tools to support the u . s . economy in this challenging time , thereby promoting its maximum employment and price stability goals . [SEP]
08/27/2021 17:10:49 - INFO - finbert.utils - input_ids: 101 1996 2976 3914 2003 5462 2000 2478 2049 2440 2846 1997 5906 2000 2490 1996 1057 1012 1055 1012 4610 1999 2023 10368 2051 1010 8558 7694 2049 4555 6107 1998 3976 9211
3289 1012 102 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
08/27/2021 17:10:49 - INFO - finbert.utils - attention_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
08/27/2021 17:10:49 - INFO - finbert.utils - token_type_ids: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
08/27/2021 17:10:49 - INFO - finbert.utils - label: None (id = 9090)
[<finbert.utils.InputFeatures object at 0x7f82e1849d90>]
08/27/2021 17:10:49 - INFO - root - tensor([ 2.1882, -2.1247, -0.7895])
[ 2.1882384 -2.1246738 -0.78948754]
08/27/2021 17:10:49 - ERROR - main - Exception on / [POST]
Traceback (most recent call last):
File "/opt/conda/lib/python3.8/site-packages/flask/app.py", line 2070, in wsgi_app
response = self.full_dispatch_request()
File "/opt/conda/lib/python3.8/site-packages/flask/app.py", line 1515, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/opt/conda/lib/python3.8/site-packages/flask_cors/extension.py", line 165, in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
File "/opt/conda/lib/python3.8/site-packages/flask/app.py", line 1513, in full_dispatch_request
rv = self.dispatch_request()
File "/opt/conda/lib/python3.8/site-packages/flask/app.py", line 1499, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "/src/main.py", line 21, in score
return(predict(text, model).to_json(orient='records'))
File "/src/finbert/finbert.py", line 615, in predict
logits = softmax(np.array(logits))
File "/src/finbert/utils.py", line 215, in softmax
e_x = np.exp(x - np.max(x, axis=1)[:, None])
File "<__array_function__ internals>", line 5, in amax
File "/opt/conda/lib/python3.8/site-packages/numpy/core/fromnumeric.py", line 2705, in amax
return _wrapreduction(a, np.maximum, 'max', axis, None, out,
File "/opt/conda/lib/python3.8/site-packages/numpy/core/fromnumeric.py", line 87, in _wrapreduction
return ufunc.reduce(obj, axis, dtype, out, **passkwargs)
numpy.AxisError: axis 1 is out of bounds for array of dimension 1
08/27/2021 17:10:49 - INFO - werkzeug - 172.17.0.1 - - [27/Aug/2021 17:10:49] "POST / HTTP/1.1" 500 -
['The Federal Reserve is committed to using its full range of tools to support the US economy in this challenging
time, thereby promoting its maximum employment and price stability goals.']
08/27/2021 17:22:27 - INFO - finbert.utils - *** Example ***
08/27/2021 17:22:27 - INFO - finbert.utils - guid: 0
08/27/2021 17:22:27 - INFO - finbert.utils - tokens: [CLS] the federal reserve is committed to using its full range of tools to support the us economy in this challenging time , thereby promoting its maximum employment and price stability goals . [SEP]
08/27/2021 17:22:27 - INFO - finbert.utils - input_ids: 101 1996 2976 3914 2003 5462 2000 2478 2049 2440 2846 1997 5906 2000 2490 1996 2149 4610 1999 2023 10368 2051 1010 8558 7694 2049 4555 6107 1998 3976 9211 3289 1012 102 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
08/27/2021 17:22:27 - INFO - finbert.utils - attention_mask: 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
08/27/2021 17:22:27 - INFO - finbert.utils - token_type_ids: 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
08/27/2021 17:22:27 - INFO - finbert.utils - label: None (id = 9090)
[<finbert.utils.InputFeatures object at 0x7f82e1004d90>]
08/27/2021 17:22:27 - INFO - root - tensor([ 2.0840, -2.0827, -0.8532])
[ 2.0839536 -2.0827212 -0.85315543]
08/27/2021 17:22:27 - ERROR - main - Exception on / [POST]
Traceback (most recent call last):
File "/opt/conda/lib/python3.8/site-packages/flask/app.py", line 2070, in wsgi_app
response = self.full_dispatch_request()
File "/opt/conda/lib/python3.8/site-packages/flask/app.py", line 1515, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/opt/conda/lib/python3.8/site-packages/flask_cors/extension.py", line 165, in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
File "/opt/conda/lib/python3.8/site-packages/flask/app.py", line 1513, in full_dispatch_request
rv = self.dispatch_request()
File "/opt/conda/lib/python3.8/site-packages/flask/app.py", line 1499, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "/src/main.py", line 21, in score
return(predict(text, model).to_json(orient='records'))
File "/src/finbert/finbert.py", line 615, in predict
logits = softmax(np.array(logits))
File "/src/finbert/utils.py", line 215, in softmax
e_x = np.exp(x - np.max(x, axis=1)[:, None])
File "<__array_function__ internals>", line 5, in amax
File "/opt/conda/lib/python3.8/site-packages/numpy/core/fromnumeric.py", line 2705, in amax
return _wrapreduction(a, np.maximum, 'max', axis, None, out,
File "/opt/conda/lib/python3.8/site-packages/numpy/core/fromnumeric.py", line 87, in _wrapreduction
return ufunc.reduce(obj, axis, dtype, out, **passkwargs)
numpy.AxisError: axis 1 is out of bounds for array of dimension 1
08/27/2021 17:22:27 - INFO - werkzeug - 172.17.0.1 - - [27/Aug/2021 17:22:27] "POST / HTTP/1.1" 500 -
I have solved this issue by changing code in finbert.py
Before:
with torch.no_grad():
logits = model(all_input_ids, all_attention_mask, all_token_type_ids)[0]
After:
with torch.no_grad():
logits = model(all_input_ids, all_attention_mask, all_token_type_ids)
I succeeded in training model with executing finbert_training.ipynb. Then I ran a docker conatiner from Dockerfile and threw POST request to container(localhost:8080), which showed me the following error.
I have solved this issue by changing code in finbert.py Before:
After:
I will appreciate if you check this out.