SunYanCN / bert-text

BERT+ TF Keras For Chinese NLP Tasks
32 stars 8 forks source link

installation issue #1

Closed emanueledirosa closed 5 years ago

emanueledirosa commented 5 years ago

Hi! Very interesting project but I could not test it even on colab. When I run "from bert_text import run_on_dfs" I get the following error: ModuleNotFoundError: No module named 'bert_text'

Even if I installed it. If you just run the colab here you will reproduce the issue: https://github.com/wshuyi/demo-text-binary-classification-with-bert/blob/master/bert_text_classification.ipynb

Thank you

irakli97 commented 5 years ago

I have the same issue.

ghost commented 5 years ago

Same issue here.

dangmanhtruong1995 commented 5 years ago

Same issue when running google colab, even when I tried to use git clone then python setup.py install

callzhang commented 5 years ago

same issue

athakur5 commented 5 years ago

Same issue here. I tried installing the module again, that didn't work either.

SunYanCN commented 5 years ago

@emanueledirosa @irakli97 @evkoskib @dangmanhtruong1995 @callzhang @athakur5 The original bert-text has been deleted and is no longer maintained. The current bert-text is doing other functions. If you want to learn about bert text tasks, I recommend this code: https://kashgari.bmio.net/tutorial/text-classification/

import kashgari
from kashgari.corpus import SMP2018ECDTCorpus
from kashgari.tasks.classification import BiLSTM_Model
from kashgari.embeddings import BERTEmbedding
from kashgari.callbacks import EvalCallBack
from tensorflow.python import keras
from kashgari import utils

# if GPU True else Fasle
kashgari.config.use_cudnn_cell = True

# dataset
train_x, train_y = SMP2018ECDTCorpus.load_data('train')
valid_x, valid_y = SMP2018ECDTCorpus.load_data('valid')
test_x, test_y = SMP2018ECDTCorpus.load_data('test')

# '<PRE_TRAINED_BERT_MODEL_FOLDER>':BERT model path
bert_embed = BERTEmbedding('/home/new/Toxicity/bert_model/models/chinese_L-12_H-768_A-12',
                           task=kashgari.CLASSIFICATION,
                           sequence_length=100)

model = BiLSTM_Model(bert_embed)
tf_board_callback = keras.callbacks.TensorBoard(log_dir='./logs', update_freq=1000)

eval_callback = EvalCallBack(kash_model=model,
                             valid_x=valid_x,
                             valid_y=valid_y,
                             step=5)

model.fit(train_x,
          train_y,
          valid_x,
          valid_y,
          batch_size=100,
          callbacks=[eval_callback, tf_board_callback])

model.evaluate(test_x, test_y)

# save model to `saved_classification_model` 
model.save('saved_classification_model')

# load model
loaded_model = kashgari.utils.load_model('saved_classification_model')

# predict
loaded_model.predict(test_x[:10])

# Save model
utils.convert_to_saved_model(model, 
                             model_path='saved_model/blstm', 
                             version=1)
kevinwu23 commented 5 years ago

Any chance you might be able to point us to a commit for the old functional code? I would be happy to maintain the code.

harshitsinghai77 commented 4 years ago

Same issue

getcontrol commented 4 years ago

+1 same issue

SIG75 commented 4 years ago

+1

Abhinavk910 commented 4 years ago

+1