PaddlePaddle / Paddle

PArallel Distributed Deep LEarning: Machine Learning Framework from Industrial Practice (『飞桨』核心框架,深度学习&机器学习高性能单机、分布式训练和跨平台部署)
http://www.paddlepaddle.org/
Apache License 2.0
22.29k stars 5.61k forks source link

jieba分词使用paddle模式,文本数据量较大时,报错 #44082

Closed corleytd closed 1 year ago

corleytd commented 2 years ago

bug描述 Describe the Bug

分词库jieba可以使用paddle进行深度学习分词,数据量较小时可以正常使用,数据量较大时就会报错

Traceback (most recent call last):
  File "C:/Users/LENOVO/Desktop/Study/自我学习/AILearning/NLP大厂实训班-深度之眼/codes/p1_chinese_word_segmentation/sequence_labeling_participle/data_process.py", line 90, in <module>
    gen_train_data(raw_data)
  File "C:/Users/LENOVO/Desktop/Study/自我学习/AILearning/NLP大厂实训班-深度之眼/codes/p1_chinese_word_segmentation/sequence_labeling_participle/data_process.py", line 69, in gen_train_data
    for word in words:
  File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\jieba\__init__.py", line 306, in cut
    results = predict.get_sent(sentence)
  File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\jieba\lac_small\predict.py", line 57, in get_sent
    words, crf_decode = exe.run(
  File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\paddle\fluid\executor.py", line 1299, in run
    six.reraise(*sys.exc_info())
  File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\six.py", line 719, in reraise
    raise value
  File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\paddle\fluid\executor.py", line 1285, in run
    res = self._run_impl(
  File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\paddle\fluid\executor.py", line 1464, in _run_impl
    return new_exe.run(list(feed.keys()), fetch_list, return_numpy)
  File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\paddle\fluid\executor.py", line 547, in run
    tensors = self._new_exe.run(feed_names, fetch_list)._move_to_list()
RuntimeError: In user code:

    File "C:/Users/LENOVO/Desktop/Study/自我学习/AILearning/NLP大厂实训班-深度之眼/codes/p1_chinese_word_segmentation/sequence_labeling_participle/data_process.py", line 18, in <module>
      jieba.enable_paddle()  # 启用PaddlePaddle的深度学习分词
    File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\jieba\_compat.py", line 46, in enable_paddle
      import jieba.lac_small.predict as predict
    File "<frozen importlib._bootstrap>", line 991, in _find_and_load

    File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked

    File "<frozen importlib._bootstrap>", line 671, in _load_unlocked

    File "<frozen importlib._bootstrap_external>", line 843, in exec_module

    File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed

    File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\jieba\lac_small\predict.py", line 43, in <module>
      infer_ret = creator.create_model(dataset.vocab_size, dataset.num_labels, mode='infer')
    File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\jieba\lac_small\creator.py", line 38, in create_model
      crf_decode = nets.lex_net(
    File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\jieba\lac_small\nets.py", line 122, in lex_net
      return _net_conf(word)
    File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\jieba\lac_small\nets.py", line 103, in _net_conf
      bigru_output = _bigru_layer(input_feature)
    File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\jieba\lac_small\nets.py", line 65, in _bigru_layer
      pre_gru_r = fluid.layers.fc(
    File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\paddle\fluid\layers\nn.py", line 370, in fc
      pre_activation = helper.append_bias_op(pre_bias, dim_start=num_flatten_dims)
    File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\paddle\fluid\layer_helper.py", line 131, in append_bias_op
      self.append_op(
    File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\paddle\fluid\layer_helper.py", line 44, in append_op
      return self.main_program.current_block().append_op(*args, **kwargs)
    File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\paddle\fluid\framework.py", line 3615, in append_op
      op = Operator(
    File "E:\Miniconda3\envs\nlpprojectbase\lib\site-packages\paddle\fluid\framework.py", line 2635, in __init__
      for frame in traceback.extract_stack():

    PreconditionNotMetError: The meta data must be valid when call the mutable data function. (at ..\paddle\phi\core\dense_tensor.cc:105)
      [operator < elementwise_add > error]

是环境问题还是什么问题啊?

其他补充信息 Additional Supplementary Information

No response

paddle-bot-old[bot] commented 2 years ago

您好,我们已经收到了您的问题,会安排技术人员尽快解答您的问题,请耐心等待。请您再次检查是否提供了清晰的问题描述、复现代码、环境&版本、报错信息等。同时,您也可以通过查看官网API文档常见问题历史IssueAI社区来寻求解答。祝您生活愉快~

Hi! We've received your issue and please be patient to get responded. We will arrange technicians to answer your questions as soon as possible. Please make sure that you have posted enough message to demo your request. You may also check out the APIFAQGithub Issue and AI community to get the answer.Have a nice day!

Aganlengzi commented 2 years ago

数据量较小时可以正常使用,数据量较大时就会报错

谢谢您的反馈,根据现象,建议检查调用Paddle API前数据量较大时数据加载是否正常。

corleytd commented 2 years ago

和数据加载没关系,因为这是在jieba里边使用paddle才出现的问题,代码如下:

import paddle, jieba
paddle.enable_static()
jieba.enable_paddle()  # 启用PaddlePaddle的深度学习分词
words = jieba.cut(sent, use_paddle=True)

建议看下详细的报错信息,看看是什么问题呢?

Aganlengzi commented 2 years ago

和数据加载没关系,因为这是在jieba里边使用paddle才出现的问题,代码如下:

import paddle, jieba
paddle.enable_static()
jieba.enable_paddle()  # 启用PaddlePaddle的深度学习分词
words = jieba.cut(sent, use_paddle=True)

建议看下详细的报错信息,看看是什么问题呢?

最后的报错信息是Tensor的meta信息(包括dtype,layout等)不合法,另外根据你的描述仅数据量大小不同,所以建议检查调用API前的数据,另外jieba对应的paddle版本是什么?是否兼容,是否可以使用最新的Paddle版本(报错的代码对应行号和最新的已经改变)

paddle-bot[bot] commented 1 year ago

Since you haven\'t replied for more than a year, we have closed this issue/pr. If the problem is not solved or there is a follow-up one, please reopen it at any time and we will continue to follow up. 由于您超过一年未回复,我们将关闭这个issue/pr。 若问题未解决或有后续问题,请随时重新打开,我们会继续跟进。