sun1638650145 / Libraries-and-Extensions-for-TensorFlow-for-Apple-Silicon

This Repo will provide TensorFlow libraries and extended build tutorials that require compilation to build, as well as pre-compiled wheel files.
113 stars 9 forks source link

M1 Mac RASA not working with these #9

Closed sbilgil closed 2 years ago

sbilgil commented 2 years ago

I am using RASA for my chatbot. It was working with Intel chipset. But for M1 it didnt work. I have searched internet alot. I follow them install tensorflow and numpy but my environment couldnt train my data. Then i searched internet and found this page. I followed the instructions and i hardly create my environment. Then when i press train button again error. Error looks like a mismatch between those packages: `2022-10-04 12:39:55 ERROR softtechnlp.server - Traceback (most recent call last): File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/server.py", line 1062, in train training_result = await train_async(training_payload) File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/train.py", line 169, in train_async return await _train_async_internal( File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/train.py", line 361, in _train_async_internal await _do_training( File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/train.py", line 407, in _do_training model_path = await _train_nlu_with_validated_data( File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/train.py", line 841, in _train_nlu_with_validated_data await softtechnlp.nlu.train( File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/nlu/train.py", line 116, in train interpreter = trainer.train(training_data, kwargs) File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/nlu/model.py", line 210, in train updates = component.train(working_data, self.config, context) File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/sf/nlu/model.py", line 342, in train super().train(training_data, config, kwargs) File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/nlu/classifiers/diet_classifier.py", line 832, in train self.model.fit( File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/models.py", line 224, in fit ) = self._get_tf_train_functions(eager, model_data, batch_strategy) File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/models.py", line 489, in _get_tf_train_functions self._get_tf_call_model_function( File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/models.py", line 472, in _get_tf_call_model_function tf_call_model_function(next(iter(init_dataset))) File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler raise e.with_traceback(filtered_tb) from None File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/autograph_generated_file9g8h7p4r.py", line 11, in tftrain_on_batch prediction_loss = ag.converted_call(ag__.ld(self).batch_loss, (ag.ld(batch_in),), None, fscope) File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/autograph_generated_fileubdqh00g.py", line 23, in tfbatch_loss (text_transformed, text_in, text_seq_ids, lm_mask_booltext, ) = ag.converted_call(ag.ld(self)._create_sequence, (ag.ld(tf_batch_data)[ag.ld(TEXT)][ag.ld(SEQUENCE)], ag.ld(tf_batch_data)[ag.ld(TEXT)][ag.ld(SENTENCE)], ag.ld(mask_sequence_text), ag__.ld(mask_text), ag.ld(self).text_name), dict(sparse_dropout=ag.ld(self).config[ag.ld(SPARSE_INPUT_DROPOUT)], dense_dropout=ag.ld(self).config[ag.ld(DENSE_INPUT_DROPOUT)], masked_lm_loss=ag.ld(self).config[ag.ld(MASKED_LM)], sequence_ids=True), fscope) File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/autograph_generated_filemw2bj9mv.py", line 27, in tf___create_sequence inputs = ag.converted_call(ag.ld(self)._combine_sequence_sentence_features, (ag.ld(sequence_features), ag.ld(sentence_features), ag__.ld(mask_sequence), ag.ld(mask), ag.ld(name), ag.ld(sparse_dropout), ag.ld(dense_dropout)), None, fscope) File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/__autograph_generatedfile2l1w562x.py", line 10, in tfcombine_sequence_sentence_features sequence_x = ag.converted_call(ag.ld(self)._combine_sparse_dense_features, (ag.ld(sequence_features), f'{ag.ld(name)}_{ag.ld(SEQUENCE)}', ag.ld(mask_sequence), ag.ld(sparse_dropout), ag__.ld(dense_dropout)), None, fscope) File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/autograph_generated_filedqt8w5sf.py", line 119, in tf_combine_sparse_dense_features ag.if_stmt(ag.not_(ag.ld(features)), if_body_4, else_body_4, get_state_5, set_state_5, ('doreturn', 'retval'), 2) File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/autograph_generated_filedqt8w5sf.py", line 88, in else_body_4 ag__.for_stmt(ag.ld(features), None, loop_body, get_state_3, set_state_3, (), {'iterate_names': 'f'}) File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/autograph_generated_filedqt8w5sf.py", line 84, in loop_body ag__.if_stmt(ag.converted_call(ag.ld(isinstance), (ag.ld(f), ag.ld(tf).SparseTensor), None, fscope), if_body_2, else_body_2, get_state_2, set_state_2, (), 0) File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/autograph_generated_filedqt8w5sf.py", line 62, in if_body_2 ag.if_stmt(ag.ld(sparse_dropout), if_body, else_body, get_state, set_state, ('_f',), 1) File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/autograph_generated_filedqt8w5sf.py", line 57, in if_body _f = ag__.converted_call(ag.ld(self)._tf_layers[f'sparse_input_dropout.{ag.ld(name)}'], (ag.ld(f), ag.ld(self)._training), None, fscope) File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 67, in error_handler raise e.with_traceback(filtered_tb) from None File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/__autograph_generated_fileqluryb_j.py", line 67, in tfcall outputs = ag.converted_call(ag.ld(tf_utils).smart_cond, (ag.ld(training), ag__.ld(dropped_inputs), ag.autograph_artifact(lambda : ag__.converted_call(ag.ld(tf).identity, (ag.ld(inputs),), None, fscope))), None, fscope) AttributeError: in user code:

File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/models.py", line 298, in train_on_batch  *
    prediction_loss = self.batch_loss(batch_in)
File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/nlu/classifiers/diet_classifier.py", line 1448, in batch_loss  *
    (
File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/models.py", line 1066, in _create_sequence  *
    inputs = self._combine_sequence_sentence_features(
File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/models.py", line 960, in _combine_sequence_sentence_features  *
    sequence_x = self._combine_sparse_dense_features(
File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/models.py", line 928, in _combine_sparse_dense_features  *
    _f = self._tf_layers[f"sparse_input_dropout.{name}"](
File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 67, in error_handler  **
    raise e.with_traceback(filtered_tb) from None
File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/__autograph_generated_fileqluryb_j.py", line 67, in tf__call
    outputs = ag__.converted_call(ag__.ld(tf_utils).smart_cond, (ag__.ld(training), ag__.ld(dropped_inputs), ag__.autograph_artifact(lambda : ag__.converted_call(ag__.ld(tf).identity, (ag__.ld(inputs),), None, fscope))), None, fscope)

AttributeError: Exception encountered when calling layer "sparse_dropout_1" (type SparseDropout).

in user code:

    File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/layers.py", line 64, in call  *
        outputs = tf_utils.smart_cond(

    AttributeError: module 'tensorflow.python.keras.utils.tf_utils' has no attribute 'smart_cond'

Call arguments received by layer "sparse_dropout_1" (type SparseDropout):
  • inputs=<tensorflow.python.framework.sparse_tensor.SparseTensor object at 0x2c2d6d640>
  • training=True

2022-10-04 12:39:55 ERROR softtechnlp.server - An unexpected error occurred during training. Error: in user code:

File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/models.py", line 298, in train_on_batch  *
    prediction_loss = self.batch_loss(batch_in)
File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/nlu/classifiers/diet_classifier.py", line 1448, in batch_loss  *
    (
File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/models.py", line 1066, in _create_sequence  *
    inputs = self._combine_sequence_sentence_features(
File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/models.py", line 960, in _combine_sequence_sentence_features  *
    sequence_x = self._combine_sparse_dense_features(
File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/models.py", line 928, in _combine_sparse_dense_features  *
    _f = self._tf_layers[f"sparse_input_dropout.{name}"](
File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/keras/utils/traceback_utils.py", line 67, in error_handler  **
    raise e.with_traceback(filtered_tb) from None
File "/var/folders/3r/lscxbw557b37671_q_xn6d300000gp/T/__autograph_generated_fileqluryb_j.py", line 67, in tf__call
    outputs = ag__.converted_call(ag__.ld(tf_utils).smart_cond, (ag__.ld(training), ag__.ld(dropped_inputs), ag__.autograph_artifact(lambda : ag__.converted_call(ag__.ld(tf).identity, (ag__.ld(inputs),), None, fscope))), None, fscope)

AttributeError: Exception encountered when calling layer "sparse_dropout_1" (type SparseDropout).

in user code:

    File "/Users/sadikalperbilgil/miniforge3/envs/envNLPV2arm2/lib/python3.9/site-packages/softtechnlp/utils/tensorflow/layers.py", line 64, in call  *
        outputs = tf_utils.smart_cond(

    AttributeError: module 'tensorflow.python.keras.utils.tf_utils' has no attribute 'smart_cond'

Call arguments received by layer "sparse_dropout_1" (type SparseDropout):
  • inputs=<tensorflow.python.framework.sparse_tensor.SparseTensor object at 0x2c2d6d640>
  • training=True

I have my toml file for the project like: build-system] requires = [ "poetry-core>=1.0.0",] build-backend = "poetry.core.masonry.api"

[tool.black] line-length = 88 target-version = [ "py36", "py37", "py38","py39"] exclude = "((.eggs | .git | .pytest_cache | build | dist))"

[tool.poetry] name = "softtechnlp" version = "2.3.3.2.dev" description = "Open source machine learning framework to automate text- and voice-based conversations: NLU, dialogue management, connect to Slack, Facebook, and more - Create chatbots and voice assistants" authors = [ "",] maintainers = [ "",] homepage = "" repository = "" documentation = "" classifiers = [ "Development Status :: 4 - Beta", "Intended Audience :: Developers", "License :: OSI Approved :: Apache Software License", "Programming Language :: Python", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: 3.8", "Topic :: Software Development :: Libraries",] keywords = [ "nlp", "machine-learning", "machine-learning-library", "bot", "bots", "botkit", "conversational-agents", "conversational-ai", "chatbot", "chatbot-framework", "bot-framework",] include = [ "LICENSE.txt", "README.md",] readme = "README.md" license = "Apache-2.0"

[tool.towncrier] package = "softtechnlp" package_dir = "softtechnlp" filename = "CHANGELOG.mdx" directory = "./changelog" underlines = " " title_format = "## [{version}] - {project_date}" template = "./changelog/_template.md.jinja2" start_string = "\n" issue_format = "" [[tool.towncrier.type]] directory = "removal" name = "Deprecations and Removals" showcontent = true

[[tool.towncrier.type]] directory = "feature" name = "Features" showcontent = true

[[tool.towncrier.type]] directory = "improvement" name = "Improvements" showcontent = true

[[tool.towncrier.type]] directory = "bugfix" name = "Bugfixes" showcontent = true

[[tool.towncrier.type]] directory = "doc" name = "Improved Documentation" showcontent = true

[[tool.towncrier.type]] directory = "misc" name = "Miscellaneous internal changes" showcontent = false

[tool.poetry.dependencies] python = ">=3.6,<3.10" boto3 = "^1.12" requests = "^2.23" requests_futures = "^1.0.0" fuzzy_matcher = "^0.1.0" fuzzywuzzy = "0.18.0" sgqlc = "^14.1" pypred = { git = "https://git@github.com/dialoguemd/pypred.git", rev = "7e30c9078e8a34a4ba3ecf96c6ea826173b25063" } matplotlib = ">=3.1,<3.4" attrs = ">=19.3,<20.4" jsonpickle = ">=1.3,<1.6" redis = "^3.4" numpy = [{version = ">=1.23", markers = "sys_platform!='darwin'"},{version = "=1.19.5", markers = "sys_platform=='darwin'"}] scipy = "^1.4.1" absl-py = ">=0.9,<0.12" apscheduler = ">=3.6,<3.8" tqdm = ">=4.31,<4.57" networkx = ">=2.4,<2.6" fbmessenger = "~6.0.0" pykwalify = ">=1.7,<1.9" coloredlogs = ">=10,<15" "ruamel.yaml" = "^0.16.5" scikit-learn = { version = ">=0.22,<0.25", markers="platform_machine != 'arm64'"} slackclient = "^2.0.0" twilio = ">=6.26,<6.51" webexteamssdk = ">=1.1.1,<1.7.0" mattermostwrapper = "~2.2" rocketchat_API = ">=0.6.31,<1.10.0" colorhash = "~1.0.2" jsonschema = "~3.2" packaging = ">=20.0,<21.0" pytz = ">=2019.1,<2021.0" softtechnlp-sdk = "^2.3.1" colorclass = "~2.2" terminaltables = "~3.1.0" sanic = ">=19.12.2,<21.0.0" sanic-cors = "~0.10.0b1" sanic-jwt = ">=1.3.2,<2.0" cloudpickle = ">=1.2,<1.7" multidict = "^4.6" aiohttp = "~3.6" questionary = "~1.5.1" prompt-toolkit = "^2.0" python-socketio = ">=5,<6" python-engineio = ">=4,<5" pydot = "~1.4" async_generator = "~1.10" SQLAlchemy = "~1.3.3" sklearn-crfsuite = "~0.3" psycopg2-binary = "~2.8.2" python-dateutil = "~2.8" tensorflow = { version = "~2.8.2", markers="platform_machine != 'arm64'"} tensorflow-text = [{ version = "~2.8.0", markers = "sys_platform!='win32' and sys_platform!='darwin'"}] tensorflow_hub = [{ version = "~2.8.0", markers = "sys_platform!='win32' and sys_platform!='darwin'"}] tensorflow-addons = [{version = "~0.10", markers="sys_platform!='darwin'"},] tensorflow-estimator = [{version = "~2.6", markers="sys_platform!='darwin'"},] tensorflow-probability = [{version = "~0.11", markers="sys_platform!='darwin'"},] setuptools = ">=41.0.0" kafka-python = ">=1.4,<3.0" ujson = ">=1.35,<5.0" oauth2client = "4.1.3" regex = ">=2020.6,<2020.10" joblib = "^0.15.1" sentry-sdk = ">=0.17.0,<0.20.0" aio-pika = "^6.7.1" pyTelegramBotAPI = "^3.7.3" prometheus-client = "^0.8.0" instana = "^1.37.4" python-dotenv = "^0.20.0" fasttext = "^0.9.2" spacymoji = "2.0.0" spacy = { version = "2.3.0", markers="sys_platform!='darwin'"} grpcio= ">=1.45.0"

[tool.poetry.dev-dependencies] pytest-cov = "^2.10.0" pytest-localserver = "^0.5.0" pytest-sanic = "^1.6.1" pytest-asyncio = "^0.10.0" pytest-xdist = "^1.32.0" pytest = "^5.3.4" freezegun = "^1.0.0" responses = "^0.12.1" aioresponses = "^0.6.2" moto = "~=1.3.16" fakeredis = "^1.4.0" mongomock = "^3.18.0" black = "^19.10b0" flake8 = "^3.8.3" flake8-docstrings = "^1.5.0" google-cloud-storage = "^1.29.0" azure-storage-blob = "<12.6.0" coveralls = "^2.0.0" towncrier = "^19.2.0" toml = "^0.10.0" pep440-version-utils = "^0.3.0" pydoc-markdown = "^3.5.0" pytest-timeout = "^1.4.2" mypy = "^0.790" bandit = "^1.6.3"

[tool.poetry.extras] jieba = [ "jieba",] transformers = [ "transformers",] full = [ "transformers", "jieba",] gh-release-notes = [ "github3.py",]

[tool.poetry.scripts] softtechnlp = "softtechnlp.main:main"

[tool.poetry.dependencies.PyJWT] version = "^2.0.0" extras = [ "crypto",]

[tool.poetry.dependencies.colorama] version = "^0.4.4" markers = "sys_platform == 'win32'"

[tool.poetry.dependencies."github3.py"] version = "~1.3.0" optional = true

[tool.poetry.dependencies.transformers] version = ">=2.4,<2.12" optional = true

[tool.poetry.dependencies.jieba] version = ">=0.39, <0.43" optional = true

[tool.poetry.dependencies.pymongo] version = ">=3.8,<3.11" extras = [ "tls", "srv",] `

As you can see i installed tensorflow-macos metal addons and text manually with the instructions on this page. Now i am stuck.

Can you have any insight?

sun1638650145 commented 2 years ago

This question seems to have nothing to do with building the whl file, here only discusses how to build the whl, and does not discuss other bugs and usages.