Open jtresko opened 11 months ago
Hey @jtresko I was wondering if you ever found a fix for this? I'm just trying to get the migration up and running, and I'm getting this as well. Is there a known workaround? Thanks in advance :-)
No, I've sorta abandoned this one. Was just curious about the capability. Seems a critical piece is to make sure TreeSitter has definitions for both languages. Not sure if that's where this error came from.
It worked when I tested something like python to Nodejs I believe.
Also getting this error. This repo seems useless now. @joshpxyne FYI
Yeah I've got the same issue, trying to run ollama/llama3
model on Windows 10, Python 3.11
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\main.py:100 in main │
│ │
│ 97 │ │ │ file_name = write_migration(sourcefile, external_deps_list, target_deps_per_ │
│ 98 │ │ │ target_deps_per_file[parent_file].append(file_name) │
│ 99 │ │ │
│ > 100 │ │ migrate(sourceentry, globals) │
│ 101 │ │ add_env_files(globals) │
│ 102 │ │
│ 103 │ ''' 3. Testing ''' │
│ │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │ ai = <ai.AI object at 0x00000210E5A7CD10> │ │
│ │ detected_language = None │ │
│ │ globals = <__main__.Globals object at 0x00000210E5976BD0> │ │
│ │ guidelines = '' │ │
│ │ migrate = <function main.<locals>.migrate at 0x00000210E3311BC0> │ │
│ │ model = 'ollama/llama3' │ │
│ │ operating_system = 'linux' │ │
│ │ source_directory_structure = ' ├── .github/\n │ ├── ISSUE_TEMPLATE/\n │ │
│ │ │ │ '+2224752 │ │
│ │ sourcedir = 'C:\\Users\\chalu\\Desktop\myapp17.0' │ │
│ │ sourceentry = 'C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.py' │ │
│ │ sourcelang = 'python' │ │
│ │ sourceport = None │ │
│ │ step = 'all' │ │
│ │ target_deps_per_file = defaultdict(<class 'list'>, {}) │ │
│ │ targetdir = 'C:\\Users\\chalu\\Desktop\\myappGolang' │ │
│ │ targetlang = 'golang' │ │
│ │ targetport = 8080 │ │
│ │ temperature = 0.0 │ │
│ │ testfiles = 'app.py' │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
│ │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\main.py:94 in migrate │
│ │
│ 91 │ │ target_deps_per_file = defaultdict(list) │
│ 92 │ │ def migrate(sourcefile, globals, parent_file=None): │
│ 93 │ │ │ # recursively work through each of the files in the source directory, starti │
│ > 94 │ │ │ internal_deps_list, external_deps_list = get_dependencies(sourcefile=sourcef │
│ 95 │ │ │ for dependency in internal_deps_list: │
│ 96 │ │ │ │ migrate(dependency, globals, parent_file=sourcefile) │
│ 97 │ │ │ file_name = write_migration(sourcefile, external_deps_list, target_deps_per_ │
│ │
│ ┌───────────────────────────────────── locals ─────────────────────────────────────┐ │
│ │ globals = <__main__.Globals object at 0x00000210E5976BD0> │ │
│ │ migrate = <function main.<locals>.migrate at 0x00000210E3311BC0> │ │
│ │ parent_file = None │ │
│ │ sourcefile = 'C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.py' │ │
│ │ target_deps_per_file = defaultdict(<class 'list'>, {}) │ │
│ └──────────────────────────────────────────────────────────────────────────────────┘ │
│ │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\steps\migrate.py:58 in get_dependencies │
│ │
│ 55 │ │ │ │ │ │ │ │ │ │ │ │ │ sourcelang=globals.sourcelang, │
│ 56 │ │ │ │ │ │ │ │ │ │ │ │ │ sourcefile_content=sourcefile_conten │
│ 57 │ │
│ > 58 │ external_dependencies = llm_run(prompt, │
│ 59 │ │ │ │ │ │ │ waiting_message=f"Identifying external dependencies for {sou │
│ 60 │ │ │ │ │ │ │ success_message=None, │
│ 61 │ │ │ │ │ │ │ globals=globals) │
│ │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │ external_deps_prompt_template = 'The following prompt is a composition of prompt sections, │ │
│ │ each with different pr'+1799 │ │
│ │ file = <_io.TextIOWrapper │ │
│ │ name='C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.… │ │
│ │ mode='r' encoding='cp1252'> │ │
│ │ globals = <__main__.Globals object at 0x00000210E5976BD0> │ │
│ │ internal_deps_prompt_template = 'The following prompt is a composition of prompt sections, │ │
│ │ each with different pr'+2207 │ │
│ │ prompt = 'The following prompt is a composition of prompt sections, │ │
│ │ each with different pr'+1775 │ │
│ │ sourcefile = 'C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.py' │ │
│ │ sourcefile_content = 'from .cli.command import main\n\nmain()\n' │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
│ │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\utils.py:39 in llm_run │
│ │
│ 36 │ │
│ 37 │ output = "" │
│ 38 │ with yaspin(text=waiting_message, spinner="dots") as spinner: │
│ > 39 │ │ output = globals.ai.run(prompt) │
│ 40 │ │ spinner.ok("✅ ") │
│ 41 │ │
│ 42 │ if success_message: │
│ │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │ globals = <__main__.Globals object at 0x00000210E5976BD0> │ │
│ │ output = '' │ │
│ │ prompt = 'The following prompt is a composition of prompt sections, each with │ │
│ │ different pr'+1775 │ │
│ │ spinner = <Yaspin frames=⠋⠙⠹⠸⠼⠴⠦⠧⠇⠏> │ │
│ │ success_message = None │ │
│ │ waiting_message = 'Identifying external dependencies for │ │
│ │ C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__ma'+10 │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
│ │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\ai.py:49 in run │
│ │
│ 46 │ │ for chunk in response: │
│ 47 │ │ │ delta = chunk["choices"][0]["delta"] │
│ 48 │ │ │ msg = delta.get("content", "") │
│ > 49 │ │ │ chat += msg │
│ 50 │ │ return chat │
│ 51 │
│ 52 │
│ │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │ chat = 'logrus,github.com/spf13/cobra,github.com/google/go-cmp/matchers,github.com/goog… │ │
│ │ chunk = ModelResponse( │ │
│ │ │ id='chatcmpl-0022f59d-3cf6-4202-8205-3dfbe33133a1', │ │
│ │ │ choices=[ │ │
│ │ │ │ StreamingChoices( │ │
│ │ │ │ │ finish_reason='stop', │ │
│ │ │ │ │ index=0, │ │
│ │ │ │ │ delta=Delta( │ │
│ │ │ │ │ │ content=None, │ │
│ │ │ │ │ │ role=None, │ │
│ │ │ │ │ │ function_call=None, │ │
│ │ │ │ │ │ tool_calls=None │ │
│ │ │ │ │ ), │ │
│ │ │ │ │ logprobs=None │ │
│ │ │ │ ) │ │
│ │ │ ], │ │
│ │ │ created=1715994803, │ │
│ │ │ model='llama3', │ │
│ │ │ object='chat.completion.chunk', │ │
│ │ │ system_fingerprint=None │ │
│ │ ) │ │
│ │ delta = Delta(content=None, role=None, function_call=None, tool_calls=None) │ │
│ │ message = [ │ │
│ │ │ { │ │
│ │ │ │ 'role': 'user', │ │
│ │ │ │ 'content': 'The following prompt is a composition of prompt sections, │ │
│ │ each with different pr'+1775 │ │
│ │ │ } │ │
│ │ ] │ │
│ │ msg = None │ │
│ │ prompt = 'The following prompt is a composition of prompt sections, each with different │ │
│ │ pr'+1775 │ │
│ │ response = <generator object ollama_completion_stream at 0x00000210E5A38460> │ │
│ │ self = <ai.AI object at 0x00000210E5A7CD10> │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
└──────────────────────────────────────────────────────────────────────────────────────────────────┘
TypeError: can only concatenate str (not "NoneType") to str
It can be sidestepped in ai.py
on line 49 by adding or ""
to chat += msg
chat += msg or ""
Commented here: https://github.com/joshpxyne/gpt-migrate/commit/bfb06d6c81818fc0d66f1741f8179efa3225419e#r145282262
I'm getting this on pretty much any command. Even when going from python to python to try having it debug, test, etc or convert to different libraries with the --guidelines.
Here's the full console output (note this is an example with an obscure language, but getting the same on py to py):
Another example: