MagicStack / MagicPython

Cutting edge Python syntax highlighter for Sublime Text, Atom and Visual Studio Code. Used by GitHub to highlight your Python code!
MIT License
1.41k stars 91 forks source link

Tokenization for match-case #235

Open alexr00 opened 3 years ago

alexr00 commented 3 years ago

Originally from @SNvMK in https://github.com/microsoft/vscode/issues/120734

So, in python 3.10, there is match/case syntax. Currently, it is just white words(for monokai). I'd like if you add highlight for this. Thanks image

a = 'world'
match a:
    case 'world':
        print('Hello!')
    case _:
        print('I dont know you!')

Currently the match and case are only tokenized as source.python.

cdce8p commented 3 years ago

I opened #237 to address this one

hawkinsw commented 2 years ago

Though @cdce8p closed his PR, I think that this would still be beneficial. The VS Code markdown "previewer" use this grammar when rendering Python-labeled triple backtick blocks. I am happy to pick up on where @cdce8p left his great work.

I am trying to figure out if there is a reference for the yaml format that drives the production of the other files. In particular, I am wondering if there is documentation that explains whether there is a way to determine (using regular expression as much as possible) if the tokenizer is in a particular section. In other words, does the yaml file support defining a region (or something like that) according to two sets of regular expressions that could be used to label a section of code as belonging to the body of a lambda or with. We could write such a group detector for match statements which would make it easier to label case and _ as keywords when they are more likely to be keywords.

Obviously regular expressions are not enough to capture the full context sensitivity required to definitively mark match, case and _, but perhaps it could be a good start?

contang0 commented 2 years ago

This seems to be addressed by 7d0f2b22a5ad8fccbd7341bc7b7a715169283044. Is there going to be new release? Last one was in 2018.