Open keaiwkaw opened 1 year ago
@keaiwkaw Hi, have you solved this problem?
@xueerli @keaiwkaw
I solved it by patching method wireTmGrammars from here. I just don't allow the tokenizer to tokenize lines that are longer than 300 symbols.
Easy peasy:
/**
* Wires up monaco-editor with monaco-textmate
*/
export function wireTmGrammars(monaco: MonacoEditor, registry: Registry, languages: Map<string, string>, editor?: MonacoEditorInstance) {
const maxLineLength = 300;
return Promise.all(
Array.from(languages.keys())
.map(async(languageId) => {
const grammar = await registry.loadGrammar(languages.get(languageId));
monaco.languages.setTokensProvider(languageId, {
getInitialState: () => new TokenizerState(INITIAL),
tokenize: (line: string, state: TokenizerState) => {
// Return plaintext for long lines
if (line.length > maxLineLength) {
return {
endState: state,
tokens: [{ startIndex: 0, scopes: 'plaintext' }]
};
}
const res = grammar.tokenizeLine(line, state.ruleStack);
editor ??= monaco.editor.getEditors()[0] as MonacoEditorInstance;
return {
endState: new TokenizerState(res.ruleStack),
tokens: res.tokens.map(token => ({
...token,
scopes: editor ? TMToMonacoToken(editor, token.scopes) : token.scopes.at(-1)
})),
};
}
});
})
);
}
The browser crashed while I was writing 60k compressed js code into the monaco-editor. When I commented out the highlighted logic, the compressed code appeared normally