Closed mcepl closed 9 months ago
Should I understand https://orbitalquark.github.io/scintillua/api.html#migrating-legacy-lexers
Lexers no longer specify styling information. Remove any calls to
lex:add_style()
. You may need to add styling information for custom tags to your editor’s theme.
that all calls to lex:add_style()
should be removed? I see at these files to have them:
And truly when adding this patch to rest.lua
I really get rid of the error reports:
diff --git a/vis/lexers/rest.lua b/vis/lexers/rest.lua
index 7080003..df94656 100644
--- a/vis/lexers/rest.lua
+++ b/vis/lexers/rest.lua
@@ -20,7 +20,7 @@ local block = '::' * (lexer.newline + -1) * function(input, index)
return #input + 1
end
lex:add_rule('literal_block', token('literal_block', block))
-lex:add_style('literal_block', lexer.styles.embedded .. {eolfilled = true})
+-- MC lex:add_style('literal_block', lexer.styles.embedded .. {eolfilled = true})
-- Lists.
local option_word = lexer.alnum * (lexer.alnum + '-')^0
@@ -62,7 +62,7 @@ end
local code_block =
prefix * 'code-block::' * S(' \t')^1 * lexer.nonnewline^0 * (lexer.newline + -1) * indented_block
lex:add_rule('code_block', #prefix * token('code_block', starts_line(code_block)))
-lex:add_style('code_block', lexer.styles.embedded .. {eolfilled = true})
+-- MC lex:add_style('code_block', lexer.styles.embedded .. {eolfilled = true})
-- Directives.
local known_directive = token('directive', prefix * word_match{
@@ -102,8 +102,8 @@ local unknown_directive = token('unknown_directive', prefix * word * '::' * lexe
lex:add_rule('directive',
#prefix * starts_line(known_directive + sphinx_directive + unknown_directive))
lex:add_style('directive', lexer.styles.keyword)
-lex:add_style('sphinx_directive', lexer.styles.keyword .. {bold = true})
-lex:add_style('unknown_directive', lexer.styles.keyword .. {italics = true})
+-- MC lex:add_style('sphinx_directive', lexer.styles.keyword .. {bold = true})
+-- MC lex:add_style('unknown_directive', lexer.styles.keyword .. {italics = true})
-- Substitution definitions.
lex:add_rule('substitution', #prefix * token('substitution', starts_line(prefix * lexer.range('|') *
Except of course, it also eliminates the desired functionality (grep -c 'lex:add_style' rest.lua
gives me 12 hits). What is the correct way how to deal with these styling issues? Or is it vis-specific? Do I have to create custom styles in all our themes/
files? E.g., how to make for example diff.lua
work (with tags like addition
, deletion
, change
)?
I don't think lex:add_style()
is the issue. The problem is that the
highlighted code is not valid lua. I'm not sure what the intention was
but it is probably something like:
lexer.styles.embedded.eolfilled = true
lex:add_style('literal_block', lexer.styles.embedded)
lexer.styles.embedded.eolfilled = false
@orbitalquark will have to confirm since they introduced the code in 0649455. Matěj is credited but those lines were not in the original patch (#71).
BTW, is it correct that we could eliminate all those lines, because it is now default? Why they are still in this repository?
-- Whitespace.
lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
I mean so radical as?
From 5df0b23e9ecc126abc966206d42e46e2b63e184f Mon Sep 17 00:00:00 2001
From: =?UTF-8?q?Mat=C4=9Bj=20Cepl?= <mcepl@cepl.eu>
Date: Mon, 28 Aug 2023 15:01:37 +0200
Subject: [PATCH vis] Remove now default (in scintillua 6.0+) whitespace
declaration.
---
lua/lexers/actionscript.lua | 3 ---
lua/lexers/ada.lua | 3 ---
lua/lexers/antlr.lua | 3 ---
lua/lexers/apdl.lua | 3 ---
lua/lexers/apl.lua | 3 ---
lua/lexers/applescript.lua | 3 ---
lua/lexers/batch.lua | 3 ---
lua/lexers/boo.lua | 3 ---
lua/lexers/caml.lua | 3 ---
lua/lexers/clojure.lua | 3 ---
lua/lexers/coffeescript.lua | 3 ---
lua/lexers/context.lua | 3 ---
lua/lexers/crystal.lua | 3 ---
lua/lexers/csharp.lua | 3 ---
lua/lexers/dart.lua | 3 ---
lua/lexers/dot.lua | 3 ---
lua/lexers/eiffel.lua | 3 ---
lua/lexers/elixir.lua | 3 ---
lua/lexers/elm.lua | 3 ---
lua/lexers/erlang.lua | 3 ---
lua/lexers/fantom.lua | 4 ----
lua/lexers/faust.lua | 3 ---
lua/lexers/fennel.lua | 3 ---
lua/lexers/fish.lua | 3 ---
lua/lexers/forth.lua | 3 ---
lua/lexers/fortran.lua | 3 ---
lua/lexers/fsharp.lua | 3 ---
lua/lexers/fstab.lua | 3 ---
lua/lexers/gap.lua | 3 ---
lua/lexers/gettext.lua | 3 ---
lua/lexers/gherkin.lua | 3 ---
lua/lexers/gleam.lua | 4 ----
lua/lexers/groovy.lua | 3 ---
lua/lexers/hare.lua | 3 ---
lua/lexers/haskell.lua | 3 ---
lua/lexers/icon.lua | 3 ---
lua/lexers/idl.lua | 3 ---
lua/lexers/inform.lua | 3 ---
lua/lexers/ini.lua | 3 ---
lua/lexers/io_lang.lua | 3 ---
lua/lexers/jq.lua | 3 ---
lua/lexers/julia.lua | 3 ---
lua/lexers/ledger.lua | 3 ---
lua/lexers/lilypond.lua | 3 ---
lua/lexers/lisp.lua | 3 ---
lua/lexers/logtalk.lua | 3 ---
lua/lexers/man.lua | 3 ---
lua/lexers/matlab.lua | 3 ---
lua/lexers/meson.lua | 3 ---
lua/lexers/moonscript.lua | 3 ---
lua/lexers/myrddin.lua | 3 ---
lua/lexers/nemerle.lua | 3 ---
lua/lexers/networkd.lua | 3 ---
lua/lexers/nim.lua | 3 ---
lua/lexers/nsis.lua | 3 ---
lua/lexers/objective_c.lua | 3 ---
lua/lexers/pascal.lua | 3 ---
lua/lexers/pike.lua | 3 ---
lua/lexers/pkgbuild.lua | 3 ---
lua/lexers/pony.lua | 4 ----
lua/lexers/powershell.lua | 3 ---
lua/lexers/prolog.lua | 3 ---
lua/lexers/protobuf.lua | 3 ---
lua/lexers/ps.lua | 3 ---
lua/lexers/pure.lua | 3 ---
lua/lexers/rc.lua | 3 ---
lua/lexers/reason.lua | 3 ---
lua/lexers/rebol.lua | 3 ---
lua/lexers/rexx.lua | 3 ---
lua/lexers/routeros.lua | 3 ---
lua/lexers/rpmspec.lua | 3 ---
lua/lexers/rstats.lua | 3 ---
lua/lexers/scala.lua | 4 ----
lua/lexers/scheme.lua | 3 ---
lua/lexers/smalltalk.lua | 3 ---
lua/lexers/sml.lua | 4 ----
lua/lexers/snobol4.lua | 3 ---
lua/lexers/spin.lua | 3 ---
lua/lexers/sql.lua | 3 ---
lua/lexers/systemd.lua | 3 ---
lua/lexers/tcl.lua | 3 ---
lua/lexers/vala.lua | 3 ---
lua/lexers/vcard.lua | 3 ---
lua/lexers/verilog.lua | 3 ---
lua/lexers/vhdl.lua | 3 ---
lua/lexers/xs.lua | 3 ---
lua/lexers/xtend.lua | 4 ----
lua/lexers/zig.lua | 3 ---
88 files changed, 270 deletions(-)
diff --git a/lua/lexers/actionscript.lua b/lua/lexers/actionscript.lua
index 07b68fd8..b25aad2f 100644
--- a/lua/lexers/actionscript.lua
+++ b/lua/lexers/actionscript.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('actionscript')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'break', 'continue', 'delete', 'do', 'else', 'for', 'function', 'if', 'in', 'new', 'on', 'return',
diff --git a/lua/lexers/ada.lua b/lua/lexers/ada.lua
index 6533ab30..4d933e75 100644
--- a/lua/lexers/ada.lua
+++ b/lua/lexers/ada.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('ada')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match({
'abort', 'abs', 'abstract', 'accept', 'access', 'aliased', 'all', 'and', 'array', 'at', 'begin',
diff --git a/lua/lexers/antlr.lua b/lua/lexers/antlr.lua
index 984ca3eb..30ce8757 100644
--- a/lua/lexers/antlr.lua
+++ b/lua/lexers/antlr.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('antlr')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'abstract', 'break', 'case', 'catch', 'continue', 'default', 'do', 'else', 'extends', 'final',
diff --git a/lua/lexers/apdl.lua b/lua/lexers/apdl.lua
index d25b889f..9b0e6b6d 100644
--- a/lua/lexers/apdl.lua
+++ b/lua/lexers/apdl.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('apdl', {case_insensitive_fold_points = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match({
'*abbr', '*abb', '*afun', '*afu', '*ask', '*cfclos', '*cfc', '*cfopen', '*cfo', '*cfwrite',
diff --git a/lua/lexers/apl.lua b/lua/lexers/apl.lua
index 77373c51..698319d1 100644
--- a/lua/lexers/apl.lua
+++ b/lua/lexers/apl.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('apl')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Comments.
lex:add_rule('comment', token(lexer.COMMENT, lexer.to_eol(P('⍝') + '#')))
diff --git a/lua/lexers/applescript.lua b/lua/lexers/applescript.lua
index 30118195..611d7e0f 100644
--- a/lua/lexers/applescript.lua
+++ b/lua/lexers/applescript.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('applescript')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match({
'script', 'property', 'prop', 'end', 'copy', 'to', 'set', 'global', 'local', 'on', 'to', 'of',
diff --git a/lua/lexers/batch.lua b/lua/lexers/batch.lua
index 6c6765b7..8a53b2e5 100644
--- a/lua/lexers/batch.lua
+++ b/lua/lexers/batch.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('batch', {case_insensitive_fold_points = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match({
'cd', 'chdir', 'md', 'mkdir', 'cls', 'for', 'if', 'echo', 'echo.', 'move', 'copy', 'ren', 'del',
diff --git a/lua/lexers/boo.lua b/lua/lexers/boo.lua
index 67a75192..6a539275 100644
--- a/lua/lexers/boo.lua
+++ b/lua/lexers/boo.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('boo')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'and', 'break', 'cast', 'continue', 'elif', 'else', 'ensure', 'except', 'for', 'given', 'goto',
diff --git a/lua/lexers/caml.lua b/lua/lexers/caml.lua
index e269d754..419e1fe3 100644
--- a/lua/lexers/caml.lua
+++ b/lua/lexers/caml.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('caml')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'and', 'as', 'asr', 'begin', 'class', 'closed', 'constraint', 'do', 'done', 'downto', 'else',
diff --git a/lua/lexers/clojure.lua b/lua/lexers/clojure.lua
index 201733d6..6e06cf7d 100644
--- a/lua/lexers/clojure.lua
+++ b/lua/lexers/clojure.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('clojure')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'fn', 'try', 'catch', 'finaly', 'defonce', 'and', 'case', 'cond', 'def', 'defn', 'defmacro', 'do',
diff --git a/lua/lexers/coffeescript.lua b/lua/lexers/coffeescript.lua
index 862b34d5..ec3b07a0 100644
--- a/lua/lexers/coffeescript.lua
+++ b/lua/lexers/coffeescript.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('coffeescript', {fold_by_indentation = true})
--- Whitespace.
-lex:add_rule('whitespace', lex:tag(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', lex:tag(lexer.KEYWORD, word_match{
'all', 'and', 'bind', 'break', 'by', 'case', 'catch', 'class', 'const', 'continue', 'default',
diff --git a/lua/lexers/context.lua b/lua/lexers/context.lua
index a4d48091..8d9fd2b4 100644
--- a/lua/lexers/context.lua
+++ b/lua/lexers/context.lua
@@ -11,9 +11,6 @@ local lex = lexer.new('context')
local beginend = (P('begin') + 'end')
local startstop = (P('start') + 'stop')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Comments.
lex:add_rule('comment', token(lexer.COMMENT, lexer.to_eol('%')))
diff --git a/lua/lexers/crystal.lua b/lua/lexers/crystal.lua
index 703bc689..0af69be2 100644
--- a/lua/lexers/crystal.lua
+++ b/lua/lexers/crystal.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('crystal')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'alias', 'begin', 'break', 'case', 'class', 'def', 'defined?', 'do', 'else', 'elsif', 'end',
diff --git a/lua/lexers/csharp.lua b/lua/lexers/csharp.lua
index 628ebd21..20027484 100644
--- a/lua/lexers/csharp.lua
+++ b/lua/lexers/csharp.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('csharp')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'class', 'delegate', 'enum', 'event', 'interface', 'namespace', 'struct', 'using', 'abstract',
diff --git a/lua/lexers/dart.lua b/lua/lexers/dart.lua
index cf6fd28b..03d968ca 100644
--- a/lua/lexers/dart.lua
+++ b/lua/lexers/dart.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('dart')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'assert', 'break', 'case', 'catch', 'class', 'const', 'continue', 'default', 'do', 'else', 'enum',
diff --git a/lua/lexers/dot.lua b/lua/lexers/dot.lua
index 230e61f8..612f5323 100644
--- a/lua/lexers/dot.lua
+++ b/lua/lexers/dot.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('dot')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'graph', 'node', 'edge', 'digraph', 'fontsize', 'rankdir', 'fontname', 'shape', 'label',
diff --git a/lua/lexers/eiffel.lua b/lua/lexers/eiffel.lua
index 6f73eb2e..f02dd6ab 100644
--- a/lua/lexers/eiffel.lua
+++ b/lua/lexers/eiffel.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('eiffel')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'alias', 'all', 'and', 'as', 'check', 'class', 'creation', 'debug', 'deferred', 'do', 'else',
diff --git a/lua/lexers/elixir.lua b/lua/lexers/elixir.lua
index 995401f2..6d399b16 100644
--- a/lua/lexers/elixir.lua
+++ b/lua/lexers/elixir.lua
@@ -8,9 +8,6 @@ local B, P, S = lpeg.B, lpeg.P, lpeg.S
local lex = lexer.new('elixir', {fold_by_indentation = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Sigils.
local sigil11 = '~' * S('CRSW') * lexer.range('<', '>')
local sigil12 = '~' * S('CRSW') * lexer.range('{', '}')
diff --git a/lua/lexers/elm.lua b/lua/lexers/elm.lua
index c609431c..50238ede 100644
--- a/lua/lexers/elm.lua
+++ b/lua/lexers/elm.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('elm', {fold_by_indentation = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match(
'if then else case of let in module import as exposing type alias port')))
diff --git a/lua/lexers/erlang.lua b/lua/lexers/erlang.lua
index 923891e9..75c219ee 100644
--- a/lua/lexers/erlang.lua
+++ b/lua/lexers/erlang.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('erlang')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'after', 'begin', 'case', 'catch', 'cond', 'end', 'fun', 'if', 'let', 'of', 'query', 'receive',
diff --git a/lua/lexers/fantom.lua b/lua/lexers/fantom.lua
index c7275ec7..7696e581 100644
--- a/lua/lexers/fantom.lua
+++ b/lua/lexers/fantom.lua
@@ -8,10 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('fantom')
--- Whitespace.
-local ws = token(lexer.WHITESPACE, lexer.space^1)
-lex:add_rule('whitespace', ws)
-
-- Classes.
local type = token(lexer.TYPE, lexer.word)
lex:add_rule('class_sequence',
diff --git a/lua/lexers/faust.lua b/lua/lexers/faust.lua
index a4b20bcf..f50b4092 100644
--- a/lua/lexers/faust.lua
+++ b/lua/lexers/faust.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('faust')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'declare', 'import', 'mdoctags', 'dependencies', 'distributed', 'inputs', 'outputs', 'par', 'seq',
diff --git a/lua/lexers/fennel.lua b/lua/lexers/fennel.lua
index 62681ed3..0442a0d4 100644
--- a/lua/lexers/fennel.lua
+++ b/lua/lexers/fennel.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('fennel', {inherit = lexer.load('lua')})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:modify_rule('keyword', token(lexer.KEYWORD, word_match{
'#', '%', '*', '+', '-', '->>', '->', '-?>>', '-?>', '..', '.', '//', '/', ':', '<=', '<', '=',
diff --git a/lua/lexers/fish.lua b/lua/lexers/fish.lua
index cd0247a3..0be43292 100644
--- a/lua/lexers/fish.lua
+++ b/lua/lexers/fish.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('fish')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'alias', 'and', 'begin', 'bg', 'bind', 'block', 'break', 'breakpoint', 'builtin', 'case', 'cd',
diff --git a/lua/lexers/forth.lua b/lua/lexers/forth.lua
index 851c3d09..2a3f4973 100644
--- a/lua/lexers/forth.lua
+++ b/lua/lexers/forth.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('forth')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Strings.
local c_str = 'c' * lexer.range('"', true, false)
local s_str = 's' * lexer.range('"', true, false)
diff --git a/lua/lexers/fortran.lua b/lua/lexers/fortran.lua
index d3729310..1bd0bd5c 100644
--- a/lua/lexers/fortran.lua
+++ b/lua/lexers/fortran.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('fortran')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Comments.
local line_comment = lexer.to_eol(lexer.starts_line(S('CcDd!*')) + '!')
lex:add_rule('comment', token(lexer.COMMENT, line_comment))
diff --git a/lua/lexers/fsharp.lua b/lua/lexers/fsharp.lua
index 451c794b..0ab14d33 100644
--- a/lua/lexers/fsharp.lua
+++ b/lua/lexers/fsharp.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('fsharp', {fold_by_indentation = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'abstract', 'and', 'as', 'assert', 'asr', 'begin', 'class', 'default', 'delegate', 'do', 'done',
diff --git a/lua/lexers/fstab.lua b/lua/lexers/fstab.lua
index c76cb6a3..3f9a1743 100644
--- a/lua/lexers/fstab.lua
+++ b/lua/lexers/fstab.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('fstab', {lex_by_line = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
-- Basic filesystem-independent mount options.
diff --git a/lua/lexers/gap.lua b/lua/lexers/gap.lua
index d770bd89..b401be94 100644
--- a/lua/lexers/gap.lua
+++ b/lua/lexers/gap.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('gap')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'and', 'break', 'continue', 'do', 'elif', 'else', 'end', 'fail', 'false', 'fi', 'for', 'function',
diff --git a/lua/lexers/gettext.lua b/lua/lexers/gettext.lua
index 35bb607e..406eed61 100644
--- a/lua/lexers/gettext.lua
+++ b/lua/lexers/gettext.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('gettext')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match(
'msgid msgid_plural msgstr fuzzy c-format no-c-format', true)))
diff --git a/lua/lexers/gherkin.lua b/lua/lexers/gherkin.lua
index a44dbbcf..e4aada30 100644
--- a/lua/lexers/gherkin.lua
+++ b/lua/lexers/gherkin.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('gherkin', {fold_by_indentation = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match(
'And Background But Examples Feature Given Outline Scenario Scenarios Then When')))
diff --git a/lua/lexers/gleam.lua b/lua/lexers/gleam.lua
index 1f7191d0..4aa4e800 100644
--- a/lua/lexers/gleam.lua
+++ b/lua/lexers/gleam.lua
@@ -11,10 +11,6 @@ local KEY, OP = lexer.KEYWORD, lexer.OPERATOR
local lex = lexer.new('gleam')
--- Whitespace.
-local gleam_ws = token(lexer.WHITESPACE, lexer.space^1)
-lex:add_rule('whitespace', gleam_ws)
-
-- Types.
local typ_tok = token(lexer.TYPE, lexer.upper * lexer.alnum^0)
lex:add_rule('type', typ_tok)
diff --git a/lua/lexers/groovy.lua b/lua/lexers/groovy.lua
index e9fe4c5f..26fce012 100644
--- a/lua/lexers/groovy.lua
+++ b/lua/lexers/groovy.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('groovy')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'abstract', 'break', 'case', 'catch', 'continue', 'default', 'do', 'else', 'extends', 'final',
diff --git a/lua/lexers/hare.lua b/lua/lexers/hare.lua
index eef2e85e..985a2787 100644
--- a/lua/lexers/hare.lua
+++ b/lua/lexers/hare.lua
@@ -9,9 +9,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('hare')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'as', 'break', 'case', 'const', 'continue', 'def', 'defer', 'else', 'export', 'false', 'fn',
diff --git a/lua/lexers/haskell.lua b/lua/lexers/haskell.lua
index 55855748..cc34dec0 100644
--- a/lua/lexers/haskell.lua
+++ b/lua/lexers/haskell.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('haskell', {fold_by_indentation = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'case', 'class', 'data', 'default', 'deriving', 'do', 'else', 'if', 'import', 'in', 'infix',
diff --git a/lua/lexers/icon.lua b/lua/lexers/icon.lua
index 5c24d054..1ba3a93b 100644
--- a/lua/lexers/icon.lua
+++ b/lua/lexers/icon.lua
@@ -9,9 +9,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('icon')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'break', 'by', 'case', 'create', 'default', 'do', 'else', 'end', 'every', 'fail', 'global', 'if',
diff --git a/lua/lexers/idl.lua b/lua/lexers/idl.lua
index 0ae55330..1d93a391 100644
--- a/lua/lexers/idl.lua
+++ b/lua/lexers/idl.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('idl')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'abstract', 'attribute', 'case', 'const', 'context', 'custom', 'default', 'enum', 'exception',
diff --git a/lua/lexers/inform.lua b/lua/lexers/inform.lua
index 075a3af2..3e056ded 100644
--- a/lua/lexers/inform.lua
+++ b/lua/lexers/inform.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('inform')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'Abbreviate', 'Array', 'Attribute', 'Class', 'Constant', 'Default', 'End', 'Endif', 'Extend',
diff --git a/lua/lexers/ini.lua b/lua/lexers/ini.lua
index aee3a811..6cef99ec 100644
--- a/lua/lexers/ini.lua
+++ b/lua/lexers/ini.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('ini')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match('true false on off yes no')))
diff --git a/lua/lexers/io_lang.lua b/lua/lexers/io_lang.lua
index 4737b69a..e0d1367e 100644
--- a/lua/lexers/io_lang.lua
+++ b/lua/lexers/io_lang.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('io_lang')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'block', 'method', 'while', 'foreach', 'if', 'else', 'do', 'super', 'self', 'clone', 'proto',
diff --git a/lua/lexers/jq.lua b/lua/lexers/jq.lua
index 49ea848c..4ed8832a 100644
--- a/lua/lexers/jq.lua
+++ b/lua/lexers/jq.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('jq')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
-- keywords not listed by jq's "builtins", minus operators 'and' and 'or', plus the '?' shorthand
diff --git a/lua/lexers/julia.lua b/lua/lexers/julia.lua
index db0a1de0..d0fd55fa 100644
--- a/lua/lexers/julia.lua
+++ b/lua/lexers/julia.lua
@@ -7,9 +7,6 @@ local B, P, S = lpeg.B, lpeg.P, lpeg.S
local lex = lexer.new('julia')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
local id = lexer.word * P('!')^0
-- Keyword
diff --git a/lua/lexers/ledger.lua b/lua/lexers/ledger.lua
index e41d7100..3ae8767f 100644
--- a/lua/lexers/ledger.lua
+++ b/lua/lexers/ledger.lua
@@ -18,9 +18,6 @@ lex:add_rule('amount', token(lexer.NUMBER, delim * (1 - S(';\r\n'))^1))
-- Comments.
lex:add_rule('comment', token(lexer.COMMENT, lexer.to_eol(S(';#'))))
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Strings.
local sq_str = lexer.range("'")
local dq_str = lexer.range('"')
diff --git a/lua/lexers/lilypond.lua b/lua/lexers/lilypond.lua
index 86eb4de1..8f9231f5 100644
--- a/lua/lexers/lilypond.lua
+++ b/lua/lexers/lilypond.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('lilypond')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords, commands.
lex:add_rule('keyword', token(lexer.KEYWORD, '\\' * lexer.word))
diff --git a/lua/lexers/lisp.lua b/lua/lexers/lisp.lua
index e344c0d6..5f11aaef 100644
--- a/lua/lexers/lisp.lua
+++ b/lua/lexers/lisp.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('lisp')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'defclass', 'defconstant', 'defgeneric', 'define-compiler-macro', 'define-condition',
diff --git a/lua/lexers/logtalk.lua b/lua/lexers/logtalk.lua
index 36c50dea..f8d8fd31 100644
--- a/lua/lexers/logtalk.lua
+++ b/lua/lexers/logtalk.lua
@@ -20,9 +20,6 @@ lex:modify_rule('directive',
token(lexer.PREPROCESSOR, word_match(directives))
) + lex:get_rule('directive'))
--- Whitespace.
-lex:modify_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
local zero_arity_keywords = {
-- extracted from test document in logtalk distribution
'comment', 'argnames', 'arguments', 'author', 'version', 'date', 'parameters', 'parnames',
diff --git a/lua/lexers/man.lua b/lua/lexers/man.lua
index 3ca9910c..0db92b5c 100644
--- a/lua/lexers/man.lua
+++ b/lua/lexers/man.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('man')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Markup.
lex:add_rule('rule1', token(lexer.STRING, '.' * lexer.to_eol('B' * P('R')^-1 + 'I' * P('PR')^-1)))
lex:add_rule('rule2', token(lexer.NUMBER, lexer.to_eol('.' * S('ST') * 'H')))
diff --git a/lua/lexers/matlab.lua b/lua/lexers/matlab.lua
index 6e09d4fd..d0d3be61 100644
--- a/lua/lexers/matlab.lua
+++ b/lua/lexers/matlab.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('matlab')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match({
'break', 'case', 'catch', 'continue', 'do', 'else', 'elseif', 'end', 'end_try_catch',
diff --git a/lua/lexers/meson.lua b/lua/lexers/meson.lua
index c8e029b2..4857527c 100644
--- a/lua/lexers/meson.lua
+++ b/lua/lexers/meson.lua
@@ -7,9 +7,6 @@ local P, R, S = lpeg.P, lpeg.R, lpeg.S
local lex = lexer.new('meson', {fold_by_indentation = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match(
'and or not if elif else endif foreach break continue endforeach')))
diff --git a/lua/lexers/moonscript.lua b/lua/lexers/moonscript.lua
index 436924eb..64a8eb02 100644
--- a/lua/lexers/moonscript.lua
+++ b/lua/lexers/moonscript.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('moonscript', {fold_by_indentation = true})
--- Whitespace.
-lex:add_rule('whitspace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Table keys.
lex:add_rule('tbl_key', token('tbl_key', lexer.word * ':' + ':' * lexer.word))
lex:add_style('tbl_key', lexer.styles.regex)
diff --git a/lua/lexers/myrddin.lua b/lua/lexers/myrddin.lua
index 2af5010c..a6cae07e 100644
--- a/lua/lexers/myrddin.lua
+++ b/lua/lexers/myrddin.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('myrddin')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'break', 'const', 'continue', 'elif', 'else', 'extern', 'false', 'for', 'generic', 'goto', 'if',
diff --git a/lua/lexers/nemerle.lua b/lua/lexers/nemerle.lua
index c4937f59..4fa77018 100644
--- a/lua/lexers/nemerle.lua
+++ b/lua/lexers/nemerle.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('nemerle')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'_', 'abstract', 'and', 'array', 'as', 'base', 'catch', 'class', 'def', 'do', 'else', 'extends',
diff --git a/lua/lexers/networkd.lua b/lua/lexers/networkd.lua
index e98106de..cadc52a4 100644
--- a/lua/lexers/networkd.lua
+++ b/lua/lexers/networkd.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('networkd', {lex_by_line = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
-- Boolean values.
diff --git a/lua/lexers/nim.lua b/lua/lexers/nim.lua
index 20c4c361..01678303 100644
--- a/lua/lexers/nim.lua
+++ b/lua/lexers/nim.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('nim', {fold_by_indentation = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match({
'addr', 'and', 'as', 'asm', 'atomic', 'bind', 'block', 'break', 'case', 'cast', 'const',
diff --git a/lua/lexers/nsis.lua b/lua/lexers/nsis.lua
index 47f2414d..e5d0013d 100644
--- a/lua/lexers/nsis.lua
+++ b/lua/lexers/nsis.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('nsis')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Comments (4.1).
local line_comment = lexer.to_eol(S(';#'))
local block_comment = lexer.range('/*', '*/')
diff --git a/lua/lexers/objective_c.lua b/lua/lexers/objective_c.lua
index b1b6f71c..a99c2a19 100644
--- a/lua/lexers/objective_c.lua
+++ b/lua/lexers/objective_c.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('objective_c')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
-- From C.
diff --git a/lua/lexers/pascal.lua b/lua/lexers/pascal.lua
index db141c45..59302d65 100644
--- a/lua/lexers/pascal.lua
+++ b/lua/lexers/pascal.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('pascal')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match({
'and', 'array', 'as', 'at', 'asm', 'begin', 'case', 'class', 'const', 'constructor', 'destructor',
diff --git a/lua/lexers/pike.lua b/lua/lexers/pike.lua
index c3bf1029..b3e33551 100644
--- a/lua/lexers/pike.lua
+++ b/lua/lexers/pike.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('pike')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'break', 'case', 'catch', 'continue', 'default', 'do', 'else', 'for', 'foreach', 'gauge', 'if',
diff --git a/lua/lexers/pkgbuild.lua b/lua/lexers/pkgbuild.lua
index 05bea6c2..0bc162cb 100644
--- a/lua/lexers/pkgbuild.lua
+++ b/lua/lexers/pkgbuild.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('pkgbuild')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Comments.
lex:add_rule('comment', token(lexer.COMMENT, lexer.to_eol('#')))
diff --git a/lua/lexers/pony.lua b/lua/lexers/pony.lua
index 5f65fa07..61c06d59 100644
--- a/lua/lexers/pony.lua
+++ b/lua/lexers/pony.lua
@@ -7,10 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('pony')
--- Whitespace.
-local ws = token(lexer.WHITESPACE, lexer.space^1)
-lex:add_rule('whitespace', ws)
-
-- Capabilities.
local capability = token(lexer.LABEL, word_match('box iso ref tag trn val'))
lex:add_rule('capability', capability)
diff --git a/lua/lexers/powershell.lua b/lua/lexers/powershell.lua
index d1dd8e1b..d1dd10eb 100644
--- a/lua/lexers/powershell.lua
+++ b/lua/lexers/powershell.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('powershell')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Comments.
lex:add_rule('comment', token(lexer.COMMENT, lexer.to_eol('#')))
diff --git a/lua/lexers/prolog.lua b/lua/lexers/prolog.lua
index 20ac2a25..dc7d9b69 100644
--- a/lua/lexers/prolog.lua
+++ b/lua/lexers/prolog.lua
@@ -75,9 +75,6 @@ lex:add_rule('directive',
token(lexer.WHITESPACE, S(' \t')^0) *
token(lexer.PREPROCESSOR, word_match(directives[dialect])))
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
local zero_arity_keywords = {}
zero_arity_keywords.iso = [[
diff --git a/lua/lexers/protobuf.lua b/lua/lexers/protobuf.lua
index 25007c84..8c72ce1a 100644
--- a/lua/lexers/protobuf.lua
+++ b/lua/lexers/protobuf.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('protobuf')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'contained', 'syntax', 'import', 'option', 'package', 'message', 'group', 'oneof', 'optional',
diff --git a/lua/lexers/ps.lua b/lua/lexers/ps.lua
index 17d93a19..14f563f4 100644
--- a/lua/lexers/ps.lua
+++ b/lua/lexers/ps.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('ps')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'pop', 'exch', 'dup', 'copy', 'roll', 'clear', 'count', 'mark', 'cleartomark', 'counttomark',
diff --git a/lua/lexers/pure.lua b/lua/lexers/pure.lua
index c31f8f24..e21946cf 100644
--- a/lua/lexers/pure.lua
+++ b/lua/lexers/pure.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('pure')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'namespace', 'with', 'end', 'using', 'interface', 'extern', 'let', 'const', 'def', 'type',
diff --git a/lua/lexers/rc.lua b/lua/lexers/rc.lua
index a42efbb9..6498b395 100644
--- a/lua/lexers/rc.lua
+++ b/lua/lexers/rc.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('rc')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'for', 'in', 'while', 'if', 'not', 'switch', 'case', 'fn', 'builtin', 'cd', 'eval', 'exec',
diff --git a/lua/lexers/reason.lua b/lua/lexers/reason.lua
index 1852e72e..a96004a0 100644
--- a/lua/lexers/reason.lua
+++ b/lua/lexers/reason.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('reason')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'and', 'as', 'asr', 'begin', 'class', 'closed', 'constraint', 'do', 'done', 'downto', 'else',
diff --git a/lua/lexers/rebol.lua b/lua/lexers/rebol.lua
index c2e1e3ba..bfce206d 100644
--- a/lua/lexers/rebol.lua
+++ b/lua/lexers/rebol.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('rebol')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Comments.
local line_comment = lexer.to_eol(';')
local block_comment = 'comment' * P(' ')^-1 * lexer.range('{', '}')
diff --git a/lua/lexers/rexx.lua b/lua/lexers/rexx.lua
index d60954d1..54a74ccc 100644
--- a/lua/lexers/rexx.lua
+++ b/lua/lexers/rexx.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('rexx')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match({
'address', 'arg', 'by', 'call', 'class', 'do', 'drop', 'else', 'end', 'exit', 'expose', 'forever',
diff --git a/lua/lexers/routeros.lua b/lua/lexers/routeros.lua
index 8b199f9b..ca26b738 100644
--- a/lua/lexers/routeros.lua
+++ b/lua/lexers/routeros.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('routeros')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
-- Control.
diff --git a/lua/lexers/rpmspec.lua b/lua/lexers/rpmspec.lua
index c86df627..8712e865 100644
--- a/lua/lexers/rpmspec.lua
+++ b/lua/lexers/rpmspec.lua
@@ -6,9 +6,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('rpmspec')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Comments.
lex:add_rule('comment', token(lexer.COMMENT, lexer.to_eol('#')))
diff --git a/lua/lexers/rstats.lua b/lua/lexers/rstats.lua
index 2ea8097a..2077525f 100644
--- a/lua/lexers/rstats.lua
+++ b/lua/lexers/rstats.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('rstats')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'break', 'else', 'for', 'if', 'in', 'next', 'repeat', 'return', 'switch', 'try', 'while', --
diff --git a/lua/lexers/scala.lua b/lua/lexers/scala.lua
index 6f77ab86..27005cc3 100644
--- a/lua/lexers/scala.lua
+++ b/lua/lexers/scala.lua
@@ -7,10 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('scala')
--- Whitespace.
-local ws = token(lexer.WHITESPACE, lexer.space^1)
-lex:add_rule('whitespace', ws)
-
-- Classes.
lex:add_rule('class', token(lexer.KEYWORD, 'class') * ws^1 * token(lexer.CLASS, lexer.word))
diff --git a/lua/lexers/scheme.lua b/lua/lexers/scheme.lua
index 4ccb06df..d2fecd7a 100644
--- a/lua/lexers/scheme.lua
+++ b/lua/lexers/scheme.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('scheme')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'and', 'or', 'not', 'else',
diff --git a/lua/lexers/smalltalk.lua b/lua/lexers/smalltalk.lua
index 2e04fb6b..0543e686 100644
--- a/lua/lexers/smalltalk.lua
+++ b/lua/lexers/smalltalk.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('smalltalk')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match(
'true false nil self super isNil not Smalltalk Transcript')))
diff --git a/lua/lexers/sml.lua b/lua/lexers/sml.lua
index 4d72f6df..233cdb5d 100644
--- a/lua/lexers/sml.lua
+++ b/lua/lexers/sml.lua
@@ -7,10 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('sml')
--- Whitespace.
-local ws = token(lexer.WHITESPACE, lexer.space^1)
-lex:add_rule('whitespace', ws)
-
-- Structures.
local id = (lexer.alnum + "'" + '_')^0
local aid = lexer.alpha * id
diff --git a/lua/lexers/snobol4.lua b/lua/lexers/snobol4.lua
index 3f79fe96..04c2be6d 100644
--- a/lua/lexers/snobol4.lua
+++ b/lua/lexers/snobol4.lua
@@ -8,9 +8,6 @@ local B, P, S = lpeg.B, lpeg.P, lpeg.S
local lex = lexer.new('snobol4')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match({
'ABORT', 'ARRAY', 'CONTINUE', 'DEFINE', 'END', 'FRETURN', 'INPUT', 'NRETURN', 'OUTPUT', 'PUNCH',
diff --git a/lua/lexers/spin.lua b/lua/lexers/spin.lua
index 3a1d1c83..4508f22b 100644
--- a/lua/lexers/spin.lua
+++ b/lua/lexers/spin.lua
@@ -7,9 +7,6 @@ local P, R, S = lpeg.P, lpeg.R, lpeg.S
local lex = lexer.new('spin')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'_clkfreq', '_clkmode', '_free', '_stack', '_xinfreq', 'abort', 'abs', 'absneg', 'add', 'addabs',
diff --git a/lua/lexers/sql.lua b/lua/lexers/sql.lua
index 0652bbec..8e65f6ec 100644
--- a/lua/lexers/sql.lua
+++ b/lua/lexers/sql.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('sql')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match({
'add', 'all', 'alter', 'analyze', 'and', 'as', 'asc', 'asensitive', 'before', 'between', 'bigint',
diff --git a/lua/lexers/systemd.lua b/lua/lexers/systemd.lua
index 5fed9b72..1e358678 100644
--- a/lua/lexers/systemd.lua
+++ b/lua/lexers/systemd.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('systemd', {lex_by_line = true})
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
-- Boolean values.
diff --git a/lua/lexers/tcl.lua b/lua/lexers/tcl.lua
index 3ec38d72..a01565c4 100644
--- a/lua/lexers/tcl.lua
+++ b/lua/lexers/tcl.lua
@@ -9,9 +9,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('tcl')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Comment.
lex:add_rule('comment', token(lexer.COMMENT, lexer.to_eol('#' * P(function(input, index)
local i = index - 2
diff --git a/lua/lexers/vala.lua b/lua/lexers/vala.lua
index 8f8d7514..6a9ac905 100644
--- a/lua/lexers/vala.lua
+++ b/lua/lexers/vala.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('vala')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'class', 'delegate', 'enum', 'errordomain', 'interface', 'namespace', 'signal', 'struct', 'using',
diff --git a/lua/lexers/vcard.lua b/lua/lexers/vcard.lua
index 2ee82baf..a040b281 100644
--- a/lua/lexers/vcard.lua
+++ b/lua/lexers/vcard.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('vcard')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Begin vCard, end vCard.
lex:add_rule('begin_sequence', token(lexer.KEYWORD, 'BEGIN') * token(lexer.OPERATOR, ':') *
token(lexer.COMMENT, 'VCARD'))
diff --git a/lua/lexers/verilog.lua b/lua/lexers/verilog.lua
index e8ffac1c..db218750 100644
--- a/lua/lexers/verilog.lua
+++ b/lua/lexers/verilog.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('verilog')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'always', 'assign', 'begin', 'case', 'casex', 'casez', 'default', 'deassign', 'disable', 'else',
diff --git a/lua/lexers/vhdl.lua b/lua/lexers/vhdl.lua
index af5f9aa5..e45ada07 100644
--- a/lua/lexers/vhdl.lua
+++ b/lua/lexers/vhdl.lua
@@ -7,9 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('vhdl')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'access', 'after', 'alias', 'all', 'architecture', 'array', 'assert', 'attribute', 'begin',
diff --git a/lua/lexers/xs.lua b/lua/lexers/xs.lua
index a08e3cf4..86ca77f6 100644
--- a/lua/lexers/xs.lua
+++ b/lua/lexers/xs.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('xs')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
'access', 'alias', 'catch', 'cd', 'dirs', 'echo', 'else', 'escape', 'eval', 'exec', 'exit',
diff --git a/lua/lexers/xtend.lua b/lua/lexers/xtend.lua
index 01385ee9..0166ceb0 100644
--- a/lua/lexers/xtend.lua
+++ b/lua/lexers/xtend.lua
@@ -7,10 +7,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('xtend')
--- Whitespace.
-local ws = token(lexer.WHITESPACE, lexer.space^1)
-lex:add_rule('whitespace', ws)
-
-- Classes.
lex:add_rule('class', token(lexer.KEYWORD, 'class') * ws^1 * token(lexer.CLASS, lexer.word))
diff --git a/lua/lexers/zig.lua b/lua/lexers/zig.lua
index 0a4499d6..a53eb3a2 100644
--- a/lua/lexers/zig.lua
+++ b/lua/lexers/zig.lua
@@ -8,9 +8,6 @@ local P, S = lpeg.P, lpeg.S
local lex = lexer.new('zig')
--- Whitespace.
-lex:add_rule('whitespace', token(lexer.WHITESPACE, lexer.space^1))
-
-- Keywords.
lex:add_rule('keyword', token(lexer.KEYWORD, word_match{
-- Keywords.
--
2.41.0
I don't think
lex:add_style()
is the issue. The problem is that the highlighted code is not valid lua. I'm not sure what the intention was but it is probably something like:lexer.styles.embedded.eolfilled = true lex:add_style('literal_block', lexer.styles.embedded) lexer.styles.embedded.eolfilled = false
@orbitalquark will have to confirm since they introduced the code in 0649455. Matěj is credited but those lines were not in the original patch (#71).
@rnpnr lex:add_style('literal_block', lexer.styles.embedded .. {eolfilled = true})
is valid Lua code. Styles have a metatable that allows table concatenation to update fields: https://github.com/orbitalquark/scintillua/blob/028953206432b8572b309224134ccebfdec61e31/lexers/lexer.lua#L1926-L1928.
BTW, is it correct that we could eliminate all those lines, because it is now default? Why they are still in this repository?
There is an open issue to migrate legacy lexers: https://github.com/orbitalquark/scintillua/issues/76. Legacy lexers are still valid (though legacy features like style settings may not have an effect).
Styles have a metatable that allows table concatenation to update fields:
Ahhh, my bad. But that doesn't really explain this issue then. Perhaps
you still have some legacy lexer.lua
hanging around @mcepl?
I'm also getting that same error when opening a .rst
file. So that
metatable isn't really getting applied to the .embedded
member. Since
the __concat
method is just supposed to return nil
can that argument
just be directly replaced with nil
(or the line removed altogether
since it doesn't seem to do anything in that case).
Are you redefining lexer.styles
by accident? As written in
lexer.lua, lexer.styles.embedded
will trigger the __index
metamethod, which should return an empty table with a __concat
metamethod that should get triggered with the following ..
operator.
You know what we were adding members to styles
. Since thats no longer
useful I can fix our code. While this is definitely our fault something
like this would avoid the issue:
--- a/lexers/lexer.lua
+++ b/lexers/lexer.lua
@@ -1918,7 +1918,8 @@ end
M.colors = {} -- legacy
M.styles = setmetatable({}, { -- legacy
- __index = function() return setmetatable({}, {__concat = function() return nil end}) end
+ __index = function() return setmetatable({}, {__concat = function() return nil end}) end,
+ __newindex = function() return end
})
M.property_expanded = setmetatable({}, {__index = function() return '' end}) -- legacy
Styles have a metatable that allows table concatenation to update fields:
Ahhh, my bad. But that doesn't really explain this issue then. Perhaps you still have some legacy
lexer.lua
hanging around @mcepl?
I am almost certain that I have not: I have even run find / -name lexer.lua
to make sure.
You know what we were adding members to
styles
. Since thats no longer useful I can fix our code. While this is definitely our fault something like this would avoid the issue:--- a/lexers/lexer.lua +++ b/lexers/lexer.lua @@ -1918,7 +1918,8 @@ end M.colors = {} -- legacy M.styles = setmetatable({}, { -- legacy - __index = function() return setmetatable({}, {__concat = function() return nil end}) end + __index = function() return setmetatable({}, {__concat = function() return nil end}) end, + __newindex = function() return end }) M.property_expanded = setmetatable({}, {__index = function() return '' end}) -- legacy
That's a good idea. Committed via https://github.com/orbitalquark/scintillua/commit/e88bbcfecae46b48b79d8156ea7129411b5c847d
One more question:
tumbleweed-pkg/u/s/v/lexers$ grep 'add_style' rest.lua
lex:add_style('literal_block', lexer.styles.embedded .. {eolfilled = true})
lex:add_style('footnote_block', lexer.styles.label)
lex:add_style('citation_block', lexer.styles.label)
lex:add_style('link_block', lexer.styles.label)
lex:add_style('code_block', lexer.styles.embedded .. {eolfilled = true})
lex:add_style('directive', lexer.styles.keyword)
lex:add_style('sphinx_directive', lexer.styles.keyword .. {bold = true})
lex:add_style('unknown_directive', lexer.styles.keyword .. {italics = true})
lex:add_style('substitution', lexer.styles.variable)
lex:add_style('inline_literal', lexer.styles.embedded)
lex:add_style('role', lexer.styles.class)
lex:add_style('interpreted', lexer.styles.string)
tumbleweed-pkg/u/s/v/lexers$
Ignoring those {eolfilled = true}
and {bold = true}
I have just another question … would it be possible to have some kind of alias (in lexer file), so that I don’t have to rename all those role
, interpreted
etc. to completely non-sensical (in the context of ReST file) class
and string
ones?
I'm not familiar with ReST so take this with a grain of salt but the goal with the new lexers is to be more agnostic about theming. When migrating a legacy lexer the goal is to try and minimize the amount of lexer specific tags. So if an existing tag is good enough that is preferred. For places where it absolutely must be different than existing tags you can do something like in the diff
lexer:
But this is not ideal since anyone who wants to use this lexer to (for example) add styling to an editor will need to specifically style the non standard tags. This is what I understand from migrating a couple lexers and also updating the default themes in vis. Its a bit more annoying for people using scintillua to provide source code highlighting but it makes more sense for scintillua to simply be a general purpose lexer.
Is it possible to call add_rule
multiple times with the result that these different patterns would be connected with OR operator? I.e., is this correct https://paste.opensuse.org/pastes/7894264590df (search for commented out line calls to lex:add_style()
)?
Rules are matched in the order they are added. While you can technically use the same rule ID, it will not "OR" it with an ID of the previous name, and any lexers derived from yours that calls modify_rule()
with your duplicate ID will not work as expected. It's highly recommended to use unique rule IDs.
On Tue Sep 26, 2023 at 5:33 PM CEST, orbitalquark wrote:
Rules are matched in the order they are added. While you can technically use the same rule ID, it will not "OR" it with an ID of the previous name, and any lexers derived from yours that calls
modify_rule()
with your duplicate ID will not work as expected. It's highly recommended to use unique rule IDs.
OK, then I am completely confused. Using the unique IDs is exactly what got us into this mess. Unique IDs begat many more unique styles and quadratic growth of the number of changes needed in all styles.
What am I missing?
I do not follow vis and its development, so I don't know what "mess" you are referring to. Scintillua has always relied on unique rule names. https://orbitalquark.github.io/scintillua/api.html#rules provides a pretty simple (IMO) overview of rules and how they are matched.
I do not follow vis and its development, so I don't know what "mess" you are referring to. Scintillua has always relied on unique rule names. https://orbitalquark.github.io/scintillua/api.html#rules provides a pretty simple (IMO) overview of rules and how they are matched.
How well your editor (whatever it is as long as it based on scintillua) works with rST files? Do you have all those styles (e.g., footnote_block
) in each stylesheet of it?
This probably could be closed, or it is obsolete.
https://github.com/orbitalquark/scintillua/blob/0a8c23d5229eaa4942facaefa5e48e017d0d4636/lexers/rest.lua#L23
@rnpnr This is another blocker (at least for me) for rebase to scintillua_6.2.