Whitespace token modernization - a* lexers - regarding #1905#1914
Whitespace token modernization - a* lexers - regarding #1905#1914Anteru merged 7 commits intopygments:masterfrom
Conversation
|
Further playing with the automation lexer wasn't very fruitful... I tried to get detect the labels correctly for the following snippet. #p::
Run, https://www.pygments.org/
return
^t::
Run, calc.exe
return
:*:pyg::pyg
::pygmentize::pygmentizeThis snippet results in differing tokens as well. Anyhow, I am unable to find a quick solution. Rethinking this lexer would require some time... I'd drop the respective commit stashing it for later. Maybe I have a better understanding on pygments' code then - or someone else will address this... |
|
Sounds like a plan. I'd rather have this noted as an open issue than introduce this kind of change during a cleanup -- thanks for investigating this! |
|
Merged, thanks a lot! |
This PR is chunk of the effort (#1905) to insert the Whitespace token where ever it applies.
The automation lexer also contains a minor fix for multiline comments (making the multiline-comment-content regex greedy).