-
Lexer or also called a tokenizer is basically the first step of all of it. A lexer class takes in the text/user input (also filename for context to throw errors in my case) and goes through every single character and checks if any of it matches with the character I pass in if statements, if it does then if it is a '+' for example it will append a new token to a list (mine is called tokens) using the Token class you can check that out too a token class takes a Token type and a value so the type is PLUS here and the value is '+' all my types are stored in consts.py and then the tokens get passed to the parser
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
Lark
Lark is a parsing toolkit for Python, built with a focus on ergonomics, performance and modularity.
There's also lark, which is used by a plethora of projects (I haven't used it, but I heard about PreQL on a podcast where they talk for a bit about what it's like to develop a new language in lark)
-
There's also lark, which is used by a plethora of projects (I haven't used it, but I heard about PreQL on a podcast where they talk for a bit about what it's like to develop a new language in lark)
Related posts
-
JSON extra uses orjson instead of ujson
-
The Future of MySQL is PostgreSQL: an extension for the MySQL wire protocol
-
SQLGlot: No-dependency SQL parser, transpiler, optimizer for 21 SQL dialects
-
SQLglot: Python SQL Parser and Transpiler
-
utype VS pydantic - a user suggested alternative
2 projects | 15 Feb 2024