-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Investigate accepting a stream of tokens as input #202
Comments
… external dependency. Lots of TODOs, somewhat thin testing, but the basics work. See #202.
detail::make_input_subrange() out into its own header for reuse. See #202.
detail::make_input_subrange() out into its own header for reuse. See #202.
advancing past the end of the cache. Filter out whitespace tokens entirely. Make tokens_view noncopyable+nonmovable. See #202.
…xpected sequence of tokens. See #202.
… code to use a bool and a double as its value, repsectively, instead of a string_view. See #202.
…ation. Therer's room for it, since it's in a union with a string_view anyway. See #202.
… and parser_interface. token_spec is now a variable template that generates a parser_interface wrapping a token_parser, which parameterized on the token_spec_t. This way, a single token_spec use can be used to specify how to lex, and how to parse. See #202.
…s that producestring_views. See #202.
…s that producestring_views. See #202.
…hpp is now required to come before parser.hpp. See #202.
…n the range's value type is a specializtion of token. See #202.
underlying sequence, and change the way that the error handler is invoked, so that it detects token iterators, and passes iterators into the underlying range to the error handler, instead of the token iterators. See #202.
…ror, to ensure that the messge is preserved exactly when translating exceptions in failed lexer parsing. See #202.
… for the full set of *parse() functions that support token parsing; fix errors. See #202.
… for the full set of *parse() functions that support token parsing; fix errors. See #202.
…f the character parsers out into a standalone detail:: function; use it in all the character parsers. See #202.
…fully exercise the token parsing code. See #202.
…f the character parsers out into a standalone detail:: function; use it in all the character parsers. See #202.
…fully exercise the token parsing code. See #202.
…fully exercise the token parsing code. See #202.
…fully exercise the token parsing code. See #202.
…fully exercise the token parsing code. See #202.
…fully exercise the token parsing code. See #202.
exercise the token parsing code; fix errors. See #202.
exercise the token parsing code; fix errors. See #202.
…cate which error handling function apply this function to their inputs and which do not. See #202.
…lue, and doing transcoding for the match in detail::token_with_string_view. See #202.
Ok, this stuff is pretty much ready to go. I say "pretty much" only because I don't want to merge it quite yet, since I don't really have a big enough use case for it to really try it out. @jefftrull, that's where you come in. :) Please give it a try and tell me what's under-documented, hard to use, etc. You can find all the code on the ctre branch, and the pre-built updated docs can be found on the ctre_doc branch (just point your browser at |
Things are a little crazy right now but I'm very excited to try this. Thanks for taking on this big feature! |
No worries! The branch isn't going anywhere. |
Not sure exactly how this should work -- should it take some abstract notion of a token, a lexrtl token, something I create, a CTRE-style thing, or what?
But it definitely deserves at least some investigation. Some parsers are very hard to write (or even ambiguous) without lexing first.
The text was updated successfully, but these errors were encountered: