Age | Commit message (Collapse) | Author | |
---|---|---|---|
2016-04-25 | Implement endAtWhitespace flag which tells TokenizedDataReader to stop ↵ | Andreas Stöckel | |
reading data after the first whitespace character | |||
2016-04-25 | Change way indent and dedent special tokens are produced by TokenizedData | Andreas Stöckel | |
* Move dedent to the end of the previous line * Leave indent to the first character of the current line * Dedent is called as many times as indent | |||
2015-03-01 | Prefer longer non-primary tokens | Andreas Stöckel | |
2015-03-01 | allowing to store gaps in SourceOffsetVector and fixed bug with trim not ↵ | Andreas Stöckel | |
resetting offsets correctly when the new length is zero | |||
2015-02-28 | Test case for data being empty if a token is found | Andreas Stöckel | |
2015-02-26 | Reactivated TokenizerTest | Andreas Stöckel | |
2015-02-26 | Moved "assert" functions to own header | Andreas Stöckel | |
2015-02-25 | start of branch, commit log will be rewritten | Andreas Stöckel | |
2015-02-22 | Adapted old Tokenizer infrastructure to new Tokens.hpp | Andreas Stöckel | |
2015-02-22 | Implemented TokenizedData, a facility to store data with tokens where tokens ↵ | Andreas Stöckel | |
can be dynamically enabled and the whitespace mode specified at the moment the tokens are read | |||
2015-02-22 | Implemented SourceOffsetVector -- a class for storing the SourceOffset for ↵ | Andreas Stöckel | |
each character in a sequence in an fairly efficient manner | |||
2015-02-15 | Moved TokenTrieTest to new directory | Andreas Stöckel | |
2015-02-14 | Moved DynamicTokenizer and TokenTrie to parser/utils | Andreas Stöckel | |