Package lepl :: Package lexer :: Module stream :: Class TokenHelper
[hide private]
[frames] | no frames]

Class TokenHelper

source code

This wraps a sequence of values generated by the lexer. The sequence is a source of (tokens, stream) instances, where the stream was generated from the source.

It follows that the value returned by s_next is also (tokens, stream). This is interpreted by Token which forwards stream to sub-matchers.

Implementation is vaguely similar to IterableHelper, in that we use a Cons based linked list to allow memory handling. However, instead of a "line" of data, each node contains, again, (tokens, stream) and there is no need to store the line_stream explicitly in the state.

Instance Methods [hide private]
__init__(self, id=None, factory=None, max=None, global_kargs=None, cache_level=None, delta=None, len=None) source code
key(self, cons, other) source code
next(self, cons, count=1) source code
line(self, cons, empty_ok)
This doesn't have much meaning in terms of tokens, but might be used for some debug output, so return something vaguely useful.
source code
len(self, cons) source code
stream(self, state, value, id_=None) source code