Package lepl :: Package lexer :: Package lines :: Module lexer :: Class _OffsideLexer
[hide private]
[frames] | no frames]

Class _OffsideLexer

source code


An alternative lexer that adds LineStart and LineEnd tokens.

Note that because of the extend argument list this must be used in the config via make_offside_lexer() (although in normal use it is supplied by simply calling config.lines() so you don't need to refer to this class at all)

Instance Methods [hide private]
 
__init__(self, matcher, tokens, alphabet, discard, t_regexp=None, s_regexp=None, tabsize=8, blocks=False)
matcher is the head of the original matcher graph, which will be called with a tokenised stream.
source code
 
_tokens(self, stream, max)
Generate tokens, on demand.
source code

Inherited from lexer.Lexer: token_for_id

Inherited from lexer.Lexer (private): _match

Inherited from support.context.NamespaceMixin (private): _lookup

Inherited from matchers.support.BaseMatcher: __repr__, __str__, clone, kargs, tree, tree_repr

Inherited from support.graph.ArgAsAttributeMixin: __iter__

Inherited from support.graph.PostorderWalkerMixin: postorder

Inherited from object: __delattr__, __format__, __getattribute__, __hash__, __new__, __reduce__, __reduce_ex__, __setattr__, __sizeof__, __subclasshook__

Inherited from matchers.matcher.Matcher: indented_repr

Properties [hide private]

Inherited from object: __class__

Method Details [hide private]

__init__(self, matcher, tokens, alphabet, discard, t_regexp=None, s_regexp=None, tabsize=8, blocks=False)
(Constructor)

source code 

matcher is the head of the original matcher graph, which will be called with a tokenised stream.

tokens is the set of Token instances that define the lexer.

alphabet is the alphabet for which the regexps are defined.

discard is the regular expression for spaces (which are silently dropped if not token can be matcher).

t_regexp and s_regexp are internally compiled state, use in cloning, and should not be provided by non-cloning callers.

Overrides: matchers.matcher.Matcher.__init__

_tokens(self, stream, max)

source code 
Generate tokens, on demand.
Overrides: lexer.Lexer._tokens