Package lepl :: Package lexer :: Module matchers
[hide private]
[frames] | no frames]

Module matchers

source code

Generate and match a stream of tokens that are identified by regular expressions.
Classes [hide private]
  BaseToken
Introduce a token that will be recognised by the lexer.
  Token
A token with a user-specified regexp.
  EmptyToken
A token that cannot be specialised, and that returns nothing.
Functions [hide private]
 
RestrictTokensBy(*tokens)
A matcher factory that generates a new matcher that will transform the stream passed to its arguments so that they do not see the given tokens.
source code
Variables [hide private]
  NonToken = ABCMeta('NonToken', (object,), {})
ABC used to identify matchers that actually consume from the stream.
Function Details [hide private]

RestrictTokensBy(*tokens)

source code 

A matcher factory that generates a new matcher that will transform the stream passed to its arguments so that they do not see the given tokens.

So, for example:
MyFactory = RestrictTokensBy(A(), B()): RestrictedC = MyFactory(C())

will create a matcher, RestrictedC, that is like C, but which will not see the tokens matced by A and B.

In other words, this filters tokens from the input.


Variables Details [hide private]

NonToken

ABC used to identify matchers that actually consume from the stream. These are the "leaf" matchers that "do the real work" and they cannot be used at the same level as Tokens, but must be embedded inside them.

This is a purely infmtive interface used, for example, to generate warnings for the user. Not implementing this interface will not block any functionality.

Value:
ABCMeta('NonToken', (object,), {})