Markup Parsertools
inyoka.markuptools
Useful classes for parsers.
- copyright:
2007-2024 by the Inyoka Team, see AUTHORS for more details.
- license:
BSD, see LICENSE for more details.
- class inyoka.markup.parsertools.MultiMap(sequence)
A special structure used to represent metadata and other data that has multiple values for one key.
- get(key, default=None)
Return the first value if the requested data doesn’t exist
- pop(k[, d]) v, remove specified key and return the corresponding value.
If the key is not found, return the default if given; otherwise, raise a KeyError.
- popitem(*args)
Remove and return a (key, value) pair as a 2-tuple.
Pairs are returned in LIFO (last-in, first-out) order. Raises KeyError if the dict is empty.
- popitemlist(*args)
- poplist(*args)
- setdefault(*args)
Insert key with a value of default if key is not in the dictionary.
Return the value for key if key is in the dictionary, else default.
- setlist(*args)
- setlistdefault(*args)
- update([E, ]**F) None. Update D from dict/iterable E and F.
If E is present and has a .keys() method, then does: for k in E: D[k] = E[k] If E is present and lacks a .keys() method, then does: for k, v in E: D[k] = v In either case, this is followed by: for k in F: D[k] = F[k]
- class inyoka.markup.parsertools.Token(type, value)
Represents one token.
- _asdict()
Return a new dict which maps field names to their values.
- _field_defaults = {}
- _fields = ('type', 'value')
- classmethod _make(iterable)
Make a new Token object from a sequence or iterable
- _replace(**kwds)
Return a new Token object replacing specified fields with new values
- type
Alias for field number 0
- value
Alias for field number 1
- class inyoka.markup.parsertools.TokenStream(generator)
A token stream wraps a generator and supports pushing tokens back. It also provides some functions to expect tokens and similar stuff.
- Important note: Do never push more than one token back to the
stream. Although the stream object won’t stop you from doing so, the behavior is undefined. Multiple pushed tokens are only used internally!
- debug(stream=None)
Displays the tokenized code on the stream provided or stdout.
- property eof
Are we at the end of the tokenstream?
- expect(type, value=None)
expect a given token.
- classmethod from_tuple_iter(tupleiter)
- look()
See what’s the next token.
- push(token, current=False)
Push a token back to the stream (only one!).
- shift(token)
Push one token into the stream.
- skip(n)
Got n tokens ahead.
- test(type, value=Ellipsis)
Test the current token.
- class inyoka.markup.parsertools.TokenStreamIterator(stream)
The iterator for tokenstreams. Iterate over the stream until the eof token is reached.
- inyoka.markup.parsertools._unpickle_multimap(d)
Helper that creates a multipmap after pickling. We need this because the default pickle system for dicts requires a mutable interface which MultiMap is not. Do not make this a closure as this object must be pickleable itself.
- inyoka.markup.parsertools.flatten_iterator(iter)
Flatten an iterator to one without any sub-elements