All Generators are Iterators, but the converse is false.
The typical Generator is written as a function.
It does uses yield instead of return,
but can also return, e.g. pruning recursive generators.
The syntax yield from (items)
replaces the equivalent
for _ in (items): yield _
from typing import Iterable, Iterator
def gen_primes():
yield "Some prime numbers:"
yield from (2, 3, 5, 7, "(etc.)")
print(*gen_primes())
The generator is not the function, but its output.
print(type(gen_primes))
# class 'function'
print(type(primes := gen_primes()))
# class 'generator'
next is used to retrieve yields.
print(next(primes)) # "Some prime numbers:"
print(next(primes)) # 2
Generators do not store values, they can only be iterated once.
They may be infinite, or values may yield differently,
thus generators cannot be seen as container types.
primes = gen_primes()
# Prints the yields
print(" generated:", *primes)
# Print nothing (exhausted)
print(" exhausted:", *primes)
From Iterator to Generator
Generators can be used to transform an almost-Iterator
into a true Iterator (Generators are all Iterators).
A class implementing
__next__ is not quite Iterator,
since it lacks the required
__iter__.
However, a function wrapping the next call
adapts it into a Generator, and an Iterator.
def iteratorify(has_next):
while ...:
try: yield next(has_next)
except StopIteration: break
class Nums:
def __init__(self, max):
self._i, self._max = -1, max
def __next__(self):
if self._i > self._max:
raise StopIteration
self._i += 1; return self._i
next5 = Nums(5)
assert not isinstance(next5, Iterable)
assert not isinstance(next5, Iterator)
iter5 = iteratorify(next5)
assert isinstance(iter5, Iterable)
assert isinstance(iter5, Iterator)
generator.close()
raises
GeneratorExit
at the point
where the generator function was paused.
It is more an interrupt than an error,
as such it inherits from
BaseException rather than
Exception
Checks, corner cases
isgeneratorfunction
tells if a function makes generators, i.e. it contains yield keywords.
from inspect import isgeneratorfunction
assert isgeneratorfunction(gen_primes) # True
isinstance(_, GeneratorType)
checks if _ is a generator, i.e. the output
of a yield-function.
Note that range, map, filter, zip... however,
aren't generator-functions.
Thus,
two = map(func, one)
strips the GeneratorType away from one.
from types import GeneratorType
def numbers():
yield from range(100)
assert isinstance(numbers(), GeneratorType)
plus_one = map(lambda _:_ + 1, numbers())
assert not isinstance(plus_one, GeneratorType)
assert isinstance(range(6),
GeneratorType) is False
assert isinstance(map(str, range(1, 4)),
GeneratorType) is False
assert isinstance(filter(lambda _: _ < 4, range(1, 6)),
GeneratorType) is False
# All true
Variadic decorator for gen-expr, comprehension args
SEQTYPES = (range, list, tuple, set)
def variadic(callable):
def _cast_args(*_):
if len(_) != 1 or type(_[0]) == str:
return _
if isinstance(_[0], SEQTYPES) \
or hasattr(_[0], "__iter__") \
or hasattr(_[0], "__next__"):
return _[0]
return _
def _newfunc(*args):
return callable(*_cast_args(*args))
return _newfunc
Strings handled as literal types instead of char sequences.
@variadic
def count(*args): return len(args)
# len(_) would return the same for:
print(count(range(5))) # 5
print(count("01234")) # 1
# but len(_) would say it takes one arg for:
print(count(0, 1, 2, 3, 4)) # 5
print(count(*"01234")) # 5
# and would say generator has no len for:
print(count(_ for _ in range(5))) # 5
Generators with asyncio
import asyncio
async def get_result(_):
buffer = []
async for item in _: buffer.append(item)
print("asyncgen:", *buffer)
The following snippet makes a decorator that turns a generator method
into a cached property equals its result, i.e. it is computed once and then
not "exhausted" but instead cached onto the instance in latter reads.