A Function That Pauses
A generator looks like a regular function, but instead of computing a whole result and returning it, it yields one value at a time, pausing between yields until whoever's asking wants the next value.
The simplest possible one:
Notice yield instead of return. The first time for asks for a value, Python runs the function body until it hits yield 1. The function pauses right there, hands 1 back to the loop, and remembers exactly where it stopped — variables and all. The next iteration picks up where it left off: current += 1, back to the while, yield 2. And so on until the loop condition fails, at which point the generator simply stops.
That pause-and-resume is the whole trick.
Why Not Just Build a List?
Because the list version allocates all the values up front:
Fine for 5 items. Now imagine you want 50 million integers, and you only care about the first one that matches some condition. The list version allocates 50 million ints and then you throw most of them away. The generator version creates exactly as many as the caller consumes. When the for loop finds what it wants and breaks, the generator simply stops.
That's the pattern worth internalising: generators let you write iteration code without deciding up front how much of the result you'll need.
Generator Expressions
If you've written a list comprehension, you already know the syntax — swap the square brackets for parentheses:
squares_gen doesn't compute anything yet. It's just a recipe. Iterating it runs the recipe one step at a time.
Generator expressions are perfect as arguments to functions that consume an iterable:
No intermediate list. sum, max, and any read values one at a time, which is exactly what they want.
Reading a Large File, Line by Line
This is the canonical real-world case for generators — process a file that's too big to load into memory:
def parse_log_lines(path):
with open(path) as f:
for line in f:
if line.startswith("ERROR"):
yield line.rstrip()
for error in parse_log_lines("app.log"):
print(error)
The file is read lazily. Each call to the generator pulls one line from disk, filters it, and yields. Memory usage stays flat regardless of file size.
Once and Done
A generator has a single pass through it. After you've iterated to the end, it's exhausted:
The second loop prints nothing. The generator has nothing left.
If you need to iterate more than once, either call the generator function again for a fresh generator, or materialise the sequence with list(...) and iterate the list repeatedly. Pick based on cost: rebuilding is fine if the work is cheap; a list is fine if the sequence is small.
next() and Manual Iteration
You don't have to use a for loop. next() pulls one value at a time:
StopIteration is how a generator signals "I'm done." for loops catch it silently. In manual code you can pass a default to next(gen, default) to avoid the exception.
Infinite Generators
Because values are produced on demand, a generator can represent a sequence with no end — as long as the consumer stops asking:
while True with a yield inside doesn't hang the program — it just means "if someone keeps asking, keep producing." The consumer decides when to stop.
This pattern shows up in streaming data, event loops, and anywhere you pull values from a source that doesn't have a defined length.
yield from: Delegating to Another Iterable
If your generator wants to yield every value from another iterable, yield from does it in one line:
Without yield from you'd write a nested for loop with yield x inside. It also forwards send() and throw() calls correctly if you ever use those — but for everyday code, think of it as "yield every value from this thing."
When to Reach for a Generator
Three signals that a generator is the right tool:
- The sequence is large, possibly infinite, or expensive to produce in full.
- The consumer might stop before the end (a
breakon first match, for example). - You want to chain transformations — filter, map, take — without building intermediate lists.
And when not to:
- You need random access (
seq[42]). Generators only go forward. - You need to iterate the same sequence several times. Use a list.
- The sequence is small and you already have it. A list comprehension is simpler.
Generators, list comprehensions, and plain lists are each the right answer for different jobs. The skill is picking one without thinking too hard about it — and the quickest way to develop that instinct is to notice, for each iteration you write, whether "produce everything first" or "produce one at a time" fits better.
Next: Context Managers in Depth
You've now seen most of the idioms Python uses for iteration. Context managers — the with statement — are next, and they pair well with generators for streaming data out of files and network connections.
Frequently Asked Questions
What is a generator in Python?
A generator is a function that produces values one at a time, pausing between them. You write it with def like a normal function, but use yield instead of return. Calling it returns a generator object; each iteration of for or each next() call runs the function until the next yield.
What's the difference between a list and a generator?
A list holds every element in memory at once. A generator computes elements on demand and forgets them after they're consumed. For large or infinite sequences, generators use a tiny fixed amount of memory; for small results you need repeatedly, a list is better.
Can I iterate a generator twice?
No. A generator is exhausted after the first full pass — a second for loop over it produces nothing. If you need to iterate more than once, call the generator function again to get a fresh generator, or materialise the results into a list.