Audience: Grade 8–9 students who already know intermediate Python (functions, loops, lists, and basic modules)
Learning goals
- Understand what decorators and generators are, and why they're useful.
- Write your own decorators (including ones with arguments).
- Write and use generator functions to process data efficiently.
- Combine both in a small, practical project.
Part A: Generators
Idea in simple words
- A normal function computes everything and returns it all at once (like baking all cookies and then serving).
- A generator makes values one at a time when you ask for them (like a cookie machine that gives you one cookie at a time). This saves memory and can be faster for large data.
How generators work
- A generator function uses the keyword
yield. - Calling a generator function returns a generator object (an iterator). You can loop over it or call
next()on it.
Example 1: List vs generator
# Normal function: returns a list of squares
def squares_list(n):
result = []
for i in range(n):
result.append(i * i)
return result
# Generator function: yields squares one by one
def squares_gen(n):
for i in range(n):
yield i * i
print("List:", squares_list(5)) # [0, 1, 4, 9, 16]
print("Gen:", squares_gen(5)) # <generator object ...>
# You can loop a generator:
for x in squares_gen(5):
print("From generator:", x)
# Or pull values manually:
g = squares_gen(3)
print(next(g)) # 0
print(next(g)) # 1
print(next(g)) # 4
# next(g) now would raise StopIteration because it's out of valuesWhy generators are useful
- Memory efficient: They don't keep the whole result in memory.
- Lazy: They only compute when you need the next value.
- Can represent infinite sequences.
Example 2: An infinite counter generator
def count(start=0, step=1):
n = start
while True: # Infinite! Only safe if you break out
yield n
n += step
from itertools import islice
# Take first 5 numbers from the infinite counter
for num in islice(count(10, 2), 5):
print(num) # 10, 12, 14, 16, 18Example 3: Generator pipelines (chaining)
def numbers():
for i in range(10):
yield i
def evens(seq):
for n in seq:
if n % 2 == 0:
yield n
def squares(seq):
for n in seq:
yield n * n
# Pipeline: numbers -> evens -> squares
for val in squares(evens(numbers())):
print(val) # 0, 4, 16, 36, 64Generator expressions (short form)
# Like list comprehensions, but with parentheses:
gen = (i * i for i in range(5)) # generator, not list
print(list(gen)) # [0, 1, 4, 9, 16] (consumes it)Part B: Decorators
Idea in simple words
- A decorator is a function that takes another function and returns a new function that adds extra behavior.
- It's like putting a function in a "wrapper" that runs before/after the original function.
Why decorators are useful
- Add features without changing the original code (DRY: Don't Repeat Yourself).
- Common uses: logging, timing, caching, access control, retries, validation.
Basic decorator
def announce(func):
def wrapper():
print("About to run", func.__name__)
result = func()
print("Finished", func.__name__)
return result
return wrapper
@announce # Same as: greet = announce(greet)
def greet():
print("Hello!")
greet()
# About to run greet
# Hello!
# Finished greetDecorators that work with any function (args and kwargs)
from functools import wraps
def debug(func):
@wraps(func) # keeps the original name/docstring
def wrapper(*args, **kwargs):
print(f"Calling {func.__name__} with {args} {kwargs}")
result = func(*args, **kwargs)
print(f"{func.__name__} returned {result}")
return result
return wrapper
@debug
def add(a, b):
"Add two numbers."
return a + b
add(3, 4)Decorators with their own arguments (parameterized)
This is a "decorator factory": a function that returns a decorator.
from functools import wraps
def repeat(times=2):
def decorator(func):
@wraps(func)
def wrapper(*args, **kwargs):
result = None
for _ in range(times):
result = func(*args, **kwargs)
return result
return wrapper
return decorator
@repeat(times=3)
def beep():
print("Beep!")
beep()
# Beep!
# Beep!
# Beep!A very useful built-in: caching with lru_cache
from functools import lru_cache
@lru_cache(maxsize=None) # remembers results for same inputs
def fib(n):
if n < 2:
return n
return fib(n-1) + fib(n-2)
print([fib(i) for i in range(10)])Practical project: Streaming word counter with timed stages
Goal
- Process a text file to find the top N most common words without loading everything at once.
- Use generators to build a memory-friendly pipeline.
- Use decorators to time and debug stages.
What you'll build
Generators:
read_lines(path): yields lines from a file.to_words(lines): converts lines to lowercase words.filter_short(words, min_len): filters out tiny words.
A consumer:
top_n(words, n): returns a list of the most common words.
Decorators:
time_callfor normal functions.time_generatorfor generator functions.
Complete code (you can paste this into one .py file and run it)
import time
import re
from collections import Counter
from functools import wraps
from itertools import islice
import os
# ---------- Decorators ----------
def time_call(func):
"""Time a normal function (not a generator)."""
@wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
result = func(*args, **kwargs)
end = time.perf_counter()
print(f"[time] {func.__name__} took {end - start:.4f}s")
return result
return wrapper
def time_generator(func):
"""Time a generator function during iteration."""
@wraps(func)
def wrapper(*args, **kwargs):
start = time.perf_counter()
# We wrap the iteration to measure total time to consume
for item in func(*args, **kwargs):
yield item
end = time.perf_counter()
print(f"[time] {func.__name__} (generator) took {end - start:.4f}s")
return wrapper
def log_calls(func):
"""Log calls for debugging."""
@wraps(func)
def wrapper(*args, **kwargs):
print(f"[call] {func.__name__} args={args} kwargs={kwargs}")
result = func(*args, **kwargs)
print(f"[ret ] {func.__name__} -> {type(result).__name__}")
return result
return wrapper
# ---------- Generators ----------
@time_generator
def read_lines(path):
"""Yield lines from a UTF-8 text file."""
with open(path, "r", encoding="utf-8") as f:
for line in f:
yield line
@time_generator
def to_words(lines):
"""Split lines into lowercase words (letters only)."""
pattern = re.compile(r"[a-zA-Z]+")
for line in lines:
for word in pattern.findall(line.lower()):
yield word
@time_generator
def filter_short(words, min_len=3):
"""Filter out words shorter than min_len."""
for w in words:
if len(w) >= min_len:
yield w
# ---------- Consumer ----------
@time_call
@log_calls
def top_n(words_iter, n=10):
"""Count words and return top n as list of (word, count)."""
counts = Counter(words_iter) # consumes the iterator
return counts.most_common(n)
# ---------- Demo / Activity ----------
def ensure_sample_file(path="sample.txt"):
if not os.path.exists(path):
text = """
In the world of Python, generators and decorators are powerful.
Generators save memory by yielding items one at a time.
Decorators wrap functions to add features like timing and logging.
Python makes it fun to build pipelines using both!
"""
with open(path, "w", encoding="utf-8") as f:
f.write(text.strip())
return path
def main():
path = ensure_sample_file("sample.txt")
# Build the pipeline
lines = read_lines(path)
words = to_words(lines)
long_words = filter_short(words, min_len=4)
# Optionally limit for testing:
# long_words = islice(long_words, 50)
# Consume and get top words
result = top_n(long_words, n=5)
print("\nTop words:")
for w, c in result:
print(f"{w}: {c}")
if __name__ == "__main__":
main()How to explore
- Change
min_leninfilter_shortto see different results. - Replace sample.txt with a bigger text file (like an ebook). Watch timing messages.
- Remove the decorators to see how easy it is to turn logging/timing on or off.
Important notes
- The
time_generatordecorator only prints timing when the generator is fully consumed. If you only take a few items, the time may not represent the full dataset. - Decorators can be stacked. The order matters (they wrap in the reverse order they are written).
Extra mini-exercises
- Write a decorator called
count_callsthat counts how many times a function is called and prints the total. - Write a generator
chunked(seq, size)that yields lists of up to size items at a time. - Create a parameterized decorator
limit_time(seconds)that warns if a function runs longer than the given time.
Concept summary
Generators:
- Use
yieldto produce values one at a time. - Are memory efficient and lazy.
- Can be chained into pipelines for clean, readable processing.
- Generator expressions are a short, readable form.
Decorators:
- A function that takes a function and returns a new function.
- Useful for adding cross-cutting concerns: logging, timing, caching.
- Use
*argsand**kwargsin wrappers to handle any function. - Use
functools.wrapsto preserve the original function's name and docstring. - Can be parameterized by returning a decorator from a function.
By practicing both together, you'll write Python that's faster, cleaner, and easier to change.