You’re the Expert!

pynfinity

Build mountain with each pebble 🧗

Topics | Stepping Stones

📚 Guides

🌍 Pebbles & Contributions

✍️ Write a Post
🐼 Pandas DataFrames
🐼 Pandas DataFrames
🔢 NumPy Arrays
🔢 NumPy Arrays
📉 Matplotlib Plotting
📉 Matplotlib Plotting
🔥 Seaborn Heatmaps
🔥 Seaborn Heatmaps
🤖 Scikit-Learn Linear Regression
🤖 Scikit-Learn Linear Regression
🧹 Data Cleaning with Dropna
🧹 Data Cleaning with Dropna
🔍 Exploratory Data Analysis (EDA)
🔍 Exploratory Data Analysis (EDA)
⏳ Time Series Resampling
⏳ Time Series Resampling
🕸️ Web Scraping with BeautifulSoup
🕸️ Web Scraping with BeautifulSoup
🗄️ SQLAlchemy Basics
🗄️ SQLAlchemy Basics
📊 Interactive Plots with Plotly
📊 Interactive Plots with Plotly
📝 NLTK Tokenization
📝 NLTK Tokenization
🧠 TensorFlow Basics
🧠 TensorFlow Basics
🔥 PyTorch Tensors
🔥 PyTorch Tensors
📉 Statsmodels OLS
📉 Statsmodels OLS
📸 OpenCV Image Reading
📸 OpenCV Image Reading
🕸️ NetworkX Graphs
🕸️ NetworkX Graphs
🗺️ Folium Maps
🗺️ Folium Maps
🚀 Streamlit Apps
🚀 Streamlit Apps
⚡ FastAPI Endpoints
⚡ FastAPI Endpoints
✨ Jupyter Magic Commands
✨ Jupyter Magic Commands
📦 Virtual Environments
📦 Virtual Environments
🌲 Git Basics
🌲 Git Basics
🐳 Dockerfiles
🐳 Dockerfiles
☁️ AWS S3 with Boto3
☁️ AWS S3 with Boto3
🧩 Regular Expressions
🧩 Regular Expressions
λ Lambda Functions
λ Lambda Functions
📜 List Comprehensions
📜 List Comprehensions
⚡ Generators
⚡ Generators
🎀 Decorators
🎀 Decorators
🚪 Context Managers
🚪 Context Managers
🧵 Multithreading
🧵 Multithreading
🎛️ Multiprocessing
🎛️ Multiprocessing
⏳ AsyncIO
⏳ AsyncIO
🏷️ Type Hinting
🏷️ Type Hinting
📦 Dataclasses
📦 Dataclasses
🛡️ Pydantic Models
🛡️ Pydantic Models
🧪 Pytest Testing
🧪 Pytest Testing
🪵 Logging
🪵 Logging
💻 Argparse CLI
💻 Argparse CLI
📄 JSON Handling
📄 JSON Handling
📊 CSV Processing
📊 CSV Processing
🥒 Pickle Serialization
🥒 Pickle Serialization
🖥️ OS Module
🖥️ OS Module
⚙️ Sys Module
⚙️ Sys Module
📚 Collections Module
📚 Collections Module
🔁 Itertools
🔁 Itertools
🛠️ Functools
🛠️ Functools
➗ Math Module
➗ Math Module
+ New Post

⚡ Generators and Coroutines in Python

Generators and coroutines provide a way to write code that can be paused and resumed, enabling lazy evaluation and cooperative multitasking.


🏭 Generators

A generator is a function that returns an iterator. It generates values on the fly using the yield keyword.

def countdown(n):
    while n > 0:
        yield n
        n -= 1

for x in countdown(3):
    print(x)
# Output: 3, 2, 1

Benefits:
- Memory Efficient: Values are generated one by one, not stored in a list.
- Infinite Sequences: Can represent infinite streams of data.


🤝 Coroutines

Coroutines are generalizations of subroutines. While subroutines are entered at one point and exited at another, coroutines can be entered, exited, and resumed at many different points.

In Python, coroutines consume data sent to them.

def grep(pattern):
    print(f"Searching for {pattern}")
    while True:
        line = (yield)
        if pattern in line:
            print(line)

search = grep('coroutine')
next(search)  # Prime the coroutine
search.send("I love generators")
search.send("I love coroutines") # Output: I love coroutines
search.close()
  • yield is used as an expression x = (yield).
  • send() pushes values into the coroutine.
  • close() stops the coroutine.

🔗 yield from

yield from allows a generator to delegate part of its operations to another generator.

def sub_gen():
    yield 1
    yield 2

def main_gen():
    yield 0
    yield from sub_gen()
    yield 3

print(list(main_gen())) # [0, 1, 2, 3]

It also establishes a bidirectional channel, allowing send() and exceptions to pass through to the sub-generator.


🧪 Generator Pipelines

You can chain generators together to form processing pipelines.

def numbers(n):
    for i in range(n):
        yield i

def square(nums):
    for n in nums:
        yield n * n

pipeline = square(numbers(5))
print(list(pipeline)) # [0, 1, 4, 9, 16]

📝 Summary

  • Generators: Produce data lazily using yield. Save memory.
  • Coroutines: Consume data. Can be paused and resumed.
  • yield from: Delegates to sub-generators.
  • Pipelines: Compose complex data processing flows from simple generators.

Created with ❤️ by Pynfinity