You’re the Expert!

pynfinity

Build mountain with each pebble 🧗

Topics | Stepping Stones

📚 Guides

🌍 Pebbles & Contributions

✍️ Write a Post
🗺️ Folium Maps
🗺️ Folium Maps
⏳ Time Series Resampling
⏳ Time Series Resampling
📦 Dataclasses
📦 Dataclasses
🔁 Itertools
🔁 Itertools
🗄️ SQLAlchemy Basics
🗄️ SQLAlchemy Basics
📄 JSON Handling
📄 JSON Handling
🖥️ OS Module
🖥️ OS Module
🔢 NumPy Arrays
🔢 NumPy Arrays
⚡ Generators
⚡ Generators
🐼 Pandas DataFrames
🐼 Pandas DataFrames
🎛️ Multiprocessing
🎛️ Multiprocessing
⚡ FastAPI Endpoints
⚡ FastAPI Endpoints
📜 List Comprehensions
📜 List Comprehensions
🪵 Logging
🪵 Logging
🚀 Streamlit Apps
🚀 Streamlit Apps
🧪 Pytest Testing
🧪 Pytest Testing
🧹 Data Cleaning with Dropna
🧹 Data Cleaning with Dropna
🧠 TensorFlow Basics
🧠 TensorFlow Basics
🌲 Git Basics
🌲 Git Basics
📦 Virtual Environments
📦 Virtual Environments
🏷️ Type Hinting
🏷️ Type Hinting
🚪 Context Managers
🚪 Context Managers
🛠️ Functools
🛠️ Functools
🤖 Scikit-Learn Linear Regression
🤖 Scikit-Learn Linear Regression
🧩 Regular Expressions
🧩 Regular Expressions
🎀 Decorators
🎀 Decorators
📸 OpenCV Image Reading
📸 OpenCV Image Reading
💻 Argparse CLI
💻 Argparse CLI
✨ Jupyter Magic Commands
✨ Jupyter Magic Commands
🛡️ Pydantic Models
🛡️ Pydantic Models
🕸️ Web Scraping with BeautifulSoup
🕸️ Web Scraping with BeautifulSoup
📊 Interactive Plots with Plotly
📊 Interactive Plots with Plotly
📚 Collections Module
📚 Collections Module
📝 NLTK Tokenization
📝 NLTK Tokenization
λ Lambda Functions
λ Lambda Functions
📉 Matplotlib Plotting
📉 Matplotlib Plotting
🔥 Seaborn Heatmaps
🔥 Seaborn Heatmaps
🧵 Multithreading
🧵 Multithreading
⏳ AsyncIO
⏳ AsyncIO
🔥 PyTorch Tensors
🔥 PyTorch Tensors
⚙️ Sys Module
⚙️ Sys Module
☁️ AWS S3 with Boto3
☁️ AWS S3 with Boto3
🥒 Pickle Serialization
🥒 Pickle Serialization
🕸️ NetworkX Graphs
🕸️ NetworkX Graphs
➗ Math Module
➗ Math Module
🔍 Exploratory Data Analysis (EDA)
🔍 Exploratory Data Analysis (EDA)
📊 CSV Processing
📊 CSV Processing
🐳 Dockerfiles
🐳 Dockerfiles
📉 Statsmodels OLS
📉 Statsmodels OLS
+ New Post

⚡ Generators and Coroutines in Python

Generators and coroutines provide a way to write code that can be paused and resumed, enabling lazy evaluation and cooperative multitasking.


🏭 Generators

A generator is a function that returns an iterator. It generates values on the fly using the yield keyword.

def countdown(n):
    while n > 0:
        yield n
        n -= 1

for x in countdown(3):
    print(x)
# Output: 3, 2, 1

Benefits:
- Memory Efficient: Values are generated one by one, not stored in a list.
- Infinite Sequences: Can represent infinite streams of data.


🤝 Coroutines

Coroutines are generalizations of subroutines. While subroutines are entered at one point and exited at another, coroutines can be entered, exited, and resumed at many different points.

In Python, coroutines consume data sent to them.

def grep(pattern):
    print(f"Searching for {pattern}")
    while True:
        line = (yield)
        if pattern in line:
            print(line)

search = grep('coroutine')
next(search)  # Prime the coroutine
search.send("I love generators")
search.send("I love coroutines") # Output: I love coroutines
search.close()
  • yield is used as an expression x = (yield).
  • send() pushes values into the coroutine.
  • close() stops the coroutine.

🔗 yield from

yield from allows a generator to delegate part of its operations to another generator.

def sub_gen():
    yield 1
    yield 2

def main_gen():
    yield 0
    yield from sub_gen()
    yield 3

print(list(main_gen())) # [0, 1, 2, 3]

It also establishes a bidirectional channel, allowing send() and exceptions to pass through to the sub-generator.


🧪 Generator Pipelines

You can chain generators together to form processing pipelines.

def numbers(n):
    for i in range(n):
        yield i

def square(nums):
    for n in nums:
        yield n * n

pipeline = square(numbers(5))
print(list(pipeline)) # [0, 1, 4, 9, 16]

📝 Summary

  • Generators: Produce data lazily using yield. Save memory.
  • Coroutines: Consume data. Can be paused and resumed.
  • yield from: Delegates to sub-generators.
  • Pipelines: Compose complex data processing flows from simple generators.

Created with ❤️ by Pynfinity



Pynfinity
Install Pynfinity Add to home screen for the best experience