Join multiple async generators in PythonWatch stdout and stderr of a subprocess simultaneouslyAppending to merged async generators in PythonCalling an external command in PythonWhat are metaclasses in Python?Finding the index of an item given a list containing it in PythonWhat is the difference between Python's list methods append and extend?How can I safely create a nested directory?Does Python have a ternary conditional operator?How to get the current time in PythonPython join: why is it string.join(list) instead of list.join(string)?How can I make a time delay in Python?Does Python have a string 'contains' substring method?

What's is the easiest way to purchase a stock and hold it

How to prove the emptiness of intersection of two context free languages is undecidable?

What does it mean to "take the Cross"

Working hours and productivity expectations for game artists and programmers

What variables do I have to take into consideration when I homebrew armour?

Why is there no current between two capacitors connected in series?

Is there a way to generate a mapping graph like this?

US F1 Visa grace period attending a conference

Can dirty bird feeders make birds sick?

How do we explain the use of a software on a math paper?

How should I mix small caps with digits or symbols?

Is my company merging branches wrong?

Does science define life as "beginning at conception"?

Hotel booking: Why is Agoda much cheaper than booking.com?

How did Arya and the Hound get into King's Landing so easily?

Expand a hexagon

How could the B-29 bomber back up under its own power?

Department head said that group project may be rejected. How to mitigate?

Story about encounter with hostile aliens

How did Jean Parisot de Valette, 49th Grand Master of the Order of Malta, die?

Existence of a model of ZFC in which the natural numbers are really the natural numbers

Don't understand notation of morphisms in Monoid definition

Circuit construction for execution of conditional statements using least significant bit

What city and town structures are important in a low fantasy medieval world?



Join multiple async generators in Python


Watch stdout and stderr of a subprocess simultaneouslyAppending to merged async generators in PythonCalling an external command in PythonWhat are metaclasses in Python?Finding the index of an item given a list containing it in PythonWhat is the difference between Python's list methods append and extend?How can I safely create a nested directory?Does Python have a ternary conditional operator?How to get the current time in PythonPython join: why is it string.join(list) instead of list.join(string)?How can I make a time delay in Python?Does Python have a string 'contains' substring method?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








2















I would like to listen for events from multiple instances of the same object and then merge this event streams to one stream. For example, if I use async generators:



class PeriodicYielder: 
def __init__(self, period: int) -> None:
self.period = period

async def updates(self):
while True:
await asyncio.sleep(self.period)
yield self.period


I can successfully listen for events from one instance:



async def get_updates_from_one(): 
each_1 = PeriodicYielder(1)
async for n in each_1.updates():
print(n)
# 1
# 1
# 1
# ...


But how can I get events from multiple async generators? In other words: how can I iterate through multiple async generators in the order they are ready to produce next value?



async def get_updates_from_multiple(): 
each_1 = PeriodicYielder(1)
each_2 = PeriodicYielder(2)
async for n in magic_async_join_function(each_1.updates(), each_2.updates()):
print(n)
# 1
# 1
# 2
# 1
# 1
# 2
# ...


Is there such magic_async_join_function in stdlib or in 3rd party module?










share|improve this question

















  • 1





    Is gather() what you are looking for?

    – SuperShoot
    Mar 22 at 12:38











  • @SuperShoot gather() returns list with size n as result of gathering n coroutines. I would like to merge n multiple async generators to be one async generator. Something like itertools.chain() but for async iterables and not exhausting first iterable to check if second iterable is ready to yield.

    – Andrey Semakin
    Mar 22 at 12:50


















2















I would like to listen for events from multiple instances of the same object and then merge this event streams to one stream. For example, if I use async generators:



class PeriodicYielder: 
def __init__(self, period: int) -> None:
self.period = period

async def updates(self):
while True:
await asyncio.sleep(self.period)
yield self.period


I can successfully listen for events from one instance:



async def get_updates_from_one(): 
each_1 = PeriodicYielder(1)
async for n in each_1.updates():
print(n)
# 1
# 1
# 1
# ...


But how can I get events from multiple async generators? In other words: how can I iterate through multiple async generators in the order they are ready to produce next value?



async def get_updates_from_multiple(): 
each_1 = PeriodicYielder(1)
each_2 = PeriodicYielder(2)
async for n in magic_async_join_function(each_1.updates(), each_2.updates()):
print(n)
# 1
# 1
# 2
# 1
# 1
# 2
# ...


Is there such magic_async_join_function in stdlib or in 3rd party module?










share|improve this question

















  • 1





    Is gather() what you are looking for?

    – SuperShoot
    Mar 22 at 12:38











  • @SuperShoot gather() returns list with size n as result of gathering n coroutines. I would like to merge n multiple async generators to be one async generator. Something like itertools.chain() but for async iterables and not exhausting first iterable to check if second iterable is ready to yield.

    – Andrey Semakin
    Mar 22 at 12:50














2












2








2


1






I would like to listen for events from multiple instances of the same object and then merge this event streams to one stream. For example, if I use async generators:



class PeriodicYielder: 
def __init__(self, period: int) -> None:
self.period = period

async def updates(self):
while True:
await asyncio.sleep(self.period)
yield self.period


I can successfully listen for events from one instance:



async def get_updates_from_one(): 
each_1 = PeriodicYielder(1)
async for n in each_1.updates():
print(n)
# 1
# 1
# 1
# ...


But how can I get events from multiple async generators? In other words: how can I iterate through multiple async generators in the order they are ready to produce next value?



async def get_updates_from_multiple(): 
each_1 = PeriodicYielder(1)
each_2 = PeriodicYielder(2)
async for n in magic_async_join_function(each_1.updates(), each_2.updates()):
print(n)
# 1
# 1
# 2
# 1
# 1
# 2
# ...


Is there such magic_async_join_function in stdlib or in 3rd party module?










share|improve this question














I would like to listen for events from multiple instances of the same object and then merge this event streams to one stream. For example, if I use async generators:



class PeriodicYielder: 
def __init__(self, period: int) -> None:
self.period = period

async def updates(self):
while True:
await asyncio.sleep(self.period)
yield self.period


I can successfully listen for events from one instance:



async def get_updates_from_one(): 
each_1 = PeriodicYielder(1)
async for n in each_1.updates():
print(n)
# 1
# 1
# 1
# ...


But how can I get events from multiple async generators? In other words: how can I iterate through multiple async generators in the order they are ready to produce next value?



async def get_updates_from_multiple(): 
each_1 = PeriodicYielder(1)
each_2 = PeriodicYielder(2)
async for n in magic_async_join_function(each_1.updates(), each_2.updates()):
print(n)
# 1
# 1
# 2
# 1
# 1
# 2
# ...


Is there such magic_async_join_function in stdlib or in 3rd party module?







python python-asyncio






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Mar 22 at 12:25









Andrey SemakinAndrey Semakin

303214




303214







  • 1





    Is gather() what you are looking for?

    – SuperShoot
    Mar 22 at 12:38











  • @SuperShoot gather() returns list with size n as result of gathering n coroutines. I would like to merge n multiple async generators to be one async generator. Something like itertools.chain() but for async iterables and not exhausting first iterable to check if second iterable is ready to yield.

    – Andrey Semakin
    Mar 22 at 12:50













  • 1





    Is gather() what you are looking for?

    – SuperShoot
    Mar 22 at 12:38











  • @SuperShoot gather() returns list with size n as result of gathering n coroutines. I would like to merge n multiple async generators to be one async generator. Something like itertools.chain() but for async iterables and not exhausting first iterable to check if second iterable is ready to yield.

    – Andrey Semakin
    Mar 22 at 12:50








1




1





Is gather() what you are looking for?

– SuperShoot
Mar 22 at 12:38





Is gather() what you are looking for?

– SuperShoot
Mar 22 at 12:38













@SuperShoot gather() returns list with size n as result of gathering n coroutines. I would like to merge n multiple async generators to be one async generator. Something like itertools.chain() but for async iterables and not exhausting first iterable to check if second iterable is ready to yield.

– Andrey Semakin
Mar 22 at 12:50






@SuperShoot gather() returns list with size n as result of gathering n coroutines. I would like to merge n multiple async generators to be one async generator. Something like itertools.chain() but for async iterables and not exhausting first iterable to check if second iterable is ready to yield.

– Andrey Semakin
Mar 22 at 12:50













2 Answers
2






active

oldest

votes


















5














You can use wonderful aiostream library. It'll look like this:



import asyncio
from aiostream import stream


async def test1():
for _ in range(5):
await asyncio.sleep(0.1)
yield 1


async def test2():
for _ in range(5):
await asyncio.sleep(0.2)
yield 2


async def main():
combine = stream.merge(test1(), test2())

async with combine.stream() as streamer:
async for item in streamer:
print(item)


asyncio.run(main())


Result:



1
1
2
1
1
2
1
2
2
2





share|improve this answer























  • Wow, looks good! Will check it out!

    – Andrey Semakin
    Mar 22 at 13:34











  • Mikhail Gerasimov, thank you!

    – Andrey Semakin
    Mar 22 at 13:35











  • @AndreySemakin you're welcome! :)

    – Mikhail Gerasimov
    Mar 22 at 13:41


















2














If you wanted to avoid the dependency on an external library (or as a learning exercise), you could merge the async iterators using a queue:



def merge_async_iters(*aiters):
# merge async iterators, proof of concept
queue = asyncio.Queue(1)
async def drain(aiter):
async for item in aiter:
await queue.put(item)
async def merged():
while not all(task.done() for task in tasks):
yield await queue.get()
tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()


This passes the test from Mikhail's answer, but it's not perfect: it doesn't propagate the exception in case one of the async iterators raises. Also, if the task that exhausts the merged generator returned by merge_async_iters() gets cancelled, or if the same generator is not exhausted to the end, the individual drain tasks are left hanging.



A more complete version could handle the first issue by detecting an exception and transmitting it through the queue. The second issue can be resolved by merged generator cancelling the drain tasks as soon as the iteration is abandoned. With those changes, the resulting code looks like this:



def merge_async_iters(*aiters):
queue = asyncio.Queue(1)
run_count = len(aiters)
cancelling = False

async def drain(aiter):
nonlocal run_count
try:
async for item in aiter:
await queue.put((False, item))
except Exception as e:
if not cancelling:
await queue.put((True, e))
else:
raise
finally:
run_count -= 1

async def merged():
try:
while run_count:
raised, next_item = await queue.get()
if raised:
cancel_tasks()
raise next_item
yield next_item
finally:
cancel_tasks()

def cancel_tasks():
nonlocal cancelling
cancelling = True
for t in tasks:
t.cancel()

tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()


Different approaches to merging async iterators can be found in this answer, and also this one, where the later allows for adding new streams in mid-stride. The complexity and subtlety of these implementations shows that, while it is useful to know how to write one, actually doing so is best left to well-tested external libraries such as aiostream that cover all the edge cases.






share|improve this answer




















  • 1





    thank you for a great explanation!

    – Andrey Semakin
    Mar 29 at 19:10











Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55299564%2fjoin-multiple-async-generators-in-python%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









5














You can use wonderful aiostream library. It'll look like this:



import asyncio
from aiostream import stream


async def test1():
for _ in range(5):
await asyncio.sleep(0.1)
yield 1


async def test2():
for _ in range(5):
await asyncio.sleep(0.2)
yield 2


async def main():
combine = stream.merge(test1(), test2())

async with combine.stream() as streamer:
async for item in streamer:
print(item)


asyncio.run(main())


Result:



1
1
2
1
1
2
1
2
2
2





share|improve this answer























  • Wow, looks good! Will check it out!

    – Andrey Semakin
    Mar 22 at 13:34











  • Mikhail Gerasimov, thank you!

    – Andrey Semakin
    Mar 22 at 13:35











  • @AndreySemakin you're welcome! :)

    – Mikhail Gerasimov
    Mar 22 at 13:41















5














You can use wonderful aiostream library. It'll look like this:



import asyncio
from aiostream import stream


async def test1():
for _ in range(5):
await asyncio.sleep(0.1)
yield 1


async def test2():
for _ in range(5):
await asyncio.sleep(0.2)
yield 2


async def main():
combine = stream.merge(test1(), test2())

async with combine.stream() as streamer:
async for item in streamer:
print(item)


asyncio.run(main())


Result:



1
1
2
1
1
2
1
2
2
2





share|improve this answer























  • Wow, looks good! Will check it out!

    – Andrey Semakin
    Mar 22 at 13:34











  • Mikhail Gerasimov, thank you!

    – Andrey Semakin
    Mar 22 at 13:35











  • @AndreySemakin you're welcome! :)

    – Mikhail Gerasimov
    Mar 22 at 13:41













5












5








5







You can use wonderful aiostream library. It'll look like this:



import asyncio
from aiostream import stream


async def test1():
for _ in range(5):
await asyncio.sleep(0.1)
yield 1


async def test2():
for _ in range(5):
await asyncio.sleep(0.2)
yield 2


async def main():
combine = stream.merge(test1(), test2())

async with combine.stream() as streamer:
async for item in streamer:
print(item)


asyncio.run(main())


Result:



1
1
2
1
1
2
1
2
2
2





share|improve this answer













You can use wonderful aiostream library. It'll look like this:



import asyncio
from aiostream import stream


async def test1():
for _ in range(5):
await asyncio.sleep(0.1)
yield 1


async def test2():
for _ in range(5):
await asyncio.sleep(0.2)
yield 2


async def main():
combine = stream.merge(test1(), test2())

async with combine.stream() as streamer:
async for item in streamer:
print(item)


asyncio.run(main())


Result:



1
1
2
1
1
2
1
2
2
2






share|improve this answer












share|improve this answer



share|improve this answer










answered Mar 22 at 13:19









Mikhail GerasimovMikhail Gerasimov

15.9k54376




15.9k54376












  • Wow, looks good! Will check it out!

    – Andrey Semakin
    Mar 22 at 13:34











  • Mikhail Gerasimov, thank you!

    – Andrey Semakin
    Mar 22 at 13:35











  • @AndreySemakin you're welcome! :)

    – Mikhail Gerasimov
    Mar 22 at 13:41

















  • Wow, looks good! Will check it out!

    – Andrey Semakin
    Mar 22 at 13:34











  • Mikhail Gerasimov, thank you!

    – Andrey Semakin
    Mar 22 at 13:35











  • @AndreySemakin you're welcome! :)

    – Mikhail Gerasimov
    Mar 22 at 13:41
















Wow, looks good! Will check it out!

– Andrey Semakin
Mar 22 at 13:34





Wow, looks good! Will check it out!

– Andrey Semakin
Mar 22 at 13:34













Mikhail Gerasimov, thank you!

– Andrey Semakin
Mar 22 at 13:35





Mikhail Gerasimov, thank you!

– Andrey Semakin
Mar 22 at 13:35













@AndreySemakin you're welcome! :)

– Mikhail Gerasimov
Mar 22 at 13:41





@AndreySemakin you're welcome! :)

– Mikhail Gerasimov
Mar 22 at 13:41













2














If you wanted to avoid the dependency on an external library (or as a learning exercise), you could merge the async iterators using a queue:



def merge_async_iters(*aiters):
# merge async iterators, proof of concept
queue = asyncio.Queue(1)
async def drain(aiter):
async for item in aiter:
await queue.put(item)
async def merged():
while not all(task.done() for task in tasks):
yield await queue.get()
tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()


This passes the test from Mikhail's answer, but it's not perfect: it doesn't propagate the exception in case one of the async iterators raises. Also, if the task that exhausts the merged generator returned by merge_async_iters() gets cancelled, or if the same generator is not exhausted to the end, the individual drain tasks are left hanging.



A more complete version could handle the first issue by detecting an exception and transmitting it through the queue. The second issue can be resolved by merged generator cancelling the drain tasks as soon as the iteration is abandoned. With those changes, the resulting code looks like this:



def merge_async_iters(*aiters):
queue = asyncio.Queue(1)
run_count = len(aiters)
cancelling = False

async def drain(aiter):
nonlocal run_count
try:
async for item in aiter:
await queue.put((False, item))
except Exception as e:
if not cancelling:
await queue.put((True, e))
else:
raise
finally:
run_count -= 1

async def merged():
try:
while run_count:
raised, next_item = await queue.get()
if raised:
cancel_tasks()
raise next_item
yield next_item
finally:
cancel_tasks()

def cancel_tasks():
nonlocal cancelling
cancelling = True
for t in tasks:
t.cancel()

tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()


Different approaches to merging async iterators can be found in this answer, and also this one, where the later allows for adding new streams in mid-stride. The complexity and subtlety of these implementations shows that, while it is useful to know how to write one, actually doing so is best left to well-tested external libraries such as aiostream that cover all the edge cases.






share|improve this answer




















  • 1





    thank you for a great explanation!

    – Andrey Semakin
    Mar 29 at 19:10















2














If you wanted to avoid the dependency on an external library (or as a learning exercise), you could merge the async iterators using a queue:



def merge_async_iters(*aiters):
# merge async iterators, proof of concept
queue = asyncio.Queue(1)
async def drain(aiter):
async for item in aiter:
await queue.put(item)
async def merged():
while not all(task.done() for task in tasks):
yield await queue.get()
tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()


This passes the test from Mikhail's answer, but it's not perfect: it doesn't propagate the exception in case one of the async iterators raises. Also, if the task that exhausts the merged generator returned by merge_async_iters() gets cancelled, or if the same generator is not exhausted to the end, the individual drain tasks are left hanging.



A more complete version could handle the first issue by detecting an exception and transmitting it through the queue. The second issue can be resolved by merged generator cancelling the drain tasks as soon as the iteration is abandoned. With those changes, the resulting code looks like this:



def merge_async_iters(*aiters):
queue = asyncio.Queue(1)
run_count = len(aiters)
cancelling = False

async def drain(aiter):
nonlocal run_count
try:
async for item in aiter:
await queue.put((False, item))
except Exception as e:
if not cancelling:
await queue.put((True, e))
else:
raise
finally:
run_count -= 1

async def merged():
try:
while run_count:
raised, next_item = await queue.get()
if raised:
cancel_tasks()
raise next_item
yield next_item
finally:
cancel_tasks()

def cancel_tasks():
nonlocal cancelling
cancelling = True
for t in tasks:
t.cancel()

tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()


Different approaches to merging async iterators can be found in this answer, and also this one, where the later allows for adding new streams in mid-stride. The complexity and subtlety of these implementations shows that, while it is useful to know how to write one, actually doing so is best left to well-tested external libraries such as aiostream that cover all the edge cases.






share|improve this answer




















  • 1





    thank you for a great explanation!

    – Andrey Semakin
    Mar 29 at 19:10













2












2








2







If you wanted to avoid the dependency on an external library (or as a learning exercise), you could merge the async iterators using a queue:



def merge_async_iters(*aiters):
# merge async iterators, proof of concept
queue = asyncio.Queue(1)
async def drain(aiter):
async for item in aiter:
await queue.put(item)
async def merged():
while not all(task.done() for task in tasks):
yield await queue.get()
tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()


This passes the test from Mikhail's answer, but it's not perfect: it doesn't propagate the exception in case one of the async iterators raises. Also, if the task that exhausts the merged generator returned by merge_async_iters() gets cancelled, or if the same generator is not exhausted to the end, the individual drain tasks are left hanging.



A more complete version could handle the first issue by detecting an exception and transmitting it through the queue. The second issue can be resolved by merged generator cancelling the drain tasks as soon as the iteration is abandoned. With those changes, the resulting code looks like this:



def merge_async_iters(*aiters):
queue = asyncio.Queue(1)
run_count = len(aiters)
cancelling = False

async def drain(aiter):
nonlocal run_count
try:
async for item in aiter:
await queue.put((False, item))
except Exception as e:
if not cancelling:
await queue.put((True, e))
else:
raise
finally:
run_count -= 1

async def merged():
try:
while run_count:
raised, next_item = await queue.get()
if raised:
cancel_tasks()
raise next_item
yield next_item
finally:
cancel_tasks()

def cancel_tasks():
nonlocal cancelling
cancelling = True
for t in tasks:
t.cancel()

tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()


Different approaches to merging async iterators can be found in this answer, and also this one, where the later allows for adding new streams in mid-stride. The complexity and subtlety of these implementations shows that, while it is useful to know how to write one, actually doing so is best left to well-tested external libraries such as aiostream that cover all the edge cases.






share|improve this answer















If you wanted to avoid the dependency on an external library (or as a learning exercise), you could merge the async iterators using a queue:



def merge_async_iters(*aiters):
# merge async iterators, proof of concept
queue = asyncio.Queue(1)
async def drain(aiter):
async for item in aiter:
await queue.put(item)
async def merged():
while not all(task.done() for task in tasks):
yield await queue.get()
tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()


This passes the test from Mikhail's answer, but it's not perfect: it doesn't propagate the exception in case one of the async iterators raises. Also, if the task that exhausts the merged generator returned by merge_async_iters() gets cancelled, or if the same generator is not exhausted to the end, the individual drain tasks are left hanging.



A more complete version could handle the first issue by detecting an exception and transmitting it through the queue. The second issue can be resolved by merged generator cancelling the drain tasks as soon as the iteration is abandoned. With those changes, the resulting code looks like this:



def merge_async_iters(*aiters):
queue = asyncio.Queue(1)
run_count = len(aiters)
cancelling = False

async def drain(aiter):
nonlocal run_count
try:
async for item in aiter:
await queue.put((False, item))
except Exception as e:
if not cancelling:
await queue.put((True, e))
else:
raise
finally:
run_count -= 1

async def merged():
try:
while run_count:
raised, next_item = await queue.get()
if raised:
cancel_tasks()
raise next_item
yield next_item
finally:
cancel_tasks()

def cancel_tasks():
nonlocal cancelling
cancelling = True
for t in tasks:
t.cancel()

tasks = [asyncio.create_task(drain(aiter)) for aiter in aiters]
return merged()


Different approaches to merging async iterators can be found in this answer, and also this one, where the later allows for adding new streams in mid-stride. The complexity and subtlety of these implementations shows that, while it is useful to know how to write one, actually doing so is best left to well-tested external libraries such as aiostream that cover all the edge cases.







share|improve this answer














share|improve this answer



share|improve this answer








edited Mar 24 at 10:35

























answered Mar 23 at 19:37









user4815162342user4815162342

66.2k6100157




66.2k6100157







  • 1





    thank you for a great explanation!

    – Andrey Semakin
    Mar 29 at 19:10












  • 1





    thank you for a great explanation!

    – Andrey Semakin
    Mar 29 at 19:10







1




1





thank you for a great explanation!

– Andrey Semakin
Mar 29 at 19:10





thank you for a great explanation!

– Andrey Semakin
Mar 29 at 19:10

















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55299564%2fjoin-multiple-async-generators-in-python%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

Swift 4 - func physicsWorld not invoked on collision? The Next CEO of Stack OverflowHow to call Objective-C code from Swift#ifdef replacement in the Swift language@selector() in Swift?#pragma mark in Swift?Swift for loop: for index, element in array?dispatch_after - GCD in Swift?Swift Beta performance: sorting arraysSplit a String into an array in Swift?The use of Swift 3 @objc inference in Swift 4 mode is deprecated?How to optimize UITableViewCell, because my UITableView lags

Access current req object everywhere in Node.js ExpressWhy are global variables considered bad practice? (node.js)Using req & res across functionsHow do I get the path to the current script with Node.js?What is Node.js' Connect, Express and “middleware”?Node.js w/ express error handling in callbackHow to access the GET parameters after “?” in Express?Modify Node.js req object parametersAccess “app” variable inside of ExpressJS/ConnectJS middleware?Node.js Express app - request objectAngular Http Module considered middleware?Session variables in ExpressJSAdd properties to the req object in expressjs with Typescript