Discussion on Python Co process asyncio
- 2021-11-13 02:00:01
- OfStack
1. Synergy
Synergy is not provided by the computer itself, unlike processes and threads, which are provided by the computer itself. It is artificially created by programmers to realize asynchronous execution of functions. Co-thread (Coroutine), which can also be called micro-thread, is a context switching technology within users. In fact, code blocks are switched and executed through one thread. It looks like a subroutine, but during execution, it can be interrupted inside the subroutine, and then another subroutine can be executed, and then it can be returned at an appropriate time to continue execution. For example:Official description;
Synergy is a more general form of subroutine. Subroutines can enter at one point and exit at another. Synergy can enter, exit and resume at many different points. They are implemented by the async def statement. See PEP 492.
# Need python3.7+
import asyncio
async def main():
print('hello')
await asyncio.sleep(1)
print('world')
asyncio.run(main())
# Print "hello" , wait 1 Seconds, and then print "world"
Note: Simply calling a co-routine will not cause it to be scheduled for execution.
A direct main () call is problematic:
RuntimeWarning: coroutine 'main' was never awaited
main()
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
def func1():
print(1)
...
print(2)
def func2():
print(3)
...
print(4)
func1()
func2()
# Results: 1 2 3 4
The method of realizing synergy:
greenlet, Early Module "Not Recommended" yield keyword, it is python generator, with save state, switch to other functions to execute, and then switch back to the original function. asyncio Decorator (introduced by python 3.4) async, await keywords (python3.5) "Recommended"1.1 greenlet Implementation Co-process
# No. 1 3 Square module, so you need to install
pip install greenlet
from greenlet import greenlet
def func1():
print(1)
gr2.switch()
print(2)
gr2.switch()
def func2():
print(3)
gr1.switch()
print(4)
gr1 = greenlet(func1)
gr2 = greenlet(func2)
gr1.switch()
# Results: 1 3 2 4
1.2 yield Keyword
def func1():
yield 1
yield from func2()
yield 2
def func2():
yield 3
yield 4
f1 = func1()
for item in f1:
print(item)
# Results: 1 3 2 4
1.3 asynico Decorator
Support for python 3.4 and later
DeprecationWarning: "@ coroutine" decorator is deprecated since Python 3.8, use "async def"
Translation: @ coroutine "Decorator has been abandoned since Python 3.8, please use" async def "instead
So this one is not supported either.
import asyncio
@asyncio.coroutine
def func1():
print(1)
yield from asyncio.sleep(2) # Encounter IO Time-consuming operation, automatically switching to tasks Other tasks in, such as: network IO Download pictures
print(2)
@asyncio.coroutine
def func2():
print(3)
yield from asyncio.sleep(2) # Encounter IO Time-consuming operation, automatically switching to tasks Other tasks in, such as: network IO Download pictures
print(4)
tasks = [
asyncio.ensure_future(func1()),
asyncio.ensure_future(func2())
]
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))
# Results: 1 3 2 4
1.4 async & await Keyword
import asyncio
async def func1():
print(1)
await asyncio.sleep(2) # Encounter IO Time-consuming operation, automatically switching to tasks Other tasks in, such as: network IO Download pictures
print(2)
async def func2():
print(3)
await asyncio.sleep(2) # Encounter IO Time-consuming operation, automatically switching to tasks Other tasks in, such as: network IO Download pictures
print(4)
tasks = [
asyncio.ensure_future(func1()),
asyncio.ensure_future(func2())
]
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))
2. The significance of synergy
Make full use of threads. In 1 thread, if IO waiting time is encountered, the thread will not wait straight, and use idle time to do other things.
Take downloading 3 pictures as an example:
Normal (synchronous) download:
import time
import requests
def download_image(url, img_name):
print(" Start downloading :", url)
# Send network requests and download pictures
response = requests.get(url)
print(" Download complete ")
# Save pictures to local files
file_name = str(img_name) + '.png'
with open(file_name, mode='wb') as file:
file.write(response.content)
if __name__ == '__main__':
start = time.time()
url_list = [
'https://tse4-mm.cn.bing.net/th/id/OIP.866vRxQ8QvyDsrUuXiu7qwHaNK?w=182&h=324&c=7&o=5&pid=1.7',
'https://tse2-mm.cn.bing.net/th/id/OIP.HUcWtoYPG-z2pu4ityajbAHaKQ?w=182&h=252&c=7&o=5&pid=1.7',
'https://tse2-mm.cn.bing.net/th/id/OIP.MvncR0-Pt9hVxKTdrvD9dAHaNK?w=182&h=324&c=7&o=5&pid=1.7',
'https://tse1-mm.cn.bing.net/th/id/OIP._nGloaeMWbL7NB7Lp6SnXQHaLH?w=182&h=273&c=7&o=5&pid=1.7',
]
img_name = 1
for item in url_list:
download_image(item, img_name)
img_name += 1
end = time.time()
print(end - start)
# Final time: 7.25s
Co-process (asynchronous) download:
import aiohttp
import asyncio
import time
async def fetch(session, url):
print(" Send a request: ", url)
async with session.get(url, verify_ssl=False) as response:
content = await response.content.read()
file_name = url.rsplit('_')[-1]
# print(file_name)
with open(file_name, mode='wb') as file_object:
file_object.write(content)
print(" Download complete ")
async def main():
async with aiohttp.ClientSession() as session:
url_list = [
'https://www3.autoimg.cn/newsdfs/g26/M02/35/A9/120x90_0_autohomecar__ChsEe12AXQ6AOOH_AAFocMs8nzU621.jpg',
'https://www3.autoimg.cn/newsdfs/g26/M02/35/A9/120x90_0_autohomecar__ChsEe12AXQ6AOOH_AAFocMs8nzU621.jpg',
'https://www3.autoimg.cn/newsdfs/g26/M02/35/A9/120x90_0_autohomecar__ChsEe12AXQ6AOOH_AAFocMs8nzU621.jpg',
'https://www3.autoimg.cn/newsdfs/g26/M02/35/A9/120x90_0_autohomecar__ChsEe12AXQ6AOOH_AAFocMs8nzU621.jpg',
]
tasks = [asyncio.ensure_future(fetch(session, url)) for url in url_list]
await asyncio.wait(tasks)
if __name__ == '__main__':
start = time.time()
asyncio.get_event_loop().run_until_complete(main())
end = time.time()
print(end - start)
# Results: 0.05s