Wanted: experts in Twisted, Tornado, asyncore or other Python async APIs (greenlets/gevent, Stackless, libevent all welcome!). In firstname.lastname@example.org we're trying to hash out the async API for the future (for the Python stdlib) and we need input from expert users of the current generation of async APIs.
304 plus ones
Shared publicly•View activity
View 75 previous comments
- Can you summarize how Worq differentiates itself from PEP 3153 (which is implemented in Python 3.2 and up)?Nov 1, 2012
- It seems the PEP 3153 is (maybe necessarily?) an abstract discussion of the components needed to implement an async library, but it has few details of what the API would actually look like. Some code examples would be helpful. Is ``concurrent.futures`` the implementation in 3.2 that you're referring to? PEP 3153 does not refer to concurrent.futures, and concurrent.futures docs (http://docs.python.org/dev/library/concurrent.futures.html) do not refer to that PEP. I really need to make my way over to Python Ideas and do a bunch of reading, I'm sure most of these questions would be answered there, but it's going to take a bunch of time for me to catch up :)
The thing I like most about Worq is a very simple API for invoking asynchronous tasks. It tries hard to use reasonable defaults to keep the most common case simple. It also tries to make the more complicated things possible, if a bit more verbose. In the simplest, and hopefully most common case, invoking a task is a function call.
deferred = q.task(*args, **kw)
# more complex
task = Task(q.task, **options)
deferred = task(*args, **kw)
The returned "deferred" object can be passed to another task as an argument, or one can wait for the real result to become available. The result-as-argument feature makes it easy to queue up a graph of tasks to be executed asynchronously.
Why does concurrent.futures.Executor provide a map() function when Python has moved toward recommending list comprehensions in favor of map()? Worq does it like this:
results = [q.task(item) for item in items]
total = q.sum(results)
Can I do that with a list of concurrent.future.Future objects? That is, can I pass a Future object as an argument to a task and have the task execution deferred until the Future result is available?
I also think Worq is interesting because it expands the scope of task execution beyond a single machine when using a queue backend such as Redis. Someone has encouraged me to implement a concurrent.futures-compatible API for Worq, but haven't gotten around to that yet ;-)Nov 1, 2012
- Sorry, send email with a new subject to python-ideas if you want to discuss all that.Nov 1, 2012
- I don't know If somebody noticed, but uvent (the gevent core based on libuv (https://github.com/saghul/uvent) is crossplatfrom and delivers High performance IO on Windows (not select) too.
You have told before that cross platform is crucial and gevent is not, so maybe this approach will be better?
Additional gevent would probably switch to libuv in the future also.Dec 26, 2012
- Well, there is [Python Futures:](http://code.google.com/p/pythonfutures/)
But if you really want to the best for async, you want to look to Node.js for inspiration. Don’t just look at Node.js’s standard library—check out the most popular packages on NPM.
It’s not always comfortable looking at JS, but you can bet that a programming community based around an inherently async language will have a rich set of perspectives about API design.Feb 18, 2013
- Turning off comments here -- I want discussion on email@example.com(subscribe on mail.python.org), not in G+.Feb 18, 2013