Naios / continuable

C++14 asynchronous allocation aware futures (supporting then, exception handling, coroutines and connections)
https://naios.github.io/continuable/
MIT License
815 stars 44 forks source link

Question - Convenience with using executors #43

Closed noisypants closed 3 years ago

noisypants commented 3 years ago

@Naios Hello. I am trying to use custom executors to push the work on to specific thread. I have this working atm. However, I am trying to make it more convenient to use.

I will try to demonstrate what I would like to do with your examples.
In order to push the work to the right queue, I am capturing the identifier of the queue in the executor.

int queueId = 1;
auto executor = [queueId](auto&& work) {
   pushWorkToQueue(queueId, std::forward<decltype(work)>(work));
};

http_request("github.com")
   .then([](std::string github) {
     // Do something...
   }, executor);

This is working for me. I am however curious if i could make a factory method to create the executor. Something of the form:

template <class T>
auto makeQueueExecutor(int queueId) {
   return [queueId](T&& work) {      
      pushWorkToQueue(queueId, std::forward<decltype(work)>(work));
   };
}

http_request("github.com")
   .then([](std::string github) {
     // Do something...
   },  makeQueueExecutor(queueId));

This becomes tricky as the template parameter T can't be inferred.

This is a contrived example. In practice the parameters I need to create the executor are more complex, and it would be convenient to have a factory.

Can you think of any way to support this?

Thanks for the help.

Naios commented 3 years ago

Hey,

sure it should be possible. Maybe you can progress further on your solution if you rewrite your code like this?:

auto makeQueueExecutor(int queueId) {
   return [queueId](auto&& work) {    
      using T = std::remove_reference_t<decltype(work)>;
      pushWorkToQueue(queueId, std::forward<decltype(work)>(work));
   };
}
noisypants commented 3 years ago

nice! that worked! thanks for the quick response.