Replies: 2 comments 3 replies
-
We would need adjustments for OCP providers:
|
Beta Was this translation helpful? Give feedback.
3 replies
-
Hah, I love it. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Regarded yesterday discussion about Async AI providers
From the Nextcloud server side all requests look like this:
https://github.com/nextcloud/server/blob/fdc64ea2f527d25c382901ed906f71fca89fd1b3/lib/public/TextProcessing/Task.php#L98-L107
We can, from our implementation side, without changing the NC Core, make it asynchronous in the implementation of providers, but we will have to add a table with ID queries.
How we can do it changing only AppAPI:
We create a separate table for each implemented Provider Type:
id: bigint | finished: bool | error: string | result: depends on provider
We add a route in AppAPI for each provider type to receive the results
During "process" we give each request it's ID and send request to ExApp, after that we in cycle check when record for this ID will have "finished" to true
We rewrite this and take results from task from our new DB table instead from getting it from response
https://github.com/cloud-py-api/app_api/blob/f169bae7b4a3088a6991b0482f1a333ade836f33/lib/Service/TextProcessingService.php#L198-L206
This proposed solution will allow ExApps providers to easy implement queue, on getting request it will add request data to queue and return empty HTTP response for example.
After ExApp finish processing request it will store results of it using route added in Point 2.
Any thoughts , criticism?
Beta Was this translation helpful? Give feedback.
All reactions