I am designing a REST API to be implemented in Flask. It is a simple in-house module that will be accessed very occasionally by few people. Most of its services are trivial CRUD operations – add this row to this database table, delete that one – but I’ve discovered that one of them is really computationally heavy and takes around 15 minutes to finish. Therefore, it can’t be wrapped in a synchronous REST API call, because HTTP request would time out.
Which means that it should rather be split into 3 services:
- enqueueHeavyRequest(params) -> returns request_id
- checkHeavyRequestStatus(request_id) -> QUEUED, RUNNING, FINISHED
- getHeavyRequestResults(request_id) -> results (if finished)
I would also like to ensure that:
- only 1 such heavy request is processed at the same time (otherwise, the server would get overloaded)
- this heavy processing does not run as a thread of the web server hosting Flask app, but ideally as a separate process
How would you implement this, and which tools/libraries would you use? I guess some kind of a queue manager is needed, possibly communicating with the web server via some local SQLite database or something?
submitted by /u/pachura3
[link] [comments]
r/learnpython I am designing a REST API to be implemented in Flask. It is a simple in-house module that will be accessed very occasionally by few people. Most of its services are trivial CRUD operations – add this row to this database table, delete that one – but I’ve discovered that one of them is really computationally heavy and takes around 15 minutes to finish. Therefore, it can’t be wrapped in a synchronous REST API call, because HTTP request would time out. Which means that it should rather be split into 3 services: enqueueHeavyRequest(params) -> returns request_id checkHeavyRequestStatus(request_id) -> QUEUED, RUNNING, FINISHED getHeavyRequestResults(request_id) -> results (if finished) I would also like to ensure that: only 1 such heavy request is processed at the same time (otherwise, the server would get overloaded) this heavy processing does not run as a thread of the web server hosting Flask app, but ideally as a separate process How would you implement this, and which tools/libraries would you use? I guess some kind of a queue manager is needed, possibly communicating with the web server via some local SQLite database or something? submitted by /u/pachura3 [link] [comments]
I am designing a REST API to be implemented in Flask. It is a simple in-house module that will be accessed very occasionally by few people. Most of its services are trivial CRUD operations – add this row to this database table, delete that one – but I’ve discovered that one of them is really computationally heavy and takes around 15 minutes to finish. Therefore, it can’t be wrapped in a synchronous REST API call, because HTTP request would time out.
Which means that it should rather be split into 3 services:
- enqueueHeavyRequest(params) -> returns request_id
- checkHeavyRequestStatus(request_id) -> QUEUED, RUNNING, FINISHED
- getHeavyRequestResults(request_id) -> results (if finished)
I would also like to ensure that:
- only 1 such heavy request is processed at the same time (otherwise, the server would get overloaded)
- this heavy processing does not run as a thread of the web server hosting Flask app, but ideally as a separate process
How would you implement this, and which tools/libraries would you use? I guess some kind of a queue manager is needed, possibly communicating with the web server via some local SQLite database or something?
submitted by /u/pachura3
[link] [comments]