Add max execution time for Workflows (#755)

* 🎉 basic setup and execution stopping

* 🚧 soft timeout for own process executions

* 🚧 add hard timeout for subprocesses

* 🚧 add soft timeout to main thread

* 🔧 set default timeout to 5 mins --> 500s

* 💡 adding documentation to configs

* 🚧 deactivate timeout by default

* 🚧 add logic of max execution timeout

*  adding timeout to settings in frontend and server

* 🎨 improve naming

* 💡 fix change in config docs

* ✔️ fixing compilation issue

* 🎨 add format for new config variables

* 👌 type cast before checking equality

*  Improve error message if NodeType is not known

* 🐳 Tag also rpi latest image

* 🐛 Fix Postgres issue with Node.js 14 #776

* 🚧 add toggle to activate workflow timeout

* 💄 improving UX of setting a timeout and its duration

Co-authored-by: Jan Oberhauser <jan.oberhauser@gmail.com>
This commit is contained in:
Ben Hesseldieck
2020-07-29 14:12:54 +02:00
committed by GitHub
parent 6e06da99fb
commit 051598d30e
13 changed files with 232 additions and 25 deletions

View File

@@ -159,6 +159,30 @@ const config = convict({
env: 'EXECUTIONS_PROCESS'
},
// A Workflow times out and gets canceled after this time (seconds).
// If the workflow is executed in the main process a soft timeout
// is executed (takes effect after the current node finishes).
// If a workflow is running in its own process is a soft timeout
// tried first, before killing the process after waiting for an
// additional fifth of the given timeout duration.
//
// To deactivate timeout set it to -1
//
// Timeout is currently not activated by default which will change
// in a future version.
timeout: {
doc: 'Max run time (seconds) before stopping the workflow execution',
format: Number,
default: -1,
env: 'EXECUTIONS_TIMEOUT'
},
maxTimeout: {
doc: 'Max execution time (seconds) that can be set for a workflow individually',
format: Number,
default: 3600,
env: 'EXECUTIONS_TIMEOUT_MAX'
},
// If a workflow executes all the data gets saved by default. This
// could be a problem when a workflow gets executed a lot and processes
// a lot of data. To not exceed the database's capacity it is possible to