Monitor your workflows activity using tools in the Blink platform
When creating a Workflow, a Workflow type must be selected (On-Demand, Event-Based or Scheduled). Depending on which Workflows type is selected, the Workflows has different run methods available. We provide these run methods to enable users to easily integrate with Blink from external systems.
Running a Workflow using a rest HTTP call. This methods can be used to run Workflows from other tools or from the command line using the cURL utility like in the following example. The example is using cURL, but any utility that can run an HTTP call can be used.
1
Hover over the Workflow
Hover over the Workflow you want to run.
2
Click the Overview button
A blue ‘Run’ button and a grey ‘Overview’ button will appear. Select the ‘Overview’ button and navigate to the run methods section of the overview.
3
Select the REST API tab
Navigate to the REST API tab.
4
Add your personal API key
Under the Add my personal API key section, choose a personal API Key from the dropdown menu. Your personal API key will automatically be added to the command.
5
Run the command in your CLI
Copy the command and paste it into your CLI and your workflow will run.
To supply parameters called x and y to the Workflows execution add the following parameters:
Copy
Ask AI
-d '{"x":"y", "a":"b"}'
Synchronous REST API invocation
It is possible to execute the Workflows synchronously, and to wait for it to finish by adding run_sync parameter and optionally timeout parameter. For example:
Copy
Ask AI
-d '{"x":"y", "a":"b"}'
If the Workflow succeeds, the response will be the output parameters’ JSON. If not, it will result in an error.
Note: To use the [Blink CLI] as a run method, you need to have Docker installed on your computer.
1
Hover over the Workflow
Hover over the Workflow you want to run.
2
Click the Overview button
A blue ‘Run’ button and a grey ‘Overview’ button will appear. Select the ‘Overview’ button and navigate to the run methods section of the overview.
3
Select the Blink CLI tab
Navigate to the Blink CLI tab.
4
Add your personal API key
Under the Add my personal API key section, choose a personal API Key from the dropdown menu. Your personal API key will automatically be added to the command.
5
Run the Workflow from your CLI
Copy the command and paste it into your CLI and your Workflow will run.
Blink CLI- Use Case Example
This command will download a container with blink-cli and run the blink-cli inside docker. By modifying the —input parameters we can pass Workflow input parameters.
The execution queue manages how many active workflows can run at the same time across the Blink platform. When this limit is reached, any additional workflows are briefly queued until capacity becomes available, typically just a short delay of a few seconds. This approach ensures optimal performance and system stability, even during peak activity.
Additional workflows beyond these are placed in a queue and will run as capacity becomes available. While this may introduce a delay, all queued executions will still be processed.
The tenant-level execution queue allows a maximum of 50 workflows to run concurrently across all workspaces in the tenant.
Note: The limit is shared across all workspaces in the tenant. When multiple workflows are queued across different workspaces in the tenant, Blink cycles through the workspaces in turn, giving each one a fair opportunity to run its next batch of workflow. This helps ensure balanced resource usage across the entire tenant.
Workflows that include Interactive Actions, Web Forms, or the Wait action with a timeout greater than 60 seconds are not immediately added to the execution queue. Instead, they are placed in a pending state while waiting for input or a response from an external user. These pending workflows do not count toward the execution concurrency limit.Once a response is received, if the execution queue has available capacity, the workflow will proceed to execute immediately. If the queue is full at that moment, the workflow will enter the queue and wait its turn to execute.
When using one workflow within another, known as a subflow, you have the option to run it asynchronously. Running a subflow asynchronously can improve overall execution speed, as it allows the subflow to run independently of other steps in the main workflow.
For more information, refer to the Subflow section of the documentation.
Event-based Workflows provide an easy way to execute a workflow when an event happens on an external system. To achieve this, the external system must perform an HTTP POST to the custom webhook’s URL. Meanwhile,other Event-Based Workflows function by periodically polling the external system to check for new events.
Custom Syslog offers real-time event triggering in the Blink system when a Syslog event occurs, by receiving events from an external Syslog client provider.Custom Syslog is a service which converts Syslog messages into webhooks and activates the webhook with the content of the Syslog message as the payload.
Syslog Message Conversion
Custom Syslog performs the conversion of Syslog messages to webhooks in the following way:
Standard Syslog Message (before conversion):
Copy
Ask AI
<30>Jul 20 10:00:00 app-server-01 kernel: Out of memory: Kill process 12345 (app) score 678 and restart
Converted Syslog Message (after being processed by the Custom Syslog service):
Copy
Ask AI
{ "timestamp": "Jul 20 10:00:00", "hostname": "app-server-01", "source": "kernel", "message": "Out of memory: Kill process 12345 (app) score 678 and restart"}
Custom Syslog-Deployment and Configuration
1
Create an Event-based Workflow with Custom Webhook
When creating a new Workflow, choose Event-based > Custom Webhook. You will see the Webhook URL, which you will use in the docker run command later on.
2
Copy the Webhook URL into the Docker command
Copy the Webhook URL from the Event-Based Workflow you created and paste it in the following docker run command:
Deploy the Syslog container and run the Docker command
Deploy the Syslog Docker container on any infrastructure of your choice and run the docker command specified above. This could be on-premises hardware, cloud-based virtual machines, or managed services like Amazon ECS or Google Kubernetes Engine.
4
Expose the Syslog container to clients
Ensure that the Syslog Docker container is exposed to the Syslog client. Make sure the necessary ports (UDP and TCP 514) are open to receive messages.
5
Configure your Syslog client to send messages
Configure your Syslog client to send messages to the IP address of the machine where the Syslog Docker container is deployed.
Once the Docker container is running and the Syslog client is configured, the Docker container will start receiving Syslog events. Each received Syslog event will trigger the associated workflow.
Blink Azure EventHub offers real-time event triggering in the Blink system when an Azure EventHub event is ingested.Blink Azure EventHub is a service which converts Azure EventHub events into webhooks and activates the webhook with the content of the event data as the payload.
EventHub- Deployment and Configuration
To enable real-time event triggering in the Blink system using Azure EventHub, follow these steps:
1
Create an Event-based Workflow with Custom Webhook
When creating a new workflow, choose Event-based > Custom Webhook. You will see the Webhook URL and API Key, which you will use in the docker run command later on.
2
Construct your full Webhook URL
Copy the Webhook URL and the API Key from the event-based automation you created. To create your full webhook URL, combine them as follows: your_webhook_url = <Webhook URL>?apikey=<API Key>
3
Get your Azure EventHub connection string
To acquire your ‘EventHub Connection String’:
Log in to the Azure portal.
In the EventHub menu, select the EventHub you want to use.
Under ‘Settings’, click on ‘Shared access policies’.
Create a new SAS policy and select the ‘Listen’ permission.
Copy the ‘Connection string–primary key’ from the new SAS policy.
4
Get your Azure Blob connection string and blob name
To acquire your ‘Blob Connection String’ and ‘Blob Name’:
Log in to the Azure portal.
In the Storage Accounts menu, select the account you want to use.
Under ‘Security + Networking’, click on Access Keys’ and copy one of the ‘Connection Strings’.
Under ‘Data Storage’, click on ‘Containers’ and create a new container with the ‘Blob Name’ you want to use.
5
Paste your credentials into the Docker command
Paste your credentials into the following docker run command:
Deploy the Azure EventHub Docker container on any infrastructure of your choice and run the docker command above. This could be on-premises hardware, cloud-based virtual machines, or managed container services like Amazon ECS or Google Kubernetes Engine.
7
Ensure HTTPS connectivity for the container
Ensure that your Azure EventHub container has access to make and receive HTTPS requests.
1
Docker Container Starts Listening
Once the Docker container is running, the Docker container will start listening for Azure EventHub events. Each received event will trigger the associated automation.
EventHub Message Example
An example of the data that the automation will trigger on:
Polling: Events Occurring on an External Service Provider
There are no manual run methods available from the Blink platform for this type of Workflow. Instead, the Workflow is automatically triggered when the polling mechanism detects new events from the external system.
1
Create a polling-based Workflow
When creating a new Workflow, select ‘Event-based’ and then choose a relevant event from the list.
2
View the polling trigger details
The run methods** section will display the polling frequency or condition defined by the event trigger.