Running Workflows
When creating a Workflow, a Workflow type must be selected (On-Demand, Event-Based or Scheduled). Depending on which Workflows type is selected, the Workflows has different run methods available. We provide these run methods to enable users to easily integrate with Blink from external systems.
On-Demand and Scheduled Workflows
Run methods for On-Demand and Scheduled Workflows include:
- Manual: Using the User Interface.
- URL: The Workflow URL can be copied and used in Postman or another tool. For authentication one needs to add a header (BLINK-API-KEY).
- CLI: Using the Blink CLI.
- Rest API: Invoking an HTTP request.
The default run method for a Scheduled Workflows is the frequency set when creating the Automation.
In the Workflow Overview page, you can see all the Run Method options displayed.
Manually Running a Workflow
A Workflow can be run from the Workflows screen.
Hover over the Workflow you want to run.
A blue "Run" button and a grey "Overview" button will appear.
If required, you will be prompted to enter input parameters.
Click the "Run" button.
URL
This is the URL method which can be used to run the Workflows using your browser.
Hover over the Workflow you want to run.
A blue "Run" button and a grey "Overview" button will appear.
Select the "Overview" button and navigate to the run methods section of the overview.
Navigate to the URL tab.
Copy and paste the URL into the browser address bar. A dialog box will open.
Enter the Workflow's input parameters.
Click Run Workflow to start the Workflows execution.
REST API
Running a Workflow using a rest HTTP call. This methods can be used to run Workflows from other tools or from the command line using the cURL utility like in the following example. The example is using cURL, but any utility that can run an HTTP call can be used.
Hover over the Workflow you want to run.
A blue "Run" button and a grey "Overview" button will appear.
Select the "Overview" button and navigate to the run methods section of the overview.
Navigate to the REST API tab.
Under the Add my personal API key section, choose a personal API Key from the dropdown menu. Your personal API key will automatically be added to the command.
Copy the command and paste it into your CLI and your Automation will run.
REST API Example:
curl -XPOST -H 'BLINK-API-KEY: <api-key>' -H 'Content-Type: application/json' https://app.blinkops.com/api/v1/workspace/aeb40442-ff12-4071-be15-b0709e59030e/playbooks/f6a68c7f-e585-4f8c-b8b2-f8a82192f4e/execute
To supply parameters called x and y to the Workflows execution add the following parameters:
-d '{"x":"y", "a":"b"}'
Synchronous REST API invocation
It is possible to execute the Workflows synchronously, and to wait for it to finish by adding run_sync
parameter and optionally timeout
parameter. For example:
-d '{"x":"y", "a":"b"}'
If the Workflow succeeds, the response will be the output parameters' JSON. If not, it will result in an error.
Blink CLI
To use the [Blink CLI] as a run method, you need to have Docker installed on your computer.
Hover over the Workflow you want to run.
A blue "Run" button and a grey "Overview" button will appear.
Select the "Overview" button and navigate to the run methods section of the overview.
Navigate to the Blink CLI tab.
Under the Add my personal API key section, choose a personal API Key from the dropdown menu. Your personal API key will automatically be added to the command.
Copy the command and paste it into your CLI and your Workflow will run.
Blink CLI Example:
docker run --rm -it blinkops/blink-cli playbooks execute --id f6a68c7f-e585-4f8c-b8b2-f8a8219c2f4e --workspace "Demo Workspace" --inputs {"name": "value"} --hostname "https://app.blinkops.com" --api-key <api-key>
- This command will download a container with blink-cli and run the blink-cli inside docker. By modifying the --input parameters we can pass Workflow input parameters.
Running Workflows Steps asynchronously
When using a Workflow as a subflow in another Workflow, you can opt to run the Workflow asynchronously. This enables the Workflow to be executed faster as subflows are not dependent on the execution of other steps. For more details refer to Subflow Actions
Running Event-based Workflows
Event-based Workflows provide an easy way to execute a workflow when an event happens on an external system. To achieve this, the external system must perform an HTTP POST to the custom webhook's URL. Meanwhile,other Event-Based Workflows function by periodically polling the external system to check for new events.
Custom Syslog
Custom Syslog offers real-time event triggering in the Blink system when a Syslog event occurs, by receiving events from an external Syslog client provider.
Custom Syslog is a service which converts Syslog messages into webhooks and activates the webhook with the content of the Syslog message as the payload.
Syslog Message Conversion
Custom Syslog performs the conversion of Syslog messages to webhooks in the following way:
Standard Syslog Message (before conversion):
<30>Jul 20 10:00:00 app-server-01 kernel: Out of memory: Kill process 12345 (app) score 678 and restart
Converted Syslog Message (after being processed by the Custom Syslog service):
{
"timestamp": "Jul 20 10:00:00",
"hostname": "app-server-01",
"source": "kernel",
"message": "Out of memory: Kill process 12345 (app) score 678 and restart"
}
Deployment and Configuration Syslog
To enable real-time event triggering in the Blink system using Custom Syslog, follow these steps:
When creating a new Workflow, choose Event-based > Custom Webhook.
- You will see the Webhook URL, which you'll use in the docker run command later on.
Copy the Webhook URL from the Event-Based Workflows you created and paste it in the following docker run command:
docker run -d --name blink-syslog -p 514:514/udp -p 514:514/tcp -e WEBHOOK_URL=<your_webhook_url> --restart unless-stopped blinkops/blink-syslog:1.0.224
Deploy the Syslog Docker container on any infrastructure of your choice and run the docker command that's specified above. This could be on-premises hardware, cloud-based virtual machines, or even managed container services like Amazon ECS, Google Kubernetes Engine, etc.
Ensure that the Syslog Docker container is exposed to the Syslog client. Make sure the necessary ports (UDP and TCP 514 by default for Syslog) are open to receive Syslog messages from the clients.
Configure your Syslog client to send messages to the IP address of the machine where the Syslog Docker container is deployed.
Once the Docker container is running and the Syslog client is configured, the Docker container will start receiving Syslog events. Each received Syslog event will trigger the associated Workflow.
Azure EventHub
Blink Azure EventHub offers real-time event triggering in the Blink system when an Azure EventHub event is ingested.
Blink Azure EventHub is a service which converts Azure EventHub events into webhooks and activates the webhook with the content of the event data as the payload.
EventHub Message Example
An example of the data that the automation will trigger on:
{
"Body": "eyJEYXRhIjogIk15IERhdGEifQ==",
"ContentType": "application/json",
"CorrelationID": "23:16:08",
"MessageID": "b43d80eb-deea-4cab-bcd5-e78e62fe4fef",
"Properties": {
"MyNumber": 123,
"MyProperty": "MyValue"
}
}
Deployment and Configuration EventHub
To enable real-time event triggering in the Blink system using Azure EventHub, follow these steps:
When creating a new Automation, choose Event-based > Custom Webhook.
- You will see the Webhook URL and API Key, which you'll use in the docker run command later on.
Copy the Webhook URL and the API Key from the event-based automation you created. To create your full webhook url combine them in the following way:
your_webhook_url = <Webhook URL>?apikey=<API Key>
To acquire your EventHub Connection String:
- Log-in to the azure portal.
- In the EventHub menu, select the EventHub you want to use.
- In the left-hand menu, under Settings click on Shared access policies.
- Create a new SAS policy and select the Listen permission. Copy the Connection string–primary key for the newly created SAS policy.
To acquire your Blob Connection String and Blob Name:
- Log-in to the azure portal.
- In the storage accounts menu, select the account you want to use.
- In the left-hand menu, under Security + Networking click on Access Keys.
- Copy on of the Connection Strings shown.
- In the left-hand menu, under Data Storage click on Data Storage.
- Create a new container with Blob Name you want to use.
Paste your credentials into the following docker run command:
docker run -d --name blink-azure-eventhub -e "WEBHOOK_URL=<your_webhook_url>" -e "EVENTHUB_CONNECTION_STRING=<your_eventhub_connection_string>" -e "BLOB_CONNECTION_STRING=<your_blob_connection_string>" -e "BLOB_NAME=<your_blob_name>" blinkops/blink-azure-eventhub:1.241205-15
Deploy the Azure EventHub Docker container on any infrastructure of your choice and run the docker command that's specified above. This could be on-premises hardware, cloud-based virtual machines, or even managed container services like Amazon ECS, Google Kubernetes Engine, etc.
Ensure that your Azure EventHub container has access to make and receive HTTPS requests.
Once the Docker container is running, the Docker container will start listening for Azure EventHub events. Each received event will trigger the associated automation.
Event occurring on an external service provider
There are no run methods available from the Blink platform. Workflows are triggered when the polling job finds new events on an external system.
- When creating a new Workflow, select Event-based > Select an event.
Run methods displays the event or time specified when setting up the trigger. For example; Automatically will search for new events every 5 minutes.