Previously
In my previous blog I wrote on how to deploy an API connection which allows MSI authorization from an Azure Logic App. Today we will dive deeper into how to deploy a logic app with Bicep, since this can be quite a hassle, because an Azure Logic App is written in JSON and Bicep files are not. This means every single double quote (") needs to be removed or replaced, which doesn't make it an enjoyable process.
Luckily there is an easy way, so let's look at what we can do to prepare and eventually deploy it via Bicep!
Preparing your Azure Logic App
In this example I will use my Logic App from one of my earlier blogs for scaling my database up/down during working hours.
When you are within the Azure portal, you will find the Export template option under the Automation category within your blade.
This allows you to export your Azure Logic App as a fully deployable ARM template. Now I know what you're thinking: this isn't Bicep. But hold on! For your convenience I'll share the ARM template:
{
"$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"workflows_La_DB_Scaler_name": {
"defaultValue": "La-DB-Scaler",
"type": "String"
}
},
"variables": {},
"resources": [
{
"type": "Microsoft.Logic/workflows",
"apiVersion": "2017-07-01",
"name": "[parameters('workflows_La_DB_Scaler_name')]",
"location": "westeurope",
"identity": {
"type": "SystemAssigned"
},
"properties": {
"state": "Enabled",
"definition": {
"$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
"contentVersion": "1.0.0.0",
"parameters": {},
"triggers": {
"Trigger_before_and_after_business_hours": {
"recurrence": {
"frequency": "Week",
"interval": 1,
"schedule": {
"hours": [
"7",
"19"
],
"minutes": [
0
],
"weekDays": [
"Monday",
"Tuesday",
"Wednesday",
"Thursday",
"Friday"
]
},
"timeZone": "W. Europe Standard Time"
},
"evaluatedRecurrence": {
"frequency": "Week",
"interval": 1,
"schedule": {
"hours": [
"7",
"19"
],
"minutes": [
0
],
"weekDays": [
"Monday",
"Tuesday",
"Wednesday",
"Thursday",
"Friday"
]
},
"timeZone": "W. Europe Standard Time"
},
"type": "Recurrence"
}
},
"actions": {
"Check_if_business_hours": {
"actions": {
"ScaleUp_Database_to_S1": {
"runAfter": {},
"type": "Http",
"inputs": {
"authentication": {
"audience": "https://management.azure.com/",
"type": "ManagedServiceIdentity"
},
"body": {
"location": "West Europe",
"sku": {
"name": "S1",
"tier": "Standard"
}
},
"headers": {
"Content-Type": "application/json"
},
"method": "PUT",
"uri": "https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Sql/servers/{serverName}/databases/{databaseName}?api-version=2021-02-01-preview"
}
}
},
"runAfter": {},
"else": {
"actions": {
"ScaleDown_Database_to_B1": {
"runAfter": {},
"type": "Http",
"inputs": {
"authentication": {
"audience": "https://management.azure.com/",
"type": "ManagedServiceIdentity"
},
"body": {
"location": "West Europe",
"sku": {
"name": "B1",
"tier": "Standard"
}
},
"headers": {
"Content-Type": "application/json"
},
"method": "PUT",
"uri": "https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Sql/servers/{serverName}/databases/{databaseName}?api-version=2021-02-01-preview"
}
}
}
},
"expression": {
"and": [
{
"equals": [
"@formatDateTime(convertFromUtc(utcNow(),'W. Europe Standard Time'),'HH')",
7
]
}
]
},
"type": "If"
}
},
"outputs": {}
},
"parameters": {}
}
}
]
}
Now that you have the ARM code available, save it under a proper file with the .json extension behind it.
For the next part we will be using Visual Studio Code with the Bicep Extension. This allows us to write bicep files with Intelligence and Autocomplete, which will save you a lot of time! But we will also need to install the Bicep CLI, which also allows you to convert ARM templates to Bicep files. While this is a neat feature, it comes with some warnings, as it will not produce clean Bicep files, but in our case for the Logic App JSON to Bicep it shouldn't be an issue at all.
By going to Terminal > New Terminal, we will be able to type the command to convert our Logic App Code, which can be done by the following az bicep decompile --file YourFileName.json
.
When converting the ARM template we will receive the following:
param workflows_La_DB_Scaler_name string = 'La-DB-Scaler'
resource workflows_La_DB_Scaler_name_resource 'Microsoft.Logic/workflows@2017-07-01' = {
name: workflows_La_DB_Scaler_name
location: 'westeurope'
identity: {
type: 'SystemAssigned'
}
properties: {
state: 'Enabled'
definition: {
'$schema': 'https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#'
contentVersion: '1.0.0.0'
parameters: {}
triggers: {
Trigger_before_and_after_business_hours: {
recurrence: {
frequency: 'Week'
interval: 1
schedule: {
hours: [
'7'
'19'
]
minutes: [
0
]
weekDays: [
'Monday'
'Tuesday'
'Wednesday'
'Thursday'
'Friday'
]
}
timeZone: 'W. Europe Standard Time'
}
evaluatedRecurrence: {
frequency: 'Week'
interval: 1
schedule: {
hours: [
'7'
'19'
]
minutes: [
0
]
weekDays: [
'Monday'
'Tuesday'
'Wednesday'
'Thursday'
'Friday'
]
}
timeZone: 'W. Europe Standard Time'
}
type: 'Recurrence'
}
}
actions: {
Check_if_business_hours: {
actions: {
ScaleUp_Database_to_S1: {
runAfter: {}
type: 'Http'
inputs: {
authentication: {
audience: 'https://management.azure.com/'
type: 'ManagedServiceIdentity'
}
body: {
location: 'West Europe'
sku: {
name: 'S1'
tier: 'Standard'
}
}
headers: {
'Content-Type': 'application/json'
}
method: 'PUT'
uri: 'https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Sql/servers/{serverName}/databases/{databaseName}?api-version=2021-02-01-preview'
}
}
}
runAfter: {}
else: {
actions: {
ScaleDown_Database_to_B1: {
runAfter: {}
type: 'Http'
inputs: {
authentication: {
audience: 'https://management.azure.com/'
type: 'ManagedServiceIdentity'
}
body: {
location: 'West Europe'
sku: {
name: 'B1'
tier: 'Standard'
}
}
headers: {
'Content-Type': 'application/json'
}
method: 'PUT'
uri: 'https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Sql/servers/{serverName}/databases/{databaseName}?api-version=2021-02-01-preview'
}
}
}
}
expression: {
and: [
{
equals: [
'@formatDateTime(convertFromUtc(utcNow(),\'W. Europe Standard Time\'),\'HH\')'
7
]
}
]
}
type: 'If'
}
}
outputs: {}
}
parameters: {}
}
}
As you can see it parsed it properly and did all the cumbersome changes that we would otherwise have to do by hand. But this is still a very static bicep and not very modular at all.
We will now have to make the necessary changes. So let's look at what we might need to change!
Modules and parameters
For a proper Bicep deployment, we will need to create our module and main file. We will first create a Logic App module which can be used for all your Logic App deployments, independent of the logic you will have in it.
First up, create a new LogicApp.bicep file in Visual Studio Code. This can be done by going to File > New file and choosing the Bicep extension, but there are more ways to do this.
When you have your Logic App file we will be using the following code:
param location string
param logicAppName string
param workflowParameters object
param workflowDefinition object
resource LogicApp 'Microsoft.Logic/workflows@2019-05-01' = {
name: logicAppName
location: location
identity: {
type: 'SystemAssigned'
}
properties: {
state: 'Enabled'
definition: workflowDefinition
parameters: workflowParameters
}
}
output LogicAppMSI string = LogicApp.identity.principalId
The above Bicep code is completely parameterized and gives us a lot of flexibility in our Logic App deployments. It also provides the principalId, which is needed to apply the MSI role assignment to the database for this example.
Now for the Role Assignment, create a SQLRoleAssignment.bicep file with the following code:
param subscriptionID string
@allowed([
'b24988ac-6180-42a0-ab88-20f7382dd24c' //Contributor
'8e3af657-a8ff-443c-a75c-2fe8c4bcb635' //Owner
'acdd72a7-3385-48ef-bd42-f606fba81ae7' //Reader
])
param roleID string
param principalID string
param principalType string
param serverName string
resource scope 'Microsoft.Sql/servers@2021-11-01-preview' existing = {
name: serverName
}
resource roleAssignment 'Microsoft.Authorization/roleAssignments@2020-08-01-preview' ={
name: guid(subscriptionID, principalID, roleID)
scope: scope
properties: {
roleDefinitionId: '/providers/Microsoft.Authorization/roleDefinitions/${roleID}'
principalId: principalID
principalType: principalType
}
}
The above Bicep code allows us to apply a role assignment to the scope. In this case the scope would be our existing database for which we provide the name. Besides that we will only allow three specific roles in our role assignment: the Contributor, Owner and Reader roles.
Main bicep file
Now that we have 2 modules ready to use, they will need to be called upon and provided with the necessary parameters to work correctly when deploying.
For this, create a new Bicep file called main.bicep with the following code:
param subscriptionID string = subscription().subscriptionId
param resourcegroup string = resourceGroup().name
param location string = resourceGroup().location
param environment string
param logicAppName string = 'La-DB-Scaler-${environment}'
param serverName string = 'sql-cloudshift-${environment}'
param dbName string = 'sqldb-cloudshift-${environment}'
param roleID string = 'b24988ac-6180-42a0-ab88-20f7382dd24c'
param principalType string = 'ServicePrincipal'
param LogicAppWorkflowParameters object = {}
param LogicAppWorkflowDefinition object = {
'$schema': 'https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#'
contentVersion: '1.0.0.0'
parameters: {}
triggers: {
Trigger_before_and_after_business_hours: {
recurrence: {
frequency: 'Week'
interval: 1
schedule: {
hours: [
'7'
'19'
]
minutes: [
0
]
weekDays: [
'Monday'
'Tuesday'
'Wednesday'
'Thursday'
'Friday'
]
}
timeZone: 'W. Europe Standard Time'
}
evaluatedRecurrence: {
frequency: 'Week'
interval: 1
schedule: {
hours: [
'7'
'19'
]
minutes: [
0
]
weekDays: [
'Monday'
'Tuesday'
'Wednesday'
'Thursday'
'Friday'
]
}
timeZone: 'W. Europe Standard Time'
}
type: 'Recurrence'
}
}
actions: {
Check_if_business_hours: {
actions: {
ScaleUp_Database_to_S1: {
runAfter: {}
type: 'Http'
inputs: {
authentication: {
audience: 'https://management.azure.com/'
type: 'ManagedServiceIdentity'
}
body: {
location: 'West Europe'
sku: {
name: 'S1'
tier: 'Standard'
}
}
headers: {
'Content-Type': 'application/json'
}
method: 'PUT'
uri: 'https://management.azure.com/subscriptions/${subscriptionID}/resourceGroups/${resourcegroup}/providers/Microsoft.Sql/servers/${serverName}/databases/${dbName}?api-version=2021-02-01-preview'
}
}
}
runAfter: {}
else: {
actions: {
ScaleDown_Database_to_B1: {
runAfter: {}
type: 'Http'
inputs: {
authentication: {
audience: 'https://management.azure.com/'
type: 'ManagedServiceIdentity'
}
body: {
location: 'West Europe'
sku: {
name: 'B1'
tier: 'Standard'
}
}
headers: {
'Content-Type': 'application/json'
}
method: 'PUT'
uri: 'https://management.azure.com/subscriptions/${subscriptionID}/resourceGroups/${resourcegroup}/providers/Microsoft.Sql/servers/${serverName}/databases/${dbName}?api-version=2021-02-01-preview'
}
}
}
}
expression: {
and: [
{
equals: [
'@formatDateTime(convertFromUtc(utcNow(),\'W. Europe Standard Time\'),\'HH\')'
7
]
}
]
}
type: 'If'
}
}
outputs: {}
}
module LogicApp 'modules/LogicApp.bicep' = {
name: logicAppName
params: {
location: location
logicAppName: logicAppName
workflowParameters: LogicAppWorkflowParameters
workflowDefinition: LogicAppWorkflowDefinition
}
}
module SQLRoleAssignment 'modules/SQLRoleAssignment.bicep' = {
name: serverName
params: {
subscriptionID: subscriptionID
roleID: roleID
principalID: LogicApp.outputs.LogicAppMSI
principalType: principalType
serverName: serverName
}
dependsOn: [
LogicApp
]
}
A lot is happening in the above code, so let's break it down bit by bit. First up, all the parameters, in which we specify a lot of information dynamically, such as our subscriptionID, resourcegroupName and the region (Location) to which it needs to be deployed.
A few other parameters are "hardcoded," since these will be static for this deployment, but are made semi-dynamic by using the environment parameter within their naming. The LogicAppWorkflowParameters will remain empty for this example, but in here you would add the code for your API connections if you use those (see my previous blog).
The LogicAppWorkflowDefinition is the big star here. This contains all the logic in a single parameter. If you have multiple logic apps, this could become an Array with multiple pieces of logic. The same goes for the naming of the Logic App(s), etc.
Following the parameters are the Modules we created. We will call them here from a folder called modules to keep them all nice and tidy together and provide them with the necessary information (parameters) for their configuration.
As you can see, we created a dependency between the SQLRoleAssignment and the LogicApp. This so that the role assignment can only occur after the Logic App is created, otherwise the required principalID will not be present.
Deployment
Last but not least, we still need to deploy these Bicep files. I wrote many times about how to do this in a YAML pipeline, but for the sake of convenience, I will place the YAML code here as well for a deployment to a single environment:
trigger:
branches:
include: # Collaboration branch
- master
paths:
exclude:
- CICD/*
variables:
azureServiceConnection: 'YourServiceConnection'
resourceGroupName: 'YourResourceGroupNameMinusEnvironment'
location: 'westeurope'
templateFile: 'main.bicep'
pool:
vmImage: ubuntu-latest #windows-latest #macOS-latest
stages:
###################################
# Deploy Dev environment
###################################
- stage: DeployDev
displayName: Deploy Dev
jobs:
- job: Deploy
steps:
- task: AzureCLI@2
displayName: 'Deploy Bicep'
inputs:
azureSubscription: $(azureServiceConnection)
scriptType: bash
scriptLocation: inlineScript
inlineScript: |
az group create --name "$(resourceGroupName)-DEV" --location $(location)
az deployment group create --resource-group "$(resourceGroupName)-DEV" --template-file $(templateFile) --parameters environment='DEV'
What's next?
Let's dive back in YAML again and let's look at how we can make a neat deployment script for a DTAP environment. Stay tuned for next week's blog!