AWS DevOps CICD Pipelines
Introduction
This tutorial will guide you through setting up two CI/CD pipelines using AWS Code build to automate the management of business rules in DecisionRules. The first pipeline will handle moving business rules from one DecisionRules space to another, even across different environments (e.g., from development to production). The second pipeline will focus on Restore-in-Time Recovery, allowing you to revert to a previous state of your business rules when necessary.
Prerequisites
For the following steps you will need access to
AWS Account
AWS Code build - Used to orchestrate the migration of the rules
S3 - Used to store backups of your rules
Setting up Migration pipeline
This pipeline will be able to export rules from one selected space into another destination space
GCP Cloud Storage Setup
First we need to setup the storage that will be used to backup the rules. In your Google cloud navigate to Cloud Storage and execute the following steps
Navigate to S3 -> Buckets
Click Create Bucket
Choose a name for the bucket
All the other setting can be left as defaults
Create Buckets
Now you should have everything you need in the S3
GCP Cloud Build setup
Now that we have the beckup storage prepared we can go and configure the migration pipeline itself. Execute the following steps
Navigate to Code build ( using search bar )
Navigate to Build Projects
Create Project
Fill in name, description etc.
Buildspec - Insert build commands - Paste the following code
version: 0.2
phases:
install:
commands:
- echo "starting"
- mkdir decisionRules
- cd decisionRules
- git clone https://github.com/decisionrules/decisionrules-cicd-tools.git
- cd decisionrules-cicd-tools
- npm install
- mkdir export
- echo "start export of SOURCE space"
- npm run export export/export.json ${ENV_URL_SRC} ${SRC_ENV}
- echo "end export of SOURCE space"
- echo "start export of DESTINATION space"
- npm run export export/exportDestination.json ${ENV_URL_DEST} ${DEST_ENV}
- echo "end export of DESTINATION space"
- echo "clear destination space"
- npm run clear ${ENV_URL_DEST} ${DEST_ENV}
- echo "start import"
- npm run import export/export.json ${ENV_URL_DEST} ${DEST_ENV}
- echo "end import"
- echo "END"
artifacts:
files:
- 'decisionRules/decisionrules-cicd-tools/export/*'
name: backup-$(date +%Y-%m-%dT%H:%M:%S)
discard-paths: yes
Under Environment - Environment variables add the following
BRANCH_NAME (plaintext)
Fill in the name of your storage bucket that will be used to store backups
ENV_URL_DEST and _ENV_URL_SRC (plaintext)
URL for your DR API ( https://api.decisionrules.io as an example for public cloud )
SRC_ENV (plaintext)
Management API key of the Source space
DEST_ENV (plaintext)
Management API key of the Destination space
Under Artifact 1 primary
Select Amazon S3
Put your buckets name
Create build project
Open the newly created project and click Start build
After the Build is successfully finished, the destination space should now contain a copy of the rule in the Source space and a backup file for both should now be available in the selected Bucket
Setting up a rollback pipeline
This pipeline will be able to restore your space to a previous state based on the backups created by the migration pipeline, stored in Cloud Storage
GCP Cloud storage setup
The storage bucket was already created for the previous pipeline, If you are not using it follow its instructions to create the bucket
GCP Cloud build setup
The setup closely resembles the setup of the previous pipeline
Fill in name, description etc.
Buildspec - Insert build commands - Paste the following code
Create Project
Navigate to Cloud Build ( using search bar )
Navigate to triggers - +Create Trigger a. Fill in name, description etc. b. Event - Select Manual Invocation c. Configuration - select Inline d. Open editor - Paste the following code
version: 0.2
phases:
install:
commands:
- echo "starting"
- mkdir decisionRules
- cd decisionRules
- git clone https://github.com/decisionrules/decisionrules-cicd-tools.git
- cd decisionrules-cicd-tools
- npm install
- mkdir export
- mkdir backup
- aws s3 cp s3://${BUCKET_NAME}/${FOLDER_NAME}/${FILE_NAME} backup
- echo "start export of DESTINATION space"
- npm run export export/exportDestination.json ${ENV_URL_DEST} ${DEST_ENV}
- echo "end export of DESTINATION space"
- echo "clear destination space"
- npm run clear ${ENV_URL_DEST} ${DEST_ENV}
- echo "start import"
- npm run import backup/exportDestination.json ${ENV_URL_DEST} ${DEST_ENV}
- echo "end import"
- echo "END"
artifacts:
files:
- 'decisionRules/decisionrules-cicd-tools/export/*'
name: rollbackBackup-$(date +%Y-%m-%dT%H:%M:%S)
discard-paths: yes
4. Under Environment - Environment variables add the following
BUCKET_NAME
Fill in the name of your storage bucket that will be used to store backups
ENV_URL_DEST
URL for your DR API ( https://api.decisionrules.io as an example for public cloud )
DEST_ENV
Management API key of the Destination space iv.
FOLDER_NAME
Address of the folder used to store the backup of space that is rolled back ( eg. env-back-ups/Rollback-backup )
FILE_NAME
Name of the file that is being rolled back to ( should be in theBRANCH_NAME bucket ) e.g. destinationBackup- 20250506T1220.json
Click Create Project
Running the pipeline
In this pipeline we want to choose for each execution the specific backup file that we are rolling back to, this can be achieved two ways
Running the trigger manually ("Start build with overrrides" button)
a. In this case we first have to go to edit the trigger variables
b. Change the FILE_NAME and FOLDER_NAME variable to contain the required backup file name d. Click Start build
Running Trigger using AWS console
a. Start the build via a commandline "codebuild start-build" using the --environment-variables-override
After running the command you should be able to monitor the process in Code build
Last updated
Was this helpful?