Depending on the current configuration of your database you might need to change some settings to be able to exectute the necessary API Calls. The setup should be as follows
First we will take a look at how to setup a simple serial call, that processes multiple API Calls one after another, and then we will expand on it using parallel API Calls that can be useful in the cases where you need to evaluates very large sets of data.
Method 1 : Serial Approach
Using this method, if we run more than one API Call we execute them one after another.
First we need to setup a procedure which will take as input the ruleId , your API key , body of the request and version, and returns the response JSON
--DROP PROCEDURE decisionRulesSolve;CREATE PROCEDURE decisionRulesSolve(@ruleId Varchar(50) , @token Varchar(100) , @body Varchar(max) , @json Varchar(max) OUTPUT ,@version int )ASBEGIN-- DecisionRules URL-- If version is provided ad '/version_num' to URLIF @version IS NULLSET @version =''ELSESET @version ='/'+ str(@version) -- Setup the API key authenticationIF @token IS NULLSET @token ='Bearer <Default Token>'ELSESET @token ='Bearer '+ @token;IF @body IS NULLSET @body ='{"data": {}}';-- Construct the API endpointDECLARE @URL NVARCHAR(MAX) ='http://api.decisionrules.io/rule/solve/'+ @ruleId ;-- Variables for executing API CallDeclare @Object as Int;Declare @ResponseText asVarchar(max);Declare @status asVarchar(max);Declare @statusText asVarchar(max);-- Temporary table for storing parsed response-- ( storing the clob into table circumvents possible error if response is large)Declare @tempTable table ( json_table varchar(max) );-- API Call setup-- Initiate objectExec sp_OACreate 'MSXML2.XMLHTTP', @Object OUT;-- Specify type of callExec sp_OAMethod @Object, 'open', NULL, 'post', @URL,'False'-- Set Bearer token headerExec sp_OAMethod @Object , 'setRequestHeader' , NULL , 'Authorization' , @token-- Set Content type HeaderExec sp_OAMethod @Object , 'setRequestHeader' , NULL , 'Content-type' , 'application/json'-- Send call, with data in bodyExec sp_OAMethod @Object, 'send' , NULL , @body-- Store the calls responseExec sp_OAMethod @Object, 'responseText', @ResponseText OUTPUT-- Get statusexec sp_OAGetProperty @Object, 'status' , @status OUT-- Get Status textExec sp_OAMethod @Object, 'statusText', @statusText OUTPUTINSERTINTO @tempTable Exec sp_OAMethod @Object, 'responseText'-- Load response textSET @json = (select*from @tempTable );Exec sp_OADestroy @ObjectEND;;
Now lets take a look at the main script that takes the data from the database, sends them to DecisionRules and stores the output. We will describe the code bellow in parts, but here is the full file
Load data from database
In the process we first load the data from potentionaly several table into one table variable. The structure of the input JSON is then set according to the aliases of the columns ( simple alias with create a simple json object, but you can create a nested object using the dot notation e.g."root.nestedValue"
DECLARE @json table (Row# int,promoCode varchar(20), period Varchar(20) , productType Varchar(20) ,first_name varchar(20) , last_name varchar(20) );BEGIN-- Fetch the input data into @json variable-- Mapping for the input JSON is done through column aliases-- Simple alias 'example' results in { "example": columnValue }-- Alias sepparated by dots 'example.data' results in nested JSON object {"example":{"data":columnValue} }-- Column row# has to be included to allow sepparation of data into batchesINSERTINTO @json selectROW_NUMBER() over (order by promoCode desc,period,productType) row#, tab.promoCode 'promoCode' , tab.period 'period' , tab.productType 'productType','John''name.first_name','Doe''name.last_name'from dbo.my_DR_input tab
Transform data to JSON
To transform the data to JSON we use the FOR JSON PATH clause which creates a json object from table with structure according to the column aliases. We can also specify the ROOT('data') clause which creates the input json with the necessary wrapper automaticaly
SET @currentBatch = (select*from @json whererow# between @lastBatchStart and @lastBatchStart + @batchSize -1FOR JSON PATH , ROOT('data') );
Execute the API call
To get the result of an API call, we use the procedure decisionRulesSove, that we created.
Response of the call is then returned through the @json output variable
To get the data from the response JSON you can use the OPENJSON function, where in the WITH clause you specify the names and data types of field you want to extract and their location in the json File.
The output of this function you can then use for any DML statements, in out example an INSERT into the RESULTS table
This approach will not work in SQL Server Express Edition, since it does not give you access to SQL server Agent
To execute API calls in parallel we will take advantage of Jobs and the SQL server agent. Unfortunately there is no package that would serve the same purpose as DBMS_PARALLEL does in Oracle so the code might be a little bit more complicated in this case.
All we need to do is to convert the previous code into a procedure, that loads only rows specified by the input parameters
CREATE PROCEDURE processDRChunk ( @lastBatchStart int, @batchSize int ) asBEGINDECLARE ...SET @currentBatch = (select*from msdb.dbo.my_DR_input whererow# between @lastBatchStart and @lastBatchStart + @batchSize -1FOR JSON PATH , ROOT('data'));-- Rest of the code---END
Please provide your Solver API key in the API_KEY variable
Now we need to create a script that sets up and run a job for entire input divided into chunk of predefined size
DECLARE @cnt int;-- size of one batchDECLARE @chunkSize int =1000;-- variable that stores current parallel commandDECLARE @cmd varchar(1000);-- prefix for all created jobsDECLARE @jobPrefix varchar(7) ='MyDRJob';-- variable to store current jobnameDECLARE @jobName varchar(100) ;DECLARE @iter int =1 ;DECLARE @runningJobsCount int =0;-- maximum parallel callsDECLARE @parallelLevel int =4;BEGIN-- count of inputsSET @cnt = ( selectcount(*) from msdb.dbo.my_dr_input );print @cntWHILE @iter <= @cntBEGIN-- prepare commandSET @cmd ='set textsize -1 exec msdb.dbo.processDRChunk @lastBatchStart = '+ str(@iter) +' , @batchSize = '+ str(@chunkSize) +';'-- prepare jobnameSET @jobName = @jobPrefix +trim(str( @iter )) ; print 'setting up job: '+ @jobName-- If maximum number of jobs is running then waitwhile @runningJobsCount > @parallelLevelBEGIN print 'Waiting for others to finish, currently running: '+ str(@runningJobsCount) waitfor delay '00:00:01'SET @runningJobsCount = (selectcount(*) from sysjobs wherenamelike @jobPrefix +'%');END;-- Create job that deletes after succesful executionexec sp_add_job @job_name = @jobName , @delete_level =1 ;-- Add the command to stepexec sp_add_jobstep @job_name = @jobName , @step_name ='First' , @command = @cmd;-- specify serverexec sp_add_jobserver @job_name = @jobName;-- Start the jobexec sp_start_job @job_name = @jobName ; print 'job ran: '+ @jobNameSET @iter = @iter + @chunkSize SET @runningJobsCount = (selectcount(*) from sysjobs wherenamelike @jobPrefix +'%');ENDEND
We need to explicitly watch how many jobs are running in parallel by setting the @parallelLevel variable, set this variable according to your DecisionRules server specifications