Oracle PL/SQL

How to call DecisionRules directly from Oracle PLSQL

How to call DecisionRules Solver API from Oracle PL/SQL

Database setup

To execute the API call directly from Oracle PL/SQL we will take advantage of the provided UTL_HTTP package. However, in a typical database a regular user will not have the necessary privileges to use this package.

You will need to have privileges to access the UTL_HTTP package and have Access Control List setup.

For the purposes of first test you can use the code in the file below, which will create my_DR_input table which contains sample input data and results table to store the data received from DecisionRules.

Bellow you can find all the code files necessary to set up and test your first API call

Create command for RESULTS table
Create command for MY_DR_INPUT table

134KB
Open
Showcase rule for testing the call, please import this rule to your DecisionRules space

How to set up the ACL is described in this article https://oracle-base.com/articles/misc/utl_http-and-ssl#acl ( first chapter Access Control List (ACL) )

Using HTTP or HTTPS

The showcase script uses HTTP to call the DecisionRules API, to make it easier to make your first call and test whether you have all the necessary permissions and all prerequisites met.

If you want to use HTTPS you will need to set up an Oracle Wallet with the propper SSL certificate, otherwise Oracle will not let you execute the Call.

How to get the certificate and create the wallet is explained here https://oracle-base.com/articles/misc/utl_http-and-ssl#acl ( third chapter Get Site Certificates and fourth chapter Create an Oracle Wallet Containing the Certificates)

Once you have the wallet setup you can change the call URL to https and add uncomment the utl_http.set_wallet('file:<FILE LOCATION HERE>', NULL); command providing path to your newly created wallet.

Methods of processing

First we will take a look at how to setup a simple serial call, that processes multiple API Calls one after another, and then we will expand on it using parallel API Calls that can be useful in the cases where you need to evaluate very large sets of data.

Serial method

The DecisionRules API calls are made in batches using the Bulk input feature. You can set the size of the batches by changing the value of the batch_size variable.

Using the Serial approach we execute the API call one after another in a typical SQL fashion. Now lets describe the necessary steps executed by the script

Now we will describe the necessary code in parts, the full script can be downloaded here

You should make changes in the following variable to suit your own case

Code executes calls by batch one after another

You will need to provide your own API Key in the API_KEY variable

Transforming data to JSON format

DecisionRules API accepts the input data in the form of a JSON wrapped in a 'data' wrapper.To transform the data into the required JSON format we use Oracles provided functions

  • JSON_OBJECT( key VALUE column,...) - creates a JSON object with specified keys and values

  • JSON_ARRAYAGG( values ) - creates a JSON array with elements specified in values

Depending on the size of your data you might choose between storing your input and output data either in Varchar or Clob variables. In the provided code we showcase the use of Clob variables, which results in more complex code, but works for large datasets

Varchar2 variables are limited to 32Kb. CLOBs can typically store up to 2Gb

Executing the API Call

To execute the call we use the UTL_HTTP package.

You will need to provide the API Key of your own space to test the rule evaluation ( the value is set in the API_KEY variable ).

In this part we begin the request , setup the headers and then we load the data into the body. Note that there is a limit of the input size in the UTL_HTTP.write_text function of 32Kb , so we need to use a loop to accommodate inputs of larger sizes.

Finally we load the response into a prepared variable.

Extracting data from the response

To extract the data from the calls response we use the JSON_TABLE function, this function takes a JSON object and transforms it into a table with columns specified in the COLUMNS clause. Here you can specify the name and data type of the resulting column followed by the path to the value in the JSON

After transforming the data into the table format you can then use it to execute any DML statement you need, in our example we show a simple INSERT into the RESULTS table

Parallel approach

If your dataset is so large enough, that the serial execution of the API calls takes too long, you can take advantage of the DBMS_PARALLEL package to utilize the computation power of the server more efficiently and run several API calls at once.

We use one more table to store the final status of parallel execution, this gives us the option to check for errors, if the calls fail.

Table for logging the status of parallel execution

This approach requires some additional setup. First we transform our API call into a procedure that takes as parameter start_id and end_id and processes the API call for a set of rows delimited by these two values

Full code of the procedure, please insert your API key into the API_KEY variable

Script for parallel execution, you first need to create the procedure above, then you can run this code

Now we can use the DBMS_PARALLEL package to execute several calls in parallel, we use the following functions of DBMS_PARALLEL in particular

  • CREATE_TASK - creates a new task which will contain all the unique API Calls

  • CREATE_CHUNKS_BY_NUMBER_COL - divides the table into chunks of specified size by a column of data type NUMBER( each one will then be the input of an API Call )

  • RUN_TASK - executes the task running the provided SQL code for each chunk

  • RESUME_TASK - execute the task that failed with errors again

  • DROP_TASK - delete the task, that we needed only for the purposes of this procedure

  • TASK_STATUS - property that hold the status of the task ( finished, with errors etc. )

Last updated