Using OIC3 To Integrate Azure Blob Storage and On-Premises Files - Part 2

 

🧩 Using OIC3 To Integrate Azure Blob Storage and On-Premises Files - Part 2

A brief guide to integrating on-premises files and Azure Storage using OIC3


🧭 Overview

We have multiple on-premises servers hosting file shares, used for integrating data between systems. We also have Azure storage, both our own and third-party storage, also used for integrating file based data.

We now have a Oracle cloud based solution which uses Oracle Integration Cloud version 3 (OIC3) to import and export data. We would like to leverage our investment in Oracle to integrate our data between our file systems and Azure storage.

This second part will explain how to move files from an Azure Blob Container to an on-premises folder.


📌 What This Post Covers

  • ✅ Configuring an on-premises file connection from OIC3
  • ✅ Configuring an Azure Blob Storage connection from OIC3
  • ✅ Configuring a REST connection to access the Azure Storage API from OIC3
  • ✅ Managing REST data using the XML datatype in SQL Server
  • ✅ Utilizing those connections to read and write data

⚠️ Configuring an on-premises Oracle agent is out of scope for this post. The process for setting up an agent will be covered in its own post.


🧱 High-Level Approach

In this solution we will get data from an Azure blob storage container hosted by a third party. We will then write that data to an on-premises folder. There will be no data mapping or enrichment during the solution to keep us focused on the key points of this article.


🛠️ Technical Solution

Step 1 of 4: Create A Connection To An On-Premises File System

For this demo we'll start with a new OIC3 project.


Click the Add button in the Connections section and then scroll down to the File option.


You do have the Trigger and Invoke options (trigger allows the integration to start based on activities detected by the connector while invoke allows you to perform actions with the connector during the integration). For this integration we are going to choose just Invoke. We'll fill in the other fields with suitably descriptive values. When set correctly, click the Create button.


You'll notice you don't have many options for this connection type. In fact, the one we can set - Access Type - we need to change, selecting Connectivty Agent.


Then click the Associate Agent Group button to select the on-premises agent appropriate to your environment. Once selected, click Use and then Test and Save the connection.

Step 2 of 4: Create A Connecton To Azure Blob Storage

Now we repeat the process of adding a connection, this time selecting Azure Storage.


Add in the identifying data and click Create again.


The Azure connector is only able to use one type of security with Azure, but fortunately it is one of the strongest. You will need to be able to provide:

  • Storage Account Name - the name of the storage account as specified in Azure (no need to use the URI or ID, just the lowercase name)
  • Tenant ID - the GUID of the Azure tenant you are connecting to
  • Client ID - the GUID of the client as registered with the tenants Entry identity management service
  • Client Secret - the password (secret) associated with the Client application


Once set up click the Test and then the Save button.

Step 3 of 4: Creating A Connection To Azure Storage API Via REST

Unfortunately, at the time of writing this article the OIC3 adapter for Azure Storage is missing a really useful function - it is unable to list blobs in a blob storage container. To work around this gap we need to call the Azure Storage REST API using the same credentials as our Azure Storage connection uses.

Create a new REST connection as Invoke only and configure it like this:


  • Properties:
    • Connection type - set to REST API Base URL because we are going to access resources under one common API
    • Connection URL - set to the base URL of the API without any paths to specific datatypes
  • Security:
    • Security policy - set to OAuth Client Credentials to use the same credentials as our main Azure Storage connection
    • Access Token URI - this is generally at login.microsoft.com, followed by your Tenant ID and then /oauth2/v2.0/token and is used to issue a temporary access token based on your Client ID
    • Client ID - use the same as the main Azure connector
    • Client Secret - Also as above

This will create a connection which will automatically request an OAuth token and use it to access the API.

Step 4 of 4: Creating An Integration Between On-Premises And Azure

❌ A word of warning, this step is very picky because the Azure API insists on giving results in XML encoded as UTF-8 (using a Byte Order Mark - BOM) while OIC insist on reading XML without an indication of the encoding (not using a BOM). There is a critical step in this which will cause failures if you get it wrong.

We are going to create a schedule based integration in a project, the same as in part 1. We are then going to use the connection to our REST API to get our list of files. We are aiming for our integration to look like this:


Let's look at the properties of the Invoke as these will need to be set exactly for the Azure Storage API.

The first page should look like this:


Note, the field that says "What is the endpoint's relative URL?" is actually asking for the Azure blob container name prefixed with a / character. Also note the check box settings as they will drive the next pages of the configuration.


Add two query parameters to the configuration of type string:

  • restype
  • comp


Add the Standard Header called Accept and a Custom Header called x-ms-version.


This page is very important. Maye sure the HTTP method is GET and the type of response is XML Schema, and then upload the XML schema provided by Microsoft for the Azure Storage API response. This will automatically set the media-type to XML and, although you can overtype this, it will ignore any settings (such as encoding details) that you try to add.

Once done click on the Save button to complete the configuration. It will create the action and the preceding map action, which we will edit next.


The map settings are absolutely critical here and are case-sensitive:

  • HTTP Headers:
    • Standard HTTP Headers:
      • Accept - Must be set to "application/xml"
    • Custom HTTP Headers:
      • x-ms-version - must be set to a valid API version such as "2021-08-06"
  • Query Parameters:
    • restype - must be set to "container"
    • comp - must be set to "list"
  • Connectivity Properties:
    • Plugin:
      • Ignore Byte Order Mark - must be set to the Boolean value true using the function true()

Without these settings the data sent from Azure to Oracle via REST will be a binary stream with a leading Byte Order Mark (BOM) character that will prevent any attempt to process the XML.

The next steps of our integration will handle looping our blob list, downloading the content using the Azure Storage connection and then writing to the on-premises folder.


By configuring the Get Blob from Azure and Write Blob to file in a similar way to Part 1 of these blog articles you will soon have a simple but effective solution.

✅ Conclusion

  • In this solution we created a simple, scheduled based integration which iterates blob objects in an Azure blob storage container and writes them as files to an on-premises folder
  • We can improve this solution by using similar techniques to delete or archive the source files or to gracefully handle Azure storage exceptions

Comments

Popular posts from this blog

MSDTC in Windows Core

The Dreaded 403 PowerShell Remoting Error

Biztalk vs OIC