Using OIC3 To Integrate Azure Blob Storage and On-Premises Files - Part 1
🧩 Using OIC3 To Integrate Azure Blob Storage and On-Premises Files - Part 1
A brief guide to integrating on-premises files and Azure Storage using OIC3
🧭 Overview
We have multiple on-premises servers hosting file shares, used for integrating data between systems. We also have Azure storage, both our own and third-party storage, also used for integrating file based data.
We now have a Oracle cloud based solution which uses Oracle Integration Cloud version 3 (OIC3) to import and export data. We would like to leverage our investment in Oracle to integrate our data between our file systems and Azure storage.
This first part will explain how to move files from an on-premises folder to an Azure Blob Container.
📌 What This Post Covers
- ✅ Configuring an on-premises file connection from OIC3
- ✅ Configuring an Azure Blob Storage connection from OIC3
- ✅ Utilizing those connections to read and write data
- ✅ Managing data using the XML datatype in SQL Server
⚠️ Configuring an on-premises Oracle agent is out of scope for this post. The process for setting up an agent will be covered in its own post.
🧱 High-Level Approach
In this solution we will get data from an on-premise folder using. We will then write that data to an Azure blob storage container hosted by a third party. There will be no data mapping or enrichment during the solution to keep us focused on the key points of this article.
🛠️ Technical Solution
Step 1 of 3: Create A Connection To An On-Premises File System
For this demo we'll start with a new OIC3 project.
Click the Add button in the Connections section and then scroll down to the File option.
You do have the Trigger and Invoke options (trigger allows the integration to start based on activities detected by the connector while invoke allows you to perform actions with the connector during the integration). For this integration we are going to choose just Invoke. We'll fill in the other fields with suitably descriptive values. When set correctly, click the Create button.
You'll notice you don't have many options for this connection type. In fact, the one we can set - Access Type - we need to change, selecting Connectivity Agent.
Then click the Associate Agent Group button to select the on-premises agent appropriate to your environment. Once selected, click Use and then Test and Save the connection.
Step 2 of 3: Create A Connection To Azure Blob Storage
Now we repeat the process of adding a connection, this time selecting Azure Storage.
Add in the identifying data and click Create again.
The Azure connector is only able to use one type of security with Azure, but fortunately it is one of the strongest. You will need to be able to provide:
- Storage Account Name - the name of the storage account as specified in Azure (no need to use the URI or ID, just the lowercase name)
- Tenant ID - the GUID of the Azure tenant you are connecting to
- Client ID - the GUID of the client as registered with the tenants Entry identity management service
- Client Secret - the password (secret) associated with the Client application
Once set up click the Test and then the Save button.
Step 3 of 3: Creating An Integration Between On-Premises And Azure
For this demo we are going to create an integration that is triggered on a schedule. There are other options but this is going to keep our demo simple.
Straight after this we are going to add a call to our Azure storage connection which will automatically add a map on first save so our integration is going to look something like this.
Let's look at the configuration of that Invoke action.
The first page gives sets the basic identification details.
The second page allows us to set the particular action we'd like to take (in this case, we are going to list the files in the folder).
The last page asks you to confirm the details and save.
We can also use other activities such as Read File and Delete File.
On some of those actions you will use the preceding map to set the directory and file names.
In our example we are going to for-each iterate through the files found in the list files action.
Here we can see the configuration of that for-each, pointing to the repeating element in the response from our list file action.
Now we've got our list of files and we can iterate around them, next we need to upload them to blob storage. To do this we will invoke our other connection, setting the identification information as appropriate and then selecting the resource type of Blobs.
The next page of the configuration asks you to set a blob container and blob name (although we are going to override this information from the map in a moment).
And that's all the configuration we need to upload a blob so on the summary page select Save to complete the configuration. When done, open the preceding map.
There are several absolutely key pieces of information you must complete in the map:
- Headers/x-ms-version - this must be a valid API version such as "2025-07-05"
- Headers/x-ms-blob-type - there are a few documented types, all case sensitive, but a general, simple, robust one is "BlockBlob"
- Parameters/Blob - the name of the blob file (ideally a unique file name)
- Parameters/Container - the name of the individual container within the storage account (in lowercase)
- File/Stream-Reference - this must point to a reference produced by a preceding action that produces a data stream (such as the Get File Contents action)
Failure to set these correctly will result in complete failure of your integration, often with little or no trace data or useful error information. Successful configuration will mean you now have a fully working solution.
✅ Conclusion
- In this solution we created a simple, scheduled based integration which iterates files stored in an on-premises folder and writes the data to an Azure blob storage container
- We can improve this solution by using similar techniques to delete or archive the source files or to gracefully handle Azure storage exceptions
Comments
Post a Comment