Azure Data Factory is a data integration platform in the cloud and supports 90+ data sources. Using a pre-built connector, it is easy to orchestrate data from the source to the target.
SAP BW is an OLAP system that contains both star-schema and flat-table based data container objects. Data from multiple ERP systems are extracted, transformed, and loaded into various providers.
Azure Data Factory provides 3 kinds of connectors for connecting with SAP BW –
- SAP Open Hub
- SAP MDX
- SAP Table
Let us have a look at Open Hub connector in detail in this blog.
BW Open Hub
Open Hub is a service in BW that allows us to take BW data to downstream systems and applications. There are three types of Open Hub based on the destination and the ADF BW Connector supports only the ‘Database table’ type Open Hub.
Data from a variety of ERP systems can be extracted to SAP BW and then can be pushed into providers based on the modeling. Multiple staging layers are possible in the architecture before pushing the data to the final provider.
All different types of providers and corresponding staging layers can be individually extracted via Open Hub. The Open Hub creates a new database table and stores the data in it from the other BW objects. It supports all BW data providers except Infoset.

Features available in ADF Pipeline activities

- Exclude last request: This option is enabled by default and this does not load the last DTP request in the Open Hub table. This is done so that if a request is being loaded into DTP, it will be ignored. This makes sure that no records are lost of the last request if it is currently loading.
- Base request ID (>): This allows us to use the delta mechanism of BW. We can specify the DTP request number ID here, and the ADF will load only the requests which are greater than this number.
- Additional columns: We can create additional columns here with the values
- Parallel copies: This option allows parallel copying of data through multiple requests. Each parallel copy will load a data partition based on DTP Request ID and Package ID.
- Staging: When copying a large amount of data, this option allows us to store and compress data in Blob Storage.
Use cases
- Using this connector, we will be able to retrieve data from both final (Composite Provider, Infocube, Multi-provider) and staging (ADSO, DSO) layers in BW – Composite provider.
- This connector can also be used to bring master data of info objects – texts, attributes, and hierarchy
- Since the delta load is supported, the most recent data alone can be fetched. This can be helpful in situations where migration is being done, and data from the existing process needs to be captured until the new process is fully deployed.
Limitations
- Since Open Hub does not support Infoset, the data cannot be brought directly from Infoset. It can be done using another BW object like DSO or cube in between Infoset and Open Hub to stage the Infoset.
- The output data is flat and not multidimensional. The data type of the output cannot be changed
- Delta loading from BEx query to Open Hub is not possible. So, if delta is necessary, then it must be handled in the design of BEx Query
In this blog, we have seen the possibilities and constraints in using the Azure Data Factory BW Open Hub connector. This connectivity feature from Azure Data Factory helps us in orchestrating the data from the SAP BW system. We will see about the MDX connector in the next blog.
Learn more about Visual BI’s Microsoft Azure offerings here.