We have many options in Azure Data Factory (ADF) and Logic Apps to read data from HTTP/API endpoints. Here we are considering if the reverse is possible in Azure, where we would have an HTTP Listener to receive data in Azure. There are no in-built options for this in ADF or Azure Logic Apps.
Businesses can use FTP or third-party connectors to achieve this functionality, which is insecure since we are exposing our systems to external services. But they are one of the cost-ineffective methods.
Let us achieve this through Azure ecosystem which will be a simple listener using Azure functions. In this blog, we will be setting up HTTP Listener using Azure functions, which would read the input stream and move it to blob storage.
HTTP triggered azure functions are a great tool because they can be invoked on-demand and can be run on the modern serverless architecture. So, we can author and execute our business logic without any hassle of managing servers. Each trigger to the function would be treated as an isolated instance, so multiple applications can trigger the app simultaneously.
Let us take a scenario where an, HR team needs an internal application, whose database will not be exposed to external application because of data sensitivity. The application needs the ability to send the extracts to FTP or web service endpoint as CSV extracts. Since we don’t want to bring an intermediate layer into our architecture, we decided to go with setting up an HTTP end-point.
In this example, we have created an HTTP trigger Azure App with the output binding to Azure Blob Storage. The Azure function is configured to accept both GET and POST methods. Since we are building this app to ingest data we will have to invoke this app using the POST method. With setup ready, the data should be sent into the app service as binary data. The ‘Content-Type’ should be set to text/plain. The Azure App will read the input binary stream and write it to the blob storage using output binding property in the function app. With this, we could send data into Azure from any application which supports a web service call.
Data is transferred through the HTTPS protocol, so the data is encrypted during the transfer. We can also secure the app by setting the authorization level. If the authorization is set to anonymous, then anyone with the URL can hit the endpoint. The recommended authorizations are function and admin level, where you will need the authorization key along with the URL to invoke the Azure app.
Like any HTTP endpoint, we have a 100 MB limit on the request body size. So, this solution cannot be used for huge files. But we can always partition the files into smaller chunks and ingest.
If your company has been using FTP to temporarily land the source files before ingesting them into Azure, this approach would be a wonderful alternative as a pay-per-use subscription. You can find pricing details of Azure Functions here.