Logic Apps are an enterprise integration tool in a cloud system, used to integrate apps, systems and services in one place to build a scalable solution. Logic Apps contain triggers and activities to meet the business requirements. Here, we are going to cover the below-mentioned use cases and dependent activities:

  • Writing a file to an FTP Server
  • Tigger a Logic App from an ADF pipeline
  • Passing values from the pipeline to the Logic App

Flow diagram to illustrate the process

Leveraging Logic Apps in Azure Data Factory (ADF) to Support FTP

Writing file into FTP Server

We can use FTP connector available in Azure Data Factory (ADF) for reading the file from the server. However, we cannot use FTP server as a sink in the ADF pipeline due to some limitations. To achieve writing and deleting the file or folders in the FTP server, we can use the logic app to achieve the same. Create the logic app (check out the documentation on how to create the logic app- here). Once we create the logic app, we need to add the below-given actions:

  • Add HTTP Request

We need to trigger and pass parameter value for logic app activity from ADF. We do not have any logic app specific activity in ADF to trigger. To overcome this limitation, we need to create the web activity in ADF pipeline and trigger the logic app via API call which we will discuss in detail later in this blog. Add HTTP request action in the logic app as the first step like shown below and define the request body json schema to get value from ADF pipeline. You will get an HTTP post API URL once you save the logic app, which then will be used in the ADF web activity for triggering.

Leveraging Logic Apps in Azure Data Factory (ADF) to Support FTP
  • Add Data lake action

Get the file contents which reside in the data lake. This file is generated by the copy activity in the ADF pipeline. You can notice the parameter ‘filename’ in the file path which we are getting from the previous action HTTP request body.

Note: You need to configure and authenticate the data lake before reading the file.

Leveraging Logic Apps in Azure Data Factory (ADF) to Support FTP
  • Add FTP Create File action

Choose FTP to create file action in the logic app and configure the FTP server with proper authentication detail. Once you configured the FTP create file action it will ask for the root folder, filename and file content for creating the file inside the FTP server. You will get the file name from HTTP action and file content from data lake action.

Leveraging Logic Apps in Azure Data Factory (ADF) to Support FTP

Integration and triggering logic app from ADF pipeline

Create the pipeline with the associated dataset, linked service and IR. If you want to know how to create a pipeline, please click here.

  • Add Copy Activity

Once you have created the pipeline, you can drag and drop the copy activity to copy the data from the source file to the target file in data lake. As I said earlier, this file is passed to logic app read file action for reading the file content and create the file in FTP server.

Leveraging Logic Apps in Azure Data Factory (ADF) to Support FTP
  • Add Web Activity

Web activity will be used to trigger the logic app with HTTP post URL via API. URL should be the one we copied in the HTTP request action post URL. We need to pass the parameter filename and file path from web activity of ADF to the Logic app as shown below:

Leveraging Logic Apps in Azure Data Factory (ADF) to Support FTP

We completed all the activities to integrate the logic app in the ADF pipeline and writing the file into the FTP server. Now you can trigger or schedule the created pipeline to check the logic app execution and create the file in the FTP server. You can also trigger the logic app separately without the ADF pipeline.

Learn more about Visual BI’s Microsoft Azure offerings here.

Corporate HQ:
5920 Windhaven Pkwy, Plano, TX 75093

+1 888-227-2794

+1 972-232-2233

+1 888-227-7192


Copyright © Visual BI Solutions Inc.

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Join our mailing list to receive the latest news and updates from our team.

You have Successfully Subscribed!

Share This!

Share this with your friends and colleagues!