Scenario :-In this blog we will see import a file through recurring integrations via Logic Apps
Step1:-Go to D365fo Data management workspace and create a Import Data Project as shown in below
Click on create Recurring jobs
Step2:-Then give name ,description, Application Id which created in Azure, enable all option as shown in below
Step3:-Click on set processing recurrence and choose time ,count.
Step4:-Then click on monitoring recurrence and choose time and count.
Step5:-Then click ok you will see a popup continue click on yes
Now All setup is done in D365fo for importing through recurring integrations
Step6:-Go to Azure create anew logic app and start creating new workflow .Over all workflow can be seen in below
Step7:-First one will be blob modify trigger I have choose that means whenever I will get new file added in azure blob storage my logicapps gets fired.
Step8:-Next I used get blob action so that I can directly get the new file
Step9:-Then choose Http and give authorization like secret, cleintid, tenantId ,in method choose post
url:- https://trial-rh4cr1.trial.operations.dynamics.com/api/connector/enqueue/{59B7CEE6-71C3-4DB6-8101-415E5CF11C00}?entity=Customer groups
in body pass file by dynamic content for earlier connector
Note:-Disable allow chunking option in settings of http connector other wise you will encounter a error partial loading.... some kind of error
Step10:-Put delay for next status check because it take time to dump data into D365fo
Step11:-Choose one more http to check status of file importing
POST : D365url/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetMessageStatus
BODY
{
"messageId":pass message id from previous enqueue body
}
Step12:-Navigate to D365fo data project and check for mange messages
Step13:-Go to Customer groups and check for new entry's
Step14:-As well as one can check status in logic app itself and can further share information with other messenger apps by using json parse connector.
Thank you
Keep Daxing!!
No comments:
Post a Comment