Thursday, October 13, 2022

Overview of Synapse Link for Dynamics 365 Finance and Operations

Azure Synapse Link enables seamless integration allows for continuous data export from Dynamics 365 to Azure Synapse Analytics and Azure Data Lake Storage Gen2, facilitating advanced analytics and insights.

Practical Applications (A Youtube video is always a good start to have the right understanding)

  • Analytics and Reporting: By exporting data to Azure Synapse Analytics, organizations can leverage powerful analytics tools to gain insights and make data-driven decisions.

  • Data Integration: Synapse Link facilitates the integration of Dynamics 365 data with other data sources, enabling comprehensive data analysis.


Here are some individual links providing additional information:
  • Azure Synapse Link for Dataverse: This article explains how to use Azure Synapse Link to connect Microsoft Dataverse data to Azure Synapse Analytics. It covers prerequisites, how to connect Dataverse to a Synapse workspace, manage table data, and monitor the link. The guide also includes steps to unlink and relink the Synapse Link and view data in Azure Synapse Analytics.

  • Export Data from D365 FO using Synapse Link: This blog post details the process of exporting data from Dynamics 365 Finance and Operations (D365 FO) using Azure Synapse Link. It highlights the benefits of continuous data export to Azure Synapse Analytics and Azure Data Lake Storage Gen2, and provides a step-by-step guide on setting up the export, managing data, and troubleshooting common issues. 

  • Set Up Access Control for Azure Synapse Workspace: This article provides a comprehensive guide on setting up access control for an Azure Synapse workspace. It discusses various access control mechanisms, including Azure roles, Synapse roles, SQL permissions, and Git permissions. The guide also includes steps to secure a Synapse workspace by configuring security groups, preparing ADLS Gen2 storage accounts, and assigning roles.

  • Azure Synapse Link with Managed Identities: This document explains how to use managed identities for Azure with Azure Synapse Link. It covers the prerequisites, steps to enable enterprise policy for Azure subscriptions, and how to grant reader access to the enterprise policy. The guide also provides instructions on creating and configuring managed identities to secure access to Azure Data Lake Storage accounts.

  • Azure Synapse RBAC Roles: This article describes the built-in role-based access control (RBAC) roles in Azure Synapse Analytics. It details the permissions granted by each role, the scopes at which they can be used, and how to review and assign Synapse RBAC roles. The guide also includes a table summarizing the roles and their associated permissions.

  • Grant Permissions to Managed Identity in Synapse Workspace: This guide teaches how to grant permissions to the managed identity in an Azure Synapse workspace. It explains the steps to assign the Storage Blob Data Contributor role to the managed identity for accessing ADLS Gen2 storage accounts. The article also covers the process of granting permissions during and after workspace creation.

  • Synapse Link Lake Database Permissions: This blog post discusses the permissions required for using Synapse Link with lake databases in Azure Synapse Analytics. It highlights the security model for lake databases, including Azure role-based access control (RBAC) and Microsoft Entra ID (formerly Azure AD) permissions. The post also provides tips on managing permissions for lake databases and troubleshooting common issues.

  • Common Azure Synapse Link for SQL Storage Permission Issues: This blog post addresses common permission issues encountered when using Azure Synapse Link for SQL storage. It provides solutions for resolving access problems to Azure Data Lake Storage Gen2 accounts, including steps to refresh access tokens and assign the correct permissions. The post also offers troubleshooting tips for both new and existing Synapse Link setups.

Wednesday, October 12, 2022

Integration Key Patterns and Best Practices from MSFT sessions

Microsoft Dynamics 365 FastTrack TechTalks are a must-attend series for those who look to deepen their knowledge on Dynamics. These TechTalks are designed to provide in-depth knowledge and practical guidance directly from Microsoft experts, helping you to maximize the potential of your Dynamics 365 solutions. One such series is related to Integration patterns and related best practices. 

  1. Dynamics 365 Integration General Guidance | October 2, 2023:
    https://community.dynamics.com/blogs/post/?postid=13b3efef-ee69-ee11-9ae7-000d3a574bff 
    This video provides an overview of general integration strategies and best practices for Dynamics 365. It covers various integration scenarios, tools, and techniques to ensure seamless data flow and system interoperability. Key topics include the use of APIs, data connectors, and middleware solutions to integrate Dynamics 365 with other applications and services.

  2. Integration Patterns for Dynamics 365 Finance and Operations Applications | October 30, 2023:
    https://community.dynamics.com/blogs/post/?postid=27adb1ef-d19d-ee11-be37-000d3a4e511f
    This session focuses on specific integration patterns for Dynamics 365 Finance and Operations applications. It discusses synchronous and asynchronous integration methods, the use of web application programming interfaces (APIs), and data integration scenarios. The video aims to help developers and solution architects understand the best practices for integrating finance and operations apps with other systems.

  3. Integration Patterns for Dataverse | November 6, 2023:
    https://community.dynamics.com/blogs/post/?postid=388a6e70-738d-ee11-8179-00224827e5da
    This video explores various integration patterns for Microsoft Dataverse. It covers inbound and outbound integration methods, including the use of APIs, Power Automate, Logic Apps, and Azure Data Factory. The session also highlights best practices for handling large data volumes, real-time data integration, and ensuring data security and scalability.
There is also a fourth video "Session 4 - Complex integration scenarios - aimed to be delivered on December th 4th 2023. However, I couldn't find the link to that video. So, if you were able to find it, please do post it in here. Thanks. 

Tuesday, October 11, 2022

Extensible Data Security (XDS) in D365FO

 Let’s break down Extensible Data Security (XDS) in Dynamics 365 Finance and Operations (D365FO) in a way that’s easy to understand.

What is Extensible Data Security (XDS)?

XDS is a feature in D365FO that adds an extra layer of security to your data. It goes beyond the basic role-based security by allowing you to control access to specific records in your database based on certain conditions or policies.

How Does It Work?

  1. Constrained Tables: These are the tables where you want to restrict access. For example, if you want to limit access to customer transactions, the table containing these transactions would be a constrained table.

  2. Primary Tables: These tables are used to define the conditions for access. They have a direct relationship with the constrained tables. For instance, if you want to restrict access based on customer groups, the table containing customer group information would be a primary table.

  3. Policy Query: This is a query that sets the conditions for access. It filters the data in the constrained tables based on the criteria defined in the primary tables. For example, you might create a policy that only allows access to transactions for customers in a specific group.

  4. Context: This determines when the policy is applied. There are two main types:

    • Role Context: Applies the policy based on the user’s role. For example, only users with a specific role can access certain data.
    • Application Context: Applies the policy based on conditions set by the application itself.

Why is XDS Important?

  • Enhanced Security: It provides more granular control over who can see what data, ensuring sensitive information is only accessible to authorized users.
  • Flexibility: You can create complex security policies tailored to your business needs.
  • Compliance: Helps in meeting regulatory requirements by ensuring data is accessed appropriately.

Example Scenario

Imagine you have a sales team, and you want each salesperson to only see their own customers’ orders. You can set up an XDS policy where:

  • The constrained table is the table with sales orders.
  • The primary table is the table with salesperson information.
  • The policy query filters sales orders to only show those related to the logged-in salesperson.

This way, each salesperson only sees the orders relevant to them, enhancing data security and privacy.

I hope this gives you a clear understanding of Extensible Data Security in D365FO! Sure! Below are brief summaries of some related blogs available out there:

  1. Extensible Data Security (XDS) Framework in D365FO by Alex Meyer
    This blog explains the basics of the XDS framework in D365FO, highlighting its evolution from record-level security in previous versions of Dynamics AX. It covers key concepts such as constrained tables, primary tables, policy queries, and contexts (role and application). The blog also provides a step-by-step example of setting up an XDS policy to restrict access to sales orders based on customer groups.

  2. Extensible Data Security Examples - Secure by Warehouse on Dynamicspedia 
    This post focuses on using XDS to secure access to warehouses in D365FO. It discusses the challenges of creating policies for multiple warehouses and suggests using a custom table to link users with their allowed warehouses. The blog provides a detailed example of setting up such a policy, including technical details and considerations for implementation.

  3. Record Level Security on Sami's Blog 
    This blog discusses various aspects of record-level security in D365FO. It provides examples and scenarios where record-level security can be applied to restrict access to specific records or tables in the database. The blog walks you through a simple example on how to implement XDS.

  4. Record Level Security on Raziq D365FO's Blog 
    This blog covers the use of record-level security to set restrictions on specific records or tables in AX 2012 (Previous version of D365FO). It includes examples demonstrating how to use record-level security to control data visibility in reports and forms. The blog highlights the practical applications of record-level security in various business scenarios and provides possibility to compare the framework between versions.

Monday, October 10, 2022

Enhancing Security with Azure Conditional Access for Dynamics 365

In today’s digital landscape, securing access to critical business applications is paramount. Azure Conditional Access provides a robust solution to ensure that only authorized devices and users can access your Dynamics 365 environments. In this post, we’ll explore two insightful articles that delve into the application of Azure Conditional Access for Dynamics 365 CRM and Dynamics 365 for Finance and Operations.

The first article from Inogic, titled “Use of Conditional Access to Restrict Access to Dynamics 365 CRM by Operating System,” offers a detailed guide on setting up Azure Conditional Access policies to restrict CRM access based on the operating system. This ensures that only Windows-based devices can access the CRM, enhancing security by blocking non-Windows operating systems.

The second article, “Azure Conditional Access Support for Dynamics 365 for Finance and Operations,” by Peter Dahl, discusses how to extend conditional access support to Dynamics 365 for Finance and Operations. Although specific conditional access rules for Dynamics 365 are not available, defining a policy for “All cloud apps” can effectively secure access to this critical application.

By implementing these strategies, organizations can significantly bolster their security posture, ensuring that sensitive data remains protected and accessible only to authorized users.

Tuesday, February 11, 2020

To create a new user for BYOD (Azure SQL database)

Sometimes, when working with Dynamics 365 for Finance and Operations, we go with the option of BYOD (Bring your own database) in order to enable Asynchronous integrations. In such one scenario, I had to create a new db user for the BYOD Azure SQL database and below are the scripts for the same

-- create SQL auth login from master 
CREATE LOGIN test WITH PASSWORD = 'SuperSecret!' 

-- select your db in the dropdown and create a user mapped to a login 
CREATE USER [test] FOR LOGIN [test] WITH DEFAULT_SCHEMA = dbo; 

-- add user to role(s) in db 
ALTER ROLE db_datareader ADD MEMBER [test]; 
ALTER ROLE db_datawriter ADD MEMBER [test]; 

These are the three steps to be performed in order to add new login / user for your BYOD. 
More info in this blog.

Tuesday, November 5, 2019

D365FO oData endpoint and filters for optimal use

By now you know that, the metadata for the data entities present in a particular instance of a Dynamics 365 for Finance and Operations instance is publicly available and can be accessed by anyone anywhere using the URL
https://<Your-Devbox-URL>.cloudax.dynamics.com/data/$metadata 

And you will be able to get data using filtering conditions directly in the URL now... for example: 


https://<your-devbox-url>.cloudax.dynamics.com/data/ProductStatusSetups?$top=2

https://<your-devbox-url>.cloudax.dynamics.com/data/ProductStatusSetups?$select=StatusId


https://<your-devbox-url>.cloudax.dynamics.com/data/ProductStatusTables?$filter=ProductStatusId%20eq%20%27CANCELLED%27


https://<your-devbox-url>.cloudax.dynamics.com/data/ProductStatusTables?$filter=ProductStatusId eq 'CANCELLED'


Do make use of the flexibility you get from the new web-client urls in Dynamics.

Happy coding!




Saturday, January 5, 2019

Setup Azure DevOps as version control for D365 F&O Project

The setup needed to get your latest changes in Dynamics 365 for Finance and Operations on Azure DevOps (previously called Visual Studio online) is quite different when compared to the AX 2012 Version control setup. 

And the main reasons for this change in VCS are: 

  1. Morphix is gone with D365 F&O, and all development and setup moves to Visual Studio instead
  2. Also the file system approach for storing the code changes is back. Although the format of the files in which the code changes and relative metadata is XML now 

So obviously it is important to know the Local Repository location ðŸ˜„

As D365 F&O is a web service now, you can always 

  1. Go to the root directory - by simply right-clicking the webservice (under IIS) and select Explore
  2. Then find the file for the web.config and open it
  3. And finally, search for "Aos.PackageDirectory" in order to get the Local Repository location. 



And the next important to understand is how to setup the Folder structure under your Azure DevOps Project (don't forget to select "Team Foundation Version Control" option during the creation of the project)




The folder structure is basically to have two separate folders so that you can have your Metadata and Projects mapped separately to two separate local folders

  1. Metadata
  2. Projects




And once you setup the Workspaces in the Visual Studio to have 
  1. Metadata - mapped to the Local Repository folder (which you can find as mentioned above)
  2. Projects - to be mapped to any specific folder which you create locally. This is where all your new Project files will be stored eventually




Hope this helps. Happy coding!

Monday, March 19, 2018

How to perform Silent installation of Dynamics AX Client

I believe that silent installations are best in the long run;

Basically whenever you have a .msi file (Microsoft installer), you have the privilege to see the options available by use /? parameter. 

sampleinstaller.msi /?

I have used the same help command and execute the below steps one after the other in order to install Dynamics AX Client in Silent mode. Of course, once you get this far, you can always script it in better way... 


// Copy of files from FileShare to Local folder

xcopy "\\SHARE\AX2012 R3 RTM + CU8" D:\ /E /H

//Silent install - MSChart

"D:\AX2012 R3 RTM + CU8\MSChart.exe" /q /norestart

//Silent install - SQLSysClrTypes2012

"D:\AX2012 R3 RTM + CU8\SQLSysClrTypes2012.msi" /quiet

//Silent install - Report viewer 

"D:\AX2012 R3 RTM + CU8\ReportViewer2012.exe" /q:a /c:"install.exe /q"

//And Finally Silent Install - Dynamics AX client component


"D:\AX2012 R3 RTM + CU8\setup.exe" RunMode=Custom AcceptLicenseTerms=1 HideUi=1 ByPassWarnings=1 InstallClientUi=1 ClientConfigFile="D:\AX2012 R3 RTM + CU8\UAT.axc" ClientLanguage=en-us ConfigurePreRequisites=1 LogDir="D:\logs"



Hope this helps. Happy coding!

Sunday, March 18, 2018

What is GIT?

Having worked with Microsoft technologies, I am used to TFS a.k.a VSTS a.k.a DevOps. However, I wanted to understand GIT as well.. so did some reading and below is my summary.

GIT is a distributed Version control system. 

So if we compare GIT with TFS. TFS basically has a server component and a client component. The server component has to be installed on a special machine with the required CPU, RAM and Harddisk based on the size of the Project, the number of developers involved and also the level of commits and reads made. This could be expensive. 

Actually it is this reason that the owner of GIT, Linus Torvalds; has created GIT - to move there Linux kernel development code from an existing "free" VCS software to their own created VCS. 


Distributed version control system works with the peer-to-peer approach, unlike TFS or any other Centralized version control system for that matter. Every developer system which connects to the repository and syncs will have the entire code base + change history of the repository. This enables faster commits; comparisons; viewing history; and reverting changes

And the other actions Push/Pull are used in order to interact with the main repository - these could be a little bit more time consuming though. 

What is a Pull request?

GIT being distributed VCS, will ultimately have one main SourceBase and several fork repositories on which developers would make changes. And if a module / development is completed - the developer would make a pull request to the Project moderator. This request is to review the code changes and if approved to add to the main SourceBase. 
There are several ways to handle the Pull request - Automatic approvals; manual verification; testing on new branches of SourceBase - are few to follow. 

Now what is Git Hub then? 

Git Hub can be basically compared more to VSTS; where you don't just save your source code but also have all other services around development and maintenance like bug tracking; boards; discussion forms; introduction page. 

Git Hub is basically a third party software which is build on top of GIT, to make the users leverage not only GIT but also the services that this third party software provides. There is also a Git Hub desktop application which will help sync the code between GitHub website and local computer.


Git Bash is another tool which is UNIX specific; you will get all standard GIT features/commands + Standard UNIX commands can be run as well. 


Git CMD is the windows specific tool; you will get all standard GIT features/commands + Standard Windows commands to perform various tasks. 


This is what I understood.. please do comment if you have any suggestions/comments. Happy coding!

Thursday, October 26, 2017

SQL scripts repo

Just sharing a collection of SQL commands which we can use for various information retrieving w.r.t databases and tables in an SQL instance. Will try to keep this post updated with the commands I encounter. 

Go here... https://github.com/dax516/SQL/    

  • One example is as below is a command to check the number of tables in a particular database
    USE [YOUR DATABASE NAME]
    SELECT COUNT(*) from information_schema.tables
    WHERE table_type = 'base table' 
Happy coding! 

Monday, October 16, 2017

Rebuild index for all tables in a SQL Express DB

I have this weird situation in one of my ongoing projects, where there are several hundreds of SQL Express instances hosting Dynamics AX Channel Databases which are spread across few high performance virtual servers. Over a time we noticed that Server performance has reduced and a result of a good meeting with all the parties involved - we identified that SQL index rebuild was never performed even with loads of transactions and master data has been moving via the Channel databases. 

The obvious reason being - the SQL instances are Express and it doesn't support SQL Agent - to simply configure a Rebuild job out of the box. The below script can be helpful in order: 



  1. Get the current status of Index update
  2. Update indexes for all objects in all databases in a particular SQL instance. 

Hope this helps. Happy coding!

Wednesday, September 20, 2017

Switching user in visual studio 2015

Today when I logged into one of my VMs and started to use another Account with my Visual Studio 2015 - I have faced this issue - "Your account XXXX@YYYY.com cannot be connected to Visual Studio as you have already logged in with AAAA@BBBB.com. If you want to change the user - use Switch user. And if you want to continue, enter the credentials again" 

The error message might not be exactly the same, but you get the point. 


The solution I have applied is to clear of the register for the current User setting. You can do that by: 



  1. Open run > regedit
  2. Go to HKEY_CURRENT_USER > Software > Microsoft > VSCommon > 14.0 > ClientServices > VisualStudio > IdeUser
  3. Delete the existing ID
  4. Restart Visual Studio 
  5. Login with your desired credentials 
  6. and OLA... it should work. 
Happy coding!

Saturday, March 18, 2017

Pack() and Unpack() methods with RunBaseBatch

The pack() and unpack() methods can be override-ed when you extend your class with RunBaseBatch. The pack() and unpack() in Dynamics AX define serialization

Would be ease to understand with an example - Most of the interactive objects in Dynamics AX automatically store the last saved user parameters/values and when you reopen the object, you could already see the values preloaded from where you left. 
This happens because of the pack() and unpack() methods. 

For example, perhaps you remember running a report with a specific parameters and the next time you open the same report, the parameter values are preloaded as per your first selection. 

Have a look into the AOT\Classes\Tutorial_RunBaseBatch, for an example of RunBaseBatch with pack() and unpack() methods.

The confusing part could be the macros if you are a beginner. So let's go through an example without using any macros and then we can improvise it later to understand better. 

Simple example - without macros:


class AJSimpleRunBase extends RunBaseBatch
{
    date    simStartDate, simEndDate;
}

public container pack()

{
    //values stored needs to be returned to user
    
    //Simple
    return [simStartDate, simEndDate];
{

public boolean unpack(container packedValues)

{
    //Saves the user provided packedValues to the variable defined
    
    //Simple
    [simStartDate, simEndDate] = packedValues;
    return true;
}

Next level - adding additional parameter:

Say you need to add an additional parameter (isPosted) to the class. We can do so by adding a simple declaration in the classDeclaration() method and the pack() method. However, for the unpack() method, the previous stored values will become obsolete. So we need to introduce versioning and ensure that unpack() method understands which version of values it has to update. 

 class AJSimpleRunBase extends RunBaseBatch

{
    date    simStartDate, simEndDate;
    boolean isPosted;
}

public container pack()

{
    //values stored needs to be returned to user
    
    //Next level - versioning as new parameter added

    //return [1, simStartDate, simEndDate, isPosted]; //hardcoded version
}

public boolean unpack(container packedValues)

{
    //Saves the user provided packedValues to the variable defined
    
    //Next level - versioning as new parameter added
    boolean ret = true; 
    int version; 
    ;
    if (conPeek(packedValues,1) ==1) //hardcoded version
    {
        [version, simStartDate, simEndDate, isPosted] = packedValues;
    }
    else
    {
        ret=false;
    }
    return ret;
}


Using Macros - to ensure no-rework for versioning. 
If you want to add one more parameter to the class, you would have to repeat the same steps as shown above and then change the hard coded values. This is where, #macros come to the rescue. With #macros we would have to change only one spot when we change what we want to pack(). 

class AJSimpleRunBase extends RunBaseBatch

{
    date    simStartDate, simEndDate;
    boolean isPosted;
    
    //Macros, what they do is fill in exactly what the macro says at compile time. 
    //So whenever you write #CurrentVersion, the compiler fills in '1'.  
    //Whenever you write #CurrentList, the compiler fills in 'startDate, endDate, isPosted'
    #define.CurrentVersion(1)
    
    #localMacro.CurrentList
    simStartDate, 
    simEndDate, 
    isPosted
    #endMacro
}    

public container pack()

{
    //values stored needs to be returned to user
    
    //With Macros
    return [#CurrentVersion, #CurrentList];

}

public boolean unpack(container packedValues)

{
    //Saves the user provided packedValues to the variable defined
    
    //With Macros
    boolean ret = true;
    int version;
    ;
    if (conPeek(packedValues,1) == #CurrentVersion)
    {
        [version, #CurrentList] = packedValues;
    }
    else
    {
        ret=false;
    }
    return ret;

}

Hope this helps in better understanding of the pack() and unpack() methods and also some insights into #macros. 


Happy coding!

Wednesday, March 8, 2017

How to kill tasks on your machine

Sometimes only way to get out a stuck screen is to kill an on going process, if it has been executing for long, say, for example the "Application object server" is in "Starting" state for a pretty long time and you can not take any action from within the Services window. 

Always good to rely on the Windows Task manager (Right click on taskbar > Task manager) and under Services tab you can find the details of all the services available in the system. You can explore more around Processes and Details tab for further actions. 



And if Task manager is not able to help you in your situation. You can always rely on the good old command taskkill using the pid of the process to identify the correct service/process. 

Open command prompt (Run as Administrator)
taskkill /f /pid 4224


Happy coding!