Add Security role to the user from SQL- D365 FO

There are cases where the users would have lost access to an environment when it is refreshed from PROD or other environments.  If the user exists in the db , but only with minimum access, then the db admin can grant him to necessary roles from SQL
Check the Role id from using the command
            “Select Recid , * from SecurityRole”


The Role id can be found from Recid field from the above table.
Now how do we apply this to a user who has only minimum access to the environment and need to be upgraded to higher access? Use the below command to insert the role in SECURITYUSERROLE table.
The necessary roleid is has to be passed to the below table for the user.


Now on logging on to the environment , the above user would have full access .
This helps the developer to save time in waiting for access to be given for a development environment which was refreshed .



Image | Posted on by | Leave a comment

Export and Import bacpac for D365 FO

This post details about Export and Import process of bacpac file for D365 FO.
Before jumping into the actual process, let me share the issue I faced when the bacpac is directly generated from original db.
When the bacpac is exported directly from the original db, few the objects are also included which are not needed while importing , thus resulting in above error.
So, I referred to the MSDN link which speaks about export/import process. In addition to that , I am also attaching screenshots which gives clear view on these processes.
Lets start now!!!
  1. Create a copy of Original db using the command given below3_1
  2. Copying to a new DB takes time. Thus we should not assume the copied db is ready as soon as the command gets executed. So to verify if the copied db is ready, use the below commands.4_1 Though the above copy command is completed, the process would still be executing in the background. In the above picture, the status of copied db would be shown as “Copying” when the process is in progress. When it is completed, the status would be shown as “Online”.  First query shows two records which is the id of original db and the id of copied db. This query would return no records when the copy process is done.
  3. Now the copied db is ready. Its time to clean up some objects which are not needed while importing. Execute the below script one by one. I have highlighted the set of commands to be executed :
    –Prepare a database in Azure SQL database for export to SQL Server.
    –Remove certificates in database from Electronic Signature usage
    // Execute set 1 – Begin
    DECLARE @SQL nvarchar(512)
    DECLARE certCursor CURSOR for
    select ‘DROP CERTIFICATE ‘ + QUOTENAME( + ‘;’
    from sys.certificates c;
    OPEN certCursor;
    FETCH certCursor into @SQL;
    WHILE @@Fetch_Status = 0
    print @SQL;
    FETCH certCursor into @SQL;
    CLOSE certCursor;
    DEALLOCATE certCursor;
    // Execute set 1 – End
    // Execute set 2 – Begin
    — Re-assign full rext catalogs to [dbo]
    DECLARE @catalogName nvarchar(256);
    DECLARE @sqlStmtTable nvarchar(512)
    DECLARE reassignFullTextCatalogCursor CURSOR
    FROM sys.fulltext_catalogs
    — Open cursor and disable on all tables returned
    OPEN reassignFullTextCatalogCursor
    FETCH NEXT FROM reassignFullTextCatalogCursor INTO @catalogName
    SET @sqlStmtTable = ‘ALTER AUTHORIZATION ON Fulltext Catalog::[‘ + @catalogName + ‘] TO [dbo]’
    EXEC sp_executesql @sqlStmtTable
    FETCH NEXT FROM reassignFullTextCatalogCursor INTO @catalogName
    CLOSE reassignFullTextCatalogCursor
    DEALLOCATE reassignFullTextCatalogCursor
    // Execute set 2 – End
    // Execute set 3 – begin
    –Disable change tracking on tables where it is enabled.
    @SQL varchar(1000)
    set quoted_identifier off
    declare changeTrackingCursor CURSOR for
    from sys.change_tracking_tables c, sys.tables t
    where t.object_id = c.object_id
    OPEN changeTrackingCursor
    FETCH changeTrackingCursor into @SQL
    WHILE @@Fetch_Status = 0
    FETCH changeTrackingCursor into @SQL
    CLOSE changeTrackingCursor
    DEALLOCATE changeTrackingCursor
    // Execute set 3 – End
    // Execute set 4 – Begin
    –Disable change tracking on the database itself.
    MyNewCopy (Mention the copied db name here)
    –Remove the database level users from the database
    –these will be recreated after importing in SQL Server.
    @userSQL varchar(1000)
    set quoted_identifier off
    declare userCursor CURSOR for
    select ‘DROP USER [‘ + name + ‘]’
    from sys.sysusers
    where issqlrole = 0 and hasdbaccess = 1 and name <> ‘dbo’
    OPEN userCursor
    FETCH userCursor into @userSQL
    WHILE @@Fetch_Status = 0
    FETCH userCursor into @userSQL
    CLOSE userCursor
    DEALLOCATE userCursor
    // Execute set 4 – End
    // Execute set 5 – Begin
    –Delete the SYSSQLRESOURCESTATSVIEW view as it has an Azure-specific definition in it.
    –We will run db synch later to recreate the correct view for SQL Server.
    if(1=(select 1 from sys.views where name = ‘SYSSQLRESOURCESTATSVIEW’))
    –Next, set system parameters ready for being a SQL Server Database.
    update sysglobalconfiguration
    set value = ‘SQLSERVER’
    where name = ‘BACKENDDB’
    update sysglobalconfiguration
    set value = 0
    where name = ‘TEMPTABLEINAXDB’
    –Clean up the batch server configuration, server sessions, and printers from the previous environment.
    –Remove records which could lead to accidentally sending an email externally.
    UPDATE SysEmailParameters
    SET SMTPRELAYSERVERNAME = ”, MAILERNONINTERACTIVE = ‘SMTP’ –LANE.SWENKA 9/12/18 Forcing SMTP as Exchange provider can still email on refresh
    –Remove encrypted SMTP Password record(s)
    // Execute set 5- End
    // Execute set 6 – Begin
    UPDATE LogisticsElectronicAddress
    WHERE Locator LIKE ‘%@%’
    TRUNCATE TABLE PrintMgmtSettings
    TRUNCATE TABLE PrintMgmtDocInstance
    –Set any waiting, executing, ready, or canceling batches to withhold.
    UPDATE BatchJob
    SET STATUS = 0
    WHERE STATUS IN (1,2,5,7)
    // Execute set 6 – End
    // Execute set 7 – Begin
    — Clear encrypted hardware profile merchand properties
    // Execute set 7 – End
  4.  Now the copied db is ready for export. Execute the export command from command prompt5The path to execute the command should be set to bin folder in SQL server. Usually it will be in the path

                              C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\

    Change to the above path in the command prompt and execute the export command. Parameters are
    • /ssn : source connection server name. we can get this information when we connect to SQL for this server.  The name should be always suffixed with
    • /sdn : destination db . This need not exist prior. The db gets created with the name given in this parameter
    • /tf : folder where the bacpac has to be created
    • /su : user name of the sql server. This can be referred from LCS
    • /sp :  password related to the above user name. This is also avaiable in LCS
  5. After the command is completed , the file will be visible in the mentioned path6
  6. Now the bacpac file is ready to be imported. If the file size is large, I usually upload in OneDrive/project share point  and then download in the machine where the import has to be done. Execute the below command for import7.PNGAs done for export, the path has to changed to bin folder of SQL server for  import as well. The parameters
    • /sf : source file where the bacpac is saved. This should be in the local folder of the machine where the import happens
    • /tsn :  source sql connection
    • /tdb : db used for import. The name given in this parameter would create a db if it doesn`t exist.
  7. After the import process is over, 8.PNG
  8. Now the copied db has to be renamed to the original db. The process is already described in my previous post  DbNameChange


Hope the post helps 🙂

Image | Posted on by | Tagged | 2 Comments

Adding a new Financial Dimension D365 FO

Here is the process to create a new Financial dimension to a table and to display it in the form.
I am taking an example to add a new field for dimension in Project Category.
  1. Extend the table and add the field for Dimension.          d2.JPG2. Next step is to create an EDT which is specific for ProjCategory. This has to extend “LedgerDimensionValueSet”D13. Create a controller class which refers to the new EDT which we have created.This step is needed to create the dimension entry control for the new field.If this step is missed, dimensions would not be visible in the form . d3.JPG4. Now extend the form and add a new tab for dimension d4.JPG
Build the solution. Now you can see the financial dimension 🙂


Posted in Dynamics 365 Operations - Ax 7.0, Uncategorized, X++ | Leave a comment

Unable to change the primary address in Customer – D365 FO

We had a strange issue where we could not change the primary address to the other one if there are more than one address in Customer master.
The button was disabled and we were blocked to swap the primary address.
So I had to troubleshoot the issue which would unblock swapping the address.
I  observed that when the address is created manually, the check box is enabled. The customer addresses were imported using Data management framework and it looked data import issue.On observing the file and the logic given in the form “LogisticsPostalAddress”, the field “DateTo” should have a max Date value.
To be precise, the DateTo should be set to “Never” in the screen(Value to be used is given in the above screenshot). If it set to other date though it is a future date, the check box will not be enabled.
So while importing, care should be taken to give the value for the field “DateTo”. The above screen is from “More Options” -> “Advanced”  in Address tab.


Posted in D365 Operations, Data Management - DMf, Dynamics 365 Operations - Ax 7.0 | Leave a comment

PowerApps – Filter the values in the lookup

This post briefs on using FILTER function in Powerapps.

Let`s take this scenario :when a SiteId is entered in a look up, warehouse values to be filtered based on the selected site.

  1. A DropDown control is selected. Connect this control to Site Entity. I have used “OperationsSites” entity in this case.  Value is set to “SiteId” in the Advanced properties.


2. I created another Dropdown for Warehouse which has to display the values based on the sites selected. The values are filtered using the FILTER command.



Save the App and publish. Now on executing the app, Warehouse values will be displayed only the related Site.


Intersting !!! Isn`t 🙂 🙂


Posted in Uncategorized | Leave a comment

Sales Invoice from PowerApps using MS Flow

I am sharing my first post on Powerapps which is integrated to D365 FO using Microsoft Flows.

What is PowerApps ?

Well… you will get ample information on this topic when you browse. I am given a brief intro of it.

“PowerApps connects to the cloud services and data sources you’re already using, giving you the ability to quickly build apps that suit specific needs – building on skills you already have. You can share apps instantly with your co-workers across the web, tablets, and mobile devices. PowerApps also integrates with Microsoft Flow, making it possible to trigger workflows from within apps. PowerApps can be simple and fast – capable of producing an app in minutes that pulls in data from Excel or a cloud service. But PowerApps is also robust and enterprise-grade, ready for complex requirements like tracking assets across a corporation and tying into your backend systems.”

In order to implement the above concept, I gave a try to Invoice a Sales Order from Powerapps using MS Flow.

Given below the steps :

  1. Created a simple class (picture 1 )to include the logic of Sales Order Invoice. This class will be used in flow as service. There are two parameters for this method – Sales Id and the DataareaId.
  2. Microsoft flow refers the entities which are exposed.So let us create a new method in the entity “SalesOrderEntity” (Picture 2) in order to consume this method for posting.



3. Now let us move to Microsoft flow . Create a new flow with blank template and name it.


4. Enter the instance value. In the Action , the look up shows the new method which was created in the SalesOrderEntity . Since we have created the method with two parameters, those fields will be populated in the next columns.


5. These values can be either hard coded or can be taken as input from powerapps. On clicking the field, there is an option which pops on the right side . Select “Ask in PowerApps”


6. Now Microsoft flow is ready to be integrated.


7. It is also possible to test this flow independently before integrating with powerapps.There is an option to TEST





8. Now the main part is configuring the powerapps to consume this flow which we have created.

Before that , we need to bring the flow to powerapps. In the action tab, Click FLOWS and add the new flow . Create a new button in the screen and name it as “Invoice”. In the OnSelect event of the Invoice button, Call the name of the flow followed by Run . This should have 2 parameters which is the current SalesOrder and the DataAreaid.


Now, the integration is done.

Save the App and publish it. It can be tested from Mobile. On clicking the button, user will see the SO has been confirmed.




Posted in PowerApps, Uncategorized | Tagged , | Leave a comment

Are you missing model reference Error – D365 FO

We are all aware to refresh the model parameters when a new model is included in the environment. Go to Dynamics 365->  Model Management -> Update model parameters
This would work fine in development environment when we try access the object from the new model which has been added recently. What would happen if such changes for the objects are checked in ?


When the solution is build in other machine , it doesnot have the reference of the new model. The same error happens during TFS build as well.
So it is not feasible every time to manually refresh the model parameters in each environment.
We have a solution here. DESCRIPTOR file for the model.  This has to be updated with the new model which is referred in the solution.


Posted in D365 Operations, Dynamics 365 Operations - Ax 7.0 | Leave a comment