Microsoft_MVP_banner

How to enable new Microsoft teams – Public Preview!

How to enable new Microsoft teams - Public Preview!

New Microsoft Teams is just AWESOME, quick but useful post below shows how you have this preview feature to make your life EASY!  Open Microsoft Teams admin center [Ask admin in your organization if you don’t have access] and follow path Teams > Teams update policies > Click on an existing policy or Create New > follow step 1 and step 2 below. Toggle Try the new teams to get yourself into NEW TEAMS world, you may chose to go back to classic (GA) version at any point of time. Let the change happen! Get it, Preview new Teams! Experience teams across tenants and get notifications from multiple tenants accounts. This is PRETTY COOL!

Electronic Reporting: Send vendor payments to external azure storage via X++

Electronic Reporting: Send vendor payments to external azure storage via X++

Electronic Reporting module in Microsoft Dynamics 365 Finance Operation lets you archive file generated by ER at SharePoint location and in Azure Storage as per this link Archive ER destination type – Finance & Operations | Dynamics 365 | Microsoft Learn. APIs can be used to check message status and read file from either location. Logic Apps or Power Automate can be used to make a call to APIs, read files, and perform required action. This post is not about how this can be done via integration 🙂 It’s been a while I haven’t written a full code base post (no low code :)) To send ER generated files directly to your provided Azure Blob Container, below is the sample class. using Microsoft.Azure; using Microsoft.WindowsAzure.Storage; using Microsoft.WindowsAzure.Storage.File; using Microsoft.WindowsAzure.Storage.Blob; using Microsoft.WindowsAzure.Storage.Auth;  class DAX_ERVendPaymOutFieUploadHelper {     /// <summary>     /// Handles attachingFile event from Electronic reporting     /// </summary>     /// <param name = “_args”>Event args for event handler</param>    [SubscribesTo(classStr(ERDocuManagementEvents), staticDelegateStr(ERDocuManagementEvents, attachingFile))]     public static void ERDocuManagementEvents_attachingFile(ERDocuManagementAttachingFileEventArgs _args)     {         ERFormatMappingRunJobTable  ERFormatMappingRunJobTable;         Common                      common = _args.getOwner();         if(common.tableid == tableNum(ERFormatMappingRunJobTable))         {             ERFormatMappingRunJobTable = ERFormatMappingRunJobTable::find(common.RecId);         }            if (!_args.isHandled() && ERFormatMappingRunJobTable.Archived == noyes::No)         {             DAX_ERVendPaymOutFieUploadHelper uploadHandler = DAX_ERVendPaymOutFieUploadHelper::construct();             uploadHandler.uploadFile(_args.getStream());         }     }     /// <summary>    /// Creates an object of DAX_ERVendPaymOutFieUploadHelper class    /// </summary>    /// <returns>DAX_ERVendPaymOutFieUploadHelper class object</returns>     public static DAX_ERVendPaymOutFieUploadHelper construct()    {        return new DAX_ERVendPaymOutFieUploadHelper();    }       /// <summary>    /// Uploads file to custom Azure blob container specified in parameters    /// </summary>    /// <param name = “_fileStream”>File stream to be uploaded</param>    /// <returns>True if file uploaded successfully</returns>     private boolean uploadFile(System.IO.Stream _fileStream)    {        boolean ret = true;          // Custom parameters table to store Azure Storage and container info        DAX_Parameters parameters = DAX_Parameters::find();         try        {            StorageCredentials credentials = new StorageCredentials(parameters.StorageAccountName, parameters.Key);             CloudStorageAccount storageAccount = new CloudStorageAccount(credentials, true);             CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();                       CloudBlobContainer rootContainer = blobClient.GetContainerReference(parameters.ContainerName);                       if(!rootContainer.Exists(null, null))            {                return Checkfailed(‘Azure storage parameters are not set up correctly.’);            }             CloudBlobDirectory directory = rootContainer.GetDirectoryReference(parameters.BankOutPaymFolder);             CloudBlockBlob blockBlob =                directory.GetBlockBlobReference(strFmt(‘VendOutPaym.xml’));              if (_fileStream.CanSeek)            {                _fileStream.Seek(0, System.IO.SeekOrigin::Begin);            }             blockBlob.UploadFromStream(_fileStream, null, null, null);            Info(‘File uploaded’);        }         catch(Exception::Error)        {            ret = checkFailed(‘Error occurred while uploading the file’);        }         catch(Exception::CLRError)        {            ret = checkFailed(‘CLR Error occurred while uploading the file’);        }         return ret;    }}

Download large bacpac (sandbox database) to DEV environment much faster

Download large bacpac (sandbox database) to DEV environment much faster

As the LCS website gets slower and slower and the database backups get bigger and bigger. Use AZCopy to download objects out of LCS asset library. it is an incredibly quickly vs manually downloading the files (>1min for a gig vs 1 hour+)               Download AZCopy to the environment (https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10?toc=/azure/storage/files/toc.json#download-azcopy) and user the PowerShell command: .azcopy copy “LCS SAS Link” “LocalPath“ D The only issue I noticed is that the local path had to be into a folder, not the root of the drive (so “C:Temp” not “C:” which is more related to windows security then anything else. Below is the example: Extract AzCopy zip folder to C:Temp folder     Extract It took 3 minutes to download almost 18 GB of data file – WOW feeling 🙂    

Get your Dynamics 365 FO tier 2 (sandbox) environment today!!

Get your Dynamics 365 FO tier 2 (sandbox) environment today!!

Pakistan User Group is hosting FREE training program for everyone covering Microsoft Business Application and Azure components of Microsoft EcoSystem from beginners to advance level. Register now if you have not yet and join us on Saturday 20th November at 4pm Pakistan Standard Time (GMT + 5). All details apart, this post is a quick guide to get your own Microsoft Dynamics 365 Finance Operations tier 2 environment FREE!! I will create step-by-step videos to explain all these steps in details, I know it requires detailed explanation. Subscribe https://www.youtube.com/c/DaxtureD365FO  Let’s begin… Open this URL https://dynamics.microsoft.com/en-au/intelligent-order-management/overview/?ef_id=e0b92d13d85e177270894c83385bd79c:G:s&OCID=AID2200017_SEM_e0b92d13d85e177270894c83385bd79c:G:s&msclkid=e0b92d13d85e177270894c83385bd79c and click on Request a demo and sign up now Enter work or school email address (create new if you don’t have one – this can be gmail or hotmail account so don’t worry too much – it should be your and valid email as you will receive email confirmation on this account), upon entering your email account it will ask you set up a new account.  Complete all steps and verify your account either via email or SMS. Get Started  Choose region on next screen and Submit Log on to Lifecycle Services https://lcs.dynamics.com with an account you created above (e.g. I created this account 1DynamicsDaxture@1dynamics675.onmicrosoft.com). First time you will following screen  Click on + sign to create new project (Select product of your choice – I have chosen Finance and Operations)  Project is created, click on Project onboarding  and follow the documentation to complete project onboarding. This is a must step before environments will get deployed. Comment to discuss more about this process with me. Upon project onboarding completion, the configure button will be enabled (note for this example, I have not completed the onboarding process hence the configure button is disabled) Configure new environment following MS docs article – any question again ping me directly. This will take less than an hour to deploy new sandbox (tier 2) environment for Finance Operations. This also creates a new environment in power platform, check it from this URL https://make.powerapps.com/environments Sign up for FREE on portal.azure.com for 1 month using same account 🙂 I know there are steps require more explanation, I will create short videos on all steps and share. Stay Tune!! 

MS D365 FinOps: How to create new LCS project and deploy Tier 1 (DEV) VM – Even you are not a MS customer or partner :) – Part I

MS D365 FinOps: How to create new LCS project and deploy Tier 1 (DEV) VM - Even you are not a MS customer or partner :) - Part I

Scenario: You are willing to work on Microsoft Dynamics 365 Finance Operations product and want to get your hand dirty with some development. You are hearing so much about this product but never got a chance to work on this MS ERP.  Andre wrote a detailed post on how you can setup a trial environment for MS D365 Finance Operations Solution: With this post I will explain a step-by-step guide on how; to create your own LCS project to deploy a new Tier 1 VM to log on to Azure Portal to access Power Platform environments to create your own power app and use other features to deploy solutions in Power Platform to integrate with Finance Operations NOTE: You would need Azure Subscription to deploy VM First create new domain to perform all above steps – its easy just follow following steps 1. Open this site Office 365 E3 in cognitive mode or as guest and go with Free Trial option 2. Fill in details – you can use your personal email or sign up for a new email account and use that one   3. Provide as much as information you can – it will be good for you 🙂                       4. Choose verification method, I always select text me                          5. Provide Once verified enter your business details and check the availability                                       6. Sign up and you are ready to use this account to perform all above steps mentioned under solution section                 Manage your subscription option will take you to Microsoft 365 admin center where you will have 25 free user license for the whole month. You can use this account to sign up for teams and enjoy all features for the whole month FREE!!               Log on to LCS lcs.dynamics.com using above created account Create new project by clicking on + sign and fill information – product name should Finance and Operations               Your LCS project is ready, click on hamburger sign and go to Project setting                     Under Organization and ownership the type should be either customer or partner which will be one of them based on the account you have used to log on to LCS. If your account is linked to a partner organization then this will be partner and it will be customer if your account is of type customer. Remember, the created account in post is not linked to either Partner or Customer so we cannot deploy any tier 1 environment in LCS as in order to connect to Azure portal with LCS the company account should be either customer or partner.  So, we are blocked here 🙁 Here is the trick to convert this prospect account to customer account to unblock ourselves.  Browse https://trials.dynamics.com/ and choose Finance and Operations, enter your new account and hit Get Started. This will deploy a new trial environment with demo data in next 30 minutes. Read Andre’s post to find the downside of this environment.                 After trial environment deployed, refresh project settings page (you can sign out and sign in again in) to see the changes where it has changed type from prospect to customer                 Now you are the customer so let’s continue our journey of completing our solution but this is it for this post and we will continue deploying cloud hosted environment through LCS in azure portal in next post.

Dual-write learning series – Dual-write initial sync is the data integrator

Dual-write learning series - Dual-write initial sync is the data integrator

One of the features of the dual-write is initial sync where you copy data from the source app (Finance Operation OR DataVerse) to the target app (Finance Operation OR DataVerse) depending on the selection in Master for initial sync option.  This initial sync is the Data Integrator service running behind the scene and copies your data over. You configure the application id for data integrator and add it both apps (Finance Operation OR DataVerse), I have documented it my previous post The Dual Write implementation – Part 2 – understand and action pre-requisites Master for initial sync can be either Common Data Service (Dataverse) or Finance and Operations apps. For example, If I choose Finance and Operations app in below example where I am syncing Functional Locations then all records will be copied from Finance and operations to Dataverse. Initial Sync is a full push means if an individual row fails to sync, you cannot resync only failed ones. If the initial synchronization only partially succeeds, a second synchronization runs for all the rows, not just the rows that failed to be synced during the initial synchronization. For example; 1st initial sync Run for 1000 records from FO to CDS à 700 passed and 300 failed 2nd initial sync Run will again run for 1000 records  Do check Considerations for initial Sync from Microsoft Docs  Initial Sync runs against all legal entities configured for dual-write. If you have entered a filter for a specific legal entity in a table map at Finance and Operations app side, as shown below as an example, this will not work for initial sync as it will run against all legal entities configured for dual write under environment details.

D365FO: Right click on any control at D365FO browser takes you directly to the control in AOT

D365FO: Right click on any control at D365FO browser takes you directly to the control in AOT

Last week I explored very Interesting feature especially for developers where you right click on any field/control on the form and follow these steps. This opens the visual studio in non admin mod, Opens the correct form, and takes you directly at the control in AOT. NOTE: You can only get this feature within Development VM where your browser and Visual Studio are in same machine. I am at 10.0.14 but not sure when this great feature was available first 🙁

Another step closer – Finance Operations data in Power Platform – Virtual Entities

Another step closer - Finance Operations data in Power Platform - Virtual Entities

This post focuses on the integration technologies available to have the Microsoft Dynamics 365 Finance Operations data available in Dataverse/Common Data Services/CDS. What could be better then having The biggest ERP system’s data in Power Platform. You can Power Portals, Power Apps, Power BI analytical reports, use power virtual agents for inventory closing and year end closing processes, manage expenses and employee/contractors time entry processes, most of these processes can be even without logging to MS ERP (Dynamics 365 Finance Operation) so can safe on license cost too.  Let’s see what options are available to integrate F&O data with Power Platform however, this post is dedicated to Virtual Entities.  3 Options available out-the-box to integrate F&O data with Power Platform; 👉 Data Integrator – Click on link to read more 👉 Dual-Write – Click on link to read more 👉 Virtual Entities – MS Tech Talk on Virtual entities  Before we jump to the installation and configuration part, let’s see when were the virtual entities available and what features these have come up with compared to other two integrations technologies. Virtual Entities Generally available ✔️ Finance and Supply Chain Management App: 10.0.12 or later ✔️ Dataverse: Service update 189 or later Virtual Entities features ✔️ Finance and Operations is available as a data source in Dataverse ✔️ No Finance and Operations data replication in Dataverse ✔️ Access all public data entities of Finance and Operations in Dataverse ✔️ Support all CRUD operations Install Virtual Entities solution Head to this link https://appsource.microsoft.com/en-us/product/dynamics-365/mscrm.finance_and_operations_virtual_entity and Get it Now Enter your work or school account and Sign in Choose the environment where you want to install this solution Wait for finish to installing Finance and Operations Virtual Entity solution shows as Enabled Finance and Operations Virtual Entity solution is installed successfully – Hurray!! that was easy Register an App in Azure Active Directory The AAD application must be created on the same tenant as F&O. Log on to http://portal.azure.com Azure Active Directory > App registration New Registration Define these attributes Name Account type Redirect URI – leave blank Select Register Make note of the Application (Client) ID, you will need it later Create a symmetric key for the application, Save and note it for later use. Steps to follow in Dataverse environment  Log on to Dataverse environment and click on Advance settings Go to Administrator Choose Virtual entity data sources Finance and operations is available as of the data source in Dataverse Click on Finance and Operations and following screen pops up, this is where the connections established Configuration in Finance and Operations Log on to Finance and Operations and go to System Administration | Users | Users Create a new user and assign ‘CDS virtual entity application’ role to it – don’t assign system admin role to this user – This user is used to look at the metadata of the data entities from the Dataverse instance. Enter Application Id in System Administration | Setup | Azure Active Directory applications screen with the User ID = <The user created in step 1> Test Finance and Operations data in Dataverse Log on to Dataverse instance and click on a little funnel to open advance find and look for ‘Available Finance and Operations Entities’ in the list of tables in Dataverse instance.  By default not all the entities are enabled this is to avoid cluttering the user experience in Dataverse but individual entities can be enabled e.g. I enabled DataManagementDefinitionGroupEntity and mark visible to make this as a virtual entity in Dataverse. To illustrate this example, I created an export data project in Finance and Operations under Data Management with the name ‘CDSVirtualEntitiesExport’ – The data entity behind this data export projects is DataManagementDefinitionGroupEntity which is marked as virtual entity in above step.  Restart the Advance find in Dataverse instance and look for Definition Group (mserp) table map and Run to see the output This is it for today, with next post I will explain how to do customization/extension in F&O and get data into Dataverse using Virtual Entities.  Hope you may have enjoyed the post, please do provide your feedback. Enjoy your break!!

D365FO: Entity cannot be deleted while dependent Entities for a processing group exist. Delete dependent Entities for a processing group and try again.

D365FO: Entity cannot be deleted while dependent Entities for a processing group exist. Delete dependent Entities for a processing group and try again.

Scenario: There are times when you want to delete an entity from target entity list and when you do so, you face an error message which does not tell you where exactly the entity has been used.  “Entity cannot be deleted while dependent Entities for the processing group exist. Delete dependent Entities for a processing group and try again.“ Solution: Browse the environment by appending this part /?mi=SysTableBrowser&TableName=DMFDefinitionGroupEntity&cmp=USMF at the end.  For example; if the environment URL is https://daxture.sandbox.operations.dynamics.com then the complete URL will be https://daxture.sandbox.operations.dynamics.com/?mi=SysTableBrowser&TableName=DMFDefinitionGroupEntity&cmp=USMF Filter for Entity and it will give you the DefinitionGroup where the entity has been added or used in data management import/export projects. Get the DefinitionGroup name and search in the export/import projects under data management and either delete the whole project or remove the entity from the group. Try deleting/removing entity from target entity list and it should be good now.

How to run dual-write table map when underline entity table is setup under cross-company data sharing policy

How to run dual-write table map when underline entity table is setup under cross-company data sharing policy

Scenario:  Project groups are shared across all legal entities in D365 finance & operations app so have setup under one of the cross-company data sharing policies. You are also required to setup dual-write table map for project groups to sync from FO to CDS. However, you get following error message when you try to Run the table map “Copying pre-existing data completed with errors. For additional details, go to initial sync details tab.” Follow these steps to overcome this issue as a workaround.  Steps to follow: Disable cross-company data sharing policy where ProjGroup table has been used Choose Yes at next pop up window Go to Data management > Dual write > select Project Groups table map and Run Enable cross-company data sharing policy for project groups Choose No at the next pop up window unless you want to copy data across all companies

FaisalFareed@2025. All rights reserved

Design by T3chDesigns