Quickly Updating Multiple Entity Attributes

Have you ever needed to make changes to a large number of fields? Perhaps 25 fields need their max value updated? All of the currency base fields need prefixed with zzz so they’re at the end of advanced find? You could go through and tediously open and change all of the fields. Or, you could use the Xrm Toolbox Bulk Attribute Editor.

This tool turns this process into 3 simple steps:

  1. Backup the entity being edited by creating and/or exporting a solution.
  2. Download and/or open “Attribute Editor” in XRM Toolbox.
  3. Click “Load Entities”.
  4. Select the entity to be edited from the entity drop-down.
  5. Click “Download Template”.
  6. Open the downloaded template.
  7. Edit the spreadsheet as necessary. Note that the logical name, schema name, and type columns should not be changed.
  8. Save and close the spreadsheet.
  9. Upload the edited template by clicking on the “…” to the left of the “Upload Template” button, selecting the spreadsheet, and then clicking “Upload Template”.
  10. Verify that the fields displayed in the grid match the expected fields.
  11. Click “Save Entity”.

Be sure to report any issues and/or suggestions in the comments below or at the Project Site:

Migrating Personal Views and Dashboards with Kingswaysoft SSIS Tools

Migrating personal views and dashboards can be difficult. The primary challenge is that personal views and dashboards can only be queried by their owner. Fortunately, KingswaySoft SSIS tools make this possible with zero code. However, the setup for this is rather unintuitive. The goal of this post is to solve that problem.

Before getting started, I want to thank Sheila Shahpari and Daniel Cai for their work on this functionality.


The steps below explain how to setup the control and data flow to migrate personal views and dashboards. When the control flow is finished, it will look like this:

1. Setup variables.

First, we need to setup variables to be used by the Foreach Loop Container. To do this, open the variables window using the “SSIS” menu in the Visual Studio main menu bar. Then, create the variables shown below. Note that the scope will be set automatically and does not need to be changed.

2. Create a data flow task to read the users and save them to a recordset.

Add a data flow task for reading the users. This data flow task uses a Dynamics CRM Source component to retrieve the users and then assigns the result to a recordset using a RecordSet Destination. This recordset will be iterated over by the Foreach Loop Container created in the next step. The FetchXml used in the Dynamics CRM source component is included below.

When configuring the recordset destination, select all available input columns in the “Input Columns” tab. Then, open the “Input and Output Properties” tab and make a note of the order of the input columns. You will need this in step 3.

Final Data Flow for step:

FetchXml for Reading users:


3. Create a Foreach Loop Container to iterate over the recordset used in step 2.

Add a Foreach Loop Container to the control flow. Configure this task to run after the data flow task added in step 2.

To configure the Foreach Loop, open the settings by double-clicking on the task. Then, in the Collection tab, set the Enumerator to “Foreach ADO Enumerator”, and set the “ADO object source variable” to “User:UsersRecordSet” as shown below. Next, set up the Variable Mappings tab as shown below. Note that the indexes may need to be changed to match the input column order from the RecordSet destination used in step 2. If you don’t know what this is referring to, you may want to re-read step 2.

Collection Tab Setup:

Variable Mappings:

4. Add a data flow task to the ForEach Loop Container to migrate personal views.

Inside of the Foreach container, add a data flow task to migrate personal views. This data flow task uses a Dynamics CRM Source component to read the views, then uses a Dynamics CRM Destination component to create the views. The FetchXml used by the Source Component is below.

Final Data Flow for step:

Query for reading views:

5. Add a data flow task to the ForEach Loop Container to migrate personal dashboards.

Inside of the Foreach Loop container, add a data flow task to migrate personal dashboards. This data flow task uses a Dynamics CRM Source component to read the dashboards, then uses a Dynamics CRM Destination component to create the dashboards. In this step, the source component retrieves all records and properties from the “userform” entity.

Final Data Flow for step:

6. Setup user impersonation

From the control flow, right click on the “Migrate Personal Views” data flow task and click on “Properties”. Then, open the “Expressions” property. Add an expression with the property “[Read Views].[ImpersonateAs]”, and the value set to the expression below. “Read Views” is the name of the Dynamics CRM Source Component used within the data flow task.


ImpersonateAs Expression:

Repeat the process above for the “Migrate Personal Dashboards” component. This time the property name used when setting the expression will be “[Read Dashboards from Source].[ImpersonateAs]”.

Completed Control Flow

The completed control flow should look like this.


This setup does not account for differences in systemuser ids between the source and destination systems. But, it should provide a good starting point. These can be accounted for using a merge join or lookup on the systemuser’s fullname or some other joining property.

Since personal views and dashboards can only be read by their owners, we used a Foreach Loop Container to iterate over the users and impersonate each one while retrieving personal views and dashboards. The impersonation was achieved by using the “ImpersonateAs” property on the Dynamics CRM Source component.

I hope this was helpful! Please leave any feedback or issues in the comment section below.

Common Data Service

This post is intended to provide a high-level description of the Common Data Service. It took me some effort to come up with what I feel is a good summarized description, but, I eventually settled on this:

The Common Data Service is a scalable and standardized data store for integrating between and with Dynamics 365 applications.

Some of the advantages of the Common Data Service are below.

Standardized entities/fields between applications (Common Data Model)

The common data model provides a standardized set of entities/fields. The schema for these entities is automatically added when a database is created, and they cannot be deleted. Custom entities/fields can also be added as necessary. However, it is best to use the standardized entities/fields when possible. Using the standard model ensures that new features or apps can be leveraged by your application. Standardizing the model also allows developers that are familiar with the Common Data Service structure to quickly become familiar with other applications that utilize the same structure.

Consolidated data

Bringing data together in a single database helps to eliminate data silos between applications. It also allows data to be reported on from a central location. My initial thought was that the Common Data Service could potentially be used as a data warehouse. However, with the current limit of 10GB per database, I do not think this is a good fit nor is it Microsoft’s intention for the service.

Data Integration Service

This service provides a standard set of maps for integrating data with the Common Data Service. Two examples of these are Dynamics 365 for Sales and Dynamics AX. This is a core component of the Prospect to cash solution.


Running Custom Code in Dynamics CRM Portal

I recently used the steps in the blog post created by Roman Savran of UDS consulting to set up a CRM Portal to run Custom Actions. You can find this post here.  Below are the two major things that I learned in the process.

FetchXml results in Liquid Templates are Cached

FetchXml results in liquid templates are cached. I had a lot of issues with plugins not firing when I expected them to because of this. It turns out that when testing, it is very important to change the cacheString parameter between requests.

Plugins Registered on the Execute message do not appear in the Plugin Trace Log unless there was an error.

At one point while working on this, I went down the path of the Portal potentially using a different message. This was due to not recognizing the above. To identify the message, I registered a plugin on the Execute message. CRM was set to write all plugin executions to the trace log. However, I only saw this plugin in the trace log whenever there was an error.