Running Custom Code in Dynamics CRM Portal – Part 2 – Security

In a previous post, I talked about my experience running custom code from the portal. This method allowed a web file to be created and then called asynchronously from JavaScript. This post can be found at the link below, if you have not read it, I recommend reading it to provide context for the remainder of this post.

Running Custom Code in Dynamics CRM Portal

One of the major aspects missing from the last post is security. When the RetrieveMultiple plugin is triggered, it runs in the context of the system user. And, as it is written, the plugin does not have any knowledge of what portal user it was triggered by. This problem can be solved using the steps below.

Providing the current portal user (contact) id to the plugin

The first step is to pass the id of the current portal user to the plugin. This can be achieved by adding a condition to the FetchXml filter like below, along with adding a field to the Portal Actions entity.

Verifying Security on Organization Service Calls

The next step is to ensure that all calls to the OrganizationService have filtering applied. To do this, I created a class implementing the IOrganizationService interface which delegates calls to the organization service provided by the plugin. Then, when a call is made, I query the user’s entity permissions assigned through the associated web roles and then use these to determine what the user should be able to do and/or see.

Keep in mind that the Authenticated User web role is applied to Contacts automatically and will not always be explicitly assigned to the user.

Efficient Plugin Development

The process used when developing plugins can have a major influence on efficiency. This post explains the process that I have found to work best for me.


The first step of the development process is to determine whether the plugin should be placed in an existing assembly or in a new assembly. Having multiple assemblies allows multiple developers to work in parallel, however, it also increases the code’s footprint.

It is also beneficial to consider how files are organized within the plugin. I have found it works well to organize plugins by the entities that they are registered on. It can also be helpful to leave the namespace as the assembly namespace (although ReSharper will complain). This allows the files to be easily reorganized without being in the completely wrong namespace when compared to the folder structure.

Also, when creating the plugin, inheriting from a base class can be helpful. This minimizes boilerplate code, making the plugin easier to understand and more standardized.


Exceptions and Plugin Profiler

When developers first start writing Dynamics plugins, they often insert Exceptions purely for use during the development process. They may write a trace after every line. Or, they may even use the Plugin Profiler. Although these can all be useful at times, they are not usually very efficient.

I have found the methods below have improved my development process.

Unit Testing

To allow unit testing, I move the bulk of the plugin functionality into a separate function. Then, I make this function public so that it can be called from a unit test. When calling this function, it is often simplest to identify a record in CRM to test against and then pass in this record id along with an Organization Service connected to the development organization. However, if the plugin needs to support many varied and/or complex cases then I will instead use FakeXrmEasy to ensure that the various cases are sufficiently tested.

Plugin Trace Log

If an issue is difficult to setup, or an issue is only occurring when the plugin has been deployed to CRM, then it can be helpful to debug using trace statements. To do this, enable the plugin logging built into recent versions of CRM, and then view the Plugin Trace Log through the XrmToolbox Plugin Trace Viewer. This eliminates the need to install the plugin profiler solution and allows tracing without adding extra Exceptions to the plugin.

Plugin Deployment

One more helpful tool in this process is the XrmToolbox Plugin Auto Deployer. This monitors the plugin assembly for changes and automatically updates the plugin in CRM when it is built within Visual Studio. When the plugin deployment tool is set to monitor the release build, the debug build can be used to work out most issues using unit testing without deploying to CRM. This also helps avoid interfering with other developers during development.


Let me know what works well for you in the comments below.

Quickly Updating Multiple Entity Attributes

Have you ever needed to make changes to a large number of fields? Perhaps 25 fields need their max value updated? All of the currency base fields need prefixed with zzz so they’re at the end of advanced find? You could go through and tediously open and change all of the fields. Or, you could use the Xrm Toolbox Bulk Attribute Editor.

This tool turns this process into 3 simple steps:

  1. Backup the entity being edited by creating and/or exporting a solution.
  2. Download and/or open “Attribute Editor” in XRM Toolbox.
  3. Click “Load Entities”.
  4. Select the entity to be edited from the entity drop-down.
  5. Click “Download Template”.
  6. Open the downloaded template.
  7. Edit the spreadsheet as necessary. Note that the logical name, schema name, and type columns should not be changed.
  8. Save and close the spreadsheet.
  9. Upload the edited template by clicking on the “…” to the left of the “Upload Template” button, selecting the spreadsheet, and then clicking “Upload Template”.
  10. Verify that the fields displayed in the grid match the expected fields.
  11. Click “Save Entity”.

Be sure to report any issues and/or suggestions in the comments below or at the Project Site:

Migrating Personal Views and Dashboards with Kingswaysoft SSIS Tools

Migrating personal views and dashboards can be difficult. The primary challenge is that personal views and dashboards can only be queried by their owner. Fortunately, KingswaySoft SSIS tools make this possible with zero code. However, the setup for this is rather unintuitive. The goal of this post is to solve that problem.

Before getting started, I want to thank Sheila Shahpari and Daniel Cai for their work on this functionality.


The steps below explain how to setup the control and data flow to migrate personal views and dashboards. When the control flow is finished, it will look like this:

1. Setup variables.

First, we need to setup variables to be used by the Foreach Loop Container. To do this, open the variables window using the “SSIS” menu in the Visual Studio main menu bar. Then, create the variables shown below. Note that the scope will be set automatically and does not need to be changed.

2. Create a data flow task to read the users and save them to a recordset.

Add a data flow task for reading the users. This data flow task uses a Dynamics CRM Source component to retrieve the users and then assigns the result to a recordset using a RecordSet Destination. This recordset will be iterated over by the Foreach Loop Container created in the next step. The FetchXml used in the Dynamics CRM source component is included below.

When configuring the recordset destination, select all available input columns in the “Input Columns” tab. Then, open the “Input and Output Properties” tab and make a note of the order of the input columns. You will need this in step 3.

Final Data Flow for step:

FetchXml for Reading users:


3. Create a Foreach Loop Container to iterate over the recordset used in step 2.

Add a Foreach Loop Container to the control flow. Configure this task to run after the data flow task added in step 2.

To configure the Foreach Loop, open the settings by double-clicking on the task. Then, in the Collection tab, set the Enumerator to “Foreach ADO Enumerator”, and set the “ADO object source variable” to “User:UsersRecordSet” as shown below. Next, set up the Variable Mappings tab as shown below. Note that the indexes may need to be changed to match the input column order from the RecordSet destination used in step 2. If you don’t know what this is referring to, you may want to re-read step 2.

Collection Tab Setup:

Variable Mappings:

4. Add a data flow task to the ForEach Loop Container to migrate personal views.

Inside of the Foreach container, add a data flow task to migrate personal views. This data flow task uses a Dynamics CRM Source component to read the views, then uses a Dynamics CRM Destination component to create the views. The FetchXml used by the Source Component is below.

Final Data Flow for step:

Query for reading views:

5. Add a data flow task to the ForEach Loop Container to migrate personal dashboards.

Inside of the Foreach Loop container, add a data flow task to migrate personal dashboards. This data flow task uses a Dynamics CRM Source component to read the dashboards, then uses a Dynamics CRM Destination component to create the dashboards. In this step, the source component retrieves all records and properties from the “userform” entity.

Final Data Flow for step:

6. Setup user impersonation

From the control flow, right click on the “Migrate Personal Views” data flow task and click on “Properties”. Then, open the “Expressions” property. Add an expression with the property “[Read Views].[ImpersonateAs]”, and the value set to the expression below. “Read Views” is the name of the Dynamics CRM Source Component used within the data flow task.


ImpersonateAs Expression:

Repeat the process above for the “Migrate Personal Dashboards” component. This time the property name used when setting the expression will be “[Read Dashboards from Source].[ImpersonateAs]”.

Completed Control Flow

The completed control flow should look like this.


This setup does not account for differences in systemuser ids between the source and destination systems. But, it should provide a good starting point. These can be accounted for using a merge join or lookup on the systemuser’s fullname or some other joining property.

Since personal views and dashboards can only be read by their owners, we used a Foreach Loop Container to iterate over the users and impersonate each one while retrieving personal views and dashboards. The impersonation was achieved by using the “ImpersonateAs” property on the Dynamics CRM Source component.

I hope this was helpful! Please leave any feedback or issues in the comment section below.

Common Data Service

This post is intended to provide a high-level description of the Common Data Service. It took me some effort to come up with what I feel is a good summarized description, but, I eventually settled on this:

The Common Data Service is a scalable and standardized data store for integrating between and with Dynamics 365 applications.

Some of the advantages of the Common Data Service are below.

Standardized entities/fields between applications (Common Data Model)

The common data model provides a standardized set of entities/fields. The schema for these entities is automatically added when a database is created, and they cannot be deleted. Custom entities/fields can also be added as necessary. However, it is best to use the standardized entities/fields when possible. Using the standard model ensures that new features or apps can be leveraged by your application. Standardizing the model also allows developers that are familiar with the Common Data Service structure to quickly become familiar with other applications that utilize the same structure.

Consolidated data

Bringing data together in a single database helps to eliminate data silos between applications. It also allows data to be reported on from a central location. My initial thought was that the Common Data Service could potentially be used as a data warehouse. However, with the current limit of 10GB per database, I do not think this is a good fit nor is it Microsoft’s intention for the service.

Data Integration Service

This service provides a standard set of maps for integrating data with the Common Data Service. Two examples of these are Dynamics 365 for Sales and Dynamics AX. This is a core component of the Prospect to cash solution.


Running Custom Code in Dynamics CRM Portal

I recently used the steps in the blog post created by Roman Savran of UDS consulting to set up a CRM Portal to run Custom Actions. You can find this post here.  Below are the two major things that I learned in the process.

FetchXml results in Liquid Templates are Cached

FetchXml results in liquid templates are cached. I had a lot of issues with plugins not firing when I expected them to because of this. It turns out that when testing, it is very important to change the cacheString parameter between requests.

Plugins Registered on the Execute message do not appear in the Plugin Trace Log unless there was an error.

At one point while working on this, I went down the path of the Portal potentially using a different message. This was due to not recognizing the above. To identify the message, I registered a plugin on the Execute message. CRM was set to write all plugin executions to the trace log. However, I only saw this plugin in the trace log whenever there was an error.

See my follow up post on security at the link below.
Running Custom Code in Dynamics CRM Portal – Part 2 – Security