Team Blog

Keep up to date with everything going on at Simego.


16 July 2014

Dynamics CRM Audit History Download

Dynamics CRM 2013 and Dynamics CRM Online both have excellent auditing capabilities. Unfortunately, neither platform allows the extract of the full audit trail in bulk but require explicit item by item extraction. There are several methods of extracting the details directly out of the on-premise Dynamics CRM SQL Server but I would always shy away from these solutions as they are unsupported by Microsoft.

 

The solution I am going to present requires 3 stages

 

  1. Extraction of the core ‘Audit’ entity into a SQL Server table.
  2. For each Audit entity the extraction of the detailed audit record into a ‘Dynamic Column’.
  3. Filtering out subsequent requests so that we do not repeatedly extract the same audit detail unnecessarily.

 

Step 1: Extraction of the core ‘Audit’ entity into a SQL Server table.

 

Open the Connect To Data Source dialog and complete the details for you Dynamics CRM Online instance.

 

Connecting to Dynamics CRM Online

 

Once you have connected DS3 will display all of the available attributes in the Data Source (A) tree.

 

Extracting the Dynamics CRM Audit History entity schema

 

We are going to add a new Dynamic Column called auditdetails to the source giving our complete source schema.

 

Step 2: For each Audit entity the extraction of the detailed audit record into a ‘Dynamic Column’.

 

To add a Dynamic Column you first need to enable the Dynamic Column feature. From the View menu select the ‘Dynamic Columns Window’. This will present you with the dynamic Columns tab. On this table click on the ‘Enable Dynamic Columns’ Link.

 

DS3 Dynamics Columns

 

Dynamic Columns are a very special feature which allows users to create new columns from code. The code can re-use the source and target connections as well as the values of the current row being connected to.  When you enable the feature you get a default code template.

 

DS3 Dynamics Columns template

 

If you are familiar with c# code this will be no surprise. We are going to use some SDK code to extract the audit details for each row but only if their has been a change.

 

Firstly we are going to add some references to the necessary Dynamic CRM SDK’s:

 

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Sdk.Messages;
using Microsoft.Xrm.Sdk.Metadata;

using System.Runtime.Serialization;
using System.Xml;
using System.IO;

// These namespaces are found in the Microsoft.Crm.Sdk.Proxy.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Crm.Sdk.Messages;

 

In between the BeginRow() and Setup() methods we are going to add the following code.

 

The code essentially exposes the Audit change details as a property called auditdetails which returns into an xml formatted string which we can store in the database.

 

public string auditdetails { get { 
    
        
        //Only get the audit details if there is a create,  update or delete
        if(actionname =="Update" | actionname =="Create"| actionname =="Delete")
        {
            Trace.WriteLine("Row {0}",NEXTINT());
            
            //Use the current connection
            Simego.DataSync.Providers.MSCrm2011.MSCrmDataSourceReader reader = (Simego.DataSync.Providers.MSCrm2011.MSCrmDataSourceReader) this.DataSourceA.Reader;
            OrganizationServiceProxy orgService = reader.Connection.GetProxy();
            //Create a history request
            RetrieveRecordChangeHistoryRequest changeRequest = new RetrieveRecordChangeHistoryRequest();
            changeRequest.Target = new EntityReference(this.objecttypecode, this.objectid);

            //Get the history request from the CRM system
            RetrieveRecordChangeHistoryResponse changeResponse = (RetrieveRecordChangeHistoryResponse)orgService.Execute(changeRequest);

            //Serialise the response into an xml string
            using (var stringWriter = new StringWriter()) 
            {
                XmlWriterSettings settings = new XmlWriterSettings { Indent = true };
                using (var writer = XmlWriter.Create(stringWriter,settings)) 
                {
                    var contractSerializer = new DataContractSerializer(typeof(RetrieveRecordChangeHistoryResponse));
                    contractSerializer.WriteObject(writer,changeResponse);
                }
                return stringWriter.ToString();
            }
        }
        //return nothing if it is not create,  update or delete
        return"";    
    } }

 

This result is essentially the exact contents of a ‘RetrieveRecordChangeHistoryResponse‘ as defined by the Dynamics CRM SDK. Why did we do this? We can then at a later date extract the full details of the change by loading it back into a RetrieveRecordChangeHistoryResponse (using serialize and deserialize).

 

Pressing the Build button converts the auditdetails code into a column

 

Dynamics CRM Audit Details extracted into a column

 

We can now use another of DS3’s unique features which will create a SQL table exactly the same schema as the Data Source (A). From the tools menu select Tools –> Create SQL Table. You will be prompted through a wizard, which will eventually create a SQL Table in a SQL Server and connect Data Source (B) to it.

 

Create a SQL Table from Dynamics CRM Schema

 

Once connected the SQL Server Table, the schemas should automatically map themselves together.

 

Dynamics CRM Integration with SQL Server Table for Audit History

 

We could now read all of the details from the audit entity and write them to the SQL Table. If we consider the probable size of the audit table this may be an extremely long process to repeat unnecessarily. So we need to implement some way of telling DS3 that we have already written the audit record to the local SQL Server so there is no point in calling back to Dynamics CRM again. We can do this by using a lookup into the target SQL Audit Database.

 

Step 3: Filtering out subsequent requests so that we do not repeatedly extract the same audit detail unnecessarily.

 

 

 

For each row we need to lookup into the Data Source(B) whether we have already extracted the audit record. We can achieve this by creating another ‘Calculated Column’ called ‘AuditValueInDatabase’ and using the LOOKUPB function. Essentially, this tells DS3 to find the auditid value from the CRMAudit table where the audit id in CRM matches the Auditid in the database. If the result is has no value then we can skip extracting the record.

 

Calculated column for looking up values in SQL Table

 

The next step is to now filter the results to only include items we cannot find in the target SQL Table. This is easy as DS3 has a filter feature which we can set to when ‘AuditValueInDatabase’ is null (nothing returned).

 

ISNULL(AuditValueInDatabase)

 

Filtering out the previously recorded audit items

 

We are almost finished our solution. Ideally we need to keep extracting the audit trail to keep it up to date rather than extracting the whole audit table every day. We will use the ‘FetchXMLFilterExpression’  feature to extract only the last 2 days audit entries. We can run this extract every day and in the case that we do not run it for a day it will catch up without writing duplicates as DS3 will reconcile the changes first before writing any records. The fetchxml looks like:

 

<filter type="and">
     <condition attribute="createdon" operator="last-x-days" value="2" />
</filter>

 

We can now run our complete Dynamics CRM Audit download project every day.


2 July 2014

Business Intelligence + Dynamics CRM, getting the Data

Being able to view dashboards and mine you Dynamics CRM data is of utmost importance to most organisations. Dynamics CRM has great features allowing you to visualise your data using Dashboards, charts and other graphics.

 

Sometimes though Dynamics CRM data needs to join up with data from other systems. This often requires you to extract that latest data from Dynamics CRM and load it closer to your data warehouse.

 

This blog quickly demonstrates using DS3 to extract a set of entities from Dynamics CRM and schedule the continuous download of changes.

 

Open DS3 Automation Server and create an empty project.

Creating a new Integration project

This empty project will hold the steps that will extract the data from each entity in CRM.

 

Open the Project and add a Data Synchronisation Step.

 

Choosing a new Simego Project Step

 

Create and document the step and then click Save. An empty integration project will be created so you can open and get editing easily.

 

Editing the new Step

Once you have saved the step, open the integration definition by clicking ‘Open File’.

 

This will launch DS3 Developer when we can define the integration source Dynamic CRM entity, SQL Table Target and mapping.

 

Navigate through you connections tree and drag and drop the account entity onto the Data Source (alternatively you can right-click and select ‘Connect to DataSource (A)’

 

Setting the Dynamics CRM Integration source

 

All the Account entity fields in Dynamics CRM will now be available for integration.

 

Selecting the Dynamics CRM Entity attributes

 

Tick beside each field you want extracted from the entity, this will add them to the source schema.

 

Setying the primary ket for the integration

 

Using the tools menu you can as DS3 to create a SQL Table from the exact list of fields you have chosen.

 

Open the Tools menu and click the  ‘Create SQL Table’ option.

 

Automatically creating the SQL Table to import the Dynamics CRM Entity data into

 

The SQL Table Wizard will appear. This will guide you though creating a SQL Table with the same schema as the source CRM Entity.

 

Simego SQL Table Wizard

 

At the end of the wizard, DS3 will use the newly created table as the target.

 

Dynamics CRM Integration with SQL mapping

 

To verify if the integration is working, click on the Compare A->B and you can then preview the records that will come across.

 

Prevewi of integration items

 

Now you can just save the project back to the Automation server by clicking ‘Save’.

 

Back in the automation server you can just run the project by click ‘Start Project’

Automating the CRM Data Extract

 

You can view the progress on the ‘Reporting’ page in real-time.

Running the Dynamics CRM Integration

 

…and when it completes successfully.

 

CRM Integration completed successfully

 

Drilling down into the logs will show us some detail on the integration such as 9916 records were missing from the SQL Server and it took 25.225 seconds to insert them.

 

Detailed logging

 

We keep repeating the process of connecting to the Dynamic CRM entities and creating target tables until we have a set of entities to bring into the Data Warehouse.

 

Overall Business Intelligence Extract from CRM

 

We now have to set a schedule to run these extracts. The Automation server ‘Triggers’ projects based on an event. We can set these events or triggers to fire based on many different criteria. We are going to use the simplest ‘Time of Day’.

 

Create a new Time Trigger from the ‘Triggers’ menu.

 

Create a Schedule for the Business Intelligence extract

 

Set the time to 11 pm every week night and call it ‘Late Night Weekday Trigger’.

 

Schedule configuration

 

Navigate to the ‘Dynamics CRM Data Extract for Reporting’ project click Add Trigger from the sub menu.  Select the late night trigger.

 

Whenever we need to download an additional entity, we can just add a new integration to our steps and it will be included in the synchronisation.


23 June 2014

If your product or service was a person, would you like them?

Think of the service you provide and give it a personality and a name like ‘Jim’.

 

Think of yourself as a consumer of you service and product:

  1. Appreciate that every time you interact with ‘Jim’ you form an opinion about him.
  2. If you ‘love’ Jim, you will tell all your friends and invest your credibility in promoting Jim’s abilities.
  3. If you ‘hate’ Jim, you will tell all your friends and invest time in avoiding Jim or bad mouthing him.
  4. If you don’t trust Jim, you will never invest time getting to know him.

 

Jim can influence you by:

  1. Responding to questions and requests clearly and accurately.
  2. Not Ignoring you.
  3. Not Ignoring your opinion.
  4. Being attentive during conversations.
  5. Asking your opinion.
  6. Valuing your opinion.
  7. Valuing you personal information.
  8. Valuing your relationship.
  9. Being endearing.
  10. Apologizing when he makes a mistake.
  11. Introducing you to his friends that he knows you’ll like.

Not an exhaustive list bit you get the idea. It is important to notice that if Jim pretends to be someone his is not it will only backfire. Asking for your address so he can introduce you to his friends nearby and then telling everyone where your live is not endearing.


30 April 2014

Reading Data from Dynamics NAV (Navision) using the SOAP Web Service Connector

SOAP Web services remain a popular method of integration even with the rise of the API and json based integration protocols. Key Microsoft products like SharePoint, Dynamics CRM and others like Navision utilise it as a clean and approachable integration point.

 

Navision (NAV) SOAP Web Services

  

Data can be extracted from Dynamics NAV via direct SQL queries and NAV hosted web services or a combination of the both. In this post I wanted to highlight that you can extract data from Dynamics NAV using the DS3 SOAP provider and the steps needed to achieve this.

 

In this example we are going to expose the Navision Customer Card (List) as a Web Service in NAV 2009 R2.

 

Expose / Publish the Web Service

 

From within NAV navigate to Administration –> IT Administration –> General. Then Click on the Web Services List item link

 

image

 

You should see a potentially empty list of Web Services or a list of active Web Services:

 

image

 

Click on the New button, Set the ‘Object Type’ to ‘Page’ and select the Object ID to 21 and name the Service Customer and tick the Publish tick box.

 

image

 

Creating the Correct Webservice URL

 

One of the issues with this process is when you try and find the web service endpoint url. Typically the format is:

 

http://servername:7047/DynamicsNAV/WS/CompanyName/services

 

In my case for testing (if you are not using Simego DS3 then you will have to replace the spaces with %20):

 

http://192.168.1.115:7047/DynamicsNAV/WS/CRONUS International Ltd/services

 

You can open this URL and it will display the list of endpoints (Services) available from this company:

 

image

 

The most important URL is the endpoint (url) pointing to the Customer service:

 

http://192.168.1.115:7047/DynamicsNAV/WS/CRONUS International Ltd/Page/Customer

 

This is the endpoint we are going to use to read customer data from Navision.

 

Connecting to the Web Services using DS3

 

(Skip down to the animated Gif if you want to see how this is done)

 

1. Start DS3, click the Data Source A green button and select the SOAP Services Connector.

 

2. in the WebServiceURL paste in the Customer Service URL:

 

http://192.168.1.115:7047/DynamicsNAV/WS/CRONUS International Ltd/Page/Customer

2. Enter in the Namespace (this is not necessary but I like clean and Tidy).

 

3. If you need a username and password to get to the URL then enter it in the credentials settings.

 

4. Click ok and you will be presented with the Select Webservice Method dialog. We want to get a list of Customers so we navigate and select the ReadMultiple which returns an array of Customer objects. We do not need to fill in the parameters.

 

image

 

5. Click OK and DS3 will create all the necessary ‘plumbing’ to read data from the webservice.

 

6. Click on the Preview A button to check whether we can retrieve the data from Navison’s Webservice.

 

image

 

 

image

 

7. You can now choose to connect it to another systems or just set a csv destination source using the Tools Create csv file –> Compare A->B and synchronise.

 

Reading Data From Navision Web Servces

 

Contact us if you have any questions as we are happy to demonstrate the power of DS3.


22 April 2014

Uploading and Downloading files using FTP in Simego DS3 Server (Ouvvi)

FTP is a mainstay of B2B integration and whilst there may be other options to transfer files online, its ability to offer direct B2B connectivity make it a popular choice.

 

FTP Upload and Download are now included in the increasing number of DS3 Server Project  ‘Steps’.

 

This post will demonstrate how to create a project which downloads an xml file from and uploads a csv file back to a FTP server.

 

The FTP server is configured only to allow secure connections as this is a typical configuration option.

 

Create a Project to contain the FTP Steps

From the Projects Page of DS3 Server, create a new project and name it ‘FTP Integration test’.

 

Add a new ‘FTP Download’ Step

image

 

Name the step ‘Download File via FTP (s)’ and then configure it.

 

FTP Server File Path 

This is the url style path to the file on the FTP server eg ftp://localhost/data.xml

 

FTP Server uses Passive Mode

This allows the FTP server to change the port the ftp process is working through. This is normally enabled .

 

FTP Server uses SSL

If the server is configured to transfer files and authentication details over SSL then you will need to enable this setting.This is normally enabled .

 

FTP Server accept SSL certificate

It is possible (and likely) that the certificate used to secure the FTP server is not signed by a trusted root authority or is not trusted by your own machine. In this case you can indicate that it is OK to accept this SSL certificate. This is optionally enabled .

 

Destination Filename 

Define the name and location of the file once it has been downloaded eg c:\temp\medals.xml

 

Username 

Set the username used to authenticate with the FTP server.

 

Password

Set the corresponding password to used with the Username

 

Timeout

This is the time in seconds that we will wait until giving up for a response from the FTP server. eg Normally 30 seconds  00:00:30 is reasonable.

 

Once you have completed the configuration it should something look like this.

 

image

 

Save the Step.

 

Add a new ‘FTP Upload’ Step

We have assumed the there is a corresponding file called medals.csv in a local file system that we would like to upload to the FTP server.

 

Add a ‘Ftp File Upload’ step and name it ‘Upload File via FTP (s)’ and then configure it.

 

Source Filename

This is the name and path of the file to be uploaded into the FTP Server eg c:\temp\local\medals.csv

 

FTP Server Path 

This is the url style path of the of the location you want the file uploaded to eg ftp://localhost/

 

FTP Server uses Passive Mode

This allows the FTP server to change the port the ftp process is working through. This is normally enabled .

 

FTP Server uses SSL

If the server is configured to transfer files and authentication details over SSL then you will need to enable this setting.This is normally enabled .

 

FTP Server accept SSL certificate

It is possible (and likely) that the certificate used to secure the FTP server is not signed by a trusted root authority or is not trusted by your own machine. In this case you can indicate that it is OK to accept this SSL certificate. This is optionally enabled .

 

Delete Local File after Upload

After uploading you can choose to delete the source file. eg Disabled

 

Username 

Set the username used to authenticate with the FTP server.

 

Password

Set the corresponding password to used with the Username

 

Timeout

This is the time in seconds that we will wait until giving up for a response from the FTP server. eg Normally 30 seconds  00:00:30 is reasonable.

 

Once configured and Saved the step should look lie this:

 

image

 

The can now download and upload via FTP with or without SSL transport enabled by running the project.

 

Upload and Download via FTP


21 April 2014

Speeding up small updates to large data using Incremental synchronisation

When we are integrating two different systems we often want to just utilise Simego DS3 to insert or update a small portion of the destinations data. For example we might want to synchronise 100 client records from a source file containing client records to our Dynamics CRM environment containing thousands of records.

 

Using DS3’s normal compare mode, it would download the entire destination records, compare them with the source and present the results

 

image

 

In cases like this we have no interest that there are 24944 records in the destination that are not in the source. We are only interested in the 99 updates but have used 14 seconds to read the entire destination records. If we scale the problem it becomes more obviously significant when reading millions of records from the destination.

 

In cases like these we can instruct DS3 to reconcile A to A incrementally. This means DS3 will only load the records found in A from B. It can identify that records are missing in B and if they have changed also. It will not detect if there are any to delete but we are not interested in deletion in this scenario. DS3 uses the primary keys in the source to request only the records in parallel from the target. This means that you may have only one primary key for this to work.

 

To turn on Incremental mode for a project, navigate to the File menu then the project properties. Change the SyncOption from AtoB to AtoBIncremental.

 

image

 

As you can see using the AtoBIncremental setting has reduced the load time from 14.147 seconds to 1.418 and only presented us with the updates rather than deletes.

 

image


16 April 2014

Triggering a Synchronisation Project from Dynamics CRM Online

There are many ways to integrate data between systems. What is often forgotten is that we only want to process these imports or exports when something has changed.

 

We often just run the synchronisation or integration at a specific time or periodically.

 

For example we may want to identify when a new contact has been created or updated within Dynamic CRM Online. We perhaps want this information to be synchronised with a local SQL table for reporting purposes.

 

We currently have several choices.

 

Synchronise on a regular basis. We can just reconcile and then synchronise on regular interval between the Dynamics CRM contact list to our local database contacts table. This technique requires that each time we connect to Dynamics CRM, we authenticate, then query the Contacts entity to evaluate if there has been any changes. This is obviously increases the workload on Dynamics CRM Online and potentially 95% of the time there will be no data changes  and therefore no synchronisation required.

 

Poll continuously looking to see if there are any new records or records that have changed. This technique requires that each time we connect to Dynamics CRM, we authenticate, then query the Contacts to evaluate if there has been any changes. If there are changes then request a full synchronisation. This method is slightly less onerous that just synchronising regularly but is still impactful on the Dynamics CRM Online workload.

 

Send message via the Azure Message Bus. This seems ideal as Microsoft has given us some infrastructure to pass messages by. Unfortunately, we will have to go through some significant authentication and authorisation steps.  In addition we would want something flexible enough to attach to any changes which we have to program ourselves. This would be technically challenging and expensive to develop.

 

Create a Plug-in or Custom Workflow. We can use the built in Dynamics CRM expandability to create a plug-in or better a workflow step that will notify the synchronisation engine that an event has occurred and it should begin synchronisation. The challenge in this scenario is that we will need to create some form of authentication to allow us to tell the on-premise system that there have been changes. In addition we would need to configure some security from the internet based CRM Online to the on-Premise synchronisation engine, which may be behind a firewall.

 

Design

What we really need is a combination that requires no authentication, no development and is easy to configure?

 

1. Create a workflow component we can call from anywhere within Dynamics CRM that can notify an external system when something has changed.

 

2.Create an internet notification store that can be told when something has changed and can be queried about those changes. No identifying data to be stored as no authentication required because we don’t want to store security credentials in the Dynamics CRM environment or tunnel through internal firewalls.

 

3. Create a polling system that can monitor the internet based store for changes and trigger processes if there has been a change.

 

The Dynamics CRM Workflow Component

You can download it from here (including source code).

 

http://www.simego.com/downloads/Simego.CRM.Workflow.zip

 

Register the component the same as any other Dynamics CRM workflow/plug-in ( Simego.CRM.Workflow.dll).

 

Create a process and call the workflow component

 

image

 

Configure the workflow step to use a made up GUID. I chose this one but you can use anyone BUT remember it as we have to tell the monitoring system to watch for it.

 

a8243db5-fbf7-4637-881e-d635ca99269c

 

 

image

 

Save and Close then Activate the workflow.

 

The ‘Internet Notification Store’

Using the Simego Online Last Changed Service

Simego has developed a solution for exactly this purpose. It can be used by any system that can call a URL. If you want to read about in details the help is here:

 

http://www.simego.com/Help/Online/Last-Change-Service

 

Every time a Contact now changes the workflow is going to call this URL:

 

https://online.simego.com/Change/Update/a8243db5-fbf7-4637-881e-d635ca99269c

 

and when that URL is called it will update the last ‘lastchanged’ date time associated with the GUID.

 

This allows any system to query the URL and find out if something has changed since the last time it looked. This is unsecured but contains no identifying information and scales incredibly well.

 

To query the last change we just call this URL:

 

https://online.simego.com/Change/a8243db5-fbf7-4637-881e-d635ca99269c

 

The ‘Internet Notification Store’ Polling

 

Simego DS3 Automation Server and Simego Online both have the ability to poll the ‘Last Changed Service’ and Trigger associated projects by using a JSON Trigger.

 

Attaching a Trigger using the Json Trigger in the Automation Server.

 

Open the automation (Ouvvi) server and navigate to the triggers page. Choose Add Trigger and select the Json Trigger.

 

image

 

Configure the trigger with your Service URL

 

https://online.simego.com/Change/a8243db5-fbf7-4637-881e-d635ca99269c

 

Set the Json Expression to ‘lastchange’.

 

image

 

We can now create a synchronisation project to synchronise data from Dynamics CRM Online to our on-premise database.

 

Every time there is a change in the contact entity online it will be synchronised locally.


15 April 2014

Web API Ouvvi Step Handler

We have migrated the really handy Web API Step Handler from Simego Online into DS3 Automation Server (Ouvvi) this allows you to setup calls to Web Applications that have documented API’s. such as other notification systems.

 

For example configuring the Web API Step to send messages to your mobile devices based on the Pushover system is easy as setting up the request like this below.

 

image

 

 

You can also use this to Test your Web API’s and ensure that the API’s are working as intended. Or perhaps to use this as a Queue Relay Broker service to integrate application messages.

 

The Web API allows you to control the request details and validate the response so you can test to ensure the response is what you expected.

 

image


15 April 2014

Ouvvi 32/64 bit Data Sync Options

We have a new release of both Data Sync and DS3 Automaton Server (Ouvvi) that enable the ability to run both 64-Bit and 32-Bit processes from a single installation.

 

The 64 Bit version of Data Sync now ships with 32 bit components as well, and from Ouvvi you can choose how to run the Data Sync Task. This can be either In-Process (Default) this is how it works today, and we have two new options External 32-Bit Process and External 64-Bit Process.

 

These new options run the Data Sync Task in a new Process that is either 32 Bit or 64 Bit, this enables you to mix sync tasks when you need to access 32 bit Legacy ODBC Drivers whilst maintaining an 64 Bit Installation.

 

To set the execution mode you simply select the mode of operation on the Data Sync Task in Ouvvi.

 

image


11 April 2014

Connection Library Data Preview

We’ve been busy extending the Connection Library features of data sync to really help you discover your data whilst building out your migration or integration.

 

You can now preview data right from the context menu in the Connection Library this works against any data object in the connection library and will return approximately the first 1000 items.

 

For Dynamics CRM we also now support OptionSets directly which can be used for drag & drop Lookups or on their own.

 

image