Team Blog

Keep up to date with everything going on at Simego.


11 December 2014

Export Active Directory User Group Membership

In this example we show how you can use Data Sync to create a Data Set from your Active Directory that returns all the Groups that your users belong to.

 

The idea is that we want a rows for each user and group like this below.

 

image

 

By default Data Sync will return the Group membership (Member Of) as a Comma separated List Like this Builtin,Remote Desktop Users,Simego Remote Desktop Users,Users

 

We need to use a little Trick in Dynamic Columns to override the Row loading and expand this list into multiple rows for each group copying the values across the rows.

 

Go into Dynamic Columns and use the code below to expand the Member Of column into multiple rows.

 

 

using System;

partial class DataSourceRowOverride : Simego.DataSync.DynamicColumns.DataSourceRowInternal //Do Not Change This Line
{
    public override bool BeginRow()
    {        
        if(!string.IsNullOrEmpty(MemberOf))
        {
            var parts = MemberOf.Split(',');
            
            if ( parts.Length > 1 )
            {
                foreach(var s in parts)
                {
                    var new_row = Table.NewRow();
                    foreach(var c in Table.Columns) 
                    {
                        if ( c.ColumnName == "Member Of" )
                        {
                            new_row[c.ColumnName] = s;
                        } 
                        else 
                        {
                            new_row[c.ColumnName] = Row[c.ColumnName];    
                        }
                    }
                    
                    Table.Rows.AddWithIdentifier(new_row, ADPath);
                }
                
                return false;
            }
        }
                            
        return true; 
    }    
}

Since the DSID is no longer unique we need to use a combination of 2 columns for the Key we are using DSID and Member Of.

 

image

 

Now when you look at the results you get Rows for each Group and User.

 

image


3 December 2014

Copy Accounts address to Contacts address records

In this example we show how you can use Data Sync to copy the address details from the parent account record to the contact record in Dynamics CRM.

 

Within Dynamics CRM the parentcustomerid field defines the relationship to the parent Account entity record. What we need is to setup Data Sync to pull in the details of the Account record via a Lookup.

 

First setup Data Sync so that the Source and Target are mapped to the same contact entity. Then create a Filter to only return the rows where we have a parentcustomerid value.

 

Setup

 

Add a Filter to exclude rows where the parentcustomerid is NULL i.e.

NOT(ISNULL(parentcustomerid))

 

Filter Options

 

We also need to disable DELETE and ADD actions on the Target since we only want to UPDATE the records. (The filter may exclude rows which would create DELETE actions on the target an alternative is to use the SyncAtoBIncremental mode in Project Settings).

 

Target Actions

 

Next we add a Lookup on parentcustomerid to account.accountid

 

Lookup Configuration

 

Now we just map the Account Lookup Column values to the Contact record.

 

Final Schema Map

 

When we Compare A->B we now get an UPDATE action to copy over the values from the Account record to the Contact Record.

 

Compare Results


2 December 2014

Implementing an Address Verification Web Service

This is an example showing how you can use a 3rd party system with Data Sync to verify or clean up your addresses in your CRM system or Database.

 

The principal idea is go through each address and pass the details to the 3rd party and then mark the record Valid or Invalid and set a Date value for the test to prevent too many API calls. If the service you use returns corrected address data you could then use this to update the address.

 

I am using a basic address validator service from http://www.address-validator.net/index.html this will just return whether the address is valid or not.

 

I will use a simple SQL Table for this example however it would be easy to adapt to Dynamics CRM or SharePoint.

 

CREATE TABLE [dbo].[Addresses] (
    [ID] int IDENTITY(1, 1) PRIMARY KEY NOT NULL,
    [UUID] uniqueidentifier DEFAULT(newid()) NOT NULL,
    [FirstName] nvarchar(255) NULL,
    [LastName] nvarchar(255) NULL,
    [Company] nvarchar(255) NULL,
    [Address] nvarchar(255) NULL,
    [City] nvarchar(255) NULL,
    [County] nvarchar(255) NULL,
    [State] nvarchar(255) NULL,
    [ZIP] nvarchar(255) NULL,
    [Country] nvarchar(255) NULL,
    [Phone] nvarchar(255) NULL,
    [Fax] nvarchar(255) NULL,
    [Email] nvarchar(255) NULL,
    [Web] nvarchar(255) NULL,
    [DateLastChecked] datetime NULL,
    [Valid] bit DEFAULT(0) NOT NULL
)

 

There are 2 fields in here that are going to control the process DateLastChecked and Valid, so the DateLastChecked will be updated with the Date we ran the check and Valid will be updated whether the address passed the validation.

 

First Step is to connect the Source and Target in Data Sync to the same Table as we want to update the Table based on the result of the Web Service.

 

image

 

Next we want to drop into the code editor to write the code to call the service and update the values on the source. Go to View->Dynamic Columns and Enable Dynamic Columns.

 

First we write the code to provide us with the Dynamic Columns to map to the Target and a place to call the Web Service. We want to create virtual columns that we update with results from the Web Service. So the code below creates virtual/dynamic columns for the Address fields and calls them My* and we also make a check to ensure we only check addresses that are 90+ days old.

 

#region Usings
using System;
using System.Collections.Generic;
using System.Data;
using System.Globalization;
using System.Text;
using System.Linq;

using Simego.DataSync;
#endregion

partial class DataSourceRowOverride : Simego.DataSync.DynamicColumns.DataSourceRowInternal //Do Not Change This Line
{    
    public DateTime MyDateLastChecked { get; set; }
    public bool MyValid { get; set; }
        
    public string MyAddress { get; set; }
    public string MyCity { get; set; }
    public string MyCounty { get; set; }
    public string MyState { get; set; }
    public string MyZip { get; set; }
        
    public override bool BeginRow()
    {
        MyAddress = Address ?? string.Empty;
        MyCity = City ?? string.Empty;
        MyCounty = County ?? string.Empty;
        MyState = State ?? string.Empty;
        MyZip = ZIP;
        
        MyDateLastChecked = DateLastChecked ?? DateTime.Parse("2014-01-01");
             
        MyValid = Valid;
        
        //Check Addresses that are 90 Days old.
        if(MyDateLastChecked < DateTime.Today.AddDays(-90))
        {        
            //Check Address here
            
            MyDateLastChecked = DateTime.Today;
        }
        
        return true; // return false to skip row from results.
    }
}


Next we re-configure the Schema Map to map these My* columns to the Target.

 

image

 

Before we write the rest of the code we need to ensure that the .NET 4.0 System.Web.dll assembly is loaded into Data Sync, to do this go to File->Properties and add an Assembly reference for C:\Windows\Microsoft.NET\Framework64\v4.0.30319\System.Web.dll (this path maybe different on your machine).

 

image

 

Now we configure the rest of the code to call the Web Service and set the Valid result.

 

#region Usings
using System;
using System.Collections.Generic;
using System.Data;
using System.Globalization;
using System.Text;
using System.Linq;

using Simego.DataSync;
#endregion

using System.Web;
using System.Net;
using System.IO;

partial class DataSourceRowOverride : Simego.DataSync.DynamicColumns.DataSourceRowInternal //Do Not Change This Line
{
    const string APIURL = "http://api.address-validator.net/api/verify";
    const string APIKEY = "my-api-key";
    
    public DateTime MyDateLastChecked { get; set; }
    public bool MyValid { get; set; }
        
    public string MyAddress { get; set; }
    public string MyCity { get; set; }
    public string MyCounty { get; set; }
    public string MyState { get; set; }
    public string MyZip { get; set; }
        
    public override bool BeginRow()
    {
        MyAddress = Address ?? string.Empty;
        MyCity = City ?? string.Empty;
        MyCounty = County ?? string.Empty;
        MyState = State ?? string.Empty;
        MyZip = ZIP;
        
        MyDateLastChecked = DateLastChecked ?? DateTime.Parse("2014-01-01");
             
        MyValid = Valid;
        
        //Check Addresses that are 90 Days old.
        if(MyDateLastChecked < DateTime.Today.AddDays(-90))
        {        
            var postData = string.Format("StreetAddress={0}&AdditionalAddressInfo={1}&City={2}&PostalCode={3}&State={4}&CountryCode={5}&Locale=en&APIKey={6}",                
                HttpUtility.UrlEncode(MyAddress),
                HttpUtility.UrlEncode(""),
                HttpUtility.UrlEncode(MyCity),
                HttpUtility.UrlEncode(MyZip),
                HttpUtility.UrlEncode(MyState),
                HttpUtility.UrlEncode(Country),
                APIKEY);
                        
            var postBytes = System.Text.Encoding.UTF8.GetBytes(postData);
            
            //Do the Address Lookup here...                    
            HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(APIURL);
            
            request.Method = "POST";
            request.ContentLength = postBytes.Length;
            request.ContentType = "application/x-www-form-urlencoded; charset=UTF-8";
            
            using(var requestStream = request.GetRequestStream())
            {
                requestStream.Write(postBytes, 0, postBytes.Length);
            }
            
            using(var response = request.GetResponse())
            {
                if(response != null)
                {
                    using(var responseStream = new StreamReader(response.GetResponseStream()))
                    {
                        var resultString = responseStream.ReadToEnd();
                        Trace.WriteLine(resultString);
                        
                        dynamic results = Json.Decode(resultString);
                        
                        MyValid = results["status"] == "VALID";
                    }
                }                
            }
            
            MyDateLastChecked = DateTime.Today;
        }
        
        return true; // return false to skip row from results.
    }    
}

 

That’s all there is to it, if your service returns an updated address you can then use this result with the My* virtual/dynamic columns to update the Database with the corrected address details.

 

image

 

The same concept can be used to clean-up, fix or decorate your data with data from other systems via Web Service calls or other similar systems.


1 December 2014

Deleting Records from an External List

This is an example of using Data Sync to clean up your data by deleting records you have in a separate list or document.

 

In this example we have an Excel Spreadsheet that contains the Record ID numbers of the records we wish to delete from a SQL Table.

 

image

 

To setup a project that will delete the records listed in this project we need to use a Lookup function with the source and target mapped to the same data source. The idea here is that we want to return from Datasource A only the records we wish to keep.

 

First setup the project with the Source and Target mapped to the same Datasource. This will always do nothing since both sides are equal.

 

image

 

Next add a Lookup to the Key Column (ID) to match the ID values from the Excel Spreadsheet. Set the Data Type to match the Spreadsheet type and Target Column to be ID from the Spreadsheet.

 

image

 

Next add an Expression in the Source Filter to return only those records we wish to keep. i.e. the records that are not listed in the Excel Spreadsheet. We use a test for NULL so if the result is NULL we want to keep the record.

 

The Expression in the case is ISNULL(Lookup1_ID)

 

image

 

Now when we compare A->B the result is a DELETE set that matches the records in the Excel Spreadsheet.

 

image

 

This same procedure can be used with virtually any data source and the Delete list can come from many different sources such as a SQL Query, FetchXML Expression etc.


21 November 2014

Data Sync and Windows 10

Data Synchronisation Studio looking good on Windows 10 Tech Preview.

 

image


3 November 2014

Re-writing File Paths

Requires : Data Sync Release Version 3.0.888 or higher.

 

The Data Sync File System provider now allows for the re-writing of Filenames and Paths which is handy if you want to clean up some directories prior to importing into SharePoint. You can even use it to remove all the directories to create a list of files.

 

Start by configuring a simple project that maps the source folder to a target folder.

 

image

 

This source directory contains a few folders and five documents.

 

image

 

To remove all the folders we simply map the Filename to FullFilename and remove the Path and FileName from the schema map.

 

image

 

This produces a directory containing the files with no directories.

 

image

 

Now imagine we want to use the path with the filename combined we can easily do this with a Calculated Column to create a new filename. For example below we use the FORMAT function to concatenate the Path and Filename to produce a new filename. Mapping this to FullFilename re-writes all the filenames on the target.

 

image

 

Producing the following result.

 

image

 

Then with a dynamic column you could write a function to calculate a path based on when the file was last modified allowing you to group files into directories by date.

 

Here we create a new path for the file based on the last time the file was written to and put then into folders grouped by month.

 

image

 

The schema map is slightly different since we want to map Filename and Path this time.

 

image

 

This time we get folders named as Dates and the files that we’re modified placed under them.

 

image

 

As you can see using the FileSystem provider to clean up your directories prior to importing into SharePoint is now quick and easy with a little bit of code you can almost do anything.


23 October 2014

Split Column Values into multiple Rows

We we’re asked how you might use Data Sync to Split values stored in a column with a separator character into multiple rows repeating the data on each row.

 

This is not something Data Sync handles out of the box but with a little bit of Dynamic Column code it is quite possible.

 

Assuming you have some data like this

 

ID,Name,Value
1,Item 1,Option1;Option2;Option3
2,Item 2,Option2;Option3
3,Item 3,

 

You can see that Item 1 and Item 2 have multiple Options where they are separated by a semi-colon (;). Item 3 has no options and is NULL.

 

To do this in Data Sync we need to override the row loading with Dynamic Columns and split the Value column. We also need to be careful since we can create a StackOverflow exception and crash Data Sync if we do not exit the loop correctly.

 

So in the code below we Split the Value Column and if we have more than 1 result we cancel the current row and create new row items for each element we found. We have to manually copy the row element by element and put the split single value into the value column.

 

    public override bool BeginRow()
    {        
        if(!string.IsNullOrEmpty(Value))
        {
            var parts = Value.Split(';');
            
            if ( parts.Length > 1 )
            {
                foreach(var s in parts)
                {
                    var new_row = Table.NewRow();
                    foreach(var c in Table.Columns) 
                    {
                        if ( c.ColumnName == "Value" )
                        {
                            new_row[c.ColumnName] = s;
                        } 
                        else 
                        {
                            new_row[c.ColumnName] = Row[c.ColumnName];    
                        }
                    }
                    
                    Table.Rows.Add(new_row);
                }
                
                return false;
            }
        }
                            
        return true; 
    }    

Adding the new rows with this code causes the BeginRow() method to be called again for each of the split values. But this time the Value column only contains 1 value so we just drop out the bottom.

 

image


17 October 2014

File Drop Automation with Ouvvi

An example where a file is dropped into a folder and processed by Ouvvi. This example is a little more advanced than a simple case where by the file is stored in the Out or Failure folders depending on the result of the synchronisation.

 

We have 4 folders Drop, In, Out and Failure.

 

The file is dropped into the Drop Folder, copied to the In folder and then moved to either Out or Failure depending on the result.

 

image

 

This project also uses Ouvvi configuration variables to replace the paths easily. We created a variable called ProductAutomationFolder that is the base path to our folder and share.

 

image

 

We have a File based trigger that looks for the products.xml document being dropped into the Drop folder. (Note the timestamp on this file needs to change for the trigger to detect the file).

 

image

 

You can also use a Folder Trigger if you do not know the name of the incoming file. However in this situation you need to use wildcard file patterns with your Data Sync projects so that Data Sync can locate the file. For instance the CSV reader supports *.csv and and the XML Data Set provider supports *.xml. 

 

image

 

File and Folder Triggers have a setting Time between Trigger Capture this is to prevent double trigger when a file make take a while to be written. Enter a time value here so prevent the double trigger issue with slow file writing.

 

The main automation project has 5 steps where we copy the Drop file to the In folder so we can work on the file and then process the file.

 

image

 

Step 1- Copy File to IN folder

 

image

 

Step 2 - Run our Data Sync Task

 

image

 

Step 3 - Copy the file to the OUT folder.

Providing that the previous steps we’re successful. We also create a folder for each instance so that we can link the file to the Ouvvi Log.

 

image

 

Step 4 - Copy the file to the Failure folder when there was an error.

We also create a folder for each instance so that we can link the file to the Ouvvi Log.

 

clip_image016

 

Step 5 - Delete the file from the IN folder.

 

image


8 October 2014

Data Sync with SQL Stored Procedures

It’s been requested many times and finally we’ve done it … you can now use SQL Server Stored Procedures as a Read-Only Data Source in Data Sync release 3.0.882.

 

Simply enter the Stored Procedure in the Command or SQL Tab of the connection using the keyword EXEC to indicate that it’s a Stored Procedure.

 

image

 

Because stored procedures can return different results depending on the parameters passed it’s difficult to get the schema. We have to execute the Stored procedure so we do this in a Transaction to rollback any changes to the DB that may be executed. Therefore you can use Temp tables and the like in the Stored Procedure but it might take a while to load if it has a lot of work to do.


16 July 2014

Dynamics CRM Audit History Download

Dynamics CRM 2013 and Dynamics CRM Online both have excellent auditing capabilities. Unfortunately, neither platform allows the extract of the full audit trail in bulk but require explicit item by item extraction. There are several methods of extracting the details directly out of the on-premise Dynamics CRM SQL Server but I would always shy away from these solutions as they are unsupported by Microsoft.

 

The solution I am going to present requires 3 stages

 

  1. Extraction of the core ‘Audit’ entity into a SQL Server table.
  2. For each Audit entity the extraction of the detailed audit record into a ‘Dynamic Column’.
  3. Filtering out subsequent requests so that we do not repeatedly extract the same audit detail unnecessarily.

 

Step 1: Extraction of the core ‘Audit’ entity into a SQL Server table.

 

Open the Connect To Data Source dialog and complete the details for you Dynamics CRM Online instance.

 

Connecting to Dynamics CRM Online

 

Once you have connected DS3 will display all of the available attributes in the Data Source (A) tree.

 

Extracting the Dynamics CRM Audit History entity schema

 

We are going to add a new Dynamic Column called auditdetails to the source giving our complete source schema.

 

Step 2: For each Audit entity the extraction of the detailed audit record into a ‘Dynamic Column’.

 

To add a Dynamic Column you first need to enable the Dynamic Column feature. From the View menu select the ‘Dynamic Columns Window’. This will present you with the dynamic Columns tab. On this table click on the ‘Enable Dynamic Columns’ Link.

 

DS3 Dynamics Columns

 

Dynamic Columns are a very special feature which allows users to create new columns from code. The code can re-use the source and target connections as well as the values of the current row being connected to.  When you enable the feature you get a default code template.

 

DS3 Dynamics Columns template

 

If you are familiar with c# code this will be no surprise. We are going to use some SDK code to extract the audit details for each row but only if their has been a change.

 

Firstly we are going to add some references to the necessary Dynamic CRM SDK’s:

 

using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Sdk.Messages;
using Microsoft.Xrm.Sdk.Metadata;

using System.Runtime.Serialization;
using System.Xml;
using System.IO;

// These namespaces are found in the Microsoft.Crm.Sdk.Proxy.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Crm.Sdk.Messages;

 

In between the BeginRow() and Setup() methods we are going to add the following code.

 

The code essentially exposes the Audit change details as a property called auditdetails which returns into an xml formatted string which we can store in the database.

 

public string auditdetails { get { 
    
        
        //Only get the audit details if there is a create,  update or delete
        if(actionname =="Update" | actionname =="Create"| actionname =="Delete")
        {
            Trace.WriteLine("Row {0}",NEXTINT());
            
            //Use the current connection
            Simego.DataSync.Providers.MSCrm2011.MSCrmDataSourceReader reader = (Simego.DataSync.Providers.MSCrm2011.MSCrmDataSourceReader) this.DataSourceA.Reader;
            OrganizationServiceProxy orgService = reader.Connection.GetProxy();
            //Create a history request
            RetrieveRecordChangeHistoryRequest changeRequest = new RetrieveRecordChangeHistoryRequest();
            changeRequest.Target = new EntityReference(this.objecttypecode, this.objectid);

            //Get the history request from the CRM system
            RetrieveRecordChangeHistoryResponse changeResponse = (RetrieveRecordChangeHistoryResponse)orgService.Execute(changeRequest);

            //Serialise the response into an xml string
            using (var stringWriter = new StringWriter()) 
            {
                XmlWriterSettings settings = new XmlWriterSettings { Indent = true };
                using (var writer = XmlWriter.Create(stringWriter,settings)) 
                {
                    var contractSerializer = new DataContractSerializer(typeof(RetrieveRecordChangeHistoryResponse));
                    contractSerializer.WriteObject(writer,changeResponse);
                }
                return stringWriter.ToString();
            }
        }
        //return nothing if it is not create,  update or delete
        return"";    
    } }

 

This result is essentially the exact contents of a ‘RetrieveRecordChangeHistoryResponse‘ as defined by the Dynamics CRM SDK. Why did we do this? We can then at a later date extract the full details of the change by loading it back into a RetrieveRecordChangeHistoryResponse (using serialize and deserialize).

 

Pressing the Build button converts the auditdetails code into a column

 

Dynamics CRM Audit Details extracted into a column

 

We can now use another of DS3’s unique features which will create a SQL table exactly the same schema as the Data Source (A). From the tools menu select Tools –> Create SQL Table. You will be prompted through a wizard, which will eventually create a SQL Table in a SQL Server and connect Data Source (B) to it.

 

Create a SQL Table from Dynamics CRM Schema

 

Once connected the SQL Server Table, the schemas should automatically map themselves together.

 

Dynamics CRM Integration with SQL Server Table for Audit History

 

We could now read all of the details from the audit entity and write them to the SQL Table. If we consider the probable size of the audit table this may be an extremely long process to repeat unnecessarily. So we need to implement some way of telling DS3 that we have already written the audit record to the local SQL Server so there is no point in calling back to Dynamics CRM again. We can do this by using a lookup into the target SQL Audit Database.

 

Step 3: Filtering out subsequent requests so that we do not repeatedly extract the same audit detail unnecessarily.

 

 

 

For each row we need to lookup into the Data Source(B) whether we have already extracted the audit record. We can achieve this by creating another ‘Calculated Column’ called ‘AuditValueInDatabase’ and using the LOOKUPB function. Essentially, this tells DS3 to find the auditid value from the CRMAudit table where the audit id in CRM matches the Auditid in the database. If the result is has no value then we can skip extracting the record.

 

Calculated column for looking up values in SQL Table

 

The next step is to now filter the results to only include items we cannot find in the target SQL Table. This is easy as DS3 has a filter feature which we can set to when ‘AuditValueInDatabase’ is null (nothing returned).

 

ISNULL(AuditValueInDatabase)

 

Filtering out the previously recorded audit items

 

We are almost finished our solution. Ideally we need to keep extracting the audit trail to keep it up to date rather than extracting the whole audit table every day. We will use the ‘FetchXMLFilterExpression’  feature to extract only the last 2 days audit entries. We can run this extract every day and in the case that we do not run it for a day it will catch up without writing duplicates as DS3 will reconcile the changes first before writing any records. The fetchxml looks like:

 

<filter type="and">
     <condition attribute="createdon" operator="last-x-days" value="2" />
</filter>

 

We can now run our complete Dynamics CRM Audit download project every day.