United Kingdom +44 (0) 1424 863 450   hello@simego.com

Team Blog

Keep up to date with everything going on at Simego.

9 August 2016

Use Visual Studio External Tools with Data Sync

This is an interesting Use-Case I just discovered, using Visual Studio External Tools allows you to run Data Sync projects in the Visual Studio Designer and capture the output in the Visual Studio Output window.


Setup the External Tool to Start the Data Sync RunTool.




Then Select the Data Sync Project in solution explorer and run it via the External Tool.



25 July 2016

Connection Dialog Updated

In the latest preview release of Data Sync we have updated the Connection Dialog window to make it easier to locate providers and access the connection library.


As Data Sync gets more and more providers it was becoming difficult to find the provider in the list. We now group the providers by type in a Tree list which makes it much easier. Also since this was a re-write of some old code it’s now faster too.




Additionally you can now Quick Connect without creating a Library connection by selecting Connect.


Lastly connecting via the Connection Library also updates the Library *.dsconn file so this is easier to update credentials by simply updating a Library connection and connecting.

29 June 2016

Method to add Change Tracking to SQL Table

This is a method to add an Updated Timestamp to a SQL Table where one does not exist and the source application does not provide updated information as data is changed.


This would normally be implemented within the Application or perhaps SQL Triggers, however this is another approach with Data Sync via a Hash value stored on the row.


You might want to consider this if your Tables are large and your syncing into something that is slow or on the internet. i.e. Dynamics CRM where Data Sync Incremental sync would be faster, with a Change Timestamp you can select records that have been changed since a point in time.


I have started out with a simple Table, a copy of the Suppliers Table from the Northwind database. This does not include any tracking information.


CREATE TABLE [dbo].[MySuppliers] (
    [CompanyName] nvarchar(40) NOT NULL,
    [ContactName] nvarchar(30) NULL,
    [ContactTitle] nvarchar(30) NULL,
    [Address] nvarchar(60) NULL,
    [City] nvarchar(15) NULL,
    [Region] nvarchar(15) NULL,
    [PostalCode] nvarchar(10) NULL,
    [Country] nvarchar(15) NULL,
    [Phone] nvarchar(24) NULL,
    [Fax] nvarchar(24) NULL,
    [HomePage] nvarchar(MAX) NULL


Next we add 2 columns, 1 to hold the hash value and 1 to hold the time of change.


ALTER TABLE [dbo].[MySuppliers] ADD [Hash] nvarchar(50) NULL
ALTER TABLE [dbo].[MySuppliers] ADD [Updated] datetime DEFAULT(GETUTCDATE()) NOT NULL


Next we setup the Data Sync project, connect the Source and Target to the same SQL Table, MySuppliers in this case.


Then Create 2 Calculated Columns which will provide the change tracking information.


1. MyHash – Returns a String MD5 hash of the values in the row.






2. MyUpdated – Returns DateTime of the Change (when this project is run)




Map these new Calculated columns to Hash and Updated in the Target.




Now for the Trick, we want to update the Time and hash values only when the data has changed to do this we set the TriggerUpdate property in the schema map = False for all columns Except MyHash.




Now the first time we run the project all rows will be updated, but on the next run only the rows where the Hash is different will the Updated value be changed.


To generate the change tracking, you would run this as a pre-step in Ouvvi to your project you want to create an Incremental Sync. The benefit being if you have a Table of say 1 Million Rows into Dynamics CRM running this against SQL locally first will ensure the subsequent Sync into CRM with Incremental mode is fast as only changed rows need be loaded.



8 June 2016

Dynamics CRM Error Logging

We have improved the error response your get when you hit an Error against Dynamic CRM with Data Sync. We now format the request in a more human readable format and add this to the Error Log along with the response from Dynamics CRM.


For example attempting to Update the totalamount on an opportunity to a value that is out of range returns the text below to the error log. You will see that now you can easily see the Guid of the target entity item and the data that is being sent in the request. This should make it much easier to diagnose issues with Dynamics CRM.




A validation error occurred.  The value of 'totalamount' on record of type 'opportunity' is outside the valid range.











createdbyname=Sean Cleaver

createdbyyominame=Sean Cleaver








<meta http-equiv="Content-Type" content="text/html; charset=utf-8">


<body> ... (value truncated)










modifiedbyname=Sean Cleaver

modifiedbyyominame=Sean Cleaver




owneridname=Sean Cleaver


owneridyominame=Sean Cleaver









<meta http-equiv="Content-Type" content="text/html; charset=utf-8">


<body> ... (value truncated)



subject=Thank you for your order 205-6373703!



Account With Id = ccc5ef83-c210-e411-80bf-00155d013a04 Does Not Exist


Dynamics CRM Error

3 June 2016

Static Data Provider

This is something I have been meaning to build for some time now. I wanted the ability to take some data and store it in the Data Sync project file and use this as the source to some other project. You could always use XML Files but that requires two files the XML Data and Data Sync Project.


You might use this to create Test Environments or reset some Data Source prior to testing or even port data between two environments that have no physical connection.


The problem was I had no way of adding a block of XML to the Project file during load and save other than in an XML Attribute. So I have extended the Data Sync IDataSourceReader interface to allow for loading and saving an XML Element as well as the Parameter Elements.


This provider persists the data in a format similar to the .NET DataSet so it’s human readable in the project file and small changes can be made manually if necessary. The Data is loaded when the project is opened so if you make a change through the back door you need to reload the project. Also if you change the Schema it’s best to re-create it.


It’s easy to use first connect to the source of your Data this is what you want to copy into the project file.




Then choose Create Static Datasource from the Tools Menu




This automatically sets up the Static Datasource copying the Schema from the Schema Map.




Compare and Sync your Data to load the Data into the Static Datasource provider.




Then use the Swap Sides function from the Tools Menu to make the Static Data the Source of your project.




That’s it you now have a Static Datasource setup that contains a copy of the Data in the Data Sync project file.



1 June 2016

Custom REST Data Sync Provider with Project Automation

In this example we are going to show you how to use a new feature in Project Automation with the Dynamic Data Source Provider to create a custom REST provider.


First the REST Service, this is the Ouvvi Data Service it returns a simple Json result from a Url like this below. You will see each data item is within the array “data”.




First we set up the project we use the Empty/Dynamic Datasource as the Source provider and configure the Columns collection to match the columns that will be returned from the REST Service. This is the default schema as this provider will not discover the schema.


We also set the Target to be an instance of the Empty/Dynamic Datasource for development we can change this later when it’s all working. (FYI: Tools->Create Null/Empty Datasource does this automatically matching the Schema Map).




Running this returns zero results since it does nothing right now but we’re now set to jump into Project Automation and write some code that will call the REST Service return a Json document and then parse the result and populate the Data Sync DataTableStore table with the data.




First up a bit of Boiler plate code we need to let the Source provider know that we have an implementation for the data load so we set a delegate to call us for the Data Load and then return the DataTableStore (currently empty).




    public override void Start()
        DataSourceA.GetDataTableCallback = GetDataTableSource;    

    public DataTableStore GetDataTableSource(DataTableStoreCallbackInfo info)
        return info.Store;

Next we need some help for the Http Request calls, we could do it all ourselves but it would be better if Data Sync could help us here … HttpWebRequestHelper to the rescue add a new instance of the HttpRequestHelper class to the class instance variables.


    private HttpWebRequestHelper helper = new HttpWebRequestHelper() { UseDefaultCredentials = true };
    public override void Start()
        DataSourceA.GetDataTableCallback = GetDataTableSource;    

    public DataTableStore GetDataTableSource(DataTableStoreCallbackInfo info)
        return info.Store;

Now for the bit where we get the Json document and load the DataTableStore with data. A simple call via GetRequestAsJson returns the Json Document from the service we then enumerate the “data” array adding each row to the DataTable Store.


The Add method on the DataTableStore Rows collection includes a delegate that asks for each column passing you the column name from the source. So you simply look it up and return it’s value.




    private HttpWebRequestHelper helper = new HttpWebRequestHelper() { UseDefaultCredentials = true };
    public override void Start()
        DataSourceA.GetDataTableCallback = GetDataTableSource;    

    public DataTableStore GetDataTableSource(DataTableStoreCallbackInfo info)
        var jsonResponse = helper.GetRequestAsJson("http://test.internal.simego.com/api/list/projects");
        foreach(var item_json in jsonResponse["data"])
            info.Store.Rows.Add(info, (o, columnName) => item_json[columnName].ToObject(typeof(object)));            
        return info.Store;


There you have it a custom REST based provider in Data Sync all without Visual Studio and a few lines of code. If you need to write back to the Data Source you can then implement your own code in the Project Automation Item Events or provide a full IDataSourceWriter implementation via the GetWriterCallback delegate.




The HttpWebRequestHelper class provides many helper methods to assist with working with Http Calls. We have already re-written the Simego Web API provider to use this new helper and started a new OData 4.0 provider based on it.



1 June 2016

Sending Email Message in Project Automation

We have added a helper function in the Project Automation base class for sending email messages, you can now simply call the SendEmail method to send an email at the end of the sync. (The SendEmail is a wrapper around the .NET SmtpClient in System.Net.Mail).


    public override void End(ProjectAutomationResult result)
        SendEmail("support@simego.com", "Test Send", "Testing Send");

The are several method overrides that also allow you to add attachments to your email messages.


The SMTP Server settings can either be supplied inline or via the Project Properties, if your running under Ouvvi the SMTP Settings will be read via the Values Store Service so make sure you point it to your Ouvvi Tenant.  


If your using project properties for the SMTP Server details use the following property names :-








If you use the project properties and your running under Ouvvi, Ouvvi will automatically update them from the Ouvvi configuration Database at runtime.

1 June 2016

Output Change Set as XML

It’s now possible to output the Data Sync change set as an XML document directly from the Compare Results and via Project Automation.


Simply press the “Export Results Xml” button on the Compare Results to generate an Xml Document containing the change details.




The Xml Output format is a .NET XML DataSet file that contains multiple tables Added, Changed-Source, Changed-Target and Deleted. (These files can be read by the .NET DataSet Provider in Data Sync)




If you then want to output a change set file automatically each time the sync runs you can use Project Automation End event to output the change file.


Get a Change Set XML Serializer from the Compare Result and write the result to a file.


    public override void End(ProjectAutomationResult result)
            result.CompareResult.GetChangeSetSerializer().WriteXml(string.Format("ChangeSet-{0}.xml", Environment.TickCount));    

19 April 2016

Output Change Set Data

Outputting the changes between sync’s has been requested by several clients recently.  There’s not an easy way to do it other than enumerate the change set. We are hoping we can fix this so this is a request for some feedback!


Currently we have added a Method called GetChangeSets() on the compareResult this returns an object with three DataTableStore objects Added,Updated and Deleted.


There is also a Helper Method to combine them into a DataSet with 3 tables, where you can then simply write the DataSet to a File.




public override void AfterCompare(ProjectAutomationCompareResult compareResult)
            compareResult.GetChangeSets().GetAllChangesDataSet().WriteXml("AllChanges.xml", XmlWriteMode.WriteSchema);                




One problem with the .NET DataSet object is it likes to munch on system memory so if the change set is large it’s going to eat away at a lot of memory.


So I guess we’re asking given a change set what do you want to do with it?


3xCSV files? 3xXML Files?


How would you see this working?

30 March 2016

Converting Integers to Guids

We know that Integers are not Guid’s they are very different data types. This causes a few complications with systems like Dynamics CRM that uses Guid’s for Primary Keys.


Most source systems will use an Integer as the Primary Key typically an auto-incrementing ID column on the source table.


If we could simply create a Guid from the Integer value we could keep the Integer ID column in our source system and link it to the Guid Primary Key in the Target System. No longer would we need to map the ID to a custom column or manage other lookup/conversion tables.


If we take a Guid value and replace the first 4 bytes with the bytes from an Integer we can then reliably convert an Integer to a Guid.


OK so we loose the fact that we can guarantee that every Guid value is unique, however are these values really likely to exist in our small applications?


Taking an Empty Guid and replacing the first 4 bytes with the values 1-9 produces.












Or Take an existing Guid and replacing the first 4 bytes with the values 1-9 produces












So as of Data Sync 3.0.992 release we have added a new Calculated Column function NEWGUID(int) and NEWGUID(guid, int) these functions help us produce the values above.


This version will merge your integer value with an Empty Guid.




This version will merge your integer value with a defined Guid value.


NEWGUID(GUID("{E03F2193-7DF5-4947-AC47-E05EA89D21E4}"), ID)


So now you can convert your source integer ID columns directly to a Guid to use with Dynamics CRM and map directly to the CRM Entity ID.