Team Blog

Keep upto date with everything going on at Simego.

16 April 2014

Triggering a Synchronisation Project from Dynamics CRM Online

There are many ways to integrate data between systems. What is often forgotten is that we only want to process these imports or exports when something has changed.


We often just run the synchronisation or integration at a specific time or periodically.


For example we may want to identify when a new contact has been created or updated within Dynamic CRM Online. We perhaps want this information to be synchronised with a local SQL table for reporting purposes.


We currently have several choices.


Synchronise on a regular basis. We can just reconcile and then synchronise on regular interval between the Dynamics CRM contact list to our local database contacts table. This technique requires that each time we connect to Dynamics CRM, we authenticate, then query the Contacts entity to evaluate if there has been any changes. This is obviously increases the workload on Dynamics CRM Online and potentially 95% of the time there will be no data changes  and therefore no synchronisation required.


Poll continuously looking to see if there are any new records or records that have changed. This technique requires that each time we connect to Dynamics CRM, we authenticate, then query the Contacts to evaluate if there has been any changes. If there are changes then request a full synchronisation. This method is slightly less onerous that just synchronising regularly but is still impactful on the Dynamics CRM Online workload.


Send message via the Azure Message Bus. This seems ideal as Microsoft has given us some infrastructure to pass messages by. Unfortunately, we will have to go through some significant authentication and authorisation steps.  In addition we would want something flexible enough to attach to any changes which we have to program ourselves. This would be technically challenging and expensive to develop.


Create a Plug-in or Custom Workflow. We can use the built in Dynamics CRM expandability to create a plug-in or better a workflow step that will notify the synchronisation engine that an event has occurred and it should begin synchronisation. The challenge in this scenario is that we will need to create some form of authentication to allow us to tell the on-premise system that there have been changes. In addition we would need to configure some security from the internet based CRM Online to the on-Premise synchronisation engine, which may be behind a firewall.



What we really need is a combination that requires no authentication, no development and is easy to configure?


1. Create a workflow component we can call from anywhere within Dynamics CRM that can notify an external system when something has changed.


2.Create an internet notification store that can be told when something has changed and can be queried about those changes. No identifying data to be stored as no authentication required because we don’t want to store security credentials in the Dynamics CRM environment or tunnel through internal firewalls.


3. Create a polling system that can monitor the internet based store for changes and trigger processes if there has been a change.


The Dynamics CRM Workflow Component

You can download it from here (including source code).


Register the component the same as any other Dynamics CRM workflow/plug-in ( Simego.CRM.Workflow.dll).


Create a process and call the workflow component




Configure the workflow step to use a made up GUID. I chose this one but you can use anyone BUT remember it as we have to tell the monitoring system to watch for it.







Save and Close then Activate the workflow.


The ‘Internet Notification Store’

Using the Simego Online Last Changed Service

Simego has developed a solution for exactly this purpose. It can be used by any system that can call a URL. If you want to read about in details the help is here:


Every time a Contact now changes the workflow is going to call this URL:


and when that URL is called it will update the last ‘lastchanged’ date time associated with the GUID.


This allows any system to query the URL and find out if something has changed since the last time it looked. This is unsecured but contains no identifying information and scales incredibly well.


To query the last change we just call this URL:


The ‘Internet Notification Store’ Polling


Simego DS3 Automation Server and Simego Online both have the ability to poll the ‘Last Changed Service’ and Trigger associated projects by using a JSON Trigger.


Attaching a Trigger using the Json Trigger in the Automation Server.


Open the automation (Ouvvi) server and navigate to the triggers page. Choose Add Trigger and select the Json Trigger.




Configure the trigger with your Service URL


Set the Json Expression to ‘lastchange’.




We can now create a synchronisation project to synchronise data from Dynamics CRM Online to our on-premise database.


Every time there is a change in the contact entity online it will be synchronised locally.

15 April 2014

Web API Ouvvi Step Handler

We have migrated the really handy Web API Step Handler from Simego Online into DS3 Automation Server (Ouvvi) this allows you to setup calls to Web Applications that have documented API’s. such as other notification systems.


For example configuring the Web API Step to send messages to your mobile devices based on the Pushover system is easy as setting up the request like this below.





You can also use this to Test your Web API’s and ensure that the API’s are working as intended. Or perhaps to use this as a Queue Relay Broker service to integrate application messages.


The Web API allows you to control the request details and validate the response so you can test to ensure the response is what you expected.



15 April 2014

Ouvvi 32/64 bit Data Sync Options

We have a new release of both Data Sync and DS3 Automaton Server (Ouvvi) that enable the ability to run both 64-Bit and 32-Bit processes from a single installation.


The 64 Bit version of Data Sync now ships with 32 bit components as well, and from Ouvvi you can choose how to run the Data Sync Task. This can be either In-Process (Default) this is how it works today, and we have two new options External 32-Bit Process and External 64-Bit Process.


These new options run the Data Sync Task in a new Process that is either 32 Bit or 64 Bit, this enables you to mix sync tasks when you need to access 32 bit Legacy ODBC Drivers whilst maintaining an 64 Bit Installation.


To set the execution mode you simply select the mode of operation on the Data Sync Task in Ouvvi.



11 April 2014

Connection Library Data Preview

We’ve been busy extending the Connection Library features of data sync to really help you discover your data whilst building out your migration or integration.


You can now preview data right from the context menu in the Connection Library this works against any data object in the connection library and will return approximately the first 1000 items.


For Dynamics CRM we also now support OptionSets directly which can be used for drag & drop Lookups or on their own.



24 March 2014

Using FTP and Simego Automation Server to Import and Export files

FTP continues to be a highly used internet protocol despite many innovative products like Dropbox, SkyDrive amd Unfortunately, due to some considerable complexity in implementing downloading and uploading FTP, developing your own solutions can be painful.


We have worked with our clients to provide the easiest and most consistently successful method of downloading and uploading files to FTP using the power and auditability of Simego Automation server and  ‘Wget’, a well know and reliable HTTP. HTTPS and FTP command line application.


1. Download the Wget Binaries and place them in a location on the Automation Server (Ouvvi).(C:\Temp\FTPClient)


Download the binaries here: in the bin directory is the wget.exe which needs to be copied.


2. Create 2 new 'User Settings' in Ouvvi
Settings -> User Settings -> Add Setting


3. Create a new 'External Program' step in a project to call the wget command line tool download the files .



Running the above will download the i386.exe file to the working directory.


You can browse all of the functionality of wget here which includes an excellent way to synchronise FTP and local directories

17 March 2014

Simple Application Integration with Simego Online

This is a quick example of how we used Simego Online Platform to quickly provide some application integration between our systems without coding. (Implementation time 30 minutes)


We started out like a lot of people were we had a website then a LOB application (Helpdesk) both systems were independent with their own Database, list of users all hosted on Windows Azure. This was fine for a while but then it got to a point where it would be nice to at least link the user profiles.




Simego Online and Data Sync to the Rescue, by setting up a simple Simego Online project to run a Data Sync project based on users updating their profile on our main website, profiles in our HelpDesk are kept in Sync.


We used the SQL (Azure) Trigger in Simego Online to run the project when ever a change was made so that changes were reflected in near real-time.




The benefits here are this is all managed in the Cloud on the Simego Online platform, so we do not need to manage any servers, databases On Premise and the whole process just works!


We could so easily now add another project to sync user profile data into Dynamics CRM or other systems all automatically.

17 March 2014

Application Asynchronous Integration

By Leveraging Simego Online platform you can use Simego Online to call your application back to create asynchronous tasks. You might want to leverage this capability for long running tasks where you do not want to build background services but rather just create simple HTTP Services that are called.


For example, simple tasks like sending email, querying mailboxes can take some time to execute where they could be executed outside of the user context.


We use this service to run the background processing of our HelpDesk without the need to run a separate server. Our HelpDesk runs on Windows Azure Websites and all the background processing is handled via callbacks from Simego Online.


One of our process flows is as follows

New Email –> Call HelpDesk to Process Inbox –> For Each Message Set up Callback to get Message and Process


Then we move onto

Get Message –> Add to HelpDesk –> Set up Callback to Email User


This means we can create very small ASP.NET MVC Controller Actions that execute a simple task and tie them together via Simego Online Callbacks. All this then runs in the Cloud as if by magic and we can view every execution form Simego Online and even re-submit failed tasks.


Example dashboard showing HelpDesk operations being executed against our HelpDesk Website.




So how do you set this up?


Simego Online has a very useful Step Handler called Web API this allows you to execute Web Actions and validate their responses. This could be used for Web API End Point testing or anything else where you want to call an API endpoint.


You can send GET/POST/PUT/DELETE requests, add Headers, add Body, Basic Authentication and validate the response.



So in our HelpDesk one of our useful projects is a simple Relay service where we send in a message that gets relayed back to us some time later for processing. Because we don’t want to get overloaded this all happens in a nice orderly fashion 1 at a time. We also have the ability to configure a retry and failure condition. We could also setup an Email Task to let us know if something fails.


We’re also using a little know feature of Simego Online that allows us to expand variables at runtime, so in the configuration of the step we set the URL to {{ReturnURL}} which is actually a value passed to Simego Online by the Client Application, we also use this in the Body so when we get called back we receive an ID value that means something to us. We also validate that the Response returns a 200 OK otherwise we retry 3 times before putting the message on the Failure Queue.




To initiate a Task to run we need to POST our message data to Simego Online to start the Project.


The URL is like this<account>/<workstream>/ProjectApi/Start/<ProjectName>


Where you replace account, work stream and project name with your values.


The Body you Post is a Json value with the details you want to use in the project. This URL is basic auth over SSL authenticated so you should create an API User account under Settings –> User Profiles and set the Permission Level to User so that this user can start projects.


We’re using Return URL and ID in our Step so our body is.


{ ReturnURL : "MyURLGoesHere", ID: 1234 }


Then when Simgo Online executes the Project we get called back via a POST to our URL with the ID value then it’s up to us to simply execute the Task.


We setup a simple Helper Function to Post the Message to Simego Online like this.


public string PostJsonRequest(string url, string jsonString)
    HttpWebRequest webRequest = (HttpWebRequest)HttpWebRequest.Create(url);
    webRequest.Method = "POST";
    webRequest.ContentType = "application/json";
    webRequest.Accept = "application/json";
    webRequest.ServicePoint.Expect100Continue = false;
    webRequest.Credentials = new NetworkCredential(Settings.GetSetting<string>(SettingKey.QueueApiUsername), "");

    if (!string.IsNullOrEmpty(jsonString))
        byte[] data = Encoding.UTF8.GetBytes(jsonString);

        webRequest.ContentLength = data.Length;

        using (Stream requestStream = webRequest.GetRequestStream())
            requestStream.Write(data, 0, data.Length);

    using (HttpWebResponse response = (HttpWebResponse)webRequest.GetResponse())
        using (StreamReader sr = new StreamReader(response.GetResponseStream()))
            return sr.ReadToEnd();


This system allows us to easily deploy new services that execute background tasks via simple web sites that otherwise would have become complicated and require multiple services.


You can of course simply use this to be called on a schedule like every 5 minutes to run some background tasks.

20 February 2014

Announcing Intercom Data Provider for DS3

We have a new Data Provider available for Intercom, which provides full Read/Write Capability. Allowing you to Import, Export and Synchronise data between many different systems and Intercom.


For example with this you could set up an Intercom to Dynamics CRM Lead Capture integration, or use Data Sync to decorate your Intercom data with data from your own internal databases.






The Full Source Code for the Intercom Provider is available in the Data Sync Installation Directory so that you can study it to build similar REST API providers or to adapt it to your own needs.

19 February 2014

Creating Custom Data Providers with Visual Studio

We’ve added a new feature to make it super easy to start building your own data providers for Data Sync. This feature is currently in beta so you need release 3.0.808 or greater.


The feature from a Click of a Button creates a Visual Studio Project with a Template Project that can be opened in Visual Studio and give you F5 run and Debug.






This new Project Opened in Visual Studio 2012




Full F5 Debugging via Visual Studio to help you build the provider.




This allows you to code up whatever you want and then deploy the assembly to Data Sync and use it just like the providers we have built.


We often get asked how long does it take to write a provider for Data Sync. It really depends on the system you connect to and the API it exposes, however you can generally be up and running with a simple read-only provider in a couple of hours. We recently created a complete read-write provider for Intercom which is a JSON REST based API and this took approximately 8 hours over a weekend to complete.

6 February 2014

New Connection Library Video

Quick video showing how easy it is now to connect to your source or target data, preview the data and attach a Lookup to the data source.