Team Blog

Keep upto date with everything going on at Simego.


22 April 2014

Uploading and Downloading files using FTP in Simego DS3 Server (Ouvvi)

FTP is a mainstay of B2B integration and whilst there may be other options to transfer files online, its ability to offer direct B2B connectivity make it a popular choice.

 

FTP Upload and Download are now included in the increasing number of DS3 Server Project  ‘Steps’.

 

This post will demonstrate how to create a project which downloads an xml file from and uploads a csv file back to a FTP server.

 

The FTP server is configured only to allow secure connections as this is a typical configuration option.

 

Create a Project to contain the FTP Steps

From the Projects Page of DS3 Server, create a new project and name it ‘FTP Integration test’.

 

Add a new ‘FTP Download’ Step

image

 

Name the step ‘Download File via FTP (s)’ and then configure it.

 

FTP Server File Path 

This is the url style path to the file on the FTP server eg ftp://localhost/data.xml

 

FTP Server uses Passive Mode

This allows the FTP server to change the port the ftp process is working through. This is normally enabled .

 

FTP Server uses SSL

If the server is configured to transfer files and authentication details over SSL then you will need to enable this setting.This is normally enabled .

 

FTP Server accept SSL certificate

It is possible (and likely) that the certificate used to secure the FTP server is not signed by a trusted root authority or is not trusted by your own machine. In this case you can indicate that it is OK to accept this SSL certificate. This is optionally enabled .

 

Destination Filename 

Define the name and location of the file once it has been downloaded eg c:\temp\medals.xml

 

Username 

Set the username used to authenticate with the FTP server.

 

Password

Set the corresponding password to used with the Username

 

Timeout

This is the time in seconds that we will wait until giving up for a response from the FTP server. eg Normally 30 seconds  00:00:30 is reasonable.

 

Once you have completed the configuration it should something look like this.

 

image

 

Save the Step.

 

Add a new ‘FTP Upload’ Step

We have assumed the there is a corresponding file called medals.csv in a local file system that we would like to upload to the FTP server.

 

Add a ‘Ftp File Upload’ step and name it ‘Upload File via FTP (s)’ and then configure it.

 

Source Filename

This is the name and path of the file to be uploaded into the FTP Server eg c:\temp\local\medals.csv

 

FTP Server Path 

This is the url style path of the of the location you want the file uploaded to eg ftp://localhost/

 

FTP Server uses Passive Mode

This allows the FTP server to change the port the ftp process is working through. This is normally enabled .

 

FTP Server uses SSL

If the server is configured to transfer files and authentication details over SSL then you will need to enable this setting.This is normally enabled .

 

FTP Server accept SSL certificate

It is possible (and likely) that the certificate used to secure the FTP server is not signed by a trusted root authority or is not trusted by your own machine. In this case you can indicate that it is OK to accept this SSL certificate. This is optionally enabled .

 

Delete Local File after Upload

After uploading you can choose to delete the source file. eg Disabled

 

Username 

Set the username used to authenticate with the FTP server.

 

Password

Set the corresponding password to used with the Username

 

Timeout

This is the time in seconds that we will wait until giving up for a response from the FTP server. eg Normally 30 seconds  00:00:30 is reasonable.

 

Once configured and Saved the step should look lie this:

 

image

 

The can now download and upload via FTP with or without SSL transport enabled by running the project.

 

Upload and Download via FTP


21 April 2014

Speeding up small updates to large data using Incremental synchronisation

When we are integrating two different systems we often want to just utilise Simego DS3 to insert or update a small portion of the destinations data. For example we might want to synchronise 100 client records from a source file containing client records to our Dynamics CRM environment containing thousands of records.

 

Using DS3’s normal compare mode, it would download the entire destination records, compare them with the source and present the results

 

image

 

In cases like this we have no interest that there are 24944 records in the destination that are not in the source. We are only interested in the 99 updates but have used 14 seconds to read the entire destination records. If we scale the problem it becomes more obviously significant when reading millions of records from the destination.

 

In cases like these we can instruct DS3 to reconcile A to A incrementally. This means DS3 will only load the records found in A from B. It can identify that records are missing in B and if they have changed also. It will not detect if there are any to delete but we are not interested in deletion in this scenario. DS3 uses the primary keys in the source to request only the records in parallel from the target. This means that you may have only one primary key for this to work.

 

To turn on Incremental mode for a project, navigate to the File menu then the project properties. Change the SyncOption from AtoB to AtoBIncremental.

 

image

 

As you can see using the AtoBIncremental setting has reduced the load time from 14.147 seconds to 1.418 and only presented us with the updates rather than deletes.

 

image


16 April 2014

Triggering a Synchronisation Project from Dynamics CRM Online

There are many ways to integrate data between systems. What is often forgotten is that we only want to process these imports or exports when something has changed.

 

We often just run the synchronisation or integration at a specific time or periodically.

 

For example we may want to identify when a new contact has been created or updated within Dynamic CRM Online. We perhaps want this information to be synchronised with a local SQL table for reporting purposes.

 

We currently have several choices.

 

Synchronise on a regular basis. We can just reconcile and then synchronise on regular interval between the Dynamics CRM contact list to our local database contacts table. This technique requires that each time we connect to Dynamics CRM, we authenticate, then query the Contacts entity to evaluate if there has been any changes. This is obviously increases the workload on Dynamics CRM Online and potentially 95% of the time there will be no data changes  and therefore no synchronisation required.

 

Poll continuously looking to see if there are any new records or records that have changed. This technique requires that each time we connect to Dynamics CRM, we authenticate, then query the Contacts to evaluate if there has been any changes. If there are changes then request a full synchronisation. This method is slightly less onerous that just synchronising regularly but is still impactful on the Dynamics CRM Online workload.

 

Send message via the Azure Message Bus. This seems ideal as Microsoft has given us some infrastructure to pass messages by. Unfortunately, we will have to go through some significant authentication and authorisation steps.  In addition we would want something flexible enough to attach to any changes which we have to program ourselves. This would be technically challenging and expensive to develop.

 

Create a Plug-in or Custom Workflow. We can use the built in Dynamics CRM expandability to create a plug-in or better a workflow step that will notify the synchronisation engine that an event has occurred and it should begin synchronisation. The challenge in this scenario is that we will need to create some form of authentication to allow us to tell the on-premise system that there have been changes. In addition we would need to configure some security from the internet based CRM Online to the on-Premise synchronisation engine, which may be behind a firewall.

 

Design

What we really need is a combination that requires no authentication, no development and is easy to configure?

 

1. Create a workflow component we can call from anywhere within Dynamics CRM that can notify an external system when something has changed.

 

2.Create an internet notification store that can be told when something has changed and can be queried about those changes. No identifying data to be stored as no authentication required because we don’t want to store security credentials in the Dynamics CRM environment or tunnel through internal firewalls.

 

3. Create a polling system that can monitor the internet based store for changes and trigger processes if there has been a change.

 

The Dynamics CRM Workflow Component

You can download it from here (including source code).

 

http://www.simego.com/downloads/Simego.CRM.Workflow.zip

 

Register the component the same as any other Dynamics CRM workflow/plug-in ( Simego.CRM.Workflow.dll).

 

Create a process and call the workflow component

 

image

 

Configure the workflow step to use a made up GUID. I chose this one but you can use anyone BUT remember it as we have to tell the monitoring system to watch for it.

 

a8243db5-fbf7-4637-881e-d635ca99269c

 

 

image

 

Save and Close then Activate the workflow.

 

The ‘Internet Notification Store’

Using the Simego Online Last Changed Service

Simego has developed a solution for exactly this purpose. It can be used by any system that can call a URL. If you want to read about in details the help is here:

 

http://www.simego.com/Help/Online/Last-Change-Service

 

Every time a Contact now changes the workflow is going to call this URL:

 

https://online.simego.com/Change/Update/a8243db5-fbf7-4637-881e-d635ca99269c

 

and when that URL is called it will update the last ‘lastchanged’ date time associated with the GUID.

 

This allows any system to query the URL and find out if something has changed since the last time it looked. This is unsecured but contains no identifying information and scales incredibly well.

 

To query the last change we just call this URL:

 

https://online.simego.com/Change/a8243db5-fbf7-4637-881e-d635ca99269c

 

The ‘Internet Notification Store’ Polling

 

Simego DS3 Automation Server and Simego Online both have the ability to poll the ‘Last Changed Service’ and Trigger associated projects by using a JSON Trigger.

 

Attaching a Trigger using the Json Trigger in the Automation Server.

 

Open the automation (Ouvvi) server and navigate to the triggers page. Choose Add Trigger and select the Json Trigger.

 

image

 

Configure the trigger with your Service URL

 

https://online.simego.com/Change/a8243db5-fbf7-4637-881e-d635ca99269c

 

Set the Json Expression to ‘lastchange’.

 

image

 

We can now create a synchronisation project to synchronise data from Dynamics CRM Online to our on-premise database.

 

Every time there is a change in the contact entity online it will be synchronised locally.


15 April 2014

Web API Ouvvi Step Handler

We have migrated the really handy Web API Step Handler from Simego Online into DS3 Automation Server (Ouvvi) this allows you to setup calls to Web Applications that have documented API’s. such as other notification systems.

 

For example configuring the Web API Step to send messages to your mobile devices based on the Pushover system is easy as setting up the request like this below.

 

image

 

 

You can also use this to Test your Web API’s and ensure that the API’s are working as intended. Or perhaps to use this as a Queue Relay Broker service to integrate application messages.

 

The Web API allows you to control the request details and validate the response so you can test to ensure the response is what you expected.

 

image


15 April 2014

Ouvvi 32/64 bit Data Sync Options

We have a new release of both Data Sync and DS3 Automaton Server (Ouvvi) that enable the ability to run both 64-Bit and 32-Bit processes from a single installation.

 

The 64 Bit version of Data Sync now ships with 32 bit components as well, and from Ouvvi you can choose how to run the Data Sync Task. This can be either In-Process (Default) this is how it works today, and we have two new options External 32-Bit Process and External 64-Bit Process.

 

These new options run the Data Sync Task in a new Process that is either 32 Bit or 64 Bit, this enables you to mix sync tasks when you need to access 32 bit Legacy ODBC Drivers whilst maintaining an 64 Bit Installation.

 

To set the execution mode you simply select the mode of operation on the Data Sync Task in Ouvvi.

 

image


11 April 2014

Connection Library Data Preview

We’ve been busy extending the Connection Library features of data sync to really help you discover your data whilst building out your migration or integration.

 

You can now preview data right from the context menu in the Connection Library this works against any data object in the connection library and will return approximately the first 1000 items.

 

For Dynamics CRM we also now support OptionSets directly which can be used for drag & drop Lookups or on their own.

 

image


24 March 2014

Using FTP and Simego Automation Server to Import and Export files

FTP continues to be a highly used internet protocol despite many innovative products like Dropbox, SkyDrive amd Box.net. Unfortunately, due to some considerable complexity in implementing downloading and uploading FTP, developing your own solutions can be painful.

 

We have worked with our clients to provide the easiest and most consistently successful method of downloading and uploading files to FTP using the power and auditability of Simego Automation server and  ‘Wget’, a well know and reliable HTTP. HTTPS and FTP command line application.

 

1. Download the Wget Binaries and place them in a location on the Automation Server (Ouvvi).(C:\Temp\FTPClient)

 

https://www.gnu.org/software/wget/

 

Download the binaries here: http://gnuwin32.sourceforge.net/packages/wget.htm in the bin directory is the wget.exe which needs to be copied.

 

2. Create 2 new 'User Settings' in Ouvvi
Settings -> User Settings -> Add Setting

ConfigSetting1


3. Create a new 'External Program' step in a project to call the wget command line tool download the files .

ConfigSetting2

 

Running the above will download the i386.exe file to the working directory.

 

You can browse all of the functionality of wget here https://www.gnu.org/software/wget/manual/ which includes an excellent way to synchronise FTP and local directories


17 March 2014

Simple Application Integration with Simego Online

This is a quick example of how we used Simego Online Platform to quickly provide some application integration between our systems without coding. (Implementation time 30 minutes)

 

We started out like a lot of people were we had a website then a LOB application (Helpdesk) both systems were independent with their own Database, list of users all hosted on Windows Azure. This was fine for a while but then it got to a point where it would be nice to at least link the user profiles.

 

image

 

Simego Online and Data Sync to the Rescue, by setting up a simple Simego Online project to run a Data Sync project based on users updating their profile on our main website, profiles in our HelpDesk are kept in Sync.

 

We used the SQL (Azure) Trigger in Simego Online to run the project when ever a change was made so that changes were reflected in near real-time.

 

image

 

The benefits here are this is all managed in the Cloud on the Simego Online platform, so we do not need to manage any servers, databases On Premise and the whole process just works!

 

We could so easily now add another project to sync user profile data into Dynamics CRM or other systems all automatically.


17 March 2014

Application Asynchronous Integration

By Leveraging Simego Online platform you can use Simego Online to call your application back to create asynchronous tasks. You might want to leverage this capability for long running tasks where you do not want to build background services but rather just create simple HTTP Services that are called.

 

For example, simple tasks like sending email, querying mailboxes can take some time to execute where they could be executed outside of the user context.

 

We use this service to run the background processing of our HelpDesk without the need to run a separate server. Our HelpDesk runs on Windows Azure Websites and all the background processing is handled via callbacks from Simego Online.

 

One of our process flows is as follows

New Email –> Call HelpDesk to Process Inbox –> For Each Message Set up Callback to get Message and Process

 

Then we move onto

Get Message –> Add to HelpDesk –> Set up Callback to Email User

 

This means we can create very small ASP.NET MVC Controller Actions that execute a simple task and tie them together via Simego Online Callbacks. All this then runs in the Cloud as if by magic and we can view every execution form Simego Online and even re-submit failed tasks.

 

Example dashboard showing HelpDesk operations being executed against our HelpDesk Website.

 

image

 

So how do you set this up?

 

Simego Online has a very useful Step Handler called Web API this allows you to execute Web Actions and validate their responses. This could be used for Web API End Point testing or anything else where you want to call an API endpoint.

 

You can send GET/POST/PUT/DELETE requests, add Headers, add Body, Basic Authentication and validate the response.

 

image

So in our HelpDesk one of our useful projects is a simple Relay service where we send in a message that gets relayed back to us some time later for processing. Because we don’t want to get overloaded this all happens in a nice orderly fashion 1 at a time. We also have the ability to configure a retry and failure condition. We could also setup an Email Task to let us know if something fails.

 

We’re also using a little know feature of Simego Online that allows us to expand variables at runtime, so in the configuration of the step we set the URL to {{ReturnURL}} which is actually a value passed to Simego Online by the Client Application, we also use this in the Body so when we get called back we receive an ID value that means something to us. We also validate that the Response returns a 200 OK otherwise we retry 3 times before putting the message on the Failure Queue.

 

image

 

To initiate a Task to run we need to POST our message data to Simego Online to start the Project.

 

The URL is like this

https://online.simego.com/<account>/<workstream>/ProjectApi/Start/<ProjectName>

 

Where you replace account, work stream and project name with your values.

 

The Body you Post is a Json value with the details you want to use in the project. This URL is basic auth over SSL authenticated so you should create an API User account under Settings –> User Profiles and set the Permission Level to User so that this user can start projects.

 

We’re using Return URL and ID in our Step so our body is.

 

{ ReturnURL : "MyURLGoesHere", ID: 1234 }

 

Then when Simgo Online executes the Project we get called back via a POST to our URL with the ID value then it’s up to us to simply execute the Task.

 

We setup a simple Helper Function to Post the Message to Simego Online like this.

 

public string PostJsonRequest(string url, string jsonString)
{
    HttpWebRequest webRequest = (HttpWebRequest)HttpWebRequest.Create(url);
    webRequest.Method = "POST";
    webRequest.ContentType = "application/json";
    webRequest.Accept = "application/json";
    webRequest.ServicePoint.Expect100Continue = false;
    webRequest.Credentials = new NetworkCredential(Settings.GetSetting<string>(SettingKey.QueueApiUsername), "");

    if (!string.IsNullOrEmpty(jsonString))
    {
        byte[] data = Encoding.UTF8.GetBytes(jsonString);

        webRequest.ContentLength = data.Length;

        using (Stream requestStream = webRequest.GetRequestStream())
        {
            requestStream.Write(data, 0, data.Length);
        }
    }

    using (HttpWebResponse response = (HttpWebResponse)webRequest.GetResponse())
    {
        using (StreamReader sr = new StreamReader(response.GetResponseStream()))
        {
            return sr.ReadToEnd();
        }
    }
}

 

This system allows us to easily deploy new services that execute background tasks via simple web sites that otherwise would have become complicated and require multiple services.

 

You can of course simply use this to be called on a schedule like every 5 minutes to run some background tasks.


20 February 2014

Announcing Intercom Data Provider for DS3

We have a new Data Provider available for Intercom, which provides full Read/Write Capability. Allowing you to Import, Export and Synchronise data between many different systems and Intercom.

 

For example with this you could set up an Intercom to Dynamics CRM Lead Capture integration, or use Data Sync to decorate your Intercom data with data from your own internal databases.

 

image

 

image

 

The Full Source Code for the Intercom Provider is available in the Data Sync Installation Directory so that you can study it to build similar REST API providers or to adapt it to your own needs.