Team Blog

Keep up to date with everything going on at Simego.

Data Sync Projects vs Traditional Blocky Workflows

Comparision of Data Synchronisation Studio Projects against other ETL Tools

5 July 2019

In this blog we will cover the difference between Data Synchronisation Studio and other Data Integration products that take a blocky single step at a time workflow approach.


For this example lets take a simple scenario where you may receive a file daily into a mailbox you want to then take this file and save it to a folder and at the same time rename the file with today's date.

Typical Workflow

Below is an example of the steps you might take to complete this process.

Email Attachment Process Flow

The Data Sync Approach

In Data Sync we model our Source Data how we want it represented in the Target doing everything at once and in batch. Data Sync then works out for us what it needs to do to make the target the same as the source. We then run multiple Data Sync projects in a sequence to build more complex processes.

So in this example we wrap all this up into a single Data Sync project where we connect to an Exchange Mailbox and get mail for Today with a specific subject then map it to a Folder and rename the file before its written.

Filter Email for Today and Subject

This is done via Project Automation to update the project configuration at runtime to return email messages received today. To filter only messages starting with NEW LEADS we enter a value of NEW LEADS* into the FilterBySubject property of the source.

DataSourceA.FilterByReceivedDateTime = DateTime.Today.ToString("yyyy-MM-dd HH:mm:ss");

Extract Attachment and Save to Folder

This is done by just mapping the Filename from the Attachment to the Target Folder at the same time we use a Calculated Column to rename the file using today's date.

FORMAT("LEADS_{0}.csv", DATESTR(TODAY(), "yyyyMMdd"))

Data Sync Project in Designer

Data Sync Project

Next Process Import File

Next we might want to import the data from the file into our Target System this could almost be anything but in general the process is the same. We process each row one at a time looking up against the target to decide whether we need to add the row as its a new row or update an existing row with new data. This can get really complicated when our source data doesn't contain the required identifier in the target system. It can also be really slow to lookup each record one at a time.

Email Attachment Process Flow

Fortunately in Data Sync all this is really easy we just need to connect the source to our CSV file. We can use a wildcard for the path if we do not know the filename i.e. C:\Temp\Leads_Drop\LEADS_*.csv.

We then connect the Target and define the mapping between the two data sources. We need a Key column to identify the records this is used by Data Sync to calculate whether a record is new or is an update.

Internally Data Sync tracks the target identifiers so we can always update a record even when the source data doesn't include this value.

Data Sync Project

Bringing it all together

Finally we need a way to run these Data Sync projects in a Sequence. Configure a Schedule or real-time trigger and have a way to monitor the execution once the process is in production.


Ouvvi is our solution for scheduling and monitoring the individual Data Sync projects. Within Ouvvi you define a project and add steps to execute in a sequence using flow control to manage which steps run based on a running status.

For our example here we add three steps.

  1. Download CSV File from Email.
  2. Import CSV file to SQL Table
  3. On Failure send an email report.

We can document each of these steps in Ouvvi to ensure we have some visibility on the defined process.

Once the solution is configured and tested we add a Trigger to run this at Start Of Day to ensure it runs automatically for us.

Projects configured in Ouvvi

Ouvvi Project

This is just a small fraction of what is possible with Data Sync+Ouvvi. Hopefully this helps you understand the difference between Data Sync+Ouvvi and other products and you can see how easy it is to use Data Sync+Ouvvi for your Data Integration processes.

Data Sync Avangate/2Checkout Connector

Avangate/2Checkout Connector for Subscriptions and Promotions

2 July 2019

We have created a new Data Sync connector for Avanagte/2Checkout REST V4 API.

This new Connector can read Subscriptions and Promotions from the API. With this new connector you can easily sync your subscription data in 2Checkout with your CRM System or SQL Database.

Avangate/2Checkout API docs are here

Avangate-2Checkout Connector

Write operations are manual via Project Automation Events and calling the REST API directly.

For example if you wanted to synchronise promotion discount rates with your own internal system you might use the following code in Project Automation BeforeUpdateItem event to retrieve the current promotion json document, update the rate and then send the document back.

public override void BeforeUpdateItem(object sender, DataCompareItemInvariant item, object identity)
	var mapping = new DataSchemaMapping(SchemaMap, DataSchemaSide.DataSourceB);
	var toUpdate = item.ToUpdateItemDictionary(mapping);
		var helper = DataSourceB.GetWebRequestHelper();	
		var info = DataSourceB.GetDatasourceInfo();
		var url = info.GetAvangateItemEndpointUrl((string)identity);
		// Get the Existing Promotion Details
		var promotion = helper.GetRequestAsJson(url);
		// Update the Discount
		promotion["Discount"]["Value"] = toUpdate["DiscountValue"].ToString();
		// Send it back
		helper.PutRequestAsJson(promotion, url);			

Data Sync Mailchimp Connector

Import, Export Mailchimp Audience data and set Tags

1 July 2019

We have created a new connector for Data Synchronisation Studio to work with Mailchimp data.

With this connector you can Import and Export Mailchimp Audience Contacts. You can also update existing Contacts with any changes you may have in your database. This connector also supports Incremental Sync mode when you use the ID column as the Key column.

Mailchimp Data Sync Connector

When you prepare your data for the Mailchimp data import its a good idea to lower-case the email address this is to ensure that we can also calculate the MD5/id value that Mailchimp uses as the record identifier.

It's easy with a calculated column to Lower case the Email address with the LOWER(Email) function.

Mailchimp Lower Email Address

We can also generate the same MD5 hash from the email address that Mailchimp uses with the MD5HASH(LOWER(Email)) function.

Mailchimp MD5 Hash Email Address

To configure your project map your source columns to the Mailchimp columns and use the results of your functions to map Email and ID columns.

Mailchimp Data Sync Project

You can specify the contact status during the import by providing a text value for the status i.e. subscribed, transactional etc.

Its also possible to add/remove tags from contacts this is done by specifying an array of Tags to apply to the contact.

If your source data holds your tags in different columns like Tag1, Tag2 etc its quite easy to combine these into an Array with a helper function.

If you add a function to the Dynamic Columns class like this ARRAY function below you can then call this from Calculated columns with the function ARRAY(Tag1, Tag2)

public string [] ARRAY(params string [] values)
	return values.Where(p => !string.IsNullOrEmpty(p)).ToArray();

You can also extend the import further in Project Automation by intercepting the Item Events and calling the Mailchimp REST API directly with the help of the Data Sync HttpWebRequest helper.

Deploying Ouvvi + Data Sync on Windows 10

Using Windows 10 Pro rather than Windows Server

7 June 2019

Ouvvi is typically deployed on a dedicated Windows Server Virtual Machine. This can have significant Windows Server License implications.

Ouvvi and Data Sync can be run on Windows 10 Pro with IIS and SQL Server Express Edition just as well as it would on a Windows Server OS. You do need the Pro SKU of Windows 10 as Ouvvi requires IIS.

DELL COMPUTERS offer a tiny little workstation the OptiPlex 3060 which can be ordered in several configurations. However, the 6 Core i5-8500T with 8GB RAM and 256 GB M2.SSD is an interesting configuration and would make for a great little Ouvvi + Data Sync Data Integration server (we have 6 of them!). This specification is capable of running some very significant data integration processes.

DELL Optiplex 3060

If you were to get a similar specification 4 Cores and 8GB RAM Windows Server from a well-known cloud service vendor you would be looking at ~$285/Month.

AWS c5.xlarge costs

I would imagine this little pc would still be significantly faster than any of the cloud vm offerings. Additionally over 3-Years this would make this little PC ~$20/Month.

You can Domain Join and enable Remote Desktop on this little Windows 10 Pro machine and just leave it in a cupboard somewhere and pretend it’s a “real server”.

Windows 10 Pro only allows for a single Remote Desktop Session so your limited to a single user logging on at a time. However, you can enable Ouvvi to be accessible over your LAN by enabling the Ouvvi Web TCP Port via Windows Firewall. Multiple network users can then view the Dashboards and keep and eye on what’s running via the Web Interface.

Ouvvi runs a service so this machine does not need to be logged in, just switched on and running.

Ouvvi Dashboard View

So there you go you can deploy Ouvvi + Data Sync on Windows 10 Pro and it might just be a better option for you.

Email Address Validation via ZeroBounce

Using Data Sync Dynamic Columns with ZeroBounce API

6 June 2019

We found this great SaaS service online to clean up email addresses basically you can ask zerobounce if an email address is valid or not and then use that information to keep your database clean.

This is a simple HTTP Json type service where you send an email address and it will return a Json response with the result. Armed with this information it would be really easy to build an automated integration with Data Sync to continuously check the status of email addresses in your database.

For this example I created a simple SQL Table to hold some email addresses and the response from ZeroBounce. As your charged based on API Credits its a good idea to only call the API when you need to so in this example we store a last checked value and then only call the API again if 30 days have passed since the last check.

Note: You need to complete the sync to write the values to the SQL Table if you keep comparing then you will keep calling the API.

SQL Table

Here is a T-SQL definition for the SQL Table I used in this example.

CREATE TABLE [EmailValidation] (
	[EmailAddress] nvarchar(255) NOT NULL,
	[LastChecked] datetime NULL,
	[Status] nvarchar(100) NULL,
	[SubStatus] nvarchar(100) NULL,
	[FreeEmail] bit NULL

You then need to fill the EmailAddress column with email addresses. You could use a Data Sync project to import these from a CSV other SQL Table or other data source.

Email Validation Table

Data Sync Project

We now need to configure the Data Sync project as this project is to update the same data source i.e. the SQL table with information from ZeroBounce we need to map the SQL Table as both Source and Target. Then use Dynamic Columns in a Lookup type function to call the ZeroBounce API and get the result for each Email Address in the Table

To configure this Data Sync project

  1. Load the SQL Table as both the Source and Target
  2. Copy the Dynamic Columns Code add your ZeroBounce API key and check that it compiles (build button).
  3. Map the columns as you see in the Screenshot below so that the data from ZeroBounce is mapped to the Target SQL Table
  4. Compare and Sync

Data Sync Project

Data Sync Project Code

Dynamic Columns Code

partial class DataSourceRowOverride : Simego.DataSync.DynamicColumns.DataSourceRowInternal
    private const string API_KEY = "YOUR_API_KEY";
    private const string API_URL = "{0}&email={1}&ip_address=";
    private HttpWebRequestHelper helper = new HttpWebRequestHelper();
    public DateTime? Fx_LastChecked { get; set; }
    public string Fx_Status { get; set; }
    public string Fx_SubStatus { get; set; }
    public bool? Fx_FreeEmail { get; set; }
    public override bool BeginRow()
	    Fx_LastChecked = LastChecked;
	    Fx_Status = Status;
	    Fx_SubStatus = SubStatus;
	    Fx_FreeEmail = FreeEmail;
	    if(!Fx_LastChecked.HasValue || DateTime.Today.AddDays(-30) > Fx_LastChecked.Value)
		    Fx_LastChecked = DateTime.UtcNow;

			    var result = helper.GetRequestAsJson(API_URL, API_KEY, EmailAddress);
			    Fx_Status = DataSchemaTypeConverter.
			    Fx_SubStatus = DataSchemaTypeConverter.
			    Fx_FreeEmail = DataSchemaTypeConverter.
		    catch(Exception e)
			    Fx_Status = "Error";
        return true;

Email Validation Results

After you run the sync you will then see the results of the email validation in the SQL Table. If you add more email addresses to the Table only those new email addresses added will be checked next time.

Email Validation Table Results

Ouvvi DevOps

Automating Ouvvi Solution Deployment

25 April 2019

Requires: Ouvvi Version 4.0.574 or greater.

We have a new Solution packaging feature for Ouvvi. This allows you to package Ouvvi assets into a single deployable package and then deploy to your Ouvvi server.

These Ouvvi solutions include Dashboards, Groups, Projects, Steps, Triggers, Connections and Settings.

There is a new Ouvvi Step Hander that can be used to automatically pickup a solution file and deploy to your server when ever the solution file is changed. For Live running servers this will queue in the import job and it will be run when the current job queue is clear.

Export Solution from Existing Ouvvi Instance

To export your existing Ouvvi projects to the new solution file format use the new Solution Export feature.

Go to Projects->Export->Solution Export.

Export Projects

Export Solution

And check the items to be included in the export.

Choose Projects to Export

This will export a ZIP Archive file containing the solution assets which you can then extract and modify as necessary.

Solution Files

Import Solution to Ouvvi Instance

To import a solution file make sure that you zip all your solution assets into a ZIP Archive and then go to Projects->Import->Solution Import.

The Import process is a MERGE operation with your Ouvvi instance where items are matched by name. The only items that are deleted are steps within an imported project where they no longer exist.

Important: You must use unique names for your Ouvvi items so that matching by name during the import works correctly.

Import Projects

Import Solution

Choose file to Import

Before your items are imported you can review the contents of the solution file and choose which items to import.

Check Items to Import

Import Items

Build Solutions with VS Code

Using VS Code, Visual Studio or another IDE allows you to manage these assets with a Source Control System (GIT) and when ready package up and deploy to the Ouvvi Server. The package is simply a ZIP Archive of the solution.

VSCode solution project

Within the IDE you can edit the solution XML Files. Store your assets in a source control system and edit the Data Sync projects with the Data Sync Studio Designer. The Connection Library is redirected to the Library within the Solution.

Automate Solution Import

You can setup an Ouvvi project which can then be used to automate importing a solution file when it changes. To do this create a new project and add a step to the project of type "Ouvvi Solution File Import". The step configuration should then point to the location of the solution file.

Solution Import Step Type

Solution Import Step Configuration

You can then create a File Trigger that points to the same file and add this to the deployment project. When a new Solution Zip file is then written the contents of the solution are automatically imported.

Import Solution via API

If you create an Ouvvi Deployment project you can also then call Ouvvi via the API to run the import.

Example Powershell script to execute an Ouvvi hosted project

Invoke-RestMethod -Method GET -ContentType application/json -UseDefaultCredentials -Uri http://localhost:2026/api/projects/start/1

Would you like to try Data Sync and Ouvvi for your Data Integration projects?