Team Blog

Keep up to date with everything going on at Simego.

Product Updates October 2020

It's been a while! Here's an update and a few of our future plans

19 October 2020

It's been a while since we last posted on our blog, so we thought we would give a quick update about some of the things that we have been up to and some of the things we have planned.

Website Updates

You may have noticed that our docs are looking a little different. We decided to change the structure of them so the articles are broken down into more sections and the sections are in a card view. We have also made them readily available on the main website in the main navigation menu, hopefully this makes them easier for you to find.

New Docs Layout

You will also notice a search bar within the documentation pages, which will search just the docs so you can locate the articles you need as quickly and easily as possible.

The old help pages have been absorbed into the how to articles or into the user-guide information with updated screen captures and guidance. So those old articles you all found useful do still exist, even though the old links to them will no longer work.

We are always updating the documentation content to make sure it is up to date and any new features are covered, however if there is anything you can't find let us know and we can either point you in the right direction or tell you how to do it.

Data Sync Changes

AD Connector Updates

The Active Directory connector has had an update where it now supports SSL/TLS connections. To connect using SSL change the UseSecureSocketsLayer dropdown to True.


We have big updates planned for the AD connector which you can read about further down this article.

Dynamics Connector

We have been working hard to keep on top of the many changes Microsoft are making, one of these changes being the deprecation of Legacy Authentication (Username and Password) on new and upgraded Dynamics instances. Because of this change applications will need to connect using OAuth, so the Dynamics connector now supports OAuth.

We've created updated guidance on how to connect as you need to create an App within Azure AD to get your ClientID and Client Secret needed to login. Check out the latest Dynamics connection docs here.

It may sound scary, but we promise you it isn't bad at all! The most difficult thing will be getting hold of your Azure AD & Dynamics administrators to configure this for you if you don't have the permissions.

As always, if they have any questions get them to send us an email and we can help guide them on what to do.

SharePoint Connector

As with the Dynamics connector we have also updated the SharePoint connector so it now supports OAuth. The setup for SharePoint requires you to create an App in SharePoint rather than in Azure AD directly, our full documentation details can be found here.

The OAuth connector also supports File Upload and Download, so you can continue to use Data Sync the same as you would have connecting via legacy authentication.

Exchange Connector

It's going to sound like a broken record now... Due to the deprecation of legacy authentication, which was due this year for Exchange but has been postponed till 2021, the Exchange connectors now support OAuth.

As you have come to expect, you can connect to your main mailbox, appointments, contacts and tasks as you need to with your new connection details.

You can find the details on how to connect with OAuth here.

AWS S3 Supports Incremental mode

Another update that I almost forgot to mention is: Our AWS S3 connector now supports Incremental Mode. So for large datasets you can switch to Incremental Mode to incrementally sync your changes, you should find your performance increases when you have a few updates to make (as it won't be trying to load your whole dataset).

Ouvvi Changes

Connector Updates

Like the Data Sync updates, the Ouvvi SharePoint and Dynamics connectors now support OAuth Authentication.

For Dynamics change the Authentication Type dropdown to OAuth and enter in your ClientID into the username field, Client Secret into the Password field, and your domain into the domain field. The documentation for this will be coming soon.

Dynamics OAuth Connection Ouvvi

For SharePoint, simply complete the OAuth credentials with your details. The full documentation can be found here.

SharePoint OAuth Connection Ouvvi

Ouvvi Authentication using Azure AD

You can now configure Ouvvi to use your Azure AD login. If you want to make use of MFA and not have yet another password, you can now configure Ouvvi to use you Microsoft Azure AD login details.

We have our full documentation explaining the process here, but in short you will create an App in Azure AD and then use the details from this app within Ouvvi. The deployment of an agent needs a different process to normal, and this is documented in the documentation, however we are planning on making the agent deployment for this more streamlined in the future.

Import and Export Ouvvi Apps

Ouvvi Apps can now be imported and exported.

If you make use of Ouvvi Apps (we recommend you do as they're incredibly useful) you can now export them to be imported in your other Ouvvi instances. This is also the same for if you move servers and want to create a clean install of Ouvvi on the new server.

Go to the Ouvvi Apps page, click Export and check the checkbox next to each of the Apps you want to include in your export. To import go to the Ouvvi Apps page, click Import and locate the exported files.

Import/Export Ouvvi Apps

ZipArchive Provider

To make your Ouvvi projects have DevOps solution the import and export can be opened within an IDE so you can make changes and store the updates within your chosen source control system. When you're ready to deploy, simply package the solution and import it back into Ouvvi.

AWS S3 Upload and Download Step Handlers

To keep adding to the value of Ouvvi, we now have built in connectors for S3 Upload and Download. You can include S3 Upload and Download steps within your Ouvvi projects.

We specifically added the S3 steps to help us automate our website deployment.

Ouvvi Online Update

It has been a little while since we first launched Ouvvi Online. We know we didn't go with a big launch party but we did want to see how it would go down with you all first.

So for our progress update: We have a number of customers using Ouvvi Online, some of these are existing customers who have migrated to the online platform (thank you for your support!) and some are brand new customers (welcome to the family!)

We also make use of Ouvvi Online for all our system processes and we have two major instances of Ouvvi to handle these: Systems and Operations.

Systems handles the rolling deployment of Ouvvi Online updates, the website deployment and the building of Data Sync and Ouvvi releases.

Simego Systems

Operations handles the working parts such as the Helpdesk, order notifications, trial notifications and the GDPR data control.

We have been using Ouvvi Online since February/March 2020 and haven't had any issues (touch wood!) We get reports from Pingdom to let us know if there are any issues/downtime, and when you look at our report you can see 100% uptime!

Pingdom Report

To handle updates to the platform we use a rolling deployment and this has been working brilliantly. One of the many benefits of Ouvvi Online is that we handle the updates to the system so you don't have to. You also get access to the latest features before they become available in the on-premise version of Ouvvi, and some features will only be available in Ouvvi Online.

UPDATE (14-Jan-2021): We have come to the conclusion over the past year that running a data integration platform in a SaaS way is very difficult. Many customers are in highly regulated industries and data protection/privacy requirements limit the ability to run operations on shared infrastructure.

With this in mind we have decided to retire our Ouvvi Online project. Overall it was a great learning curve, and showed us that we could build it. We will be migrating some select features from the Online platform to the normal Ouvvi so that you too can make use of these.

Automated Build System

Using Ouvvi Online we have now automated our Build process of Data Sync and Ouvvi. So in a slightly strange but still ever so cool way we use Ouvvi to deploy Ouvvi, chicken or egg situation anyone? So now when we make an update to Data Sync and/or Ouvvi and we want to deploy this to the beta site we simply log in to our Systems Ouvvi and start the project. Yes we could take this further to be fully automated so then when a file is changed in a folder it automatically kicks off the project, but we like having this bit of control.

Automated Build

Previously we found our website deployment build was incredibly manual, but why couldn't we automate it? So in our process clean up we thought "Couldn't we use Ouvvi to do this?"

So we now have three projects that manage the deployment. One to upload to S3, one to deploy to the staging site for our testing, and another to deploy to the live site.

Website Deployment

Going Forward

To wrap it all up lets have a quick look at what you can expect from us in the near future.

Active Directory Connector Updates

We want to do extend the Active Directory connector so that it can do more, however it is most likely going to be a brand new connector so that we can do things behind the scenes a little differently to make the changes possible. Some of the features we want to add are: Automatically map the Manager column without having to define the column, support connecting to multiple OUs and support lookups into other OU's and groups.

Are there any other features you think would be useful to have added?

Now is your chance to shape the AD connector to potentially work how you want it to, let us know any thoughts you have and we can look at adding them in.

Adding SQL Lite to Ouvvi

We are looking to add a self contained database option for Ouvvi such as SQL Lite. The thought process behind it is that it will simplify the installation and you will not need to install SQL Express or have your own SQL Server to make use of Ouvvi.

What are your thoughts, do you think this could be a good option and would you be interested?

Dynamic UI for Step Handlers

We are currently working on a Dynamic UI for the Ouvvi step handlers. This means that you can build a custom user interface from the code without having to define it elsewhere.

You'll be able to create custom step handlers to upload into Ouvvi and use and the UI will automatically return with the default view unless you decide to define it exactly. You can group the various properties to return in separate cards. By default it will return each as a column that spans the space, but you can then further define this so that properties are returned side by side spanning half or a third of the row etc.

We've done this to clean up our code base and make our lives easier when writing new step handlers, triggers and connectors for Ouvvi, and if you want to write your own it should help you too!

It will also make it possible to dynamically hide options that aren't relevant for the configuration option you chose.

Got any questions about writing your own Handler? Send us an email and we can help you out.

Continuing through Covid

We've been lucky enough that throughout this pandemic you, our customers, have continued to find ways to use Data Sync and Ouvvi to help automate your processes. Over this time we have taken on 30 new customers, who are either completely new to our Products or have been recommended to use them by someone else.

So this has kept us busy, making sure that everyone can continue to connect to all their applications and trying to plan ahead for the future. If everyone is going to be working remotely more often then is a shift towards being a SaaS product where we need to be? Let us know your thoughts, would being able to access Ouvvi and edit your projects from anywhere either on your laptop or your mobile or tablet be useful for you?

If it is, maybe you want to try Ouvvi Online out.

Final Note

On another more personal note, we got to have a bit of a celebration in September as Rebecca got married! Due to restrictions they have had multiple small wedding days and so far all of them have had glorious sunshine. Let's hope the rest are as sunny when they get back to celebrating again in spring.

So as our final note of the post here is a picture of the happy couple on the day.

The Happy Couple

Bulk Upload Documents into SharePoint from Excel

How to use Data Sync to upload documents to a Document Library from a single Excel Spreadsheet

27 April 2020

Lets take a scenario that most of you come across on a frequent basis. You have a directory of documents that need to be added to SharePoint. With Data Sync there are a few ways to do this, and the one we are going to cover today makes use of a simple Excel Spreadsheet. Your other options can be seen in our documentation here.

The example shown below uploads only a small sample of documents, you can use this method to upload many thousands of documents.

Before you begin

The Document Library

You want to make sure you have a Document Library ready in SharePoint to take your documents and define any metadata columns you want to be included.

Empty Document Library

The Spreadsheet

In your Excel spreadsheet you need to specify the name of the document and any metadata you want to be uploaded alongside it. In our example we have a range of ten documents with different modified dates and different approval status’.

Spreadsheet of Data

Documents in SubFolders

If your documents are contained within folders in your directory, you want to include the sub-directory folder name with the file name for example Folder1\doc1.docx. This will ensure that the document can be found, and your directory structure is replicated in the Document Library. An example spreadsheet can be seen here:

Spreadsheet of Data With Folders

The Project

Open Data Sync and connect to your spreadsheet as the Source choose the Open XML provider from the list, then locate your spreadsheet containing your metadata and click Connect.

Excel Connection

In the connection properties go to the section Settings.Writer and complete the following fields:

  • BlobBasePath : Enter in the file path to your directory of documents in our example this is C:\Users\Rebecca\Documents\Demo Documents.
  • BlobFileName : Choose from the drop down the column that returns the name of the file. In this example it is the column FileName.

Property Settings

Connect to your SharePoint Document Library as the Target by choosing the SharePoint Online or SharePoint ClientAPI (if using an on-prem version of SharePoint) provider from the list.

SharePoint Connection

Schema Mapping

Now map your source columns to your target columns. You need to map one column to the URLPath in SharePoint. For this example we are adding a Title Property to each document, setting whether it is approved or not and a modified date. The full schema mapping of the example can be seen below:

Schema Mapping

As of version 3.0.1276 you can map a different filename to the one you have specified in the BlobFileName field, and the BlobFileName column does not need to be included in the schema mapping. So if you want your files to go to different folders or have a different name, map the other Filename column (such as TargetFileName in the example spreadsheet) to the URL path in the schema map.

Mapping without BlobFileName Included

Run the Compare and Sync

Now run the comparison by clicking Compare A -> B and you can preview the results to make sure that they are appearing as expected.

Comparison Results

Once you are happy synchronise the results. You can now go to the Document Library in SharePoint and you should see all the documents added with their corresponding metadata.

Results in SharePoint

Results with Folders

If you had defined folders for the documents to be in your results in SharePoint should be like this:

Results in SharePoint with Folders

Clicking inside a folder shows the documents.

Results in SharePoint Inside Folder

Announcing the Simego Referral Program

What the Simego referral program is and how to get started

5 October 2019

Today we are launching a referral program for you, our loyal customers.

We have found that much of our business comes from you recommending us to others. So, in recognition of this we have created a referral program where if you refer a customer to Simego they will get 20% discount on their order and you will also receive a 20% credit in the form of points to use for against your renewal/s or new purchases.

Get Started

To get started all you need to do is log in to your Simego account and find the referral section (this should be in the menu to the left of the page). Read through the details and click "Agree Terms". This will then generate your unique referral code.

Copy that referral code and get talking!

You can put it wherever you like: in an email, in a blog post, as a footer, post it on social media. You decide how you want to get the word out. Just make sure to tell everyone where they need to go ( and that the code needs to be put in the Partner/Referral Box during checkout.

The more you refer the more points you are rewarded!

The referral section in your account will show you how many points you have (the big green section), the transactions you have against your referral code and how many points these orders gave you (or what you spent with a spend code).

You can also generate a Spend Code to spend your points from your account. Click “Create Spend Code” and type in the number of points you want to use then click “Create”. Your spend code will now appear in your transactions table, copy this and use it against your renewal or a new purchase.

Go get signed up and start earning points for every time you refer a new customer to us!

Thank you again for being a Simego customer.

Introducing Ouvvi Apps

An introduction to Ouvvi Apps

4 September 2019

We've been busy working on some new Ouvvi developments over the few months, one of these has been Ouvvi Apps.
This new addition has so many possibilities we are going to split it into multiple blog posts so that we don't overwhelm you with information.

So lets get started with our introduction to Ouvvi Apps.

What are Ouvvi Apps?

Ouvvi apps, in a nutshell, is a new form of table storage with a modern restful API. Now that may not sound like much to you right now but when you see how it'll save you time and effort in your integrations, and the possibilities it generates, I think we might make you think otherwise. Ouvvi apps was designed to make data integration and reporting on your data simpler and more useful.

Simego Ouvvi Apps

Simego Ouvvi Apps

Ouvvi Apps are SQL tables on steroids. Not only can you use the apps as a store or backup of your data, but you also get to use some useful additional features. Ouvvi Apps have a simple REST based API and an Export API which can output the data in a number of standard Data formats.

Built in Connector

Ouvvi Apps comes with a built in ready to use Data Sync connector, so you can implement these in your integrations quickly and with no hassle.

Ouvvi Apps Connector

You also get a quick start function to create new apps within your Ouvvi environment!

Quick Start

Export Formats

Each app has its own API endpoint that you can connect to.
You can also generate exports of your data in a number of formats. We're talking XML, HTML, CSV, Excel and JSON. These can all be consumed by a URL from the Ouvvi server so a great way to share data with other applications.

Export Formats

Data Consumption

Consume the data stored in Ouvvi Apps straight into Power BI. Sometimes it can be difficult to get your data into Power BI or slow depending on the source system. Now you can use Data Sync to update your Ouvvi Apps and then connect Power BI to the Ouvvi App.

Power BI

Data Transformation

With Data Integration it can be helpful to use a SQL Database as a Temporary Staging Store. Now with Ouvvi Apps you can use these instead and no longer need to deal with creating SQL Tables on the Database Server.

Email Reporting

Go from simply having a data store to getting emails about your data. We have two HTML report views TABLE and LIST and a new Ouvvi Email handler than can consume a Web Page as the body of an email. By combining these we have a new powerful way to import data into an Ouvvi App and then send this as a report by email.

We've been using this loads! One of the ways is for when we receive a new order. The data we get from our reseller wasn't useful enough for us and we kept having to log in to find out the information we needed. So we utilised Data Sync + Ouvvi Apps and now we get an email every time we get a new order with all the information we need to know.

New Order Email

We're going to follow up with some more detailed blogs in the future but for now this is a heads up for what is coming in the next release.

Introducing the Data Sync Connector for Pipedrive

The Pipedrive CRM system connector

30 August 2019

We’ve recently begun using Pipedrive as our main CRM, so naturally thought we should build a connector for this!

So let's tell you a bit more about how you can connect and use the Pipedrive REST API.

The Connector

The Pipedrive connector is available from version 3.0.1228, and has full read and write, lookups and connection library capabilities.

To use it you will need your API token from Pipedrive, get this by going to Settings > Personal > Other > API. Then you simply choose the list you want to connect to.

Pipedrive Connector

What can you connect to?

The connector can connect to the Activity, Deal, Note, Organization, Person, Pipeline, Product, Stage, and User lists.

If you want to test what fields will be returned for each list you can do so at the Pipedrive API reference page. Enter in your API token, choose an endpoint to test and then click GET. This will then return a sample of your data stored in Pipedrive.

How might you use this connector?

Maybe you have Pipedrive but don’t know where to get started. Let us give you a few examples of how you might use it, so then you can get up and running automating your Pipedrive integration.

Data Migration

The very first task we needed the connector for was to migrate from our old CRM system to Pipedrive.

We simply connected to the old CRM as the source, connected to Pipedrive as our target, mapped the columns we wanted to include and then clicked Synchronise.
As you have all probably found out by now, Data Sync doesn’t take long to run these sorts of projects, so within under a minute we were done.

Now we can go into Pipedrive and have our customer data to hand, joining the emails we receive to customers and deals in the system.

Adding Leads

You may receive a list of new leads from one of your departments that need to be uploaded into Pipedrive. This could be a CSV file, or maybe you connect directly to the SQL database where the data is added.
Just connect to your list, connect to Pipedrive, map the columns and sync.

Updating Records

Maybe you edit the user data within Pipedrive but then need to reflect these updates in your business systems. Or spin it the other way and the data is edited in another system, but these changes need to be reflected in your Pipedrive CRM.
Just follow the same process: connect to your source system, connect to your target (destination) system, map your columns, compare and sync.

You can automate the project by using either the run tool to schedule it to run at a set time or by using Ouvvi to apply event or time-based triggers. You could even have Ouvvi send you an email once it has run to let you know something has changed… We will be telling you more about these sorts of possibilities in an upcoming blog post so keep your eyes peeled for our Ouvvi Apps release post.

Want to know more?

So if this has got you thinking about all the possibilities you could do and you want to know more, send us a message and we can help get you started.

Data Sync Projects vs Traditional Blocky Workflows

Comparision of Data Synchronisation Studio Projects against other ETL Tools

5 July 2019

In this blog we will cover the difference between Data Synchronisation Studio and other Data Integration products that take a blocky single step at a time workflow approach.


For this example lets take a simple scenario where you may receive a file daily into a mailbox you want to then take this file and save it to a folder and at the same time rename the file with today's date.

Typical Workflow

Below is an example of the steps you might take to complete this process.

Email Attachment Process Flow

The Data Sync Approach

In Data Sync we model our Source Data how we want it represented in the Target doing everything at once and in batch. Data Sync then works out for us what it needs to do to make the target the same as the source. We then run multiple Data Sync projects in a sequence to build more complex processes.

So in this example we wrap all this up into a single Data Sync project where we connect to an Exchange Mailbox and get mail for Today with a specific subject then map it to a Folder and rename the file before its written.

Filter Email for Today and Subject

This is done via Project Automation to update the project configuration at runtime to return email messages received today. To filter only messages starting with NEW LEADS we enter a value of NEW LEADS* into the FilterBySubject property of the source.

DataSourceA.FilterByReceivedDateTime = DateTime.Today.ToString("yyyy-MM-dd HH:mm:ss");

Extract Attachment and Save to Folder

This is done by just mapping the Filename from the Attachment to the Target Folder at the same time we use a Calculated Column to rename the file using today's date.

FORMAT("LEADS_{0}.csv", DATESTR(TODAY(), "yyyyMMdd"))

Data Sync Project in Designer

Data Sync Project

Next Process Import File

Next we might want to import the data from the file into our Target System this could almost be anything but in general the process is the same. We process each row one at a time looking up against the target to decide whether we need to add the row as its a new row or update an existing row with new data. This can get really complicated when our source data doesn't contain the required identifier in the target system. It can also be really slow to lookup each record one at a time.

Email Attachment Process Flow

Fortunately in Data Sync all this is really easy we just need to connect the source to our CSV file. We can use a wildcard for the path if we do not know the filename i.e. C:\Temp\Leads_Drop\LEADS_*.csv.

We then connect the Target and define the mapping between the two data sources. We need a Key column to identify the records this is used by Data Sync to calculate whether a record is new or is an update.

Internally Data Sync tracks the target identifiers so we can always update a record even when the source data doesn't include this value.

Data Sync Project

Bringing it all together

Finally we need a way to run these Data Sync projects in a Sequence. Configure a Schedule or real-time trigger and have a way to monitor the execution once the process is in production.


Ouvvi is our solution for scheduling and monitoring the individual Data Sync projects. Within Ouvvi you define a project and add steps to execute in a sequence using flow control to manage which steps run based on a running status.

For our example here we add three steps.

  1. Download CSV File from Email.
  2. Import CSV file to SQL Table
  3. On Failure send an email report.

We can document each of these steps in Ouvvi to ensure we have some visibility on the defined process.

Once the solution is configured and tested we add a Trigger to run this at Start Of Day to ensure it runs automatically for us.

Projects configured in Ouvvi

Ouvvi Project

This is just a small fraction of what is possible with Data Sync+Ouvvi. Hopefully this helps you understand the difference between Data Sync+Ouvvi and other products and you can see how easy it is to use Data Sync+Ouvvi for your Data Integration processes.

Data Sync Avangate/2Checkout Connector

Avangate/2Checkout Connector for Subscriptions and Promotions

2 July 2019

We have created a new Data Sync connector for Avanagte/2Checkout REST V4 API.

This new Connector can read Subscriptions and Promotions from the API. With this new connector you can easily sync your subscription data in 2Checkout with your CRM System or SQL Database.

Avangate/2Checkout API docs are here

Avangate-2Checkout Connector

Write operations are manual via Project Automation Events and calling the REST API directly.

For example if you wanted to synchronise promotion discount rates with your own internal system you might use the following code in Project Automation BeforeUpdateItem event to retrieve the current promotion json document, update the rate and then send the document back.

public override void BeforeUpdateItem(object sender, DataCompareItemInvariant item, object identity)
	var mapping = new DataSchemaMapping(SchemaMap, DataSchemaSide.DataSourceB);
	var toUpdate = item.ToUpdateItemDictionary(mapping);
		var helper = DataSourceB.GetWebRequestHelper();	
		var info = DataSourceB.GetDatasourceInfo();
		var url = info.GetAvangateItemEndpointUrl((string)identity);
		// Get the Existing Promotion Details
		var promotion = helper.GetRequestAsJson(url);
		// Update the Discount
		promotion["Discount"]["Value"] = toUpdate["DiscountValue"].ToString();
		// Send it back
		helper.PutRequestAsJson(promotion, url);			

Data Sync Mailchimp Connector

Import, Export Mailchimp Audience data and set Tags

1 July 2019

We have created a new connector for Data Synchronisation Studio to work with Mailchimp data.

With this connector you can Import and Export Mailchimp Audience Contacts. You can also update existing Contacts with any changes you may have in your database. This connector also supports Incremental Sync mode when you use the ID column as the Key column.

Mailchimp Data Sync Connector

When you prepare your data for the Mailchimp data import its a good idea to lower-case the email address this is to ensure that we can also calculate the MD5/id value that Mailchimp uses as the record identifier.

It's easy with a calculated column to Lower case the Email address with the LOWER(Email) function.

Mailchimp Lower Email Address

We can also generate the same MD5 hash from the email address that Mailchimp uses with the MD5HASH(LOWER(Email)) function.

Mailchimp MD5 Hash Email Address

To configure your project map your source columns to the Mailchimp columns and use the results of your functions to map Email and ID columns.

Mailchimp Data Sync Project

You can specify the contact status during the import by providing a text value for the status i.e. subscribed, transactional etc.

Its also possible to add/remove tags from contacts this is done by specifying an array of Tags to apply to the contact.

If your source data holds your tags in different columns like Tag1, Tag2 etc its quite easy to combine these into an Array with a helper function.

If you add a function to the Dynamic Columns class like this ARRAY function below you can then call this from Calculated columns with the function ARRAY(Tag1, Tag2)

public string [] ARRAY(params string [] values)
	return values.Where(p => !string.IsNullOrEmpty(p)).ToArray();

You can also extend the import further in Project Automation by intercepting the Item Events and calling the Mailchimp REST API directly with the help of the Data Sync HttpWebRequest helper.

Deploying Ouvvi + Data Sync on Windows 10

Using Windows 10 Pro rather than Windows Server

7 June 2019

Ouvvi is typically deployed on a dedicated Windows Server Virtual Machine. This can have significant Windows Server License implications.

Ouvvi and Data Sync can be run on Windows 10 Pro with IIS and SQL Server Express Edition just as well as it would on a Windows Server OS. You do need the Pro SKU of Windows 10 as Ouvvi requires IIS.

DELL COMPUTERS offer a tiny little workstation the OptiPlex 3060 which can be ordered in several configurations. However, the 6 Core i5-8500T with 8GB RAM and 256 GB M2.SSD is an interesting configuration and would make for a great little Ouvvi + Data Sync Data Integration server (we have 6 of them!). This specification is capable of running some very significant data integration processes.

DELL Optiplex 3060

If you were to get a similar specification 4 Cores and 8GB RAM Windows Server from a well-known cloud service vendor you would be looking at ~$285/Month.

AWS c5.xlarge costs

I would imagine this little pc would still be significantly faster than any of the cloud vm offerings. Additionally over 3-Years this would make this little PC ~$20/Month.

You can Domain Join and enable Remote Desktop on this little Windows 10 Pro machine and just leave it in a cupboard somewhere and pretend it’s a “real server”.

Windows 10 Pro only allows for a single Remote Desktop Session so your limited to a single user logging on at a time. However, you can enable Ouvvi to be accessible over your LAN by enabling the Ouvvi Web TCP Port via Windows Firewall. Multiple network users can then view the Dashboards and keep and eye on what’s running via the Web Interface.

Ouvvi runs a service so this machine does not need to be logged in, just switched on and running.

Ouvvi Dashboard View

So there you go you can deploy Ouvvi + Data Sync on Windows 10 Pro and it might just be a better option for you.

Email Address Validation via ZeroBounce

Using Data Sync Dynamic Columns with ZeroBounce API

6 June 2019

We found this great SaaS service online to clean up email addresses basically you can ask zerobounce if an email address is valid or not and then use that information to keep your database clean.

This is a simple HTTP Json type service where you send an email address and it will return a Json response with the result. Armed with this information it would be really easy to build an automated integration with Data Sync to continuously check the status of email addresses in your database.

For this example I created a simple SQL Table to hold some email addresses and the response from ZeroBounce. As your charged based on API Credits its a good idea to only call the API when you need to so in this example we store a last checked value and then only call the API again if 30 days have passed since the last check.

Note: You need to complete the sync to write the values to the SQL Table if you keep comparing then you will keep calling the API.

SQL Table

Here is a T-SQL definition for the SQL Table I used in this example.

CREATE TABLE [EmailValidation] (
	[EmailAddress] nvarchar(255) NOT NULL,
	[LastChecked] datetime NULL,
	[Status] nvarchar(100) NULL,
	[SubStatus] nvarchar(100) NULL,
	[FreeEmail] bit NULL

You then need to fill the EmailAddress column with email addresses. You could use a Data Sync project to import these from a CSV other SQL Table or other data source.

Email Validation Table

Data Sync Project

We now need to configure the Data Sync project as this project is to update the same data source i.e. the SQL table with information from ZeroBounce we need to map the SQL Table as both Source and Target. Then use Dynamic Columns in a Lookup type function to call the ZeroBounce API and get the result for each Email Address in the Table

To configure this Data Sync project

  1. Load the SQL Table as both the Source and Target
  2. Copy the Dynamic Columns Code add your ZeroBounce API key and check that it compiles (build button).
  3. Map the columns as you see in the Screenshot below so that the data from ZeroBounce is mapped to the Target SQL Table
  4. Compare and Sync

Data Sync Project

Data Sync Project Code

Dynamic Columns Code

partial class DataSourceRowOverride : Simego.DataSync.DynamicColumns.DataSourceRowInternal
    private const string API_KEY = "YOUR_API_KEY";
    private const string API_URL = "{0}&email={1}&ip_address=";
    private HttpWebRequestHelper helper = new HttpWebRequestHelper();
    public DateTime? Fx_LastChecked { get; set; }
    public string Fx_Status { get; set; }
    public string Fx_SubStatus { get; set; }
    public bool? Fx_FreeEmail { get; set; }
    public override bool BeginRow()
	    Fx_LastChecked = LastChecked;
	    Fx_Status = Status;
	    Fx_SubStatus = SubStatus;
	    Fx_FreeEmail = FreeEmail;
	    if(!Fx_LastChecked.HasValue || DateTime.Today.AddDays(-30) > Fx_LastChecked.Value)
		    Fx_LastChecked = DateTime.UtcNow;

			    var result = helper.GetRequestAsJson(API_URL, API_KEY, EmailAddress);
			    Fx_Status = DataSchemaTypeConverter.
			    Fx_SubStatus = DataSchemaTypeConverter.
			    Fx_FreeEmail = DataSchemaTypeConverter.
		    catch(Exception e)
			    Fx_Status = "Error";
        return true;

Email Validation Results

After you run the sync you will then see the results of the email validation in the SQL Table. If you add more email addresses to the Table only those new email addresses added will be checked next time.

Email Validation Table Results

Ouvvi DevOps

Automating Ouvvi Solution Deployment

25 April 2019

Requires: Ouvvi Version 4.0.574 or greater.

We have a new Solution packaging feature for Ouvvi. This allows you to package Ouvvi assets into a single deployable package and then deploy to your Ouvvi server.

These Ouvvi solutions include Dashboards, Groups, Projects, Steps, Triggers, Connections and Settings.

There is a new Ouvvi Step Hander that can be used to automatically pickup a solution file and deploy to your server when ever the solution file is changed. For Live running servers this will queue in the import job and it will be run when the current job queue is clear.

Export Solution from Existing Ouvvi Instance

To export your existing Ouvvi projects to the new solution file format use the new Solution Export feature.

Go to Projects->Export->Solution Export.

Export Projects

Export Solution

And check the items to be included in the export.

Choose Projects to Export

This will export a ZIP Archive file containing the solution assets which you can then extract and modify as necessary.

Solution Files

Import Solution to Ouvvi Instance

To import a solution file make sure that you zip all your solution assets into a ZIP Archive and then go to Projects->Import->Solution Import.

The Import process is a MERGE operation with your Ouvvi instance where items are matched by name. The only items that are deleted are steps within an imported project where they no longer exist.

Important: You must use unique names for your Ouvvi items so that matching by name during the import works correctly.

Import Projects

Import Solution

Choose file to Import

Before your items are imported you can review the contents of the solution file and choose which items to import.

Check Items to Import

Import Items

Build Solutions with VS Code

Using VS Code, Visual Studio or another IDE allows you to manage these assets with a Source Control System (GIT) and when ready package up and deploy to the Ouvvi Server. The package is simply a ZIP Archive of the solution.

VSCode solution project

Within the IDE you can edit the solution XML Files. Store your assets in a source control system and edit the Data Sync projects with the Data Sync Studio Designer. The Connection Library is redirected to the Library within the Solution.

Automate Solution Import

You can setup an Ouvvi project which can then be used to automate importing a solution file when it changes. To do this create a new project and add a step to the project of type "Ouvvi Solution File Import". The step configuration should then point to the location of the solution file.

Solution Import Step Type

Solution Import Step Configuration

You can then create a File Trigger that points to the same file and add this to the deployment project. When a new Solution Zip file is then written the contents of the solution are automatically imported.

Import Solution via API

If you create an Ouvvi Deployment project you can also then call Ouvvi via the API to run the import.

Example Powershell script to execute an Ouvvi hosted project

Invoke-RestMethod -Method GET -ContentType application/json -UseDefaultCredentials -Uri http://localhost:2026/api/projects/start/1

Pivot DataSource Columns into Rows

Using Dynamic Columns to convert your Columns into new rows

1 November 2018

Here we have a Data Transformation method to pivot normal row data so that the columns in the original data source can be used to create multiple new rows.

Source Data

Taking this simple source data we need to switch it around so that the row columns become new rows.

Pivot source Data

Result Data

The result were looking for here is have 3 columns ID the original source ID, MetaKey the name of the value and MetaValue the actual value from the row.

We also want to add additional rows with fixed values.

Pivot Result Data

Dynamic Columns

Using Dynamic columns we're going to capture the original row values, add new rows and then remove the original row.

partial class DataSourceRowOverride : Simego.DataSync.DynamicColumns.DataSourceRowInternal
    //Our Dynamic Columns
    public string MetaKey { get; set; }
    public string MetaValue { get; set; }
    private bool processingRow = false;
    public override bool BeginRow()
	    //If we're currently adding our new rows return true to include them in the results.
	    if(processingRow) return true;
	    processingRow = true;
	    //Capture this row data
	    var nickname = Nickname;
	    var first_name = FirstName;
	    var last_name = LastName;
	    //Add New Rows
	    Table.Rows.Add(AddRow(ID, "nickname", nickname));
	    Table.Rows.Add(AddRow(ID, "first_name", first_name));
	    Table.Rows.Add(AddRow(ID, "last_name", last_name));
	    Table.Rows.Add(AddRow(ID, "description", ""));
	    Table.Rows.Add(AddRow(ID, "rich_editing", "true"));
	    Table.Rows.Add(AddRow(ID, "syntax_highlighting", "true"));
	    Table.Rows.Add(AddRow(ID, "comment_shortcuts", "false"));
	    processingRow = false;
	    //Remove the original row from the results
	    return false;
    private DataTableStoreRow AddRow(string ID, string key, string value) 
	    //Create new Row
	    var row =  Table.NewRow();	
	    row["ID"] = ID;		
	    //Set Dynamic Column Values
	    MetaKey = key;
	    MetaValue = value;
	    return row;

There is a slight variation to this when the source connector uses an internal identifier. The more advanced connectors such as Dynamics, Salesforce, AD, Podio and SharePoint require row identifiers to be set for each row.

For example to use this with Dynamics CRM systemuser entity would require obtaining the row identifier and adding the rows with the AddWithIdentifier method.

    // Get the source identifier
    var id = Row.GetIdentifier<Guid>();

    var nickname = domainname;
    var first_name = firstname;
    var last_name = lastname;
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "nickname", nickname), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "first_name", first_name), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "last_name", last_name), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "description", ""), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "rich_editing", "true"), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "syntax_highlighting", "true"), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "comment_shortcuts", "false"), id);

Data Sync Project

The complete Pivot example project loaded in Data Synchronisation Studio.

Pivot Project screenshot

Download Sample Project

Would you like to try Data Sync and Ouvvi for your Data Integration projects?