rss

Team Blog

Keep up to date with everything going on at Simego.

Deploying Ouvvi + Data Sync on Windows 10

Using Windows 10 Pro rather than Windows Server

7 June 2019

Ouvvi is typically deployed on a dedicated Windows Server Virtual Machine. This can have significant Windows Server License implications.

Ouvvi and Data Sync can be run on Windows 10 Pro with IIS and SQL Server Express Edition just as well as it would on a Windows Server OS. You do need the Pro SKU of Windows 10 as Ouvvi requires IIS.

DELL COMPUTERS offer a tiny little workstation the OptiPlex 3060 which can be ordered in several configurations. However, the 6 Core i5-8500T with 8GB RAM and 256 GB M2.SSD is an interesting configuration and would make for a great little Ouvvi + Data Sync Data Integration server (we have 6 of them!). This specification is capable of running some very significant data integration processes.

DELL Optiplex 3060

If you were to get a similar specification 4 Cores and 8GB RAM Windows Server from a well-known cloud service vendor you would be looking at ~$285/Month.

AWS c5.xlarge costs

I would imagine this little pc would still be significantly faster than any of the cloud vm offerings. Additionally over 3-Years this would make this little PC ~$20/Month.

You can Domain Join and enable Remote Desktop on this little Windows 10 Pro machine and just leave it in a cupboard somewhere and pretend it’s a “real server”.

Windows 10 Pro only allows for a single Remote Desktop Session so your limited to a single user logging on at a time. However, you can enable Ouvvi to be accessible over your LAN by enabling the Ouvvi Web TCP Port via Windows Firewall. Multiple network users can then view the Dashboards and keep and eye on what’s running via the Web Interface.

Ouvvi runs a service so this machine does not need to be logged in, just switched on and running.

Ouvvi Dashboard View

So there you go you can deploy Ouvvi + Data Sync on Windows 10 Pro and it might just be a better option for you.


Email Address Validation via ZeroBounce

Using Data Sync Dynamic Columns with ZeroBounce API

6 June 2019

We found this great SaaS service online to clean up email addresses https://www.zerobounce.net/ basically you can ask zerobounce if an email address is valid or not and then use that information to keep your database clean.

This is a simple HTTP Json type service where you send an email address and it will return a Json response with the result. Armed with this information it would be really easy to build an automated integration with Data Sync to continuously check the status of email addresses in your database.

For this example I created a simple SQL Table to hold some email addresses and the response from ZeroBounce. As your charged based on API Credits its a good idea to only call the API when you need to so in this example we store a last checked value and then only call the API again if 30 days have passed since the last check.

Note: You need to complete the sync to write the values to the SQL Table if you keep comparing then you will keep calling the API.

SQL Table

Here is a T-SQL definition for the SQL Table I used in this example.

CREATE TABLE [EmailValidation] (
	[ID] int IDENTITY(1, 1) PRIMARY KEY NOT NULL,
	[EmailAddress] nvarchar(255) NOT NULL,
	[LastChecked] datetime NULL,
	[Status] nvarchar(100) NULL,
	[SubStatus] nvarchar(100) NULL,
	[FreeEmail] bit NULL
)

You then need to fill the EmailAddress column with email addresses. You could use a Data Sync project to import these from a CSV other SQL Table or other data source.

Email Validation Table

Data Sync Project

We now need to configure the Data Sync project as this project is to update the same data source i.e. the SQL table with information from ZeroBounce we need to map the SQL Table as both Source and Target. Then use Dynamic Columns in a Lookup type function to call the ZeroBounce API and get the result for each Email Address in the Table

To configure this Data Sync project

  1. Load the SQL Table as both the Source and Target
  2. Copy the Dynamic Columns Code add your ZeroBounce API key and check that it compiles (build button).
  3. Map the columns as you see in the Screenshot below so that the data from ZeroBounce is mapped to the Target SQL Table
  4. Compare and Sync

Data Sync Project

Data Sync Project Code

Dynamic Columns Code

partial class DataSourceRowOverride : Simego.DataSync.DynamicColumns.DataSourceRowInternal
{
    private const string API_KEY = "YOUR_API_KEY";
    private const string API_URL = "https://api.zerobounce.net/v2/validate?api_key={0}&email={1}&ip_address=";
	
    private HttpWebRequestHelper helper = new HttpWebRequestHelper();
	
    public DateTime? Fx_LastChecked { get; set; }
    public string Fx_Status { get; set; }
    public string Fx_SubStatus { get; set; }
    public bool? Fx_FreeEmail { get; set; }
			
    public override bool BeginRow()
    {
	    Fx_LastChecked = LastChecked;
	    Fx_Status = Status;
	    Fx_SubStatus = SubStatus;
	    Fx_FreeEmail = FreeEmail;
	
	    if(!Fx_LastChecked.HasValue || DateTime.Today.AddDays(-30) > Fx_LastChecked.Value)
	    {
		    Fx_LastChecked = DateTime.UtcNow;

		    try 
		    {			
			    var result = helper.GetRequestAsJson(API_URL, API_KEY, EmailAddress);
			
			    Fx_Status = DataSchemaTypeConverter.
				    ConvertTo<string>(result["status"].ToObject<object>());
				
			    Fx_SubStatus = DataSchemaTypeConverter.
				    ConvertTo<string>(result["sub_status"].ToObject<object>());
				
			    Fx_FreeEmail = DataSchemaTypeConverter.
				    ConvertTo<bool>(result["free_email"].ToObject<object>());				
			
		    } 
		    catch(Exception e)
		    {
			    Fx_Status = "Error";
			    Trace.TraceError(e.ToString());
		    }				
	    }
		
        return true;
    }    
}

Email Validation Results

After you run the sync you will then see the results of the email validation in the SQL Table. If you add more email addresses to the Table only those new email addresses added will be checked next time.

Email Validation Table Results


Ouvvi DevOps

Automating Ouvvi Solution Deployment

25 April 2019

Requires: Ouvvi Version 4.0.574 or greater.

We have a new Solution packaging feature for Ouvvi. This allows you to package Ouvvi assets into a single deployable package and then deploy to your Ouvvi server.

These Ouvvi solutions include Dashboards, Groups, Projects, Steps, Triggers, Connections and Settings.

There is a new Ouvvi Step Hander that can be used to automatically pickup a solution file and deploy to your server when ever the solution file is changed. For Live running servers this will queue in the import job and it will be run when the current job queue is clear.

Export Solution from Existing Ouvvi Instance

To export your existing Ouvvi projects to the new solution file format use the new Solution Export feature.

Go to Projects->Export->Solution Export.

Export Projects

Export Solution

And check the items to be included in the export.

Choose Projects to Export

This will export a ZIP Archive file containing the solution assets which you can then extract and modify as necessary.

Solution Files

Import Solution to Ouvvi Instance

To import a solution file make sure that you zip all your solution assets into a ZIP Archive and then go to Projects->Import->Solution Import.

The Import process is a MERGE operation with your Ouvvi instance where items are matched by name. The only items that are deleted are steps within an imported project where they no longer exist.

Important: You must use unique names for your Ouvvi items so that matching by name during the import works correctly.

Import Projects

Import Solution

Choose file to Import

Before your items are imported you can review the contents of the solution file and choose which items to import.

Check Items to Import

Import Items

Build Solutions with VS Code

Using VS Code, Visual Studio or another IDE allows you to manage these assets with a Source Control System (GIT) and when ready package up and deploy to the Ouvvi Server. The package is simply a ZIP Archive of the solution.

VSCode solution project

Within the IDE you can edit the solution XML Files. Store your assets in a source control system and edit the Data Sync projects with the Data Sync Studio Designer. The Connection Library is redirected to the Library within the Solution.

Automate Solution Import

You can setup an Ouvvi project which can then be used to automate importing a solution file when it changes. To do this create a new project and add a step to the project of type "Ouvvi Solution File Import". The step configuration should then point to the location of the solution file.

Solution Import Step Type

Solution Import Step Configuration

You can then create a File Trigger that points to the same file and add this to the deployment project. When a new Solution Zip file is then written the contents of the solution are automatically imported.

Import Solution via API

If you create an Ouvvi Deployment project you can also then call Ouvvi via the API to run the import.

Example Powershell script to execute an Ouvvi hosted project

Invoke-RestMethod -Method GET -ContentType application/json -UseDefaultCredentials -Uri http://localhost:2026/api/projects/start/1


Pivot DataSource Columns into Rows

Using Dynamic Columns to convert your Columns into new rows

1 November 2018

Here we have a Data Transformation method to pivot normal row data so that the columns in the original data source can be used to create multiple new rows.

Source Data

Taking this simple source data we need to switch it around so that the row columns become new rows.

Pivot source Data

Result Data

The result were looking for here is have 3 columns ID the original source ID, MetaKey the name of the value and MetaValue the actual value from the row.

We also want to add additional rows with fixed values.

Pivot Result Data

Dynamic Columns

Using Dynamic columns we're going to capture the original row values, add new rows and then remove the original row.

partial class DataSourceRowOverride : Simego.DataSync.DynamicColumns.DataSourceRowInternal
{
    //Our Dynamic Columns
    public string MetaKey { get; set; }
    public string MetaValue { get; set; }
	
    private bool processingRow = false;
	
    public override bool BeginRow()
    {		
	    //If we're currently adding our new rows return true to include them in the results.
	    if(processingRow) return true;
		
	    processingRow = true;
		
	    //Capture this row data
	    var nickname = Nickname;
	    var first_name = FirstName;
	    var last_name = LastName;
		
	    //Add New Rows
	    Table.Rows.Add(AddRow(ID, "nickname", nickname));
	    Table.Rows.Add(AddRow(ID, "first_name", first_name));
	    Table.Rows.Add(AddRow(ID, "last_name", last_name));
	    Table.Rows.Add(AddRow(ID, "description", ""));
	    Table.Rows.Add(AddRow(ID, "rich_editing", "true"));
	    Table.Rows.Add(AddRow(ID, "syntax_highlighting", "true"));
	    Table.Rows.Add(AddRow(ID, "comment_shortcuts", "false"));
		
	    processingRow = false;
		
	    //Remove the original row from the results
	    return false;
    }
	
    private DataTableStoreRow AddRow(string ID, string key, string value) 
    {
	    //Create new Row
	    var row =  Table.NewRow();	
	    row["ID"] = ID;		
		
	    //Set Dynamic Column Values
	    MetaKey = key;
	    MetaValue = value;
		
	    return row;
    }
}

There is a slight variation to this when the source connector uses an internal identifier. The more advanced connectors such as Dynamics, Salesforce, AD, Podio and SharePoint require row identifiers to be set for each row.

For example to use this with Dynamics CRM systemuser entity would require obtaining the row identifier and adding the rows with the AddWithIdentifier method.


    // Get the source identifier
    var id = Row.GetIdentifier<Guid>();

    var nickname = domainname;
    var first_name = firstname;
    var last_name = lastname;
		
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "nickname", nickname), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "first_name", first_name), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "last_name", last_name), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "description", ""), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "rich_editing", "true"), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "syntax_highlighting", "true"), id);
    Table.Rows.AddWithIdentifier(AddRow(systemuserid.ToString(), "comment_shortcuts", "false"), id);


Data Sync Project

The complete Pivot example project loaded in Data Synchronisation Studio.

Pivot Project screenshot

Download Sample Project


Summer 2018 Release

Data Sync & Ouvvi Summer 2018 Release Update

24 July 2018

Here we have a round up of the major new changes to Data Sync and Ouvvi for our Summer 2018 release.

Data Sync

Dynamics 365

Faster connection to Dynamics 365 by bypassing the Discovery Service and connecting directly to the Organisation Service Endpoint.

Dynamics 365 Connection

Dynamics 365 Connector now targets Dynamics CRM SDK v9.0.2 with support for Multiple selection Optionsets. Dynamics version 2016+ now uses a specific writer so that multiple requests are no longer required to set record owner.

UpdateSourceRow support in Dynamics connector to write back to CRM Record in Project Automation.

public override void AfterAddItem(object sender, DataCompareItemInvariant item, object identity)
{
    var properties = new Dictionary<string, object>() 
                    { 
                        { "new_sync", true } 
                    } ;
        
    DataSourceA.UpdateSourceRow(properties, (Guid)identity);
}

ADO.NET

New ADO.NET connector to connect to ADO.NET compliant Data sources.

The ADO.NET connector supports schema browsing via the ADO.NET Schema Tables, Connection Library, Lookups, Lookup Functions, Incremental Lookup, Incremental Data Load and INSERT/UPDATE/DELETE target actions. Some of these features are subject to the driver implementation.

Connect to ADO.NET

ADO.NET Drivers

Ouvvi

Deployment Manager

New Deployment Manager for Ouvvi instances to enable quick and easy deployment of Ouvvi. The Deployment Manager enables deployment of new instances and update of existing instances.

Deployment Manager

Log Instance

Logs on child processes now create their own log instance so that when you call an Ouvvi Project from another project the log is recorded against the actual project running with a link back to the parent log.

So now when one project calls many other projects you will see the status against each project rather than the project calling the projects.

Project View

ForEach Step Handler

The ForEach file step handler has been updated to be more robust and no longer ties up a processing thread in Ouvvi waiting for the project to complete.

Teams Message Step Handler

You can now post a Notification Message to Microsoft Teams via the new Teams Message Step Handler in Ouvvi.

Microsoft Teams Notification

Web API Step Handler

The Web API Step Handler can now save the response to a file so can be used to download a document for further processing.

Copy Project

You can now use the Copy Project feature to copy a project in Ouvvi including all the steps in an existing project.


Dynamics 365 - Slow Discovery Service

Changing How Data Sync Connects with Dynamics 365

24 April 2018

Data Synchronisation Studio connects with Dynamics 365 first through the Dynamics 365 Discovery Service to obtain the URL to your Organisation SOAP/WCF Service endpoint. Starting this week we have noticed this call taking > 90 seconds to complete. This makes everything way too slow every interaction with Dynamics 365 was taking nearly 2 minutes to complete.

We therefore decided to remove the Discovery Service from the connection, this means we need the URL to the Organisation Service to establish the connection.

First you will need to get this URL from your Dynamics Instance from Settings -> Customizations -> Developer Info

Dynamics 365 Organisation Service

You will find the URL is like this https://orgname.api.crm4.dynamics.com/XRMServices/2011/Organization.svc

Next you will need Data Sync version 3.0.1138 or higher currently this is available here

When you create a connection to Dynamics 365 now you have a new form to enter the connection details and need to enter the URL and authentication details.

Dynamics 365 Connection

That is basically it, you should notice that connections are much faster like a few seconds. If you have existing projects linked to Dynamics 365, update your Connection Library connection to use the new URL and you should be good to go.

Questions?

If you ever think that there is a feature missing from our products, that would help you and other people out, then let us know by sending a message to support@simego.com. We love to hear feedback from our customers and will keep striving to add more functionality to our software.


Would you like to try Data Sync and Ouvvi for your Data Integration projects?

or DOWNLOAD NOW