United Kingdom +44 (0) 1424 863 450   hello@simego.com

Team Blog

Keep up to date with everything going on at Simego.


22 June 2017

Ouvvi v3 Beta

Ouvvi v3 is finally here in beta ready for you to try out. To get the latest version please download from https://www.simego.com/install/data-synchronisation-studio-beta

 

There has been a big re-write of Ouvvi to move away from ASP.NET Forms and onto ASP.NET MVC, we dropped all the Database Stored Procedures to make the Database simpler and the Ouvvi Service now runs over HTTP.

 

Ouvvi v3 runs on Windows Server 2012R2 and Windows Server 2016 also runs on Azure Web Apps with Web Jobs as well as VM’s from AWS/Azure etc.

 

Ouvvi v3 is 64 bit only at this stage unless there is demand for a 32 bit version.

 

We have adopted Windows PowerShell to make the installation process as easy as possible. The Default install will also use SQL Server LocalDB as the default Database engine. This is fine for small deployments anything large should be deployed on SQL Express or Full SQL Server.

 

The Video below shows how to install Ouvvi v3 onto a default Windows Server 2016 image with Default Settings.

 

PowerShell Script used in this Demo

 

Import-Module OuvviAdministration

Install-Ouvvi

 

 

Upgrading from v2.5 is very similar with a little more PowerShell

 

Powershell Script used in this Upgrade Demo note the use of paths and other settings that will be unique to your current setup.

 

Import-Module OuvviAdministration
Stop-Service -Name Ouvvi*
Stop-WebAppPool -Name Ouvvi*

 

Install-OuvviUpgrade -targetPath C:\OuvviTenants\Default -Url http://localhost:8080
Set-OuvviDBUserDBOwner -targetPath C:\OuvviTenants\Default -UserName "OUVVIV3\Ouvvi"

 

Start-WebAppPool -Name Ouvvi*

 

Start-Service -Name Ouvvi*

 

 

 

This version of Ouvvi V3 has been running our internal processes for several months on internal VMs and Azure Web Apps and uses a version of Data Sync that is very close to the current release version.


11 April 2017

Ouvvi v3 Data Change Set Feature

With Ouvvi v3 we have a new feature to record the Data Change Set that was created by Data Sync projects. We record both the high level statistics i.e. number of ADD/UPDATE/DELETE actions against the target data source and the actual data affected.

 

Below in the Screenshot you can see the instance log for the “Service Desk Integration” project this project has a few steps and each time it runs at 8AM you can see we have a summary of the records affected.

 

image

 

Now selecting a log instance from the list allows us to drill-in to what actually happened. We can see each step, the number of rows in both Source and Target and the individual changes applied for each step. This data is stored within the Ouvvi v3 database as a compressed binary blob against the log instance and step.

 

image

 

We can then drill in further and view the actual data that was affected, here we can see a new customer renewal being added to our HelpDesk CRM.

 

image

 

It’s also possible to download the change set as an XML document and we have a new connector for Data Sync that can read this XML format.

 

image

 

Hopefully you can see that this is a great new feature for Ouvvi and Data Sync and not something available from other vendors.


7 April 2017

Latest Update on Ouvvi v3

It’s been a crazy time here at Simego we’ve been re-building Ouvvi to be bigger better and faster. We still have much to complete but it’s time to let you see what’s coming round the corner.

 

The big change is that you can now host Ouvvi in Azure on Azure Web Apps with Azure Active Directory and even run the processes as an Azure Web Job. The Ouvvi service is now an Agent and can be installed on machines behind the corporate firewall. This allows you to deploy the Ouvvi Web App (Mothership) in the Cloud and deploy Agents (Smaller ships) where your data is located.

 

We’re running a few instances in Azure on a single small App Service and using Basic (5 DTU) SQL Azure databases. We’ve been tuning the Web Requests and DB Queries so that we can get the most out of a small VM.

 

New Project Dashboard with real-time updates.

New Layout and filter options to filter all your projects by status and group names.

 

image

 

New Change Tracking

Data Sync projects now report the change set details to Ouvvi when they run and you can even drill into the actual data that was part of the change set on every run. This change set can even be downloaded and used a source in a Data Sync project.

 

image

 

New Agent Dependency

Now it’s possible to define on which Agent a step process must run, so if you have a requirement that a step must execute on premise you can define where it runs. This also applies to triggers where they can now run across machines and networks.

 

image

 

Log Views

All the Log views now page the data to reduce the work on the SQL Database when you have really big logs.

 

Connection Library

The Connection library for Ouvvi v3 has moved to the Database so that everything is together in one place.

 

Database

The Database schema has been simplified, we dropped all the stored procedures and now we just have a few simple tables. This makes upgrades and adding new features far simpler now.

 

Triggers

Triggers run as individual queue items, across agents in parallel this enables execution of more triggers without holding the trigger thread and allows triggers to run on multiple nodes.

 

Message Queue Project

There is a way you can post a Json/XML payload to an Ouvvi project and with a special Data Sync project you can process the data. This message data gets assigned as the source data at runtime so ideal for running simple message based systems.

 

Data Sync

Of course Data Sync has been getting the love as well, we have a new connector for OData v4 / Web API with OAuth support for Dynamics 365 and support to edit projects in the new Ouvvi system when using Azure AD along with other little tweaks.


9 August 2016

Use Visual Studio External Tools with Data Sync

This is an interesting Use-Case I just discovered, using Visual Studio External Tools allows you to run Data Sync projects in the Visual Studio Designer and capture the output in the Visual Studio Output window.

 

Setup the External Tool to Start the Data Sync RunTool.

 

image

 

Then Select the Data Sync Project in solution explorer and run it via the External Tool.

 

image


25 July 2016

Connection Dialog Updated

In the latest preview release of Data Sync we have updated the Connection Dialog window to make it easier to locate providers and access the connection library.

 

As Data Sync gets more and more providers it was becoming difficult to find the provider in the list. We now group the providers by type in a Tree list which makes it much easier. Also since this was a re-write of some old code it’s now faster too.

 

image

 

Additionally you can now Quick Connect without creating a Library connection by selecting Connect.

 

Lastly connecting via the Connection Library also updates the Library *.dsconn file so this is easier to update credentials by simply updating a Library connection and connecting.


29 June 2016

Method to add Change Tracking to SQL Table

This is a method to add an Updated Timestamp to a SQL Table where one does not exist and the source application does not provide updated information as data is changed.

 

This would normally be implemented within the Application or perhaps SQL Triggers, however this is another approach with Data Sync via a Hash value stored on the row.

 

You might want to consider this if your Tables are large and your syncing into something that is slow or on the internet. i.e. Dynamics CRM where Data Sync Incremental sync would be faster, with a Change Timestamp you can select records that have been changed since a point in time.

 

I have started out with a simple Table, a copy of the Suppliers Table from the Northwind database. This does not include any tracking information.

 

CREATE TABLE [dbo].[MySuppliers] (
    [SupplierID] int NOT NULL PRIMARY KEY CLUSTERED,
    [CompanyName] nvarchar(40) NOT NULL,
    [ContactName] nvarchar(30) NULL,
    [ContactTitle] nvarchar(30) NULL,
    [Address] nvarchar(60) NULL,
    [City] nvarchar(15) NULL,
    [Region] nvarchar(15) NULL,
    [PostalCode] nvarchar(10) NULL,
    [Country] nvarchar(15) NULL,
    [Phone] nvarchar(24) NULL,
    [Fax] nvarchar(24) NULL,
    [HomePage] nvarchar(MAX) NULL
)
GO

 

Next we add 2 columns, 1 to hold the hash value and 1 to hold the time of change.

 

ALTER TABLE [dbo].[MySuppliers] ADD [Hash] nvarchar(50) NULL
GO
ALTER TABLE [dbo].[MySuppliers] ADD [Updated] datetime DEFAULT(GETUTCDATE()) NOT NULL

 

Next we setup the Data Sync project, connect the Source and Target to the same SQL Table, MySuppliers in this case.

 

Then Create 2 Calculated Columns which will provide the change tracking information.

 

1. MyHash – Returns a String MD5 hash of the values in the row.

 

MD5HASH(CONCAT(columns))

 

image

 

2. MyUpdated – Returns DateTime of the Change (when this project is run)

 

image

 

Map these new Calculated columns to Hash and Updated in the Target.

 

image

 

Now for the Trick, we want to update the Time and hash values only when the data has changed to do this we set the TriggerUpdate property in the schema map = False for all columns Except MyHash.

 

image

 

Now the first time we run the project all rows will be updated, but on the next run only the rows where the Hash is different will the Updated value be changed.

 

To generate the change tracking, you would run this as a pre-step in Ouvvi to your project you want to create an Incremental Sync. The benefit being if you have a Table of say 1 Million Rows into Dynamics CRM running this against SQL locally first will ensure the subsequent Sync into CRM with Incremental mode is fast as only changed rows need be loaded.

 

image


8 June 2016

Dynamics CRM Error Logging

We have improved the error response your get when you hit an Error against Dynamic CRM with Data Sync. We now format the request in a more human readable format and add this to the Error Log along with the response from Dynamics CRM.

 

For example attempting to Update the totalamount on an opportunity to a value that is out of range returns the text below to the error log. You will see that now you can easily see the Guid of the target entity item and the data that is being sent in the request. This should make it much easier to diagnose issues with Dynamics CRM.

 

UPDATE:opportunity(731f4eed-d71d-e611-80bb-00155d016e05)
SET
    actualclosedate=2016-06-08T00:00:00.0000000Z
    actualvalue=1000
    totalamount=79228162514264337593543950335

  

A validation error occurred.  The value of 'totalamount' on record of type 'opportunity' is outside the valid range.

 

CREATE:email(70cbafc4-7563-e411-80c3-00155d010200)

SET

activitytypecode=email

activitytypecodename=(null)

attachmentcount=0

compressed=False

compressedname=(null)

correlationmethod=optionset(0)

createdby=systemuser(046d4493-e2e7-e311-80bd-00155d013a04)

createdbyname=Sean Cleaver

createdbyyominame=Sean Cleaver

createdon=2014-11-03T16:23:40.0000000Z

deliveryprioritycode=optionset(1)

deliveryprioritycodename=(null)

deliveryreceiptrequested=False

deliveryreceiptrequestedname=(null)

description=<html>

<head>

<meta http-equiv="Content-Type" content="text/html; charset=utf-8">

</head>

<body> ... (value truncated)

directioncode=True

directioncodename=(null)

isbilled=False

isbilledname=(null)

isregularactivity=True

isregularactivityname=(null)

isworkflowcreated=False

isworkflowcreatedname=(null)

modifiedby=systemuser(046d4493-e2e7-e311-80bd-00155d013a04)

modifiedbyname=Sean Cleaver

modifiedbyyominame=Sean Cleaver

modifiedon=2014-11-03T16:23:40.0000000Z

notifications=optionset(0)

notificationsname=(null)

owneridname=Sean Cleaver

owneridtype=systemuser

owneridyominame=Sean Cleaver

owningbusinessunit=businessunit(1a8577f0-a1e1-e311-80bd-00155d013a04)

owninguser=systemuser(046d4493-e2e7-e311-80bd-00155d013a04)

prioritycode=optionset(1)

prioritycodename=(null)

readreceiptrequested=False

readreceiptrequestedname=(null)

safedescription=<html>

<head>

<meta http-equiv="Content-Type" content="text/html; charset=utf-8">

</head>

<body> ... (value truncated)

statecodename=(null)

statuscodename=(null)

subject=Thank you for your order 205-6373703!

to=account(ccc5ef83-c210-e411-80bf-00155d013a04)

 

Account With Id = ccc5ef83-c210-e411-80bf-00155d013a04 Does Not Exist


 

Dynamics CRM Error


3 June 2016

Static Data Provider

This is something I have been meaning to build for some time now. I wanted the ability to take some data and store it in the Data Sync project file and use this as the source to some other project. You could always use XML Files but that requires two files the XML Data and Data Sync Project.

 

You might use this to create Test Environments or reset some Data Source prior to testing or even port data between two environments that have no physical connection.

 

The problem was I had no way of adding a block of XML to the Project file during load and save other than in an XML Attribute. So I have extended the Data Sync IDataSourceReader interface to allow for loading and saving an XML Element as well as the Parameter Elements.

 

This provider persists the data in a format similar to the .NET DataSet so it’s human readable in the project file and small changes can be made manually if necessary. The Data is loaded when the project is opened so if you make a change through the back door you need to reload the project. Also if you change the Schema it’s best to re-create it.

 

It’s easy to use first connect to the source of your Data this is what you want to copy into the project file.

 

image

 

Then choose Create Static Datasource from the Tools Menu

 

image

 

This automatically sets up the Static Datasource copying the Schema from the Schema Map.

 

image

 

Compare and Sync your Data to load the Data into the Static Datasource provider.

 

image

 

Then use the Swap Sides function from the Tools Menu to make the Static Data the Source of your project.

 

image

 

That’s it you now have a Static Datasource setup that contains a copy of the Data in the Data Sync project file.

 

image


1 June 2016

Custom REST Data Sync Provider with Project Automation

In this example we are going to show you how to use a new feature in Project Automation with the Dynamic Data Source Provider to create a custom REST provider.

 

First the REST Service, this is the Ouvvi Data Service it returns a simple Json result from a Url like this below. You will see each data item is within the array “data”.

 

image

 

First we set up the project we use the Empty/Dynamic Datasource as the Source provider and configure the Columns collection to match the columns that will be returned from the REST Service. This is the default schema as this provider will not discover the schema.

 

We also set the Target to be an instance of the Empty/Dynamic Datasource for development we can change this later when it’s all working. (FYI: Tools->Create Null/Empty Datasource does this automatically matching the Schema Map).

 

image

 

Running this returns zero results since it does nothing right now but we’re now set to jump into Project Automation and write some code that will call the REST Service return a Json document and then parse the result and populate the Data Sync DataTableStore table with the data.

 

image

 

First up a bit of Boiler plate code we need to let the Source provider know that we have an implementation for the data load so we set a delegate to call us for the Data Load and then return the DataTableStore (currently empty).

 

image

 

    public override void Start()
    {        
        DataSourceA.GetDataTableCallback = GetDataTableSource;    
    }    

    public DataTableStore GetDataTableSource(DataTableStoreCallbackInfo info)
    {                                    
        return info.Store;
    }

Next we need some help for the Http Request calls, we could do it all ourselves but it would be better if Data Sync could help us here … HttpWebRequestHelper to the rescue add a new instance of the HttpRequestHelper class to the class instance variables.

 

    private HttpWebRequestHelper helper = new HttpWebRequestHelper() { UseDefaultCredentials = true };
    
    public override void Start()
    {        
        DataSourceA.GetDataTableCallback = GetDataTableSource;    
    }    

    public DataTableStore GetDataTableSource(DataTableStoreCallbackInfo info)
    {                                            
        return info.Store;
    }

Now for the bit where we get the Json document and load the DataTableStore with data. A simple call via GetRequestAsJson returns the Json Document from the service we then enumerate the “data” array adding each row to the DataTable Store.

 

The Add method on the DataTableStore Rows collection includes a delegate that asks for each column passing you the column name from the source. So you simply look it up and return it’s value.

 

image

 

    private HttpWebRequestHelper helper = new HttpWebRequestHelper() { UseDefaultCredentials = true };
    
    public override void Start()
    {        
        DataSourceA.GetDataTableCallback = GetDataTableSource;    
    }    

    public DataTableStore GetDataTableSource(DataTableStoreCallbackInfo info)
    {                            
        var jsonResponse = helper.GetRequestAsJson("http://test.internal.simego.com/api/list/projects");
                                
        foreach(var item_json in jsonResponse["data"])
        {            
            info.Store.Rows.Add(info, (o, columnName) => item_json[columnName].ToObject(typeof(object)));            
        }
        
        return info.Store;
    }

 

There you have it a custom REST based provider in Data Sync all without Visual Studio and a few lines of code. If you need to write back to the Data Source you can then implement your own code in the Project Automation Item Events or provide a full IDataSourceWriter implementation via the GetWriterCallback delegate.

 

image

 

The HttpWebRequestHelper class provides many helper methods to assist with working with Http Calls. We have already re-written the Simego Web API provider to use this new helper and started a new OData 4.0 provider based on it.

 

image


1 June 2016

Sending Email Message in Project Automation

We have added a helper function in the Project Automation base class for sending email messages, you can now simply call the SendEmail method to send an email at the end of the sync. (The SendEmail is a wrapper around the .NET SmtpClient in System.Net.Mail).

 

    public override void End(ProjectAutomationResult result)
    {
        SendEmail("support@simego.com", "Test Send", "Testing Send");
    }

The are several method overrides that also allow you to add attachments to your email messages.

 

The SMTP Server settings can either be supplied inline or via the Project Properties, if your running under Ouvvi the SMTP Settings will be read via the Values Store Service so make sure you point it to your Ouvvi Tenant.  

 

If your using project properties for the SMTP Server details use the following property names :-

 

SMTPServer

SMTPPort

SMTPUseSSL

SMTPUsername

SMTPPassword

 

If you use the project properties and your running under Ouvvi, Ouvvi will automatically update them from the Ouvvi configuration Database at runtime.