Integration Manager fully supports Microsoft Azure Logic Apps

Guru Venkataraman and Mark Mortimore from Microsoft held a presentation about the critical mission to use cloud capabilities on Microsoft Azure for building system integrations during the Microsoft Ignite event in Chicago. We have been chosen to be one of the partners for showing the audience the capabilities Microsoft offers its partners and customers in the new service – App Services. In one of their demos they decided to show how you can extend your Logic Apps with monitoring and logging capabilities by using one of our products – Integration Manager. See the whole recorded session by clicking here.

We are proud to announce that Integration Manager now fully supports logging all events/messages and monitoring Microsoft Azure Logic Apps.

These new capabilities enables you to simply log all messages as well as to monitor all Logic Apps in your Microsoft Azure subscription. Simply install our all new log and monitor agent for Microsoft Azure Logic Apps, let it connect to Microsoft Azure and fetch all messages and monitor all parts of your Logic Apps.

Logging

Integration Manager’s Log and Monitor Agent for Microsoft Azure Logic Apps will log not only that an operation has been executed, it will also log the input and output data of all actions being executed within the different Logic Apps. All events being logged contain a correlation id as a contextual value which easily can be used as a Search Field to group the different events, enabling you to see the whole chain of events and get it all in one place.

Monitoring

The new agent for logging and monitoring for Microsoft Azure Logic Apps automatically keeps track of all Logic Apps published in your Azure subscription. Not only are the statuses of your Logic Apps being monitored, even the different connectors are being monitored – giving you the insight and control you need in order for you to know that everything is in place and running according for your business to work.


Want to see a demo and want to learn more? Please contact us at info@integrationsoftware.se and we will set up an online meeting and show you how easy you can get in control and retrieve the insight you need!

Azure App Services Series – Finding #4: Hybrid Connections – Could not load file or assembly Newtonsoft.Json

This blog post is one of many of a series we have about Microsoft Azure App Services. See a list of all our findings here.

Introduction

Azure BizTalk Services – Hybrid Connections allow you to access on-premise resources, such as SQL instances, files, etc from Azure app services – web, API, mobile and logic apps.

By enabling your app services to use on-premise resources you can together with Integration Manager monitor even more!

Microsoft created and released e.g. a protocol connector enabling you to access on-premise files – the File connector. This connector uses Hybrid Connections to access files located on your on-premise file system.

The Problem

If you follow tutorials / documentations on how to get started with Hybrid Connections you learn that you have to have a BizTalk Services instance in Azure, this instance can have one or several hybrid connections (see pricing of BizTalk Services to learn more about limitations). While setting up and configuring your hybrid connection, it requires you to install a service/agent on your on-premise machine called the Hybrid Connection Manager. Installing the agent sometimes works and sometimes not.

We came across a file not found error during installation. Newtonsoft.Json file could not be found and therefore the installer throws an exception stating “Could not load file or assembly Newtonsoft.json…”.

Unfortunately there is no post from Microsoft about it, there are some forum posts regarding this error message, but we could not find a proper blog post about how to resolve the problem.

The Solution

We solved the issue by installing the proper version of the Newtonsoft.Json.dll into the Global Assembly Cache (GAC). The error message of the installer tells you which version you need, which in our case was version 4.5.x.x. We downloaded the dll with version 4.5.11 from the Newtonsoft.json Github project downloads and installed it into the GAC. Installing the DLL into the GAC can be done by using the Developer Command Prompt for Visual Studio and executing the gacutil.exe with the proper parameters.

The command prompt will show a message that the assembly has successfully been added to the GAC (see screenshot 1).

Screenshot 1 - Gacutil added assembly

Screenshot 1 – Gacutil added assembly

When you are done, restart the installer of the Hybrid Connection Manager and you should be good to go (if there is not another bug hidden somewhere along the installation!).

 

We hope this helps you solving issues during the installation of the Hybrid Connection Manager!

Azure App Services Series – Finding #3: BizTalk transformer API service still requires Visual Studio 2012

This blog post is one of many of a series we have about Microsoft Azure App Services. See a list of all our findings here.

Introduction

For systems to communicate with each other data needs to be transformed.

Microsoft has released an API app / Connector that can be used in your Logic Apps enabling you to transform a message from one document type to another by using the BizTalk Services map (see Screenshot 1) – BizTalk Transform Service (see action in Screenshot 2)

Screenshot 1 - BizTalk Services Map in Visual Studio

Screenshot 1 – BizTalk Services Map in Visual Studio

Screenshot 2 - BizTalk Transform Service Action

Screenshot 2 – BizTalk Transform Service Action

With Integration Manager we can see how the message has been transformed without needing to export the input/output files of your operation logs of your logic app (see Screenshot 3 to 5 below).

Screenshot 3 - Logged events of tranformation in Integration Manager

Screenshot 3 – Logged events of tranformation in Integration Manager

 

Screenshot 4 - Message before transformation

Screenshot 4 – Message before transformation

 

Screenshot 5 - Message after transformation

Screenshot 5 – Message after transformation

The Problem

The maps used by the BizTalk Transform Service in your Logic App need to be uploaded to your instance of the BizTalk Transform Service (see screenshot 6).

Screenshot 3 - Uploading BizTalk Services map

Screenshot 6 – Uploading BizTalk Services map

But how do we create them? In Visual Studio, but what version? The answer is 2012 – you can create a BizTalk Services map in Visual Studio 2012, not in 2013, 2015 Preview, just in 2012! In our opinion it is a miss once again here. Microsoft engages us to use the latest techniques/technologies, the latest tools, but when it comes to a certain level of details, Microsoft is just not up-to-date.

What we want

It is obvious – we want a better integration when it comes to Visual Studio 2013 and of course with that Visual Studio 2015 as well.

Key-Takeaways

  1. Creating a BizTalk Services map to use in your BizTalk Transform Service of your Logic App requires you to use Visual Studio 2012 and the BizTalk Services SDK.
  2. Integration Manager gives us end-to-end tracking in the logic apps (not only Logic Apps) for understanding all steps in your integrations

Azure App Services Series – Finding #2: Logic App Workflow API documentation is incomplete

This blog post is one of many of a series we have about Microsoft Azure App Services. See a list of all our findings here.

Introduction

With the release of Logic Apps, Microsoft also releases the Workflow Management API. An API to use for getting access to the workflow of your logic apps allowing you to easily enable, disable workflows, get the status, etc.

Microsoft’s documentation of the API is incomplete and not clear which of the API calls are working in the current release of the API.

This API is allowing you to monitor Logic Apps with Integration Manager’s monitoring feature!

The Problem

Problem #1: Incomplete documentation

According to the documentation the URI to the calls is:

https:///subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Logic/workflows/{workflowName}?api-version=2015-02-01-preview

This is wrong of course, you can clearly see that something is missing in the beginning of the URI.

It’s the URI to the region of the Logic App. For example: northeurope.logic.azure.com. The URI to call the API should look like this (example below):

https://northeurope.logic.azure.com/subscriptions/72bad97c-b0ce-40a3-a300-d36e50de34fe/resourceGroups/Default-Web-NorthEurope/providers/Microsoft.Logic/workflows/FTP2BTnoIM?api-version=2015-02-01-preview
Problem #2: Not all of the API calls documented are working (yet)

We did not manage to get many of the API calls to work. In fact, we only did manage to get one call to work – triggering the action by calling the /run method of the API. Most of the others (we did not test all) return 404 not found when calling the API.

Problem #3: There is no documentation about authenticating against the API

The documentation says nothing about how the authentication looks like, which makes the documentation somewhat useless!

For your interest, the authentication type supported is Basic. A username is created on logic app create which is ‘default‘ and the password to use is the primary key (not sure if the secondary works as well) which you can find in the logic apps properties blade of the Azure portal (see screenshot 1).

Screenshot 1 - Primary Access Key

Screenshot 1 – Primary Access Key

So the proper way of authenticating against the API is to send the Authorization header with your Basic username/password value – see example below:

Authorization: Basic YOURENCRYPTED(default:primaryAccessKey)

Azure App Services Series – Finding #1: Problems using Workflow Definition Language in Logic Apps

This blog post is one of many of a series we have about Microsoft Azure App Services. See a list of all our findings here.

Introduction

With the preview-release of Logic Apps, Microsoft also releases a new language used in the work flows of your Logic Apps – Workflow Defintion Language (WDL).

The language is used in your work flow code in both conditions and action inputs.

Using it in coditions

Conditions are used to perform the action on which you add the condition when the condition matches the requirements. This is especially important when it comes to exception handling.

Per default if one of the action fails in an Azure Logic App, the message is thrown away when one of the actions fail, so what you need to do is to create actions that are performed when error occur in order to not loose data. In most cases the data itself is still accessible in the operation logs (this depends on the API apps!) and can be extracted from there with the Workflow Management API (the Workflow Management API will be covered in an upcoming blog post).

An example:
The above example is a condition for an action that is executed when the status of the action “YOURACTION” equals ‘Failed’.

Using it in an action’s input

WDL is not only used in conditions, but also when sending input data to the defined actions. For instance, you might have an FTP connector that is triggered as soon as a file is available on the FTP, what you want is to send the file’s content or file name to another action (logging purposes maybe). In this case the FTP connector would be the trigger and another API App the action. An example of getting the FTP triggers file name:

The above example is a WDL expression to get the triggers output body, which you can send into an action as input.

The Problems

Problem #1

There is basically no Intellisense/help at all when it comes to writing expressions with WDL . The only Intellisense/help you get is when using a trigger’s/action’s output in another action and the only thing you get there are the available output alternatives (output body of succeeded actions! Let me get back to that later in Problem#2 below!) shown in screenshot 1 below. In many cases this is enough, but what happens if you want to use functions to concatenate strings or you want to write conditions? Well, you are on your own body!

available_trigger_outputs

Working with concat or any other function provided by the WDL is not giving you any alternative in the input fields – no help, no nothing except the output of the triggers or actions that have run before the action you are editing, making it very hard to work with WDL!

Problem #2

As described above and shown in the Screenshot 1, we get the triggers’ and actions’ outputs as alternatives in the actions inputs (NOT IN THE CONDITIONS!!!). But the available outputs are the outputs for succeeded actions! But if you have an action that fails based on a condition and you want to send the error message to third-party tools like Integration Manager using Integration Manager’s connector (Coming soon), what is the WDL expression for that? Well and how can we test the expressions (see Problem#3)? The documentation shows us that there is the errors attribute containing an array of errors, but there is no help in the interface on what the WDL expression is for that.

Problem #3

The third problem we see in the current version of Logic Apps are the not yet existing way of testing your WDL expressions, there is some kind of validation on saving your workflow, but no direct way of testing it.

Our wishes

Problem #1: Full intellisense in both action input text fields and condition text fields.

Problem #2: All available outputs for both when an action failes and succeeds. (when building a condition you might want to choose “action(‘ftpconnector’).status”)

Problem #3: Testing of WDL expressions and improved validation.

 

Vote for our feedback

http://feedback.azure.com/forums/287593-logic-apps/suggestions/7592121-intellisense-help-on-wdl-expressions

Log events and messages from BizTalk to Integration Manager

Get insights and Control!

This post describes the steps necessary to log messages from BizTalk to Integration Manager. Integration Manager has great, out-of-the-box support for Microsoft BizTalk Server. Integration Manager replaces the need for costly and bad performing alternatives like BAM, SCOM and many other 3rd party products.

The need to log events and messages flowing through the systems integration platform is ever growing and growing. This is due to systems integrations becoming more and more important for the business. Providing the business people with the data they need in order to innovate their business is crucial.

Microsoft BizTalk Server is one integration platform for enterprise application integration (EAI) and business-to-business (B2B) integration.

 

 

This blog post assumes you have basic knowledge in administering Microsoft BizTalk Server and Integration Manager, as well as a working instance of BizTalk and Integration Manager.

 


Prepare the BizTalk Server Environment

Simply enable tracking on receive and send ports within your applications in order for Integration Manager to log all the events/messages going through your Microsoft BizTalk Server.

Screenshot 1 - Enable Tracking on Microsoft BizTalk Server

Screenshot 1 – Enable Tracking on a Microsoft BizTalk Server receive port

Integration Manager can also be used from within Orchestrations and pipeline stages. Thiscapability uses our LogAPI which we will cover in another blog post.

The question we often get asked is how we affect existing BizTalk applications. The answer is – we don’t.

  • Integration Manager only requires BizTalk’s tracking to be enabled on ports of interest.
  • No pipelines / pipeline Components required.
  • No BAM activities
    Integration Manager offers you a way to retrieve any kind (XML, EDI, CSV) of logged message content by using either the context values or the messages’ body (This will be covered in another blog post).
  • No solution/project requirements
    • No analysis
    • No design
    • No code
  • No MSMQ queues

Prepare Integration Manager

Integration Manager’s logging service has a thread that copies data from Microsoft BizTalk Server’s tracking database (usually BizTalkDTADb). For Integration Manager to get the data from the database we need to enable the log feature in Integration Manager and provide it with the correct connection information to Microsoft BizTalk Server’s management and tracking database.

You can do this by opening up the configuration application (see screenshot 2 below), open the system settings (see screenshot 3) and check the checkbox saying “Copy messages from BizTalk (BizTalk Tracking Enabled)” and fill in the required connection information for both the management database (BizTalkMgmtDb) and tracking database (BizTalkDTADb) (see screenshot4).

Integration Manager - Configuration Application

Screenshot 2 – Starting Integration Manager’s configuration application

Integration Manager - Configuration Application - system settings

(Screenshot 3 – Open up system settings)

Integration Manager - Configuration Application - biztalk settings

Screenshot 4 – Setting up Microsoft BizTalk Server settings

Finish by pressing the Save button and close the Integration Manager configuration application.

Start the ‘IM Logging’ Windows Service.

Integration Manager will now first sync end Points (Send Ports, Receive ports/locations, etc.) in BizTalk with Integration Manager and then to copy all the data from your tracking database to Integration Manager’s logging databases. This is a continous process meaning that you can have a short windows (3-4 days worth of data within the biztalk tracking database)

Use the admin-view in Integration Manager or create custom log views to search for any event/message logged via Microsoft BizTalk server. You get access to not only an event that indicates that something has been logged, you get also access to the message itself and the context values.

event context BT compared to IM

Screenshot 4 – showing message details in BizTalk Management Console and Integration Manager

Key Takeaways

Using Integration Manager to log & archive events/messages from Microsoft BizTalk Server is easy and gives you the control and insights you need in order to run your business – simply follow the simple steps explained above and you are good to go.

By using Integration Manager and its job for copying events/messages from BizTalk’s tracking database you can keep the “Purge & archive” SQL job to save data for just 2 to 3 days, since the data is copied to the log databases of Integration Manager – keeping the performance affection low.

No BAM activities – Integration Manager as already explained above, does not need any BAM activities. Integration Manager has ways of getting the values you are interested in by using the messages’ body or context values. The values can be retrieved from historic events/messages as well! (Will be covered in an upcoming blog post).

Integration Manager has long time storage – Every transaction can be kept forever as long as you have the disk space. We will create new databases as we go to keep the system performant independent of the total amount of data being kept. There might be legal demands to keep for example outgoing invoices for 7 years – we can guarantee that for you.

To read more about Integration Manager logging click here.

Using Integration Manager with Azure App Services / Logic Apps

This blog contains a list of our blog posts with our findings for bugs and feature requests about the new Azure API App Services / Logic Apps.

Mission

We are commited to bring the same user experience with all functionality, support and capabilites for your cloud enabled system integrations as we today offer for on-prem BizTalk Server enabled system integrations.

Integration Manager is a productivity tool that helps you save time and get in control. We alse provide insights for your organization to intimate details about all the data also including artifacts, messagetypes, SLA levels, servers… all information about your valuable resources, all in one tool.

  • Logging
  • Monitoring
  • Reports
  • Repository  / CMDB

Current status and findings

We have already successfully built logic apps that logs events and body data to Integration Manager. We also provide monitoring capabilites of Logic Apps from within Integration Manager. All resources part of a distributed system can be monitored (Windows Services, queues, databases, BizTalk resources, API endpoints…)

The following blog posts pinpoints individual findings we encountered while developing, testing, deploying functionality to enable monitor and logging functionality for use with Integration Manager.

# IM Type Description Blog Workaround / Solution / Comment Resolved?
0 Feature Request Re-configuration of API Apps Link See blog post false
1 Log Feature Request Problems using Workflow Definition Language in Logic Apps Link false
2 Monitor Bug Logic App Workflow API documentation is incomplete Link false
3 Log Feature Request BizTalk transformer API service still requires Visual Studio 2012 Link See blog post false
4 Monitor Bug Hybrid Connections – Could not load file or assembly Newtonsoft.Json Link See blog post false
5 Monitor Feature Request Changing the order of connectors in your Logic APP not possible in designer view Link See blog post false

Column IM shows to which IM capability this post is related to.

How to use Integration Manager’s log4net log appender

This blog post describes one of the ways/functions we offer to log events/message to Integration Manager by using Integration Manager’s log appender for Log4Net. We assume that you have basic knowledge in Log4Net as well as you have a running instance of Integration Manager. Before continuing reading, make sure you have the correct link to Integration Manager’s Log API. In my exampel the test application and Integration Manager are located on the same server and the address to LogAPI is “http://localhost/IM/LogApi/LogApiService.svc“.

We will work with an example solution – a simple console application that prompts the user to enter a message which will log the message to a file with Log4Net’s file appender. During this how-to we will show you how you can extend your application with Integration Manager’s Log4Net appender. You can download the example here.

Before we start we let us take a look at how the application works.

  1. Start the application
    Console Application - Image 1
  2. Write a message to log – Hello World
    Console Application - Image 2
  3. The application will write into a file called tutorial.log in the application’s folder.
    Console Application - Image 3
  4. The result
    Console Application - Image 4

Let’s start adding Integration Manager’s log appender for Log4Net. We start by downloading the appender here.

The file you downloaded contains readme and DLL files. Start by copying the DLL files from the downloaded ZIP into the applications folder and open your applications configuration file which in the example looks like the code below.

Since we log a message to Integration Manager’s Log API we need to add the required “ServiceModel” configuration – see below.

Copy the above code, replace client/endpoint/@address value “http://localhost/IM/LogAPI/LogApiService.svc” with the address of your Integration Manager’s Log API instance and paste it into the application’s configuration file.

The next step is to add an application settings key called “IM.Log4Net.IsOneWay” which is of type boolean – set it depending on the requirements to either true or false.

If the application setting “IM.Log4Net.IsOneWay” is set to true the application will not ensure deliverance to the Log API, since we have configured the (IM) appender for Log4Net to not wait for a response (one way).

For log4net to understand that it needs to log an event/message with the provided appender you need to extend the configuration file with a reference to the appender as well as an appender configuration (IMAppender) – see below.

All the parameters used in the (IMAppender) appender configuration file are described in the readme.pdf that comes with the appender you downloaded before.

When the configuration is completed you can then start the application, write a text message and press Enter. The application will then log the event/message via the IM appender to Integration Manager – the configured Log Agent, End Point and Message Type will automatically added to Integration Manager. You can then create a log views allowing users to only search for events/messages coming from applications that use this (IM) appender.

You can see an example search result of Integration Manager in the screenshot below.

Console Application - Image 6

 

You can download the final application here. We hope that this blog post gets you started with logging your Log4Net logs to Integration Manager, giving you yet another way to log messages to Integration Manager for a centralized platform for all your logs, no matter what integration platform, application nor technology you use.

Monitor Web Services and SQL Server with Integration Manager

We are proud to announce the release of two new monitor agents for Integration Manager giving you even more control and insights – Microsoft SQL Server and Web Services.

SQL Server Agent

Integration Manager’s agent for Microsoft SQL Server allows you to monitor:

  • SQL statements
  • SQL jobs
  • Backups
  • Size checks (data and log separately).

It is crucial to know when backups are not taken, jobs are not running and statements are not returning the expected result, due to the fact that this might cause business critical errors.

With this agent you can configure Integration Manager to monitor all imaginable SQL statements and their result.

Monitor SQL jobs on all configured SQL server instances.

Control that backups have been taken from all monitored databases. Displays the time and state of the last backup (if any).

Check the size of the files for data and log, supports warning and error threshold values.

  • Any number of SQL Instances can be monitored from a single agent
  • Multiple agents can be deployed

Web Services Agent

Many systems and integrations are depending on web services, monitoring them is important! With Integration Manager’s agent for web services you can easily monitor web API’s, SOAP services and even web sites.

You monitor them by simply providing all required HTTP headers, client certificates as well as the services’ address and let the agent do its job – connecting to the service, check if the response’s HTTP status is one of the expected ones and evaluate the result of the request by using a regular expression or an XPath.

Not only can you check if the HTTP status is one of the expected ones, and not only evaluate the result by using a regular expression or an XPath, you can even monitor the response time of the request.

  • Any http endpoint can be monitored
  • Any number of web endpoints can be monitored from a single agent
  • Multiple agents can be deployed

Technical Features

  • SOAP
  • Web Api
  • Web Applications
  • Body from file
  • All HTTP operations are supported
  • All HTTP Headers are supported
  • Validate Response body using RegEx or XPath
  • Validate HTTP response status code
  • Authentication support
    • Client Cert
    • Impersonation
    • Basic auth
    • Api Key
Twitter
YouTube
LinkedIn