Azure App Services Series – Finding #4: Hybrid Connections – Could not load file or assembly Newtonsoft.Json

This blog post is one of many of a series we have about Microsoft Azure App Services. See a list of all our findings here.

Introduction

Azure BizTalk Services – Hybrid Connections allow you to access on-premise resources, such as SQL instances, files, etc from Azure app services – web, API, mobile and logic apps.

By enabling your app services to use on-premise resources you can together with Integration Manager monitor even more!

Microsoft created and released e.g. a protocol connector enabling you to access on-premise files – the File connector. This connector uses Hybrid Connections to access files located on your on-premise file system.

The Problem

If you follow tutorials / documentations on how to get started with Hybrid Connections you learn that you have to have a BizTalk Services instance in Azure, this instance can have one or several hybrid connections (see pricing of BizTalk Services to learn more about limitations). While setting up and configuring your hybrid connection, it requires you to install a service/agent on your on-premise machine called the Hybrid Connection Manager. Installing the agent sometimes works and sometimes not.

We came across a file not found error during installation. Newtonsoft.Json file could not be found and therefore the installer throws an exception stating “Could not load file or assembly Newtonsoft.json…”.

Unfortunately there is no post from Microsoft about it, there are some forum posts regarding this error message, but we could not find a proper blog post about how to resolve the problem.

The Solution

We solved the issue by installing the proper version of the Newtonsoft.Json.dll into the Global Assembly Cache (GAC). The error message of the installer tells you which version you need, which in our case was version 4.5.x.x. We downloaded the dll with version 4.5.11 from the Newtonsoft.json Github project downloads and installed it into the GAC. Installing the DLL into the GAC can be done by using the Developer Command Prompt for Visual Studio and executing the gacutil.exe with the proper parameters.

The command prompt will show a message that the assembly has successfully been added to the GAC (see screenshot 1).

Screenshot 1 - Gacutil added assembly

Screenshot 1 – Gacutil added assembly

When you are done, restart the installer of the Hybrid Connection Manager and you should be good to go (if there is not another bug hidden somewhere along the installation!).

 

We hope this helps you solving issues during the installation of the Hybrid Connection Manager!

Azure App Services Series – Finding #3: BizTalk transformer API service still requires Visual Studio 2012

This blog post is one of many of a series we have about Microsoft Azure App Services. See a list of all our findings here.

Introduction

For systems to communicate with each other data needs to be transformed.

Microsoft has released an API app / Connector that can be used in your Logic Apps enabling you to transform a message from one document type to another by using the BizTalk Services map (see Screenshot 1) – BizTalk Transform Service (see action in Screenshot 2)

Screenshot 1 - BizTalk Services Map in Visual Studio

Screenshot 1 – BizTalk Services Map in Visual Studio

Screenshot 2 - BizTalk Transform Service Action

Screenshot 2 – BizTalk Transform Service Action

With Integration Manager we can see how the message has been transformed without needing to export the input/output files of your operation logs of your logic app (see Screenshot 3 to 5 below).

Screenshot 3 - Logged events of tranformation in Integration Manager

Screenshot 3 – Logged events of tranformation in Integration Manager

 

Screenshot 4 - Message before transformation

Screenshot 4 – Message before transformation

 

Screenshot 5 - Message after transformation

Screenshot 5 – Message after transformation

The Problem

The maps used by the BizTalk Transform Service in your Logic App need to be uploaded to your instance of the BizTalk Transform Service (see screenshot 6).

Screenshot 3 - Uploading BizTalk Services map

Screenshot 6 – Uploading BizTalk Services map

But how do we create them? In Visual Studio, but what version? The answer is 2012 – you can create a BizTalk Services map in Visual Studio 2012, not in 2013, 2015 Preview, just in 2012! In our opinion it is a miss once again here. Microsoft engages us to use the latest techniques/technologies, the latest tools, but when it comes to a certain level of details, Microsoft is just not up-to-date.

What we want

It is obvious – we want a better integration when it comes to Visual Studio 2013 and of course with that Visual Studio 2015 as well.

Key-Takeaways

  1. Creating a BizTalk Services map to use in your BizTalk Transform Service of your Logic App requires you to use Visual Studio 2012 and the BizTalk Services SDK.
  2. Integration Manager gives us end-to-end tracking in the logic apps (not only Logic Apps) for understanding all steps in your integrations

Azure App Services Series – Finding #2: Logic App Workflow API documentation is incomplete

This blog post is one of many of a series we have about Microsoft Azure App Services. See a list of all our findings here.

Introduction

With the release of Logic Apps, Microsoft also releases the Workflow Management API. An API to use for getting access to the workflow of your logic apps allowing you to easily enable, disable workflows, get the status, etc.

Microsoft’s documentation of the API is incomplete and not clear which of the API calls are working in the current release of the API.

This API is allowing you to monitor Logic Apps with Integration Manager’s monitoring feature!

The Problem

Problem #1: Incomplete documentation

According to the documentation the URI to the calls is:

https:///subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Logic/workflows/{workflowName}?api-version=2015-02-01-preview

This is wrong of course, you can clearly see that something is missing in the beginning of the URI.

It’s the URI to the region of the Logic App. For example: northeurope.logic.azure.com. The URI to call the API should look like this (example below):

https://northeurope.logic.azure.com/subscriptions/72bad97c-b0ce-40a3-a300-d36e50de34fe/resourceGroups/Default-Web-NorthEurope/providers/Microsoft.Logic/workflows/FTP2BTnoIM?api-version=2015-02-01-preview
Problem #2: Not all of the API calls documented are working (yet)

We did not manage to get many of the API calls to work. In fact, we only did manage to get one call to work – triggering the action by calling the /run method of the API. Most of the others (we did not test all) return 404 not found when calling the API.

Problem #3: There is no documentation about authenticating against the API

The documentation says nothing about how the authentication looks like, which makes the documentation somewhat useless!

For your interest, the authentication type supported is Basic. A username is created on logic app create which is ‘default‘ and the password to use is the primary key (not sure if the secondary works as well) which you can find in the logic apps properties blade of the Azure portal (see screenshot 1).

Screenshot 1 - Primary Access Key

Screenshot 1 – Primary Access Key

So the proper way of authenticating against the API is to send the Authorization header with your Basic username/password value – see example below:

Authorization: Basic YOURENCRYPTED(default:primaryAccessKey)

Azure App Services Series – Finding #1: Problems using Workflow Definition Language in Logic Apps

This blog post is one of many of a series we have about Microsoft Azure App Services. See a list of all our findings here.

Introduction

With the preview-release of Logic Apps, Microsoft also releases a new language used in the work flows of your Logic Apps – Workflow Defintion Language (WDL).

The language is used in your work flow code in both conditions and action inputs.

Using it in coditions

Conditions are used to perform the action on which you add the condition when the condition matches the requirements. This is especially important when it comes to exception handling.

Per default if one of the action fails in an Azure Logic App, the message is thrown away when one of the actions fail, so what you need to do is to create actions that are performed when error occur in order to not loose data. In most cases the data itself is still accessible in the operation logs (this depends on the API apps!) and can be extracted from there with the Workflow Management API (the Workflow Management API will be covered in an upcoming blog post).

An example:
The above example is a condition for an action that is executed when the status of the action “YOURACTION” equals ‘Failed’.

Using it in an action’s input

WDL is not only used in conditions, but also when sending input data to the defined actions. For instance, you might have an FTP connector that is triggered as soon as a file is available on the FTP, what you want is to send the file’s content or file name to another action (logging purposes maybe). In this case the FTP connector would be the trigger and another API App the action. An example of getting the FTP triggers file name:

The above example is a WDL expression to get the triggers output body, which you can send into an action as input.

The Problems

Problem #1

There is basically no Intellisense/help at all when it comes to writing expressions with WDL . The only Intellisense/help you get is when using a trigger’s/action’s output in another action and the only thing you get there are the available output alternatives (output body of succeeded actions! Let me get back to that later in Problem#2 below!) shown in screenshot 1 below. In many cases this is enough, but what happens if you want to use functions to concatenate strings or you want to write conditions? Well, you are on your own body!

available_trigger_outputs

Working with concat or any other function provided by the WDL is not giving you any alternative in the input fields – no help, no nothing except the output of the triggers or actions that have run before the action you are editing, making it very hard to work with WDL!

Problem #2

As described above and shown in the Screenshot 1, we get the triggers’ and actions’ outputs as alternatives in the actions inputs (NOT IN THE CONDITIONS!!!). But the available outputs are the outputs for succeeded actions! But if you have an action that fails based on a condition and you want to send the error message to third-party tools like Integration Manager using Integration Manager’s connector (Coming soon), what is the WDL expression for that? Well and how can we test the expressions (see Problem#3)? The documentation shows us that there is the errors attribute containing an array of errors, but there is no help in the interface on what the WDL expression is for that.

Problem #3

The third problem we see in the current version of Logic Apps are the not yet existing way of testing your WDL expressions, there is some kind of validation on saving your workflow, but no direct way of testing it.

Our wishes

Problem #1: Full intellisense in both action input text fields and condition text fields.

Problem #2: All available outputs for both when an action failes and succeeds. (when building a condition you might want to choose “action(‘ftpconnector’).status”)

Problem #3: Testing of WDL expressions and improved validation.

 

Vote for our feedback

http://feedback.azure.com/forums/287593-logic-apps/suggestions/7592121-intellisense-help-on-wdl-expressions

Log events and messages from BizTalk to Integration Manager

Get insights and Control!

This post describes the steps necessary to log messages from BizTalk to Integration Manager. Integration Manager has great, out-of-the-box support for Microsoft BizTalk Server. Integration Manager replaces the need for costly and bad performing alternatives like BAM, SCOM and many other 3rd party products.

The need to log events and messages flowing through the systems integration platform is ever growing and growing. This is due to systems integrations becoming more and more important for the business. Providing the business people with the data they need in order to innovate their business is crucial.

Microsoft BizTalk Server is one integration platform for enterprise application integration (EAI) and business-to-business (B2B) integration.

 

 

This blog post assumes you have basic knowledge in administering Microsoft BizTalk Server and Integration Manager, as well as a working instance of BizTalk and Integration Manager.

 


Prepare the BizTalk Server Environment

Simply enable tracking on receive and send ports within your applications in order for Integration Manager to log all the events/messages going through your Microsoft BizTalk Server.

Screenshot 1 - Enable Tracking on Microsoft BizTalk Server

Screenshot 1 – Enable Tracking on a Microsoft BizTalk Server receive port

Integration Manager can also be used from within Orchestrations and pipeline stages. Thiscapability uses our LogAPI which we will cover in another blog post.

The question we often get asked is how we affect existing BizTalk applications. The answer is – we don’t.

  • Integration Manager only requires BizTalk’s tracking to be enabled on ports of interest.
  • No pipelines / pipeline Components required.
  • No BAM activities
    Integration Manager offers you a way to retrieve any kind (XML, EDI, CSV) of logged message content by using either the context values or the messages’ body (This will be covered in another blog post).
  • No solution/project requirements
    • No analysis
    • No design
    • No code
  • No MSMQ queues

Prepare Integration Manager

Integration Manager’s logging service has a thread that copies data from Microsoft BizTalk Server’s tracking database (usually BizTalkDTADb). For Integration Manager to get the data from the database we need to enable the log feature in Integration Manager and provide it with the correct connection information to Microsoft BizTalk Server’s management and tracking database.

You can do this by opening up the configuration application (see screenshot 2 below), open the system settings (see screenshot 3) and check the checkbox saying “Copy messages from BizTalk (BizTalk Tracking Enabled)” and fill in the required connection information for both the management database (BizTalkMgmtDb) and tracking database (BizTalkDTADb) (see screenshot4).

Integration Manager - Configuration Application

Screenshot 2 – Starting Integration Manager’s configuration application

Integration Manager - Configuration Application - system settings

(Screenshot 3 – Open up system settings)

Integration Manager - Configuration Application - biztalk settings

Screenshot 4 – Setting up Microsoft BizTalk Server settings

Finish by pressing the Save button and close the Integration Manager configuration application.

Start the ‘IM Logging’ Windows Service.

Integration Manager will now first sync end Points (Send Ports, Receive ports/locations, etc.) in BizTalk with Integration Manager and then to copy all the data from your tracking database to Integration Manager’s logging databases. This is a continous process meaning that you can have a short windows (3-4 days worth of data within the biztalk tracking database)

Use the admin-view in Integration Manager or create custom log views to search for any event/message logged via Microsoft BizTalk server. You get access to not only an event that indicates that something has been logged, you get also access to the message itself and the context values.

event context BT compared to IM

Screenshot 4 – showing message details in BizTalk Management Console and Integration Manager

Key Takeaways

Using Integration Manager to log & archive events/messages from Microsoft BizTalk Server is easy and gives you the control and insights you need in order to run your business – simply follow the simple steps explained above and you are good to go.

By using Integration Manager and its job for copying events/messages from BizTalk’s tracking database you can keep the “Purge & archive” SQL job to save data for just 2 to 3 days, since the data is copied to the log databases of Integration Manager – keeping the performance affection low.

No BAM activities – Integration Manager as already explained above, does not need any BAM activities. Integration Manager has ways of getting the values you are interested in by using the messages’ body or context values. The values can be retrieved from historic events/messages as well! (Will be covered in an upcoming blog post).

Integration Manager has long time storage – Every transaction can be kept forever as long as you have the disk space. We will create new databases as we go to keep the system performant independent of the total amount of data being kept. There might be legal demands to keep for example outgoing invoices for 7 years – we can guarantee that for you.

To read more about Integration Manager logging click here.

Using Integration Manager with Azure App Services / Logic Apps

This blog contains a list of our blog posts with our findings for bugs and feature requests about the new Azure API App Services / Logic Apps.

Mission

We are commited to bring the same user experience with all functionality, support and capabilites for your cloud enabled system integrations as we today offer for on-prem BizTalk Server enabled system integrations.

Integration Manager is a productivity tool that helps you save time and get in control. We alse provide insights for your organization to intimate details about all the data also including artifacts, messagetypes, SLA levels, servers… all information about your valuable resources, all in one tool.

  • Logging
  • Monitoring
  • Reports
  • Repository  / CMDB

Current status and findings

We have already successfully built logic apps that logs events and body data to Integration Manager. We also provide monitoring capabilites of Logic Apps from within Integration Manager. All resources part of a distributed system can be monitored (Windows Services, queues, databases, BizTalk resources, API endpoints…)

The following blog posts pinpoints individual findings we encountered while developing, testing, deploying functionality to enable monitor and logging functionality for use with Integration Manager.

# IM Type Description Blog Workaround / Solution / Comment Resolved?
0 Feature Request Re-configuration of API Apps Link See blog post false
1 Log Feature Request Problems using Workflow Definition Language in Logic Apps Link false
2 Monitor Bug Logic App Workflow API documentation is incomplete Link false
3 Log Feature Request BizTalk transformer API service still requires Visual Studio 2012 Link See blog post false
4 Monitor Bug Hybrid Connections – Could not load file or assembly Newtonsoft.Json Link See blog post false
5 Monitor Feature Request Changing the order of connectors in your Logic APP not possible in designer view Link See blog post false

Column IM shows to which IM capability this post is related to.

Azure App Services Series – Finding #0: Re-configuration of API Apps

This blog post is the first blog of a series we have about Microsoft Azure App Services. See a list of all our findings here.

Problem

Re-configuring API apps added to your subscription is a task that will be a common task for everybody working with API & Logic Apps. Let us say we have an FTP connector used in our Logic Apps and the user’s password used in the FTP connector has changed.

The Solution

Doing that or finding the way of doing that is not the most intuitive way – you first need to select the API app itself, click on the API app host, select settings, application settings and scroll down in the blade to find the settings (see screenshot 1).

Change connector settings.

Change connector settings.

Our Wish

We would like a direct link to the settings in the API app/connector blade of Azure’s new portal instead of needing to open and wait for two other blades to load.

 

Link to feedback forum: http://feedback.azure.com/forums/287593-logic-apps/suggestions/7575600-improve-ui-re-configuration-of-api-apps

[UPDATE] Problem when using connectors of same kind in same logic app

UPDATE 2015-04-15

Microsoft has released an update to this issue now showing you the name of the connectors in your resource group.

Updated Workflow showing name of connector instance

Updated Workflow showing name of connector instance


 

Microsoft recently released Microsoft Azure App Service in preview – which enables you to build Logic Apps.

We came across a minor problem while creating and working with logic apps – which connector has which name when having the same type of connector in the current resource group.

Demo Scenario

Let us say you have a logic app that reads from one FTP Server (your internal ftp) and you need to upload the file to another FTP Server.

For this scenario we need to have two FTP Connectors in our resource group of the Logic App – in my example I have called them “FTPSourceServer” and “FTPTargetServer”.

Let’s take a look at the Logic APP triggers and action interface and see how it looks like (see screenshot 1 below).

Azure Blog april 13th - triggers and actions

Screenshot 1 – Triggers and actions interface

 

The Problem

How do we know which of the FTP connectors we see is either “FTPSourceServer” or “FTPTargetServer”, by just looking at the screen shown in Screenshot 1 above?

Short answer: In the current version, by 2015-04-13, we can’t by just looking at the designer view (the one on Screenshot 1).

Long answer: There is a work around! See “The Solution”.

The Solution

What you can do is to simply add the first in the list to the logic and simply choose the “File Available (Read then Delete)” trigger, save the changes and then switch to “Code View” (see screenshot 2).

Screenshot 2 - Add FTP Connector

Screenshot 2 – Add FTP Connector

You have now switched from the “Designer” to the “Code View” showing you the JSON data the logic app needs and uses in order for your logic app to work correctly. In order to find out which of the FTP connector you just added, simply search for the “triggers” section of the JSON file and see the name of the trigger – see screenshot 3.

Screenshot 3 - Name of trigger

Screenshot 3 – Name of trigger

As you can see, the name of the trigger is “ftpsourceserver”, which lets me know that the first FTP connector in the list of available API apps is the one FTP connector I called “FTPSourceServer” and of course the second in the list must therefore be the one I called “FTPTargetServer”. (Could be good to know that the JSON data is case insensitive when it comes to the object names – in case you would like to create the logic app’s code by your own.)

This is of course not the best way and I am sure Microsoft’s Azure team is aware of this issue and gets fixed soon – keep in my mind that the new interface of Microsoft Azure is still in preview as well as API and logic apps!

How to use Integration Manager’s log4net log appender

This blog post describes one of the ways/functions we offer to log events/message to Integration Manager by using Integration Manager’s log appender for Log4Net. We assume that you have basic knowledge in Log4Net as well as you have a running instance of Integration Manager. Before continuing reading, make sure you have the correct link to Integration Manager’s Log API. In my exampel the test application and Integration Manager are located on the same server and the address to LogAPI is “http://localhost/IM/LogApi/LogApiService.svc“.

We will work with an example solution – a simple console application that prompts the user to enter a message which will log the message to a file with Log4Net’s file appender. During this how-to we will show you how you can extend your application with Integration Manager’s Log4Net appender. You can download the example here.

Before we start we let us take a look at how the application works.

  1. Start the application
    Console Application - Image 1
  2. Write a message to log – Hello World
    Console Application - Image 2
  3. The application will write into a file called tutorial.log in the application’s folder.
    Console Application - Image 3
  4. The result
    Console Application - Image 4

Let’s start adding Integration Manager’s log appender for Log4Net. We start by downloading the appender here.

The file you downloaded contains readme and DLL files. Start by copying the DLL files from the downloaded ZIP into the applications folder and open your applications configuration file which in the example looks like the code below.

Since we log a message to Integration Manager’s Log API we need to add the required “ServiceModel” configuration – see below.

Copy the above code, replace client/endpoint/@address value “http://localhost/IM/LogAPI/LogApiService.svc” with the address of your Integration Manager’s Log API instance and paste it into the application’s configuration file.

The next step is to add an application settings key called “IM.Log4Net.IsOneWay” which is of type boolean – set it depending on the requirements to either true or false.

If the application setting “IM.Log4Net.IsOneWay” is set to true the application will not ensure deliverance to the Log API, since we have configured the (IM) appender for Log4Net to not wait for a response (one way).

For log4net to understand that it needs to log an event/message with the provided appender you need to extend the configuration file with a reference to the appender as well as an appender configuration (IMAppender) – see below.

All the parameters used in the (IMAppender) appender configuration file are described in the readme.pdf that comes with the appender you downloaded before.

When the configuration is completed you can then start the application, write a text message and press Enter. The application will then log the event/message via the IM appender to Integration Manager – the configured Log Agent, End Point and Message Type will automatically added to Integration Manager. You can then create a log views allowing users to only search for events/messages coming from applications that use this (IM) appender.

You can see an example search result of Integration Manager in the screenshot below.

Console Application - Image 6

 

You can download the final application here. We hope that this blog post gets you started with logging your Log4Net logs to Integration Manager, giving you yet another way to log messages to Integration Manager for a centralized platform for all your logs, no matter what integration platform, application nor technology you use.

Twitter
YouTube
LinkedIn