Sep 09, 2010 If you look carefully, the MIME Root doesn't have the charset in the Content-Type header. Looking at 2,.NET says 'While XOP defines the charset parameter for application/xop+xml to be optional, it is needed for interoperability similar to the BP 1.1 requirement on the charset parameter for the text/xml media type.The BP 1.2 has a recommendation related to this. Web browsers are typically incapable of making multipart requests when the non-binary parts, such as rich input bodies, have their own Content-Type.To work around this issue, specify a certain Content-Disposition name and Salesforce can read the Content-Type of the rich input part. (You don’t have to specify a Content-Type for the rich input body.).
![]()
Handling data from Azure Storage blobs is not straightforward. The return value is binary (
application/octet-stream ) at first and needs to be casted into a data type you want to process; in our case into application/json .
This write-up is an easy to follow and real walk through of errors beginners may encounter handling Azure Storage blobs in Azure Logic Apps. It has happened to me.
As soon a new file (blob) in an Azure Storage container arrives this file should be processed by an Azure Function app.
1) Create a new Azure Storage Account.
2) Create a new container in the storage account. 3) Create a new Azure Logic App. 4) Design the Logic App.
A first draft could look like this.
With this configuration we have three steps.
Unfortunately, this configuration does not work because of two errors:
I use the Microsoft Azure Storage Explorer to upload files into the container of my Azure Storage account to trigger my Azure Logic App. In every try I increment the number of my test file. The test file contains a simple JSON which can be interpreted by my Azure Function.
After each upload I go back into the Azure Portal to look for new trigger events and shortly afterwards for new run events. To avoid flooding of my trigger history I disable the logic app after each upload to inspect the run results. Before each new try I enable the logic app again.
As you can see our first configuration of the Azure Logic App did not run successfully. Let’s inspect the first error!
The return value of the first step is no array! If we look at the raw data, we see in the
body that there is no array. Maybe this is an exception because we uploaded only a single file? Try again with two files at the same time.
Now, I upload two files at the same time.
Surprisingly, the Logic App gets triggered for each new file separately.
So my assumption was wrong that the action
When one or more blobs are added or modified (metadata only) (Preview) returns an array. For me the property Number of blobs was somewhat misleading that the action would return an array.
We can resolve this error easily by removing the for-each-loop. We can design the flow of the Logic App in such a way that the Logic App gets triggered for each new or modified blob separately.
Let’s try again by uploading a new file. Again, we see in the run history an error. This time the error is at the third step. The first two steps are running successfully, now. We were successful in getting the content of the blob which has triggered the Logic App. So far, so good! Let’s explore the new error!
Even the raw data of the 2nd step looks fine.
The Azure Function action in the third step throws the error
UnsupportedMediaType with the message: “The WebHook request must contain an entity body formatted as JSON.” That error may be confusing at first because our file contains pure JSON data. A look at the content type reveals that the Logic App does not know that we handle JSON data, instead it says something of application/octet-stream , which is a binary data type.
The Azure Function gets the following raw input:
The raw output of the Azure Function action looks like this.
And for reference the function stub of the Azure Function looks like this:
Convert into JSON data type
The documentation states, that Logic Apps can handle natively
application/json and text/plain (see Handle content types in logic apps). As we have already JSON data we can use the function @json() to cast the data type to application/json .
Unfortunately, this approach cannot be saved by the Logic App Designer.
Error message
Save logic app failedFailed to save logic app logicapp. The template validation failed: ‘The template action ‘ProcessBlob2’ at line ‘1’ and column ‘43845’ is not valid: “The template language expression ‘json(@{body(‘Get_blob_content’)})’ is not valid: the string character ‘@’ at position ‘5’ is not expected.”.’.
Fortunately, this only a small shortcoming of the Azure Logic App Designer. We need to look at the configuration in code view. For that reason click on the tree dots
... in the upper right corner of the Azure Function action and select Peek code in the menu.
We have to change the evaluation in the body property. It must not contain more than one expression wrapper
@() . The documentation does not say explicitly how to nest expressions (see https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-workflow-definition-language#expressions), but after some trial and error, we know, we just need to remove the nested expression wrapper @{ and } and leave everything else.
Check again, if it’s working. Upload a new file.
We check the run history, again, and all actions did run successfully. Let’s check the raw input and output.
Raw input:
Raw output:
There is a difference in the input data of the Azure Function action, as there is no explicit content type, just pure JSON data.
![]()
The final Logic App looks like this in the designer. Unfortunately, you don’t see all expressions. You need to peek inside the code, as seen in the step before.
To see everything switch to code view. That’s not nice to design, but it’s good enough to check our configuration.
Azure Logic Apps
Azure Function Apps
![]() Comments are closed.
|
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
December 2022
Categories |