Sync Response for Durable Functions

Durable Functions provides great model for running reliably serverless logic powered by easy to understand orchestration approach.

One of Durable Functions’ specifics is that follows async model. This means that you can fire request via HTTP, Queue or Manually and then you will get an endpoint to monitor the status of the execution.

However, I really like the Durable Functions’ orchestration capabilties and I want to use it as general approach for building complex workflows not limited only to async scenarios. But is it possible to combine the advanced orchestration capabilities of Durable Functions with more typical execution scenarios? Most of the clients we have today – mobile and API-s expect sync responses. Or in other words they are calling API-s endpoint and wait until there is a response. One approach to leverage the advanced orchestration capabilities of Durable Functions is to try to change existing clients and move them to async behavior. However, this is difficult task in short-term.

Why don’t we add sync layer on top of Durable Functions to enable more common scenarios and take advantage of their orchestration capabilities?

Usually with Durable Functions we have 3 main players – Orchestration Client, Orchestration Trigger (Orchestrator) and Activity Trigger (Activity):

ArchitectureV2

In this architecture the Mobile Client needs to check periodically the status of Orchestrator and eventually will get the output.

Can we delegate this responsibility to an external layer – new function that will be called by the mobile client and provide the output if it accessible in pre-configured time-frame or provide the original status check endpoint if the allowed waiting period is over? Below you can see that we are adding Sync Response Wrapper function:

1

One sample implementation for Sync Response Wrapper can be found here:

using System;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using System.Configuration;
using DurableFunc.Model;
using DurableFunc.Services;
using DurableFunc.Utils;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
namespace DurableFunc
{
public static class SyncResponse
{
[FunctionName("SyncResponse")]
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
{
var functionName = await Helper.GetParameterValue(req);
if (string.IsNullOrEmpty(functionName))
{
req.CreateResponse(HttpStatusCode.BadRequest,"Please pass a funcname on the query string or in the request body");
}
object result = null;
string statusUri;
var orchestrationClientUri = ConfigurationManager.AppSettings["OrchestrationClientUri"];
using (var httpCleint = new HttpClient())
{
var orcestrationClientResponse =
await httpCleint.PostAsync(new Uri($"{orchestrationClientUri}{functionName}"), null);
orcestrationClientResponse.EnsureSuccessStatusCode();
var clientResponse =
await orcestrationClientResponse.Content.ReadAsAsync<OrchestrationClientResponse>();
statusUri = clientResponse.StatusQueryGetUri;
var executionDetails = Helper.GetExecutionDetails();
var durableFunctionSyncResponseService = new DurableFunctionSyncResponseService();
result = await durableFunctionSyncResponseService.ProvideOutput(clientResponse, executionDetails, log);
}
return result == null
? req.CreateResponse(HttpStatusCode.OK, $"The operation is taking more than expected. Keep following the progress here {statusUri}") :
req.CreateResponse(HttpStatusCode.OK, result);
}
}
}
view raw gistfile1.txt hosted with ❤ by GitHub

In Sync Response Wrapper function we call the Orchestration Client, after that we get the status endpoint and we start to check it for the output. The most interesting part is in the service that abstracts the call to the Orchestration Client Status Endpoint:

public async Task<object> ProvideOutput(OrchestrationClientResponse clientResponse, ExecutionDetails executionDetails, TraceWriter log)
{
object result = null;
using (var httpClient = new HttpClient())
{
for (var i = 0; i < executionDetails.Iterations; i++)
{
Thread.Sleep(executionDetails.IterationPeriod);
string statusCheck;
try
{
statusCheck = await httpClient.GetStringAsync(clientResponse.StatusQueryGetUri);
}
catch (Exception e)
{
// log the exception
log.Error(e.Message);
continue;
}
var status = JsonConvert.DeserializeObject<StatusResponse>(statusCheck);
if (status.RuntimeStatus != "Completed") { continue; }
result = status.Output;
break;
}
}
return result;
}

We provide configuration that time boxes the waiting time and will fall back to the async behavior if we are not able to complete on time.

The next question is how we can optimize the performance. The disadvantage of this architecture is that we are adding one more component that will increase the overall latency. Can we eliminate some of the network calls?

Yes, let’s add the pulling logic that results in Sync Response in the Orchestration Client. This will help us to get immediately the Status URL endpoint and leverage Orchestration Client for getting the updates coming from the Orchestration Trigger:

SyncResponseOrchestrationClient

Then the code of the Orchestration Client becomes:

[FunctionName("SyncResponseClient")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Function, methods: "post", Route = "orchestrator/{functionName}")] HttpRequestMessage req,
[OrchestrationClient] DurableOrchestrationClient starter,
string functionName,
TraceWriter log)
{
// Function input comes from the request content.
dynamic eventData = await req.Content.ReadAsAsync<object>();
string instanceId = await starter.StartNewAsync(functionName, eventData);
log.Info($"Started orchestration with ID = '{instanceId}'.");
var responseMessage = starter.CreateCheckStatusResponse(req, instanceId);
var clientResponse = JsonConvert.DeserializeObject<OrchestrationClientResponse>(await responseMessage.Content.ReadAsStringAsync());
var executionDetails = Helper.GetExecutionDetails();
var durableFunctionSyncResponseService = new DurableFunctionSyncResponseService();
var result = await durableFunctionSyncResponseService.ProvideOutput(clientResponse, executionDetails, log);
return result == null
? req.CreateResponse(HttpStatusCode.OK, $"The operation is taking more than expected. Keep following the progress here {clientResponse.StatusQueryGetUri}") :
req.CreateResponse(HttpStatusCode.OK, result);
}

We continue to leverage the same Service logic as well but we reduced the number of functions and calls.

Now let’s review complete example including sub-orchestration. We will call Sync Response Orchestration Client that will retrieve the names of 3 cities and via sub-orchestration will retrieve the current temperature in each of the cities:

Complete

Complete implementation of this sample can be found here – https://github.com/gled4er/durable-functions-sub-orchestrations

And in this case the first Orchestrator is calling both Activities and another Orchestrator:

using System.Collections.Generic;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
namespace DurableFunc
{
public static class HelloWorld
{
[FunctionName("HelloWorld")]
public static async Task<List<string>> Run([OrchestrationTrigger] DurableOrchestrationContext context)
{
var list = new List<string>
{
$"{await context.CallActivityAsync<string>("Hello", "Tokyo")}. The temperature is {await context.CallSubOrchestratorAsync<string>("TemperatureService", "Tokyo")}°C",
$"{await context.CallActivityAsync<string>("Hello", "Seattle")}. The temperature is {await context.CallSubOrchestratorAsync<string>("TemperatureService", "Seattle")}°C",
$"{await context.CallActivityAsync<string>("Hello", "London")}. The temperature is {await context.CallSubOrchestratorAsync<string>("TemperatureService", "London")}°C"
};
return list;
}
}
}

In terms of settings we have the following values for local testing:

{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"AzureWebJobsDashboard": "UseDevelopmentStorage=true",
"WeatherApiKey": {Weather-API-Key},
"OrchestrationClientUri": "http://localhost:7071/api/orchestrators/&quot;,
"MaxExecutionTime": "3000",
"ExecutionPeriod": "100"
}
}

And the result we get after execution is:

Output

I am very interested to hear your opinion about this approach and if you find useful Durable Functions to be used for Sync Response scenarios. Please share your feedback by contacting me on @azurekanio

Thank you!

Kanio

 

 

 

 

 

 

 

Get Azure Durable Functions running in Azure Portal for 5 minutes

Hello All,

Azure Durable Functions provides unique way to orchestrate Azure Functions via code and implement advanced workflows leveraging long running executions as well as persisting local state.

In Azure Durable Functions documentation you can find great reference how to use Visual Studio for development.

In recent post we covered how to use Visual Studio Code to develop Azure Durable Functions on Windows, Mac and Linux.

In this post we will see how to get Azure Durable Functions up and running in less than 5 minutes in Azure Portal!

We will utilize already configured templates in Azure Portal and we will build an application containing 3 types of functions:

  • Orchestrator Client
  • Orchestrator
  • Activity

We will use the following architecture:

ArchitectureV2

 

Mobile Client or any RESTful client will call the orchestrator client to trigger the execution. The orchestrator client will start the orchestrator function that will call an activity function multiple times. In the end we will check the output via “statusQueryGetUri” endpoint exposed by the orchestration client.

So, let’s start and keep eye on the time – we will be done in less than 5 minutes!

  • Let’s access Azure Portal and find Function App

1

  • Let’s fill the necessary data for Functions App

2

  • Azure Durable Functions requires to run on the preview version of the Runtime. So let’s change that via Function Settings

3

  • And then let’s select “beta” for runtime version

4.PNG

  • Now let’s create our first function. Please select “Create your own custom function”

5

  • On the following screen change Scenario to “All” and select DurableFunctionsHttpStart

6

  • You need to install the extension required for Azure Durable Functions

7

  • Please press “Install” and you will get the following message. The installation is actually faster than 10 minutes so we are still on track to meet our deadline:

8

  • After the install is complete, we are ready to create our first function – “HttpStart” that is of type Orchestration Client and is created by selecting DurableFunctionsHttpStart template

9

The function is responsible for managing the orchestrator function requested by the user:

 
#r "Microsoft.Azure.WebJobs.Extensions.DurableTask"
#r "Newtonsoft.Json"

using System.Net;

public static async Task&lt;HttpResponseMessage&gt; Run(HttpRequestMessage req,
DurableOrchestrationClient starter, string functionName, TraceWriter log)
{
  // Function input comes from the request content.
  dynamic eventData = await req.Content.ReadAsAsync&lt;object&gt;();
  string instanceId = await starter.StartNewAsync(functionName, eventData);

  log.Info($"Started orchestration with ID = '{instanceId}'.");

  return starter.CreateCheckStatusResponse(req, instanceId);
}

with configuration:


{
"bindings": [
{
"authLevel": "anonymous",
"name": "req",
"type": "httpTrigger",
"direction": "in",
"route": "orchestrators/{functionName}",
"methods": [
"post",
"get"
]
},
{
"name": "$return",
"type": "http",
"direction": "out"
},
{
"name": "starter",
"type": "orchestrationClient",
"direction": "in"
}
],
"disabled": false
}

  • Now, let’s create our orchestration function “HelloSequence” from “DurableFunctionOrchestrator” template

10

This is our orchestrator function that will call 3 times the Activity function:


/*
 * Before running this sample, please create a Durable Activity function (default name is "hello")
 */

#r "Microsoft.Azure.WebJobs.Extensions.DurableTask"

public static async Task&lt;List&lt;string&gt;&gt; Run(DurableOrchestrationContext context)
{
 var outputs = new List&lt;string&gt;();

// Replace "hello" with the name of your Durable Activity Function.
 outputs.Add(await context.CallActivityAsync&lt;string&gt;("Hello", "Tokyo"));
 outputs.Add(await context.CallActivityAsync&lt;string&gt;("Hello", "Seattle"));
 outputs.Add(await context.CallActivityAsync&lt;string&gt;("Hello", "London"));

// returns ["Hello Tokyo!", "Hello Seattle!", "Hello London!"]
 return outputs;
}

And we apply the following configuration:


{
 "bindings": [
 {
 "name": "context",
 "type": "orchestrationTrigger",
 "direction": "in"
 }
 ],
 "disabled": false
}

  • And our last function will be “Hello” from “DurableFunctionActivity” template

11

This is function that actually will perform the actual “work”:


/*
 * Before running this sample, please create a Durable Activity function (default name is "hello")
 */

#r "Microsoft.Azure.WebJobs.Extensions.DurableTask"

public static string Run(string name)
{
 return $"Hello {name}!";
}

And we use the following configuration for the this function:


{
"bindings": [
{
"name": "name",
"type": "activityTrigger",
"direction": "in"
}
],
"disabled": false
}

  • Now let’s go to HttpStart function and copy its URL

12

  • Let’s use Postman or cURL to query the endpoint. No data is required to be passed just use POST verb

13

  • Then let’s call the “statusQueryGetUri” endpoint and we will see the actual output of the Durable Function

15

  • Congratulations! Your first Azure Durable Functions is up and running in Azure Portal!!!

 

Congratulations! Now you can run Azure Durable Functions in Visual Studio, Visual Studio Code on Mac and Linux and in this post we learn how to use Azure Portal for Azure Durable Functions development!

Azure Durable Functions Everywhere! 

Thank you!

Kanio

 

Running Azure Durable Functions in VS Code

Hello All,

Azure Durable Functions are great new addition to the exciting Azure Serverless offerings! They allow you to create complex Azure Functions workflows completely in code. You can find more information in my previous post –  https://azurekan.wordpress.com/2017/11/13/azure-durable-functions/

If you use Visual Studio, you can find instructions how to run sample code here – https://docs.microsoft.com/en-us/azure/azure-functions/durable-functions-install

In this post I want to share how easy is to achieve the same result in Visual Studio Code. 

In order to speed you up I created this “Hello World” project in GitHub – https://github.com/gled4er/vs-code-azure-durable-functions-hello-world

If you follow the instructions in the README file you will be able to run the Azure Durable Functions in VS Code in less than 5 minutes – https://github.com/gled4er/vs-code-azure-durable-functions-hello-world/blob/master/README.md:

1. Clone the project – https://github.com/gled4er/vs-code-azure-durable-functions-hello-world

2. Install Azure Functions Cross Platform Tools (more info here http://bit.ly/2ftaOIC) by running the following command (you will need Node.js 8.5 and later):

npm i -g azure-functions-core-tools@core 

If you already have the tools, please be sure to update them to the latest available version. Only after updating to Azure Functions Core Tools (2.0.1-beta.21) and Function Runtime Version 2.0.11370.0 we were able to run the sample successfully on Mac.

3. Navigate the to folder and add Durable Functions Extenstion by running the following command:

func extensions install -p Microsoft.Azure.WebJobs.Extensions.DurableTask -v 1.0.0-beta

4. Start local Azure Storage Emulator v5.2 (or later) or use connection string to existing Azure Storage account

5. Add local.setting.json file next to host.json file and then modify local.settings.json file to look similar to if you use Azure Storage Emulator or provide the connection string to your storage account in Azure:

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true"
  }
}

6. Start the project by running the following command in the folder of your function app:

func host start 

7. In Postman or with cURL call the endpoint provided by Azure Functions Tools:

curl -X POST http://localhost:7071/api/orchestrators/HelloSequence

8. You can check the progress of your function by calling “statusQueryGetUri” endpoint provided in the response similar to:

url -X GET 'http://localhost:7071/runtime/webhooks/DurableTaskExtension/instances/9c71ff5ff9f34e4f82f882c795bb20fa?taskHub=DurableFunctionsHub&connection=Storage&code=xuqaAlxP%2F%2FjlrBxU%2FL8kE5jjzMzhHysVVUucYItg6rBPJvAAIetd%2FA%3D%3D'

9. Great! You are now able to run Azure Durable Functions in VS Code! Congratulations!!!

Let us know what ideas you have and how we can help you be more successful with Azure Functions and Azure Durable Functions! Please contact me here or at @azurekanio

Thank you!

Kanio

 

 

 

 

Azure Durable Functions

Azure Functions allows you to use the power of Serverless to reduce significantly development time and enable in the same time shorter release cycles.

However, as we enter in the Serverless era the complexity increases as well. If we want to have Serverless-driven solution we will probably end up having hundreds and thousands of functions. How can we mange so many items in the best possible way? Azure Logic apps already provide great way of creating complex workflows that include Azure Functions. However, if we want to use the code as the main source of truth for the current state of our application, code-driven approach will be better.

Thus, Azure Functions offering was extended with Azure Durable Function – code-driven orchestrator functions  that help us coordinate easily the execution other functions.

You can find consice review of Azure Durable Functions in the presentation I showed during Tech Summit 2017, Tokyo:

Azure Durable Functions has a great documentation that can be found here  . Feel free to review the source code that can be accessed here.

In series of posts we will cover various topics for Azure Durable Functions. Feel free to contact me here or in twitter at @azurekanio

We are very interested to hear your requirements and how we can make you more successful with Azure Functions and Azure Durable Functions!

Thank you!

Kanio

Microsoft Azure Update, July 2016

Hello,

I am starting a new initiative for summarizing Microsoft Azure update for the last month.

Today we have the update for July.

Azure is developing so fast so it is important to have the chance for minutes to cover what is changing and then dig deeper depending on your needs.

Please find the information I summarized for you in the presentation below:

Let me know if you have suggestions how this initiative can be improved and more useful for you.

Thank you!

Kanio

 

Introduction to Serverless Architectures with Azure Functions

Hello All,

We had 7-th edition of Tokyo Azure Meetup. Now we are making the event global – broadcasting live directly from Microsoft office in Shinagawa, Tokyo.

This time we talked about serverless architectures. We are in the middle of the next big shift in computing – completely abstracting the underlying infrastructure  and focusing 100% on the business logic.

Today we can create applications directly in our browser and leave the decision how they are hosted and scaled to the cloud provider. Moreover, this approach give us incredible control over the granularity of our applications since most of the time we are dealing with single function at a time.

In this meetup we covered the following topics:

• Microsoft Azure major updates for July

• Introduce Serverless Architectures

• Talk about the advantages of Serverless Architectures

• Discuss in details in  event-driven computing

• Cover common Serverless approaches

• See practical applications with Azure Functions

• Compare AWS Lambda and Azure Functions

• Talk about open source alternatives

• Expore the relation between Microservices and Serverless Architectures

You can find the presentation here:

And the recording can be found here:

We will be happy to have you as our guest speaker live or online.

Please contact me if you are interested!

Thank you!

Kanio

Error with ASP.NET Core Project in Visual Studio 2015 Update 3

After installing Visual Studio 2015 Update 3, when I run the sample project for ASP.NET Core API Project I got the following errors:
Error MSB4064 The “OutputLogFile” parameter is not supported by the “VsTsc” task. Verify the parameter exists on the task, and it is a settable public instance property. C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\TypeScript\Microsoft.TypeScript.targets 261

and

Severity Code Description Project File Line Suppression State
Error MSB4063 The “VsTsc” task could not be initialized with its input parameters. C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\TypeScript\Microsoft.TypeScript.targets 247

These errors point to the following code in the following file  C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v14.0\TypeScript\Microsoft.TypeScript.targets


<VsTsc
ToolPath="$(TscToolPath)"
ToolExe="$(TscToolExe)"
Configurations="$(TypeScriptBuildConfigurations)"
FullPathsToFiles="@(TypeScriptCompile)"
YieldDuringToolExecution="$(TscYieldDuringToolExecution)"
OutFile="$(TypeScriptOutFile)"
OutDir="$(TypeScriptOutDir)"
ProjectDir="$(ProjectDir)"
ToolsVersion="$(TypeScriptToolsVersion)"
RootDir="$(TypeScriptRootDir)"
TypeScriptCompileBlocked="$(TypeScriptCompileBlocked)"
JsxPreserve="$(JsxPreserve)"
ComputeOutputOnly="false"
OutputLogFile="$(CompilerOutputLog)">
<Output TaskParameter="GeneratedJavascript" ItemName="emittedFiles" />
</VsTsc>

The workaround I found so far is to comment all the code above and this fixes the build problem I experience:


<!--<VsTsc
ToolPath="$(TscToolPath)"
ToolExe="$(TscToolExe)"
Configurations="$(TypeScriptBuildConfigurations)"
FullPathsToFiles="@(TypeScriptCompile)"
YieldDuringToolExecution="$(TscYieldDuringToolExecution)"
OutFile="$(TypeScriptOutFile)"
OutDir="$(TypeScriptOutDir)"
ProjectDir="$(ProjectDir)"
ToolsVersion="$(TypeScriptToolsVersion)"
RootDir="$(TypeScriptRootDir)"
TypeScriptCompileBlocked="$(TypeScriptCompileBlocked)"
JsxPreserve="$(JsxPreserve)"
ComputeOutputOnly="false"
OutputLogFile="$(CompilerOutputLog)">
<Output TaskParameter="GeneratedJavascript" ItemName="emittedFiles" />
</VsTsc>-->

I will update the post if I find a better solution.

Kanio

 

 

de:code 2016

Hello,

I have the honor to present on de:code 2016, the main Microsoft Conference at Japan. de:code is very interesting event that gives a chance to Japanese audience cover topics discussed during Build conference mixed with topics inspired by the local Microsoft community.

This year the event was huge and we had for the keynote Satya Nadella, Microsoft CEO and Steve Guggenheimer, Corporate Vice President & Chief Evangelist. More information here .

It is amazing to be on the same speaker list with people like Satya and Steven.

I was talking about DevOps and our experiences at Rakuten changing the organization to leverage best practices.

I was presenting with a colleague of mine Kawaguchi-san, who is the leading DevOps evangelist at Rakuten.

You can find the recording of the talk here (English part starts at 22-nd minute):

https://channel9.msdn.com/Events/de-code/2016/DOO-004/player

Also we participated in the live Channel 9 coverage of the event where again we explored strategies to adopt DevOps at big enterprise. The recording is below:

https://channel9.msdn.com/Events/de-code/decode2016-livestreaming/Channel9stageDay2-2/player

The whole DevOps track was very interesting. Microsoft brought leading experts from Chef, Mesosphere (DCOS), HashiCorp and Visual Studio Team Services teams to share their experiences in order to speed up DevOps adoption in Japan.

de:code 2016 was great event!

I am looking forward to de:code 2017.

Thank you!

Kanio

 

Tokyo Azure Meetup #6

Hello,

In a series of post I will be giving you more details about the activities of Tokyo Azure Meetup .

Today we will talk about our 6-th event dedicated to Azure Machine Learning and Microsoft Dynamics.

We were very honored to have Taiki Yoshida to be our guest speaker for the event. He has extensive experience with Microsoft Dynamics and from several months was working on also with Azure Machine Learning and its applications in other technologies.

Also in the beginning of the meeting I made a summary on the latest developments in Microsoft Azure for June.

For the first time we were broadcasting the event live and you  can watch the recording:

If you are interested in the presentations you can find them below as well.

Azure Monthly Update – June:

Azure Machine Learning with Microsoft Dynamics:

Thank you!

Kanio

 

 

 

Tokyo Azure Meetup

Hello,

Meetup_Logo_2015For last six months I have been running  new community event in Tokyo, Japan focused on Microsoft Azure.

We started in the beginning of the year and we already had 6 very successful events.

Our community grew to 184 members up to know and we are adding new people every month.

This is the URL of the meetup  Tokyo Azure Meetup .

There you can find information about the our events.

The main goals of Tokyo Azure Meetup is to:

  1. Gather people interested in cloud and especially  Microsoft Azure 
  2. Provide very practical knowledge (theory & demos) that can be easily applied in your daily work
  3. Learn from other unique experience
  4. Share your unique experience

I will be sharing more information for events here.

I want also to invite you to be our next speaker.

We have only 2 basic rules for our speakers:

  1. Present in English or Japanese
  2. Have a topic related to Microsoft Azure

You can have a talk from 5 to 60 minutes depending what suits you best.

We have our events in Microsoft Japan Office in Shinagawa, Tokyo, Japan.

However, if you are not located in Japan we can have virtual session.

So, as you can see there is a lot of flexibility on our side.

If you are interested, please join Tokyo Azure Meetup and send me message there.

We started to add recording of the events to our YouTube channel

The presentations will be accessible in our SlideShare channel

Thank you and see you soon!