Sendgrid and Azure Functions

Azure Functions recently introduced Sendgrid as an output type. The feature is currently in beta/preview and the documentation is as such very sparse, so I thought I’d quickly write up how I got it working. Hopefully, it might save you some googling (then again, you probably googled your way here, but I digress).

I currently use sendgrid to mail a shopping list, built up through the use of a barcode scanner in our kitchen, to myself every sunday. However, since the content of the email is somewhat irrelevant in this case, we’ll build an function which sends a mail whenever its HTTP endpoint is triggered/accessed.

As a prerequisite for this short tutorial you will need an active sendgrid account. They are kind enough to offer a free tier, so head on over and register if you haven’t already.

I’ll assume you have already created an azure function app (if not, follow Microsoft’s instructions). In it, create a new function; select the HttpTrigger-Csharp type, give it a name and click Create.

After a short wait, you’ll be presented with your brand new function. The default implementation of an HTTP trigger will return a name sent to it either via the body as a POST request or as a querystring parameter. For this example’s sake, we will email that name to a hard coded email address as well.

First, add a new out parameter to the function. To do so, select “Integrate” beneath the function name in the panel on the left hand side, and then click on “+New output”:

Creating a new output parameter in azure functions.

Then, select “SendGrid (Preview)” (you might have to scroll down a little in the output type lsit) and hit “Select”. The following screen is not as self-explanatory as it should be, but here’s the deal: The message parameter name is what you think it is – it’s the name of the parameter in your function which will contain the email message. The SendGrid API Key value, however, shouldn’t be populated with your actual SendGrid API key, but the key of the app setting containing your sendgrid api. Leave the value at its default (SendGridApiKey).You can leave the other settings (from & to address, subject and message text) empty, as we’ll populate them via the function code instead. Save the new parameter.

Now it is time to get the API key from sendgrid and add the required app setting. You create the API key at Go to settings -> API Keys.


then “Create API key” and “General API Key”


Give the Api Key a descriptive name (I chose “Azure functions sendgrid example”) and make sure you grant the API key full access to “Send mail”:


When you hit save, the API key will be displayed. Copy the key to the clipboard (it might also be an idea to keep that browser window open until you’ve verified it works, since that screen is the first and last time sendgrid will ever show you the key).

Moving back to the azure portal, it’s time to create the app setting. Click “Function app settings” in the bottom left corner:


and then “Go to app service settings.”

There, select “Application Settings” and add a new App Setting. Its key has to be “SendGridApiKey” (as we specified in the sendgrid parameter setting). Paste the sendgrid api key as its value, click save and close the app settings blade. You should be returned to the azure function. Click on “Develop” under the function name – it’s time to write some code!

Since we have specified a new out parameter called “message”, the first thing we have to do is to add that. The sendgrid parameter type expects a “Mail” type object, found in the SendGrid.Helpers.Mail namespace. We also have to import the sendgrid assembly. Add the following two lines at the very top of the function:

#r "SendGrid"

using SendGrid.Helpers.Mail;

Add the new out parameter to the function signature and remove the async and Task parts (out parameters do not play nice with async).

public static HttpResponseMessage Run(
    HttpRequestMessage req, 
    TraceWriter log, 
    out Mail message)

Removing async means you will also have to change the following line

dynamic data = await req.Content.ReadAsAsync<object>();


dynamic data = req.Content.ReadAsAsync<object>().Result;

When that is done, all the boilerplate/infrastructure is in place, and it is just a matter of composing the email. The entire sendgrid C# API is documented at github, but the example below is a minimal, working implementation:

 message = new Mail();
 message.Subject = "Someone passed a name to an azure function!";
 var personalization = new Personalization();
 personalization.AddTo(new Email("")); 
 Content content = new Content
 Type = "text/plain",
 Value =$"The name was {name}" 

That everything that is needed to integrate sendgrid with azure functions. The complete function as a gist.

#r "SendGrid"
using System.Net;
using SendGrid.Helpers.Mail;
public static HttpResponseMessage Run(HttpRequestMessage req, TraceWriter log, out Mail message)
log.Info("C# HTTP trigger function processed a request.");
// parse query parameter
string name = req.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "name", true) == 0)
// Get request body
dynamic data = req.Content.ReadAsAsync<object>().Result;
// Set name to query string or body data
name = name ?? data?.name;
message = new Mail();
message.Subject = "Someone passed a name to an azure function!";
var personalization = new Personalization();
personalization.AddTo(new Email(""));
Content content = new Content
Type = "text/plain",
Value =$"The name was {name}"
return name == null
? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a name on the query string or in the request body")
: req.CreateResponse(HttpStatusCode.OK, "Hello " + name);
view raw SendGrid.csx hosted with ❤ by GitHub

Let me know how you get on in the comments!

Azure Function as an HTTP endpoint

Article three describing my first foray into serverless computing. Introduction here, storage queue triggered function here. TL;DR: I’m using this is as a way to familiarize myself with Azure Functions, and to save some precious family time better spent not trying to remember what we need come Sunday.

In my projects, we always end up integrating against some kind of third party – be it booking systems, news aggregators, external providers of content or something as simple as an image resizer. Common for the integrations, is the need to transform the data received into a format with which we want to work. Sometimes we need to enrich the data  by combining it with other sources, and other times the data exposed by the external API is more than we need.

For this project, I had to get product data from‘s API by barcode in order to add the product to my cart using their internal ID. Their API provides a product endpoint, but it exposes much more data than I needed. I decided to abstract it away behind a GET endpoint of my own.

This is a philosophy I try to follow in all projects I’m responsible for. It means the business code can trust that the contracts we’ve agreed upon won’t break suddenly because of any external factors, and third parties are kept at the edges of the system. Another advantage in this case is that I might want to cache the product data returned at some point, and with the endpoint (function) in place, I have an easy way to implement said caching without having to modify any business code.

So, an HTTP endpoint was needed, and since I’m working in a serverless architecture, that meant I had to create an HTTP triggered function:

The HTTP triggered function is found in the “API & WebHooks” scenario,

Note that you can specify the access/authorization level for the function:


“Function” means the caller needs to provide a key which is specific to the function in the query string. “Admin” means an app wide key is needed, while “anonymous” will allow anyone who accesses the URL to trigger the function.

Now, the HTTP trigger is not always used as a RESTful endpoint – it can just as well trigger a processing job, write to a storage table or whatever else you need it to do, but in this case I wanted to return a product object on the format determined by the Accept-Encoding of the request.

In order to make the function return an http request, the function needs to have an HTTP output parameter, which is set up by default when you create the function. I left everything as it was, and focused on writing the small function:

#r "Newtonsoft.Json"
using System;
using System.Configuration;
using Newtonsoft.Json;
using System.Net;
// These are the (interesting parts) of the models returned from the kolonial API.
// The actual JSON returned contains more properties, but I see no reason to bother the deserializer
// with more than what we actually need.
public class KolonialSearchResponse {
public KolonialProduct[] Products {get;set;}
public class KolonialProduct {
public string Id {get;set;}
public string Barcode { get; set; }
public string Brand {get; set;}
public string Name {get; set;}
static HttpClient HttpClient = null;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
if(HttpClient == null)
HttpClient = CreateHttpClient(log);
string barcode = GetBarcodeFromRequest(req);
// Request product from kolonial
var httpResult = await HttpClient.GetAsync("" + barcode);
if (!httpResult.IsSuccessStatusCode)
// not much we can do here, so just log the error and return an error status code.
log.Error($"Error occured when getting data from kolonial.");
return req.CreateErrorResponse(HttpStatusCode.InternalServerError, "Product not found");
var json = await httpResult.Content.ReadAsStringAsync();
if (json != null)
log.Info($"Processing JSON: {json}");
var results = JsonConvert.DeserializeObject<KolonialSearchResponse>(json);
// If the search call returns anything but a single product
// we have no idea how to handle it.
if(results.Products != null && results.Products.Length == 1)
var product = results.Products.First();
product.Barcode = barcode;
return req.CreateResponse(HttpStatusCode.OK, product);
return req.CreateErrorResponse(HttpStatusCode.BadRequest, "Product not found");
public static HttpClient CreateHttpClient(TraceWriter log)
log.Info("Instantiating HTTP client.");
var httpClient = new HttpClient();
httpClient.DefaultRequestHeaders.Add("X-Client-Token", ConfigurationManager.AppSettings["KolonialToken"]);
httpClient.DefaultRequestHeaders.Add("User-Agent", ConfigurationManager.AppSettings["KolonialUserAgent"]);
return httpClient;
public static string GetBarcodeFromRequest(HttpRequestMessage req){
return req.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "barcode", true) == 0)

Hopefully, the code should be pretty self-explanatory. There are a couple of things which are good to know, though: Newtonsoft.Json is available in azure functions, but it must be imported:

#r "Newtonsoft.Json"


And while I very much doubt that I would ever exhaust anything with the small scale of my project, the Microsoft patterns and practices team do recommend that HttpClient is instantiated as few times as possible, and instead kept in memory and re-used. That’s why the static HttpClient is created the first time the function is triggered and kept around:

if(HttpClient == null)
    HttpClient = CreateHttpClient(log);

What’s interesting is that despite the function in principle being serverless, the static variable will hang around (albeit for an unpredictable amount of time), effectively allowing us to follow best practice. You can read more about sharing state in Azure Functions on Mark Heath’s excellent blog.

When it comes to the formatting of the return value, the framework will take of that for you as long as you stick to the request.CreateResponse(…) functions. As an example, this is how the function responds to an Accept: application/json request:


While requesting XML, predictably, will return this:


(Both screenshots courtesy of the wonderful Postman Chrome plugin)

And that’s that – a small integration with an external service, contained in a single azure function.


Azure functions – storage queue trigger

As I described in the previous blog post (in Norwegian), I’ve spent a few hours the last couple of days to set up a barcode scanner in our kitchen. The barcode scanner is attached to a raspberry pi and when a barcode is scanned, the barcode is pushed to an azure storage queue and the product eventually ends up in our shopping cart.

I’m using the project to familiarize myself with Azure Functions before utilizing them in customer projects, and I decided to use a storage queue triggered function to process incoming barcodes. Read more about storage queues at They’re basically lightweight, high volume and low-maintenance distributed queues.

Creating the function

Before coding the function, it has to be set up as part of an Azure Function app. An azure function app is basically a special version of a web app. The app can be created either via the Azure Functions Portal or through the regular azure portal.

Side note: While working on this, the continuous depoyment of a totally unrelated app in the same azure subscription suddenly started failing when a powershell command tried to switch deployment slots, with the error message

Requested value 'Dynamic' was not found.

This had me scratching my head for quite some time, but some googling revealed that adding a function app to an existing subscription will (may?) break powershell functionality. The fix was to completely delete the function app. YMMV .

Once the app is set up, it’s time to create the function. As mentioned, we want an azure storage queue triggered function, and I opted for C#:


Selecting the trigger type will reveal a few options below the “Choose a template” grid:


Here we give the function a descriptive name and enter the name of an existing queue (or a new one) in the “Queue name” field. The storage account connection field is a drop down of the storage accounts available. It obviously needs to be set to the storage account we want the  queue to be stored in. Once we click create, the function is added to the function app, and it will be triggered (executed) every time a new message is added to the storage queue “incoming-barcodes”. This configuration (queue name, storage accouht) can be changed at any time by clicking “Integrate” beneath the function name in the function portal:


The next step is to actually write the function. In this first version, everything is done in one function call, and we’re only covering the happy path: we assume the kolonial account exists, that the password is correct and that the product exists. If not, the message will end up in the poison queue or just log a warning message. A natural next step would be to alert the user that any errors occured, but that’s for another day.

The default entry point for the function is a static method called “Run” (I realize that RunAsync would be more correct with regards to naming async methods, but I’m sticking as close as I can to the defaults):

public static async Task Run(string message, TraceWriter log)
log.Info($"Processing incoming barcode: {message}");
var incoming = IncomingBarcode.FromMessage(message);
var httpClient = await CreateKolonialHttpClientAsync();
var kolonialProduct = await GetKolonialProductAsync(httpClient, incoming.Barcode);
if(kolonialProduct == null)
log.Warning($"Product with barcode {incoming.Barcode} is not available at Kolonial.");
await AddProductToCartAsync (httpClient, kolonialProduct, log);

First, we extract the raspberry ID and barcode from the incoming message with a small data transport class (IncomingBarcode), since the barcode is passed to the function by the raspberry pi on the format “rpi-id:barcode”.

The kolonial API needs to be set up with a user agent and a token, and in order to access a specific user’s cart we also need to get a user session. That’s all handled by the CreateKolonialHttpClientAsync function:

public static async Task<HttpClient> CreateKolonialHttpClientAsync(){
var httpClient = new HttpClient();
// The kolonial API requires a client token and a user agent (supplied by kolonial) to work
httpClient.DefaultRequestHeaders.Add("X-Client-Token", ConfigurationManager.AppSettings["KolonialToken"]);
httpClient.DefaultRequestHeaders.Add("User-Agent", ConfigurationManager.AppSettings["KolonialUserAgent"]);
// Modifying the cart requires a valid, active session
string sessionId = await GetSessionIdAsync(httpClient);
// Ensure the session cookie is sent as part of the calls to the API.
httpClient.DefaultRequestHeaders.Add("Cookie", $"sessionid={sessionId}");
return httpClient;
public static async Task<string> GetSessionIdAsync(HttpClient httpClient)
// The session cookie ID is retrieved by passing an object like { username: "something", password: "something-secure" }
// to the user/login endpoint
var result = await httpClient.PostAsync("",
new StringContent(
new { username = "an-email-address", password = "a-password" }),
Encoding.UTF8, "application/json"));
var json = await result.Content.ReadAsStringAsync();
var response = JsonConvert.DeserializeObject<LogInResponse>(json);
return response.sessionid;

As can be seen in the gist above, configuration values are handled just as in regular .net code, by utilizing the ConfigurationManager. The settings themselves are set via the function app settings:

Navigating to the app settings: Function app settings -> Configure app settings…
…and then adding the setting as usual (you’ll recognize the settings blade from ordinary azure web apps).

Once the connection/session to is set up, we attempt to get the product by its bar code. I’ve separated the “get product from kolonial and transform it to a model I need” part into a separate http triggered azure function, which I’ll cover later, so there’s not a whole lot of logic needed; if the function returns a non-null json, the barcode is a valid kolonial product which is returned, if not we return null.

public static async Task<KolonialProduct> GetKolonialProductAsync(HttpClient client, string barcode)
var httpResult = await client.GetAsync(ConfigurationManager.AppSettings["GetKolonialProductUri"] + "&barcode=" + barcode);
if (httpResult.IsSuccessStatusCode)
var json = await httpResult.Content.ReadAsStringAsync();
if (json != null)
return JsonConvert.DeserializeObject<KolonialProduct>(json);
return null;

As can be seen in the Run method, all that’s left to do when the product exists, is to add it to the cart. This is done by POSTing to the /cart/items endpoint:

public static async Task AddProductToCartAsync(HttpClient httpClient, KolonialProduct kolonialProduct, TraceWriter log)
var productsJson = JsonConvert.SerializeObject(
new { items = new []{ new { product_id = kolonialProduct.Id, quantity = 1 } }});
log.Info($"Updating Kolonial with {productsJson}");
var response = await httpClient.PostAsync(
new StringContent(productsJson, Encoding.UTF8, "application/json"));

That’s all there is to it.

Dev notes

I tried setting the project up in Visual Studio, but the development experience for Azure Functions leaves a lot to be desired (the tools are still in beta), so I ended up coding the function in the function portal.

Testing a storage queue triggered function is actually pretty easy. I used the Azure Storage Explorer to manually add entries to the queue when developing.

When working with REST APIs, I like to have strongly typed models to work with. An easy way to create them, is the paste example JSON responses into, which will create C# classes for you.

Handleliste? Sånt har vi systemer for

Med en strekkodeleser, en raspberry pi, noen linjer kode og Microsoft Azure, oppdateres nå husholdningens handleliste på i takt med at vi går tomme for varer.

For noen måneder siden gikk jeg til innkjøp av en raspberry pi, siden jeg noe naivt trodde jeg ville ha tid og lyst til å bygge et Magic Mirror. Den illusjonen brast umiddelbart da jeg innså at det ville innebære én del koding/programvare og 99 deler finsnekring, så rpien ble liggende i en skuff.

Det endret seg dog veldig raskt da jeg kom over en bloggpost fra 2013 om Oscar, et system som automatisk oppdaterer en handleliste på trello ved hjelp av en strekkodescanner og litt koding.

Vår husholdning ble tidlig kunder av de nettbaserte dagligvarebutikkene, så ideen min var å kombinere prinsippet bak Oscar med en eksisterende netthandel. Som tenkt så utført: En strekkodescanner montert på kjøkkenet og noen linjer kode senere, og vi har et system som automatisk oppdaterer vår handlekurv på når vi scanner varer, enten når vi ser vi trenger mer eller når den tomm eemballasjen går i søpla.

Dette ble samtidig en gylden mulighet til å utforske Azure Functions før jeg tar de i bruk i faktiske kundeprosjekter – håpet er å publisere en liten føljetong med bloggposter som belyser de forskjellige kode- og arkitektur-valgene løsningen består av.



Selve strekkodeleseren kjøpte fra en mer eller mindre tilfeldig valgt forhandler på ebay. Det er en USB-variant som kobles til rpien på helt vanlig måte og registreres som en input-enhet:


Som man ser av skjermbildet over, valgte jeg python som utviklingsspråk på raspberrien. Jeg befinner meg som oftest på microsoft-stacken i mitt daglige virke, så valget var mest for variasjonens skyld og for å bruke python til noe annet enn å si hallo til verden.

Litt rask googling avslørte at evdev var et naturlig valg når det kommer til å lese input fra en enhet, og det tok ikke mange minutter å snekre sammen en liten snutt som leste alle tegn frem til linjeskift og samlet disse til en barcode.

Den første versjonen (altså nåværende, og eneste, versjon), dytter strekkodene til en Azure Storage Queue, også tar en azure function over derifra. Kodenunder faller forøvrig inn i kategorien “kode som virker”, men neppe “dogmatisk og strukturelt korrekt python” – ta det for det det er

import evdev
from evdev import *
from import QueueService, QueueMessageFormat
import threading
import time
from queue import *
import datetime
# responsible for uploading the barcodes to the azure storage queue.
class BarcodeUploader:
def __init__(self):
# Instiantiate the azure queue service (from the package)
self.queue_service = QueueService(account_name='wereoutof', account_key='your-key-here')
# azure functions is _very_ confused is the text isn't base64 encoded
self.queue_service.encode_function = QueueMessageFormat.text_base64encode
# use a simple queue to avoid blocking operations
self.queue = LifoQueue()
t = threading.Thread(target=self.worker, args=())
t.daemon = True
# processes all messages (barcodes) in queue - uploading them to azure one by one
def worker(self):
while True:
while not self.queue.empty():
barcode = self.queue.get()
self.queue_service.put_message('barcodes', u'account-key:' + barcode)
except Exception as exc:
print("Exception occured when uploading barcode:" + repr(exc))
# re-submit task into queue
print("Barcode " + barcode + " registered")
def register(self, barcode):
print "Registering barcode " + barcode + "..."
current_barcode = ""
# Reads barcode from "device"
def readBarcodes():
global current_barcode
print ("Reading barcodes from device")
for event in device.read_loop():
if event.type == evdev.ecodes.EV_KEY and event.value == 1:
keycode = categorize(event).keycode
if keycode == 'KEY_ENTER':
current_barcode = ""
current_barcode += keycode[4:]
# Finds the input device with the name "Barcode Reader ".
# Could and should be parameterized, of course. Device name as cmd line parameter, perhaps?
def find_device():
device_name = 'Barcode Reader '
devices = [evdev.InputDevice(fn) for fn in evdev.list_devices()]
for d in devices:
if == device_name:
print("Found device " +
device = d
return device
# Find device...
device = find_device()
if device is None:
print("Unable to find " + device_name)
#... instantiate the uploader...
uploader = BarcodeUploader()
# ... and read the bar codes.

Som man ser, gjør kodesnutten ikke mye: Den leser tall frem til “enter”, og sender disse (altså strekkoden) til Azure via I denne første, rudimentære versjonen brukes en kø som retry-mekanisme.

Scriptet starter ved maskinstart som en cronjobb:

# m h dom mon dow command
@reboot sh /home/pi/ > /home/pi/logs/cronlog 2>&1 setter opp pythonmiljøet og starter scriptet.

#!/usr/bin/env bash
cd /home/pi/Devel/barcode_reader/

Neste gang viser jeg hvordan dette håndteres på mottakssiden, men en azure function storage queue trigger (altså en funksjon som eksekveres hver gang noe legges til en spesifikk kø).