Azure functions – storage queue trigger

As I described in the previous blog post (in Norwegian), I’ve spent a few hours the last couple of days to set up a barcode scanner in our kitchen. The barcode scanner is attached to a raspberry pi and when a barcode is scanned, the barcode is pushed to an azure storage queue and the product eventually ends up in our kolonial.no shopping cart.

I’m using the project to familiarize myself with Azure Functions before utilizing them in customer projects, and I decided to use a storage queue triggered function to process incoming barcodes. Read more about storage queues at microsoft.com. They’re basically lightweight, high volume and low-maintenance distributed queues.

Creating the function

Before coding the function, it has to be set up as part of an Azure Function app. An azure function app is basically a special version of a web app. The app can be created either via the Azure Functions Portal or through the regular azure portal.

Side note: While working on this, the continuous depoyment of a totally unrelated app in the same azure subscription suddenly started failing when a powershell command tried to switch deployment slots, with the error message

Requested value 'Dynamic' was not found.

This had me scratching my head for quite some time, but some googling revealed that adding a function app to an existing subscription will (may?) break powershell functionality. The fix was to completely delete the function app. YMMV .

Once the app is set up, it’s time to create the function. As mentioned, we want an azure storage queue triggered function, and I opted for C#:

create-azure-function-storage-trigger.png

Selecting the trigger type will reveal a few options below the “Choose a template” grid:

name-function

Here we give the function a descriptive name and enter the name of an existing queue (or a new one) in the “Queue name” field. The storage account connection field is a drop down of the storage accounts available. It obviously needs to be set to the storage account we want the  queue to be stored in. Once we click create, the function is added to the function app, and it will be triggered (executed) every time a new message is added to the storage queue “incoming-barcodes”. This configuration (queue name, storage accouht) can be changed at any time by clicking “Integrate” beneath the function name in the function portal:

integrate.png

The next step is to actually write the function. In this first version, everything is done in one function call, and we’re only covering the happy path: we assume the kolonial account exists, that the password is correct and that the product exists. If not, the message will end up in the poison queue or just log a warning message. A natural next step would be to alert the user that any errors occured, but that’s for another day.

The default entry point for the function is a static method called “Run” (I realize that RunAsync would be more correct with regards to naming async methods, but I’m sticking as close as I can to the defaults):

First, we extract the raspberry ID and barcode from the incoming message with a small data transport class (IncomingBarcode), since the barcode is passed to the function by the raspberry pi on the format “rpi-id:barcode”.

The kolonial API needs to be set up with a user agent and a token, and in order to access a specific user’s cart we also need to get a user session. That’s all handled by the CreateKolonialHttpClientAsync function:

As can be seen in the gist above, configuration values are handled just as in regular .net code, by utilizing the ConfigurationManager. The settings themselves are set via the function app settings:

app-settings
Navigating to the app settings: Function app settings -> Configure app settings…
appsettigns
…and then adding the setting as usual (you’ll recognize the settings blade from ordinary azure web apps).

Once the connection/session to kolonial.no is set up, we attempt to get the product by its bar code. I’ve separated the “get product from kolonial and transform it to a model I need” part into a separate http triggered azure function, which I’ll cover later, so there’s not a whole lot of logic needed; if the function returns a non-null json, the barcode is a valid kolonial product which is returned, if not we return null.

As can be seen in the Run method, all that’s left to do when the product exists, is to add it to the cart. This is done by POSTing to the /cart/items endpoint:

That’s all there is to it.

Dev notes

I tried setting the project up in Visual Studio, but the development experience for Azure Functions leaves a lot to be desired (the tools are still in beta), so I ended up coding the function in the function portal.

Testing a storage queue triggered function is actually pretty easy. I used the Azure Storage Explorer to manually add entries to the queue when developing.

When working with REST APIs, I like to have strongly typed models to work with. An easy way to create them, is the paste example JSON responses into http://json2csharp.com/, which will create C# classes for you.

Handleliste? Sånt har vi systemer for

Med en strekkodeleser, en raspberry pi, noen linjer kode og Microsoft Azure, oppdateres nå husholdningens handleliste på kolonial.no i takt med at vi går tomme for varer.

For noen måneder siden gikk jeg til innkjøp av en raspberry pi, siden jeg noe naivt trodde jeg ville ha tid og lyst til å bygge et Magic Mirror. Den illusjonen brast umiddelbart da jeg innså at det ville innebære én del koding/programvare og 99 deler finsnekring, så rpien ble liggende i en skuff.

Det endret seg dog veldig raskt da jeg kom over en bloggpost fra 2013 om Oscar, et system som automatisk oppdaterer en handleliste på trello ved hjelp av en strekkodescanner og litt koding.

Vår husholdning ble tidlig kunder av de nettbaserte dagligvarebutikkene, så ideen min var å kombinere prinsippet bak Oscar med en eksisterende netthandel. Som tenkt så utført: En strekkodescanner montert på kjøkkenet og noen linjer kode senere, og vi har et system som automatisk oppdaterer vår handlekurv på kolonial.no når vi scanner varer, enten når vi ser vi trenger mer eller når den tomm eemballasjen går i søpla.

Dette ble samtidig en gylden mulighet til å utforske Azure Functions før jeg tar de i bruk i faktiske kundeprosjekter – håpet er å publisere en liten føljetong med bloggposter som belyser de forskjellige kode- og arkitektur-valgene løsningen består av.

Strekkodeleseren

barcode-scanner

Selve strekkodeleseren kjøpte fra en mer eller mindre tilfeldig valgt forhandler på ebay. Det er en USB-variant som kobles til rpien på helt vanlig måte og registreres som en input-enhet:

python-list-devices

Som man ser av skjermbildet over, valgte jeg python som utviklingsspråk på raspberrien. Jeg befinner meg som oftest på microsoft-stacken i mitt daglige virke, så valget var mest for variasjonens skyld og for å bruke python til noe annet enn å si hallo til verden.

Litt rask googling avslørte at evdev var et naturlig valg når det kommer til å lese input fra en enhet, og det tok ikke mange minutter å snekre sammen en liten snutt som leste alle tegn frem til linjeskift og samlet disse til en barcode.

Den første versjonen (altså nåværende, og eneste, versjon), dytter strekkodene til en Azure Storage Queue, også tar en azure function over derifra. Kodenunder faller forøvrig inn i kategorien “kode som virker”, men neppe “dogmatisk og strukturelt korrekt python” – ta det for det det er

Som man ser, gjør kodesnutten ikke mye: Den leser tall frem til “enter”, og sender disse (altså strekkoden) til Azure via azure.storage-pakken. I denne første, rudimentære versjonen brukes en kø som retry-mekanisme.

Scriptet starter ved maskinstart som en cronjobb:

# m h dom mon dow command
@reboot sh /home/pi/launch_barcode_reader.sh > /home/pi/logs/cronlog 2>&1

launch_barcode_reader.sh setter opp pythonmiljøet og starter scriptet.

#!/usr/bin/env bash
cd /home/pi/Devel/barcode_reader/
/home/pi/.virtualenvs/barcode_reader/bin/python reader.py

Neste gang viser jeg hvordan dette håndteres på mottakssiden, men en azure function storage queue trigger (altså en funksjon som eksekveres hver gang noe legges til en spesifikk kø).

Experiences with Paypal Adaptive Payments API

I just finished a small project involving PayPal’s Adaptive Payments API (and the nuget package they supply for it). The points below is stuff I spent too much time on, hopefully this can save someone the trouble.

First of all, I got the error “Your payment can’t be completed. Please return to the participating website and try again.” after completing the payment as a test user in the sandbox. This was solved by creating an application id for the application and including it in the signature credentials I pass to PayPal. I think this happened after I switched to the SignatureCredential implementation. Its constructor does not accept  the Application ID, but the property can be set afterwards.

The second stumbling block was a generic exception (“Input string was not in a correct format.”) when passing the payment info to PayPal. It turns out that the Adaptive Payments package assumes that point is the universal decimal mark – “100.00” is OK, while “100,00”, which is my locale’s preferred way of expressing a hundred, does not compute.

OData Web APIs with AutoMapper 3

When developing an API on top of a domain layer, we rarely want to expose the actual domain objects to the API consumers. Rather, it is usually a matter of presenting the consumers with a subset of the domain object’s properties or a DTO/model object representing a composite of multiple domain objects.

Although the combination of OData and Entity Framework does provide some control of the presentation of the objects returned, it quickly falls apart when more advanced combinations and composites are needed. This meant that, to me, OData via .NET Web Api was not a viable alternative in most of my real world (read: customer) projects.

Enter AutoMapper. It has been an invaluable part of my development arsenal for quite some time, and the introduction of LINQ functionality in its latest incarnation makes an already awesome library even better. The LINQ support means that AutoMapper no longer populates the source objects completely, skipping any properties which aren’t needed for the mapping to the destination type. In other words, sensibly designed DTOs and some careful mapping configuration is all that’s needed to create an effective OData API.

Example

Given the domain object below:

[gist https://gist.github.com/JoachimL/6387670 /]

An ordinary domain object, containing a couple of properties we are not likely to want to expose over an OData API. It would be a horrible idea to expose the Image byte array, and there’s no need to expose the user who added the movie to the database, either. For this examples’s sake, we will limit the OData presentation of a movie to its Id, Title and Year Of Release.

[gist https://gist.github.com/JoachimL/6387685 /]

The mapping profile needs to be configured, and that’s typically done in a separate class inheriting from AutoMapper.Profile. AutoMapper does a good job of matching and mapping properties which share names, but has to be told that we just want the year part of the release date: [gist https://gist.github.com/JoachimL/6387662 /]

When that’s done, we merely need to set up an OData controller to delivery Movie objects over OData, and create a service and repository to bring the objects from the database to the controller. The Service, in this example called «MovieService», fetches the domain objects from an instance of the MovieRepository class. This layering might seem a bit contrived in our simple example, but it should prove that this technique is viable in a real world multi-layered architecture.

[gist https://gist.github.com/JoachimL/6387700 /] [gist https://gist.github.com/JoachimL/6387705 /]

I will skip the configuration of the actual OData endpoint, but I have more or less copied it verbatim from http://msdn.microsoft.com/en-us/magazine/dn201742.aspx. In addition, a visual studio 2012 project containing a runnable version of the code in this article is available on GitHub: https://github.com/JoachimL/WebApi.OData.

The most interesting part of this example, at least for those of us familiar with AutoMapper v2, is this line in the service: [gist https://gist.github.com/JoachimL/6387847 /]
In version 2, it would most likely look like this instead: [gist https://gist.github.com/JoachimL/6387854 /]

The difference is the Project().To()-syntax, which ensures that only the property values needed for the mapping are retrieved from the domain objects (and, by extension, the database), The OData endpoint now returns the Title and the Year of Release:

List of all the movies in the database
All movies in the database

SQL Express Profiler proves that the LINQ/AutoMapper/Entity Framework combination leads to an SQL query fetching just the properties needed:

LINQ-generated SQL.
LINQ-generated SQL.

If we try to fetch all Movies starting with a “D”, this is what happens:

movies_whose_filter_start_with_d

And the SQL generated to fetch the objects is still as slim as possible.

How about adding elements to the Movie model which aren’t part of the Movie domain object? No problem, we’ll first extend the Model:

[gist https://gist.github.com/JoachimL/6387670 /]

And then give AutoMapper a hand setting up the properties it’s not able to figure out itself:

[gist https://gist.github.com/JoachimL/6387662 /]

This leads to more data being returned when getting the same URL as previously:

movies_d_with_actors_and_ratings

And even though the SQL’s complexity increases, it’s still limited to the properties/fields it actually needs:

movies_d_with_actors_and_ratings_sql

Hopefully, these examples will give you an idea of how the combination of Entity Framework (and other ORMs) and AutoMapper is a powerful and productive combination. If OData fits the use case, devs can focus on the throttling and security side of things, and let the API consumers themselves decide how they search.

There are some security issues which must be addressed in a production system which I haven’t covered in this article. There is an abundance of advice and tutorials on that topic all around the internet. http://www.asp.net/web-api/overview/odata-support-in-aspnet-web-api/odata-security-guidance is a good starting point.

I have not yet had the time to experiment with how the OData $expand option can be used within the context of EF/LINQ and AutoMapper.

References:

https://github.com/AutoMapper/AutoMapper/wiki

https://github.com/AutoMapper/AutoMapper/wiki/Queryable-Extensions

http://www.asp.net/web-api/overview/odata-support-in-aspnet-web-api

http://expressprofiler.codeplex.com/

* For a discussion on the topic, see http://stackoverflow.com/questions/16962081/asp-net-webapi-odata-support-for-dtos.