File upload through ASP.NET Core middleware

In my previous article we discussed the different options to implement file upload for cloud applications.

In this article I want to provide you an example of how to implement a file upload through a middleware.:

middleware_upload

Example: File Upload to Azure Blog Storage using Angular and ASP.NET Core

We will scaffold our application using the Angular template which is part of the .NET-Core CLI and create a component using the Angular CLI.

Prerequisites

* You can either manually create the Storage Account within the Azure Portal or by using the following ARM Template:

Scaffold the project

To scaffold the project we use the dotnet new command:

dotnet new angular --name file-upload

Implement the middleware

To store the uploaded files to Azure Blob Storage we need to specify the connection string within the appsettings.json:

"ConnectionStrings": {
    "StorageAccount": "UseDevelopmentStorage=true"
    },

If you don’t have the Azure Storage Emulator installed you have to replace the value with an actual Azure Storage Account connection string.

Install WindowsAzure.Storage NuGet Package

You can install the NuGet package using the following command:

dotnet add package WindowsAzure.Storage

File upload implementation

We will implement the file upload in a new Controller called AssetController. The controller only exposes a single method called UploadAssetAsync which takes an IFormFile with the name asset (note: IFormFile is suitable for uploading small files, if you have to deal with large files you have to consider implementing streaming or uploading the files directly from the client to a data store. Further information here.).

The UploadAssetAsync method uploads the passed file to the previous specified Azure Blob Storage and returns the URI of the new blob:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Configuration;
using Microsoft.WindowsAzure.Storage;

namespace file_upload.Controllers
{
    [Route("api/[controller]")]
    public class AssetController : Controller
    {
        private readonly IConfiguration _configuration;
        public AssetController(IConfiguration config)
        {
            _configuration = config;
        }

        [HttpPost]
        public async Task UploadAssetAsync([FromForm]IFormFile asset)
        {
            CloudStorageAccount storageAccount = null;
            if (CloudStorageAccount.TryParse(_configuration.GetConnectionString("StorageAccount"), out storageAccount))
            {
                var client = storageAccount.CreateCloudBlobClient();
                var container = client.GetContainerReference("fileupload");
                await container.CreateIfNotExistsAsync();

                var blob = await container.GetBlobReferenceFromServerAsync(asset.FileName);
                await blob.UploadFromStreamAsync(asset.OpenReadStream());

                return Ok(blob.Uri);
            }

            return StatusCode(StatusCodes.Status500InternalServerError);
        }
    }
}

Disable HttpsRedirection in Development

Before we can run the application we should disable the Https redirection for the development environment (note: you can also install a localhost certificate). Otherwise we will get a warning that the site is not secure. This can be done by replacing the following line in the Configure method within the Startup.cs

app.UseHttpsRedirection();

with:

if (!env.IsDevelopment())
{
   app.UseHttpsRedirection();
}

Implement the frontend

We will implement the file upload in a new component. To create the new component we use the Angular CLI:

ng g component fileUpload

Note: Ensure you invoke the CLI commands within the ClientApp directory.

Configure Routing

Now we add routing to our new component within the app.module.ts:

{ path: 'file-upload', component: FileUploadComponent },

Then we add a link to the component inside the nav-menu.component.html:

<li>
    <a>
        <span class='glyphicon glyphicon-cloud-upload'></span> File Upload
    </a>
</li>

Install PrimeNG

We use the PrimeNG NPM package to implement the file upload:

npm install primeng --save 

Add the file upload

The last thing we have to do is to add the file upload to the file-upload.component.html. The name attribute value must match with the IFormFile parameter name in the middleware (in our case “asset“). The url is /api/Asset which is the address of the UploadAssetAsync middleware web method:

 
<p-fileUpload #fubauto mode="basic" name="asset" url="/api/Asset" maxFileSize="1000000" auto="true"
        chooseLabel="Browse"></p-fileUpload>

That is it. We don’t need to implement any further upload mechanism. To start the application we can use the dotnet CLI:

dotnet run

The source code can be found in my GitHub repository.

File upload in Cloud Applications: The Options

Almost every web application requires some form of file upload.  You may want to allow a user to upload a profile picture or to import any kind of data.

Multiple ways to implement the file upload

Depending on the size of the files and the regularity of the upload you have two options to implement the upload:

Directly upload the file to a data store

The fastest and resource friendliest way is to directly upload the file from the client to a data store. This typically requires the client to have the security credentials for the data store:

direct_uploadBut giving security credentials to potential untrusted clients isn’t a realistic approach for most web applications. Instead you want to use a token that provides clients restricted access to a specific resource for a limited validity period. This pattern is known as Valet Key pattern.

Upload the file through a middleware

The second option is to upload the file to your middleware (API) which will handle the movement of the data to the data store.

middleware_upload

This approach prevents us from exposing any information about the underlying data store to the client. We could even change our Data Store (e. g. from Azure Blob Storage to Azure File Storage) without updating the client.

The downside is that it absorbs valuable resources from our middleware like compute, memory and bandwidth.

Find outdated Azure ARM QuickStart Templates on GitHub

In my previous post Determine latest API version for a resource provider I showed you how to retrieve the latest API version for a specific resource provider using the Get-AzureRmResourceProviderLatestApiVersion cmdlet. In this post I will use the cmdlet to find any outdated resource provider within an ARM template. Also, we will analyze the Azure quickstart templates on GitHub!

To analyze an ARM template we have to convert the JavaScript Object Notation (JSON) to a custom PowerShell object that has a property for each field in the JSON.
We will use the following lean template for testing purpose (download):

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "resources": [
        {
            "type": "Microsoft.Storage/storageAccounts",
            "name": "myuniquestorageaccountname",
            "apiVersion": "2016-01-01",
            "location": "West Europe",
            "sku": {
                "name": "Standard_LRS"
            },
            "kind": "Storage"
        }
    ]
}

The first thing we have to do is to load the file using the Get-Content cmdlet and pipe the result to the ConvertFrom-Json cmdlet. To retrieve the resources we further pipe the result to the Select-Object cmdlet:


Get-Content leanArmTemplate.json |
        ConvertFrom-Json |
        Select-Object -ExpandProperty resources

This will give us all necessary information like the resource provider and the used API version:

type       : Microsoft.Storage/storageAccounts
name       : myuniquestorageaccountname
apiVersion : 2016-01-01
location   : West Europe
sku        : @{name=Standard_LRS}
kind       : Storage

Now finally to analyze the template, we have to iterate over each resource and retrieve the latest API version using the previous mentioned Get-AzureRmResourceProviderLatestApiVersion cmdlet. To get a handy output we create a new PsCustomObject containing the resource type, the used and latest API Version and a flag that specifies whether the used API version is the latest:

Get-Content leanArmTemplate.json |
        ConvertFrom-Json |
        Select-Object -ExpandProperty resources |
         ForEach-Object {
            $latestApiVersion = Get-AzureRmResourceProviderLatestApiVersion -Type $_.type
            [PsCustomObject]@{
                Type = $_.type
                UsedApiVersion = $_.apiVersion
                LatestVersion = $latestApiVersion
                Latest = $_.apiVersion -eq $latestApiVersion
            }
        }

This is the output:

Type                              UsedApiVersion LatestVersion Latest
----                              -------------- ------------- ------
Microsoft.Storage/storageAccounts 2016-01-01     2018-02-01     False

In this example, the used API for the resource provider Microsoft.Storage/storageAccounts is not up to date.

For the sake of convenience I wrapped the code into a Get-OutdatedResourceProvider cmdlet and added some special treatment to handle nested resources:

function Get-OutdatedResourceProvider
{
    [CmdletBinding()]
    [Alias()]
    [OutputType([string])]
    Param
    (
        [Parameter(Mandatory = $true, Position = 0)]
        [string]$Path,

        [switch]$IncludePreview
    )
    Get-Content $Path |
        ConvertFrom-Json |
        Select-Object -ExpandProperty resources |
        ForEach-Object {
        $latestApiVersion = Get-AzureRmResourceProviderLatestApiVersion -Type $_.type
        [PsCustomObject]@{
            Type = $_.type
            UsedApiVersion = $_.apiVersion
            LatestVersion = $latestApiVersion
            Latest = $_.apiVersion -eq $latestApiVersion
        }

        # a resource may include sub resources
        foreach ($subResource in $_.resources)
        {
            $latestApiVersion = Get-AzureRmResourceProviderLatestApiVersion -Type ('{0}/{1}' -f $_.type, $subResource.type)
            [PsCustomObject]@{
                Type = '{0}/{1}' -f $_.type, $subResource.type
                UsedApiVersion = $_.apiVersion
                LatestVersion = $_.apiVersion -eq $latestApiVersion
            }
        }
    }
}

🕵️‍♂️🕵️‍♂️🕵️‍♂️ Now it’s time to analyze some templates  🕵️‍♂️🕵️‍♂️🕵️‍♂️.

Lets clone the whole azure-quickstart-templates Git repository:

git clone https://github.com/Azure/azure-quickstart-templates.git

Then we can filter all ARM templates using the Get-ChildItem cmdlet:

$armTemplates = Get-ChildItem 'azure-quickstart-templates' -Filter 'azuredeploy.json' -Recurse

To analyze them we iterate over each template and pass it to the above created Get-OutdatedResourceProvider cmdlet. Because there might be some invalid templates, we try to parse the JSON first and filter all valid templates:

$invalidTemplates = @()
$analyzedTemplates = $armTemplates |
    ForEach-Object {
    $template = $_
    try
    {
        $null = $template | Get-Content | ConvertFrom-Json
        Get-OutdatedResourceProvider -Path $template.FullName
    }
    catch
    {
        $invalidTemplates += $template
    }
}

Finally we gather and output some metrics:

$validAnalyzes = $analyzedTemplates | Where-Object LatestVersion
$uptodateproviderCount = $validAnalyzes | Where-Object LatestVersion -eq TRUE | Measure-Object | Select-Object -ExpandProperty Count
$outdatedproviderCount = $validAnalyzes | Where-Object LatestVersion -ne TRUE | Measure-Object | Select-Object -ExpandProperty Count

Write-Host "Found $($armTemplates.Count) ARM templates in the Azure quickstart repository ($($invalidTemplates.Count) of them are invalid).
They are using $($analyzedTemplates.Count) resource providers. I was able to determine the latest version for $($validAnalyzes.count) resource provider. $uptodateproviderCount of them are using the latest API version whereas $outdatedproviderCount are using an outdated API." -ForegroundColor Cyan

Here is the output:

Found 675 ARM templates in the Azure quickstart repository (3 of them are invalid).
They are using 3873 resource providers. I was able to determine the latest version for 3288 resource provider. 18 of them are using the latest API version whereas 3270 are using an outdated API.

Note: There is no need to always use the latest API version. You should consider upgrading only if you encounter issues with the resource provider or if you want to use new features that are not part of the currently used API.

You can download the complete script and run it for yourself here.

Determine latest API version for a resource provider

Azure Resource Manager templates are great to deploy one or more resources to a resource group.

A mandatory part of an ARM template is the resources section which defines the resource types that are deployed or updated. Here is an example:

"type": "Microsoft.Storage/storageAccounts",
"name": "[variables('storageAccountName')]",
"apiVersion": "2016-01-01",
"location": "[parameters('location')]",
"sku": {
	"name": "Standard_LRS"
},
"kind": "Storage",
"properties": {}

The name of a resource type has the following format: {resource-provider}/{resource-type}. In this example Microsoft.Storage/storageAccounts. The apiVersion specifies the version of the API that is used to deploy / manage the resource.

A resource provider offers a set of operations to deploy and manage resources using a REST API. If a resource provider adds new features, it releases a new version of the API. Therefore, if you have to utilize new features you may have to use the latest API Version. To determine the latest version of a resource type you can use the following Get-AzureRmResourceProviderLatestApiVersion cmdlet:

function Get-AzureRmResourceProviderLatestApiVersion
{
    [CmdletBinding()]
    [Alias()]
    [OutputType([string])]
    Param
    (
        [Parameter(ParameterSetName = 'Full', Mandatory = $true, Position = 0)]
        [string]$Type,

        [Parameter(ParameterSetName = 'ProviderAndType', Mandatory = $true, Position = 0)]
        [string]$ResourceProvider,

        [Parameter(ParameterSetName = 'ProviderAndType', Mandatory = $true, Position = 1)]
        [string]$ResourceType,

        [switch]$IncludePreview
    )

    # retrieving the resource providers is time consuming therefore we store
    # them in a script variable to accelerate subsequent requests.
    if (-not $script:resourceProvider)
    {
        $script:resourceProvider = Get-AzureRmResourceProvider   
    }

    if ($PSCmdlet.ParameterSetName -eq 'Full')
    {
        $ResourceProvider = ($Type -replace "\/.*")
        $ResourceType = ($Type -replace ".*?\/(.+)", '$1')
    }
    
    $provider = $script:resourceProvider | 
        Where-Object {
        $_.ProviderNamespace -eq $ResourceProvider -and 
        $_.ResourceTypes.ResourceTypeName -eq $ResourceType
    }

    if ($IncludePreview)
    {
        $provider.ResourceTypes.ApiVersions[0]
    }
    else
    {
        $provider.ResourceTypes.ApiVersions | Where-Object {
            $_ -notmatch '-preview'
        } | Select-Object -First 1
    }
}

You can now determine the latest API Version using either the full type:

Get-AzureRmResourceProviderLatestApiVersion -Type Microsoft.Storage/storageAccounts

Or by passing the Resource Provider and Resource Type:

Get-AzureRmResourceProviderLatestApiVersion -ResourceProvider Microsoft.Storage -ResourceType storageAccounts

Finally, to include preview versions you can append the -IncludePreview switch:

Get-AzureRmResourceProviderLatestApiVersion -Type Microsoft.Storage/storageAccounts -IncludePreview

You can download the script from my GitHub repository.

Additional information: Resource providers and types

Three reasons why you should associate multiple subscriptions with the same Azure Active Directory

In Azure, multiple subscriptions can trust the same Azure Active Directory but each subscription trusts only one directory.

If you create a new Azure subcription, a new Azure Active Directory is automatically created and associated with your subscription. To provide a user access for a resource you can use Role-Based Access Control (RBAC) given that the user is part of the associated Azure Active Directory. You can also add existing users from another Azure Active Directory as guest but I would still recommend to link your subscriptions with the same directory for the following three reasons:

  1. If you use a different directory for your subscription you won’t be able to move resources between your subscription:

    The source and destination subscriptions must exist within the same Azure Active Directory tenant.

  2. You can easy jump to your resources using the “All resources” blade by using the “Filter by Name” search field and don’t have to remember which resource belongs to which subscription:
    filterbyname
  3. If your user is a guest in many directories, your tenant list will grow and switching directories will become a mess:
    subscription

 

Read here how to associate or add an Azure subscription to Azure Active Directory

Configure Azure Cloud Shell to use a profile hosted on GitHub

You may have noticed that you can run the Azure Cloud Shell without the portal as a separate component on https://shell.azure.com/

The shell is really handy since it can be used from everywhere. Today I want to show you how you can load a remote profile that is hosted on GitHub in the Azure Cloud Shell.

A PowerShell profile is used to add aliases, functions or variables to a session every time you start the shell.

The Azure Cloud Shell uses a fileshare stored on your storage account to persist files. This is also true for your profile. You can determine your profile path by entering:

$profile

You will see a path similar to this:

profile

This doesn’t mean that the profile exists, its just the path where Azure Cloud Shell tries to load your profile when you start it. You can determine whether the file actually exists using the Test-Path cmdlet:

Test-Path $profile

The cmdlet should return false if you didn’t already created a profile:

Capture

You could create a profile using the New-Item cmdlet and go to your file share to edit it. But you may like to have a history where you can compare the changes you made. You may also want to use the same profile for different accounts. So how can we connect a profile that is stored in a GitHub repository?

Lets start with adding the actual profile to our GitHub repository. My profile.ps1 contains a single function to print Hello World:

function Show-HelloWorld
{
    Write-Host "hello, world!"
}

Next we have to load the profile. For that purpose I have created another file called Set-Profile.ps1:

$profilePath = 'https://raw.githubusercontent.com/mjisaak/azure/master/profile.ps1'

$downloadString = '{0}?{1}' -f $profilePath, (New-Guid)
Invoke-Expression ((New-Object System.Net.WebClient).DownloadString($profilePath))

The $profilePath contains the URL to the previous created profile.ps1. I do append a query string to the path containg a random guid to prevent the web client from caching the file. This is particularly usefull when we update the profile.ps1 in the GitHub repository and want to load these changes without restarting the shell by dot sourcing the profile.
In line 4,  I download the profile.ps1 as a string and execute it using the Invoke-Expression cmdlet to load it to the runspace.

The last step we need to do is to set the content of the Set-Profile.ps1 to the actual PowerShell profile. We can do this by executing the following snippet in the Azure Cloud Shell:

(New-Object System.Net.WebClient).DownloadString('https://raw.githubusercontent.com/mjisaak/azure/master/Set-Profile.ps1') |
  Set-Content $profile -Force

The snippet is using the web client again but instead of executing the code,  it pipes the string to the Set-Content cmdlet to override the profile. I can verfiy that by retrieving the content of $profile. This should output the content of my Set-Profile.ps1:

Capture4

Finally to load the profile we can either restart the PowerShell session or dot source the profile as mentioned earlier:

. $profile

And now we can use all the aliases, variables and function we have defined in the profile that is stored on GitHub:

helloworld

Using Azure Key Vault in ASP.NET Core 2.0 with the options pattern

The best way to store secrets in your app is not to store secrets in your app

Almost every web application needs some kind of secrets like a SQL Database connection string or the primary key of a Storage Account in order to communicate with external services.

Certainly we don’t store these secrets within our source code since this would expose them to every developer that has access to the code. In Azure we could store the secrets within the Application Settings in the Azure Portal:

secret.PNG

But if a secret is used in multiple application and we need to change it (e. g. regenerate a storage account key) we would have to do that in multiple places. A better place to store secrets in Azure is the Key Vault.

Instead of storing each secret within our app we store them in the Key Vault and configure our app to access the secrets in the vault. Now we have a single place where we can manage our secrets.

Lets take a look how we can access those secrets in an ASP.NET Core 2.0 web application without introducing a dependency to Key Vault in the class that uses it. To create a vault, store secrets to it and create a service principal for the access policy see Get started with Azure Key Vault.

Our secret is stored in a class called ValueSettings:

public sealed class ValueSettings
{
	public string TestSecret { get; set; }
}

There is a ValuesController with one HttpGet method that returns our secret:

[Route("api/[controller]")]
public class ValuesController : Controller
{
    private readonly ValueSettings _valueSettings;

    public ValuesController(IOptions<ValueSettings> valueSettings)
    {
        _valueSettings = valueSettings.Value;
    }

    [HttpGet]
    public IActionResult Get()
    {
        return Ok(_valueSettings.TestSecret);
    }
}

As you can see in line 6 the controller uses the options pattern to inject the actual settings. The controller doesn’t know where the secret is comming from and doesn’t have any dependencies to Azure Key Vault.

Now lets take a look how we need to configure our application for that. First we need to store the vault settings in our appsettings.json:

{
  "KeyVault": {
    "Vault": "https://myvault.vault.azure.net/",
    "ClientId": "myclientid",
    "ClientSecret": "myclientsecret"
  }
}

We also have a class that represents these settings:

public class KeyVaultSettings
{
    public string Vault { get; set; }
    public string ClientId { get; set; }
    public string ClientSecret { get; set; }
}

Now to configure the key vault we use the AddAzureKeyVault extension method in the Programm.cs:

public class Program
{
    public static void Main(string[] args)
    {
        BuildWebHost(args).Run();
    }

    public static IWebHost BuildWebHost(string[] args) =>
        WebHost.CreateDefaultBuilder(args)
            .ConfigureAppConfiguration((context, config) =>
            {
                config.SetBasePath(Directory.GetCurrentDirectory())
                    .AddJsonFile("appsettings.json", optional: false)
                    .AddEnvironmentVariables();

                var builtConfig = config.Build();
                var settings = builtConfig.GetSection("KeyVault").Get();

                config.AddAzureKeyVault(
                    settings.Vault, settings.ClientId, settings.ClientSecret);

            })
            .UseStartup()
            .Build();
}

And finally this is how our Startup looks like:

public class Startup
{
    public Startup(IConfiguration configuration)
    {
        Configuration = configuration;   
    }

    public IConfiguration Configuration { get; }

    // This method gets called by the runtime. Use this method to add services to the container.
    public void ConfigureServices(IServiceCollection services)
    {
        services.Configure(Configuration);

        services.AddMvc();
    }

    // This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
    public void Configure(IApplicationBuilder app, IHostingEnvironment env)
    {
        if (env.IsDevelopment())
        {
            app.UseDeveloperExceptionPage();
        }

        app.UseMvc();
    }
}

You can download the complete example from my GitHub repository.