Fridathon: unstructured learning or hacking you can opt-in to do on any random Friday.

How to use .NET module initializers in a concrete real use case

Module initializers in C# aren’t a radical new thing for sure, but they were esoteric enough and they required a NuGet package to work, so it seemed like a sort of unofficial/unsupported feature. Not anymore since they are now an officially supported feature with first-class language support in C# 9!

So the first thing might be to ask: what would I use them for? Learning a new thing in C# just for the sake of it isn’t very productive.

I just came across a scenario where I totally needed this feature: unit tests that run MSBuild!

Initializing MSBuild for tests

It turns out that the right way of doing MSBuild unit tests is to use the Microsoft.Build.Locator to set the MSBuild to use for the tests. This is the only sane way to get all those targets imports, tasks and SDKs properly resolved.

Simply enough, you’re supposed to invoke Microsoft.Build.Locator.MSBuildLocator.RegisterMSBuildPath(path); before any MSBuild assemblies are loaded. And you can only call it ONCE. Tricky thing eh? Can’t really put it in one test class static constructor, maybe in a helper? What if you forget to call the helper from some test class? Everything breaks and it will be tough to diagnose. What you really need is something that runs only once for the entire assembly (which are run in isolation in most (all?) runners). In other words, a Module Initializer!

The basic idea is you create a static class with a static void method, and annotate it with [ModuleInitializer] and that’s it. Unless you’re targeting .NET5, however, you won’t have that attribute type defined anywhere to use it. Luckily, you can just declare it in your project and things will Just Work too:

namespace System.Runtime.CompilerServices
{
    [AttributeUsage(AttributeTargets.Method, AllowMultiple = false)]
    public sealed class ModuleInitializerAttribute : Attribute { }
}

In my particular scenario, I want to use the MSBuild path that was used to compile the test project itself, to account for side-by-side installs. So how can my code access an MSBuild property (namely, the MSBuildBinPath property)? Another C# 9 powered feature to the rescue: ThisAssembly.Project source generator! I just need to add the following to the test .csproj:

  <ItemGroup>
    <PackageReference Include="ThisAssembly.Project" Version="0.10.6" />  
    <ProjectProperty Include="MSBuildBinPath" />
  </ItemGroup>

And now the initializer can access it and set the path:

internal static class ModuleInitializer
{

    [ModuleInitializer]
    internal static void Run()
    {
        var binPath = ThisAssembly.Project.MSBuildBinPath;
        Microsoft.Build.Locator.MSBuildLocator.RegisterMSBuildPath(binPath);
        // Set environment variables so SDKs can be resolved. 
        Environment.SetEnvironmentVariable("MSBUILD_EXE_PATH", Path.Combine(binPath, "MSBuild.exe"), EnvironmentVariableTarget.Process);
    }
}

What about mobile?

Here’s another scenario where I’d love to see it used: all those dreaded ThatOrThat.Init(); so frequent in Xamarin! I tried the above code in the netstandard library of a Xamarin Forms app, and both Android and iOS properly invoked the module initializer before executing any code in the shared library. Moreover, I tried having more than one, and they all invoked too!

So I think that’s another amazing improvement that could come at some point from the mobile platform. In that particular case, a source generator would emit the module initializer code so you, the end user, don’t have to do anything and things Just Work after simply installing a nuget package :-).

[Read More]

How use Visual Studio, MSBuild or Roslyn previews in GitHub or DevOps CI

If you want to leverage the many awesome C# 9 features, including roslyn source generators like ThisAssembly, all of which require the latest and greatest .NET 5.0 preview, it would be a pity to have to give up the safety net of your CI builds (whether GitHub Workflows or Azure DevOps pipelines) just because they don’t provide hosted images with the relevant bits.

This post shows how to install and use the latest Visual Studio preview from your build script.

Yes, it might just be enough to install the .NET Core RC and use dotnet build and dotnet test. In some cases you do need a full Visual Studio depending on your project.

The key to enabling this scenario is a little awesome (if I might say so) dotnet global tool called dotnet-vs: “A global tool for running, managing and querying Visual Studio installations”. It’s a cool little thing Adrian Alonso and myself created to more easily manage multiple versions of Visual Studio installed side by side. It can get quite crazy at times.

The tool allows, among other things, to query installed VS versions and install new ones, including adding/removing components. It internally uses vswhere as well as the Visual Studio installer command line to achieve a seamless experience.

So, on to the actual scripts that are really quite simple.

GitHub Workflow

The whole build workflow (which you can see in action too) is:

name: build
on: push

jobs:
  build:
    runs-on: windows-latest
    steps:
      - uses: actions/checkout@v2
      - uses: actions/setup-dotnet@v1
        with:
          dotnet-version: 3.1.x
      - run: dotnet tool update -g dotnet-vs
      - run: echo "::set-env name=MSB::$(vs where preview --prop=InstallationPath)"
      - run: vs install preview --quiet +Microsoft.VisualStudio.Component.ManagedDesktop.Core +Microsoft.NetCore.Component.DevelopmentTools
        if: env.MSB == ''
      - run: echo "::add-path::$(vs where preview --prop=InstallationPath)\MSBuild\Current\Bin"
      - run: msbuild -r
      - run: msbuild -t:test

Relevant steps:

  1. Install/update to latest & greatest dotnet-vs by simply using dotnet tool update -g. That will install the tool if it’s not there, and ensure it’s the latest otherwise. I do this because if VS preview requires some newer command args in the future, the latest dotnet-vs tool will likely support that too.

  2. The syntax for setting an environment from a GH action is a bit weird, but the notable thing here is that the run command will actually run Powershell Core by default, unlike on DevOps where it runs cmd.exe (on Windows agents, in both cases):

    RunPwsh.png

    So we take advantage of that fact and just run the vs where command inline to set the value of the installation directory for a preview version of VS. The dotnet-vs tool where command will return the raw value from that execution, or an empty string if no such version is found.

  3. We use that as the condition for the vs install so that we only do so if the preview isn’t there already. Note how you can add any supported workload or component ID to the installation with the simple +[ID] syntax. There are also shorter aliases for common workloads like +core +desktop +azure +mobile, say. The ones I’m installing in this case are just the minium I need, so I can get the install in just about ~5 minutes!

  4. We finally use the same “trick” as step 2 for adding the MSBuild path to the %PATH% so that we can finally just run msbuild.

All in all, pretty straightforward and concise. I love it how GitHub run actions are rendered by default using the frst line of the command. I wish Azure DevOps did the same, instead of showing just CmdLine and forcing you to always annotate steps with displayName.

Azure DevOps

The whole build pipeline (which you can see in action too) is:

pool:
  vmImage: 'windows-2019'
steps:
- checkout: self

- task: UseDotNet@2
  inputs:
    packageType: sdk
    version: 3.1.x
    performMultiLevelLookup: true

- script: dotnet tool update -g dotnet-vs
- pwsh: echo "##vso[task.setvariable variable=MSB]$(vs where preview --prop=InstallationPath)"
- script: vs install preview --quiet +Microsoft.VisualStudio.Component.ManagedDesktop.Core +Microsoft.NetCore.Component.DevelopmentTools
  condition: eq(variables['MSB'], '')
- pwsh: echo "##vso[task.prependpath]$(vs where preview --prop=InstallationPath)\MSBuild\Current\Bin"
- script: msbuild -r
- script: msbuild -t:test

(I removed all the displayName for conciseness).

You can see that the structure is pretty much the same as for GitHub workflows. Note that we need to explicitly choose to run with powershell by using pwsh instead of script, so that the inline execution of vs commands when expanding the string for the variables works the same way. We use the ##vso[task.XXX] syntax in this case instead.

The condition syntax in GitHub workflows is also so much nicer :).

And that is all you need to install quickly (both in ~5’ in this combination of components) and build in CI using the latest and greatest C# features!

[Read More]

How to generate code using Roslyn source generators in real world scenarios

Roslyn (as of 16.8 Preview 3) now brings first-class support for source code generators that run as part of a project compilation. The provided cookbook is a fantastic resource to get to know the capabilities and some specific scenarios this feature was created for. The carefully chosen set of features, driven by concrete scenarios, make for a powerful and flexible toolset to supercharge your nuget packages with. In this blog post I’ll outline how I’m using it, in light of my first real-world use case: ThisAssembly.

NOTE: if you haven’t read the aforementioned cookbook, this would be a good time.

One conspicuous detail left out of the cookbook is how to actually put together the generated code. Surely we’re not expected to use string concatenation for real, right?

How to actually create the generated code

Most (all?) of the code generators I’ve seen resort to “simple” string concatenating approaches to codegen. I happened to have done codegen for long enough to deeply distrust the “simplicity” they offer. There’s a reason why template-based codegen has a multitude of options and has been around for so long: that simplicity is just a shortcut. It works for a hello world sample, but it just doesn’t scale, it’s not maintainable, it’s hard to tweak and modify, even harder to grasp what the output really will look like (with loops and conditionals in between actual text), it’s plain awful and painful to work with.

I’ve used a whole bunch of approaches to this over the years, all the way from CodeDom back in the day to Roslyn raw APIs nowadays, with everything in-between (such as T3/T4 and Razor, reflection emit and expression tress). One thing I’ve definitely have come to realize is that what works best is:

  • Build a model
  • Apply a template

Any and all logic to process whatever your source is goes into the model building side of things (which can be nicely unit tested as needed), and the template is a very simple thing that just acts on that model and translates it to some output which is your generated stuff (can be code, XML, JSON, HTML, whatever).

After reading quite a bit on all the .NET-native approaches, I found Scriban to be the best suited for the job. I love the author’s extensive background and experience in various approaches and toolkits, which seem to have greatly informed his design choices with Scriban.

How to use Scriban in your source generator

As explained in the cookbook, your generator nuget dependencies must be included with your analyzer. I’m not a fan of the way the packaging of analyzers in general is suggested there, so I do it slightly different.

<Project>
  <PropertyGroup>
    <TargetFramework>netstandard2.0</TargetFramework>
    <GeneratePackageOnBuild>true</GeneratePackageOnBuild>
    <BuildOutputTargetFolder>analyzers</BuildOutputTargetFolder>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Scriban" Version="2.1.2" PrivateAssets="all" Pack="true" />
  </ItemGroup>
</Project>

The BuildOutputTargetFolder property means our assembly will end up in analyzers/netstandard2.0 inside the package, so it will not become a lib dependency for consumers.

NOTE: this does mean that the analyzer would run for any language, as long as the consuming project targets netstandard2.0. This may or may not be what you want.
The documented alternative to have more control over that is to use explicit None items with PackagePath metadata pointing to analyzers/[tfm]/[lang] instead.

Presently, however, netstandard2.0 virtually equates dotnet (as in, all currently supported target frameworks/runtimes), and targeting all three main languages (C#, F# and VB) is quite trivial when using a text template. Moreover, as of .NET5.0 I believe source generators will only be supported for C#, so they wouldn’t even run for the other two, the extra simplicity of BuildOutputTargetFolder works for me.

The Pack=true metadata on the PackageReference works as I explained in my TIL: How to include package reference files in your nuget.

This is what an actual template looks like:

I simply embed the template files in the assembly, which is the most convenient way for me. Again, this can be done in a single place in the Directory.Build.targets:

  <ItemGroup>
    <EmbeddedResource Include="@(None -> WithMetadataValue('Extension', '.sbntxt'))" />
  </ItemGroup>

Then a simple helper method allows us to get its content at run-time:

    static class EmbeddedResource
    {
        public static string GetContent(string relativePath)
        {
            var baseName = Assembly.GetExecutingAssembly().GetName().Name;
            var resourceName = relativePath
                .TrimStart('.')
                .Replace(Path.DirectorySeparatorChar, '.')
                .Replace(Path.AltDirectorySeparatorChar, '.');

            using var stream = Assembly.GetExecutingAssembly()
                .GetManifestResourceStream(baseName + "." + resourceName);

            if (stream == null)
                throw new NotSupportedException();

            using var reader = new StreamReader(stream);
            return reader.ReadToEnd();
        }
    }

Since the Scriban template makes it so easy to support multiple target languages, I basically future-proof my generators by including templates for all three now, and they will just “light up” whenever Roslyn adds support for them in the future. Therefore, the code to lookup the template content and apply it to a model, is always the same and generic for all target languages:

[Generator]
public class MetadataGenerator : ISourceGenerator
{
    public void Initialize(GeneratorInitializationContext context) { }

    public void Execute(GeneratorExecutionContext context)
    {
        var model = ...; // build the model
        var language = context.ParseOptions.Language;
        // lookup CSharp.sbntxt, VisualBasic.sbntxt or FSharp.sbntxt
        var file = language.Replace("#", "Sharp") + ".sbntxt";
        var template = Template.Parse(EmbeddedResource.GetContent(file), file);
        // apply the template
        var output = template.Render(model, member => member.Name);

        // add the file
        context.AddSource("[HINT_NAME_OF_OUTPUT]", SourceText.From(output, Encoding.UTF8));

NOTE: even if I don’t provide a template for VB/F#, this code won’t fail presently since it will only be invoked for C# ;-)

Now on to some concrete scenarios I used that showcase the power and flexibility of source generators.

Debugging source generators

Basically, just add a System.Diagnostics.Debugger.Launch() :).

For a bit more added flexibility, and to avoid having to comment/uncomment that line all the time, I make debugging a configurable option via MSBuild.

There are two parts to enabling MSBuild configuration for your generator:

  1. Declaring the name of the property in targets file
  2. Reading it in the generator.

For debugging I define the following properties in a file named after the generator package ID (i.e. ThisAssembly.Metadata.targets):

<Project>
  <ItemGroup>
    <CompilerVisibleProperty Include="DebugSourceGenerators" />
    <CompilerVisibleProperty Include="DebugThisAssemblyMetadata" />
  </ItemGroup>
</Project>

The first property, when set to true in a build, will cause the Debugger.Launch to run for all generators. The second allow debugging a specific generator instead. Usage would be: msbuild -p:DebugThisAssemblyMetadata=true, for example.

We next have to include the targets file with the analyzer, but it needs to go to the build package folder. This can be done generically too in the Directory.Build.targets:

  <ItemGroup>
    <_PackageFiles Include="*.props" PackagePath="build/$(TargetFramework)" />
    <_PackageFiles Include="*.targets" PackagePath="build/$(TargetFramework)" />
  </ItemGroup>

(I include .props generically too since some generators need those too)

Finally, the debugger check helper:

static class GeneratorExtensions
{
    public static void CheckDebugger(this GeneratorExecutionContext context, string generatorName)
    {
        if (context.AnalyzerConfigOptions.GlobalOptions.TryGetValue("build_property.DebugSourceGenerators", out var debugValue) &&
            bool.TryParse(debugValue, out var shouldDebug) &&
            shouldDebug)
        {
            Debugger.Launch();
        }
        else if (context.AnalyzerConfigOptions.GlobalOptions.TryGetValue("build_property.Debug" + generatorName, out debugValue) &&
            bool.TryParse(debugValue, out shouldDebug) &&
            shouldDebug)
        {
            Debugger.Launch();
        }
    }
}

NOTE: if I wanted this capability only for DEBUG builds, I could simply add [Conditional("DEBUG")] to the above method.

We simply access the MSBuild property as documented in the cookbook and attempt to parse it as a boolean to determine whether the debugger should be launched. Now all my generators can include a single line of code (usually the first in the Execute method) that I never have to remove:

context.CheckDebugger("ThisAssemblyMetadata");

Generating ThisAssembly.Metadata

Once scenario I’ve used codegen in the past and am quite fond of, is to access values provided by the build via MSBuild project properties and items. In the past I created MSBuilder.ThisAssembly.Metadata, for example, to pull assembly attributes into a code class with constants.

I ported the concept to Roslyn source generators and the result is available as the ThisAssembly.Metadata package:

The basic concept is that in any project (.NET 5.0 SDK or later), you can readily add assembly metadata by simply adding items via MSBuild (support added by yours truly ;)):

    <ItemGroup>
      <AssemblyMetadata Include="Foo" Value="Bar" />
    </ItemGroup>

Which is automatically turned into the following attribute in the generated AssemblyInfo.cs in your obj folder:

  [assembly: System.Reflection.AssemblyMetadataAttribute("Foo", "Bar")]

Even though you can access that metadata by using reflection, that’s comparatively slower and more annoying than simply accessing a constant like ThisAssembly.Metadata.Foo, say.

This is probably the simplest of generators, since we don’t need to access MSBuild information and can instead just rely on the current compilation passed to the generator to contain the attributes shown above.

The generator basically accesses the current compilation and looks for all attributes in it:

[Generator]
public class MetadataGenerator : ISourceGenerator
{
    public void Initialize(GeneratorInitializationContext context) { }

    public void Execute(GeneratorExecutionContext context)
    {
        var metadata = context.Compilation.Assembly.GetAttributes()
            .Where(x => x.AttributeClass?.Name == nameof(System.Reflection.AssemblyMetadataAttribute) &&
                Microsoft.CodeAnalysis.CSharp.SyntaxFacts.IsValidIdentifier((string)x.ConstructorArguments[0].Value))
            .ToDictionary(x => (string)x.ConstructorArguments[0].Value, x => (string)x.ConstructorArguments[1].Value);
        ...
    }
}

That metadata becomes my Model for the template:

    public class Model
    {
        public Model(IEnumerable<KeyValuePair<string, string>> metadata) => Metadata = metadata.ToList();

        public string Version => Assembly.GetExecutingAssembly().GetName().Version.ToString(3);

        public List<KeyValuePair<string, string>> Metadata { get; }
    }

Which is rendered with the template shown above in the Scriban section.

The entirety of the shipping generator is:

[Generator]
public class MetadataGenerator : ISourceGenerator
{
    public void Initialize(GeneratorInitializationContext context) { }

    public void Execute(GeneratorExecutionContext context)
    {
        context.CheckDebugger("ThisAssemblyMetadata");

        var metadata = context.Compilation.Assembly.GetAttributes()
            .Where(x => x.AttributeClass?.Name == nameof(System.Reflection.AssemblyMetadataAttribute) &&
                Microsoft.CodeAnalysis.CSharp.SyntaxFacts.IsValidIdentifier((string)x.ConstructorArguments[0].Value))
            .Select(x => new KeyValuePair<string, string>((string)x.ConstructorArguments[0].Value, (string)x.ConstructorArguments[1].Value))
            .Distinct(new KeyValueComparer())
            .ToDictionary(x => x.Key, x => x.Value);

        var model = new Model(metadata);
        var language = context.ParseOptions.Language;
        var file = language.Replace("#", "Sharp") + ".sbntxt";
        var template = Template.Parse(EmbeddedResource.GetContent(file), file);
        var output = template.Render(model, member => member.Name);

        context.ApplyDesignTimeFix(output, "ThisAssembly.Metadata", language);
        context.AddSource("ThisAssembly.Metadata", SourceText.From(output, Encoding.UTF8));
    }
}

The next ones are similarly simple and concise.

Generating ThisAssembly.Project

This generator results in a similar end-user experience:

But the goal here is to allow arbitrary MSBuild properties to end up there, without having corresponding assembly-level attributes, like the previous generator. I hacked this in the past with MSBuild, but it was sketchy (using reflection to access MSBuild properties by name, ugh). This time around, I can be legit :).

The intended usage is to declare properties you want to get as constants via MSBuild items, similar to AssemblyMetadata items:

<Project>
  <ItemGroup>
    <ProjectProperty Include="PackageId" />
  </ItemGroup>
</Project>

The generator includes a few out of the box too in this very fashion.

This source generator is interesting because we have to coordinate more deeply the MSBuild side and the generator side. Namely: we have to turn those items into compiler visible properties, but also need to tell the source geneator which properties were opted-in to codegen, since we don’t want to just emit all CompilerVisibleProperty since that might include others used internally for other generators.

The targets file builds up a property containing the oped-in props and does the item group conversion as needed:

  <Target Name="InjectThisAssemblyProject" BeforeTargets="PrepareForBuild;CompileDesignTime">
    <PropertyGroup>
      <ThisAssemblyProject>@(ProjectProperty, '|')</ThisAssemblyProject>
    </PropertyGroup>
    <ItemGroup Condition="'$(ThisAssemblyProject)' != ''">
      <CompilerVisibleProperty Include="@(ProjectProperty)" />
      <CompilerVisibleProperty Include="ThisAssemblyProject" />
    </ItemGroup>
  </Target>

The source generator will receive a | separated list of opted-in properties to using the ThisAssemblyProject MSBuild property. And it will also have access to all the compiler visible properties as usual, so it can use the first to filter the second.

The way these compiler visible properties work is that the built-in SDK targets will generate a .editorconfig in your obj/Debug folder (plus target framework), containing the values. So for the out of the box + PackageId property, it will look like the following when you install ThisAssembly.Project:

is_global = true
build_property.DebugSourceGenerators = 
build_property.DebugThisAssemblyProject = 
build_property.RootNamespace = ClassLibrary6
build_property.AssemblyName = ClassLibrary6
build_property.TargetFrameworkVersion = v2.0
build_property.TargetFrameworkIdentifier = .NETStandard
build_property.TargetFrameworkMoniker = .NETStandard,Version=v2.0
build_property.PackageId = ClassLibrary6
build_property.ThisAssemblyProject = RootNamespace|AssemblyName|TargetFrameworkVersion|TargetFrameworkIdentifier|TargetFrameworkMoniker|PackageId

The astute reader will notice that the semi-colon character ; marks the beginning of a comment in .editorconfig, so if we had used the default concatenation for items in MSBuild:

    <PropertyGroup>
      <ThisAssemblyProject>@(ProjectProperty)</ThisAssemblyProject>
    </PropertyGroup>

We would have ended with this in the .editorconfig:

build_property.ThisAssemblyProject = RootNamespace;AssemblyName;TargetFrameworkVersion;TargetFrameworkIdentifier;TargetFrameworkMoniker;PackageId

Which would be interpreted as RootNamespace followed by a comment! The generator would only ever see the first property in the @(ProjectProperty) item group! (this was quite the head scratcher ;))

Back at the generator code now, we first read the propertly list and then get all the properties using the same mechanism, filtering only those that do have a value:

public void Execute(GeneratorExecutionContext context)
{
    context.CheckDebugger("ThisAssemblyProject");

    if (!context.AnalyzerConfigOptions.GlobalOptions.TryGetValue("build_property.ThisAssemblyProject", out var properties))
        return;

    var metadata = properties.Split('|')
        .Select(prop => new KeyValuePair<string, string>(prop,
            context.AnalyzerConfigOptions.GlobalOptions.TryGetValue("build_property." + prop, out var value) ? 
            value : null))
        .Where(pair => pair.Value != null)
        .Distinct(new KeyValueComparer())
        .ToDictionary(x => x.Key, x => x.Value);

    var model = new Model(metadata);
    ...
}

The rest of the method is the same as the previous generator, and the template is almost the same too, except for the nested class name which is Project now instead of Metadata.

Generating ThisAssembly.Info

The third and final generator for this point emits constants for the common attributes applied to the assembly by default when you build an SDK-style project:

With the discussion of the previous two generators, I think you, dear reader, will have no problems making sense of its source, since it looks (unsurprisingly) very similar to the ones shown above.

Next up is the (somewhat popular) netfx-System.StringResources :)

Stay tunned for more source generator galore!

[Read More]

Serverless redirection to save us from ugly URLs

I’m a fan of CI-independent serverless nuget feeds: you can push packages from arbitrary systems to a single feed that is highly available and requires no maintenance. It can also be made public access (which Azure Artifacts/VSTS still doesn’t allow).

There is one minor issue, though: the URL isn’t all that memorable or particularly short. Its format is https://[ACCOUNT].blob.core.windows.net/[CONTAINER]/index.json. It’s still better than a VSTS packaging feed: https://[ACCOUNT].pkgs.visualstudio.com/_packaging/[NAME]/nuget/v3/index.json, but wouldn’t it be nice to have something even shorter, like http://[account].nuget.cloud/index.json? After all, it’s just a trivial HTTP redirect we need. Serverless to the rescue!

NOTE: why even have that index.json at the end? Turns out, that is what tells NuGet to consider the feed as a v3 feed :(

The things we’ll need for this are:

  1. A nice short domain
  2. An Azure DNS zone and records for the domain
  3. An Azure Functions app to perform the redirects

I head over namecheap.com, typed “nuget” and found nuget.cloud for ~$3. Then I went to Azure DNS and created a new DNS zone for it.

NOTE: turns out that renewing that domain a year later was ~$21. I’m not going to renew it, but all instructions here are still precise and will work with your own domain, whichever one you choose ;)

create DNS zone

NOTE: best way to find stuff in the Azure Portal is to just type in the search box

search DNS in azure portal

Then back to namecheap to configure the DNS for the domain.

After creating the functions app, I created a redirect function which is simple enough:

using System.Net;
using System.Web;

public static HttpResponseMessage Run(HttpRequestMessage req, TraceWriter log) 
{
    log.Info($"Redirecting {req}");
    
    var account = req.Headers.GetValues("DISGUISED-HOST").First().Replace(".nuget.cloud", "");
    var response = req.CreateResponse(HttpStatusCode.MovedPermanently);
    response.Headers.Location = new Uri($"https://{account}.blob.core.windows.net/nuget/index.json");

    return response;
}

Over in the function app’s Platform features tab, we can configure the custom domain for it:

configure custom domain

I added *.nuget.cloud since I want the redirection be usable by anyone creating their custom serverless nuget feeds.

Back at the DNS zone, I added a recordset for *.nuget.cloud to CNAME it to the azure function (nugetcloud.azurewebsites.net in my case) host name:

add recordset

Finally, we need to make azure function accessible from *.nuget.cloud/index.json. The function URL is currently https://nugetcloud.azurewebsites.net/api/redirect. In order to make it accessible via a different URL, we just need to create a Proxy with the desired route:

add proxy

With that in place, anyone using serverless Azure nuget feeds can use a nice sort url like http://kzu.nuget.cloud/index.json. The only requisite is that your storage container name must be nuget, and the storage account becomes the subdomain of nuget.cloud.

[Read More]