Keeping models in sync with Reinforced.Typings

Keeping types in sync between the server and the frontend has long been a sticking point in my development career. It gives me an uneasy feeling any time I work on a project where the compiler or tooling doesn't know exactly what kind of data it's working with. The same goes for working with models that might slowly drift out of sync. It's not a stretch to imagine a situation where you design an endpoint one way, but then make a slight tweak several months later and forget to update your client. In situations like these, I want my tooling to scream at me that something is wrong!

Syncing types and models between the caller and the receiver has been one of my holy grails of programming for a long time, and to that end I've tried a variety of tools to accomplish it. I've even briefly flirted with building my own tool to translate C# models to TypeScript interfaces. Sadly that project never grew legs, partly because I didn't have the time to dedicate to it, but mostly because this is a hard problem (though I'd be lying if I said I didn't want to return to it some day).

Tooling isn't the only way you can sync your models and types between the frontend and backend, though. One of the easiest ways to do it is to just use the same language across your entire project. TypeScript is an excellent example of this, where you run Node on the server and transpile to JavaScript on the server, but they share the same modules and type/interface declarations. Another language is F#, Microsoft's functional, red-headed stepchild. Using the Fable Compiler, you can write F# that runs on both the server and the browser.

It's not all flowers and sunshine for both of those languages, though. Like anything, they have their advantages and disadvantages.

For TypeScript the disadvantage (to me), is that at this point in my career I tend to prefer "kitchen sink" frameworks like ASP.NET, whereas in the TS/JS world, you're encouraged to stitch together small modules into an ad-hoc Frankenstein framework of your own design. And then there's the elephant in the room: NPM and package management. I'm the first to roll my eyes whenever I see the "lol leftpad" meme being repeated for the millionth time, but there are serious issues with NPM and the general attitude of many JavaScript developers.

For F#, the disadvantage is the tooling around the language. Paket, the package manager used and recommended by most F# developers, is great... when it isn't throwing bizarre mono errors (mozroots anyone? oh, the tool that's been deprecated by mono itself?). Beyond Paket, I consistently have poor performance and issues with Ionide, the VS Code language extension for F#; after 10 or 15 minutes, the intellisense will just stop working, and codelens will display in places it shouldn't. Once that happens, you need to close VS Code and restart it, or make a change to your F# project file to force the extension to reload the project.

(I must caveat the Ionide performance issues I've had by saying that I'm apparently the only one who experiences them. I don't hear many complaints about Ionide, and I can't find many relevant issues on their GitHub repo. But I've used four different machines over the last 1.5 years -- two Windows laptops, one Windows desktop, and one Ubuntu desktop -- and have had the same performance problem across all of them. At a certain point you're just spending too much time restarting the editor, and it's time to re-evaluate your choice of technology.)

But even if you love TypeScript (I do, and use as much as possible) or more obscure languages like F# or even Reason, sometimes you just don't have a choice. This is sometimes the case for myself in my freelancing engagements, where a client might request that the project use C# on the server so their own developers can make changes down the road.

This is where you start to get into the territory of tools like Swagger, NSwag and JSON Schema, but in most cases this isn't what I'm looking for; I just want the "source of truth" for my models to come from the server's models, whereas these tools have you define yor models in a JSON file or even a visual editor. Don't get me wrong, they're very powerful, and the tooling around them is robust, but again, it's not what I'm looking for.

Reinforced.Typings

A long time ago I started using a tool called Reinforced.Typings after the author posted about it on Reddit. I put it to use in a couple of my projects at the time, but the biggest reason I stopped using it was the fact that it wouldn't run anywhere outside of Visual Studio's build process -- or if it did, it wasn't well documented. Combine that with the fact that I'd briefly moved away from Windows to working on Ubuntu with the rise of .NET Core, I quickly gave up hope that I'd ever keep my C# backend and TypeScript frontend synchronized without using "heavyweight" solutions like swagger or JSON schema.

These days, though, Reinforced.Typings has come a long way and I've been putting it to work in all of my current and future projects. This is largely thanks to:

  1. The tool's support for .NET Core, which means it can synchronize your models whether you're on Windows, Ubuntu, macOS, or any other OS supported by .NET Core.
  2. The executable CLI tool that's packaged and distributed with the library itself, meaning it can easily be run from bash or PowerShell -- a big upgrade from describing tasks in a convoluted MSBuild process written in XML.

One of my only complaints with this tool is that it only works with C# models. I can't seem to get it to work with F# at all, whether I'm using records or true .NET classes. I assume this comes down to the difference in compilers: C# uses Roslyn, but F# uses the F# compiler.

Lack of F# support aside, this tool is nearly perfect for my own specific usecase. With PowerShell now available on Linux, and Bash now available on Windows through WSL, it's super easy to write one single script that will run anywhere and generate your TypeScript interfaces on any OS!

Installing the package and CLI tool

I've provided the code for everything in this article [here on GitHub]. There's a folder for your favoriate package manager. Despite my complaints about Paket above, I still use it in my dotnet projects because of its support for lockfiles. But package manager politics aside, I know that most dotnet projects use Nuget, so I'll provide examples for both.

To install with Paket:

paket add reinforced.typings 

To install the package with Nuget:

dotnet add package Reinforced.Typings

And if you installed with Nuget, you'll also need to restore the packages to your project folder (Paket does this automatically):

dotnet restore --packages ./packages

With the package installed, you can actually start using it right away by either running the bundled .exe tool in the .NET 4.5 folder, or by running the compiled DLL from the dotnet CLI. They'll both do the same thing, so it's really up to your personal preference, or whether you're running on Windows or something else.

To run the tool, you need to pass in the source assembly containing your C# models (i.e. the DLL file), a target directory where the typings should be written to. I also pass Hierarchy='true', which structures the typings in different folders depending on the namespace and module options (which we'll get to below). So, running the tool looks like this:

dotnet "./packages/Reinforced.Typings/tools/netcoreapp2.1/rtcli.dll" SourceAssemblies="/absolute/path/to/my/app.dll" Hierarchy="true"

Configuring your C# models, an example

Reinforced.Typings works by compiling your C# projects, loading up the DLL, and then searching for things that have been marked with special Reinforced.Typings attributes. You'll need to be familiar with several of these attributes to put the tool to good use, and one of the most important ones is the [TsGlobal] attribute, which is used to configure global options for Reinforced.Typings.

The [TsGlobal] attribute, unlike the other attributes in Reinforced.Typings, is an assembly attribute, meaning it's applied to your entire assembly rather than just one single class or interface. To use it, open up your Program.cs or Startup.cs file and add the following at the top:

// file: Program.cs 

using Reinforced.Typings.Attributes;
[assembly: TsGlobal(
    UseModules = true,
    AutoOptionalProperties = true,
    DiscardNamespacesWhenUsingModules = true,
    ExportPureTypings = true
)]

These are the global settings that I like to use in my projects. Your mileage may vary, so make sure you consult the docs to learn which options can be set. For my projects using the example above, these properties do the following:

  • UseModules: turns the generated TypeScript files into modules, i.e. they use export statements.
  • AutoOptionalProperties: automatically marks Nullable<T> properties in C# as optional in TypeScript, e.g. int? MyProperty becomes MyProperty?: number.
  • DiscardNamespacesWhenUsingModules: does what it says, e.g. namespace MyModule { export interface XYZ { ... } } becomes export interface XYZ { ... } and the filename becomes the namespace.
  • ExportPureTypings: by default, Reinforced.Typings will turn your C# classes into TypeScript classes, not interfaces. Personally I don't find this useful at all, and I prefer to use interfaces when transferring data between the frontend and backend.

With the global settings configured, you can start marking your .NET classes with the [TsInterface] attribute, which tells the Reinforced.Typings tool that the class needs to be converted to TypeScript. In my own applications, I like to have fully-typed HTTP requests and responses between my server and my client. If my client is making a request, it should know exactly what data the server is expecting, and it should know exactly what data it's supposed to respond with. Login and authentication mechanisms are very common pieces of web applications, so we'll use a login request and login response as an example.

First, create a new C# model named LoginRequest.cs. In our demo application, we'll say that the Login endpoint expects the client to post a username and a password in the body, and the server is expected to return a token string or an error. Translating that to a model, it looks like this:

// file: Models.cs

using Reinforced.Typings;

namespace MyApp.Models
{
    public class LoginRequest
    {
        public string Username { get; set; }

        public string Password { get; set; }

        public string GetValidationErrors()
        {
            if (string.IsNullOrEmpty(Username))
            {
                return "Username cannot be empty.";
            }

            if (string.IsNullOrEmpty(Password))
            {
                return "Password cannot be empty.";
            }

            return null;
        }
    }

    public class LoginResponse
    {
        public bool Ok => true;

        public string Token { get; set; }
    }

    public class ErrorResponse
    {
        public bool Ok => false;
    
        public string Message { get; set; }
    }
}

So we have three models: two responses (what the server is expected to send back to the client), and one request, which can validate its own properties and return an error message if either is invalid. To translate these models to TypeScript, you just need to mark them with the [TsInterface] attribute. Like the global attribute, this one has several configuration properties, but the only one we'll use in this example is the Namespace = "XYZ" prop. When combined with the UseModules option from the global attribute, this will make Reinforced.Typings write the interface to the XYZ folder.

// file: Models.cs

using Reinforced.Typings

namespace MyApp.Models
{
    [TsInterface(Namespace = "requests")]
    public class LoginRequest 
    {
        ...
    }

    [TsInterface(Namespace = "responses")]
    public class LoginResponse
    {
        ...
    }

    [TsInterface(Namespace = "responses")]
    public class ErrorResponse
    {
        ...
    }
}

In most cases, you're good to go at this point. You can run the tool, and your models will be magically transmuted to TypeScript interfaces through alchemy most wondrous. However, there are a couple edge cases and improvements that you might want to know about.

Using the [TsProperty] and [TsIgnore] attributes

If you're not using C# 8's new nullable type system, where any value can be marked as nullable, there may be some properties in your C# models that you know can be null, but the tool does not know. Yes, with the AutoOptionalProperties global option set, it will know that bool?, int?, etc. need to be converted to optional properties in TypeScript, but public string NullableString won't work. In these cases, you tell Reinforced.Typings to mark your property as null by adding the [TsProperty(ForceNullable=true)] attribute.

If you've got a string property, and you know that the only valid values for the string are e.g. "Option 1" and "Option 2", you can force the TypeScript type to a union type, where only those two strings will compile. Just mark your property with [TsProperty(Type = "\"Option 1\" | \"Option 2\"")]. The TypeScript will look like myProperty: "Option 1" | "Option 2", and trying to assign any other value to it wil result in a compiler error. This also works with numbers, booleans, and even objects.

Finally, Reinforced.Typings will also add function signatures for any methods on your class. In the example above, the tool will actually add the GetValidationErrors() method to the TS interface. I don't find that very useful at all, since you can't even serialize a function and send it over HTTP -- and why would you? You can prevent the tool from generating types for methods by marking the method with [TsIgnore]. This also works for skipping properties.

Putting it all together

There are a couple ways you could improve the models from above. First, you should add a [TsIgnore] to the LoginRequest.GetValidationErrors() method so it doesn't show up in your TypeScript interface. Second, since the LoginResponse.Okay property is always true, and since the ErrorResponse.Okay property is always false, you can set them to those literal values; this will let you discriminate on the on the prop on both models to determine what kind of response you've received. That's not super useful in this specific case, since if there were an error you'd probably return a different status code that you could look for, but it's not difficult to imagine a case where an endpoint might return different types of objects based on what it received.

With that in mind, here's what the final C# models look like:

// file: Models.cs

namespace MyApp.Models
{
    [TsInterface(Namespace = "requests")]
    public class LoginRequest
    {
        public string Username { get; set; }

        public string Password { get; set; }

        [TsIgnore]
        public string GetValidationErrors()
        {
            if (string.IsNullOrEmpty(Username))
            {
                return "Username cannot be empty.";
            }

            if (string.IsNullOrEmpty(Password))
            {
                return "Password cannot be empty.";
            }

            return null;
        }
    }

    [TsInterface(Namespace = "responses")]
    public class LoginResponse
    {
        [TsProperty(Type = "true")]
        public bool Ok => true;

        public string Token { get; set; }
    }

    [TsInterface(Namespace = "responses")]
    public class ErrorResponse
    {
        [TsProperty("Type = "false")]
        public bool Ok => false;
    
        public string Message { get; set; }
    }
}

Once converted to TypeScript by Reinforced.Typings, you'll have the following interfaces:

// file: requests/ILoginRequest.d.ts
export interface ILoginRequest {
    Username: string;
    Password: string;
}

// file: responses/ILoginResponse.d.ts
export interface ILoginResponse {
    Ok: true;
    Token: string;
}

// file: responses/ErrorResponse.d.ts
export interface IErrorResponse {
    Ok: false;
    Message: string;
}

Now in your TypeScript client script, you can send a login request (using fetch or even plain old XHR) and use the Ok property to figure out what kind of response you got. If it's true, the TypeScript compiler will automatically know you have an ILoginResponse; if it's false, it knows you have an IErrorResponse. Again, maybe that's not super useful in this specific example, since you'd probably return a different status code if there were an error, but I get a lot of mileage out of these discriminated union types in my own apps.

import { ILoginRequest } from "requests";
import { ILoginResponse, IErrorResponse } from "responses";

async function sendRequest<T>(url: string, data: Object): Promise<T> {
    // Send request to the server here and parse the JSON response body
    const body = responseBody;

    return JSON.parse(body);
}

async function login(username: string, password: string) {
    const data: ILoginRequest = {
        Username: username,
        Password: password
    };
    const url = "/auth/login";
    const response = await sendRequest<ILoginResponse | IErrorResponse>(url, data);

    // TypeScript compiler knows that response has an Ok property, but doesn't know if it's 
    // an ILoginResponse or an IErrorResponse until you test that property
    if (response.Ok) {
        // TypeScript compiler knows you have an ILoginResponse and lets you use the Token prop
        alert("You're logged in! Your token is: " + response.Token);
    } else {
        // TypeScript compiler knows you have an IErrorResponse and lets you use the Message prop
        alert("Error logging in: " + response.Message);
    }
}

And there you have it! Our types are in sync, and we can even use TypeScript's type system to decide which kind of response we got based on whether the Ok property is true or false.

Running Reinforced.Typings with PowerShell or Bash on any OS

Finishing up, let's take a look at how you can use a PowerShell or Bash script to run the Reinforced.Typings tool on any Operating System -- the only prerequisite is that it has the dotnet CLI installed.

I'll provide both PS and Bash scripts for this example, although my favorite is PowerShell. I know it gets a lot of flak for being a "weird" scripting language (or, really, just different from Bash), but personally I think it's much more readable and easier to work with. If you're a C# dev, the PowerShell syntax looks a lot more familiar than Bash syntax, and you can probably figure out what's going on just by looking at it.

These scripts will have three goals:

  1. Since Reinforced.Typings needs absolute paths for the source assembly, the script must test if its running on WSL specifically, and translate the paths from Unix-style to Windows-style using the wslpath binary if so.
  2. The script must run the Reinforced.Typings CLI tool and place the TypeScript declarations in the typings folder.
  3. The script must create TypeScript "barrels" with the declarations, which just puts an index.d.ts file in each module directory and exports every other module from that folder. This isn't strictly necessary, it just lets you import { ILoginRequest } from "requests" instead of `import { ILoginRequest } from "requests/iloginrequest".

Goals in mind, let's write our scripts. We'll start with PowerShell, in a file called poco.ps1:

#! /usr/bin/env pwsh
# ^ If running on unix, this tells the shell to run the script using the powershell binary "pwsh"

$dll = "./src/bin/Release/netcoreapp2.0/publish/MyApp.dll"

# Check if the script is running in WSL
if (test-path "/proc/version")
{
    $WSL_running = $(Get-Content "/proc/version").Contains("Microsoft");
}   
else
{
    $WSL_running = $false;
}

# If this script is running in WSL, it's possible that dotnet packages were restored for Windows. Use the Windows dotnet exe to build/publish the project
if ($WSL_running -eq $true)
{
    $dotnet = "dotnet.exe";
    # Reinforced.Typings needs an absolute path, so it must be converted to Windows-style path with wslpath binary
    $dll = wslpath -am $(join-path "$PWD" "$dll")
} 
else 
{
    $dotnet = "$(which dotnet)"
    $dll = join-path "$PWD" "$dll"
}   

# These are the module folders that will be managed and barrelled by this script. These are based on the namespaces in [TsInterface(Namespace = "...")]
$folders = "requests", "responses"

# Find the Reinforced.Typings CLI tool
$cli = "./packages/Reinforced.Typings/tools/netcoreapp2.1/rtcli.dll";

if (! (Test-Path $cli))
{
    Write-Error "CLI tool could not be found. Make sure you restore nuget or paket packages before running this script.";
    exit 1;
}

# Publish the project, so rtcli can find all packages
& $dotnet publish -v quiet -c Release ./src

# Check that the dll is where it's expected to be
if ($WSL_running -eq $true)
{
    $exists = Test-Path $(wslpath -ua "$dll")
}
else
{
    $exists = Test-Path "$dll"
}

if ($exists -eq $false)
{
    Write-Error "DLL could not be found at $dll. Did something go wrong with the dotnet publish command?";
    exit 1;
}

# Empty the target folders, so that any C# models that have been deleted also get deleted from typings
$folders | ForEach-Object {
    Remove-Item -Recurse "./typings/$_";
}

# Run the Reinforced.Typings tool on the DLL and output to the ./typings folder
& $dotnet "$cli" SourceAssemblies="$dll" TargetDirectory="./typings" Hierarchy="true"

# Create a function that will make an export "barrel" in each typings directory
# This lets you do "import { X } from 'requests'" instead of "import { X } from 'requests/x'"
function createBarrel($folderName) {
    $indexFile = "./typings/$folderName/index.d.ts"

    write-host "Creating export barrel for " -NoNewLine
    write-host "./typings/$folderName" -ForegroundColor green
    
    # Clear out the contents of index.d.ts if it exists
    if (test-path "$indexFile") {
        Clear-Content "$indexFile"
    }

    # Gather up all other files in the folder and export their contents from index.d.ts
    Get-ChildItem "./typings/$folderName" -exclude "index.d.ts" | ForEach-Object {
        $baseName = $_.Name -replace ".d.ts", ""

        Add-Content "$indexFile" "export * from `"./$baseName`";"
    }
}

# And finally, run the barrel function on each folder
$folders | ForEach-Object {
    createBarrel $_
}

Once that's saved, you should be able to generate some pocos with a ./poco.ps1 from PowerShell if you're on Windows. If you're on a Unix or macOS machine, you can run the script the same way from your favorite shell, but you'll need to make sure that you've got the open-source PowerShell 6 package installed.

Finally, if you're a lover of Bash, here's that same script, this time in a file called poco.sh:

#! /bin/bash

dll="./src/bin/Release/netcoreapp2.0/publish/MyApp.dll"

# Check if the script is running in WSL
if grep -q Microsoft /proc/version; then
    WSL_running=true
else
    WSL_running=false
fi

# If this script is running in WSL, it's possible that the dotnet packages were restored for Windows. Use the Windows dotnet exe to build/publish the project
if [[ $WSL_running ==true ]]; then
    dotnet="dotnet.exe"
    # Reinforced.Typings needs an absolute path, so it must be converted to Windows-style path with wslpath binary
    # "The POSIX standard mandates that multiple / are treated as a single / in a file name. Thus //dir///subdir////file is the same as /dir/subdir/file."
    # Source: https://stackoverflow.com/a/24026057
    dll=$(wslpath -am "$PWD/$dll")
else
    dotnet=$(which dotnet)
    dll="$PWD/$dll"
fi

# These are module folders that will be managed and barrelled by this script. These are based on the namespaces in [TsInterface(Namespace = "...")]
declare -a folders=("requests" "responses")

# Find the Reinforced.Typings CLI tool
cli="./packages/Reinforced.Typings/tools/netcoreapp2.1/rtcli.dll";

if [ ! -f "$cli" ]; then
    echo "CLI tool could not be found. Make sure you restore nuget or paket packages before running this script."
    exit 1
fi

# Publish the project, so rtcli can all packages
eval $dotnet publish -v quiet -c Release ./src

# Check that the dll is where it's expected to be
if [[ $WSL_running ==true ]]; then
    test -f $(wslpath -ua "$dll")
    exists=$?
else
    test -f "$dll"
    exists=$?
fi

if [[ $exists == 1 ]]; then 
    echo "DLL could not be found at $dll. Did something go wrong with the dotnet publish command?"
    exit 1
fi

# Empty the target folders, so that any C# models that have been deleted also get deleted from typings
for folder in "${folders[@]}"
do
    rm -r "./typings/$folder"
done

# Run the Reinforced.Typings tool on the DLL and output to the ./typings folder
eval $dotnet "$cli" SourceAssemblies="$dll" TargetDirectory="./typings" Hierarchy="true"

# Create a function that will make an export "barrel" in each typings directory
# This lets you do "import { X } from 'requests'" instead of "import { X } from 'requests/x'"
function createBarrel {
    folder="$1"
    indexFile="./typings/$folder/index.d.ts"

    echo "Creating export barrel for ./typings/$folder"

    # Clear out the contents of index.d.ts if it exists
    echo "" > "$indexFile"

    # Gather up all other files in the folder and export their contents from index.d.ts
    readarray tsmodules < <(ls "./typings/$folder" | grep -i -v "index.d.ts")
    for tsmodule in "${tsmodules[@]}"
    do
        baseName="${tsmodule/.d.ts/}"

        echo "export * from \"./$baseName\";" > "$indexFile"
    done
}

# And finally, run the barrel function on each folder
for folder in "${folders[@]}"
do
    createBarrel "$folder"
done

There we have it, just like the the PowerShell script, you can run this from your favorite shell -- even on Windows using the Windows Subsystem for Linux -- by executing ./poco.sh.


Learn how to build rock solid Shopify apps with C# and ASP.NET!

Did you enjoy this article? I wrote a premium course for C# and ASP.NET developers, and it's all about building rock-solid Shopify apps from day one.

Enter your email here and I'll send you a free sample from The Shopify Development Handbook. It'll help you get started with integrating your users' Shopify stores and charging them with the Shopify billing API.

We won't send you spam. Unsubscribe at any time.