Logging exceptions on the client side is just as important as logging them on the server side. Also important is letting the user know something failed and what they should do next. I’ll cover some strategies to enable this. I’m not going to cover the server side but will assume you have an endpoint setup where json error messages can be POSTed.

Outside of Aurelia

Exceptions can occur both inside and outside of Aurelia. In order to catch all unhandled errors that occur outside of Aurelia, you can setup a global error handler in your main page to handle these:

<script>
	var escape = function(value) {
	    return !value ? '' : value
			.replace(/\\/g, '\\\\').replace(/\"/g, '\\"')
	        .replace(/\//g, '\/\/').replace(/[\b]/g, '\\b')
	        .replace(/\f/g, '\\f').replace(/\n/g, '\\n')
	        .replace(/\r/g, '\\r').replace(/\t/g, '\\t');
	};
	var XHR = window.XMLHttpRequest || function() {
	    try { return new ActiveXObject("Msxml3.XMLHTTP"); } catch (e0) {}
	    try { return new ActiveXObject("Msxml2.XMLHTTP.6.0"); } catch (e1) {}
	    try { return new ActiveXObject("Msxml2.XMLHTTP.3.0"); } catch (e2) {}
	    try { return new ActiveXObject("Msxml2.XMLHTTP"); } catch (e3) {}
	    try { return new ActiveXObject("Microsoft.XMLHTTP"); } catch (e4) {}
	};
	window.onerror = function(message, source, line, column, error) {
		try {
		    var xhr = new XHR();
		    xhr.open('POST', '/errors', true);
		    xhr.setRequestHeader('Content-type', 'application/json');
		    xhr.send('{ ' +
		        '"message": "' + escape(message || '') + '",' +
		        '"stackTrace": "' + escape(error ? error.stack || '' : '') + '",' +
		        '"source": "' + escape(source || '') + '",' +
		        '"url": "' + escape(window.location.href) + '",' +
		        '"line": "' + (line || 0) + '",' +
		        '"column": "' + (column || 0) + '"' +
		    '}');
	    }
	    finally {
	        // Display a friendly message to the user.
	    }
	};
</script>

This hander is placed in the main page (Not in a separate file) and before all script tags. It is also very lo-fi so it should run in any browser and not depend on a 3rd party library. I adapted the XHR factory from Angular and quirksmode.org. After logging the exception on the server you will want to notify the user that something broke and what they should do next (Like contact support). Again, lo-fi, so that it works no matter what.

In Aurelia

Now that we are logging unhandled exceptions outside of Aurelia we want to handle ones that happen inside of Aurelia. We can do this by creating a log appender that sends errors back to the server. First, lets install the necessary modules (You can omit the tsd call if not using TypeScript):

jspm install aurelia-http-client lodash
tsd install lodash --save

NOTE: The following code is TypeScript but you can easily adapt it to ES5/ES6.

import { HttpClient } from 'aurelia-http-client';
import { Logger } from 'aurelia-logging';
import * as _ from 'lodash';

export class LogAppender {

    http: HttpClient;

    constructor(http: HttpClient) {
        this.http = http;
    }

    debug(logger: Logger, ...rest: any[]): void {}
    info(logger: Logger, ...rest: any[]): void {}
    warn(logger: Logger, ...rest: any[]): void {}

    error(logger: Logger, ...rest : any[]): void {
        var error = _.find(rest, x => x instanceof Error);
        this.http.post('errors', {
            url: window.location.href,
            source: (<any>logger).id,
            message: rest.join('\r\n'),
            stackTrace: error ? error.stack || '' : ''
        }).then();
        // Display a friendly message to the user.
    }
}

We are only implementing the error method above as that’s all we’re interested in logging but you could obviously log much more information by implementing the other levels. We are also looking for an Error object. If it was passed in, we can potentially extract the stack trace (if it has it of course).

Now that we’ve created the appender, we need to register it. This is done in your main.js. If you haven’t already created one, see here for more info.


import { Aurelia, LogManager } from 'aurelia-framework';
import { ConsoleAppender } from 'aurelia-logging-console';
import { HttpClient } from 'aurelia-http-client';
import { LogAppender } from './log-appender';

LogManager.addAppender(new ConsoleAppender());
LogManager.addAppender(new LogAppender(new HttpClient()));
LogManager.setLevel(LogManager.logLevel.debug);

export function configure(aurelia: Aurelia) {
    ...
}

In the example above we import our appender, create it and pass in an HttpClient. You can have multiple appnders and also set the logging level for all appenders.

NOTE: At this time, view errors are not handled by the Aurelia logging infrastructure. Also ATM you can’t reliably catch promise rejection errors with the global error handler. If a view fails to load you will get something along the lines of this:

This has been fixed as of 10/25/2015 and should be released soon.

Unhandled promise rejection Error: A route with name 'add' could not be found. Check that `name: 'add'` was specified in the route's config.
    at AppRouter.generate (http://localhost/jspm_packages/github/aurelia/router@0.13.0/aurelia-router.js:1228:15)
    at http://localhost/jspm_packages/github/aurelia/templating-router@0.17.0/route-href.js:33:33
    at run (http://localhost/jspm_packages/npm/core-js@0.9.18/modules/es6.promise.js:91:43)
    at http://localhost/jspm_packages/npm/core-js@0.9.18/modules/es6.promise.js:105:11
    at module.exports (http://localhost/jspm_packages/npm/core-js@0.9.18/modules/$.invoke.js:6:25)
    at queue.(anonymous function) (http://localhost/jspm_packages/npm/core-js@0.9.18/modules/$.task.js:40:9)
    at Number.run (http://localhost/jspm_packages/npm/core-js@0.9.18/modules/$.task.js:27:7)
    at listner (http://localhost/jspm_packages/npm/core-js@0.9.18/modules/$.task.js:31:9)

There is an issue tracking this here.


The getting started guide on the Aurelia site is a nice way to get up and running quickly. This post will dig a little deeper and set up a TypeScript Aurelia app, with tests, from scratch. The following post assumes that you are using gulp and familiar with TypeScript. If not, start here and here.

Before you continue, you may want to enable TypeScript support in your editor. A list of editors and plugins can be found here.

NOTE: ES6 and ES7 are now officially called ES 2015 and ES 2016 respectively but I just refer to them as the former here.

Setting up Compilation

First we need to set up the TypeScript compiler.

Install the following node modules:

npm install gulp-typescript --save
npm install gulp-sourcemaps --save

Then add the following task to your gulpfile.js:

var tsc = require("gulp-typescript");
var sourcemaps = require("gulp-sourcemaps");

gulp.task('tsc', function () {
    return gulp.src('**/*.ts', { base: '.' })
        .pipe(sourcemaps.init())
        .pipe(tsc({
            target: 'ES5',
            module: 'commonjs',
            noEmitOnError: true,
            noImplicitAny: true,
            emitDecoratorMetadata: true,
            experimentalDecorators: true
        }))
        .on('error', function() { process.exit(1); })
        .pipe(sourcemaps.write('.'))
        .pipe(gulp.dest('.'));
});

Here we are compiling TypeScript files (with a .ts extension) to ES5. We are compiling to CommonJS modules as they can be understood on the browser by SystemJS (Which we’ll cover in a bit) and natively by Node.js for our tests. Next we are saying that we don’t want to generate and files when there are errors and we don’t want to allow implicit use of any. For a full list of options see here. We are also generating source map files so that compiled code can be mapped to the original TypeScript code when debugging. Since we don’t want to commit generated .js files to our repository we can selectively exclude with a .gitignore:

*.js
# Any exclusions
!gulpfile.js

We’ll set up the watcher below when we setup the tests.

Setting up Type Definitions

In order for the TypeScript compiler to understand vanilla JavaScript libraries it needs some sort of meta data that describes them. This meta data is the TypeScript definition file, with an extension of .d.ts, as described here. Fortunately a growing number of definition files for popular libraries have been contributed to the definitelytyped.org repository (~1,200 as of 9/2015). These definitions can easily be downloaded with the TypeScript Definition Manager.

So first, lets setup the TypeScript Definition Manager:

npm install tsd -g

The syntax for tsd is similar to npm (see here for all the options):

tsd install [name] --save

This command will download the type definitions to a folder called typings that contains all your type definitions. It also creates a file called tds.json (If it doesn’t already exist) that contains reference to all the saved definitions (Analogous to package.json). We’ll install the type definitions as we go.

Setting up the Test Runner

We will use the Mocha test framework and Chai assertion library. We make our test task depend on the tsc task to compile before running the tests. And since Node.js doesn’t natively support source maps, we’ll need to use the source-map-support module. We also need to reference the core-js ES* pollyfill to get ES6 API support in Node.

Install the following node modules:

npm install gulp-mocha --save
tsd install mocha --save
npm install chai --save
tsd install chai --save
npm install source-map-support --save
npm install core-js --save

Then add the following task to your gulpfile.js:

var mocha = require('gulp-mocha');
var childProcess = require('child_process');
require('source-map-support').install();
require('core-js');

gulp.task('watch', function () {
    var spawnTests = function() {
        childProcess.spawn('gulp', ['test'], { stdio: 'inherit' });
    }
    spawnTests();
    gulp.watch('**/*.ts', spawnTests);
});

gulp.task('test', ['tsc'], function () {
    return gulp.src(['tests/**/*.js'])
        .pipe(mocha({ reporter: 'spec', ui: 'bdd' }));
});

Here we add the watcher that compiles our TypeScript code, then fires off the test runner. Spawning the test runner prevents certain failures from stopping the watch loop. Also, since the watch is fired only on changes, we run it right off the bat so we don’t have to wait for changes to see results.

Now we’ll create a fixture called app.ts with a dummy test to make sure everything is working:

import { expect } from 'chai';
 
describe('Test', () =>
    it('should work.', () => 
        expect(1).to.equal(1)
    )
);

And run the tests:

gulp watch

Setting up SystemJS & jspm

Aurelia fully embraces ES6 and is made up of many composable modules. This allows you to pick and choose what pieces you want to use. As such, there isn’t one single file you can reference like you would with jQuery or Angular 1.x. This presents a number of challenges. First, how do we easily get a hold of all the modules we want? And second, how do we wire up and load all these modules? Both of these problems are solved by the SystemJS module loader and its corresponding package manager jspm. SystemJS supports many different module standards (ES6, AMD, CommonJS) and jspm enables us to easily get a hold of modules and their dependencies (Just like NPM). jspm also automatically wires up modules in the SystemJS loader so we don’t have to. We will install SystemJS (And later Aurelia) with jspm. First lets install jspm:

# Install jspm globally
npm install jspm -g

# Lock down the local version
npm install jspm --save

# At the root of your web app
jspm init

NOTE: If you have one project, one package.json and multiple web apps, you can run jspm init . in the root of each web app. This will initialize it separately for each app.

jspm init will prompt you for a few pieces of information. You can probably accept the defaults, see here for more info on them. You will want to respond no when prompted to use an ES6 transpiler (As we’ve already handled that).

Would you like jspm to prefix the jspm package.json properties under jspm? [yes]:
Enter server baseURL (public folder path) [./]:
Enter jspm packages folder [./jspm_packages]:
Enter config file path [./config.js]:
Configuration file config.js doesn't exist, create it? [yes]:
Enter client baseURL (public folder URL) [/]:
Do you wish to use an ES6 transpiler? [yes]:no
ok   Verified package.json at package.json
     Verified config file at config.js
     Looking up loader files...
       system.js
       system.js.map
       system-csp-production.js
       system-polyfills.js
       system.src.js
       system-csp-production.src.js
       system-polyfills.js.map
       system-polyfills.src.js
       system-csp-production.js.map
     
     Using loader versions:
       systemjs@0.18.17
ok   Loader files downloaded successfully

The jspm init command does a few things. First it adds a section to your package.json for configuration and dependencies:

{
  "name": "SetecAstronomy",
  "version": "0.0.0",
  ...
  "jspm": {
    "directories": {},
    "dependencies": {
      "core-js": "npm:core-js@^1.1.3"
    }
  }
}

Next it creates a config.js where SystemJS modules are wired up:

System.config({
  baseURL: "/",
  defaultJSExtensions: true,
  transpiler: "none",
  paths: {
    "github:*": "jspm_packages/github/*",
    "npm:*": "jspm_packages/npm/*"
  },

  map: {
    "core-js": "npm:core-js@1.1.3",
    "github:jspm/nodelibs-process@0.1.1": {
      "process": "npm:process@0.10.1"
    },
    "npm:core-js@1.1.3": {
      "fs": "github:jspm/nodelibs-fs@0.1.2",
      "process": "github:jspm/nodelibs-process@0.1.1",
      "systemjs-json": "github:systemjs/plugin-json@0.1.0"
    }
  }
});

When you install packages, jspm automatically wires up all modules and their dependencies here so you don’t have to do it manually. All dependencies are stored in the jspm_packages folder (Analogous to node_modules).

One thing to note about installing jspm modules is that an alias mapping is created for modules you install directly but not for their dependencies. So for example if you install a module some-module with a dependency of some-dependency the config.js will look something along the lines of this:

System.config({
  ...
  map: {
    "some-module": "github:some-repo/some-module@0.1.2",
    "github:some-repo/some-module@0.1.2": {
      "some-dependency": "github:some-repo/some-dependency@0.2.2"
    },
    "github:some-repo/some-dependency@0.2.2": {
      ...
    },
    ...
  }
});

The very first mapping maps an alias to the actual module, in this case on GitHub. Even though some-dependency has been downloaded and referenced, there is no alias mapping. So you can reference some-module in your code but not some-dependency. If you try to reference some-dependency you will get a file not found error. If you jspm install some-dependency it will create the alias mapping for you and you can then reference this in your modules:

System.config({
  ...
  map: {
    "some-module": "github:some-repo/some-module@0.1.2",
    "some-dependency": "github:some-repo/some-module@0.2.2",
    "github:some-repo/some-module@0.1.2": {
      "some-dependency": "github:some-repo/some-dependency@0.2.2"
    },
    "github:some-repo/some-dependency@0.2.2": {
      ...
    },
    ...
  }
});

You can see there is now an alias mapping for some-dependency.

Once we have jspm installed we can then wire up SystemJS and its configuration in our main page:

<script src="jspm_packages/system.js"></script>
<script src="config.js"></script>

Setting up Aurelia

This is pretty simple:

jspm install core-js aurelia-bootstrapper aurelia-router aurelia-http-client
tsd install core-js --save

The bootstrapper is a top level module that fires up Aurelia. The router and http client are optional, so you can omit these and choose your own adventure if you want. You can use jspm update to update these modules.

NOTE: jspm update will only update packages within the version range specified in the packages.json. If you want to update to the latest version, outside of this range, you will need to specify the version after the package name, a la jspm install package-name@0.5.0. So in order to update Aurelia modules to the latest version you will have to update each top level module and include the version you want to update to. There is a proposal to add an upgrade flag to jspm so this may be a better option in the future.

In our main page we’ll flag the body tag as the base element of the app with the aurelia-app attribute. Then we make the call to fire up Aurelia by importing the bootstrapper module:

<html>
  ...
  <body aurelia-app>
    ...
    <script src="jspm_packages/system.js"></script>
    <script src="config.js"></script>
    <script>
      System.import('aurelia-bootstrapper');
    </script>
  </body>
</html>

Next we need to setup the view and view-model pair. These are conventionally tied together by their name, a la name.html & name.js. Conventionally the default pair is called app, so app.html and app.js. This can be overridden by setting the aurelia-app attribute in the main page to your a name:

<html>
  ...
  <body aurelia-app="my-custom-name">
    ...
  </body>
</html>

By default these are loaded from the root. If they are located in a sub folder you will need to adjust the default module path accordingly in the config.js. In the example below, our files are located in the app folder.

System.config({
  ...
  "paths": {
    "*": "app/*.js", // <-- Default path here
    "github:*": "jspm_packages/github/*.js",
    "npm:*": "jspm_packages/npm/*.js"
  }
});
...

Although it may look like it at first glance, the above paths are not globs but simple path mappings. The above configuration would map modules as follows:

Module Path
somemodule app/somemodule
github:somemodule jspm_packages/github/somemodule
npm:somemodule jspm_packages/npm/somemodule

NOTE: Previously the paths included the .js extension (a la "app/*.js") but this has changed and adding the extension will now cause errors when bundling.

Next we’ll create a simple view model called app.ts:

export class App {
    constructor() {
        this.message = 'Oh hai';
    }

    exclaim() {
        this.message += '!';
    }
}

Then a simple view called app.html:

<template>
    <p>${message}</p>
    <button click.delegate="exclaim()">Exclaim!</button>
</template>

And of course a test. We’ve already stubbed out the test above so we can modify it to test our new view model:

import { expect } from 'chai';
import { App } from '../app';
 
describe('App', () => {
    it('should exclaim!', () => {
        let app = new App();
        app.exclaim();
        app.exclaim();
        expect(app.message).to.equal('Oh hai!!');
    });
});

The test fixture references the app view model (Assuming it’s one level up).


NOTE: The aurelia-bootstrapper type definition is affected by an unresolved issue that will hopefully be fixed in a future release. In the meantime you can just manually change the core-js import to import * as core from 'core-js' in jspm_packages/github/aurelia/bootstrapper@0.17.0/aurelia-bootstrapper.d.ts.

Bundling

As it stands, loading a bunch of little files can be a significant performance hit. This will change with HTTP/2 but in the meantime we need to bundle these files. First install the Aurelia bundler and the text plugin (For bundling html and css files):

npm install aurelia-bundler --save
jspm install text

Next, setup the bundle configuration in your gulpfile. In the example below we are creating two bundles, one for the app and one for Aurelia:

var bundler = require('aurelia-bundler');

...

gulp.task('bundle', function() {
    return bundler.bundle({
        force: true,
        packagePath: '.',
        bundles: {
            "app/app-bundle": {
                includes: [
                    '**/*',
                    '*.html!text'
                ],
                options: {
                    inject: true,
                    minify: true
                }
            },
            "app/aurelia-bundle": {
                includes: [
                    'aurelia-bootstrapper',
                    'aurelia-http-client',
                    'aurelia-router',
                    'github:aurelia/templating-binding',
                    'github:aurelia/templating-resources',
                    'github:aurelia/templating-router',
                    'github:aurelia/loader-default',
                    'github:aurelia/history-browser',
                    'github:aurelia/logging-console'
                ],
                options: {
                    inject: true,
                    minify: true
                }
            }
        }
    });
});

Few things to note:

  • The force flag will overwrite existing bundles.
  • The packagePath points to the folder that contains the package.json at the root of the app.
  • The bundles object, as the name implies, contains configuration for all your bundles. The field names indicate the path of the resulting bundle file (sans the .js extension) and the value contains the bundle configuration.
    • The includes array contains all the resources you want bundled.
      • App bundle:
        • The first include specifies the files we want to include. In this case we want to recursively include every file. The .js file extension can be omitted as this is assumed. This path is relative to the default path we setup in the config.js. So if we specified "*": "app/*" in the config.js and **/* for the bundle path, the bundle would include app/**/*. You can get pretty crazy with these paths using the SystemJS bundler “arithmetic expressions”. These expressions support advanced filtering, see here for more info.
        • The second include indicates that we want to include text files. The syntax is the glob followed by an ! and the plugin name, in this case text. You can also include css files.
      • Aurelia bundle:
        • The includes here point to the top level Aurelia modules you want to bundle. The ones listed above should be sufficient at first but as you make use of more Aurelia features you will need to include those modules in the bundle otherwise they will be loaded individually. You will want to include the following modules:
          • Modules that you install via jspm. In the config above these would be the first 3 aurelia-* modules.
          • Any dynamically loaded dependencies that cannot be determined by static analysis. These will show up in the network traffic on page load and look like some-module@0.0.0.js. Look at the response and you’ll see the module name along the lines of github:organization/repository. In the config above these would be the last 6 github:aurelia/* modules.
    • The inject flags tells the bundler to update the config.js with the bundle information. Once this is set, the bundle will be downloaded instead of the individual modules.

Now run gulp bundle and two bundles will be created relative to your package.json: app/app-bundle.js and app/aurelia-bundle.js. The bundler will also update your config.js with the bundle information so the SystemJS module loader knows about them:

System.config({
  ...
  bundles: {
    "app-bundle": [
      "app.html!github:systemjs/plugin-text@0.0.2",
      "app",
      ...
    ],
    "aurelia-bundle": [
      "github:aurelia/bootstrapper@0.17.0",
	  ...
    ]
  },
  ...
});

Now reload the page and verify that the bundle is being downloaded instead of all the individual modules.

NOTE: Currently the SystemJS bundler does not support sourcemap comments. Hopefully this will be supported in future releases.


The getting started guide on the Aurelia site is a nice way to get up and running quickly. This post will dig a little deeper and set up an ES6 Aurelia app, with tests, from scratch. The following post assumes that you are using gulp. If not, start here.

NOTE: ES6 and ES7 are now officially called ES 2015 and ES 2016 respectively.

Setting up Transpilation

First we need to set up transpilation to convert our ES6 code into something current browsers can understand. There are many transpilers out there but we will be using Babel (Formerly 6to5) in this example. Transpilation can be done dynamically in the browser or statically. The former is not appropriate for production so we will do the latter and setup a watcher for dev.

Install the following node modules:

npm install gulp-babel --save
npm install gulp-sourcemaps --save
npm install gulp-rename --save

Then add the following task to your gulpfile.js:

var sourcemaps = require("gulp-sourcemaps");
var babel = require("gulp-babel");
var rename = require('gulp-rename');

gulp.task('babel', function () {
    return gulp.src('**/*.es6', { base: '.' })
        .pipe(sourcemaps.init())
        .pipe(babel())
        .pipe(rename({ extname: '.js' }))
        .pipe(sourcemaps.write('.'))
        .pipe(gulp.dest('.'));
});

Here we are transpiling ES6 files (with a .es6 extension) to ES5. The default module format is CommonJS. We are transpiling to this module format as it can be understood on the browser by SystemJS (Which we’ll cover in a bit) and natively by Node.js for our tests. We are also generating source map files so that transpiled code can be mapped to the original ES6 code when debugging. Finally the output is renamed to *.js. Since we don’t want to commit generated .js files to our repository we can selectively exclude with a .gitignore:

*.js
# Any exclusions
!gulpfile.js

We’ll set up the watcher below when we setup the tests.

NOTE: As of September 2015 Babel does not enable ES7 decorators by default as they are considered experimental. Aurelia makes use of these so you’ll either have to enable experimental features in Babel a la:

    .pipe(babel({ stage: 1 }))

    // Or

    .pipe(babel({ optional: ["es7.decorators"] }))

Or you can create a static property called decorators on your class and set it’s value using the Decorators DSL a la:

import { HttpClient } from 'aurelia-http-client';
import { Decorators } from 'aurelia-framework';

export class SomeNiftyClass {...}

SomeNiftyClass.decorators = Decorators.transient().inject(HttpClient);

Setting up the Test Runner

We will use the Mocha test framework and Chai assertion library. We make our test task depend on the babel task to transpile before running the tests. And since Node.js doesn’t natively support source maps, we’ll need to use the source-map-support module. We also need to reference the core-js ES* pollyfill to get ES6 API support in Node.

Install the following node modules:

npm install gulp-mocha --save
npm install chai --save
npm install source-map-support --save
npm install core-js --save

Then add the following task to your gulpfile.js:

var mocha = require('gulp-mocha');
var childProcess = require('child_process');
require('source-map-support').install();
require('core-js');

gulp.task('watch', function () {
    var spawnTests = function() {
        childProcess.spawn('gulp', ['test'], { stdio: 'inherit' });
    }
    spawnTests();
    gulp.watch('**/*.es6', spawnTests);
});

gulp.task('test', ['babel'], function () {
    return gulp.src(['tests/**/*.js'])
        .pipe(mocha({ reporter: 'spec', ui: 'bdd' }));
});

Here we add the watcher that transpiles our ES6 code then fires off the test runner. Spawning the test runner prevents certain failures from stopping the watch loop. Also, since the watch is fired only on changes, we run it right off the bat so we don’t have to wait for changes to see results.

Now we’ll create a fixture called app.es6 with a dummy test to make sure everything is working:

import { expect } from 'chai';
 
describe('Test', () =>
    it('should work.', () => 
        expect(1).to.equal(1)
    )
);

And run the tests:

gulp watch

Setting up SystemJS & jspm

Aurelia fully embraces ES6 and is made up of many composable modules. This allows you to pick and choose what pieces you want to use. As such, there isn’t one single file you can reference like you would with jQuery or Angular 1.x. This presents a number of challenges. First, how do we easily get a hold of all the modules we want? And second, how do we wire up and load all these modules? Both of these problems are solved by the SystemJS module loader and its corresponding package manager jspm. SystemJS supports many different module standards (ES6, AMD, CommonJS) and jspm enables us to easily get a hold of modules and their dependencies (Just like NPM). jspm also automatically wires up modules in the SystemJS loader so we don’t have to. We will install SystemJS (And later Aurelia) with jspm. First lets install jspm:

# Install jspm globally
npm install jspm -g

# Lock down the local version
npm install jspm --save

# At the root of your web app
jspm init

NOTE: If you have one project, one package.json and multiple web apps, you can run jspm init . in the root of each web app. This will initialize it separately for each app.

jspm init will prompt you for a few pieces of information. You can probably accept the defaults, see here for more info on them. You will want to respond no when prompted to use an ES6 transpiler (As we’ve already handled that).

Would you like jspm to prefix the jspm package.json properties under jspm? [yes]:
Enter server baseURL (public folder path) [./]:
Enter jspm packages folder [./jspm_packages]:
Enter config file path [./config.js]:
Configuration file config.js doesn't exist, create it? [yes]:
Enter client baseURL (public folder URL) [/]:
Do you wish to use an ES6 transpiler? [yes]:no
ok   Verified package.json at package.json
     Verified config file at config.js
     Looking up loader files...
       system.js
       system.js.map
       system-csp-production.js
       system-polyfills.js
       system.src.js
       system-csp-production.src.js
       system-polyfills.js.map
       system-polyfills.src.js
       system-csp-production.js.map
     
     Using loader versions:
       systemjs@0.18.17
ok   Loader files downloaded successfully

The jspm init command does a few things. First it adds a section to your package.json for configuration and dependencies:

{
  "name": "SetecAstronomy",
  "version": "0.0.0",
  ...
  "jspm": {
    "directories": {},
    "dependencies": {
      "core-js": "npm:core-js@^1.1.3"
    }
  }
}

Next it creates a config.js where SystemJS modules are wired up:

System.config({
  baseURL: "/",
  defaultJSExtensions: true,
  transpiler: "none",
  paths: {
    "github:*": "jspm_packages/github/*",
    "npm:*": "jspm_packages/npm/*"
  },

  map: {
    "core-js": "npm:core-js@1.1.3",
    "github:jspm/nodelibs-process@0.1.1": {
      "process": "npm:process@0.10.1"
    },
    "npm:core-js@1.1.3": {
      "fs": "github:jspm/nodelibs-fs@0.1.2",
      "process": "github:jspm/nodelibs-process@0.1.1",
      "systemjs-json": "github:systemjs/plugin-json@0.1.0"
    }
  }
});

When you install packages, jspm automatically wires up all modules and their dependencies here so you don’t have to do it manually. All dependencies are stored in the jspm_packages folder (Analogous to node_modules).

One thing to note about installing jspm modules is that an alias mapping is created for modules you install directly but not for their dependencies. So for example if you install a module some-module with a dependency of some-dependency the config.js will look something along the lines of this:

System.config({
  ...
  map: {
    "some-module": "github:some-repo/some-module@0.1.2",
    "github:some-repo/some-module@0.1.2": {
      "some-dependency": "github:some-repo/some-dependency@0.2.2"
    },
    "github:some-repo/some-dependency@0.2.2": {
      ...
    },
    ...
  }
});

The very first mapping maps an alias to the actual module, in this case on GitHub. Even though some-dependency has been downloaded and referenced, there is no alias mapping. So you can reference some-module in your code but not some-dependency. If you try to reference some-dependency you will get a file not found error. If you jspm install some-dependency it will create the alias mapping for you and you can then reference this in your modules:

System.config({
  ...
  map: {
    "some-module": "github:some-repo/some-module@0.1.2",
    "some-dependency": "github:some-repo/some-module@0.2.2",
    "github:some-repo/some-module@0.1.2": {
      "some-dependency": "github:some-repo/some-dependency@0.2.2"
    },
    "github:some-repo/some-dependency@0.2.2": {
      ...
    },
    ...
  }
});

You can see there is now an alias mapping for some-dependency.

Once we have jspm installed we can then wire up SystemJS and its configuration in our main page:

<script src="jspm_packages/system.js"></script>
<script src="config.js"></script>

Setting up Aurelia

This is pretty simple:

jspm install core-js aurelia-bootstrapper aurelia-router aurelia-http-client

The bootstrapper is a top level module that fires up Aurelia. The router and http client are optional, so you can omit these and choose your own adventure if you want. You can use jspm update to update these modules.

NOTE: jspm update will only update packages within the version range specified in the packages.json. If you want to update to the latest version, outside of this range, you will need to specify the version after the package name, a la jspm install package-name@0.5.0. So in order to update Aurelia modules to the latest version you will have to update each top level module and include the version you want to update to. There is a proposal to add an upgrade flag to jspm so this may be a better option in the future.

In our main page we’ll flag the body tag as the base element of the app with the aurelia-app attribute. Then we make the call to fire up Aurelia by importing the bootstrapper module:

<html>
  ...
  <body aurelia-app>
    ...
    <script src="jspm_packages/system.js"></script>
    <script src="config.js"></script>
    <script>
      System.import('aurelia-bootstrapper');
    </script>
  </body>
</html>

Next we need to setup the view and view-model pair. These are conventionally tied together by their name, a la name.html & name.js. Conventionally the default pair is called app, so app.html and app.js. This can be overridden by setting the aurelia-app attribute in the main page to your a name:

<html>
  ...
  <body aurelia-app="my-custom-name">
    ...
  </body>
</html>

By default these are loaded from the root. If they are located in a sub folder you will need to adjust the default module path accordingly in the config.js. In the example below, our files are located in the app folder.

System.config({
  ...
  "paths": {
    "*": "app/*", // <-- Default path here
    "github:*": "jspm_packages/github/*",
    "npm:*": "jspm_packages/npm/*"
  }
});
...

Although it may look like it at first glance, the above paths are not globs but simple path mappings. The above configuration would map modules as follows:

Module Path
somemodule app/somemodule
github:somemodule jspm_packages/github/somemodule
npm:somemodule jspm_packages/npm/somemodule

NOTE: Previously the paths included the .js extension (a la "app/*.js") but this has changed and adding the extension will now cause errors when bundling.

Next we’ll create a simple view model called app.es6:

export class App {
    constructor() {
        this.message = 'Oh hai';
    }

    exclaim() {
        this.message += '!';
    }
}

Then a simple view called app.html:

<template>
    <p>${message}</p>
    <button click.delegate="exclaim()">Exclaim!</button>
</template>

And of course a test. We’ve already stubbed out the test above so we can modify it to test our new view model:

import { expect } from 'chai';
import { App } from '../app';
 
describe('App', () => {
    it('should exclaim!', () => {
        let app = new App();
        app.exclaim();
        app.exclaim();
        expect(app.message).to.equal('Oh hai!!');
    });
});

The test fixture references the app view model (Assuming it’s one level up).

Bundling

As it stands, loading a bunch of little files can be a significant performance hit. This will change with HTTP/2 but in the meantime we need to bundle these files. First install the Aurelia bundler and the text plugin (For bundling html and css files):

npm install aurelia-bundler --save
jspm install text

Next, setup the bundle configuration in your gulpfile. In the example below we are creating two bundles, one for the app and one for Aurelia:

var bundler = require('aurelia-bundler');

...

gulp.task('bundle', function() {
    return bundler.bundle({
        force: true,
        packagePath: '.',
        bundles: {
            "app/app-bundle": {
                includes: [
                    '**/*',
                    '*.html!text'
                ],
                options: {
                    inject: true,
                    minify: true
                }
            },
            "app/aurelia-bundle": {
                includes: [
                    'aurelia-bootstrapper',
                    'aurelia-http-client',
                    'aurelia-router',
                    'github:aurelia/templating-binding',
                    'github:aurelia/templating-resources',
                    'github:aurelia/templating-router',
                    'github:aurelia/loader-default',
                    'github:aurelia/history-browser',
                    'github:aurelia/logging-console'
                ],
                options: {
                    inject: true,
                    minify: true
                }
            }
        }
    });
});

Few things to note:

  • The force flag will overwrite existing bundles.
  • The packagePath points to the folder that contains the package.json at the root of the app.
  • The bundles object, as the name implies, contains configuration for all your bundles. The field names indicate the path of the resulting bundle file (sans the .js extension) and the value contains the bundle configuration.
    • The includes array contains all the resources you want bundled.
      • App bundle:
        • The first include specifies the files we want to include. In this case we want to recursively include every file. The .js file extension can be omitted as this is assumed. This path is relative to the default path we setup in the config.js. So if we specified "*": "app/*" in the config.js and **/* for the bundle path, the bundle would include app/**/*. You can get pretty crazy with these paths using the SystemJS bundler “arithmetic expressions”. These expressions support advanced filtering, see here for more info.
        • The second include indicates that we want to include text files. The syntax is the glob followed by an ! and the plugin name, in this case text. You can also include css files.
      • Aurelia bundle:
        • The includes here point to the top level Aurelia modules you want to bundle. The ones listed above should be sufficient at first but as you make use of more Aurelia features you will need to include those modules in the bundle otherwise they will be loaded individually. You will want to include the following modules:
          • Modules that you install via jspm. In the config above these would be the first 3 aurelia-* modules.
          • Any dynamically loaded dependencies that cannot be determined by static analysis. These will show up in the network traffic on page load and look like some-module@0.0.0.js. Look at the response and you’ll see the module name along the lines of github:organization/repository. In the config above these would be the last 6 github:aurelia/* modules.
    • The inject flags tells the bundler to update the config.js with the bundle information. Once this is set, the bundle will be downloaded instead of the individual modules.

Now run gulp bundle and two bundles will be created relative to your package.json: app/app-bundle.js and app/aurelia-bundle.js. The bundler will also update your config.js with the bundle information so the SystemJS module loader knows about them:

System.config({
  ...
  bundles: {
    "app-bundle": [
      "app.html!github:systemjs/plugin-text@0.0.2",
      "app",
      ...
    ],
    "aurelia-bundle": [
      "github:aurelia/bootstrapper@0.17.0",
	  ...
    ]
  },
  ...
});

Now reload the page and verify that the bundle is being downloaded instead of all the individual modules.

NOTE: Currently the SystemJS bundler does not support sourcemap comments. Hopefully this will be supported in future releases.


The following post discusses how to setup Salesforce SSO with SAML and a .NET web application. First off I’d highly recommend looking over these resources:

SAML Notes

Few things to note:

  • Identity Provider (IdP): The application that will be authenticating users. In this post, your web app.
  • Service Provider (SP): The 3rd party application that your IdP will be providing authentication for, in this post Salesforce.
  • It appears that although Salesforce both supports SAML 1.1 and 2.0, going forward you can only create new configurations with SAML 2.0.
  • SAML has the notion of…
    • Assertions. We will be creating an “Authentication Assertion” in this post, which is issued by the IdP to tell the SP that the user has been authenticated.
    • Bindings. Bindings describe how communication can be done (For example SOAP, HTTP POST, HTTP Redirect, etc). We will be using the HTTP POST Binding.
    • Profiles. We will be concerning ourselves with the “Web Browser SSO Profile” in this post.

Generating a Certificate

We need to generate a certificate in order to sign the SAML assertion. This can be done with the certificate creation tool makecert.exe and the pvk to pfx conversion tool Pvk2Pfx.exe. These tools are part of the Windows SDK which can be downloaded here. They can be found under C:\Program Files (x86)\Microsoft SDKs\Windows\*\Bin\. The following command will generate the private key and the certificate which we will provide to Salesforce. You will be prompted for a password for the private key which you can leave blank.

makecert -r -a sha256 -n "CN=My Certificate" -sky signature -sv mycert.pvk mycert.cer

A few notes on the flags:

  • -r Creates a self-signed certificate.
  • -a The hashing algorithm used. The default is MD5 which is susceptible to collision attacks.
  • -n The subject where we specify the common name.
  • -sky The subject’s key specification, which in this case we will be signing.
  • -sv The filename of the private key we will generate.
  • And finally the filename of the X.509 certificate we will generate, which contains the public key among other things.

Next we will create a PKCS 12 file which contains the private key and the certificate which will be used in our .NET code to sign the SAML responses. We can delete the private key file after this has been created.

pvk2pfx -pvk mycert.pvk -spc mycert.cer -pfx mycert.pfx
del mycert.pvk

Salesforce Configuration

First you will need to enable SAML, then you will need to add your SAML configuration. You can watch a demo of this here.

  • Go to Salesforce Setup and browse to Administer Security Controls Single Sign-On Settings.

SSO Page

  • Edit the “Federated Single Sign-On Using SAML” setting, check “SAML Enabled” and click “Save”.
  • Click “New” on the “SAML Single Sign-On Settings” section.
  • Enter the SSO Settings and click “Save”. You can find the documentation for this here.
    • Name The friendly name of the IdP, can be whatever you want.
    • API Name The id of the SSO configuration when accessed by the API. This will be auto generated so you don’t have to worry about it.
    • Issuer This is the issuer that will be specified in the SAML assertion from our IdP. We will discuss this further below.
    • Entity Id This will be https://saml.salesforce.com if you do not have a custom domain setup, otherwise you will want to use your custom domain (e.g. a Salesforce sub domain https://myorg.cs11.my.salesforce.com or your own domain).
    • Identity Provider Certificate This is the certificate (.cer file) we generated in the last section.
    • Request Signing Certificate The certificate used for signing SP requests. The default is fine unless you want to control expiration.
    • Request Signature Method The hashing algorithm used to sign SP requests, choose RSA-SHA256 as SHA1 is susceptible to collision attacks.
    • Assertion Decryption Certificate Set to Assertion not encrypted as we are not encrypting assertions.
    • SAML Identity Type This tells Salesforce what kind of user id you are passing it e.g. a Salesforce user name, id or a custom user id from you IdP (Or federation id). In this post we will be passing in a federation id.
    • SAML Identity Location This tells Salesforce where to look for the user id in the SAML assertions. In this post we will put it in the NameIdentifier element.
    • Service Provider Initiated Request Binding This option indicates how SP initiated SAML requests will pass the request. In other words, if someone clicks a Salesforce link, Salesforce will need to check with your IdP to authenticate the user. Salesforce can either POST the SAML request or redirect and pass the SAML request in the querystring. According to the Salesforce documentation a redirect is recommended as it plays nicer with iOS devices.
    • Identity Provider Login URL The url where SP initiated requests (i.e. Salesforce links) are redirected to be authenticated by the IdP. We can provide this information in the SAML assertion to we can omit it.
    • Identity Provider Logout URL The url where users are redirected then they click the logout link in Salesforce to be logged out by the IdP. We can provide this information in the SAML assertion to we can omit it.
    • Custom Error URL The url where users are directed when there is an error authenticating.
    • User Provisioning Enabled This indicates whether Salesforce automatically creates new users upon authenticating, if they don’t already exist. I won’t cover that in this post but you can find more about it here.

SSO Settings

  • Once you create your SAML settings you will be taken to a page thats displays the configuration as well as the SSO endpoint you will need to use to login users from the IdP:

SSO Settings

Creating the SAML Assertion

In order for a user to access Salesforce, the IdP will need to create and POST a SAML assertion to the Salesforce login url. The following is a SAML assertion with the values specified mustache style:

<?xml version="1.0" encoding="UTF-8"?>
<saml2p:Response 
    xmlns:saml2p="urn:oasis:names:tc:SAML:2.0:protocol" 
    xmlns:saml2="urn:oasis:names:tc:SAML:2.0:assertion"
    xmlns:xs="http://www.w3.org/2001/XMLSchema" 
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xmlns:ds="http://www.w3.org/2000/09/xmldsig#"
    Destination="{{SalesforceLoginUrl}}" ID="_{{ResponseId}}" 
    IssueInstant="{{Timestamp}}" Version="2.0">
    <saml2:Issuer>{{Issuer}}</saml2:Issuer>
    <saml2p:Status>
        <saml2p:StatusCode Value="urn:oasis:names:tc:SAML:2.0:status:Success"/>
    </saml2p:Status>
    <saml2:Assertion ID="_{{AssertionId}}" IssueInstant="{{Timestamp}}" Version="2.0">
        <saml2:Issuer>{{Issuer}}</saml2:Issuer>
        <saml2:Subject>
            <saml2:NameID Format="urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified">{{FederationId}}</saml2:NameID>
            <saml2:SubjectConfirmation Method="urn:oasis:names:tc:SAML:2.0:cm:bearer">
                <saml2:SubjectConfirmationData NotOnOrAfter="{{Expires}}" Recipient="{{SalesforceLoginUrl}}"/>
            </saml2:SubjectConfirmation>
        </saml2:Subject>
        <saml2:Conditions NotBefore="{{Timestamp}}" NotOnOrAfter="{{Expires}}">
            <saml2:AudienceRestriction>
                <saml2:Audience>{{EntityId}}</saml2:Audience>
            </saml2:AudienceRestriction>
        </saml2:Conditions>
        <saml2:AuthnStatement AuthnInstant="{{Timestamp}}">
            <saml2:AuthnContext>
                <saml2:AuthnContextClassRef>urn:oasis:names:tc:SAML:2.0:ac:classes:unspecified</saml2:AuthnContextClassRef>
            </saml2:AuthnContext>
        </saml2:AuthnStatement>
        <saml2:AttributeStatement>
            <saml2:Attribute Name="ssoStartPage" NameFormat="urn:oasis:names:tc:SAML:2.0:attrname-format:unspecified">
                <saml2:AttributeValue xsi:type="xs:string">{{IdentityProviderLoginUrl}}</saml2:AttributeValue>
            </saml2:Attribute>
            <saml2:Attribute Name="logoutURL" NameFormat="urn:oasis:names:tc:SAML:2.0:attrname-format:unspecified">
                <saml2:AttributeValue xsi:type="xs:string">{{IdentityProviderLogoutUrl}}</saml2:AttributeValue>
            </saml2:Attribute>
        </saml2:AttributeStatement>
    </saml2:Assertion>
</saml2p:Response>

This assertion is then signed and base64 encoded. The following .NET code does this for you (Signed with SHA256 instead of the default SHA1):

public class SamlAssertion
{
    private readonly byte[] _certificate;
    private readonly string _issuer;
    private readonly string _entityId;
    private readonly string _salesforceLoginUrl;
    private readonly string _identityProviderLoginUrl;
    private readonly string _identityProviderLogoutUrl;

    public SamlAssertion(
        byte[] certificate,
        string issuer, 
        string entityId,
        string salesforceLoginUrl, 
        string identityProviderLoginUrl = null, 
        string identityProviderLogoutUrl = null)
    {
        _certificate = certificate;
        _issuer = issuer;
        _entityId = entityId;
        _salesforceLoginUrl = salesforceLoginUrl;
        _identityProviderLoginUrl = identityProviderLoginUrl;
        _identityProviderLogoutUrl = identityProviderLogoutUrl;
    }

    public string Create(string federationId)
    {
        var saml2AssertionNamespace = XNamespace.Get("urn:oasis:names:tc:SAML:2.0:assertion");
        var saml2ProtocolNamespace = XNamespace.Get("urn:oasis:names:tc:SAML:2.0:protocol");
        var xmlSchemaNamespace = XNamespace.Get("http://www.w3.org/2001/XMLSchema");
        var xmlSchemaInstanceNamespace = XNamespace.Get("http://www.w3.org/2001/XMLSchema-instance");
        var timestamp = DateTime.Now.ToString("o");
        var expires = DateTime.Now.AddMinutes(1).ToString("o");
        Func<string> createId = () => string.Format("_{0:N}", Guid.NewGuid());
        var requestId = createId();

        var attributes = new List<XElement>();
        Action<string, string> addAttribute = (name, value) =>
            attributes.Add(new XElement(saml2AssertionNamespace + "Attribute",
                new XAttribute("Name", name),
                new XAttribute("NameFormat", "urn:oasis:names:tc:SAML:2.0:attrname-format:unspecified"),
                new XElement(saml2AssertionNamespace + "AttributeValue",
                    new XAttribute(xmlSchemaInstanceNamespace + "type", "xs:string"), value)));

        if (_identityProviderLoginUrl != null) addAttribute("ssoStartPage", _identityProviderLoginUrl);
        if (_identityProviderLogoutUrl != null) addAttribute("logoutURL", _identityProviderLogoutUrl);

        var assertion =
            new XDocument(
            new XElement(saml2ProtocolNamespace + "Response",
                new XAttribute(XNamespace.Xmlns + "saml2", saml2AssertionNamespace),
                new XAttribute(XNamespace.Xmlns + "saml2p", saml2ProtocolNamespace),
                new XAttribute(XNamespace.Xmlns + "xs", xmlSchemaNamespace),
                new XAttribute(XNamespace.Xmlns + "xsi", xmlSchemaInstanceNamespace),
                new XAttribute("Destination", _salesforceLoginUrl),
                new XAttribute("ID", requestId),
                new XAttribute("IssueInstant", timestamp),
                new XAttribute("Version", "2.0"),
                new XElement((saml2AssertionNamespace + "Issuer"), _issuer),
                new XElement(saml2ProtocolNamespace + "Status",
                    new XElement(saml2ProtocolNamespace + "StatusCode",
                        new XAttribute("Value", "urn:oasis:names:tc:SAML:2.0:status:Success"))),
                new XElement(saml2AssertionNamespace + "Assertion",
                    new XAttribute("ID", createId()),
                    new XAttribute("IssueInstant", timestamp),
                    new XAttribute("Version", "2.0"),
                    new XElement(saml2AssertionNamespace + "Issuer", _issuer),
                    new XElement(saml2AssertionNamespace + "Subject",
                        new XElement(saml2AssertionNamespace + "NameID",
                            new XAttribute("Format", "urn:oasis:names:tc:SAML:1.1:nameid-format:unspecified"), 
                            federationId),
                        new XElement(saml2AssertionNamespace + "SubjectConfirmation",
                            new XAttribute("Method", "urn:oasis:names:tc:SAML:2.0:cm:bearer"),
                            new XElement(saml2AssertionNamespace + "SubjectConfirmationData",
                                new XAttribute("NotOnOrAfter", expires),
                                new XAttribute("Recipient", _salesforceLoginUrl)))),
                    new XElement(saml2AssertionNamespace + "Conditions",
                        new XAttribute("NotBefore", timestamp),
                        new XAttribute("NotOnOrAfter", expires),
                        new XElement(saml2AssertionNamespace + "AudienceRestriction",
                            new XElement(saml2AssertionNamespace + "Audience", _entityId))),
                    new XElement(saml2AssertionNamespace + "AuthnStatement",
                        new XAttribute("AuthnInstant", timestamp),
                        new XElement(saml2AssertionNamespace + "AuthnContext",
                            new XElement(saml2AssertionNamespace + "AuthnContextClassRef", 
                                "urn:oasis:names:tc:SAML:2.0:ac:classes:unspecified"))),
                    new XElement(saml2AssertionNamespace + "AttributeStatement", attributes)))
            );

        return Convert.ToBase64String(Encoding.UTF8.GetBytes(
            SignDocument(assertion, _certificate, requestId).InnerXml));
    }

    private static XmlDocument SignDocument(XDocument xdocument, byte[] certificate, string uri)
    {
        const string sha256Algorithm = "http://www.w3.org/2001/04/xmldsig-more#rsa-sha256";
        CryptoConfig.AddAlgorithm(typeof(RsaPkCs1Sha256SignatureDescription), sha256Algorithm);
        var key = new RSACryptoServiceProvider(new CspParameters(24)) { PersistKeyInCsp = false };
        key.FromXmlString(new X509Certificate2(certificate, "", X509KeyStorageFlags.Exportable)
            .PrivateKey.ToXmlString(true));

        var document = new XmlDocument();
        using (var reader = xdocument.CreateReader()) { document.Load(reader); }

        var orderRef = new Reference("#" + uri);
        orderRef.AddTransform(new XmlDsigEnvelopedSignatureTransform());

        var signer = new SignedXml(document)
        {
            KeyInfo = new KeyInfo(),
            SigningKey = key
        };
        signer.SignedInfo.SignatureMethod = sha256Algorithm;
        signer.KeyInfo.AddClause(new KeyInfoX509Data(new X509Certificate2(certificate, "")));
        signer.AddReference(orderRef);
        signer.ComputeSignature();

        document.DocumentElement.PrependChild(signer.GetXml());

        return document;
    }

    // Pulled from .NET 4.5+
    public class RsaPkCs1Sha256SignatureDescription : SignatureDescription
    {
        public RsaPkCs1Sha256SignatureDescription()
        {
            KeyAlgorithm = "System.Security.Cryptography.RSACryptoServiceProvider";
            DigestAlgorithm = "System.Security.Cryptography.SHA256Managed";
            FormatterAlgorithm = "System.Security.Cryptography.RSAPKCS1SignatureFormatter";
            DeformatterAlgorithm = "System.Security.Cryptography.RSAPKCS1SignatureDeformatter";
        }

        public override AsymmetricSignatureDeformatter CreateDeformatter(AsymmetricAlgorithm key)
        {
            var asymmetricSignatureDeformatter = (AsymmetricSignatureDeformatter)
                CryptoConfig.CreateFromName(DeformatterAlgorithm);
            asymmetricSignatureDeformatter.SetKey(key);
            asymmetricSignatureDeformatter.SetHashAlgorithm("SHA256");
            return asymmetricSignatureDeformatter;
        }
    }
}

This class has 6 constructor parameters:

  • certificate This is the .pfx file we generated above.
  • issuer This is the issuer we specified in the Salesforce SAML configuration. Doesn’t matter what this is as long as it matches whats in Salesforce.
  • entityId This is the entity id we specified in the Salesforce SAML configuration.
  • salesforceLoginUrl This is the login url that Salesforce generated for us in the Salesforce SAML configuration (Under Endpoints).
  • identityProviderLoginUrl This is an optional url where users are sent if they need to login, should be your IdP login page.
  • identityProviderLogoutUrl This is an optional url where users are sent if they click the logout link in Salesforce.

The Create method takes a federation id and builds the assertion. Assertions are time sensitive and must be used immediately or they will expire. So do not generate an assertion to be used later.

HTTP POST Binding

Now that we have everything configured and have a way to generate assertions, we need to be able to POST the assertion to Salesforce. The form is defined as follows:

<form method="post" action="https://myorg.cs21.force.com/customers/login?so=00AA0000000DDbb">
    <input type="hidden" name="SAMLResponse" value="aHR0cDovL2JpdC5seS8xR3YwRzc0..." />
    <input type="hidden" name="RelayState" value="https://myorg.cs21.force.com/customers/888f00000000abc" />
</form>

The SAMLResponse is the assertion we generate above. The RelayState is an optional landing page in Salesforce.

As noted earlier, we need to generate the SAML assertion on the fly and not simply render it in a web page to potentially be clicked later. One way we can accomplish this is by adding an endpoint that generates the SAML assertion and then dynamically create the form on the client side and submit it (Unfortunately we can’t simply ajax POST as this violates the same-origin policy). So first, we need to create an endpoint along these lines:

public class SamlController : Controller
{
    public string Assertion()
    {
        return new SamlAssertion(
            Assembly.GetExecutingAssembly().GetEmbeddedResource("mycert.pfx"),
            Configuration.Saml.Issuer,
            Configuration.Saml.EntityId,
            Configuration.Saml.SalesforceLoginUrl,
            Configuration.Saml.IdentityProviderLoginUrl,
            Configuration.Saml.IdentityProviderLogoutUrl)
            .Create(Token.Current.Username);
    }
}

On the client side we can pass our salesforce url’s into the view. When the link is clicked, a SAML assertion is generated and returned. A hidden form is dynamically generated, submitted and removed.

<a href="#" id="salesforce">Login to Salesforce.</a>

<script>
    var salesforceLoginUrl = '@ViewBag.SalesforceLoginUrl';
    var salesforceStartUrl = '@ViewBag.SalesforceStartUrl';
   
    $(function () {
        $('#salesforce').click(function() {
            $.get('/saml/assertion')
            .done(function (data) {
                var form = $('<form>', {
                    'method': 'POST',
                    'action': salesforceLoginUrl,
                    'target': '_blank',
                    'style': 'visibility: hidden'
                });

                form.append($('<input>', {
                    'name': 'SAMLResponse',
                    'value': data
                }))
                .append($('<input>', {
                    'name': 'RelayState',
                    'value': salesforceStartUrl
                }))
                .appendTo('body') // Required for FF
                .submit();

                form.remove();
            });
        });
    });
</script>

And there you have it! If you are having trouble, be sure to check out the SAML Validator. It will contain the assertion from the last login attempt and display any errors. Also checkout the SAML login history to see a list logins and any errors.


ES6 is right around the corner. But thanks to a few of libraries and tools, developing and testing ES6 apps today on ES5 is trivial. Lets see how we can do it.

The code for this post can be found here.

Overview

Developing ES6 apps on ES5 requires three things:

  1. Polyfill ES6 polyfills add new functionality to ES5 browsers and Node.js.
  2. Module Loader Module loaders do just what their name implies, load modules.
  3. Transpiler Converts ES6 code to ES5 code so it can be run by ES5 browsers and Node.js.

Layout

In this post we’ll be working with the following layout:

index.html
app.es6
module.es6
gulpfile.js
tests
    module-tests.es6

I’ve gone the route of naming ES6 files with the .es6 extension (which will be transpiled to .js ES5 files). Linguist, the library GitHub uses for syntax highlighting, recognizes the .es6 extension, so they will be properly highlighted in GitHub.

Polyfill

Polyfills add new ES6 functionality to ES5 browsers and Node.js. Polyfills can only do so much though, for example they cannot add support for new language constructs (But we can work around this with transpilation). There are a number of polyfill libraries out there but we’ll use core-js in this example.

You will need to add core-js to your html file. It’s available through Bower (bower install core.js) or directly here.

<script src="core.js"></script>

Next install the core-js package (npm install core-js --save) and require it in your gulpfile.js.

require('core-js');

Module Loader

Since ES5 browsers do not have the ability load modules we will need to use a module loader. There are many module loaders out there but we will be using SystemJS in this example. SystemJS has the ability to load a number of different module formats. In this example we will be transpiling to CommonJS modules as these can be handled by SystemJS in the browser and natively by Node.js.

You will need to add SystemJS and the ES6 module loader polyfill (Which SystemJS depends on) to your html file. It’s available through Bower (bower install system.js) or directly here and here respectively.

<script src="es6-module-loader.js"></script>
<script src="system.js"></script>

NOTE: you do not need to include a transpiler as noted here as we will not be transpiling on the fly in the browser.

Transpiliation

As mentioned earlier, ES6 support cannot be achieved with polyfills alone as there are new language constructs. The way we deal this is through transpilation, where ES6 is converted to ES5. There are many transpilers out there but we will be using Babel (Formerly 6to5) in this example. Transpilation can be done dynamically in the browser or statically as part of a build. The former is not appropriate for production so we will do the latter and setup a watcher for dev.

Install the following node modules:

 npm install gulp-babel --save
 npm install gulp-sourcemaps --save
 npm install gulp-rename --save

… and add the following task to your gulpfile.js:

var sourcemaps = require("gulp-sourcemaps");
var babel = require("gulp-babel");
var rename = require('gulp-rename');

gulp.task('babel', function () {
    return gulp.src(['**/*.es6'])
        .pipe(sourcemaps.init())
        .pipe(babel())
        .pipe(rename({ extname: '.js' }))
        .pipe(sourcemaps.write('.'))
        .pipe(gulp.dest('.'));
});

Here we are transpiling ES6 files (with a .es6 extension) to ES5. The default module format is CommonJS. We are transpiling to this module format as it can be understood on the browser by SystemJS and natively by Node.js for our tests. We are also generating source map files so that transpiled code can be mapped to the original ES6 code when debugging. Finally the output is renamed to *.js. Since we don’t want to commit generated files to our repository we can selectively exclude them in our .gitignore:

*.js
# Any exclusions
!gulpfile.js

Bootstrapping

Bootstrapping our app in the browser is done via the module loader:

<script>
    System.import('app').then(function(app) { app.run(); });
</script>

Testing

We will use the Mocha test framework and Chai assertion library. There is not really anything special we have to do on the Node side except make our test task depend on the babel task to transpile before running the tests.

Install the following node modules:

 npm install gulp-mocha --save
 npm install chai --save

Then add the following task to your gulpfile.js:

var mocha = require('gulp-mocha');
var process = require('child_process');

function spawnTests() {
    process.spawn('gulp', ['test'], { stdio: 'inherit' });
}

gulp.task('watch', function () {
    spawnTests();
    gulp.watch('**/*.es6', spawnTests);
});

gulp.task('test', ['babel'], function () {
    return gulp.src(['tests/**/*.js'])
        .pipe(mocha({ reporter: 'spec', ui: 'bdd' }));
});

Spawning the test runner prevents certain failures from stopping the watch loop. Also, since the watch is fired only on changes, we run it right off the bat so we don’t have to wait for changes.

Now we can write a test referencing a module:

import { expect } from 'chai';
import * as module from '../module';
 
describe('Module', () =>
    it('should return something.', () => 
        expect(module.oh()).to.equal('hai')
    )
);

And run the tests:

gulp watch

Conclusion

So writing ES6 apps today is a pretty easy task. With the finalization of the spec so close it seems like a no-brainer at this point to do all new development on ES6. So long ES5, and thanks for all the fish!


I really love GitHub but unfortunately not every organization can host code outside of their network or justify the cost of GitHub Enterprise (Which starts at $400+ p/m). After trying out some different packages, GitLab is the only thing that even comes close to GitHub for internal use without breaking the bank (Free Community Edition or starting at $34 p/m for Enterprise). In fact the free community edition would probably be sufficient for most organizations. Below I outline setting up GitLab on Ubuntu 14.04 with Active Directory integration and a self signed cert.

Install

The following installation instructions are taken from the download page.

sudo apt-get install openssh-server

# Omit if using an existing SMTP server
sudo apt-get install postfix

curl https://packages.gitlab.com/install/repositories/gitlab/gitlab-ce/script.deb.sh | sudo bash
sudo apt-get install gitlab-ce

# If this fails, reboot and try again
sudo gitlab-ctl reconfigure

Browse to your server and make sure everything is up and running. Login as root/5iveL!fe and change the default password. Also disable sign ups (Admin area > Settings > Signup enabled).

Configure Email

The following enables email delivery and sets basic options:

sudo vim /etc/gitlab/gitlab.rb
 gitlab_rails['gitlab_email_enabled'] = true
 gitlab_rails['gitlab_email_from'] = 'gitlab@mydomain.int'
 gitlab_rails['gitlab_email_display_name'] = 'GitLab'

If you you are sending through an existing SMTP server, configure as follows (See here for more options):

 gitlab_rails['smtp_enable'] = true
 gitlab_rails['smtp_address'] = "smtp.mydomain.int"
 gitlab_rails['smtp_port'] = 25

Finally reconfigure for the changes to take effect.

sudo gitlab-ctl reconfigure

Unfortunately there isn’t a test email button anywhere, but you can add an SSH key as this sends a notification.

Configure Active Directory Integration

In order to integrate with Active Directory you will need to either have anonymous queries enabled or create a domain account with query access. By default members of the Domain Users group have this.

sudo vim /etc/gitlab/gitlab.rb

Configure the basic settings, see here for more details and settings.

 gitlab_rails['ldap_enabled'] = true
 gitlab_rails['ldap_servers'] = YAML.load <<-'EOS'
   main: # 'main' is the GitLab 'provider ID' of this LDAP server
     label: 'My Organization' # Label shown on the login page
     host: 'ad1.mydomain.int' # AD server
     port: 389
     uid: 'sAMAccountName'
     method: 'plain' # "tls" or "ssl" or "plain"
     bind_dn: 'GitLab' # AD user that has query access
     password: 'P@$$w0rd' # Password of said user
     active_directory: true
     allow_username_or_email_login: false
     base: 'CN=Users,DC=mydomain,DC=int'
     user_filter: '' # Leave blank if not used
 EOS

If you want to filter based on group membership you can use the following user filter:

user_filter: '(memberOf:1.2.840.113556.1.4.1941:=CN=GitLabUsers,CN=Users,DC=mydomain,DC=int)'

Next, run the following to propagate the changes and ensure configuration is correct.

sudo gitlab-ctl reconfigure
sudo gitlab-rake gitlab:ldap:check RAILS_ENV=production

If so, you should see a list of users that match the base and filter:

Checking LDAP ...

LDAP users with access to your GitLab server (only showing the first 100 results)
Server: ldapmain
     DN: CN=GitLab,CN=Users,DC=mydomain,DC=int  sAMAccountName: GitLab
     DN: CN=Guest,CN=Users,DC=mydomain,DC=int  sAMAccountName: Guest
     ...

Checking LDAP ... Finished

GitLab pulls user email addresses from AD so you will need to make sure these are set on users accessing GitLab. These cannot be modified in GitLab.

Now login to GitLab with your domain account and set up your profile. Next, log out and login to GitLab as root. Then give your domain account admin privileges (Admin area > Users > Edit > Admin).

Configure SSL

Create the ssl folder where the cert will saved.

sudo mkdir -p /etc/gitlab/ssl
sudo chmod 700 /etc/gitlab/ssl

Create a self signed cert as outlined below or if you have a cert, copy the crt and key files into the ssl folder as code.mydomain.com.crt and code.mydomain.com.key respectively.

sudo openssl genrsa -out "/etc/gitlab/ssl/code.mydomain.int.key" 2048
sudo openssl req -new -key "/etc/gitlab/ssl/code.mydomain.int.key" -out "gitlab.csr"

# Country Name (2 letter code) [AU]:US
# State or Province Name (full name) [Some-State]:Maryland
# Locality Name (eg, city) []:Fort Meade
# Organization Name (eg, company) [Internet Widgits Pty Ltd]:Setec Astronomy
# Organizational Unit Name (eg, section) []:Research  
# Common Name (e.g. server FQDN or YOUR name) []:code.mydomain.int
# Email Address []:me@mydomain.int
# A challenge password []:
# An optional company name []:

sudo openssl x509 -req -days 3650 -in "gitlab.csr" \
     -signkey "/etc/gitlab/ssl/code.mydomain.int.key" \
     -out "/etc/gitlab/ssl/code.mydomain.int.crt"

sudo rm "gitlab.csr"

Now configure SSL.

sudo vim /etc/gitlab/gitlab.rb
 external_url 'https://code.mydomain.int'
 ...
 nginx['redirect_http_to_https'] = true

Now add the SSL exception to the firewall and propagate configuration changes and restart nginx.

# Enable SSL in the firewall
sudo ufw allow https

sudo gitlab-ctl reconfigure
sudo gitlab-ctl restart

You should now be able to access GitLab over SSL.

Reverse Proxy with SSL

If instead you have SSL setup through a reverse proxy you can change the default url to be https but you will need to disable SSL in nginx.

sudo vim /etc/gitlab/gitlab.rb
 external_url 'https://code.mydomain.com'
 ...
 nginx['listen_https'] = false
sudo gitlab-ctl reconfigure

If you’re running TeamCity and you want to enable status badges, this post is for you. Turns out this is really easy to do and you can use the badges that ship with TeamCity or use the pretty Shields.io badges.

TeamCity Badges

TeamCity has shipped with status badges out of the box for a while now: . To enable these you need to do one of two things depending of whether or not you want to enable guest access (Otherwise you’ll get this: ).

No Guest Access

If you do not want to enable guest access you will have to individually enable status badges for each build configuration. Under the build configuration General Settings section you will see an option called enable status widget. Check this, save and the badge will now be available.

TeamCity Status Widget

Guest Access

Enabling guest access is the easiest option as it enables the status icons for all build configurations. It also allows you to create a link to the build page that anyone can view. To do this go to the global Administration page, then go to the *Server Administration Authentication* section. There will be a checkbox that allows you to enable guest access. Check this, save and badges will now be available on all build configurations.

TeamCity Guest Access

Badge Url

Now that badges are enabled you can create the badge url. The url is formatted as follows:

http://Server/app/rest/builds/buildType:(id:BuildConfigId)/statusIcon

So for example if our server is build.myorg.com and the build config id was myapp the url would be as follows:

http://build.myorg.com/app/rest/builds/buildType:(id:myapp)/statusIcon

The build config id can be found under the build configuration settings. TeamCity will generate a default one that is nondescript (i.e. bt24) so you can set a more descriptive one here if you like.

TeamCity Build Config Id

If you enabled guest access you can also create a link to the build status page that is formatted as follows:

http://Server/viewType.html?buildTypeId=BuildConfigId&guest=1

The guest flag automatically logs them in as the guest user. So for example if our server is build.myorg.com and the build config id was myapp the html would be as follows:

<a href="http://build.myorg.com/viewType.html?buildTypeId=myapp&guest=1">
<img src="http://build.myorg.com/app/rest/builds/buildType:(id:myapp)/statusIcon"/>
</a>

Shields.io TeamCity Badges

I love TeamCity but I don’t find their status badges very attractive. Shields.io offers beautiful badges that, fortunately for us, integrate nicely with TeamCity: . In order to use the shields.io badges you will need to enable badges as described in the previous section. Also your server will need to be publicly accessible. The url is formatted as follows:

https://img.shields.io/teamcity/Protocol/Server:Port/s/BuildConfigId.svg

The protocol can be either http or https and the port is optional if it is 80. Also you can choose between .svg and .png formats. So for example if our server is build.myorg.com and the build config id was myapp the url would be as follows:

https://img.shields.io/teamcity/http/build.myorg.com/s/myapp.svg

You can also specify a flat style (Which I personally prefer): by tacking on the style flag:

https://img.shields.io/teamcity/http/build.myorg.com/s/myapp.svg?style=flat

You may also want to override the label. For example if you have both TravisCI and TeamCity badges you would want to differentiate the two. You can do this by tacking on the label flag:

https://img.shields.io/teamcity/http/build.myorg.com/s/myapp.svg?label=TeamCity

That will produce this: .


I’ve been using Grunt to build and deploy .NET apps for about a year now. Its a huge improvement over Rake and the crusty XML build tools but I’d been hearing a lot of good things about gulp so I thought it was time to check it out. I was not disappointed and would highly recommend using gulp over Grunt. Gulp’s code over configuration approach eliminates the friction I experienced with Grunt. It also reduces (Or eliminates entirely) the code required to wire up tasks. So lets see how we can use gulp to build .NET apps.

Project Layout

The project layout will be along the lines of this:

/MyApp
    /src
        MyApp.sln
        ...
    /...
    gulpfile.js
    package.json
    ...

At the root of your project will be a package.json that specifies your dependencies. Also a gulpfile.js which is your build script.

Initial Setup

  • Download and install Node.js.
  • Create a minimal package.json at the project root or use npm init to generate it for you:
{
    "name": "MyApp",
    "version": "0.0.0"
}
  • Create a bare bones gulpfile.js at the project root:
var gulp = require('gulp');

gulp.task('default', []);

gulp.task('ci', []);

Above we have a special default task alias that gets run when you type gulp with no arguments. You can decide what you want that to do; I have it setup to run Karma in watch mode. The second task alias, ci, will be what is run by the build server (And you can call this whatever you want so long as the build server knows about it, as we’ll see later).

  • Install gulp globally (-g): npm install gulp -g
  • Install gulp locally (No -g) and save the dependency to your package.json: npm install gulp --save
  • Run gulp to make sure all is working, should display something along these lines:
[14:48:30] Using gulpfile /.../gulpfile.js
[14:48:30] Starting 'default'...
[14:48:30] Finished 'default' after 32 μs

NOTE: Throughout this post I will be using the --save flag to save NPM dependencies to the package.json file. You may have also seen the --save-dev flag and might be confused as to when you should use one over the other. Check out this SO question for an explanation.

Task Sequence

Gulp will try to run every task in parallel. Obviously you will need to run certain tasks in a particular order in your build. The current version of gulp allows you to do this a few ways. First you will need to specify a dependency and then some way to indicate the dependency has completed. According to the gulp docs, you can indicate that a dependency has completed by either returning a stream, returning a promise or taking in a callback and calling it when done. The following demonstrates the stream and callback approaches:

// Return a stream so gulp can determine completion
gulp.task('clean', function() {
    return gulp
        .src('app/tmp/*.js', { read: false })
        .pipe(clean());
});

// OR

// Take in the gulp callback and call it when done
gulp.task('clean', function(callback) {
    gulp.src('app/tmp/*.js', { read: false })
        .pipe(clean());
    callback();
});

// Specify the dependencies in the second parameter
gulp.task('build', ['clean'], function() {
    // Build...
});

So if you run the build task in this example, the clean task will run and complete first, then the build task will run. I will favor returning the stream throughout this post unless the callback or promise method makes sense.

Seem awkward and/or confusing? You’re not alone. The upcoming gulp 4 release will revamp how this is handled. When that is released I will update this post to reflect those changes.

Passing in parameters

Undoubtedly you’ll want to pass parameters into your build scripts. For example passing in the version or a nuget api key. One way to do this is with environment variables:

gulp.task('default', function() {
    var version = process.env.BUILD_NUMBER;
    var nugetApiKey = process.env.NUGET_API_KEY;
    ...
});

Another way to do this is by passing them into gulp as parameters:

var args = require('yargs').argv;

gulp.task('default', function() {
    console.log(args.buildVersion);
    console.log(args.debug);
});
$ gulp --build-version 1.2.3.4 --debug
[14:48:30] Using gulpfile /.../gulpfile.js
[14:48:30] Starting 'default'...
[14:48:30] 1.2.3.4
[14:48:30] true
[14:48:30] Finished 'default' after 32 μs

Here we are using the yargs module (Which is a fork of optimist) to parse the gulp command line args (npm install yargs --save). You can pass in any arguments you like so long as they don’t conflict with gulp options (Which is why I use build-version instead of version which is already used by gulp). One nice feature of yargs is that arguments that do not have a value are considered flags and represented as booleans (As demonstrated above with --debug). Another nice feature is the automatic conversion of spinal-case-args to camelCase. I will use this approach throughout this post.

Assembly Info

First thing you will want to do is set the version number in the project assembly info files (And any other info you’d like). I personally let the build server manage the version and then grab it from an environment variable, but you can do whatever works best for you. To do this we’ll use the gulp-dotnet-assembly-info plugin (npm install gulp-dotnet-assembly-info --save). Use the plugin as follows:

var args = require('yargs').argv,
    assemblyInfo = require('gulp-dotnet-assembly-info');

gulp.task('assemblyInfo', function() {
    return gulp
        .src('**/AssemblyInfo.cs')
        .pipe(assemblyInfo({
            version: args.buildVersion,
            fileVersion: args.buildVersion,
            company: 'Planet Express',
            copyright: function(value) { 
                return value + '-' + new Date().getFullYear(); 
            },
            ...
        }))
        .pipe(gulp.dest('.'));
});

So we pipe in all AssemblyInfo.cs files, modify them and then save them back out. You can specify a value or a function that returns the value. See here for more info.

Setting Configuration Values

You may need to set values in the app.config or web.config. To do this we’ll use the xmlpoke module (npm install xmlpoke --save). Use the module as follows:

var xmlpoke = require('xmlpoke');

gulp.task('configuration', ['assemblyInfo'], function(cb) {
    xmlpoke('**/{web,app}.config', function(xml) {
        xml.withBasePath('configuration')
           .set("appSettings/add[@key='connString']/@value", 
                'Server=server;Database=database;Trusted_Connection=True;')
           .set('system.net/mailSettings/smtp/network/@host', 'smtp.mycompany.com');
    });
    cb();
});

This module sports a lot more features than shown here. See here for more info.

Building

Now for building. To do that we will use the gulp-msbuild plugin (npm install gulp-msbuild --save). Use the plugin as follows:

var msbuild = require('gulp-msbuild');

gulp.task('build', ['configuration'], function() {
    return gulp
        .src('**/*.sln')
        .pipe(msbuild({
            toolsVersion: 12.0,
            targets: ['Clean', 'Build'],
            errorOnFail: true,
            stdout: true
        }));
});

The plugin looks for msbuild in the PATH. You can also specify the version you want to target with the toolsVersion option. This plugin supports more options than shown here, see here for more info.

On a side note, you may run into the following error when building web applications on your build server:

The imported project "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\
v11.0\WebApplications\Microsoft.WebApplication.targets" was not found.

A solution can be found here.

Running Tests

Next you will want to run your tests. If you are using NUnit, you’re in luck as there is a gulp plugin for that. We will use the gulp-nunit-runner plugin (npm install gulp-nunit-runner --save). Use the plugin as follows:

var nunit = require('gulp-nunit-runner');

gulp.task('test', ['build'], function () {
    return gulp
        .src(['**/bin/**/*Tests.dll'], { read: false })
        .pipe(nunit({
            teamcity: true
        }));
});

The plugin looks for NUnit in the PATH and by default runs the anycpu version of NUnit (The x32 version can be specified with the platform option). You can also explicitly pass the nunit runner path if you like. You’ll notice we’re passing read: false into the source; this indicates that only filenames, and not content, are to be included in the stream. Also, the teamcity option integrates the test results with TeamCity. The plugin supports many more options than shown here, see here for more info.

Controlling Windows Services

Sometimes you will need to deploy Windows services. In order to do this you will need to stop these services before the delploy and start them after. To do this we can use the windows-service-controller node module (npm install windows-service-controller --save). Use the module as follows:

var sc = require('windows-service-controller');

gulp.task('stop-services', ['nunit'], function() {
    return sc.stop('MyServer', 'MyService');
});

...

gulp.task('start-services', ['deploy'], function() {
    return sc.start('MyServer', 'MyService');
});

Here we are passing in a single service although you can pass in an array of service names if there are multiple. The module supports many more options than shown here, see here for more info.

Deploying Files

There are a couple of ways to deploy files. Out of the box, gulp’s innate ability to work with files will get you a long way:

gulp.task('deploy', ['nunit'], function() {
    return gulp
        .src('./src/MyApp.Web/**/*.{config,html,htm,js,dll,pdb,png,jpg,jpeg,gif,css}')
        .pipe(gulp.dest('D:/Websites/www.myapp.com/wwwroot'));
});

If for some reason you need more advanced file copy capabilities you can use Robocopy (née xcopy). We can use the robocopy node module to run it (npm install robocopy --save). Use the plugin as follows:

var robocopy = require('robocopy');

gulp.task('deploy', ['nunit'], function() {
    return robocopy({
        source: 'src/MyApp.Web',
        destination: 'D:/Websites/www.myapp.com/wwwroot',
        files: ['*.config', '*.html', '*.htm', '*.js', '*.dll', '*.pdb',
                '*.png', '*.jpg', '*.jpeg', '*.gif', '*.css'],
        copy: {
            mirror: true
        },
        file: {
            excludeFiles: ['packages.config'],
            excludeDirs: ['obj', 'Properties'],
        },
        retry: {
            count: 2,
            wait: 3
        }
    });
});

The robocopy function returns a promise so we can just return this to allow gulp to know when it has completed. You’ll also notice that robocopy is not a gulp plugin and this is ok. Unlike Grunt where everything is a plugin, gulp plugins are really only useful if they operate on a stream of files. Many times you will just invoke a module in a gulp task like we do above, no plugin involved at all. @ozcinc has a nice writeup on this here.

The Robocopy options above are pretty self explanatory. The mirror option allows you to synchronize your destination with your source folder, removing any deleted files. The retry options allow you to retry the copy after so many seconds if it failed. Both these options can be useful when deploying websites. The task fully supports all the robocopy options, see here for more info.

Database Migration

More than likely your app is data driven and you will need to deploy schema changes. The simplest approach is to apply a delta script. To do this we can use the sqlcmd-runner module (npm install sqlcmd-runner --save).

var sqlcmd = require('sqlcmd-runner');

gulp.task('database', ['robocopy'], function() {
    return sqlcmd({
        server: 'sql.mycompany.int',
        database: 'myapp',
        inputFiles: [ 'delta.sql' ],
        outputFile: 'delta.log',
        failOnSqlErrors: true,
        errorRedirection: true
    });
});

This module is a wrapper around the sqlcmd utility and supports all options; see here for more info.

If you’re using DB Ghost to create your delta script, you can use the dbghost module (npm install dbghost --save).

var sqlcmd = require('sqlcmd-runner');

gulp.task('database', ['robocopy'], function() {
    return dbghost.buildCompareAndCreateDelta({
        configSavePath: 'CreateDelta.dbgcm',
        changeManager: {
            reportFilename: 'CreateDelta.log',
            buildDatabaseName: 'Source',
            deltaScriptsFilename: 'delta.sql',
            templateDatabase: {
                name: targetSchema,
                server: 'sql.mycompany.int'
            },
            targetDatabase: {
                name: targetSchema,
                server: 'sql.mycompany.int'
            },
            schemaScripts: {
                rootDirectory: 'schema',
            }
        }
    })
    .then(function() {
        return sqlcmd({
            server: 'sql.mycompany.int',
            database: 'myapp',
            inputFiles: [ 'delta.sql' ],
            outputFile: 'delta.log',
            failOnSqlErrors: true,
            errorRedirection: true
        });
    });
});

This module is a wrapper around ChangeManagerCmd.exe and can either generate a config from scratch or from a template, overriding the template with the config passed in. See here for more info.

Nuget

If you are publishing a Nuget package instead of deploying an app there is a module for that too. We can use the nuget-runner module (npm install nuget-runner --save). You will need to create a nuspec file as described here. Use the module as follows:

var args = require('yargs').argv,
    Nuget = require('nuget-runner');

gulp.task('deploy', ['nunit'], function() {

    // Copy all package files into a staging folder
    gulp.src('src/MyLibrary/bin/Release/MyLibrary.*')
        .pipe(gulp.dest('package/lib'));

    var nuget = Nuget({ apiKey: args.nugetApiKey });

    return nuget
        .pack({
            spec: 'MyLibrary.nuspec',
            basePath: 'package', // Specify the staging folder as the base path
            version: args.buildVersion
        })
        .then(function() { return nuget.push('*.nupkg'); });
});

As demonstrated above you can set the version number and your nuget API key from an environment variable set by the build server. One thing to note is that even though you are passing in the version, the version element must exist in the nuspec file and have a value, otherwise pack will fail. The pack method returns a promise so we can just return this to allow gulp to know when it has completed.

You can also simplify the above code a bit more by specifying the the files in the .nuspec:

<package ...>
   <metadata>
      ...
   </metadata>
   <files>
      <file src="src\MyLibrary\bin\Release\MyLibrary.*" target="lib" />
   </files>
</package>

Then you can forgo creating the staging folder and specifying a basepath:

var Nuget = require('nuget-runner');

gulp.task('deploy', ['nunit'], function() {

    var nuget = Nuget({ apiKey: process.env.NUGET_API_KEY });

    return nuget
        .pack({
            spec: 'MyLibrary.nuspec',
            version: process.env.BUILD_NUMBER
        })
        .then(function() { return nuget.push('*.nupkg'); });
});

The module supports more commands and options than shown here, see here for more info.

Build Server

The last step is to setup your build server to run gulp. I’m going to demonstrate how to configure gulp with TeamCity but this should loosely apply to any build server.

  • Download and install Node.js.
  • Install gulp: npm install gulp -g -prefix="C:\Program Files\nodejs". You will need to set the prefix to a folder in the PATH. I simply put it in the node install directory along side NPM. By default this folder is added to the PATH by the Node.js installer. Note that depending on your UAC settings you may need to run that command in an elevated command prompt as Program Files can be locked down. Also the 32 bit version will be installed to Program Files (x86) by default.
  • Restart the TeamCity build agent so it picks up the Node.js path.
  • Create a Command Line build step in TeamCity, set Run to Custom script and enter the following as the Custom script:
call npm install
call gulp ci

Here we call gulp with the task we want to run.

TeamCity gulp task

As mentioned above you may want to pass parameters into your build script. There are a couple of ways to do this. First you can hard code them directly into the custom script above:

call npm install
call gulp ci --build-version 1.0.0.0 --nuget-api-key 78a53314-c2c0-45c6-9d92-795b2096ae6c

There are a couple of problems with this however. First, you don’t want to manually manage your version number when TeamCity already does that for you automatically. So you can take advantage of TeamCity predefined build parameters and dynamically pass the version number as follows:

call npm install
call gulp ci --build-version %build.number%

Now the Nuget API key could be hardcoded for some builds but what if you use the same API key for multiple builds? Again you can make use of build parameters as TeamCity allows you to create custom parameters at different levels. This allows you set the parameter in one place where it can be referenced by multilple builds. You would specify the custom parameter as you would the predefined one:

call npm install
call gulp ci --nuget-api-key %nuget.api.key%

Here you can see the custom build parameter set at the project level. All build configurations under it inherit this parameter.

TeamCity gulp task

Usually you will want to keep an eye on how long tasks take to execute. TeamCity allows you to customize notifications so you can include task timings in your notifications. To do this we’ll need to edit the build_successful.ftl template of your preferred notification type and add the following template. You can add it anywhere you want.

<b>Gulp Task Timings</b>
<br/>

<table border="0">
<#list build.buildLog.messages[1..] as message>
    <#assign tasks = message.toString()?matches(r".*\sFinished\s\'(.*)\'\safter\s(.*)")>
    <#if tasks && !tasks?groups[2]?contains("μ") ><tr>
        <td>${tasks?groups[1]}</td>
        <td>${tasks?groups[2]}</td>
    </tr></#if>
</#list>
</table>

Your notifications will then display task timings along the lines of this:

Gulp Task Timings

init           6.51 ms
assembly-info  1.9 s
config         119 ms
style-cop      71 ms
build          2.88 s
unit-tests     4.11 s
...

Tasks are displayed in the order they finish.

Final Thoughts

Hopefully this demonstrates how easy it is to setup a .NET build/deploy on Windows with gulp. If you are also doing client side development, this will be even more of a win as your tools (Like Karma, JSHint, etc) will be run by the same build tool.