jspm preprocessing (injecting settings for the targeted environment) when bundling - bundle

Is there a way to preprocess the .js files (i.e. inject environment specific settings) when bundling using buildStatic?

I don't know about a way to pre-process js files during bundling but you can have different files/modules for different environments and use JS API to swap your development version with the production one:
gulp.task('jspm', function() {
var builder = new jspm.Builder();
function production(builder) {
var systemNormalize = builder.loader.normalize;
builder.loader.normalize = function(name, parentName, parentAddress) {
if (name === 'ember') name = 'ember/ember.prod';
if (name === './app-config.dev') name = './app-config.prod';
return systemNormalize.call(this, name, parentName, parentAddress);
return builder.loadConfig('./config.js')
.then(function() {
return builder.buildStatic('app/main', 'dist/app.min.js', { sourceMaps: false, minify: false, mangle: false});
app-config.dev.js and app-config.prod.js are modules that you use throughout the app and that provide your environment-specific settings. In your code, you should always import app-config.dev. Read more about this workflow in my blog post: How to Use SystemJS Hooks for Building a Production Version of Your App


Gulp - using ES6 modules in a combined js file?

I am trying to use ES6 modules in my current GULP setup. I've read that this is yet to be supported by browsers or Babel, so there is a need some elaborate setup to make this work, using things such Browserify, babelify, vinyl-source-stream. (Seems extremely complex setup).
What I want is different from examples I had found online. All the examples are with external files being imported, and I really don't want that. I want all the files to be bundled into a single file, with all the modules there already. Here's what I have:
My current GULP setup is like this:
gulp.task('buildJS', function() {
var src = [
'./js/dist/init.js' // must be last
.pipe(babel({modules:"common"})) // I have no idea what "modules" actually does
And this is an example of a component file in /js/dist/components/.
There are many files like this, and they are all combined to a single file.
module "components/foo" {
export function render(settings) {
return ...
So later in some page controller I would use it:
import { render } from "components/foo";
Now that I have a single file, (been transformed using Babel), how can I use the modules via Import?
No, don't naively concatenate the files. Use browserify to bundle them, with babelify to compile them (via babel). A basic example would look something like this:
// ...
It's hard to give more specific advice because your use case is so unclear. Do you have a dependency graph that begins at one file, or are you trying to bundle together a bunch of indepdendent modules? Are you trying to run a script to kick off an application, or do you just want to be able to access modules individually?
Based on the example you linked to in your comment you should have something like this:
export default function Doughnut (settings = {}) {
// ...
Doughnut.prototype = {}
import Doughnut from './components/doughnut';
export default function () {
var component = new Doughnut();
Have each module export what you want to be available from any other module. Have each module import whatever it needs from any other module(s). Whatever uses the controller from this example should then do import home from './routes/home'; These modules aren't tied to a global variable App and can be reused in other applications (as long as you otherwise make them reusable).
.pipe(babel({modules:"common"})) // I have no idea what "modules"
modules is a babel option that determines what module format it compiles ES6 module syntax to. In this case, CommonJS.
module "components/foo" {
Thanks to your comments I now understand why you have this. You need to eliminate that. Your component file should look something like:
export function render (settings) {
return ...
Paired with:
import { render } from "components/foo";
Or if you want a default export / import:
export default function render (settings) {
return ...
import render from "components/foo";
import { render } from "components/foo";
If you're browserifying your modules, you're probably going to need to use relative paths like ./components/foo or use something else to deal with the paths, like babel's resolveModuleSource option.
Since the end of 2015 I have been using rollupjs in order to create a bundle of ES2015 (ES6) modules, so I could use import/export freely in my code.
I've found Rollupjs to be very good and easy to use. The people behind
it are great people which devote themselves to the project. I've had
many questions which I had posted on the project's Github issues page
and I always got answered pretty quickly.
Setup includes these rollupjs plugins:
rollup (basic rollupjs bundler)
rollup-plugin-babel (converts ES2015 code to ES5 or earlier, for legacy browsers support)
rollup-plugin-eslint (verify the javascript code is valid)
rollup-plugin-uglify (minify the code, to make it smaller)
rollup-plugin-progress (shows bundle progress in terminal. shows which file being "worked on")
beepbeep (Make a console beep sound. I use this to inform me of compilaction errors)
Simplified GULP setup I'm using:
var gulp = require('gulp'),
gutil = require('gulp-util'),
rollup = require('rollup').rollup,
babelRollup = require('rollup-plugin-babel'),
eslintRollup = require('rollup-plugin-eslint'),
uglifyRollup = require('rollup-plugin-uglify'),
rollupProgress = require('rollup-plugin-progress'),
beep = require('beepbeep');
// ESlint
var eslint_settings = {
rulePaths: [],
rules: {
"no-mixed-spaces-and-tabs" : [2, "smart-tabs"],
"block-spacing" : [2, "always"],
"comma-style" : [2, "last"],
"no-debugger" : [1],
"no-alert" : [2],
"indent-legacy" : [1, 4, {"SwitchCase":1}],
'strict' : 0,
'no-undef' : 1
ecmaFeatures : {
modules: true,
sourceType: "module"
"parserOptions": {
"ecmaVersion" : 6,
"sourceType": "module",
"ecmaFeatures": {
"jsx": false,
"experimentalObjectRestSpread": true
globals : ['$', '_', 'afterEach', 'assert', 'beforeEach', 'Cookies', 'd3', 'dataLayer', 'describe', 'done', 'expect', 'ga', 'it', 'jQuery', 'sinon'], baseConfig: {
//parser: 'babel-eslint',
envs: [
'browser', 'es6'
// Rollup plugins configuration
function getRollupPlugins( settings = {} ){
var rollupPlugins = [];
presets : [['es2015', {"modules": false}]], //['es2015-rollup'],
runtimeHelpers : true,
exclude : 'node_modules/**',
plugins : ["external-helpers"]
rollupPlugins.push(eslintRollup( Object.assign({throwOnError:true}, eslint_settings) ))
clearLine:true // default: true
// I would advise Babel to only be used for production output since it greatly slower bundle creation
if( settings.ENV == 'production' ){
return rollupPlugins;
var rollupPlugins = getRollupPlugins();
* a generic Rollup bundle creator
* #param {String} outputPath [where to save the bundle to (must end with /)]
* #param {String} outputFileName [bundle file name]
* #param {String} entryFile [rollup entry file to start scanning from]
* #return {Object} [Promise]
function rollupBundle(outputPath, outputFileName, entryFile, bundleOptions){
bundleOptions = bundleOptions || {};
bundleOptions.plugins = bundleOptions.plugins || rollupPlugins;
return new Promise(function(resolve, reject) {
outputFileName += '.js';
var cache;
// fs.truncate(outputPath + outputFileName, 0, function() {
// gutil.log( gutil.colors.dim.gray('Emptied: '+ outputPath + outputFileName) );
// });
entry : entryFile,
plugins : bundleOptions.plugins,
cache : cache
.then(function (bundle) {
var bundleSettings = {
format : bundleOptions.format || 'umd',
sourceMap : false,
banner : config.banner
result = bundle.generate(bundleSettings),
mapFileName = outputFileName + '.map',
sourceMappingURL = '\n//# sourceMappingURL='+ mapFileName;
cache = bundle;
// if folder does not exists, create it
if( !fs.existsSync(outputPath) ){
gutil.log( gutil.colors.black.bgWhite('Creating directory ' + outputPath) );
// save bundle file to disk
fs.writeFile( outputPath + outputFileName, result.code + (bundleSettings.sourceMap ? sourceMappingURL : ''), function(){
// save map file to disk
if( bundleSettings.sourceMap )
fs.writeFile( outputPath + mapFileName, result.map.toString());
gutil.log( gutil.colors.white.bgRed('Rollup [catch]: ', err.stack) );
// This task bundles the main application, using an entry file which itself has many imports,
// and those imports also has imports.. like a tree branching
gulp.task('bundle-app', ()=>{
return rollupBundle('../dist/js/', 'app', 'js/dist/app.js', {format:'cjs'});

node development server intermittently does not handle windows integrated authentication with chrome

We are doing some front-end Web development. We are developing a "single page application" (SPA) in a React-with-redux framework. When doing development, we are using the node.exe development server. We are also using Visual Studio as our primary IDE.
About two months ago, we started having intermittent problems with single-sign-on windows integrated authentication when we are using the node server in development. About 60% of the time, authentication fails and chrome asks for a username/password. At this point, we need to kill the node.exe processes, sometimes even all chrome processes, occasionally even the Visual Studio process, in order to get things working again.
Has anyone had a similar experience, and does anyone have any idea how to resolve this or what may be causing it?
In detail, we start a debug session in Visual Studio. This fires a cmd script which in turn runs "yarn run start". The start.js script is this:
'use strict';
process.env.BABEL_ENV = 'development';
process.env.NODE_ENV = 'development';
process.on('unhandledRejection', err => {
throw err;
const fs = require('fs');
const chalk = require('chalk');
const webpack = require('webpack');
const WebpackDevServer = require('webpack-dev-server');
const clearConsole = require('react-dev-utils/clearConsole');
const checkRequiredFiles = require('react-dev-utils/checkRequiredFiles');
const {
} = require('react-dev-utils/WebpackDevServerUtils');
const openBrowser = require('react-dev-utils/openBrowser');
const paths = require('../config/paths');
const config = require('../config/webpack.config.dev');
const createDevServerConfig = require('../config/webpackDevServer.config');
const useYarn = fs.existsSync(paths.yarnLockFile);
const isInteractive = process.stdout.isTTY;
if (!checkRequiredFiles([paths.appHtml, paths.appIndexJs])) {
const DEFAULT_PORT = parseInt(process.env.PORT, 10) || 3000;
const HOST = process.env.COMPUTERNAME + '.fully.qualified.domain';
.then(port => {
if (port == null) {
const protocol = process.env.HTTPS === 'true' ? 'https' : 'http';
const appName = require(paths.appPackageJson).name;
const urls = prepareUrls(protocol, HOST, port);
const compiler = createCompiler(webpack, config, appName, urls, useYarn);
const proxySetting = require(paths.appPackageJson).proxy;
const proxyConfig = prepareProxy(proxySetting, paths.appPublic);
const serverConfig = createDevServerConfig(
const devServer = new WebpackDevServer(compiler, serverConfig);
devServer.listen(port, HOST, err => {
if (err) {
return console.log(err);
if (isInteractive) {
console.log(chalk.cyan('Starting the development server...\n'));
['SIGINT', 'SIGTERM'].forEach(function (sig) {
process.on(sig, function () {
.catch(err => {
if (err && err.message) {
More important is the content or the webpackDevServer.config.js file:
'use strict';
const errorOverlayMiddleware = require('react-dev-utils/errorOverlayMiddleware');
const noopServiceWorkerMiddleware = require('react-dev-utils/noopServiceWorkerMiddleware');
const config = require('./webpack.config.dev');
const paths = require('./paths');
const Agent = require('agentkeepalive');
const protocol = process.env.HTTPS === 'true' ? 'https' : 'http';
const host = process.env.HOST || '';
const DevVirtualDir = "";
const qualifiedTarget = 'http://' + process.env.COMPUTERNAME + '.fully.qualified.domain:88' + DevVirtualDir;
let onProxyRes = function (proxyRes, req, res) {
const key = "www-authenticate";
proxyRes.headers[key] = proxyRes.headers[key] && proxyRes.headers[key].split(",");
module.exports = function (proxy, allowedHost) {
return {
// WebpackDevServer 2.4.3 introduced a security fix that prevents remote
// websites from potentially accessing local content through DNS rebinding:
// https://github.com/webpack/webpack-dev-server/issues/887
// https://medium.com/webpack/webpack-dev-server-middleware-security-issues-1489d950874a
// However, it made several existing use cases such as development in cloud
// environment or subdomains in development significantly more complicated:
// https://github.com/facebookincubator/create-react-app/issues/2271
// https://github.com/facebookincubator/create-react-app/issues/2233
// While we're investigating better solutions, for now we will take a
// compromise. Since our WDS configuration only serves files in the `public`
// folder we won't consider accessing them a vulnerability. However, if you
// use the `proxy` feature, it gets more dangerous because it can expose
// remote code execution vulnerabilities in backends like Django and Rails.
// So we will disable the host check normally, but enable it if you have
// specified the `proxy` setting. Finally, we let you override it if you
// really know what you're doing with a special environment variable.
!proxy || process.env.DANGEROUSLY_DISABLE_HOST_CHECK === 'true',
// Enable gzip compression of generated files.
compress: true,
// Silence WebpackDevServer's own logs since they're generally not useful.
// It will still show compile warnings and errors with this setting.
clientLogLevel: 'none',
// By default WebpackDevServer serves physical files from current directory
// in addition to all the virtual build products that it serves from memory.
// This is confusing because those files won’t automatically be available in
// production build folder unless we copy them. However, copying the whole
// project directory is dangerous because we may expose sensitive files.
// Instead, we establish a convention that only files in `public` directory
// get served. Our build script will copy `public` into the `build` folder.
// In `index.html`, you can get URL of `public` folder with %PUBLIC_URL%:
// <link rel="shortcut icon" href="%PUBLIC_URL%/favicon.ico">
// In JavaScript code, you can access it with `process.env.PUBLIC_URL`.
// Note that we only recommend to use `public` folder as an escape hatch
// for files like `favicon.ico`, `manifest.json`, and libraries that are
// for some reason broken when imported through Webpack. If you just want to
// use an image, put it in `src` and `import` it from JavaScript instead.
contentBase: paths.appPublic,
// By default files from `contentBase` will not trigger a page reload.
watchContentBase: true,
// Enable hot reloading server. It will provide /sockjs-node/ endpoint
// for the WebpackDevServer client so it can learn when the files were
// updated. The WebpackDevServer client is included as an entry point
// in the Webpack development configuration. Note that only changes
// to CSS are currently hot reloaded. JS changes will refresh the browser.
hot: true,
// It is important to tell WebpackDevServer to use the same "root" path
// as we specified in the config. In development, we always serve from /.
publicPath: config.output.publicPath,
// Part of a custom set-up
headers: {'Access-Control-Allow-Origin': '*'},
// WebpackDevServer is noisy by default so we emit custom message instead
// by listening to the compiler events with `compiler.plugin` calls above.
quiet: true,
// Reportedly, this avoids CPU overload on some systems.
// https://github.com/facebookincubator/create-react-app/issues/293
watchOptions: {
ignored: /node_modules/,
// Enable HTTPS if the HTTPS environment variable is set to 'true'
https: protocol === 'https',
host: host,
overlay: false,
historyApiFallback: {
// Paths with dots should still use the history fallback.
// See https://github.com/facebookincubator/create-react-app/issues/387.
disableDotRule: true,
public: allowedHost,
// This is a custom set-up for the API to make sure NTLM works in development
// In package.json, it is insufficient because you can only provide object literals
// not custom JS code.
// In production, this is not an issue because both the API and the static file
// will be on the same origin.
proxy: {
'/api': {
target: qualifiedTarget,
changeOrigin: true,
agent: new Agent({
maxSockets: 100,
keepAlive: true,
maxFreeSockets: 10,
keepAliveMsecs: 100000,
timeout: 6000000,
keepAliveTimeout: 90000 // free socket keepalive for 90 seconds
onProxyRes: onProxyRes = (proxyRes) => {
const key = 'www-authenticate';
proxyRes.headers[key] = proxyRes.headers[key] && proxyRes.headers[key].split(',');
setup(app) {
// This lets us open files from the runtime error overlay.
// This service worker file is effectively a 'no-op' that will reset any
// previous service worker registered for the same host:port combination.
// We do this in development to avoid hitting the production cache if
// it used the same host and port.
// https://github.com/facebookincubator/create-react-app/issues/2272#issuecomment-302832432

Browserify's custom dependency name is not working

I am trying to get browserify's custom dependency name working with memory streams. The reason I am using a memory stream is because this code is destined to run inside an AWS Lambda that will be receiving multiple "files" as inputs and will not be available to the lambda via the file system.
According to the documentation at https://github.com/substack/node-browserify, it seems like this should be possible:
b.require(file, opts)
file can also be a stream, but you should also use opts.basedir so that relative requires will be resolvable.
Use the expose property of opts to specify a custom dependency name. require('./vendor/angular/angular.js', {expose: 'angular'}) enables require('angular')
const browserify = require('browserify')
const path = require('path')
const File = require('vinyl')
const jsIndex = new File({
file: path.resolve('index.js'),
contents: new Buffer('var two = require("./two")')})
const jsTwo = new File({
file: path.resolve('two.js'),
contents: new Buffer('console.log("Hello from two.js")')})
entries: [ jsIndex ],
require: [ jsTwo ],
basedir: path.resolve('.')
.bundle((err, res) => {
if (err) console.log(err.message)
Cannot find module './two' from '...'
Another individual has posted a similar issue on the node-browserify github page here: https://github.com/substack/node-browserify/issues/1622
Browserify makes heavy use of the file system, as it implements its own module resolution that uses the same algorithm as Node's require. To get Browserify to bundle using in-memory source files, one solution would be to monkey patch the file system - in a manner similar to the mock-fs module that's useful for testing.
However, you really only want to do this for the source files. Browserify uses its module resolution for other files that are included in the bundle and having to add those to the in-memory file system would be tedious. The patched file system functions should check for calls that involve in-memory source files, handle them accordingly and forward other calls to the original implementations. (mock-fs does something similar in that it detects file system calls that occur within require and forwards those calls to the original implementations.)
'use strict';
const fs = require('fs');
const path = require('path');
const toStream = require('string-to-stream');
// Create an object hash that contains the source file contents as strings,
// keyed using the resolved file names. The patched fs methods will look in
// this map for source files before falling back to the original
// implementations.
const files = {};
files[path.resolve('one.js')] =
'console.log("Hello from one.js");' +
'var two = require("./two");' +
'exports.one = 1;';
files[path.resolve('two.js')] =
'console.log("Hello from two.js");' +
'exports.two = 2;';
// The three fs methods that need to be patched take a file name as the
// first parameter - so the patch mechanism is the same for all three.
function patch(method, replacement) {
var original = fs[method];
fs[method] = function (...args) {
var name = path.resolve(args[0]);
if (files[name]) {
args[0] = name;
return replacement.apply(null, args);
} else {
return original.apply(fs, args);
patch('createReadStream', function (name) {
return toStream(files[name]);
patch('lstat', function (...args) {
args[args.length - 1](null, {
isDirectory: function () { return false; },
isFile: function () { return true; },
isSymbolicLink: function () { return false; }
patch('stat', function (...args) {
args[args.length - 1](null, {
isDirectory: function () { return false; },
isFile: function () { return true; }
// With the fs module patched, browserify can be required and the bundle
// can be built.
const browserify = require('browserify');
.require(path.resolve('one.js'), { entry: true, expose: 'one' })
.require(path.resolve('two.js'), { expose: 'two' })
require and expose
In your question, you mentioned expose. The one.js and two.js modules were bundled using require, so they can be required in the browser using the name specified in the expose options. If that's not what you want, you can just use add instead and they will be modules internal to the bundle.
<!doctype html>
<script src="./bundle.js"></script>
// At this point, the bundle will have loaded and the entry
// point(s) will have been executed. Exposed modules can be
// required in scripts, like this:
I spent some time looking into Browserify's require and expose options when investigating this tsify question. The options facilitate the requiring of modules from outside the bundle, but I'm not at all sure they have any influence on requires between modules within the bundle (which is what you need). Also, their implementation is a little confusing - particularly this part in which the exposed name sometimes has a slash prepended.
Vinyl streams
In your question, you used vinyl. However, I wasn't able to use vinyl for the read streams. Although it does implement pipe, it's not an event emitter and doesn't seem to implement the on-based the pre-Streams 2 API - which is what Browserify expects from a createReadStream result.

What is the correct way to load polyfills and shims with Browserify

I'm building a web app and I'm getting to know and love Browserify. One thing has bugged me though.
I'm using some ES6 features that need to be shimmed/polyfilled in older browsers such as es6-promise and object-assign (packages on npm).
Currently I'm just loading them into each module that needs them:
var assign = require('object-assign');
var Promise = require('es6-promise');
I know this is definitely not the way to go. It is hard to maintain and I would like to use the ES6 features transparently instead of having to "depend" on them via requires.
What's the definitive way to load shims like these? I've seen several examples around the internet but they're all different. I could:
load them externally:
var bundle = browserify();
// or should I use it bundle.add to make sure the code is runned???
The problem I have here is that I don't know which order the modules
will be loaded in in the browser. So the polyfilling might not have happened
yet at call sites that need the polyfilled functionality.
This has the additional downside that backend code cannot benefit from these
polyfills (unless I'm missing something).
use browserify-shim or something similar. I don't really see how this would work for ES6 features.
manually set up the polyfilling:
Object.assign = require('object-assign');
Don't require polyfills in your modules, that's an anti-pattern. Your modules should assume that the runtime is patched (when needed), and that should be part of the contract. A good example of this is ReactJS, where they explicitly define the minimum requirement for the runtime so that the library can work: http://facebook.github.io/react/docs/working-with-the-browser.html#browser-support-and-polyfills
You could use a polyfill service (e.g.: https://cdn.polyfill.io/) to include an optimized script tag at the top of your page to guarantee that the runtime is patched correctly with the pieces you need, while modern browsers will not get penalized.
One solution that worked for me was to use bundle.add
I split my bundle in 2 parts, app.js for app code, and appLib.js for libraries (this one will be cached as it does not change oftenly).
See https://github.com/sogko/gulp-recipes/tree/master/browserify-separating-app-and-vendor-bundles
For appLibs.js I use bundle.add for polyfills, as they must be loaded when the script is loaded, while I use bundle.require for other libs, that will be loaded only when required inside app.js.
polyfills.forEach(function(polyfill) {
libs.forEach(function(lib) {
The page loads these 2 bundles in order:
<script type="text/javascript" src="appLibs.js" crossorigin></script>
<script type="text/javascript" src="app.js" crossorigin></script>
This way it seems safe to assume that all polyfills will be loaded even before other libs are initialized. Not sure it's the best option but it worked for me.
My complete setup:
"use strict";
var browserify = require('browserify');
var gulp = require('gulp');
var gutil = require('gulp-util');
var handleErrors = require('../util/handleErrors');
var source = require('vinyl-source-stream');
var watchify = require("watchify");
var livereload = require('gulp-livereload');
var gulpif = require("gulp-if");
var buffer = require('vinyl-buffer');
var uglify = require('gulp-uglify');
// polyfills should be automatically loaded, even if they are never required
var polyfills = [
var libs = [
// permits to create a special bundle for vendor libs
// See https://github.com/sogko/gulp-recipes/tree/master/browserify-separating-app-and-vendor-bundles
gulp.task('browserify-libs', function () {
var b = browserify({
debug: true
polyfills.forEach(function(polyfill) {
libs.forEach(function(lib) {
return b.bundle()
.on('error', handleErrors)
// TODO use node_env instead of "global.buildNoWatch"
.pipe(gulpif(global.buildNoWatch, buffer()))
.pipe(gulpif(global.buildNoWatch, uglify()))
// Inspired by http://truongtx.me/2014/08/06/using-watchify-with-gulp-for-fast-browserify-build/
gulp.task('browserify',['cleanAppJs','browserify-libs'],function browserifyShare(){
var b = browserify({
cache: {},
packageCache: {},
fullPaths: true,
extensions: ['.jsx'],
paths: ['./node_modules','./src/'],
debug: true
libs.forEach(function(lib) {
// TODO use node_env instead of "global.buildNoWatch"
if ( !global.buildNoWatch ) {
b = watchify(b);
b.on('update', function() {
gutil.log("Watchify detected change -> Rebuilding bundle");
return bundleShare(b);
b.on('error', handleErrors);
//b.add('app.js'); // It seems to produce weird behaviors when both using "add" and "require"
// expose does not seem to work well... see https://github.com/substack/node-browserify/issues/850
b.require('app.js',{expose: 'app'});
return bundleShare(b);
function bundleShare(b) {
return b.bundle()
.on('error', handleErrors)
// TODO use node_env instead of "global.buildNoWatch"
.pipe(gulpif(!global.buildNoWatch, livereload()));
As you can see
Or use the polyfill service at
This is the method that I'm using. The key is that you have to export your polyfill properly at the top of your main entry file.
The following won't work:
// Using ES6 imports
import './polyfill';
// Using CommonJS style
... // rest of your code goes here
You actually need to export the polyfill:
// Using ES6 export
export * from './polyfill';
// Using CommonJS style
var polyfill = require('./polyfill');
... // rest of your code goes here
Your polyfills will load correctly if you do either of the latter methods.
Below you can find examples of my polyfills.
import './polyfill/Array.from';
import './polyfill/Object.assign';
if (typeof Object.assign !== 'function') {
(function iife() {
const ObjectHasOwnProperty = Object.prototype.hasOwnProperty;
* Copy the values of all enumerable own properties from one source
* object to a target object. It will return the target object.
* #param {Object} target The target object.
* #param {Object} source The source object.
* #return {Object} The target object.
function shallowAssign(target, source) {
if (target === source) return target;
Object.keys(source).forEach((key) => {
// Avoid bugs when hasOwnProperty is shadowed
if (ObjectHasOwnProperty.call(source, key)) {
target[key] = source[key];
return target;
* Copy the values of all enumerable own properties from one source
* object to a target object. It will return the target object.
* #param {Object} target The target object.
* #param {Object} source The source object.
* #return {Object} The target object.
Object.assign = function assign(target, ...sources) {
if (target === null || target === undefined) {
throw new TypeError('Cannot convert undefined or null to object');
sources.forEach((source) => {
if (source !== null) { // Skip over if undefined or null
shallowAssign(Object(target), Object(source));
return target;

How to load environment specific configuration with requirejs?

I am making a single page app and using backbone with requirejs. Now I want to include different configurations depending environment variable. I don't know how can I do this and what is best practices.
I have three different config file:
config.js : This keep common configurations.
config.dev.js : This keeps development environment specific configurations
config.production.js : This keeps production specific configurations
I am extending Config model(in config.js) from environment specific model and I requiring the Config model from other modules.
This working fine but downloading dev and production config files both. I want only load config.dev.js or config.production.js not both.
// config.js
], function(Backbone, DevConfig, ProdConfig) {
// Select environment model
EnvConfig = (ENV == 'dev') ? DevConfig : ProdConfig;
var ConfigModel = EnvConfig.extend({
// Extending attributes.
defaults: _.extend({
foo : 'bar'
}, EnvConfig.prototype.defaults)
return new ConfigModel();
In other modules I am using Config as below style:
], function(Backbone, Config) {
var ProfileModel = Backbone.Model.extend({
urlRoot: Config.get('apiRoot') + '/profiles'
return ProfileModel;
How to load one of development or production specific config file, not both?
I'm not sure where the ENV variable comes from, but assuming it's just a global variable you could do something like this:
<script type="text/javascript">
// adding it here so the example works on its own
window.ENV = 'production'; // or 'dev'
// this will be added to RequireJS configuration
var require = {
paths: {
// 'env-config' will be an alias for actual file name
'env-config': 'config.' + window.ENV
<script data-main="scripts/main" src="scripts/requirejs.js"></script>
'env': 'development!'
'env': 'production!'
require(['config', 'env-config'], function(conf, envConfig) {
console.log('The environment is: ', envConfig.env);