Node.js Module Patterns
- 1/19/2018
- ·
- #development
- #node
- #javascript
In the module pattern as in all things, JavaScript offers few prescriptions. On the client-side “modules” are creatures of the build system and our imagination; Node.js implements a filesystem-based module pattern but leaves its use and abuse largely up to userland. Considerable freedom, minimal structure.
The good news? These wide-open spaces include several well-worn patterns for authoring modules that are clear, flexible, and easily scaled inside a growing application. All we have to do is put them to use.
Modules
Modules break complex systems into reusable parts. They’re a built-in feature of node.js, but even in browser-based JavaScript it’s common to see modules implemented as immediately-invoked functions .
var myModule = (function () {
var myModule = {
doSomething: () {
return 'consider it done';
},
};
return myModule;
})();
If you’ve ever wrestled with the browser’s global scope, the motivation is
clear: wrapping myModule
in a separate execution context provides a nice way
to separate internal details from the public interface we’re on the hook to
maintain.
Under this pattern, module content is private by-default–the only variables and methods that external code can access are those we explicitly return:
var myModule = (function () {
var secret = 'Something only I know';
return {
isSecret(guess) {
return secret === guess;
}
};
})();
In Node.js
The pattern looks slightly different in Node.js, where modules are shared,
cached, and required through a built-in module system, The concept remains the
same, but, every file represents a separate context whose “return value” is
set using exports
.
// myModule.js
const secret = 'Something only I know';
module.exports.isSecret = function (guess) {
return secret === guess;
};
Other modules in the application can now require
this module, and off we go.
// index.js
var myModule = require('./myModule');
console.log(myModule.isSecret('can you hear me now?'));
So far, so good. But let’s dial up the complexity and see how things change.
Dependencies part I: Singletons
On paper each module contains independent functionality, but in practice application subsystems tend to depend (explicitly or otherwise) on data and functionality from elsewhere in the system.
Consider configuration: since development and production environments demand different levels of uptime and visibility, we’ll likely need to adjust application behavior to the environment we’re in. An easy way to do it is to represent the config as a global singleton:
// myModule.js
const config = require('./config');
module.exports = {
isSecret(guess) {
return (guess === config.secret);
},
};
// config.js
module.exports = {
secret: 'Something only we know',
};
Singletons are the easiest way to provide module dependencies at runtime—and the less change we have to anticipate inside the running app, the happier we’ll all be. But singletons also come with a big downside: since the same instance is shared between all the modules in the app, changes to its behavior are tricky to enact and isolate.
Dependencies part II: Dependency Injection
When dependencies don’t change, there’s no need for complexity past node’s built-in module system. But we may need ways to encapsulate pieces of functionality as applications grow more complex.
The usual approach is to decouple services from the logic that depends on them through dependency injection (DI). At its heart DI is simply the process of providing dependencies via function arguments. Depending on the DI container, there might be quite a bit of ceremony in how dependencies are declared or received. But for our purposes “injection” is just another function call.
Here’s how it works. This time, instead of require
-ing a singleton, we’ll
simply configure a dependency and pass it in when we call the module. Here’s
how it looks in practice.
module.exports = function createService (opts) {
const { logger } = opts;
let attempt = 0;
const isSecret = (guess) => {
attempt = attempt + 1;
logger.info({ attempt }, 'Guessing secret');
return (guess === secret);
};
return {
isSecret,
};
};
In this example, we can think of createService
as a factory
function that will be
called at least once. It provides a “service” for counting guesses, and takes
as its argument another service for logging each guess as it happens.
Program to the Interface
Interfaces are a formal feature of many typed languages. But even in JavaScript they remain a useful tool for thinking about program composition.
For our purposes, interfaces are collections of methods that match a certain shape. We don’t need to worry about how the methods are implemented–only that they match the given form. Since downstream code can call interface methods with no knowledge of the underlying implementation, interfaces provide a powerful tool for changing up functionality on the fly.
The logger
service has a simple interface.
interface Logger = {
info({ [k: string] }: any): void
}
All our module needs to know is that it will receive a logger matching the
Logger
interface. How it receives it is irrelevant, as is its
implementation. If it quacks like a duck (or
a Logger
), we don’t need to know how.
One implementation for the Logger
interface comes from the
bunyan library (which itself has
inspired a very similar interface in bole).
const { createLogger } = require('bunyan');
const logger = createLogger({ name: 'service' });
const service = createService({ logger });
if (service.isSecret('My best guess')) {
console.log("That's it!");
}
The only downside? Along with the confirmation message ("That’s it!"
), the
provided logger will print messages to stdout
as they arrive. That’s great
news for a production audit log
but not so nice for testing or local development. Take the output from the
service’s test suite, for example:
// service_spec.js
const bunyan = require('bunyan');
const createService = require('./service');
const logger = createLogger({ name: 'service' });
const service = createService({ logger });
describe('service', () => {
it('rejects invalid secret', () =>
expect(service.isSecret('The wrong secret'))
.toEqual(false));
it('passes good secret', () =>
expect(service.isSecret('Something only we know'))
.toEqual(true));
});
Running the tests, we’ll see quite a bit of chatter from the logger.
$ jasmine ./service_spec.js
Started
{"name":"service","hostname":"local","pid":25059,"level":30,"attempt":1,"msg":"Guessing secret","time":"2017-12-03T05:42:33.206Z","v":0}
.{"name":"service","hostname":"local","pid":25059,"level":30,"attempt":2,"msg":"Guessing secret","time":"2017-12-03T05:42:33.212Z","v":0}
.
2 specs, 0 failures
Finished in 0.013 seconds
The good news is that we don’t have to pass in a default instance of bunyan
.
Using dependency injection, we can provide anything that implements the
Logger
interface
That could be our own mock
implementation,
or an instance of bunyan configured to write to
/dev/null
:
const fs = require('fs');
const bunyan = require('bunyan');
const createService = require('./service');
const logger = bunyan.createLogger({
name: 'null',
stream: fs.createWriteStream('/dev/null'),
});
const service = createService({ logger });
Now we can run the spec without the extra chatter from the logs:
$ jasmine ./service_spec.js
Started
..
2 specs, 0 failures
Finished in 0.013 seconds
Much better.
All together now
Putting it all together, here it is: a simple-yet-flexible module pattern that uses injected dependencies to manage complexity.
const { createLogger } = require('bunyan');
const config = require('./config');
const DEFAULTS = {
secret: config.secret,
logger: createLogger({ name: 'guessing-service' }),
};
module.exports = function createService (opts) {
const {
secret, // (optional), the secret we're trying to guess
logger, // (optional), the auth logger to use with the guessing service
} = { ...DEFAULTS, ...opts };
let attempt = 0;
const isSecret = (guess) => {
attempt = attempt + 1;
logger.info({ attempt }, 'Guessing secret');
return (guess === secret);
};
return {
isSecret,
};
};
It’s overkill in the sort of scenario where require
would do, but only just.
An explicit, easily documented interface; visibility restrictions where needed;
and clearly-revealed exports. Just what the doctor ordered.