Contributing to faast.js
Thanks for your interest in contributing to faast.js! The information on this page explains how to contribute, and provides setup instructions for a development environment to hack on faast.js.
Communication
Sign up for our discord channel.
Also check out the faast.js
tag on StackOverflow.
How to contribute
There are a many ways to contribute to faast.js:
File an issue on GitHub for an issue you're having using faast.js. Please check for duplicates when filing.
Fix an open issue.
Contribute a pull request with new functionality. Before doing this, please open a new issue to discuss the feature so we can align on what / when / how before you put in a ton of work.
Tooling
Prerequisites:
- Node version 8+. Versions of node from OS package managers are often out of date; it is preferable to install directly from nodejs.org or use a node version manager. Consider n, which can be easily installed with n-install.
Included in package.json
:
AVA - test runner.
Docusaurus - documentation and website.
API-extractor - API documentation generator and more.
semantic-release - automatic changelog and release generation.
commitizen - commit message formatting
External tools that may be useful:
Building
To install node, see tooling.
$ node -v
Ensure you're using Node version 8+.
Then build:
$ npm install
$ npm run build
The output is placed in dist/
.
Running in watch mode can be useful for compiling on the fly as changes are made:
$ npm run watch
Running the Testsuite
The full testsuite requires both AWS accounts to be set up. See instructions to setup accounts on AWS.
$ npm run test
Only run AWS and local tests
npm run test-aws
Local Testing (no network required)
$ npm run test-local
Documentation
There are two documentation sources: the markdown files in docs/*.md
and generated API documentation from the source code. The generated API documentation is output to docs/api/*
and is checked into the repository.
Generating API documentation from source code
API documentation generation uses API-extractor and API-documenter.
To perform a clean build that includes building API documentation:
$ npm run build
This takes a little time, and unfortunately the documentation generator lacks a watch mode, so is not automatically updated when using npm run watch
. To occassionally update the documentation without running a full build:
$ npm run doc
This will execute the documentation builder script doc.ts.
Writing API documentation
Every exported symbol is in index.ts
in the root of the repository. Each of these symbols needs to have a @public
annotation and some documentation. Review the documentation tags specified by API-extractor.
Faast.js also uses an API review file generated by API-extractor in etc/faastjs.api.md
. This file contains a summary of the public exported API for faast.js. It is generated automatically when you run npm run doc
or npm run build
. When this file changes, it means the public API has changed, and this warrants careful review.
When reviewing the API review file, all undefined types should be part of declarations marked as @internal (except for those referencing AWS or WebPack dependency types used for special case options). Any changes to the public API should be inspected for possible semantic version changes. All public apis should be documented (i.e. ensure there are no "undocumented" comments on APIs in the API review file).
Semantic versioning
Faast.js follows semantic versioning. We use the semantic-release tool to automatically generate releases from the master branch on github. Semantic-release automatically increments version numbers, generates changelogs, and performs npm publish
based on commit messages.
When creating pull requests, you can format your messages with the following command:
$ npm run commit
This will run commitizen, prompting you to answer some questions about your changes. Don't worry, we will fix your commit messages (if needed) before accepting your pull request.
Website
Faast.js uses docusaurus to generate a static website that includes the generated documentation as an API in docs/api/*.md
, along with the manual you are currently reading in docs/*.md
.
Docusaurus has a built-in server:
$ cd website
$ npm install
$ npm start
This should open your browser allowing you to see a live preview. Note that you'll need to npm run doc
to get updated API docs.
Testsuite Design
Why AVA (and not Jest, Mocha, etc)?
AVA is designed to run all tests within a file concurrently. This is a different architecture than most other JavaScript test frameworks, and it is especially suited to faast.js. The faast.js testsuite needs to create many lambda functions and other infrastructure in the cloud. Performing these operations can take some time, but can be done in parallel easily.
In addition, faast.js has to execute the same tests across a test matrix of cloud providers and configurations:
{CloudProviders} x {Configurations} x {Tests}
The most natural way to write these tests is as follows:
for (const provider of providers) {
for (const config of configs) {
test(...);
}
}
With Jest and most other JavaScript test frameworks, this style of writing tests will result in serialization of each test. Splitting the elements of the test matrix into different files causes the test structure to become more complex than it needs to be because the common test code needs to be factored out, and separate test files need to be created for each cloud provider and possible each configuration in order to achieve sufficient test concurrency.
What should be tested
Faast.js focuses mostly on integration tests, not unit tests. Faast.js tests on live cloud services in addition to locally. We've found that this test philosophy maximizes ROI on test effort.
Test location
Tests are located in test/\*.test.ts
.
Running AVA tests manually
You can run AVA directly to use more advanced filtering, etc:
$ npx ava
For example, to run only AWS tests that test garbage collection:
$ npx ava -m='*aws*garbage*'
Writing tests
The benefit of AVA is that you have more control over test concurrency. The drawback is that tests need to be written so they don't interfere with each other. This takes some effort and diligence, but basically it boils down to the following:
Most tests should be written as macros. See basic.test.ts for an example.
Each test should create its own faast instance and properly clean it up in a
finally
clause.Don't share resources between tests. This includes global or shared variables, objects, files, etc.
Each test should have a unique test title that is descriptive enough to make filtering easy.
Don't ignore intermittent test failures. They may indicate a race condition.
Test titles
Test titles are important. Please following the following rules to ensure the CI tests continue to work as expected:
Test titles that require network access should begin with
`remote ${provider}`
Use the
title()
utility function to help ensure you have the right prefix for test titles.
Continuous integration with CircleCI
See .circleci/config.yml
. On CircleCI's project settings page, you need to set environment variables to run tests on the cloud providers:
environment variable | value |
---|---|
AWS_ACCESS_KEY_ID | An AWS IAM user access key ID. See AWS CLI. |
AWS_SECRET_ACCESS_KEY | An AWS IAM user access key secret. See AWS CLI. |
CODECOV_TOKEN | The project token for codecov, available in web console. |
GH_TOKEN | Github token (for semantic-release) |
NPM_TOKEN | NPM publish token (for semantic-release) |
The codecov token can be retrieved from the web console in a codecov account. Code coverage is optional.
Adding a new cloud provider
We'd love your help in adding more cloud providers! Please reach out on Discord to discuss.