Hapi pipelines plugin for the Screwdriver API
const Hapi = require('hapi');
const server = new Hapi.Server();
const pipelinesPlugin = require('./');
server.connection({ port: 3000 });
server.register({
register: pipelinesPlugin,
options: {}
}, () => {
server.start((err) => {
if (err) {
throw err;
}
console.log('Server running at:', server.info.uri);
});
});
page
, count
, sort
, sortBy
, search
, and configPipelineId
optional
search
will search for a pipeline with a name containing the search keyword in the scmRepo
field
GET /pipelines?page={pageNumber}&count={countNumber}&configPipelineId={configPipelineId}&search={search}
GET /pipelines/{id}
Create a pipeline and create a job called 'main'
POST /pipelines
Arguments
checkoutUrl
- Source code URL for the application. For a git-based repository, it is typically the SSH endpoint and the branch name, separated by a octothorpe. Must be unique.rootDir
- Optional Root directory where the source code lives. Default to empty string.
Example payload:
{
"checkoutUrl": "git@github.com:screwdriver-cd/data-model.git#master",
"rootDir": "src/app/component"
}
You can update the checkoutUrl of a pipeline.
PUT /pipelines/{id}
Arguments
checkoutUrl
- Source code URL for the application. For a git-based repository, it is typically the SSH endpoint and the branch name, separated by a octothorpe. Must be unique.rootDir
- Optional Root directory where the source code lives. Default to empty string.
Example payload:
{
"checkoutUrl": "git@github.com:screwdriver-cd/data-model.git#master",
"rootDir": "src/app/component"
}
DELETE /pipelines/{id}
- Synchronize the pipeline by looking up latest screwdriver.yaml
- Create, update, or disable jobs if necessary.
- Store/update the pipeline workflowGraph
POST /pipelines/{id}/sync
- Synchronize webhooks for the pipeline
- Add or update webhooks if necessary
POST /pipelines/{id}/sync/webhooks
- Synchronize pull requests for the pipeline
- Add or update pull request jobs if necessary
POST /pipelines/{id}/sync/pullrequests
page
, count
, sort
, and prNum
are optional
Only PR events of specified PR number will be searched when prNum
is set
GET /pipelines/{id}/events?page={pageNumber}&count={countNumber}&sort={sort}&prNum={prNumber}
archived
is optional and has a default value of false
, which makes the endpoint not return archived jobs (e.g. closed pull requests)
GET /pipelines/{id}/jobs?archived={boolean}
GET /pipelines/{id}/triggers
GET /pipelines/{id}/secrets
GET /pipelines/{id}/metrics
GET /pipelines/{id}/metrics?startTime=2019-02-01T12:00:00.000Z
GET /pipelines/{id}/metrics?startTime=2019-02-01T12:00:00.000Z&endTime=2019-03-01T12:00:00.000
- Start all child pipelines belong to this config pipeline all at once
POST /pipelines/{id}/startall
POST /pipelines/{id}/token
GET /pipelines/{id}/tokens
PUT /pipelines/{pipelineId}/tokens/{tokenId}
PUT /pipelines/{pipelineId}/tokens/{tokenId}/refresh
DELETE /pipelines/{pipelineId}/tokens/{tokenId}
DELETE /pipelines/{pipelineId}/tokens
GET /pipelines/{id}/jobs/{jobName}/latestBuild
Can search by build status
GET /pipelines/{id}/jobs/{jobName}/latestBuild?status=SUCCESS
The server supplies factories to plugins in the form of server settings:
// handler pipelinePlugin.js
handler: (request, reply) => {
const factory = request.server.app.pipelineFactory;
// ...
}