Skip to content

Commit

Permalink
Merge branch 'release/v1.0.0'
Browse files Browse the repository at this point in the history
  • Loading branch information
balis committed Aug 2, 2014
2 parents f647532 + 65ffe68 commit 35faf0a
Show file tree
Hide file tree
Showing 101 changed files with 45,923 additions and 21,436 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
node_modules/
coverage/
*~
*.swp
.idea
24 changes: 12 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,24 +2,24 @@

##Description

HyperFlow provides a model of computation and an execution engine for complex, distributed workflow applications which consist of a set of **processes** performing well-defined **functions** and exchanging **signals**.
HyperFlow provides a model of computation, workflow description language and enactment engine for complex, distributed workflows.

Browse the [wiki pages](https://github.com/balis/hyperflow/wiki) to learn more about the HyperFlow workflow model.

##Getting started

Latest release of HyperFlow is 1.0.0-beta-6
The latest release of HyperFlow is 1.0.0

Installation & running:
* Download the package: https://github.com/dice-cyfronet/hyperflow/archive/v1.0.0-beta-6.zip
* Install dependencies (in `hyperflow` directory): `npm install -d`
###Installation
* Download the package: https://github.com/dice-cyfronet/hyperflow/archive/v1.0.0.zip
* Install the latest node.js (http://nodejs.org)
* Install dependencies (in `hyperflow` directory): `npm install -d`
* Install the Redis server 2.6.x or higher (http://redis.io) (tested with version 2.6.x)
* Start the redis server
* Run example workflows: `node scripts/runwf.js -f workflows/<workflow_file>`
* Try `Wf_grepfile_simple.json`, `Wf_MapReduce.json`, `Wf_PingPong.json`
* Also try simulated `Montage` workflows which require the `-s` flag:
* `node scripts/runwf.js -f workflows/Montage_143.json -s`
* Look at sample workflows in the `workflows` directory
* Look at example functions invoked from workflow tasks in the `functions` directory
* Start the redis server.
* Set an environment variable `HFLOW_PATH` to point to your hyperflow root directory.
* Add `$HFLOW_PATH/bin` to your `PATH`.

###Running
Run example workflows from the `examples` directory as follows:

```hflow run <wf_directory>```
95 changes: 95 additions & 0 deletions SPEC.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
# HyperFlow: a distributed workflow execution engine

HyperFlow provides a model of computation and an execution engine for complex, distributed [workflow](http://en.wikipedia.org/wiki/Workflow) applications which consist of a set of **processes** performing well-defined **functions** and exchanging **signals**. Browse the [wiki pages](https://github.com/dice-cyfronet/hyperflow/wiki) to learn more about the HyperFlow workflow model.


## Getting started

### Installing Hyperflow

Hyperflow requires [node.js](http://nodejs.org) runtime. Stable version may be installed using npm package manager:

```shell
$ npm install -g hyperflow
```

You can install bleeding-edge from GitHub:

```shell
$ npm install -g https://github.com/dice-cyfronet/hyperflow/archive/develop.tar.gz
```

Hyperflow also requires [Redis](http://redis.io) server.

```shell
# on Debian/Ubuntu
$ sudo apt-get install redis-server

# on RedHat or CentOS
$ sudo yum install redis
```

### Installing additional modules

`hyperflow` package provides only core functionality, while additional packages can bundle *functions* and *graphs*. The functions may be later referenced from workflow graph as `$npm_package_name:$function_name`.

We provide:

* `hyperflow-amqp` – allows remote execution of tasks by using AMQP queues,
* `hyperflow-map-reduce` – functions for constructing Map-Reduce workflows,
* `hyperflow-pegasus` – support for executing [Pegasus](http://...) DAX workflows,
* `hyperflow-montage` – support for executing [Montage](http://...) workflow.

See [wiki page](http://...) to see how to create hyperflow function packages.

### Running *hello world* workflow

```shell
$ git clone http://github.com/dice-cyfronet/hyperflow-hello-world.git
$ cd hyperflow-hello-world
$ hflow start
Hyperflow starting!
Listening on *:1234, webui: http://1.2.3.4:1234/
hello-world workflow loaded, sending initial signals.
Workflow id is 9876.
```
### Advanced options

```
hflow start [--background] [--functions functions.js] [--graph graph.json|graph.js] [--config config.json] [--set-KEY=VALUE]
hflow resume [workflow_id] [--functions functions.js] [--graph graph.json|graph.js] [--config config.json] [--set-KEY=VALUE]
hflow terminate [workflow_id]
hflow status [workflow_id]
hflow watch_events [workflow_id]
```

### Workflow directory structure

Workflow is a directory that bundles all files required and contains:

* workflow graph:
* `graph.json` – static workflow graph in JSON, or
* `graph.js` – graph generation code as node.js module,
* `config.json` – hyperflow configuration and workflow parameters,
* `functions.js` – functions specific for given workflow.

## Configuration

Configuration is provided in JSON format, while some options may be also specified as environment variables. Hyperflow reads and merges the config in the following order:

* defaults (see [default_config.json](default_config.json)),
* `/etc/hyperflow.json`,
* `~/.hyperflow.json`,
* `hyperflow.json` placed in the same directory as workflow JSON file,
* `$HYPERFLOW_CONFIG`,
* options from environment variables e.g. `$REDIS_URL`,
* options from command line arguments.

Options are:

* `packages` – list of function packages that are required by workflow (e.g. `montage/functions`),
* `graph` – filename of graph file (defaults to `./graph.[js|json]`), may use also bundled graphs (e.g. `montage/graph`),
* `port` or `$PORT` (defaults to 1234),
* `redis_url` or `$REDIS_URL` (defaults to: `redis://127.0.0.1:6379/0`),
* `amqp_url` or `$AMQP_URL` (defaults to `amqp://localhost`),
* `amqp_executor_config` (defaults to `{"storage": "local", "workdir": "/tmp/"}`).
91 changes: 91 additions & 0 deletions bin/hflow
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
#!/usr/bin/env node
var docopt = require('docopt').docopt,
spawn = require('child_process').spawn,
fs = require('fs'),
pathtool = require('path'),
redis = require('redis'),
rcl = redis.createClient(),
wflib = require('../wflib').init(rcl),
Engine = require('../engine2'),
async = require('async'),
dbId = 0;

var doc = "\
Usage:\n\
hflow run <workflow_dir_or_file> [-s]\n\
hflow send <wf_id> ( <signal_file> | -d <signal_data> )\n\
hflow -h | --help | --version";

var opts = docopt(doc);

var hfroot = process.env.HFLOW_PATH;

if (opts.run) {
hflow_run();
} else if (opts.send) {
hflow_send();
}

function hflow_run() {
var wfpath = opts['<workflow_dir_or_file>'],
wfstats = fs.lstatSync(wfpath),
wffile;

if (wfstats.isDirectory()) {
wffile = pathtool.join(wfpath, "workflow.json");
} else if (wfstats.isFile()) {
wffile = wfpath;
wfpath = pathtool.dirname(wfpath);
}

var runWf = function(wfId) {
var config = {"emulate":"false", "workdir": pathtool.resolve(wfpath)};
var engine = new Engine(config, wflib, wfId, function(err) {
// This represent custom plugin listening on event from available eventServer
// engine.eventServer.on('trace.*', function(exec, args) {
// console.log('Event captured: ' + exec + ' ' + args + ' job done');
// });
engine.runInstance(function(err) {
console.log("Wf id="+wfId);
if (opts['-s']) {
// Flag -s is present: send all input signals to the workflow -> start execution
wflib.getWfIns(wfId, false, function(err, wfIns) {
engine.wflib.getSignalInfo(wfId, wfIns, function(err, sigs) {
engine.emitSignals(sigs);
});
});
}
});
});
};

var createWf = function(cb) {
rcl.select(dbId, function(err, rep) {
rcl.flushdb(function(err, rep) {
wflib.createInstanceFromFile(wffile, '', function(err, id) {
cb(err, id);
});
});
});
}

createWf(function(err, wfId) {
runWf(wfId);
});
}

function hflow_send() {
console.log("send signal to a workflow: not implemented");
}

function spawn_proc(exec, args) {
var proc = spawn(exec, args);

proc.stdout.on('data', function(data) {
console.log(data.toString().trimRight());
});

proc.stderr.on('data', function(data) {
console.log(data.toString().trimRight());
});
}
74 changes: 74 additions & 0 deletions bin/hflow-run.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
#!/usr/bin/env node

/* HyperFlow workflow engine.
* Bartosz Balis, 2013-2014
* hflow-run.js:
* - creates a Hyperflow engine instance for workflow identified by Redis id
* - runs this workflow: at this point the engine is awaiting signals (unless -s flag is given)
**/

var redis = require('redis'),
rcl = redis.createClient(),
wflib = require('../wflib').init(rcl),
Engine = require('../engine2'),
async = require('async'),
argv = require('optimist').argv,
dbId = 0,
engine;

function createWf(cb) {
rcl.select(dbId, function(err, rep) {
rcl.flushdb(function(err, rep) {
wflib.createInstanceFromFile(argv.f, '',
function(err, id) {
cb(err, id);
}
);
});
});
}


if (!argv.id && !argv.f) {
console.log("hflow-run.js: runs a workflow instance\n");
console.log("Usage: node hflow-run.js [-f </path/to/wf.json>] [-i WFID] [--db=DBID]");
console.log(" -f <file> : create a new wf instance from a file and run it");
console.log(" -i WFID : use already created wf instance with WFID as its Redis id");
console.log(" -s : send input signals to the workflow (starts execution)");
console.log(" -d DBID : Redis db number to be used (default=0)");
process.exit();
}

if (argv.d) {
dbId = argv.d;
console.log("DBID", dbId);
}

var runWf = function(wfId) {
engine = new Engine({"emulate":"false"}, wflib, wfId, function(err) {
//This represent custom plugin listening on event from available eventServer
// engine.eventServer.on('trace.*', function(exec, args) {
// console.log('Event captured: ' + exec + ' ' + args + ' job done');
// });
engine.runInstance(function(err) {
console.log("Wf id="+wfId);
if (argv.s) {
// Flag -s is present: send all input signals to the workflow -> start execution
wflib.getWfIns(wfId, false, function(err, wfIns) {
engine.wflib.getSignalInfo(wfId, wfIns, function(err, sigs) {
engine.emitSignals(sigs);
});
});
}
});
});
};

if (argv.f) {
createWf(function(err, wfId) {
runWf(wfId);
});
} else if (argv.i) {
runWf(argv.i);
}

2 changes: 2 additions & 0 deletions scripts/runwf.js → bin/hyperflow
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
#!/usr/bin/env node

/* Hypermedia workflow
* Bartosz Balis, 2013
* runwf:
Expand Down
21 changes: 21 additions & 0 deletions bin/hyperflow-convert-dax
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
#!/usr/bin/env node

var PegasusConverter = require('../converters/pegasus_dax.js'),
argv = require('optimist').argv;
var pc;

if (!argv._[0]) {
console.log("Usage: node dax_convert.js <DAX file path> [command_name]");
console.log(" command_name can be: amqpCommand, command_print or command... etc ");
process.exit();
}

if (argv._[1]) {
pc = new PegasusConverter(argv._[1]);
} else {
pc = new PegasusConverter();
}

pc.convertFromFile(argv._[0], function (err, wfOut) {
console.log(JSON.stringify(wfOut, null, 2));
});
1 change: 1 addition & 0 deletions scripts/createwf.js → bin/hyperflow-create-workflow
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
#!/usr/bin/env node
/* Hypermedia workflow
* Bartosz Balis, 2013
* createwf: creates workflow state in Redis db from a json file
Expand Down
File renamed without changes.
Loading

0 comments on commit 35faf0a

Please sign in to comment.