Skip to content

Commit

Permalink
docs
Browse files Browse the repository at this point in the history
  • Loading branch information
neutralino1 committed Jul 5, 2022
1 parent 56d2298 commit c15b6e3
Show file tree
Hide file tree
Showing 6 changed files with 79 additions and 16 deletions.
4 changes: 2 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -60,10 +60,10 @@ wheel:

test-release:
python3 -m twine upload --repository testpypi bazel-bin/sematic/*.whl
docker build -t sematicai/sematic-server:dev .
docker push sematicai/sematic-server:dev

release:
python3 -m twine upload bazel-bin/sematic/*.whl

release-server:
docker build -t sematicai/sematic-server:latest .
docker push sematicai/sematic-server:latest
2 changes: 1 addition & 1 deletion docs/SUMMARY.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
* [Type support](type-support.md)
* [Future algebra](future-algebra.md)
* [Graph resolution](coming-soon.md)
* [Deploy Sematic](coming-soon.md)
* [Deploy Sematic](deploy.md)
* [Debugging](coming-soon.md)
* [Glossary](glossary.md)

Expand Down
4 changes: 3 additions & 1 deletion docs/changelog.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
* HEAD
* 0.4.0
* [feature] ability to deploy the Sematic API to a cloud instance and run
pipelines against it (pipeline still runs locally)
* [improvement] Rename `$ sematic credentials` to `$ sematic settings` to be
able to store other things than credentials.
* 0.3.0
Expand Down
55 changes: 55 additions & 0 deletions docs/deploy.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
# Deploy Sematic

When you install Sematic the first time, everything runs locally. The web app
and your pipelines run locally.

Here is how to deploy Sematic to take full advantage of your cloud resources.

## Deploy the web app

In order to collaborate and share results with your team, it is best to deploy
the Sematic web app to a remote server.

Prerequisites:

* A remote instance into which you can SSH
* A Postgres database
* [Install Docker](https://docs.docker.com/engine/install/) onto your remote instance

Then, SSH into your remote server:

```shell
$ ssh my-remote-server.dev
```

Pull the latest server image

```shell
$ sudo docker pull sematicai/sematic-server:latest
```

launch the server

```shell
$ sudo docker run -d -p 80:80 -e DATABASE_URL=<DATABASE_URL> \
-v /home/ubuntu/.sematic:/root/.sematic \
sematicai/sematic-server:latest
```

where `DATABASE_URL` is the fully-qualified URL of your Postgres database. It
should be of the form:

```
postgresql://<username>:<password>@<hostname>:<port>/<database>
```

Now you should be able to visit http://my-remote-server.dev and see the landing page.

### Run pipelines against the deployed API

At this time Sematic still runs your pipelines on your local machine (cloud
execution coming soon). In order to write metadata to the deployed API, simply do:

```shell
$ sematic settings set SEMATIC_API_ADDRESS http://my-remote-server.dev
```
28 changes: 17 additions & 11 deletions docs/sematic-vs.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,35 +63,41 @@ pipelines, since it was the first to provide such abilities.

Sematic differs from Airflow in the following ways:

* Iterative development: change code, run in the cloud, visualize, repeat. In Airflow, you can either run a pipeline on a local Airflow instance, or if you want to run it in the cloud, you must merge your code and deploy it to the cloud instance which adds many steps to your iteration workflow.
* **Iterative development** change code, run in the cloud, visualize, repeat. In Airflow, you can either run a pipeline on a local Airflow instance, or if you want to run it in the cloud, you must merge your code and deploy it to the cloud instance which adds many steps to your iteration workflow.

* Semantic UI: Sematic brings visualizations for your functions inputs and outputs straight to the forefront. No need to take care of persisting things elsewhere or fetching them into a local notebook.
* **Semantic UI** Sematic brings visualizations for your functions inputs and outputs straight to the forefront. No need to take care of persisting things elsewhere or fetching them into a local notebook.

* Dynamic graph:
* **Dynamic graph** – Sematic lets up loop over configurations, run different branches of your pipelines depending on the outcome of upstream steps, and even do hyperparameter tuning.

### Kubeflow Pipelines

* Barrier to entry:
* **Barrier to entry** – Using KFP requires quite a bit of knowledge about
Kubernetes, Docker images, Argo, etc. These things take a lot of time to
master. Sematic tries to provide working solutions out-of-the-box so that you
can focus on your expertise instead of infrastructure.

* Dynamic graph:
* **Dynamic graph** – In KFP, DAGs are fixed ahead of time. Sematic lets up loop over configurations, run different branches of your pipelines depending on the outcome of upstream steps, and even do hyperparameter tuning.

* Lineage tracking:
* **Lineage tracking** – KFP does not keep a rigorous track of all assets and artifacts flowing between your steps. You need to implement that yourself on top of it. In Sematic, lineage tracking is a first-class citizen. All inputs and outputs of all Sematic Functions are tracked and versioned.

* Semantic UI:
* **Semantic UI** – The KFP UI offers barebones tools to monitor and inspect runs, but no rich visualization or advanced features.

### MLFlow

### Dagster

### Comet ML

Experiment tracking, use Comet ML with Sematic
Comet ML specializes in experiment tracking. If this is a particular concern of
yours, you can definitely use Comet ML with Sematic. They are complementary
tools.

### Weights & Biases

Experiment tracking, visualizations, use W&B with Sematic
Weights & Biases excels at experiment tracking and visualization. You can
absolutely use W&B together with Sematic, they are complementary tols.

### HuggingFace

Pre-trained models, use HF with Sematic

HuggingFace provides a large collection of pre-trained models and datasets. You
can absolutely use HuggingFace with Sematic.
2 changes: 1 addition & 1 deletion sematic/BUILD
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ sematic_py_wheel(
distribution = "sematic",
python_tag = "py3",
platform = "any",
version = "0.3.1",#+{BUILD_TIMESTAMP}",
version = "0.4.0",#+{BUILD_TIMESTAMP}",
python_requires = ">=3.9",
description_file = "//:README.rst",
author = "Sematic AI, Inc.",
Expand Down

0 comments on commit c15b6e3

Please sign in to comment.