Skip to content

Commit

Permalink
doc site improvements:
Browse files Browse the repository at this point in the history
- refactor profile description section in "under the hood"
- correct TLS references and add mention of self hosting guide in "tunnel-server"
  • Loading branch information
Roy Razon committed Mar 28, 2024
1 parent 11998c4 commit eb0e085
Show file tree
Hide file tree
Showing 2 changed files with 57 additions and 39 deletions.
80 changes: 45 additions & 35 deletions site/docs/intro/under-the-hood.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,69 +12,79 @@ When provisioning a new environment using the [`up`](/cli-reference#preevy-up-se
- Read the tunneling key and default flags from the profile.
- Calculate the environment ID based on the current git branch (or use the `--id` flag.)
- Connect to the Tunnel Server using the tunneling key to pre-generate the public URLs in env vars
- Make sure a machine is provisioned:
- Query the configured cloud provider for an existing machine
- If a machine doesn't exist yet, a new one is provisioned and a Docker server is set up in it
- Set up an SSH tunnel to the Docker server on the provisioned machine
- Make sure a deployment target (VM or Kubernetes Pod) is provisioned:
- Query the configured cloud provider or Kubernetes cluster for an existing machine
- If the deployment target doesn't exist yet, a new one is provisioned
- Set up an SSH tunnel to the Docker server on the provisioned deployment target
- Build the Compose project
- Extract the build information from the specified Compose file(s) and combine it with the specified build options to generate an interim build Compose file.
- Run `docker buildx bake` with the generated build Compose file.
- The resulting images are either loaded to the provisioned machine or written to an image registry.
- The resulting images are either loaded to the provisioned machine or written to an image registry.
- Deploy the Compose services to the machine's Docker server using the `docker compose up` command
- Local volume mounts are copied to the remote machine first
- The original Compose project is augmented with a helper service, `preevy_proxy`, responsible for connecting to the [Tunnel Server](/tunnel-server).
- The `preevy_proxy` service creates a tunnel for each service.
- Fetch the URLs from the Tunnel Server and output them.
- Fetch the tunneled URLs from the Tunnel Server and print them.

## Profile configuration

The Preevy profile provides a mechanism for storing and sharing configuration and state between different machines. This allows sharing of environments between different CI jobs, or different developers.
Using a shared profile ensures consistent configuration and stable URLs between different CI runs.

Preevy includes built-in support for sharing profiles using[AWS S3](https://aws.amazon.com/s3/) [Google Cloud Storage](https://cloud.google.com/storage/) and [Azure Blob Storage](https://azure.microsoft.com/en-us/products/storage/blobs/). You can also store the profile on the local filesystem and copy it manually.
The Preevy profile provides a mechanism for storing and sharing configuration and state between different machines. Using a shared profile ensures consistent configuration and stable URLs between different CI runs and different developers.

:::note
The profile does not contain any cloud provider credentials.
:::

When using S3, the Preevy CLI uses the local AWS credential chain (e.g, from environment variables, AWS profile, or EC2 role)
### Creating, viewing, editing and deleting a profile

Create a profile using the [`preevy init`](/cli-reference/init) or [`preevy profile create`](/cli-reference/profile#preevy-profile-create-name-url) commands.

Import an existing profile using the [`preevy init --from <profile-url>`](/cli-reference/init) or [`preevy profile import <profile-url> --use`](/cli-reference/profile#preevy-profile-import-location) commands.

View the list of imported profiles using the [`preevy profile ls`](/cli-reference/profile#preevy-profile-ls) command.

Copy a profile between storage locations using the [`preevy profile cp`](/cli-reference/profile#preevy-profile-cp) command.

See the [`preevy profile`](cli-reference/profile) subcommands for other operations on profiles, including viewing the profile contents and editing it.

### Selecting a profile when running Preevy

A profile is identified using a URL. Preevy commands which require a profile accept the `--profile` flag. If the flag is not specified, the default profile is used. The default profile is set to the last created or imported profile, and can be explicitly set using the [`preevy profile use`](/cli-reference/profile#preevy-profile-use-name) command.

### Remote storage for profiles

Similarly, when using Google Cloud Storage or Azure Blob Storage, the Preevy CLI uses the locally stored credentials.
Preevy includes built-in support for storing profiles remotely on [AWS S3](https://aws.amazon.com/s3/), [Google Cloud Storage](https://cloud.google.com/storage/) and [Azure Blob Storage](https://azure.microsoft.com/en-us/products/storage/blobs/). Storing the profile on remote storage makes it easy share the profile and use Preevy in CI.

For all storage providers, Preevy needs specific permissions to create the bucket (if it doesn't already exist) and read/write objects on the bucket.
#### Required credentials

### Profile URLs
When using S3, the Preevy CLI uses the local AWS credential chain (e.g, from environment variables, AWS profile, or EC2 role). Similarly, when using Google Cloud Storage or Azure Blob Storage, the Preevy CLI uses the locally stored credentials.

Profile URLs specify the location of the shared profile on AWS S3 or Google Cloud Storage. A bucket name is always specified. The same bucket can be used to store multiple profiles by specifying a base path.
For all remote storage providers, Preevy needs specific permissions to create the bucket (if it doesn't already exist) and read/write objects on the bucket.

Example AWS S3 URL:
```
s3://preevy-config/profile1?region=us-east-1
```
#### Remote storage profile URLs

Refers to a profile stored on an S3 bucket named `preevy-config` in the region `us-east-1` under the base path `profile1`.
Remote storage profile URLs specify a bucket name and and optional base path. By specifying different base paths, the same bucket can be used to store multiple profiles.

Example Google Cloud Storage URL:
Example AWS S3 URLs:

```
gs://preevy-config/profile1?project=my-project
```
- `s3://my-bucket?region=us-east-1`: S3 bucket named `my-bucket` in the region `us-east-1`
- `s3://my-bucket/my-path?region=us-east-1`: Same as above, with the base path `my-path`

Example Azure Blob Storage Storage URL:
Example Google Cloud Storage URLs:
- `gs://my-bucket?project=my-project`: GCS bucket named `my-bucket` in the project `my-project`
- `gs://my-bucket/my-path?project=my-project` Same as above, with the base path `my-path`

```
azblob://preevy-config/profile1?storage_account=myaccount
```
Example Azure Blob Storage Storage URLs:
- `azblob://my-container?storage_account=myaccount`: AZBlob [container](https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction#containers) named `preevy-config` in the [storage account](https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction#storage-accounts) `my-project`
- `azblob://my-container/my-path?storage_account=myaccount`: Same as above, with the base path `my-path`

Refers to a profile stored on a AZBlob [container](https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction#containers) named `preevy-config` in the [storage account](https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction#storage-accounts) `my-project` under the base path `profile1`.
### Local storage for profiles

To import a shared profile, specify its URL to the `preevy init` command:
You can also store the profile on the local filesystem. When using Preevy in CI, a local profile needs to be copied manually in order to create stable URLs between different CI runs.

```
preevy init --from s3://preevy-config/profile1?region=us-east-1
```
Example local storage URL:
- `local://profile-name`: A local directory (under the OS's local data dir, e.g, `~/.local/share/preevy`) named `profile-name`

List profiles that were already imported using the command [`preevy profile ls`](/cli-reference#preevy-profile-ls).
To copy a local profile to remote storage, use the [`preevy profile cp`](/cli-reference/profile#preevy-profile-cp) command.

## Components

Expand All @@ -94,6 +104,6 @@ For usage examples, you can go over the [CLI reference](/cli-reference)
#### [Tunnel server](https://github.com/livecycle/preevy/tree/main/packages/tunnel-server)

The tunnel server is a Node.js-based server responsible for exposing friendly HTTPS URLs for the Compose services.
A free public instance is hosted on `livecycle.run`, and it can be self-hosted as well.
A free public instance is hosted on `livecycle.run`, and it can be [self-hosted](https://github.com/livecycle/preevy/tree/main/tunnel-server/deployment/k8s) as well.

Read more about it: [Tunnel server](/tunnel-server)
16 changes: 12 additions & 4 deletions site/docs/tunnel-server.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,10 @@ Preevy uses a tunnel server to expose the preview environment services to the en
By default, the CLI uses a public tunnel server hosted on `livecycle.run`, but this configuration can be overridden using the [`--tunnel-server`](cli-reference#preevy-up-service) flag.
The tunnel server can be self hosted and a Docker/OCI image is publicly available on `ghcr.io/livecycle/preevy/tunnel-server`

The tunnel server itself is a node.js server responsible for creating tunnels for HTTP services using SSH.
The server accepts SSH connections on port 2222 and HTTP traffic on port 3000.
The tunnel server is a Node.js server responsible for creating tunnels for HTTP services using SSH.

The server accepts SSH connections on port 2222 and HTTP traffic on port 3000. It can also accept TLS connections (for both SSH and HTTPS) on port 8443. All ports are configurable using environment variables.

Assuming the tunnel server is running on `tunnel-server-host`, creating a tunnel for an HTTP service running on port 5000 is as simple as running:

```bash
Expand All @@ -26,15 +28,21 @@ The expected output of this command will be something like:
}
```

A tunnel from `http://my-tunnel-5rwpwhy5.tunnel-server-host:3000->http://localhost:5000` was created and can be accessed by anyone who has the URL (assuming the tunnel server is public).
A tunnel from `http://my-tunnel-5rwpwhy5.tunnel-server-host:3000` to `http://localhost:5000` was created and can be accessed by anyone who has the URL (assuming the tunnel server is public).

## Security

To connect to the tunnel server, you must use SSH private-key-based authentication.
Urls for tunnels are derived from the public key of the client (referred to as `clientId`):
`http://{tunnel_name}-{clientId}.{tunnel-server}`.

Preevy supports connecting to a tunnel server over TLS which wraps the SSH connection. For these cases, the tunnel server needs to use an external service for TLS termination (e.g. [NGINX](https://www.nginx.com/), [HAProxy](https://www.haproxy.org/), [Traefik](https://traefik.io/)).
The tunnel server supports connections over TLS which enable HTTPS and SSH over TLS. For this, a TLS certificate is required.

## Self hosting the tunnel server

The tunnel server can be self-hosted. Use cases for self hosting include private networks, custom domain names and reducing network latency by deploying the tunnel server geographically closer to the deployment machines.

See [guide](https://github.com/livecycle/preevy/tree/main/tunnel-server/deployment/k8s).

## Observability

Expand Down

0 comments on commit eb0e085

Please sign in to comment.