Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve login command ux #859

Open
wants to merge 28 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 25 commits
Commits
Show all changes
28 commits
Select commit Hold shift + click to select a range
e23dee0
Improve push command usability
rmn-boiko Sep 6, 2024
729312f
Add account info endpoint
rmn-boiko Sep 23, 2024
ab8451a
Merge branch 'master' into feature/450-improve-login-command-ux
rmn-boiko Sep 23, 2024
9c25603
Add repo as a to param
rmn-boiko Sep 24, 2024
f0481ec
Fix get account info call
rmn-boiko Sep 26, 2024
7dabc73
Merge branch 'master' into feature/450-improve-login-command-ux
rmn-boiko Sep 26, 2024
b189c38
Add new Alisa resolver
rmn-boiko Oct 1, 2024
51eec38
Finish resolver implementation
rmn-boiko Oct 1, 2024
93ea8f9
Merge branch 'master' into feature/450-improve-login-command-ux
rmn-boiko Oct 1, 2024
48dbc77
Add workspace info endpoint
rmn-boiko Oct 1, 2024
a72186b
Add rest tests
rmn-boiko Oct 2, 2024
becebcb
Add dataset ref tests
rmn-boiko Oct 2, 2024
b992cb1
Add early return for alias ref
rmn-boiko Oct 2, 2024
4b7a016
Fix tests
rmn-boiko Oct 2, 2024
db0a379
Update packages
rmn-boiko Oct 2, 2024
b32b9cd
Fix grammar
rmn-boiko Oct 2, 2024
30872f8
Add push protocol test
rmn-boiko Oct 3, 2024
d439086
Improve tests
rmn-boiko Oct 3, 2024
fea8286
Merge branch 'master' into feature/450-improve-login-command-ux
rmn-boiko Oct 3, 2024
d6af96e
Fix tests
rmn-boiko Oct 3, 2024
072f9bf
Clean code
rmn-boiko Oct 3, 2024
7b255e4
Merge branch 'master' into feature/450-improve-login-command-ux
rmn-boiko Oct 3, 2024
b0ec2fc
Fix review comments
rmn-boiko Oct 3, 2024
9b2138d
Update changelog
rmn-boiko Oct 3, 2024
546ec2f
Add dataset by id rest api tests
rmn-boiko Oct 3, 2024
d87782c
Fix review comments. Iter 1
rmn-boiko Oct 8, 2024
5d87a52
Merge branch 'master' into feature/450-improve-login-command-ux
rmn-boiko Oct 8, 2024
339c66c
Fix tests
rmn-boiko Oct 8, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ Recommendation: for ease of reading, use the following order:
- Simplified error handling code in repositories
- Hidden part of the test code behind the feature gate
- Updated our crate dependencies so they can be built in isolation
- `kamu push <dataset>` command now can be called without `--to` reference and Alias or Remote dataset repository will be used as destination
- `kamu login` command now will store repository to Repository registry. Name can be provided with `--repo-name` flag and to skip creating repo can be used `--skip-add-repo` flag

## [0.204.4] - 2024-09-30
### Changed
Expand Down
2 changes: 2 additions & 0 deletions resources/cli-reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -526,6 +526,8 @@ Authentiates with a remote ODF server interactively
* `--user` — Store access token in the user home folder rather than in the workspace
* `--check` — Check whether existing authorization is still valid without triggering a login flow
* `--access-token <ACCESS_TOKEN>` — Provide an existing access token
* `--repo-name <REPO_NAME>` — Repository name which will be used to store in repositories list
* `--skip-add-repo` — Don't automatically add a remote repository for this host



Expand Down
95 changes: 95 additions & 0 deletions src/adapter/http/src/data/account_handler.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
// Copyright Kamu Data, Inc. and contributors. All rights reserved.
//
// Use of this software is governed by the Business Source License
// included in the LICENSE file.
//
// As of the Change Date specified in that file, in accordance with
// the Business Source License, use of this software will be governed
// by the Apache License, Version 2.0.

// Copyright Kamu Data, Inc. and contributors. All rights reserved.
//
// Use of this software is governed by the Business Source License
// included in the LICENSE file.
//
// As of the Change Date specified in that file, in accordance with
// the Business Source License, use of this software will be governed
// by the Apache License, Version 2.0.

use axum::extract::Extension;
use axum::response::Json;
use chrono::{DateTime, Utc};
use database_common_macros::transactional_handler;
use dill::Catalog;
use http_common::*;
use kamu_accounts::{
Account,
AccountDisplayName,
AccountType,
AuthenticationService,
CurrentAccountSubject,
};
use opendatafabric::{AccountID, AccountName};

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

#[derive(Debug, serde::Serialize)]
#[serde(rename_all = "camelCase")]
pub struct AccountResponse {
pub id: AccountID,
pub account_name: AccountName,
pub email: Option<String>,
pub display_name: AccountDisplayName,
pub account_type: AccountType,
pub avatar_url: Option<String>,
pub registered_at: DateTime<Utc>,
pub is_admin: bool,
pub provider: String,
pub provider_identity_key: String,
}

zaychenko-sergei marked this conversation as resolved.
Show resolved Hide resolved
impl From<Account> for AccountResponse {
fn from(value: Account) -> Self {
Self {
id: value.id,
sergiimk marked this conversation as resolved.
Show resolved Hide resolved
account_name: value.account_name,
email: value.email,
display_name: value.display_name,
account_type: value.account_type,
avatar_url: value.avatar_url,
registered_at: value.registered_at,
is_admin: value.is_admin,
provider: value.provider,
provider_identity_key: value.provider_identity_key,
}
}
}

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

#[transactional_handler]
pub async fn account_handler(
Extension(catalog): Extension<Catalog>,
) -> Result<Json<AccountResponse>, ApiError> {
let response = get_account(&catalog).await?;
tracing::debug!(?response, "Get account info response");
Ok(response)
}

async fn get_account(catalog: &Catalog) -> Result<Json<AccountResponse>, ApiError> {
let current_account_subject = catalog.get_one::<CurrentAccountSubject>().unwrap();
match current_account_subject.as_ref() {
CurrentAccountSubject::Anonymous(_) => Err(ApiError::new_unauthorized()),
CurrentAccountSubject::Logged(account) => {
let auth_service = catalog.get_one::<dyn AuthenticationService>().unwrap();
let full_account_info_maybe = auth_service.account_by_id(&account.account_id).await?;
if let Some(full_account_info) = full_account_info_maybe {
return Ok(Json(full_account_info.into()));
}

Err(ApiError::not_found_without_body())
zaychenko-sergei marked this conversation as resolved.
Show resolved Hide resolved
}
}
}

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
75 changes: 75 additions & 0 deletions src/adapter/http/src/data/dataset_info_handler.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
// Copyright Kamu Data, Inc. and contributors. All rights reserved.
//
// Use of this software is governed by the Business Source License
// included in the LICENSE file.
//
// As of the Change Date specified in that file, in accordance with
// the Business Source License, use of this software will be governed
// by the Apache License, Version 2.0.

// Copyright Kamu Data, Inc. and contributors. All rights reserved.
//
// Use of this software is governed by the Business Source License
// included in the LICENSE file.
//
// As of the Change Date specified in that file, in accordance with
// the Business Source License, use of this software will be governed
// by the Apache License, Version 2.0.

use axum::extract::{Extension, Path};
use axum::response::Json;
use database_common_macros::transactional_handler;
use dill::Catalog;
use http_common::*;
use kamu_core::{DatasetRepository, GetDatasetError};
use opendatafabric::{AccountName, DatasetHandle, DatasetID, DatasetName};

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

#[derive(Debug, serde::Serialize)]
#[serde(rename_all = "camelCase")]
pub struct DatasetInfoResponse {
pub id: DatasetID,
pub account_name: Option<AccountName>,
pub dataset_name: DatasetName,
}

impl From<DatasetHandle> for DatasetInfoResponse {
fn from(value: DatasetHandle) -> Self {
Self {
id: value.id,
account_name: value.alias.account_name,
dataset_name: value.alias.dataset_name,
}
}
}

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

#[transactional_handler]
pub async fn dataset_info_handler(
Extension(catalog): Extension<Catalog>,
Path(dataset_id): Path<DatasetID>,
) -> Result<Json<DatasetInfoResponse>, ApiError> {
let response = get_dataset_by_id(&catalog, &dataset_id).await?;
tracing::debug!(?response, "Get dataset by id info response");
Ok(response)
}

async fn get_dataset_by_id(
catalog: &Catalog,
dataset_id: &DatasetID,
) -> Result<Json<DatasetInfoResponse>, ApiError> {
let dataset_repo = catalog.get_one::<dyn DatasetRepository>().unwrap();
let dataset_handle = dataset_repo
.resolve_dataset_ref(&dataset_id.clone().as_local_ref())
.await
.map_err(|err| match err {
GetDatasetError::NotFound(e) => ApiError::not_found(e),
GetDatasetError::Internal(e) => e.api_err(),
})?;

Ok(Json(dataset_handle.into()))
}
zaychenko-sergei marked this conversation as resolved.
Show resolved Hide resolved

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
3 changes: 3 additions & 0 deletions src/adapter/http/src/data/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,15 @@
// the Business Source License, use of this software will be governed
// by the Apache License, Version 2.0.

mod account_handler;
sergiimk marked this conversation as resolved.
Show resolved Hide resolved
mod dataset_info_handler;
mod ingest_handler;
pub mod metadata_handler;
mod query_handler;
pub mod query_types;
mod router;
mod tail_handler;
mod verify_handler;
mod workspace_info_handler;

pub use router::*;
4 changes: 2 additions & 2 deletions src/adapter/http/src/data/query_handler.rs
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ pub(crate) async fn query_handler_post_v2(
},
})
} else {
Err(ApiError::not_implemented(ResponseSigningNotCongigured))?
Err(ApiError::not_implemented(ResponseSigningNotConfigured))?
sergiimk marked this conversation as resolved.
Show resolved Hide resolved
};

Ok(Json(response))
Expand Down Expand Up @@ -223,4 +223,4 @@ pub async fn query_handler(

#[derive(Debug, thiserror::Error)]
#[error("Response signing is not enabled by the node operator")]
struct ResponseSigningNotCongigured;
struct ResponseSigningNotConfigured;
12 changes: 12 additions & 0 deletions src/adapter/http/src/data/router.rs
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,18 @@ pub fn root_router() -> axum::Router {
"/verify",
axum::routing::post(super::verify_handler::verify_handler),
)
.route(
"/workspace/info",
sergiimk marked this conversation as resolved.
Show resolved Hide resolved
axum::routing::get(super::workspace_info_handler::workspace_info_handler),
)
.route(
"/me",
sergiimk marked this conversation as resolved.
Show resolved Hide resolved
axum::routing::get(super::account_handler::account_handler),
)
.route(
"/datasets/:id",
axum::routing::get(super::dataset_info_handler::dataset_info_handler),
)
}

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
Expand Down
51 changes: 51 additions & 0 deletions src/adapter/http/src/data/workspace_info_handler.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
// Copyright Kamu Data, Inc. and contributors. All rights reserved.
//
// Use of this software is governed by the Business Source License
// included in the LICENSE file.
//
// As of the Change Date specified in that file, in accordance with
// the Business Source License, use of this software will be governed
// by the Apache License, Version 2.0.

// Copyright Kamu Data, Inc. and contributors. All rights reserved.
//
// Use of this software is governed by the Business Source License
// included in the LICENSE file.
//
// As of the Change Date specified in that file, in accordance with
// the Business Source License, use of this software will be governed
// by the Apache License, Version 2.0.

use axum::extract::Extension;
use axum::response::Json;
use dill::Catalog;
use http_common::*;
use kamu_core::DatasetRepository;

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

#[derive(Debug, serde::Serialize)]
#[serde(rename_all = "camelCase")]
pub struct WorkspaceInfoResponse {
sergiimk marked this conversation as resolved.
Show resolved Hide resolved
pub is_multi_tenant: bool,
zaychenko-sergei marked this conversation as resolved.
Show resolved Hide resolved
}

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////

pub async fn workspace_info_handler(
Extension(catalog): Extension<Catalog>,
) -> Result<Json<WorkspaceInfoResponse>, ApiError> {
let response = get_workspace_info(&catalog);
tracing::debug!(?response, "Get workspace info response");
Ok(response)
}

fn get_workspace_info(catalog: &Catalog) -> Json<WorkspaceInfoResponse> {
let dataset_repo = catalog.get_one::<dyn DatasetRepository>().unwrap();

Json(WorkspaceInfoResponse {
is_multi_tenant: dataset_repo.is_multi_tenant(),
})
}

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
28 changes: 25 additions & 3 deletions src/adapter/http/tests/harness/client_side_harness.rs
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,15 @@
// by the Apache License, Version 2.0.

use std::path::PathBuf;
use std::str::FromStr;
use std::sync::Arc;

use auth::OdfServerAccessTokenResolver;
use container_runtime::ContainerRuntime;
use database_common::NoOpDatabasePlugin;
use dill::Component;
use headers::Header;
use internal_error::{InternalError, ResultIntoInternal};
use kamu::domain::*;
use kamu::*;
use kamu_accounts::CurrentAccountSubject;
Expand All @@ -27,7 +29,8 @@ use opendatafabric::{
DatasetID,
DatasetRef,
DatasetRefAny,
DatasetRefRemote,
RepoName,
TransferDatasetRef,
};
use tempfile::TempDir;
use time_source::SystemTimeSourceDefault;
Expand Down Expand Up @@ -106,6 +109,7 @@ impl ClientSideHarness {
b.add::<RemoteRepositoryRegistryImpl>();

b.add::<RemoteAliasesRegistryImpl>();
b.add::<RemoteAliasResolverImpl>();

b.add::<EngineProvisionerNull>();

Expand Down Expand Up @@ -239,7 +243,7 @@ impl ClientSideHarness {
pub async fn push_dataset(
&self,
dataset_local_ref: DatasetRef,
dataset_remote_ref: DatasetRefRemote,
dataset_remote_ref: TransferDatasetRef,
force: bool,
dataset_visibility: DatasetVisibility,
) -> Vec<PushResponse> {
Expand All @@ -266,7 +270,7 @@ impl ClientSideHarness {
pub async fn push_dataset_result(
&self,
dataset_local_ref: DatasetRef,
dataset_remote_ref: DatasetRefRemote,
dataset_remote_ref: TransferDatasetRef,
force: bool,
dataset_visibility: DatasetVisibility,
) -> SyncResult {
Expand All @@ -288,6 +292,24 @@ impl ClientSideHarness {
}
}

pub fn add_repository(
&self,
repo_name: &RepoName,
base_url: &str,
) -> Result<(), InternalError> {
let remote_repo_reg = self
.catalog
.get_one::<dyn RemoteRepositoryRegistry>()
.unwrap();

remote_repo_reg
.add_repository(
repo_name,
Url::from_str(&format!("http://{base_url}")).unwrap(),
)
.int_err()
}

pub fn internal_datasets_folder_path(&self) -> PathBuf {
self.tempdir.path().join("datasets")
}
Expand Down
Loading
Loading