Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

wrangler e2e tests #666

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@
<aws.sdk.version>1.11.133</aws.sdk.version>
<bigquery.connector.hadoop2.version>0.10.2-hadoop2</bigquery.connector.hadoop2.version>
<bouncycastle.version>1.56</bouncycastle.version>
<cdap.version>6.10.0-SNAPSHOT</cdap.version>
<cdap.version>6.10.0</cdap.version>
<chlorine.version>1.1.5</chlorine.version>
<commons.validator.version>1.6</commons.validator.version>
<commons-io.version>2.5</commons-io.version>
Expand Down Expand Up @@ -547,7 +547,7 @@
<dependency>
<groupId>io.cdap.tests.e2e</groupId>
<artifactId>cdap-e2e-framework</artifactId>
<version>0.3.0-SNAPSHOT</version>
<version>0.4.0-SNAPSHOT</version>
<scope>test</scope>
</dependency>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
@Wrangler
Feature: datatype parsers

@BQ_SOURCE_TS_TEST @BQ_SINK_TEST
@BQ_SOURCE_TS_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse timestamp directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Expand All @@ -25,13 +25,13 @@ Feature: datatype parsers
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this removed?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As the pipelines are exported.In almost all the PR's the issue is when we try to get schema then randomly in most of the fields it is going null when we click on get schema and validate button. And it also changes the schema . So just to solve that issue we removed that part from here.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you mean by this?

when we try to get schema then randomly in most of the fields it is going null when we click on get schema and validate button. And it also changes the schema

Can you please elaborate with the help of screenshots?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added the schema and validation step.

Then Click on the Validate button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Expand All @@ -43,7 +43,7 @@ Feature: datatype parsers
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_Timestamp"


@BQ_SOURCE_DATETIME_TEST @BQ_SINK_TEST
@BQ_SOURCE_DATETIME_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse datetime directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Expand All @@ -53,13 +53,14 @@ Feature: datatype parsers
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,9 @@
# the License.

@Wrangler
Feature: Wrangler - Run time scenarios
Feature: Wrangler - Run time scenarios for parse csv

@BQ_SOURCE_CSV_TEST @BQ_SINK_TEST
@BQ_SOURCE_CSV_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse csv directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Expand All @@ -25,13 +25,13 @@ Feature: Wrangler - Run time scenarios
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

@Wrangler
Feature: Parse as excel

@GCS_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse Excel directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Then Select the file for importing the pipeline for the plugin "Directive_parse_excel"
Then Navigate to the properties page of plugin: "GCSFile"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "path" with value: "gcsSourceBucket"
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Validate "BigQuery" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_excel"
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
@Wrangler
Feature: parse as fixed length

@BQ_SOURCE_FXDLEN_TEST @BQ_SINK_TEST
@BQ_SOURCE_FXDLEN_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse fixedlength directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Expand All @@ -25,13 +25,13 @@ Feature: parse as fixed length
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
@Wrangler
Feature: parse as HL7

@BQ_SOURCE_HL7_TEST @BQ_SINK_TEST
@BQ_SOURCE_HL7_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse hl7 directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Expand All @@ -25,16 +25,15 @@ Feature: parse as HL7
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Then Save and Deploy Pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

@Wrangler
Feature: parse as Json

@BQ_SOURCE_JSON_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse Json directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Then Select the file for importing the pipeline for the plugin "Directive_parse_json"
Then Navigate to the properties page of plugin: "BigQueryTable"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_json"
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

@Wrangler
Feature: parse as XmlToJson

@BQ_SOURCE_XML_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse XmlToJson directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Then Select the file for importing the pipeline for the plugin "Directive_parse_xml"
Then Navigate to the properties page of plugin: "BigQueryTable"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_xml"
Loading
Loading