Skip to content

Commit

Permalink
e2e for wrangler-groupBy
Browse files Browse the repository at this point in the history
  • Loading branch information
priyabhatnagar25 authored and AnkitCLI committed Jan 23, 2024
1 parent 7042ab9 commit e20018d
Show file tree
Hide file tree
Showing 23 changed files with 370 additions and 103 deletions.
4 changes: 2 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@
<aws.sdk.version>1.11.133</aws.sdk.version>
<bigquery.connector.hadoop2.version>0.10.2-hadoop2</bigquery.connector.hadoop2.version>
<bouncycastle.version>1.56</bouncycastle.version>
<cdap.version>6.10.0-SNAPSHOT</cdap.version>
<cdap.version>6.10.0</cdap.version>
<chlorine.version>1.1.5</chlorine.version>
<commons.validator.version>1.6</commons.validator.version>
<commons-io.version>2.5</commons-io.version>
Expand Down Expand Up @@ -547,7 +547,7 @@
<dependency>
<groupId>io.cdap.tests.e2e</groupId>
<artifactId>cdap-e2e-framework</artifactId>
<version>0.3.0-SNAPSHOT</version>
<version>0.4.0-SNAPSHOT</version>
<scope>test</scope>
</dependency>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
@Wrangler
Feature: datatype parsers

@BQ_SOURCE_TS_TEST @BQ_SINK_TEST
@BQ_SOURCE_TS_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse timestamp directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Expand All @@ -25,13 +25,13 @@ Feature: datatype parsers
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Expand All @@ -43,7 +43,7 @@ Feature: datatype parsers
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_Timestamp"


@BQ_SOURCE_DATETIME_TEST @BQ_SINK_TEST
@BQ_SOURCE_DATETIME_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse datetime directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Expand All @@ -53,13 +53,13 @@ Feature: datatype parsers
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
@Wrangler
Feature: Wrangler - Run time scenarios

@BQ_SOURCE_CSV_TEST @BQ_SINK_TEST
@BQ_SOURCE_CSV_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse csv directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Expand All @@ -25,13 +25,13 @@ Feature: Wrangler - Run time scenarios
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
@Wrangler
Feature: parse as fixed length

@BQ_SOURCE_FXDLEN_TEST @BQ_SINK_TEST
@BQ_SOURCE_FXDLEN_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse fixedlength directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Expand All @@ -25,13 +25,13 @@ Feature: parse as fixed length
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
@Wrangler
Feature: parse as HL7

@BQ_SOURCE_HL7_TEST @BQ_SINK_TEST
@BQ_SOURCE_HL7_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using parse hl7 directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Expand All @@ -25,13 +25,13 @@ Feature: parse as HL7
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Expand Down
43 changes: 43 additions & 0 deletions wrangler-transform/src/e2e-test/features/Wrangler/Runtime.feature
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

@Wrangler
Feature: Wrangler - Run time scenarios

@BQ_SOURCE_GRPBY_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using wrangler and groupBy directive
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Then Select the file for importing the pipeline for the plugin "Directive_GroupBy"
Then Navigate to the properties page of plugin: "BigQueryTable"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Validate "BigQueryTable" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "project" with value: "projectId"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Validate "BigQuery2" plugin properties
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_GroupBy"
Original file line number Diff line number Diff line change
Expand Up @@ -40,9 +40,10 @@ public class TestSetupHooks {
@Before(order = 1, value = "@BQ_SOURCE_CSV_TEST")
public static void createTempSourceBQTable() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQTableQueryFileCsv"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileCsv"));
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileCsv"));
}
@Before(order = 1, value = "@BQ_SINK_TEST")

@Before(order = 2, value = "@BQ_SINK_TEST")
public static void setTempTargetBQTableName() {
String bqTargetTableName = "E2E_TARGET_" + UUID.randomUUID().toString().replaceAll("-", "_");
PluginPropertyUtils.addPluginProp("bqTargetTable", bqTargetTableName);
Expand All @@ -54,7 +55,8 @@ public static void deleteTempTargetBQTable() throws IOException, InterruptedExce
String bqTargetTableName = PluginPropertyUtils.pluginProp("bqTargetTable");
try {
BigQueryClient.dropBqQuery(bqTargetTableName);
BeforeActions.scenario.write("BQ Target table - " + bqTargetTableName + " deleted successfully");
BeforeActions.scenario.write(
"BQ Target table - " + bqTargetTableName + " deleted successfully");
PluginPropertyUtils.removePluginProp("bqTargetTable");
} catch (BigQueryException e) {
if (e.getMessage().contains("Not found: Table")) {
Expand All @@ -71,61 +73,77 @@ public static void deleteTempTargetBQTable() throws IOException, InterruptedExce
@Before(order = 1, value = "@BQ_SOURCE_FXDLEN_TEST")
public static void createTempSourceBQTableFxdLen() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQDataQueryFileFxdLen"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileFxdLen"));
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileFxdLen"));
}

@Before(order = 1, value = "@BQ_SOURCE_HL7_TEST")
public static void createTempSourceBQTableHl7() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQDataQueryFileHl7"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileHl7"));
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileHl7"));
}

@Before(order = 1, value = "@BQ_SOURCE_TS_TEST")
public static void createTempSourceBQTableTimestamp() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQDataQueryFileTimestamp"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileTimestamp"));
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileTimestamp"));
}

@Before(order = 1, value = "@BQ_SOURCE_DATETIME_TEST")
public static void createTempSourceBQTableDateTime() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQDataQueryFileDatetime"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileDatetime"));
PluginPropertyUtils.pluginProp("InsertBQDataQueryFileDatetime"));
}

@After(order = 1, value = "@BQ_SOURCE_TEST")
@After(order = 2, value = "@BQ_SOURCE_TEST")
public static void deleteTempSourceBQTable() throws IOException, InterruptedException {
String bqSourceTable = PluginPropertyUtils.pluginProp("bqSourceTable");
BigQueryClient.dropBqQuery(bqSourceTable);
BeforeActions.scenario.write("BQ source Table " + bqSourceTable + " deleted successfully");
PluginPropertyUtils.removePluginProp("bqSourceTable");

}

@Before(order = 1, value = "@BQ_SOURCE_GRPBY_TEST")
public static void createTempSourceBQTableGroupBy() throws IOException, InterruptedException {
createSourceBQTableWithQueries(PluginPropertyUtils.pluginProp("CreateBQTableQueryFile"),
PluginPropertyUtils.pluginProp("InsertBQDataQueryFile"));
}

private static void createSourceBQTableWithQueries(String bqCreateTableQueryFile, String bqInsertDataQueryFile)
throws IOException, InterruptedException {
String bqSourceTable = "E2E_SOURCE_" + UUID.randomUUID().toString().substring(0, 5).replaceAll("-",
"_");
private static void createSourceBQTableWithQueries(String bqCreateTableQueryFile,
String bqInsertDataQueryFile)
throws IOException, InterruptedException {
String bqSourceTable =
"E2E_SOURCE_" + UUID.randomUUID().toString().substring(0, 5).replaceAll("-",
"_");

String createTableQuery = StringUtils.EMPTY;
try {
createTableQuery = new String(Files.readAllBytes(Paths.get(TestSetupHooks.class.getResource
("/" + bqCreateTableQueryFile).toURI()))
, StandardCharsets.UTF_8);
createTableQuery = createTableQuery.replace("DATASET", PluginPropertyUtils.pluginProp("dataset"))
.replace("TABLE_NAME", bqSourceTable);
("/" + bqCreateTableQueryFile).toURI()))
, StandardCharsets.UTF_8);
createTableQuery = createTableQuery.replace("DATASET",
PluginPropertyUtils.pluginProp("dataset"))
.replace("TABLE_NAME", bqSourceTable);
} catch (Exception e) {
BeforeActions.scenario.write("Exception in reading " + bqCreateTableQueryFile + " - " + e.getMessage());
BeforeActions.scenario.write(
"Exception in reading " + bqCreateTableQueryFile + " - " + e.getMessage());
Assert.fail("Exception in BigQuery testdata prerequisite setup " +
"- error in reading create table query file " + e.getMessage());
"- error in reading create table query file " + e.getMessage());
}

String insertDataQuery = StringUtils.EMPTY;
try {
insertDataQuery = new String(Files.readAllBytes(Paths.get(TestSetupHooks.class.getResource
("/" + bqInsertDataQueryFile).toURI()))
, StandardCharsets.UTF_8);
insertDataQuery = insertDataQuery.replace("DATASET", PluginPropertyUtils.pluginProp("dataset"))
.replace("TABLE_NAME", bqSourceTable);
("/" + bqInsertDataQueryFile).toURI()))
, StandardCharsets.UTF_8);
insertDataQuery = insertDataQuery.replace("DATASET",
PluginPropertyUtils.pluginProp("dataset"))
.replace("TABLE_NAME", bqSourceTable);
} catch (Exception e) {
BeforeActions.scenario.write("Exception in reading " + bqInsertDataQueryFile + " - " + e.getMessage());
BeforeActions.scenario.write(
"Exception in reading " + bqInsertDataQueryFile + " - " + e.getMessage());
Assert.fail("Exception in BigQuery testdata prerequisite setup " +
"- error in reading insert data query file " + e.getMessage());
"- error in reading insert data query file " + e.getMessage());
}
BigQueryClient.getSoleQueryResult(createTableQuery);
try {
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{"create_date":"2023","id":1,"timecolumn":"2006-03-18"}
{"create_date":"2023","id":2,"timecolumn":"2007-03-18"}
{"create_date":"2023","id":3,"timecolumn":"2008-04-19"}
{"create_date":"2024","id":1,"timecolumn":"2006-03-18"}
{"create_date":"2024","id":2,"timecolumn":"2007-03-18"}
{"create_date":"2024","id":3,"timecolumn":"2008-04-19"}
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{"city":"San Jose","cityFirst":"San Jose","firstname":"DOUGLAS","id":"1","lastname":"Williams","state":"CA","zipcode":923564293}
{"city":"Houston","cityFirst":"Houston","firstname":"DAVID","id":"2","lastname":"Johnson","state":"TX","zipcode":1738378970}
{"city":"Manhattan","cityFirst":"Manhattan","firstname":"HUGH","id":"3","lastname":"Jackman","state":"NY","zipcode":-1863622247}
{"city":"San Diego","cityFirst":"San Diego","firstname":"FRANK","id":"5","lastname":"Underwood","state":"CA","zipcode":-1317090526}
{"city":"New York","cityFirst":"New York","firstname":"SARTHAK","id":"7","lastname":"Dash","state":"NY","zipcode":-1949601773}
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
create table `DATASET.TABLE_NAME` (id STRING, firstname STRING, lastname STRING, streetAddress STRING,
city STRING, state STRING, zipcode BIGINT, phoneNumber BIGINT)
Original file line number Diff line number Diff line change
@@ -1 +1 @@
create table `DATASET.TABLE_NAME` (id STRING, create_date STRING, timestamp STRING)
create table `DATASET.TABLE_NAME` (id BIGINT, create_date STRING, timestamp STRING)
Original file line number Diff line number Diff line change
@@ -1 +1 @@
create table `DATASET.TABLE_NAME` (url STRING, fixedlength STRING)
create table `DATASET.TABLE_NAME` (Url STRING, fixedlength STRING)
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
INSERT INTO DATASET.TABLE_NAME (id, firstname, lastname, streetAddress, city, state, zipcode, phoneNumber)
VALUES
('5', 'Frank', 'Underwood', '1609 Far St.', 'San Diego', 'CA', 2977876770, 19061512345),
('1', 'Douglas', 'Williams', '1 Vista Montana', 'San Jose', 'CA', 9513498885, 35834612345),
('4', 'Walter', 'White', '3828 Piermont Dr', 'Orlando', 'FL', 7349864532, 7829812345),
('3', 'Hugh', 'Jackman', '5, Cool Way', 'Manhattan', 'NY', 6726312345, 1695412345),
('7', 'Sarthak', 'Dash', '123 Far St.', 'New York', 'NY', 2345365523, 1324812345),
('6', 'Serena', 'Woods', '123 Far St.', 'Las Vegas', 'NV', 4533456734, 78919612345),
('2', 'David', 'Johnson', '3 Baypointe Parkway', 'Houston', 'TX', 1738378970, 1451412345),
('8', 'Rahul', 'Dash', '22 MG Road.', 'Bangalore', 'KA',NULL, 94864612345);
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
INSERT INTO DATASET.TABLE_NAME (id,create_date,timestamp)
VALUES
('1','2021-01-21','2006-02-18T05:03:42Z[UTC]'),
('2','2022-02-22','2007-01-18T04:03:22Z[UTC]'),
('3','2023-03-23','2008-07-19T08:04:22Z[UTC]');
(1,'2021-01-21','2006-02-18T05:03:42Z[UTC]'),
(2,'2022-02-22','2007-01-18T04:03:22Z[UTC]'),
(3,'2023-03-23','2008-07-19T08:04:22Z[UTC]');
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
INSERT INTO DATASET.TABLE_NAME (url,fixedlength)
INSERT INTO DATASET.TABLE_NAME (Url,fixedlength)
VALUES
('http://example.com:80/docs/books/tutorial/index.html?name=networking#DOWNLOADING','21 10 ABCXYZ'),
('http://geeks.com:80/docs/chair/tutorial/index.html?name=networking#DOWNLOADING','19 13 ABCXYZ'),
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
validationSuccessMessage=No errors found.
Loading

0 comments on commit e20018d

Please sign in to comment.