Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrangler plugin e2e tests. #650

Closed
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
105 changes: 105 additions & 0 deletions .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
# Copyright © 2023 Cask Data, Inc.
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

# This workflow will build a Java project with Maven
# For more information see: https://help.github.com/actions/language-and-framework-guides/building-and-testing-java-with-maven
# Note: Any changes to this workflow would be used only after merging into develop
name: Build e2e tests

on:
push:
branches: [ develop ]
pull_request:
branches: [ develop ]
types: [ opened, synchronize, reopened, labeled ]
workflow_dispatch:

jobs:
build:
runs-on: k8s-runner-e2e
# We allow builds:
# 1) When triggered manually
# 2) When it's a merge into a branch
# 3) For PRs that are labeled as build and
# - It's a code change
# - A build label was just added
# A bit complex, but prevents builds when other labels are manipulated
if: >
github.event_name == 'workflow_dispatch'
|| github.event_name == 'push'
|| (contains(github.event.pull_request.labels.*.name, 'build')
&& (github.event.action != 'labeled' || github.event.label.name == 'build')
)
strategy:
matrix:
module: [wrangler-transform]
fail-fast: false

steps:
# Pinned 1.0.0 version
- uses: actions/checkout@v3
with:
path: plugin
submodules: 'recursive'
ref: ${{ github.event.workflow_run.head_sha }}

- uses: dorny/paths-filter@b2feaf19c27470162a626bd6fa8438ae5b263721
if: github.event_name != 'workflow_dispatch' && github.event_name != 'push'
id: filter
with:
working-directory: plugin
filters: |
e2e-test:
- '${{ matrix.module }}/**/e2e-test/**'

- name: Checkout e2e test repo
uses: actions/checkout@v3
with:
repository: cdapio/cdap-e2e-tests
path: e2e

- name: Cache
uses: actions/cache@v3
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-${{ github.workflow }}-${{ hashFiles('**/pom.xml') }}
restore-keys: |
${{ runner.os }}-maven-${{ github.workflow }}

- name: Run required e2e tests
if: github.event_name != 'workflow_dispatch' && github.event_name != 'push' && steps.filter.outputs.e2e-test == 'false'
run: python3 e2e/src/main/scripts/run_e2e_test.py --module ${{ matrix.module }} --testRunner TestRunnerRequired.java

- name: Run all e2e tests
if: github.event_name == 'workflow_dispatch' || github.event_name == 'push' || steps.filter.outputs.e2e-test == 'true'
run: python3 e2e/src/main/scripts/run_e2e_test.py --module ${{ matrix.module }}

- name: Upload report
uses: actions/upload-artifact@v3
if: always()
with:
name: Cucumber report - ${{ matrix.module }}
path: ./**/target/cucumber-reports

- name: Upload debug files
uses: actions/upload-artifact@v3
if: always()
with:
name: Debug files - ${{ matrix.module }}
path: ./**/target/e2e-debug

- name: Upload files to GCS
uses: google-github-actions/upload-cloud-storage@v0
if: always()
with:
path: ./plugin
destination: e2e-tests-cucumber-reports/${{ github.event.repository.name }}/${{ github.ref }}
glob: '**/target/cucumber-reports/**'
138 changes: 134 additions & 4 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@
<google.storage.version>1.106.0</google.storage.version>
<gson.version>2.6.2</gson.version>
<guava.retrying.version>2.0.0</guava.retrying.version>
<guava.version>20.0</guava.version>
<guava.version>31.0.1-jre</guava.version>
<hadoop.version>2.4.0</hadoop.version>
<hl7.version>2.2</hl7.version>
<hsql.version>2.2.4</hsql.version>
Expand All @@ -122,6 +122,7 @@
<simplemagic.version>1.11</simplemagic.version>
<slf4j.version>1.7.15</slf4j.version>
<unix4j.version>0.4</unix4j.version>
<testSourceLocation>${project.basedir}/src/test/java/</testSourceLocation>
</properties>

<repositories>
Expand Down Expand Up @@ -172,6 +173,7 @@
</dependencies>

<build>
<testSourceDirectory>${testSourceLocation}</testSourceDirectory>
<pluginManagement>
<plugins>
<plugin>
Expand All @@ -186,7 +188,7 @@
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>3.3.0</version>
<version>3.5.0</version>
<extensions>true</extensions>
<configuration>
<instructions>
Expand Down Expand Up @@ -397,7 +399,6 @@
<releaseProfiles>releases</releaseProfiles>
</configuration>
</plugin>

<plugin>
<groupId>org.sonatype.plugins</groupId>
<artifactId>nexus-staging-maven-plugin</artifactId>
Expand Down Expand Up @@ -429,6 +430,135 @@
</plugins>
</build>
</profile>
<profile>
<id>e2e-tests</id>
<properties>
<testSourceLocation>src/e2e-test/java</testSourceLocation>
<TEST_RUNNER>TestRunner.java</TEST_RUNNER>
</properties>
<build>
<testResources>
<testResource>
<directory>src/e2e-test/resources</directory>
</testResource>
</testResources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18.1</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>

<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>3.0.0</version>
<dependencies>
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-junit47</artifactId>
<version>3.0.0</version>
</dependency>
</dependencies>
<configuration>
<includes>
<include>${TEST_RUNNER}</include>
</includes>
<!--Start configuration to run TestRunners in parallel-->
<parallel>classes</parallel> <!--Running TestRunner classes in parallel-->
<threadCount>2</threadCount> <!--Number of classes to run in parallel-->
<forkCount>2</forkCount> <!--Number of JVM processes -->
<reuseForks>true</reuseForks>
<!--End configuration to run TestRunners in parallel-->
<environmentVariables>
<GOOGLE_APPLICATION_CREDENTIALS>
${GOOGLE_APPLICATION_CREDENTIALS}
</GOOGLE_APPLICATION_CREDENTIALS>
<SERVICE_ACCOUNT_TYPE>
${SERVICE_ACCOUNT_TYPE}
</SERVICE_ACCOUNT_TYPE>
<SERVICE_ACCOUNT_FILE_PATH>
${SERVICE_ACCOUNT_FILE_PATH}
</SERVICE_ACCOUNT_FILE_PATH>
<SERVICE_ACCOUNT_JSON>
${SERVICE_ACCOUNT_JSON}
</SERVICE_ACCOUNT_JSON>
</environmentVariables>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>

<plugin>
<groupId>net.masterthought</groupId>
<artifactId>maven-cucumber-reporting</artifactId>
<version>5.5.0</version>

<executions>
<execution>
<id>execution</id>
<phase>verify</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<projectName>Cucumber Reports</projectName> <!-- Replace with project name -->
<outputDirectory>target/cucumber-reports/advanced-reports</outputDirectory>
<buildNumber>1</buildNumber>
<skip>false</skip>
<inputDirectory>${project.build.directory}/cucumber-reports</inputDirectory>
<jsonFiles> <!-- supports wildcard or name pattern -->
<param>**/*.json</param>
</jsonFiles> <!-- optional, defaults to outputDirectory if not specified -->
<classificationDirectory>${project.build.directory}/cucumber-reports</classificationDirectory>
<checkBuildResult>true</checkBuildResult>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>${guava.version}</version>
</dependency>
</dependencies>
</dependencyManagement>

<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.15</version>
</dependency>

<dependency>
<groupId>io.cdap.tests.e2e</groupId>
<artifactId>cdap-e2e-framework</artifactId>
<version>0.3.0-SNAPSHOT</version>
vanathi-g marked this conversation as resolved.
Show resolved Hide resolved
<scope>test</scope>
</dependency>

<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.8</version>
<scope>runtime</scope>
</dependency>
</dependencies>

</profile>
</profiles>
</project>

1 change: 0 additions & 1 deletion wrangler-transform/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -215,5 +215,4 @@
</build>
</profile>
</profiles>

</project>
91 changes: 91 additions & 0 deletions wrangler-transform/src/e2e-test/features/Wrangler/RunTime.feature
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
# Copyright © 2023 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

@Wrangler
Feature: Wrangler - Run time scenarios

@BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using the copy count and delete directives in the wrangler plugin
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Then Select the file for importing the pipeline for the plugin "Directive_copy_drop_count_setcolmn"
Then Navigate to the properties page of plugin: "BigQueryTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_copy_drop_count_setcolmn"

@BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using the fill null and send to error directives in the wrangler plugin
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unable to run the tests locally as it says this step (and a few others) are missing. Am I missing some config?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A PR has been raised for the steps in cdap-e2e-framework that are required for wrangler test scenarios. (cdapio/cdap-e2e-tests#193).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then Select the file for importing the pipeline for the plugin "Directive_Fillempty_sendtoerror"
Then Navigate to the properties page of plugin: "BigQueryTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Close the Plugin Properties page
Then Rename the pipeline
vanathi-g marked this conversation as resolved.
Show resolved Hide resolved
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_Fillempty_sendtoerror"

@BQ_SOURCE_TEST @BQ_SINK_TEST
Scenario: To verify User is able to run a pipeline using the Format,concatenate,title case and copy column directives in the wrangler plugin
Given Open Datafusion Project to configure pipeline
Then Click on the Plus Green Button to import the pipelines
Then Select the file for importing the pipeline for the plugin "Directive_Concatenate_titlecase"
Then Navigate to the properties page of plugin: "BigQueryTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Replace input plugin property: "table" with value: "bqSourceTable"
Then Click on the Get Schema button
Then Click on the Validate button
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "BigQuery2"
Then Replace input plugin property: "table" with value: "bqTargetTable"
Then Replace input plugin property: "dataset" with value: "dataset"
Then Click on the Validate button
Then Close the Plugin Properties page
Then Rename the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_Concatenate_titlecase"
Loading
Loading