diff --git a/README.md b/README.md index 0e9fba7..c475f8e 100644 --- a/README.md +++ b/README.md @@ -1,13 +1,79 @@ # Data Cloud integration built in MuleSoft -Temporary (?) place for the Data Cloud integration built using MuleSoft. +Temporary (or not?) place for the Data Cloud integration built using MuleSoft. -## Run locally +There are two main folders/projects inside this repo: +1. [Data Cloud Integration API/](/Data%20Cloud%20Integration%20API/) ~ Contains the files I used to publish the **API Specification** that is publicly available in my Exchange Portal [here](https://anypoint.mulesoft.com/exchange/portals/mulesoft-36559/b903eebf-16e9-46c5-8992-bffd66c2306c/data-cloud-integration-api/). +2. [data-cloud-integration-impl/](/data-cloud-integration-impl/) ~ Contains the Mule project I created for you to call Data Cloud. This is the code used to generate the JAR file that you will use to deploy the Mule app in Anypoint Platform. You can download the latest JAR from the `releases` section of this repo or by clicking [here](https://github.com/alexandramartinez/datacloud-mulesoft-integration/releases/download/1.0.0/data-cloud-integration-impl-1.0.0-mule-application.jar). -TBD +## Data Cloud configuration + +tbd + +## Deploy your own Mule app + +Before starting, make sure you have all the Data Cloud settings ready. + +- Download the Mule app's JAR file [here](https://github.com/alexandramartinez/datacloud-mulesoft-integration/releases/download/1.0.0/data-cloud-integration-impl-1.0.0-mule-application.jar). +- Take note of the following credentials. You will need them for the Mule app. + - `salesforce.username` - The username you use to log in to Salesforce at login.salesforce.com + - `salesforce.password` - The password you use to log in to Salesforce at login.salesforce.com + - `cdp.consumer.key` - The Consumer Key for the Connected App you created in Salesforce + - `cdp.consumer.secret` - The Consumer Secret for the Connected App you created in Salesforce +- Create a free trial account in [Anypoint Platform](https://anypoint.mulesoft.com). It will expire in 30 days but you can create a new one using the same details to register, just making sure to change the username to a different one. +- In Anypoint Platform, navigate to [Runtime Manager](https://anypoint.mulesoft.com/cloudhub). +- Click on **Deploy Application**. +- Add any **application name**. For example, `data-cloud-integration` + - We will be deploying to CloudHub 2.0, but if you're deploying to CloudHub 1.0, the application name has to be unique across all applications worldwide. If the application name is not available, you can add your username at the end to make it unique. For example, `data-cloud-integration-amartinez`. +- Make sure the **deployment target** is set to CloudHub 2.0. It should be selected by default. +- Select **Choose file > Upload File** under **Application File** +- Select the JAR file you downloaded at the beginning, or get it from this repo's [releases](https://github.com/alexandramartinez/datacloud-mulesoft-integration/releases). +- Check the following values are correctly set under **Runtime** + + | field | value | + | - | - | + | + | Use Edge Release Channel | ◻️ (unchecked) + | Runtime Version | 4.4.0 + | Replica Count | 1 + | Replica size | 0.1 vCores + | Deployment model | Rolling update + | Run in Runtime Cluster Mode | ◻️ (unchecked) + | Use Object Store V2 | ◻️ (unchecked) + +- Leave everything unchecked under the **Ingress** tab. +- Add your credentials under **Properties**. + - You can add them one by one using the **Table view** + - Or upload the following text using the **Text view** and replace your values + + ```properties + salesforce.username=your_username + salesforce.password=your_password + cdp.consumer.key=your_key + cdp.consumer.secret=your_secret + ``` + +- Click on **Protect** next to each value from the **Table view** to hide your properties (only available in CloudHub 2.0). +- Click on **Deploy Application**. Wait for a few minutes until your application has been deployed 🟢 (green circle). + - If you see a 🔴 red circle right after clicking **Deploy**, wait more time. It doesn't mean the app has failed, it means it's still deploying. +- Once the app has been deployed, click on **Dashboard** tab at the left side of the screen. +- Copy the URL. This is your host URL. It should be something like `https://data-cloud-integration-abcdef123.a1b2c3-1.usa-e2.cloudhub.io/` + +> [!CAUTION] +> **DO NOT** share this URL publicly. Anyone with this URL will be able to access your Data Cloud API. +> +> If you wish to create a new URL, you will have to delete this app from the **Settings** tab (where it says **Stop**) and create a new one following the same steps. +> +> You can also **Stop** and **Start** your app only when you are using it to avoid unwanted requests. + +## Call your integration + +tbd ## Create your own - step by step +There is no need for you to go through these steps if you only want to make use of the integration. This is only if you want to learn how to create this whole integration step by step. From designing the API specification to implementing the Mule project. + ### 1. Create the API Specification and publish it to Exchange > [!NOTE] diff --git a/data-cloud-integration-impl/pom.xml b/data-cloud-integration-impl/pom.xml index 971d996..71e73c4 100644 --- a/data-cloud-integration-impl/pom.xml +++ b/data-cloud-integration-impl/pom.xml @@ -60,16 +60,6 @@ log4j-core 2.17.1 - - - b903eebf-16e9-46c5-8992-bffd66c2306c - data-cloud-integration-api - 1.0.0 - raml - zip - org.mule.modules mule-apikit-module diff --git a/data-cloud-integration-impl/src/main/mule/global.xml b/data-cloud-integration-impl/src/main/mule/global.xml index b0c2516..e887fba 100644 --- a/data-cloud-integration-impl/src/main/mule/global.xml +++ b/data-cloud-integration-impl/src/main/mule/global.xml @@ -22,5 +22,5 @@ - + \ No newline at end of file diff --git a/data-cloud-integration-impl/src/main/resources/api/data-cloud-integration-api.raml b/data-cloud-integration-impl/src/main/resources/api/data-cloud-integration-api.raml new file mode 100644 index 0000000..fc3a7b0 --- /dev/null +++ b/data-cloud-integration-impl/src/main/resources/api/data-cloud-integration-api.raml @@ -0,0 +1,334 @@ +#%RAML 1.0 +title: Data Cloud Integration API +version: 1.0.0 + +types: + DataCloudSuccessfulResponse: + properties: + accepted: + default: true + example: true + type: boolean + +traits: + dataCloudApiParams: + queryParameters: + sourceApiName: + displayName: Ingestion API's name + description: | + You can find this value by taking a look at your Ingestion API settings (where you uploaded the YAML schema) in Salesforce. + example: MuleSoft_Ingestion_API + type: string + objectName: + displayName: Ingestion API's object name + description: | + You can find this value by taking a look at your Ingestion API settings. This should be one of the objects you uploaded from the YAML schema in Salesforce. You should also be able to find this from your Data Stream settings. + example: runner_profiles + type: string + +/schema: + post: + displayName: Get YAML Schema for Ingestion API + queryParameters: + openapiversion?: + default: 3.0.3 + example: 3.0.3 + type: string + body: + application/json: + example: + strict: true + value: + customer: + id: 1 + first_name: Alex + last_name: Martinez + email: alex@sf.com + address: + street: 415 Mission Street + city: San Francisco + state: CA + postalCode: "94105" + geo: + lat: 37.78916 + lng: -122.39521 + type: object + responses: + "200": + body: + application/raml+yaml: + example: + strict: true + value: + openapi: 3.0.3 + components: + schemas: + customer: + type: object + properties: + id: + type: number + first_name: + type: string + last_name: + type: string + email: + type: string + street: + type: string + city: + type: string + state: + type: string + postalCode: + type: string + lat: + type: number + lng: + type: number + type: object + description: | + With this endpoint, you can send a JSON object to be transformed into the OpenAPI YAML schema. This is needed to create your Ingestion API and Data Stream in Data Cloud. + + Because the Ingestion API doesn't accept nested objects on the schema, this endpoint will transform your multi-level object into the single-level needed for Data Cloud. + + For example, the following input payload: + + ```json + { + "customer": { + "id": 1, + "first_name": "Alex", + "last_name": "Martinez", + "email": "alex@sf.com", + "address": { + "street": "415 Mission Street", + "city": "San Francisco", + "state": "CA", + "postalCode": "94105", + "geo": { + "lat": 37.78916, + "lng": -122.39521 + } + } + } + } + ``` + + would be flattened and transformed into the following output: + + ```json + { + "customer": { + "id": 1, + "first_name": "Alex", + "last_name": "Martinez", + "email": "alex@sf.com", + "street": "415 Mission Street", + "city": "San Francisco", + "state": "CA", + "postalCode": "94105", + "lat": 37.78916, + "lng": -122.39521 + } + } + ``` + + then, based on this new input, you will receive the YAML schema like the following: + + > ⚠️ **Important** + > + > Take note of the object(s) name(s) from this YAML schema because you will use them for the insertion and deletion. + > For example, in the following YAML schema, the object name is `customer`. + + ```yaml + openapi: 3.0.3 + components: + schemas: + customer: + type: object + properties: + id: + type: number + first_name: + type: string + last_name: + type: string + email: + type: string + street: + type: string + city: + type: string + state: + type: string + postalCode: + type: string + lat: + type: number + lng: + type: number + ``` + + > Important: Make sure you set the `openapiversion` query parameter to set the version. Otherwise, the default of `3.0.3` will be outputted. + +/query: + post: + displayName: Perform a SOQL Query + body: + text/plain: + example: SELECT * FROM MuleSoft_Ingestion_API_runner_p_38447E8E__dll LIMIT 100 + type: string + responses: + "200": + body: + application/json: + items: + example: + strict: true + value: + DataSourceObject__c: MuleSoft_Ingestion_API_runner_profiles_38447E8E + DataSource__c: MuleSoft_Ingestion_API_996db928_2078_4e3a_9c67_1c80b32790aa + city__c: Toronto + created__c: 2017-07-21 + email__c: alex@sf.com + first_name__c: Alex + gender__c: NB + last_name__c: Martinez + maid__c: 1 + state__c: ON + type: object + description: | + Send your SOQL query in the body of the request on a `text/plain` format. + + For example: + + ```sql + SELECT * FROM MuleSoft_Ingestion_API_runner_p_38447E8E__dll LIMIT 100 + ``` + + You will receive a JSON response. + + For example, from the previous query, you'd receive a JSON Array with the results of the `SELECT`: + + ```json + [ + { + "DataSourceObject__c": "MuleSoft_Ingestion_API_runner_profiles_38447E8E", + "DataSource__c": "MuleSoft_Ingestion_API_996db928_2078_4e3a_9c67_1c80b32790aa", + "city__c": "Toronto", + "created__c": "2017-07-21", + "email__c": "alex@sf.com", + "first_name__c": "Alex", + "gender__c": "NB", + "last_name__c": "Martinez", + "maid__c": 1.000000000000000000, + "state__c": "ON" + } + ] + ``` + +/insert: + post: + displayName: Insert new records through streaming + is: + - dataCloudApiParams + body: + application/json: + items: + example: + strict: true + value: + maid: 1 + first_name: Alex + last_name: Martinez + email: alex@sf.com + gender: NB + city: Toronto + state: ON + created: 2017-07-21 + type: object + responses: + "200": + body: + application/json: + type: DataCloudSuccessfulResponse + description: | + Make sure you add the following query parameters to let Data Cloud know more information of where you want to insert new records: + + - `sourceApiName` i.e., `MuleSoft_Ingestion_API` + - `objectName` i.e., `runner_profiles` + + Next, in the body of the request, make sure to use a JSON Array. Each Object inside this Array is a new record. + + For example: + + ```json + [ + { + "maid": 1, + "first_name": "Alex", + "last_name": "Martinez", + "email": "alex@sf.com", + "gender": "NB", + "city": "Toronto", + "state": "ON", + "created": "2017-07-21" + } + ] + ``` + + If everything ran smoothly, you will receive a `200 - OK` successful response. + + > ℹ️ **Note** + > + > It may take a few minutes for your data to be updated in Data Cloud. You can manually check the records in Data Cloud or wait to attempt the `/query` from your MuleSoft API. + +/delete: + delete: + displayName: Delete existing records through streaming + is: + - dataCloudApiParams + body: + application/json: + items: + type: + anyOf: + - + displayName: string + type: string + example: ID123 + - + displayName: number + type: number + format: int + example: 123 + uniqueItems: true + example: + [ 1, 2, 3 ] + responses: + "200": + body: + application/json: + type: DataCloudSuccessfulResponse + description: | + Make sure you add the following query parameters to let Data Cloud know more information of where you want to delete the records from: + + - `sourceApiName` i.e., `MuleSoft_Ingestion_API` + - `objectName` i.e., `runner_profiles` + + Next, in the body of the request, make sure to use a JSON Array. Each Object inside this Array is the ID of the record to delete. + + For example: + + ```json + [ + 1 + ] + ``` + + If everything ran smoothly, you will receive a `200 - OK` successful response. + + > ℹ️ **Note** + > + > It may take a few minutes for your data to be updated in Data Cloud. You can manually check the records in Data Cloud or wait to attempt the `/query` from your MuleSoft API. diff --git a/data-cloud-integration-impl/src/main/resources/props.yaml b/data-cloud-integration-impl/src/main/resources/props.yaml index ce9aaca..dd7b532 100644 --- a/data-cloud-integration-impl/src/main/resources/props.yaml +++ b/data-cloud-integration-impl/src/main/resources/props.yaml @@ -5,4 +5,3 @@ http: listener: host: "0.0.0.0" port: "8081" -