Skip to content

Commit 9edad25

Browse files
WMS ID: #11382 (#644)
* Adding labs I added Lab 9,10 and 11 for OCW24 * review and adjustments to 23.4 * Additional changes added the changes from Julian --------- Co-authored-by: hbaerus <hermann.baer@gmail.com>
1 parent dce954a commit 9edad25

24 files changed

+662
-36
lines changed

23aifree/json-collections/json-collections.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ In this lab, you will:
2222

2323
### Prerequisites
2424

25-
- Oracle Database 23ai Free Developer Release
25+
- Oracle Database 23ai, version 23.4 or above
2626
- All previous labs successfully completed
2727

2828
## Task 1: Create Collection
@@ -247,7 +247,7 @@ More generally, constraints can be used to check the data being entered for vari
247247
Now copy and paste the query below in the worksheet and click the *Run query* button to run the SQL query to alter the **movie** table and add constraints.
248248
249249
```
250-
<copy>alter table movies add constraint movies_json_schema
250+
<copy>alter table "movies" add constraint movies_json_schema
251251
check (data is json validate '{ "type": "object",
252252
"properties": {
253253
"_id": { "type": "number" },
@@ -272,7 +272,7 @@ More generally, constraints can be used to check the data being entered for vari
272272
273273
```
274274
<copy>
275-
alter table movies add constraint no_negative_price
275+
alter table "movies" add constraint no_negative_price
276276
check (
277277
JSON_EXISTS(data, '$?(@.price.number() >= 0)')
278278
);
@@ -282,7 +282,7 @@ More generally, constraints can be used to check the data being entered for vari
282282
283283
JSON_Exists is a SQL/JSON function that checks that a SQL/JSON path expression selects at least one value in the JSON data. The selected value(s) are not extracted – only their existence is checked. Here, *$?(@.price.number() >= 0)* is a standard, SQL/JSON path expressions. You'll learn more about SQJ/JSON functions later in this lab.
284284
285-
4. Once the **movie** table is altered, navigate back to JSON workshop. Click the navigation menu on the top left and select **JSON** under Development.
285+
4. Once the **movies** table is altered, navigate back to JSON workshop. Click the navigation menu on the top left and select **JSON** under Development.
286286
287287
![JSON navigation](./images/development-json.png)
288288
@@ -340,7 +340,7 @@ More generally, constraints can be used to check the data being entered for vari
340340
"genre": "Romance",
341341
"starring" :"tbd" }'), json_schema )
342342
AS REPORT
343-
from user_JSON_SCHEMA_COLUMNS where table_name = 'MOVIES')
343+
from user_JSON_SCHEMA_COLUMNS where table_name = 'movies')
344344
select json_serialize(report pretty) from x
345345
/
346346
</copy>
@@ -354,7 +354,7 @@ In the SQL tool, run:
354354
355355
```
356356
<copy>
357-
select constraint_name, json_serialize(json_schema) from user_JSON_SCHEMA_COLUMNS where table_name = 'MOVIES';
357+
select constraint_name, json_serialize(json_schema) from user_JSON_SCHEMA_COLUMNS where table_name = 'movies';
358358
</copy>
359359
```
360360
![SQL for data dictionary](./images/sql-data-dict.png)
@@ -372,4 +372,4 @@ You may now proceed to the next lab.
372372
373373
* **Author** - William Masdon, Kaylien Phan, Hermann Baer
374374
* **Contributors** - David Start, Ranjan Priyadarshi
375-
* **Last Updated By/Date** - William Masdon, Database Product Manager, April 2023
375+
* **Last Updated By/Date** - Hermann Baer, Database Product Management, August 2024
Loading
Loading
Loading
Loading
Loading
Loading
Lines changed: 137 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,137 @@
1+
# JSON-To-Duality Migrator
2+
3+
## Introduction
4+
5+
The **JSON-To-Duality Migrator** is a new tool in Oracle Database 23ai that can migrate one or more existing sets of JSON documents to JSON-relational duality views. The migrator can be used for database migration from any document database using JSON documents or build a new application using JSON documents as the migrator will automatically create the necessary duality views.
6+
The JSON-To-Duality Migrator is PL/SQL based. Its PL/SQL subprograms generate the views based on implicit document-content relations (shared content). By default, document parts that can be shared are shared, and the views are defined for maximum updatability. In this lab, we will migrate a JSON collection called CONF_SCHEDULE into Oracle Database 23ai and will test the end-result from both SQL Developer and MongoDB Compass.
7+
The new migrator which has two components:
8+
- **Converter**: Create the database objects needed to support the original JSON documents: duality views and their underlying tables and indexes.
9+
- **Importer**: Import Oracle Database JSON-type document sets that correspond to the original external documents into the duality views created by the converter.
10+
11+
The **converter** is composed of these PL/SQL functions in package **DBMS\_JSON\_DUALITY**:
12+
- **infer\_schema** infers the JSON schema that represents all of the input document sets.
13+
- **generate\_schema** produces the code to create the required database objects for each duality view.
14+
- **infer\_and\_generate\_schema** performs both operations.
15+
16+
17+
Estimated Time: 10 minutes
18+
19+
20+
### Objectives
21+
22+
In this lab, you will:
23+
24+
- Create a native JSON collection using the new syntax
25+
- Import data from JSON files
26+
- Run the JSON-To-Duality Migrator (both converter and importer)
27+
- Validate the newly created objects (tables and duality views)
28+
29+
30+
### Prerequisites
31+
32+
- Oracle Database 23ai, version 23.4 or above
33+
- MongoDB Compass [can be downloaded for free from here:](https://www.mongodb.com/docs/compass/current/install/)
34+
35+
36+
37+
## Task 1: Clean up the environment:
38+
39+
1. Follow these steps to clean up your environment. These steps are only needed in case you're running the workshop more than once. If this is the first time you are using this workshop, you can skip the cleanup.
40+
41+
```
42+
<copy>
43+
DROP VIEW if exists CONF_SCHEDULE_DUALITY;
44+
DROP TABLE if exists CONF_SCHEDULE PURGE;
45+
DROP TABLE if exists CONF_SCHEDULE_SCHEDULE PURGE;
46+
DROP TABLE if exists CONF_SCHEDULE_ROOT PURGE;
47+
</copy>
48+
```
49+
50+
## Task 2: Create a native JSON collection
51+
52+
1. Option 1: In SQL Developer run:
53+
54+
```
55+
<copy>CREATE JSON COLLECTION TABLE CONF_SCHEDULE;</copy>
56+
```
57+
58+
2. Option 2: Create in MongoDB Compass the collection **CONF\_SCHEDULE**
59+
60+
![Create Collection](images/create_collection.png)
61+
62+
63+
## Task 3: Import the document(s) from MongoDB Compass into CONF_SCHEDULE
64+
65+
1. Download the [BHRJ_Schedule.json](https://c4u04.objectstorage.us-ashburn-1.oci.customer-oci.com/p/EcTjWk2IuZPZeNnD_fYMcgUhdNDIDA6rt9gaFj_WZMiL7VvxPBNMY60837hu5hga/n/c4u04/b/livelabsfiles/o/labfiles/BHRJ_Schedule.json) document.
66+
67+
![Add File](images/add_file.png)
68+
![Choose File](images/import_data.png)
69+
70+
2. Import the BHRJ_Schedule.json file
71+
72+
![Import File](images/import_schedule.png)
73+
![Show import](images/imported_completed.png)
74+
75+
76+
## Task 4: Run the JSON-To-Duality Migrator
77+
78+
1. Follow this code to run the JSON-To-Duality Migrator: we will do infer\_schema and generate\_schema together.
79+
80+
```
81+
<copy>
82+
SET SERVEROUTPUT ON
83+
SET LINESIZE 10000
84+
DECLARE
85+
DBschema_sql clob;
86+
myCurrentSchema varchar2(128) default null;
87+
BEGIN
88+
SELECT SYS_CONTEXT('USERENV', 'CURRENT_SCHEMA') into myCurrentSchema;
89+
90+
DBschema_sql :=
91+
dbms_json_duality.infer_and_generate_schema(
92+
json('{"tableNames" : [ "CONF_SCHEDULE" ],
93+
"useFlexFields" : false,
94+
"updatability" : false,
95+
"sourceSchema" : '''|| myCurrentSchema|| '''}'));
96+
dbms_output.put_line('DDL Script: ');
97+
dbms_output.put_line(DBschema_sql);
98+
99+
execute immediate DBschema_sql;
100+
101+
dbms_json_duality.import(table_name => 'CONF_SCHEDULE', view_name => 'CONF_SCHEDULE_DUALITY');
102+
END;
103+
/
104+
</copy>
105+
```
106+
## Task 5: Validate the newly created objects and check the output from the select statement below
107+
108+
1. In SQL Developer run:
109+
110+
```
111+
<copy>
112+
select json_serialize(data pretty) from "CONF_SCHEDULE_DUALITY";
113+
select * from user_objects where object_name like '%SCHEDULE%';
114+
select * from CONF_SCHEDULE_SCHEDULE;
115+
select * from CONF_SCHEDULE_ROOT;
116+
</copy>
117+
```
118+
119+
2. In MongoDB Compass after refreshing the databases, click on the **CONF\_SCHEDULE\_DUALITY** collection under the ADMIN database
120+
121+
![Conf Schedule Duality](images/conf_schedule_duality%20collection.png)
122+
123+
3. In mongosh, verify the number of documents in the **CONF\_SCHEDULE\_DUALITY** collection:
124+
125+
```
126+
hol23> <copy>db.CONF_SCHEDULE_DUALITY.countDocuments({});
127+
</copy>
128+
```
129+
## Learn More
130+
131+
* [JSON-Relational Duality Developer's Guide](https://docs.oracle.com/en/database/oracle/oracle-database/23/jsnvu/json-duality.html)
132+
133+
## Acknowledgements
134+
135+
* **Author** - Julian Dontcheff, Hermann Baer
136+
* **Contributors** - David Start, Ranjan Priyadarshi
137+
* **Last Updated By/Date** - Carmen Berdant, Technical Program Manager, July 2024

23aifree/json-indexes/json-indexes.md

Lines changed: 184 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,184 @@
1+
# Creating multi-value JSON indexes and aggregation pipelines using MongoDB Shell (mongosh) in Oracle Database 23ai
2+
3+
## Introduction
4+
5+
In previous versions of Oracle Database, JSON indexes had to be created from SQL and the type of the data (e.g. string, number, date, etc) had to be specified when the index was created. Multi-value any-type indexes do not require users to specify a data type upfront, and they can be created directly from MongoDB clients. And these indexes will be picked up by both MongoDB and SQL queries over the same data. In previous versions of Oracle Database, MongoDB aggregation pipelines were not supported, which sometimes created friction for users attempting to migrate MongoDB applications to Oracle Database. In 23ai, these aggregation pipelines are now supported; they are transparently converted into SQL and executed directly. In this lab, we will create a native JSON collection called SALES and we will create indexes and aggregation pipelines directly from mongosh.
6+
7+
MongoDB added recently a new operator $sql to their aggregation pipeline framework not too long ago, so we at Oracle figured, hey, we have SQL, too. But unlike them, we've been doing SQL for quite some time, so why not support that operator and offer our customers the world of Oracle's powerful SQL within the realms of the MongoDB API? The examples below will also show how $sql can be used in Oracle Database 23ai.
8+
9+
10+
Estimated Time: 10 minutes
11+
12+
### Objectives
13+
14+
In this lab, you will:
15+
16+
- Create a native JSON collection called SALES
17+
- Create and drop indexes on the SALES collection
18+
- Create aggregation pipelines
19+
- Validate the newly created collection table
20+
21+
22+
23+
### Prerequisites
24+
25+
- Oracle Database 23.5 with direct OS access as oracle user MongoDB shell (mongosh) installed
26+
- All previous labs successfully completed
27+
28+
29+
## Task 1: Clean up the environment:
30+
31+
1. Follow these steps to clean up your environment:
32+
33+
```
34+
<copy>
35+
db.SALES.drop();
36+
</copy>
37+
```
38+
39+
## Task 2: Create a native JSON collection called **SALES**
40+
41+
1. Follow this code to run:
42+
43+
```
44+
<copy>db.createCollection('SALES');
45+
</copy>
46+
```
47+
*Note: you do not need to explicitly create a collection in Mongo, you can just insert data, and a collection will be created.*
48+
49+
## Task 3: Populate the SALES collections with data
50+
51+
1. Follow this code to run:
52+
53+
```
54+
<copy>db.SALES.insertMany([
55+
{ "_id" : 1, "item" : "Espresso", "price" : 5, "size": "Short", "quantity" : 22, "date" : ISODate("2024-01-15T08:00:00Z") },
56+
{ "_id" : 2, "item" : "Cappuccino", "price" : 6, "size": "Short","quantity" : 12, "date" : ISODate("2024-01-16T09:00:00Z") },
57+
{ "_id" : 3, "item" : "Latte", "price" : 10, "size": "Grande","quantity" : 25, "date" : ISODate("2024-01-16T09:05:00Z") },
58+
{ "_id" : 4, "item" : "Mocha", "price" : 8,"size": "Tall", "quantity" : 11, "date" : ISODate("2024-02-17T08:00:00Z") },
59+
{ "_id" : 5, "item" : "Americano", "price" : 1, "size": "Grande","quantity" : 12, "date" : ISODate("2024-02-18T21:06:00Z") },
60+
{ "_id" : 6, "item" : "Cortado", "price" : 7, "size": "Tall","quantity" : 20, "date" : ISODate("2024-02-20T10:07:00Z") },
61+
{ "_id" : 7, "item" : "Macchiato", "price" : 9,"size": "Tall", "quantity" : 30, "date" : ISODate("2024-02-21T10:08:00Z") },
62+
{ "_id" : 8, "item" : "Turkish Coffee", "price" : 20, "size": "Grande","quantity" : 21, "date" : ISODate("2024-02-22T14:09:00Z") },
63+
{ "_id" : 9, "item" : "Iced Coffee", "price" : 15, "size": "Grande","quantity" : 17, "date" : ISODate("2024-02-23T14:09:00Z") },
64+
{ "_id" : 10, "item" : "Dirty Chai", "price" : 12, "size": "Tall","quantity" : 15, "date" : ISODate("2024-02-25T14:09:00Z") },
65+
{ "_id" : 11, "item" : "Decaf", "price" : 4, "size": "Normal", "quantity" : 2, "date" : ISODate("2024-01-16T11:01:00Z") },
66+
{ "_id" : 12, "item" : "Finlandia", "price" : 50, "size": "Grande","quantity" : 7, "date" : ISODate("2024-05-16T10:00:00Z") }
67+
]);
68+
</copy>
69+
```
70+
71+
## Task 4: Create indexes on SALES
72+
73+
1. The following example creates an ascending index on the field size:
74+
75+
```
76+
<copy>
77+
db.SALES.createIndex({size:1});
78+
</copy>
79+
```
80+
2. Drop the index
81+
82+
```
83+
<copy>
84+
db.SALES.dropIndex("size_1");
85+
</copy>
86+
```
87+
3. The following example creates a compound index called sales_ndx on the item field (in ascending order) and the quantity field (in descending order)
88+
89+
```
90+
<copy>
91+
db.SALES.createIndex({item:1, quantity:-1},{name:"sales_ndx"});
92+
</copy>
93+
```
94+
4. Drop the index
95+
96+
```
97+
<copy>
98+
db.SALES.dropIndex("sales_ndx");
99+
</copy>
100+
```
101+
5. The following example creates a unique index called sales_uniq_ndx on the item field (in ascending order)
102+
103+
```
104+
<copy>
105+
db.SALES.createIndex({item:1}, {name:"sales_uniq_ndx", unique: true});
106+
</copy>
107+
```
108+
6. Check that the index is actually created using $sql:
109+
110+
```
111+
<copy>
112+
db.aggregate([{ $sql: "select index_name from user_indexes where index_type <> 'LOB' and table_name = 'SALES'" }] );
113+
</copy>
114+
```
115+
7. Check if the index is used in mongosh:
116+
117+
```
118+
<copy>
119+
db.SALES.find({size:3}).explain();
120+
</copy>
121+
```
122+
123+
8. Drop the index
124+
125+
```
126+
<copy>
127+
db.SALES.dropIndex("sales_uniq_ndx");
128+
</copy>
129+
```
130+
131+
## Task 5: Create aggregation pipelines
132+
133+
1. Calculate the total price of all items
134+
135+
```
136+
<copy>
137+
db.SALES.aggregate([{$group:{_id:null,"total_price": {$sum:"$price"}}}]);
138+
</copy>
139+
```
140+
141+
2. Using SQL, calculate the quantity of all items
142+
143+
```
144+
<copy>
145+
db.SALES.aggregate([
146+
{$sql: 'select sum(f.data.quantity) from SALES f'}
147+
]);
148+
</copy>
149+
```
150+
151+
3. Using SQL, calculate the average price of the items
152+
153+
```
154+
<copy>db.SALES.aggregate([
155+
{$sql: 'select avg(f.data.price) from SALES f'}
156+
]);
157+
</copy>
158+
```
159+
160+
4. For every size, list the total quantity available
161+
162+
```
163+
<copy>db.SALES.aggregate([
164+
{
165+
$group: {
166+
_id: '$size',
167+
totalQty: { $sum: '$quantity' },
168+
},
169+
},
170+
]);
171+
</copy>
172+
```
173+
* Examples for analytical functions are for example at: https://blogs.oracle.com/database/post/proper-sql-comes-to-mongodb-applications-with-oracle
174+
175+
176+
## Learn More
177+
178+
* [Proper SQL comes to MongoDB applications..with the Oracle Database!](https://blogs.oracle.com/database/post/proper-sql-comes-to-mongodb-applications-with-oracle)
179+
180+
## Acknowledgements
181+
182+
* **Author** - Julian Dontcheff, Hermann Baer
183+
* **Contributors** - David Start, Ranjan Priyadarshi
184+
* **Last Updated By/Date** - Carmen Berdant, Technical Program Manager, July 2024
Loading
Loading
Loading

0 commit comments

Comments
 (0)