|
| 1 | +# Creating multi-value JSON indexes and aggregation pipelines using MongoDB Shell (mongosh) in Oracle Database 23ai |
| 2 | + |
| 3 | +## Introduction |
| 4 | + |
| 5 | +In previous versions of Oracle Database, JSON indexes had to be created from SQL and the type of the data (e.g. string, number, date, etc) had to be specified when the index was created. Multi-value any-type indexes do not require users to specify a data type upfront, and they can be created directly from MongoDB clients. And these indexes will be picked up by both MongoDB and SQL queries over the same data. In previous versions of Oracle Database, MongoDB aggregation pipelines were not supported, which sometimes created friction for users attempting to migrate MongoDB applications to Oracle Database. In 23ai, these aggregation pipelines are now supported; they are transparently converted into SQL and executed directly. In this lab, we will create a native JSON collection called SALES and we will create indexes and aggregation pipelines directly from mongosh. |
| 6 | + |
| 7 | +MongoDB added recently a new operator $sql to their aggregation pipeline framework not too long ago, so we at Oracle figured, hey, we have SQL, too. But unlike them, we've been doing SQL for quite some time, so why not support that operator and offer our customers the world of Oracle's powerful SQL within the realms of the MongoDB API? The examples below will also show how $sql can be used in Oracle Database 23ai. |
| 8 | + |
| 9 | + |
| 10 | +Estimated Time: 10 minutes |
| 11 | + |
| 12 | +### Objectives |
| 13 | + |
| 14 | +In this lab, you will: |
| 15 | + |
| 16 | +- Create a native JSON collection called SALES |
| 17 | +- Create and drop indexes on the SALES collection |
| 18 | +- Create aggregation pipelines |
| 19 | +- Validate the newly created collection table |
| 20 | + |
| 21 | + |
| 22 | + |
| 23 | +### Prerequisites |
| 24 | + |
| 25 | +- Oracle Database 23.5 with direct OS access as oracle user MongoDB shell (mongosh) installed |
| 26 | +- All previous labs successfully completed |
| 27 | + |
| 28 | + |
| 29 | +## Task 1: Clean up the environment: |
| 30 | + |
| 31 | +1. Follow these steps to clean up your environment: |
| 32 | + |
| 33 | + ``` |
| 34 | + <copy> |
| 35 | + db.SALES.drop(); |
| 36 | + </copy> |
| 37 | + ``` |
| 38 | +
|
| 39 | +## Task 2: Create a native JSON collection called **SALES** |
| 40 | +
|
| 41 | +1. Follow this code to run: |
| 42 | +
|
| 43 | + ``` |
| 44 | + <copy>db.createCollection('SALES'); |
| 45 | + </copy> |
| 46 | + ``` |
| 47 | +*Note: you do not need to explicitly create a collection in Mongo, you can just insert data, and a collection will be created.* |
| 48 | +
|
| 49 | +## Task 3: Populate the SALES collections with data |
| 50 | +
|
| 51 | +1. Follow this code to run: |
| 52 | +
|
| 53 | + ``` |
| 54 | + <copy>db.SALES.insertMany([ |
| 55 | + { "_id" : 1, "item" : "Espresso", "price" : 5, "size": "Short", "quantity" : 22, "date" : ISODate("2024-01-15T08:00:00Z") }, |
| 56 | + { "_id" : 2, "item" : "Cappuccino", "price" : 6, "size": "Short","quantity" : 12, "date" : ISODate("2024-01-16T09:00:00Z") }, |
| 57 | + { "_id" : 3, "item" : "Latte", "price" : 10, "size": "Grande","quantity" : 25, "date" : ISODate("2024-01-16T09:05:00Z") }, |
| 58 | + { "_id" : 4, "item" : "Mocha", "price" : 8,"size": "Tall", "quantity" : 11, "date" : ISODate("2024-02-17T08:00:00Z") }, |
| 59 | + { "_id" : 5, "item" : "Americano", "price" : 1, "size": "Grande","quantity" : 12, "date" : ISODate("2024-02-18T21:06:00Z") }, |
| 60 | + { "_id" : 6, "item" : "Cortado", "price" : 7, "size": "Tall","quantity" : 20, "date" : ISODate("2024-02-20T10:07:00Z") }, |
| 61 | + { "_id" : 7, "item" : "Macchiato", "price" : 9,"size": "Tall", "quantity" : 30, "date" : ISODate("2024-02-21T10:08:00Z") }, |
| 62 | + { "_id" : 8, "item" : "Turkish Coffee", "price" : 20, "size": "Grande","quantity" : 21, "date" : ISODate("2024-02-22T14:09:00Z") }, |
| 63 | + { "_id" : 9, "item" : "Iced Coffee", "price" : 15, "size": "Grande","quantity" : 17, "date" : ISODate("2024-02-23T14:09:00Z") }, |
| 64 | + { "_id" : 10, "item" : "Dirty Chai", "price" : 12, "size": "Tall","quantity" : 15, "date" : ISODate("2024-02-25T14:09:00Z") }, |
| 65 | + { "_id" : 11, "item" : "Decaf", "price" : 4, "size": "Normal", "quantity" : 2, "date" : ISODate("2024-01-16T11:01:00Z") }, |
| 66 | + { "_id" : 12, "item" : "Finlandia", "price" : 50, "size": "Grande","quantity" : 7, "date" : ISODate("2024-05-16T10:00:00Z") } |
| 67 | + ]); |
| 68 | + </copy> |
| 69 | + ``` |
| 70 | +
|
| 71 | +## Task 4: Create indexes on SALES |
| 72 | +
|
| 73 | +1. The following example creates an ascending index on the field size: |
| 74 | +
|
| 75 | + ``` |
| 76 | + <copy> |
| 77 | + db.SALES.createIndex({size:1}); |
| 78 | + </copy> |
| 79 | + ``` |
| 80 | +2. Drop the index |
| 81 | +
|
| 82 | + ``` |
| 83 | + <copy> |
| 84 | + db.SALES.dropIndex("size_1"); |
| 85 | + </copy> |
| 86 | + ``` |
| 87 | +3. The following example creates a compound index called sales_ndx on the item field (in ascending order) and the quantity field (in descending order) |
| 88 | +
|
| 89 | + ``` |
| 90 | + <copy> |
| 91 | + db.SALES.createIndex({item:1, quantity:-1},{name:"sales_ndx"}); |
| 92 | + </copy> |
| 93 | + ``` |
| 94 | +4. Drop the index |
| 95 | +
|
| 96 | + ``` |
| 97 | + <copy> |
| 98 | + db.SALES.dropIndex("sales_ndx"); |
| 99 | + </copy> |
| 100 | + ``` |
| 101 | +5. The following example creates a unique index called sales_uniq_ndx on the item field (in ascending order) |
| 102 | +
|
| 103 | + ``` |
| 104 | + <copy> |
| 105 | + db.SALES.createIndex({item:1}, {name:"sales_uniq_ndx", unique: true}); |
| 106 | + </copy> |
| 107 | + ``` |
| 108 | +6. Check that the index is actually created using $sql: |
| 109 | +
|
| 110 | + ``` |
| 111 | + <copy> |
| 112 | + db.aggregate([{ $sql: "select index_name from user_indexes where index_type <> 'LOB' and table_name = 'SALES'" }] ); |
| 113 | + </copy> |
| 114 | + ``` |
| 115 | +7. Check if the index is used in mongosh: |
| 116 | +
|
| 117 | + ``` |
| 118 | + <copy> |
| 119 | + db.SALES.find({size:3}).explain(); |
| 120 | + </copy> |
| 121 | + ``` |
| 122 | +
|
| 123 | +8. Drop the index |
| 124 | +
|
| 125 | + ``` |
| 126 | + <copy> |
| 127 | + db.SALES.dropIndex("sales_uniq_ndx"); |
| 128 | + </copy> |
| 129 | + ``` |
| 130 | +
|
| 131 | +## Task 5: Create aggregation pipelines |
| 132 | +
|
| 133 | +1. Calculate the total price of all items |
| 134 | +
|
| 135 | + ``` |
| 136 | + <copy> |
| 137 | + db.SALES.aggregate([{$group:{_id:null,"total_price": {$sum:"$price"}}}]); |
| 138 | + </copy> |
| 139 | + ``` |
| 140 | +
|
| 141 | +2. Using SQL, calculate the quantity of all items |
| 142 | +
|
| 143 | + ``` |
| 144 | + <copy> |
| 145 | + db.SALES.aggregate([ |
| 146 | + {$sql: 'select sum(f.data.quantity) from SALES f'} |
| 147 | + ]); |
| 148 | + </copy> |
| 149 | + ``` |
| 150 | +
|
| 151 | +3. Using SQL, calculate the average price of the items |
| 152 | +
|
| 153 | + ``` |
| 154 | + <copy>db.SALES.aggregate([ |
| 155 | + {$sql: 'select avg(f.data.price) from SALES f'} |
| 156 | + ]); |
| 157 | + </copy> |
| 158 | + ``` |
| 159 | +
|
| 160 | +4. For every size, list the total quantity available |
| 161 | +
|
| 162 | + ``` |
| 163 | + <copy>db.SALES.aggregate([ |
| 164 | + { |
| 165 | + $group: { |
| 166 | + _id: '$size', |
| 167 | + totalQty: { $sum: '$quantity' }, |
| 168 | + }, |
| 169 | + }, |
| 170 | + ]); |
| 171 | + </copy> |
| 172 | + ``` |
| 173 | +* Examples for analytical functions are for example at: https://blogs.oracle.com/database/post/proper-sql-comes-to-mongodb-applications-with-oracle |
| 174 | +
|
| 175 | +
|
| 176 | +## Learn More |
| 177 | +
|
| 178 | +* [Proper SQL comes to MongoDB applications..with the Oracle Database!](https://blogs.oracle.com/database/post/proper-sql-comes-to-mongodb-applications-with-oracle) |
| 179 | +
|
| 180 | +## Acknowledgements |
| 181 | +
|
| 182 | +* **Author** - Julian Dontcheff, Hermann Baer |
| 183 | +* **Contributors** - David Start, Ranjan Priyadarshi |
| 184 | +* **Last Updated By/Date** - Carmen Berdant, Technical Program Manager, July 2024 |
0 commit comments