diff --git a/.nojekyll b/.nojekyll index 46be58a..8721ce1 100644 --- a/.nojekyll +++ b/.nojekyll @@ -1 +1 @@ -f489f2a0 \ No newline at end of file +fc9c8013 \ No newline at end of file diff --git a/about.html b/about.html index 6aee8a2..c053dac 100644 --- a/about.html +++ b/about.html @@ -2,7 +2,7 @@ - + @@ -34,10 +34,10 @@ - + - + - + - + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ +
+ +
+ + + + +
+ + + + +
+
+

Roboflow Notebooks

+
Roboflow Notebooks
+
+
+
+

Fine-tuning Florence-2 on Object Detection Dataset

+
+

Roboflow arXiv

+

Florence-2 is a lightweight vision-language model open-sourced by Microsoft under the MIT license. The model demonstrates strong zero-shot and fine-tuning capabilities across tasks such as captioning, object detection, grounding, and segmentation.

+
+
+

+
Florence-2 Figure.1
+
+
+

Figure 1. Illustration showing the level of spatial hierarchy and semantic granularity expressed by each task. Source: Florence-2: Advancing a Unified Representation for a Variety of Vision Tasks.

+

The model takes images and task prompts as input, generating the desired results in text format. It uses a DaViT vision encoder to convert images into visual token embeddings. These are then concatenated with BERT-generated text embeddings and processed by a transformer-based multi-modal encoder-decoder to generate the response.

+
+
+

+
Florence-2 Figure.2
+
+
+

Figure 2. Overview of Florence-2 architecture. Source: Florence-2: Advancing a Unified Representation for a Variety of Vision Tasks.

+
+

Setup

+
+

Configure your API keys

+

To fine-tune Florence-2, you need to provide your HuggingFace Token and Roboflow API key. Follow these steps:

+
    +
  • Open your HuggingFace Settings page. Click Access Tokens then New Token to generate new token.
  • +
  • Go to your Roboflow Settings page. Click Copy. This will place your private key in the clipboard.
  • +
  • In Colab, go to the left pane and click on Secrets (🔑). +
      +
    • Store HuggingFace Access Token under the name HF_TOKEN.
    • +
    • Store Roboflow API Key under the name ROBOFLOW_API_KEY.
    • +
  • +
+
+
+

Select the runtime

+

Let’s make sure that we have access to GPU. We can use nvidia-smi command to do that. In case of any problems navigate to Edit -> Notebook settings -> Hardware accelerator, set it to L4 GPU, and then click Save.

+
+
!nvidia-smi
+
+
Tue Feb 11 14:44:13 2025       
++-----------------------------------------------------------------------------------------+
+| NVIDIA-SMI 565.57.01              Driver Version: 565.57.01      CUDA Version: 12.7     |
+|-----------------------------------------+------------------------+----------------------+
+| GPU  Name                 Persistence-M | Bus-Id          Disp.A | Volatile Uncorr. ECC |
+| Fan  Temp   Perf          Pwr:Usage/Cap |           Memory-Usage | GPU-Util  Compute M. |
+|                                         |                        |               MIG M. |
+|=========================================+========================+======================|
+|   0  NVIDIA A100-SXM4-80GB          On  |   00000000:01:00.0 Off |                    0 |
+| N/A   38C    P0             86W /  500W |       5MiB /  81920MiB |      0%      Default |
+|                                         |                        |             Disabled |
++-----------------------------------------+------------------------+----------------------+
+|   1  NVIDIA A100-SXM4-80GB          On  |   00000000:41:00.0 Off |                    0 |
+| N/A   35C    P0             63W /  500W |       5MiB /  81920MiB |      0%      Default |
+|                                         |                        |             Disabled |
++-----------------------------------------+------------------------+----------------------+
+|   2  NVIDIA A100-SXM4-80GB          On  |   00000000:81:00.0 Off |                    0 |
+| N/A   35C    P0             63W /  500W |       5MiB /  81920MiB |      0%      Default |
+|                                         |                        |             Disabled |
++-----------------------------------------+------------------------+----------------------+
+|   3  NVIDIA A100-SXM4-80GB          On  |   00000000:C1:00.0 Off |                    0 |
+| N/A   34C    P0             62W /  500W |       5MiB /  81920MiB |      0%      Default |
+|                                         |                        |             Disabled |
++-----------------------------------------+------------------------+----------------------+
+                                                                                         
++-----------------------------------------------------------------------------------------+
+| Processes:                                                                              |
+|  GPU   GI   CI        PID   Type   Process name                              GPU Memory |
+|        ID   ID                                                               Usage      |
+|=========================================================================================|
+|  No running processes found                                                             |
++-----------------------------------------------------------------------------------------+
+
+
+
+
+

Download example data

+

NOTE: Feel free to replace our example image with your own photo.

+
+
!wget -q https://media.roboflow.com/notebooks/examples/dog.jpeg
+!ls -lh
+
+
total 337M
+-rw-rw-r-- 1 patel_zeel patel_zeel 104K Jun  2  2023 dog.jpeg
+-rw-rw-r-- 1 patel_zeel patel_zeel 104K Jun  2  2023 dog.jpeg.1
+-rw-rw-r-- 1 patel_zeel patel_zeel 104K Jun  2  2023 dog.jpeg.2
+-rw-rw-r-- 1 patel_zeel patel_zeel 104K Jun  2  2023 dog.jpeg.3
+-rw-rw-r-- 1 patel_zeel patel_zeel 4.7M Jan 20 13:40 example.tiff
+-rw-rw-r-- 1 patel_zeel patel_zeel 4.3M Feb 11 14:44 how-to-finetune-florence-2-on-detection-dataset.ipynb
+drwxrwxr-x 7 patel_zeel patel_zeel 4.0K Feb 11 14:33 model_checkpoints
+drwxrwxr-x 5 patel_zeel patel_zeel 4.0K Feb 11 00:54 poker-cards-4
+drwxrwxr-x 4 patel_zeel patel_zeel 4.0K Jan 20 15:43 runs
+-rw-rw-r-- 1 patel_zeel patel_zeel 2.5M Jan 22 10:51 scratchpad.ipynb
+drwxrwxr-x 4 patel_zeel patel_zeel 4.0K Jan 20 15:17 trench_width
+drwxrwxr-x 7 patel_zeel patel_zeel 4.0K Jan 20 15:43 wandb
+-rw-rw-r-- 1 patel_zeel patel_zeel  41M Jan 20 15:36 yolo11m-obb.pt
+-rw-rw-r-- 1 patel_zeel patel_zeel 5.4M Jan 20 15:17 yolo11n.pt
+-rw-rw-r-- 1 patel_zeel patel_zeel  50M Jan 20 15:43 yolov8m.pt
+-rw-rw-r-- 1 patel_zeel patel_zeel  53M Jan 20 15:43 yolov8m-seg.pt
+-rw-rw-r-- 1 patel_zeel patel_zeel  23M Jan 20 15:14 yolov8s-obb.pt
+-rw-rw-r-- 1 patel_zeel patel_zeel  22M Jan 18 23:20 yolov8s.pt
+-rw-rw-r-- 1 patel_zeel patel_zeel 134M Jan 20 15:30 yolov8x-obb.pt
+
+
+
+
EXAMPLE_IMAGE_PATH = "dog.jpeg"
+
+
+
+
+

Download and configure the model

+

Let’s download the model checkpoint and configure it so that you can fine-tune it later on.

+
+
!pip install -q transformers flash_attn timm einops peft
+!pip install -q roboflow git+https://github.com/roboflow/supervision.git
+
+
+
# @title Imports
+
+import io
+import os
+import re
+import json
+import torch
+import html
+import base64
+import itertools
+
+import numpy as np
+import supervision as sv
+
+# from google.colab import userdata
+from IPython.core.display import display, HTML
+from torch.utils.data import Dataset, DataLoader
+from transformers import (
+    AdamW,
+    AutoModelForCausalLM,
+    AutoProcessor,
+    get_scheduler
+)
+from tqdm import tqdm
+from typing import List, Dict, Any, Tuple, Generator
+from peft import LoraConfig, get_peft_model
+from PIL import Image
+from roboflow import Roboflow
+
+
DeprecationWarning: Importing display from IPython.core.display is deprecated since IPython 7.14, please import from IPython.display
+
+
+

Load the model using AutoModelForCausalLM and the processor using AutoProcessor classes from the transformers library. Note that you need to pass trust_remote_code as True since this model is not a standard transformers model.

+
+
CHECKPOINT = "microsoft/Florence-2-base-ft"
+# REVISION = 'refs/pr/6'
+DEVICE = torch.device("cuda" if torch.cuda.is_available() else "cpu")
+
+model = AutoModelForCausalLM.from_pretrained(CHECKPOINT, trust_remote_code=True).to(DEVICE)
+processor = AutoProcessor.from_pretrained(CHECKPOINT, trust_remote_code=True)
+
+
Importing from timm.models.layers is deprecated, please import via timm.layers
+Florence2LanguageForConditionalGeneration has generative capabilities, as `prepare_inputs_for_generation` is explicitly overwritten. However, it doesn't directly inherit from `GenerationMixin`. From 👉v4.50👈 onwards, `PreTrainedModel` will NOT inherit from `GenerationMixin`, and this model will lose the ability to call `generate` and other related functions.
+  - If you're using `trust_remote_code=True`, you can get rid of this warning by loading the model with an auto class. See https://huggingface.co/docs/transformers/en/model_doc/auto#auto-classes
+  - If you are the owner of the model architecture code, please modify your model class such that it inherits from `GenerationMixin` (after `PreTrainedModel`, otherwise you'll get an exception).
+  - If you are not the owner of the model architecture class, please contact the model code owner to update it.
+
+
+
+
+

Run inference with pre-trained Florence-2 model

+
+
# @title Example object detection inference
+
+image = Image.open(EXAMPLE_IMAGE_PATH)
+task = "<OD>"
+text = "<OD>"
+
+inputs = processor(text=text, images=image, return_tensors="pt").to(DEVICE)
+generated_ids = model.generate(
+    input_ids=inputs["input_ids"],
+    pixel_values=inputs["pixel_values"],
+    max_new_tokens=1024,
+    num_beams=3
+)
+generated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]
+response = processor.post_process_generation(generated_text, task=task, image_size=(image.width, image.height))
+detections = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, response, resolution_wh=image.size)
+
+bounding_box_annotator = sv.BoundingBoxAnnotator(color_lookup=sv.ColorLookup.INDEX)
+label_annotator = sv.LabelAnnotator(color_lookup=sv.ColorLookup.INDEX)
+
+image = bounding_box_annotator.annotate(image, detections)
+image = label_annotator.annotate(image, detections)
+image.thumbnail((600, 600))
+image
+
+
BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+
+
+
+
+

+
+
+
+
+
+
# @title Example image captioning inference
+
+image = Image.open(EXAMPLE_IMAGE_PATH)
+task = "<DETAILED_CAPTION>"
+text = "<DETAILED_CAPTION>"
+
+inputs = processor(text=text, images=image, return_tensors="pt").to(DEVICE)
+generated_ids = model.generate(
+    input_ids=inputs["input_ids"],
+    pixel_values=inputs["pixel_values"],
+    max_new_tokens=1024,
+    num_beams=3
+)
+generated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]
+response = processor.post_process_generation(generated_text, task=task, image_size=(image.width, image.height))
+response
+
+
{'<DETAILED_CAPTION>': 'In this image we can see a person wearing a bag and holding a dog. In the background there are buildings, poles and sky with clouds.'}
+
+
+
+
# @title Example caption to phrase grounding inference
+
+image = Image.open(EXAMPLE_IMAGE_PATH)
+task = "<CAPTION_TO_PHRASE_GROUNDING>"
+text = "<CAPTION_TO_PHRASE_GROUNDING> Vehicle"
+
+inputs = processor(text=text, images=image, return_tensors="pt").to(DEVICE)
+generated_ids = model.generate(
+    input_ids=inputs["input_ids"],
+    pixel_values=inputs["pixel_values"],
+    max_new_tokens=1024,
+    num_beams=3
+)
+generated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]
+response = processor.post_process_generation(generated_text, task=task, image_size=(image.width, image.height))
+detections = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, response, resolution_wh=image.size)
+
+bounding_box_annotator = sv.BoundingBoxAnnotator(color_lookup=sv.ColorLookup.INDEX)
+label_annotator = sv.LabelAnnotator(color_lookup=sv.ColorLookup.INDEX)
+
+image = bounding_box_annotator.annotate(image, detections)
+image = label_annotator.annotate(image, detections)
+image.thumbnail((600, 600))
+image
+
+
BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+
+
+
+
+

+
+
+
+
+
+
+

Fine-tune Florence-2 on custom dataset

+
+

Download dataset from Roboflow Universe

+
+
ROBOFLOW_API_KEY = os.getenv("ROBOFLOW_API_KEY")
+rf = Roboflow(api_key=ROBOFLOW_API_KEY)
+
+project = rf.workspace("roboflow-jvuqo").project("poker-cards-fmjio")
+version = project.version(4)
+dataset = version.download("florence2-od")
+
+
loading Roboflow workspace...
+loading Roboflow project...
+
+
+
+
!head -n 5 {dataset.location}/train/annotations.jsonl
+
+
{"image": "IMG_20220316_172418_jpg.rf.e3cb4a86dc0247e71e3697aa3e9db923.jpg", "prefix": "<OD>", "suffix": ":!pg!dmvct<loc_138><loc_100><loc_470><loc_448>21!pg!dmvct<loc_388><loc_145><loc_670><loc_453>kbdl!!pg!dmvct<loc_566><loc_166><loc_823><loc_432>rvffo!pg!dmvct<loc_365><loc_465><loc_765><loc_999>ljoh!pg!dmvct<loc_601><loc_440><loc_949><loc_873>"}
+{"image": "IMG_20220316_171515_jpg.rf.e3b1932bb375b3b3912027647586daa8.jpg", "prefix": "<OD>", "suffix": "6!pg!dmvct<loc_554><loc_2><loc_763><loc_467>7!pg!dmvct<loc_399><loc_79><loc_555><loc_466>8!pg!dmvct<loc_363><loc_484><loc_552><loc_905>9!pg!dmvct<loc_535><loc_449><loc_757><loc_971>"}
+{"image": "IMG_20220316_165139_jpg.rf.e30257ec169a2bfdfecb693211d37250.jpg", "prefix": "<OD>", "suffix": ":!pg!ejbnpoet<loc_596><loc_535><loc_859><loc_982>kbdl!pg!ejbnpoet<loc_211><loc_546><loc_411><loc_880>rvffo!pg!ejbnpoet<loc_430><loc_34><loc_692><loc_518>ljoh!pg!ejbnpoet<loc_223><loc_96><loc_451><loc_523>21!pg!ejbnpoet<loc_387><loc_542><loc_604><loc_925>"}
+{"image": "IMG_20220316_143407_jpg.rf.e1eb3be3efc6c3bbede436cfb5489e7c.jpg", "prefix": "<OD>", "suffix": "bdf!pg!ifbsut<loc_345><loc_315><loc_582><loc_721>3!pg!ifbsut<loc_709><loc_115><loc_888><loc_509>4!pg!ifbsut<loc_529><loc_228><loc_735><loc_613>5!pg!ifbsut<loc_98><loc_421><loc_415><loc_845>"}
+{"image": "IMG_20220316_165139_jpg.rf.e4c229a9128494d17992cbe88af575df.jpg", "prefix": "<OD>", "suffix": ":!pg!ejbnpoet<loc_141><loc_18><loc_404><loc_465>kbdl!pg!ejbnpoet<loc_589><loc_120><loc_789><loc_454>rvffo!pg!ejbnpoet<loc_308><loc_482><loc_570><loc_966>ljoh!pg!ejbnpoet<loc_549><loc_477><loc_777><loc_904>21!pg!ejbnpoet<loc_396><loc_75><loc_613><loc_458>"}
+
+
+
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
+To disable this warning, you can either:
+    - Avoid using `tokenizers` before the fork if possible
+    - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
+
+
+
+
# read jsonl file
+def read_jsonl(file_path: str) -> Generator[Dict[str, Any], None, None]:
+    with open(file_path, "r") as f:
+        for line in f:
+            yield json.loads(line)
+
+lines = []
+split = "test"     
+for line in read_jsonl(dataset.location + f"/{split}/annotations.jsonl.backup"):
+    # print(line)
+    # edit = True
+    # copied_line = list(line['suffix'])
+    # for i in range(len(copied_line)):
+    #     if copied_line[i] == "<":
+    #         edit = False
+    #     elif copied_line[i] == ">":
+    #         edit = True
+    #     else:
+    #         if edit:
+    #             copied_line[i] = chr(ord(copied_line[i]) + 1)
+    # copied_line = "".join(copied_line)
+    # line['suffix'] = copied_line
+    
+    line['suffix'] = line['suffix'].replace("club", "dog").replace("diamond", "cat").replace("heart", "bird").replace("spade", "fish")
+    print(line)
+    lines.append(line)
+
+with open(dataset.location + f"/{split}/annotations.jsonl", "w") as f:
+    for line in lines:
+        f.write(json.dumps(line) + "\n")
+
+
{'image': 'IMG_20220316_140255_jpg.rf.0d10768652a0f20bea317e96632d3448.jpg', 'prefix': '<OD>', 'suffix': '5 of fishs<loc_146><loc_488><loc_541><loc_917>6 of fishs<loc_259><loc_221><loc_604><loc_673>7 of fishs<loc_470><loc_206><loc_761><loc_741>8 of fishs<loc_599><loc_201><loc_949><loc_732>'}
+{'image': 'IMG_20220316_140400_jpg.rf.3f21e54bd916b05218202fbf109d8a5f.jpg', 'prefix': '<OD>', 'suffix': '5 of fishs<loc_127><loc_280><loc_378><loc_597>7 of fishs<loc_244><loc_415><loc_539><loc_904>8 of fishs<loc_414><loc_98><loc_792><loc_513>6 of fishs<loc_450><loc_469><loc_847><loc_999>'}
+{'image': 'IMG_20220316_144511_jpg.rf.40ee049c8f854c558e2ca20f90be3787.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_49><loc_508><loc_221><loc_891>6 of birds<loc_205><loc_391><loc_384><loc_832>7 of birds<loc_387><loc_281><loc_621><loc_777>8 of birds<loc_615><loc_100><loc_971><loc_677>'}
+{'image': 'IMG_20220316_144657_jpg.rf.0b7bb9ab4b594b83097af9c4c1ea46c3.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_643><loc_238><loc_870><loc_672>6 of birds<loc_184><loc_34><loc_555><loc_468>7 of birds<loc_137><loc_531><loc_527><loc_989>8 of birds<loc_517><loc_521><loc_753><loc_982>'}
+{'image': 'IMG_20220316_172435_jpg.rf.854fe0c471b03fe3a3894a3d2cbe00d0.jpg', 'prefix': '<OD>', 'suffix': '9 of dogs<loc_76><loc_146><loc_420><loc_702>10 of dogs<loc_353><loc_111><loc_586><loc_570>jack  of dogs<loc_540><loc_59><loc_724><loc_460>queen of dogs<loc_336><loc_550><loc_586><loc_998>king of dogs<loc_525><loc_456><loc_717><loc_852>'}
+{'image': 'IMG_20220316_161455_jpg.rf.635ccd0ee9f7dd762009f539d6b998e9.jpg', 'prefix': '<OD>', 'suffix': '9 of birds<loc_22><loc_495><loc_367><loc_878>10 of birds<loc_423><loc_33><loc_653><loc_428>jack of birds<loc_170><loc_136><loc_445><loc_563>queen of birds<loc_390><loc_470><loc_687><loc_791>king of birds<loc_666><loc_207><loc_938><loc_643>'}
+{'image': 'IMG_20220316_161313_jpg.rf.12498ecbc8985a8ff65bd2033d8f622a.jpg', 'prefix': '<OD>', 'suffix': '9 of birds<loc_165><loc_191><loc_374><loc_532>10 of birds<loc_361><loc_115><loc_584><loc_471>jack of birds<loc_527><loc_212><loc_912><loc_646>queen of birds<loc_375><loc_616><loc_692><loc_988>king of birds<loc_216><loc_451><loc_463><loc_846>'}
+{'image': 'IMG_20220316_142643_jpg.rf.13b0a2a65a9d4b17580b39ed19de5bba.jpg', 'prefix': '<OD>', 'suffix': 'ace of birds<loc_180><loc_323><loc_416><loc_725>2 of birds<loc_374><loc_577><loc_671><loc_946>3 of birds<loc_300><loc_16><loc_572><loc_485>4 of birds<loc_535><loc_214><loc_973><loc_769>'}
+{'image': 'IMG_20220316_144650_jpg.rf.34b246c8ee646cbb5979a35d68c58901.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_606><loc_265><loc_919><loc_746>6 of birds<loc_258><loc_2><loc_689><loc_439>7 of birds<loc_1><loc_295><loc_330><loc_841>8 of birds<loc_271><loc_352><loc_646><loc_875>'}
+{'image': 'IMG_20220316_161217_jpg.rf.1755e5fefb14ca6df49690604289bb46.jpg', 'prefix': '<OD>', 'suffix': '9 of birds<loc_20><loc_195><loc_275><loc_491>10 of birds<loc_268><loc_188><loc_490><loc_495>jack of birds<loc_491><loc_152><loc_735><loc_452>queen of birds<loc_713><loc_145><loc_991><loc_441>king of birds<loc_298><loc_480><loc_607><loc_935>'}
+{'image': 'IMG_20220316_144711_jpg.rf.19a6cc13f83f27b45e10a6056bb25721.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_738><loc_233><loc_997><loc_663>6 of birds<loc_20><loc_202><loc_258><loc_630>7 of birds<loc_476><loc_220><loc_734><loc_649>8 of birds<loc_264><loc_205><loc_491><loc_630>'}
+{'image': 'IMG_20220316_164257_jpg.rf.0c3abfccf0f7f147946c89251b87f598.jpg', 'prefix': '<OD>', 'suffix': '5 of cats<loc_49><loc_312><loc_290><loc_621>6 of cats<loc_681><loc_245><loc_920><loc_554>7 of cats<loc_242><loc_202><loc_492><loc_559>8 of cats<loc_468><loc_261><loc_687><loc_592>'}
+{'image': 'IMG_20220316_170755_jpg.rf.71804c21e1c7681d5b656427449e0a2a.jpg', 'prefix': '<OD>', 'suffix': 'ace of dogs<loc_47><loc_95><loc_391><loc_754>2 of dogs<loc_577><loc_238><loc_737><loc_674>3 of dogs<loc_723><loc_348><loc_861><loc_708>4 of dogs<loc_366><loc_189><loc_602><loc_711>'}
+{'image': 'IMG_20220316_141442_jpg.rf.41913768e5d56c57566ee3b45391470d.jpg', 'prefix': '<OD>', 'suffix': '9 of fishs<loc_509><loc_233><loc_768><loc_603>10 of fishs<loc_202><loc_86><loc_530><loc_413>jack of fishs<loc_597><loc_476><loc_836><loc_895>queen of fishs<loc_370><loc_441><loc_635><loc_916>king of fishs<loc_145><loc_312><loc_482><loc_782>'}
+{'image': 'IMG_20220316_140723_jpg.rf.826bc2b212c4001c11115ef28f58d142.jpg', 'prefix': '<OD>', 'suffix': '6 of fishs<loc_23><loc_341><loc_329><loc_713>7 of fishs<loc_228><loc_259><loc_442><loc_527>5 of fishs<loc_438><loc_254><loc_709><loc_555>8 of fishs<loc_631><loc_284><loc_999><loc_652>'}
+{'image': 'IMG_20220316_134701_jpg.rf.27aa29de9d6012ae05c64b156f7c07b8.jpg', 'prefix': '<OD>', 'suffix': 'ace of fishs<loc_197><loc_114><loc_489><loc_480>2 of fishs<loc_270><loc_581><loc_538><loc_999>3 of fishs<loc_470><loc_398><loc_718><loc_782>4 of fishs<loc_580><loc_213><loc_824><loc_543>'}
+{'image': 'IMG_20220316_163749_jpg.rf.3c41ec25d2a8390c760eb7fcf2d2466b.jpg', 'prefix': '<OD>', 'suffix': '5 of cats<loc_0><loc_276><loc_289><loc_582>6 of cats<loc_256><loc_286><loc_497><loc_566>7 of cats<loc_485><loc_282><loc_751><loc_559>8 of cats<loc_698><loc_254><loc_995><loc_548>'}
+{'image': 'IMG_20220316_165206_jpg.rf.1e20afbea0b132989e944ffbd800f348.jpg', 'prefix': '<OD>', 'suffix': '9 of cats<loc_57><loc_220><loc_401><loc_691>jack of cats<loc_420><loc_30><loc_713><loc_468>queen of cats<loc_266><loc_490><loc_563><loc_971>king of cats<loc_397><loc_422><loc_797><loc_948>10 of cats<loc_217><loc_74><loc_500><loc_557>'}
+{'image': 'IMG_20220316_144500_jpg.rf.14c3cb5eadd6c3d449916f52d9f9381e.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_43><loc_27><loc_396><loc_608>6 of birds<loc_348><loc_191><loc_610><loc_702>7 of birds<loc_577><loc_326><loc_790><loc_785>8 of birds<loc_795><loc_452><loc_973><loc_883>'}
+{'image': 'IMG_20220316_172537_jpg.rf.8fe076e115c111f04732f8e7f778e51d.jpg', 'prefix': '<OD>', 'suffix': '9 of dogs<loc_635><loc_348><loc_998><loc_738>10 of dogs<loc_1><loc_445><loc_320><loc_852>jack  of dogs<loc_267><loc_177><loc_467><loc_409>queen of dogs<loc_473><loc_141><loc_722><loc_358>king of dogs<loc_340><loc_380><loc_634><loc_788>'}
+{'image': 'IMG_20220316_144652_jpg.rf.316d0dad84d3d696fa8fa3caf53fb700.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_451><loc_505><loc_820><loc_980>6 of birds<loc_519><loc_101><loc_773><loc_495>7 of birds<loc_177><loc_68><loc_452><loc_427>8 of birds<loc_291><loc_301><loc_559><loc_730>'}
+{'image': 'IMG_20220316_173229_jpg.rf.1b93cc9a66ca2d0a28820a7dae74222a.jpg', 'prefix': '<OD>', 'suffix': '9 of dogs<loc_514><loc_426><loc_769><loc_895>10 of dogs<loc_650><loc_168><loc_978><loc_680>jack  of dogs<loc_151><loc_368><loc_432><loc_865>queen of dogs<loc_365><loc_82><loc_657><loc_546>king of dogs<loc_67><loc_272><loc_327><loc_786>'}
+{'image': 'IMG_20220316_171916_jpg.rf.7c8b85f64455e815acc30b5111422bf2.jpg', 'prefix': '<OD>', 'suffix': '5 of dogs<loc_402><loc_294><loc_713><loc_806>6 of dogs<loc_175><loc_587><loc_334><loc_982>7 of dogs<loc_509><loc_5><loc_893><loc_552>8 of dogs<loc_273><loc_414><loc_491><loc_867>'}
+{'image': 'IMG_20220316_165737_jpg.rf.11f2e19b001c300ee820e093b135409a.jpg', 'prefix': '<OD>', 'suffix': '9 of cats<loc_141><loc_350><loc_441><loc_848>jack of cats<loc_257><loc_107><loc_513><loc_429>queen of cats<loc_354><loc_405><loc_688><loc_926>king of cats<loc_201><loc_223><loc_477><loc_597>10 of cats<loc_519><loc_114><loc_783><loc_467>'}
+{'image': 'IMG_20220316_170523_jpg.rf.a106d534bf3279ec40e771452a142c1e.jpg', 'prefix': '<OD>', 'suffix': 'ace of dogs<loc_1><loc_295><loc_290><loc_616>2 of dogs<loc_250><loc_296><loc_509><loc_604>3 of dogs<loc_493><loc_275><loc_765><loc_588>4 of dogs<loc_700><loc_261><loc_999><loc_577>'}
+{'image': 'IMG_20220316_171810_jpg.rf.be96dac3cbdda6973a920a0a787b33f2.jpg', 'prefix': '<OD>', 'suffix': '5 of dogs<loc_546><loc_83><loc_699><loc_530>6 of dogs<loc_301><loc_438><loc_610><loc_906>7 of dogs<loc_363><loc_66><loc_638><loc_515>8 of dogs<loc_238><loc_163><loc_604><loc_547>'}
+{'image': 'IMG_20220316_171557_jpg.rf.9c6f913ba56e578ef4c31cc3faffcf7d.jpg', 'prefix': '<OD>', 'suffix': '5 of dogs<loc_289><loc_130><loc_556><loc_277>6 of dogs<loc_80><loc_391><loc_465><loc_838>7 of dogs<loc_255><loc_213><loc_659><loc_579>8 of dogs<loc_438><loc_258><loc_680><loc_603>'}
+{'image': 'IMG_20220316_140335_jpg.rf.c3310d8f13f66189440daf8419b1ad9c.jpg', 'prefix': '<OD>', 'suffix': '7 of fishs<loc_85><loc_452><loc_295><loc_678>6 of fishs<loc_291><loc_413><loc_488><loc_648>8 of fishs<loc_491><loc_405><loc_696><loc_626>5 of fishs<loc_708><loc_418><loc_961><loc_652>'}
+{'image': 'IMG_20220316_135021_jpg.rf.d038afdef1a927103dae268ff392888f.jpg', 'prefix': '<OD>', 'suffix': 'ace of fishs<loc_514><loc_234><loc_814><loc_523>2 of fishs<loc_326><loc_506><loc_596><loc_916>3 of fishs<loc_601><loc_515><loc_921><loc_937>4 of fishs<loc_219><loc_22><loc_528><loc_347>'}
+{'image': 'IMG_20220316_163800_jpg.rf.e1ad0b7b78e379d5f9f0bec787a9050b.jpg', 'prefix': '<OD>', 'suffix': '5 of cats<loc_26><loc_100><loc_376><loc_752>6 of cats<loc_377><loc_223><loc_612><loc_725>7 of cats<loc_591><loc_290><loc_779><loc_712>8 of cats<loc_760><loc_336><loc_916><loc_702>'}
+{'image': 'IMG_20220316_165711_jpg.rf.c64b84b61322947a5a8c7545e214278c.jpg', 'prefix': '<OD>', 'suffix': '9 of cats<loc_437><loc_86><loc_609><loc_363>10 of cats<loc_217><loc_123><loc_434><loc_458>jack of cats<loc_69><loc_504><loc_395><loc_995>queen of cats<loc_366><loc_530><loc_610><loc_951>king of cats<loc_553><loc_425><loc_745><loc_759>'}
+{'image': 'IMG_20220316_141917_jpg.rf.b3075e4161fe5fa2285d75bb2da3bc7a.jpg', 'prefix': '<OD>', 'suffix': '9 of fishs<loc_682><loc_295><loc_951><loc_709>10 of fishs<loc_63><loc_502><loc_293><loc_807>jack of fishs<loc_462><loc_345><loc_727><loc_729>queen of fishs<loc_256><loc_486><loc_497><loc_819>king of fishs<loc_113><loc_191><loc_411><loc_488>'}
+{'image': 'IMG_20220316_144507_jpg.rf.ab9b95792bbdbe7694910967641fba38.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_1><loc_339><loc_343><loc_888>6 of birds<loc_327><loc_326><loc_563><loc_773>7 of birds<loc_573><loc_320><loc_810><loc_735>8 of birds<loc_784><loc_320><loc_999><loc_684>'}
+{'image': 'IMG_20220316_161515_jpg.rf.b22dd5d2b8037f009a9e65052d2d3b5c.jpg', 'prefix': '<OD>', 'suffix': '9 of birds<loc_91><loc_238><loc_309><loc_620>10 of birds<loc_708><loc_179><loc_900><loc_532>jack of birds<loc_470><loc_161><loc_670><loc_525>queen of birds<loc_386><loc_495><loc_689><loc_816>king of birds<loc_282><loc_163><loc_490><loc_532>'}
+{'image': 'IMG_20220316_164227_jpg.rf.e42455b79bae041d959f90597b7065bc.jpg', 'prefix': '<OD>', 'suffix': '5 of cats<loc_59><loc_326><loc_274><loc_720>6 of cats<loc_740><loc_223><loc_955><loc_625>7 of cats<loc_260><loc_206><loc_523><loc_639>8 of cats<loc_485><loc_252><loc_760><loc_695>'}
+{'image': 'IMG_20220316_135012_jpg.rf.ee7179374d33235528db011cb5418226.jpg', 'prefix': '<OD>', 'suffix': 'ace of fishs<loc_290><loc_323><loc_573><loc_639>2 of fishs<loc_330><loc_652><loc_617><loc_986>3 of fishs<loc_570><loc_447><loc_856><loc_816>4 of fishs<loc_375><loc_49><loc_658><loc_361>'}
+{'image': 'IMG_20220316_161525_jpg.rf.f7962cdc50e0a3cd03e4c081a1d2a67a.jpg', 'prefix': '<OD>', 'suffix': '9 of birds<loc_55><loc_363><loc_334><loc_798>10 of birds<loc_607><loc_155><loc_863><loc_418>jack of birds<loc_400><loc_204><loc_670><loc_516>queen of birds<loc_423><loc_432><loc_778><loc_795>king of birds<loc_224><loc_244><loc_499><loc_602>'}
+{'image': 'IMG_20220316_141940_jpg.rf.e0ca71cfc8dc86a6624c3f90fe6c5e9e.jpg', 'prefix': '<OD>', 'suffix': '9 of fishs<loc_680><loc_333><loc_922><loc_778>10 of fishs<loc_105><loc_438><loc_451><loc_920>jack of fishs<loc_493><loc_230><loc_709><loc_655>queen of fishs<loc_219><loc_227><loc_489><loc_654>king of fishs<loc_366><loc_412><loc_728><loc_796>'}
+{'image': 'IMG_20220316_165221_jpg.rf.a5188caf60a7bd5e8c6dcf92d68f9505.jpg', 'prefix': '<OD>', 'suffix': '9 of cats<loc_43><loc_241><loc_334><loc_550>jack of cats<loc_470><loc_132><loc_757><loc_420>queen of cats<loc_347><loc_275><loc_641><loc_586>king of cats<loc_295><loc_511><loc_623><loc_948>10 of cats<loc_596><loc_446><loc_906><loc_757>'}
+{'image': 'IMG_20220316_162828_jpg.rf.db7557485f5c3e5ce01b2e1ab3d4621e.jpg', 'prefix': '<OD>', 'suffix': 'ace of cats<loc_34><loc_73><loc_348><loc_538>2 of cats<loc_277><loc_523><loc_555><loc_978>3 of cats<loc_318><loc_91><loc_659><loc_498>4 of cats<loc_539><loc_447><loc_838><loc_917>'}
+{'image': 'IMG_20220316_171936_jpg.rf.b6e31b1cc6b5e14dc66462becfa4a63d.jpg', 'prefix': '<OD>', 'suffix': '5 of dogs<loc_329><loc_499><loc_623><loc_984>6 of dogs<loc_353><loc_2><loc_748><loc_355>7 of dogs<loc_390><loc_238><loc_699><loc_731>8 of dogs<loc_38><loc_416><loc_420><loc_784>'}
+{'image': 'IMG_20220316_172800_jpg.rf.e63a3bae897cbf27dccbf969f543bb6f.jpg', 'prefix': '<OD>', 'suffix': '9 of dogs<loc_708><loc_168><loc_906><loc_501>10 of dogs<loc_42><loc_430><loc_409><loc_877>jack  of dogs<loc_481><loc_126><loc_678><loc_445>queen of dogs<loc_232><loc_158><loc_440><loc_488>king of dogs<loc_448><loc_459><loc_737><loc_773>'}
+{'image': 'IMG_20220316_171653_jpg.rf.637e380597f1d844654a33f4a3555471.jpg', 'prefix': '<OD>', 'suffix': '5 of dogs<loc_102><loc_473><loc_418><loc_800>6 of dogs<loc_423><loc_495><loc_798><loc_862>7 of dogs<loc_466><loc_100><loc_816><loc_400>8 of dogs<loc_173><loc_157><loc_466><loc_480>'}
+{'image': 'IMG_20220316_140241_jpg.rf.c44522806b5455bfb03a638aa3ffa896.jpg', 'prefix': '<OD>', 'suffix': '5 of fishs<loc_300><loc_409><loc_405><loc_673>6 of fishs<loc_372><loc_380><loc_489><loc_688>7 of fishs<loc_456><loc_334><loc_623><loc_747>8 of fishs<loc_598><loc_236><loc_880><loc_830>'}
+
+
+
+
# @title Define `DetectionsDataset` class
+
+class JSONLDataset:
+    def __init__(self, jsonl_file_path: str, image_directory_path: str):
+        self.jsonl_file_path = jsonl_file_path
+        self.image_directory_path = image_directory_path
+        self.entries = self._load_entries()
+
+    def _load_entries(self) -> List[Dict[str, Any]]:
+        entries = []
+        with open(self.jsonl_file_path, 'r') as file:
+            for line in file:
+                data = json.loads(line)
+                entries.append(data)
+        return entries
+
+    def __len__(self) -> int:
+        return len(self.entries)
+
+    def __getitem__(self, idx: int) -> Tuple[Image.Image, Dict[str, Any]]:
+        if idx < 0 or idx >= len(self.entries):
+            raise IndexError("Index out of range")
+
+        entry = self.entries[idx]
+        image_path = os.path.join(self.image_directory_path, entry['image'])
+        try:
+            image = Image.open(image_path)
+            return (image, entry)
+        except FileNotFoundError:
+            raise FileNotFoundError(f"Image file {image_path} not found.")
+
+
+class DetectionDataset(Dataset):
+    def __init__(self, jsonl_file_path: str, image_directory_path: str):
+        self.dataset = JSONLDataset(jsonl_file_path, image_directory_path)
+
+    def __len__(self):
+        return len(self.dataset)
+
+    def __getitem__(self, idx):
+        image, data = self.dataset[idx]
+        prefix = data['prefix']
+        suffix = data['suffix']
+        return prefix, suffix, image
+
+
+
# @title Initiate `DetectionsDataset` and `DataLoader` for train and validation subsets
+
+BATCH_SIZE = 6
+NUM_WORKERS = 0
+
+def collate_fn(batch):
+    questions, answers, images = zip(*batch)
+    inputs = processor(text=list(questions), images=list(images), return_tensors="pt", padding=True).to(DEVICE)
+    return inputs, answers
+
+train_dataset = DetectionDataset(
+    jsonl_file_path = f"{dataset.location}/train/annotations.jsonl",
+    image_directory_path = f"{dataset.location}/train/"
+)
+val_dataset = DetectionDataset(
+    jsonl_file_path = f"{dataset.location}/valid/annotations.jsonl",
+    image_directory_path = f"{dataset.location}/valid/"
+)
+
+train_loader = DataLoader(train_dataset, batch_size=BATCH_SIZE, collate_fn=collate_fn, num_workers=NUM_WORKERS, shuffle=True)
+val_loader = DataLoader(val_dataset, batch_size=BATCH_SIZE, collate_fn=collate_fn, num_workers=NUM_WORKERS)
+
+
+
# @title Setup LoRA Florence-2 model
+
+config = LoraConfig(
+    r=8,
+    lora_alpha=8,
+    target_modules=["q_proj", "o_proj", "k_proj", "v_proj", "linear", "Conv2d", "lm_head", "fc2"],
+    task_type="CAUSAL_LM",
+    lora_dropout=0.05,
+    bias="none",
+    inference_mode=False,
+    use_rslora=True,
+    init_lora_weights="gaussian",
+    revision="asdjasdsa",
+)
+
+peft_model = get_peft_model(model, config)
+peft_model.print_trainable_parameters()
+
+
trainable params: 1,929,928 || all params: 272,733,896 || trainable%: 0.7076
+
+
+
+
torch.cuda.empty_cache()
+
+
+
# @title Run inference with pre-trained Florence-2 model on validation dataset
+
+def render_inline(image: Image.Image, resize=(128, 128)):
+    """Convert image into inline html."""
+    image.resize(resize)
+    with io.BytesIO() as buffer:
+        image.save(buffer, format='jpeg')
+        image_b64 = str(base64.b64encode(buffer.getvalue()), "utf-8")
+        return f"data:image/jpeg;base64,{image_b64}"
+
+
+def render_example(image: Image.Image, response):
+    try:
+        detections = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, response, resolution_wh=image.size)
+        image = sv.BoundingBoxAnnotator(color_lookup=sv.ColorLookup.INDEX).annotate(image.copy(), detections)
+        image = sv.LabelAnnotator(color_lookup=sv.ColorLookup.INDEX).annotate(image, detections)
+    except:
+        print('failed to redner model response')
+    return f"""
+<div style="display: inline-flex; align-items: center; justify-content: center;">
+    <img style="width:256px; height:256px;" src="{render_inline(image, resize=(128, 128))}" />
+    <p style="width:512px; margin:10px; font-size:small;">{html.escape(json.dumps(response))}</p>
+</div>
+"""
+
+
+def render_inference_results(model, dataset: DetectionDataset, count: int):
+    html_out = ""
+    count = min(count, len(dataset))
+    for i in range(count):
+        image, data = dataset.dataset[i]
+        prefix = data['prefix']
+        suffix = data['suffix']
+        inputs = processor(text=prefix, images=image, return_tensors="pt").to(DEVICE)
+        generated_ids = model.generate(
+            input_ids=inputs["input_ids"],
+            pixel_values=inputs["pixel_values"],
+            max_new_tokens=1024,
+            num_beams=3
+        )
+        generated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]
+        answer = processor.post_process_generation(generated_text, task='<OD>', image_size=image.size)
+        html_out += render_example(image, answer)
+
+    display(HTML(html_out))
+
+render_inference_results(peft_model, val_dataset, 4)
+
+
BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+
+
+ +
+ +

{"<OD>": {"bboxes": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], "labels": ["bed"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], "labels": ["table"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], "labels": ["chair"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], "labels": ["furniture"]}}

+
+
+
+
+
+
+

Fine-tune Florence-2 on custom object detection dataset

+
+
# @title Define train loop
+
+def train_model(train_loader, val_loader, model, processor, epochs=10, lr=1e-6):
+    optimizer = AdamW(model.parameters(), lr=lr)
+    num_training_steps = epochs * len(train_loader)
+    lr_scheduler = get_scheduler(
+        name="linear",
+        optimizer=optimizer,
+        num_warmup_steps=0,
+        num_training_steps=num_training_steps,
+    )
+
+    render_inference_results(peft_model, val_loader.dataset, 6)
+
+    for epoch in range(epochs):
+        model.train()
+        train_loss = 0
+        for inputs, answers in tqdm(train_loader, desc=f"Training Epoch {epoch + 1}/{epochs}"):
+
+            input_ids = inputs["input_ids"]
+            pixel_values = inputs["pixel_values"]
+            labels = processor.tokenizer(
+                text=answers,
+                return_tensors="pt",
+                padding=True,
+                return_token_type_ids=False
+            ).input_ids.to(DEVICE)
+
+            outputs = model(input_ids=input_ids, pixel_values=pixel_values, labels=labels)
+            loss = outputs.loss
+
+            loss.backward(), optimizer.step(), lr_scheduler.step(), optimizer.zero_grad()
+            train_loss += loss.item()
+
+        avg_train_loss = train_loss / len(train_loader)
+        print(f"Average Training Loss: {avg_train_loss}")
+
+        model.eval()
+        val_loss = 0
+        with torch.no_grad():
+            for inputs, answers in tqdm(val_loader, desc=f"Validation Epoch {epoch + 1}/{epochs}"):
+
+                input_ids = inputs["input_ids"]
+                pixel_values = inputs["pixel_values"]
+                labels = processor.tokenizer(
+                    text=answers,
+                    return_tensors="pt",
+                    padding=True,
+                    return_token_type_ids=False
+                ).input_ids.to(DEVICE)
+
+                outputs = model(input_ids=input_ids, pixel_values=pixel_values, labels=labels)
+                loss = outputs.loss
+
+                val_loss += loss.item()
+
+            avg_val_loss = val_loss / len(val_loader)
+            print(f"Average Validation Loss: {avg_val_loss}")
+
+            render_inference_results(peft_model, val_loader.dataset, 6)
+
+        output_dir = f"./model_checkpoints/epoch_{epoch+1}"
+        os.makedirs(output_dir, exist_ok=True)
+        model.save_pretrained(output_dir)
+        processor.save_pretrained(output_dir)
+
+
+
%%time
+
+EPOCHS = 10
+LR = 5e-6
+
+train_model(train_loader, val_loader, peft_model, processor, epochs=EPOCHS, lr=LR)
+
+
This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+
+
+ +
+ +

{"<OD>": {"bboxes": [[434.8800048828125, 140.47999572753906, 460.47998046875, 178.239990234375]], "labels": ["human face"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], "labels": ["table"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[321.6000061035156, 212.1599884033203, 343.3599853515625, 242.87998962402344]], "labels": ["human face"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], "labels": ["furniture"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], "labels": ["bed"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], "labels": ["trousers"]}}

+
+
+
+
Training Epoch 1/10: 100%|██████████| 136/136 [01:32<00:00,  1.48it/s]
+
+
+
Average Training Loss: 5.180607767666087
+
+
+
Validation Epoch 1/10: 100%|██████████| 8/8 [00:02<00:00,  2.76it/s]
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+
+
+
Average Validation Loss: 3.619225859642029
+
+
+
BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+
+
+ +
+ +

{"<OD>": {"bboxes": [[372.79998779296875, 112.95999908447266, 512.3200073242188, 356.79998779296875]], "labels": ["queen of spades"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[0.3199999928474426, 129.59999084472656, 267.8399963378906, 411.8399963378906]], "labels": ["6 of 6 of 7 of 8 of 9 of 9's of 7's of 8's of 5's of 6's of 4's of 9' of 5' of 4' of 6' of 7' of 8' of 10's of 10' of 12's of 11's of 16's of 17's of 18's of 19's of 20's of 30's of 40's of 50's of 70's of 80's of eight's of ten's of four's of seven's of all sorts of sorts of eight' of sorts's of sorts"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[19.520000457763672, 289.6000061035156, 223.0399932861328, 580.7999877929688], [186.55999755859375, 274.239990234375, 395.8399963378906, 511.03997802734375]], "labels": ["queen of spades", "queen of spades"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [], "labels": []}}

+
+ +
+ +

{"<OD>": {"bboxes": [[15.039999961853027, 254.39999389648438, 213.44000244140625, 464.9599914550781], [463.67999267578125, 222.39999389648438, 635.8399658203125, 405.44000244140625], [329.2799987792969, 192.3199920654297, 466.8799743652344, 397.1199951171875], [208.95999145507812, 285.1199951171875, 345.91998291015625, 461.7599792480469]], "labels": ["queen of spades", "queen of spades", "queen of spades", "queen of spades"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[294.0799865722656, 176.95999145507812, 624.3200073242188, 398.3999938964844]], "labels": ["6 of spades"]}}

+
+
+
+
Setting `save_embedding_layers` to `True` as embedding layers found in `target_modules`.
+Training Epoch 2/10: 100%|██████████| 136/136 [01:32<00:00,  1.47it/s]
+
+
+
Average Training Loss: 3.457882178180358
+
+
+
Validation Epoch 2/10: 100%|██████████| 8/8 [00:02<00:00,  2.75it/s]
+
+
+
Average Validation Loss: 2.489599049091339
+
+
+
BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+
+
+ +
+ +

{"<OD>": {"bboxes": [[372.79998779296875, 112.95999908447266, 512.3200073242188, 356.79998779296875], [162.87998962402344, 330.55999755859375, 301.1199951171875, 585.2799682617188], [52.79999923706055, 239.0399932861328, 166.0800018310547, 469.44000244140625]], "labels": ["queen of cats", "king of cats", "9 of cats"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[0.3199999928474426, 128.3199920654297, 267.8399963378906, 411.8399963378906], [198.0800018310547, 82.23999786376953, 381.1199951171875, 323.5199890136719], [330.55999755859375, 42.55999755859375, 516.7999877929688, 207.0399932861328]], "labels": ["6 of cats", "7 of cats", "5 of cats"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[18.8799991607666, 289.6000061035156, 223.0399932861328, 581.4400024414062], [368.3199768066406, 235.1999969482422, 517.4400024414062, 490.55999755859375], [185.27999877929688, 273.6000061035156, 396.47998046875, 511.67999267578125], [86.08000183105469, 164.16000366210938, 255.67999267578125, 403.5199890136719], [254.39999389648438, 167.36000061035156, 389.44000244140625, 317.1199951171875]], "labels": ["queen of cats", "9 of cats", "queen's of cats", "queen cats", "queen card"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[55.36000061035156, 228.79998779296875, 331.8399963378906, 637.1199951171875], [331.1999816894531, 151.36000061035156, 479.03997802734375, 447.03997802734375], [296.6399841308594, 253.1199951171875, 459.8399963378906, 550.719970703125], [436.79998779296875, 157.1199951171875, 557.760009765625, 392.0]], "labels": ["8 of cats", "6 of cats", "7 of cats", "6 of dogs"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[329.2799987792969, 192.3199920654297, 466.8799743652344, 397.7599792480469], [208.95999145507812, 285.1199951171875, 345.91998291015625, 461.7599792480469], [14.399999618530273, 254.39999389648438, 214.0800018310547, 464.9599914550781], [463.03997802734375, 222.39999389648438, 636.47998046875, 406.0799865722656]], "labels": ["6 of cats", "7 of cats", "8 of cats", "5 of cats"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[97.5999984741211, 433.5999755859375, 314.55999755859375, 563.5199584960938], [292.1600036621094, 176.3199920654297, 624.9599609375, 399.03997802734375], [10.559999465942383, 228.1599884033203, 275.5199890136719, 427.8399963378906]], "labels": ["5 of cats", "9 of cats", "7 of cats"]}}

+
+
+
+
Training Epoch 3/10: 100%|██████████| 136/136 [01:32<00:00,  1.47it/s]
+
+
+
Average Training Loss: 2.4063032251947067
+
+
+
Validation Epoch 3/10: 100%|██████████| 8/8 [00:02<00:00,  2.76it/s]
+
+
+
Average Validation Loss: 1.7791805267333984
+
+
+
BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+
+
+ +
+ +

{"<OD>": {"bboxes": [[372.79998779296875, 113.5999984741211, 512.3200073242188, 356.79998779296875], [161.59999084472656, 330.55999755859375, 301.1199951171875, 585.2799682617188], [52.79999923706055, 239.67999267578125, 166.0800018310547, 469.44000244140625], [310.0799865722656, 360.6399841308594, 445.7599792480469, 616.6400146484375]], "labels": ["queen of cats", "king of cats", "9 of cats", "10 of cats"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[0.3199999928474426, 127.68000030517578, 267.1999816894531, 410.55999755859375], [200.63999938964844, 82.23999786376953, 381.1199951171875, 323.5199890136719], [333.1199951171875, 42.55999755859375, 516.7999877929688, 207.0399932861328]], "labels": ["6 of dogs", "7 of dogs", "5 of dogs"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[19.520000457763672, 289.6000061035156, 223.0399932861328, 580.7999877929688], [368.3199768066406, 234.55999755859375, 517.4400024414062, 490.55999755859375], [185.9199981689453, 273.6000061035156, 395.8399963378906, 511.03997802734375], [86.08000183105469, 163.51998901367188, 255.0399932861328, 401.6000061035156], [255.0399932861328, 167.36000061035156, 388.79998779296875, 317.1199951171875]], "labels": ["queen of birds", "9 of birds", "king of birds", "queen birds", "queen fish"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[55.36000061035156, 228.79998779296875, 333.1199951171875, 637.1199951171875], [437.44000244140625, 157.1199951171875, 557.760009765625, 391.3599853515625], [296.6399841308594, 253.1199951171875, 459.8399963378906, 550.0800170898438], [331.1999816894531, 150.0800018310547, 479.67999267578125, 447.03997802734375]], "labels": ["8 of birds", "6 of birds", "7 of birds", "5 of birds"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[328.6399841308594, 192.3199920654297, 467.5199890136719, 397.7599792480469], [208.95999145507812, 285.1199951171875, 346.55999755859375, 461.7599792480469], [14.399999618530273, 254.39999389648438, 214.0800018310547, 465.5999755859375], [462.3999938964844, 221.1199951171875, 636.47998046875, 407.3599853515625]], "labels": ["6 of birds", "7 of birds", "8 of birds", "5 of birds"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[97.5999984741211, 433.5999755859375, 314.55999755859375, 563.5199584960938], [308.1600036621094, 423.3599853515625, 548.7999877929688, 563.5199584960938], [10.559999465942383, 228.1599884033203, 275.5199890136719, 427.8399963378906]], "labels": ["5 of cats", "9 of cats", "7 of cats"]}}

+
+
+
+
Training Epoch 4/10: 100%|██████████| 136/136 [01:32<00:00,  1.47it/s]
+
+
+
Average Training Loss: 1.849329402341562
+
+
+
Validation Epoch 4/10: 100%|██████████| 8/8 [00:02<00:00,  2.72it/s]
+
+
+
Average Validation Loss: 1.6737226694822311
+
+
+
BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+
+
+ +
+ +

{"<OD>": {"bboxes": [[372.79998779296875, 113.5999984741211, 512.3200073242188, 358.0799865722656], [162.239990234375, 330.55999755859375, 301.1199951171875, 585.2799682617188], [52.79999923706055, 239.0399932861328, 167.36000061035156, 470.0799865722656], [310.0799865722656, 360.6399841308594, 445.7599792480469, 616.6400146484375]], "labels": ["queen of birds", "king of birds", "9 of birds", "10 of birds"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[333.1199951171875, 43.20000076293945, 516.7999877929688, 207.0399932861328], [0.3199999928474426, 128.3199920654297, 267.1999816894531, 410.55999755859375], [200.0, 83.5199966430664, 381.1199951171875, 323.5199890136719]], "labels": ["5 of cats", "6 of cats", "7 of cats"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[369.6000061035156, 234.55999755859375, 518.0800170898438, 490.55999755859375], [256.32000732421875, 168.0, 388.79998779296875, 317.1199951171875], [18.8799991607666, 289.6000061035156, 223.0399932861328, 582.0800170898438], [186.55999755859375, 273.6000061035156, 396.47998046875, 512.3200073242188], [87.36000061035156, 164.8000030517578, 255.0399932861328, 403.5199890136719]], "labels": ["9 of cats", "queen of cats", "10 of cats", "king of cats", "queen OF cats"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[437.44000244140625, 157.1199951171875, 556.47998046875, 390.7200012207031], [298.55999755859375, 254.39999389648438, 457.2799987792969, 549.4400024414062], [333.1199951171875, 151.36000061035156, 479.03997802734375, 447.03997802734375], [56.0, 228.79998779296875, 331.8399963378906, 637.1199951171875]], "labels": ["6 of dogs", "7 of dogs", "5 of dogs", "8 of dogs"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[328.6399841308594, 192.3199920654297, 467.5199890136719, 397.1199951171875], [208.95999145507812, 285.1199951171875, 346.55999755859375, 461.7599792480469], [15.039999961853027, 254.39999389648438, 214.0800018310547, 465.5999755859375], [464.3199768066406, 222.39999389648438, 636.47998046875, 406.0799865722656]], "labels": ["6 of birds", "7 of birds", "8 of birds", "5 of birds"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[97.5999984741211, 433.5999755859375, 314.55999755859375, 563.5199584960938], [310.0799865722656, 424.0, 548.7999877929688, 562.239990234375], [10.559999465942383, 228.1599884033203, 275.5199890136719, 427.8399963378906]], "labels": ["5 of cats", "6 of cats", "7 of cats"]}}

+
+
+
+
Training Epoch 5/10: 100%|██████████| 136/136 [01:32<00:00,  1.46it/s]
+
+
+
Average Training Loss: 1.7156715840101242
+
+
+
Validation Epoch 5/10: 100%|██████████| 8/8 [00:02<00:00,  2.74it/s]
+
+
+
Average Validation Loss: 1.6302864849567413
+
+
+
BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+
+
+ +
+ +

{"<OD>": {"bboxes": [[371.5199890136719, 112.31999969482422, 512.9599609375, 358.0799865722656], [161.59999084472656, 329.91998291015625, 301.7599792480469, 585.2799682617188], [51.52000045776367, 238.39999389648438, 167.36000061035156, 470.0799865722656], [310.0799865722656, 358.0799865722656, 447.03997802734375, 616.6400146484375]], "labels": ["queen of birds", "king of birds", "9 of birds", "10 of birds"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[333.1199951171875, 43.20000076293945, 516.7999877929688, 207.0399932861328], [0.3199999928474426, 129.59999084472656, 267.1999816894531, 410.55999755859375], [200.63999938964844, 83.5199966430664, 381.1199951171875, 323.5199890136719]], "labels": ["5 of dogs", "6 of dogs", "7 of dogs"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[369.6000061035156, 234.55999755859375, 518.0800170898438, 490.55999755859375], [86.08000183105469, 162.87998962402344, 255.67999267578125, 401.6000061035156], [256.32000732421875, 167.36000061035156, 389.44000244140625, 317.1199951171875], [18.8799991607666, 289.6000061035156, 223.67999267578125, 582.0800170898438], [186.55999755859375, 273.6000061035156, 397.1199951171875, 512.3200073242188]], "labels": ["9 of birds", "queen of birds", "jack of birds", "10 of birds", "king of birds"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[438.0799865722656, 157.1199951171875, 556.47998046875, 390.0799865722656], [333.1199951171875, 150.72000122070312, 479.03997802734375, 447.03997802734375], [298.55999755859375, 254.39999389648438, 458.55999755859375, 550.0800170898438], [56.0, 228.79998779296875, 333.1199951171875, 637.1199951171875]], "labels": ["6 of dogs", "5 of dogs", "7 of dogs", "8 of dogs"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[328.6399841308594, 192.3199920654297, 467.5199890136719, 397.1199951171875], [208.95999145507812, 285.1199951171875, 346.55999755859375, 461.7599792480469], [15.039999961853027, 254.39999389648438, 214.0800018310547, 465.5999755859375], [464.3199768066406, 222.39999389648438, 636.47998046875, 406.0799865722656]], "labels": ["6 of birds", "7 of birds", "8 of birds", "5 of birds"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[97.5999984741211, 433.5999755859375, 314.55999755859375, 563.5199584960938], [310.0799865722656, 424.0, 548.7999877929688, 562.239990234375], [10.559999465942383, 228.1599884033203, 275.5199890136719, 427.8399963378906], [291.5199890136719, 176.3199920654297, 625.5999755859375, 399.03997802734375]], "labels": ["5 of cats", "6 of cats", "7 of cats", "8 of cats"]}}

+
+
+
+
Training Epoch 6/10: 100%|██████████| 136/136 [01:32<00:00,  1.46it/s]
+
+
+
Average Training Loss: 1.6734710993135677
+
+
+
Validation Epoch 6/10: 100%|██████████| 8/8 [00:02<00:00,  2.73it/s]
+
+
+
Average Validation Loss: 1.6009675413370132
+
+
+
BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+BoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.
+
+
+ +
+ +

{"<OD>": {"bboxes": [[371.5199890136719, 112.31999969482422, 512.9599609375, 358.0799865722656], [161.59999084472656, 330.55999755859375, 301.7599792480469, 585.2799682617188], [51.52000045776367, 239.0399932861328, 168.63999938964844, 470.0799865722656], [310.0799865722656, 360.0, 447.03997802734375, 616.6400146484375]], "labels": ["queen of birds", "king of birds", "9 of birds", "10 of birds"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[333.1199951171875, 43.20000076293945, 516.7999877929688, 207.0399932861328], [0.3199999928474426, 129.59999084472656, 267.1999816894531, 410.55999755859375], [200.63999938964844, 83.5199966430664, 381.1199951171875, 323.5199890136719]], "labels": ["5 of dogs", "6 of dogs", "7 of dogs"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[369.6000061035156, 234.55999755859375, 518.0800170898438, 490.55999755859375], [86.08000183105469, 162.87998962402344, 255.67999267578125, 401.6000061035156], [256.32000732421875, 167.36000061035156, 389.44000244140625, 317.1199951171875], [18.8799991607666, 289.6000061035156, 223.67999267578125, 582.0800170898438], [186.55999755859375, 273.6000061035156, 396.47998046875, 511.03997802734375]], "labels": ["9 of birds", "queen of birds", "jack of birds", "10 of birds", "king of birds"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[438.0799865722656, 157.1199951171875, 556.47998046875, 390.0799865722656], [333.1199951171875, 150.72000122070312, 479.03997802734375, 447.03997802734375], [298.55999755859375, 254.39999389648438, 457.2799987792969, 550.0800170898438], [56.0, 228.79998779296875, 333.1199951171875, 637.1199951171875]], "labels": ["6 of dogs", "5 of dogs", "7 of dogs", "8 of dogs"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[328.6399841308594, 192.3199920654297, 467.5199890136719, 397.1199951171875], [464.3199768066406, 222.39999389648438, 636.47998046875, 405.44000244140625], [209.59999084472656, 285.1199951171875, 346.55999755859375, 461.7599792480469], [15.039999961853027, 254.39999389648438, 214.0800018310547, 464.9599914550781]], "labels": ["6 of birds", "5 of birds", "7 of birds", "8 of birds"]}}

+
+ +
+ +

{"<OD>": {"bboxes": [[97.5999984741211, 433.5999755859375, 314.55999755859375, 563.5199584960938], [310.0799865722656, 424.0, 548.7999877929688, 561.5999755859375], [10.559999465942383, 228.1599884033203, 275.5199890136719, 427.8399963378906], [291.5199890136719, 176.3199920654297, 625.5999755859375, 399.03997802734375]], "labels": ["5 of cats", "6 of cats", "7 of cats", "8 of cats"]}}

+
+
+
+
Training Epoch 7/10:  34%|███▍      | 46/136 [00:31<01:02,  1.44it/s]
+
+
+
+
---------------------------------------------------------------------------
+KeyboardInterrupt                         Traceback (most recent call last)
+File <timed exec>:4
+
+Cell In[29], line 29, in train_model(train_loader, val_loader, model, processor, epochs, lr)
+     21 pixel_values = inputs["pixel_values"]
+     22 labels = processor.tokenizer(
+     23     text=answers,
+     24     return_tensors="pt",
+     25     padding=True,
+     26     return_token_type_ids=False
+     27 ).input_ids.to(DEVICE)
+---> 29 outputs = model(input_ids=input_ids, pixel_values=pixel_values, labels=labels)
+     30 loss = outputs.loss
+     32 loss.backward(), optimizer.step(), lr_scheduler.step(), optimizer.zero_grad()
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)
+   1734     return self._compiled_call_impl(*args, **kwargs)  # type: ignore[misc]
+   1735 else:
+-> 1736     return self._call_impl(*args, **kwargs)
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)
+   1742 # If we don't have any hooks, we want to skip the rest of the logic in
+   1743 # this function, and just call forward.
+   1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
+   1745         or _global_backward_pre_hooks or _global_backward_hooks
+   1746         or _global_forward_hooks or _global_forward_pre_hooks):
+-> 1747     return forward_call(*args, **kwargs)
+   1749 result = None
+   1750 called_always_called_hooks = set()
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/peft/peft_model.py:1719, in PeftModelForCausalLM.forward(self, input_ids, attention_mask, inputs_embeds, labels, output_attentions, output_hidden_states, return_dict, task_ids, **kwargs)
+   1717     with self._enable_peft_forward_hooks(**kwargs):
+   1718         kwargs = {k: v for k, v in kwargs.items() if k not in self.special_peft_forward_args}
+-> 1719         return self.base_model(
+   1720             input_ids=input_ids,
+   1721             attention_mask=attention_mask,
+   1722             inputs_embeds=inputs_embeds,
+   1723             labels=labels,
+   1724             output_attentions=output_attentions,
+   1725             output_hidden_states=output_hidden_states,
+   1726             return_dict=return_dict,
+   1727             **kwargs,
+   1728         )
+   1730 batch_size = _get_batch_size(input_ids, inputs_embeds)
+   1731 if attention_mask is not None:
+   1732     # concat prompt attention mask
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)
+   1734     return self._compiled_call_impl(*args, **kwargs)  # type: ignore[misc]
+   1735 else:
+-> 1736     return self._call_impl(*args, **kwargs)
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)
+   1742 # If we don't have any hooks, we want to skip the rest of the logic in
+   1743 # this function, and just call forward.
+   1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
+   1745         or _global_backward_pre_hooks or _global_backward_hooks
+   1746         or _global_forward_hooks or _global_forward_pre_hooks):
+-> 1747     return forward_call(*args, **kwargs)
+   1749 result = None
+   1750 called_always_called_hooks = set()
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/peft/tuners/tuners_utils.py:197, in BaseTuner.forward(self, *args, **kwargs)
+    196 def forward(self, *args: Any, **kwargs: Any):
+--> 197     return self.model.forward(*args, **kwargs)
+
+File ~/.cache/huggingface/modules/transformers_modules/microsoft/Florence-2-base-ft/9803f52844ec1ae5df004e6089262e9a23e527fd/modeling_florence2.py:2741, in Florence2ForConditionalGeneration.forward(self, input_ids, pixel_values, attention_mask, decoder_input_ids, decoder_attention_mask, head_mask, decoder_head_mask, cross_attn_head_mask, encoder_outputs, past_key_values, inputs_embeds, decoder_inputs_embeds, labels, use_cache, output_attentions, output_hidden_states, return_dict)
+   2739 if inputs_embeds is not None:
+   2740     attention_mask = attention_mask.to(inputs_embeds.dtype)
+-> 2741 outputs = self.language_model(
+   2742     attention_mask=attention_mask,
+   2743     labels=labels,
+   2744     inputs_embeds=inputs_embeds,
+   2745     decoder_input_ids=decoder_input_ids,
+   2746     encoder_outputs=encoder_outputs,
+   2747     decoder_attention_mask=decoder_attention_mask,
+   2748     head_mask=head_mask,
+   2749     decoder_head_mask=decoder_head_mask,
+   2750     cross_attn_head_mask=cross_attn_head_mask,
+   2751     past_key_values=past_key_values,
+   2752     decoder_inputs_embeds=decoder_inputs_embeds,
+   2753     use_cache=use_cache,
+   2754     output_attentions=output_attentions,
+   2755     output_hidden_states=output_hidden_states,
+   2756     return_dict=return_dict,
+   2757 )
+   2759 logits = outputs.logits
+   2760 logits = logits.float()
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)
+   1734     return self._compiled_call_impl(*args, **kwargs)  # type: ignore[misc]
+   1735 else:
+-> 1736     return self._call_impl(*args, **kwargs)
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)
+   1742 # If we don't have any hooks, we want to skip the rest of the logic in
+   1743 # this function, and just call forward.
+   1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
+   1745         or _global_backward_pre_hooks or _global_backward_hooks
+   1746         or _global_forward_hooks or _global_forward_pre_hooks):
+-> 1747     return forward_call(*args, **kwargs)
+   1749 result = None
+   1750 called_always_called_hooks = set()
+
+File ~/.cache/huggingface/modules/transformers_modules/microsoft/Florence-2-base-ft/9803f52844ec1ae5df004e6089262e9a23e527fd/modeling_florence2.py:2140, in Florence2LanguageForConditionalGeneration.forward(self, input_ids, attention_mask, decoder_input_ids, decoder_attention_mask, head_mask, decoder_head_mask, cross_attn_head_mask, encoder_outputs, past_key_values, inputs_embeds, decoder_inputs_embeds, labels, use_cache, output_attentions, output_hidden_states, return_dict)
+   2135     if decoder_input_ids is None and decoder_inputs_embeds is None:
+   2136         decoder_input_ids = shift_tokens_right(
+   2137             labels, self.config.pad_token_id, self.config.decoder_start_token_id
+   2138         )
+-> 2140 outputs = self.model(
+   2141     input_ids,
+   2142     attention_mask=attention_mask,
+   2143     decoder_input_ids=decoder_input_ids,
+   2144     encoder_outputs=encoder_outputs,
+   2145     decoder_attention_mask=decoder_attention_mask,
+   2146     head_mask=head_mask,
+   2147     decoder_head_mask=decoder_head_mask,
+   2148     cross_attn_head_mask=cross_attn_head_mask,
+   2149     past_key_values=past_key_values,
+   2150     inputs_embeds=inputs_embeds,
+   2151     decoder_inputs_embeds=decoder_inputs_embeds,
+   2152     use_cache=use_cache,
+   2153     output_attentions=output_attentions,
+   2154     output_hidden_states=output_hidden_states,
+   2155     return_dict=return_dict,
+   2156 )
+   2158 lm_logits = self.lm_head(outputs[0])
+   2159 lm_logits = lm_logits + self.final_logits_bias.to(lm_logits.device)
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)
+   1734     return self._compiled_call_impl(*args, **kwargs)  # type: ignore[misc]
+   1735 else:
+-> 1736     return self._call_impl(*args, **kwargs)
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)
+   1742 # If we don't have any hooks, we want to skip the rest of the logic in
+   1743 # this function, and just call forward.
+   1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
+   1745         or _global_backward_pre_hooks or _global_backward_hooks
+   1746         or _global_forward_hooks or _global_forward_pre_hooks):
+-> 1747     return forward_call(*args, **kwargs)
+   1749 result = None
+   1750 called_always_called_hooks = set()
+
+File ~/.cache/huggingface/modules/transformers_modules/microsoft/Florence-2-base-ft/9803f52844ec1ae5df004e6089262e9a23e527fd/modeling_florence2.py:2014, in Florence2LanguageModel.forward(self, input_ids, attention_mask, decoder_input_ids, decoder_attention_mask, head_mask, decoder_head_mask, cross_attn_head_mask, encoder_outputs, past_key_values, inputs_embeds, decoder_inputs_embeds, use_cache, output_attentions, output_hidden_states, return_dict)
+   2011 return_dict = return_dict if return_dict is not None else self.config.use_return_dict
+   2013 if encoder_outputs is None:
+-> 2014     encoder_outputs = self.encoder(
+   2015         input_ids=input_ids,
+   2016         attention_mask=attention_mask,
+   2017         head_mask=head_mask,
+   2018         inputs_embeds=inputs_embeds,
+   2019         output_attentions=output_attentions,
+   2020         output_hidden_states=output_hidden_states,
+   2021         return_dict=return_dict,
+   2022     )
+   2023 # If the user passed a tuple for encoder_outputs, we wrap it in a BaseModelOutput when return_dict=True
+   2024 elif return_dict and not isinstance(encoder_outputs, BaseModelOutput):
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)
+   1734     return self._compiled_call_impl(*args, **kwargs)  # type: ignore[misc]
+   1735 else:
+-> 1736     return self._call_impl(*args, **kwargs)
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)
+   1742 # If we don't have any hooks, we want to skip the rest of the logic in
+   1743 # this function, and just call forward.
+   1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
+   1745         or _global_backward_pre_hooks or _global_backward_hooks
+   1746         or _global_forward_hooks or _global_forward_pre_hooks):
+-> 1747     return forward_call(*args, **kwargs)
+   1749 result = None
+   1750 called_always_called_hooks = set()
+
+File ~/.cache/huggingface/modules/transformers_modules/microsoft/Florence-2-base-ft/9803f52844ec1ae5df004e6089262e9a23e527fd/modeling_florence2.py:1593, in Florence2Encoder.forward(self, input_ids, attention_mask, head_mask, inputs_embeds, output_attentions, output_hidden_states, return_dict)
+   1588     attention_mask = attention_mask if 0 in attention_mask else None
+   1589 elif self._use_sdpa and head_mask is None and not output_attentions:
+   1590     # output_attentions=True & head_mask can not be supported when using SDPA, fall back to
+   1591     # the manual implementation that requires a 4D causal mask in all cases.
+   1592     # [bsz, seq_len] -> [bsz, 1, tgt_seq_len, src_seq_len]
+-> 1593     attention_mask = _prepare_4d_attention_mask_for_sdpa(attention_mask, inputs_embeds.dtype)
+   1594 else:
+   1595     # [bsz, seq_len] -> [bsz, 1, tgt_seq_len, src_seq_len]
+   1596     attention_mask = _prepare_4d_attention_mask(attention_mask, inputs_embeds.dtype)
+
+KeyboardInterrupt: 
+
+
+
+
+
+

Fine-tuned model evaluation

+
+
# @title Check if the model can still detect objects outside of the custom dataset
+
+image = Image.open(EXAMPLE_IMAGE_PATH)
+task = "<OD>"
+text = "<OD>"
+
+inputs = processor(text=text, images=image, return_tensors="pt").to(DEVICE)
+generated_ids = peft_model.generate(
+    input_ids=inputs["input_ids"],
+    pixel_values=inputs["pixel_values"],
+    max_new_tokens=1024,
+    num_beams=3
+)
+generated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]
+response = processor.post_process_generation(generated_text, task=task, image_size=(image.width, image.height))
+detections = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, response, resolution_wh=image.size)
+
+bounding_box_annotator = sv.BoundingBoxAnnotator(color_lookup=sv.ColorLookup.INDEX)
+label_annotator = sv.LabelAnnotator(color_lookup=sv.ColorLookup.INDEX)
+
+image = bounding_box_annotator.annotate(image, detections)
+image = label_annotator.annotate(image, detections)
+image.thumbnail((600, 600))
+image
+
+
+
+

+
+
+
+
+

NOTE: It seems that the model can still detect classes that don’t belong to our custom dataset.

+
+
# @title Collect predictions
+
+PATTERN = r'([a-zA-Z0-9 ]+ of [a-zA-Z0-9 ]+)<loc_\d+>'
+
+def extract_classes(dataset: DetectionDataset):
+    class_set = set()
+    for i in range(len(dataset.dataset)):
+        image, data = dataset.dataset[i]
+        suffix = data["suffix"]
+        classes = re.findall(PATTERN, suffix)
+        class_set.update(classes)
+    return sorted(class_set)
+
+CLASSES = extract_classes(train_dataset)
+
+targets = []
+predictions = []
+
+for i in range(len(val_dataset.dataset)):
+    image, data = val_dataset.dataset[i]
+    prefix = data['prefix']
+    suffix = data['suffix']
+
+    inputs = processor(text=prefix, images=image, return_tensors="pt").to(DEVICE)
+    generated_ids = model.generate(
+        input_ids=inputs["input_ids"],
+        pixel_values=inputs["pixel_values"],
+        max_new_tokens=1024,
+        num_beams=3
+    )
+    generated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]
+
+    prediction = processor.post_process_generation(generated_text, task='<OD>', image_size=image.size)
+    prediction = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, prediction, resolution_wh=image.size)
+    prediction = prediction[np.isin(prediction['class_name'], CLASSES)]
+    prediction.class_id = np.array([CLASSES.index(class_name) for class_name in prediction['class_name']])
+    prediction.confidence = np.ones(len(prediction))
+
+    target = processor.post_process_generation(suffix, task='<OD>', image_size=image.size)
+    target = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, target, resolution_wh=image.size)
+    target.class_id = np.array([CLASSES.index(class_name) for class_name in target['class_name']])
+
+    targets.append(target)
+    predictions.append(prediction)
+
+
+
# @title Calculate mAP
+mean_average_precision = sv.MeanAveragePrecision.from_detections(
+    predictions=predictions,
+    targets=targets,
+)
+
+print(f"map50_95: {mean_average_precision.map50_95:.2f}")
+print(f"map50: {mean_average_precision.map50:.2f}")
+print(f"map75: {mean_average_precision.map75:.2f}")
+
+
map50_95: 0.49
+map50: 0.53
+map75: 0.53
+
+
+
+
# @title Calculate Confusion Matrix
+confusion_matrix = sv.ConfusionMatrix.from_detections(
+    predictions=predictions,
+    targets=targets,
+    classes=CLASSES
+)
+
+_ = confusion_matrix.plot()
+
+
+
+

+
+
+
+
+
+
+

Save fine-tuned model on hard drive

+
+
peft_model.save_pretrained("/content/florence2-lora")
+processor.save_pretrained("/content/florence2-lora/")
+!ls -la /content/florence2-lora/
+
+
total 11432
+drwxr-xr-x 2 root root    4096 Jun 26 21:43 .
+drwxr-xr-x 1 root root    4096 Jun 26 21:43 ..
+-rw-r--r-- 1 root root     746 Jun 26 21:43 adapter_config.json
+-rw-r--r-- 1 root root 7747264 Jun 26 21:43 adapter_model.safetensors
+-rw-r--r-- 1 root root   22410 Jun 26 21:43 added_tokens.json
+-rw-r--r-- 1 root root  456318 Jun 26 21:43 merges.txt
+-rw-r--r-- 1 root root     947 Jun 26 21:43 preprocessor_config.json
+-rw-r--r-- 1 root root    5102 Jun 26 21:43 README.md
+-rw-r--r-- 1 root root  146627 Jun 26 21:43 special_tokens_map.json
+-rw-r--r-- 1 root root  197658 Jun 26 21:43 tokenizer_config.json
+-rw-r--r-- 1 root root 2297961 Jun 26 21:43 tokenizer.json
+-rw-r--r-- 1 root root  798293 Jun 26 21:43 vocab.json
+
+
+
+
+

Upload model to Roboflow (optional)

+

You can deploy your Florence-2 object detection model on your own hardware (i.e. a cloud GPu server or an NVIDIA Jetson) with Roboflow Inference, an open source computer vision inference server.

+

To deploy your model, you will need a free Roboflow account.

+

To get started, create a new Project in Roboflow if you don’t already have one. Then, upload the dataset you used to train your model. Then, create a dataset Version, which is a snapshot of your dataset with which your model will be associated in Roboflow.

+

You can read our full Deploy Florence-2 with Roboflow guide for step-by-step instructions of these steps.

+

Once you have trained your model A, you can upload it to Roboflow using the following code:

+
+
import roboflow
+
+rf = Roboflow(api_key="API_KEY")
+project = rf.workspace("workspace-id").project("project-id")
+version = project.version(VERSION)
+
+version.deploy(model_type="florence-2", model_path="/content/florence2-lora")
+
+

Above, replace:

+ +

If you are not using our notebook, replace /content/florence2-lora with the directory where you saved your model weights.

+

When you run the code above, the model will be uploaded to Roboflow. It will take a few minutes for the model to be processed before it is ready for use.

+

Your model will be uploaded to Roboflow.

+
+
+

Deploy to your hardware

+

Once your model has been processed, you can download it to any device on which you want to deploy your model. Deployment is supported through Roboflow Inference, our open source computer vision inference server.

+

Inference can be run as a microservice with Docker, ideal for large deployments where you may need a centralized server on which to run inference, or when you want to run Inference in an isolated container. You can also directly integrate Inference into your project through the Inference Python SDK.

+

For this guide, we will show how to deploy the model with the Python SDK.

+

First, install inference:

+
+
!pip install inference
+
+

Then, create a new Python file and add the following code:

+
+
import os
+from inference import get_model
+from PIL import Image
+import json
+
+lora_model = get_model("model-id/version-id", api_key="KEY")
+
+image = Image.open("containers.png")
+response = lora_model.infer(image)
+print(response)
+
+

In the code avove, we load our model, run it on an image, then plot the predictions with the supervision Python package.

+

When you first run the code, your model weights will be downloaded and cached to your device for subsequent runs. This process may take a few minutes depending on the strength of your internet connection.

+
+
+
+

Congratulations

+

⭐️ If you enjoyed this notebook, star the Roboflow Notebooks repo (and supervision while you’re at it) and let us know what tutorials you’d like to see us do next. ⭐️

+ + +
+ +
+ +
+ + + + + \ No newline at end of file diff --git a/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-10-output-2.png b/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-10-output-2.png new file mode 100644 index 0000000..8033802 Binary files /dev/null and b/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-10-output-2.png differ diff --git a/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-21-output-1.png b/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-21-output-1.png new file mode 100644 index 0000000..4a637d0 Binary files /dev/null and b/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-21-output-1.png differ diff --git a/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-24-output-1.png b/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-24-output-1.png new file mode 100644 index 0000000..d937dab Binary files /dev/null and b/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-24-output-1.png differ diff --git a/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-8-output-2.png b/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-8-output-2.png new file mode 100644 index 0000000..97a4094 Binary files /dev/null and b/lab/how-to-finetune-florence-2-on-detection-dataset_files/figure-html/cell-8-output-2.png differ diff --git a/lab/scratchpad.html b/lab/scratchpad.html new file mode 100644 index 0000000..246ed7d --- /dev/null +++ b/lab/scratchpad.html @@ -0,0 +1,6837 @@ + + + + + + + + + +Predict – blog + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ +
+ +
+ + + + +
+ +
+
+

Predict

+
+ + + +
+ + + + +
+ + + +
+ + +
+
import requests
+from io import BytesIO
+from PIL import Image
+import numpy as np
+import supervision as sv
+from inference.models.utils import get_roboflow_model
+
+# Create a dummy dataset
+data = requests.get("https://raw.githubusercontent.com/jigsawpieces/dog-api-images/main/pitbull/dog-3981033_1280.jpg")
+image = Image.open(BytesIO(data.content)).reduce(5)
+label = np.random.rand(1, 5) / 10 + 0.5
+label[:, 0] = 0
+!mkdir -p /tmp/dummy_dataset/images
+!mkdir -p /tmp/dummy_dataset/labels
+image.save("/tmp/dummy_dataset/images/0.jpg")
+np.savetxt("/tmp/dummy_dataset/labels/0.txt", label, fmt="%d %f %f %f %f")
+with open("/tmp/dummy_dataset/dataset.yml", "w") as f:
+    f.write("""train: _
+val: _
+test: _
+nc: 1
+names: ["dummy"]""")
+
+# Load as supervision dataset
+dataset = sv.DetectionDataset.from_yolo("/tmp/dummy_dataset/images", "/tmp/dummy_dataset/labels", "/tmp/dummy_dataset/dataset.yml")
+
+# Visualize the first instance
+image_path, image, detection = dataset[0]
+box_annotator = sv.BoxAnnotator()
+label_annotator = sv.LabelAnnotator()
+annotated_image = box_annotator.annotate(image.copy(), detection)
+annotated_image = label_annotator.annotate(annotated_image, detection)
+display(Image.fromarray(annotated_image))
+
+# Visualize the prediction
+model = get_roboflow_model("yolov8s-640")
+prediction = model.infer(image)[0]
+detection = sv.Detections.from_inference(prediction)
+annotated_image = box_annotator.annotate(image.copy(), detection)
+annotated_image = label_annotator.annotate(annotated_image, detection)
+display(Image.fromarray(annotated_image))
+
+# Modified the detection to display class name
+# image_path, image, detection = dataset[0]
+# detection.data = {"class_name": np.array(['dummy'])}
+# box_annotator = sv.BoxAnnotator()
+# label_annotator = sv.LabelAnnotator()
+# annotated_image = box_annotator.annotate(image.copy(), detection)
+# annotated_image = label_annotator.annotate(annotated_image, detection)
+# display(Image.fromarray(annotated_image))
+
+
[01/20/25 22:15:20] WARNING  Your inference package version 0.33.0 is out of date! Please upgrade to __init__.py:41
+                             version 0.34.0 of inference for the latest features and bug fixes by                  
+                             running `pip install --upgrade inference`.                                            
+
+
+
+
+
+

+
+
+
+
+
UserWarning: Specified provider 'OpenVINOExecutionProvider' is not in available provider names.Available providers: 'TensorrtExecutionProvider, CUDAExecutionProvider, CPUExecutionProvider'
+UserWarning: Specified provider 'CoreMLExecutionProvider' is not in available provider names.Available providers: 'TensorrtExecutionProvider, CUDAExecutionProvider, CPUExecutionProvider'
+2025-01-20 22:15:22.319412181 [E:onnxruntime:Default, provider_bridge_ort.cc:1862 TryGetProviderInfo_CUDA] /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1539 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_cuda.so with error: libcudnn_adv.so.9: cannot open shared object file: No such file or directory
+
+2025-01-20 22:15:22.319453861 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:993 CreateExecutionProviderInstance] Failed to create CUDAExecutionProvider. Require cuDNN 9.* and CUDA 12.*. Please install all dependencies as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
+
+
+
+
+

+
+
+
+
+
+
import os
+import numpy as np
+import supervision as sv
+from ultralytics import YOLO
+
+# Download dataset
+if not os.path.exists("/tmp/rf_animals"):
+    !wget https://universe.roboflow.com/ds/1LLwpXz2td?key=8JnJML5YF6 -O /tmp/rf_animals.zip
+    !unzip /tmp/dataset.zip -d /tmp/rf_animals
+
+# Load dataset
+dataset = sv.DetectionDataset.from_yolo("/tmp/rf_animals/train/images", "/tmp/rf_animals/train/labels", "/tmp/rf_animals/data.yaml")
+
+# Inference
+model = YOLO("yolov8s")
+targets, detections = [], []
+for image_path, image, target in dataset:
+    targets.append(target)
+    
+    prediction = model(image, verbose=False)[0]
+    detection = sv.Detections.from_ultralytics(prediction)
+    detection = detection[np.isin(detection['class_name'], dataset.classes)]
+    detection.class_id = np.array([dataset.classes.index(class_name) for class_name in detection['class_name']])
+    detections.append(detection)
+    
+# Method #1
+mAP = sv.metrics.MeanAveragePrecision().update(detections, targets).compute()
+print(f"mAP50: {mAP.map50:.4f}")
+
+# Method #2
+mAP = sv.MeanAveragePrecision.from_detections(detections, targets)
+print(f"mAP50: {mAP.map50:.4f}")
+
+
mAP50: 0.1553
+mAP50: 0.2100
+
+
+
+
def callback(image):
+    prediction = model(image, verbose=False)[0]
+    detection = sv.Detections.from_ultralytics(prediction)
+    detection = detection[np.isin(detection['class_name'], dataset.classes)]
+    detection.class_id = np.array([dataset.classes.index(class_name) for class_name in detection['class_name']])
+    return detection
+
+sv.MeanAveragePrecision.benchmark(dataset, callback).map50
+
+
np.float64(0.2100207753031282)
+
+
+
+
dataset.classes
+
+
['butterfly',
+ 'cat',
+ 'crocodile',
+ 'dear',
+ 'deer',
+ 'dog',
+ 'elephant',
+ 'fog',
+ 'frog',
+ 'giraffe',
+ 'goat',
+ 'hippo',
+ 'kangaroo',
+ 'lion',
+ 'parrot',
+ 'shark',
+ 'sheep',
+ 'spider',
+ 'tiger',
+ 'zebra']
+
+
+
+
import leafmap
+
+
+
m = leafmap.Map(center=(28.25, 77.40), zoom=18)
+# m.add_basemap("SATELLITE")
+m.add_wms_layer("https://wayback.maptiles.arcgis.com/arcgis/rest/services/World_Imagery/WMTS/1.0.0/GoogleMapsCompatible/MapServer/tile/32553/{z}/{y}/{x}", layers="0")
+m
+
+ +
+
+
+
m.zoom
+
+
19.0
+
+
+
+
m = leafmap.Map(center=(28.25, 77.40), zoom=17)
+m.add_basemap("SATELLITE")
+# m.add_wms_layer("https://wayback.maptiles.arcgis.com/arcgis/rest/services/World_Imagery/WMTS/1.0.0/GoogleMapsCompatible/MapServer/tile/32553/{z}/{y}/{x}", layers="0")
+m
+
+ +
+
+
+
import geopandas as gpd
+from tqdm.notebook import tqdm
+from PIL import Image
+from joblib import Parallel, delayed
+from glob import glob
+
+
+
delhi_kilns = gpd.read_file("/home/patel_zeel/kiln_compass_24/regions/labels/delhi_airshed.geojson")
+len(delhi_kilns)
+
+
783
+
+
+
+
zoom = 19
+jobs = []
+for kiln in tqdm(delhi_kilns.geometry):
+    lon_min, lat_min, lon_max, lat_max = kiln.bounds
+    lon_margin = (lon_max - lon_min)/4
+    lat_margin = (lat_max - lat_min)/4
+    outer_bounds = [lon_min - lon_margin, lat_min - lat_margin, lon_max + lon_margin, lat_max + lat_margin]
+    download_path = f"/home/patel_zeel/kiln_compass_24/regions/high_res/{zoom}/{','.join(map(str, kiln.bounds))}.tif"
+    jobs.append(delayed(leafmap.map_tiles_to_geotiff)(download_path, outer_bounds, zoom=zoom, to_cog=True, source="SATELLITE", quiet=True))
+
+_ = Parallel(n_jobs=-1)(tqdm(jobs))
+
+ +
+
+ +
+
+
Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439517,28.203442,77.440547,28.203933.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.385046,28.221207,77.385974,28.221709.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439517,28.203442,77.440547,28.203933.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.385046,28.221207,77.385974,28.221709.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.41962,28.208412,77.420769,28.208927.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.41962,28.208412,77.420769,28.208927.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.400691,28.212222,77.402094,28.212772.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399209,28.215745,77.400439,28.21634.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.400691,28.212222,77.402094,28.212772.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399209,28.215745,77.400439,28.21634.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380984,28.223921,77.382308,28.224487.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423613,28.214831,77.424976,28.215277.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380984,28.223921,77.382308,28.224487.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423613,28.214831,77.424976,28.215277.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420321,28.226731,77.421568,28.227297.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420321,28.226731,77.421568,28.227297.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422024,28.218841,77.423172,28.219389.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397178,28.245326,77.398362,28.245804.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422024,28.218841,77.423172,28.219389.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397178,28.245326,77.398362,28.245804.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378493,28.231351,77.379637,28.231909.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403983,28.253302,77.405073,28.253971.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378493,28.231351,77.379637,28.231909.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403983,28.253302,77.405073,28.253971.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399211,28.249648,77.400477,28.250317.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421627,28.226688,77.422657,28.227306.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381049,28.2314,77.381853,28.232478.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.377595,28.230781,77.378218,28.231793.tif
+
+
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380083,28.233122,77.381338,28.233603.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414066,28.22562,77.415028,28.226247.tif
+
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.377595,28.230781,77.378218,28.231793.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399211,28.249648,77.400477,28.250317.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381049,28.2314,77.381853,28.232478.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421627,28.226688,77.422657,28.227306.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426747,28.223595,77.428081,28.224178.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393264,28.218964,77.394545,28.219577.tif
+
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380083,28.233122,77.381338,28.233603.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414066,28.22562,77.415028,28.226247.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411022,28.244281,77.412055,28.244761.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.406952,28.252856,77.408023,28.253388.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426747,28.223595,77.428081,28.224178.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393264,28.218964,77.394545,28.219577.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411022,28.244281,77.412055,28.244761.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.398997,28.242479,77.399698,28.243509.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.406952,28.252856,77.408023,28.253388.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381981,28.23996,77.383141,28.240486.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403869,28.244107,77.404506,28.244952.tif
+
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.398997,28.242479,77.399698,28.243509.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403869,28.244107,77.404506,28.244952.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404645,28.250969,77.40556,28.251879.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381981,28.23996,77.383141,28.240486.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.379979,28.231168,77.380917,28.231783.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.295803,28.24481,77.296446,28.245788.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40962,28.233303,77.411022,28.234024.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404645,28.250969,77.40556,28.251879.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.379979,28.231168,77.380917,28.231783.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.295803,28.24481,77.296446,28.245788.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395784,28.24831,77.397011,28.248911.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380263,28.228121,77.380828,28.229099.tif
+
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.334249,28.254088,77.334997,28.255125.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40962,28.233303,77.411022,28.234024.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430124,28.253558,77.431044,28.254059.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380263,28.228121,77.380828,28.229099.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395784,28.24831,77.397011,28.248911.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.334249,28.254088,77.334997,28.255125.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430124,28.253558,77.431044,28.254059.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.409951,28.246183,77.411256,28.246749.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.557539,28.235139,77.558522,28.235908.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.333566,28.250313,77.334763,28.250958.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.384357,28.238603,77.38497,28.239681.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.409951,28.246183,77.411256,28.246749.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.557539,28.235139,77.558522,28.235908.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.333566,28.250313,77.334763,28.250958.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423652,28.226954,77.424177,28.227829.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39999,28.250901,77.400555,28.251758.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.384357,28.238603,77.38497,28.239681.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392542,28.244058,77.393126,28.245018.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392542,28.244058,77.393126,28.245018.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39999,28.250901,77.400555,28.251758.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423652,28.226954,77.424177,28.227829.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382656,28.242525,77.383947,28.243172.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.384356,28.240206,77.385652,28.240789.tif
+
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404071,28.24931,77.405363,28.250143.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.384356,28.240206,77.385652,28.240789.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382656,28.242525,77.383947,28.243172.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420263,28.225702,77.421373,28.226251.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.383301,28.237917,77.384119,28.23916.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382349,28.222811,77.383578,28.224024.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382747,28.236606,77.383991,28.237372.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404071,28.24931,77.405363,28.250143.tif
+
+
+
+Updating dataset tags...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420263,28.225702,77.421373,28.226251.tif
+Adding overviews...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382747,28.236606,77.383991,28.237372.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382349,28.222811,77.383578,28.224024.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.409604,28.241284,77.410961,28.241942.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.383301,28.237917,77.384119,28.23916.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404832,28.244738,77.406239,28.245398.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.409604,28.241284,77.410961,28.241942.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402862,28.244088,77.403679,28.24515.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399208,28.248389,77.400481,28.249037.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402862,28.244088,77.403679,28.24515.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404832,28.244738,77.406239,28.245398.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.54838,28.22488,77.549438,28.225545.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399208,28.248389,77.400481,28.249037.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.335186,28.251218,77.336479,28.251801.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.54838,28.22488,77.549438,28.225545.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374381,28.23553,77.37553,28.235994.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40628,28.242719,77.407334,28.243676.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.335186,28.251218,77.336479,28.251801.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374381,28.23553,77.37553,28.235994.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.410336,28.239227,77.411631,28.239777.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436496,28.253485,77.437447,28.254035.tif
+
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40628,28.242719,77.407334,28.243676.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436496,28.253485,77.437447,28.254035.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396776,28.243946,77.398141,28.244546.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.410336,28.239227,77.411631,28.239777.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346543,28.22565,77.347147,28.226628.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416385,28.219885,77.417119,28.220877.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396776,28.243946,77.398141,28.244546.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346543,28.22565,77.347147,28.226628.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416385,28.219885,77.417119,28.220877.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434905,28.23123,77.436051,28.232266.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438724,28.253028,77.439387,28.25404.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438724,28.253028,77.439387,28.25404.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434905,28.23123,77.436051,28.232266.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376227,28.23481,77.377405,28.23553.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439581,28.252959,77.440711,28.253474.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441003,28.253354,77.441607,28.254383.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376227,28.23481,77.377405,28.23553.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439581,28.252959,77.440711,28.253474.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441003,28.253354,77.441607,28.254383.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.330694,28.255359,77.331843,28.255828.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.450088,28.25145,77.451256,28.251964.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415978,28.262976,77.41703,28.263491.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.330694,28.255359,77.331843,28.255828.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.450088,28.25145,77.451256,28.251964.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415978,28.262976,77.41703,28.263491.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325479,28.261083,77.326648,28.261644.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325479,28.261083,77.326648,28.261644.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328932,28.260927,77.329976,28.261426.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.611995,28.256777,77.613127,28.25748.tif
+
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328932,28.260927,77.329976,28.261426.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.611995,28.256777,77.613127,28.25748.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402288,28.240987,77.40297,28.242342.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.332996,28.264283,77.334249,28.265031.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393165,28.284485,77.394197,28.285017.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402288,28.240987,77.40297,28.242342.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.332996,28.264283,77.334249,28.265031.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393165,28.284485,77.394197,28.285017.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.417644,28.244725,77.41928,28.24644.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.388491,28.274279,77.389289,28.27524.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393729,28.273362,77.39484,28.274019.tif
+
+Updating dataset tags...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.388491,28.274279,77.389289,28.27524.tif
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393729,28.273362,77.39484,28.274019.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.417644,28.244725,77.41928,28.24644.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.272707,28.288054,77.27337,28.288894.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.272707,28.288054,77.27337,28.288894.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.329479,28.2648,77.330813,28.265646.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.329479,28.2648,77.330813,28.265646.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378491,28.291501,77.379445,28.291946.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395667,28.281381,77.396465,28.282324.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378491,28.291501,77.379445,28.291946.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.05445,28.299134,77.054995,28.30006.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392814,28.285344,77.394061,28.28591.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395667,28.281381,77.396465,28.282324.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.05445,28.299134,77.054995,28.30006.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392814,28.285344,77.394061,28.28591.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382094,28.29126,77.3826,28.292101.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368841,28.287985,77.370107,28.288551.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382094,28.29126,77.3826,28.292101.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368841,28.287985,77.370107,28.288551.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397799,28.279495,77.39885,28.280527.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.280575,28.29301,77.281685,28.293558.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.388539,28.275375,77.390012,28.27643.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.280575,28.29301,77.281685,28.293558.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.057992,28.30206,77.059087,28.302553.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.057992,28.30206,77.059087,28.302553.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39686,28.310894,77.397453,28.31185.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387887,28.286527,77.389153,28.287145.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397799,28.279495,77.39885,28.280527.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396582,28.283319,77.397205,28.284502.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.388539,28.275375,77.390012,28.27643.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39686,28.310894,77.397453,28.31185.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387887,28.286527,77.389153,28.287145.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396582,28.283319,77.397205,28.284502.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.055102,28.29886,77.055667,28.2997.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.055102,28.29886,77.055667,28.2997.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37223,28.295396,77.372795,28.296442.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37223,28.295396,77.372795,28.296442.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394255,28.334431,77.395307,28.334928.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389503,28.332717,77.390691,28.333249.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394255,28.334431,77.395307,28.334928.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389503,28.332717,77.390691,28.333249.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397898,28.310147,77.399084,28.310693.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397898,28.310147,77.399084,28.310693.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389484,28.286081,77.390847,28.286716.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37092,28.295907,77.371777,28.296921.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389776,28.33162,77.390867,28.332134.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389484,28.286081,77.390847,28.286716.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37092,28.295907,77.371777,28.296921.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389776,28.33162,77.390867,28.332134.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391794,28.336363,77.392355,28.337244.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582304,28.353717,77.583298,28.354337.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582778,28.352286,77.583749,28.352902.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391794,28.336363,77.392355,28.337244.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582304,28.353717,77.583298,28.354337.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582778,28.352286,77.583749,28.352902.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391529,28.367164,77.392035,28.367901.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.591217,28.361328,77.592152,28.361705.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391529,28.367164,77.392035,28.367901.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.567521,28.361494,77.568469,28.362254.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.591217,28.361328,77.592152,28.361705.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.567521,28.361494,77.568469,28.362254.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.35727,28.302428,77.35877,28.303196.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.576011,28.364573,77.576889,28.365287.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582164,28.345883,77.583188,28.346603.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.576011,28.364573,77.576889,28.365287.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.35727,28.302428,77.35877,28.303196.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.58579,28.350779,77.586734,28.351472.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582164,28.345883,77.583188,28.346603.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.58579,28.350779,77.586734,28.351472.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573756,28.369634,77.574639,28.370282.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573756,28.369634,77.574639,28.370282.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.5687,28.356236,77.569633,28.357074.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.579859,28.344916,77.580884,28.345734.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.578047,28.36547,77.578979,28.366176.tif
+
+Adding overviews...
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.577563,28.355356,77.578563,28.356157.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.5687,28.356236,77.569633,28.357074.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.584383,28.352828,77.585199,28.353929.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.579859,28.344916,77.580884,28.345734.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.578047,28.36547,77.578979,28.366176.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.577563,28.355356,77.578563,28.356157.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.584383,28.352828,77.585199,28.353929.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.580639,28.332723,77.581861,28.333598.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.373846,28.414445,77.374937,28.415045.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.580639,28.332723,77.581861,28.333598.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394576,28.377987,77.395569,28.378552.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.373846,28.414445,77.374937,28.415045.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.583154,28.367847,77.584115,28.368713.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394576,28.377987,77.395569,28.378552.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.379309,28.411173,77.379951,28.411996.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573654,28.363088,77.574602,28.364024.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.379309,28.411173,77.379951,28.411996.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.585125,28.367876,77.586074,28.368697.tif
+
+Adding overviews...
+Updating dataset tags...
+Adding overviews...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.583154,28.367847,77.584115,28.368713.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573654,28.363088,77.574602,28.364024.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586444,28.399282,77.587516,28.400037.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.585125,28.367876,77.586074,28.368697.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586444,28.399282,77.587516,28.400037.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.562144,28.349463,77.563504,28.35039.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574721,28.363434,77.575752,28.364597.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.562144,28.349463,77.563504,28.35039.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574721,28.363434,77.575752,28.364597.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.5834,28.345409,77.584414,28.34647.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.630723,28.398897,77.631673,28.399733.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.5834,28.345409,77.584414,28.34647.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382717,28.420471,77.383807,28.420933.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.630723,28.398897,77.631673,28.399733.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382717,28.420471,77.383807,28.420933.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.400725,28.406126,77.401871,28.406861.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378525,28.407155,77.379336,28.408164.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.400725,28.406126,77.401871,28.406861.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378525,28.407155,77.379336,28.408164.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574408,28.335977,77.575719,28.336887.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.590371,28.337407,77.591509,28.338601.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574408,28.335977,77.575719,28.336887.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.590371,28.337407,77.591509,28.338601.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.38782,28.388839,77.389035,28.389902.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555503,28.395057,77.556764,28.395932.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.38782,28.388839,77.389035,28.389902.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555503,28.395057,77.556764,28.395932.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597054,28.421614,77.598106,28.422378.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597004,28.428189,77.598064,28.428905.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597054,28.421614,77.598106,28.422378.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582468,28.383531,77.583666,28.384522.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597004,28.428189,77.598064,28.428905.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.645184,28.45388,77.646158,28.454548.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.676666,28.434924,77.677534,28.435544.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.645184,28.45388,77.646158,28.454548.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.676666,28.434924,77.677534,28.435544.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582468,28.383531,77.583666,28.384522.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.671598,28.430455,77.672823,28.431464.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.671598,28.430455,77.672823,28.431464.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597862,28.43561,77.59873,28.436718.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.649144,28.446776,77.650364,28.447675.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597862,28.43561,77.59873,28.436718.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.644936,28.450308,77.646087,28.45115.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.606742,28.406935,77.607844,28.407877.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.592911,28.426001,77.594206,28.426818.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.619007,28.410973,77.620136,28.411956.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.649144,28.446776,77.650364,28.447675.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.644936,28.450308,77.646087,28.45115.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.592911,28.426001,77.594206,28.426818.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.606742,28.406935,77.607844,28.407877.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.619007,28.410973,77.620136,28.411956.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.641895,28.450275,77.643003,28.451172.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.641895,28.450275,77.643003,28.451172.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.602472,28.407715,77.603892,28.408836.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.646111,28.453562,77.64719,28.454407.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593847,28.481626,77.594897,28.482573.tif
+
+Adding overviews...
+Updating dataset tags...
+Adding overviews...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.602472,28.407715,77.603892,28.408836.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.646111,28.453562,77.64719,28.454407.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593847,28.481626,77.594897,28.482573.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.923943,28.724198,76.924571,28.725237.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.923943,28.724198,76.924571,28.725237.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.654202,28.449042,77.655425,28.449933.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.892405,28.641308,76.893691,28.641803.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.654202,28.449042,77.655425,28.449933.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.892405,28.641308,76.893691,28.641803.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597879,28.437297,77.598954,28.438487.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60059,28.520599,77.601846,28.521552.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.896381,28.633003,76.897692,28.633587.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597879,28.437297,77.598954,28.438487.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.606773,28.519039,77.607973,28.519999.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.896381,28.633003,76.897692,28.633587.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60059,28.520599,77.601846,28.521552.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.627612,28.487476,77.628753,28.488303.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.605367,28.521803,77.606598,28.522723.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395823,28.244212,77.396465,28.245036.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.606773,28.519039,77.607973,28.519999.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.599257,28.530944,77.600313,28.531893.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395823,28.244212,77.396465,28.245036.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604296,28.52906,77.605367,28.530121.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.627612,28.487476,77.628753,28.488303.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.605367,28.521803,77.606598,28.522723.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604296,28.52906,77.605367,28.530121.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.599257,28.530944,77.600313,28.531893.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.863749,28.644316,76.864898,28.64488.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.863749,28.644316,76.864898,28.64488.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60946,28.528705,77.610897,28.52971.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60946,28.528705,77.610897,28.52971.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.35692,28.720956,77.357871,28.721958.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.531815,28.594874,77.53279,28.595945.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.35692,28.720956,77.357871,28.721958.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.531815,28.594874,77.53279,28.595945.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.928659,28.734502,76.930064,28.734963.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.521977,28.541158,77.523188,28.54226.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411367,28.733316,77.412423,28.733866.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.928659,28.734502,76.930064,28.734963.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411367,28.733316,77.412423,28.733866.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.521977,28.541158,77.523188,28.54226.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60608,28.412178,77.60763,28.413427.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.605627,28.530084,77.606712,28.531215.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.605627,28.530084,77.606712,28.531215.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60608,28.412178,77.60763,28.413427.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.607555,28.519595,77.608658,28.520652.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.643833,28.675536,77.644681,28.676664.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426727,28.717934,77.427864,28.719015.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.643833,28.675536,77.644681,28.676664.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.611452,28.528599,77.612755,28.529724.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.607555,28.519595,77.608658,28.520652.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.930947,28.733199,76.932218,28.73381.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.53028,28.590155,77.531362,28.591403.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.601955,28.5376,77.603468,28.538821.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426727,28.717934,77.427864,28.719015.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.930947,28.733199,76.932218,28.73381.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362787,28.722948,77.364138,28.723998.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.53028,28.590155,77.531362,28.591403.tif
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.611452,28.528599,77.612755,28.529724.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.601955,28.5376,77.603468,28.538821.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362787,28.722948,77.364138,28.723998.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.413781,28.729296,77.414463,28.730468.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.603531,28.538614,77.604863,28.539878.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.413781,28.729296,77.414463,28.730468.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442072,28.722959,77.443251,28.724072.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.924006,28.737248,76.925496,28.737791.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.412711,28.724839,77.413583,28.726029.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.603531,28.538614,77.604863,28.539878.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421989,28.731187,77.42261,28.732262.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.412711,28.724839,77.413583,28.726029.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421989,28.731187,77.42261,28.732262.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.924006,28.737248,76.925496,28.737791.tif
+Updating dataset tags...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42751,28.729953,77.42821,28.730879.tif
+
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442072,28.722959,77.443251,28.724072.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421358,28.726853,77.422091,28.728028.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425279,28.733268,77.425996,28.734352.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42751,28.729953,77.42821,28.730879.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421358,28.726853,77.422091,28.728028.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425279,28.733268,77.425996,28.734352.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.437877,28.719557,77.439105,28.720884.tif
+
+Adding overviews...
+Updating dataset tags...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.932392,28.737181,76.932978,28.738409.tif
+
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.437877,28.719557,77.439105,28.720884.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.92683,28.738921,76.927458,28.739961.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.929012,28.737692,76.930306,28.738261.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.932392,28.737181,76.932978,28.738409.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.929012,28.737692,76.930306,28.738261.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.92683,28.738921,76.927458,28.739961.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436464,28.726587,77.437863,28.727577.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.565184,28.715374,77.566544,28.716433.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415773,28.728794,77.416588,28.730119.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436464,28.726587,77.437863,28.727577.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.418617,28.733934,77.419949,28.734676.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.565184,28.715374,77.566544,28.716433.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415773,28.728794,77.416588,28.730119.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.418617,28.733934,77.419949,28.734676.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920781,28.739362,76.921518,28.740496.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920781,28.739362,76.921518,28.740496.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.518341,28.726193,77.519521,28.727224.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.932329,28.741682,76.933006,28.74309.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.349287,28.737819,77.350269,28.738923.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.932329,28.741682,76.933006,28.74309.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.518341,28.726193,77.519521,28.727224.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.349287,28.737819,77.350269,28.738923.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370372,28.736791,77.371589,28.737673.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370372,28.736791,77.371589,28.737673.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445863,28.744009,77.446748,28.744985.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445863,28.744009,77.446748,28.744985.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423476,28.743285,77.424735,28.744367.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411872,28.737316,77.413159,28.738264.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423476,28.743285,77.424735,28.744367.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.435478,28.742127,77.43679,28.743004.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411872,28.737316,77.413159,28.738264.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423105,28.742161,77.424483,28.743185.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914064,28.748339,76.914746,28.749346.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.435478,28.742127,77.43679,28.743004.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914064,28.748339,76.914746,28.749346.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415366,28.739043,77.416753,28.740137.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362702,28.73804,77.363795,28.739191.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423105,28.742161,77.424483,28.743185.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416457,28.741858,77.417809,28.742712.tif
+
+Adding overviews...
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915321,28.746591,76.916147,28.74774.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415366,28.739043,77.416753,28.740137.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362702,28.73804,77.363795,28.739191.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440436,28.735691,77.441665,28.736545.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416457,28.741858,77.417809,28.742712.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.919578,28.752905,76.920835,28.753456.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915321,28.746591,76.916147,28.74774.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356845,28.741321,77.358155,28.742256.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440436,28.735691,77.441665,28.736545.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.919578,28.752905,76.920835,28.753456.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.419738,28.737114,77.420787,28.738253.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916183,28.745961,76.917351,28.746512.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433533,28.741994,77.434686,28.743217.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356845,28.741321,77.358155,28.742256.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350176,28.737668,77.35157,28.738706.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916183,28.745961,76.917351,28.746512.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.419738,28.737114,77.420787,28.738253.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.429763,28.739069,77.431202,28.740038.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433533,28.741994,77.434686,28.743217.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.413784,28.740562,77.415241,28.741668.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350176,28.737668,77.35157,28.738706.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.922963,28.744732,76.924222,28.745362.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.429763,28.739069,77.431202,28.740038.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.922963,28.744732,76.924222,28.745362.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.413784,28.740562,77.415241,28.741668.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.923111,28.749089,76.924325,28.749746.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.923111,28.749089,76.924325,28.749746.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42127,28.740169,77.422402,28.74138.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420238,28.736745,77.421446,28.737915.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446262,28.742522,77.447695,28.743381.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425026,28.736699,77.426186,28.737879.tif
+
+Adding overviews...
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420238,28.736745,77.421446,28.737915.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42127,28.740169,77.422402,28.74138.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446262,28.742522,77.447695,28.743381.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42459,28.740948,77.426,28.742123.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425026,28.736699,77.426186,28.737879.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42459,28.740948,77.426,28.742123.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.360944,28.747144,77.362171,28.747956.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.360944,28.747144,77.362171,28.747956.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.428742,28.746448,77.429976,28.74749.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.455849,28.735985,77.457301,28.737144.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387069,28.343111,77.388004,28.343522.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425252,28.746076,77.426414,28.747205.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387069,28.343111,77.388004,28.343522.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.428742,28.746448,77.429976,28.74749.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425252,28.746076,77.426414,28.747205.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.455849,28.735985,77.457301,28.737144.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.3559,28.746014,77.357159,28.74732.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.345519,28.753813,77.346962,28.7548.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.3559,28.746014,77.357159,28.74732.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.345519,28.753813,77.346962,28.7548.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434931,28.746573,77.436104,28.747911.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.352773,28.747897,77.354283,28.748955.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434931,28.746573,77.436104,28.747911.tif
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.352773,28.747897,77.354283,28.748955.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43642,28.746962,77.437816,28.747859.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351163,28.745347,77.352644,28.746451.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43642,28.746962,77.437816,28.747859.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351163,28.745347,77.352644,28.746451.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449107,28.749177,77.450327,28.750249.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449107,28.749177,77.450327,28.750249.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.522613,28.741342,77.524065,28.742711.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.566611,28.34057,77.567861,28.341851.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.522613,28.741342,77.524065,28.742711.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.359709,28.745599,77.361243,28.746612.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.471012,28.747688,77.472291,28.748687.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.566611,28.34057,77.567861,28.341851.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43747,28.753013,77.438884,28.754136.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.453736,28.745112,77.455031,28.746296.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586977,28.334712,77.58781,28.335848.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449761,28.748434,77.450963,28.749562.tif
+
+
+Adding overviews...
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420269,28.746228,77.421749,28.747472.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.359709,28.745599,77.361243,28.746612.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.471012,28.747688,77.472291,28.748687.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586977,28.334712,77.58781,28.335848.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43747,28.753013,77.438884,28.754136.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.453736,28.745112,77.455031,28.746296.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449761,28.748434,77.450963,28.749562.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420269,28.746228,77.421749,28.747472.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446959,28.750691,77.448351,28.751787.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43514,28.753454,77.436648,28.754795.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446959,28.750691,77.448351,28.751787.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43514,28.753454,77.436648,28.754795.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.340341,28.758871,77.34125,28.759911.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.340341,28.758871,77.34125,28.759911.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.369066,28.759512,77.370177,28.760617.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.369066,28.759512,77.370177,28.760617.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37018,28.760358,77.371142,28.761429.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34456,28.755009,77.346002,28.756079.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37018,28.760358,77.371142,28.761429.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34456,28.755009,77.346002,28.756079.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449522,28.752244,77.450608,28.753502.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438875,28.759944,77.440035,28.761055.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438647,28.755698,77.440065,28.75659.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438875,28.759944,77.440035,28.761055.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449522,28.752244,77.450608,28.753502.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438647,28.755698,77.440065,28.75659.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.344301,28.75838,77.345818,28.759456.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374773,28.762884,77.376052,28.763905.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440239,28.757653,77.441573,28.758745.tif
+
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442567,28.760813,77.44367,28.76203.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.345923,28.763033,77.347359,28.764226.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.344301,28.75838,77.345818,28.759456.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442567,28.760813,77.44367,28.76203.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440239,28.757653,77.441573,28.758745.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374773,28.762884,77.376052,28.763905.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348221,28.763572,77.349596,28.76484.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440762,28.758223,77.44214,28.759303.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.345923,28.763033,77.347359,28.764226.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.647265,28.455689,77.648489,28.456623.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348221,28.763572,77.349596,28.76484.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34503,28.760859,77.346402,28.761991.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440762,28.758223,77.44214,28.759303.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.647265,28.455689,77.648489,28.456623.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34503,28.760859,77.346402,28.761991.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.632056,28.452044,77.633377,28.45302.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.632056,28.452044,77.633377,28.45302.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376729,28.763234,77.378062,28.764403.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447601,28.754954,77.448828,28.7561.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376729,28.763234,77.378062,28.764403.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447601,28.754954,77.448828,28.7561.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364789,28.762969,77.366501,28.764261.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.450428,28.760757,77.451724,28.761867.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368395,28.766226,77.369461,28.767303.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.450428,28.760757,77.451724,28.761867.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442161,28.757968,77.443461,28.759041.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368395,28.766226,77.369461,28.767303.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364789,28.762969,77.366501,28.764261.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448716,28.756714,77.4501,28.757835.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442161,28.757968,77.443461,28.759041.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.642208,28.682066,77.643102,28.68313.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.453191,28.760263,77.454507,28.761367.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.642208,28.682066,77.643102,28.68313.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448716,28.756714,77.4501,28.757835.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.453191,28.760263,77.454507,28.761367.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447194,28.762389,77.448498,28.763671.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447194,28.762389,77.448498,28.763671.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.594389,28.400605,77.596279,28.401993.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325741,28.767554,77.327062,28.76855.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319744,28.766391,77.320904,28.767495.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368159,28.773908,77.369488,28.774925.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325741,28.767554,77.327062,28.76855.tif
+Adding overviews...
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368159,28.773908,77.369488,28.774925.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319744,28.766391,77.320904,28.767495.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.594389,28.400605,77.596279,28.401993.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.366958,28.765243,77.368178,28.766552.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444459,28.755202,77.445835,28.756572.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.366958,28.765243,77.368178,28.766552.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444459,28.755202,77.445835,28.756572.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439023,28.771396,77.440116,28.772573.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376327,28.766997,77.377455,28.768274.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439023,28.771396,77.440116,28.772573.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376327,28.766997,77.377455,28.768274.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.338077,28.768703,77.339528,28.769902.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434954,28.773155,77.4363,28.774119.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.365989,28.773855,77.367365,28.774898.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364154,28.767956,77.36542,28.769439.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434954,28.773155,77.4363,28.774119.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.338077,28.768703,77.339528,28.769902.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.365989,28.773855,77.367365,28.774898.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364154,28.767956,77.36542,28.769439.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.363384,28.773193,77.364794,28.774749.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411491,28.727581,77.41291,28.728371.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.437314,28.765986,77.438686,28.767078.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411491,28.727581,77.41291,28.728371.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593488,28.770902,77.594638,28.772008.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.363384,28.773193,77.364794,28.774749.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.437314,28.765986,77.438686,28.767078.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440694,28.766191,77.441961,28.767599.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593488,28.770902,77.594638,28.772008.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436299,28.771941,77.437783,28.772984.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440694,28.766191,77.441961,28.767599.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.44426,28.765179,77.445636,28.76633.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436299,28.771941,77.437783,28.772984.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.44426,28.765179,77.445636,28.76633.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439184,28.76917,77.440359,28.770457.tif
+
+Adding overviews...
+Updating dataset tags...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441356,28.764995,77.44263,28.766318.tif
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439184,28.76917,77.440359,28.770457.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.589809,28.773299,77.591117,28.774377.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36273,28.728479,77.364122,28.729604.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.307744,28.774735,77.309216,28.775631.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.589809,28.773299,77.591117,28.774377.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441356,28.764995,77.44263,28.766318.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604262,28.774147,77.605617,28.774969.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.307744,28.774735,77.309216,28.775631.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36273,28.728479,77.364122,28.729604.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604262,28.774147,77.605617,28.774969.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.569684,28.773985,77.570871,28.775227.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.569684,28.773985,77.570871,28.775227.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.314924,28.776945,77.316023,28.778218.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34208,28.77478,77.343493,28.775764.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.314924,28.776945,77.316023,28.778218.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34208,28.77478,77.343493,28.775764.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.360863,28.776403,77.362418,28.777528.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313962,28.782084,77.315502,28.783184.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.360863,28.776403,77.362418,28.777528.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.323234,28.780901,77.324651,28.78198.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348386,28.783799,77.349665,28.784894.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313962,28.782084,77.315502,28.783184.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.596297,28.773129,77.597768,28.774436.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.323234,28.780901,77.324651,28.78198.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348386,28.783799,77.349665,28.784894.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.602905,28.766215,77.604246,28.767467.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.455781,28.743484,77.45698,28.74458.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.596297,28.773129,77.597768,28.774436.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.602905,28.766215,77.604246,28.767467.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370639,28.764829,77.372224,28.766385.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.455781,28.743484,77.45698,28.74458.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313449,28.778717,77.314692,28.780092.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370639,28.764829,77.372224,28.766385.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313449,28.778717,77.314692,28.780092.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36319,28.775465,77.36486,28.776688.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.31586,28.775972,77.317439,28.777175.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.564231,28.777744,77.56577,28.778673.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446598,28.783626,77.447934,28.784563.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36319,28.775465,77.36486,28.776688.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.31586,28.775972,77.317439,28.777175.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.596915,28.766768,77.598568,28.768204.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.564231,28.777744,77.56577,28.778673.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446598,28.783626,77.447934,28.784563.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.947081,28.790763,76.94753,28.791928.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.947081,28.790763,76.94753,28.791928.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927175,28.78835,76.928317,28.78899.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.565324,28.777047,77.566557,28.778136.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927175,28.78835,76.928317,28.78899.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.596915,28.766768,77.598568,28.768204.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.565324,28.777047,77.566557,28.778136.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370628,28.774993,77.372182,28.776167.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.363992,28.7831,77.365356,28.784597.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.952502,28.789701,76.953659,28.790189.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.952502,28.789701,76.953659,28.790189.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370628,28.774993,77.372182,28.776167.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.363992,28.7831,77.365356,28.784597.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420911,28.749509,77.422206,28.750651.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420911,28.749509,77.422206,28.750651.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393642,28.779181,77.395321,28.780406.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441948,28.750509,77.443014,28.75169.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.552464,28.783114,77.553904,28.784337.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.944367,28.79409,76.944893,28.795218.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441948,28.750509,77.443014,28.75169.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364755,28.777285,77.366221,28.778942.tif
+Updating dataset tags...
+
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393642,28.779181,77.395321,28.780406.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.944367,28.79409,76.944893,28.795218.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420037,28.750257,77.421452,28.75146.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439518,28.746085,77.440854,28.746982.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.592007,28.78286,77.593378,28.783684.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.552464,28.783114,77.553904,28.784337.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439518,28.746085,77.440854,28.746982.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.562674,28.775677,77.564047,28.776915.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420037,28.750257,77.421452,28.75146.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364755,28.777285,77.366221,28.778942.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.592007,28.78286,77.593378,28.783684.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.562674,28.775677,77.564047,28.776915.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.355279,28.751545,77.356688,28.752725.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.951123,28.788449,76.952847,28.789189.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.577146,28.77488,77.578443,28.775998.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.951123,28.788449,76.952847,28.789189.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555181,28.782961,77.55661,28.784202.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356537,28.751511,77.358093,28.752631.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574241,28.781285,77.575638,28.782461.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.355279,28.751545,77.356688,28.752725.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.566401,28.7804,77.567839,28.781605.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.577146,28.77488,77.578443,28.775998.tif
+Adding overviews...
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574241,28.781285,77.575638,28.782461.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555181,28.782961,77.55661,28.784202.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.316595,28.79245,77.317639,28.793517.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356537,28.751511,77.358093,28.752631.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.566401,28.7804,77.567839,28.781605.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.316595,28.79245,77.317639,28.793517.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.366517,28.776091,77.368382,28.77747.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593462,28.775935,77.594835,28.777124.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.584325,28.779093,77.585688,28.780292.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.318081,28.794084,77.319394,28.795044.tif
+
+Adding overviews...
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.366517,28.776091,77.368382,28.77747.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593462,28.775935,77.594835,28.777124.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.318081,28.794084,77.319394,28.795044.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.584325,28.779093,77.585688,28.780292.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325143,28.791396,77.326476,28.792341.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320603,28.788732,77.321664,28.789842.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325143,28.791396,77.326476,28.792341.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346971,28.793623,77.348156,28.79469.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.443476,28.752078,77.444757,28.753004.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320603,28.788732,77.321664,28.789842.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346971,28.793623,77.348156,28.79469.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.587339,28.776679,77.588878,28.777978.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.443476,28.752078,77.444757,28.753004.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.336304,28.764197,77.337484,28.765267.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.326488,28.789608,77.327922,28.790651.tif
+
+Adding overviews...
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.336304,28.764197,77.337484,28.765267.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.587339,28.776679,77.588878,28.777978.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.326488,28.789608,77.327922,28.790651.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328991,28.790909,77.330241,28.792217.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324084,28.79007,77.325352,28.791319.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.317802,28.789826,77.319171,28.790893.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.575583,28.778195,77.577203,28.779656.tif
+
+Adding overviews...
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.359641,28.788344,77.360764,28.789438.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328991,28.790909,77.330241,28.792217.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324084,28.79007,77.325352,28.791319.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.317802,28.789826,77.319171,28.790893.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.355221,28.755255,77.356324,28.756254.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.359641,28.788344,77.360764,28.789438.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350163,28.760839,77.351566,28.761884.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.575583,28.778195,77.577203,28.779656.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.355221,28.755255,77.356324,28.756254.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.850943,28.799945,76.852398,28.800496.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313467,28.792684,77.314777,28.793966.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.850943,28.799945,76.852398,28.800496.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844777,28.798646,76.845398,28.79978.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.353793,28.786232,77.355052,28.787531.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350163,28.760839,77.351566,28.761884.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844777,28.798646,76.845398,28.79978.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.924499,28.801519,76.925164,28.802558.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313467,28.792684,77.314777,28.793966.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914908,28.798953,76.915447,28.800071.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916497,28.802148,76.917688,28.8026.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.924499,28.801519,76.925164,28.802558.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914908,28.798953,76.915447,28.800071.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916497,28.802148,76.917688,28.8026.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.353793,28.786232,77.355052,28.787531.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.926524,28.797143,76.927189,28.798182.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.926524,28.797143,76.927189,28.798182.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356603,28.785378,77.357764,28.786621.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350146,28.7616,77.351815,28.762753.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356603,28.785378,77.357764,28.786621.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348332,28.788119,77.349726,28.78921.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350146,28.7616,77.351815,28.762753.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348332,28.788119,77.349726,28.78921.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.547596,28.793098,77.548664,28.794104.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.547596,28.793098,77.548664,28.794104.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.933296,28.795963,76.933961,28.796986.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.507843,28.790402,77.509214,28.791599.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.933296,28.795963,76.933961,28.796986.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.358582,28.792517,77.36008,28.793642.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.507843,28.790402,77.509214,28.791599.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.358582,28.792517,77.36008,28.793642.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362218,28.791709,77.363569,28.793095.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351374,28.799503,77.352318,28.80047.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.361926,28.788358,77.363249,28.789636.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351374,28.799503,77.352318,28.80047.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362218,28.791709,77.363569,28.793095.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368745,28.762421,77.37011,28.763687.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.361926,28.788358,77.363249,28.789636.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368745,28.762421,77.37011,28.763687.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351737,28.796406,77.352934,28.797365.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555977,28.791282,77.557307,28.792281.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351737,28.796406,77.352934,28.797365.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555977,28.791282,77.557307,28.792281.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.572004,28.786398,77.573226,28.78751.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320092,28.79789,77.32117,28.799067.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445419,28.790159,77.4467,28.791443.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.550714,28.793385,77.552062,28.794533.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.554294,28.793159,77.555615,28.794369.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320092,28.79789,77.32117,28.799067.tif
+Adding overviews...
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445419,28.790159,77.4467,28.791443.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.550714,28.793385,77.552062,28.794533.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.572004,28.786398,77.573226,28.78751.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.554294,28.793159,77.555615,28.794369.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351676,28.798465,77.352996,28.799467.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319983,28.80237,77.32139,28.803396.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.329798,28.800491,77.330977,28.801702.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324903,28.800462,77.326324,28.80146.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351676,28.798465,77.352996,28.799467.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319983,28.80237,77.32139,28.803396.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.329798,28.800491,77.330977,28.801702.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324903,28.800462,77.326324,28.80146.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.375588,28.804412,77.37674,28.805199.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.367579,28.801358,77.368752,28.802196.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37061,28.800435,77.371534,28.80156.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.375588,28.804412,77.37674,28.805199.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.367579,28.801358,77.368752,28.802196.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328528,28.804005,77.329665,28.805265.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37061,28.800435,77.371534,28.80156.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.375858,28.798023,77.377101,28.798847.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.344299,28.804002,77.345525,28.80518.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.91909,28.811323,76.920412,28.811865.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.937065,28.811135,76.938253,28.811601.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328528,28.804005,77.329665,28.805265.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.375858,28.798023,77.377101,28.798847.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.91909,28.811323,76.920412,28.811865.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.937065,28.811135,76.938253,28.811601.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.344299,28.804002,77.345525,28.80518.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920528,28.811296,76.921139,28.812405.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920528,28.811296,76.921139,28.812405.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.354843,28.796089,77.356008,28.797179.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.332342,28.799474,77.33357,28.800783.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.45339,28.798339,77.454709,28.799275.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374343,28.804902,77.375575,28.805689.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.354843,28.796089,77.356008,28.797179.tif
+Adding overviews...
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.563516,28.795457,77.564549,28.796255.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.332342,28.799474,77.33357,28.800783.tif
+Updating dataset tags...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.45339,28.798339,77.454709,28.799275.tif
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374343,28.804902,77.375575,28.805689.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347724,28.794735,77.348996,28.795784.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.563516,28.795457,77.564549,28.796255.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.848985,28.808151,76.849666,28.809344.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347724,28.794735,77.348996,28.795784.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.848985,28.808151,76.849666,28.809344.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.551507,28.803384,77.552788,28.804405.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844198,28.812821,76.845654,28.813567.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.847125,28.812669,76.849038,28.813407.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844198,28.812821,76.845654,28.813567.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.551507,28.803384,77.552788,28.804405.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.847125,28.812669,76.849038,28.813407.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849026,28.813797,76.850536,28.814523.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.532089,28.802776,77.533219,28.803914.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849026,28.813797,76.850536,28.814523.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449381,28.803685,77.450699,28.804721.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.532089,28.802776,77.533219,28.803914.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449381,28.803685,77.450699,28.804721.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.294141,28.805063,77.295358,28.805989.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.31241,28.796202,77.31392,28.797543.tif
+Updating dataset tags...
+
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.294141,28.805063,77.295358,28.805989.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328377,28.778093,77.329511,28.779191.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.544378,28.799685,77.545591,28.800839.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.302414,28.811316,77.303604,28.81261.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328377,28.778093,77.329511,28.779191.tif
+Adding overviews...
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.544378,28.799685,77.545591,28.800839.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.31241,28.796202,77.31392,28.797543.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.302414,28.811316,77.303604,28.81261.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347889,28.80743,77.349771,28.807962.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347889,28.80743,77.349771,28.807962.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.470509,28.809922,77.471664,28.811036.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.57246,28.798558,77.573539,28.799538.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.470509,28.809922,77.471664,28.811036.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.340993,28.806853,77.342147,28.807982.tif
+
+Adding overviews...
+Updating dataset tags...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.480786,28.805537,77.482111,28.806542.tif
+
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.57246,28.798558,77.573539,28.799538.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844012,28.816557,76.845175,28.817133.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.845233,28.821603,76.846635,28.82219.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.340993,28.806853,77.342147,28.807982.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844012,28.816557,76.845175,28.817133.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.480786,28.805537,77.482111,28.806542.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.309701,28.810701,77.311179,28.811736.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920343,28.815163,76.92154,28.81561.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.845233,28.821603,76.846635,28.82219.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.530834,28.813706,77.531994,28.814876.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920343,28.815163,76.92154,28.81561.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.309701,28.810701,77.311179,28.811736.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.530834,28.813706,77.531994,28.814876.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.533024,28.810578,77.534142,28.81172.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849191,28.816402,76.850566,28.816934.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.851395,28.830629,76.851948,28.831621.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.56557,28.8051,77.566293,28.806338.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849191,28.816402,76.850566,28.816934.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.533024,28.810578,77.534142,28.81172.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.851395,28.830629,76.851948,28.831621.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.30894,28.812889,77.310367,28.813934.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.321123,28.804873,77.32256,28.806199.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.56557,28.8051,77.566293,28.806338.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.877693,28.83413,76.8783,28.835169.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.877693,28.83413,76.8783,28.835169.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422211,28.805184,77.423706,28.806162.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.316865,28.806026,77.318241,28.807386.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313437,28.809409,77.314684,28.810688.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.960004,28.82227,76.961138,28.822781.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.30894,28.812889,77.310367,28.813934.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.851905,28.819129,76.852626,28.820375.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.960004,28.82227,76.961138,28.822781.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.321123,28.804873,77.32256,28.806199.tif
+Adding overviews...
+Updating dataset tags...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.851905,28.819129,76.852626,28.820375.tif
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422211,28.805184,77.423706,28.806162.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.95858,28.822863,76.959236,28.823842.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313437,28.809409,77.314684,28.810688.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.95858,28.822863,76.959236,28.823842.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.316865,28.806026,77.318241,28.807386.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.290246,28.817769,77.291254,28.818445.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.856739,28.821235,76.857475,28.822434.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.290246,28.817769,77.291254,28.818445.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.306324,28.817327,77.30688,28.818479.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.856739,28.821235,76.857475,28.822434.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.306324,28.817327,77.30688,28.818479.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.948548,28.819426,76.949443,28.82067.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864128,28.8354,76.865137,28.835862.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.948548,28.819426,76.949443,28.82067.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864128,28.8354,76.865137,28.835862.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.301883,28.827145,77.302801,28.827909.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.301883,28.827145,77.302801,28.827909.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368122,28.827311,77.369161,28.828181.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.862663,28.835888,76.863666,28.836418.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.534406,28.812102,77.535806,28.813222.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.862663,28.835888,76.863666,28.836418.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368122,28.827311,77.369161,28.828181.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.534406,28.812102,77.535806,28.813222.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536143,28.809452,77.537547,28.81054.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.861135,28.83385,76.862194,28.834368.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.990364,28.842177,76.991035,28.843146.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.861135,28.83385,76.862194,28.834368.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.990364,28.842177,76.991035,28.843146.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536143,28.809452,77.537547,28.81054.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.371176,28.825624,77.37232,28.82642.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.878808,28.84229,76.879977,28.842772.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.878808,28.84229,76.879977,28.842772.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.371176,28.825624,77.37232,28.82642.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.958927,28.824924,76.960553,28.825451.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.297729,28.832594,77.298992,28.833493.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.958927,28.824924,76.960553,28.825451.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.297729,28.832594,77.298992,28.833493.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.275666,28.836555,77.276798,28.837342.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.541615,28.811052,77.54309,28.812347.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.925608,28.789461,76.926857,28.790274.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.275666,28.836555,77.276798,28.837342.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.925608,28.789461,76.926857,28.790274.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.541615,28.811052,77.54309,28.812347.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426625,28.843175,77.427459,28.844197.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426625,28.843175,77.427459,28.844197.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433702,28.844202,77.4346,28.845223.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433702,28.844202,77.4346,28.845223.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.53055,28.83775,77.531789,28.838272.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.53055,28.83775,77.531789,28.838272.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.955666,28.853937,76.956239,28.854921.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.955666,28.853937,76.956239,28.854921.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.292388,28.834105,77.293747,28.835142.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.888486,28.849237,76.889794,28.849759.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.888486,28.849237,76.889794,28.849759.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.533118,28.830917,77.534394,28.831924.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.525559,28.843177,77.526696,28.844002.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.292388,28.834105,77.293747,28.835142.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.533118,28.830917,77.534394,28.831924.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.525559,28.843177,77.526696,28.844002.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.043509,28.850643,77.044151,28.851567.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.550966,28.840755,77.552084,28.841475.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.043509,28.850643,77.044151,28.851567.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.454177,28.835745,77.45534,28.836681.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381884,28.854646,77.382817,28.855262.tif
+
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.95383,28.850743,76.955045,28.851366.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.550966,28.840755,77.552084,28.841475.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536208,28.836549,77.537073,28.837565.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381884,28.854646,77.382817,28.855262.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.95383,28.850743,76.955045,28.851366.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536208,28.836549,77.537073,28.837565.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.454177,28.835745,77.45534,28.836681.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.50953,28.838229,77.510425,28.83928.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378511,28.8545,77.379516,28.855262.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.546794,28.842254,77.547909,28.843169.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.50953,28.838229,77.510425,28.83928.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378511,28.8545,77.379516,28.855262.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.546794,28.842254,77.547909,28.843169.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.527433,28.832762,77.528829,28.833787.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448291,28.842929,77.449389,28.843847.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.527433,28.832762,77.528829,28.833787.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448291,28.842929,77.449389,28.843847.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.962591,28.85522,76.963807,28.855722.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.962591,28.85522,76.963807,28.855722.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382044,28.850399,77.383024,28.851449.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382044,28.850399,77.383024,28.851449.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.964082,28.85757,76.965228,28.858132.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.964082,28.85757,76.965228,28.858132.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.535455,28.837993,77.536679,28.839134.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.169404,28.856465,77.170482,28.857068.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.298331,28.846606,77.299605,28.847408.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.169404,28.856465,77.170482,28.857068.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.285525,28.847659,77.286912,28.848554.tif
+
+Adding overviews...
+Updating dataset tags...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.535455,28.837993,77.536679,28.839134.tif
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.298331,28.846606,77.299605,28.847408.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.2943,28.847591,77.295541,28.848592.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.952902,28.856166,76.953582,28.857208.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.285525,28.847659,77.286912,28.848554.tif
+Updating dataset tags...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.471116,28.848687,77.472301,28.849646.tif
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.952902,28.856166,76.953582,28.857208.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368875,28.785527,77.370107,28.786592.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.2943,28.847591,77.295541,28.848592.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287533,28.857049,77.288491,28.85803.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.471116,28.848687,77.472301,28.849646.tif
+Adding overviews...
+Updating dataset tags...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.46839,28.85956,77.468904,28.860501.tif
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368875,28.785527,77.370107,28.786592.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447553,28.787207,77.448867,28.788072.tif
+
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.559975,28.846864,77.561057,28.847934.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287533,28.857049,77.288491,28.85803.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.46839,28.85956,77.468904,28.860501.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.028012,28.861109,77.029305,28.861763.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447553,28.787207,77.448867,28.788072.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.028012,28.861109,77.029305,28.861763.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.559975,28.846864,77.561057,28.847934.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.390245,28.852095,77.391246,28.853328.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.290793,28.84583,77.292141,28.846967.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.034059,28.857248,77.035297,28.857811.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430102,28.854218,77.431071,28.855262.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.034059,28.857248,77.035297,28.857811.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.390245,28.852095,77.391246,28.853328.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.290793,28.84583,77.292141,28.846967.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382919,28.860446,77.383617,28.861401.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430102,28.854218,77.431071,28.855262.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382919,28.860446,77.383617,28.861401.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.284304,28.850469,77.285799,28.851544.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436509,28.863599,77.437368,28.864615.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436509,28.863599,77.437368,28.864615.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348934,28.86195,77.350287,28.86291.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.284304,28.850469,77.285799,28.851544.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.167907,28.870087,77.169093,28.870606.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348934,28.86195,77.350287,28.86291.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.167907,28.870087,77.169093,28.870606.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448422,28.848973,77.449674,28.849902.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.525954,28.839259,77.527259,28.84048.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.385272,28.857863,77.386243,28.858883.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448422,28.848973,77.449674,28.849902.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.428479,28.84846,77.42971,28.849591.tif
+
+Updating dataset tags...
+Adding overviews...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.385272,28.857863,77.386243,28.858883.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.525954,28.839259,77.527259,28.84048.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.039955,28.874601,77.041266,28.87512.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.295883,28.867939,77.297112,28.868506.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.039955,28.874601,77.041266,28.87512.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.428479,28.84846,77.42971,28.849591.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.295883,28.867939,77.297112,28.868506.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441193,28.847954,77.44197,28.849205.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441193,28.847954,77.44197,28.849205.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.296584,28.864739,77.297833,28.865512.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.296584,28.864739,77.297833,28.865512.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.03126,28.871338,77.032134,28.872477.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.476012,28.86095,77.477292,28.861882.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.03126,28.871338,77.032134,28.872477.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.476012,28.86095,77.477292,28.861882.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.432489,28.858258,77.433525,28.85947.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.432489,28.858258,77.433525,28.85947.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038572,28.876217,77.039728,28.876808.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.861877,28.881644,76.863134,28.882304.tif
+Updating dataset tags...
+
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038572,28.876217,77.039728,28.876808.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.302563,28.86724,77.303415,28.868374.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.861877,28.881644,76.863134,28.882304.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.302563,28.86724,77.303415,28.868374.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.311904,28.867939,77.313369,28.868676.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.311904,28.867939,77.313369,28.868676.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.040611,28.879031,77.041149,28.880077.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.040611,28.879031,77.041149,28.880077.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376311,28.883798,77.377354,28.884411.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.873137,28.89189,76.873688,28.89297.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.86054,28.888196,76.861238,28.889445.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376311,28.883798,77.377354,28.884411.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.873137,28.89189,76.873688,28.89297.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864046,28.886548,76.864674,28.887633.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.86054,28.888196,76.861238,28.889445.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.172673,28.884157,77.173357,28.885261.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864046,28.886548,76.864674,28.887633.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.172673,28.884157,77.173357,28.885261.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.470542,28.862928,77.471957,28.863941.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436074,28.864892,77.437224,28.866058.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436074,28.864892,77.437224,28.866058.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.470542,28.862928,77.471957,28.863941.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.29232,28.86873,77.293737,28.869714.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.301137,28.869215,77.302288,28.870436.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.29232,28.86873,77.293737,28.869714.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.032577,28.883232,77.033314,28.884427.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.301137,28.869215,77.302288,28.870436.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.032577,28.883232,77.033314,28.884427.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.028783,28.890532,77.02979,28.891009.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.882577,28.890701,76.883419,28.891808.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.882919,28.891848,76.884122,28.892351.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.028783,28.890532,77.02979,28.891009.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.882577,28.890701,76.883419,28.891808.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.882919,28.891848,76.884122,28.892351.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876902,28.890181,76.878213,28.890826.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876902,28.890181,76.878213,28.890826.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.252089,28.881329,77.253163,28.882544.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.047153,28.89309,77.048446,28.89353.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038464,28.888292,77.039776,28.888924.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.047153,28.89309,77.048446,28.89353.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.870968,28.893161,76.872247,28.894057.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038464,28.888292,77.039776,28.888924.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.252089,28.881329,77.253163,28.882544.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.870968,28.893161,76.872247,28.894057.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.254325,28.894242,77.25517,28.89526.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.262355,28.883902,77.263618,28.885108.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.254325,28.894242,77.25517,28.89526.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.45465,28.880723,77.455924,28.881913.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438637,28.869431,77.439948,28.870789.tif
+Adding overviews...
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.45465,28.880723,77.455924,28.881913.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.262355,28.883902,77.263618,28.885108.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864692,28.88263,76.866866,28.883347.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438637,28.869431,77.439948,28.870789.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.043476,28.898498,77.044912,28.899049.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251953,28.899649,77.252679,28.900808.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.976183,28.901973,76.977782,28.902555.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864692,28.88263,76.866866,28.883347.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.043476,28.898498,77.044912,28.899049.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43221,28.796215,77.433453,28.797039.tif
+Updating dataset tags...
+
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251953,28.899649,77.252679,28.900808.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.265469,28.892011,77.266774,28.89291.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.976183,28.901973,76.977782,28.902555.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43221,28.796215,77.433453,28.797039.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.265469,28.892011,77.266774,28.89291.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.024315,28.891926,77.025052,28.893452.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.024315,28.891926,77.025052,28.893452.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.250911,28.888992,77.25207,28.890165.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.040979,28.899851,77.041554,28.901187.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.250911,28.888992,77.25207,28.890165.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.040979,28.899851,77.041554,28.901187.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.05123,28.897806,77.051877,28.898923.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266054,28.890528,77.267098,28.891708.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.05123,28.897806,77.051877,28.898923.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.045303,28.895227,77.046632,28.895762.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.045303,28.895227,77.046632,28.895762.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266054,28.890528,77.267098,28.891708.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.044965,28.899996,77.046431,28.900664.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.044965,28.899996,77.046431,28.900664.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.260048,28.889159,77.26125,28.890421.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.531282,28.811477,77.532354,28.81262.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.260048,28.889159,77.26125,28.890421.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038008,28.893457,77.03955,28.894399.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.531282,28.811477,77.532354,28.81262.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.877077,28.919511,76.877768,28.920599.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.263118,28.886879,77.264444,28.888208.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.877077,28.919511,76.877768,28.920599.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038008,28.893457,77.03955,28.894399.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.263118,28.886879,77.264444,28.888208.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.247521,28.910042,77.248417,28.911224.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256492,28.887152,77.257811,28.888528.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.321949,28.913876,77.323009,28.914615.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.880422,28.918483,76.881661,28.919065.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.247521,28.910042,77.248417,28.911224.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.880422,28.918483,76.881661,28.919065.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.321949,28.913876,77.323009,28.914615.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37849,28.907998,77.379812,28.908794.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604211,28.893824,77.60542,28.895016.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347713,28.805071,77.349265,28.806072.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.867253,28.918092,76.867916,28.919211.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256492,28.887152,77.257811,28.888528.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37849,28.907998,77.379812,28.908794.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.867253,28.918092,76.867916,28.919211.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604211,28.893824,77.60542,28.895016.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347713,28.805071,77.349265,28.806072.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.865303,28.916926,76.86656,28.917477.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.870063,28.918106,76.87062,28.919253.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.865303,28.916926,76.86656,28.917477.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.24377,28.904835,77.245205,28.905678.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.870063,28.918106,76.87062,28.919253.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.386738,28.899994,77.387903,28.901375.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.24377,28.904835,77.245205,28.905678.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251168,28.89594,77.252175,28.89729.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.323558,28.912049,77.324703,28.913286.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.300485,28.910495,77.301892,28.911529.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.268471,28.893036,77.269967,28.894135.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.386738,28.899994,77.387903,28.901375.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251168,28.89594,77.252175,28.89729.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.323558,28.912049,77.324703,28.913286.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.268471,28.893036,77.269967,28.894135.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.300485,28.910495,77.301892,28.911529.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394603,28.904198,77.3961,28.90526.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.874198,28.922114,76.874961,28.923424.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394603,28.904198,77.3961,28.90526.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.874198,28.922114,76.874961,28.923424.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.94914,28.919705,76.95013,28.920672.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.94914,28.919705,76.95013,28.920672.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.883925,28.921816,76.885326,28.922508.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.249271,28.916525,77.250725,28.91715.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370497,28.818212,77.371909,28.819376.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.249271,28.916525,77.250725,28.91715.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.883925,28.921816,76.885326,28.922508.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370497,28.818212,77.371909,28.819376.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.874307,28.925557,76.875669,28.926033.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.874307,28.925557,76.875669,28.926033.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.249015,28.905595,77.250543,28.906765.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.884571,28.923891,76.885882,28.924536.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.884571,28.923891,76.885882,28.924536.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876316,28.922714,76.877748,28.923576.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.875734,28.928389,76.876327,28.929458.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.249015,28.905595,77.250543,28.906765.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.875734,28.928389,76.876327,28.929458.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.898177,28.930259,76.899578,28.930841.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876316,28.922714,76.877748,28.923576.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39508,28.918222,77.396294,28.919073.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.897171,28.932963,76.898374,28.933561.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.898177,28.930259,76.899578,28.930841.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.897171,28.932963,76.898374,28.933561.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39508,28.918222,77.396294,28.919073.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287776,28.82722,77.289045,28.828096.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397744,28.922422,77.398976,28.923672.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.866704,28.925682,76.868069,28.926316.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.373435,28.820064,77.374641,28.821211.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287776,28.82722,77.289045,28.828096.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.885291,28.926016,76.88598,28.927352.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.866704,28.925682,76.868069,28.926316.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397744,28.922422,77.398976,28.923672.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.885291,28.926016,76.88598,28.927352.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319729,28.917191,77.321108,28.917964.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.906632,28.926062,76.907351,28.92732.tif
+Updating dataset tags...
+
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.373435,28.820064,77.374641,28.821211.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319729,28.917191,77.321108,28.917964.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.906632,28.926062,76.907351,28.92732.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914836,28.928373,76.915483,28.929568.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.904607,28.93213,76.905362,28.933388.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914836,28.928373,76.915483,28.929568.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256037,28.911106,77.257078,28.91247.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.875666,28.926238,76.877105,28.927012.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.904607,28.93213,76.905362,28.933388.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.903188,28.930197,76.904571,28.930747.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.875666,28.926238,76.877105,28.927012.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.903188,28.930197,76.904571,28.930747.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256037,28.911106,77.257078,28.91247.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.398581,28.921898,77.39973,28.923136.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.951926,28.853897,76.953188,28.854439.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.951926,28.853897,76.953188,28.854439.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.398581,28.921898,77.39973,28.923136.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.255411,28.920668,77.256678,28.921915.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.918285,28.926785,76.918985,28.92787.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.255411,28.920668,77.256678,28.921915.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.918285,28.926785,76.918985,28.92787.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.912075,28.929552,76.913529,28.930134.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915626,28.928294,76.916327,28.929583.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914374,28.930731,76.914984,28.932272.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.912075,28.929552,76.913529,28.930134.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915626,28.928294,76.916327,28.929583.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920422,28.931674,76.921141,28.932947.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914374,28.930731,76.914984,28.932272.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920422,28.931674,76.921141,28.932947.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.530007,28.832895,77.531028,28.834103.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.909962,28.926185,76.91151,28.926898.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916755,28.926784,76.917409,28.92792.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.909962,28.926185,76.91151,28.926898.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916755,28.926784,76.917409,28.92792.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381082,28.93381,77.381968,28.934706.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.530007,28.832895,77.531028,28.834103.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.901284,28.935685,76.902631,28.936298.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.901284,28.935685,76.902631,28.936298.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381082,28.93381,77.381968,28.934706.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.918321,28.932224,76.919775,28.932822.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.918321,28.932224,76.919775,28.932822.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414274,28.919033,77.415722,28.920224.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.901826,28.931758,76.902876,28.933089.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414274,28.919033,77.415722,28.920224.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.901826,28.931758,76.902876,28.933089.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.908256,28.936154,76.90943,28.936734.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.908256,28.936154,76.90943,28.936734.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.906931,28.938292,76.908238,28.938849.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.906931,28.938292,76.908238,28.938849.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.272674,28.933719,77.273739,28.934798.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.272674,28.933719,77.273739,28.934798.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916201,28.931155,76.917782,28.931784.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916201,28.931155,76.917782,28.931784.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.259371,28.933412,77.260493,28.934482.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.926873,28.927528,76.927908,28.928875.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.896325,28.941761,76.897155,28.943049.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.259371,28.933412,77.260493,28.934482.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.258113,28.943559,77.259317,28.944645.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.884405,28.948867,76.885815,28.949482.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.896325,28.941761,76.897155,28.943049.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.898105,28.94721,76.898841,28.94831.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.884405,28.948867,76.885815,28.949482.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.926873,28.927528,76.927908,28.928875.tif
+Updating dataset tags...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.905626,28.951297,76.906812,28.951878.tif
+
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.258113,28.943559,77.259317,28.944645.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.898105,28.94721,76.898841,28.94831.tif
+Updating dataset tags...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915698,28.935449,76.916596,28.936723.tif
+
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.905626,28.951297,76.906812,28.951878.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256898,28.934974,77.258232,28.935959.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915698,28.935449,76.916596,28.936723.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256898,28.934974,77.258232,28.935959.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.268644,28.934886,77.269672,28.936023.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.289628,28.942986,77.290971,28.94393.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.903601,28.947053,76.90502,28.947635.tif
+
+Adding overviews...
+Updating dataset tags...
+Adding overviews...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.903601,28.947053,76.90502,28.947635.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.289628,28.942986,77.290971,28.94393.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.268644,28.934886,77.269672,28.936023.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.907662,28.950844,76.90835,28.951969.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.907662,28.950844,76.90835,28.951969.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251985,28.94787,77.25335,28.948452.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.88633,28.948476,76.887096,28.94956.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251985,28.94787,77.25335,28.948452.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.88633,28.948476,76.887096,28.94956.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.886334,28.950175,76.887182,28.951397.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.917351,28.950071,76.918716,28.950699.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.913062,28.947949,76.914535,28.948656.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.912954,28.951218,76.913692,28.952507.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.886334,28.950175,76.887182,28.951397.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.917351,28.950071,76.918716,28.950699.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.913062,28.947949,76.914535,28.948656.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.912954,28.951218,76.913692,28.952507.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.431279,28.936963,77.432459,28.938163.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.217525,28.955605,77.218273,28.956667.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.217525,28.955605,77.218273,28.956667.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.259178,28.942939,77.260521,28.944094.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.401468,28.939561,77.402701,28.940702.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.431279,28.936963,77.432459,28.938163.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415133,28.938179,77.416279,28.939415.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.259178,28.942939,77.260521,28.944094.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.401468,28.939561,77.402701,28.940702.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415133,28.938179,77.416279,28.939415.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915491,28.953007,76.916947,28.953641.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324468,28.942848,77.325864,28.94388.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414298,28.93673,77.41581,28.937925.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915491,28.953007,76.916947,28.953641.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.224086,28.952098,77.225003,28.953237.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324468,28.942848,77.325864,28.94388.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414298,28.93673,77.41581,28.937925.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.407728,28.956252,77.408428,28.957289.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.224086,28.952098,77.225003,28.953237.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.407728,28.956252,77.408428,28.957289.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266991,28.943217,77.268631,28.944473.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440341,28.873538,77.441825,28.874925.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266991,28.943217,77.268631,28.944473.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440341,28.873538,77.441825,28.874925.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.221169,28.952151,77.22219,28.953507.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.291493,28.933235,77.29294,28.934675.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.221169,28.952151,77.22219,28.953507.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447961,28.958248,77.448554,28.959222.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.427677,28.946816,77.428841,28.948161.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447961,28.958248,77.448554,28.959222.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.291493,28.933235,77.29294,28.934675.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430814,28.958514,77.43227,28.959079.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.427677,28.946816,77.428841,28.948161.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430814,28.958514,77.43227,28.959079.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423406,28.959032,77.424661,28.959707.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.219602,28.959455,77.220595,28.960561.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423406,28.959032,77.424661,28.959707.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.221756,28.956899,77.22304,28.957726.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.219602,28.959455,77.220595,28.960561.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.221756,28.956899,77.22304,28.957726.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425208,28.958155,77.426535,28.958868.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.418174,28.945075,77.419569,28.946435.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425208,28.958155,77.426535,28.958868.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.232493,28.956615,77.233848,28.957765.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421306,28.951913,77.422879,28.952881.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.225407,28.960811,77.226616,28.96238.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.418174,28.945075,77.419569,28.946435.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.232493,28.956615,77.233848,28.957765.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421306,28.951913,77.422879,28.952881.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.225407,28.960811,77.226616,28.96238.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.880135,28.916879,76.881266,28.917382.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.880135,28.916879,76.881266,28.917382.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42902,28.946138,77.4303,28.947424.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.88796,28.973951,76.888717,28.9751.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351114,28.95661,77.352266,28.957959.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.250986,28.95936,77.252325,28.960745.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.88796,28.973951,76.888717,28.9751.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42902,28.946138,77.4303,28.947424.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351114,28.95661,77.352266,28.957959.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.291097,28.945003,77.292804,28.94633.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.250986,28.95936,77.252325,28.960745.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.291097,28.945003,77.292804,28.94633.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.293315,28.948056,77.29497,28.949171.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433782,28.959234,77.434904,28.960545.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.293315,28.948056,77.29497,28.949171.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441876,28.96009,77.443214,28.961068.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256212,28.895766,77.257349,28.89702.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.270443,28.971888,77.271871,28.972685.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433782,28.959234,77.434904,28.960545.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.270443,28.971888,77.271871,28.972685.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256212,28.895766,77.257349,28.89702.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441876,28.96009,77.443214,28.961068.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346764,28.955006,77.348023,28.956306.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346764,28.955006,77.348023,28.956306.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.26793,28.895126,77.269475,28.896238.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.228188,28.975158,77.229128,28.97626.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444652,28.962986,77.446015,28.963965.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.228188,28.975158,77.229128,28.97626.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444652,28.962986,77.446015,28.963965.tif
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.229458,28.955108,77.231257,28.95632.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.92039,28.815175,76.921463,28.815598.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.26793,28.895126,77.269475,28.896238.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.23228,28.97215,77.233554,28.973512.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.92039,28.815175,76.921463,28.815598.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.229458,28.955108,77.231257,28.95632.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.23228,28.97215,77.233554,28.973512.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.223663,28.992069,77.224585,28.99318.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.223663,28.992069,77.224585,28.99318.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.890943,28.987499,76.892324,28.988273.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42095,28.950123,77.422685,28.951445.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.260693,28.968166,77.261917,28.969265.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.228707,28.97247,77.229954,28.973988.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.890943,28.987499,76.892324,28.988273.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.235394,28.97924,77.236685,28.980241.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.260693,28.968166,77.261917,28.969265.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422288,28.96797,77.423587,28.968999.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.29058,28.922459,77.291843,28.923733.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42095,28.950123,77.422685,28.951445.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.228707,28.97247,77.229954,28.973988.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234277,28.978332,77.235809,28.979443.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.235394,28.97924,77.236685,28.980241.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422288,28.96797,77.423587,28.968999.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.29058,28.922459,77.291843,28.923733.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440034,28.965291,77.441488,28.966403.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.252804,28.919736,77.253839,28.921065.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234277,28.978332,77.235809,28.979443.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.233107,28.989203,77.234562,28.990181.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440034,28.965291,77.441488,28.966403.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.252804,28.919736,77.253839,28.921065.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.233107,28.989203,77.234562,28.990181.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242409,28.981966,77.243709,28.982998.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927728,28.997463,76.928392,28.998484.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242409,28.981966,77.243709,28.982998.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444144,28.991224,77.445482,28.991824.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242662,28.986902,77.243978,28.988101.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927728,28.997463,76.928392,28.998484.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266362,28.975527,77.267611,28.976745.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444144,28.991224,77.445482,28.991824.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242662,28.986902,77.243978,28.988101.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266362,28.975527,77.267611,28.976745.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.238437,28.986875,77.239703,28.988132.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441475,28.991872,77.44271,28.992574.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441475,28.991872,77.44271,28.992574.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.238437,28.986875,77.239703,28.988132.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387019,28.343102,77.387985,28.343462.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.225361,28.995365,77.226303,28.996462.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387019,28.343102,77.387985,28.343462.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448081,28.848099,77.449061,28.849067.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440135,28.998405,77.441464,28.998987.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.225361,28.995365,77.226303,28.996462.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440135,28.998405,77.441464,28.998987.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448081,28.848099,77.449061,28.849067.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849086,28.813816,76.850417,28.81438.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.955645,28.85393,76.956203,28.854832.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876252,28.839627,76.877453,28.840116.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.955645,28.85393,76.956203,28.854832.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849086,28.813816,76.850417,28.81438.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403778,28.412947,77.404786,28.413436.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876252,28.839627,76.877453,28.840116.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.241341,28.992554,77.242952,28.993526.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403778,28.412947,77.404786,28.413436.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.049801,28.854869,77.05096,28.855359.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573315,28.36872,77.574238,28.36964.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.049801,28.854869,77.05096,28.855359.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.241341,28.992554,77.242952,28.993526.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.244783,28.987145,77.246338,28.988209.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573315,28.36872,77.574238,28.36964.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.595223,28.421471,77.596189,28.421991.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.595223,28.421471,77.596189,28.421991.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.244783,28.987145,77.246338,28.988209.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.333294,28.993503,77.334969,28.994655.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242459,28.988296,77.244017,28.989758.tif
+
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.035317,28.851863,77.035789,28.852821.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.333294,28.993503,77.334969,28.994655.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.035317,28.851863,77.035789,28.852821.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242459,28.988296,77.244017,28.989758.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.244539,28.986063,77.24616,28.987295.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.354189,28.784581,77.355326,28.785607.tif
+
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362289,28.791722,77.363577,28.793003.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.354189,28.784581,77.355326,28.785607.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.925257,28.809107,76.926596,28.809595.tif
+
+Adding overviews...
+Updating dataset tags...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.244539,28.986063,77.24616,28.987295.tif
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.925257,28.809107,76.926596,28.809595.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362289,28.791722,77.363577,28.793003.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.892829,28.851524,76.89405,28.852051.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.051239,28.897861,77.051797,28.898836.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.591579,28.363717,77.592623,28.364617.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.892829,28.851524,76.89405,28.852051.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.051239,28.897861,77.051797,28.898836.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391783,28.336326,77.392405,28.337207.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.44643,28.995227,77.447866,28.996104.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.591579,28.363717,77.592623,28.364617.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920369,28.815156,76.921484,28.815579.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391783,28.336326,77.392405,28.337207.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920369,28.815156,76.921484,28.815579.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.44643,28.995227,77.447866,28.996104.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438629,28.755661,77.439973,28.756577.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370182,28.760435,77.371005,28.761378.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.41164,28.98945,77.413204,28.990748.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370182,28.760435,77.371005,28.761378.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.921227,28.810047,76.9223,28.810498.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.656268,28.442301,77.657153,28.443089.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.656268,28.442301,77.657153,28.443089.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438629,28.755661,77.439973,28.756577.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.921227,28.810047,76.9223,28.810498.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927972,28.724121,76.928558,28.725145.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.41164,28.98945,77.413204,28.990748.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927972,28.724121,76.928558,28.725145.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449793,28.748491,77.450824,28.74945.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.037849,28.874662,77.038321,28.875611.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389357,28.379969,77.390411,28.380556.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449793,28.748491,77.450824,28.74945.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.037849,28.874662,77.038321,28.875611.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389357,28.379969,77.390411,28.380556.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.921484,28.808138,76.922042,28.809266.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.921484,28.808138,76.922042,28.809266.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586957,28.334628,77.587804,28.335808.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.241675,28.999019,77.243251,29.000096.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586957,28.334628,77.587804,28.335808.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402736,28.406643,77.403811,28.407207.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402736,28.406643,77.403811,28.407207.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.241675,28.999019,77.243251,29.000096.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415738,28.925848,77.417098,28.927017.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36497,28.774063,77.366304,28.775493.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927385,28.810009,76.928029,28.811062.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415738,28.925848,77.417098,28.927017.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927385,28.810009,76.928029,28.811062.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36497,28.774063,77.366304,28.775493.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391558,28.284599,77.392073,28.285392.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391558,28.284599,77.392073,28.285392.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438014,28.864431,77.439215,28.865792.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.681387,28.4739,77.682416,28.474743.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536172,28.83655,77.537137,28.83761.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.681387,28.4739,77.682416,28.474743.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438014,28.864431,77.439215,28.865792.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536172,28.83655,77.537137,28.83761.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876985,28.837903,76.877578,28.838889.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416685,28.334453,77.4172,28.335341.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387266,28.284569,77.387824,28.285441.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416685,28.334453,77.4172,28.335341.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876985,28.837903,76.877578,28.838889.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387266,28.284569,77.387824,28.285441.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381382,28.344613,77.382465,28.345067.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.941848,28.814333,76.942449,28.815349.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381382,28.344613,77.382465,28.345067.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.941848,28.814333,76.942449,28.815349.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287778,28.8272,77.289087,28.828087.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.659764,28.44711,77.660694,28.447977.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.352365,28.773937,77.353674,28.775451.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.613234,28.256518,77.614019,28.257205.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.659754,28.447119,77.660704,28.448006.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.613234,28.256518,77.614019,28.257205.tif
+Updating dataset tags...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.659764,28.44711,77.660694,28.447977.tif
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287778,28.8272,77.289087,28.828087.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320484,28.806018,77.321655,28.807118.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.64256,28.449063,77.643148,28.449603.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.659754,28.447119,77.660704,28.448006.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.64256,28.449063,77.643148,28.449603.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.352365,28.773937,77.353674,28.775451.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320484,28.806018,77.321655,28.807118.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.496861,28.99441,77.497355,28.995353.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914495,28.946471,76.915285,28.947556.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.496861,28.99441,77.497355,28.995353.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914495,28.946471,76.915285,28.947556.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444558,28.744613,77.445846,28.74537.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.497985,28.994591,77.499063,28.99581.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.318972,28.794426,77.320377,28.795357.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444558,28.744613,77.445846,28.74537.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.497985,28.994591,77.499063,28.99581.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.318972,28.794426,77.320377,28.795357.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914409,28.944832,76.915348,28.946099.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392928,28.277293,77.394007,28.278242.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.558232,28.355867,77.559151,28.356814.tif
+
+Adding overviews...
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.246554,28.894409,77.247526,28.895545.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433127,28.754224,77.434403,28.75538.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914409,28.944832,76.915348,28.946099.tif
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392928,28.277293,77.394007,28.278242.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.558232,28.355867,77.559151,28.356814.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40708,28.9954,77.408685,28.996862.tif
+
+Adding overviews...
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.246554,28.894409,77.247526,28.895545.tif
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433127,28.754224,77.434403,28.75538.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40708,28.9954,77.408685,28.996862.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.929078,28.993159,76.929773,28.994319.tif
+
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573165,28.764444,77.574517,28.765589.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.929078,28.993159,76.929773,28.994319.tif
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573165,28.764444,77.574517,28.765589.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.237019,28.959754,77.238697,28.961028.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.237019,28.959754,77.238697,28.961028.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.298072,28.967272,77.299566,28.968437.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.298072,28.967272,77.299566,28.968437.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234164,28.958778,77.235502,28.9604.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234164,28.958778,77.235502,28.9604.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445232,28.966706,77.446808,28.967925.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445232,28.966706,77.446808,28.967925.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.443547,28.9985,77.444212,28.999741.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.443547,28.9985,77.444212,28.999741.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396826,28.326171,77.397619,28.326662.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396826,28.326171,77.397619,28.326662.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234529,29.002138,77.236003,29.003329.tif
+
+Adding overviews...
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234529,29.002138,77.236003,29.003329.tif
+Reading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.275279,28.225628,77.275879,28.226649.tif
+
+Updating dataset tags...
+Writing output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.275279,28.225628,77.275879,28.226649.tif
+
+
+
+
files = glob("/home/patel_zeel/kiln_compass_24/regions/high_res/19/*.tif")
+print(len(files))
+sizes = []
+for file in files:
+    with Image.open(file) as img:
+        sizes.append(img.size[0])
+        sizes.append(img.size[1])
+
+
783
+
+
+
+
max(sizes), min(sizes)
+
+
(1216, 229)
+
+
+
+
Image.open(files[1])
+
+
+
+

+
+
+
+
+
+
for file in tqdm(files):
+    new_file = file.replace(".tif", ".png")
+    with Image.open(file) as img:
+        img.save(new_file)
+
+ +
+
+
+
from ultralytics import YOLO
+
+
+
model = YOLO("yolo11m-obb")
+
+
+
model.train(data="../lab/trench_width/data.yaml", epochs=51, batch=-1, imgsz=1280, save_period=5)
+
+
New https://pypi.org/project/ultralytics/8.3.64 available 😃 Update with 'pip install -U ultralytics'
+Ultralytics 8.3.55 🚀 Python-3.10.15 torch-2.5.0+cu124 CUDA:0 (NVIDIA A100-SXM4-80GB, 81156MiB)
+engine/trainer: task=obb, mode=train, model=yolo11m-obb.pt, data=../lab/trench_width/data.yaml, epochs=51, time=None, patience=100, batch=-1, imgsz=1280, save=True, save_period=5, cache=False, device=None, workers=8, project=None, name=train5, exist_ok=False, pretrained=True, optimizer=auto, verbose=True, seed=0, deterministic=True, single_cls=False, rect=False, cos_lr=False, close_mosaic=10, resume=False, amp=True, fraction=1.0, profile=False, freeze=None, multi_scale=False, overlap_mask=True, mask_ratio=4, dropout=0.0, val=True, split=val, save_json=False, save_hybrid=False, conf=None, iou=0.7, max_det=300, half=False, dnn=False, plots=True, source=None, vid_stride=1, stream_buffer=False, visualize=False, augment=False, agnostic_nms=False, classes=None, retina_masks=False, embed=None, show=False, save_frames=False, save_txt=False, save_conf=False, save_crop=False, show_labels=True, show_conf=True, show_boxes=True, line_width=None, format=torchscript, keras=False, optimize=False, int8=False, dynamic=False, simplify=True, opset=None, workspace=None, nms=False, lr0=0.01, lrf=0.01, momentum=0.937, weight_decay=0.0005, warmup_epochs=3.0, warmup_momentum=0.8, warmup_bias_lr=0.1, box=7.5, cls=0.5, dfl=1.5, pose=12.0, kobj=1.0, nbs=64, hsv_h=0.015, hsv_s=0.7, hsv_v=0.4, degrees=0.0, translate=0.1, scale=0.5, shear=0.0, perspective=0.0, flipud=0.0, fliplr=0.5, bgr=0.0, mosaic=1.0, mixup=0.0, copy_paste=0.0, copy_paste_mode=flip, auto_augment=randaugment, erasing=0.4, crop_fraction=1.0, cfg=None, tracker=botsort.yaml, save_dir=runs/obb/train5
+Overriding model.yaml nc=80 with nc=2
+
+                   from  n    params  module                                       arguments                     
+  0                  -1  1      1856  ultralytics.nn.modules.conv.Conv             [3, 64, 3, 2]                 
+  1                  -1  1     73984  ultralytics.nn.modules.conv.Conv             [64, 128, 3, 2]               
+  2                  -1  1    111872  ultralytics.nn.modules.block.C3k2            [128, 256, 1, True, 0.25]     
+  3                  -1  1    590336  ultralytics.nn.modules.conv.Conv             [256, 256, 3, 2]              
+  4                  -1  1    444928  ultralytics.nn.modules.block.C3k2            [256, 512, 1, True, 0.25]     
+  5                  -1  1   2360320  ultralytics.nn.modules.conv.Conv             [512, 512, 3, 2]              
+  6                  -1  1   1380352  ultralytics.nn.modules.block.C3k2            [512, 512, 1, True]           
+  7                  -1  1   2360320  ultralytics.nn.modules.conv.Conv             [512, 512, 3, 2]              
+  8                  -1  1   1380352  ultralytics.nn.modules.block.C3k2            [512, 512, 1, True]           
+  9                  -1  1    656896  ultralytics.nn.modules.block.SPPF            [512, 512, 5]                 
+ 10                  -1  1    990976  ultralytics.nn.modules.block.C2PSA           [512, 512, 1]                 
+ 11                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
+ 12             [-1, 6]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
+ 13                  -1  1   1642496  ultralytics.nn.modules.block.C3k2            [1024, 512, 1, True]          
+ 14                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
+ 15             [-1, 4]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
+ 16                  -1  1    542720  ultralytics.nn.modules.block.C3k2            [1024, 256, 1, True]          
+ 17                  -1  1    590336  ultralytics.nn.modules.conv.Conv             [256, 256, 3, 2]              
+ 18            [-1, 13]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
+ 19                  -1  1   1511424  ultralytics.nn.modules.block.C3k2            [768, 512, 1, True]           
+ 20                  -1  1   2360320  ultralytics.nn.modules.conv.Conv             [512, 512, 3, 2]              
+ 21            [-1, 10]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
+ 22                  -1  1   1642496  ultralytics.nn.modules.block.C3k2            [1024, 512, 1, True]          
+ 23        [16, 19, 22]  1   2261401  ultralytics.nn.modules.head.OBB              [2, 1, [256, 512, 512]]       
+YOLO11m-obb summary: 434 layers, 20,903,385 parameters, 20,903,369 gradients, 71.9 GFLOPs
+
+Transferred 685/691 items from pretrained weights
+
+
+
wandb: Using wandb-core as the SDK backend. Please refer to https://wandb.me/wandb-core for more information.
+wandb: Currently logged in as: patel_zeel (sustainability-lab). Use `wandb login --relogin` to force relogin
+
+
+Tracking run with wandb version 0.18.5 +
+
+Run data is saved locally in /home/patel_zeel/blog/lab/wandb/run-20250120_153755-5wtpe9o7 +
+
+Syncing run train5 to Weights & Biases (docs)
+
+ + +
+
Freezing layer 'model.23.dfl.conv.weight'
+AMP: running Automatic Mixed Precision (AMP) checks...
+AMP: checks passed ✅
+
+
+
train: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]
+
+
+
AutoBatch: Computing optimal batch size for imgsz=1280 at 60.0% CUDA memory utilization.
+AutoBatch: CUDA:0 (NVIDIA A100-SXM4-80GB) 79.25G total, 0.21G reserved, 0.20G allocated, 78.84G free
+
+
+
+
+
+
      Params      GFLOPs  GPU_mem (GB)  forward (ms) backward (ms)                   input                  output
+    20903385       287.6         4.488         46.53           nan      (1, 3, 1280, 1280)                    list
+    20903385       575.1        10.364         44.88           nan      (2, 3, 1280, 1280)                    list
+    20903385        1150        19.864         47.99           nan      (4, 3, 1280, 1280)                    list
+    20903385        2300        39.047         75.16           nan      (8, 3, 1280, 1280)                    list
+    20903385        4601        76.993         149.4           nan     (16, 3, 1280, 1280)                    list
+CUDA out of memory. Tried to allocate 400.00 MiB. GPU 0 has a total capacity of 79.25 GiB of which 39.50 MiB is free. Including non-PyTorch memory, this process has 79.21 GiB memory in use. Of the allocated memory 78.52 GiB is allocated by PyTorch, and 164.18 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation.  See documentation for Memory Management  (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
+CUDA out of memory. Tried to allocate 400.00 MiB. GPU 0 has a total capacity of 79.25 GiB of which 171.50 MiB is free. Including non-PyTorch memory, this process has 79.08 GiB memory in use. Of the allocated memory 77.98 GiB is allocated by PyTorch, and 587.22 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation.  See documentation for Memory Management  (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
+AutoBatch: Using batch-size 10 for CUDA:0 49.53G/79.25G (62%) ✅
+
+
+
train: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]
+val: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]
+
+
+
Plotting labels to runs/obb/train5/labels.jpg... 
+optimizer: 'optimizer=auto' found, ignoring 'lr0=0.01' and 'momentum=0.937' and determining best 'optimizer', 'lr0' and 'momentum' automatically... 
+optimizer: AdamW(lr=0.001667, momentum=0.9) with parameter groups 112 weight(decay=0.0), 122 weight(decay=0.00046875), 121 bias(decay=0.0)
+Image sizes 1280 train, 1280 val
+Using 8 dataloader workers
+Logging results to runs/obb/train5
+Starting training for 51 epochs...
+
+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       1/51      21.9G      3.042      5.112      3.885         57       1280: 100%|██████████| 1/1 [00:00<00:00,  1.23it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.08it/s]
+
+
+
                   all         10         20    0.00146        0.2    0.00126   0.000148
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       2/51      21.9G      2.846      4.908      4.012         58       1280: 100%|██████████| 1/1 [00:00<00:00,  3.09it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.09it/s]
+
+
+
                   all         10         20    0.00106       0.15   0.000779   9.96e-05
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       3/51      21.9G      3.022      5.258      3.667         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.19it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.07it/s]
+
+
+
                   all         10         20    0.00103       0.15   0.000764   9.76e-05
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       4/51      21.9G      2.789      5.113      3.852         58       1280: 100%|██████████| 1/1 [00:00<00:00,  3.20it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.13it/s]
+
+
+
                   all         10         20   0.000675        0.1   0.000516   7.25e-05
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       5/51        22G      2.526       5.45      3.457         42       1280: 100%|██████████| 1/1 [00:00<00:00,  2.71it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.13it/s]
+
+
+
                   all         10         20     0.0017       0.25    0.00121    0.00022
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       6/51      22.1G      2.832      4.983      3.874         48       1280: 100%|██████████| 1/1 [00:00<00:00,  3.09it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.13it/s]
+
+
+
                   all         10         20    0.00257       0.35    0.00268   0.000685
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       7/51      22.1G      2.534      4.688      3.428         55       1280: 100%|██████████| 1/1 [00:00<00:00,  3.09it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.13it/s]
+
+
+
                   all         10         20    0.00417       0.55     0.0161     0.0029
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       8/51      22.1G      2.091      4.292       3.43         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.06it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.14it/s]
+
+
+
                   all         10         20    0.00667        0.6     0.0156    0.00451
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       9/51      22.1G      2.307      4.644      3.652         35       1280: 100%|██████████| 1/1 [00:00<00:00,  3.02it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.14it/s]
+
+
+
                   all         10         20     0.0105       0.55     0.0608     0.0119
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      10/51      22.1G        1.8      4.124      2.982         47       1280: 100%|██████████| 1/1 [00:00<00:00,  3.07it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.19it/s]
+
+
+
                   all         10         20     0.0111        0.8     0.0696      0.017
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      11/51      22.1G      1.751      3.753      2.608         54       1280: 100%|██████████| 1/1 [00:00<00:00,  3.48it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.18it/s]
+
+
+
                   all         10         20     0.0111        0.8     0.0696      0.017
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      12/51      22.2G      1.821      3.793      2.822         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.00it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.29it/s]
+
+
+
                   all         10         20    0.00953          1      0.145     0.0346
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      13/51      22.1G      1.748       3.64      2.721         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.49it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.30it/s]
+
+
+
                   all         10         20    0.00953          1      0.145     0.0346
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      14/51      22.2G      1.711      3.417      2.728         69       1280: 100%|██████████| 1/1 [00:00<00:00,  3.11it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.46it/s]
+
+
+
                   all         10         20    0.00856          1      0.301      0.107
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      15/51      22.1G      1.443       3.42      3.097         45       1280: 100%|██████████| 1/1 [00:00<00:00,  3.50it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.45it/s]
+
+
+
                   all         10         20    0.00856          1      0.301      0.107
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      16/51      22.2G      1.468      3.528      2.344         38       1280: 100%|██████████| 1/1 [00:00<00:00,  3.09it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.63it/s]
+
+
+
                   all         10         20       0.22       0.65      0.358      0.154
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      17/51      22.1G      1.452      2.938      2.311         54       1280: 100%|██████████| 1/1 [00:00<00:00,  3.45it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.60it/s]
+
+
+
                   all         10         20       0.22       0.65      0.358      0.154
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      18/51      22.2G      1.321       2.94      3.078         46       1280: 100%|██████████| 1/1 [00:00<00:00,  3.07it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.78it/s]
+
+
+
                   all         10         20      0.322        0.6      0.393       0.17
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      19/51      22.1G      1.385      2.793      2.518         53       1280: 100%|██████████| 1/1 [00:00<00:00,  3.51it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.79it/s]
+
+
+
                   all         10         20      0.322        0.6      0.393       0.17
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      20/51      22.2G      1.287      2.709      2.337         46       1280: 100%|██████████| 1/1 [00:00<00:00,  3.10it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.85it/s]
+
+
+
                   all         10         20      0.385        0.8       0.54      0.237
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      21/51      22.1G      1.191       2.52      2.295         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.48it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.84it/s]
+
+
+
                   all         10         20      0.385        0.8       0.54      0.237
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      22/51      22.2G      1.354      2.634      2.612         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.11it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.78it/s]
+
+
+
                   all         10         20      0.463       0.85      0.708       0.39
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      23/51      22.1G      1.266      2.719       2.22         35       1280: 100%|██████████| 1/1 [00:00<00:00,  3.50it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.78it/s]
+
+
+
                   all         10         20      0.463       0.85      0.708       0.39
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      24/51      22.2G      1.189       2.15      2.245         59       1280: 100%|██████████| 1/1 [00:00<00:00,  3.19it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.72it/s]
+
+
+
                   all         10         20      0.479       0.85      0.729      0.418
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      25/51      22.1G      1.069      2.278      1.992         52       1280: 100%|██████████| 1/1 [00:00<00:00,  3.49it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.73it/s]
+
+
+
                   all         10         20      0.479       0.85      0.729      0.418
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      26/51      22.2G       1.42      2.523      2.593         54       1280: 100%|██████████| 1/1 [00:00<00:00,  3.09it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.59it/s]
+
+
+
                   all         10         20      0.711        0.9      0.841       0.52
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      27/51      22.1G      1.175      2.119      1.997         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.50it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.60it/s]
+
+
+
                   all         10         20      0.711        0.9      0.841       0.52
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      28/51      22.2G      1.173      2.096      2.219         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.12it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.53it/s]
+
+
+
                   all         10         20      0.856      0.889      0.933      0.623
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      29/51      22.1G      1.173      2.163      2.431         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.49it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.52it/s]
+
+
+
                   all         10         20      0.856      0.889      0.933      0.623
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      30/51      22.2G      1.196      2.354      1.996         41       1280: 100%|██████████| 1/1 [00:00<00:00,  3.49it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.51it/s]
+
+
+
                   all         10         20      0.856      0.889      0.933      0.623
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      31/51      22.2G      1.289      2.233      2.543         49       1280: 100%|██████████| 1/1 [00:00<00:00,  3.11it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.49it/s]
+
+
+
                   all         10         20      0.971       0.89      0.966      0.654
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      32/51      22.1G       1.22      2.208      2.129         51       1280: 100%|██████████| 1/1 [00:00<00:00,  3.50it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.48it/s]
+
+
+
                   all         10         20      0.971       0.89      0.966      0.654
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      33/51      22.2G      1.175      2.047      1.935         45       1280: 100%|██████████| 1/1 [00:00<00:00,  3.50it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.49it/s]
+
+
+
                   all         10         20      0.971       0.89      0.966      0.654
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      34/51      22.2G      1.268      1.903      1.914         62       1280: 100%|██████████| 1/1 [00:00<00:00,  3.05it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.48it/s]
+
+
+
                   all         10         20      0.806       0.93      0.952      0.676
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      35/51      22.1G      1.199      2.076       2.07         52       1280: 100%|██████████| 1/1 [00:00<00:00,  3.49it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.48it/s]
+
+
+
                   all         10         20      0.806       0.93      0.952      0.676
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      36/51      22.2G      1.139      2.094      2.366         51       1280: 100%|██████████| 1/1 [00:00<00:00,  3.48it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.49it/s]
+
+
+
                   all         10         20      0.806       0.93      0.952      0.676
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      37/51      22.2G      1.141      2.146      1.818         49       1280: 100%|██████████| 1/1 [00:00<00:00,  3.10it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.49it/s]
+
+
+
                   all         10         20      0.768       0.94      0.946      0.671
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      38/51      22.1G      1.095      1.823      1.974         60       1280: 100%|██████████| 1/1 [00:00<00:00,  3.49it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.48it/s]
+
+
+
                   all         10         20      0.768       0.94      0.946      0.671
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      39/51      22.2G      1.037      1.773      2.015         54       1280: 100%|██████████| 1/1 [00:00<00:00,  3.47it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.49it/s]
+
+
+
                   all         10         20      0.768       0.94      0.946      0.671
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      40/51      22.2G      1.049      1.755      1.971         60       1280: 100%|██████████| 1/1 [00:00<00:00,  3.08it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.51it/s]
+
+
+
                   all         10         20       0.87       0.95      0.957      0.669
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      41/51      22.1G       1.16      2.066      2.284         46       1280: 100%|██████████| 1/1 [00:00<00:00,  3.49it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.49it/s]
+
+
+
                   all         10         20       0.87       0.95      0.957      0.669
+
+
+
+
+
+
Closing dataloader mosaic
+
+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      42/51      22.2G      1.142      3.141      1.802         20       1280: 100%|██████████| 1/1 [00:00<00:00,  1.20it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.46it/s]
+
+
+
                   all         10         20       0.87       0.95      0.957      0.669
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      43/51      22.2G      1.076      2.939      1.903         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.10it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.45it/s]
+
+
+
                   all         10         20      0.905       0.93      0.963      0.677
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      44/51      22.1G      1.062      2.816      1.615         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.51it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.43it/s]
+
+
+
                   all         10         20      0.905       0.93      0.963      0.677
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      45/51      22.2G      1.043      2.889      1.868         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.49it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.50it/s]
+
+
+
                   all         10         20      0.905       0.93      0.963      0.677
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      46/51      22.2G      1.308      3.073      1.914         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.11it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.48it/s]
+
+
+
                   all         10         20      0.895       0.95      0.957      0.701
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      47/51      22.1G     0.9313      2.941      1.799         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.49it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.50it/s]
+
+
+
                   all         10         20      0.895       0.95      0.957      0.701
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      48/51      22.2G     0.9851      2.767      1.656         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.48it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.50it/s]
+
+
+
                   all         10         20      0.895       0.95      0.957      0.701
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      49/51      22.2G      1.306      2.876      2.089         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.47it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.49it/s]
+
+
+
                   all         10         20      0.895       0.95      0.957      0.701
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      50/51      22.2G      1.156      2.817      1.904         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.12it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.49it/s]
+
+
+
                   all         10         20      0.913      0.999      0.968      0.726
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      51/51      22.1G      1.119      2.846      1.677         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.49it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.48it/s]
+
+
+
                   all         10         20      0.913      0.999      0.968      0.726
+
+
+
+
+
+

+51 epochs completed in 0.034 hours.
+Optimizer stripped from runs/obb/train5/weights/last.pt, 43.1MB
+Optimizer stripped from runs/obb/train5/weights/best.pt, 43.1MB
+
+Validating runs/obb/train5/weights/best.pt...
+Ultralytics 8.3.55 🚀 Python-3.10.15 torch-2.5.0+cu124 CUDA:0 (NVIDIA A100-SXM4-80GB, 81156MiB)
+YOLO11m-obb summary (fused): 322 layers, 20,880,025 parameters, 0 gradients, 71.3 GFLOPs
+
+
+
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.51it/s]
+
+
+
                   all         10         20      0.913      0.999      0.968      0.734
+             inner_box         10         10      0.909      0.998       0.94      0.592
+             outer_box         10         10      0.918          1      0.995      0.875
+Speed: 0.4ms preprocess, 6.9ms inference, 0.0ms loss, 54.4ms postprocess per image
+Results saved to runs/obb/train5
+
+
+ +
+
+ +

Run history:


+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
lr/pg0▁▂▂▃▃▄▄▅▅▅▆▆▇▇▇██████████▇▇▇▇▇▆▆▆▅▅▄▄▃▃▂
lr/pg1▁▂▂▃▃▄▄▅▅▅▆▆▇▇▇▇█████████▇▇▇▇▇▆▆▆▅▅▄▄▃▃▂
lr/pg2▁▂▃▃▃▄▅▅▅▆▆▇▇▇▇███████████▇▇▇▇▆▆▆▅▅▄▄▃▃▂
metrics/mAP50(B)▁▁▁▁▁▁▁▂▂▃▄▄▄▄▅▆▆▆▆▇████████████████████
metrics/mAP50-95(B)▁▁▁▁▁▁▁▁▁▁▂▂▂▂▃▃▅▅▅▅▆▇▇▇▇▇███▇▇▇▇▇██████
metrics/precision(B)▁▁▁▁▁▁▁▁▁▁▁▁▁▃▃▃▄▄▄▄▄▆▇▇▇█▇▇▇▇▇▇▇▇██▇▇▇█
metrics/recall(B)▁▁▁▂▃▅▄▆▆███▅▅▅▆▆▇▇▇▇▇▇▇▇▇▇▇▇▇████▇▇████
model/GFLOPs
model/parameters
model/speed_PyTorch(ms)
train/box_loss█▇█▇▆▆▅▆▄▄▄▄▃▃▃▂▂▂▂▁▂▂▂▂▂▂▂▂▂▂▁▂▂▁▁▂▁▁▂▂
train/cls_loss▇▇█▇█▆▆▅▅▅▄▄▄▃▃▃▃▂▂▂▂▂▂▂▂▁▂▂▂▁▁▂▄▃▃▃▃▃▃▃
train/dfl_loss██▇█▆▆▆▇▅▄▄▅▃▃▅▃▃▄▃▃▂▃▃▂▄▂▂▂▃▂▂▃▂▂▁▂▂▁▂▁
val/box_loss████▇▄▄▃▃▃▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁
val/cls_loss███▇▆▅▄▄▄▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁
val/dfl_loss████▇▅▅▆▆▆▅▄▄▄▄▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁
+

Run summary:


+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
lr/pg02e-05
lr/pg12e-05
lr/pg22e-05
metrics/mAP50(B)0.96773
metrics/mAP50-95(B)0.73366
metrics/precision(B)0.9135
metrics/recall(B)0.99876
model/GFLOPs71.889
model/parameters20903385
model/speed_PyTorch(ms)13.092
train/box_loss1.11869
train/cls_loss2.84641
train/dfl_loss1.6767
val/box_loss0.93457
val/cls_loss1.79106
val/dfl_loss1.45649
+
+
+
+ View run train5 at: https://wandb.ai/sustainability-lab/Ultralytics/runs/5wtpe9o7
View project at: https://wandb.ai/sustainability-lab/Ultralytics
Synced 4 W&B file(s), 0 media file(s), 5 artifact file(s) and 16 other file(s) +
+
+Find logs at: ./wandb/run-20250120_153755-5wtpe9o7/logs +
+
+
ultralytics.utils.metrics.OBBMetrics object with attributes:
+
+ap_class_index: array([0, 1])
+box: ultralytics.utils.metrics.Metric object
+confusion_matrix: <ultralytics.utils.metrics.ConfusionMatrix object at 0x7fe64f199420>
+curves: []
+curves_results: []
+fitness: np.float64(0.7570705318296267)
+keys: ['metrics/precision(B)', 'metrics/recall(B)', 'metrics/mAP50(B)', 'metrics/mAP50-95(B)']
+maps: array([    0.59184,     0.87549])
+names: {0: 'inner_box', 1: 'outer_box'}
+plot: True
+results_dict: {'metrics/precision(B)': np.float64(0.9134993192895953), 'metrics/recall(B)': np.float64(0.9987569029703861), 'metrics/mAP50(B)': np.float64(0.9677272727272728), 'metrics/mAP50-95(B)': np.float64(0.7336642272854437), 'fitness': np.float64(0.7570705318296267)}
+save_dir: PosixPath('runs/obb/train5')
+speed: {'preprocess': 0.35784244537353516, 'inference': 6.9373369216918945, 'loss': 0.0010251998901367188, 'postprocess': 54.367613792419434}
+
+
+
+
import numpy as np
+import supervision as sv
+pred_model = YOLO("/home/patel_zeel/blog/lab/runs/obb/train5/weights/best.pt")
+
+
+
import os
+files = glob("/home/patel_zeel/kiln_compass_24/regions/high_res/19/*.png")
+# np.random.seed(1)
+random_file = np.random.choice(files)
+base_name = os.path.basename(random_file)
+if base_name in [os.path.basename(file) for file in glob("../lab/trench_width/images/*.png")]:
+    print("Part of the training dataset")
+
+result = pred_model(random_file, imgsz=1280, verbose=False)[0]
+detection = sv.Detections.from_ultralytics(result)
+
+img = Image.open(random_file)
+box_annotator = sv.OrientedBoxAnnotator()
+label_annotator = sv.LabelAnnotator()
+annotated_image = box_annotator.annotate(img.copy(), detection)
+annotated_image = label_annotator.annotate(annotated_image, detection)
+display(annotated_image)
+
+
+
+

+
+
+
+
+
+

Run seg

+
+
model = YOLO("yolov8m-seg")
+
+
+
model.train(data="../lab/trench_width/data.yaml", epochs=51, batch=-1, imgsz=1280, save_period=5)
+
+
New https://pypi.org/project/ultralytics/8.3.64 available 😃 Update with 'pip install -U ultralytics'
+Ultralytics 8.3.55 🚀 Python-3.10.15 torch-2.5.0+cu124 CUDA:0 (NVIDIA A100-SXM4-80GB, 81156MiB)
+engine/trainer: task=segment, mode=train, model=yolov8m-seg.pt, data=../lab/trench_width/data.yaml, epochs=51, time=None, patience=100, batch=-1, imgsz=1280, save=True, save_period=5, cache=False, device=None, workers=8, project=None, name=train, exist_ok=False, pretrained=True, optimizer=auto, verbose=True, seed=0, deterministic=True, single_cls=False, rect=False, cos_lr=False, close_mosaic=10, resume=False, amp=True, fraction=1.0, profile=False, freeze=None, multi_scale=False, overlap_mask=True, mask_ratio=4, dropout=0.0, val=True, split=val, save_json=False, save_hybrid=False, conf=None, iou=0.7, max_det=300, half=False, dnn=False, plots=True, source=None, vid_stride=1, stream_buffer=False, visualize=False, augment=False, agnostic_nms=False, classes=None, retina_masks=False, embed=None, show=False, save_frames=False, save_txt=False, save_conf=False, save_crop=False, show_labels=True, show_conf=True, show_boxes=True, line_width=None, format=torchscript, keras=False, optimize=False, int8=False, dynamic=False, simplify=True, opset=None, workspace=None, nms=False, lr0=0.01, lrf=0.01, momentum=0.937, weight_decay=0.0005, warmup_epochs=3.0, warmup_momentum=0.8, warmup_bias_lr=0.1, box=7.5, cls=0.5, dfl=1.5, pose=12.0, kobj=1.0, nbs=64, hsv_h=0.015, hsv_s=0.7, hsv_v=0.4, degrees=0.0, translate=0.1, scale=0.5, shear=0.0, perspective=0.0, flipud=0.0, fliplr=0.5, bgr=0.0, mosaic=1.0, mixup=0.0, copy_paste=0.0, copy_paste_mode=flip, auto_augment=randaugment, erasing=0.4, crop_fraction=1.0, cfg=None, tracker=botsort.yaml, save_dir=runs/segment/train
+Overriding model.yaml nc=80 with nc=2
+
+                   from  n    params  module                                       arguments                     
+  0                  -1  1      1392  ultralytics.nn.modules.conv.Conv             [3, 48, 3, 2]                 
+  1                  -1  1     41664  ultralytics.nn.modules.conv.Conv             [48, 96, 3, 2]                
+  2                  -1  2    111360  ultralytics.nn.modules.block.C2f             [96, 96, 2, True]             
+  3                  -1  1    166272  ultralytics.nn.modules.conv.Conv             [96, 192, 3, 2]               
+  4                  -1  4    813312  ultralytics.nn.modules.block.C2f             [192, 192, 4, True]           
+  5                  -1  1    664320  ultralytics.nn.modules.conv.Conv             [192, 384, 3, 2]              
+  6                  -1  4   3248640  ultralytics.nn.modules.block.C2f             [384, 384, 4, True]           
+  7                  -1  1   1991808  ultralytics.nn.modules.conv.Conv             [384, 576, 3, 2]              
+  8                  -1  2   3985920  ultralytics.nn.modules.block.C2f             [576, 576, 2, True]           
+  9                  -1  1    831168  ultralytics.nn.modules.block.SPPF            [576, 576, 5]                 
+ 10                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
+ 11             [-1, 6]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
+ 12                  -1  2   1993728  ultralytics.nn.modules.block.C2f             [960, 384, 2]                 
+ 13                  -1  1         0  torch.nn.modules.upsampling.Upsample         [None, 2, 'nearest']          
+ 14             [-1, 4]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
+ 15                  -1  2    517632  ultralytics.nn.modules.block.C2f             [576, 192, 2]                 
+ 16                  -1  1    332160  ultralytics.nn.modules.conv.Conv             [192, 192, 3, 2]              
+ 17            [-1, 12]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
+ 18                  -1  2   1846272  ultralytics.nn.modules.block.C2f             [576, 384, 2]                 
+ 19                  -1  1   1327872  ultralytics.nn.modules.conv.Conv             [384, 384, 3, 2]              
+ 20             [-1, 9]  1         0  ultralytics.nn.modules.conv.Concat           [1]                           
+ 21                  -1  2   4207104  ultralytics.nn.modules.block.C2f             [960, 576, 2]                 
+ 22        [15, 18, 21]  1   5160182  ultralytics.nn.modules.head.Segment          [2, 32, 192, [192, 384, 576]] 
+YOLOv8m-seg summary: 331 layers, 27,240,806 parameters, 27,240,790 gradients, 110.4 GFLOPs
+
+Transferred 531/537 items from pretrained weights
+
+
+Tracking run with wandb version 0.18.5 +
+
+Run data is saved locally in /home/patel_zeel/blog/lab/wandb/run-20250120_154343-tvkavy42 +
+
+Syncing run train to Weights & Biases (docs)
+
+ + +
+
Freezing layer 'model.22.dfl.conv.weight'
+AMP: running Automatic Mixed Precision (AMP) checks...
+AMP: checks passed ✅
+
+
+
train: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]
+
+
+
AutoBatch: Computing optimal batch size for imgsz=1280 at 60.0% CUDA memory utilization.
+AutoBatch: CUDA:0 (NVIDIA A100-SXM4-80GB) 79.25G total, 3.34G reserved, 0.75G allocated, 75.16G free
+
+
+
+
+
+
      Params      GFLOPs  GPU_mem (GB)  forward (ms) backward (ms)                   input                  output
+    27240806       441.6         4.140         29.42           nan      (1, 3, 1280, 1280)                    list
+    27240806       883.2         9.200         36.04           nan      (2, 3, 1280, 1280)                    list
+    27240806        1766        17.132         41.55           nan      (4, 3, 1280, 1280)                    list
+    27240806        3533        33.320          73.9           nan      (8, 3, 1280, 1280)                    list
+    27240806        7065        64.274         142.8           nan     (16, 3, 1280, 1280)                    list
+    27240806   1.413e+04       127.767         281.6           nan     (32, 3, 1280, 1280)                    list
+CUDA out of memory. Tried to allocate 600.00 MiB. GPU 0 has a total capacity of 79.25 GiB of which 167.50 MiB is free. Including non-PyTorch memory, this process has 79.07 GiB memory in use. Of the allocated memory 77.43 GiB is allocated by PyTorch, and 1.08 GiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation.  See documentation for Memory Management  (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
+AutoBatch: Using batch-size 11 for CUDA:0 49.63G/79.25G (63%) ✅
+
+
+
train: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]
+val: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]
+
+
+
Plotting labels to runs/segment/train/labels.jpg... 
+optimizer: 'optimizer=auto' found, ignoring 'lr0=0.01' and 'momentum=0.937' and determining best 'optimizer', 'lr0' and 'momentum' automatically... 
+optimizer: AdamW(lr=0.001667, momentum=0.9) with parameter groups 86 weight(decay=0.0), 97 weight(decay=0.000515625), 96 bias(decay=0.0)
+Image sizes 1280 train, 1280 val
+Using 8 dataloader workers
+Logging results to runs/segment/train
+Starting training for 51 epochs...
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       1/51      19.4G      2.736      7.179      6.442      2.486         57       1280: 100%|██████████| 1/1 [00:01<00:00,  1.02s/it]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  2.53it/s]
+
+
+
                   all         10         20          0          0          0          0          0          0          0          0
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       2/51      19.4G      2.783      6.387      6.172      2.612         59       1280: 100%|██████████| 1/1 [00:00<00:00,  2.97it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  3.37it/s]
+
+
+
                   all         10         20          0          0          0          0          0          0          0          0
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       3/51      19.4G      2.929      7.076      6.595      2.577         56       1280: 100%|██████████| 1/1 [00:00<00:00,  2.97it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  3.19it/s]
+
+
+
                   all         10         20          0          0          0          0          0          0          0          0
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       4/51      19.4G      2.726      7.228      6.208      2.532         58       1280: 100%|██████████| 1/1 [00:00<00:00,  3.03it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  4.10it/s]
+
+
+
                   all         10         20          0          0          0          0          0          0          0          0
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       5/51      19.3G      2.499      6.182      7.267      2.396         42       1280: 100%|██████████| 1/1 [00:00<00:00,  3.01it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  3.71it/s]
+
+
+
                   all         10         20          0          0          0          0          0          0          0          0
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       6/51      19.3G      3.034      8.223      7.171      2.728         48       1280: 100%|██████████| 1/1 [00:00<00:00,  2.99it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  3.97it/s]
+
+
+
                   all         10         20          0          0          0          0          0          0          0          0
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       7/51      19.4G       3.12      7.957      7.105      2.859         55       1280: 100%|██████████| 1/1 [00:00<00:00,  2.99it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  4.12it/s]
+
+
+
                   all         10         20          0          0          0          0          0          0          0          0
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       8/51      19.4G      2.931      7.516      6.602      2.776         56       1280: 100%|██████████| 1/1 [00:00<00:00,  2.89it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.57it/s]
+
+
+
                   all         10         20      0.696       0.15      0.145     0.0523      0.617        0.2     0.0585     0.0102
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
       9/51      19.4G      2.629      5.092      6.662      2.429         35       1280: 100%|██████████| 1/1 [00:00<00:00,  2.88it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  2.32it/s]
+
+
+
                   all         10         20       0.13       0.25       0.16     0.0768      0.132        0.1      0.154     0.0326
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      10/51      19.5G      1.897      4.349      4.317      1.925         48       1280: 100%|██████████| 1/1 [00:00<00:00,  3.01it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  2.21it/s]
+
+
+
                   all         10         20      0.452      0.351       0.39      0.175      0.172       0.35       0.25     0.0854
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      11/51      19.6G      1.887      4.137      3.486      1.877         54       1280: 100%|██████████| 1/1 [00:00<00:00,  2.75it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  3.67it/s]
+
+
+
                   all         10         20        0.8       0.15      0.357      0.219      0.105      0.614      0.284     0.0629
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      12/51      19.6G       1.43      3.731      3.628      1.507         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.12it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  3.44it/s]
+
+
+
                   all         10         20        0.8       0.15      0.357      0.219      0.105      0.614      0.284     0.0629
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      13/51      19.7G      1.559      3.545       3.55      1.526         56       1280: 100%|██████████| 1/1 [00:00<00:00,  2.87it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  2.00it/s]
+
+
+
                   all         10         20      0.904        0.3        0.5      0.193      0.904        0.3       0.43      0.108
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      14/51      19.7G       1.39      2.989      2.648      1.478         69       1280: 100%|██████████| 1/1 [00:00<00:00,  3.12it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.83it/s]
+
+
+
                   all         10         20      0.904        0.3        0.5      0.193      0.904        0.3       0.43      0.108
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      15/51      19.6G      1.729      2.974      3.255      1.676         45       1280: 100%|██████████| 1/1 [00:00<00:00,  2.87it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.31it/s]
+
+
+
                   all         10         20      0.455      0.727      0.398      0.158      0.451      0.677      0.349      0.137
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      16/51      19.4G      1.711      2.639      3.226      1.627         38       1280: 100%|██████████| 1/1 [00:00<00:00,  3.28it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.32it/s]
+
+
+
                   all         10         20      0.455      0.727      0.398      0.158      0.451      0.677      0.349      0.137
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      17/51      19.7G      1.329      2.381      2.383      1.373         54       1280: 100%|██████████| 1/1 [00:00<00:00,  2.92it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.26it/s]
+
+
+
                   all         10         20      0.284       0.75      0.629      0.298      0.299        0.8      0.649      0.261
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      18/51      19.4G      1.228      1.977      2.472      1.299         46       1280: 100%|██████████| 1/1 [00:00<00:00,  3.18it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.22it/s]
+
+
+
                   all         10         20      0.284       0.75      0.629      0.298      0.299        0.8      0.649      0.261
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      19/51      19.7G      1.415      2.453      2.654      1.415         54       1280: 100%|██████████| 1/1 [00:00<00:00,  2.92it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.27it/s]
+
+
+
                   all         10         20      0.792       0.35      0.358      0.212      0.792       0.35      0.346      0.179
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      20/51      19.5G      1.282      1.927      2.378      1.391         46       1280: 100%|██████████| 1/1 [00:00<00:00,  3.16it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.41it/s]
+
+
+
                   all         10         20      0.792       0.35      0.358      0.212      0.792       0.35      0.346      0.179
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      21/51      19.7G      1.192      1.933       2.26       1.32         56       1280: 100%|██████████| 1/1 [00:00<00:00,  2.94it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.64it/s]
+
+
+
                   all         10         20      0.474        0.5      0.538      0.356      0.474        0.5      0.539      0.285
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      22/51      19.6G      1.181      1.726      2.088      1.278         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.16it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  1.61it/s]
+
+
+
                   all         10         20      0.474        0.5      0.538      0.356      0.474        0.5      0.539      0.285
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      23/51      19.4G        1.3      2.048      2.568      1.359         35       1280: 100%|██████████| 1/1 [00:00<00:00,  2.89it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  2.50it/s]
+
+
+
                   all         10         20      0.778        0.6      0.659      0.352      0.722       0.55      0.637      0.374
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      24/51      19.6G       1.12      1.541      1.564      1.297         60       1280: 100%|██████████| 1/1 [00:00<00:00,  3.15it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  2.32it/s]
+
+
+
                   all         10         20      0.778        0.6      0.659      0.352      0.722       0.55      0.637      0.374
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      25/51      19.6G      1.043      1.471      1.694      1.228         52       1280: 100%|██████████| 1/1 [00:00<00:00,  2.84it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  3.49it/s]
+
+
+
                   all         10         20      0.415        0.7      0.443      0.248      0.436       0.75      0.445      0.307
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      26/51      19.5G      1.355       1.56      1.975       1.43         54       1280: 100%|██████████| 1/1 [00:00<00:00,  3.24it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  3.66it/s]
+
+
+
                   all         10         20      0.415        0.7      0.443      0.248      0.436       0.75      0.445      0.307
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      27/51      19.7G       1.02      1.483      1.475      1.218         56       1280: 100%|██████████| 1/1 [00:00<00:00,  2.86it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.42it/s]
+
+
+
                   all         10         20      0.771       0.55      0.749      0.416      0.771       0.55       0.76       0.51
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      28/51      19.6G      1.079      1.262      1.616      1.309         56       1280: 100%|██████████| 1/1 [00:00<00:00,  3.23it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.32it/s]
+
+
+
                   all         10         20      0.771       0.55      0.749      0.416      0.771       0.55       0.76       0.51
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      29/51      19.7G      1.093      1.497      1.691      1.302         56       1280: 100%|██████████| 1/1 [00:00<00:00,  2.99it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.35it/s]
+
+
+
                   all         10         20      0.774       0.75      0.799      0.436      0.774       0.75      0.788      0.514
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      30/51      19.5G      1.135      1.419      1.848      1.339         42       1280: 100%|██████████| 1/1 [00:00<00:00,  3.19it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.87it/s]
+
+
+
                   all         10         20      0.774       0.75      0.799      0.436      0.774       0.75      0.788      0.514
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      31/51      19.6G      1.042      1.496      1.698      1.272         50       1280: 100%|██████████| 1/1 [00:00<00:00,  2.99it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.63it/s]
+
+
+
                   all         10         20      0.632        0.7      0.652      0.429      0.663       0.75      0.698      0.501
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      32/51      19.6G       1.05      1.488      1.513      1.214         52       1280: 100%|██████████| 1/1 [00:00<00:00,  3.26it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.62it/s]
+
+
+
                   all         10         20      0.632        0.7      0.652      0.429      0.663       0.75      0.698      0.501
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      33/51      19.6G      1.043      1.309       1.44      1.269         45       1280: 100%|██████████| 1/1 [00:00<00:00,  3.27it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.10it/s]
+
+
+
                   all         10         20      0.632        0.7      0.652      0.429      0.663       0.75      0.698      0.501
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      34/51      19.7G     0.9872      1.389      1.329      1.191         62       1280: 100%|██████████| 1/1 [00:00<00:00,  2.98it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.38it/s]
+
+
+
                   all         10         20      0.646        0.7      0.724      0.512      0.713        0.8      0.763       0.58
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      35/51      19.6G      1.019      1.202      1.568      1.191         52       1280: 100%|██████████| 1/1 [00:00<00:00,  3.23it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.40it/s]
+
+
+
                   all         10         20      0.646        0.7      0.724      0.512      0.713        0.8      0.763       0.58
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      36/51      19.6G      1.029       1.35      1.522      1.232         51       1280: 100%|██████████| 1/1 [00:00<00:00,  3.27it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.22it/s]
+
+
+
                   all         10         20      0.646        0.7      0.724      0.512      0.713        0.8      0.763       0.58
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      37/51      19.6G      1.056      1.289      1.494      1.221         50       1280: 100%|██████████| 1/1 [00:00<00:00,  2.96it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.64it/s]
+
+
+
                   all         10         20      0.635       0.85      0.803        0.6       0.69        0.9      0.846      0.657
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      38/51      19.6G     0.8953      1.298       1.12      1.068         60       1280: 100%|██████████| 1/1 [00:00<00:00,  3.19it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.65it/s]
+
+
+
                   all         10         20      0.635       0.85      0.803        0.6       0.69        0.9      0.846      0.657
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      39/51      19.6G     0.8898      1.057      1.206      1.153         54       1280: 100%|██████████| 1/1 [00:00<00:00,  3.25it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.70it/s]
+
+
+
                   all         10         20      0.635       0.85      0.803        0.6       0.69        0.9      0.846      0.657
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      40/51      19.7G     0.9538      1.113      1.201      1.206         60       1280: 100%|██████████| 1/1 [00:00<00:00,  2.89it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.69it/s]
+
+
+
                   all         10         20      0.736        0.8       0.85      0.621      0.812        0.9      0.919      0.683
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      41/51      19.5G     0.9001      1.213      1.314      1.171         46       1280: 100%|██████████| 1/1 [00:00<00:00,  3.22it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.36it/s]
+
+
+
                   all         10         20      0.736        0.8       0.85      0.621      0.812        0.9      0.919      0.683
+
+
+
+
+
+
Closing dataloader mosaic
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      42/51      19.3G      1.192      1.432      2.694      1.399         20       1280: 100%|██████████| 1/1 [00:00<00:00,  1.16it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  4.71it/s]
+
+
+
                   all         10         20      0.736        0.8       0.85      0.621      0.812        0.9      0.919      0.683
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      43/51      19.3G      1.034      1.252      2.616      1.418         20       1280: 100%|██████████| 1/1 [00:00<00:00,  2.93it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  4.62it/s]
+
+
+
                   all         10         20      0.814       0.85      0.876      0.651      0.866        0.9      0.931      0.702
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      44/51      19.2G     0.9355      1.308       2.01       1.27         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.24it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  4.74it/s]
+
+
+
                   all         10         20      0.814       0.85      0.876      0.651      0.866        0.9      0.931      0.702
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      45/51      19.3G     0.9141      1.258      2.125      1.412         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.20it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.62it/s]
+
+
+
                   all         10         20      0.814       0.85      0.876      0.651      0.866        0.9      0.931      0.702
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      46/51      19.3G      1.256      1.421      2.302      1.435         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.01it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.75it/s]
+
+
+
                   all         10         20      0.782      0.829       0.84        0.6      0.804      0.819      0.881      0.687
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      47/51      19.2G     0.8384      1.046      1.981      1.267         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.27it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.70it/s]
+
+
+
                   all         10         20      0.782      0.829       0.84        0.6      0.804      0.819      0.881      0.687
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      48/51      19.3G      1.146      1.197      2.063      1.319         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.25it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.71it/s]
+
+
+
                   all         10         20      0.782      0.829       0.84        0.6      0.804      0.819      0.881      0.687
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      49/51      19.3G      1.144      1.099      2.118      1.411         20       1280: 100%|██████████| 1/1 [00:00<00:00,  2.99it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.79it/s]
+
+
+
                   all         10         20      0.811      0.776      0.909      0.675      0.797      0.879      0.926      0.728
+
+
+
+
+
+

+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      50/51      19.2G      1.022      1.027      1.993      1.325         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.26it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.60it/s]
+
+
+
                   all         10         20      0.811      0.776      0.909      0.675      0.797      0.879      0.926      0.728
+
+      Epoch    GPU_mem   box_loss   seg_loss   cls_loss   dfl_loss  Instances       Size
+
+
+
      51/51      19.3G      1.167      1.066        2.1      1.344         20       1280: 100%|██████████| 1/1 [00:00<00:00,  3.12it/s]
+                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.51it/s]
+
+
+
                   all         10         20      0.811      0.776      0.909      0.675      0.797      0.879      0.926      0.728
+
+
+
+
+
+

+51 epochs completed in 0.031 hours.
+Optimizer stripped from runs/segment/train/weights/last.pt, 55.0MB
+Optimizer stripped from runs/segment/train/weights/best.pt, 55.0MB
+
+Validating runs/segment/train/weights/best.pt...
+Ultralytics 8.3.55 🚀 Python-3.10.15 torch-2.5.0+cu124 CUDA:0 (NVIDIA A100-SXM4-80GB, 81156MiB)
+YOLOv8m-seg summary (fused): 245 layers, 27,223,542 parameters, 0 gradients, 110.0 GFLOPs
+
+
+
                 Class     Images  Instances      Box(P          R      mAP50  mAP50-95)     Mask(P          R      mAP50  mAP50-95): 100%|██████████| 1/1 [00:00<00:00,  5.44it/s]
+
+
+
                   all         10         20      0.811      0.776      0.909      0.675      0.797      0.879      0.926      0.727
+             inner_box         10         10      0.622        0.7      0.823      0.508      0.594        0.8      0.857      0.678
+             outer_box         10         10          1      0.852      0.995      0.842          1      0.959      0.995      0.776
+Speed: 0.4ms preprocess, 6.9ms inference, 0.0ms loss, 3.5ms postprocess per image
+Results saved to runs/segment/train
+
+
+ +
+
+ +

Run history:


+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
lr/pg0▁▂▂▃▃▄▄▅▅▅▆▆▇▇▇▇███████████▇▇▇▇▆▆▆▆▅▄▄▃▂
lr/pg1▁▂▂▃▃▄▅▅▅▆▆▇▇▇▇████████████▇▇▇▆▆▆▆▅▄▄▄▃▂
lr/pg2▁▂▂▃▃▄▄▅▅▅▆▆▇▇▇▇███████████▇▇▇▆▆▆▆▅▄▄▄▃▂
metrics/mAP50(B)▁▁▁▁▂▄▄▄▅▅▄▆▆▄▄▅▆▆▄▄▇▇▇▆▆▇▇▇▇▇█████▇▇▇██
metrics/mAP50(M)▁▁▁▁▁▁▁▂▃▃▄▄▄▆▄▅▅▆▆▄▇▇▇▇▆▇▇▇▇▇██████████
metrics/mAP50-95(B)▁▁▁▁▁▁▂▂▃▃▃▃▃▃▄▃▃▅▅▅▄▅▅▆▆▅▅▆▆▆▇▇▇▇██▇▇██
metrics/mAP50-95(M)▁▁▁▁▁▁▁▁▂▂▂▂▂▂▄▃▄▄▅▅▄▆▆▆▆▆▇▇▇▇██████████
metrics/precision(B)▁▁▁▁▁▇▂▅██▅▅▃▃█▅██▅▅███▆▆▇▇▇▆▆▇▇▇███████
metrics/precision(M)▁▁▁▁▁▆▂▂▂▂█▄▄▃▃▇▅▅▇▄▇▇▇▇▆▆▇▇▇▆▆▇▇▇██▇▇▇▇
metrics/recall(B)▁▁▁▁▁▃▄▂▂▃▇▇▇▇▄▅▅▆▆▇▆▇▇▇▇▇▇▇██████████▇▇
metrics/recall(M)▁▁▁▁▁▃▂▄▆▆▃▆▆▇▇▅▅▅▅▇▅▅▇▇▇▇▇▇▇███████▇▇██
model/GFLOPs
model/parameters
model/speed_PyTorch(ms)
train/box_loss▇▇▇▇▆█▇▆▄▄▃▃▄▄▃▂▂▂▂▂▃▂▂▂▂▂▁▂▂▂▁▁▁▂▂▁▁▂▂▂
train/cls_loss▇▇▇▇██▇▇▅▄▄▃▃▃▂▃▂▂▂▃▂▁▁▂▂▁▁▁▁▁▁▁▁▃▃▂▂▂▂▂
train/dfl_loss▇▇▆▇█▆▄▄▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▁▁▂▂▁▂▁▂▂▂▂▂▂▂▂
train/seg_loss▇▆▇▇▆██▅▄▄▄▃▃▃▂▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁
val/box_loss██████▄▃▃▃▃▃▃▃▃▂▂▂▂▂▃▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁
val/cls_loss▃▃▃▃▃▄▄██▄▄▄▆▆▅▃▅▅██▃▃▃▃▃▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁
val/dfl_loss██████▄▃▂▂▃▃▃▂▂▂▂▂▂▂▄▂▂▂▂▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁
val/seg_loss█████▇▆▆▅▅▄▃▃▃▃▃▄▄▃▅▂▂▂▂▂▂▂▂▂▁▁▂▂▂▁▁▁▁▁▁
+

Run summary:


+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
lr/pg02e-05
lr/pg12e-05
lr/pg22e-05
metrics/mAP50(B)0.90923
metrics/mAP50(M)0.92577
metrics/mAP50-95(B)0.67538
metrics/mAP50-95(M)0.72695
metrics/precision(B)0.81082
metrics/precision(M)0.79694
metrics/recall(B)0.77614
metrics/recall(M)0.87927
model/GFLOPs110.395
model/parameters27240806
model/speed_PyTorch(ms)13.361
train/box_loss1.16748
train/cls_loss2.09975
train/dfl_loss1.34438
train/seg_loss1.0662
val/box_loss1.02605
val/cls_loss2.05508
val/dfl_loss1.23359
val/seg_loss1.12731
+
+
+
+ View run train at: https://wandb.ai/sustainability-lab/Ultralytics/runs/tvkavy42
View project at: https://wandb.ai/sustainability-lab/Ultralytics
Synced 5 W&B file(s), 0 media file(s), 20 artifact file(s) and 28 other file(s) +
+
+Find logs at: ./wandb/run-20250120_154343-tvkavy42/logs +
+
+
ultralytics.utils.metrics.SegmentMetrics object with attributes:
+
+ap_class_index: array([0, 1])
+box: ultralytics.utils.metrics.Metric object
+confusion_matrix: <ultralytics.utils.metrics.ConfusionMatrix object at 0x7fe84e12c040>
+curves: ['Precision-Recall(B)', 'F1-Confidence(B)', 'Precision-Confidence(B)', 'Recall-Confidence(B)', 'Precision-Recall(M)', 'F1-Confidence(M)', 'Precision-Confidence(M)', 'Recall-Confidence(M)']
+curves_results: [[array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
+          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
+          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
+          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
+          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
+           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
+           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
+           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
+           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
+           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
+           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
+           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
+           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
+           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
+           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
+           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
+           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
+           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
+           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
+           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
+           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
+            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
+           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
+           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
+           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
+            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
+           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
+           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
+           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
+            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
+           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
+           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
+           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
+           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
+           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
+           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
+           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
+           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
+           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
+           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
+           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
+           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[          1,           1,           1, ...,     0.38462,     0.38462,           0],
+       [          1,           1,           1, ...,           1,           1,           0]], shape=(2, 1000)), 'Recall', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
+          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
+          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
+          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
+          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
+           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
+           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
+           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
+           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
+           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
+           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
+           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
+           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
+           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
+           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
+           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
+           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
+           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
+           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
+           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
+           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
+            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
+           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
+           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
+           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
+            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
+           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
+           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
+           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
+            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
+           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
+           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
+           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
+           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
+           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
+           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
+           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
+           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
+           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
+           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
+           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
+           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[    0.07326,     0.07326,     0.07326, ...,     0.33565,           0,           0],
+       [  0.0072807,   0.0072807,   0.0072807, ...,     0.30149,      0.2242,           0]], shape=(2, 1000)), 'Confidence', 'F1'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
+          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
+          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
+          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
+          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
+           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
+           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
+           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
+           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
+           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
+           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
+           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
+           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
+           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
+           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
+           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
+           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
+           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
+           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
+           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
+           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
+            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
+           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
+           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
+           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
+            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
+           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
+           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
+           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
+            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
+           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
+           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
+           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
+           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
+           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
+           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
+           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
+           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
+           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
+           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
+           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
+           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[   0.038023,    0.038023,    0.038023, ...,           1,           1,           1],
+       [  0.0036536,   0.0036536,   0.0036536, ...,           1,           1,           1]], shape=(2, 1000)), 'Confidence', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
+          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
+          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
+          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
+          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
+           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
+           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
+           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
+           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
+           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
+           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
+           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
+           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
+           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
+           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
+           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
+           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
+           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
+           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
+           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
+           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
+            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
+           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
+           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
+           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
+            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
+           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
+           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
+           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
+            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
+           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
+           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
+           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
+           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
+           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
+           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
+           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
+           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
+           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
+           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
+           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
+           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[          1,           1,           1, ...,     0.20167,           0,           0],
+       [          1,           1,           1, ...,      0.1775,     0.12625,           0]], shape=(2, 1000)), 'Confidence', 'Recall'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
+          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
+          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
+          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
+          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
+           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
+           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
+           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
+           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
+           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
+           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
+           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
+           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
+           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
+           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
+           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
+           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
+           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
+           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
+           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
+           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
+            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
+           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
+           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
+           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
+            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
+           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
+           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
+           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
+            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
+           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
+           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
+           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
+           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
+           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
+           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
+           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
+           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
+           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
+           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
+           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
+           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[          1,           1,           1, ...,         0.5,         0.5,           0],
+       [          1,           1,           1, ...,           1,           1,           0]], shape=(2, 1000)), 'Recall', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
+          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
+          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
+          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
+          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
+           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
+           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
+           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
+           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
+           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
+           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
+           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
+           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
+           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
+           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
+           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
+           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
+           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
+           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
+           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
+           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
+            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
+           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
+           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
+           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
+            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
+           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
+           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
+           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
+            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
+           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
+           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
+           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
+           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
+           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
+           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
+           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
+           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
+           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
+           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
+           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
+           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[    0.07326,     0.07326,     0.07326, ...,     0.33565,           0,           0],
+       [  0.0072807,   0.0072807,   0.0072807, ...,     0.30149,      0.2242,           0]], shape=(2, 1000)), 'Confidence', 'F1'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
+          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
+          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
+          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
+          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
+           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
+           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
+           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
+           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
+           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
+           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
+           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
+           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
+           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
+           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
+           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
+           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
+           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
+           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
+           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
+           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
+            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
+           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
+           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
+           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
+            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
+           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
+           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
+           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
+            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
+           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
+           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
+           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
+           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
+           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
+           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
+           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
+           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
+           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
+           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
+           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
+           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[   0.038023,    0.038023,    0.038023, ...,           1,           1,           1],
+       [  0.0036536,   0.0036536,   0.0036536, ...,           1,           1,           1]], shape=(2, 1000)), 'Confidence', 'Precision'], [array([          0,    0.001001,    0.002002,    0.003003,    0.004004,    0.005005,    0.006006,    0.007007,    0.008008,    0.009009,     0.01001,    0.011011,    0.012012,    0.013013,    0.014014,    0.015015,    0.016016,    0.017017,    0.018018,    0.019019,     0.02002,    0.021021,    0.022022,    0.023023,
+          0.024024,    0.025025,    0.026026,    0.027027,    0.028028,    0.029029,     0.03003,    0.031031,    0.032032,    0.033033,    0.034034,    0.035035,    0.036036,    0.037037,    0.038038,    0.039039,     0.04004,    0.041041,    0.042042,    0.043043,    0.044044,    0.045045,    0.046046,    0.047047,
+          0.048048,    0.049049,     0.05005,    0.051051,    0.052052,    0.053053,    0.054054,    0.055055,    0.056056,    0.057057,    0.058058,    0.059059,     0.06006,    0.061061,    0.062062,    0.063063,    0.064064,    0.065065,    0.066066,    0.067067,    0.068068,    0.069069,     0.07007,    0.071071,
+          0.072072,    0.073073,    0.074074,    0.075075,    0.076076,    0.077077,    0.078078,    0.079079,     0.08008,    0.081081,    0.082082,    0.083083,    0.084084,    0.085085,    0.086086,    0.087087,    0.088088,    0.089089,     0.09009,    0.091091,    0.092092,    0.093093,    0.094094,    0.095095,
+          0.096096,    0.097097,    0.098098,    0.099099,      0.1001,      0.1011,      0.1021,      0.1031,      0.1041,     0.10511,     0.10611,     0.10711,     0.10811,     0.10911,     0.11011,     0.11111,     0.11211,     0.11311,     0.11411,     0.11512,     0.11612,     0.11712,     0.11812,     0.11912,
+           0.12012,     0.12112,     0.12212,     0.12312,     0.12412,     0.12513,     0.12613,     0.12713,     0.12813,     0.12913,     0.13013,     0.13113,     0.13213,     0.13313,     0.13413,     0.13514,     0.13614,     0.13714,     0.13814,     0.13914,     0.14014,     0.14114,     0.14214,     0.14314,
+           0.14414,     0.14515,     0.14615,     0.14715,     0.14815,     0.14915,     0.15015,     0.15115,     0.15215,     0.15315,     0.15415,     0.15516,     0.15616,     0.15716,     0.15816,     0.15916,     0.16016,     0.16116,     0.16216,     0.16316,     0.16416,     0.16517,     0.16617,     0.16717,
+           0.16817,     0.16917,     0.17017,     0.17117,     0.17217,     0.17317,     0.17417,     0.17518,     0.17618,     0.17718,     0.17818,     0.17918,     0.18018,     0.18118,     0.18218,     0.18318,     0.18418,     0.18519,     0.18619,     0.18719,     0.18819,     0.18919,     0.19019,     0.19119,
+           0.19219,     0.19319,     0.19419,      0.1952,      0.1962,      0.1972,      0.1982,      0.1992,      0.2002,      0.2012,      0.2022,      0.2032,      0.2042,     0.20521,     0.20621,     0.20721,     0.20821,     0.20921,     0.21021,     0.21121,     0.21221,     0.21321,     0.21421,     0.21522,
+           0.21622,     0.21722,     0.21822,     0.21922,     0.22022,     0.22122,     0.22222,     0.22322,     0.22422,     0.22523,     0.22623,     0.22723,     0.22823,     0.22923,     0.23023,     0.23123,     0.23223,     0.23323,     0.23423,     0.23524,     0.23624,     0.23724,     0.23824,     0.23924,
+           0.24024,     0.24124,     0.24224,     0.24324,     0.24424,     0.24525,     0.24625,     0.24725,     0.24825,     0.24925,     0.25025,     0.25125,     0.25225,     0.25325,     0.25425,     0.25526,     0.25626,     0.25726,     0.25826,     0.25926,     0.26026,     0.26126,     0.26226,     0.26326,
+           0.26426,     0.26527,     0.26627,     0.26727,     0.26827,     0.26927,     0.27027,     0.27127,     0.27227,     0.27327,     0.27427,     0.27528,     0.27628,     0.27728,     0.27828,     0.27928,     0.28028,     0.28128,     0.28228,     0.28328,     0.28428,     0.28529,     0.28629,     0.28729,
+           0.28829,     0.28929,     0.29029,     0.29129,     0.29229,     0.29329,     0.29429,      0.2953,      0.2963,      0.2973,      0.2983,      0.2993,      0.3003,      0.3013,      0.3023,      0.3033,      0.3043,     0.30531,     0.30631,     0.30731,     0.30831,     0.30931,     0.31031,     0.31131,
+           0.31231,     0.31331,     0.31431,     0.31532,     0.31632,     0.31732,     0.31832,     0.31932,     0.32032,     0.32132,     0.32232,     0.32332,     0.32432,     0.32533,     0.32633,     0.32733,     0.32833,     0.32933,     0.33033,     0.33133,     0.33233,     0.33333,     0.33433,     0.33534,
+           0.33634,     0.33734,     0.33834,     0.33934,     0.34034,     0.34134,     0.34234,     0.34334,     0.34434,     0.34535,     0.34635,     0.34735,     0.34835,     0.34935,     0.35035,     0.35135,     0.35235,     0.35335,     0.35435,     0.35536,     0.35636,     0.35736,     0.35836,     0.35936,
+           0.36036,     0.36136,     0.36236,     0.36336,     0.36436,     0.36537,     0.36637,     0.36737,     0.36837,     0.36937,     0.37037,     0.37137,     0.37237,     0.37337,     0.37437,     0.37538,     0.37638,     0.37738,     0.37838,     0.37938,     0.38038,     0.38138,     0.38238,     0.38338,
+           0.38438,     0.38539,     0.38639,     0.38739,     0.38839,     0.38939,     0.39039,     0.39139,     0.39239,     0.39339,     0.39439,      0.3954,      0.3964,      0.3974,      0.3984,      0.3994,      0.4004,      0.4014,      0.4024,      0.4034,      0.4044,     0.40541,     0.40641,     0.40741,
+           0.40841,     0.40941,     0.41041,     0.41141,     0.41241,     0.41341,     0.41441,     0.41542,     0.41642,     0.41742,     0.41842,     0.41942,     0.42042,     0.42142,     0.42242,     0.42342,     0.42442,     0.42543,     0.42643,     0.42743,     0.42843,     0.42943,     0.43043,     0.43143,
+           0.43243,     0.43343,     0.43443,     0.43544,     0.43644,     0.43744,     0.43844,     0.43944,     0.44044,     0.44144,     0.44244,     0.44344,     0.44444,     0.44545,     0.44645,     0.44745,     0.44845,     0.44945,     0.45045,     0.45145,     0.45245,     0.45345,     0.45445,     0.45546,
+           0.45646,     0.45746,     0.45846,     0.45946,     0.46046,     0.46146,     0.46246,     0.46346,     0.46446,     0.46547,     0.46647,     0.46747,     0.46847,     0.46947,     0.47047,     0.47147,     0.47247,     0.47347,     0.47447,     0.47548,     0.47648,     0.47748,     0.47848,     0.47948,
+           0.48048,     0.48148,     0.48248,     0.48348,     0.48448,     0.48549,     0.48649,     0.48749,     0.48849,     0.48949,     0.49049,     0.49149,     0.49249,     0.49349,     0.49449,      0.4955,      0.4965,      0.4975,      0.4985,      0.4995,      0.5005,      0.5015,      0.5025,      0.5035,
+            0.5045,     0.50551,     0.50651,     0.50751,     0.50851,     0.50951,     0.51051,     0.51151,     0.51251,     0.51351,     0.51451,     0.51552,     0.51652,     0.51752,     0.51852,     0.51952,     0.52052,     0.52152,     0.52252,     0.52352,     0.52452,     0.52553,     0.52653,     0.52753,
+           0.52853,     0.52953,     0.53053,     0.53153,     0.53253,     0.53353,     0.53453,     0.53554,     0.53654,     0.53754,     0.53854,     0.53954,     0.54054,     0.54154,     0.54254,     0.54354,     0.54454,     0.54555,     0.54655,     0.54755,     0.54855,     0.54955,     0.55055,     0.55155,
+           0.55255,     0.55355,     0.55455,     0.55556,     0.55656,     0.55756,     0.55856,     0.55956,     0.56056,     0.56156,     0.56256,     0.56356,     0.56456,     0.56557,     0.56657,     0.56757,     0.56857,     0.56957,     0.57057,     0.57157,     0.57257,     0.57357,     0.57457,     0.57558,
+           0.57658,     0.57758,     0.57858,     0.57958,     0.58058,     0.58158,     0.58258,     0.58358,     0.58458,     0.58559,     0.58659,     0.58759,     0.58859,     0.58959,     0.59059,     0.59159,     0.59259,     0.59359,     0.59459,      0.5956,      0.5966,      0.5976,      0.5986,      0.5996,
+            0.6006,      0.6016,      0.6026,      0.6036,      0.6046,     0.60561,     0.60661,     0.60761,     0.60861,     0.60961,     0.61061,     0.61161,     0.61261,     0.61361,     0.61461,     0.61562,     0.61662,     0.61762,     0.61862,     0.61962,     0.62062,     0.62162,     0.62262,     0.62362,
+           0.62462,     0.62563,     0.62663,     0.62763,     0.62863,     0.62963,     0.63063,     0.63163,     0.63263,     0.63363,     0.63463,     0.63564,     0.63664,     0.63764,     0.63864,     0.63964,     0.64064,     0.64164,     0.64264,     0.64364,     0.64464,     0.64565,     0.64665,     0.64765,
+           0.64865,     0.64965,     0.65065,     0.65165,     0.65265,     0.65365,     0.65465,     0.65566,     0.65666,     0.65766,     0.65866,     0.65966,     0.66066,     0.66166,     0.66266,     0.66366,     0.66466,     0.66567,     0.66667,     0.66767,     0.66867,     0.66967,     0.67067,     0.67167,
+           0.67267,     0.67367,     0.67467,     0.67568,     0.67668,     0.67768,     0.67868,     0.67968,     0.68068,     0.68168,     0.68268,     0.68368,     0.68468,     0.68569,     0.68669,     0.68769,     0.68869,     0.68969,     0.69069,     0.69169,     0.69269,     0.69369,     0.69469,      0.6957,
+            0.6967,      0.6977,      0.6987,      0.6997,      0.7007,      0.7017,      0.7027,      0.7037,      0.7047,     0.70571,     0.70671,     0.70771,     0.70871,     0.70971,     0.71071,     0.71171,     0.71271,     0.71371,     0.71471,     0.71572,     0.71672,     0.71772,     0.71872,     0.71972,
+           0.72072,     0.72172,     0.72272,     0.72372,     0.72472,     0.72573,     0.72673,     0.72773,     0.72873,     0.72973,     0.73073,     0.73173,     0.73273,     0.73373,     0.73473,     0.73574,     0.73674,     0.73774,     0.73874,     0.73974,     0.74074,     0.74174,     0.74274,     0.74374,
+           0.74474,     0.74575,     0.74675,     0.74775,     0.74875,     0.74975,     0.75075,     0.75175,     0.75275,     0.75375,     0.75475,     0.75576,     0.75676,     0.75776,     0.75876,     0.75976,     0.76076,     0.76176,     0.76276,     0.76376,     0.76476,     0.76577,     0.76677,     0.76777,
+           0.76877,     0.76977,     0.77077,     0.77177,     0.77277,     0.77377,     0.77477,     0.77578,     0.77678,     0.77778,     0.77878,     0.77978,     0.78078,     0.78178,     0.78278,     0.78378,     0.78478,     0.78579,     0.78679,     0.78779,     0.78879,     0.78979,     0.79079,     0.79179,
+           0.79279,     0.79379,     0.79479,      0.7958,      0.7968,      0.7978,      0.7988,      0.7998,      0.8008,      0.8018,      0.8028,      0.8038,      0.8048,     0.80581,     0.80681,     0.80781,     0.80881,     0.80981,     0.81081,     0.81181,     0.81281,     0.81381,     0.81481,     0.81582,
+           0.81682,     0.81782,     0.81882,     0.81982,     0.82082,     0.82182,     0.82282,     0.82382,     0.82482,     0.82583,     0.82683,     0.82783,     0.82883,     0.82983,     0.83083,     0.83183,     0.83283,     0.83383,     0.83483,     0.83584,     0.83684,     0.83784,     0.83884,     0.83984,
+           0.84084,     0.84184,     0.84284,     0.84384,     0.84484,     0.84585,     0.84685,     0.84785,     0.84885,     0.84985,     0.85085,     0.85185,     0.85285,     0.85385,     0.85485,     0.85586,     0.85686,     0.85786,     0.85886,     0.85986,     0.86086,     0.86186,     0.86286,     0.86386,
+           0.86486,     0.86587,     0.86687,     0.86787,     0.86887,     0.86987,     0.87087,     0.87187,     0.87287,     0.87387,     0.87487,     0.87588,     0.87688,     0.87788,     0.87888,     0.87988,     0.88088,     0.88188,     0.88288,     0.88388,     0.88488,     0.88589,     0.88689,     0.88789,
+           0.88889,     0.88989,     0.89089,     0.89189,     0.89289,     0.89389,     0.89489,      0.8959,      0.8969,      0.8979,      0.8989,      0.8999,      0.9009,      0.9019,      0.9029,      0.9039,      0.9049,     0.90591,     0.90691,     0.90791,     0.90891,     0.90991,     0.91091,     0.91191,
+           0.91291,     0.91391,     0.91491,     0.91592,     0.91692,     0.91792,     0.91892,     0.91992,     0.92092,     0.92192,     0.92292,     0.92392,     0.92492,     0.92593,     0.92693,     0.92793,     0.92893,     0.92993,     0.93093,     0.93193,     0.93293,     0.93393,     0.93493,     0.93594,
+           0.93694,     0.93794,     0.93894,     0.93994,     0.94094,     0.94194,     0.94294,     0.94394,     0.94494,     0.94595,     0.94695,     0.94795,     0.94895,     0.94995,     0.95095,     0.95195,     0.95295,     0.95395,     0.95495,     0.95596,     0.95696,     0.95796,     0.95896,     0.95996,
+           0.96096,     0.96196,     0.96296,     0.96396,     0.96496,     0.96597,     0.96697,     0.96797,     0.96897,     0.96997,     0.97097,     0.97197,     0.97297,     0.97397,     0.97497,     0.97598,     0.97698,     0.97798,     0.97898,     0.97998,     0.98098,     0.98198,     0.98298,     0.98398,
+           0.98498,     0.98599,     0.98699,     0.98799,     0.98899,     0.98999,     0.99099,     0.99199,     0.99299,     0.99399,     0.99499,       0.996,       0.997,       0.998,       0.999,           1]), array([[          1,           1,           1, ...,     0.20167,           0,           0],
+       [          1,           1,           1, ...,      0.1775,     0.12625,           0]], shape=(2, 1000)), 'Confidence', 'Recall']]
+fitness: np.float64(1.4455970167060048)
+keys: ['metrics/precision(B)', 'metrics/recall(B)', 'metrics/mAP50(B)', 'metrics/mAP50-95(B)', 'metrics/precision(M)', 'metrics/recall(M)', 'metrics/mAP50(M)', 'metrics/mAP50-95(M)']
+maps: array([     1.1864,      1.6182])
+names: {0: 'inner_box', 1: 'outer_box'}
+plot: True
+results_dict: {'metrics/precision(B)': np.float64(0.8108231337398004), 'metrics/recall(B)': np.float64(0.7761444955593891), 'metrics/mAP50(B)': np.float64(0.9092307692307693), 'metrics/mAP50-95(B)': np.float64(0.6753775662745483), 'metrics/precision(M)': np.float64(0.7969436046359123), 'metrics/recall(M)': np.float64(0.8792737357610776), 'metrics/mAP50(M)': np.float64(0.9257692307692309), 'metrics/mAP50-95(M)': np.float64(0.726952452287679), 'fitness': np.float64(1.4455970167060048)}
+save_dir: PosixPath('runs/segment/train')
+seg: ultralytics.utils.metrics.Metric object
+speed: {'preprocess': 0.447845458984375, 'inference': 6.885099411010742, 'loss': 0.0012874603271484375, 'postprocess': 3.5344362258911133}
+task: 'segment'
+
+
+
+
+

Predict

+
+
import numpy as np
+import supervision as sv
+pred_model = YOLO("/home/patel_zeel/blog/lab/runs/segment/train/weights/best.pt")
+
+
+
import os
+files = glob("/home/patel_zeel/kiln_compass_24/regions/high_res/19/*.png")
+# np.random.seed(1)
+random_file = np.random.choice(files)
+base_name = os.path.basename(random_file)
+if base_name in [os.path.basename(file) for file in glob("../lab/trench_width/images/*.png")]:
+    print("Part of the training dataset")
+
+result = pred_model(random_file, imgsz=1280, verbose=False)[0]
+detection = sv.Detections.from_ultralytics(result)
+
+img = Image.open(random_file)
+box_annotator = sv.MaskAnnotator()
+label_annotator = sv.LabelAnnotator()
+annotated_image = box_annotator.annotate(img.copy(), detection)
+annotated_image = label_annotator.annotate(annotated_image, detection)
+display(annotated_image)
+
+
+
+

+
+
+
+
+ + +
+ +
+ +
+ + + + + \ No newline at end of file diff --git a/lab/scratchpad_files/figure-html/cell-15-output-1.png b/lab/scratchpad_files/figure-html/cell-15-output-1.png new file mode 100644 index 0000000..ae7ddd1 Binary files /dev/null and b/lab/scratchpad_files/figure-html/cell-15-output-1.png differ diff --git a/lab/scratchpad_files/figure-html/cell-2-output-2.png b/lab/scratchpad_files/figure-html/cell-2-output-2.png new file mode 100644 index 0000000..d26bdb0 Binary files /dev/null and b/lab/scratchpad_files/figure-html/cell-2-output-2.png differ diff --git a/lab/scratchpad_files/figure-html/cell-2-output-4.png b/lab/scratchpad_files/figure-html/cell-2-output-4.png new file mode 100644 index 0000000..419adad Binary files /dev/null and b/lab/scratchpad_files/figure-html/cell-2-output-4.png differ diff --git a/lab/scratchpad_files/figure-html/cell-21-output-1.png b/lab/scratchpad_files/figure-html/cell-21-output-1.png new file mode 100644 index 0000000..e09a288 Binary files /dev/null and b/lab/scratchpad_files/figure-html/cell-21-output-1.png differ diff --git a/lab/scratchpad_files/figure-html/cell-25-output-1.png b/lab/scratchpad_files/figure-html/cell-25-output-1.png new file mode 100644 index 0000000..dd37dd7 Binary files /dev/null and b/lab/scratchpad_files/figure-html/cell-25-output-1.png differ diff --git a/listings.json b/listings.json index a59230e..8df3fc0 100644 --- a/listings.json +++ b/listings.json @@ -2,29 +2,23 @@ { "listing": "/index.html", "items": [ - "/posts/2024-12-27-download_caaqm_locations.html", + "/posts/2024-12-29-object-detection-how-to.html", + "/posts/2024-12-27-download_caaqm_locations copy.html", "/posts/2024-12-10-cpcb-download.html", - "/posts/foundation-models-for-time-series.html", - "/posts/GPT-from-scratch.html", "/posts/fundamentals_across_domains.html", - "/posts/seq_to_seq.html", - "/posts/wrf-tutorial.html", - "/posts/learnings_from_brick_kiln_project.html", - "/posts/Torch-DataLoaders.html", - "/posts/AL_with_MNIST.html", + "/posts/2025-02-10-object-detection-random-baseline.html", "/posts/non-gaussian-likelihood-mlps.html", "/posts/pruning_vs_uncertainty.html", - "/posts/air-quality-google-.html", "/posts/bayesian-gaussian-basis-regression.html", "/posts/numpy-algebra- copy.html", "/posts/Rank1_GPs.html", "/posts/PurpleAir.html", - "/posts/Multiclass_GP_classification.html", "/posts/climate-modeling-with-SpecialGP.html", + "/posts/Multiclass_GP_classification.html", "/posts/climate-modeling-with-siren.html", "/posts/GNNs_and_GPs.html", - "/posts/Basis_functions.html", "/posts/GNN_for_regression.html", + "/posts/Basis_functions.html", "/posts/CNPs_for_Images.html", "/posts/ssh-macos.html", "/posts/2023-04-29-sine-combination-netowrks.html", diff --git a/posts/2020-03-28-active_learning_with_bayesian_linear_regression.html b/posts/2020-03-28-active_learning_with_bayesian_linear_regression.html index 7c42220..b3dabd4 100644 --- a/posts/2020-03-28-active_learning_with_bayesian_linear_regression.html +++ b/posts/2020-03-28-active_learning_with_bayesian_linear_regression.html @@ -2,7 +2,7 @@ - + @@ -72,10 +72,10 @@ - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
+
+ +
+ +
+
+
+

Object Detection - A how-to guide

+
+
+ Basic operations in object detection task +
+
+
+
ML
+
CV
+
+
+
+ + +
+ +
+
Author
+
+

Zeel B Patel

+
+
+ +
+
Published
+
+

December 29, 2024

+
+
+ + +
+ + +
+ + + + +
+ + + + + +
+

Imports

+
+
# Config
+import os
+
+# Basic
+import numpy as np
+import pandas as pd
+from time import time
+import matplotlib.pyplot as plt
+
+# Monitoring
+from tqdm.notebook import tqdm
+
+# IO
+from os.path import join, exists, basename, dirname, splitext, expanduser
+from glob import glob
+
+# Parallel processing
+from joblib import Parallel, delayed
+
+import yaml
+from PIL import Image
+import supervision as sv
+import cv2
+from supervision.utils.file import list_files_with_extensions, read_txt_file
+from supervision.detection.utils import polygon_to_xyxy
+from ultralytics import YOLO
+from ultralytics.utils.downloads import download
+from pathlib import Path
+from inference.models.utils import get_roboflow_model
+from roboflow import Roboflow
+from typing import List, Tuple
+from dotenv import load_dotenv
+load_dotenv()
+
+%reload_ext memory_profiler
+
+sv.__version__
+
+
'0.25.1'
+
+
+
+
+

Axis-Aligned Bounding Boxes (AABB)

+
+

Dataset

+
+

Download

+
+
rf = Roboflow(api_key=os.getenv("ROBOFLOW_API_KEY"))
+ws = rf.workspace("plan-zkend")
+project = ws.project("animals-ksxhf-plgrl")
+version = project.version(2)
+rf_dataset = version.download("yolov8", location="/tmp/tmp", overwrite=True)
+
+
loading Roboflow workspace...
+loading Roboflow project...
+
+
+
Downloading Dataset Version Zip in /tmp/tmp to yolov8:: 100%|██████████| 3047/3047 [00:02<00:00, 1202.83it/s]
+
+
+
+
+
+

+Extracting Dataset Version Zip to /tmp/tmp in yolov8:: 100%|██████████| 212/212 [00:00<00:00, 5963.05it/s]
+
+
+
+
rf_dataset.location
+
+
'/tmp/tmp'
+
+
+
+
!ls -lh {rf_dataset.location}
+
+
total 24K
+-rw-rw-r-- 1 patel_zeel patel_zeel  423 Feb  3 11:40 data.yaml
+-rw-rw-r-- 1 patel_zeel patel_zeel  141 Feb  3 11:40 README.dataset.txt
+-rw-rw-r-- 1 patel_zeel patel_zeel  896 Feb  3 11:40 README.roboflow.txt
+drwxrwxr-x 4 patel_zeel patel_zeel 4.0K Jan 17 18:47 test
+drwxrwxr-x 4 patel_zeel patel_zeel 4.0K Jan 17 18:47 train
+drwxrwxr-x 4 patel_zeel patel_zeel 4.0K Jan 17 22:52 valid
+
+
+
+
!ls -lh {rf_dataset.location}/test
+
+
total 8.0K
+drwxrwxr-x 2 patel_zeel patel_zeel 4.0K Jan 17 18:47 images
+drwxrwxr-x 2 patel_zeel patel_zeel 4.0K Jan 17 18:47 labels
+
+
+
+
!ls -l {rf_dataset.location}/test/images/*.jpg | wc -l
+
+
7
+
+
+
+
!ls -l {rf_dataset.location}/test/labels/*.txt | wc -l
+
+
7
+
+
+
+

Check a sample manually

+
+
image_paths = glob(f"{rf_dataset.location}/test/images/*.jpg")
+sample_image_path = image_paths[0]
+sample_image = Image.open(sample_image_path)
+sample_image
+
+
+
+

+
+
+
+
+
+
label_paths = glob(f"{rf_dataset.location}/test/labels/*.txt")
+sample_label_path = label_paths[0]
+sample_label = np.loadtxt(sample_label_path, ndmin=2)
+sample_label.shape
+
+
(1, 5)
+
+
+
+
sample_label
+
+
array([[         18,     0.67217,     0.47797,     0.53625,     0.51148]])
+
+
+
+
+
+

Load with supervision

+
+
%%time
+
+dataset = sv.DetectionDataset.from_yolo(images_directory_path=f"{rf_dataset.location}/test/images", annotations_directory_path=f"{rf_dataset.location}/test/labels", data_yaml_path=f"{rf_dataset.location}/data.yaml")
+len(dataset)
+
+
CPU times: user 11.6 ms, sys: 0 ns, total: 11.6 ms
+Wall time: 10.6 ms
+
+
+
7
+
+
+
+
+

Visualize

+

Ideally, LabelAnnotator should show the class names on top of the bounding boxes but currently it shows class IDs. This issue is tracked here.

+
+
image_path, image, detection = dataset[0]
+
+box_annotator = sv.BoxAnnotator()
+label_annotator = sv.LabelAnnotator()
+annotated_frame = box_annotator.annotate(image.copy(), detection)
+annotated_frame = label_annotator.annotate(annotated_frame.copy(), detection)
+
+Image.fromarray(annotated_frame)
+
+
+
+

+
+
+
+
+

A quick fix for now.

+
+
image_path, image, detection = dataset[0]
+np_classes = np.array(dataset.classes)
+detection.data['class_name'] = np_classes[detection.class_id]
+box_annotator = sv.BoxAnnotator()
+label_annotator = sv.LabelAnnotator()
+annotated_frame = box_annotator.annotate(image.copy(), detection)
+annotated_frame = label_annotator.annotate(annotated_frame.copy(), detection)
+
+Image.fromarray(annotated_frame)
+
+
+
+

+
+
+
+
+
+
+
+

Inference

+
+

With roboflow models

+
+
rf_model = get_roboflow_model("yolov8s-640")
+prediction = rf_model.infer(image)[0]
+detection = sv.Detections.from_inference(prediction)
+annotated_image = box_annotator.annotate(image.copy(), detection)
+annotated_image = label_annotator.annotate(annotated_image, detection)
+Image.fromarray(annotated_image)
+
+
Specified provider 'OpenVINOExecutionProvider' is not in available provider names.Available providers: 'TensorrtExecutionProvider, CUDAExecutionProvider, CPUExecutionProvider'
+Specified provider 'CoreMLExecutionProvider' is not in available provider names.Available providers: 'TensorrtExecutionProvider, CUDAExecutionProvider, CPUExecutionProvider'
+2025-02-03 11:42:55.119414030 [E:onnxruntime:Default, provider_bridge_ort.cc:1862 TryGetProviderInfo_CUDA] /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1539 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_cuda.so with error: libcudnn_adv.so.9: cannot open shared object file: No such file or directory
+
+2025-02-03 11:42:55.119453180 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:993 CreateExecutionProviderInstance] Failed to create CUDAExecutionProvider. Require cuDNN 9.* and CUDA 12.*. Please install all dependencies as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
+
+
+
+
+

+
+
+
+
+
+
+

With ultralytics models

+
+
model = YOLO("yolov8s")
+prediction = model(image)[0]
+detection = sv.Detections.from_ultralytics(prediction)
+annotated_image = box_annotator.annotate(image.copy(), detection)
+annotated_image = label_annotator.annotate(annotated_image, detection)
+Image.fromarray(annotated_image)
+
+

+0: 640x640 1 dog, 2 horses, 1 sheep, 1 cow, 6.3ms
+Speed: 10.2ms preprocess, 6.3ms inference, 274.3ms postprocess per image at shape (1, 3, 640, 640)
+
+
+
+
+

+
+
+
+
+
+
+
+

Metrics

+
+
model = YOLO("yolov8x")
+targets = []
+predictions = []
+np_classes = np.array(dataset.classes)
+for image_path, image, target in tqdm(dataset):
+    # add class names to detection
+    # target.data['class_name'] = np_classes[target.class_id]
+
+    # remove classes not in model
+    # target = target[np.isin(target['class_name'], list(model.names.values()))]
+    # if len(target) == 0:
+    #     print(f"Skipping {image_path} as it has no classes in model")
+    #     continue
+    
+    prediction = model(image, verbose=False)[0]
+    detection = sv.Detections.from_ultralytics(prediction)
+    
+    # remove classes not in dataset
+    detection = detection[np.isin(detection['class_name'], dataset.classes)]
+    
+    # remap class ids
+    detection.class_id = np.array([dataset.classes.index(class_name) for class_name in detection['class_name']])
+    
+    targets.append(target)
+    predictions.append(detection)
+
+ +
+
+
+
mAP = sv.metrics.MeanAveragePrecision().update(predictions, targets).compute()
+mAP50 = mAP.mAP_scores[0]
+mAP5095 = mAP.mAP_scores.mean()
+print(f"mAP50: {mAP50:.2f}, mAP50-95: {mAP5095:.2f}")
+
+
mAP50: 0.29, mAP50-95: 0.19
+
+
+
+
+
+

Oriented Bounding Boxes (OBB)

+
+

Dataset

+
+

Download

+
+
if not exists('/tmp/DOTAv1.zip'):
+    # Downloaded in 4m 5s with 6.09 MB/s
+    !wget https://github.com/ultralytics/assets/releases/download/v0.0.0/DOTAv1.zip -O /tmp/DOTAv1.zip
+else:
+    print('DOTAv1.zip already downloaded')
+    
+if not exists('/tmp/DOTAv1'):
+    !unzip /tmp/DOTAv1.zip -d /tmp
+else:
+    print('DOTAv1 already unzipped')
+
+
DOTAv1.zip already downloaded
+DOTAv1 already unzipped
+
+
+
+
!ls /tmp/DOTAv1
+
+
21.84s - pydevd: Sending message related to process being replaced timed-out after 5 seconds
+
+
+
images  labels
+
+
+
+
!ls /tmp/DOTAv1/images
+
+
27.24s - pydevd: Sending message related to process being replaced timed-out after 5 seconds
+
+
+
test  train  val
+
+
+
+
print(f"Number of train samples: {len(glob('/tmp/DOTAv1/images/train/*.jpg'))}")
+print(f"Number of val samples: {len(glob('/tmp/DOTAv1/images/val/*.jpg'))}")
+
+
Number of train samples: 1411
+Number of val samples: 458
+
+
+

Keep 20 samples in each and delete the rest

+
+
paths = {'images': {}, 'labels': {}}
+
+for split in ['train', 'val']:
+    paths['images'][split] = glob(f"/tmp/DOTAv1/images/{split}/*.jpg")[:100]
+    paths['labels'][split] = [p.replace("images", "labels").replace(".jpg", ".txt") for p in paths['images'][split]]
+    !mkdir -p /tmp/DOTAv1_small/images/{split}
+    !mkdir -p /tmp/DOTAv1_small/labels/{split}
+
+    for img, label in tqdm(zip(paths['images'][split], paths['labels'][split])):
+        os.system(f"cp {img} /tmp/DOTAv1_small/images/{split}/")
+        os.system(f"cp {label} /tmp/DOTAv1_small/labels/{split}/")
+
+
32.81s - pydevd: Sending message related to process being replaced timed-out after 5 seconds
+38.11s - pydevd: Sending message related to process being replaced timed-out after 5 seconds
+
+
+ +
+
+
44.25s - pydevd: Sending message related to process being replaced timed-out after 5 seconds
+49.55s - pydevd: Sending message related to process being replaced timed-out after 5 seconds
+
+
+ +
+
+
+
print(f"Number of train samples: {len(glob('/tmp/DOTAv1_small/images/train/*.jpg'))}")
+print(f"Number of val samples: {len(glob('/tmp/DOTAv1_small/images/val/*.jpg'))}")
+
+
Number of train samples: 100
+Number of val samples: 100
+
+
+
+
+

Check a sample

+
+
train_images = glob("/tmp/DOTAv1_small/images/train/*.jpg")
+sample_image_path = train_images[1]
+sample_image_path
+
+
'/tmp/DOTAv1_small/images/train/P2732.jpg'
+
+
+
+
sample_image = Image.open(sample_image_path)
+sample_image.size
+
+
(3087, 2632)
+
+
+
+
sample_image.reduce(10)
+
+
+
+

+
+
+
+
+
+
sample_label_path = sample_image_path.replace("images", "labels").replace(".jpg", ".txt")
+assert exists(sample_label_path), f"Error: {sample_label_path} does not exist"
+
+
+
sample_label = np.loadtxt(sample_label_path, ndmin=2)
+sample_label.shape
+
+
(51, 9)
+
+
+
+
sample_label[0]
+
+
array([          7,     0.72076,     0.45023,     0.72238,     0.44871,     0.73178,     0.46087,     0.73016,     0.46201])
+
+
+

The above is YOLO-Oriented Bounding Box (OBB) format: class_id, x1, y1, x2, y2, x3, y3, x4, y4

+
+
+

Load with supervision

+

supervision does not support DOTA dataset yet, but ultralytics has already converted it to YOLO format. Let’s create data.yml file for DOTA dataset.

+
+
%%writefile /tmp/DOTAv1_small/data.yml
+train: /tmp/DOTAv1_small/images/train
+val: /tmp/DOTAv1_small/images/val
+test: /tmp/DOTAv1_small/images/test
+nc: 15
+names: ['plane', 'ship', 'storage tank', 'baseball diamond', 'tennis court', 'basketball court', 'ground track field', 'harbor', 'bridge', 'large vehicle', 'small vehicle', 'helicopter', 'roundabout', 'soccer ball field', 'swimming pool']
+
+
Overwriting /tmp/DOTAv1_small/data.yml
+
+
+
+
# %%memit
+
+dataset = sv.DetectionDataset.from_yolo(
+    "/tmp/DOTAv1_small/images/train",
+    "/tmp/DOTAv1_small/labels/train",
+    data_yaml_path="/tmp/DOTAv1_small/data.yml",
+    is_obb=True,
+)
+
+
+
+

Visualize

+
+
sample = dataset[0]
+img_array = sample[1]
+img_detections = sample[2]
+
+annotator = sv.OrientedBoxAnnotator()
+annotated_img = annotator.annotate(img_array, img_detections)
+
+plt.imshow(annotated_img)
+plt.ylim(0, annotated_img.shape[1] // 2)
+plt.axis("off")
+
+
+
+

+
+
+
+
+
+
+
+

Inference

+
    +
  • iou = Non-max suppression IOU threshold
  • +
  • conf = Object confidence threshold
  • +
+
+

Inline method

+
+
model = YOLO("yolo11x-obb")
+
+detections = []
+predictions = []
+for img_path, img, detection in tqdm(dataset):
+    prediction = model(img, imgsz=1024, iou=0.33, max_det=300, conf=0.001, verbose=False)[0]
+    predictions.append(sv.Detections.from_ultralytics(prediction))
+    detections.append(detection)
+
+ +
+
+
+
+

CLI method

+
+
!cd /tmp && yolo obb predict model=yolo11x-obb source=/tmp/DOTAv1_small/images/val exist_ok=True save=False save_txt=True imgsz=1024 iou=0.33 max_det=300 conf=0.001 verbose=False
+
+
Ultralytics 8.3.55 🚀 Python-3.10.15 torch-2.5.0+cu124 CUDA:0 (NVIDIA A100-SXM4-80GB, 81156MiB)
+YOLO11x-obb summary (fused): 483 layers, 58,752,928 parameters, 0 gradients, 202.8 GFLOPs
+Results saved to runs/obb/predict
+20 labels saved to runs/obb/predict/labels
+💡 Learn more at https://docs.ultralytics.com/modes/predict
+
+
+
+
+
+

Metrics

+
+

Inline method

+
+

Confusion matrix

+
+

At the time of writing this post, supervision’s ConfusionMatrix does not support OBB. Follow this issue for updates.

+
+
    +
  • conf_threshold – minimum confidence threshold for a detection to be considered. Instances with confidence below this threshold are ignored as if they were not predicted.
  • +
  • iou_threshold – minimum intersection over union (IoU) threshold for a detection to be considered a true positive. Predictions with IoU below this threshold are considered false positives.
  • +
+
+
cm = sv.ConfusionMatrix.from_detections(
+    predictions, detections, classes=dataset.classes, conf_threshold=0.25, iou_threshold=0.33
+)
+_ = cm.plot()
+
+
+
+

+
+
+
+
+
+
+

Precision, Recall & F1 Score

+

You know the formulas of Precision and Recall.

+
Precision = TP / (TP + FP)
+Recall = TP / (TP + FN)
+

We can also write them as the following:

+
Precision = TP / PP
+Recall = TP / AP
+

where PP is the number of predicted positives and AP is the number of actual positives.

+

To calculate TP, we can sum the values along the diagonal of the confusion matrix.

+
+
TP = cm.matrix.diagonal().sum()
+TP
+
+
np.float64(847.0)
+
+
+

To calculate PP, we should remove all cells which represent “not predicted” instances. That is nothing but FN. Thus, we will remove the last column representing FN.

+
+
PP = cm.matrix[:, :-1].sum()
+PP
+
+
np.float64(931.0)
+
+
+

To calculate AP, we should remove all cells which represent “predicted but wrong” instances. That is nothing but FP. Thus, we will remove the last row representing FP.

+
+
AP = cm.matrix[:-1, :].sum()
+AP
+
+
np.float64(1059.0)
+
+
+
+
P = TP / PP
+R = TP / AP
+F1 = 2 * P * R / (P + R)
+print(f"P: {P:.2f}, R: {R:.2f}, F1: {F1:.2f}")
+
+
P: 0.91, R: 0.80, F1: 0.85
+
+
+

Notice that to compute P, R and F1, we need to fix a confidence threshold and an IOU threshold. Now, we will see some metrics which integrate over confidence thresholds and use only IoU threshold.

+

There are specific methods in supervision to compute Precision, Recall and F1 Score, but they are significantly slow. If they become faster in future, one can use them with the following code.

+
+
# f1_score = sv.metrics.MeanAveragePrecision(sv.metrics.MetricTarget.ORIENTED_BOUNDING_BOXES)
+# f1_score.update(predictions, detections).compute()
+
+
+
# precision = sv.metrics.Precision(sv.metrics.MetricTarget.ORIENTED_BOUNDING_BOXES)
+# precision.update(predictions, detections).compute()
+
+
+
# recall = sv.metrics.Recall(sv.metrics.MetricTarget.ORIENTED_BOUNDING_BOXES)
+# recall.update(predictions, detections).compute()
+
+
+
+
+

CLI method

+

When we use CLI method of inference in Ultralytics, results are saved on the disk. Now, we need to load them back to calculate metrics.

+
+
from ultralytics.engine.results import Results
+
+

The following method is extremely space consuming unless the following issue is resolved: https://github.com/roboflow/supervision/issues/1762. Not adding further steps until the issue is resolved but the steps should be similar to the axis-aligned bounding box case.

+
+
predicted_dataset = sv.DetectionDataset.from_yolo(
+    "/tmp/DOTAv1_small/images/val",
+    "/tmp/runs/obb/predict/labels",
+    data_yaml_path="/tmp/DOTAv1_small/data.yml",
+    is_obb=True,
+)
+
+
+
---------------------------------------------------------------------------
+KeyboardInterrupt                         Traceback (most recent call last)
+Cell In[78], line 1
+----> 1 predicted_dataset = sv.DetectionDataset.from_yolo(
+      2     "/tmp/DOTAv1_small/images/val",
+      3     "/tmp/runs/obb/predict/labels",
+      4     data_yaml_path="/tmp/DOTAv1_small/data.yml",
+      5     is_obb=True,
+      6 )
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/supervision/dataset/core.py:497, in DetectionDataset.from_yolo(cls, images_directory_path, annotations_directory_path, data_yaml_path, force_masks, is_obb)
+    445 @classmethod
+    446 def from_yolo(
+    447     cls,
+   (...)
+    452     is_obb: bool = False,
+    453 ) -> DetectionDataset:
+    454     """
+    455     Creates a Dataset instance from YOLO formatted data.
+    456 
+   (...)
+    495         ```
+    496     """
+--> 497     classes, image_paths, annotations = load_yolo_annotations(
+    498         images_directory_path=images_directory_path,
+    499         annotations_directory_path=annotations_directory_path,
+    500         data_yaml_path=data_yaml_path,
+    501         force_masks=force_masks,
+    502         is_obb=is_obb,
+    503     )
+    504     return DetectionDataset(
+    505         classes=classes, images=image_paths, annotations=annotations
+    506     )
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/supervision/dataset/formats/yolo.py:182, in load_yolo_annotations(images_directory_path, annotations_directory_path, data_yaml_path, force_masks, is_obb)
+    180     with_masks = _with_mask(lines=lines)
+    181     with_masks = force_masks if force_masks else with_masks
+--> 182     annotation = yolo_annotations_to_detections(
+    183         lines=lines,
+    184         resolution_wh=resolution_wh,
+    185         with_masks=with_masks,
+    186         is_obb=is_obb,
+    187     )
+    188     annotations[image_path] = annotation
+    189 return classes, image_paths, annotations
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/supervision/dataset/formats/yolo.py:120, in yolo_annotations_to_detections(lines, resolution_wh, with_masks, is_obb)
+    115     return Detections(class_id=class_id, xyxy=xyxy, data=data)
+    117 polygons = [
+    118     (polygon * np.array(resolution_wh)).astype(int) for polygon in relative_polygon
+    119 ]
+--> 120 mask = _polygons_to_masks(polygons=polygons, resolution_wh=resolution_wh)
+    121 return Detections(class_id=class_id, xyxy=xyxy, data=data, mask=mask)
+
+File /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/supervision/dataset/formats/yolo.py:50, in _polygons_to_masks(polygons, resolution_wh)
+     47 def _polygons_to_masks(
+     48     polygons: List[np.ndarray], resolution_wh: Tuple[int, int]
+     49 ) -> np.ndarray:
+---> 50     return np.array(
+     51         [
+     52             polygon_to_mask(polygon=polygon, resolution_wh=resolution_wh)
+     53             for polygon in polygons
+     54         ],
+     55         dtype=bool,
+     56     )
+
+KeyboardInterrupt: 
+
+
+
+
+
+
+
+

Versions

+
+
%reload_ext watermark
+
+%watermark -v -m -p supervision,ultralytics
+
+
Python implementation: CPython
+Python version       : 3.10.15
+IPython version      : 8.28.0
+
+supervision: 0.26.0rc1
+ultralytics: 8.3.55
+
+Compiler    : GCC 13.3.0
+OS          : Linux
+Release     : 5.15.0-124-generic
+Machine     : x86_64
+Processor   : x86_64
+CPU cores   : 64
+Architecture: 64bit
+
+
+
+ + +
+ +
+ +
+ + + + + \ No newline at end of file diff --git a/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-13-output-1.png b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-13-output-1.png new file mode 100644 index 0000000..22768d8 Binary files /dev/null and b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-13-output-1.png differ diff --git a/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-14-output-1.png b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-14-output-1.png new file mode 100644 index 0000000..c5c1c64 Binary files /dev/null and b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-14-output-1.png differ diff --git a/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-15-output-2.png b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-15-output-2.png new file mode 100644 index 0000000..b072fef Binary files /dev/null and b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-15-output-2.png differ diff --git a/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-16-output-2.png b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-16-output-2.png new file mode 100644 index 0000000..24da045 Binary files /dev/null and b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-16-output-2.png differ diff --git a/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-27-output-1.png b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-27-output-1.png new file mode 100644 index 0000000..90871d7 Binary files /dev/null and b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-27-output-1.png differ diff --git a/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-33-output-1.png b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-33-output-1.png new file mode 100644 index 0000000..e4ff6d6 Binary files /dev/null and b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-33-output-1.png differ diff --git a/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-36-output-1.png b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-36-output-1.png new file mode 100644 index 0000000..0baaa4c Binary files /dev/null and b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-36-output-1.png differ diff --git a/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-9-output-1.png b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-9-output-1.png new file mode 100644 index 0000000..2fe3953 Binary files /dev/null and b/posts/2024-12-29-object-detection-how-to_files/figure-html/cell-9-output-1.png differ diff --git a/posts/GPT-from-scratch.html b/posts/2025-02-10-object-detection-random-baseline.html similarity index 92% rename from posts/GPT-from-scratch.html rename to posts/2025-02-10-object-detection-random-baseline.html index c9c8c1f..d43ef1a 100644 --- a/posts/GPT-from-scratch.html +++ b/posts/2025-02-10-object-detection-random-baseline.html @@ -2,15 +2,15 @@ - + - - + + -Building GPT from scratch – blog +Object Detection Random Baseline – blog - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- -
- -
-
-
-

Active Learning with MNIST

-
-
- Active Learning with MNIST -
-
-
-
ML
-
-
-
- - -
- -
-
Author
-
-

Zeel B Patel

-
-
- -
-
Published
-
-

September 30, 2023

-
-
- - -
- - -
- - - - -
- - - - - -
-

Imports

-
-
import pandas as pd
-import numpy as np
-from sklearn.ensemble import RandomForestClassifier
-from sklearn.model_selection import train_test_split
-from sklearn.datasets import fetch_openml
-from sklearn.metrics import classification_report, precision_score, recall_score, f1_score, confusion_matrix
-
-import matplotlib.pyplot as plt
-from tqdm import tqdm
-
-import psutil
-
-
-
-

Load data

-
-
X, y = fetch_openml('mnist_784', version=1, data_home='data', return_X_y=True, as_frame=False)
-
-
/home/patel_zeel/miniconda3/lib/python3.9/site-packages/sklearn/datasets/_openml.py:1002: FutureWarning: The default value of `parser` will change from `'liac-arff'` to `'auto'` in 1.4. You can set `parser='auto'` to silence this warning. Therefore, an `ImportError` will be raised from 1.4 if the dataset is dense and pandas is not installed. Note that the pandas parser may return different data types. See the Notes Section in fetch_openml's API doc for details.
-  warn(
-
-
-
-
-

-
-
X_train, X_test, y_train, y_test = X[:60000], X[60000:], y[:60000], y[60000:]
-print(X_train.shape, X_test.shape, y_train.shape, y_test.shape)
-
-
(60000, 784) (10000, 784) (60000,) (10000,)
-
-
-
-
-

Check if things are working as expected

-
-
%%time
-
-clf = RandomForestClassifier(n_estimators=100, max_depth=10, random_state=0, n_jobs=psutil.cpu_count()//2)
-clf.fit(X_train, y_train)
-preds = clf.predict(X_test)
-
-print(classification_report(y_test, preds))
-
-
              precision    recall  f1-score   support
-
-           0       0.96      0.99      0.97       980
-           1       0.98      0.99      0.98      1135
-           2       0.94      0.94      0.94      1032
-           3       0.94      0.94      0.94      1010
-           4       0.95      0.93      0.94       982
-           5       0.96      0.93      0.94       892
-           6       0.96      0.97      0.96       958
-           7       0.95      0.92      0.94      1028
-           8       0.94      0.93      0.93       974
-           9       0.88      0.94      0.91      1009
-
-    accuracy                           0.95     10000
-   macro avg       0.95      0.95      0.95     10000
-weighted avg       0.95      0.95      0.95     10000
-
-CPU times: user 1min 33s, sys: 652 ms, total: 1min 34s
-Wall time: 3.73 s
-
-
-
-
-

Convert to one v/s rest problem

-
-
y_c = (y == '2').astype(np.int8)
-
-y_c_train, y_c_test = y_c[:60000], y_c[60000:]
-
-
-
-

Check if things are working as expected

-
-
clf = RandomForestClassifier(n_estimators=100, max_depth=10, random_state=0, n_jobs=psutil.cpu_count()//2)
-clf.fit(X_train, y_c_train)
-preds = clf.predict(X_test)
-
-
-
print("Precision", precision_score(y_c_test, preds))
-print("Recall", recall_score(y_c_test, preds))
-
-
Precision 0.9808988764044944
-Recall 0.8459302325581395
-
-
-
-
-

Divide data into train and pool

-
-
train_size = 200
-X_train, X_pool, y_c_train, y_c_pool = train_test_split(X, y_c, train_size=train_size, random_state=42)
-print(X_train.shape, X_pool.shape, y_c_train.shape, y_c_pool.shape)
-
-# plot a bar chart of the number of samples in each class for the training and test set
-unique, counts = np.unique(y_c_train, return_counts=True)
-print("Number of samples in each class for training set", dict(zip(unique, counts)))
-print("One v/s rest ratio", counts[0]/counts[1], "for training set")
-
-
(200, 784) (69800, 784) (200,) (69800,)
-Number of samples in each class for training set {0: 179, 1: 21}
-One v/s rest ratio 8.523809523809524 for training set
-
-
-
-
-

Prof. Ermon’s method

-
-
X_train, X_pool, y_c_train, y_c_pool = train_test_split(X, y_c, train_size=train_size, random_state=42)
-print(X_train.shape, X_pool.shape, y_c_train.shape, y_c_pool.shape)
-
-clf = RandomForestClassifier(n_estimators=100, max_depth=10, random_state=0, n_jobs=psutil.cpu_count()//2)
-clf.fit(X_train, y_c_train)
-preds = clf.predict(X_test)
-
-test_recall = [recall_score(y_c_test, preds)]
-test_precision = [precision_score(y_c_test, preds)]
-positives = [np.sum(y_c_train)]
-negatives = [len(y_c_train) - positives[-1]]
-labeling_cost = [0]
-tp = np.where((preds == 1) & (y_c_test == 1))[0]
-fp = np.where((preds == 1) & (y_c_test == 0))[0]
-print("Test: Number of false positives", len(fp), "Number of true positives", len(tp))
-print("Iteration", 0, "Precision", test_precision[-1], "Recall", test_recall[-1], "Cost", labeling_cost[-1])
-
-al_iters = 10
-
-for iter in range(al_iters):
-    print()
-    preds = clf.predict(X_pool)
-    # pred_proba = clf.predict_proba(X_pool)
-    # print(pred_proba.shape)
-    # identify instances predicted as positive but are actually negative (false positives)
-    # we only pick points with more than 90% probability of being positive
-    # fp = np.where((pred_proba[:, 1] > 0.8) & (y_pool == 0))[0]
-    fp = np.where((preds == 1) & (y_c_pool == 0))[0]
-    tp = np.where((preds == 1) & (y_c_pool == 1))[0]
-    fn = np.where((preds == 0) & (y_c_pool == 1))[0]
-    print("Pool: Number of false positives", len(fp), "Number of true positives", len(tp), "Number of false negatives", len(fn))
-    tp_fp = np.concatenate((tp, fp))
-    # add them to the training set
-    X_train = np.concatenate((X_train, X_pool[tp_fp]))
-    y_c_train = np.concatenate((y_c_train, y_c_pool[tp_fp]))
-    positives.append(np.sum(y_c_train))
-    negatives.append(len(y_c_train) - positives[-1])
-    # remove from the pool set
-    X_pool = np.delete(X_pool, tp_fp, axis=0)
-    y_c_pool = np.delete(y_c_pool, tp_fp)
-    # add the cost of labeling to the list
-    labeling_cost.append(len(tp_fp))
-    # train the classifier again
-    clf.fit(X_train, y_c_train)
-    # predict on the test set
-    preds = clf.predict(X_test)
-    tp = np.where((preds == 1) & (y_c_test == 1))[0]
-    fp = np.where((preds == 1) & (y_c_test == 0))[0]
-    fn = np.where((preds == 0) & (y_c_test == 1))[0]
-    print("Test: Number of false positives", len(fp), "Number of true positives", len(tp), "Number of false negatives", len(fn))
-    # calculate precision and recall
-    test_recall.append(recall_score(y_c_test, preds))
-    test_precision.append(precision_score(y_c_test, preds))
-    # print information
-    print("Iteration", iter+1, "Precision", test_precision[-1], "Recall", test_recall[-1], "Cost", labeling_cost[-1])
-
-labeling_cost = np.cumsum(labeling_cost)
-
-
(200, 784) (69800, 784) (200,) (69800,)
-Test: Number of false positives 8 Number of true positives 283
-Iteration 0 Precision 0.9725085910652921 Recall 0.2742248062015504 Cost 0
-
-Pool: Number of false positives 73 Number of true positives 1884 Number of false negatives 5085
-Test: Number of false positives 209 Number of true positives 932 Number of false negatives 100
-Iteration 1 Precision 0.8168273444347064 Recall 0.9031007751937985 Cost 1957
-
-Pool: Number of false positives 1389 Number of true positives 4386 Number of false negatives 699
-Test: Number of false positives 489 Number of true positives 1016 Number of false negatives 16
-Iteration 2 Precision 0.6750830564784053 Recall 0.9844961240310077 Cost 5775
-
-Pool: Number of false positives 4088 Number of true positives 598 Number of false negatives 101
-Test: Number of false positives 12 Number of true positives 1006 Number of false negatives 26
-Iteration 3 Precision 0.9882121807465619 Recall 0.9748062015503876 Cost 4686
-
-Pool: Number of false positives 18 Number of true positives 15 Number of false negatives 86
-Test: Number of false positives 10 Number of true positives 1010 Number of false negatives 22
-Iteration 4 Precision 0.9901960784313726 Recall 0.9786821705426356 Cost 33
-
-Pool: Number of false positives 13 Number of true positives 5 Number of false negatives 81
-Test: Number of false positives 11 Number of true positives 1010 Number of false negatives 22
-Iteration 5 Precision 0.9892262487757101 Recall 0.9786821705426356 Cost 18
-
-Pool: Number of false positives 5 Number of true positives 2 Number of false negatives 79
-Test: Number of false positives 10 Number of true positives 1011 Number of false negatives 21
-Iteration 6 Precision 0.990205680705191 Recall 0.9796511627906976 Cost 7
-
-Pool: Number of false positives 5 Number of true positives 1 Number of false negatives 78
-Test: Number of false positives 9 Number of true positives 1011 Number of false negatives 21
-Iteration 7 Precision 0.9911764705882353 Recall 0.9796511627906976 Cost 6
-
-Pool: Number of false positives 5 Number of true positives 2 Number of false negatives 76
-Test: Number of false positives 11 Number of true positives 1010 Number of false negatives 22
-Iteration 8 Precision 0.9892262487757101 Recall 0.9786821705426356 Cost 7
-
-Pool: Number of false positives 5 Number of true positives 2 Number of false negatives 74
-Test: Number of false positives 9 Number of true positives 1010 Number of false negatives 22
-Iteration 9 Precision 0.9911678115799804 Recall 0.9786821705426356 Cost 7
-
-Pool: Number of false positives 3 Number of true positives 1 Number of false negatives 73
-Test: Number of false positives 12 Number of true positives 1010 Number of false negatives 22
-Iteration 10 Precision 0.9882583170254403 Recall 0.9786821705426356 Cost 4
-
-
-
-
# plot the confusion matrix
-import itertools
-cm = confusion_matrix(y_c_test, preds)
-plt.imshow(cm, interpolation="nearest", cmap=plt.cm.Blues)
-# add the numbers inside the boxes
-thresh = cm.max() / 2.0
-for i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):
-    plt.text(j, i, cm[i, j], horizontalalignment="center", color="white" if cm[i, j] > thresh else "black")
-plt.title("Confusion Matrix")
-plt.xlabel("Predicted Label")
-plt.ylabel("True Label")
-
-
Text(0, 0.5, 'True Label')
-
-
-
-
-

-
-
-
-
-
-
pd.DataFrame({"Cost": labeling_cost, "Train_Positives": positives, "Train_Negatives": negatives, "Test_Precision": test_precision, "Test_Recall": test_recall})
-
-
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
CostTrain_PositivesTrain_NegativesTest_PrecisionTest_Recall
00211790.9725090.274225
1195719052520.8168270.903101
27732629116410.6750830.984496
312418688957290.9882120.974806
412451690457470.9901960.978682
512469690957600.9892260.978682
612476691157650.9902060.979651
712482691257700.9911760.979651
812489691457750.9892260.978682
912496691657800.9911680.978682
1012500691757830.9882580.978682
- -
-
-
- - -
- -
- -
- - - - - \ No newline at end of file diff --git a/posts/AL_with_MNIST_files/figure-html/cell-11-output-2.png b/posts/AL_with_MNIST_files/figure-html/cell-11-output-2.png deleted file mode 100644 index 36c0f0b..0000000 Binary files a/posts/AL_with_MNIST_files/figure-html/cell-11-output-2.png and /dev/null differ diff --git a/posts/Basis_functions.html b/posts/Basis_functions.html index ed7d9f3..c39ab84 100644 --- a/posts/Basis_functions.html +++ b/posts/Basis_functions.html @@ -2,7 +2,7 @@ - + @@ -72,10 +72,10 @@ - + - + - + - + - + - + - + - + - + - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- -
- -
-
-
-

Data Handling for Large Scale ML

-
-
- An exploratory analysis of various dataset handling processes to optimize memory, diskspace and speed. -
-
-
-
ML
-
-
-
- - -
- -
-
Author
-
-

Zeel B Patel

-
-
- -
-
Published
-
-

September 30, 2023

-
-
- - -
- - -
- - - - -
- - - - - -
-

Imports

-
-
import os
-os.environ["CUDA_VISIBLE_DEVICES"] = "3"
-
-import torch
-import torch.nn as nn
-from numcodecs import GZip, Zstd, Blosc
-
-from time import time, sleep
-from tqdm import tqdm
-from glob import glob
-from os.path import join
-from torch.utils.data import DataLoader, Dataset
-from joblib import Parallel, delayed
-import xarray as xr
-import numpy as np
-
-from torchvision.models import vit_b_16
-from astra.torch.models import ViTClassifier
-from astra.torch.utils import train_fn
-
-
-
-

Creating Custom Dataset

-
-
base_path = "/home/patel_zeel/bkdb/bangladesh_pnas_pred/team1"
-xr.open_zarr(join(base_path, "21.11,92.18.zarr"), consolidated=False)
-
-
- - - - - - - - - - - - - - -
<xarray.Dataset>
-Dimensions:  (channel: 3, col: 224, lat_lag: 5, lon_lag: 5, row: 224)
-Coordinates:
-  * channel  (channel) uint8 0 1 2
-  * col      (col) uint8 0 1 2 3 4 5 6 7 8 ... 216 217 218 219 220 221 222 223
-    lat      float64 ...
-  * lat_lag  (lat_lag) int8 -2 -1 0 1 2
-    lon      float64 ...
-  * lon_lag  (lon_lag) int8 -2 -1 0 1 2
-  * row      (row) uint8 0 1 2 3 4 5 6 7 8 ... 216 217 218 219 220 221 222 223
-Data variables:
-    data     (lat_lag, lon_lag, row, col, channel) uint8 dask.array<chunksize=(3, 3, 112, 112, 3), meta=np.ndarray>
-    label    (lat_lag, lon_lag) int8 dask.array<chunksize=(5, 5), meta=np.ndarray>
-
-
-
-
class XarrayDataset(Dataset):
-    def __init__(self, path, max_files):
-        self.base_path = path
-        self.all_files = glob(join(path, "*.zarr"))[:max_files]
-        self.all_files.sort()
-        self.lat_lags = [-2, -1, 0, 1, 2]
-        self.lon_lags = [-2, -1, 0, 1, 2]
-        
-    def __len__(self):
-        return len(self.all_files) * 25
-    
-    def __getitem__(self, idx):
-        file_idx = idx // 25
-        local_idx = idx % 25
-        lat_lag = self.lat_lags[local_idx // 5]
-        lon_lag = self.lon_lags[local_idx % 5]
-        
-        with xr.open_zarr(self.all_files[file_idx], consolidated=False) as ds:
-            img =  ds.isel(lat_lag=lat_lag, lon_lag=lon_lag)['data']
-            # swap dims to make it ["channel", "row", "col"]
-            img = img.transpose("channel", "row", "col").values
-            return img.astype(np.float32) / 255
-
-
-
def process_it(dataset, batch_size, num_workers):
-    dataloader = DataLoader(dataset, batch_size=batch_size, shuffle=True, num_workers=num_workers, pin_memory=True, pin_memory_device='cuda', prefetch_factor=num_workers//2)
-
-    model = ViTClassifier(vit_b_16, None, 2).to('cuda')
-    optimizer = torch.optim.Adam(model.parameters(), lr=1e-3)
-
-    pbar = tqdm(dataloader)
-
-    train_init = time()
-    iter_times = []
-    for batch in pbar:
-        init = time()
-        optimizer.zero_grad()
-        out = model(batch.to('cuda'))
-        loss = nn.CrossEntropyLoss()(out, torch.randint(0, 2, (batch.shape[0],)).to('cuda'))
-        loss.backward()
-        optimizer.step()
-        time_taken = time() - init
-        pbar.set_description(f"Time: {time_taken:.4f}")
-        iter_times.append(time_taken)
-        
-    total_time = time() - train_init
-    print(f"Average Iteration Processing Time: {np.mean(iter_times):.4f} +- {np.std(iter_times):.4f}")
-    print(f"Total time for all iterations: {np.sum(iter_times):.4f}")
-    print(f"Total Wall Time per iteration: {total_time / len(dataloader):.4f}")
-    print(f"Total Wall Time: {total_time:.4f}")
-
-
-
-

Global config

-
-
max_files = 500
-
-
-
batch_size = 256
-num_workers = 32
-
-dataset = XarrayDataset(base_path, max_files=max_files)
-process_it(dataset, batch_size, num_workers)
-
-
Time: 1.5727: 100%|██████████| 49/49 [01:27<00:00,  1.78s/it]
-
-
-
Average Iteration Processing Time: 1.6474 +- 0.2618
-Total time for all iterations: 80.7246
-Total Wall Time per iteration: 1.7799
-Total Wall Time: 87.2134
-
-
-
-
-
-
-
batch_size = 512
-num_workers = 16
-
-dataset = XarrayDataset(base_path, max_files=max_files)
-process_it(dataset, batch_size, num_workers)
-
-
Time: 2.6731: 100%|██████████| 25/25 [01:32<00:00,  3.69s/it]
-
-
-
Average Iteration Processing Time: 3.1956 +- 0.3949
-Total time for all iterations: 79.8897
-Total Wall Time per iteration: 3.6910
-Total Wall Time: 92.2762
-
-
-
-
-
-
-
batch_size = 512
-num_workers = 32
-
-dataset = XarrayDataset(base_path, max_files=max_files)
-process_it(dataset, batch_size, num_workers)
-
-
Time: 2.6726: 100%|██████████| 25/25 [01:32<00:00,  3.69s/it]
-
-
-
Average Iteration Processing Time: 3.1938 +- 0.4043
-Total time for all iterations: 79.8451
-Total Wall Time per iteration: 3.6908
-Total Wall Time: 92.2689
-
-
-
-
-
-
-
batch_size = 128
-num_workers = 32
-
-dataset = XarrayDataset(base_path, max_files=max_files)
-process_it(dataset, batch_size, num_workers)
-
-
Time: 0.8377:   9%|▉         | 9/98 [00:11<01:19,  1.12it/s]Time: 0.7455: 100%|██████████| 98/98 [01:25<00:00,  1.15it/s]
-
-
-
Average Iteration Processing Time: 0.8269 +- 0.0551
-Total time for all iterations: 81.0315
-Total Wall Time per iteration: 0.8716
-Total Wall Time: 85.4156
-
-
-
-
-
-
-

Is .nc better than zarr?

-
-
os.system(f"du -sh {base_path}")
-
-
1.8G    /home/patel_zeel/bkdb/bangladesh_pnas_pred/team1
-
-
-
0
-
-
-
-
save_path = "/tmp/nc_check_uncompressed"
-os.makedirs(save_path, exist_ok=True)
-files = []
-def zarr_to_nc(file):
-    with xr.open_zarr(file, consolidated=False) as ds:
-        ds.to_netcdf(join(save_path, file.split("/")[-1].replace(".zarr", ".nc")))
-
-_ = Parallel(n_jobs=32)(delayed(zarr_to_nc)(file) for file in tqdm(glob(join(base_path, "*.zarr"))))
-
-os.system(f"du -sh {save_path}")
-
-
  0%|          | 0/1501 [00:00<?, ?it/s]100%|██████████| 1501/1501 [00:24<00:00, 62.47it/s] 
-
-
-
5.3G    /tmp/nc_check_uncompressed
-
-
-
0
-
-
-
-
save_path = "/tmp/nc_check_compressed"
-os.system(f"rm -rf {save_path}")
-os.makedirs(save_path, exist_ok=True)
-
-encoding = {var: {"zlib": True, "complevel": 1} for var in ["data"]}
-
-files = []
-def zarr_to_nc(file):
-    with xr.open_zarr(file, consolidated=False) as ds:
-        ds.to_netcdf(join(save_path, file.split("/")[-1].replace(".zarr", ".nc")), encoding=encoding)
-
-_ = Parallel(n_jobs=32)(delayed(zarr_to_nc)(file) for file in tqdm(glob(join(base_path, "*.zarr"))))
-
-os.system(f"du -sh {save_path}")
-
-
100%|██████████| 1501/1501 [00:04<00:00, 311.18it/s]
-
-
-
1.8G    /tmp/nc_check_compressed
-
-
-
0
-
-
-
-
class XarrayDatasetWithNC(Dataset):
-    def __init__(self, path, max_files):
-        self.base_path = path
-        self.all_files = glob(join(path, "*.nc"))[:max_files]
-        self.all_files.sort()
-        self.all_ds = [xr.open_dataset(file) for file in tqdm(self.all_files)]
-        self.lat_lags = [-2, -1, 0, 1, 2]
-        self.lon_lags = [-2, -1, 0, 1, 2]
-        
-    def __len__(self):
-        return len(self.all_files) * 25
-    
-    def __getitem__(self, idx):
-        file_idx = idx // 25
-        local_idx = idx % 25
-        lat_lag = self.lat_lags[local_idx // 5]
-        lon_lag = self.lon_lags[local_idx % 5]
-        
-        ds = self.all_ds[file_idx]
-        img =  ds.isel(lat_lag=lat_lag, lon_lag=lon_lag)['data'].values
-        return torch.tensor(np.einsum("hwc->chw", img).astype(np.float32) / 255)
-
-
-
nc_path = "/tmp/nc_check_compressed"
-
-
-
batch_size = 128
-num_workers = 32
-
-dataset = XarrayDatasetWithNC(nc_path, max_files=max_files)
-process_it(dataset, batch_size, num_workers)
-
-
100%|██████████| 500/500 [00:02<00:00, 246.27it/s]
-Time: 0.7414: 100%|██████████| 98/98 [01:25<00:00,  1.15it/s]
-
-
-
Average Iteration Processing Time: 0.8260 +- 0.0530
-Total time for all iterations: 80.9527
-Total Wall Time per iteration: 0.8725
-Total Wall Time: 85.5034
-
-
-
-
-
-
-
-

Additional experiments

-
-
n_images = 60000
-t = 84.9131/500/25 * n_images
-print(f"Time to process {n_images} images: ", t/60, "minutes")
-
-
Time to process 60000 images:  6.793048000000001 minutes
-
-
-
-
files = glob(join(base_path, "*.zarr"))
-data_tensors = []
-for file in tqdm(files):
-    with xr.open_zarr(file, consolidated=False) as ds:
-        # print(ds['data'].values.reshape(-1, 224, 224, 3))
-        data_tensors.append(torch.tensor(np.einsum("nhwc->nchw", ds['data'].values.reshape(-1, 224, 224, 3)).astype(np.float16) / 255))
-
-
100%|██████████| 1501/1501 [02:44<00:00,  9.13it/s]
-
-
-
-
all_in_one = torch.concat(data_tensors, dim=0)
-all_in_one.shape
-
-
torch.Size([37525, 3, 224, 224])
-
-
-
-
all_in_one = all_in_one.to('cuda')
-
-
-
-

Insights

-
    -
  • GPU Memory consumption is 17776MiB / 81920MiB for batch size 128 for ViT model
  • -
  • Uploading torch.Size([37525, 3, 224, 224]) of float32 data to GPU takes 22054MiB / 81920MiB of GPU Memory. Same data with float16 takes 11202MiB / 81920MiB of GPU Memory.
  • -
  • It seems .nc or .zarr are not making much difference in terms of time and/or memory.
  • -
- - -
-
- -
- -
- - - - - \ No newline at end of file diff --git a/posts/air-quality-google-.html b/posts/air-quality-google-.html deleted file mode 100644 index 573cda8..0000000 --- a/posts/air-quality-google-.html +++ /dev/null @@ -1,11965 +0,0 @@ - - - - - - - - - - - - -Google Air Quality Data – blog - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- -
- -
-
-
-

Google Air Quality Data

-
-
- Google Air Quality API -
-
-
-
NumPy, Mathematics
-
-
-
- - -
- -
-
Author
-
-

Zeel B Patel

-
-
- -
-
Published
-
-

August 31, 2023

-
-
- - -
- - -
- - - - -
- - - - - -
-
import requests
-import numpy as np
-import pandas as pd
-import xarray as xr
-from tqdm import tqdm
-import matplotlib.pyplot as plt
-import dask
-from dask.distributed import Client, LocalCluster
-
-if "key" in locals():
-    pass
-else:
-    key = input("Enter your key: ")
-
-url = f"https://airquality.googleapis.com/v1/history:lookup?key={key}"
-url
-
-
'https://airquality.googleapis.com/v1/history:lookup?key=AIzaSyA9ytSec31of_INkpuB3TZ6vLR1nzme9iQ'
-
-
-
-
if "client" in locals():
-    print(client)
-else:
-    cluster = LocalCluster(n_workers=54, threads_per_worker=1)
-    client = Client(cluster)
-    print(client)
-
-
<Client: 'tcp://127.0.0.1:36239' processes=54 threads=54, memory=503.73 GiB>
-
-
-
-
payload = {
-    "hours": 24 * 30,
-    "pageSize": 168,
-    "pageToken": "",
-    "location": {"latitude": 28.636429, "longitude": 77.201067},
-    "extraComputations": ["POLLUTANT_CONCENTRATION", "LOCAL_AQI"],
-}
-
-headers = {"Content-Type": "application/json"}
-
-response = requests.post(url, json=payload, headers=headers)
-
-res = response.json()
-
-
-
24 * 30
-
-
720
-
-
-
-
len(res["hoursInfo"])
-
-
168
-
-
-
-
ts = []
-codes = []
-pm25 = []
-df = pd.DataFrame(columns=["timestamp", "value", "code"])
-
-for each in res["hoursInfo"]:
-    ts.append(each["dateTime"])
-    codes.append(each["pollutants"][4]["code"])
-    pm25.append(each["pollutants"][4]["concentration"]["value"])
-
-df["timestamp"] = ts
-df["value"] = pm25
-df["code"] = codes
-df["timestamp"] = pd.to_datetime(df["timestamp"])
-df.head(20)
-
-
-
---------------------------------------------------------------------------
-KeyError                                  Traceback (most recent call last)
-Cell In[5], line 9
-      7     ts.append(each["dateTime"])
-      8     codes.append(each["pollutants"][4]["code"])
-----> 9     pm25.append(each["pollutants"][4]["concentration"]["value"])
-     11 df["timestamp"] = ts
-     12 df["value"] = pm25
-
-KeyError: 'concentration'
-
-
-
-
-
sensor_data = pd.read_excel(
-    "/home/patel_zeel/blog/posts/site_12220230831125207.xlsx", skiprows=16
-)
-sensor_data["From Date"] = pd.to_datetime(
-    sensor_data["From Date"], format="%d-%m-%Y %H:%M"
-)
-sensor_data["To Date"] = pd.to_datetime(sensor_data["To Date"], format="%d-%m-%Y %H:%M")
-sensor_data["mean_time"] = sensor_data[["From Date", "To Date"]].mean(axis=1)
-sensor_data["utc_time"] = sensor_data["mean_time"] - pd.Timedelta(hours=5, minutes=30)
-
-fig, ax = plt.subplots(figsize=(15, 4))
-sensor_data.plot(x="utc_time", y="PM2.5", ax=ax, label="sensor")
-df.plot(x="timestamp", y="value", ax=ax, label="google")
-
-
-
-

-
-
-
-
-
-

Request at a grid.

-
-
from aqmsp_data.data import load_camx
-
-camx = load_camx(years=2022, months=1, days=1, variables="P25")
-camx
-
-
/home/patel_zeel/miniconda3/lib/python3.9/site-packages/xarray/core/indexing.py:1443: PerformanceWarning: Slicing is producing a large chunk. To accept the large
-chunk and silence this warning, set the option
-    >>> with dask.config.set(**{'array.slicing.split_large_chunks': False}):
-    ...     array[indexer]
-
-To avoid creating the large chunks, set the option
-    >>> with dask.config.set(**{'array.slicing.split_large_chunks': True}):
-    ...     array[indexer]
-  return self.array[key]
-
-
-
- - - - - - - - - - - - - - -
<xarray.Dataset>
-Dimensions:    (time: 24, latitude: 80, longitude: 80)
-Coordinates:
-  * longitude  (longitude) float64 76.85 76.86 76.87 76.88 ... 77.62 77.63 77.64
-  * latitude   (latitude) float64 28.2 28.21 28.22 28.23 ... 28.97 28.98 28.99
-  * time       (time) datetime64[ns] 2022-01-01T00:30:00 ... 2022-01-01T23:30:00
-Data variables:
-    P25        (time, latitude, longitude) float32 dask.array<chunksize=(24, 80, 80), meta=np.ndarray>
-Attributes: (12/34)
-    CDATE:          2023126
-    CTIME:          95909
-    EXEC_ID:        ????????????????                                         ...
-    FILEDESC:       I/O API formatted CAMx AVRG output                       ...
-    FTYPE:          1
-    GDNAM:          ????????????????
-    ...             ...
-    XCELL:          0.009999999776482582
-    XCENT:          0.0
-    XORIG:          76.8499984741211
-    YCELL:          0.009999999776482582
-    YCENT:          0.0
-    YORIG:          28.200000762939453
-
-
-
-
lat_grid, lon_grid = np.meshgrid(camx.latitude, camx.longitude)
-lat_lon_grid = np.vstack([lat_grid.ravel(), lon_grid.ravel()]).T
-print(lat_lon_grid.shape)
-
-session = requests.Session()
-delayed_fn = dask.delayed(session.post)
-responses = []
-for lat, lon in tqdm(lat_lon_grid):
-    payload = {
-        "hours": 1,
-        "pageSize": 200,
-        "pageToken": "",
-        "location": {"latitude": lat, "longitude": lon},
-        "extraComputations": ["POLLUTANT_CONCENTRATION"],
-    }
-
-    headers = {"Content-Type": "application/json"}
-
-    response = delayed_fn(url, json=payload, headers=headers)
-    responses.append(response)
-
-all_res = dask.compute(*responses)
-
-
(6400, 2)
-
-
-
100%|██████████| 6400/6400 [00:01<00:00, 5981.06it/s]
-
-
-
-
---------------------------------------------------------------------------
-KeyboardInterrupt                         Traceback (most recent call last)
-Cell In[7], line 22
-     19     response = delayed_fn(url, json=payload, headers=headers)
-     20     responses.append(response)
----> 22 all_res = dask.compute(*responses)
-
-File ~/miniconda3/lib/python3.9/site-packages/dask/base.py:666, in compute(traverse, optimize_graph, scheduler, get, *args, **kwargs)
-    663     keys.append(x.__dask_keys__())
-    664     postcomputes.append(x.__dask_postcompute__())
---> 666 results = schedule(dsk, keys, **kwargs)
-    667 return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])
-
-File ~/miniconda3/lib/python3.9/site-packages/dask/threaded.py:89, in get(dsk, keys, cache, num_workers, pool, **kwargs)
-     86     elif isinstance(pool, multiprocessing.pool.Pool):
-     87         pool = MultiprocessingPoolExecutor(pool)
----> 89 results = get_async(
-     90     pool.submit,
-     91     pool._max_workers,
-     92     dsk,
-     93     keys,
-     94     cache=cache,
-     95     get_id=_thread_get_id,
-     96     pack_exception=pack_exception,
-     97     **kwargs,
-     98 )
-    100 # Cleanup pools associated to dead threads
-    101 with pools_lock:
-
-File ~/miniconda3/lib/python3.9/site-packages/dask/local.py:500, in get_async(submit, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, chunksize, **kwargs)
-    498 while state["waiting"] or state["ready"] or state["running"]:
-    499     fire_tasks(chunksize)
---> 500     for key, res_info, failed in queue_get(queue).result():
-    501         if failed:
-    502             exc, tb = loads(res_info)
-
-File ~/miniconda3/lib/python3.9/site-packages/dask/local.py:137, in queue_get(q)
-    136 def queue_get(q):
---> 137     return q.get()
-
-File ~/miniconda3/lib/python3.9/queue.py:171, in Queue.get(self, block, timeout)
-    169 elif timeout is None:
-    170     while not self._qsize():
---> 171         self.not_empty.wait()
-    172 elif timeout < 0:
-    173     raise ValueError("'timeout' must be a non-negative number")
-
-File ~/miniconda3/lib/python3.9/threading.py:312, in Condition.wait(self, timeout)
-    310 try:    # restore state no matter what (e.g., KeyboardInterrupt)
-    311     if timeout is None:
---> 312         waiter.acquire()
-    313         gotit = True
-    314     else:
-
-KeyboardInterrupt: 
-
-
-
-
-
dfs = []
-for res, (lat, lon) in tqdm(zip(all_res, lat_lon_grid)):
-    res = res.json()
-    df = pd.DataFrame(columns=["timestamp", "value", "code"])
-    ts = []
-    codes = []
-    pm25 = []
-    for each in res["hoursInfo"]:
-        ts.append(each["dateTime"])
-        codes.append(each["pollutants"][4]["code"])
-        try:
-            pm25.append(each["pollutants"][4]["concentration"]["value"])
-        except KeyError:
-            pm25.append(np.nan)
-
-    df["timestamp"] = ts
-    df["value"] = pm25
-    df["code"] = codes
-    df["lat"] = lat
-    df["lon"] = lon
-    df["timestamp"] = pd.to_datetime(df["timestamp"])
-    dfs.append(df)
-master_df = pd.concat(dfs)
-assert master_df["code"].nunique() == 1 and master_df["code"].unique()[0] == "pm25"
-
-ds = master_df.set_index(["lat", "lon", "timestamp"]).to_xarray()
-ds.isel(timestamp=0)["value"].plot(x="lon", y="lat", cmap="RdYlGn_r")
-
-
0it [00:00, ?it/s]6400it [00:12, 497.91it/s]
-
-
-
-
-

-
-
-
-
-
-
-

Appendix

-
-
import xarray as xr
-
-
-
global_date = "2023-07-26"
-ds_met = xr.open_dataset(
-    f'../../sarath_auto_download/data/camxmet2d.delhi.{global_date.replace("-","")}.96hours.nc'
-)
-ds_aq = xr.open_dataset(
-    f'../../sarath_auto_download/data/camx120hr_merged_{global_date.replace("-","")}.nc'
-)
-ds_met
-
-
- - - - - - - - - - - - - - -
<xarray.Dataset>
-Dimensions:     (TSTEP: 96, VAR: 14, DATE-TIME: 2, LAY: 1, ROW: 80, COL: 80)
-Dimensions without coordinates: TSTEP, VAR, DATE-TIME, LAY, ROW, COL
-Data variables: (12/15)
-    TFLAG       (TSTEP, VAR, DATE-TIME) int32 ...
-    TSURF_K     (TSTEP, LAY, ROW, COL) float32 ...
-    SNOWEW_M    (TSTEP, LAY, ROW, COL) float32 ...
-    SNOWAGE_HR  (TSTEP, LAY, ROW, COL) float32 ...
-    PRATE_MMpH  (TSTEP, LAY, ROW, COL) float32 ...
-    CLOUD_OD    (TSTEP, LAY, ROW, COL) float32 ...
-    ...          ...
-    SWSFC_WpM2  (TSTEP, LAY, ROW, COL) float32 ...
-    SOLM_M3pM3  (TSTEP, LAY, ROW, COL) float32 ...
-    CLDTOP_KM   (TSTEP, LAY, ROW, COL) float32 ...
-    CAPE        (TSTEP, LAY, ROW, COL) float32 ...
-    PBL_WRF_M   (TSTEP, LAY, ROW, COL) float32 ...
-    PBL_YSU_M   (TSTEP, LAY, ROW, COL) float32 ...
-Attributes: (12/33)
-    IOAPI_VERSION:  $Id: @(#) ioapi library version 3.0 $                    ...
-    EXEC_ID:        ????????????????                                         ...
-    FTYPE:          1
-    CDATE:          2023207
-    CTIME:          72116
-    WDATE:          2023207
-    ...             ...
-    VGLVLS:         [0. 0.]
-    GDNAM:          ????????????????
-    UPNAM:          CAMx2IOAPI      
-    VAR-LIST:       TSURF_K         SNOWEW_M        SNOWAGE_HR      PRATE_MMp...
-    FILEDESC:       I/O API formatted CAMx AVRG output                       ...
-    HISTORY:        
-
-
-
-
plt.figure(figsize=(15, 3))
-ds_met["PBL_WRF_M"].isel(LAY=0).mean(dim=["ROW", "COL"]).plot()
-twin_x = plt.gca().twinx()
-ds_aq["P25"].isel(LAY=0, TSTEP=range(24, 120)).mean(dim=["ROW", "COL"]).plot(
-    ax=twin_x, color="r"
-)
-# set ylabel color as red
-twin_x.set_ylabel("PM2.5", color="r")
-# set yticks color as red
-twin_x.tick_params(axis="y", colors="r")
-twin_x.set_xlabel("Time")
-
-
Text(0.5, 0, 'Time')
-
-
-
-
-

-
-
-
-
-
-
ds_met
-
-
- - - - - - - - - - - - - - -
<xarray.Dataset>
-Dimensions:     (TSTEP: 96, VAR: 14, DATE-TIME: 2, LAY: 1, ROW: 80, COL: 80)
-Dimensions without coordinates: TSTEP, VAR, DATE-TIME, LAY, ROW, COL
-Data variables: (12/15)
-    TFLAG       (TSTEP, VAR, DATE-TIME) int32 ...
-    TSURF_K     (TSTEP, LAY, ROW, COL) float32 ...
-    SNOWEW_M    (TSTEP, LAY, ROW, COL) float32 ...
-    SNOWAGE_HR  (TSTEP, LAY, ROW, COL) float32 ...
-    PRATE_MMpH  (TSTEP, LAY, ROW, COL) float32 ...
-    CLOUD_OD    (TSTEP, LAY, ROW, COL) float32 ...
-    ...          ...
-    SWSFC_WpM2  (TSTEP, LAY, ROW, COL) float32 ...
-    SOLM_M3pM3  (TSTEP, LAY, ROW, COL) float32 ...
-    CLDTOP_KM   (TSTEP, LAY, ROW, COL) float32 ...
-    CAPE        (TSTEP, LAY, ROW, COL) float32 ...
-    PBL_WRF_M   (TSTEP, LAY, ROW, COL) float32 ...
-    PBL_YSU_M   (TSTEP, LAY, ROW, COL) float32 ...
-Attributes: (12/33)
-    IOAPI_VERSION:  $Id: @(#) ioapi library version 3.0 $                    ...
-    EXEC_ID:        ????????????????                                         ...
-    FTYPE:          1
-    CDATE:          2023207
-    CTIME:          72116
-    WDATE:          2023207
-    ...             ...
-    VGLVLS:         [0. 0.]
-    GDNAM:          ????????????????
-    UPNAM:          CAMx2IOAPI      
-    VAR-LIST:       TSURF_K         SNOWEW_M        SNOWAGE_HR      PRATE_MMp...
-    FILEDESC:       I/O API formatted CAMx AVRG output                       ...
-    HISTORY:        
-
-
-
-
np.array([1, 2, 3])[:]
-
-
array([1, 2, 3])
-
-
-
-
import numpy as np
-import pandas as pd
-
-df = pd.DataFrame(columns=["lag"], index=list(ds_met.data_vars)[1:])
-
-for lag in range(-24, 24):
-    for var in list(ds_met.data_vars)[1:]:
-        if lag == 0:
-            met_series = ds_met[var].isel(LAY=0).mean(dim=["ROW", "COL"]).values
-            aq_series = (
-                ds_aq["P25"]
-                .isel(LAY=0, TSTEP=range(24, 120))
-                .mean(dim=["ROW", "COL"])
-                .values
-            )
-        elif lag > 0:
-            met_series = ds_met[var].isel(LAY=0).mean(dim=["ROW", "COL"]).values[lag:]
-            aq_series = (
-                ds_aq["P25"]
-                .isel(LAY=0, TSTEP=range(24, 120))
-                .mean(dim=["ROW", "COL"])
-                .values[:-lag]
-            )
-        else:
-            met_series = ds_met[var].isel(LAY=0).mean(dim=["ROW", "COL"]).values[:lag]
-            aq_series = (
-                ds_aq["P25"]
-                .isel(LAY=0, TSTEP=range(24, 120))
-                .mean(dim=["ROW", "COL"])
-                .values[-lag:]
-            )
-        # print(f"{var}: {np.corrcoef(met_series, aq_series)[0, 1]}")
-        df.loc[var, lag] = np.corrcoef(met_series, aq_series)[0, 1]
-
-
/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[:, None]
-/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide
-  c /= stddev[None, :]
-
-
-
-
df.index.values
-
-
array(['TSURF_K', 'SNOWEW_M', 'SNOWAGE_HR', 'PRATE_MMpH', 'CLOUD_OD',
-       'U10_MpS', 'V10_MpS', 'T2_K', 'SWSFC_WpM2', 'SOLM_M3pM3',
-       'CLDTOP_KM', 'CAPE', 'PBL_WRF_M', 'PBL_YSU_M'], dtype=object)
-
-
-
-
import matplotlib.pyplot as plt
-from matplotlib.animation import FuncAnimation
-
-plt.figure(figsize=(6, 3))
-for var in ["TSURF_K", "T2_K", "SWSFC_WpM2", "PBL_WRF_M", "PBL_YSU_M"]:
-    df.loc[var].plot(label=var)
-for var in ["CLOUD_OD", "V10_MpS"]:
-    df.loc[var].plot(label=var, linestyle="--")
-
-plt.legend(bbox_to_anchor=(1, 1))
-plt.ylabel("Correlation with P25 from CAMx")
-plt.tight_layout()
-plt.savefig("lag.pdf")
-
-
-
-

-
-
-
-
-
-
plt.rcParams["animation.html"] = "jshtml"
-fig, ax = plt.subplots(figsize=(15, 4))
-mappable = (
-    ds_met["PBL_WRF_M"]
-    .isel(TSTEP=1, LAY=0)
-    .plot(
-        x="COL", y="ROW", cmap="RdYlGn_r", ax=ax, vmin=0, vmax=2000, add_colorbar=False
-    )
-)
-fig.colorbar(mappable)
-
-
-def plot_it(t):
-    ax.cla()
-    tmp = ds_met["PBL_WRF_M"].isel(TSTEP=t, LAY=0)
-    tmp.plot(
-        x="COL", y="ROW", cmap="RdYlGn_r", ax=ax, vmin=0, vmax=2000, add_colorbar=False
-    )
-    ax.set_xlabel("Longitude")
-    ax.set_ylabel("Latitude")
-    ax.set_title(f"Mean PBLH: {tmp.mean().values:.2f} m")
-
-
-anim = FuncAnimation(fig, plot_it, frames=range(1, 13), interval=500)
-plt.close()
-anim
-
- - - - - - -
- -
- -
- - - - - - - - - -
-
- - - - - - -
-
-
- - - -
-
-
-
from aqmsp_data.data import load_caaqm
-
-caaqm = load_caaqm(years=2022, months=1, days=1, variables="PM2.5")
-
-
-
import pandas as pd
-from glob import glob
-
-files = glob("/home/patel_zeel/aqmsp/aqmsp_data/data/caaqm/raw_2023/*.xlsx")
-
-df = pd.read_excel(
-    files[-3],
-    header=None,
-)
-# print(df.head(7))
-station = df.iloc[6, 1]
-lat = caaqm.sel(station=station).latitude.values.item()
-lon = caaqm.sel(station=station).longitude.values.item()
-df = df.iloc[16:, :]
-# assign first row as column names
-df.columns = df.iloc[0].values
-# drop first row
-df = df.iloc[1:, :]
-print(station)
-df["time"] = pd.to_datetime(df["From Date"], format="%d-%m-%Y %H:%M") + pd.Timedelta(
-    minutes=30
-)
-df
-
-
Vivek Vihar, Delhi - DPCC
-
-
-
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
From DateTo DatePM2.5PM10ATBPRHtime
1701-01-2023 00:0001-01-2023 01:0014121111.95997.8378.92023-01-01 00:30:00
1801-01-2023 01:0001-01-2023 02:0014921011.85997.4779.12023-01-01 01:30:00
1901-01-2023 02:0001-01-2023 03:0014118611.3997.0380.252023-01-01 02:30:00
2001-01-2023 03:0001-01-2023 04:0013817410.3996.583.332023-01-01 03:30:00
2101-01-2023 04:0001-01-2023 05:0012216110.05996.3884.082023-01-01 04:30:00
...........................
507730-07-2023 20:0030-07-2023 21:001495.7530.87972.466.172023-07-30 20:30:00
507830-07-2023 21:0030-07-2023 22:0018.25114.7530.6972.4269.222023-07-30 21:30:00
507930-07-2023 22:0030-07-2023 23:0020.5100.530.48972.43722023-07-30 22:30:00
508030-07-2023 23:0031-07-2023 00:0017.7510330.42972.3571.922023-07-30 23:30:00
508131-07-2023 00:0031-07-2023 00:002312430.4972.371.82023-07-31 00:30:00
- -

5065 rows × 8 columns

-
-
-
-
-
def process_it(ds, date):
-    if ds.TSTEP.size == 120:
-        print("120")
-        ds = ds.isel(LAY=0, TSTEP=range(24, 48))
-    else:
-        print("96")
-        ds = ds.isel(LAY=0, TSTEP=range(24))
-    ds1 = ds.assign(time=("TSTEP", pd.date_range(date, periods=24, freq="H")))
-    lats = np.arange(80) * ds1.YCELL + ds1.YORIG
-    lons = np.arange(80) * ds1.XCELL + ds1.XORIG
-    ds2 = ds1.assign_coords(lat=("ROW", lats), lon=("COL", lons))
-    ds3 = ds2.swap_dims({"TSTEP": "time", "ROW": "lat", "COL": "lon"})
-    ds3["time"] = ds3["time"] + pd.Timedelta(hours=5, minutes=30)
-    return ds3
-
-
-ds_met_processed = process_it(ds_met, global_date)
-ds_met_processed
-
-
96
-
-
-
- - - - - - - - - - - - - - -
<xarray.Dataset>
-Dimensions:     (time: 24, VAR: 14, DATE-TIME: 2, lat: 80, lon: 80)
-Coordinates:
-  * time        (time) datetime64[ns] 2023-07-26T05:30:00 ... 2023-07-27T04:3...
-  * lat         (lat) float64 28.2 28.21 28.22 28.23 ... 28.96 28.97 28.98 28.99
-  * lon         (lon) float64 76.85 76.86 76.87 76.88 ... 77.62 77.63 77.64
-Dimensions without coordinates: VAR, DATE-TIME
-Data variables: (12/15)
-    TFLAG       (time, VAR, DATE-TIME) int32 ...
-    TSURF_K     (time, lat, lon) float32 ...
-    SNOWEW_M    (time, lat, lon) float32 ...
-    SNOWAGE_HR  (time, lat, lon) float32 ...
-    PRATE_MMpH  (time, lat, lon) float32 ...
-    CLOUD_OD    (time, lat, lon) float32 ...
-    ...          ...
-    SWSFC_WpM2  (time, lat, lon) float32 ...
-    SOLM_M3pM3  (time, lat, lon) float32 ...
-    CLDTOP_KM   (time, lat, lon) float32 ...
-    CAPE        (time, lat, lon) float32 ...
-    PBL_WRF_M   (time, lat, lon) float32 ...
-    PBL_YSU_M   (time, lat, lon) float32 ...
-Attributes: (12/33)
-    IOAPI_VERSION:  $Id: @(#) ioapi library version 3.0 $                    ...
-    EXEC_ID:        ????????????????                                         ...
-    FTYPE:          1
-    CDATE:          2023207
-    CTIME:          72116
-    WDATE:          2023207
-    ...             ...
-    VGLVLS:         [0. 0.]
-    GDNAM:          ????????????????
-    UPNAM:          CAMx2IOAPI      
-    VAR-LIST:       TSURF_K         SNOWEW_M        SNOWAGE_HR      PRATE_MMp...
-    FILEDESC:       I/O API formatted CAMx AVRG output                       ...
-    HISTORY:        
-
-
-
-
closest_ds = ds_met_processed.sel(lat=lat, lon=lon, method="nearest")
-fig, ax = plt.subplots(figsize=(15, 4))
-
-print(lat, lon)
-closest_ds["TSURF_K"].plot(ax=ax, label="TSURF_K")
-closest_ds["T2_K"].plot(ax=ax, label="T2_K")
-tmp_df = df.set_index("time")
-# select data from global_data
-tmp_df = tmp_df.loc[closest_ds["time"].values[0] : closest_ds["time"].values[-1]]
-tmp_df["AT_K"] = tmp_df["AT"] + 273.15
-tmp_df.plot(ax=ax, y="AT_K")
-plt.legend()
-# tmp_df
-
-
28.672342 77.31526
-
-
-
-
-

-
-
-
-
- - -
- -
- -
- - - - - \ No newline at end of file diff --git a/posts/air-quality-google-_files/figure-html/cell-11-output-2.png b/posts/air-quality-google-_files/figure-html/cell-11-output-2.png deleted file mode 100644 index 2af28f8..0000000 Binary files a/posts/air-quality-google-_files/figure-html/cell-11-output-2.png and /dev/null differ diff --git a/posts/air-quality-google-_files/figure-html/cell-14-output-2.png b/posts/air-quality-google-_files/figure-html/cell-14-output-2.png deleted file mode 100644 index 537f261..0000000 Binary files a/posts/air-quality-google-_files/figure-html/cell-14-output-2.png and /dev/null differ diff --git a/posts/air-quality-google-_files/figure-html/cell-19-output-1.png b/posts/air-quality-google-_files/figure-html/cell-19-output-1.png deleted file mode 100644 index 74774f7..0000000 Binary files a/posts/air-quality-google-_files/figure-html/cell-19-output-1.png and /dev/null differ diff --git a/posts/air-quality-google-_files/figure-html/cell-24-output-2.png b/posts/air-quality-google-_files/figure-html/cell-24-output-2.png deleted file mode 100644 index 1d6afc2..0000000 Binary files a/posts/air-quality-google-_files/figure-html/cell-24-output-2.png and /dev/null differ diff --git a/posts/air-quality-google-_files/figure-html/cell-8-output-1.png b/posts/air-quality-google-_files/figure-html/cell-8-output-1.png deleted file mode 100644 index bf047a1..0000000 Binary files a/posts/air-quality-google-_files/figure-html/cell-8-output-1.png and /dev/null differ diff --git a/posts/bayesian-gaussian-basis-regression.html b/posts/bayesian-gaussian-basis-regression.html index 4fccbee..2ca2892 100644 --- a/posts/bayesian-gaussian-basis-regression.html +++ b/posts/bayesian-gaussian-basis-regression.html @@ -2,7 +2,7 @@ - + @@ -72,10 +72,10 @@ - + - + - + - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - - - - -
-
- -
- -
-
-
-

Foundation Models for Time Series Forecasting

-
-
- Exploring the foundation models for time series forecasting -
-
-
-
ML
-
-
-
- - -
- -
-
Author
-
-

Zeel B Patel

-
-
- -
-
Published
-
-

July 6, 2024

-
-
- - -
- - -
- - - - -
- - - - - -
-
# Config
-import os
-
-# Basic
-import numpy as np
-import pandas as pd
-import matplotlib.pyplot as plt
-
-# Monitoring
-from tqdm.notebook import tqdm
-
-# IO
-from os.path import join, exists, basename, dirname
-from glob import glob
-
-# Parallel processing
-from joblib import Parallel, delayed
-
-import xarray as xr
-
-
-

Data

-
-
ds = xr.open_zarr("zip:///::https://huggingface.co/datasets/Zeel/P1/resolve/main/all_in_one.zarr.zip")
-ds
-
-
- - - - - - - - - - - - - - -
<xarray.Dataset> Size: 25GB
-Dimensions:      (Timestamp: 245376, station: 537)
-Coordinates:
-  * Timestamp    (Timestamp) datetime64[ns] 2MB 2017-01-01 ... 2023-12-31T23:...
-    address      (station) <U187 402kB ...
-    city         (station) <U18 39kB ...
-    latitude     (station) float64 4kB ...
-    longitude    (station) float64 4kB ...
-    state        (station) <U17 37kB ...
-  * station      (station) <U64 137kB '32Bungalows, Bhilai - CECB' ... 'Ward-...
-Data variables: (12/24)
-    AT           (Timestamp, station) float64 1GB ...
-    BP           (Timestamp, station) float64 1GB ...
-    Benzene      (Timestamp, station) float64 1GB ...
-    CO           (Timestamp, station) float64 1GB ...
-    Eth-Benzene  (Timestamp, station) float64 1GB ...
-    MP-Xylene    (Timestamp, station) float64 1GB ...
-    ...           ...
-    TOT-RF       (Timestamp, station) float64 1GB ...
-    Toluene      (Timestamp, station) float64 1GB ...
-    VWS          (Timestamp, station) float64 1GB ...
-    WD           (Timestamp, station) float64 1GB ...
-    WS           (Timestamp, station) float64 1GB ...
-    Xylene       (Timestamp, station) float64 1GB ...
-
-
-
-
one_station_ds = ds.sel(station="IGI Airport (T3), Delhi - IMD", Timestamp=slice("2022", "2023"))[["PM2.5"]]
-one_station_ds
-
-
- - - - - - - - - - - - - - -
<xarray.Dataset> Size: 1MB
-Dimensions:    (Timestamp: 70080)
-Coordinates:
-  * Timestamp  (Timestamp) datetime64[ns] 561kB 2022-01-01 ... 2023-12-31T23:...
-    address    <U187 748B ...
-    city       <U18 72B ...
-    latitude   float64 8B ...
-    longitude  float64 8B ...
-    state      <U17 68B ...
-    station    <U64 256B 'IGI Airport (T3), Delhi - IMD'
-Data variables:
-    PM2.5      (Timestamp) float64 561kB ...
-
-
-
-
data = one_station_ds['PM2.5'].to_dataframe()[['PM2.5']]
-
-# convert to hourly data
-data = data.resample('h').mean()
-
-# how much missing data
-print(f"Missing data: {data.isna().sum().values[0]}")
-
-# fill missing data
-data = data.interpolate(method='linear')
-
-print(f"Missing data after interpolation: {data.isna().sum().values[0]}")
-
-data.head()
-
-
Missing data: 298
-Missing data after interpolation: 0
-
-
-
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
PM2.5
Timestamp
2022-01-01 00:00:00273.5475
2022-01-01 01:00:00268.8675
2022-01-01 02:00:00258.0225
2022-01-01 03:00:00194.9100
2022-01-01 04:00:00197.9975
- -
-
-
-
-
import timesfm
-
-tfm = timesfm.TimesFm(
-    context_len=32,
-    horizon_len=24,
-    input_patch_len=32,
-    output_patch_len=128,
-    num_layers=20,
-    model_dims=1280,
-    backend="gpu",
-)
-tfm.load_from_checkpoint(repo_id="google/timesfm-1.0-200m")
-
-
Multiprocessing context has already been set.
-Constructing model weights.
-
-
-
WARNING:absl:No registered CheckpointArgs found for handler type: <class 'paxml.checkpoints.FlaxCheckpointHandler'>
-WARNING:absl:Configured `CheckpointManager` using deprecated legacy API. Please follow the instructions at https://orbax.readthedocs.io/en/latest/api_refactor.html to migrate by May 1st, 2024.
-WARNING:absl:train_state_unpadded_shape_dtype_struct is not provided. We assume `train_state` is unpadded.
-
-
-
Constructed model weights in 3.76 seconds.
-Restoring checkpoint from /home/patel_zeel/.cache/huggingface/hub/models--google--timesfm-1.0-200m/snapshots/8775f7531211ac864b739fe776b0b255c277e2be/checkpoints.
-
-
-
-
---------------------------------------------------------------------------
-MemoryError                               Traceback (most recent call last)
-Cell In[6], line 12
-      1 import timesfm
-      3 tfm = timesfm.TimesFm(
-      4     context_len=32,
-      5     horizon_len=24,
-   (...)
-     10     backend="gpu",
-     11 )
----> 12 tfm.load_from_checkpoint(repo_id="google/timesfm-1.0-200m")
-
-File ~/timesfm/src/timesfm.py:270, in TimesFm.load_from_checkpoint(self, checkpoint_path, repo_id, checkpoint_type, step)
-    268 self._logging(f"Restoring checkpoint from {checkpoint_path}.")
-    269 start_time = time.time()
---> 270 self._train_state = checkpoints.restore_checkpoint(
-    271     train_state_local_shapes,
-    272     checkpoint_dir=checkpoint_path,
-    273     checkpoint_type=checkpoint_type,
-    274     state_specs=train_state_partition_specs,
-    275     step=step,
-    276 )
-    277 self._logging(
-    278     f"Restored checkpoint in {time.time() - start_time:.2f} seconds."
-    279 )
-    281 # Initialize and jit the decode fn.
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/paxml/checkpoints.py:246, in restore_checkpoint(state_global_shapes, checkpoint_dir, global_mesh, checkpoint_type, state_specs, step, enforce_restore_shape_check, state_unpadded_shape_dtype_struct, tensorstore_use_ocdbt, restore_transformations)
-    240 if checkpoint_type == CheckpointType.GDA:
-    241   restore_args = {
-    242       'specs': state_specs,
-    243       'mesh': global_mesh,
-    244       'transforms': restore_transformations,
-    245   }
---> 246 output = checkpoint_manager.restore(
-    247     step,
-    248     state_global_shapes,
-    249     state_unpadded_shape_dtype_struct,
-    250     restore_kwargs=restore_args,
-    251 )
-    252 # Note: `aux_items` argument wasn't passed to checkpoint_manager.restore()
-    253 # so this returns a TrainState instance.
-    254 return cast(train_states.TrainState, output)
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/paxml/checkpoint_managers.py:568, in OrbaxCheckpointManager.restore(self, step, train_state, train_state_unpadded_shape_dtype_struct, train_input_pipeline, restore_kwargs, aux_items, aux_restore_kwargs)
-    565 if train_input_pipeline and self._train_checkpoint_exists(step):
-    566   items[INPUT_ITEM_NAME] = train_input_pipeline
---> 568 restored = self._manager.restore(
-    569     step, items=items, restore_kwargs=restore_kwargs
-    570 )
-    572 # Skip metadata checks if using transformations, since the TrainState may be
-    573 # completely altered.
-    574 if self.version > 1.0 and not uses_transformations:
-    575   # If unpadded shapes were not provided, skip the shape check for now, as
-    576   # there are many callers that need to be changed.
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/checkpoint_manager.py:1055, in CheckpointManager.restore(self, step, items, restore_kwargs, directory, args)
-   1052     args = typing.cast(args_lib.Composite, args)
-   1054 restore_directory = self._get_read_step_directory(step, directory)
--> 1055 restored = self._checkpointer.restore(restore_directory, args=args)
-   1056 if self._single_item:
-   1057   return restored[DEFAULT_ITEM_NAME]
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/checkpointer.py:170, in Checkpointer.restore(self, directory, *args, **kwargs)
-    168 logging.info('Restoring item from %s.', directory)
-    169 ckpt_args = construct_checkpoint_args(self._handler, False, *args, **kwargs)
---> 170 restored = self._handler.restore(directory, args=ckpt_args)
-    171 logging.info('Finished restoring checkpoint from %s.', directory)
-    172 utils.sync_global_processes('Checkpointer:restore', self._active_processes)
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/composite_checkpoint_handler.py:470, in CompositeCheckpointHandler.restore(self, directory, args)
-    468     continue
-    469   handler = self._get_or_set_handler(item_name, arg)
---> 470   restored[item_name] = handler.restore(
-    471       self._get_item_directory(directory, item_name), args=arg
-    472   )
-    473 return CompositeResults(**restored)
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/composite_checkpoint_handler.py:138, in _AsyncLegacyCheckpointHandlerWrapper.restore(self, directory, args)
-    137 def restore(self, directory: epath.Path, args: '_AsyncWrapperArgs'):
---> 138   return self._handler.restore(directory, *args.args, **args.kwargs)
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/paxml/checkpoints.py:685, in FlaxCheckpointHandler.restore(self, directory, item, restore_args, transforms, transforms_default_to_original, version)
-    680 str_pytree_state = str(pytree_state)
-    681 input_target = {
-    682     'flattened_state': flattened_state,
-    683     'str_pytree_state': str_pytree_state,
-    684 }
---> 685 restored_target = super().restore(directory, input_target)
-    686 # Flax restore_checkpoint returned input_target unchanged if
-    687 # no step specified and no checkpoint files present.
-    688 if restored_target is input_target:
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/pytree_checkpoint_handler.py:1089, in PyTreeCheckpointHandler.restore(self, directory, item, restore_args, transforms, transforms_default_to_original, legacy_transform_fn, args)
-   1085   raise FileNotFoundError(
-   1086       f'Requested directory for restore does not exist at {directory}'
-   1087   )
-   1088 byte_limiter = get_byte_limiter(self._concurrent_gb)
--> 1089 structure, use_zarr3_metadata = self._get_internal_metadata(directory)
-   1090 # `checkpoint_restore_args` has a structure relative to the checkpoint,
-   1091 # while `restore_args` remains structured relative to the output.
-   1092 param_infos, checkpoint_restore_args = _get_restore_parameters(
-   1093     directory,
-   1094     item,
-   (...)
-   1102     else self._use_zarr3,
-   1103 )
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/pytree_checkpoint_handler.py:1312, in PyTreeCheckpointHandler._get_internal_metadata(self, directory)
-   1296 def _get_internal_metadata(
-   1297     self, directory: epath.Path
-   1298 ) -> Tuple[PyTree, Optional[bool]]:
-   1299   """Gets limited information needed to fully restore the checkpoint.
-   1300 
-   1301   This information just consists of the restore type for each leaf, as well
-   (...)
-   1310     checkpoint.
-   1311   """
--> 1312   aggregate_tree = self._read_aggregate_file(directory)
-   1313   flat_aggregate = utils.to_flat_dict(aggregate_tree, keep_empty_nodes=True)
-   1314   try:
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/pytree_checkpoint_handler.py:1172, in PyTreeCheckpointHandler._read_aggregate_file(self, directory)
-   1170 checkpoint_path = directory / self._aggregate_filename
-   1171 if checkpoint_path.exists():
--> 1172   return self._aggregate_handler.deserialize(checkpoint_path)
-   1173 elif self._use_ocdbt:
-   1174   raise FileNotFoundError(
-   1175       f'Checkpoint structure file does not exist at {directory}.'
-   1176   )
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/aggregate_handlers.py:86, in MsgpackHandler.deserialize(self, path)
-     84 """See superclass documentation."""
-     85 if path.exists():
----> 86   msgpack = path.read_bytes()
-     87   return msgpack_utils.msgpack_restore(msgpack)
-     88 else:
-
-File /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/etils/epath/abstract_path.py:152, in Path.read_bytes(self)
-    150 """Reads contents of self as bytes."""
-    151 with self.open('rb') as f:
---> 152   return f.read()
-
-MemoryError: 
-
-
-
- - -
- -
- -
- - - - - \ No newline at end of file diff --git a/posts/fundamentals_across_domains.html b/posts/fundamentals_across_domains.html index 3e13d0e..a73ba8c 100644 --- a/posts/fundamentals_across_domains.html +++ b/posts/fundamentals_across_domains.html @@ -2,7 +2,7 @@ - + @@ -38,10 +38,10 @@ - + - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - -
-
- -
- -
-
-
-

Learnings from the Brick Kiln Project

-
-
- Learnings from the Brick Kiln Project -
-
-
-
ML
-
-
-
- - -
- -
-
Author
-
-

Zeel B Patel

-
-
- -
-
Published
-
-

November 28, 2023

-
-
- - -
- - -
- - - - -
- - - - - -
-

Points

-
-

Labeling

-
    -
  • Labeling is the most important and effort-taking part of the project. It is also most confusing part if not done properly for the images. For example, we needed to make this decision for the images of the brick kilns: “If brick kiln firing chamber is visible fully or partially at a level where a human would be able to identify it as a brick kiln, we mark it as a brick kiln”.
  • -
  • To ensure good quality of labels, one should allow a small number of images to be labeled by multiple people and then compare the labels. This will help in identifying the mistakes in the labeling process and also help in improving the labeling instructions.
  • -
- - -
-
- -
- -
- - - - - \ No newline at end of file diff --git a/posts/non-gaussian-likelihood-mlps.html b/posts/non-gaussian-likelihood-mlps.html index 6d2eb7f..226786e 100644 --- a/posts/non-gaussian-likelihood-mlps.html +++ b/posts/non-gaussian-likelihood-mlps.html @@ -2,7 +2,7 @@ - + @@ -72,10 +72,10 @@ - + - + - + - + - + - + - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - -
-
- -
- -
-
-
-

Seq to Seq

-
-
- Rational driven history of Seq to Seq models -
-
-
-
ML
-
-
-
- - -
- -
-
Author
-
-

Zeel B Patel

-
-
- -
-
Published
-
-

June 24, 2024

-
-
- - -
- - -
- - - - -
- - - - - - ------ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
ModelMain DisadvantageSolved byHow?
NNCan’t handle dynamic length inputRNNRNN can handle dynamic length input
RNNVanishing Gradient ProblemLSTMLSTM can handle vanishing gradient problem
LSTMNon parallelizableTransformerTransformer can parallelize the computation
Trasformerlosses sequentialityTransformerPositional Encoding
- - - -
- -
- - - - - \ No newline at end of file diff --git a/posts/ssh-macos.html b/posts/ssh-macos.html index 474e191..5b00dcb 100644 --- a/posts/ssh-macos.html +++ b/posts/ssh-macos.html @@ -2,7 +2,7 @@ - + @@ -72,10 +72,10 @@ - + - + - + - + - - - - - - - - - - - - - - - - - - - - - - - -
-
- -
- -
-
-
-

WRF Tutorial

-
-
- WRF end-to-end tutorial -
-
-
-
ML
-
-
-
- - -
- -
-
Author
-
-

Zeel B Patel

-
-
- -
-
Published
-
-

March 4, 2024

-
-
- - -
- - -
- - - - -
- - - - - -
-

What is what?

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
AcronymFull Form
WRFWeather Research and Forecasting model
WPSWRF Preprocessing System
WRF-ARWAdvanced Research WRF
WRF-HydroWRF Hydrological model
WRF-ChemWRF Chemical model
WRFDAWRF Data Assimilation
-
-
-

Explanation in simple words

-

Will add.

-
-
-

System information

-
-
!lsb_release -a
-
-
No LSB modules are available.
-Distributor ID: Ubuntu
-Description:    Ubuntu 20.04.6 LTS
-Release:    20.04
-Codename:   focal
-
-
-
-
# processor info
-!cat /proc/cpuinfo
-
-
processor   : 0
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 2001.451
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 0
-cpu cores   : 32
-apicid      : 0
-initial apicid  : 0
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 1
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 1
-cpu cores   : 32
-apicid      : 1
-initial apicid  : 1
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 2
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 2
-cpu cores   : 32
-apicid      : 2
-initial apicid  : 2
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 3
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 3
-cpu cores   : 32
-apicid      : 3
-initial apicid  : 3
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 4
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 4
-cpu cores   : 32
-apicid      : 4
-initial apicid  : 4
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 5
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 5
-cpu cores   : 32
-apicid      : 5
-initial apicid  : 5
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 6
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 6
-cpu cores   : 32
-apicid      : 6
-initial apicid  : 6
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 7
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 7
-cpu cores   : 32
-apicid      : 7
-initial apicid  : 7
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 8
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 8
-cpu cores   : 32
-apicid      : 8
-initial apicid  : 8
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 9
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 9
-cpu cores   : 32
-apicid      : 9
-initial apicid  : 9
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 10
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 10
-cpu cores   : 32
-apicid      : 10
-initial apicid  : 10
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 11
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 11
-cpu cores   : 32
-apicid      : 11
-initial apicid  : 11
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 12
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 2000.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 12
-cpu cores   : 32
-apicid      : 12
-initial apicid  : 12
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 13
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 13
-cpu cores   : 32
-apicid      : 13
-initial apicid  : 13
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 14
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 14
-cpu cores   : 32
-apicid      : 14
-initial apicid  : 14
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 15
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 15
-cpu cores   : 32
-apicid      : 15
-initial apicid  : 15
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 16
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 16
-cpu cores   : 32
-apicid      : 16
-initial apicid  : 16
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 17
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 17
-cpu cores   : 32
-apicid      : 17
-initial apicid  : 17
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 18
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 18
-cpu cores   : 32
-apicid      : 18
-initial apicid  : 18
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 19
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 19
-cpu cores   : 32
-apicid      : 19
-initial apicid  : 19
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 20
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 20
-cpu cores   : 32
-apicid      : 20
-initial apicid  : 20
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 21
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 21
-cpu cores   : 32
-apicid      : 21
-initial apicid  : 21
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 22
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 22
-cpu cores   : 32
-apicid      : 22
-initial apicid  : 22
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 23
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 23
-cpu cores   : 32
-apicid      : 23
-initial apicid  : 23
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 24
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 24
-cpu cores   : 32
-apicid      : 24
-initial apicid  : 24
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 25
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 25
-cpu cores   : 32
-apicid      : 25
-initial apicid  : 25
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 26
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 26
-cpu cores   : 32
-apicid      : 26
-initial apicid  : 26
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 27
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 27
-cpu cores   : 32
-apicid      : 27
-initial apicid  : 27
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 28
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 28
-cpu cores   : 32
-apicid      : 28
-initial apicid  : 28
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 29
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 29
-cpu cores   : 32
-apicid      : 29
-initial apicid  : 29
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 30
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 30
-cpu cores   : 32
-apicid      : 30
-initial apicid  : 30
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 31
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 0
-siblings    : 32
-core id     : 31
-cpu cores   : 32
-apicid      : 31
-initial apicid  : 31
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4699.82
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 32
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 0
-cpu cores   : 32
-apicid      : 64
-initial apicid  : 64
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 33
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 1
-cpu cores   : 32
-apicid      : 65
-initial apicid  : 65
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 34
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 2
-cpu cores   : 32
-apicid      : 66
-initial apicid  : 66
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 35
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 3
-cpu cores   : 32
-apicid      : 67
-initial apicid  : 67
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 36
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 4
-cpu cores   : 32
-apicid      : 68
-initial apicid  : 68
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 37
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 5
-cpu cores   : 32
-apicid      : 69
-initial apicid  : 69
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 38
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 6
-cpu cores   : 32
-apicid      : 70
-initial apicid  : 70
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 39
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 7
-cpu cores   : 32
-apicid      : 71
-initial apicid  : 71
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 40
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 8
-cpu cores   : 32
-apicid      : 72
-initial apicid  : 72
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 41
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 9
-cpu cores   : 32
-apicid      : 73
-initial apicid  : 73
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 42
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 10
-cpu cores   : 32
-apicid      : 74
-initial apicid  : 74
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 43
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 11
-cpu cores   : 32
-apicid      : 75
-initial apicid  : 75
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 44
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 12
-cpu cores   : 32
-apicid      : 76
-initial apicid  : 76
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 45
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 13
-cpu cores   : 32
-apicid      : 77
-initial apicid  : 77
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 46
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 14
-cpu cores   : 32
-apicid      : 78
-initial apicid  : 78
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 47
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 15
-cpu cores   : 32
-apicid      : 79
-initial apicid  : 79
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 48
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 2900.378
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 16
-cpu cores   : 32
-apicid      : 80
-initial apicid  : 80
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 49
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1549.380
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 17
-cpu cores   : 32
-apicid      : 81
-initial apicid  : 81
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 50
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 18
-cpu cores   : 32
-apicid      : 82
-initial apicid  : 82
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 51
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 19
-cpu cores   : 32
-apicid      : 83
-initial apicid  : 83
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 52
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 20
-cpu cores   : 32
-apicid      : 84
-initial apicid  : 84
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 53
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 21
-cpu cores   : 32
-apicid      : 85
-initial apicid  : 85
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 54
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 22
-cpu cores   : 32
-apicid      : 86
-initial apicid  : 86
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 55
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 23
-cpu cores   : 32
-apicid      : 87
-initial apicid  : 87
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 56
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 24
-cpu cores   : 32
-apicid      : 88
-initial apicid  : 88
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 57
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 25
-cpu cores   : 32
-apicid      : 89
-initial apicid  : 89
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 58
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 26
-cpu cores   : 32
-apicid      : 90
-initial apicid  : 90
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 59
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 2350.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 27
-cpu cores   : 32
-apicid      : 91
-initial apicid  : 91
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 60
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 28
-cpu cores   : 32
-apicid      : 92
-initial apicid  : 92
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 61
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 29
-cpu cores   : 32
-apicid      : 93
-initial apicid  : 93
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 62
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 30
-cpu cores   : 32
-apicid      : 94
-initial apicid  : 94
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-processor   : 63
-vendor_id   : AuthenticAMD
-cpu family  : 23
-model       : 49
-model name  : AMD EPYC 7452 32-Core Processor
-stepping    : 0
-microcode   : 0x830107a
-cpu MHz     : 1500.000
-cache size  : 512 KB
-physical id : 1
-siblings    : 32
-core id     : 31
-cpu cores   : 32
-apicid      : 95
-initial apicid  : 95
-fpu     : yes
-fpu_exception   : yes
-cpuid level : 16
-wp      : yes
-flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es
-bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso
-bogomips    : 4673.53
-TLB size    : 3072 4K pages
-clflush size    : 64
-cache_alignment : 64
-address sizes   : 43 bits physical, 48 bits virtual
-power management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]
-
-
-
-
-
-

WRF-WPS Installation

-
    -
  • Create a new directory for all WRF codes and put everything into it.
  • -
  • Clone the repo recursively (includes WRF, WRFDA & WRF-Chem)
  • -
-
git clone --recurse-submodules https://github.com/wrf-model/WRF
-
    -
  • Clone the repo (includes WPS)
  • -
-
git clone https://github.com/wrf-model/WPS
-
    -
  • Follow instructions given on https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compilation_tutorial.php
  • -
- - -
- -
- -
- - - - - \ No newline at end of file diff --git a/search.json b/search.json index 92e4e2c..644b162 100644 --- a/search.json +++ b/search.json @@ -1,360 +1,276 @@ [ { - "objectID": "posts/pruning_vs_uncertainty.html", - "href": "posts/pruning_vs_uncertainty.html", - "title": "Pruning vs Uncertainty", + "objectID": "lab/scratchpad.html", + "href": "lab/scratchpad.html", + "title": "Predict", "section": "", - "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"3\"\n\nimport numpy as np\n\nimport torch\nimport torch.nn as nn\n\n# import pruning library\nimport torch.nn.utils.prune as prune\n\n# import torchvision\nimport torchvision\nfrom torchvision import transforms, datasets\nfrom torch.utils.data import DataLoader, TensorDataset\n\nfrom tqdm import tqdm\n\nfrom sklearn.calibration import calibration_curve\nfrom sklearn.metrics import classification_report\nimport matplotlib.pyplot as plt\n\ntry:\n from laplace import Laplace\nexcept ModuleNotFoundError:\n %pip install laplace-torch\n from laplace import Laplace\n\n<frozen importlib._bootstrap>:228: RuntimeWarning: scipy._lib.messagestream.MessageStream size changed, may indicate binary incompatibility. Expected 56 from C header, got 64 from PyObject" + "text": "import requests\nfrom io import BytesIO\nfrom PIL import Image\nimport numpy as np\nimport supervision as sv\nfrom inference.models.utils import get_roboflow_model\n\n# Create a dummy dataset\ndata = requests.get(\"https://raw.githubusercontent.com/jigsawpieces/dog-api-images/main/pitbull/dog-3981033_1280.jpg\")\nimage = Image.open(BytesIO(data.content)).reduce(5)\nlabel = np.random.rand(1, 5) / 10 + 0.5\nlabel[:, 0] = 0\n!mkdir -p /tmp/dummy_dataset/images\n!mkdir -p /tmp/dummy_dataset/labels\nimage.save(\"/tmp/dummy_dataset/images/0.jpg\")\nnp.savetxt(\"/tmp/dummy_dataset/labels/0.txt\", label, fmt=\"%d %f %f %f %f\")\nwith open(\"/tmp/dummy_dataset/dataset.yml\", \"w\") as f:\n f.write(\"\"\"train: _\nval: _\ntest: _\nnc: 1\nnames: [\"dummy\"]\"\"\")\n\n# Load as supervision dataset\ndataset = sv.DetectionDataset.from_yolo(\"/tmp/dummy_dataset/images\", \"/tmp/dummy_dataset/labels\", \"/tmp/dummy_dataset/dataset.yml\")\n\n# Visualize the first instance\nimage_path, image, detection = dataset[0]\nbox_annotator = sv.BoxAnnotator()\nlabel_annotator = sv.LabelAnnotator()\nannotated_image = box_annotator.annotate(image.copy(), detection)\nannotated_image = label_annotator.annotate(annotated_image, detection)\ndisplay(Image.fromarray(annotated_image))\n\n# Visualize the prediction\nmodel = get_roboflow_model(\"yolov8s-640\")\nprediction = model.infer(image)[0]\ndetection = sv.Detections.from_inference(prediction)\nannotated_image = box_annotator.annotate(image.copy(), detection)\nannotated_image = label_annotator.annotate(annotated_image, detection)\ndisplay(Image.fromarray(annotated_image))\n\n# Modified the detection to display class name\n# image_path, image, detection = dataset[0]\n# detection.data = {\"class_name\": np.array(['dummy'])}\n# box_annotator = sv.BoxAnnotator()\n# label_annotator = sv.LabelAnnotator()\n# annotated_image = box_annotator.annotate(image.copy(), detection)\n# annotated_image = label_annotator.annotate(annotated_image, detection)\n# display(Image.fromarray(annotated_image))\n\n[01/20/25 22:15:20] WARNING Your inference package version 0.33.0 is out of date! Please upgrade to __init__.py:41\n version 0.34.0 of inference for the latest features and bug fixes by \n running `pip install --upgrade inference`. \n\n\n\n\n\n\n\n\n\n\nUserWarning: Specified provider 'OpenVINOExecutionProvider' is not in available provider names.Available providers: 'TensorrtExecutionProvider, CUDAExecutionProvider, CPUExecutionProvider'\nUserWarning: Specified provider 'CoreMLExecutionProvider' is not in available provider names.Available providers: 'TensorrtExecutionProvider, CUDAExecutionProvider, CPUExecutionProvider'\n2025-01-20 22:15:22.319412181 [E:onnxruntime:Default, provider_bridge_ort.cc:1862 TryGetProviderInfo_CUDA] /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1539 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_cuda.so with error: libcudnn_adv.so.9: cannot open shared object file: No such file or directory\n\n2025-01-20 22:15:22.319453861 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:993 CreateExecutionProviderInstance] Failed to create CUDAExecutionProvider. Require cuDNN 9.* and CUDA 12.*. Please install all dependencies as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.\nimport os\nimport numpy as np\nimport supervision as sv\nfrom ultralytics import YOLO\n\n# Download dataset\nif not os.path.exists(\"/tmp/rf_animals\"):\n !wget https://universe.roboflow.com/ds/1LLwpXz2td?key=8JnJML5YF6 -O /tmp/rf_animals.zip\n !unzip /tmp/dataset.zip -d /tmp/rf_animals\n\n# Load dataset\ndataset = sv.DetectionDataset.from_yolo(\"/tmp/rf_animals/train/images\", \"/tmp/rf_animals/train/labels\", \"/tmp/rf_animals/data.yaml\")\n\n# Inference\nmodel = YOLO(\"yolov8s\")\ntargets, detections = [], []\nfor image_path, image, target in dataset:\n targets.append(target)\n \n prediction = model(image, verbose=False)[0]\n detection = sv.Detections.from_ultralytics(prediction)\n detection = detection[np.isin(detection['class_name'], dataset.classes)]\n detection.class_id = np.array([dataset.classes.index(class_name) for class_name in detection['class_name']])\n detections.append(detection)\n \n# Method #1\nmAP = sv.metrics.MeanAveragePrecision().update(detections, targets).compute()\nprint(f\"mAP50: {mAP.map50:.4f}\")\n\n# Method #2\nmAP = sv.MeanAveragePrecision.from_detections(detections, targets)\nprint(f\"mAP50: {mAP.map50:.4f}\")\n\nmAP50: 0.1553\nmAP50: 0.2100\ndef callback(image):\n prediction = model(image, verbose=False)[0]\n detection = sv.Detections.from_ultralytics(prediction)\n detection = detection[np.isin(detection['class_name'], dataset.classes)]\n detection.class_id = np.array([dataset.classes.index(class_name) for class_name in detection['class_name']])\n return detection\n\nsv.MeanAveragePrecision.benchmark(dataset, callback).map50\n\nnp.float64(0.2100207753031282)\ndataset.classes\n\n['butterfly',\n 'cat',\n 'crocodile',\n 'dear',\n 'deer',\n 'dog',\n 'elephant',\n 'fog',\n 'frog',\n 'giraffe',\n 'goat',\n 'hippo',\n 'kangaroo',\n 'lion',\n 'parrot',\n 'shark',\n 'sheep',\n 'spider',\n 'tiger',\n 'zebra']\nimport leafmap\nm = leafmap.Map(center=(28.25, 77.40), zoom=18)\n# m.add_basemap(\"SATELLITE\")\nm.add_wms_layer(\"https://wayback.maptiles.arcgis.com/arcgis/rest/services/World_Imagery/WMTS/1.0.0/GoogleMapsCompatible/MapServer/tile/32553/{z}/{y}/{x}\", layers=\"0\")\nm\nm.zoom\n\n19.0\nm = leafmap.Map(center=(28.25, 77.40), zoom=17)\nm.add_basemap(\"SATELLITE\")\n# m.add_wms_layer(\"https://wayback.maptiles.arcgis.com/arcgis/rest/services/World_Imagery/WMTS/1.0.0/GoogleMapsCompatible/MapServer/tile/32553/{z}/{y}/{x}\", layers=\"0\")\nm\nimport geopandas as gpd\nfrom tqdm.notebook import tqdm\nfrom PIL import Image\nfrom joblib import Parallel, delayed\nfrom glob import glob\ndelhi_kilns = gpd.read_file(\"/home/patel_zeel/kiln_compass_24/regions/labels/delhi_airshed.geojson\")\nlen(delhi_kilns)\n\n783\nzoom = 19\njobs = []\nfor kiln in tqdm(delhi_kilns.geometry):\n lon_min, lat_min, lon_max, lat_max = kiln.bounds\n lon_margin = (lon_max - lon_min)/4\n lat_margin = (lat_max - lat_min)/4\n outer_bounds = [lon_min - lon_margin, lat_min - lat_margin, lon_max + lon_margin, lat_max + lat_margin]\n download_path = f\"/home/patel_zeel/kiln_compass_24/regions/high_res/{zoom}/{','.join(map(str, kiln.bounds))}.tif\"\n jobs.append(delayed(leafmap.map_tiles_to_geotiff)(download_path, outer_bounds, zoom=zoom, to_cog=True, source=\"SATELLITE\", quiet=True))\n\n_ = Parallel(n_jobs=-1)(tqdm(jobs))\n\n\n\n\n\n\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439517,28.203442,77.440547,28.203933.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.385046,28.221207,77.385974,28.221709.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439517,28.203442,77.440547,28.203933.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.385046,28.221207,77.385974,28.221709.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.41962,28.208412,77.420769,28.208927.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.41962,28.208412,77.420769,28.208927.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.400691,28.212222,77.402094,28.212772.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399209,28.215745,77.400439,28.21634.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.400691,28.212222,77.402094,28.212772.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399209,28.215745,77.400439,28.21634.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380984,28.223921,77.382308,28.224487.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423613,28.214831,77.424976,28.215277.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380984,28.223921,77.382308,28.224487.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423613,28.214831,77.424976,28.215277.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420321,28.226731,77.421568,28.227297.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420321,28.226731,77.421568,28.227297.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422024,28.218841,77.423172,28.219389.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397178,28.245326,77.398362,28.245804.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422024,28.218841,77.423172,28.219389.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397178,28.245326,77.398362,28.245804.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378493,28.231351,77.379637,28.231909.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403983,28.253302,77.405073,28.253971.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378493,28.231351,77.379637,28.231909.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403983,28.253302,77.405073,28.253971.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399211,28.249648,77.400477,28.250317.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421627,28.226688,77.422657,28.227306.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381049,28.2314,77.381853,28.232478.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.377595,28.230781,77.378218,28.231793.tif\n\n\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380083,28.233122,77.381338,28.233603.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414066,28.22562,77.415028,28.226247.tif\n\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.377595,28.230781,77.378218,28.231793.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399211,28.249648,77.400477,28.250317.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381049,28.2314,77.381853,28.232478.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421627,28.226688,77.422657,28.227306.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426747,28.223595,77.428081,28.224178.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393264,28.218964,77.394545,28.219577.tif\n\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380083,28.233122,77.381338,28.233603.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414066,28.22562,77.415028,28.226247.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411022,28.244281,77.412055,28.244761.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.406952,28.252856,77.408023,28.253388.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426747,28.223595,77.428081,28.224178.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393264,28.218964,77.394545,28.219577.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411022,28.244281,77.412055,28.244761.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.398997,28.242479,77.399698,28.243509.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.406952,28.252856,77.408023,28.253388.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381981,28.23996,77.383141,28.240486.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403869,28.244107,77.404506,28.244952.tif\n\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.398997,28.242479,77.399698,28.243509.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403869,28.244107,77.404506,28.244952.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404645,28.250969,77.40556,28.251879.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381981,28.23996,77.383141,28.240486.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.379979,28.231168,77.380917,28.231783.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.295803,28.24481,77.296446,28.245788.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40962,28.233303,77.411022,28.234024.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404645,28.250969,77.40556,28.251879.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.379979,28.231168,77.380917,28.231783.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.295803,28.24481,77.296446,28.245788.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395784,28.24831,77.397011,28.248911.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380263,28.228121,77.380828,28.229099.tif\n\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.334249,28.254088,77.334997,28.255125.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40962,28.233303,77.411022,28.234024.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430124,28.253558,77.431044,28.254059.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.380263,28.228121,77.380828,28.229099.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395784,28.24831,77.397011,28.248911.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.334249,28.254088,77.334997,28.255125.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430124,28.253558,77.431044,28.254059.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.409951,28.246183,77.411256,28.246749.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.557539,28.235139,77.558522,28.235908.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.333566,28.250313,77.334763,28.250958.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.384357,28.238603,77.38497,28.239681.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.409951,28.246183,77.411256,28.246749.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.557539,28.235139,77.558522,28.235908.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.333566,28.250313,77.334763,28.250958.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423652,28.226954,77.424177,28.227829.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39999,28.250901,77.400555,28.251758.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.384357,28.238603,77.38497,28.239681.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392542,28.244058,77.393126,28.245018.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392542,28.244058,77.393126,28.245018.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39999,28.250901,77.400555,28.251758.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423652,28.226954,77.424177,28.227829.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382656,28.242525,77.383947,28.243172.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.384356,28.240206,77.385652,28.240789.tif\n\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404071,28.24931,77.405363,28.250143.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.384356,28.240206,77.385652,28.240789.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382656,28.242525,77.383947,28.243172.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420263,28.225702,77.421373,28.226251.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.383301,28.237917,77.384119,28.23916.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382349,28.222811,77.383578,28.224024.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382747,28.236606,77.383991,28.237372.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404071,28.24931,77.405363,28.250143.tif\n\n\n\nUpdating dataset tags...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420263,28.225702,77.421373,28.226251.tif\nAdding overviews...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382747,28.236606,77.383991,28.237372.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382349,28.222811,77.383578,28.224024.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.409604,28.241284,77.410961,28.241942.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.383301,28.237917,77.384119,28.23916.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404832,28.244738,77.406239,28.245398.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.409604,28.241284,77.410961,28.241942.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402862,28.244088,77.403679,28.24515.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399208,28.248389,77.400481,28.249037.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402862,28.244088,77.403679,28.24515.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.404832,28.244738,77.406239,28.245398.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.54838,28.22488,77.549438,28.225545.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.399208,28.248389,77.400481,28.249037.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.335186,28.251218,77.336479,28.251801.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.54838,28.22488,77.549438,28.225545.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374381,28.23553,77.37553,28.235994.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40628,28.242719,77.407334,28.243676.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.335186,28.251218,77.336479,28.251801.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374381,28.23553,77.37553,28.235994.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.410336,28.239227,77.411631,28.239777.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436496,28.253485,77.437447,28.254035.tif\n\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40628,28.242719,77.407334,28.243676.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436496,28.253485,77.437447,28.254035.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396776,28.243946,77.398141,28.244546.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.410336,28.239227,77.411631,28.239777.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346543,28.22565,77.347147,28.226628.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416385,28.219885,77.417119,28.220877.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396776,28.243946,77.398141,28.244546.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346543,28.22565,77.347147,28.226628.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416385,28.219885,77.417119,28.220877.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434905,28.23123,77.436051,28.232266.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438724,28.253028,77.439387,28.25404.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438724,28.253028,77.439387,28.25404.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434905,28.23123,77.436051,28.232266.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376227,28.23481,77.377405,28.23553.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439581,28.252959,77.440711,28.253474.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441003,28.253354,77.441607,28.254383.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376227,28.23481,77.377405,28.23553.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439581,28.252959,77.440711,28.253474.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441003,28.253354,77.441607,28.254383.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.330694,28.255359,77.331843,28.255828.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.450088,28.25145,77.451256,28.251964.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415978,28.262976,77.41703,28.263491.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.330694,28.255359,77.331843,28.255828.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.450088,28.25145,77.451256,28.251964.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415978,28.262976,77.41703,28.263491.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325479,28.261083,77.326648,28.261644.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325479,28.261083,77.326648,28.261644.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328932,28.260927,77.329976,28.261426.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.611995,28.256777,77.613127,28.25748.tif\n\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328932,28.260927,77.329976,28.261426.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.611995,28.256777,77.613127,28.25748.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402288,28.240987,77.40297,28.242342.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.332996,28.264283,77.334249,28.265031.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393165,28.284485,77.394197,28.285017.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402288,28.240987,77.40297,28.242342.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.332996,28.264283,77.334249,28.265031.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393165,28.284485,77.394197,28.285017.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.417644,28.244725,77.41928,28.24644.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.388491,28.274279,77.389289,28.27524.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393729,28.273362,77.39484,28.274019.tif\n\nUpdating dataset tags...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.388491,28.274279,77.389289,28.27524.tif\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393729,28.273362,77.39484,28.274019.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.417644,28.244725,77.41928,28.24644.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.272707,28.288054,77.27337,28.288894.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.272707,28.288054,77.27337,28.288894.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.329479,28.2648,77.330813,28.265646.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.329479,28.2648,77.330813,28.265646.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378491,28.291501,77.379445,28.291946.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395667,28.281381,77.396465,28.282324.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378491,28.291501,77.379445,28.291946.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.05445,28.299134,77.054995,28.30006.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392814,28.285344,77.394061,28.28591.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395667,28.281381,77.396465,28.282324.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.05445,28.299134,77.054995,28.30006.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392814,28.285344,77.394061,28.28591.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382094,28.29126,77.3826,28.292101.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368841,28.287985,77.370107,28.288551.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382094,28.29126,77.3826,28.292101.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368841,28.287985,77.370107,28.288551.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397799,28.279495,77.39885,28.280527.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.280575,28.29301,77.281685,28.293558.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.388539,28.275375,77.390012,28.27643.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.280575,28.29301,77.281685,28.293558.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.057992,28.30206,77.059087,28.302553.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.057992,28.30206,77.059087,28.302553.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39686,28.310894,77.397453,28.31185.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387887,28.286527,77.389153,28.287145.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397799,28.279495,77.39885,28.280527.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396582,28.283319,77.397205,28.284502.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.388539,28.275375,77.390012,28.27643.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39686,28.310894,77.397453,28.31185.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387887,28.286527,77.389153,28.287145.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396582,28.283319,77.397205,28.284502.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.055102,28.29886,77.055667,28.2997.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.055102,28.29886,77.055667,28.2997.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37223,28.295396,77.372795,28.296442.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37223,28.295396,77.372795,28.296442.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394255,28.334431,77.395307,28.334928.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389503,28.332717,77.390691,28.333249.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394255,28.334431,77.395307,28.334928.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389503,28.332717,77.390691,28.333249.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397898,28.310147,77.399084,28.310693.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397898,28.310147,77.399084,28.310693.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389484,28.286081,77.390847,28.286716.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37092,28.295907,77.371777,28.296921.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389776,28.33162,77.390867,28.332134.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389484,28.286081,77.390847,28.286716.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37092,28.295907,77.371777,28.296921.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389776,28.33162,77.390867,28.332134.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391794,28.336363,77.392355,28.337244.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582304,28.353717,77.583298,28.354337.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582778,28.352286,77.583749,28.352902.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391794,28.336363,77.392355,28.337244.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582304,28.353717,77.583298,28.354337.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582778,28.352286,77.583749,28.352902.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391529,28.367164,77.392035,28.367901.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.591217,28.361328,77.592152,28.361705.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391529,28.367164,77.392035,28.367901.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.567521,28.361494,77.568469,28.362254.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.591217,28.361328,77.592152,28.361705.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.567521,28.361494,77.568469,28.362254.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.35727,28.302428,77.35877,28.303196.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.576011,28.364573,77.576889,28.365287.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582164,28.345883,77.583188,28.346603.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.576011,28.364573,77.576889,28.365287.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.35727,28.302428,77.35877,28.303196.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.58579,28.350779,77.586734,28.351472.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582164,28.345883,77.583188,28.346603.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.58579,28.350779,77.586734,28.351472.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573756,28.369634,77.574639,28.370282.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573756,28.369634,77.574639,28.370282.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.5687,28.356236,77.569633,28.357074.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.579859,28.344916,77.580884,28.345734.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.578047,28.36547,77.578979,28.366176.tif\n\nAdding overviews...\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.577563,28.355356,77.578563,28.356157.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.5687,28.356236,77.569633,28.357074.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.584383,28.352828,77.585199,28.353929.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.579859,28.344916,77.580884,28.345734.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.578047,28.36547,77.578979,28.366176.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.577563,28.355356,77.578563,28.356157.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.584383,28.352828,77.585199,28.353929.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.580639,28.332723,77.581861,28.333598.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.373846,28.414445,77.374937,28.415045.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.580639,28.332723,77.581861,28.333598.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394576,28.377987,77.395569,28.378552.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.373846,28.414445,77.374937,28.415045.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.583154,28.367847,77.584115,28.368713.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394576,28.377987,77.395569,28.378552.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.379309,28.411173,77.379951,28.411996.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573654,28.363088,77.574602,28.364024.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.379309,28.411173,77.379951,28.411996.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.585125,28.367876,77.586074,28.368697.tif\n\nAdding overviews...\nUpdating dataset tags...\nAdding overviews...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.583154,28.367847,77.584115,28.368713.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573654,28.363088,77.574602,28.364024.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586444,28.399282,77.587516,28.400037.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.585125,28.367876,77.586074,28.368697.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586444,28.399282,77.587516,28.400037.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.562144,28.349463,77.563504,28.35039.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574721,28.363434,77.575752,28.364597.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.562144,28.349463,77.563504,28.35039.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574721,28.363434,77.575752,28.364597.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.5834,28.345409,77.584414,28.34647.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.630723,28.398897,77.631673,28.399733.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.5834,28.345409,77.584414,28.34647.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382717,28.420471,77.383807,28.420933.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.630723,28.398897,77.631673,28.399733.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382717,28.420471,77.383807,28.420933.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.400725,28.406126,77.401871,28.406861.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378525,28.407155,77.379336,28.408164.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.400725,28.406126,77.401871,28.406861.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378525,28.407155,77.379336,28.408164.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574408,28.335977,77.575719,28.336887.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.590371,28.337407,77.591509,28.338601.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574408,28.335977,77.575719,28.336887.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.590371,28.337407,77.591509,28.338601.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.38782,28.388839,77.389035,28.389902.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555503,28.395057,77.556764,28.395932.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.38782,28.388839,77.389035,28.389902.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555503,28.395057,77.556764,28.395932.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597054,28.421614,77.598106,28.422378.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597004,28.428189,77.598064,28.428905.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597054,28.421614,77.598106,28.422378.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582468,28.383531,77.583666,28.384522.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597004,28.428189,77.598064,28.428905.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.645184,28.45388,77.646158,28.454548.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.676666,28.434924,77.677534,28.435544.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.645184,28.45388,77.646158,28.454548.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.676666,28.434924,77.677534,28.435544.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.582468,28.383531,77.583666,28.384522.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.671598,28.430455,77.672823,28.431464.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.671598,28.430455,77.672823,28.431464.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597862,28.43561,77.59873,28.436718.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.649144,28.446776,77.650364,28.447675.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597862,28.43561,77.59873,28.436718.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.644936,28.450308,77.646087,28.45115.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.606742,28.406935,77.607844,28.407877.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.592911,28.426001,77.594206,28.426818.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.619007,28.410973,77.620136,28.411956.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.649144,28.446776,77.650364,28.447675.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.644936,28.450308,77.646087,28.45115.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.592911,28.426001,77.594206,28.426818.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.606742,28.406935,77.607844,28.407877.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.619007,28.410973,77.620136,28.411956.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.641895,28.450275,77.643003,28.451172.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.641895,28.450275,77.643003,28.451172.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.602472,28.407715,77.603892,28.408836.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.646111,28.453562,77.64719,28.454407.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593847,28.481626,77.594897,28.482573.tif\n\nAdding overviews...\nUpdating dataset tags...\nAdding overviews...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.602472,28.407715,77.603892,28.408836.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.646111,28.453562,77.64719,28.454407.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593847,28.481626,77.594897,28.482573.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.923943,28.724198,76.924571,28.725237.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.923943,28.724198,76.924571,28.725237.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.654202,28.449042,77.655425,28.449933.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.892405,28.641308,76.893691,28.641803.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.654202,28.449042,77.655425,28.449933.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.892405,28.641308,76.893691,28.641803.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597879,28.437297,77.598954,28.438487.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60059,28.520599,77.601846,28.521552.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.896381,28.633003,76.897692,28.633587.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.597879,28.437297,77.598954,28.438487.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.606773,28.519039,77.607973,28.519999.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.896381,28.633003,76.897692,28.633587.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60059,28.520599,77.601846,28.521552.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.627612,28.487476,77.628753,28.488303.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.605367,28.521803,77.606598,28.522723.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395823,28.244212,77.396465,28.245036.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.606773,28.519039,77.607973,28.519999.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.599257,28.530944,77.600313,28.531893.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.395823,28.244212,77.396465,28.245036.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604296,28.52906,77.605367,28.530121.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.627612,28.487476,77.628753,28.488303.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.605367,28.521803,77.606598,28.522723.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604296,28.52906,77.605367,28.530121.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.599257,28.530944,77.600313,28.531893.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.863749,28.644316,76.864898,28.64488.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.863749,28.644316,76.864898,28.64488.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60946,28.528705,77.610897,28.52971.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60946,28.528705,77.610897,28.52971.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.35692,28.720956,77.357871,28.721958.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.531815,28.594874,77.53279,28.595945.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.35692,28.720956,77.357871,28.721958.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.531815,28.594874,77.53279,28.595945.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.928659,28.734502,76.930064,28.734963.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.521977,28.541158,77.523188,28.54226.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411367,28.733316,77.412423,28.733866.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.928659,28.734502,76.930064,28.734963.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411367,28.733316,77.412423,28.733866.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.521977,28.541158,77.523188,28.54226.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60608,28.412178,77.60763,28.413427.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.605627,28.530084,77.606712,28.531215.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.605627,28.530084,77.606712,28.531215.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.60608,28.412178,77.60763,28.413427.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.607555,28.519595,77.608658,28.520652.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.643833,28.675536,77.644681,28.676664.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426727,28.717934,77.427864,28.719015.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.643833,28.675536,77.644681,28.676664.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.611452,28.528599,77.612755,28.529724.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.607555,28.519595,77.608658,28.520652.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.930947,28.733199,76.932218,28.73381.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.53028,28.590155,77.531362,28.591403.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.601955,28.5376,77.603468,28.538821.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426727,28.717934,77.427864,28.719015.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.930947,28.733199,76.932218,28.73381.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362787,28.722948,77.364138,28.723998.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.53028,28.590155,77.531362,28.591403.tif\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.611452,28.528599,77.612755,28.529724.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.601955,28.5376,77.603468,28.538821.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362787,28.722948,77.364138,28.723998.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.413781,28.729296,77.414463,28.730468.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.603531,28.538614,77.604863,28.539878.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.413781,28.729296,77.414463,28.730468.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442072,28.722959,77.443251,28.724072.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.924006,28.737248,76.925496,28.737791.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.412711,28.724839,77.413583,28.726029.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.603531,28.538614,77.604863,28.539878.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421989,28.731187,77.42261,28.732262.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.412711,28.724839,77.413583,28.726029.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421989,28.731187,77.42261,28.732262.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.924006,28.737248,76.925496,28.737791.tif\nUpdating dataset tags...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42751,28.729953,77.42821,28.730879.tif\n\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442072,28.722959,77.443251,28.724072.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421358,28.726853,77.422091,28.728028.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425279,28.733268,77.425996,28.734352.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42751,28.729953,77.42821,28.730879.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421358,28.726853,77.422091,28.728028.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425279,28.733268,77.425996,28.734352.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.437877,28.719557,77.439105,28.720884.tif\n\nAdding overviews...\nUpdating dataset tags...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.932392,28.737181,76.932978,28.738409.tif\n\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.437877,28.719557,77.439105,28.720884.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.92683,28.738921,76.927458,28.739961.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.929012,28.737692,76.930306,28.738261.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.932392,28.737181,76.932978,28.738409.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.929012,28.737692,76.930306,28.738261.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.92683,28.738921,76.927458,28.739961.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436464,28.726587,77.437863,28.727577.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.565184,28.715374,77.566544,28.716433.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415773,28.728794,77.416588,28.730119.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436464,28.726587,77.437863,28.727577.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.418617,28.733934,77.419949,28.734676.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.565184,28.715374,77.566544,28.716433.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415773,28.728794,77.416588,28.730119.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.418617,28.733934,77.419949,28.734676.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920781,28.739362,76.921518,28.740496.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920781,28.739362,76.921518,28.740496.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.518341,28.726193,77.519521,28.727224.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.932329,28.741682,76.933006,28.74309.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.349287,28.737819,77.350269,28.738923.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.932329,28.741682,76.933006,28.74309.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.518341,28.726193,77.519521,28.727224.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.349287,28.737819,77.350269,28.738923.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370372,28.736791,77.371589,28.737673.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370372,28.736791,77.371589,28.737673.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445863,28.744009,77.446748,28.744985.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445863,28.744009,77.446748,28.744985.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423476,28.743285,77.424735,28.744367.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411872,28.737316,77.413159,28.738264.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423476,28.743285,77.424735,28.744367.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.435478,28.742127,77.43679,28.743004.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411872,28.737316,77.413159,28.738264.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423105,28.742161,77.424483,28.743185.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914064,28.748339,76.914746,28.749346.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.435478,28.742127,77.43679,28.743004.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914064,28.748339,76.914746,28.749346.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415366,28.739043,77.416753,28.740137.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362702,28.73804,77.363795,28.739191.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423105,28.742161,77.424483,28.743185.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416457,28.741858,77.417809,28.742712.tif\n\nAdding overviews...\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915321,28.746591,76.916147,28.74774.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415366,28.739043,77.416753,28.740137.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362702,28.73804,77.363795,28.739191.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440436,28.735691,77.441665,28.736545.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416457,28.741858,77.417809,28.742712.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.919578,28.752905,76.920835,28.753456.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915321,28.746591,76.916147,28.74774.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356845,28.741321,77.358155,28.742256.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440436,28.735691,77.441665,28.736545.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.919578,28.752905,76.920835,28.753456.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.419738,28.737114,77.420787,28.738253.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916183,28.745961,76.917351,28.746512.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433533,28.741994,77.434686,28.743217.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356845,28.741321,77.358155,28.742256.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350176,28.737668,77.35157,28.738706.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916183,28.745961,76.917351,28.746512.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.419738,28.737114,77.420787,28.738253.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.429763,28.739069,77.431202,28.740038.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433533,28.741994,77.434686,28.743217.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.413784,28.740562,77.415241,28.741668.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350176,28.737668,77.35157,28.738706.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.922963,28.744732,76.924222,28.745362.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.429763,28.739069,77.431202,28.740038.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.922963,28.744732,76.924222,28.745362.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.413784,28.740562,77.415241,28.741668.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.923111,28.749089,76.924325,28.749746.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.923111,28.749089,76.924325,28.749746.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42127,28.740169,77.422402,28.74138.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420238,28.736745,77.421446,28.737915.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446262,28.742522,77.447695,28.743381.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425026,28.736699,77.426186,28.737879.tif\n\nAdding overviews...\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420238,28.736745,77.421446,28.737915.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42127,28.740169,77.422402,28.74138.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446262,28.742522,77.447695,28.743381.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42459,28.740948,77.426,28.742123.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425026,28.736699,77.426186,28.737879.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42459,28.740948,77.426,28.742123.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.360944,28.747144,77.362171,28.747956.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.360944,28.747144,77.362171,28.747956.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.428742,28.746448,77.429976,28.74749.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.455849,28.735985,77.457301,28.737144.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387069,28.343111,77.388004,28.343522.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425252,28.746076,77.426414,28.747205.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387069,28.343111,77.388004,28.343522.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.428742,28.746448,77.429976,28.74749.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425252,28.746076,77.426414,28.747205.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.455849,28.735985,77.457301,28.737144.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.3559,28.746014,77.357159,28.74732.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.345519,28.753813,77.346962,28.7548.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.3559,28.746014,77.357159,28.74732.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.345519,28.753813,77.346962,28.7548.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434931,28.746573,77.436104,28.747911.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.352773,28.747897,77.354283,28.748955.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434931,28.746573,77.436104,28.747911.tif\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.352773,28.747897,77.354283,28.748955.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43642,28.746962,77.437816,28.747859.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351163,28.745347,77.352644,28.746451.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43642,28.746962,77.437816,28.747859.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351163,28.745347,77.352644,28.746451.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449107,28.749177,77.450327,28.750249.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449107,28.749177,77.450327,28.750249.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.522613,28.741342,77.524065,28.742711.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.566611,28.34057,77.567861,28.341851.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.522613,28.741342,77.524065,28.742711.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.359709,28.745599,77.361243,28.746612.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.471012,28.747688,77.472291,28.748687.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.566611,28.34057,77.567861,28.341851.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43747,28.753013,77.438884,28.754136.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.453736,28.745112,77.455031,28.746296.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586977,28.334712,77.58781,28.335848.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449761,28.748434,77.450963,28.749562.tif\n\n\nAdding overviews...\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420269,28.746228,77.421749,28.747472.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.359709,28.745599,77.361243,28.746612.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.471012,28.747688,77.472291,28.748687.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586977,28.334712,77.58781,28.335848.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43747,28.753013,77.438884,28.754136.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.453736,28.745112,77.455031,28.746296.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449761,28.748434,77.450963,28.749562.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420269,28.746228,77.421749,28.747472.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446959,28.750691,77.448351,28.751787.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43514,28.753454,77.436648,28.754795.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446959,28.750691,77.448351,28.751787.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43514,28.753454,77.436648,28.754795.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.340341,28.758871,77.34125,28.759911.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.340341,28.758871,77.34125,28.759911.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.369066,28.759512,77.370177,28.760617.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.369066,28.759512,77.370177,28.760617.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37018,28.760358,77.371142,28.761429.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34456,28.755009,77.346002,28.756079.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37018,28.760358,77.371142,28.761429.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34456,28.755009,77.346002,28.756079.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449522,28.752244,77.450608,28.753502.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438875,28.759944,77.440035,28.761055.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438647,28.755698,77.440065,28.75659.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438875,28.759944,77.440035,28.761055.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449522,28.752244,77.450608,28.753502.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438647,28.755698,77.440065,28.75659.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.344301,28.75838,77.345818,28.759456.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374773,28.762884,77.376052,28.763905.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440239,28.757653,77.441573,28.758745.tif\n\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442567,28.760813,77.44367,28.76203.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.345923,28.763033,77.347359,28.764226.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.344301,28.75838,77.345818,28.759456.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442567,28.760813,77.44367,28.76203.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440239,28.757653,77.441573,28.758745.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374773,28.762884,77.376052,28.763905.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348221,28.763572,77.349596,28.76484.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440762,28.758223,77.44214,28.759303.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.345923,28.763033,77.347359,28.764226.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.647265,28.455689,77.648489,28.456623.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348221,28.763572,77.349596,28.76484.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34503,28.760859,77.346402,28.761991.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440762,28.758223,77.44214,28.759303.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.647265,28.455689,77.648489,28.456623.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34503,28.760859,77.346402,28.761991.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.632056,28.452044,77.633377,28.45302.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.632056,28.452044,77.633377,28.45302.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376729,28.763234,77.378062,28.764403.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447601,28.754954,77.448828,28.7561.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376729,28.763234,77.378062,28.764403.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447601,28.754954,77.448828,28.7561.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364789,28.762969,77.366501,28.764261.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.450428,28.760757,77.451724,28.761867.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368395,28.766226,77.369461,28.767303.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.450428,28.760757,77.451724,28.761867.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442161,28.757968,77.443461,28.759041.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368395,28.766226,77.369461,28.767303.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364789,28.762969,77.366501,28.764261.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448716,28.756714,77.4501,28.757835.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.442161,28.757968,77.443461,28.759041.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.642208,28.682066,77.643102,28.68313.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.453191,28.760263,77.454507,28.761367.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.642208,28.682066,77.643102,28.68313.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448716,28.756714,77.4501,28.757835.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.453191,28.760263,77.454507,28.761367.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447194,28.762389,77.448498,28.763671.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447194,28.762389,77.448498,28.763671.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.594389,28.400605,77.596279,28.401993.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325741,28.767554,77.327062,28.76855.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319744,28.766391,77.320904,28.767495.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368159,28.773908,77.369488,28.774925.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325741,28.767554,77.327062,28.76855.tif\nAdding overviews...\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368159,28.773908,77.369488,28.774925.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319744,28.766391,77.320904,28.767495.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.594389,28.400605,77.596279,28.401993.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.366958,28.765243,77.368178,28.766552.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444459,28.755202,77.445835,28.756572.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.366958,28.765243,77.368178,28.766552.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444459,28.755202,77.445835,28.756572.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439023,28.771396,77.440116,28.772573.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376327,28.766997,77.377455,28.768274.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439023,28.771396,77.440116,28.772573.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376327,28.766997,77.377455,28.768274.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.338077,28.768703,77.339528,28.769902.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434954,28.773155,77.4363,28.774119.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.365989,28.773855,77.367365,28.774898.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364154,28.767956,77.36542,28.769439.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.434954,28.773155,77.4363,28.774119.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.338077,28.768703,77.339528,28.769902.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.365989,28.773855,77.367365,28.774898.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364154,28.767956,77.36542,28.769439.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.363384,28.773193,77.364794,28.774749.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411491,28.727581,77.41291,28.728371.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.437314,28.765986,77.438686,28.767078.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.411491,28.727581,77.41291,28.728371.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593488,28.770902,77.594638,28.772008.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.363384,28.773193,77.364794,28.774749.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.437314,28.765986,77.438686,28.767078.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440694,28.766191,77.441961,28.767599.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593488,28.770902,77.594638,28.772008.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436299,28.771941,77.437783,28.772984.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440694,28.766191,77.441961,28.767599.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.44426,28.765179,77.445636,28.76633.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436299,28.771941,77.437783,28.772984.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.44426,28.765179,77.445636,28.76633.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439184,28.76917,77.440359,28.770457.tif\n\nAdding overviews...\nUpdating dataset tags...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441356,28.764995,77.44263,28.766318.tif\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439184,28.76917,77.440359,28.770457.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.589809,28.773299,77.591117,28.774377.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36273,28.728479,77.364122,28.729604.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.307744,28.774735,77.309216,28.775631.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.589809,28.773299,77.591117,28.774377.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441356,28.764995,77.44263,28.766318.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604262,28.774147,77.605617,28.774969.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.307744,28.774735,77.309216,28.775631.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36273,28.728479,77.364122,28.729604.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604262,28.774147,77.605617,28.774969.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.569684,28.773985,77.570871,28.775227.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.569684,28.773985,77.570871,28.775227.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.314924,28.776945,77.316023,28.778218.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34208,28.77478,77.343493,28.775764.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.314924,28.776945,77.316023,28.778218.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.34208,28.77478,77.343493,28.775764.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.360863,28.776403,77.362418,28.777528.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313962,28.782084,77.315502,28.783184.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.360863,28.776403,77.362418,28.777528.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.323234,28.780901,77.324651,28.78198.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348386,28.783799,77.349665,28.784894.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313962,28.782084,77.315502,28.783184.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.596297,28.773129,77.597768,28.774436.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.323234,28.780901,77.324651,28.78198.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348386,28.783799,77.349665,28.784894.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.602905,28.766215,77.604246,28.767467.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.455781,28.743484,77.45698,28.74458.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.596297,28.773129,77.597768,28.774436.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.602905,28.766215,77.604246,28.767467.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370639,28.764829,77.372224,28.766385.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.455781,28.743484,77.45698,28.74458.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313449,28.778717,77.314692,28.780092.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370639,28.764829,77.372224,28.766385.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313449,28.778717,77.314692,28.780092.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36319,28.775465,77.36486,28.776688.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.31586,28.775972,77.317439,28.777175.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.564231,28.777744,77.56577,28.778673.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446598,28.783626,77.447934,28.784563.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36319,28.775465,77.36486,28.776688.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.31586,28.775972,77.317439,28.777175.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.596915,28.766768,77.598568,28.768204.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.564231,28.777744,77.56577,28.778673.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.446598,28.783626,77.447934,28.784563.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.947081,28.790763,76.94753,28.791928.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.947081,28.790763,76.94753,28.791928.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927175,28.78835,76.928317,28.78899.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.565324,28.777047,77.566557,28.778136.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927175,28.78835,76.928317,28.78899.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.596915,28.766768,77.598568,28.768204.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.565324,28.777047,77.566557,28.778136.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370628,28.774993,77.372182,28.776167.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.363992,28.7831,77.365356,28.784597.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.952502,28.789701,76.953659,28.790189.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.952502,28.789701,76.953659,28.790189.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370628,28.774993,77.372182,28.776167.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.363992,28.7831,77.365356,28.784597.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420911,28.749509,77.422206,28.750651.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420911,28.749509,77.422206,28.750651.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393642,28.779181,77.395321,28.780406.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441948,28.750509,77.443014,28.75169.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.552464,28.783114,77.553904,28.784337.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.944367,28.79409,76.944893,28.795218.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441948,28.750509,77.443014,28.75169.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364755,28.777285,77.366221,28.778942.tif\nUpdating dataset tags...\n\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.393642,28.779181,77.395321,28.780406.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.944367,28.79409,76.944893,28.795218.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420037,28.750257,77.421452,28.75146.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439518,28.746085,77.440854,28.746982.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.592007,28.78286,77.593378,28.783684.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.552464,28.783114,77.553904,28.784337.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.439518,28.746085,77.440854,28.746982.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.562674,28.775677,77.564047,28.776915.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.420037,28.750257,77.421452,28.75146.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.364755,28.777285,77.366221,28.778942.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.592007,28.78286,77.593378,28.783684.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.562674,28.775677,77.564047,28.776915.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.355279,28.751545,77.356688,28.752725.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.951123,28.788449,76.952847,28.789189.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.577146,28.77488,77.578443,28.775998.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.951123,28.788449,76.952847,28.789189.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555181,28.782961,77.55661,28.784202.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356537,28.751511,77.358093,28.752631.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574241,28.781285,77.575638,28.782461.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.355279,28.751545,77.356688,28.752725.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.566401,28.7804,77.567839,28.781605.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.577146,28.77488,77.578443,28.775998.tif\nAdding overviews...\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.574241,28.781285,77.575638,28.782461.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555181,28.782961,77.55661,28.784202.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.316595,28.79245,77.317639,28.793517.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356537,28.751511,77.358093,28.752631.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.566401,28.7804,77.567839,28.781605.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.316595,28.79245,77.317639,28.793517.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.366517,28.776091,77.368382,28.77747.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593462,28.775935,77.594835,28.777124.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.584325,28.779093,77.585688,28.780292.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.318081,28.794084,77.319394,28.795044.tif\n\nAdding overviews...\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.366517,28.776091,77.368382,28.77747.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.593462,28.775935,77.594835,28.777124.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.318081,28.794084,77.319394,28.795044.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.584325,28.779093,77.585688,28.780292.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325143,28.791396,77.326476,28.792341.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320603,28.788732,77.321664,28.789842.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.325143,28.791396,77.326476,28.792341.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346971,28.793623,77.348156,28.79469.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.443476,28.752078,77.444757,28.753004.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320603,28.788732,77.321664,28.789842.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346971,28.793623,77.348156,28.79469.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.587339,28.776679,77.588878,28.777978.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.443476,28.752078,77.444757,28.753004.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.336304,28.764197,77.337484,28.765267.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.326488,28.789608,77.327922,28.790651.tif\n\nAdding overviews...\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.336304,28.764197,77.337484,28.765267.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.587339,28.776679,77.588878,28.777978.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.326488,28.789608,77.327922,28.790651.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328991,28.790909,77.330241,28.792217.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324084,28.79007,77.325352,28.791319.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.317802,28.789826,77.319171,28.790893.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.575583,28.778195,77.577203,28.779656.tif\n\nAdding overviews...\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.359641,28.788344,77.360764,28.789438.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328991,28.790909,77.330241,28.792217.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324084,28.79007,77.325352,28.791319.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.317802,28.789826,77.319171,28.790893.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.355221,28.755255,77.356324,28.756254.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.359641,28.788344,77.360764,28.789438.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350163,28.760839,77.351566,28.761884.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.575583,28.778195,77.577203,28.779656.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.355221,28.755255,77.356324,28.756254.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.850943,28.799945,76.852398,28.800496.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313467,28.792684,77.314777,28.793966.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.850943,28.799945,76.852398,28.800496.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844777,28.798646,76.845398,28.79978.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.353793,28.786232,77.355052,28.787531.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350163,28.760839,77.351566,28.761884.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844777,28.798646,76.845398,28.79978.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.924499,28.801519,76.925164,28.802558.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313467,28.792684,77.314777,28.793966.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914908,28.798953,76.915447,28.800071.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916497,28.802148,76.917688,28.8026.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.924499,28.801519,76.925164,28.802558.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914908,28.798953,76.915447,28.800071.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916497,28.802148,76.917688,28.8026.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.353793,28.786232,77.355052,28.787531.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.926524,28.797143,76.927189,28.798182.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.926524,28.797143,76.927189,28.798182.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356603,28.785378,77.357764,28.786621.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350146,28.7616,77.351815,28.762753.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.356603,28.785378,77.357764,28.786621.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348332,28.788119,77.349726,28.78921.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.350146,28.7616,77.351815,28.762753.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348332,28.788119,77.349726,28.78921.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.547596,28.793098,77.548664,28.794104.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.547596,28.793098,77.548664,28.794104.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.933296,28.795963,76.933961,28.796986.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.507843,28.790402,77.509214,28.791599.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.933296,28.795963,76.933961,28.796986.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.358582,28.792517,77.36008,28.793642.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.507843,28.790402,77.509214,28.791599.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.358582,28.792517,77.36008,28.793642.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362218,28.791709,77.363569,28.793095.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351374,28.799503,77.352318,28.80047.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.361926,28.788358,77.363249,28.789636.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351374,28.799503,77.352318,28.80047.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362218,28.791709,77.363569,28.793095.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368745,28.762421,77.37011,28.763687.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.361926,28.788358,77.363249,28.789636.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368745,28.762421,77.37011,28.763687.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351737,28.796406,77.352934,28.797365.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555977,28.791282,77.557307,28.792281.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351737,28.796406,77.352934,28.797365.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.555977,28.791282,77.557307,28.792281.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.572004,28.786398,77.573226,28.78751.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320092,28.79789,77.32117,28.799067.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445419,28.790159,77.4467,28.791443.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.550714,28.793385,77.552062,28.794533.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.554294,28.793159,77.555615,28.794369.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320092,28.79789,77.32117,28.799067.tif\nAdding overviews...\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445419,28.790159,77.4467,28.791443.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.550714,28.793385,77.552062,28.794533.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.572004,28.786398,77.573226,28.78751.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.554294,28.793159,77.555615,28.794369.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351676,28.798465,77.352996,28.799467.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319983,28.80237,77.32139,28.803396.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.329798,28.800491,77.330977,28.801702.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324903,28.800462,77.326324,28.80146.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351676,28.798465,77.352996,28.799467.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319983,28.80237,77.32139,28.803396.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.329798,28.800491,77.330977,28.801702.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324903,28.800462,77.326324,28.80146.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.375588,28.804412,77.37674,28.805199.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.367579,28.801358,77.368752,28.802196.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37061,28.800435,77.371534,28.80156.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.375588,28.804412,77.37674,28.805199.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.367579,28.801358,77.368752,28.802196.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328528,28.804005,77.329665,28.805265.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37061,28.800435,77.371534,28.80156.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.375858,28.798023,77.377101,28.798847.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.344299,28.804002,77.345525,28.80518.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.91909,28.811323,76.920412,28.811865.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.937065,28.811135,76.938253,28.811601.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328528,28.804005,77.329665,28.805265.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.375858,28.798023,77.377101,28.798847.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.91909,28.811323,76.920412,28.811865.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.937065,28.811135,76.938253,28.811601.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.344299,28.804002,77.345525,28.80518.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920528,28.811296,76.921139,28.812405.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920528,28.811296,76.921139,28.812405.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.354843,28.796089,77.356008,28.797179.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.332342,28.799474,77.33357,28.800783.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.45339,28.798339,77.454709,28.799275.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374343,28.804902,77.375575,28.805689.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.354843,28.796089,77.356008,28.797179.tif\nAdding overviews...\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.563516,28.795457,77.564549,28.796255.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.332342,28.799474,77.33357,28.800783.tif\nUpdating dataset tags...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.45339,28.798339,77.454709,28.799275.tif\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.374343,28.804902,77.375575,28.805689.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347724,28.794735,77.348996,28.795784.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.563516,28.795457,77.564549,28.796255.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.848985,28.808151,76.849666,28.809344.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347724,28.794735,77.348996,28.795784.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.848985,28.808151,76.849666,28.809344.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.551507,28.803384,77.552788,28.804405.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844198,28.812821,76.845654,28.813567.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.847125,28.812669,76.849038,28.813407.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844198,28.812821,76.845654,28.813567.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.551507,28.803384,77.552788,28.804405.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.847125,28.812669,76.849038,28.813407.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849026,28.813797,76.850536,28.814523.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.532089,28.802776,77.533219,28.803914.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849026,28.813797,76.850536,28.814523.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449381,28.803685,77.450699,28.804721.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.532089,28.802776,77.533219,28.803914.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449381,28.803685,77.450699,28.804721.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.294141,28.805063,77.295358,28.805989.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.31241,28.796202,77.31392,28.797543.tif\nUpdating dataset tags...\n\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.294141,28.805063,77.295358,28.805989.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328377,28.778093,77.329511,28.779191.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.544378,28.799685,77.545591,28.800839.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.302414,28.811316,77.303604,28.81261.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.328377,28.778093,77.329511,28.779191.tif\nAdding overviews...\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.544378,28.799685,77.545591,28.800839.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.31241,28.796202,77.31392,28.797543.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.302414,28.811316,77.303604,28.81261.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347889,28.80743,77.349771,28.807962.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347889,28.80743,77.349771,28.807962.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.470509,28.809922,77.471664,28.811036.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.57246,28.798558,77.573539,28.799538.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.470509,28.809922,77.471664,28.811036.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.340993,28.806853,77.342147,28.807982.tif\n\nAdding overviews...\nUpdating dataset tags...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.480786,28.805537,77.482111,28.806542.tif\n\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.57246,28.798558,77.573539,28.799538.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844012,28.816557,76.845175,28.817133.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.845233,28.821603,76.846635,28.82219.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.340993,28.806853,77.342147,28.807982.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.844012,28.816557,76.845175,28.817133.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.480786,28.805537,77.482111,28.806542.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.309701,28.810701,77.311179,28.811736.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920343,28.815163,76.92154,28.81561.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.845233,28.821603,76.846635,28.82219.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.530834,28.813706,77.531994,28.814876.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920343,28.815163,76.92154,28.81561.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.309701,28.810701,77.311179,28.811736.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.530834,28.813706,77.531994,28.814876.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.533024,28.810578,77.534142,28.81172.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849191,28.816402,76.850566,28.816934.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.851395,28.830629,76.851948,28.831621.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.56557,28.8051,77.566293,28.806338.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849191,28.816402,76.850566,28.816934.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.533024,28.810578,77.534142,28.81172.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.851395,28.830629,76.851948,28.831621.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.30894,28.812889,77.310367,28.813934.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.321123,28.804873,77.32256,28.806199.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.56557,28.8051,77.566293,28.806338.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.877693,28.83413,76.8783,28.835169.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.877693,28.83413,76.8783,28.835169.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422211,28.805184,77.423706,28.806162.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.316865,28.806026,77.318241,28.807386.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313437,28.809409,77.314684,28.810688.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.960004,28.82227,76.961138,28.822781.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.30894,28.812889,77.310367,28.813934.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.851905,28.819129,76.852626,28.820375.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.960004,28.82227,76.961138,28.822781.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.321123,28.804873,77.32256,28.806199.tif\nAdding overviews...\nUpdating dataset tags...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.851905,28.819129,76.852626,28.820375.tif\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422211,28.805184,77.423706,28.806162.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.95858,28.822863,76.959236,28.823842.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.313437,28.809409,77.314684,28.810688.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.95858,28.822863,76.959236,28.823842.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.316865,28.806026,77.318241,28.807386.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.290246,28.817769,77.291254,28.818445.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.856739,28.821235,76.857475,28.822434.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.290246,28.817769,77.291254,28.818445.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.306324,28.817327,77.30688,28.818479.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.856739,28.821235,76.857475,28.822434.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.306324,28.817327,77.30688,28.818479.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.948548,28.819426,76.949443,28.82067.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864128,28.8354,76.865137,28.835862.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.948548,28.819426,76.949443,28.82067.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864128,28.8354,76.865137,28.835862.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.301883,28.827145,77.302801,28.827909.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.301883,28.827145,77.302801,28.827909.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368122,28.827311,77.369161,28.828181.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.862663,28.835888,76.863666,28.836418.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.534406,28.812102,77.535806,28.813222.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.862663,28.835888,76.863666,28.836418.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368122,28.827311,77.369161,28.828181.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.534406,28.812102,77.535806,28.813222.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536143,28.809452,77.537547,28.81054.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.861135,28.83385,76.862194,28.834368.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.990364,28.842177,76.991035,28.843146.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.861135,28.83385,76.862194,28.834368.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.990364,28.842177,76.991035,28.843146.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536143,28.809452,77.537547,28.81054.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.371176,28.825624,77.37232,28.82642.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.878808,28.84229,76.879977,28.842772.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.878808,28.84229,76.879977,28.842772.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.371176,28.825624,77.37232,28.82642.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.958927,28.824924,76.960553,28.825451.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.297729,28.832594,77.298992,28.833493.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.958927,28.824924,76.960553,28.825451.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.297729,28.832594,77.298992,28.833493.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.275666,28.836555,77.276798,28.837342.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.541615,28.811052,77.54309,28.812347.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.925608,28.789461,76.926857,28.790274.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.275666,28.836555,77.276798,28.837342.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.925608,28.789461,76.926857,28.790274.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.541615,28.811052,77.54309,28.812347.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426625,28.843175,77.427459,28.844197.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.426625,28.843175,77.427459,28.844197.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433702,28.844202,77.4346,28.845223.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433702,28.844202,77.4346,28.845223.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.53055,28.83775,77.531789,28.838272.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.53055,28.83775,77.531789,28.838272.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.955666,28.853937,76.956239,28.854921.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.955666,28.853937,76.956239,28.854921.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.292388,28.834105,77.293747,28.835142.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.888486,28.849237,76.889794,28.849759.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.888486,28.849237,76.889794,28.849759.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.533118,28.830917,77.534394,28.831924.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.525559,28.843177,77.526696,28.844002.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.292388,28.834105,77.293747,28.835142.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.533118,28.830917,77.534394,28.831924.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.525559,28.843177,77.526696,28.844002.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.043509,28.850643,77.044151,28.851567.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.550966,28.840755,77.552084,28.841475.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.043509,28.850643,77.044151,28.851567.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.454177,28.835745,77.45534,28.836681.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381884,28.854646,77.382817,28.855262.tif\n\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.95383,28.850743,76.955045,28.851366.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.550966,28.840755,77.552084,28.841475.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536208,28.836549,77.537073,28.837565.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381884,28.854646,77.382817,28.855262.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.95383,28.850743,76.955045,28.851366.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536208,28.836549,77.537073,28.837565.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.454177,28.835745,77.45534,28.836681.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.50953,28.838229,77.510425,28.83928.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378511,28.8545,77.379516,28.855262.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.546794,28.842254,77.547909,28.843169.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.50953,28.838229,77.510425,28.83928.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.378511,28.8545,77.379516,28.855262.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.546794,28.842254,77.547909,28.843169.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.527433,28.832762,77.528829,28.833787.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448291,28.842929,77.449389,28.843847.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.527433,28.832762,77.528829,28.833787.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448291,28.842929,77.449389,28.843847.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.962591,28.85522,76.963807,28.855722.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.962591,28.85522,76.963807,28.855722.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382044,28.850399,77.383024,28.851449.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382044,28.850399,77.383024,28.851449.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.964082,28.85757,76.965228,28.858132.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.964082,28.85757,76.965228,28.858132.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.535455,28.837993,77.536679,28.839134.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.169404,28.856465,77.170482,28.857068.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.298331,28.846606,77.299605,28.847408.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.169404,28.856465,77.170482,28.857068.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.285525,28.847659,77.286912,28.848554.tif\n\nAdding overviews...\nUpdating dataset tags...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.535455,28.837993,77.536679,28.839134.tif\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.298331,28.846606,77.299605,28.847408.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.2943,28.847591,77.295541,28.848592.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.952902,28.856166,76.953582,28.857208.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.285525,28.847659,77.286912,28.848554.tif\nUpdating dataset tags...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.471116,28.848687,77.472301,28.849646.tif\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.952902,28.856166,76.953582,28.857208.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368875,28.785527,77.370107,28.786592.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.2943,28.847591,77.295541,28.848592.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287533,28.857049,77.288491,28.85803.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.471116,28.848687,77.472301,28.849646.tif\nAdding overviews...\nUpdating dataset tags...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.46839,28.85956,77.468904,28.860501.tif\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.368875,28.785527,77.370107,28.786592.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447553,28.787207,77.448867,28.788072.tif\n\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.559975,28.846864,77.561057,28.847934.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287533,28.857049,77.288491,28.85803.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.46839,28.85956,77.468904,28.860501.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.028012,28.861109,77.029305,28.861763.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447553,28.787207,77.448867,28.788072.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.028012,28.861109,77.029305,28.861763.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.559975,28.846864,77.561057,28.847934.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.390245,28.852095,77.391246,28.853328.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.290793,28.84583,77.292141,28.846967.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.034059,28.857248,77.035297,28.857811.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430102,28.854218,77.431071,28.855262.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.034059,28.857248,77.035297,28.857811.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.390245,28.852095,77.391246,28.853328.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.290793,28.84583,77.292141,28.846967.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382919,28.860446,77.383617,28.861401.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430102,28.854218,77.431071,28.855262.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.382919,28.860446,77.383617,28.861401.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.284304,28.850469,77.285799,28.851544.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436509,28.863599,77.437368,28.864615.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436509,28.863599,77.437368,28.864615.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348934,28.86195,77.350287,28.86291.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.284304,28.850469,77.285799,28.851544.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.167907,28.870087,77.169093,28.870606.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.348934,28.86195,77.350287,28.86291.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.167907,28.870087,77.169093,28.870606.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448422,28.848973,77.449674,28.849902.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.525954,28.839259,77.527259,28.84048.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.385272,28.857863,77.386243,28.858883.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448422,28.848973,77.449674,28.849902.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.428479,28.84846,77.42971,28.849591.tif\n\nUpdating dataset tags...\nAdding overviews...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.385272,28.857863,77.386243,28.858883.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.525954,28.839259,77.527259,28.84048.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.039955,28.874601,77.041266,28.87512.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.295883,28.867939,77.297112,28.868506.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.039955,28.874601,77.041266,28.87512.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.428479,28.84846,77.42971,28.849591.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.295883,28.867939,77.297112,28.868506.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441193,28.847954,77.44197,28.849205.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441193,28.847954,77.44197,28.849205.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.296584,28.864739,77.297833,28.865512.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.296584,28.864739,77.297833,28.865512.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.03126,28.871338,77.032134,28.872477.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.476012,28.86095,77.477292,28.861882.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.03126,28.871338,77.032134,28.872477.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.476012,28.86095,77.477292,28.861882.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.432489,28.858258,77.433525,28.85947.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.432489,28.858258,77.433525,28.85947.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038572,28.876217,77.039728,28.876808.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.861877,28.881644,76.863134,28.882304.tif\nUpdating dataset tags...\n\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038572,28.876217,77.039728,28.876808.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.302563,28.86724,77.303415,28.868374.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.861877,28.881644,76.863134,28.882304.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.302563,28.86724,77.303415,28.868374.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.311904,28.867939,77.313369,28.868676.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.311904,28.867939,77.313369,28.868676.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.040611,28.879031,77.041149,28.880077.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.040611,28.879031,77.041149,28.880077.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376311,28.883798,77.377354,28.884411.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.873137,28.89189,76.873688,28.89297.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.86054,28.888196,76.861238,28.889445.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.376311,28.883798,77.377354,28.884411.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.873137,28.89189,76.873688,28.89297.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864046,28.886548,76.864674,28.887633.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.86054,28.888196,76.861238,28.889445.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.172673,28.884157,77.173357,28.885261.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864046,28.886548,76.864674,28.887633.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.172673,28.884157,77.173357,28.885261.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.470542,28.862928,77.471957,28.863941.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436074,28.864892,77.437224,28.866058.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.436074,28.864892,77.437224,28.866058.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.470542,28.862928,77.471957,28.863941.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.29232,28.86873,77.293737,28.869714.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.301137,28.869215,77.302288,28.870436.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.29232,28.86873,77.293737,28.869714.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.032577,28.883232,77.033314,28.884427.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.301137,28.869215,77.302288,28.870436.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.032577,28.883232,77.033314,28.884427.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.028783,28.890532,77.02979,28.891009.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.882577,28.890701,76.883419,28.891808.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.882919,28.891848,76.884122,28.892351.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.028783,28.890532,77.02979,28.891009.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.882577,28.890701,76.883419,28.891808.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.882919,28.891848,76.884122,28.892351.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876902,28.890181,76.878213,28.890826.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876902,28.890181,76.878213,28.890826.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.252089,28.881329,77.253163,28.882544.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.047153,28.89309,77.048446,28.89353.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038464,28.888292,77.039776,28.888924.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.047153,28.89309,77.048446,28.89353.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.870968,28.893161,76.872247,28.894057.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038464,28.888292,77.039776,28.888924.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.252089,28.881329,77.253163,28.882544.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.870968,28.893161,76.872247,28.894057.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.254325,28.894242,77.25517,28.89526.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.262355,28.883902,77.263618,28.885108.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.254325,28.894242,77.25517,28.89526.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.45465,28.880723,77.455924,28.881913.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438637,28.869431,77.439948,28.870789.tif\nAdding overviews...\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.45465,28.880723,77.455924,28.881913.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.262355,28.883902,77.263618,28.885108.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864692,28.88263,76.866866,28.883347.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438637,28.869431,77.439948,28.870789.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.043476,28.898498,77.044912,28.899049.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251953,28.899649,77.252679,28.900808.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.976183,28.901973,76.977782,28.902555.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.864692,28.88263,76.866866,28.883347.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.043476,28.898498,77.044912,28.899049.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43221,28.796215,77.433453,28.797039.tif\nUpdating dataset tags...\n\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251953,28.899649,77.252679,28.900808.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.265469,28.892011,77.266774,28.89291.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.976183,28.901973,76.977782,28.902555.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.43221,28.796215,77.433453,28.797039.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.265469,28.892011,77.266774,28.89291.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.024315,28.891926,77.025052,28.893452.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.024315,28.891926,77.025052,28.893452.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.250911,28.888992,77.25207,28.890165.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.040979,28.899851,77.041554,28.901187.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.250911,28.888992,77.25207,28.890165.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.040979,28.899851,77.041554,28.901187.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.05123,28.897806,77.051877,28.898923.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266054,28.890528,77.267098,28.891708.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.05123,28.897806,77.051877,28.898923.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.045303,28.895227,77.046632,28.895762.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.045303,28.895227,77.046632,28.895762.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266054,28.890528,77.267098,28.891708.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.044965,28.899996,77.046431,28.900664.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.044965,28.899996,77.046431,28.900664.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.260048,28.889159,77.26125,28.890421.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.531282,28.811477,77.532354,28.81262.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.260048,28.889159,77.26125,28.890421.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038008,28.893457,77.03955,28.894399.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.531282,28.811477,77.532354,28.81262.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.877077,28.919511,76.877768,28.920599.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.263118,28.886879,77.264444,28.888208.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.877077,28.919511,76.877768,28.920599.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.038008,28.893457,77.03955,28.894399.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.263118,28.886879,77.264444,28.888208.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.247521,28.910042,77.248417,28.911224.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256492,28.887152,77.257811,28.888528.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.321949,28.913876,77.323009,28.914615.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.880422,28.918483,76.881661,28.919065.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.247521,28.910042,77.248417,28.911224.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.880422,28.918483,76.881661,28.919065.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.321949,28.913876,77.323009,28.914615.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37849,28.907998,77.379812,28.908794.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604211,28.893824,77.60542,28.895016.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347713,28.805071,77.349265,28.806072.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.867253,28.918092,76.867916,28.919211.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256492,28.887152,77.257811,28.888528.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.37849,28.907998,77.379812,28.908794.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.867253,28.918092,76.867916,28.919211.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.604211,28.893824,77.60542,28.895016.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.347713,28.805071,77.349265,28.806072.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.865303,28.916926,76.86656,28.917477.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.870063,28.918106,76.87062,28.919253.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.865303,28.916926,76.86656,28.917477.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.24377,28.904835,77.245205,28.905678.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.870063,28.918106,76.87062,28.919253.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.386738,28.899994,77.387903,28.901375.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.24377,28.904835,77.245205,28.905678.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251168,28.89594,77.252175,28.89729.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.323558,28.912049,77.324703,28.913286.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.300485,28.910495,77.301892,28.911529.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.268471,28.893036,77.269967,28.894135.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.386738,28.899994,77.387903,28.901375.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251168,28.89594,77.252175,28.89729.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.323558,28.912049,77.324703,28.913286.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.268471,28.893036,77.269967,28.894135.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.300485,28.910495,77.301892,28.911529.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394603,28.904198,77.3961,28.90526.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.874198,28.922114,76.874961,28.923424.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.394603,28.904198,77.3961,28.90526.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.874198,28.922114,76.874961,28.923424.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.94914,28.919705,76.95013,28.920672.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.94914,28.919705,76.95013,28.920672.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.883925,28.921816,76.885326,28.922508.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.249271,28.916525,77.250725,28.91715.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370497,28.818212,77.371909,28.819376.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.249271,28.916525,77.250725,28.91715.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.883925,28.921816,76.885326,28.922508.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370497,28.818212,77.371909,28.819376.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.874307,28.925557,76.875669,28.926033.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.874307,28.925557,76.875669,28.926033.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.249015,28.905595,77.250543,28.906765.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.884571,28.923891,76.885882,28.924536.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.884571,28.923891,76.885882,28.924536.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876316,28.922714,76.877748,28.923576.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.875734,28.928389,76.876327,28.929458.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.249015,28.905595,77.250543,28.906765.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.875734,28.928389,76.876327,28.929458.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.898177,28.930259,76.899578,28.930841.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876316,28.922714,76.877748,28.923576.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39508,28.918222,77.396294,28.919073.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.897171,28.932963,76.898374,28.933561.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.898177,28.930259,76.899578,28.930841.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.897171,28.932963,76.898374,28.933561.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.39508,28.918222,77.396294,28.919073.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287776,28.82722,77.289045,28.828096.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397744,28.922422,77.398976,28.923672.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.866704,28.925682,76.868069,28.926316.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.373435,28.820064,77.374641,28.821211.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287776,28.82722,77.289045,28.828096.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.885291,28.926016,76.88598,28.927352.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.866704,28.925682,76.868069,28.926316.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.397744,28.922422,77.398976,28.923672.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.885291,28.926016,76.88598,28.927352.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319729,28.917191,77.321108,28.917964.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.906632,28.926062,76.907351,28.92732.tif\nUpdating dataset tags...\n\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.373435,28.820064,77.374641,28.821211.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.319729,28.917191,77.321108,28.917964.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.906632,28.926062,76.907351,28.92732.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914836,28.928373,76.915483,28.929568.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.904607,28.93213,76.905362,28.933388.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914836,28.928373,76.915483,28.929568.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256037,28.911106,77.257078,28.91247.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.875666,28.926238,76.877105,28.927012.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.904607,28.93213,76.905362,28.933388.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.903188,28.930197,76.904571,28.930747.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.875666,28.926238,76.877105,28.927012.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.903188,28.930197,76.904571,28.930747.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256037,28.911106,77.257078,28.91247.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.398581,28.921898,77.39973,28.923136.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.951926,28.853897,76.953188,28.854439.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.951926,28.853897,76.953188,28.854439.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.398581,28.921898,77.39973,28.923136.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.255411,28.920668,77.256678,28.921915.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.918285,28.926785,76.918985,28.92787.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.255411,28.920668,77.256678,28.921915.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.918285,28.926785,76.918985,28.92787.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.912075,28.929552,76.913529,28.930134.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915626,28.928294,76.916327,28.929583.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914374,28.930731,76.914984,28.932272.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.912075,28.929552,76.913529,28.930134.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915626,28.928294,76.916327,28.929583.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920422,28.931674,76.921141,28.932947.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914374,28.930731,76.914984,28.932272.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920422,28.931674,76.921141,28.932947.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.530007,28.832895,77.531028,28.834103.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.909962,28.926185,76.91151,28.926898.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916755,28.926784,76.917409,28.92792.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.909962,28.926185,76.91151,28.926898.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916755,28.926784,76.917409,28.92792.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381082,28.93381,77.381968,28.934706.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.530007,28.832895,77.531028,28.834103.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.901284,28.935685,76.902631,28.936298.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.901284,28.935685,76.902631,28.936298.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381082,28.93381,77.381968,28.934706.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.918321,28.932224,76.919775,28.932822.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.918321,28.932224,76.919775,28.932822.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414274,28.919033,77.415722,28.920224.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.901826,28.931758,76.902876,28.933089.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414274,28.919033,77.415722,28.920224.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.901826,28.931758,76.902876,28.933089.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.908256,28.936154,76.90943,28.936734.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.908256,28.936154,76.90943,28.936734.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.906931,28.938292,76.908238,28.938849.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.906931,28.938292,76.908238,28.938849.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.272674,28.933719,77.273739,28.934798.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.272674,28.933719,77.273739,28.934798.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916201,28.931155,76.917782,28.931784.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.916201,28.931155,76.917782,28.931784.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.259371,28.933412,77.260493,28.934482.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.926873,28.927528,76.927908,28.928875.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.896325,28.941761,76.897155,28.943049.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.259371,28.933412,77.260493,28.934482.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.258113,28.943559,77.259317,28.944645.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.884405,28.948867,76.885815,28.949482.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.896325,28.941761,76.897155,28.943049.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.898105,28.94721,76.898841,28.94831.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.884405,28.948867,76.885815,28.949482.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.926873,28.927528,76.927908,28.928875.tif\nUpdating dataset tags...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.905626,28.951297,76.906812,28.951878.tif\n\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.258113,28.943559,77.259317,28.944645.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.898105,28.94721,76.898841,28.94831.tif\nUpdating dataset tags...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915698,28.935449,76.916596,28.936723.tif\n\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.905626,28.951297,76.906812,28.951878.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256898,28.934974,77.258232,28.935959.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915698,28.935449,76.916596,28.936723.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256898,28.934974,77.258232,28.935959.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.268644,28.934886,77.269672,28.936023.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.289628,28.942986,77.290971,28.94393.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.903601,28.947053,76.90502,28.947635.tif\n\nAdding overviews...\nUpdating dataset tags...\nAdding overviews...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.903601,28.947053,76.90502,28.947635.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.289628,28.942986,77.290971,28.94393.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.268644,28.934886,77.269672,28.936023.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.907662,28.950844,76.90835,28.951969.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.907662,28.950844,76.90835,28.951969.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251985,28.94787,77.25335,28.948452.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.88633,28.948476,76.887096,28.94956.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.251985,28.94787,77.25335,28.948452.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.88633,28.948476,76.887096,28.94956.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.886334,28.950175,76.887182,28.951397.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.917351,28.950071,76.918716,28.950699.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.913062,28.947949,76.914535,28.948656.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.912954,28.951218,76.913692,28.952507.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.886334,28.950175,76.887182,28.951397.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.917351,28.950071,76.918716,28.950699.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.913062,28.947949,76.914535,28.948656.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.912954,28.951218,76.913692,28.952507.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.431279,28.936963,77.432459,28.938163.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.217525,28.955605,77.218273,28.956667.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.217525,28.955605,77.218273,28.956667.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.259178,28.942939,77.260521,28.944094.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.401468,28.939561,77.402701,28.940702.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.431279,28.936963,77.432459,28.938163.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415133,28.938179,77.416279,28.939415.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.259178,28.942939,77.260521,28.944094.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.401468,28.939561,77.402701,28.940702.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415133,28.938179,77.416279,28.939415.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915491,28.953007,76.916947,28.953641.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324468,28.942848,77.325864,28.94388.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414298,28.93673,77.41581,28.937925.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.915491,28.953007,76.916947,28.953641.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.224086,28.952098,77.225003,28.953237.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.324468,28.942848,77.325864,28.94388.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.414298,28.93673,77.41581,28.937925.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.407728,28.956252,77.408428,28.957289.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.224086,28.952098,77.225003,28.953237.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.407728,28.956252,77.408428,28.957289.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266991,28.943217,77.268631,28.944473.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440341,28.873538,77.441825,28.874925.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266991,28.943217,77.268631,28.944473.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440341,28.873538,77.441825,28.874925.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.221169,28.952151,77.22219,28.953507.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.291493,28.933235,77.29294,28.934675.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.221169,28.952151,77.22219,28.953507.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447961,28.958248,77.448554,28.959222.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.427677,28.946816,77.428841,28.948161.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.447961,28.958248,77.448554,28.959222.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.291493,28.933235,77.29294,28.934675.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430814,28.958514,77.43227,28.959079.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.427677,28.946816,77.428841,28.948161.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.430814,28.958514,77.43227,28.959079.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423406,28.959032,77.424661,28.959707.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.219602,28.959455,77.220595,28.960561.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.423406,28.959032,77.424661,28.959707.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.221756,28.956899,77.22304,28.957726.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.219602,28.959455,77.220595,28.960561.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.221756,28.956899,77.22304,28.957726.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425208,28.958155,77.426535,28.958868.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.418174,28.945075,77.419569,28.946435.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.425208,28.958155,77.426535,28.958868.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.232493,28.956615,77.233848,28.957765.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421306,28.951913,77.422879,28.952881.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.225407,28.960811,77.226616,28.96238.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.418174,28.945075,77.419569,28.946435.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.232493,28.956615,77.233848,28.957765.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.421306,28.951913,77.422879,28.952881.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.225407,28.960811,77.226616,28.96238.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.880135,28.916879,76.881266,28.917382.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.880135,28.916879,76.881266,28.917382.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42902,28.946138,77.4303,28.947424.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.88796,28.973951,76.888717,28.9751.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351114,28.95661,77.352266,28.957959.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.250986,28.95936,77.252325,28.960745.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.88796,28.973951,76.888717,28.9751.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42902,28.946138,77.4303,28.947424.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.351114,28.95661,77.352266,28.957959.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.291097,28.945003,77.292804,28.94633.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.250986,28.95936,77.252325,28.960745.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.291097,28.945003,77.292804,28.94633.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.293315,28.948056,77.29497,28.949171.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433782,28.959234,77.434904,28.960545.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.293315,28.948056,77.29497,28.949171.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441876,28.96009,77.443214,28.961068.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256212,28.895766,77.257349,28.89702.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.270443,28.971888,77.271871,28.972685.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433782,28.959234,77.434904,28.960545.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.270443,28.971888,77.271871,28.972685.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.256212,28.895766,77.257349,28.89702.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441876,28.96009,77.443214,28.961068.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346764,28.955006,77.348023,28.956306.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.346764,28.955006,77.348023,28.956306.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.26793,28.895126,77.269475,28.896238.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.228188,28.975158,77.229128,28.97626.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444652,28.962986,77.446015,28.963965.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.228188,28.975158,77.229128,28.97626.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444652,28.962986,77.446015,28.963965.tif\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.229458,28.955108,77.231257,28.95632.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.92039,28.815175,76.921463,28.815598.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.26793,28.895126,77.269475,28.896238.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.23228,28.97215,77.233554,28.973512.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.92039,28.815175,76.921463,28.815598.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.229458,28.955108,77.231257,28.95632.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.23228,28.97215,77.233554,28.973512.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.223663,28.992069,77.224585,28.99318.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.223663,28.992069,77.224585,28.99318.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.890943,28.987499,76.892324,28.988273.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42095,28.950123,77.422685,28.951445.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.260693,28.968166,77.261917,28.969265.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.228707,28.97247,77.229954,28.973988.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.890943,28.987499,76.892324,28.988273.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.235394,28.97924,77.236685,28.980241.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.260693,28.968166,77.261917,28.969265.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422288,28.96797,77.423587,28.968999.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.29058,28.922459,77.291843,28.923733.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.42095,28.950123,77.422685,28.951445.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.228707,28.97247,77.229954,28.973988.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234277,28.978332,77.235809,28.979443.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.235394,28.97924,77.236685,28.980241.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.422288,28.96797,77.423587,28.968999.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.29058,28.922459,77.291843,28.923733.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440034,28.965291,77.441488,28.966403.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.252804,28.919736,77.253839,28.921065.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234277,28.978332,77.235809,28.979443.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.233107,28.989203,77.234562,28.990181.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440034,28.965291,77.441488,28.966403.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.252804,28.919736,77.253839,28.921065.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.233107,28.989203,77.234562,28.990181.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242409,28.981966,77.243709,28.982998.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927728,28.997463,76.928392,28.998484.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242409,28.981966,77.243709,28.982998.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444144,28.991224,77.445482,28.991824.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242662,28.986902,77.243978,28.988101.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927728,28.997463,76.928392,28.998484.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266362,28.975527,77.267611,28.976745.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444144,28.991224,77.445482,28.991824.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242662,28.986902,77.243978,28.988101.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.266362,28.975527,77.267611,28.976745.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.238437,28.986875,77.239703,28.988132.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441475,28.991872,77.44271,28.992574.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.441475,28.991872,77.44271,28.992574.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.238437,28.986875,77.239703,28.988132.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387019,28.343102,77.387985,28.343462.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.225361,28.995365,77.226303,28.996462.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387019,28.343102,77.387985,28.343462.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448081,28.848099,77.449061,28.849067.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440135,28.998405,77.441464,28.998987.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.225361,28.995365,77.226303,28.996462.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.440135,28.998405,77.441464,28.998987.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.448081,28.848099,77.449061,28.849067.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849086,28.813816,76.850417,28.81438.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.955645,28.85393,76.956203,28.854832.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876252,28.839627,76.877453,28.840116.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.955645,28.85393,76.956203,28.854832.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.849086,28.813816,76.850417,28.81438.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403778,28.412947,77.404786,28.413436.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876252,28.839627,76.877453,28.840116.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.241341,28.992554,77.242952,28.993526.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.403778,28.412947,77.404786,28.413436.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.049801,28.854869,77.05096,28.855359.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573315,28.36872,77.574238,28.36964.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.049801,28.854869,77.05096,28.855359.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.241341,28.992554,77.242952,28.993526.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.244783,28.987145,77.246338,28.988209.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573315,28.36872,77.574238,28.36964.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.595223,28.421471,77.596189,28.421991.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.595223,28.421471,77.596189,28.421991.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.244783,28.987145,77.246338,28.988209.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.333294,28.993503,77.334969,28.994655.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242459,28.988296,77.244017,28.989758.tif\n\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.035317,28.851863,77.035789,28.852821.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.333294,28.993503,77.334969,28.994655.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.035317,28.851863,77.035789,28.852821.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.242459,28.988296,77.244017,28.989758.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.244539,28.986063,77.24616,28.987295.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.354189,28.784581,77.355326,28.785607.tif\n\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362289,28.791722,77.363577,28.793003.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.354189,28.784581,77.355326,28.785607.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.925257,28.809107,76.926596,28.809595.tif\n\nAdding overviews...\nUpdating dataset tags...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.244539,28.986063,77.24616,28.987295.tif\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.925257,28.809107,76.926596,28.809595.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.362289,28.791722,77.363577,28.793003.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.892829,28.851524,76.89405,28.852051.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.051239,28.897861,77.051797,28.898836.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.591579,28.363717,77.592623,28.364617.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.892829,28.851524,76.89405,28.852051.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.051239,28.897861,77.051797,28.898836.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391783,28.336326,77.392405,28.337207.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.44643,28.995227,77.447866,28.996104.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.591579,28.363717,77.592623,28.364617.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920369,28.815156,76.921484,28.815579.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391783,28.336326,77.392405,28.337207.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.920369,28.815156,76.921484,28.815579.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.44643,28.995227,77.447866,28.996104.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438629,28.755661,77.439973,28.756577.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370182,28.760435,77.371005,28.761378.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.41164,28.98945,77.413204,28.990748.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.370182,28.760435,77.371005,28.761378.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.921227,28.810047,76.9223,28.810498.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.656268,28.442301,77.657153,28.443089.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.656268,28.442301,77.657153,28.443089.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438629,28.755661,77.439973,28.756577.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.921227,28.810047,76.9223,28.810498.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927972,28.724121,76.928558,28.725145.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.41164,28.98945,77.413204,28.990748.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927972,28.724121,76.928558,28.725145.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449793,28.748491,77.450824,28.74945.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.037849,28.874662,77.038321,28.875611.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389357,28.379969,77.390411,28.380556.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.449793,28.748491,77.450824,28.74945.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.037849,28.874662,77.038321,28.875611.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.389357,28.379969,77.390411,28.380556.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.921484,28.808138,76.922042,28.809266.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.921484,28.808138,76.922042,28.809266.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586957,28.334628,77.587804,28.335808.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.241675,28.999019,77.243251,29.000096.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.586957,28.334628,77.587804,28.335808.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402736,28.406643,77.403811,28.407207.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.402736,28.406643,77.403811,28.407207.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.241675,28.999019,77.243251,29.000096.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415738,28.925848,77.417098,28.927017.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36497,28.774063,77.366304,28.775493.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927385,28.810009,76.928029,28.811062.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.415738,28.925848,77.417098,28.927017.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.927385,28.810009,76.928029,28.811062.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.36497,28.774063,77.366304,28.775493.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391558,28.284599,77.392073,28.285392.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.391558,28.284599,77.392073,28.285392.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438014,28.864431,77.439215,28.865792.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.681387,28.4739,77.682416,28.474743.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536172,28.83655,77.537137,28.83761.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.681387,28.4739,77.682416,28.474743.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.438014,28.864431,77.439215,28.865792.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.536172,28.83655,77.537137,28.83761.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876985,28.837903,76.877578,28.838889.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416685,28.334453,77.4172,28.335341.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387266,28.284569,77.387824,28.285441.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.416685,28.334453,77.4172,28.335341.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.876985,28.837903,76.877578,28.838889.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.387266,28.284569,77.387824,28.285441.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381382,28.344613,77.382465,28.345067.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.941848,28.814333,76.942449,28.815349.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.381382,28.344613,77.382465,28.345067.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.941848,28.814333,76.942449,28.815349.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287778,28.8272,77.289087,28.828087.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.659764,28.44711,77.660694,28.447977.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.352365,28.773937,77.353674,28.775451.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.613234,28.256518,77.614019,28.257205.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.659754,28.447119,77.660704,28.448006.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.613234,28.256518,77.614019,28.257205.tif\nUpdating dataset tags...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.659764,28.44711,77.660694,28.447977.tif\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.287778,28.8272,77.289087,28.828087.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320484,28.806018,77.321655,28.807118.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.64256,28.449063,77.643148,28.449603.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.659754,28.447119,77.660704,28.448006.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.64256,28.449063,77.643148,28.449603.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.352365,28.773937,77.353674,28.775451.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.320484,28.806018,77.321655,28.807118.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.496861,28.99441,77.497355,28.995353.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914495,28.946471,76.915285,28.947556.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.496861,28.99441,77.497355,28.995353.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914495,28.946471,76.915285,28.947556.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444558,28.744613,77.445846,28.74537.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.497985,28.994591,77.499063,28.99581.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.318972,28.794426,77.320377,28.795357.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.444558,28.744613,77.445846,28.74537.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.497985,28.994591,77.499063,28.99581.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.318972,28.794426,77.320377,28.795357.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914409,28.944832,76.915348,28.946099.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392928,28.277293,77.394007,28.278242.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.558232,28.355867,77.559151,28.356814.tif\n\nAdding overviews...\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.246554,28.894409,77.247526,28.895545.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433127,28.754224,77.434403,28.75538.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.914409,28.944832,76.915348,28.946099.tif\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.392928,28.277293,77.394007,28.278242.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.558232,28.355867,77.559151,28.356814.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40708,28.9954,77.408685,28.996862.tif\n\nAdding overviews...\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.246554,28.894409,77.247526,28.895545.tif\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.433127,28.754224,77.434403,28.75538.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.40708,28.9954,77.408685,28.996862.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.929078,28.993159,76.929773,28.994319.tif\n\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573165,28.764444,77.574517,28.765589.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/76.929078,28.993159,76.929773,28.994319.tif\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.573165,28.764444,77.574517,28.765589.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.237019,28.959754,77.238697,28.961028.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.237019,28.959754,77.238697,28.961028.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.298072,28.967272,77.299566,28.968437.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.298072,28.967272,77.299566,28.968437.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234164,28.958778,77.235502,28.9604.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234164,28.958778,77.235502,28.9604.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445232,28.966706,77.446808,28.967925.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.445232,28.966706,77.446808,28.967925.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.443547,28.9985,77.444212,28.999741.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.443547,28.9985,77.444212,28.999741.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396826,28.326171,77.397619,28.326662.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.396826,28.326171,77.397619,28.326662.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234529,29.002138,77.236003,29.003329.tif\n\nAdding overviews...\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.234529,29.002138,77.236003,29.003329.tif\nReading input: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.275279,28.225628,77.275879,28.226649.tif\n\nUpdating dataset tags...\nWriting output to: /home/patel_zeel/kiln_compass_24/regions/high_res/19/77.275279,28.225628,77.275879,28.226649.tif\nfiles = glob(\"/home/patel_zeel/kiln_compass_24/regions/high_res/19/*.tif\")\nprint(len(files))\nsizes = []\nfor file in files:\n with Image.open(file) as img:\n sizes.append(img.size[0])\n sizes.append(img.size[1])\n\n783\nmax(sizes), min(sizes)\n\n(1216, 229)\nImage.open(files[1])\nfor file in tqdm(files):\n new_file = file.replace(\".tif\", \".png\")\n with Image.open(file) as img:\n img.save(new_file)\nfrom ultralytics import YOLO\nmodel = YOLO(\"yolo11m-obb\")\nmodel.train(data=\"../lab/trench_width/data.yaml\", epochs=51, batch=-1, imgsz=1280, save_period=5)\n\nNew https://pypi.org/project/ultralytics/8.3.64 available 😃 Update with 'pip install -U ultralytics'\nUltralytics 8.3.55 🚀 Python-3.10.15 torch-2.5.0+cu124 CUDA:0 (NVIDIA A100-SXM4-80GB, 81156MiB)\nengine/trainer: task=obb, mode=train, model=yolo11m-obb.pt, data=../lab/trench_width/data.yaml, epochs=51, time=None, patience=100, batch=-1, imgsz=1280, save=True, save_period=5, cache=False, device=None, workers=8, project=None, name=train5, exist_ok=False, pretrained=True, optimizer=auto, verbose=True, seed=0, deterministic=True, single_cls=False, rect=False, cos_lr=False, close_mosaic=10, resume=False, amp=True, fraction=1.0, profile=False, freeze=None, multi_scale=False, overlap_mask=True, mask_ratio=4, dropout=0.0, val=True, split=val, save_json=False, save_hybrid=False, conf=None, iou=0.7, max_det=300, half=False, dnn=False, plots=True, source=None, vid_stride=1, stream_buffer=False, visualize=False, augment=False, agnostic_nms=False, classes=None, retina_masks=False, embed=None, show=False, save_frames=False, save_txt=False, save_conf=False, save_crop=False, show_labels=True, show_conf=True, show_boxes=True, line_width=None, format=torchscript, keras=False, optimize=False, int8=False, dynamic=False, simplify=True, opset=None, workspace=None, nms=False, lr0=0.01, lrf=0.01, momentum=0.937, weight_decay=0.0005, warmup_epochs=3.0, warmup_momentum=0.8, warmup_bias_lr=0.1, box=7.5, cls=0.5, dfl=1.5, pose=12.0, kobj=1.0, nbs=64, hsv_h=0.015, hsv_s=0.7, hsv_v=0.4, degrees=0.0, translate=0.1, scale=0.5, shear=0.0, perspective=0.0, flipud=0.0, fliplr=0.5, bgr=0.0, mosaic=1.0, mixup=0.0, copy_paste=0.0, copy_paste_mode=flip, auto_augment=randaugment, erasing=0.4, crop_fraction=1.0, cfg=None, tracker=botsort.yaml, save_dir=runs/obb/train5\nOverriding model.yaml nc=80 with nc=2\n\n from n params module arguments \n 0 -1 1 1856 ultralytics.nn.modules.conv.Conv [3, 64, 3, 2] \n 1 -1 1 73984 ultralytics.nn.modules.conv.Conv [64, 128, 3, 2] \n 2 -1 1 111872 ultralytics.nn.modules.block.C3k2 [128, 256, 1, True, 0.25] \n 3 -1 1 590336 ultralytics.nn.modules.conv.Conv [256, 256, 3, 2] \n 4 -1 1 444928 ultralytics.nn.modules.block.C3k2 [256, 512, 1, True, 0.25] \n 5 -1 1 2360320 ultralytics.nn.modules.conv.Conv [512, 512, 3, 2] \n 6 -1 1 1380352 ultralytics.nn.modules.block.C3k2 [512, 512, 1, True] \n 7 -1 1 2360320 ultralytics.nn.modules.conv.Conv [512, 512, 3, 2] \n 8 -1 1 1380352 ultralytics.nn.modules.block.C3k2 [512, 512, 1, True] \n 9 -1 1 656896 ultralytics.nn.modules.block.SPPF [512, 512, 5] \n 10 -1 1 990976 ultralytics.nn.modules.block.C2PSA [512, 512, 1] \n 11 -1 1 0 torch.nn.modules.upsampling.Upsample [None, 2, 'nearest'] \n 12 [-1, 6] 1 0 ultralytics.nn.modules.conv.Concat [1] \n 13 -1 1 1642496 ultralytics.nn.modules.block.C3k2 [1024, 512, 1, True] \n 14 -1 1 0 torch.nn.modules.upsampling.Upsample [None, 2, 'nearest'] \n 15 [-1, 4] 1 0 ultralytics.nn.modules.conv.Concat [1] \n 16 -1 1 542720 ultralytics.nn.modules.block.C3k2 [1024, 256, 1, True] \n 17 -1 1 590336 ultralytics.nn.modules.conv.Conv [256, 256, 3, 2] \n 18 [-1, 13] 1 0 ultralytics.nn.modules.conv.Concat [1] \n 19 -1 1 1511424 ultralytics.nn.modules.block.C3k2 [768, 512, 1, True] \n 20 -1 1 2360320 ultralytics.nn.modules.conv.Conv [512, 512, 3, 2] \n 21 [-1, 10] 1 0 ultralytics.nn.modules.conv.Concat [1] \n 22 -1 1 1642496 ultralytics.nn.modules.block.C3k2 [1024, 512, 1, True] \n 23 [16, 19, 22] 1 2261401 ultralytics.nn.modules.head.OBB [2, 1, [256, 512, 512]] \nYOLO11m-obb summary: 434 layers, 20,903,385 parameters, 20,903,369 gradients, 71.9 GFLOPs\n\nTransferred 685/691 items from pretrained weights\n\n\nwandb: Using wandb-core as the SDK backend. Please refer to https://wandb.me/wandb-core for more information.\nwandb: Currently logged in as: patel_zeel (sustainability-lab). Use `wandb login --relogin` to force relogin\n\n\nTracking run with wandb version 0.18.5\n\n\nRun data is saved locally in /home/patel_zeel/blog/lab/wandb/run-20250120_153755-5wtpe9o7\n\n\nSyncing run train5 to Weights & Biases (docs)\n\n\n View project at https://wandb.ai/sustainability-lab/Ultralytics\n\n\n View run at https://wandb.ai/sustainability-lab/Ultralytics/runs/5wtpe9o7\n\n\nFreezing layer 'model.23.dfl.conv.weight'\nAMP: running Automatic Mixed Precision (AMP) checks...\nAMP: checks passed ✅\n\n\ntrain: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]\n\n\nAutoBatch: Computing optimal batch size for imgsz=1280 at 60.0% CUDA memory utilization.\nAutoBatch: CUDA:0 (NVIDIA A100-SXM4-80GB) 79.25G total, 0.21G reserved, 0.20G allocated, 78.84G free\n\n\n\n\n\n Params GFLOPs GPU_mem (GB) forward (ms) backward (ms) input output\n 20903385 287.6 4.488 46.53 nan (1, 3, 1280, 1280) list\n 20903385 575.1 10.364 44.88 nan (2, 3, 1280, 1280) list\n 20903385 1150 19.864 47.99 nan (4, 3, 1280, 1280) list\n 20903385 2300 39.047 75.16 nan (8, 3, 1280, 1280) list\n 20903385 4601 76.993 149.4 nan (16, 3, 1280, 1280) list\nCUDA out of memory. Tried to allocate 400.00 MiB. GPU 0 has a total capacity of 79.25 GiB of which 39.50 MiB is free. Including non-PyTorch memory, this process has 79.21 GiB memory in use. Of the allocated memory 78.52 GiB is allocated by PyTorch, and 164.18 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)\nCUDA out of memory. Tried to allocate 400.00 MiB. GPU 0 has a total capacity of 79.25 GiB of which 171.50 MiB is free. Including non-PyTorch memory, this process has 79.08 GiB memory in use. Of the allocated memory 77.98 GiB is allocated by PyTorch, and 587.22 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)\nAutoBatch: Using batch-size 10 for CUDA:0 49.53G/79.25G (62%) ✅\n\n\ntrain: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]\nval: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]\n\n\nPlotting labels to runs/obb/train5/labels.jpg... \noptimizer: 'optimizer=auto' found, ignoring 'lr0=0.01' and 'momentum=0.937' and determining best 'optimizer', 'lr0' and 'momentum' automatically... \noptimizer: AdamW(lr=0.001667, momentum=0.9) with parameter groups 112 weight(decay=0.0), 122 weight(decay=0.00046875), 121 bias(decay=0.0)\nImage sizes 1280 train, 1280 val\nUsing 8 dataloader workers\nLogging results to runs/obb/train5\nStarting training for 51 epochs...\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 1/51 21.9G 3.042 5.112 3.885 57 1280: 100%|██████████| 1/1 [00:00<00:00, 1.23it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.08it/s]\n\n\n all 10 20 0.00146 0.2 0.00126 0.000148\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 2/51 21.9G 2.846 4.908 4.012 58 1280: 100%|██████████| 1/1 [00:00<00:00, 3.09it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.09it/s]\n\n\n all 10 20 0.00106 0.15 0.000779 9.96e-05\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 3/51 21.9G 3.022 5.258 3.667 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.19it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.07it/s]\n\n\n all 10 20 0.00103 0.15 0.000764 9.76e-05\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 4/51 21.9G 2.789 5.113 3.852 58 1280: 100%|██████████| 1/1 [00:00<00:00, 3.20it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.13it/s]\n\n\n all 10 20 0.000675 0.1 0.000516 7.25e-05\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 5/51 22G 2.526 5.45 3.457 42 1280: 100%|██████████| 1/1 [00:00<00:00, 2.71it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.13it/s]\n\n\n all 10 20 0.0017 0.25 0.00121 0.00022\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 6/51 22.1G 2.832 4.983 3.874 48 1280: 100%|██████████| 1/1 [00:00<00:00, 3.09it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.13it/s]\n\n\n all 10 20 0.00257 0.35 0.00268 0.000685\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 7/51 22.1G 2.534 4.688 3.428 55 1280: 100%|██████████| 1/1 [00:00<00:00, 3.09it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.13it/s]\n\n\n all 10 20 0.00417 0.55 0.0161 0.0029\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 8/51 22.1G 2.091 4.292 3.43 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.06it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.14it/s]\n\n\n all 10 20 0.00667 0.6 0.0156 0.00451\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 9/51 22.1G 2.307 4.644 3.652 35 1280: 100%|██████████| 1/1 [00:00<00:00, 3.02it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.14it/s]\n\n\n all 10 20 0.0105 0.55 0.0608 0.0119\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 10/51 22.1G 1.8 4.124 2.982 47 1280: 100%|██████████| 1/1 [00:00<00:00, 3.07it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.19it/s]\n\n\n all 10 20 0.0111 0.8 0.0696 0.017\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 11/51 22.1G 1.751 3.753 2.608 54 1280: 100%|██████████| 1/1 [00:00<00:00, 3.48it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.18it/s]\n\n\n all 10 20 0.0111 0.8 0.0696 0.017\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 12/51 22.2G 1.821 3.793 2.822 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.00it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.29it/s]\n\n\n all 10 20 0.00953 1 0.145 0.0346\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 13/51 22.1G 1.748 3.64 2.721 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.49it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.30it/s]\n\n\n all 10 20 0.00953 1 0.145 0.0346\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 14/51 22.2G 1.711 3.417 2.728 69 1280: 100%|██████████| 1/1 [00:00<00:00, 3.11it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.46it/s]\n\n\n all 10 20 0.00856 1 0.301 0.107\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 15/51 22.1G 1.443 3.42 3.097 45 1280: 100%|██████████| 1/1 [00:00<00:00, 3.50it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.45it/s]\n\n\n all 10 20 0.00856 1 0.301 0.107\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 16/51 22.2G 1.468 3.528 2.344 38 1280: 100%|██████████| 1/1 [00:00<00:00, 3.09it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.63it/s]\n\n\n all 10 20 0.22 0.65 0.358 0.154\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 17/51 22.1G 1.452 2.938 2.311 54 1280: 100%|██████████| 1/1 [00:00<00:00, 3.45it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.60it/s]\n\n\n all 10 20 0.22 0.65 0.358 0.154\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 18/51 22.2G 1.321 2.94 3.078 46 1280: 100%|██████████| 1/1 [00:00<00:00, 3.07it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.78it/s]\n\n\n all 10 20 0.322 0.6 0.393 0.17\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 19/51 22.1G 1.385 2.793 2.518 53 1280: 100%|██████████| 1/1 [00:00<00:00, 3.51it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.79it/s]\n\n\n all 10 20 0.322 0.6 0.393 0.17\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 20/51 22.2G 1.287 2.709 2.337 46 1280: 100%|██████████| 1/1 [00:00<00:00, 3.10it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.85it/s]\n\n\n all 10 20 0.385 0.8 0.54 0.237\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 21/51 22.1G 1.191 2.52 2.295 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.48it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.84it/s]\n\n\n all 10 20 0.385 0.8 0.54 0.237\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 22/51 22.2G 1.354 2.634 2.612 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.11it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.78it/s]\n\n\n all 10 20 0.463 0.85 0.708 0.39\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 23/51 22.1G 1.266 2.719 2.22 35 1280: 100%|██████████| 1/1 [00:00<00:00, 3.50it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.78it/s]\n\n\n all 10 20 0.463 0.85 0.708 0.39\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 24/51 22.2G 1.189 2.15 2.245 59 1280: 100%|██████████| 1/1 [00:00<00:00, 3.19it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.72it/s]\n\n\n all 10 20 0.479 0.85 0.729 0.418\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 25/51 22.1G 1.069 2.278 1.992 52 1280: 100%|██████████| 1/1 [00:00<00:00, 3.49it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.73it/s]\n\n\n all 10 20 0.479 0.85 0.729 0.418\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 26/51 22.2G 1.42 2.523 2.593 54 1280: 100%|██████████| 1/1 [00:00<00:00, 3.09it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.59it/s]\n\n\n all 10 20 0.711 0.9 0.841 0.52\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 27/51 22.1G 1.175 2.119 1.997 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.50it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.60it/s]\n\n\n all 10 20 0.711 0.9 0.841 0.52\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 28/51 22.2G 1.173 2.096 2.219 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.12it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.53it/s]\n\n\n all 10 20 0.856 0.889 0.933 0.623\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 29/51 22.1G 1.173 2.163 2.431 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.49it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.52it/s]\n\n\n all 10 20 0.856 0.889 0.933 0.623\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 30/51 22.2G 1.196 2.354 1.996 41 1280: 100%|██████████| 1/1 [00:00<00:00, 3.49it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.51it/s]\n\n\n all 10 20 0.856 0.889 0.933 0.623\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 31/51 22.2G 1.289 2.233 2.543 49 1280: 100%|██████████| 1/1 [00:00<00:00, 3.11it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.49it/s]\n\n\n all 10 20 0.971 0.89 0.966 0.654\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 32/51 22.1G 1.22 2.208 2.129 51 1280: 100%|██████████| 1/1 [00:00<00:00, 3.50it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.48it/s]\n\n\n all 10 20 0.971 0.89 0.966 0.654\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 33/51 22.2G 1.175 2.047 1.935 45 1280: 100%|██████████| 1/1 [00:00<00:00, 3.50it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.49it/s]\n\n\n all 10 20 0.971 0.89 0.966 0.654\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 34/51 22.2G 1.268 1.903 1.914 62 1280: 100%|██████████| 1/1 [00:00<00:00, 3.05it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.48it/s]\n\n\n all 10 20 0.806 0.93 0.952 0.676\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 35/51 22.1G 1.199 2.076 2.07 52 1280: 100%|██████████| 1/1 [00:00<00:00, 3.49it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.48it/s]\n\n\n all 10 20 0.806 0.93 0.952 0.676\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 36/51 22.2G 1.139 2.094 2.366 51 1280: 100%|██████████| 1/1 [00:00<00:00, 3.48it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.49it/s]\n\n\n all 10 20 0.806 0.93 0.952 0.676\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 37/51 22.2G 1.141 2.146 1.818 49 1280: 100%|██████████| 1/1 [00:00<00:00, 3.10it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.49it/s]\n\n\n all 10 20 0.768 0.94 0.946 0.671\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 38/51 22.1G 1.095 1.823 1.974 60 1280: 100%|██████████| 1/1 [00:00<00:00, 3.49it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.48it/s]\n\n\n all 10 20 0.768 0.94 0.946 0.671\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 39/51 22.2G 1.037 1.773 2.015 54 1280: 100%|██████████| 1/1 [00:00<00:00, 3.47it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.49it/s]\n\n\n all 10 20 0.768 0.94 0.946 0.671\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 40/51 22.2G 1.049 1.755 1.971 60 1280: 100%|██████████| 1/1 [00:00<00:00, 3.08it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.51it/s]\n\n\n all 10 20 0.87 0.95 0.957 0.669\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 41/51 22.1G 1.16 2.066 2.284 46 1280: 100%|██████████| 1/1 [00:00<00:00, 3.49it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.49it/s]\n\n\n all 10 20 0.87 0.95 0.957 0.669\n\n\n\n\n\nClosing dataloader mosaic\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 42/51 22.2G 1.142 3.141 1.802 20 1280: 100%|██████████| 1/1 [00:00<00:00, 1.20it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.46it/s]\n\n\n all 10 20 0.87 0.95 0.957 0.669\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 43/51 22.2G 1.076 2.939 1.903 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.10it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.45it/s]\n\n\n all 10 20 0.905 0.93 0.963 0.677\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 44/51 22.1G 1.062 2.816 1.615 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.51it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.43it/s]\n\n\n all 10 20 0.905 0.93 0.963 0.677\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 45/51 22.2G 1.043 2.889 1.868 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.49it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.50it/s]\n\n\n all 10 20 0.905 0.93 0.963 0.677\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 46/51 22.2G 1.308 3.073 1.914 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.11it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.48it/s]\n\n\n all 10 20 0.895 0.95 0.957 0.701\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 47/51 22.1G 0.9313 2.941 1.799 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.49it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.50it/s]\n\n\n all 10 20 0.895 0.95 0.957 0.701\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 48/51 22.2G 0.9851 2.767 1.656 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.48it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.50it/s]\n\n\n all 10 20 0.895 0.95 0.957 0.701\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 49/51 22.2G 1.306 2.876 2.089 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.47it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.49it/s]\n\n\n all 10 20 0.895 0.95 0.957 0.701\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 50/51 22.2G 1.156 2.817 1.904 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.12it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.49it/s]\n\n\n all 10 20 0.913 0.999 0.968 0.726\n\n\n\n\n\n\n Epoch GPU_mem box_loss cls_loss dfl_loss Instances Size\n\n\n 51/51 22.1G 1.119 2.846 1.677 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.49it/s]\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.48it/s]\n\n\n all 10 20 0.913 0.999 0.968 0.726\n\n\n\n\n\n\n51 epochs completed in 0.034 hours.\nOptimizer stripped from runs/obb/train5/weights/last.pt, 43.1MB\nOptimizer stripped from runs/obb/train5/weights/best.pt, 43.1MB\n\nValidating runs/obb/train5/weights/best.pt...\nUltralytics 8.3.55 🚀 Python-3.10.15 torch-2.5.0+cu124 CUDA:0 (NVIDIA A100-SXM4-80GB, 81156MiB)\nYOLO11m-obb summary (fused): 322 layers, 20,880,025 parameters, 0 gradients, 71.3 GFLOPs\n\n\n Class Images Instances Box(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.51it/s]\n\n\n all 10 20 0.913 0.999 0.968 0.734\n inner_box 10 10 0.909 0.998 0.94 0.592\n outer_box 10 10 0.918 1 0.995 0.875\nSpeed: 0.4ms preprocess, 6.9ms inference, 0.0ms loss, 54.4ms postprocess per image\nResults saved to runs/obb/train5\n\n\n\n\n\n\nRun history:\n\n\n\nlr/pg0\n▁▂▂▃▃▄▄▅▅▅▆▆▇▇▇██████████▇▇▇▇▇▆▆▆▅▅▄▄▃▃▂\n\n\nlr/pg1\n▁▂▂▃▃▄▄▅▅▅▆▆▇▇▇▇█████████▇▇▇▇▇▆▆▆▅▅▄▄▃▃▂\n\n\nlr/pg2\n▁▂▃▃▃▄▅▅▅▆▆▇▇▇▇███████████▇▇▇▇▆▆▆▅▅▄▄▃▃▂\n\n\nmetrics/mAP50(B)\n▁▁▁▁▁▁▁▂▂▃▄▄▄▄▅▆▆▆▆▇████████████████████\n\n\nmetrics/mAP50-95(B)\n▁▁▁▁▁▁▁▁▁▁▂▂▂▂▃▃▅▅▅▅▆▇▇▇▇▇███▇▇▇▇▇██████\n\n\nmetrics/precision(B)\n▁▁▁▁▁▁▁▁▁▁▁▁▁▃▃▃▄▄▄▄▄▆▇▇▇█▇▇▇▇▇▇▇▇██▇▇▇█\n\n\nmetrics/recall(B)\n▁▁▁▂▃▅▄▆▆███▅▅▅▆▆▇▇▇▇▇▇▇▇▇▇▇▇▇████▇▇████\n\n\nmodel/GFLOPs\n▁\n\n\nmodel/parameters\n▁\n\n\nmodel/speed_PyTorch(ms)\n▁\n\n\ntrain/box_loss\n█▇█▇▆▆▅▆▄▄▄▄▃▃▃▂▂▂▂▁▂▂▂▂▂▂▂▂▂▂▁▂▂▁▁▂▁▁▂▂\n\n\ntrain/cls_loss\n▇▇█▇█▆▆▅▅▅▄▄▄▃▃▃▃▂▂▂▂▂▂▂▂▁▂▂▂▁▁▂▄▃▃▃▃▃▃▃\n\n\ntrain/dfl_loss\n██▇█▆▆▆▇▅▄▄▅▃▃▅▃▃▄▃▃▂▃▃▂▄▂▂▂▃▂▂▃▂▂▁▂▂▁▂▁\n\n\nval/box_loss\n████▇▄▄▃▃▃▃▃▃▃▃▃▃▂▂▂▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁\n\n\nval/cls_loss\n███▇▆▅▄▄▄▃▃▃▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁\n\n\nval/dfl_loss\n████▇▅▅▆▆▆▅▄▄▄▄▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁\n\n\n\nRun summary:\n\n\n\nlr/pg0\n2e-05\n\n\nlr/pg1\n2e-05\n\n\nlr/pg2\n2e-05\n\n\nmetrics/mAP50(B)\n0.96773\n\n\nmetrics/mAP50-95(B)\n0.73366\n\n\nmetrics/precision(B)\n0.9135\n\n\nmetrics/recall(B)\n0.99876\n\n\nmodel/GFLOPs\n71.889\n\n\nmodel/parameters\n20903385\n\n\nmodel/speed_PyTorch(ms)\n13.092\n\n\ntrain/box_loss\n1.11869\n\n\ntrain/cls_loss\n2.84641\n\n\ntrain/dfl_loss\n1.6767\n\n\nval/box_loss\n0.93457\n\n\nval/cls_loss\n1.79106\n\n\nval/dfl_loss\n1.45649\n\n\n\n\n\n\n View run train5 at: https://wandb.ai/sustainability-lab/Ultralytics/runs/5wtpe9o7 View project at: https://wandb.ai/sustainability-lab/UltralyticsSynced 4 W&B file(s), 0 media file(s), 5 artifact file(s) and 16 other file(s)\n\n\nFind logs at: ./wandb/run-20250120_153755-5wtpe9o7/logs\n\n\nultralytics.utils.metrics.OBBMetrics object with attributes:\n\nap_class_index: array([0, 1])\nbox: ultralytics.utils.metrics.Metric object\nconfusion_matrix: <ultralytics.utils.metrics.ConfusionMatrix object at 0x7fe64f199420>\ncurves: []\ncurves_results: []\nfitness: np.float64(0.7570705318296267)\nkeys: ['metrics/precision(B)', 'metrics/recall(B)', 'metrics/mAP50(B)', 'metrics/mAP50-95(B)']\nmaps: array([ 0.59184, 0.87549])\nnames: {0: 'inner_box', 1: 'outer_box'}\nplot: True\nresults_dict: {'metrics/precision(B)': np.float64(0.9134993192895953), 'metrics/recall(B)': np.float64(0.9987569029703861), 'metrics/mAP50(B)': np.float64(0.9677272727272728), 'metrics/mAP50-95(B)': np.float64(0.7336642272854437), 'fitness': np.float64(0.7570705318296267)}\nsave_dir: PosixPath('runs/obb/train5')\nspeed: {'preprocess': 0.35784244537353516, 'inference': 6.9373369216918945, 'loss': 0.0010251998901367188, 'postprocess': 54.367613792419434}\nimport numpy as np\nimport supervision as sv\npred_model = YOLO(\"/home/patel_zeel/blog/lab/runs/obb/train5/weights/best.pt\")\nimport os\nfiles = glob(\"/home/patel_zeel/kiln_compass_24/regions/high_res/19/*.png\")\n# np.random.seed(1)\nrandom_file = np.random.choice(files)\nbase_name = os.path.basename(random_file)\nif base_name in [os.path.basename(file) for file in glob(\"../lab/trench_width/images/*.png\")]:\n print(\"Part of the training dataset\")\n\nresult = pred_model(random_file, imgsz=1280, verbose=False)[0]\ndetection = sv.Detections.from_ultralytics(result)\n\nimg = Image.open(random_file)\nbox_annotator = sv.OrientedBoxAnnotator()\nlabel_annotator = sv.LabelAnnotator()\nannotated_image = box_annotator.annotate(img.copy(), detection)\nannotated_image = label_annotator.annotate(annotated_image, detection)\ndisplay(annotated_image)" }, { - "objectID": "posts/pruning_vs_uncertainty.html#train-a-model-on-mnist", - "href": "posts/pruning_vs_uncertainty.html#train-a-model-on-mnist", - "title": "Pruning vs Uncertainty", - "section": "Train a model on MNIST", - "text": "Train a model on MNIST\n\n# Define data transformations\ntransform = transforms.Compose(\n [\n transforms.Resize((224, 224)),\n transforms.Grayscale(num_output_channels=3), # Convert to RGB format\n transforms.ToTensor(),\n transforms.Normalize((0.5,), (0.5,)),\n # convert dtype to float32\n # transforms.Lambda(lambda x: x.to(torch.float32)),\n ]\n)\n\n\n# Load MNIST dataset\ndevice = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\nprint(f\"Using {device} device\")\ntrain_dataset = datasets.MNIST(\n root=\"./data\", train=True, transform=transform, download=True\n)\nprint(\"Train size\", len(train_dataset))\n\ntrain_dataset = TensorDataset(\n train_dataset.data[..., None]\n .repeat(1, 1, 1, 3)\n .swapaxes(1, 3)\n .swapaxes(2, 3)\n .to(torch.float32)\n .to(device),\n train_dataset.targets.to(device),\n)\ntest_dataset = datasets.MNIST(\n root=\"./data\", train=False, transform=transform, download=True\n)\nprint(\"Test size\", len(test_dataset))\ntest_dataset = TensorDataset(\n test_dataset.data[..., None]\n .repeat(1, 1, 1, 3)\n .swapaxes(1, 3)\n .swapaxes(2, 3)\n .to(torch.float32)\n .to(device),\n test_dataset.targets.to(device),\n)\n\nUsing cuda device\nTrain size 60000\nTest size 10000\n\n\n\ntrain_dataset[0][0].dtype, train_dataset[0][1].dtype\n\n(torch.float32, torch.int64)\n\n\n\n# Define data loaders\nbatch_size = 64\ntrain_loader = DataLoader(train_dataset, batch_size=batch_size, shuffle=True)\ntest_loader = DataLoader(test_dataset, batch_size=batch_size, shuffle=False)\n\n\n# Load pre-trained ResNet model\nresnet = torchvision.models.resnet18(pretrained=True)\nprint(\"Loaded pre-trained ResNet18 model\")\nprint(resnet.fc.in_features)\n\n# Modify the last fully connected layer to match MNIST's number of classes (10)\nnum_classes = 10\nresnet.fc = nn.Sequential(\n nn.Linear(resnet.fc.in_features, resnet.fc.in_features),\n nn.GELU(),\n nn.Linear(resnet.fc.in_features, num_classes),\n)\n\n# Freeze all layers except the last fully connected layer\nfor name, param in resnet.named_parameters():\n param.requires_grad = False\nresnet.fc.requires_grad_(True)\n\n# Define loss and optimizer\ncriterion = nn.CrossEntropyLoss()\noptimizer = torch.optim.Adam(resnet.parameters(), lr=1e-4)\n\n# Training loop\nnum_epochs = 50\nprint(f\"Training on device {device}\")\nresnet.to(device)\n\nprint(\"Training ResNet18 model\")\nfor epoch in range(num_epochs):\n resnet.train()\n epoch_loss = 0.0\n for images, labels in tqdm(train_loader):\n optimizer.zero_grad()\n outputs = resnet(images)\n loss = criterion(outputs, labels)\n loss.backward()\n optimizer.step()\n epoch_loss += loss.item()\n\n epoch_loss /= len(train_loader)\n\n print(f\"Epoch [{epoch+1}/{num_epochs}] Loss: {epoch_loss:.4f}\")\n\n # Evaluation\n resnet.eval()\n correct = 0\n total = 0\n with torch.no_grad():\n predicted_list = []\n for images, labels in test_loader:\n outputs = resnet(images)\n _, predicted = torch.max(outputs.data, 1)\n total += labels.size(0)\n correct += (predicted == labels).sum().item()\n\n print(f\"Accuracy on the test set: {(100 * correct / total):.2f}%\")\n\n/home/patel_zeel/miniconda3/envs/torch_dt/lib/python3.9/site-packages/torchvision/models/_utils.py:208: UserWarning: The parameter 'pretrained' is deprecated since 0.13 and may be removed in the future, please use 'weights' instead.\n warnings.warn(\n/home/patel_zeel/miniconda3/envs/torch_dt/lib/python3.9/site-packages/torchvision/models/_utils.py:223: UserWarning: Arguments other than a weight enum or `None` for 'weights' are deprecated since 0.13 and may be removed in the future. The current behavior is equivalent to passing `weights=ResNet18_Weights.IMAGENET1K_V1`. You can also use `weights=ResNet18_Weights.DEFAULT` to get the most up-to-date weights.\n warnings.warn(msg)\n\n\nLoaded pre-trained ResNet18 model\n512\nTraining on device cuda\nTraining ResNet18 model\n\n\n100%|██████████| 938/938 [00:03<00:00, 242.75it/s]\n\n\nEpoch [1/50] Loss: 1.0877\nAccuracy on the test set: 75.42%\n\n\n100%|██████████| 938/938 [00:03<00:00, 262.53it/s]\n\n\nEpoch [2/50] Loss: 0.8051\nAccuracy on the test set: 76.74%\n\n\n100%|██████████| 938/938 [00:03<00:00, 270.43it/s]\n\n\nEpoch [3/50] Loss: 0.7578\nAccuracy on the test set: 78.27%\n\n\n100%|██████████| 938/938 [00:03<00:00, 265.38it/s]\n\n\nEpoch [4/50] Loss: 0.7290\nAccuracy on the test set: 78.71%\n\n\n100%|██████████| 938/938 [00:03<00:00, 265.51it/s]\n\n\nEpoch [5/50] Loss: 0.7083\nAccuracy on the test set: 79.62%\n\n\n100%|██████████| 938/938 [00:03<00:00, 266.62it/s]\n\n\nEpoch [6/50] Loss: 0.6761\nAccuracy on the test set: 79.82%\n\n\n100%|██████████| 938/938 [00:03<00:00, 268.49it/s]\n\n\nEpoch [7/50] Loss: 0.6627\nAccuracy on the test set: 80.47%\n\n\n100%|██████████| 938/938 [00:03<00:00, 266.33it/s]\n\n\nEpoch [8/50] Loss: 0.6423\nAccuracy on the test set: 80.24%\n\n\n100%|██████████| 938/938 [00:03<00:00, 268.52it/s]\n\n\nEpoch [9/50] Loss: 0.6257\nAccuracy on the test set: 81.11%\n\n\n100%|██████████| 938/938 [00:03<00:00, 269.38it/s]\n\n\nEpoch [10/50] Loss: 0.6131\nAccuracy on the test set: 81.42%\n\n\n100%|██████████| 938/938 [00:03<00:00, 264.77it/s]\n\n\nEpoch [11/50] Loss: 0.5911\nAccuracy on the test set: 82.02%\n\n\n100%|██████████| 938/938 [00:03<00:00, 266.07it/s]\n\n\nEpoch [12/50] Loss: 0.5765\nAccuracy on the test set: 82.32%\n\n\n100%|██████████| 938/938 [00:03<00:00, 262.19it/s]\n\n\nEpoch [13/50] Loss: 0.5611\nAccuracy on the test set: 82.30%\n\n\n100%|██████████| 938/938 [00:04<00:00, 214.62it/s]\n\n\nEpoch [14/50] Loss: 0.5466\nAccuracy on the test set: 82.49%\n\n\n100%|██████████| 938/938 [00:04<00:00, 219.31it/s]\n\n\nEpoch [15/50] Loss: 0.5358\nAccuracy on the test set: 82.81%\n\n\n100%|██████████| 938/938 [00:04<00:00, 226.53it/s]\n\n\nEpoch [16/50] Loss: 0.5266\nAccuracy on the test set: 83.30%\n\n\n100%|██████████| 938/938 [00:05<00:00, 171.25it/s]\n\n\nEpoch [17/50] Loss: 0.5137\nAccuracy on the test set: 83.37%\n\n\n100%|██████████| 938/938 [00:03<00:00, 278.59it/s]\n\n\nEpoch [18/50] Loss: 0.5051\nAccuracy on the test set: 83.17%\n\n\n100%|██████████| 938/938 [00:03<00:00, 248.82it/s]\n\n\nEpoch [19/50] Loss: 0.4969\nAccuracy on the test set: 83.46%\n\n\n100%|██████████| 938/938 [00:05<00:00, 175.56it/s]\n\n\nEpoch [20/50] Loss: 0.4811\nAccuracy on the test set: 83.76%\n\n\n100%|██████████| 938/938 [00:03<00:00, 277.18it/s]\n\n\nEpoch [21/50] Loss: 0.4714\nAccuracy on the test set: 83.57%\n\n\n100%|██████████| 938/938 [00:03<00:00, 273.71it/s]\n\n\nEpoch [22/50] Loss: 0.4624\nAccuracy on the test set: 84.25%\n\n\n100%|██████████| 938/938 [00:03<00:00, 242.18it/s]\n\n\nEpoch [23/50] Loss: 0.4553\nAccuracy on the test set: 84.27%\n\n\n100%|██████████| 938/938 [00:03<00:00, 279.42it/s]\n\n\nEpoch [24/50] Loss: 0.4506\nAccuracy on the test set: 84.62%\n\n\n100%|██████████| 938/938 [00:03<00:00, 269.21it/s]\n\n\nEpoch [25/50] Loss: 0.4394\nAccuracy on the test set: 83.97%\n\n\n100%|██████████| 938/938 [00:04<00:00, 227.36it/s]\n\n\nEpoch [26/50] Loss: 0.4346\nAccuracy on the test set: 84.16%\n\n\n100%|██████████| 938/938 [00:04<00:00, 222.91it/s]\n\n\nEpoch [27/50] Loss: 0.4271\nAccuracy on the test set: 84.38%\n\n\n100%|██████████| 938/938 [00:04<00:00, 223.68it/s]\n\n\nEpoch [28/50] Loss: 0.4193\nAccuracy on the test set: 84.84%\n\n\n100%|██████████| 938/938 [00:03<00:00, 261.50it/s]\n\n\nEpoch [29/50] Loss: 0.4148\nAccuracy on the test set: 85.05%\n\n\n100%|██████████| 938/938 [00:03<00:00, 246.52it/s]\n\n\nEpoch [30/50] Loss: 0.4040\nAccuracy on the test set: 84.49%\n\n\n100%|██████████| 938/938 [00:03<00:00, 281.60it/s]\n\n\nEpoch [31/50] Loss: 0.3990\nAccuracy on the test set: 84.59%\n\n\n100%|██████████| 938/938 [00:03<00:00, 278.41it/s]\n\n\nEpoch [32/50] Loss: 0.4016\nAccuracy on the test set: 84.92%\n\n\n100%|██████████| 938/938 [00:03<00:00, 275.60it/s]\n\n\nEpoch [33/50] Loss: 0.3979\nAccuracy on the test set: 85.01%\n\n\n100%|██████████| 938/938 [00:03<00:00, 250.04it/s]\n\n\nEpoch [34/50] Loss: 0.3844\nAccuracy on the test set: 84.82%\n\n\n100%|██████████| 938/938 [00:03<00:00, 280.53it/s]\n\n\nEpoch [35/50] Loss: 0.3789\nAccuracy on the test set: 85.49%\n\n\n100%|██████████| 938/938 [00:03<00:00, 279.26it/s]\n\n\nEpoch [36/50] Loss: 0.3760\nAccuracy on the test set: 85.26%\n\n\n100%|██████████| 938/938 [00:04<00:00, 207.71it/s]\n\n\nEpoch [37/50] Loss: 0.3733\nAccuracy on the test set: 85.36%\n\n\n100%|██████████| 938/938 [00:03<00:00, 265.92it/s]\n\n\nEpoch [38/50] Loss: 0.3655\nAccuracy on the test set: 84.98%\n\n\n100%|██████████| 938/938 [00:03<00:00, 279.79it/s]\n\n\nEpoch [39/50] Loss: 0.3627\nAccuracy on the test set: 85.19%\n\n\n100%|██████████| 938/938 [00:03<00:00, 276.73it/s]\n\n\nEpoch [40/50] Loss: 0.3517\nAccuracy on the test set: 84.78%\n\n\n100%|██████████| 938/938 [00:03<00:00, 278.32it/s]\n\n\nEpoch [41/50] Loss: 0.3526\nAccuracy on the test set: 85.43%\n\n\n100%|██████████| 938/938 [00:03<00:00, 243.70it/s]\n\n\nEpoch [42/50] Loss: 0.3523\nAccuracy on the test set: 85.55%\n\n\n100%|██████████| 938/938 [00:03<00:00, 240.48it/s]\n\n\nEpoch [43/50] Loss: 0.3457\nAccuracy on the test set: 85.02%\n\n\n100%|██████████| 938/938 [00:03<00:00, 274.70it/s]\n\n\nEpoch [44/50] Loss: 0.3447\nAccuracy on the test set: 85.20%\n\n\n100%|██████████| 938/938 [00:03<00:00, 276.08it/s]\n\n\nEpoch [45/50] Loss: 0.3411\nAccuracy on the test set: 85.47%\n\n\n100%|██████████| 938/938 [00:04<00:00, 215.18it/s]\n\n\nEpoch [46/50] Loss: 0.3312\nAccuracy on the test set: 85.55%\n\n\n100%|██████████| 938/938 [00:03<00:00, 244.20it/s]\n\n\nEpoch [47/50] Loss: 0.3290\nAccuracy on the test set: 85.52%\n\n\n100%|██████████| 938/938 [00:03<00:00, 267.56it/s]\n\n\nEpoch [48/50] Loss: 0.3277\nAccuracy on the test set: 85.35%\n\n\n100%|██████████| 938/938 [00:03<00:00, 267.91it/s]\n\n\nEpoch [49/50] Loss: 0.3241\nAccuracy on the test set: 85.80%\n\n\n100%|██████████| 938/938 [00:03<00:00, 266.04it/s]\n\n\nEpoch [50/50] Loss: 0.3217\nAccuracy on the test set: 84.93%\n\n\n\n# Evaluation\nresnet.eval()\ncorrect = 0\ntotal = 0\nwith torch.no_grad():\n predicted_list = []\n for images, labels in test_loader:\n outputs = resnet(images)\n _, predicted = torch.max(outputs.data, 1)\n total += labels.size(0)\n correct += (predicted == labels).sum().item()\n softmax_outputs = nn.Softmax(dim=1)(outputs)\n predicted_list.append(softmax_outputs.data.cpu().numpy())\n\nall_predicted = np.concatenate(predicted_list, axis=0)\nprint(f\"Accuracy on the test set: {(100 * correct / total):.2f}%\")\n\nAccuracy on the test set: 84.93%" + "objectID": "lab/scratchpad.html#run-seg", + "href": "lab/scratchpad.html#run-seg", + "title": "Predict", + "section": "Run seg", + "text": "Run seg\n\nmodel = YOLO(\"yolov8m-seg\")\n\n\nmodel.train(data=\"../lab/trench_width/data.yaml\", epochs=51, batch=-1, imgsz=1280, save_period=5)\n\nNew https://pypi.org/project/ultralytics/8.3.64 available 😃 Update with 'pip install -U ultralytics'\nUltralytics 8.3.55 🚀 Python-3.10.15 torch-2.5.0+cu124 CUDA:0 (NVIDIA A100-SXM4-80GB, 81156MiB)\nengine/trainer: task=segment, mode=train, model=yolov8m-seg.pt, data=../lab/trench_width/data.yaml, epochs=51, time=None, patience=100, batch=-1, imgsz=1280, save=True, save_period=5, cache=False, device=None, workers=8, project=None, name=train, exist_ok=False, pretrained=True, optimizer=auto, verbose=True, seed=0, deterministic=True, single_cls=False, rect=False, cos_lr=False, close_mosaic=10, resume=False, amp=True, fraction=1.0, profile=False, freeze=None, multi_scale=False, overlap_mask=True, mask_ratio=4, dropout=0.0, val=True, split=val, save_json=False, save_hybrid=False, conf=None, iou=0.7, max_det=300, half=False, dnn=False, plots=True, source=None, vid_stride=1, stream_buffer=False, visualize=False, augment=False, agnostic_nms=False, classes=None, retina_masks=False, embed=None, show=False, save_frames=False, save_txt=False, save_conf=False, save_crop=False, show_labels=True, show_conf=True, show_boxes=True, line_width=None, format=torchscript, keras=False, optimize=False, int8=False, dynamic=False, simplify=True, opset=None, workspace=None, nms=False, lr0=0.01, lrf=0.01, momentum=0.937, weight_decay=0.0005, warmup_epochs=3.0, warmup_momentum=0.8, warmup_bias_lr=0.1, box=7.5, cls=0.5, dfl=1.5, pose=12.0, kobj=1.0, nbs=64, hsv_h=0.015, hsv_s=0.7, hsv_v=0.4, degrees=0.0, translate=0.1, scale=0.5, shear=0.0, perspective=0.0, flipud=0.0, fliplr=0.5, bgr=0.0, mosaic=1.0, mixup=0.0, copy_paste=0.0, copy_paste_mode=flip, auto_augment=randaugment, erasing=0.4, crop_fraction=1.0, cfg=None, tracker=botsort.yaml, save_dir=runs/segment/train\nOverriding model.yaml nc=80 with nc=2\n\n from n params module arguments \n 0 -1 1 1392 ultralytics.nn.modules.conv.Conv [3, 48, 3, 2] \n 1 -1 1 41664 ultralytics.nn.modules.conv.Conv [48, 96, 3, 2] \n 2 -1 2 111360 ultralytics.nn.modules.block.C2f [96, 96, 2, True] \n 3 -1 1 166272 ultralytics.nn.modules.conv.Conv [96, 192, 3, 2] \n 4 -1 4 813312 ultralytics.nn.modules.block.C2f [192, 192, 4, True] \n 5 -1 1 664320 ultralytics.nn.modules.conv.Conv [192, 384, 3, 2] \n 6 -1 4 3248640 ultralytics.nn.modules.block.C2f [384, 384, 4, True] \n 7 -1 1 1991808 ultralytics.nn.modules.conv.Conv [384, 576, 3, 2] \n 8 -1 2 3985920 ultralytics.nn.modules.block.C2f [576, 576, 2, True] \n 9 -1 1 831168 ultralytics.nn.modules.block.SPPF [576, 576, 5] \n 10 -1 1 0 torch.nn.modules.upsampling.Upsample [None, 2, 'nearest'] \n 11 [-1, 6] 1 0 ultralytics.nn.modules.conv.Concat [1] \n 12 -1 2 1993728 ultralytics.nn.modules.block.C2f [960, 384, 2] \n 13 -1 1 0 torch.nn.modules.upsampling.Upsample [None, 2, 'nearest'] \n 14 [-1, 4] 1 0 ultralytics.nn.modules.conv.Concat [1] \n 15 -1 2 517632 ultralytics.nn.modules.block.C2f [576, 192, 2] \n 16 -1 1 332160 ultralytics.nn.modules.conv.Conv [192, 192, 3, 2] \n 17 [-1, 12] 1 0 ultralytics.nn.modules.conv.Concat [1] \n 18 -1 2 1846272 ultralytics.nn.modules.block.C2f [576, 384, 2] \n 19 -1 1 1327872 ultralytics.nn.modules.conv.Conv [384, 384, 3, 2] \n 20 [-1, 9] 1 0 ultralytics.nn.modules.conv.Concat [1] \n 21 -1 2 4207104 ultralytics.nn.modules.block.C2f [960, 576, 2] \n 22 [15, 18, 21] 1 5160182 ultralytics.nn.modules.head.Segment [2, 32, 192, [192, 384, 576]] \nYOLOv8m-seg summary: 331 layers, 27,240,806 parameters, 27,240,790 gradients, 110.4 GFLOPs\n\nTransferred 531/537 items from pretrained weights\n\n\nTracking run with wandb version 0.18.5\n\n\nRun data is saved locally in /home/patel_zeel/blog/lab/wandb/run-20250120_154343-tvkavy42\n\n\nSyncing run train to Weights & Biases (docs)\n\n\n View project at https://wandb.ai/sustainability-lab/Ultralytics\n\n\n View run at https://wandb.ai/sustainability-lab/Ultralytics/runs/tvkavy42\n\n\nFreezing layer 'model.22.dfl.conv.weight'\nAMP: running Automatic Mixed Precision (AMP) checks...\nAMP: checks passed ✅\n\n\ntrain: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]\n\n\nAutoBatch: Computing optimal batch size for imgsz=1280 at 60.0% CUDA memory utilization.\nAutoBatch: CUDA:0 (NVIDIA A100-SXM4-80GB) 79.25G total, 3.34G reserved, 0.75G allocated, 75.16G free\n\n\n\n\n\n Params GFLOPs GPU_mem (GB) forward (ms) backward (ms) input output\n 27240806 441.6 4.140 29.42 nan (1, 3, 1280, 1280) list\n 27240806 883.2 9.200 36.04 nan (2, 3, 1280, 1280) list\n 27240806 1766 17.132 41.55 nan (4, 3, 1280, 1280) list\n 27240806 3533 33.320 73.9 nan (8, 3, 1280, 1280) list\n 27240806 7065 64.274 142.8 nan (16, 3, 1280, 1280) list\n 27240806 1.413e+04 127.767 281.6 nan (32, 3, 1280, 1280) list\nCUDA out of memory. Tried to allocate 600.00 MiB. GPU 0 has a total capacity of 79.25 GiB of which 167.50 MiB is free. Including non-PyTorch memory, this process has 79.07 GiB memory in use. Of the allocated memory 77.43 GiB is allocated by PyTorch, and 1.08 GiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)\nAutoBatch: Using batch-size 11 for CUDA:0 49.63G/79.25G (63%) ✅\n\n\ntrain: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]\nval: Scanning /home/patel_zeel/blog/lab/trench_width/labels.cache... 10 images, 0 backgrounds, 0 corrupt: 100%|██████████| 10/10 [00:00<?, ?it/s]\n\n\nPlotting labels to runs/segment/train/labels.jpg... \noptimizer: 'optimizer=auto' found, ignoring 'lr0=0.01' and 'momentum=0.937' and determining best 'optimizer', 'lr0' and 'momentum' automatically... \noptimizer: AdamW(lr=0.001667, momentum=0.9) with parameter groups 86 weight(decay=0.0), 97 weight(decay=0.000515625), 96 bias(decay=0.0)\nImage sizes 1280 train, 1280 val\nUsing 8 dataloader workers\nLogging results to runs/segment/train\nStarting training for 51 epochs...\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 1/51 19.4G 2.736 7.179 6.442 2.486 57 1280: 100%|██████████| 1/1 [00:01<00:00, 1.02s/it]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 2.53it/s]\n\n\n all 10 20 0 0 0 0 0 0 0 0\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 2/51 19.4G 2.783 6.387 6.172 2.612 59 1280: 100%|██████████| 1/1 [00:00<00:00, 2.97it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 3.37it/s]\n\n\n all 10 20 0 0 0 0 0 0 0 0\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 3/51 19.4G 2.929 7.076 6.595 2.577 56 1280: 100%|██████████| 1/1 [00:00<00:00, 2.97it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 3.19it/s]\n\n\n all 10 20 0 0 0 0 0 0 0 0\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 4/51 19.4G 2.726 7.228 6.208 2.532 58 1280: 100%|██████████| 1/1 [00:00<00:00, 3.03it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 4.10it/s]\n\n\n all 10 20 0 0 0 0 0 0 0 0\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 5/51 19.3G 2.499 6.182 7.267 2.396 42 1280: 100%|██████████| 1/1 [00:00<00:00, 3.01it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 3.71it/s]\n\n\n all 10 20 0 0 0 0 0 0 0 0\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 6/51 19.3G 3.034 8.223 7.171 2.728 48 1280: 100%|██████████| 1/1 [00:00<00:00, 2.99it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 3.97it/s]\n\n\n all 10 20 0 0 0 0 0 0 0 0\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 7/51 19.4G 3.12 7.957 7.105 2.859 55 1280: 100%|██████████| 1/1 [00:00<00:00, 2.99it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 4.12it/s]\n\n\n all 10 20 0 0 0 0 0 0 0 0\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 8/51 19.4G 2.931 7.516 6.602 2.776 56 1280: 100%|██████████| 1/1 [00:00<00:00, 2.89it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.57it/s]\n\n\n all 10 20 0.696 0.15 0.145 0.0523 0.617 0.2 0.0585 0.0102\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 9/51 19.4G 2.629 5.092 6.662 2.429 35 1280: 100%|██████████| 1/1 [00:00<00:00, 2.88it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 2.32it/s]\n\n\n all 10 20 0.13 0.25 0.16 0.0768 0.132 0.1 0.154 0.0326\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 10/51 19.5G 1.897 4.349 4.317 1.925 48 1280: 100%|██████████| 1/1 [00:00<00:00, 3.01it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 2.21it/s]\n\n\n all 10 20 0.452 0.351 0.39 0.175 0.172 0.35 0.25 0.0854\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 11/51 19.6G 1.887 4.137 3.486 1.877 54 1280: 100%|██████████| 1/1 [00:00<00:00, 2.75it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 3.67it/s]\n\n\n all 10 20 0.8 0.15 0.357 0.219 0.105 0.614 0.284 0.0629\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 12/51 19.6G 1.43 3.731 3.628 1.507 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.12it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 3.44it/s]\n\n\n all 10 20 0.8 0.15 0.357 0.219 0.105 0.614 0.284 0.0629\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 13/51 19.7G 1.559 3.545 3.55 1.526 56 1280: 100%|██████████| 1/1 [00:00<00:00, 2.87it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 2.00it/s]\n\n\n all 10 20 0.904 0.3 0.5 0.193 0.904 0.3 0.43 0.108\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 14/51 19.7G 1.39 2.989 2.648 1.478 69 1280: 100%|██████████| 1/1 [00:00<00:00, 3.12it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.83it/s]\n\n\n all 10 20 0.904 0.3 0.5 0.193 0.904 0.3 0.43 0.108\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 15/51 19.6G 1.729 2.974 3.255 1.676 45 1280: 100%|██████████| 1/1 [00:00<00:00, 2.87it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.31it/s]\n\n\n all 10 20 0.455 0.727 0.398 0.158 0.451 0.677 0.349 0.137\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 16/51 19.4G 1.711 2.639 3.226 1.627 38 1280: 100%|██████████| 1/1 [00:00<00:00, 3.28it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.32it/s]\n\n\n all 10 20 0.455 0.727 0.398 0.158 0.451 0.677 0.349 0.137\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 17/51 19.7G 1.329 2.381 2.383 1.373 54 1280: 100%|██████████| 1/1 [00:00<00:00, 2.92it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.26it/s]\n\n\n all 10 20 0.284 0.75 0.629 0.298 0.299 0.8 0.649 0.261\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 18/51 19.4G 1.228 1.977 2.472 1.299 46 1280: 100%|██████████| 1/1 [00:00<00:00, 3.18it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.22it/s]\n\n\n all 10 20 0.284 0.75 0.629 0.298 0.299 0.8 0.649 0.261\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 19/51 19.7G 1.415 2.453 2.654 1.415 54 1280: 100%|██████████| 1/1 [00:00<00:00, 2.92it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.27it/s]\n\n\n all 10 20 0.792 0.35 0.358 0.212 0.792 0.35 0.346 0.179\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 20/51 19.5G 1.282 1.927 2.378 1.391 46 1280: 100%|██████████| 1/1 [00:00<00:00, 3.16it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.41it/s]\n\n\n all 10 20 0.792 0.35 0.358 0.212 0.792 0.35 0.346 0.179\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 21/51 19.7G 1.192 1.933 2.26 1.32 56 1280: 100%|██████████| 1/1 [00:00<00:00, 2.94it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.64it/s]\n\n\n all 10 20 0.474 0.5 0.538 0.356 0.474 0.5 0.539 0.285\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 22/51 19.6G 1.181 1.726 2.088 1.278 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.16it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 1.61it/s]\n\n\n all 10 20 0.474 0.5 0.538 0.356 0.474 0.5 0.539 0.285\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 23/51 19.4G 1.3 2.048 2.568 1.359 35 1280: 100%|██████████| 1/1 [00:00<00:00, 2.89it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 2.50it/s]\n\n\n all 10 20 0.778 0.6 0.659 0.352 0.722 0.55 0.637 0.374\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 24/51 19.6G 1.12 1.541 1.564 1.297 60 1280: 100%|██████████| 1/1 [00:00<00:00, 3.15it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 2.32it/s]\n\n\n all 10 20 0.778 0.6 0.659 0.352 0.722 0.55 0.637 0.374\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 25/51 19.6G 1.043 1.471 1.694 1.228 52 1280: 100%|██████████| 1/1 [00:00<00:00, 2.84it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 3.49it/s]\n\n\n all 10 20 0.415 0.7 0.443 0.248 0.436 0.75 0.445 0.307\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 26/51 19.5G 1.355 1.56 1.975 1.43 54 1280: 100%|██████████| 1/1 [00:00<00:00, 3.24it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 3.66it/s]\n\n\n all 10 20 0.415 0.7 0.443 0.248 0.436 0.75 0.445 0.307\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 27/51 19.7G 1.02 1.483 1.475 1.218 56 1280: 100%|██████████| 1/1 [00:00<00:00, 2.86it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.42it/s]\n\n\n all 10 20 0.771 0.55 0.749 0.416 0.771 0.55 0.76 0.51\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 28/51 19.6G 1.079 1.262 1.616 1.309 56 1280: 100%|██████████| 1/1 [00:00<00:00, 3.23it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.32it/s]\n\n\n all 10 20 0.771 0.55 0.749 0.416 0.771 0.55 0.76 0.51\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 29/51 19.7G 1.093 1.497 1.691 1.302 56 1280: 100%|██████████| 1/1 [00:00<00:00, 2.99it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.35it/s]\n\n\n all 10 20 0.774 0.75 0.799 0.436 0.774 0.75 0.788 0.514\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 30/51 19.5G 1.135 1.419 1.848 1.339 42 1280: 100%|██████████| 1/1 [00:00<00:00, 3.19it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.87it/s]\n\n\n all 10 20 0.774 0.75 0.799 0.436 0.774 0.75 0.788 0.514\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 31/51 19.6G 1.042 1.496 1.698 1.272 50 1280: 100%|██████████| 1/1 [00:00<00:00, 2.99it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.63it/s]\n\n\n all 10 20 0.632 0.7 0.652 0.429 0.663 0.75 0.698 0.501\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 32/51 19.6G 1.05 1.488 1.513 1.214 52 1280: 100%|██████████| 1/1 [00:00<00:00, 3.26it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.62it/s]\n\n\n all 10 20 0.632 0.7 0.652 0.429 0.663 0.75 0.698 0.501\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 33/51 19.6G 1.043 1.309 1.44 1.269 45 1280: 100%|██████████| 1/1 [00:00<00:00, 3.27it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.10it/s]\n\n\n all 10 20 0.632 0.7 0.652 0.429 0.663 0.75 0.698 0.501\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 34/51 19.7G 0.9872 1.389 1.329 1.191 62 1280: 100%|██████████| 1/1 [00:00<00:00, 2.98it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.38it/s]\n\n\n all 10 20 0.646 0.7 0.724 0.512 0.713 0.8 0.763 0.58\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 35/51 19.6G 1.019 1.202 1.568 1.191 52 1280: 100%|██████████| 1/1 [00:00<00:00, 3.23it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.40it/s]\n\n\n all 10 20 0.646 0.7 0.724 0.512 0.713 0.8 0.763 0.58\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 36/51 19.6G 1.029 1.35 1.522 1.232 51 1280: 100%|██████████| 1/1 [00:00<00:00, 3.27it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.22it/s]\n\n\n all 10 20 0.646 0.7 0.724 0.512 0.713 0.8 0.763 0.58\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 37/51 19.6G 1.056 1.289 1.494 1.221 50 1280: 100%|██████████| 1/1 [00:00<00:00, 2.96it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.64it/s]\n\n\n all 10 20 0.635 0.85 0.803 0.6 0.69 0.9 0.846 0.657\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 38/51 19.6G 0.8953 1.298 1.12 1.068 60 1280: 100%|██████████| 1/1 [00:00<00:00, 3.19it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.65it/s]\n\n\n all 10 20 0.635 0.85 0.803 0.6 0.69 0.9 0.846 0.657\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 39/51 19.6G 0.8898 1.057 1.206 1.153 54 1280: 100%|██████████| 1/1 [00:00<00:00, 3.25it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.70it/s]\n\n\n all 10 20 0.635 0.85 0.803 0.6 0.69 0.9 0.846 0.657\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 40/51 19.7G 0.9538 1.113 1.201 1.206 60 1280: 100%|██████████| 1/1 [00:00<00:00, 2.89it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.69it/s]\n\n\n all 10 20 0.736 0.8 0.85 0.621 0.812 0.9 0.919 0.683\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 41/51 19.5G 0.9001 1.213 1.314 1.171 46 1280: 100%|██████████| 1/1 [00:00<00:00, 3.22it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.36it/s]\n\n\n all 10 20 0.736 0.8 0.85 0.621 0.812 0.9 0.919 0.683\n\n\n\n\n\nClosing dataloader mosaic\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 42/51 19.3G 1.192 1.432 2.694 1.399 20 1280: 100%|██████████| 1/1 [00:00<00:00, 1.16it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 4.71it/s]\n\n\n all 10 20 0.736 0.8 0.85 0.621 0.812 0.9 0.919 0.683\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 43/51 19.3G 1.034 1.252 2.616 1.418 20 1280: 100%|██████████| 1/1 [00:00<00:00, 2.93it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 4.62it/s]\n\n\n all 10 20 0.814 0.85 0.876 0.651 0.866 0.9 0.931 0.702\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 44/51 19.2G 0.9355 1.308 2.01 1.27 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.24it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 4.74it/s]\n\n\n all 10 20 0.814 0.85 0.876 0.651 0.866 0.9 0.931 0.702\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 45/51 19.3G 0.9141 1.258 2.125 1.412 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.20it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.62it/s]\n\n\n all 10 20 0.814 0.85 0.876 0.651 0.866 0.9 0.931 0.702\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 46/51 19.3G 1.256 1.421 2.302 1.435 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.01it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.75it/s]\n\n\n all 10 20 0.782 0.829 0.84 0.6 0.804 0.819 0.881 0.687\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 47/51 19.2G 0.8384 1.046 1.981 1.267 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.27it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.70it/s]\n\n\n all 10 20 0.782 0.829 0.84 0.6 0.804 0.819 0.881 0.687\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 48/51 19.3G 1.146 1.197 2.063 1.319 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.25it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.71it/s]\n\n\n all 10 20 0.782 0.829 0.84 0.6 0.804 0.819 0.881 0.687\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 49/51 19.3G 1.144 1.099 2.118 1.411 20 1280: 100%|██████████| 1/1 [00:00<00:00, 2.99it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.79it/s]\n\n\n all 10 20 0.811 0.776 0.909 0.675 0.797 0.879 0.926 0.728\n\n\n\n\n\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 50/51 19.2G 1.022 1.027 1.993 1.325 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.26it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.60it/s]\n\n\n all 10 20 0.811 0.776 0.909 0.675 0.797 0.879 0.926 0.728\n\n Epoch GPU_mem box_loss seg_loss cls_loss dfl_loss Instances Size\n\n\n 51/51 19.3G 1.167 1.066 2.1 1.344 20 1280: 100%|██████████| 1/1 [00:00<00:00, 3.12it/s]\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.51it/s]\n\n\n all 10 20 0.811 0.776 0.909 0.675 0.797 0.879 0.926 0.728\n\n\n\n\n\n\n51 epochs completed in 0.031 hours.\nOptimizer stripped from runs/segment/train/weights/last.pt, 55.0MB\nOptimizer stripped from runs/segment/train/weights/best.pt, 55.0MB\n\nValidating runs/segment/train/weights/best.pt...\nUltralytics 8.3.55 🚀 Python-3.10.15 torch-2.5.0+cu124 CUDA:0 (NVIDIA A100-SXM4-80GB, 81156MiB)\nYOLOv8m-seg summary (fused): 245 layers, 27,223,542 parameters, 0 gradients, 110.0 GFLOPs\n\n\n Class Images Instances Box(P R mAP50 mAP50-95) Mask(P R mAP50 mAP50-95): 100%|██████████| 1/1 [00:00<00:00, 5.44it/s]\n\n\n all 10 20 0.811 0.776 0.909 0.675 0.797 0.879 0.926 0.727\n inner_box 10 10 0.622 0.7 0.823 0.508 0.594 0.8 0.857 0.678\n outer_box 10 10 1 0.852 0.995 0.842 1 0.959 0.995 0.776\nSpeed: 0.4ms preprocess, 6.9ms inference, 0.0ms loss, 3.5ms postprocess per image\nResults saved to runs/segment/train\n\n\n\n\n\n\nRun history:\n\n\n\nlr/pg0\n▁▂▂▃▃▄▄▅▅▅▆▆▇▇▇▇███████████▇▇▇▇▆▆▆▆▅▄▄▃▂\n\n\nlr/pg1\n▁▂▂▃▃▄▅▅▅▆▆▇▇▇▇████████████▇▇▇▆▆▆▆▅▄▄▄▃▂\n\n\nlr/pg2\n▁▂▂▃▃▄▄▅▅▅▆▆▇▇▇▇███████████▇▇▇▆▆▆▆▅▄▄▄▃▂\n\n\nmetrics/mAP50(B)\n▁▁▁▁▂▄▄▄▅▅▄▆▆▄▄▅▆▆▄▄▇▇▇▆▆▇▇▇▇▇█████▇▇▇██\n\n\nmetrics/mAP50(M)\n▁▁▁▁▁▁▁▂▃▃▄▄▄▆▄▅▅▆▆▄▇▇▇▇▆▇▇▇▇▇██████████\n\n\nmetrics/mAP50-95(B)\n▁▁▁▁▁▁▂▂▃▃▃▃▃▃▄▃▃▅▅▅▄▅▅▆▆▅▅▆▆▆▇▇▇▇██▇▇██\n\n\nmetrics/mAP50-95(M)\n▁▁▁▁▁▁▁▁▂▂▂▂▂▂▄▃▄▄▅▅▄▆▆▆▆▆▇▇▇▇██████████\n\n\nmetrics/precision(B)\n▁▁▁▁▁▇▂▅██▅▅▃▃█▅██▅▅███▆▆▇▇▇▆▆▇▇▇███████\n\n\nmetrics/precision(M)\n▁▁▁▁▁▆▂▂▂▂█▄▄▃▃▇▅▅▇▄▇▇▇▇▆▆▇▇▇▆▆▇▇▇██▇▇▇▇\n\n\nmetrics/recall(B)\n▁▁▁▁▁▃▄▂▂▃▇▇▇▇▄▅▅▆▆▇▆▇▇▇▇▇▇▇██████████▇▇\n\n\nmetrics/recall(M)\n▁▁▁▁▁▃▂▄▆▆▃▆▆▇▇▅▅▅▅▇▅▅▇▇▇▇▇▇▇███████▇▇██\n\n\nmodel/GFLOPs\n▁\n\n\nmodel/parameters\n▁\n\n\nmodel/speed_PyTorch(ms)\n▁\n\n\ntrain/box_loss\n▇▇▇▇▆█▇▆▄▄▃▃▄▄▃▂▂▂▂▂▃▂▂▂▂▂▁▂▂▂▁▁▁▂▂▁▁▂▂▂\n\n\ntrain/cls_loss\n▇▇▇▇██▇▇▅▄▄▃▃▃▂▃▂▂▂▃▂▁▁▂▂▁▁▁▁▁▁▁▁▃▃▂▂▂▂▂\n\n\ntrain/dfl_loss\n▇▇▆▇█▆▄▄▃▃▃▂▂▂▂▂▂▂▂▂▂▂▂▂▂▁▁▂▂▁▂▁▂▂▂▂▂▂▂▂\n\n\ntrain/seg_loss\n▇▆▇▇▆██▅▄▄▄▃▃▃▂▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁\n\n\nval/box_loss\n██████▄▃▃▃▃▃▃▃▃▂▂▂▂▂▃▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁▁▁▁▁\n\n\nval/cls_loss\n▃▃▃▃▃▄▄██▄▄▄▆▆▅▃▅▅██▃▃▃▃▃▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁\n\n\nval/dfl_loss\n██████▄▃▂▂▃▃▃▂▂▂▂▂▂▂▄▂▂▂▂▂▂▂▂▂▁▁▁▁▁▁▁▁▁▁\n\n\nval/seg_loss\n█████▇▆▆▅▅▄▃▃▃▃▃▄▄▃▅▂▂▂▂▂▂▂▂▂▁▁▂▂▂▁▁▁▁▁▁\n\n\n\nRun summary:\n\n\n\nlr/pg0\n2e-05\n\n\nlr/pg1\n2e-05\n\n\nlr/pg2\n2e-05\n\n\nmetrics/mAP50(B)\n0.90923\n\n\nmetrics/mAP50(M)\n0.92577\n\n\nmetrics/mAP50-95(B)\n0.67538\n\n\nmetrics/mAP50-95(M)\n0.72695\n\n\nmetrics/precision(B)\n0.81082\n\n\nmetrics/precision(M)\n0.79694\n\n\nmetrics/recall(B)\n0.77614\n\n\nmetrics/recall(M)\n0.87927\n\n\nmodel/GFLOPs\n110.395\n\n\nmodel/parameters\n27240806\n\n\nmodel/speed_PyTorch(ms)\n13.361\n\n\ntrain/box_loss\n1.16748\n\n\ntrain/cls_loss\n2.09975\n\n\ntrain/dfl_loss\n1.34438\n\n\ntrain/seg_loss\n1.0662\n\n\nval/box_loss\n1.02605\n\n\nval/cls_loss\n2.05508\n\n\nval/dfl_loss\n1.23359\n\n\nval/seg_loss\n1.12731\n\n\n\n\n\n\n View run train at: https://wandb.ai/sustainability-lab/Ultralytics/runs/tvkavy42 View project at: https://wandb.ai/sustainability-lab/UltralyticsSynced 5 W&B file(s), 0 media file(s), 20 artifact file(s) and 28 other file(s)\n\n\nFind logs at: ./wandb/run-20250120_154343-tvkavy42/logs\n\n\nultralytics.utils.metrics.SegmentMetrics object with attributes:\n\nap_class_index: array([0, 1])\nbox: ultralytics.utils.metrics.Metric object\nconfusion_matrix: <ultralytics.utils.metrics.ConfusionMatrix object at 0x7fe84e12c040>\ncurves: ['Precision-Recall(B)', 'F1-Confidence(B)', 'Precision-Confidence(B)', 'Recall-Confidence(B)', 'Precision-Recall(M)', 'F1-Confidence(M)', 'Precision-Confidence(M)', 'Recall-Confidence(M)']\ncurves_results: [[array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,\n 0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,\n 0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,\n 0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,\n 0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,\n 0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,\n 0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,\n 0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,\n 0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,\n 0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,\n 0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,\n 0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,\n 0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,\n 0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,\n 0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,\n 0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,\n 0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,\n 0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,\n 0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,\n 0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,\n 0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,\n 0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,\n 0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,\n 0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,\n 0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,\n 0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,\n 0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,\n 0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,\n 0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,\n 0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,\n 0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,\n 0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,\n 0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,\n 0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,\n 0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,\n 0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,\n 0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,\n 0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,\n 0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,\n 0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,\n 0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,\n 0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 1, 1, 1, ..., 0.38462, 0.38462, 0],\n [ 1, 1, 1, ..., 1, 1, 0]], shape=(2, 1000)), 'Recall', 'Precision'], [array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,\n 0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,\n 0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,\n 0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,\n 0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,\n 0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,\n 0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,\n 0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,\n 0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,\n 0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,\n 0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,\n 0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,\n 0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,\n 0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,\n 0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,\n 0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,\n 0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,\n 0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,\n 0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,\n 0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,\n 0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,\n 0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,\n 0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,\n 0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,\n 0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,\n 0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,\n 0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,\n 0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,\n 0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,\n 0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,\n 0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,\n 0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,\n 0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,\n 0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,\n 0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,\n 0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,\n 0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,\n 0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,\n 0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,\n 0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,\n 0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,\n 0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 0.07326, 0.07326, 0.07326, ..., 0.33565, 0, 0],\n [ 0.0072807, 0.0072807, 0.0072807, ..., 0.30149, 0.2242, 0]], shape=(2, 1000)), 'Confidence', 'F1'], [array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,\n 0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,\n 0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,\n 0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,\n 0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,\n 0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,\n 0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,\n 0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,\n 0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,\n 0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,\n 0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,\n 0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,\n 0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,\n 0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,\n 0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,\n 0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,\n 0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,\n 0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,\n 0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,\n 0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,\n 0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,\n 0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,\n 0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,\n 0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,\n 0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,\n 0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,\n 0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,\n 0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,\n 0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,\n 0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,\n 0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,\n 0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,\n 0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,\n 0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,\n 0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,\n 0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,\n 0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,\n 0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,\n 0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,\n 0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,\n 0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,\n 0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 0.038023, 0.038023, 0.038023, ..., 1, 1, 1],\n [ 0.0036536, 0.0036536, 0.0036536, ..., 1, 1, 1]], shape=(2, 1000)), 'Confidence', 'Precision'], [array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,\n 0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,\n 0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,\n 0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,\n 0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,\n 0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,\n 0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,\n 0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,\n 0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,\n 0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,\n 0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,\n 0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,\n 0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,\n 0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,\n 0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,\n 0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,\n 0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,\n 0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,\n 0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,\n 0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,\n 0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,\n 0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,\n 0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,\n 0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,\n 0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,\n 0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,\n 0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,\n 0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,\n 0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,\n 0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,\n 0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,\n 0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,\n 0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,\n 0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,\n 0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,\n 0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,\n 0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,\n 0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,\n 0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,\n 0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,\n 0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,\n 0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 1, 1, 1, ..., 0.20167, 0, 0],\n [ 1, 1, 1, ..., 0.1775, 0.12625, 0]], shape=(2, 1000)), 'Confidence', 'Recall'], [array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,\n 0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,\n 0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,\n 0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,\n 0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,\n 0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,\n 0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,\n 0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,\n 0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,\n 0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,\n 0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,\n 0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,\n 0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,\n 0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,\n 0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,\n 0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,\n 0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,\n 0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,\n 0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,\n 0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,\n 0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,\n 0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,\n 0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,\n 0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,\n 0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,\n 0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,\n 0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,\n 0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,\n 0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,\n 0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,\n 0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,\n 0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,\n 0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,\n 0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,\n 0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,\n 0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,\n 0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,\n 0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,\n 0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,\n 0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,\n 0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,\n 0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 1, 1, 1, ..., 0.5, 0.5, 0],\n [ 1, 1, 1, ..., 1, 1, 0]], shape=(2, 1000)), 'Recall', 'Precision'], [array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,\n 0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,\n 0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,\n 0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,\n 0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,\n 0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,\n 0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,\n 0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,\n 0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,\n 0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,\n 0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,\n 0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,\n 0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,\n 0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,\n 0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,\n 0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,\n 0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,\n 0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,\n 0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,\n 0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,\n 0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,\n 0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,\n 0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,\n 0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,\n 0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,\n 0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,\n 0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,\n 0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,\n 0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,\n 0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,\n 0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,\n 0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,\n 0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,\n 0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,\n 0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,\n 0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,\n 0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,\n 0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,\n 0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,\n 0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,\n 0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,\n 0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 0.07326, 0.07326, 0.07326, ..., 0.33565, 0, 0],\n [ 0.0072807, 0.0072807, 0.0072807, ..., 0.30149, 0.2242, 0]], shape=(2, 1000)), 'Confidence', 'F1'], [array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,\n 0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,\n 0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,\n 0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,\n 0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,\n 0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,\n 0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,\n 0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,\n 0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,\n 0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,\n 0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,\n 0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,\n 0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,\n 0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,\n 0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,\n 0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,\n 0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,\n 0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,\n 0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,\n 0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,\n 0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,\n 0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,\n 0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,\n 0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,\n 0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,\n 0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,\n 0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,\n 0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,\n 0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,\n 0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,\n 0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,\n 0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,\n 0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,\n 0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,\n 0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,\n 0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,\n 0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,\n 0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,\n 0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,\n 0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,\n 0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,\n 0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 0.038023, 0.038023, 0.038023, ..., 1, 1, 1],\n [ 0.0036536, 0.0036536, 0.0036536, ..., 1, 1, 1]], shape=(2, 1000)), 'Confidence', 'Precision'], [array([ 0, 0.001001, 0.002002, 0.003003, 0.004004, 0.005005, 0.006006, 0.007007, 0.008008, 0.009009, 0.01001, 0.011011, 0.012012, 0.013013, 0.014014, 0.015015, 0.016016, 0.017017, 0.018018, 0.019019, 0.02002, 0.021021, 0.022022, 0.023023,\n 0.024024, 0.025025, 0.026026, 0.027027, 0.028028, 0.029029, 0.03003, 0.031031, 0.032032, 0.033033, 0.034034, 0.035035, 0.036036, 0.037037, 0.038038, 0.039039, 0.04004, 0.041041, 0.042042, 0.043043, 0.044044, 0.045045, 0.046046, 0.047047,\n 0.048048, 0.049049, 0.05005, 0.051051, 0.052052, 0.053053, 0.054054, 0.055055, 0.056056, 0.057057, 0.058058, 0.059059, 0.06006, 0.061061, 0.062062, 0.063063, 0.064064, 0.065065, 0.066066, 0.067067, 0.068068, 0.069069, 0.07007, 0.071071,\n 0.072072, 0.073073, 0.074074, 0.075075, 0.076076, 0.077077, 0.078078, 0.079079, 0.08008, 0.081081, 0.082082, 0.083083, 0.084084, 0.085085, 0.086086, 0.087087, 0.088088, 0.089089, 0.09009, 0.091091, 0.092092, 0.093093, 0.094094, 0.095095,\n 0.096096, 0.097097, 0.098098, 0.099099, 0.1001, 0.1011, 0.1021, 0.1031, 0.1041, 0.10511, 0.10611, 0.10711, 0.10811, 0.10911, 0.11011, 0.11111, 0.11211, 0.11311, 0.11411, 0.11512, 0.11612, 0.11712, 0.11812, 0.11912,\n 0.12012, 0.12112, 0.12212, 0.12312, 0.12412, 0.12513, 0.12613, 0.12713, 0.12813, 0.12913, 0.13013, 0.13113, 0.13213, 0.13313, 0.13413, 0.13514, 0.13614, 0.13714, 0.13814, 0.13914, 0.14014, 0.14114, 0.14214, 0.14314,\n 0.14414, 0.14515, 0.14615, 0.14715, 0.14815, 0.14915, 0.15015, 0.15115, 0.15215, 0.15315, 0.15415, 0.15516, 0.15616, 0.15716, 0.15816, 0.15916, 0.16016, 0.16116, 0.16216, 0.16316, 0.16416, 0.16517, 0.16617, 0.16717,\n 0.16817, 0.16917, 0.17017, 0.17117, 0.17217, 0.17317, 0.17417, 0.17518, 0.17618, 0.17718, 0.17818, 0.17918, 0.18018, 0.18118, 0.18218, 0.18318, 0.18418, 0.18519, 0.18619, 0.18719, 0.18819, 0.18919, 0.19019, 0.19119,\n 0.19219, 0.19319, 0.19419, 0.1952, 0.1962, 0.1972, 0.1982, 0.1992, 0.2002, 0.2012, 0.2022, 0.2032, 0.2042, 0.20521, 0.20621, 0.20721, 0.20821, 0.20921, 0.21021, 0.21121, 0.21221, 0.21321, 0.21421, 0.21522,\n 0.21622, 0.21722, 0.21822, 0.21922, 0.22022, 0.22122, 0.22222, 0.22322, 0.22422, 0.22523, 0.22623, 0.22723, 0.22823, 0.22923, 0.23023, 0.23123, 0.23223, 0.23323, 0.23423, 0.23524, 0.23624, 0.23724, 0.23824, 0.23924,\n 0.24024, 0.24124, 0.24224, 0.24324, 0.24424, 0.24525, 0.24625, 0.24725, 0.24825, 0.24925, 0.25025, 0.25125, 0.25225, 0.25325, 0.25425, 0.25526, 0.25626, 0.25726, 0.25826, 0.25926, 0.26026, 0.26126, 0.26226, 0.26326,\n 0.26426, 0.26527, 0.26627, 0.26727, 0.26827, 0.26927, 0.27027, 0.27127, 0.27227, 0.27327, 0.27427, 0.27528, 0.27628, 0.27728, 0.27828, 0.27928, 0.28028, 0.28128, 0.28228, 0.28328, 0.28428, 0.28529, 0.28629, 0.28729,\n 0.28829, 0.28929, 0.29029, 0.29129, 0.29229, 0.29329, 0.29429, 0.2953, 0.2963, 0.2973, 0.2983, 0.2993, 0.3003, 0.3013, 0.3023, 0.3033, 0.3043, 0.30531, 0.30631, 0.30731, 0.30831, 0.30931, 0.31031, 0.31131,\n 0.31231, 0.31331, 0.31431, 0.31532, 0.31632, 0.31732, 0.31832, 0.31932, 0.32032, 0.32132, 0.32232, 0.32332, 0.32432, 0.32533, 0.32633, 0.32733, 0.32833, 0.32933, 0.33033, 0.33133, 0.33233, 0.33333, 0.33433, 0.33534,\n 0.33634, 0.33734, 0.33834, 0.33934, 0.34034, 0.34134, 0.34234, 0.34334, 0.34434, 0.34535, 0.34635, 0.34735, 0.34835, 0.34935, 0.35035, 0.35135, 0.35235, 0.35335, 0.35435, 0.35536, 0.35636, 0.35736, 0.35836, 0.35936,\n 0.36036, 0.36136, 0.36236, 0.36336, 0.36436, 0.36537, 0.36637, 0.36737, 0.36837, 0.36937, 0.37037, 0.37137, 0.37237, 0.37337, 0.37437, 0.37538, 0.37638, 0.37738, 0.37838, 0.37938, 0.38038, 0.38138, 0.38238, 0.38338,\n 0.38438, 0.38539, 0.38639, 0.38739, 0.38839, 0.38939, 0.39039, 0.39139, 0.39239, 0.39339, 0.39439, 0.3954, 0.3964, 0.3974, 0.3984, 0.3994, 0.4004, 0.4014, 0.4024, 0.4034, 0.4044, 0.40541, 0.40641, 0.40741,\n 0.40841, 0.40941, 0.41041, 0.41141, 0.41241, 0.41341, 0.41441, 0.41542, 0.41642, 0.41742, 0.41842, 0.41942, 0.42042, 0.42142, 0.42242, 0.42342, 0.42442, 0.42543, 0.42643, 0.42743, 0.42843, 0.42943, 0.43043, 0.43143,\n 0.43243, 0.43343, 0.43443, 0.43544, 0.43644, 0.43744, 0.43844, 0.43944, 0.44044, 0.44144, 0.44244, 0.44344, 0.44444, 0.44545, 0.44645, 0.44745, 0.44845, 0.44945, 0.45045, 0.45145, 0.45245, 0.45345, 0.45445, 0.45546,\n 0.45646, 0.45746, 0.45846, 0.45946, 0.46046, 0.46146, 0.46246, 0.46346, 0.46446, 0.46547, 0.46647, 0.46747, 0.46847, 0.46947, 0.47047, 0.47147, 0.47247, 0.47347, 0.47447, 0.47548, 0.47648, 0.47748, 0.47848, 0.47948,\n 0.48048, 0.48148, 0.48248, 0.48348, 0.48448, 0.48549, 0.48649, 0.48749, 0.48849, 0.48949, 0.49049, 0.49149, 0.49249, 0.49349, 0.49449, 0.4955, 0.4965, 0.4975, 0.4985, 0.4995, 0.5005, 0.5015, 0.5025, 0.5035,\n 0.5045, 0.50551, 0.50651, 0.50751, 0.50851, 0.50951, 0.51051, 0.51151, 0.51251, 0.51351, 0.51451, 0.51552, 0.51652, 0.51752, 0.51852, 0.51952, 0.52052, 0.52152, 0.52252, 0.52352, 0.52452, 0.52553, 0.52653, 0.52753,\n 0.52853, 0.52953, 0.53053, 0.53153, 0.53253, 0.53353, 0.53453, 0.53554, 0.53654, 0.53754, 0.53854, 0.53954, 0.54054, 0.54154, 0.54254, 0.54354, 0.54454, 0.54555, 0.54655, 0.54755, 0.54855, 0.54955, 0.55055, 0.55155,\n 0.55255, 0.55355, 0.55455, 0.55556, 0.55656, 0.55756, 0.55856, 0.55956, 0.56056, 0.56156, 0.56256, 0.56356, 0.56456, 0.56557, 0.56657, 0.56757, 0.56857, 0.56957, 0.57057, 0.57157, 0.57257, 0.57357, 0.57457, 0.57558,\n 0.57658, 0.57758, 0.57858, 0.57958, 0.58058, 0.58158, 0.58258, 0.58358, 0.58458, 0.58559, 0.58659, 0.58759, 0.58859, 0.58959, 0.59059, 0.59159, 0.59259, 0.59359, 0.59459, 0.5956, 0.5966, 0.5976, 0.5986, 0.5996,\n 0.6006, 0.6016, 0.6026, 0.6036, 0.6046, 0.60561, 0.60661, 0.60761, 0.60861, 0.60961, 0.61061, 0.61161, 0.61261, 0.61361, 0.61461, 0.61562, 0.61662, 0.61762, 0.61862, 0.61962, 0.62062, 0.62162, 0.62262, 0.62362,\n 0.62462, 0.62563, 0.62663, 0.62763, 0.62863, 0.62963, 0.63063, 0.63163, 0.63263, 0.63363, 0.63463, 0.63564, 0.63664, 0.63764, 0.63864, 0.63964, 0.64064, 0.64164, 0.64264, 0.64364, 0.64464, 0.64565, 0.64665, 0.64765,\n 0.64865, 0.64965, 0.65065, 0.65165, 0.65265, 0.65365, 0.65465, 0.65566, 0.65666, 0.65766, 0.65866, 0.65966, 0.66066, 0.66166, 0.66266, 0.66366, 0.66466, 0.66567, 0.66667, 0.66767, 0.66867, 0.66967, 0.67067, 0.67167,\n 0.67267, 0.67367, 0.67467, 0.67568, 0.67668, 0.67768, 0.67868, 0.67968, 0.68068, 0.68168, 0.68268, 0.68368, 0.68468, 0.68569, 0.68669, 0.68769, 0.68869, 0.68969, 0.69069, 0.69169, 0.69269, 0.69369, 0.69469, 0.6957,\n 0.6967, 0.6977, 0.6987, 0.6997, 0.7007, 0.7017, 0.7027, 0.7037, 0.7047, 0.70571, 0.70671, 0.70771, 0.70871, 0.70971, 0.71071, 0.71171, 0.71271, 0.71371, 0.71471, 0.71572, 0.71672, 0.71772, 0.71872, 0.71972,\n 0.72072, 0.72172, 0.72272, 0.72372, 0.72472, 0.72573, 0.72673, 0.72773, 0.72873, 0.72973, 0.73073, 0.73173, 0.73273, 0.73373, 0.73473, 0.73574, 0.73674, 0.73774, 0.73874, 0.73974, 0.74074, 0.74174, 0.74274, 0.74374,\n 0.74474, 0.74575, 0.74675, 0.74775, 0.74875, 0.74975, 0.75075, 0.75175, 0.75275, 0.75375, 0.75475, 0.75576, 0.75676, 0.75776, 0.75876, 0.75976, 0.76076, 0.76176, 0.76276, 0.76376, 0.76476, 0.76577, 0.76677, 0.76777,\n 0.76877, 0.76977, 0.77077, 0.77177, 0.77277, 0.77377, 0.77477, 0.77578, 0.77678, 0.77778, 0.77878, 0.77978, 0.78078, 0.78178, 0.78278, 0.78378, 0.78478, 0.78579, 0.78679, 0.78779, 0.78879, 0.78979, 0.79079, 0.79179,\n 0.79279, 0.79379, 0.79479, 0.7958, 0.7968, 0.7978, 0.7988, 0.7998, 0.8008, 0.8018, 0.8028, 0.8038, 0.8048, 0.80581, 0.80681, 0.80781, 0.80881, 0.80981, 0.81081, 0.81181, 0.81281, 0.81381, 0.81481, 0.81582,\n 0.81682, 0.81782, 0.81882, 0.81982, 0.82082, 0.82182, 0.82282, 0.82382, 0.82482, 0.82583, 0.82683, 0.82783, 0.82883, 0.82983, 0.83083, 0.83183, 0.83283, 0.83383, 0.83483, 0.83584, 0.83684, 0.83784, 0.83884, 0.83984,\n 0.84084, 0.84184, 0.84284, 0.84384, 0.84484, 0.84585, 0.84685, 0.84785, 0.84885, 0.84985, 0.85085, 0.85185, 0.85285, 0.85385, 0.85485, 0.85586, 0.85686, 0.85786, 0.85886, 0.85986, 0.86086, 0.86186, 0.86286, 0.86386,\n 0.86486, 0.86587, 0.86687, 0.86787, 0.86887, 0.86987, 0.87087, 0.87187, 0.87287, 0.87387, 0.87487, 0.87588, 0.87688, 0.87788, 0.87888, 0.87988, 0.88088, 0.88188, 0.88288, 0.88388, 0.88488, 0.88589, 0.88689, 0.88789,\n 0.88889, 0.88989, 0.89089, 0.89189, 0.89289, 0.89389, 0.89489, 0.8959, 0.8969, 0.8979, 0.8989, 0.8999, 0.9009, 0.9019, 0.9029, 0.9039, 0.9049, 0.90591, 0.90691, 0.90791, 0.90891, 0.90991, 0.91091, 0.91191,\n 0.91291, 0.91391, 0.91491, 0.91592, 0.91692, 0.91792, 0.91892, 0.91992, 0.92092, 0.92192, 0.92292, 0.92392, 0.92492, 0.92593, 0.92693, 0.92793, 0.92893, 0.92993, 0.93093, 0.93193, 0.93293, 0.93393, 0.93493, 0.93594,\n 0.93694, 0.93794, 0.93894, 0.93994, 0.94094, 0.94194, 0.94294, 0.94394, 0.94494, 0.94595, 0.94695, 0.94795, 0.94895, 0.94995, 0.95095, 0.95195, 0.95295, 0.95395, 0.95495, 0.95596, 0.95696, 0.95796, 0.95896, 0.95996,\n 0.96096, 0.96196, 0.96296, 0.96396, 0.96496, 0.96597, 0.96697, 0.96797, 0.96897, 0.96997, 0.97097, 0.97197, 0.97297, 0.97397, 0.97497, 0.97598, 0.97698, 0.97798, 0.97898, 0.97998, 0.98098, 0.98198, 0.98298, 0.98398,\n 0.98498, 0.98599, 0.98699, 0.98799, 0.98899, 0.98999, 0.99099, 0.99199, 0.99299, 0.99399, 0.99499, 0.996, 0.997, 0.998, 0.999, 1]), array([[ 1, 1, 1, ..., 0.20167, 0, 0],\n [ 1, 1, 1, ..., 0.1775, 0.12625, 0]], shape=(2, 1000)), 'Confidence', 'Recall']]\nfitness: np.float64(1.4455970167060048)\nkeys: ['metrics/precision(B)', 'metrics/recall(B)', 'metrics/mAP50(B)', 'metrics/mAP50-95(B)', 'metrics/precision(M)', 'metrics/recall(M)', 'metrics/mAP50(M)', 'metrics/mAP50-95(M)']\nmaps: array([ 1.1864, 1.6182])\nnames: {0: 'inner_box', 1: 'outer_box'}\nplot: True\nresults_dict: {'metrics/precision(B)': np.float64(0.8108231337398004), 'metrics/recall(B)': np.float64(0.7761444955593891), 'metrics/mAP50(B)': np.float64(0.9092307692307693), 'metrics/mAP50-95(B)': np.float64(0.6753775662745483), 'metrics/precision(M)': np.float64(0.7969436046359123), 'metrics/recall(M)': np.float64(0.8792737357610776), 'metrics/mAP50(M)': np.float64(0.9257692307692309), 'metrics/mAP50-95(M)': np.float64(0.726952452287679), 'fitness': np.float64(1.4455970167060048)}\nsave_dir: PosixPath('runs/segment/train')\nseg: ultralytics.utils.metrics.Metric object\nspeed: {'preprocess': 0.447845458984375, 'inference': 6.885099411010742, 'loss': 0.0012874603271484375, 'postprocess': 3.5344362258911133}\ntask: 'segment'" }, { - "objectID": "posts/pruning_vs_uncertainty.html#check-calibration", - "href": "posts/pruning_vs_uncertainty.html#check-calibration", - "title": "Pruning vs Uncertainty", - "section": "Check calibration", - "text": "Check calibration\n\ntest_dataset.tensors[1].cpu().numpy().shape, all_predicted.shape\n\n((10000,), (10000, 10))\n\n\n\n# Compute calibration curve\n\nfig, axes = plt.subplots(2, 5, figsize=(20, 5))\naxes = axes.flatten()\n\nfor target_class in range(10):\n true_labels = test_dataset.tensors[1].cpu().numpy() == target_class\n predicted_probabilities = all_predicted[:, target_class]\n\n prob_true, prob_pred = calibration_curve(\n true_labels, predicted_probabilities, n_bins=10\n )\n\n # Plot calibration curve\n axes[target_class].plot(prob_pred, prob_true, marker=\"o\", label=\"Calibration Curve\")\n axes[target_class].plot(\n [0, 1], [0, 1], linestyle=\"--\", label=\"Perfectly Calibrated\"\n )\n axes[target_class].set_xlabel(\"Mean Predicted Probability\")\n axes[target_class].set_ylabel(\"Observed Accuracy\")\n # ece_score = compute_ece(predicted_probabilities, true_labels, num_bins=10)\n # print(f\"Expected Calibration Error (ECE): {ece_score:.4f}\")\n\n axes[target_class].set_title(f\"Class {target_class}\")\nplt.tight_layout()\n\n# Compute expected calibration error (ECE)\nece = compute_ece(all_predicted, test_dataset.tensors[1].cpu().numpy(), 10)\nece\n\n0.021885250088572478\n\n\n\n\n\n\n\n\n\n\nprint(\n classification_report(\n test_dataset.tensors[1].cpu().numpy(), all_predicted.argmax(axis=1)\n )\n)\n\n precision recall f1-score support\n\n 0 0.89 0.91 0.90 980\n 1 0.91 0.97 0.94 1135\n 2 0.75 0.72 0.73 1032\n 3 0.77 0.74 0.75 1010\n 4 0.82 0.88 0.85 982\n 5 0.68 0.70 0.69 892\n 6 0.85 0.84 0.85 958\n 7 0.80 0.79 0.79 1028\n 8 0.76 0.75 0.75 974\n 9 0.81 0.74 0.77 1009\n\n accuracy 0.81 10000\n macro avg 0.80 0.80 0.80 10000\nweighted avg 0.80 0.81 0.80 10000\n\n\n\n\ndef compute_ece(predicted_probs, true_labels, num_bins=10):\n # Ensure predicted_probs is a NumPy array\n predicted_probs = np.array(predicted_probs)\n true_labels = np.array(true_labels)\n\n # Calculate predicted class labels\n predicted_labels = np.argmax(predicted_probs, axis=1)\n\n # Calculate confidence scores (maximum predicted probability)\n confidence_scores = np.max(predicted_probs, axis=1)\n\n # Create bins for confidence scores\n bin_edges = np.linspace(0, 1, num_bins + 1)\n\n ece = 0.0\n total_samples = len(true_labels)\n\n for bin_idx in range(num_bins):\n # Find examples whose confidence scores fall into the current bin\n bin_mask = (confidence_scores >= bin_edges[bin_idx]) & (\n confidence_scores < bin_edges[bin_idx + 1]\n )\n\n if np.any(bin_mask):\n # Calculate the accuracy of predictions in this bin\n bin_accuracy = np.mean(predicted_labels[bin_mask] == true_labels[bin_mask])\n\n # Calculate the fraction of examples in this bin\n bin_fraction = np.sum(bin_mask) / total_samples\n\n # Calculate the calibration error in this bin\n bin_error = np.abs(bin_accuracy - np.mean(confidence_scores[bin_mask]))\n\n # Weighted contribution to ECE\n ece += bin_fraction * bin_error\n\n return ece" + "objectID": "lab/scratchpad.html#predict", + "href": "lab/scratchpad.html#predict", + "title": "Predict", + "section": "Predict", + "text": "Predict\n\nimport numpy as np\nimport supervision as sv\npred_model = YOLO(\"/home/patel_zeel/blog/lab/runs/segment/train/weights/best.pt\")\n\n\nimport os\nfiles = glob(\"/home/patel_zeel/kiln_compass_24/regions/high_res/19/*.png\")\n# np.random.seed(1)\nrandom_file = np.random.choice(files)\nbase_name = os.path.basename(random_file)\nif base_name in [os.path.basename(file) for file in glob(\"../lab/trench_width/images/*.png\")]:\n print(\"Part of the training dataset\")\n\nresult = pred_model(random_file, imgsz=1280, verbose=False)[0]\ndetection = sv.Detections.from_ultralytics(result)\n\nimg = Image.open(random_file)\nbox_annotator = sv.MaskAnnotator()\nlabel_annotator = sv.LabelAnnotator()\nannotated_image = box_annotator.annotate(img.copy(), detection)\nannotated_image = label_annotator.annotate(annotated_image, detection)\ndisplay(annotated_image)" }, { - "objectID": "posts/pruning_vs_uncertainty.html#does-mc-dropout-help-with-calibration", - "href": "posts/pruning_vs_uncertainty.html#does-mc-dropout-help-with-calibration", - "title": "Pruning vs Uncertainty", - "section": "Does MC-dropout help with calibration?", - "text": "Does MC-dropout help with calibration?" + "objectID": "posts/PurpleAir.html", + "href": "posts/PurpleAir.html", + "title": "Download low-cost data from OpenAQ", + "section": "", + "text": "import requests\nimport pandas as pd\nimport geopandas as gpd\nfrom shapely.geometry import Point\nfrom glob import glob\n\nfrom geopy.geocoders import Nominatim\n\nOpenAQ has an aws s3 bucket that contains all the data to download for free. This is a guide to download the data from the bucket. If you have enough space and bandwidth, aws s3 commands are the fastest way to download the data. If you don’t have enough space/bandwidth or you want to download specific data only then follow along.\nAcknowledgement: Much help is taken from ChatGPT for some complex Linux commands.\nWe will mostly use the following commands:\naws s3 ls\naws s3 cp\nScenario 1: We want to download PurpleAir sensors data for Delhi for entire 2022. I am taking Delhi’s example since there are far lesser sensors in Delhi than in the US. So, it will be easier for this demo.\nSome statistics that I have calculated for PurpleAir sensors in US are as follows:\n\n\n\nCountry\nNumber of sensors\nTotal size\nyears\n\n\n\n\nUSA\n22497\n90.945 GB\n2018, 2019, 2020, 2021, 2022, 2023\n\n\n\nLet’s see how to calculate these statistics for India.\n\n!aws s3 --no-sign-request ls s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/ > /tmp/in.txt\n\nThe output of above command contains all sensor IDs within India. These sensor IDs are assigned by OpenAQ and they are different from PurpleAir “sensor_index”. The output looks like the following.\n\nwith open('/tmp/in.txt', 'r') as f:\n lines = [line.strip() for line in f.readlines()]\n \nprint(f\"Number of locations = {len(lines)}\\n\")\nprint(*lines[:3], sep='\\n')\n\nNumber of locations = 623\n\nPRE locationid=160485/\nPRE locationid=218334/\nPRE locationid=218336/\n\n\nCounting total size of files:\n\n!aws s3 --no-sign-request ls s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/ --recursive > /tmp/in_files.txt\n\n\n!awk '{ sum += $3 } END { print (sum/1024/1024/1024) \" GB\"}' /tmp/in_files.txt\n\n1.20947 GB\n\n\nNumber of years:\n\n!cat /tmp/in_files.txt | grep -oP 'year=\\K\\w+' | sort | uniq\n\n2019\n2020\n2021\n2022\n2023\n\n\nLet’s find out which sensors among these belong to Delhi.\n\n!cat /tmp/in.txt | grep -oP 'locationid=\\K\\w+' | sort | uniq > /tmp/in_locations.txt\n\nNow we use OpenAQ REST API. It has limit of 300 requests per 5 minutes. After the limit exceeds, it will return an error.\n\nurl = 'https://api.openaq.org/v2/locations'\nparams = {\n 'limit': 1000,\n 'country_id': 'IN',\n \"modelName\": \"PurpleAir Sensor\"\n}\n\n# Request headers\nheaders = {\n 'accept': 'application/json'\n}\n\nresponse = requests.get(url, params=params, headers=headers)\n\nif response.status_code == 200:\n data = response.json()\nelse:\n print(response.status_code, response.reason)\n\n500 Internal Server Error\n\n\n\ndf = pd.DataFrame(data['results'])\ndf.shape\n\n\n---------------------------------------------------------------------------\nNameError Traceback (most recent call last)\nCell In[10], line 1\n----> 1 df = pd.DataFrame(data['results'])\n 2 df.shape\n\nNameError: name 'data' is not defined\n\n\n\nThe following method works but takes too much time since it is online:\n\ndef get_city_name(coords):\n latitude = coords[\"latitude\"]\n longitude = coords[\"longitude\"]\n geolocator = Nominatim(user_agent=\"myGeocoder\") # Replace \"myGeocoder\" with your desired user agent\n location = geolocator.reverse((latitude, longitude), exactly_one=True)\n\n if location:\n address = location.raw.get('address', {})\n city = address.get('city', '')\n return city\n\n return None\n\ndf[\"city\"] = df[\"coordinates\"].apply(get_city_name)\ndf\n\n\n\n\n\n\n\n\nid\ncity\nname\nentity\ncountry\nsources\nisMobile\nisAnalysis\nparameters\nsensorType\ncoordinates\nlastUpdated\nfirstUpdated\nmeasurements\nbounds\nmanufacturers\n\n\n\n\n0\n318146\nGangtok\nNASA_AQCS_201_cpa\nNone\nIN\nNone\nFalse\nNone\n[{'id': 135, 'unit': 'particles/cm³', 'count':...\nNone\n{'latitude': 27.31013, 'longitude': 88.59687}\n2023-07-26T02:34:05+00:00\n2022-04-23T07:42:24+00:00\n1168071\n[88.59687, 27.31013, 88.59687, 27.31013]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n1\n220706\nGangtok\nNASA_AQCS_139\nNone\nIN\nNone\nFalse\nNone\n[{'id': 100, 'unit': 'c', 'count': 28600, 'ave...\nNone\n{'latitude': 27.310116, 'longitude': 88.59682}\n2023-07-26T02:34:04+00:00\n2021-02-17T09:56:06+00:00\n2205110\n[88.59682, 27.310116, 88.59682, 27.310116]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n2\n66673\nHisar\nNASA_AQCS_160\nNone\nIN\nNone\nFalse\nNone\n[{'id': 132, 'unit': 'mb', 'count': 28651, 'av...\nNone\n{'latitude': 29.146254, 'longitude': 75.72236}\n2023-07-26T02:33:56+00:00\n2021-01-19T23:59:16+00:00\n2396934\n[75.72236, 29.146254, 75.72236, 29.146254]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n3\n72977\nBengaluru\nUT Sensor 101\nNone\nIN\nNone\nFalse\nNone\n[{'id': 126, 'unit': 'particles/cm³', 'count':...\nNone\n{'latitude': 13.045313, 'longitude': 77.573395}\n2023-07-26T02:33:55+00:00\n2021-01-14T01:18:23+00:00\n2498544\n[77.573395, 13.045313, 77.573395, 13.045313]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n4\n235916\nBengaluru\nUW Sensor 311\nNone\nIN\nNone\nFalse\nNone\n[{'id': 130, 'unit': 'particles/cm³', 'count':...\nNone\n{'latitude': 13.048528, 'longitude': 77.582275}\n2023-07-26T02:33:49+00:00\n2021-09-16T12:29:52+00:00\n1773924\n[77.582275, 13.048528, 77.582275, 13.048528]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n\n\n604\n73922\nNew Delhi District\nUS Embassy A\nNone\nIN\nNone\nFalse\nNone\n[{'id': 2, 'unit': 'µg/m³', 'count': 347, 'ave...\nNone\n{'latitude': 28.5979, 'longitude': 77.1847}\n2021-02-05T18:25:34+00:00\n2021-01-08T12:12:29+00:00\n2082\n[77.1847, 28.5979, 77.1847, 28.5979]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n605\n73924\nNew Delhi District\nUS Embassy B\nNone\nIN\nNone\nFalse\nNone\n[{'id': 2, 'unit': 'µg/m³', 'count': 347, 'ave...\nNone\n{'latitude': 28.5982, 'longitude': 77.1837}\n2021-02-05T18:23:59+00:00\n2021-01-08T12:11:29+00:00\n2082\n[77.1837, 28.5982, 77.1837, 28.5982]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n606\n219370\nBhubaneswar Municipal Corporation\nBhubaneswar India\nNone\nIN\nNone\nFalse\nNone\n[{'id': 2, 'unit': 'µg/m³', 'count': 180, 'ave...\nNone\n{'latitude': 20.2853, 'longitude': 85.7685}\n2021-02-03T10:18:41+00:00\n2021-02-03T03:02:39+00:00\n1080\n[85.7685, 20.2853, 85.7685, 20.2853]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n607\n71465\nGurugram District\nNASA_AQCS_152\nNone\nIN\nNone\nFalse\nNone\n[{'id': 1, 'unit': 'µg/m³', 'count': 1, 'avera...\nNone\n{'latitude': 28.4522, 'longitude': 77.0949}\n2021-01-14T01:18:54+00:00\n2021-01-14T01:18:54+00:00\n6\n[77.0949, 28.4522, 77.0949, 28.4522]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n608\n221974\nBengaluru\nUT sensor\nNone\nIN\nNone\nFalse\nNone\n[{'id': 130, 'unit': 'particles/cm³', 'count':...\nNone\n{'latitude': 13.0449, 'longitude': 77.5788}\n2019-12-17T10:32:35+00:00\n2019-12-17T10:32:35+00:00\n6\n[77.5788, 13.0449, 77.5788, 13.0449]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n\n\n609 rows × 16 columns\n\n\n\nNow, we use another method of shapefile to do this:\n\n!wget --no-check-certificate \"https://groups.google.com/group/datameet/attach/29b74b1aef5f2f13/Delhi.zip?part=0.1\" -O /tmp/delhi.zip\n!unzip -o /tmp/delhi.zip -d /tmp/delhi\n\n--2023-07-26 08:37:34-- https://groups.google.com/group/datameet/attach/29b74b1aef5f2f13/Delhi.zip?part=0.1\nResolving groups.google.com (groups.google.com)... 216.239.32.177, 216.239.36.177, 216.239.38.177, ...\nConnecting to groups.google.com (groups.google.com)|216.239.32.177|:443... connected.\nHTTP request sent, awaiting response... 302 Moved Temporarily\nLocation: https://06895207363394598426.googlegroups.com/attach/29b74b1aef5f2f13/Delhi.zip?part=0.1&vt=ANaJVrEpUPnptnb4Y-J5gJRBVJ29K0pIGKzeBG7492Ume1tyn1MY5eTDbztxP0Hdbc7u8XhmH_GbemY_HD60x5OvDhr7M2ib1h8YfDmlNxFefazGPgmAUj0 [following]\n--2023-07-26 08:37:35-- https://06895207363394598426.googlegroups.com/attach/29b74b1aef5f2f13/Delhi.zip?part=0.1&vt=ANaJVrEpUPnptnb4Y-J5gJRBVJ29K0pIGKzeBG7492Ume1tyn1MY5eTDbztxP0Hdbc7u8XhmH_GbemY_HD60x5OvDhr7M2ib1h8YfDmlNxFefazGPgmAUj0\nResolving 06895207363394598426.googlegroups.com (06895207363394598426.googlegroups.com)... 142.251.10.137, 2404:6800:4003:c0f::89\nConnecting to 06895207363394598426.googlegroups.com (06895207363394598426.googlegroups.com)|142.251.10.137|:443... connected.\nHTTP request sent, awaiting response... 200 OK\nLength: unspecified [application/zip]\nSaving to: ‘/tmp/delhi.zip’\n\n/tmp/delhi.zip [ <=> ] 16.42K --.-KB/s in 0.04s \n\n2023-07-26 08:37:37 (429 KB/s) - ‘/tmp/delhi.zip’ saved [16812]\n\nArchive: /tmp/delhi.zip\n inflating: /tmp/delhi/Delhi.kml \n inflating: /tmp/delhi/Districts.dbf \n inflating: /tmp/delhi/Districts.prj \n inflating: /tmp/delhi/Districts.qpj \n inflating: /tmp/delhi/Districts.shp \n inflating: /tmp/delhi/Districts.shx \n\n\n\ngdf = gpd.read_file('/tmp/delhi/Districts.shp')\ngdf.plot(color=\"none\", edgecolor=\"black\");\n\n\n\n\n\n\n\n\n\n# check if a point is within Delhi\ndef is_within_delhi(coords):\n point = Point(coords[\"longitude\"], coords[\"latitude\"])\n for i, row in gdf.iterrows():\n if row.geometry.contains(point):\n return True\n return False\n\ndf[\"is_within_delhi\"] = df[\"coordinates\"].apply(is_within_delhi)\n\n\ndelhi_df = df[df[\"is_within_delhi\"]]\ndelhi_df.shape\n\n(311, 17)\n\n\n\ndelhi_df.city.value_counts()\n\n 194\nNew Delhi District 112\nDwarka 3\nGhaziabad 2\nName: city, dtype: int64\n\n\nSeems like many points were not detected by the online geopy encoder.\nNow, we know that out of 623, 311 sensors belong to Delhi. Let’s download the data for these sensors. For illustration, I will download data for 3 sensors for year 2022 and month of Jan.\n\n# dump delhi_df.id to a file\ndelhi_df.id.to_csv('/tmp/delhi_locations.txt', index=False, header=False)\n!head -n3 /tmp/delhi_locations.txt\n\n274208\n221227\n273205\n\n\n\n!head -n3 /tmp/delhi_locations.txt > /tmp/delhi_locations_3.txt\n!while read -r sensor_id; do aws s3 --no-sign-request cp s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=$sensor_id/year=2022/month=01 /tmp/delhi_data --recursive; done < /tmp/delhi_locations_3.txt\n\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220107.csv.gz to ../../../../tmp/delhi_data/location-274208-20220107.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220120.csv.gz to ../../../../tmp/delhi_data/location-274208-20220120.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220131.csv.gz to ../../../../tmp/delhi_data/location-274208-20220131.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220122.csv.gz to ../../../../tmp/delhi_data/location-274208-20220122.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220130.csv.gz to ../../../../tmp/delhi_data/location-274208-20220130.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220129.csv.gz to ../../../../tmp/delhi_data/location-274208-20220129.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220114.csv.gz to ../../../../tmp/delhi_data/location-274208-20220114.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220123.csv.gz to ../../../../tmp/delhi_data/location-274208-20220123.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220113.csv.gz to ../../../../tmp/delhi_data/location-274208-20220113.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220121.csv.gz to ../../../../tmp/delhi_data/location-274208-20220121.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220107.csv.gz to ../../../../tmp/delhi_data/location-221227-20220107.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220104.csv.gz to ../../../../tmp/delhi_data/location-221227-20220104.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220115.csv.gz to ../../../../tmp/delhi_data/location-221227-20220115.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220105.csv.gz to ../../../../tmp/delhi_data/location-221227-20220105.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220108.csv.gz to ../../../../tmp/delhi_data/location-221227-20220108.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220106.csv.gz to ../../../../tmp/delhi_data/location-221227-20220106.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220102.csv.gz to ../../../../tmp/delhi_data/location-221227-20220102.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220117.csv.gz to ../../../../tmp/delhi_data/location-221227-20220117.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220118.csv.gz to ../../../../tmp/delhi_data/location-221227-20220118.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220110.csv.gz to ../../../../tmp/delhi_data/location-221227-20220110.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220131.csv.gz to ../../../../tmp/delhi_data/location-221227-20220131.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220113.csv.gz to ../../../../tmp/delhi_data/location-221227-20220113.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220119.csv.gz to ../../../../tmp/delhi_data/location-221227-20220119.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220121.csv.gz to ../../../../tmp/delhi_data/location-221227-20220121.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220123.csv.gz to ../../../../tmp/delhi_data/location-221227-20220123.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220116.csv.gz to ../../../../tmp/delhi_data/location-221227-20220116.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220128.csv.gz to ../../../../tmp/delhi_data/location-221227-20220128.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220122.csv.gz to ../../../../tmp/delhi_data/location-221227-20220122.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220101.csv.gz to ../../../../tmp/delhi_data/location-221227-20220101.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220111.csv.gz to ../../../../tmp/delhi_data/location-221227-20220111.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220120.csv.gz to ../../../../tmp/delhi_data/location-221227-20220120.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220109.csv.gz to ../../../../tmp/delhi_data/location-221227-20220109.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220129.csv.gz to ../../../../tmp/delhi_data/location-221227-20220129.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220103.csv.gz to ../../../../tmp/delhi_data/location-221227-20220103.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220112.csv.gz to ../../../../tmp/delhi_data/location-221227-20220112.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220114.csv.gz to ../../../../tmp/delhi_data/location-221227-20220114.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220130.csv.gz to ../../../../tmp/delhi_data/location-221227-20220130.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220120.csv.gz to ../../../../tmp/delhi_data/location-273205-20220120.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220124.csv.gz to ../../../../tmp/delhi_data/location-273205-20220124.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220107.csv.gz to ../../../../tmp/delhi_data/location-273205-20220107.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220121.csv.gz to ../../../../tmp/delhi_data/location-273205-20220121.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220126.csv.gz to ../../../../tmp/delhi_data/location-273205-20220126.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220114.csv.gz to ../../../../tmp/delhi_data/location-273205-20220114.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220129.csv.gz to ../../../../tmp/delhi_data/location-273205-20220129.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220130.csv.gz to ../../../../tmp/delhi_data/location-273205-20220130.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220113.csv.gz to ../../../../tmp/delhi_data/location-273205-20220113.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220123.csv.gz to ../../../../tmp/delhi_data/location-273205-20220123.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220122.csv.gz to ../../../../tmp/delhi_data/location-273205-20220122.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220131.csv.gz to ../../../../tmp/delhi_data/location-273205-20220131.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220128.csv.gz to ../../../../tmp/delhi_data/location-273205-20220128.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220125.csv.gz to ../../../../tmp/delhi_data/location-273205-20220125.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220127.csv.gz to ../../../../tmp/delhi_data/location-273205-20220127.csv.gz\n\n\nVerify if we got all the sensors data as we needed.\n\n!ls /tmp/delhi_data/location-221227* | wc -l\n!ls /tmp/delhi_data/location-274208* | wc -l\n!ls /tmp/delhi_data/location-273205* | wc -l\n\n27\n10\n15\n\n\n\nsensor_df = pd.read_csv('/tmp/delhi_data/location-274208-20220107.csv.gz')\nsensor_df.parameter.value_counts()\n\npm10 64\npm25 64\npm1 64\num010 64\num025 64\num100 64\nName: parameter, dtype: int64" }, { - "objectID": "posts/pruning_vs_uncertainty.html#last-layer-only", - "href": "posts/pruning_vs_uncertainty.html#last-layer-only", - "title": "Pruning vs Uncertainty", - "section": "Last layer only", - "text": "Last layer only\n\nclass MCDropout(nn.Module):\n def __init__(self, p):\n super().__init__()\n self.p = p\n self.dropout = nn.Dropout(p=self.p)\n\n def forward(self, x):\n self.train()\n return self.dropout(x)\n\n\nresnet_with_dropout = torchvision.models.resnet18(pretrained=True)\nresnet_with_dropout.fc = nn.Sequential(\n nn.Linear(\n resnet_with_dropout.fc.in_features, resnet_with_dropout.fc.in_features // 2\n ),\n nn.GELU(),\n MCDropout(p=0.33),\n nn.Linear(resnet_with_dropout.fc.in_features // 2, num_classes),\n)\n\nresnet_with_dropout.load_state_dict(resnet.state_dict())\n\nresnet_with_dropout.to(device)\n\nmc_samples = 1000\n\noutputs = []\nfor _ in tqdm(range(mc_samples)):\n output = resnet_with_dropout(test_dataset.tensors[0])\n softmax_output = nn.Softmax(dim=1)(output)\n outputs.append(softmax_output.data.cpu().numpy())\n\n100%|██████████| 1000/1000 [00:18<00:00, 55.50it/s]\n\n\n\nmc_mean = np.mean(outputs, axis=0)\nmc_std = np.std(outputs, axis=0)\nmc_mean.shape\n\n(10000, 10)\n\n\n\n# Compute calibration curve\n\nfig, axes = plt.subplots(2, 5, figsize=(20, 5))\naxes = axes.flatten()\n\nfor target_class in range(10):\n true_labels = test_dataset.tensors[1].cpu().numpy() == target_class\n predicted_probabilities = mc_mean[:, target_class]\n\n prob_true, prob_pred = calibration_curve(\n true_labels, predicted_probabilities, n_bins=10\n )\n\n # Plot calibration curve\n axes[target_class].plot(prob_pred, prob_true, marker=\"o\", label=\"Calibration Curve\")\n axes[target_class].plot(\n [0, 1], [0, 1], linestyle=\"--\", label=\"Perfectly Calibrated\"\n )\n axes[target_class].set_xlabel(\"Mean Predicted Probability\")\n axes[target_class].set_ylabel(\"Observed Accuracy\")\n axes[target_class].set_title(f\"Class {target_class}\")\nplt.tight_layout()\n\nece = compute_ece(mc_mean, test_dataset.tensors[1].cpu().numpy(), 10)\nece\n\n0.04250686831623317\n\n\n\n\n\n\n\n\n\n\nprint(\n classification_report(test_dataset.tensors[1].cpu().numpy(), mc_mean.argmax(axis=1))\n)\n\n precision recall f1-score support\n\n 0 0.88 0.92 0.90 980\n 1 0.91 0.98 0.94 1135\n 2 0.73 0.71 0.72 1032\n 3 0.78 0.72 0.75 1010\n 4 0.83 0.87 0.85 982\n 5 0.68 0.71 0.69 892\n 6 0.82 0.88 0.85 958\n 7 0.80 0.78 0.79 1028\n 8 0.77 0.74 0.75 974\n 9 0.82 0.72 0.77 1009\n\n accuracy 0.81 10000\n macro avg 0.80 0.80 0.80 10000\nweighted avg 0.80 0.81 0.80 10000" + "objectID": "posts/climate-modeling-with-siren.html", + "href": "posts/climate-modeling-with-siren.html", + "title": "Climate Modeling with SIRENs", + "section": "", + "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"0\"\nos.environ[\"TF_FORCE_GPU_ALLOW_GROWTH\"] = \"true\"\n\nimport numpy as np\nimport xarray as xr\nfrom tqdm.keras import TqdmCallback\n\nimport tensorflow as tf\nfrom tensorflow.keras import layers, initializers, activations\nfrom tensorflow.keras.applications.resnet50 import ResNet50\nfrom tensorflow.keras.callbacks import LearningRateScheduler\nimport tensorflow_addons as tfa\n\nimport matplotlib.pyplot as plt\n\n/home/patel_zeel/miniconda3/envs/tensorflow_gpu/lib/python3.10/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html\n from .autonotebook import tqdm as notebook_tqdm\n2023-07-18 05:13:52.439735: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.\nTo enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.\n2023-07-18 05:13:53.232689: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT\n/home/patel_zeel/miniconda3/envs/tensorflow_gpu/lib/python3.10/site-packages/tensorflow_addons/utils/tfa_eol_msg.py:23: UserWarning: \n\nTensorFlow Addons (TFA) has ended development and introduction of new features.\nTFA has entered a minimal maintenance and release mode until a planned end of life in May 2024.\nPlease modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). \n\nFor more information see: https://github.com/tensorflow/addons/issues/2807 \n\n warnings.warn(\n\n\n\ndef SIREN(input_dim, output_dim, features, activation_scale, dropout):\n first_init = lambda input_dim: initializers.RandomUniform(-1 / input_dim, 1 / input_dim)\n other_init = lambda input_dim: initializers.RandomUniform(-np.sqrt(6 / input_dim) / activation_scale, np.sqrt(6 / input_dim) / activation_scale)\n model = tf.keras.Sequential()\n model.add(layers.Dense(features[0], input_shape=(input_dim,), kernel_initializer=first_init(input_dim), activation=lambda x: tf.sin(activation_scale*x)))\n for i in range(1, len(features)):\n model.add(layers.Dense(features[i], kernel_initializer=other_init(features[i-1]), activation=lambda x: tf.sin(activation_scale*x)))\n model.add(layers.Dropout(dropout))\n model.add(layers.Dense(output_dim, kernel_initializer=other_init(features[-1]), activation='linear'))\n return model\n\ndef MLP(input_dim, output_dim, features, dropout):\n model = tf.keras.Sequential()\n model.add(layers.Dense(features[0], input_shape=(input_dim,), activation=activations.relu))\n for i in range(1, len(features)):\n model.add(layers.Dense(features[i], activation=activations.relu))\n model.add(layers.Dropout(dropout))\n model.add(layers.Dense(output_dim, activation='linear'))\n return model\n \ndef ResNet():\n resnet = ResNet50(include_top=False, weights=None, input_shape=(64, 32, 1), pooling='avg')\n model = tf.keras.Sequential()\n model.add(resnet)\n model.add(layers.Dense(2048, activation='relu'))\n model.add(layers.Dense(32768, activation='linear'))\n return model\n\n\ndata5 = xr.open_dataset(\"../../super_res/data/era5_low_res/2m_temperature/2m_temperature_2018_5.625deg.nc\")\ndata1 = xr.open_dataset(\"../../super_res/data/era5_high_res/2m_temperature/2m_temperature_2018_1.40625deg.nc\")\n\n\ndata5\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n<xarray.Dataset>\nDimensions: (lon: 64, lat: 32, time: 8760)\nCoordinates:\n * lon (lon) float64 0.0 5.625 11.25 16.88 ... 337.5 343.1 348.8 354.4\n * lat (lat) float64 -87.19 -81.56 -75.94 -70.31 ... 75.94 81.56 87.19\n * time (time) datetime64[ns] 2018-01-01 ... 2018-12-31T23:00:00\nData variables:\n t2m (time, lat, lon) float32 ...\nAttributes:\n Conventions: CF-1.6\n history: 2019-11-06 10:38:21 GMT by grib_to_netcdf-2.14.0: /opt/ecmw...xarray.DatasetDimensions:lon: 64lat: 32time: 8760Coordinates: (3)lon(lon)float640.0 5.625 11.25 ... 348.8 354.4array([ 0. , 5.625, 11.25 , 16.875, 22.5 , 28.125, 33.75 , 39.375,\n 45. , 50.625, 56.25 , 61.875, 67.5 , 73.125, 78.75 , 84.375,\n 90. , 95.625, 101.25 , 106.875, 112.5 , 118.125, 123.75 , 129.375,\n 135. , 140.625, 146.25 , 151.875, 157.5 , 163.125, 168.75 , 174.375,\n 180. , 185.625, 191.25 , 196.875, 202.5 , 208.125, 213.75 , 219.375,\n 225. , 230.625, 236.25 , 241.875, 247.5 , 253.125, 258.75 , 264.375,\n 270. , 275.625, 281.25 , 286.875, 292.5 , 298.125, 303.75 , 309.375,\n 315. , 320.625, 326.25 , 331.875, 337.5 , 343.125, 348.75 , 354.375])lat(lat)float64-87.19 -81.56 ... 81.56 87.19array([-87.1875, -81.5625, -75.9375, -70.3125, -64.6875, -59.0625, -53.4375,\n -47.8125, -42.1875, -36.5625, -30.9375, -25.3125, -19.6875, -14.0625,\n -8.4375, -2.8125, 2.8125, 8.4375, 14.0625, 19.6875, 25.3125,\n 30.9375, 36.5625, 42.1875, 47.8125, 53.4375, 59.0625, 64.6875,\n 70.3125, 75.9375, 81.5625, 87.1875])time(time)datetime64[ns]2018-01-01 ... 2018-12-31T23:00:00long_name :timearray(['2018-01-01T00:00:00.000000000', '2018-01-01T01:00:00.000000000',\n '2018-01-01T02:00:00.000000000', ..., '2018-12-31T21:00:00.000000000',\n '2018-12-31T22:00:00.000000000', '2018-12-31T23:00:00.000000000'],\n dtype='datetime64[ns]')Data variables: (1)t2m(time, lat, lon)float32...units :Klong_name :2 metre temperature[17940480 values with dtype=float32]Indexes: (3)lonPandasIndexPandasIndex(Index([ 0.0, 5.625, 11.25, 16.875, 22.5, 28.125, 33.75, 39.375,\n 45.0, 50.625, 56.25, 61.875, 67.5, 73.125, 78.75, 84.375,\n 90.0, 95.625, 101.25, 106.875, 112.5, 118.125, 123.75, 129.375,\n 135.0, 140.625, 146.25, 151.875, 157.5, 163.125, 168.75, 174.375,\n 180.0, 185.625, 191.25, 196.875, 202.5, 208.125, 213.75, 219.375,\n 225.0, 230.625, 236.25, 241.875, 247.5, 253.125, 258.75, 264.375,\n 270.0, 275.625, 281.25, 286.875, 292.5, 298.125, 303.75, 309.375,\n 315.0, 320.625, 326.25, 331.875, 337.5, 343.125, 348.75, 354.375],\n dtype='float64', name='lon'))latPandasIndexPandasIndex(Index([-87.1875, -81.5625, -75.9375, -70.3125, -64.6875, -59.0625, -53.4375,\n -47.8125, -42.1875, -36.5625, -30.9375, -25.3125, -19.6875, -14.0625,\n -8.4375, -2.8125, 2.8125, 8.4375, 14.0625, 19.6875, 25.3125,\n 30.9375, 36.5625, 42.1875, 47.8125, 53.4375, 59.0625, 64.6875,\n 70.3125, 75.9375, 81.5625, 87.1875],\n dtype='float64', name='lat'))timePandasIndexPandasIndex(DatetimeIndex(['2018-01-01 00:00:00', '2018-01-01 01:00:00',\n '2018-01-01 02:00:00', '2018-01-01 03:00:00',\n '2018-01-01 04:00:00', '2018-01-01 05:00:00',\n '2018-01-01 06:00:00', '2018-01-01 07:00:00',\n '2018-01-01 08:00:00', '2018-01-01 09:00:00',\n ...\n '2018-12-31 14:00:00', '2018-12-31 15:00:00',\n '2018-12-31 16:00:00', '2018-12-31 17:00:00',\n '2018-12-31 18:00:00', '2018-12-31 19:00:00',\n '2018-12-31 20:00:00', '2018-12-31 21:00:00',\n '2018-12-31 22:00:00', '2018-12-31 23:00:00'],\n dtype='datetime64[ns]', name='time', length=8760, freq=None))Attributes: (2)Conventions :CF-1.6history :2019-11-06 10:38:21 GMT by grib_to_netcdf-2.14.0: /opt/ecmwf/eccodes/bin/grib_to_netcdf -o /cache/data2/adaptor.mars.internal-1573035683.1772008-2550-1-601d5659-dae2-45e1-902b-45825d30e8d0.nc /cache/tmp/601d5659-dae2-45e1-902b-45825d30e8d0-adaptor.mars.internal-1573035683.1790879-2550-1-tmp.grib\n\n\n\ntime_stamp = slice(\"2018-01\", \"2018-03\")\ntrain_df = data5.sel(time=time_stamp).to_dataframe().reset_index()\ntest_df = data1.sel(time=time_stamp).to_dataframe().reset_index()\n\nX = np.stack([train_df.lat.values, train_df.lon.values, train_df.time.astype(np.int64) / 10**9], axis=1)\ny = train_df[[\"t2m\"]].values\nprint(f\"{X.shape=}, {y.shape=}\")\n\nX_test = np.stack([test_df.lat.values, test_df.lon.values, test_df.time.astype(np.int64) / 10**9], axis=1)\ny_test = test_df[[\"t2m\"]].values\nprint(f\"{X_test.shape=}, {y_test.shape=}\")\n\n# rff = np.random.normal(size=(2, 16)) * 0.01\n# X = np.concatenate([np.sin(X @ rff), np.cos(X @ rff)], axis=1)\n# print(f\"{sin_cos.shape=}\")\n# X = X @ sin_cos\n# X_test = np.concatenate([np.sin(X_test @ rff), np.cos(X_test @ rff)], axis=1)\n\nprint(f\"{X.shape=}, {X_test.shape=}\")\n\nX.shape=(4423680, 3), y.shape=(4423680, 1)\nX_test.shape=(70778880, 3), y_test.shape=(70778880, 1)\nX.shape=(4423680, 3), X_test.shape=(70778880, 3)\n\n\n\n32*64*24*(31+28+31)\n\n4423680\n\n\n\nX_max = np.max(X, axis=0, keepdims=True)\nX_min = np.min(X, axis=0, keepdims=True)\n\nX_scaled = (X - X_min) / (X_max - X_min)\nX_test_scaled = (X_test - X_min) / (X_max - X_min)\n\n# Scaling time\nif X.shape[1] == 3:\n X_scaled[:, 2] = X_scaled[:, 2] * 10 - 5\n X_test_scaled[:, 2] = X_test_scaled[:, 2] * 10 - 5\n\ny_min = np.min(y, axis=0, keepdims=True)\ny_max = np.max(y, axis=0, keepdims=True)\n\ny_scaled = (y - y_min) / (y_max - y_min)\n\n# y_mean = np.mean(y, axis=0, keepdims=True)\n# y_std = np.std(y, axis=0, keepdims=True)\n\n# y_scaled = (y - y_mean) / y_std\n\n\nmodel = SIREN(3, 1, [256]*4, 30.0, 0.0)\n# model = MLP(3, 1, [256]*4, 0.0)\n# model = ResNet()S\n# clr = tfa.optimizers.CyclicalLearningRate(initial_learning_rate=1e-3,\n# maximal_learning_rate=1e-2,\n# scale_fn=lambda x: 1/(2.**(x-1)),\n# step_size=2\n# )\nmodel.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=1e-3), loss='mse')\n\n2023-07-18 05:14:34.531498: W tensorflow/core/common_runtime/gpu/gpu_bfc_allocator.cc:47] Overriding orig_value setting because the TF_FORCE_GPU_ALLOW_GROWTH environment variable is set. Original config value was 0.\n2023-07-18 05:14:34.531583: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1635] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 78884 MB memory: -> device: 0, name: NVIDIA A100-SXM4-80GB, pci bus id: 0000:01:00.0, compute capability: 8.0\n\n\n\n0.00148\n\n0.00148\n\n\n\ncallbacks = [TqdmCallback(verbose=1)]\nhistory = model.fit(X_scaled, y_scaled, epochs=1000, batch_size=X_scaled.shape[0], verbose=0, callbacks=callbacks)\n\n 0%| | 0/1000 [00:00<?, ?epoch/s]2023-07-18 05:14:38.677299: I tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:637] TensorFloat-32 will be used for the matrix multiplication. This will only be logged once.\n2023-07-18 05:14:39.357828: I tensorflow/compiler/xla/service/service.cc:169] XLA service 0x7fb81b946130 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:\n2023-07-18 05:14:39.357901: I tensorflow/compiler/xla/service/service.cc:177] StreamExecutor device (0): NVIDIA A100-SXM4-80GB, Compute Capability 8.0\n2023-07-18 05:14:39.363158: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.\n2023-07-18 05:14:40.399794: I tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:424] Loaded cuDNN version 8600\n2023-07-18 05:14:40.557542: I ./tensorflow/compiler/jit/device_compiler.h:180] Compiled cluster using XLA! This line is logged at most once for the lifetime of the process.\n100%|██████████| 1000/1000 [04:46<00:00, 3.49epoch/s, loss=0.000295]\n\n\n\nplt.plot(history.history['loss'][200:], label='loss');\n# plt.plot(history.history['val_loss'][200:], label='val_loss');\nplt.legend();\n\n\n\n\n\n\n\n\n\n128*256*24\n\n786432\n\n\n\nimg_index = 0\ny_pred = model.predict(X_test_scaled, batch_size=20480) * (y_max - y_min) + y_min\nprint(y_pred.shape)\nplt.imshow(y_pred[img_index*(256*128):(img_index+1)*(256*128)].reshape(256, 128), origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\n\n3456/3456 [==============================] - 5s 1ms/step\n(70778880, 1)\n\n\n\n\n\n\n\n\n\n\nplt.imshow(y.reshape(64, 32), origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\n\n\ndiff = y_pred.reshape(256, 128) - y_test.reshape(256, 128)\nplt.imshow(diff, origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\nplt.colorbar();\nplt.title(\"Diff\")\n\n\n# rmse = np.sqrt(np.mean(np.abs(X_test[:, 0:1])*(y_pred.ravel() - y_test.ravel())**2))/np.mean(y_test.ravel() * np.abs(X_test[:, 0:1]))\ndef get_lat_weights(lat):\n lat_weights = np.cos(np.deg2rad(lat))\n lat_weights = lat_weights / lat_weights.mean()\n return lat_weights\n\nlat_weights = get_lat_weights(X_test[:, 0])\nprint(f\"{lat_weights.shape=}\")\n\nlat_squared_error = lat_weights * (y_pred.ravel() - y_test.ravel())**2\nlat_rmse = np.sqrt(lat_squared_error.mean())\nprint(f\"{lat_rmse=}\")\n# y_pred.shape, lat_weights.shape\n\nlat_weights.shape=(70778880,)\nlat_rmse=2.6118446730600438\n\n\n\n# lat_rmse=3.4826956884024356\n\n\nmean_bias = np.mean(y_pred.ravel() - y_test.ravel())\nprint(f\"{mean_bias=}\")" }, { - "objectID": "posts/fundamentals_across_domains.html", - "href": "posts/fundamentals_across_domains.html", - "title": "Fundamentals across ML domains", + "objectID": "posts/2022-10-21-gaussian-processes.html", + "href": "posts/2022-10-21-gaussian-processes.html", + "title": "Gaussian Processes - A no-skip-math version", "section": "", - "text": "NN\nTransformer\nCNN\n\n\n\n\n-\nMulti-head\nMulti-channel\n\n\n-\nSkip-connection\nResNet" + "text": "import jax\nimport jax.numpy as jnp\n\nfrom tinygp.kernels import ExpSquared\n\nimport matplotlib.pyplot as plt" }, { - "objectID": "posts/AL_with_MNIST.html", - "href": "posts/AL_with_MNIST.html", - "title": "Active Learning with MNIST", - "section": "", - "text": "import pandas as pd\nimport numpy as np\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.datasets import fetch_openml\nfrom sklearn.metrics import classification_report, precision_score, recall_score, f1_score, confusion_matrix\n\nimport matplotlib.pyplot as plt\nfrom tqdm import tqdm\n\nimport psutil" + "objectID": "posts/2022-10-21-gaussian-processes.html#regression", + "href": "posts/2022-10-21-gaussian-processes.html#regression", + "title": "Gaussian Processes - A no-skip-math version", + "section": "Regression", + "text": "Regression\nIn this post, we will consider the regression problem of finding a reasonable map \\(X \\to \\boldsymbol{y}\\) along with uncertainty. We can do this in a simplest setting with Bayesian linear regression assuming a MultiVariate Normal (MVN) prior \\(\\boldsymbol{\\theta} \\sim \\mathcal{N}(\\boldsymbol{\\mu}_\\theta, \\Sigma_\\theta)\\) (why MVN? because \\(\\theta \\in (-\\infty, \\infty)\\)) and Normal likelihood \\(y \\sim \\mathcal{N}(\\boldsymbol{x}^T\\theta, \\sigma_n^2)\\) with i.i.d. assumption.\nTo start with Gaussian process regression, let us first focus on \\(\\boldsymbol{y}\\) (and ignore \\(X\\)). We assume \\(\\boldsymbol{f}\\) as a random variable and \\(\\boldsymbol{y}\\) as a realization of \\(\\boldsymbol{f}\\) with some noise. It would be a natural probabilistic assumption to assume \\(\\boldsymbol{f}\\) to be MVN distributed since its range is \\((-\\infty, \\infty)\\).\n\\[\np(\\boldsymbol{f}) \\sim \\mathcal{N}(\\boldsymbol{m}_f, K_{ff})\n\\tag{prior}\n\\]\nNow, we need to bring in \\(X\\) in a reasonable way to this formulation. A core assumption connecting \\(X\\) with \\(\\boldsymbol{y}\\) is the following: > if two inputs \\(\\boldsymbol{x}\\) and \\(\\boldsymbol{x}'\\) are close to each other (how to define the closeness? kernels!), corresponding \\(\\boldsymbol{y}\\) and \\(\\boldsymbol{y}'\\) are likely to be similar.\nWe use something known as covariance function or kernel (later is more prevalent) to define this closeness. For example, RBF or squared exponential is a well-known kernel:\n\\[\nk_{RBF}(\\boldsymbol{x}, \\boldsymbol{x}') = \\sigma^2 \\exp \\left(-{\\frac {\\|\\boldsymbol{x} -\\boldsymbol{x}' \\|^{2}}{2\\ell ^{2}}}\\right)\n\\tag{kernel}\n\\]\n\nx = jnp.array(0.0).reshape(1, 1)\nx_prime = jnp.linspace(-5,5,100).reshape(-1, 1)\n\nplt.plot(x_prime, ExpSquared()(x_prime, x));\nplt.xlabel(\"$x'$\")\nplt.title(f\"$k(x,x')$ where $x={x[0][0]}$ and $x' \\in ${plt.xlim()}\");\n\n\n\n\n\n\n\n\nThe plot above shows that value of \\(k(\\boldsymbol{x}, \\boldsymbol{x}')\\) increases as \\(\\boldsymbol{x}'\\) approaches \\(\\boldsymbol{x}\\) and reduces as it moves far from \\(\\boldsymbol{x}\\). Now, we will connect \\(X\\) with \\(\\boldsymbol{f}\\) (and thus with \\(\\boldsymbol{y}\\)) through kernel \\(k\\) with two following assumptions:\n\nDiagonal entries of \\(K_{ff}\\) represent variance of \\(f_i\\), which can be represented by \\(k(\\boldsymbol{x}_i, \\boldsymbol{x}_i)\\).\nNon-diagonal entries of \\(K_{ff}\\) represent covariance between \\(f_i\\) and \\(f_j\\) and can be represented by \\(k(\\boldsymbol{x}_i, \\boldsymbol{x}_j)\\).\n\nAt this point, we have made everything clear about prior \\(p(\\boldsymbol{f}) \\sim \\mathcal{N}(\\boldsymbol{m}_f, K_{ff})\\). Now, we will look at the likelihood. As mentioned earlier, \\(\\boldsymbol{y}\\) is noisy realization of \\(f\\) so the following likelihood would be a simple and natural choice.\n\\[\np(\\boldsymbol{y}|\\boldsymbol{f}) \\sim \\mathcal{N}(\\boldsymbol{f}, \\sigma_n^2I)\n\\tag{likelihood}\n\\]\nTill now, we followed bottom-up approach and defined prior and likelihood for this problem. Now we will explore the top-down approach.\nOur ultimate goal is derive \\(p(\\boldsymbol{y}^*|X^*,\\boldsymbol{y}, X)\\) at new inputs \\(X^*\\). This can be written as:\n\\[\np(\\boldsymbol{y}^*|X^*,\\boldsymbol{y}, X) = \\int p(\\boldsymbol{y}^*|\\boldsymbol{f}^*)p(\\boldsymbol{f}^*|X^*,\\boldsymbol{y}, X)d\\boldsymbol{f}^*\n\\tag{pred post new}\n\\]\nHere, \\(p(\\boldsymbol{f}^*|X^*,\\boldsymbol{y}, X)\\) is the posterior distribution at inputs \\(X^*\\). Once we derive posterior \\(p(\\boldsymbol{f}|\\boldsymbol{y},X)\\), We can find \\(p(\\boldsymbol{f}^*|X^*,\\boldsymbol{y}, X)\\) like following:\n\\[\np(\\boldsymbol{f}^*|X^*, \\boldsymbol{y}, X) = \\int p(\\boldsymbol{f}^*|X^*, \\boldsymbol{f}, X)p(\\boldsymbol{f}|\\boldsymbol{y}, X)d\\boldsymbol{f}\n\\tag{post new}\n\\]\nHere, \\(p(\\boldsymbol{f}^*|X^*, \\boldsymbol{f}, X)\\) is a conditional Gaussian distribution with the following closed form:\n\\[\np(\\boldsymbol{f}^*|X^*, \\boldsymbol{f}, X) \\sim \\mathcal{N}(\\boldsymbol{m}_{f^*}+K_{f^*f}K_{ff}^{-1}(\\boldsymbol{f}-\\boldsymbol{m}_{f}), K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*})\n\\tag{cond}\n\\]\nPosterior \\(p(\\boldsymbol{f}|\\boldsymbol{y}, X)\\) can be derived following “Bayes’ rule for Gaussians” (section 2.2.6.2 in pml book2):\n\\[\np(\\boldsymbol{f}|\\boldsymbol{y}, X) \\sim \\mathcal{N}(\\boldsymbol{m}_f + K_{ff}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f), K_{ff} - K_{ff}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff})\n\\tag{post}\n\\]\nWe can now substitute Eq. (post) and Eq. (cond) in Eq. (post new). The integral can be solved with using Eq. 2.90 in section 2.2.6.2 in pml book2 and also mentioned in Eq. (int gaussians) in Appendix.\n\\[\n\\begin{aligned}\np(\\boldsymbol{f}^*|X^*, \\boldsymbol{y}, X) &\\sim \\mathcal{N}(\\boldsymbol{\\mu}^*, \\Sigma^*)\\\\\n\\boldsymbol{\\mu}^* &= \\boldsymbol{m}_{f^*}+K_{f^*f}K_{ff}^{-1}(\\left[\\boldsymbol{m}_f + K_{ff}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f)\\right]-\\boldsymbol{m}_{f})\\\\\n&=\\boldsymbol{m}_{f^*}+K_{f^*f}K_{ff}^{-1}(K_{ff}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f))\\\\\n&=\\boldsymbol{m}_{f^*}+K_{f^*f}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f)\\\\\n\\\\\n\\Sigma^* &= K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*} + K_{f^*f}K_{ff}^{-1}\\left[K_{ff} - K_{ff}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff}\\right]K_{ff}^{-1}K_{ff^*}\\\\\n&=K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*} + K_{f^*f}\\left[I - \\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff}\\right]K_{ff}^{-1}K_{ff^*}\\\\\n&=K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*} + K_{f^*f}\\left[K_{ff}^{-1} - \\left(K_{ff} + \\sigma_n^2I\\right)^{-1}\\right]K_{ff^*}\\\\\n&=K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*} + K_{f^*f}K_{ff}^{-1}K_{ff^*} - K_{f^*f}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff^*}\\\\\n&=K_{f^*f^*} - K_{f^*f}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff^*}\\\\\np(\\boldsymbol{f}^*|X^*, \\boldsymbol{y}, X) &\\sim \\mathcal{N}(\\boldsymbol{m}_{f^*}+K_{f^*f}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f), K_{f^*f^*} - K_{f^*f}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff^*})\n\\end{aligned}\n\\]\nNow, we are almost there. Plugging in the above formula in Eq. (pred post) and using known result in Eq. (int gaussians), we get the predictive posterior as following:\n\\[\np(\\boldsymbol{y}^*|X^*,\\boldsymbol{y}, X) \\sim \\mathcal{N}(\\boldsymbol{m}_{f^*}+K_{f^*f}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f), K_{f^*f^*} - K_{f^*f}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff^*} + \\sigma_n^2I)\n\\]\n\n\n\n\n\n\nNote\n\n\n\nWe did not exploit the special structure of likelihood variance \\(\\sigma_n^2I\\) anywhere, so, these derivations hold true for full rank likelihood covariance matrices also.\n\n\n\nOptimization\nWe perform type-II likelihood estimation (in other words, minimize log marginal likelihood or evidence term). Our goal is to find optimal model \\(\\mathcal{M}\\) represented by prior (or kernel) hyperparameters and likelihood hyperparameters. We can get the log marginal likelihood using Eq. (int gaussians):\n\\[\n\\begin{aligned}\np(\\boldsymbol{y}|X, \\mathcal{M}) &= \\int p(\\boldsymbol{y}|\\boldsymbol{f}) p(\\boldsymbol{f})d\\boldsymbol{f}\\\\\n&\\sim \\int \\mathcal{N}(\\boldsymbol{y}|\\boldsymbol{f}, \\sigma_n^2I) \\mathcal{N}(\\boldsymbol{f}|\\boldsymbol{m}_f, K_{ff})\\\\\n&\\sim \\mathcal{N}(\\boldsymbol{y}|\\boldsymbol{m}_f, K_{ff}+\\sigma_n^2I)\n\\end{aligned}\n\\]\nFor case of RBF kernel, \\(\\mathcal{M}\\) parameters will be \\(\\{\\sigma, \\ell, \\sigma_n\\}\\)." }, { - "objectID": "posts/AL_with_MNIST.html#imports", - "href": "posts/AL_with_MNIST.html#imports", - "title": "Active Learning with MNIST", - "section": "", - "text": "import pandas as pd\nimport numpy as np\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.datasets import fetch_openml\nfrom sklearn.metrics import classification_report, precision_score, recall_score, f1_score, confusion_matrix\n\nimport matplotlib.pyplot as plt\nfrom tqdm import tqdm\n\nimport psutil" + "objectID": "posts/2022-10-21-gaussian-processes.html#classification-with-laplace-approximation", + "href": "posts/2022-10-21-gaussian-processes.html#classification-with-laplace-approximation", + "title": "Gaussian Processes - A no-skip-math version", + "section": "Classification (with Laplace approximation)", + "text": "Classification (with Laplace approximation)\nWe will derive a GP predictive posterior for binary case only because for multi-class, it gets a bit complex. Our assumption for prior over the \\(\\boldsymbol{f}\\) can still be the same but likelihood needs to be changed because \\(\\boldsymbol{y}\\) is no more a real number but rather a binary value e.g. 0 or 1. From Bayesian point-of-view, Bernoulli likelihood would be the most appropriate as a likelihood here:\n\\[\np(\\boldsymbol{y}|\\boldsymbol{f}) = \\prod_{i=1}^{N} \\sigma(f_i)^{y_i=1}(1-\\sigma(f_i))^{y_i=0}\n\\tag{class likelihood}\n\\]\nSince, MVN prior and Bernoulli likelihood are not conjugate, we need to use an approximate method of inference here. We use Laplace approximation to get the MAP estimate \\(\\boldsymbol{\\hat{f}}\\) and by computing the Hessian \\(H\\) of negative log joint (log prior + log likelihood) with respect to \\(\\boldsymbol{\\hat{f}}\\), we can get the posterior distribution as the following:\n\\[\np(\\boldsymbol{f}|\\boldsymbol{y}, X) \\sim \\mathcal{N}(\\boldsymbol{\\hat{f}}, H^{-1})\n\\tag{class post}\n\\]\nEq. (cond) will be the same in this case, and thus, we can solve Eq. (post new) as we did for regression case, like the following:\n\\[\np(\\boldsymbol{f}^*|X^*, \\boldsymbol{y}, X) \\sim \\mathcal{N}(\\boldsymbol{m}_{f^*}+K_{f^*f}K_{ff}^{-1}(\\boldsymbol{\\hat{f}}-\\boldsymbol{m}_{f}), K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*} + K_{f^*f}K_{ff}^{-1}H^{-1}K_{ff}^{-1}K_{ff^*})\n\\]\n\nOptimization\nTo perform Type-II likelihood estimation for binary classification, we first need to derive the log marginal likelihood which can be approximated with Laplace approximation. First, we define the following quantity:\n\\[\n\\boldsymbol{\\psi}(\\boldsymbol{f}) \\triangleq \\log p(\\boldsymbol{y}|\\boldsymbol{f}) + \\log p(\\boldsymbol{f})\n\\]\nNow, computing the log marginal likelihood as suggested in section 3.4.4 of GPML book:\n\\[\n\\begin{aligned}\n\\log p(\\boldsymbol{y}|X, \\mathcal{M}) &\\sim \\log \\left[ \\int p(\\boldsymbol{y}|\\boldsymbol{f}) p(\\boldsymbol{f})d\\boldsymbol{f}\\right]\\\\\n&= \\log \\left[ \\int \\exp\\left(\\boldsymbol{\\psi}(\\boldsymbol{f})\\right)d\\boldsymbol{f} \\right]\\\\\n&\\thickapprox \\log \\left[ \\int \\exp\\left(\\boldsymbol{\\psi}(\\boldsymbol{\\hat{f}}) -\\frac{1}{2}(\\mathbf{f}-\\hat{\\mathbf{f}})^{\\top} H(\\mathbf{f}-\\hat{\\mathbf{f}})\\right)d\\boldsymbol{f} \\right]\\\\\n&= \\log \\left[ \\exp \\boldsymbol{\\psi}(\\boldsymbol{\\hat{f}}) \\int exp\\left(-\\frac{1}{2}(\\mathbf{f}-\\hat{\\mathbf{f}})^{\\top} H(\\mathbf{f}-\\hat{\\mathbf{f}})\\right)d\\boldsymbol{f}\\right]\\\\\n&= \\log p(\\boldsymbol{y}|\\boldsymbol{\\hat{f}}) + \\log p(\\boldsymbol{\\hat{f}}) - \\frac{N}{2}\\log(2\\pi) - \\frac{1}{2}\\log|H^{-1}|\\\\\n&= \\log p(\\boldsymbol{y}|\\boldsymbol{\\hat{f}}) -\\frac{1}{2}\\boldsymbol{\\hat{f}}^TK_{ff}^{-1}\\boldsymbol{\\hat{f}} - \\frac{1}{2}\\log|K_{ff}| - \\frac{N}{2}\\log(2\\pi) - \\frac{1}{2}\\log|H^{-1}| - \\frac{N}{2}\\log(2\\pi)\n\\end{aligned}\n\\]\nOur final optimization algorithm would be as following: 1. For N iterations do 2. to 4. 2. Optimize for \\(\\boldsymbol{\\hat{f}}\\) with M iterations using standard MAP estimation (maybe use non-centered parametrization). 3. Compute gradient of parameters of \\(\\mathcal{M}\\) w.r.t. log marginal likelihood 4. Update parameters of \\(\\mathcal{M}\\)." }, { - "objectID": "posts/AL_with_MNIST.html#load-data", - "href": "posts/AL_with_MNIST.html#load-data", - "title": "Active Learning with MNIST", - "section": "Load data", - "text": "Load data\n\nX, y = fetch_openml('mnist_784', version=1, data_home='data', return_X_y=True, as_frame=False)\n\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/sklearn/datasets/_openml.py:1002: FutureWarning: The default value of `parser` will change from `'liac-arff'` to `'auto'` in 1.4. You can set `parser='auto'` to silence this warning. Therefore, an `ImportError` will be raised from 1.4 if the dataset is dense and pandas is not installed. Note that the pandas parser may return different data types. See the Notes Section in fetch_openml's API doc for details.\n warn(" + "objectID": "posts/2022-10-21-gaussian-processes.html#appendix", + "href": "posts/2022-10-21-gaussian-processes.html#appendix", + "title": "Gaussian Processes - A no-skip-math version", + "section": "Appendix", + "text": "Appendix\n\\[\n\\int \\mathcal{N}(\\boldsymbol{y}|W\\boldsymbol{x}+\\boldsymbol{b}, \\Sigma) \\mathcal{N}(\\boldsymbol{x}|\\boldsymbol{\\mu}, K) = \\mathcal{N}(\\boldsymbol{y}|W\\boldsymbol{\\mu}+b, WKW^T+\\Sigma)\n\\tag{int gaussians}\n\\]" }, { - "objectID": "posts/AL_with_MNIST.html#section", - "href": "posts/AL_with_MNIST.html#section", - "title": "Active Learning with MNIST", + "objectID": "posts/Basis_functions.html", + "href": "posts/Basis_functions.html", + "title": "Basis functions", "section": "", - "text": "X_train, X_test, y_train, y_test = X[:60000], X[60000:], y[:60000], y[60000:]\nprint(X_train.shape, X_test.shape, y_train.shape, y_test.shape)\n\n(60000, 784) (10000, 784) (60000,) (10000,)" + "text": "import GPy\nimport numpy as np\nimport pandas as pd\n\nfrom sklearn.preprocessing import MinMaxScaler, StandardScaler\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.ensemble import RandomForestRegressor\n\nimport matplotlib.pyplot as plt\n\n\ndata = pd.read_csv(\"../../beat_stgnp/dataset/bjair/NP/processed_raw.csv\")\ndata[\"time\"] = pd.to_datetime(data[\"time\"], format=\"%Y-%m-%d %H:%M:%S\")\ndata[\"time\"] = data[\"time\"].apply(lambda x: x.timestamp())\n\nx = [\"latitude\", \"longitude\", \"time\"]\ny = [\"PM25_Concentration\"]\n\nx_train, x_test, y_train, y_test = train_test_split(data[x], data[y], test_size=0.2, random_state=42)\nx_train, x_test, y_train, y_test = map(lambda x: x.values, [x_train, x_test, y_train, y_test])\n\nx_scaler = MinMaxScaler()\ny_scaler = StandardScaler()\nx_train = x_scaler.fit_transform(x_train)\ny_train = y_scaler.fit_transform(y_train)\nx_test = x_scaler.transform(x_test)\n\nmodel = RandomForestRegressor(n_estimators=1000, random_state=42)\nmodel.fit(x_train, y_train.ravel())\ny_pred = model.predict(x_test)\nprint(\"RMSE\", np.sqrt(np.mean((y_scaler.inverse_transform(y_pred).ravel() - y_test.ravel())**2)))\n\n /tmp/ipykernel_922642/3470971270.py:18: DataConversionWarning:A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples,), for example using ravel()." }, { - "objectID": "posts/AL_with_MNIST.html#check-if-things-are-working-as-expected", - "href": "posts/AL_with_MNIST.html#check-if-things-are-working-as-expected", - "title": "Active Learning with MNIST", - "section": "Check if things are working as expected", - "text": "Check if things are working as expected\n\n%%time\n\nclf = RandomForestClassifier(n_estimators=100, max_depth=10, random_state=0, n_jobs=psutil.cpu_count()//2)\nclf.fit(X_train, y_train)\npreds = clf.predict(X_test)\n\nprint(classification_report(y_test, preds))\n\n precision recall f1-score support\n\n 0 0.96 0.99 0.97 980\n 1 0.98 0.99 0.98 1135\n 2 0.94 0.94 0.94 1032\n 3 0.94 0.94 0.94 1010\n 4 0.95 0.93 0.94 982\n 5 0.96 0.93 0.94 892\n 6 0.96 0.97 0.96 958\n 7 0.95 0.92 0.94 1028\n 8 0.94 0.93 0.93 974\n 9 0.88 0.94 0.91 1009\n\n accuracy 0.95 10000\n macro avg 0.95 0.95 0.95 10000\nweighted avg 0.95 0.95 0.95 10000\n\nCPU times: user 1min 33s, sys: 652 ms, total: 1min 34s\nWall time: 3.73 s" + "objectID": "posts/2021-10-23-warped-gp.html", + "href": "posts/2021-10-23-warped-gp.html", + "title": "Input Warped GPs - A failed idea", + "section": "", + "text": "Comments\n\nWe are warping inputs \\(\\mathbf{x}\\) into \\(\\mathbf{w}\\cdot\\mathbf{x}\\)\nLearning second level GP over \\(\\mathbf{w}\\).\nAppling penalty over \\(\\mathbf{w}\\) if varies too much unnecessary.\nSee problems at the end of the notebook.\nWe need to check mathematical concerns related to this transformation.\n\n\nimport math\nimport numpy as np\nimport torch\nimport gpytorch\nfrom matplotlib import pyplot as plt\nimport regdata as rd\nfrom sklearn.cluster import KMeans\n\n\nclass ExactGPModel(gpytorch.models.ExactGP):\n def __init__(self, train_x, train_y, likelihood):\n super(ExactGPModel, self).__init__(train_x, train_y, likelihood)\n self.mean_module = gpytorch.means.ConstantMean()\n self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())\n\n def forward(self, x):\n mean_x = self.mean_module(x)\n covar_x = self.covar_module(x)\n return gpytorch.distributions.MultivariateNormal(mean_x, covar_x)\n\nclass ExactNSGPModel(gpytorch.models.ExactGP):\n def __init__(self, train_x, train_y, likelihood, num_latent):\n super(ExactNSGPModel, self).__init__(train_x, train_y, likelihood)\n# inds = np.random.choice(train_x.shape[0], size=num_latent, replace=False)\n# self.x_bar = train_x[inds]\n self.x_bar = torch.tensor(KMeans(n_clusters=num_latent).fit(train_x).cluster_centers_).to(train_x)\n self.w_bar = torch.nn.Parameter(torch.ones(num_latent,).to(self.x_bar))\n self.bias = torch.nn.Parameter(torch.zeros(1,).to(self.x_bar))\n self.latent_likelihood = gpytorch.likelihoods.GaussianLikelihood()\n# We can fix noise to be minimum but it is not ideal. Ideally, noise should automatically reduce to reasonable value.\n# self.latent_likelihood.raw_noise.requires_grad = False\n# self.latent_likelihood.raw_noise = torch.tensor(-10.)\n self.latent_model = ExactGPModel(self.x_bar, self.w_bar, self.latent_likelihood)\n \n self.mean_module = gpytorch.means.ConstantMean()\n self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())\n\n def forward(self, x):\n self.latent_model.eval()\n with gpytorch.settings.detach_test_caches(False): # needed to back propagate thru predictive posterior\n self.latent_model.set_train_data(self.x_bar, self.w_bar, strict=False)\n self.w = self.latent_likelihood(self.latent_model(x)) # predictive posterior\n x_warped = x*self.w.mean[:, None] + self.bias\n mean_x = self.mean_module(x_warped)\n covar_x = self.covar_module(x_warped)\n return gpytorch.distributions.MultivariateNormal(mean_x, covar_x)\n\n\ndef training(model, likelihood):\n training_iter = 100\n\n # Find optimal model hyperparameters\n model.train()\n likelihood.train()\n\n # Use the adam optimizer\n optimizer = torch.optim.Adam([\n {'params': model.parameters()}, # Includes GaussianLikelihood parameters\n ], lr=0.1)\n\n # \"Loss\" for GPs - the marginal log likelihood\n mll = gpytorch.mlls.ExactMarginalLogLikelihood(likelihood, model)\n\n for i in range(training_iter):\n # Zero gradients from previous iteration\n optimizer.zero_grad()\n # Output from model\n output = model(train_x)\n # Calc loss and backprop gradients\n try:\n loss = -mll(output, train_y) + torch.square(model.w.mean-1).mean()\n# print(model.latent_likelihood.noise)\n except AttributeError:\n loss = -mll(output, train_y)\n loss.backward()\n# print('Iter %d/%d - Loss: %.3f lengthscale: %.3f noise: %.3f' % (\n# i + 1, training_iter, loss.item(),\n# model.covar_module.base_kernel.lengthscale.item(),\n# model.likelihood.noise.item()\n# ))\n optimizer.step()\n \ndef predict_plot(model, likelihood, title):\n # Get into evaluation (predictive posterior) mode\n model.eval()\n likelihood.eval()\n\n # Test points are regularly spaced along [0,1]\n # Make predictions by feeding model through likelihood\n with torch.no_grad():\n observed_pred = likelihood(model(test_x))\n\n with torch.no_grad():\n # Initialize plot\n f, ax = plt.subplots(1, 1, figsize=(10, 6))\n\n # Get upper and lower confidence bounds\n lower, upper = observed_pred.confidence_region()\n # Plot training data as black stars\n ax.plot(train_x.numpy(), train_y.numpy(), 'k*')\n # Plot predictive means as blue line\n ax.plot(test_x.numpy(), observed_pred.mean.numpy(), 'b')\n # Shade between the lower and upper confidence bounds\n ax.fill_between(test_x.numpy().ravel(), lower.numpy(), upper.numpy(), alpha=0.5)\n ax.legend(['Observed Data', 'Mean', 'Confidence'])\n ax.set_title(title)\n return observed_pred\n\n\ndef GP(num_latent):\n\n # initialize likelihood and model\n likelihood = gpytorch.likelihoods.GaussianLikelihood()\n model = ExactGPModel(train_x, train_y, likelihood)\n \n training(model, likelihood)\n predict_plot(model, likelihood, 'GP')\n\ndef NSGP(num_latent):\n\n # initialize likelihood and model\n likelihood = gpytorch.likelihoods.GaussianLikelihood()\n model = ExactNSGPModel(train_x, train_y, likelihood, num_latent)\n \n training(model, likelihood)\n observed_pred = predict_plot(model, likelihood, 'NSGP')\n \n with torch.no_grad():\n model.train()\n model.forward(test_x)\n plt.figure(figsize=(10,6))\n plt.plot(test_x*model.w.mean[:, None], observed_pred.mean.numpy())\n plt.title('Warped test inputs v/s test outputs')\n \n with torch.no_grad():\n model.train()\n model.forward(test_x)\n plt.figure(figsize=(10,6))\n plt.plot(test_x, model.w.mean, label='interpolated')\n plt.scatter(model.x_bar, model.w_bar, label='learned')\n plt.ylim(0,2)\n plt.title('Test input v/s weights')\n plt.legend()\n\n\n\nTesting over various datasets\n\ntrain_x, train_y, test_x = rd.DellaGattaGene(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=7)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.Heinonen4(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.Jump1D(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.MotorcycleHelmet(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.Olympic(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.SineJump1D(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.SineNoisy(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nProblems\n\nTransformation from x to x_warped is not monotonic." }, { - "objectID": "posts/AL_with_MNIST.html#convert-to-one-vs-rest-problem", - "href": "posts/AL_with_MNIST.html#convert-to-one-vs-rest-problem", - "title": "Active Learning with MNIST", - "section": "Convert to one v/s rest problem", - "text": "Convert to one v/s rest problem\n\ny_c = (y == '2').astype(np.int8)\n\ny_c_train, y_c_test = y_c[:60000], y_c[60000:]" + "objectID": "posts/2022-10-27-mogp.html", + "href": "posts/2022-10-27-mogp.html", + "title": "Multi-Output Gaussian Processes", + "section": "", + "text": "Inspired from this GPSS video.\nimport jax\nfrom jax.flatten_util import ravel_pytree\nimport jax.numpy as jnp\n\nimport optax\n\nimport matplotlib.pyplot as plt\nfrom tinygp import kernels" }, { - "objectID": "posts/AL_with_MNIST.html#check-if-things-are-working-as-expected-1", - "href": "posts/AL_with_MNIST.html#check-if-things-are-working-as-expected-1", - "title": "Active Learning with MNIST", - "section": "Check if things are working as expected", - "text": "Check if things are working as expected\n\nclf = RandomForestClassifier(n_estimators=100, max_depth=10, random_state=0, n_jobs=psutil.cpu_count()//2)\nclf.fit(X_train, y_c_train)\npreds = clf.predict(X_test)\n\n\nprint(\"Precision\", precision_score(y_c_test, preds))\nprint(\"Recall\", recall_score(y_c_test, preds))\n\nPrecision 0.9808988764044944\nRecall 0.8459302325581395" + "objectID": "posts/2022-10-27-mogp.html#helper-functions", + "href": "posts/2022-10-27-mogp.html#helper-functions", + "title": "Multi-Output Gaussian Processes", + "section": "Helper functions", + "text": "Helper functions\n\ndef random_fill(key, params):\n values, unravel_fn = ravel_pytree(params)\n random_values = jax.random.normal(key, shape=values.shape)\n return unravel_fn(random_values)\n\ndef get_real_params(params):\n for i in range(1, q_len+1):\n params[f'a{i}'] = params[f'a{i}'].reshape(n_outputs, rank)\n if method == 'icm':\n params['var'] = jnp.exp(params['log_var'])\n params['scale'] = jnp.exp(params['log_scale'])\n params['noise'] = jnp.exp(params['log_noise'])\n elif method == 'lmc':\n for i in range(1, q_len+1):\n params[f'var{i}'] = jnp.exp(params[f'log_var{i}'])\n params[f'scale{i}'] = jnp.exp(params[f'log_scale{i}'])\n params[f'noise{i}'] = jnp.exp(params[f'log_noise{i}'])\n return params\n\ndef kron_cov_fn(params, x1, x2, add_noise=False):\n params = get_real_params(params)\n a_list = [params[f'a{i}'] for i in range(1, q_len+1)]\n\n if method == 'icm':\n kernel_fn = params['var'] * kernels.ExpSquared(scale=params['scale'])\n cov = kernel_fn(x1, x2)\n if add_noise:\n cov = cov + jnp.eye(cov.shape[0])*params['noise']\n\n B = jax.tree_util.tree_reduce(lambda x1, x2: x1@x1.T+x2@x2.T, a_list)\n# print(B.shape, cov.shape)\n return jnp.kron(B, cov)\n\n elif method == 'lmc':\n cov_list = []\n for idx in range(1, q_len+1):\n kernel_fn = params[f'var{idx}'] * kernels.ExpSquared(scale=params[f'scale{idx}'])\n cov = kernel_fn(x1, x2)\n if add_noise:\n cov = cov + jnp.eye(cov.shape[0])*params[f'noise{idx}']\n\n B = a_list[idx-1]@a_list[idx-1].T\n cov_list.append(jnp.kron(B, cov))\n \n return jax.tree_util.tree_reduce(lambda x1, x2: x1+x2, cov_list)" }, { - "objectID": "posts/AL_with_MNIST.html#divide-data-into-train-and-pool", - "href": "posts/AL_with_MNIST.html#divide-data-into-train-and-pool", - "title": "Active Learning with MNIST", - "section": "Divide data into train and pool", - "text": "Divide data into train and pool\n\ntrain_size = 200\nX_train, X_pool, y_c_train, y_c_pool = train_test_split(X, y_c, train_size=train_size, random_state=42)\nprint(X_train.shape, X_pool.shape, y_c_train.shape, y_c_pool.shape)\n\n# plot a bar chart of the number of samples in each class for the training and test set\nunique, counts = np.unique(y_c_train, return_counts=True)\nprint(\"Number of samples in each class for training set\", dict(zip(unique, counts)))\nprint(\"One v/s rest ratio\", counts[0]/counts[1], \"for training set\")\n\n(200, 784) (69800, 784) (200,) (69800,)\nNumber of samples in each class for training set {0: 179, 1: 21}\nOne v/s rest ratio 8.523809523809524 for training set" + "objectID": "posts/2022-10-27-mogp.html#configuration", + "href": "posts/2022-10-27-mogp.html#configuration", + "title": "Multi-Output Gaussian Processes", + "section": "Configuration", + "text": "Configuration\n\nq_len = 2\nrank = 2 # if 1, slfm\nn_outputs = 2\n\nmethod = 'lmc' # lmc, icm\n\nif rank = 1, lmc becomes slfm.\n\nGenerative process\n\nx_key = jax.random.PRNGKey(4)\n\nx = jax.random.uniform(x_key, shape=(40, 1)).sort(axis=0)\nx_test = jnp.linspace(0,1,100).reshape(-1, 1)\n\ne1_key, e2_key = jax.random.split(x_key)\n\ne1 = jax.random.normal(e1_key, shape=(x.shape[0],))\ne2 = jax.random.normal(e2_key, shape=(x.shape[0],))\n\nif method == 'icm':\n noise = 0.01\n gen_kernel = 1.2*kernels.ExpSquared(scale=0.2)\n gen_covariance = gen_kernel(x, x) + jnp.eye(x.shape[0])*noise\n gen_chol = jnp.linalg.cholesky(gen_covariance)\n \n y1 = gen_chol@e1\n y2 = gen_chol@e2\n\n y = jnp.concatenate([y1, y2])\n \nelif method == 'lmc':\n noise1 = 0.01\n noise2 = 0.1\n gen_kernel1 = 1.2*kernels.ExpSquared(scale=0.1)\n gen_covariance1 = gen_kernel1(x, x) + jnp.eye(x.shape[0])*noise1\n gen_chol1 = jnp.linalg.cholesky(gen_covariance1)\n\n gen_kernel2 = 0.8*kernels.ExpSquared(scale=0.2)\n gen_covariance2 = gen_kernel2(x, x) + jnp.eye(x.shape[0])*noise2\n gen_chol2 = jnp.linalg.cholesky(gen_covariance2)\n \n y1 = gen_chol1@e1\n y2 = gen_chol2@e2\n\n y = jnp.concatenate([y1, y2])\n \n\nplt.scatter(x, y1, label='y1')\nplt.scatter(x, y2, label='y2')\nplt.legend();\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)\n\n\n\n\n\n\n\n\n\n\ndef loss_fn(params):\n mo_cov = kron_cov_fn(params, x, x, add_noise=True)\n# print(y.shape, mo_cov.shape)\n return -jax.scipy.stats.multivariate_normal.logpdf(y, jnp.zeros_like(y), mo_cov)\n\n\nkey = jax.random.PRNGKey(1)\nif method == 'icm':\n params = {'log_var':0.0, 'log_scale':0.0, 'log_noise':0.0}\n for i in range(1, q_len+1):\n params[f'a{i}'] = jnp.zeros((n_outputs, rank))\nelif method == 'lmc':\n params = {}\n for i in range(1, q_len+1):\n params[f'a{i}'] = jnp.zeros((n_outputs, rank))\n params[f'log_var{i}'] = 0.0\n params[f'log_scale{i}'] = 0.0\n params[f'log_noise{i}'] = 0.0\n\nparams = random_fill(key, params)\nparams\n\n{'a1': DeviceArray([[-0.764527 , 1.0286916],\n [-1.0690447, -0.7921495]], dtype=float32),\n 'a2': DeviceArray([[ 0.8845895, -1.1941622],\n [-1.7434924, 1.5159688]], dtype=float32),\n 'log_noise1': DeviceArray(-1.1254696, dtype=float32),\n 'log_noise2': DeviceArray(-0.22446911, dtype=float32),\n 'log_scale1': DeviceArray(0.39719132, dtype=float32),\n 'log_scale2': DeviceArray(-0.22453257, dtype=float32),\n 'log_var1': DeviceArray(-0.7590596, dtype=float32),\n 'log_var2': DeviceArray(-0.08601531, dtype=float32)}\n\n\n\nloss_fn(params)\n\nDeviceArray(116.04026, dtype=float32)\n\n\n\nkey = jax.random.PRNGKey(3)\nparams = random_fill(key, params)\n\nn_iters = 1000\n\nvalue_and_grad_fn = jax.jit(jax.value_and_grad(loss_fn))\nopt = optax.adam(0.01)\nstate = opt.init(params)\n\ndef one_step(params_and_state, xs):\n params, state = params_and_state\n loss, grads = value_and_grad_fn(params)\n updates, state = opt.update(grads, state)\n params = optax.apply_updates(params, updates)\n return (params, state), (params, loss)\n\n(tuned_params, state), (params_history, loss_history) = jax.lax.scan(one_step, init=(params, state), xs=None, length=n_iters)\n\nplt.plot(loss_history);\n\n\n\n\n\n\n\n\n\ndef predict_fn(params, x_test):\n cov = kron_cov_fn(params, x, x, add_noise=True)\n test_cov = kron_cov_fn(params, x_test, x_test, add_noise=True)\n cross_cov = kron_cov_fn(params, x_test, x, add_noise=False)\n \n chol = jnp.linalg.cholesky(cov)\n k_inv_y = jax.scipy.linalg.cho_solve((chol, True), y)\n k_inv_cross_cov = jax.scipy.linalg.cho_solve((chol, True), cross_cov.T)\n\n pred_mean = cross_cov@k_inv_y\n pred_cov = test_cov - cross_cov@k_inv_cross_cov\n return pred_mean, pred_cov\n\n\npred_mean, pred_cov = predict_fn(tuned_params, x_test)\npred_conf = 2 * jnp.diag(pred_cov)**0.5\n\nplt.scatter(x, y1, label='y1')\nplt.scatter(x, y2, label='y2')\nplt.plot(x_test, pred_mean[:x_test.shape[0]], label='pred_y1')\nplt.plot(x_test, pred_mean[x_test.shape[0]:], label='pred_y2')\nplt.fill_between(x_test.ravel(), pred_mean[:x_test.shape[0]] - pred_conf[:x_test.shape[0]], pred_mean[:x_test.shape[0]] + pred_conf[:x_test.shape[0]], label='pred_conf_y1', alpha=0.3)\nplt.fill_between(x_test.ravel(), pred_mean[x_test.shape[0]:] - pred_conf[x_test.shape[0]:], pred_mean[x_test.shape[0]:] + pred_conf[x_test.shape[0]:], label='pred_conf_y2', alpha=0.3)\nplt.legend(bbox_to_anchor=(1,1));\n\n\n\n\n\n\n\n\n\nfor name, value in get_real_params(tuned_params).items():\n if not name.startswith('log_'):\n print(name, value)\n\na1 [[0.03664799 0.00039898]\n [0.3191718 0.00344488]]\na2 [[ 0.1351072 0.00248941]\n [-0.05392759 -0.04239884]]\nnoise1 0.6797133\nnoise2 0.4154678\nscale1 5.048228\nscale2 0.10743636\nvar1 0.016275918\nvar2 41.034225" }, { - "objectID": "posts/AL_with_MNIST.html#prof.-ermons-method", - "href": "posts/AL_with_MNIST.html#prof.-ermons-method", - "title": "Active Learning with MNIST", - "section": "Prof. Ermon’s method", - "text": "Prof. Ermon’s method\n\nX_train, X_pool, y_c_train, y_c_pool = train_test_split(X, y_c, train_size=train_size, random_state=42)\nprint(X_train.shape, X_pool.shape, y_c_train.shape, y_c_pool.shape)\n\nclf = RandomForestClassifier(n_estimators=100, max_depth=10, random_state=0, n_jobs=psutil.cpu_count()//2)\nclf.fit(X_train, y_c_train)\npreds = clf.predict(X_test)\n\ntest_recall = [recall_score(y_c_test, preds)]\ntest_precision = [precision_score(y_c_test, preds)]\npositives = [np.sum(y_c_train)]\nnegatives = [len(y_c_train) - positives[-1]]\nlabeling_cost = [0]\ntp = np.where((preds == 1) & (y_c_test == 1))[0]\nfp = np.where((preds == 1) & (y_c_test == 0))[0]\nprint(\"Test: Number of false positives\", len(fp), \"Number of true positives\", len(tp))\nprint(\"Iteration\", 0, \"Precision\", test_precision[-1], \"Recall\", test_recall[-1], \"Cost\", labeling_cost[-1])\n\nal_iters = 10\n\nfor iter in range(al_iters):\n print()\n preds = clf.predict(X_pool)\n # pred_proba = clf.predict_proba(X_pool)\n # print(pred_proba.shape)\n # identify instances predicted as positive but are actually negative (false positives)\n # we only pick points with more than 90% probability of being positive\n # fp = np.where((pred_proba[:, 1] > 0.8) & (y_pool == 0))[0]\n fp = np.where((preds == 1) & (y_c_pool == 0))[0]\n tp = np.where((preds == 1) & (y_c_pool == 1))[0]\n fn = np.where((preds == 0) & (y_c_pool == 1))[0]\n print(\"Pool: Number of false positives\", len(fp), \"Number of true positives\", len(tp), \"Number of false negatives\", len(fn))\n tp_fp = np.concatenate((tp, fp))\n # add them to the training set\n X_train = np.concatenate((X_train, X_pool[tp_fp]))\n y_c_train = np.concatenate((y_c_train, y_c_pool[tp_fp]))\n positives.append(np.sum(y_c_train))\n negatives.append(len(y_c_train) - positives[-1])\n # remove from the pool set\n X_pool = np.delete(X_pool, tp_fp, axis=0)\n y_c_pool = np.delete(y_c_pool, tp_fp)\n # add the cost of labeling to the list\n labeling_cost.append(len(tp_fp))\n # train the classifier again\n clf.fit(X_train, y_c_train)\n # predict on the test set\n preds = clf.predict(X_test)\n tp = np.where((preds == 1) & (y_c_test == 1))[0]\n fp = np.where((preds == 1) & (y_c_test == 0))[0]\n fn = np.where((preds == 0) & (y_c_test == 1))[0]\n print(\"Test: Number of false positives\", len(fp), \"Number of true positives\", len(tp), \"Number of false negatives\", len(fn))\n # calculate precision and recall\n test_recall.append(recall_score(y_c_test, preds))\n test_precision.append(precision_score(y_c_test, preds))\n # print information\n print(\"Iteration\", iter+1, \"Precision\", test_precision[-1], \"Recall\", test_recall[-1], \"Cost\", labeling_cost[-1])\n\nlabeling_cost = np.cumsum(labeling_cost)\n\n(200, 784) (69800, 784) (200,) (69800,)\nTest: Number of false positives 8 Number of true positives 283\nIteration 0 Precision 0.9725085910652921 Recall 0.2742248062015504 Cost 0\n\nPool: Number of false positives 73 Number of true positives 1884 Number of false negatives 5085\nTest: Number of false positives 209 Number of true positives 932 Number of false negatives 100\nIteration 1 Precision 0.8168273444347064 Recall 0.9031007751937985 Cost 1957\n\nPool: Number of false positives 1389 Number of true positives 4386 Number of false negatives 699\nTest: Number of false positives 489 Number of true positives 1016 Number of false negatives 16\nIteration 2 Precision 0.6750830564784053 Recall 0.9844961240310077 Cost 5775\n\nPool: Number of false positives 4088 Number of true positives 598 Number of false negatives 101\nTest: Number of false positives 12 Number of true positives 1006 Number of false negatives 26\nIteration 3 Precision 0.9882121807465619 Recall 0.9748062015503876 Cost 4686\n\nPool: Number of false positives 18 Number of true positives 15 Number of false negatives 86\nTest: Number of false positives 10 Number of true positives 1010 Number of false negatives 22\nIteration 4 Precision 0.9901960784313726 Recall 0.9786821705426356 Cost 33\n\nPool: Number of false positives 13 Number of true positives 5 Number of false negatives 81\nTest: Number of false positives 11 Number of true positives 1010 Number of false negatives 22\nIteration 5 Precision 0.9892262487757101 Recall 0.9786821705426356 Cost 18\n\nPool: Number of false positives 5 Number of true positives 2 Number of false negatives 79\nTest: Number of false positives 10 Number of true positives 1011 Number of false negatives 21\nIteration 6 Precision 0.990205680705191 Recall 0.9796511627906976 Cost 7\n\nPool: Number of false positives 5 Number of true positives 1 Number of false negatives 78\nTest: Number of false positives 9 Number of true positives 1011 Number of false negatives 21\nIteration 7 Precision 0.9911764705882353 Recall 0.9796511627906976 Cost 6\n\nPool: Number of false positives 5 Number of true positives 2 Number of false negatives 76\nTest: Number of false positives 11 Number of true positives 1010 Number of false negatives 22\nIteration 8 Precision 0.9892262487757101 Recall 0.9786821705426356 Cost 7\n\nPool: Number of false positives 5 Number of true positives 2 Number of false negatives 74\nTest: Number of false positives 9 Number of true positives 1010 Number of false negatives 22\nIteration 9 Precision 0.9911678115799804 Recall 0.9786821705426356 Cost 7\n\nPool: Number of false positives 3 Number of true positives 1 Number of false negatives 73\nTest: Number of false positives 12 Number of true positives 1010 Number of false negatives 22\nIteration 10 Precision 0.9882583170254403 Recall 0.9786821705426356 Cost 4\n\n\n\n# plot the confusion matrix\nimport itertools\ncm = confusion_matrix(y_c_test, preds)\nplt.imshow(cm, interpolation=\"nearest\", cmap=plt.cm.Blues)\n# add the numbers inside the boxes\nthresh = cm.max() / 2.0\nfor i, j in itertools.product(range(cm.shape[0]), range(cm.shape[1])):\n plt.text(j, i, cm[i, j], horizontalalignment=\"center\", color=\"white\" if cm[i, j] > thresh else \"black\")\nplt.title(\"Confusion Matrix\")\nplt.xlabel(\"Predicted Label\")\nplt.ylabel(\"True Label\")\n\nText(0, 0.5, 'True Label')\n\n\n\n\n\n\n\n\n\n\npd.DataFrame({\"Cost\": labeling_cost, \"Train_Positives\": positives, \"Train_Negatives\": negatives, \"Test_Precision\": test_precision, \"Test_Recall\": test_recall})\n\n\n\n\n\n\n\n\nCost\nTrain_Positives\nTrain_Negatives\nTest_Precision\nTest_Recall\n\n\n\n\n0\n0\n21\n179\n0.972509\n0.274225\n\n\n1\n1957\n1905\n252\n0.816827\n0.903101\n\n\n2\n7732\n6291\n1641\n0.675083\n0.984496\n\n\n3\n12418\n6889\n5729\n0.988212\n0.974806\n\n\n4\n12451\n6904\n5747\n0.990196\n0.978682\n\n\n5\n12469\n6909\n5760\n0.989226\n0.978682\n\n\n6\n12476\n6911\n5765\n0.990206\n0.979651\n\n\n7\n12482\n6912\n5770\n0.991176\n0.979651\n\n\n8\n12489\n6914\n5775\n0.989226\n0.978682\n\n\n9\n12496\n6916\n5780\n0.991168\n0.978682\n\n\n10\n12500\n6917\n5783\n0.988258\n0.978682" + "objectID": "posts/2022-08-01-conditional_neural_processes.html", + "href": "posts/2022-08-01-conditional_neural_processes.html", + "title": "Conditional Neural Processes in JAX", + "section": "", + "text": "# Silence WARNING:root:The use of `check_types` is deprecated and does not have any effect.\n# https://github.com/tensorflow/probability/issues/1523\nimport logging\n\nlogger = logging.getLogger()\n\n\nclass CheckTypesFilter(logging.Filter):\n def filter(self, record):\n return \"check_types\" not in record.getMessage()\n\n\nlogger.addFilter(CheckTypesFilter())\n\nimport jax\nimport jax.numpy as jnp\nimport matplotlib.pyplot as plt\nfrom sklearn.model_selection import train_test_split\n\ntry:\n import flax.linen as nn\nexcept ModuleNotFoundError:\n %pip install flax\n import flax.linen as nn\n\ntry:\n import optax\nexcept ModuleNotFoundError:\n %pip install optax\n import optax\n\ntry:\n import tensorflow_probability.substrates.jax as tfp\nexcept ModuleNotFoundError:\n %pip install tensorflow-probability\n import tensorflow_probability.substrates.jax as tfp\ntfd = tfp.distributions" }, { - "objectID": "posts/2021-03-22-gp_kernels.html", - "href": "posts/2021-03-22-gp_kernels.html", - "title": "Understanding Kernels in Gaussian Processes", - "section": "", - "text": "!pip install -qq GPy\nimport autograd.numpy as np\nimport pandas as pd\nimport GPy\nimport matplotlib.pyplot as plt\nfrom autograd import grad\nfrom matplotlib.animation import FuncAnimation\nfrom matplotlib import rc\nimport seaborn as sns" + "objectID": "posts/2022-08-01-conditional_neural_processes.html#model", + "href": "posts/2022-08-01-conditional_neural_processes.html#model", + "title": "Conditional Neural Processes in JAX", + "section": "Model", + "text": "Model\n\nclass Encoder(nn.Module):\n features: list\n encoding_dims: int\n\n @nn.compact\n def __call__(self, x_context, y_context):\n x = jnp.hstack([x_context, y_context.reshape(x_context.shape[0], -1)])\n for n_features in self.features:\n x = nn.Dense(n_features)(x)\n x = nn.relu(x)\n\n x = nn.Dense(self.encoding_dims)(x)\n\n representation = x.mean(axis=0, keepdims=True) # option 1\n return representation # (1, encoding_dims)\n\nclass Decoder(nn.Module):\n features: list\n\n @nn.compact\n def __call__(self, representation, x):\n representation = jnp.repeat(representation, x.shape[0], axis=0)\n x = jnp.hstack([representation, x])\n\n for n_features in self.features:\n x = nn.Dense(n_features)(x)\n x = nn.relu(x)\n\n x = nn.Dense(2)(x)\n loc, raw_scale = x[:, 0], x[:, 1]\n scale = jax.nn.softplus(raw_scale)\n \n return loc, scale\n\nclass CNP(nn.Module):\n encoder_features: list\n encoding_dims: int\n decoder_features: list\n\n @nn.compact\n def __call__(self, x_content, y_context, x_target):\n representation = Encoder(self.encoder_features, self.encoding_dims)(x_content, y_context)\n loc, scale = Decoder(self.decoder_features)(representation, x_target)\n return loc, scale\n\n def loss_fn(self, params, x_context, y_context, x_target, y_target):\n loc, scale = self.apply(params, x_context, y_context, x_target)\n predictive_distribution = tfd.MultivariateNormalDiag(loc=loc, scale_diag=scale)\n return -predictive_distribution.log_prob(y_target)" }, { - "objectID": "posts/2021-03-22-gp_kernels.html#rbf-radial-basis-function-kernel-stationarity-and-isotropy", - "href": "posts/2021-03-22-gp_kernels.html#rbf-radial-basis-function-kernel-stationarity-and-isotropy", - "title": "Understanding Kernels in Gaussian Processes", - "section": "RBF (Radial basis function) Kernel, Stationarity and Isotropy", - "text": "RBF (Radial basis function) Kernel, Stationarity and Isotropy\nRBF is one of the most commonly used kernels in GPs due to it’s infinetely differentiability (extreme flexibility). This property helps us to model a vast variety of functions \\(X \\to Y\\).\nRBF kernel is given as the following, \\[\n\\begin{aligned}\n\\mathcal{K}(x_1,x_2)= \\sigma^2exp\\left(-\\frac{(x-x')^2}{2l^2}\\right)\n\\end{aligned}\n\\] Where, \\(\\sigma^2\\) is variance and \\(l\\) is known as lengthscale. #### Stationarity RBF is a stationary kernel and so it is invariant to translation in the input space. In other words, \\(\\mathcal{K}(x,x')\\) depends only on \\(x-x'\\).\n\nIsotropy\nRBF is also isotropic kernel, which means that \\(\\mathcal{K}(x,x')\\) depends only on \\(|x-x'|\\). Thus, we have \\(\\mathcal{K}(x,x') = \\mathcal{K}(x',x)\\).\nLet’s visualize few functions drawn from the RBF kernel\n\ndef K_rbf(X1, X2, sigma=1., l=1.):\n return (sigma**2)*(np.exp(-0.5*np.square(X1-X2.T)/l**2))\n\n\n\nHelper functions\n\ndef plot_functions(kernel_func, ax0_ylim=(-3,3), ax1_ylim=(-0.1,1.1)):\n mean = np.zeros(X.shape[0])\n cov = kernel_func(X, X, sigma, l)\n functions = np.random.multivariate_normal(mean, cov, size=5)\n fig = plt.figure(figsize=(14,8), constrained_layout=True)\n gs = fig.add_gridspec(2,4)\n ax0 = fig.add_subplot(gs[0, 1:-1])\n ax0.set_ylim(*ax0_ylim)\n ax1 = fig.add_subplot(gs[1, 0:2])\n ax1.set_ylim(*ax1_ylim)\n ax2 = fig.add_subplot(gs[1, 2:4])\n for func in functions:\n ax0.plot(X, func,'o-');\n ax0.set_xlabel('X');ax0.set_ylabel('Y');ax0.set_title('Functions drawn from '+k_name+' kernel');\n ax1.plot(X, cov[:,4]);ax1.set_title('K(0,X)');ax1.set_xlabel('X');ax1.set_ylabel('K(0,X)')\n sns.heatmap(cov.round(2), ax=ax2, xticklabels=X.ravel(), yticklabels=X.ravel(), annot=True);\n ax2.set_xlabel('X');ax2.set_ylabel('X');ax2.set_title('Covariance matrix');\n\ndef animate_functions(kernel_func, val_list, ax0_ylim=(-3,3), ax1_ylim=(-0.1,1.1), \n k_name='',p_name='',symbol=''):\n fig = plt.figure(figsize=(14,8))\n gs = fig.add_gridspec(2,4)\n ax0 = fig.add_subplot(gs[0, 1:-1]);ax1 = fig.add_subplot(gs[1, 0:2]);ax2 = fig.add_subplot(gs[1, 2:4]);\n def update(p):\n ax0.cla();ax1.cla();ax2.cla();\n ax0.set_ylim(*ax0_ylim);ax1.set_ylim(*ax1_ylim)\n if p_name == 'Lengthscale':\n cov = kernel_func(X, X, l=p)\n elif p_name == 'Variance':\n cov = kernel_func(X, X, sigma=np.sqrt(p))\n elif p_name == 'Offset':\n cov = kernel_func(X, X, c=p)\n elif p_name == 'Period':\n cov = kernel_func(X, X, p=p)\n functions = np.random.multivariate_normal(mean, cov, size=5)\n for func in functions:\n ax0.plot(X, func,'o-');\n ax0.set_xlabel('X');ax0.set_ylabel('Y');ax0.set_title('Functions drawn from '+k_name+' kernel\\n'+p_name+' ('+symbol+') = '+str(p));\n ax1.plot(X, cov[:,4]);ax1.set_title('K(0,X)');ax1.set_title('K(0,X)');ax1.set_xlabel('X');ax1.set_ylabel('K(0,X)')\n sns.heatmap(cov.round(2), ax=ax2, xticklabels=X.ravel(), yticklabels=X.ravel(), annot=True, cbar=False);\n ax2.set_xlabel('X');ax2.set_ylabel('X');ax2.set_title('Covariance matrix');\n\n anim = FuncAnimation(fig, update, frames=val_list, blit=False)\n plt.close()\n rc('animation', html='jshtml')\n return anim\n\nVerifying if our kernel is consistent with GPy kernels.\n\nX = np.linspace(101,1001,200).reshape(-1,1)\nsigma, l = 7, 11\nassert np.allclose(K_rbf(X,X,sigma,l), GPy.kern.RBF(1, variance=sigma**2, lengthscale=l).K(X,X)) \n\n\nnp.random.seed(0)\nX = np.arange(-4,5).reshape(-1,1)\nsigma = 1.\nl = 3.\nk_name = 'RBF'\nplot_functions(K_rbf, ax0_ylim=(-3.5,3))\n\n\n\n\n\n\n\n\nLet’s see the effect of varying parameters \\(\\sigma\\) and \\(l\\) of the RBF kernel function.\n\nnp.random.seed(0)\nsigma = 1.\nval_list = [0.5,1,2,3,4,5]\nanimate_functions(K_rbf, val_list, k_name='RBF', p_name='Lengthscale', symbol='l')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\n\nl = 1.\nval_list = [1,4,9,16,25]\nanimate_functions(K_rbf, val_list, ax0_ylim=(-12,12), ax1_ylim=(-0.1, 26),\n k_name='RBF', p_name='Variance', symbol='sigma')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nWith increase in value of \\(l\\), functions drawn from the kernel become smoother. Covariance between a pair of points is increasing with increase in \\(l\\).\nIncreasing \\(\\sigma^2\\) increase the overall uncertainty (width of the space where 95% of the functions live) across all the points." + "objectID": "posts/2022-08-01-conditional_neural_processes.html#data", + "href": "posts/2022-08-01-conditional_neural_processes.html#data", + "title": "Conditional Neural Processes in JAX", + "section": "Data", + "text": "Data\n\nN = 100\nseed = jax.random.PRNGKey(0)\nx = jnp.linspace(-1, 1, N).reshape(-1, 1)\nf = lambda x: (jnp.sin(10*x) + x).flatten()\nnoise = jax.random.normal(seed, shape=(N,)) * 0.2\ny = f(x) + noise\n\nx_test = jnp.linspace(-2, 2, N*2+10).reshape(-1, 1)\ny_test = f(x_test) \n\nplt.scatter(x, y, label='train', zorder=5)\nplt.scatter(x_test, y_test, label='test', alpha=0.5)\nplt.legend();" }, { - "objectID": "posts/2021-03-22-gp_kernels.html#matern-kernel", - "href": "posts/2021-03-22-gp_kernels.html#matern-kernel", - "title": "Understanding Kernels in Gaussian Processes", - "section": "Matern Kernel", - "text": "Matern Kernel\nMatern kernels are given by a general formula as following, \\[\n\\begin{aligned}\n\\mathcal{K}(x_1, x_2) = \\sigma^2\\frac{1}{\\Gamma(\\nu)2^{\\nu-1}}\\Bigg(\n\\frac{\\sqrt{2\\nu}}{l} |x_1-x_2|\n\\Bigg)^\\nu K_\\nu\\Bigg(\n\\frac{\\sqrt{2\\nu}}{l} |x_1-x_2|\\Bigg)\n\\end{aligned}\n\\] Where, \\(\\Gamma\\) is gamma function and \\(K_\\nu\\) is modified Bessel function of second order.\nThe general formula is not very intuitive about the functionality of this kernel. In practice, Matern with \\(\\nu=\\{0.5,1.5,2.5\\}\\) are used, where GP with each kernel is \\((\\lceil\\nu\\rceil-1)\\) times differentiable.\nMatern functions corresponding to each \\(\\nu\\) values are defined as the following, \\[\n\\begin{aligned}\nMatern12 \\to \\mathcal{K_{\\nu=0.5}}(x_1, x_2) &= \\sigma^2exp\\left(-\\frac{|x_1-x_2|}{l}\\right)\\\\\nMatern32 \\to \\mathcal{K_{\\nu=1.5}}(x_1, x_2) &= \\sigma^2\\left(1+\\frac{\\sqrt{3}|x_1-x_2|}{l}\\right)exp\\left(-\\frac{\\sqrt{3}|x_1-x_2|}{l}\\right)\\\\\nMatern52 \\to \\mathcal{K_{\\nu=2.5}}(x_1, x_2) &= \\sigma^2\\left(1+\\frac{\\sqrt{5}|x_1-x_2|}{l}+\\frac{5(x_1-x_2)^2)}{3l^2}\\right)exp\\left(-\\frac{\\sqrt{5}|x_1-x_2|}{l}\\right)\n\\end{aligned}\n\\] Matern kernels are stationary as well as isotropic. With \\(\\nu \\to \\infty\\) they converge to \\(RBF\\) kernel. \\(Matern12\\) is also known as \\(Exponential\\) kernel in toolkits such as GPy.\nNow, let’s draw few functions from each of these versions and try to get intuition behind each of them.\n\ndef K_m12(X1, X2, sigma=1., l=1.): # v = 0.5\n return (sigma**2)*(np.exp(-np.abs(X1-X2.T)/l))\ndef K_m32(X1, X2, sigma=1., l=1.): # v = 1.5\n return (sigma**2)*(1+((3**0.5)*np.abs(X1-X2.T))/l)*(np.exp(-(3**0.5)*np.abs(X1-X2.T)/l))\ndef K_m52(X1, X2, sigma=1., l=1.): # v = 2.5\n return (sigma**2)*(1+(((5**0.5)*np.abs(X1-X2.T))/l)+((5*(X1-X2.T)**2)/(3*l**2)))*\\\n (np.exp(-(5**0.5)*np.abs(X1-X2.T)/l))\n\nVerifying if our kernels are consistent with GPy kernels.\n\nX = np.linspace(101,1001,50).reshape(-1,1)\nassert np.allclose(K_m32(X,X,sigma=7.,l=11.), GPy.kern.Matern32(1,lengthscale=11.,variance=7**2).K(X,X))\nassert np.allclose(K_m52(X,X,sigma=7.,l=11.), GPy.kern.Matern52(1,lengthscale=11.,variance=7**2).K(X,X))\n\n\nX = np.arange(-4,5).reshape(-1,1)\nsigma = 1.\nl = 3.\n\nfig, ax = plt.subplots(3,2,figsize=(14,10))\nnames = ['Matern12', 'Matern32', 'Matern52']\nfor k_i, kernel in enumerate([K_m12, K_m32, K_m52]):\n mean = np.zeros(X.shape[0])\n cov = kernel(X, X, sigma, l)\n functions = np.random.multivariate_normal(mean, cov, size=5)\n for func in functions:\n ax[k_i,0].plot(X, func);\n ax[k_i,0].set_xlabel('X');ax[k_i,0].set_ylabel('Y');ax[k_i,0].set_title('Functions drawn from '+names[k_i]+' kernel');\n sns.heatmap(cov.round(2), ax=ax[k_i,1], xticklabels=X.ravel(), yticklabels=X.ravel(), annot=True);\n ax[k_i,1].set_xlabel('X');ax[k_i,1].set_ylabel('X');ax[k_i,1].set_title('Covariance matrix');\nplt.tight_layout();\n\n\n\n\n\n\n\n\nFrom the above plot, we can say that smoothness is increasing in functions as we increase \\(\\nu\\). Thus, smoothness of functions in terms of kernels is in the following order: Matern12<Matern32<Matern52.\nLet us see effect of varying \\(\\sigma\\) and \\(l\\) on Matern32 which is more popular among the three.\n\nnp.random.seed(0)\nsigma = 1.\nval_list = [0.5,1,2,3,4,5]\nanimate_functions(K_m32, val_list, k_name='Matern32', p_name='Lengthscale', symbol='l')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nWe can see that Matern32 kernel behaves similar to RBF with varying \\(l\\). Though, Matern32 is less smoother than RBF. A quick comparison would clarify this.\n\nX = np.linspace(-10,10,100).reshape(-1,1)\nplt.plot(X, K_rbf(X,X, l=3.)[:,50], label='RBF')\nplt.plot(X, K_m32(X,X, l=3.)[:,50], label='Matern32')\nplt.legend();plt.xlabel('X');plt.ylabel('Covariance (K(0,X))');\nplt.title('K(0,X)');" + "objectID": "posts/2022-08-01-conditional_neural_processes.html#training", + "href": "posts/2022-08-01-conditional_neural_processes.html#training", + "title": "Conditional Neural Processes in JAX", + "section": "Training", + "text": "Training\n\ndef train_fn(model, optimizer, seed, n_iterations, n_context):\n params = model.init(seed, x, y, x)\n value_and_grad_fn = jax.value_and_grad(model.loss_fn)\n state = optimizer.init(params)\n indices = jnp.arange(N)\n \n def one_step(params_and_state, seed):\n params, state = params_and_state\n shuffled_indices = jax.random.permutation(seed, indices)\n context_indices = shuffled_indices[:n_context]\n target_indices = shuffled_indices[n_context:]\n x_context, y_context = x[context_indices], y[context_indices]\n x_target, y_target = x[target_indices], y[target_indices]\n loss, grads = value_and_grad_fn(params, x_context, y_context, x_target, y_target)\n updates, state = optimizer.update(grads, state)\n params = optax.apply_updates(params, updates)\n return (params, state), loss\n\n seeds = jax.random.split(seed, num=n_iterations)\n (params, state), losses = jax.lax.scan(one_step, (params, state), seeds)\n return params, losses\n\n\nencoder_features = [64, 16, 8]\nencoding_dims = 1\ndecoder_features = [16, 8]\nmodel = CNP(encoder_features, encoding_dims, decoder_features)\noptimizer = optax.adam(learning_rate=0.001)\n\nseed = jax.random.PRNGKey(2)\nn_context = int(0.7 * N)\nn_iterations = 20000\n\nparams, losses = train_fn(model, optimizer, seed, n_iterations=n_iterations, n_context=n_context)\n\n\nplt.plot(losses);" }, { - "objectID": "posts/2021-03-22-gp_kernels.html#periodic-kernel", - "href": "posts/2021-03-22-gp_kernels.html#periodic-kernel", - "title": "Understanding Kernels in Gaussian Processes", - "section": "Periodic Kernel", - "text": "Periodic Kernel\nPeriodic Kernel is given as the following, \\[\n\\begin{aligned}\n\\mathcal{K}(x_1,x_2)= \\sigma^2\\exp\\left(-\\frac{\\sin^2(\\pi|x_1 - x_2|/p)}{2l^2}\\right)\n\\end{aligned}\n\\] Where \\(p\\) is period. Let’s visualize few functions drawn from this kernel.\n\ndef K_periodic(X1, X2, sigma=1., l=1., p=3.):\n return sigma**2 * np.exp(-0.5*np.square(np.sin(np.pi*(X1-X2.T)/p))/l**2)\n\nX = np.linspace(10,1001,50).reshape(-1,1)\nassert np.allclose(K_periodic(X,X,sigma=7.,l=11.,p=3.), \n GPy.kern.StdPeriodic(1,lengthscale=11.,variance=7**2,period=3.).K(X,X))\n\n\nnp.random.seed(0)\nX = np.arange(-4,5).reshape(-1,1)\nsigma = 1\nl = 1.\np = 3.\nk_name = 'Periodic'\nplot_functions(K_periodic)\n\n\n\n\n\n\n\n\nWe will investigate the effect of varying period \\(p\\) now.\n\nnp.random.seed(0)\nval_list = [1., 2., 3., 4., 5.]\n\nanimate_functions(K_periodic, val_list, ax1_ylim=(0.4,1.1),\n k_name='Periodic',p_name='Period')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nFrom the above animation we can see that, all points that are \\(p\\) distance apart from each other have exactly same values because they have correlation of exactly 1 (\\(\\sigma=1 \\to covariance=correlation\\)).\nNow, we will investigate effect of lenging lengthscale \\(l\\) while other parameters are constant.\n\nnp.random.seed(0)\nval_list = [1., 2., 3., 4., 5.]\n\nanimate_functions(K_periodic, val_list, ax1_ylim=(0.6,1.1),\n k_name='Periodic',p_name='Lengthscale', symbol='l')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nWe can see that correlation between a pair of locations \\(\\{x_1,x_2|x_1-x_2<p\\}\\) increases as the lengthscale is increased." + "objectID": "posts/2022-08-01-conditional_neural_processes.html#predict", + "href": "posts/2022-08-01-conditional_neural_processes.html#predict", + "title": "Conditional Neural Processes in JAX", + "section": "Predict", + "text": "Predict\n\nloc, scale = model.apply(params, x, y, x_test)\nlower, upper = loc - 2*scale, loc + 2*scale\n\nplt.scatter(x, y, label='train', alpha=0.5)\nplt.scatter(x_test, y_test, label='test', alpha=0.5)\nplt.plot(x_test, loc);\nplt.fill_between(x_test.flatten(), lower, upper, alpha=0.4);\nplt.ylim(-5, 5);" }, { - "objectID": "posts/2021-03-22-gp_kernels.html#linear-kernel", - "href": "posts/2021-03-22-gp_kernels.html#linear-kernel", - "title": "Understanding Kernels in Gaussian Processes", - "section": "Linear Kernel", - "text": "Linear Kernel\nLinear kernel (a.k.a. dot-product kernel) is given as the following, \\[\n\\begin{aligned}\n\\mathcal{K}(x_1,x_2)= (x_1-c)(x_2-c)+\\sigma^2\n\\end{aligned}\n\\] Let’s visualize few functions drawn from the linear kernel\n\ndef K_lin(X1, X2, sigma=1., c=1.):\n return (X1-c)@(X2.T-c) + sigma**2\n\n\nnp.random.seed(0)\nsigma = 1.\nc = 1.\n\nplot_functions(K_lin, ax0_ylim=(-10,5), ax1_ylim=(-3,7))\n\n\n\n\n\n\n\n\nLet’s see the effect of varying parameters \\(\\sigma\\) and \\(c\\) of the linear kernel function.\n\nval_list = [-3,-2,-1,0,1,2,3]\n\nanimate_functions(K_lin, val_list, ax0_ylim=(-15,12), ax1_ylim=(-3,23), \n p_name='Offset', symbol='c')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\n\nnp.random.seed(1)\nval_list = np.square(np.array([1,2,3,4,5,8]))\n\nanimate_functions(K_lin, val_list, ax0_ylim=(-25,15), ax1_ylim=(-5,110), \n p_name='Variance', symbol='sigma')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nVarying \\(c\\) parameter changes position of shallow region in covariance matrix. In other words, as \\(x \\to c\\), points close to \\(x\\) have variance \\(\\to \\sigma^2\\). Distant points have monotonically increasing variance.\nIncreasing \\(\\sigma^2\\) adds a constant in all variance and covariances. So, it allows more uncertainty across all points and weakens the monotonic trend of variance over distant points.\n\nNon-stationary behaviour of Linear kernel\nUnlike other stationary kernels, Linear kernel is not invariant of translations in the input space. The comparison below, visually supports this claim.\n\nfig, ax = plt.subplots(2,2,figsize=(14,8), sharex=True)\nkerns = [K_rbf, K_m32, K_periodic, K_lin]\nk_names = ['RBF', 'Matern32', 'Periodic', 'Linear']\nX = np.linspace(-10,10,21).reshape(-1,1)\ndef update(x):\n count = 0\n for i in range(2):\n for j in range(2):\n ax.ravel()[count].cla()\n tmp_kern = kerns[count]\n mean = np.zeros(X.shape[0])\n cov = tmp_kern(X,X)\n ax.ravel()[count].plot(X, cov[:,x]);\n ax.ravel()[count].set_xlim(X[x-3],X[x+3])\n ax.ravel()[count].set_xlabel('X');\n ax.ravel()[count].set_ylabel('K('+str(X[x].round(2))+',X)');\n ax.ravel()[count].set_title('Covariance K('+str(X[x].round(2))+',X) for '+k_names[count]+' kernel');\n count += 1\n ax.ravel()[3].set_ylim(-5,80)\n plt.tight_layout()\n\nanim = FuncAnimation(fig, update, frames=[5,7,9,11,13,15], blit=False)\nplt.close()\nrc('animation', html='jshtml')\nanim\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\n<Figure size 432x288 with 0 Axes>" + "objectID": "posts/2022-10-18-kfac-laplace.html", + "href": "posts/2022-10-18-kfac-laplace.html", + "title": "Train NN with KFAC-Laplace in JAX", + "section": "", + "text": "from math import prod\nfrom functools import partial\nfrom time import time\n\nimport blackjax\nimport flax.linen as nn\nimport jax\nfrom jax.flatten_util import ravel_pytree\nimport jax.tree_util as jtu\nimport jax.numpy as jnp\n# jnp.set_printoptions(linewidth=2000)\n\nimport optax\nfrom tqdm import trange\n\nimport arviz as az\nimport seaborn as sns\n\nimport matplotlib.pyplot as plt\njax.config.update(\"jax_enable_x64\", False)\n\n%reload_ext watermark\n\nSome helper functions:\n\njitter = 1e-6\n\ndef get_shapes(params):\n return jtu.tree_map(lambda x:x.shape, params)\n\ndef svd_inverse(matrix):\n U, S, V = jnp.linalg.svd(matrix+jnp.eye(matrix.shape[0])*jitter)\n \n return V.T/S@U.T\n\n\nDataset\nWe take XOR dataset to begin with:\n\nX = jnp.array([[0, 0], [0, 1], [1, 0], [1, 1]])\ny = jnp.array([0, 1, 1, 0])\n\nX.shape, y.shape\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)\n\n\n((4, 2), (4,))\n\n\n\n\nNN model\n\nclass MLP(nn.Module):\n features: []\n\n @nn.compact\n def __call__(self, x):\n for n_features in self.features[:-1]:\n x = nn.Dense(n_features, kernel_init=jax.nn.initializers.glorot_normal(), bias_init=jax.nn.initializers.normal())(x)\n x = nn.relu(x)\n \n x = nn.Dense(features[-1])(x)\n return x.ravel()\n\nLet us initialize the weights of NN and inspect shapes of the parameters:\n\nfeatures = [2, 1]\nkey = jax.random.PRNGKey(0)\n\nmodel = MLP(features)\nparams = model.init(key, X).unfreeze()\n\nget_shapes(params)\n\n{'params': {'Dense_0': {'bias': (2,), 'kernel': (2, 2)},\n 'Dense_1': {'bias': (1,), 'kernel': (2, 1)}}}\n\n\n\nmodel.apply(params, X)\n\nDeviceArray([ 0.00687164, -0.01380461, 0. , 0. ], dtype=float32)\n\n\n\n\nNegative Log Joint\n\nnoise_var = 0.1\n\ndef neg_log_joint(params):\n y_pred = model.apply(params, X)\n flat_params = ravel_pytree(params)[0]\n log_prior = jax.scipy.stats.norm.logpdf(flat_params).sum()\n log_likelihood = jax.scipy.stats.norm.logpdf(y, loc=y_pred, scale=noise_var).sum()\n \n return -(log_prior + log_likelihood)\n\nTesting if it works:\n\nneg_log_joint(params)\n\nDeviceArray(105.03511, dtype=float32)\n\n\n\n\nFind MAP\n\nkey = jax.random.PRNGKey(0)\nparams = model.init(key, X).unfreeze()\nn_iters = 1000\n\nvalue_and_grad_fn = jax.jit(jax.value_and_grad(neg_log_joint))\nopt = optax.adam(0.01)\nstate = opt.init(params)\n\ndef one_step(params_and_state, xs):\n params, state = params_and_state\n loss, grads = value_and_grad_fn(params)\n updates, state = opt.update(grads, state)\n params = optax.apply_updates(params, updates)\n return (params, state), loss\n \n(params, state), losses = jax.lax.scan(one_step, init=(params, state), xs=None, length=n_iters)\n\nplt.plot(losses);\n\n\n\n\n\n\n\n\n\ny_map = model.apply(params, X)\ny_map\n\nDeviceArray([0.01383345, 0.98666817, 0.98563665, 0.01507111], dtype=float32)\n\n\n\nx = jnp.linspace(-0.1,1.1,100)\nX1, X2 = jnp.meshgrid(x, x)\n\ndef predict_fn(x1, x2):\n return model.apply(params, jnp.array([x1,x2]).reshape(1,2))\n\npredict_fn_vec = jax.jit(jax.vmap(jax.vmap(predict_fn)))\n\nZ = predict_fn_vec(X1, X2).squeeze()\n\nplt.contourf(X1, X2, Z)\nplt.colorbar();\n\n\n\n\n\n\n\n\n\n\nFull Hessian Laplace\n\nflat_params, unravel_fn = ravel_pytree(params)\n\ndef neg_log_joint_flat(flat_params):\n return neg_log_joint(unravel_fn(flat_params))\n\nH = jax.hessian(neg_log_joint_flat)(flat_params)\n\nsns.heatmap(H);\n\n\n\n\n\n\n\n\n\nposterior_cov = svd_inverse(H)\n\nsns.heatmap(posterior_cov);\n\n\n\n\n\n\n\n\nNote that we can sample parameters from the posterior and revert them to correct structure with the unravel_fn. Here is a class to do it all:\n\nclass FullHessianLaplace:\n def __init__(self, map_params, model):\n flat_params, self.unravel_fn = ravel_pytree(map_params)\n\n def neg_log_joint_flat(flat_params):\n params = unravel_fn(flat_params)\n return neg_log_joint(params)\n\n self.H = jax.hessian(neg_log_joint_flat)(flat_params)\n \n self.mean = flat_params\n self.cov = svd_inverse(self.H)\n self.model = model\n\n def _vectorize(self, f, seed, shape, f_kwargs={}):\n length = prod(shape)\n seeds = jax.random.split(seed, num=length).reshape(shape+(2,))\n \n sample_fn = partial(f, **f_kwargs)\n for _ in shape:\n sample_fn = jax.vmap(sample_fn)\n \n return sample_fn(seed=seeds)\n \n def _sample(self, seed):\n sample = jax.random.multivariate_normal(seed, mean=self.mean, cov=self.cov)\n return self.unravel_fn(sample)\n \n def sample(self, seed, shape):\n return self._vectorize(self._sample, seed, shape)\n \n def _predict(self, X, seed):\n sample = self._sample(seed)\n return self.model.apply(sample, X)\n \n def predict(self, X, seed, shape):\n return self._vectorize(self._predict, seed, shape, {'X': X})\n\n\nEstimating predictive posterior\n\nposterior = FullHessianLaplace(params, model)\n\nseed = jax.random.PRNGKey(1)\nn_samples = 100000\ny_pred_full = posterior.predict(X, seed=seed, shape=(n_samples,))\nulim = 5\nllim = -5\n\nfig, ax = plt.subplots(2,2,figsize=(12,4))\nax=ax.ravel()\nfor i in range(len(y)):\n az.plot_dist(y_pred_full[:, i], ax=ax[i]);\n ax[i].grid(True)\n ax[i].set_xticks(range(llim,ulim))\n ax[i].set_xlim(llim, ulim)\n ax[i].set_title(f\"X={X[i]}, y_pred_mean={y_pred_full[:, i].mean():.3f}, y_map={y_map[i]:.3f}\")\nfig.tight_layout()\n\n\n\n\n\n\n\n\n\n\n\nKFAC-Laplace\nWe need to invert partial Hessians to do KFAC-Laplace. We can use tree_flatten with ravel_pytree to ease the workflow. We need to: 1. pick up partial Hessians in pure matrix form to be able to invert them. 2. Create layer-wise distributions and sample them. These samples will be 1d arrays. 3. We need to convert those 1d arrays to params dictionary form so that we can plug it into the flax model and get posterior predictions.\nFirst we need to segregate the parameters layer-wise. We will use is_leaf condition to stop traversing the parameter PyTree at a perticular depth. See how it is different from vanilla tree_flatten:\n\nflat_params, tree_def = jtu.tree_flatten(params)\ndisplay(flat_params, tree_def)\n\n[DeviceArray([-0.00024913, 0.00027019], dtype=float32),\n DeviceArray([[ 0.8275324 , -0.8314813 ],\n [-0.8276633 , 0.83254045]], dtype=float32),\n DeviceArray([0.01351773], dtype=float32),\n DeviceArray([[1.1750739],\n [1.1685134]], dtype=float32)]\n\n\nPyTreeDef({'params': {'Dense_0': {'bias': *, 'kernel': *}, 'Dense_1': {'bias': *, 'kernel': *}}})\n\n\n\nis_leaf = lambda param: 'bias' in param\nlayers, tree_def = jtu.tree_flatten(params, is_leaf=is_leaf)\ndisplay(layers, tree_def)\n\n[{'bias': DeviceArray([-0.00024913, 0.00027019], dtype=float32),\n 'kernel': DeviceArray([[ 0.8275324 , -0.8314813 ],\n [-0.8276633 , 0.83254045]], dtype=float32)},\n {'bias': DeviceArray([0.01351773], dtype=float32),\n 'kernel': DeviceArray([[1.1750739],\n [1.1685134]], dtype=float32)}]\n\n\nPyTreeDef({'params': {'Dense_0': *, 'Dense_1': *}})\n\n\nThe difference is clearly evident. Now, we need to flatten the inner dictionaries to get 1d arrays.\n\nflat_params = list(map(lambda x: ravel_pytree(x)[0], layers))\nunravel_fn_list = list(map(lambda x: ravel_pytree(x)[1], layers))\ndisplay(flat_params, unravel_fn_list)\n\n[DeviceArray([-2.4912864e-04, 2.7019347e-04, 8.2753241e-01,\n -8.3148128e-01, -8.2766330e-01, 8.3254045e-01], dtype=float32),\n DeviceArray([0.01351773, 1.1750739 , 1.1685134 ], dtype=float32)]\n\n\n[<function jax._src.flatten_util.ravel_pytree.<locals>.<lambda>(flat)>,\n <function jax._src.flatten_util.ravel_pytree.<locals>.<lambda>(flat)>]\n\n\n\ndef modified_neg_log_joint_fn(flat_params):\n layers = jtu.tree_map(lambda unravel_fn, flat_param: unravel_fn(flat_param), unravel_fn_list, flat_params)\n params = tree_def.unflatten(layers)\n return neg_log_joint(params)\n\nfull_hessian = jax.hessian(modified_neg_log_joint_fn)(flat_params)\n\n# Pick diagonal entries from the Hessian\nuseful_hessians = [full_hessian[i][i] for i in range(len(full_hessian))]\nuseful_hessians\n\n[DeviceArray([[139.07985, 0. , 138.07985, 0. , 0. ,\n 0. ],\n [ 0. , 410.62708, 0. , 136.54236, 0. ,\n 273.08472],\n [138.07985, 0. , 139.07985, 0. , 0. ,\n 0. ],\n [ 0. , 136.54236, 0. , 137.54236, 0. ,\n 136.54236],\n [ 0. , 0. , 0. , 0. , 1. ,\n 0. ],\n [ 0. , 273.08472, 0. , 136.54236, 0. ,\n 274.08472]], dtype=float32),\n DeviceArray([[400.99997, 82.72832, 83.44101],\n [ 82.72832, 69.43975, 0. ],\n [ 83.44101, 0. , 70.35754]], dtype=float32)]\n\n\nEach entry in above list corresponds to layer-wise hessian matrices. Now, we need to create layer-wise distributions, sample from them and reconstruct params using the similar tricks we used above:\n\nclass KFACHessianLaplace:\n def __init__(self, map_params, model):\n self.model = model\n layers, self.tree_def = jtu.tree_flatten(map_params, is_leaf=lambda x: 'bias' in x)\n flat_layers = [ravel_pytree(layer) for layer in layers]\n self.means = list(map(lambda x: x[0], flat_layers))\n self.unravel_fn_list = list(map(lambda x: x[1], flat_layers))\n\n def neg_log_joint_flat(flat_params):\n flat_layers = [self.unravel_fn_list[i](flat_params[i]) for i in range(len(flat_params))]\n params = self.tree_def.unflatten(flat_layers)\n return neg_log_joint(params)\n\n self.H = jax.hessian(neg_log_joint_flat)(self.means)\n self.useful_H = [self.H[i][i] for i in range(len(self.H))]\n \n self.covs = [svd_inverse(matrix) for matrix in self.useful_H]\n \n def _vectorize(self, f, seed, shape, f_kwargs={}):\n length = prod(shape)\n seeds = jax.random.split(seed, num=length).reshape(shape+(2,))\n \n sample_fn = partial(f, **f_kwargs)\n for _ in shape:\n sample_fn = jax.vmap(sample_fn)\n \n return sample_fn(seed=seeds)\n \n def _sample_partial(self, seed, unravel_fn, mean, cov):\n sample = jax.random.multivariate_normal(seed, mean=mean, cov=cov)\n return unravel_fn(sample)\n \n def _sample(self, seed):\n seeds = [seed for seed in jax.random.split(seed, num=len(self.means))]\n flat_sample = jtu.tree_map(self._sample_partial, seeds, self.unravel_fn_list, self.means, self.covs)\n sample = self.tree_def.unflatten(flat_sample)\n return sample\n \n def sample(self, seed, n_samples=1):\n return self._vectorize(self._sample, seed, shape)\n \n def _predict(self, X, seed):\n sample = self._sample(seed)\n return self.model.apply(sample, X)\n \n def predict(self, X, seed, shape):\n return self._vectorize(self._predict, seed, shape, {'X': X})\n\n\nEstimating predictive posterior\n\nkfac_posterior = KFACHessianLaplace(params, model)\n\nseed = jax.random.PRNGKey(1)\nn_samples = 1000000\ny_pred_kfac = kfac_posterior.predict(X, seed=seed, shape=(n_samples, ))\nulim = 5\nllim = -5\n\nfig, ax = plt.subplots(2,2,figsize=(12,4))\nax=ax.ravel()\nfor i in range(len(y)):\n az.plot_dist(y_pred_full[:, i], ax=ax[i], label='full', color='r')\n az.plot_dist(y_pred_kfac[:, i], ax=ax[i], label='kfac', color='b')\n ax[i].grid(True)\n ax[i].set_xticks(range(llim,ulim))\n ax[i].set_xlim(llim, ulim)\n ax[i].set_title(f\"X={X[i]}, y_map={y_map[i]:.3f}\")\nfig.tight_layout()\n\n\n\n\n\n\n\n\nWe can see that KFAC is approximating the trend of Full Hessian Laplace. We can visualize the Covariance matrices as below.\n\nfig, ax = plt.subplots(1,2,figsize=(18,5))\nsns.heatmap(posterior.cov, ax=ax[0], annot=True, fmt = '.2f')\nax[0].set_title('Full')\n\nkfac_cov = posterior.cov * 0\noffset = 0\nfor cov in kfac_posterior.covs:\n length = cov.shape[0]\n kfac_cov = kfac_cov.at[offset:offset+length, offset:offset+length].set(cov)\n offset += length\n\nsns.heatmap(kfac_cov, ax=ax[1], annot=True, fmt = '.2f')\nax[1].set_title('KFAC');\n\n\n\n\n\n\n\n\n\n\n\nComparison with MCMC\nInspired from a blackjax docs example.\n\nkey = jax.random.PRNGKey(0)\nwarmup_key, inference_key = jax.random.split(key, 2)\nnum_warmup = 5000\nnum_samples = n_samples\n\ninitial_position = model.init(key, X)\ndef logprob(params): \n return -neg_log_joint(params)\n\ndef inference_loop(rng_key, kernel, initial_state, num_samples):\n def one_step(state, rng_key):\n state, _ = kernel(rng_key, state)\n return state, state\n\n keys = jax.random.split(rng_key, num_samples)\n _, states = jax.lax.scan(one_step, initial_state, keys)\n\n return states\n\ninit = time()\nadapt = blackjax.window_adaptation(blackjax.nuts, logprob, num_warmup)\nfinal_state, kernel, _ = adapt.run(warmup_key, initial_position)\nstates = inference_loop(inference_key, kernel, final_state, num_samples)\nsamples = states.position.unfreeze()\nprint(f\"Sampled {n_samples} samples in {time()-init:.2f} seconds\")\n\nSampled 1000000 samples in 27.85 seconds\n\n\n\ny_pred_mcmc = jax.vmap(model.apply, in_axes=(0, None))(samples, X)\n\nulim = 5\nllim = -5\n\nfig, ax = plt.subplots(2,2,figsize=(12,4))\nax=ax.ravel()\nfor i in range(len(y)):\n az.plot_dist(y_pred_full[:, i], ax=ax[i], label='full', color='r')\n az.plot_dist(y_pred_kfac[:, i], ax=ax[i], label='kfac', color='b')\n az.plot_dist(y_pred_mcmc[:, i], ax=ax[i], label='mcmc', color='k')\n ax[i].grid(True)\n ax[i].set_xticks(range(llim,ulim))\n ax[i].set_xlim(llim, ulim)\n ax[i].set_title(f\"X={X[i]}, y_map={y_map[i]:.3f}\")\nfig.tight_layout()\n\n\n\n\n\n\n\n\n\nfig, ax = plt.subplots(1,3,figsize=(18,5))\nfig.subplots_adjust(wspace=0.1)\nsns.heatmap(posterior.cov, ax=ax[0], annot=True, fmt = '.2f')\nax[0].set_title('Full')\n\nkfac_cov = posterior.cov * 0\noffset = 0\nfor cov in kfac_posterior.covs:\n length = cov.shape[0]\n kfac_cov = kfac_cov.at[offset:offset+length, offset:offset+length].set(cov)\n offset += length\n\nsns.heatmap(kfac_cov, ax=ax[1], annot=True, fmt = '.2f')\nax[1].set_title('KFAC');\n\nmcmc_cov = jnp.cov(jax.vmap(lambda x: ravel_pytree(x)[0])(samples).T)\n\nsns.heatmap(mcmc_cov, ax=ax[2], annot=True, fmt = '.2f')\nax[2].set_title('MCMC');\n\n\n\n\n\n\n\n\n\n\nLibrary versions\n\n%watermark --iversions\n\nflax : 0.6.1\nblackjax : 0.8.2\noptax : 0.1.3\nmatplotlib: 3.5.1\njax : 0.3.23\narviz : 0.12.1\nseaborn : 0.11.2\njson : 2.0.9" }, { - "objectID": "posts/2021-03-22-gp_kernels.html#multiplications-of-kernels", - "href": "posts/2021-03-22-gp_kernels.html#multiplications-of-kernels", - "title": "Understanding Kernels in Gaussian Processes", - "section": "Multiplications of kernels", - "text": "Multiplications of kernels\nIf a single kernel is having high bias in fitting a dataset, we can use mutiple of these kernels in multiplications and/or summations. First, let us see effect of multiplication of a few kernels.\n\nPeriodic * Linear\n\nX = np.linspace(-10,10,100).reshape(-1,1)\nplt.plot(X, K_periodic(X,X,sigma=2.)[:,50], label='Periodic')\nplt.plot(X, K_lin(X,X,sigma=0.01,c=0)[:,50], label='Linear')\nplt.plot(X, K_periodic(X,X,sigma=2.)[:,50]*K_lin(X,X,sigma=0.01,c=0)[:,50], label='Periodic*Linear')\nplt.legend(bbox_to_anchor=(1,1));plt.xlabel('X');plt.ylabel('Covariance')\nplt.title('K(0,*)');\n\n\n\n\n\n\n\n\n\n\nLinear * Linear\n\nX = np.linspace(-1,1,100).reshape(-1,1)\nplt.plot(X, K_lin(X,X,c=-1)[:,50], label='Linear1')\nplt.plot(X, K_lin(X,X,c=1)[:,50], label='Linear2')\nplt.plot(X, K_lin(X,X,c=0.5)[:,50], label='Linear3')\nplt.plot(X, K_lin(X,X,c=-1)[:,50]*K_lin(X,X,c=1)[:,50], label='Linear1*Linear3')\nplt.plot(X, K_lin(X,X,c=-1)[:,50]*K_lin(X,X,c=1)[:,50]*K_lin(X,X,c=0.5)[:,50], label='Linear1*Linear2*Linear3')\nplt.legend(bbox_to_anchor=(1,1));\n\n\n\n\n\n\n\n\n\n\nMatern * Linear\n\nX = np.linspace(-1,1,100).reshape(-1,1)\nk1 = K_lin(X,X,c=1)[:,50]\nk2 = K_m32(X,X)[:,50]\nplt.plot(X, k1, label='Linear')\nplt.plot(X, k2, label='Matern32')\nplt.plot(X, k1*k2, label='Matern32*Linear')\nplt.legend(bbox_to_anchor=(1,1));" + "objectID": "posts/2024-12-29-object-detection-how-to.html", + "href": "posts/2024-12-29-object-detection-how-to.html", + "title": "Object Detection - A how-to guide", + "section": "", + "text": "# Config\nimport os\n\n# Basic\nimport numpy as np\nimport pandas as pd\nfrom time import time\nimport matplotlib.pyplot as plt\n\n# Monitoring\nfrom tqdm.notebook import tqdm\n\n# IO\nfrom os.path import join, exists, basename, dirname, splitext, expanduser\nfrom glob import glob\n\n# Parallel processing\nfrom joblib import Parallel, delayed\n\nimport yaml\nfrom PIL import Image\nimport supervision as sv\nimport cv2\nfrom supervision.utils.file import list_files_with_extensions, read_txt_file\nfrom supervision.detection.utils import polygon_to_xyxy\nfrom ultralytics import YOLO\nfrom ultralytics.utils.downloads import download\nfrom pathlib import Path\nfrom inference.models.utils import get_roboflow_model\nfrom roboflow import Roboflow\nfrom typing import List, Tuple\nfrom dotenv import load_dotenv\nload_dotenv()\n\n%reload_ext memory_profiler\n\nsv.__version__\n\n'0.25.1'" }, { - "objectID": "posts/2021-03-22-gp_kernels.html#appendix-extra-material", - "href": "posts/2021-03-22-gp_kernels.html#appendix-extra-material", - "title": "Understanding Kernels in Gaussian Processes", - "section": "Appendix (Extra material)", - "text": "Appendix (Extra material)\nAt this stage, we do not know how the fuctions are drawn from linear kernel based covariance matrix end up being lines with various intercepts and slopes.\n\n\nPredicting at a single point after observing value at a single point\nLet’s see how would be a GP prediction after observing value at a single point.\nOur kernel function is given by, * \\(K(x,x')=(x-c) \\cdot (x'-c)+\\sigma^2\\)\nNow, we observe value \\(y\\) at a location \\(x\\) and we want to predict value \\(y^*\\) at location \\(x^*\\). \\[\n\\begin{aligned}\n(y^*|x_1,y_1,x^*) &= K(x^*,x) \\cdot K^{-1}(x,x)\\cdot y \\\\\n&= \\left(\\frac{(x-c)(x^*-c)+\\sigma^2}{(x-c)(x-c)+\\sigma^2}\\right)\\cdot y\n\\end{aligned}\n\\] \\(c\\) and \\(\\sigma^2\\) do not vary in numerator and denominator so, the value of \\(y^* \\propto x^*\\).\n\n\n\nPredicting at a single point after observing values at two points\nNow, we’ll take a case where two values \\({y_1, y_2}\\) are observed at \\({x_1, x_2}\\). Let us try to predict value \\(y^*\\) at \\(x^*\\).\n$$ y^* =\n\\[\\begin{bmatrix}\nK(x_1, x^*) & K(x_2,x^*)\n\\end{bmatrix}\\begin{bmatrix}\nK(x_1, x_1) & K(x_1,x_2) \\\\\nK(x_2, x_1) & K(x_2,x_2)\n\\end{bmatrix}\\]\n^{-1}\n\\[\\begin{bmatrix}\ny_1 \\\\\ny_2\n\\end{bmatrix}\\]\n\\\n& =\n\\[\\begin{bmatrix}\n(x_1-c)(x^*-c)+\\sigma^2 & (x_2-c)(x^*-c)+\\sigma^2\n\\end{bmatrix}\n\\begin{bmatrix}\n(x_1-c)^2+\\sigma^2 & (x_1-c) (x_2-c)+\\sigma^2 \\\\\n(x_2-c) (x_1-c)+\\sigma^2 & (x_2-c)^2 +\\sigma^2\n\\end{bmatrix}\\]\n^{-1}\n\\[\\begin{bmatrix}\ny_1 \\\\\ny_2\n\\end{bmatrix}\\]\n\\\n& =\n\\[\\begin{bmatrix}\n(x_1-c)(x^*-c)+\\sigma^2 & (x_2-c)(x^*-c)+\\sigma^2\n\\end{bmatrix} \\frac{1}{\\sigma^2(x_1-x_2)^2}\n\\begin{bmatrix}\n(x_2-c)^2+\\sigma^2 & -[(x_1-c)(x_2-c)+\\sigma^2] \\\\\n-[(x_2-c) (x_1-c)+\\sigma^2] & (x_1-c)^2 +\\sigma^2\n\\end{bmatrix}\n\\begin{bmatrix}\ny_1 \\\\\ny_2\n\\end{bmatrix} \\tag{1}\\]\nFrom Eq. (1) second term, we can say that if \\(\\sigma^2=0\\), matrix is not-invertible because determinant is zero. It means that, if \\(\\sigma^2=0\\), observing a single point is enough, we can infer values at infinite points after observing that single point.\nEvaluating Eq. (1) further, it converges to the following equation, \\[\n\\begin{aligned}\ny^* = \\frac{(x_1y_2-x_2y_1)+x^*(y_1-y_2)}{(x_1-x_2)}\n\\end{aligned}\n\\] Interestingly, we can see that output does not depend on \\(c\\) or \\(\\sigma^2\\) anymore. Let us verify experimentally if this is true for observing more than 2 data points.\n\n\nPrepering useful functions\n\nfrom scipy.optimize import minimize\n\n\ndef cov_func(x, x_prime, sigma, c):\n return (x-c)@(x_prime-c) + sigma**2\n\ndef neg_log_likelihood(params):\n n = X.shape[0]\n sigma, c, noise_std = params\n cov = cov_func(X, X.T, sigma, c)\n cov = cov + (noise_std**2)*np.eye(n)\n nll_ar = 0.5*(Y.T@np.linalg.pinv(cov)@Y) + 0.5*n*np.log(2*np.pi) + 0.5*np.log(np.linalg.det(cov)) \n return nll_ar[0,0]\n\ndef predict(params):\n sigma, c, noise_std = params\n k = cov_func(X, X.T, sigma, c)\n np.fill_diagonal(k, k.diagonal()+noise_std**2)\n k_inv = np.linalg.pinv(k)\n k_star = cov_func(X_test, X.T, sigma, c)\n\n mean = k_star@k_inv@Y\n cov = cov_func(X_test, X_test.T, sigma, c) - k_star@k_inv@k_star.T\n return mean, cov\n\n\n\nObserving more than two points and changing hyperparameters manually\n\nX = np.array([3,4,5,6,7,8]).reshape(-1,1)\nY = np.array([6,9,8,11,10,13]).reshape(-1,1)\nX_test = np.linspace(1,8,20).reshape(-1,1)\nparams_grid = [[1., 0.01, 10**-10], [100., 1., 10**-10], \n [100., 0.01, 10**-10], [1., 2., 1.]] # sigma, c, noise_std\n\nX_extra = np.hstack([np.ones((X.shape[0], 1)), X])\nTheta = np.linalg.pinv(X_extra.T@X_extra)@X_extra.T@Y\nX_test_extra = np.hstack([np.ones((X_test.shape[0], 1)), X_test])\nY_test_ideal = X_test_extra@Theta\n\nfig, ax = plt.subplots(1,4,figsize=(16,5), sharey=True)\nmeans = []\nfor p_i, params in enumerate(params_grid):\n Y_test_mean, Y_test_cov = predict(params)\n means.append(Y_test_mean)\n ax[p_i].scatter(X, Y, label='train')\n ax[p_i].scatter(X_test, Y_test_mean, label='test')\n ax[p_i].legend();ax[p_i].set_xlabel('X');ax[p_i].set_ylabel('Y');\n ax[p_i].set_title('sigma='+str(params[0])+', c='+str(params[1])+', noise='+str(params[2]));\n\n\n\n\n\n\n\n\n\nnp.allclose(Y_test_ideal, means[0]),\\\nnp.allclose(Y_test_ideal, means[1]),\\\nnp.allclose(Y_test_ideal, means[2]),\\\nnp.allclose(Y_test_ideal, means[3])\n\n(True, True, True, False)\n\n\n\nmodel = GPy.models.GPRegression(X, Y, GPy.kern.Linear(input_dim=1))\n# model['Gaussian_noise'].fix(10**-10)\n# model.kern.variances.fix(10**-10)\nmodel.optimize()\nmodel.plot()\nplt.plot(X_test, Y_test_ideal, label='Normal Eq. fit')\nplt.plot(X_test,model.predict(X_test)[0], label='Prediction')\nplt.legend()\nmodel\n\n\n\n\nModel: GP regression\nObjective: 13.51314321804978\nNumber of Parameters: 2\nNumber of Optimization Parameters: 2\nUpdates: True\n\n\n\n\n\n\nGP_regression.\nvalue\nconstraints\npriors\n\n\nlinear.variances\n2.806515343539501\n+ve\n\n\n\nGaussian_noise.variance\n2.0834221617534134\n+ve\n\n\n\n\n\n\n\n\n\n\n\n\n\nWe can see that there is no change in fit with change in \\(c\\) and \\(\\sigma\\). 4th fit is not matching with the ideal fit obtained by normal equation because of high noise. Now, let us estimate parameters by minimizing negative log marginal likelihood.\n\nparams = [1., 1., 1.]\nresult = minimize(neg_log_likelihood, params, bounds=[(10**-5, 10**5), (10**-5, 10**5), (10**-5, 10**-5)])\nparams = result.x\nprint(params, result.fun)\nY_test_mean, Y_test_cov = predict(params)\nplt.scatter(X, Y, label='train')\nplt.scatter(X_test, Y_test_mean, label='test')\nplt.legend();plt.xlabel('X');plt.ylabel('Y');\nparams = np.round(params, 4)\nplt.title('sigma='+str(params[0])+', c='+str(params[1])+', noise='+str(params[2]));\nnp.allclose(Y_test_ideal, Y_test_mean)\n\n[9.99998123e-01 9.99998123e-01 1.00000000e-05] 10207223403405.541\n\n\nFalse\n\n\n\n\n\n\n\n\n\n\ndef neg_log_likelihood(sigma, c, noise_std):\n n = X.shape[0]\n cov = cov_func(X, X.T, sigma, c)\n cov = cov + (noise_std**2)*np.eye(n)\n nll_ar = 0.5*(Y.T@np.linalg.pinv(cov)@Y) + 0.5*n*np.log(2*np.pi) + 0.5*np.log(np.linalg.det(cov))\n return nll_ar[0,0]\n\n\ngrad_func = grad(neg_log_likelihood, argnum=[0,1,2])\nalpha = 0.01\nloss = []\nsigma, c, noise_std = 1., 1., 1.\nfor _ in range(5000):\n grads = grad_func(sigma, c, noise_std)\n # print(grads)\n sigma = sigma - alpha*grads[0]\n c = c - alpha*grads[1]\n noise_std = noise_std - alpha*grads[2]\n loss.append(neg_log_likelihood(sigma, c, noise_std))\nprint(sigma, c, noise_std)\nplt.plot(loss);\nloss[-1]\n\n7.588989986845149 -2.830840439162303 32.2487569348891\n\n\n31.05187173290998\n\n\n\n\n\n\n\n\n\n\nparams = sigma, c, noise_std\nY_test_mean, Y_test_cov = predict(params)\nplt.scatter(X, Y, label='train')\nplt.scatter(X_test, Y_test_mean, label='test')\nplt.legend();plt.xlabel('X');plt.ylabel('Y');\nparams = np.round(params, 4)\nplt.title('sigma='+str(params[0])+', c='+str(params[1])+', noise='+str(params[2]));\nnp.allclose(means[0], Y_test_mean, rtol=10**-1, atol=10**-1)\n\nFalse" + "objectID": "posts/2024-12-29-object-detection-how-to.html#dataset", + "href": "posts/2024-12-29-object-detection-how-to.html#dataset", + "title": "Object Detection - A how-to guide", + "section": "Dataset", + "text": "Dataset\n\nDownload\n\nrf = Roboflow(api_key=os.getenv(\"ROBOFLOW_API_KEY\"))\nws = rf.workspace(\"plan-zkend\")\nproject = ws.project(\"animals-ksxhf-plgrl\")\nversion = project.version(2)\nrf_dataset = version.download(\"yolov8\", location=\"/tmp/tmp\", overwrite=True)\n\nloading Roboflow workspace...\nloading Roboflow project...\n\n\nDownloading Dataset Version Zip in /tmp/tmp to yolov8:: 100%|██████████| 3047/3047 [00:02<00:00, 1202.83it/s]\n\n\n\n\n\n\nExtracting Dataset Version Zip to /tmp/tmp in yolov8:: 100%|██████████| 212/212 [00:00<00:00, 5963.05it/s]\n\n\n\nrf_dataset.location\n\n'/tmp/tmp'\n\n\n\n!ls -lh {rf_dataset.location}\n\ntotal 24K\n-rw-rw-r-- 1 patel_zeel patel_zeel 423 Feb 3 11:40 data.yaml\n-rw-rw-r-- 1 patel_zeel patel_zeel 141 Feb 3 11:40 README.dataset.txt\n-rw-rw-r-- 1 patel_zeel patel_zeel 896 Feb 3 11:40 README.roboflow.txt\ndrwxrwxr-x 4 patel_zeel patel_zeel 4.0K Jan 17 18:47 test\ndrwxrwxr-x 4 patel_zeel patel_zeel 4.0K Jan 17 18:47 train\ndrwxrwxr-x 4 patel_zeel patel_zeel 4.0K Jan 17 22:52 valid\n\n\n\n!ls -lh {rf_dataset.location}/test\n\ntotal 8.0K\ndrwxrwxr-x 2 patel_zeel patel_zeel 4.0K Jan 17 18:47 images\ndrwxrwxr-x 2 patel_zeel patel_zeel 4.0K Jan 17 18:47 labels\n\n\n\n!ls -l {rf_dataset.location}/test/images/*.jpg | wc -l\n\n7\n\n\n\n!ls -l {rf_dataset.location}/test/labels/*.txt | wc -l\n\n7\n\n\n\nCheck a sample manually\n\nimage_paths = glob(f\"{rf_dataset.location}/test/images/*.jpg\")\nsample_image_path = image_paths[0]\nsample_image = Image.open(sample_image_path)\nsample_image\n\n\n\n\n\n\n\n\n\nlabel_paths = glob(f\"{rf_dataset.location}/test/labels/*.txt\")\nsample_label_path = label_paths[0]\nsample_label = np.loadtxt(sample_label_path, ndmin=2)\nsample_label.shape\n\n(1, 5)\n\n\n\nsample_label\n\narray([[ 18, 0.67217, 0.47797, 0.53625, 0.51148]])\n\n\n\n\n\nLoad with supervision\n\n%%time\n\ndataset = sv.DetectionDataset.from_yolo(images_directory_path=f\"{rf_dataset.location}/test/images\", annotations_directory_path=f\"{rf_dataset.location}/test/labels\", data_yaml_path=f\"{rf_dataset.location}/data.yaml\")\nlen(dataset)\n\nCPU times: user 11.6 ms, sys: 0 ns, total: 11.6 ms\nWall time: 10.6 ms\n\n\n7\n\n\n\n\nVisualize\nIdeally, LabelAnnotator should show the class names on top of the bounding boxes but currently it shows class IDs. This issue is tracked here.\n\nimage_path, image, detection = dataset[0]\n\nbox_annotator = sv.BoxAnnotator()\nlabel_annotator = sv.LabelAnnotator()\nannotated_frame = box_annotator.annotate(image.copy(), detection)\nannotated_frame = label_annotator.annotate(annotated_frame.copy(), detection)\n\nImage.fromarray(annotated_frame)\n\n\n\n\n\n\n\n\nA quick fix for now.\n\nimage_path, image, detection = dataset[0]\nnp_classes = np.array(dataset.classes)\ndetection.data['class_name'] = np_classes[detection.class_id]\nbox_annotator = sv.BoxAnnotator()\nlabel_annotator = sv.LabelAnnotator()\nannotated_frame = box_annotator.annotate(image.copy(), detection)\nannotated_frame = label_annotator.annotate(annotated_frame.copy(), detection)\n\nImage.fromarray(annotated_frame)" }, { - "objectID": "posts/2022-03-06-probabilistic-machine-learning.html", - "href": "posts/2022-03-06-probabilistic-machine-learning.html", - "title": "Probabilistic Machine Learning", - "section": "", - "text": "An inference problem requires statements about the value of an unobserved (latent) variable x based on observations y which are related to x, but may not be sufficient to fully determine x. This requires a notion of uncertainty.\n\nWe can define the following rules because \\(p(E) = 1\\) for any event \\(E\\).\n\nSum rule: \\(p(E) = p(E|A) + p(E|\\neg A)\\)\n\nProduct rule: \\(p(E, A) = p(E|A)p(A) = p(A|E)p(E)\\)\n\nBayes’ theorem: \\(p(E|A) = \\frac{p(A|E)p(E)}{p(A)}\\)" + "objectID": "posts/2024-12-29-object-detection-how-to.html#inference", + "href": "posts/2024-12-29-object-detection-how-to.html#inference", + "title": "Object Detection - A how-to guide", + "section": "Inference", + "text": "Inference\n\nWith roboflow models\n\nrf_model = get_roboflow_model(\"yolov8s-640\")\nprediction = rf_model.infer(image)[0]\ndetection = sv.Detections.from_inference(prediction)\nannotated_image = box_annotator.annotate(image.copy(), detection)\nannotated_image = label_annotator.annotate(annotated_image, detection)\nImage.fromarray(annotated_image)\n\nSpecified provider 'OpenVINOExecutionProvider' is not in available provider names.Available providers: 'TensorrtExecutionProvider, CUDAExecutionProvider, CPUExecutionProvider'\nSpecified provider 'CoreMLExecutionProvider' is not in available provider names.Available providers: 'TensorrtExecutionProvider, CUDAExecutionProvider, CPUExecutionProvider'\n2025-02-03 11:42:55.119414030 [E:onnxruntime:Default, provider_bridge_ort.cc:1862 TryGetProviderInfo_CUDA] /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1539 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_cuda.so with error: libcudnn_adv.so.9: cannot open shared object file: No such file or directory\n\n2025-02-03 11:42:55.119453180 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:993 CreateExecutionProviderInstance] Failed to create CUDAExecutionProvider. Require cuDNN 9.* and CUDA 12.*. Please install all dependencies as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.\n\n\n\n\n\n\n\n\n\n\n\nWith ultralytics models\n\nmodel = YOLO(\"yolov8s\")\nprediction = model(image)[0]\ndetection = sv.Detections.from_ultralytics(prediction)\nannotated_image = box_annotator.annotate(image.copy(), detection)\nannotated_image = label_annotator.annotate(annotated_image, detection)\nImage.fromarray(annotated_image)\n\n\n0: 640x640 1 dog, 2 horses, 1 sheep, 1 cow, 6.3ms\nSpeed: 10.2ms preprocess, 6.3ms inference, 274.3ms postprocess per image at shape (1, 3, 640, 640)" }, { - "objectID": "posts/2022-03-06-probabilistic-machine-learning.html#introduction", - "href": "posts/2022-03-06-probabilistic-machine-learning.html#introduction", - "title": "Probabilistic Machine Learning", - "section": "", - "text": "An inference problem requires statements about the value of an unobserved (latent) variable x based on observations y which are related to x, but may not be sufficient to fully determine x. This requires a notion of uncertainty.\n\nWe can define the following rules because \\(p(E) = 1\\) for any event \\(E\\).\n\nSum rule: \\(p(E) = p(E|A) + p(E|\\neg A)\\)\n\nProduct rule: \\(p(E, A) = p(E|A)p(A) = p(A|E)p(E)\\)\n\nBayes’ theorem: \\(p(E|A) = \\frac{p(A|E)p(E)}{p(A)}\\)" + "objectID": "posts/2024-12-29-object-detection-how-to.html#metrics", + "href": "posts/2024-12-29-object-detection-how-to.html#metrics", + "title": "Object Detection - A how-to guide", + "section": "Metrics", + "text": "Metrics\n\nmodel = YOLO(\"yolov8x\")\ntargets = []\npredictions = []\nnp_classes = np.array(dataset.classes)\nfor image_path, image, target in tqdm(dataset):\n # add class names to detection\n # target.data['class_name'] = np_classes[target.class_id]\n\n # remove classes not in model\n # target = target[np.isin(target['class_name'], list(model.names.values()))]\n # if len(target) == 0:\n # print(f\"Skipping {image_path} as it has no classes in model\")\n # continue\n \n prediction = model(image, verbose=False)[0]\n detection = sv.Detections.from_ultralytics(prediction)\n \n # remove classes not in dataset\n detection = detection[np.isin(detection['class_name'], dataset.classes)]\n \n # remap class ids\n detection.class_id = np.array([dataset.classes.index(class_name) for class_name in detection['class_name']])\n \n targets.append(target)\n predictions.append(detection)\n\n\n\n\n\nmAP = sv.metrics.MeanAveragePrecision().update(predictions, targets).compute()\nmAP50 = mAP.mAP_scores[0]\nmAP5095 = mAP.mAP_scores.mean()\nprint(f\"mAP50: {mAP50:.2f}, mAP50-95: {mAP5095:.2f}\")\n\nmAP50: 0.29, mAP50-95: 0.19" }, { - "objectID": "posts/2020-09-21-programatically_download_openaq_data.html", - "href": "posts/2020-09-21-programatically_download_openaq_data.html", - "title": "Programatically download OpenAQ data", - "section": "", - "text": "# uncomment to install these libraries\n# !pip install boto3 botocore\n\nimport pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport sys\nimport boto3\nimport botocore\nimport os\nfrom IPython.display import clear_output" + "objectID": "posts/2024-12-29-object-detection-how-to.html#dataset-1", + "href": "posts/2024-12-29-object-detection-how-to.html#dataset-1", + "title": "Object Detection - A how-to guide", + "section": "Dataset", + "text": "Dataset\n\nDownload\n\nif not exists('/tmp/DOTAv1.zip'):\n # Downloaded in 4m 5s with 6.09 MB/s\n !wget https://github.com/ultralytics/assets/releases/download/v0.0.0/DOTAv1.zip -O /tmp/DOTAv1.zip\nelse:\n print('DOTAv1.zip already downloaded')\n \nif not exists('/tmp/DOTAv1'):\n !unzip /tmp/DOTAv1.zip -d /tmp\nelse:\n print('DOTAv1 already unzipped')\n\nDOTAv1.zip already downloaded\nDOTAv1 already unzipped\n\n\n\n!ls /tmp/DOTAv1\n\n21.84s - pydevd: Sending message related to process being replaced timed-out after 5 seconds\n\n\nimages labels\n\n\n\n!ls /tmp/DOTAv1/images\n\n27.24s - pydevd: Sending message related to process being replaced timed-out after 5 seconds\n\n\ntest train val\n\n\n\nprint(f\"Number of train samples: {len(glob('/tmp/DOTAv1/images/train/*.jpg'))}\")\nprint(f\"Number of val samples: {len(glob('/tmp/DOTAv1/images/val/*.jpg'))}\")\n\nNumber of train samples: 1411\nNumber of val samples: 458\n\n\nKeep 20 samples in each and delete the rest\n\npaths = {'images': {}, 'labels': {}}\n\nfor split in ['train', 'val']:\n paths['images'][split] = glob(f\"/tmp/DOTAv1/images/{split}/*.jpg\")[:100]\n paths['labels'][split] = [p.replace(\"images\", \"labels\").replace(\".jpg\", \".txt\") for p in paths['images'][split]]\n !mkdir -p /tmp/DOTAv1_small/images/{split}\n !mkdir -p /tmp/DOTAv1_small/labels/{split}\n\n for img, label in tqdm(zip(paths['images'][split], paths['labels'][split])):\n os.system(f\"cp {img} /tmp/DOTAv1_small/images/{split}/\")\n os.system(f\"cp {label} /tmp/DOTAv1_small/labels/{split}/\")\n\n32.81s - pydevd: Sending message related to process being replaced timed-out after 5 seconds\n38.11s - pydevd: Sending message related to process being replaced timed-out after 5 seconds\n\n\n\n\n\n44.25s - pydevd: Sending message related to process being replaced timed-out after 5 seconds\n49.55s - pydevd: Sending message related to process being replaced timed-out after 5 seconds\n\n\n\n\n\n\nprint(f\"Number of train samples: {len(glob('/tmp/DOTAv1_small/images/train/*.jpg'))}\")\nprint(f\"Number of val samples: {len(glob('/tmp/DOTAv1_small/images/val/*.jpg'))}\")\n\nNumber of train samples: 100\nNumber of val samples: 100\n\n\n\n\nCheck a sample\n\ntrain_images = glob(\"/tmp/DOTAv1_small/images/train/*.jpg\")\nsample_image_path = train_images[1]\nsample_image_path\n\n'/tmp/DOTAv1_small/images/train/P2732.jpg'\n\n\n\nsample_image = Image.open(sample_image_path)\nsample_image.size\n\n(3087, 2632)\n\n\n\nsample_image.reduce(10)\n\n\n\n\n\n\n\n\n\nsample_label_path = sample_image_path.replace(\"images\", \"labels\").replace(\".jpg\", \".txt\")\nassert exists(sample_label_path), f\"Error: {sample_label_path} does not exist\"\n\n\nsample_label = np.loadtxt(sample_label_path, ndmin=2)\nsample_label.shape\n\n(51, 9)\n\n\n\nsample_label[0]\n\narray([ 7, 0.72076, 0.45023, 0.72238, 0.44871, 0.73178, 0.46087, 0.73016, 0.46201])\n\n\nThe above is YOLO-Oriented Bounding Box (OBB) format: class_id, x1, y1, x2, y2, x3, y3, x4, y4\n\n\nLoad with supervision\nsupervision does not support DOTA dataset yet, but ultralytics has already converted it to YOLO format. Let’s create data.yml file for DOTA dataset.\n\n%%writefile /tmp/DOTAv1_small/data.yml\ntrain: /tmp/DOTAv1_small/images/train\nval: /tmp/DOTAv1_small/images/val\ntest: /tmp/DOTAv1_small/images/test\nnc: 15\nnames: ['plane', 'ship', 'storage tank', 'baseball diamond', 'tennis court', 'basketball court', 'ground track field', 'harbor', 'bridge', 'large vehicle', 'small vehicle', 'helicopter', 'roundabout', 'soccer ball field', 'swimming pool']\n\nOverwriting /tmp/DOTAv1_small/data.yml\n\n\n\n# %%memit\n\ndataset = sv.DetectionDataset.from_yolo(\n \"/tmp/DOTAv1_small/images/train\",\n \"/tmp/DOTAv1_small/labels/train\",\n data_yaml_path=\"/tmp/DOTAv1_small/data.yml\",\n is_obb=True,\n)\n\n\n\nVisualize\n\nsample = dataset[0]\nimg_array = sample[1]\nimg_detections = sample[2]\n\nannotator = sv.OrientedBoxAnnotator()\nannotated_img = annotator.annotate(img_array, img_detections)\n\nplt.imshow(annotated_img)\nplt.ylim(0, annotated_img.shape[1] // 2)\nplt.axis(\"off\")" }, { - "objectID": "posts/2020-09-21-programatically_download_openaq_data.html#setup", - "href": "posts/2020-09-21-programatically_download_openaq_data.html#setup", - "title": "Programatically download OpenAQ data", - "section": "Setup", - "text": "Setup\n\ns3 = boto3.client('s3', config=botocore.config.Config(signature_version=botocore.UNSIGNED))\nbucket_name = 'openaq-fetches'\nprefix = 'realtime-gzipped/'\n\npath = '/content/drive/MyDrive/IJCAI-21/data/OpenAQ-Delhi/'\n\nstart_date = '2020/01/01' # start date (inclusive)\nend_date = '2020/12/31' # end date (inclusive)\n\n\nDownload\n\nfor date in pd.date_range(start=start_date, end=end_date):\n clear_output(wait=True)\n date = str(date).split(' ')[0] # keeping just YYYY-MM-DD from YYYY-MM-DD HH:MM:SS\n print('Downloading:', date)\n data_dict = s3.list_objects(Bucket = bucket_name, Prefix = prefix+date)\n \n for file_obj in data_dict['Contents']:\n f_name = file_obj['Key']\n tmp_path = '/'.join((path+f_name).split('/')[:-1])\n \n if not os.path.exists(tmp_path):\n os.makedirs(tmp_path)\n \n s3.download_file(bucket_name, f_name, path+f_name)\n\nDownloading: 2020-05-04\n\n\n\n\nValidate\n\nfor date in pd.date_range(start=start_date, end=end_date):\n date = str(date).split(' ')[0] # keeping just YYYY-MM-DD from YYYY-MM-DD HH:MM:SS\n data_dict = s3.list_objects(Bucket = bucket_name, Prefix = prefix+date)\n \n for file_obj in data_dict['Contents']:\n assert os.path.exists(path+file_obj['Key']), file_obj['Key']\n\n\nprint('Validated')" + "objectID": "posts/2024-12-29-object-detection-how-to.html#inference-1", + "href": "posts/2024-12-29-object-detection-how-to.html#inference-1", + "title": "Object Detection - A how-to guide", + "section": "Inference", + "text": "Inference\n\niou = Non-max suppression IOU threshold\nconf = Object confidence threshold\n\n\nInline method\n\nmodel = YOLO(\"yolo11x-obb\")\n\ndetections = []\npredictions = []\nfor img_path, img, detection in tqdm(dataset):\n prediction = model(img, imgsz=1024, iou=0.33, max_det=300, conf=0.001, verbose=False)[0]\n predictions.append(sv.Detections.from_ultralytics(prediction))\n detections.append(detection)\n\n\n\n\n\n\nCLI method\n\n!cd /tmp && yolo obb predict model=yolo11x-obb source=/tmp/DOTAv1_small/images/val exist_ok=True save=False save_txt=True imgsz=1024 iou=0.33 max_det=300 conf=0.001 verbose=False\n\nUltralytics 8.3.55 🚀 Python-3.10.15 torch-2.5.0+cu124 CUDA:0 (NVIDIA A100-SXM4-80GB, 81156MiB)\nYOLO11x-obb summary (fused): 483 layers, 58,752,928 parameters, 0 gradients, 202.8 GFLOPs\nResults saved to runs/obb/predict\n20 labels saved to runs/obb/predict/labels\n💡 Learn more at https://docs.ultralytics.com/modes/predict" }, { - "objectID": "posts/ssh-macos.html", - "href": "posts/ssh-macos.html", - "title": "Passwordless SSH setup for MacOS Hosts", - "section": "", - "text": "HOST: The computer physically present with you.\nREMOTE: The remote computer that you’d like to access via ssh.\nREMOTE-IP: Ip address of the REMOTE.\nPORT: The port on which the ssh server is running on REMOTE.\nssh-keygen # this will generate a public and private key pair. Rename it if you want.\nssh-copy-id -i ~/.ssh/id_rsa.pub -p PORT USERANAME@REMOTE-IP # this will copy the public key to REMOTE\nssh-add ~/.ssh/id_rsa # this command tells the HOST to use the private key for ssh connections\nThat’s it! You should now be able to ssh into the REMOTE without a password. After rebooting the HOST, if VSCode or CLI asks for a password, run ssh-add ~/.ssh/id_rsa again." + "objectID": "posts/2024-12-29-object-detection-how-to.html#metrics-1", + "href": "posts/2024-12-29-object-detection-how-to.html#metrics-1", + "title": "Object Detection - A how-to guide", + "section": "Metrics", + "text": "Metrics\n\nInline method\n\nConfusion matrix\n\nAt the time of writing this post, supervision’s ConfusionMatrix does not support OBB. Follow this issue for updates.\n\n\nconf_threshold – minimum confidence threshold for a detection to be considered. Instances with confidence below this threshold are ignored as if they were not predicted.\niou_threshold – minimum intersection over union (IoU) threshold for a detection to be considered a true positive. Predictions with IoU below this threshold are considered false positives.\n\n\ncm = sv.ConfusionMatrix.from_detections(\n predictions, detections, classes=dataset.classes, conf_threshold=0.25, iou_threshold=0.33\n)\n_ = cm.plot()\n\n\n\n\n\n\n\n\n\n\nPrecision, Recall & F1 Score\nYou know the formulas of Precision and Recall.\nPrecision = TP / (TP + FP)\nRecall = TP / (TP + FN)\nWe can also write them as the following:\nPrecision = TP / PP\nRecall = TP / AP\nwhere PP is the number of predicted positives and AP is the number of actual positives.\nTo calculate TP, we can sum the values along the diagonal of the confusion matrix.\n\nTP = cm.matrix.diagonal().sum()\nTP\n\nnp.float64(847.0)\n\n\nTo calculate PP, we should remove all cells which represent “not predicted” instances. That is nothing but FN. Thus, we will remove the last column representing FN.\n\nPP = cm.matrix[:, :-1].sum()\nPP\n\nnp.float64(931.0)\n\n\nTo calculate AP, we should remove all cells which represent “predicted but wrong” instances. That is nothing but FP. Thus, we will remove the last row representing FP.\n\nAP = cm.matrix[:-1, :].sum()\nAP\n\nnp.float64(1059.0)\n\n\n\nP = TP / PP\nR = TP / AP\nF1 = 2 * P * R / (P + R)\nprint(f\"P: {P:.2f}, R: {R:.2f}, F1: {F1:.2f}\")\n\nP: 0.91, R: 0.80, F1: 0.85\n\n\nNotice that to compute P, R and F1, we need to fix a confidence threshold and an IOU threshold. Now, we will see some metrics which integrate over confidence thresholds and use only IoU threshold.\nThere are specific methods in supervision to compute Precision, Recall and F1 Score, but they are significantly slow. If they become faster in future, one can use them with the following code.\n\n# f1_score = sv.metrics.MeanAveragePrecision(sv.metrics.MetricTarget.ORIENTED_BOUNDING_BOXES)\n# f1_score.update(predictions, detections).compute()\n\n\n# precision = sv.metrics.Precision(sv.metrics.MetricTarget.ORIENTED_BOUNDING_BOXES)\n# precision.update(predictions, detections).compute()\n\n\n# recall = sv.metrics.Recall(sv.metrics.MetricTarget.ORIENTED_BOUNDING_BOXES)\n# recall.update(predictions, detections).compute()\n\n\n\n\nCLI method\nWhen we use CLI method of inference in Ultralytics, results are saved on the disk. Now, we need to load them back to calculate metrics.\n\nfrom ultralytics.engine.results import Results\n\nThe following method is extremely space consuming unless the following issue is resolved: https://github.com/roboflow/supervision/issues/1762. Not adding further steps until the issue is resolved but the steps should be similar to the axis-aligned bounding box case.\n\npredicted_dataset = sv.DetectionDataset.from_yolo(\n \"/tmp/DOTAv1_small/images/val\",\n \"/tmp/runs/obb/predict/labels\",\n data_yaml_path=\"/tmp/DOTAv1_small/data.yml\",\n is_obb=True,\n)\n\n\n---------------------------------------------------------------------------\nKeyboardInterrupt Traceback (most recent call last)\nCell In[78], line 1\n----> 1 predicted_dataset = sv.DetectionDataset.from_yolo(\n 2 \"/tmp/DOTAv1_small/images/val\",\n 3 \"/tmp/runs/obb/predict/labels\",\n 4 data_yaml_path=\"/tmp/DOTAv1_small/data.yml\",\n 5 is_obb=True,\n 6 )\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/supervision/dataset/core.py:497, in DetectionDataset.from_yolo(cls, images_directory_path, annotations_directory_path, data_yaml_path, force_masks, is_obb)\n 445 @classmethod\n 446 def from_yolo(\n 447 cls,\n (...)\n 452 is_obb: bool = False,\n 453 ) -> DetectionDataset:\n 454 \"\"\"\n 455 Creates a Dataset instance from YOLO formatted data.\n 456 \n (...)\n 495 ```\n 496 \"\"\"\n--> 497 classes, image_paths, annotations = load_yolo_annotations(\n 498 images_directory_path=images_directory_path,\n 499 annotations_directory_path=annotations_directory_path,\n 500 data_yaml_path=data_yaml_path,\n 501 force_masks=force_masks,\n 502 is_obb=is_obb,\n 503 )\n 504 return DetectionDataset(\n 505 classes=classes, images=image_paths, annotations=annotations\n 506 )\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/supervision/dataset/formats/yolo.py:182, in load_yolo_annotations(images_directory_path, annotations_directory_path, data_yaml_path, force_masks, is_obb)\n 180 with_masks = _with_mask(lines=lines)\n 181 with_masks = force_masks if force_masks else with_masks\n--> 182 annotation = yolo_annotations_to_detections(\n 183 lines=lines,\n 184 resolution_wh=resolution_wh,\n 185 with_masks=with_masks,\n 186 is_obb=is_obb,\n 187 )\n 188 annotations[image_path] = annotation\n 189 return classes, image_paths, annotations\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/supervision/dataset/formats/yolo.py:120, in yolo_annotations_to_detections(lines, resolution_wh, with_masks, is_obb)\n 115 return Detections(class_id=class_id, xyxy=xyxy, data=data)\n 117 polygons = [\n 118 (polygon * np.array(resolution_wh)).astype(int) for polygon in relative_polygon\n 119 ]\n--> 120 mask = _polygons_to_masks(polygons=polygons, resolution_wh=resolution_wh)\n 121 return Detections(class_id=class_id, xyxy=xyxy, data=data, mask=mask)\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/supervision/dataset/formats/yolo.py:50, in _polygons_to_masks(polygons, resolution_wh)\n 47 def _polygons_to_masks(\n 48 polygons: List[np.ndarray], resolution_wh: Tuple[int, int]\n 49 ) -> np.ndarray:\n---> 50 return np.array(\n 51 [\n 52 polygon_to_mask(polygon=polygon, resolution_wh=resolution_wh)\n 53 for polygon in polygons\n 54 ],\n 55 dtype=bool,\n 56 )\n\nKeyboardInterrupt:" }, { - "objectID": "posts/ssh-macos.html#terminology", - "href": "posts/ssh-macos.html#terminology", - "title": "Passwordless SSH setup for MacOS Hosts", + "objectID": "posts/GNNs_and_GPs.html", + "href": "posts/GNNs_and_GPs.html", + "title": "GNNs and GPs", "section": "", - "text": "HOST: The computer physically present with you.\nREMOTE: The remote computer that you’d like to access via ssh.\nREMOTE-IP: Ip address of the REMOTE.\nPORT: The port on which the ssh server is running on REMOTE.\nssh-keygen # this will generate a public and private key pair. Rename it if you want.\nssh-copy-id -i ~/.ssh/id_rsa.pub -p PORT USERANAME@REMOTE-IP # this will copy the public key to REMOTE\nssh-add ~/.ssh/id_rsa # this command tells the HOST to use the private key for ssh connections\nThat’s it! You should now be able to ssh into the REMOTE without a password. After rebooting the HOST, if VSCode or CLI asks for a password, run ssh-add ~/.ssh/id_rsa again." + "text": "import GPy\nimport numpy as np\nimport pandas as pd\n\nfrom sklearn.preprocessing import MinMaxScaler, StandardScaler\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.ensemble import RandomForestRegressor\n\nimport regdata as rd\nimport matplotlib.pyplot as plt\n\nimport torch\nimport torch.nn as nn\n\n\nx_train, y_train, x_test = rd.Step().get_data()\ny_train = y_train.reshape(-1, 1)\nx_test = x_test * 1.5\nprint(x_train.shape, y_train.shape, x_test.shape)\n\nplt.scatter(x_train, y_train, label='train');\n\n(50, 1) (50, 1) (100, 1)\n\n\n\n\n\n\n\n\n\n\nkernel = GPy.kern.RBF(1, variance=1, lengthscale=1)\nmodel = GPy.models.GPRegression(x_train, y_train.reshape(-1, 1), kernel)\nmodel.Gaussian_noise.variance = 0.1\n\ny_pred_gp, y_var = model.predict(x_test)\n\nplt.scatter(x_train, y_train, label='train');\nplt.plot(x_test, y_pred_gp, label='pred');\n\n\n\n\n\n\n\n\n\nclass GCN_Forward(nn.Module):\n def __init__(self, in_features, out_features):\n super().__init__()\n self.fc = nn.Linear(in_features, out_features)\n \n def forward(self, x, A):\n x = self.fc(x)\n x = torch.matmul(A, x)\n return x\n \nclass GCN_Reverse(nn.Module):\n def __init__(self, in_features, out_features):\n super().__init__()\n self.fc = nn.Linear(in_features, out_features)\n \n def forward(self, x, A):\n x = torch.matmul(A, x)\n x = self.fc(x)\n return x\n\nclass NN(nn.Module):\n def __init__(self, features):\n super().__init__()\n self.features = features\n \n for i, (in_features, out_features) in enumerate(zip(features[:-1], features[1:])):\n setattr(self, f'layer_{i}', nn.Linear(in_features, out_features))\n \n self.last_layer = nn.Linear(features[-1], 1)\n \n def forward(self, x, A):\n for i in range(len(self.features) - 1):\n if isinstance(getattr(self, f'layer_{i}'), GCN_Forward):\n x = getattr(self, f'layer_{i}')(x, A)\n else:\n x = getattr(self, f'layer_{i}')(x)\n x = nn.functional.gelu(x)\n \n x = self.last_layer(x)\n return x\n\nclass GCN(NN):\n def __init__(self, features):\n super().__init__(features)\n for i, (in_features, out_features) in enumerate(zip(features[:-1], features[1:])):\n setattr(self, f'layer_{i}', GCN_Forward(in_features, out_features))\n\n\nA = torch.tensor(kernel.K(x_train, x_train)).float()\n# A.fill_diagonal_(0)\nA = A / A.sum(dim=0, keepdim=True)\n# A.fill_diagonal_(1)\n\nnum_epochs = 500\nfeatures = [1, 1024]\n\ngcn_model = GCN(features=features)\nnn_model = NN(features=features)\n\ngcn_optimizer = torch.optim.Adam(gcn_model.parameters(), lr=0.01)\nnn_optimizer = torch.optim.Adam(nn_model.parameters(), lr=0.01)\n\ncriterion = nn.MSELoss()\n\nx_train_torch = torch.from_numpy(x_train).float()\ny_train_torch = torch.from_numpy(y_train).float()\n\ngcn_losses = []\nnn_losses = []\nfor epoch in range(num_epochs):\n gcn_optimizer.zero_grad()\n nn_optimizer.zero_grad()\n \n y_out_gcn = gcn_model(x_train_torch, A)\n y_out_nn = nn_model(x_train_torch, A)\n gcn_loss = criterion(y_out_gcn, y_train_torch)\n nn_loss = criterion(y_out_nn, y_train_torch)\n \n gcn_loss.backward()\n nn_loss.backward()\n \n gcn_losses.append(gcn_loss.item())\n nn_losses.append(nn_loss.item())\n \n gcn_optimizer.step()\n nn_optimizer.step()\n \nplt.plot(gcn_losses, label='gcn');\nplt.plot(nn_losses, label='nn');\nplt.legend();\n\n\n\n\n\n\n\n\n\nA_test = torch.tensor(kernel.K(x_test, x_test)).float()\n# A_test.fill_diagonal_(0)\nA_test = A_test / A_test.sum(dim=0, keepdim=True)\n# A_test.fill_diagonal_(1)\n\ny_pred_nn = nn_model(torch.from_numpy(x_test).float(), A_test).detach().numpy()\ny_pred_gcn = gcn_model(torch.from_numpy(x_test).float(), A_test).detach().numpy()\n\nplt.figure(figsize=(10, 6))\nplt.scatter(x_train, y_train, label='train');\nplt.plot(x_train, y_out_gcn.detach().numpy(), label='pred GCN train');\nplt.plot(x_train, y_out_nn.detach().numpy(), label='pred NN train');\nplt.plot(x_test, y_pred_gp, label='pred GP', linestyle='--');\nplt.plot(x_test, y_pred_nn, label='pred NN');\nplt.plot(x_test, y_pred_gcn, label='pred GCN');\nplt.ylim(-3, 3);\nplt.legend();" }, { - "objectID": "posts/bayesian-gaussian-basis-regression.html", - "href": "posts/bayesian-gaussian-basis-regression.html", - "title": "Bayesian Basis Regression", + "objectID": "posts/presentation_tips.html", + "href": "posts/presentation_tips.html", + "title": "Conference Presentation Tips", "section": "", - "text": "import os\n\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"3\"\n\nimport numpy as np\nimport pandas as pd\nimport regdata as rd\n\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport torch.distributions as dist\n\nfrom tqdm import tqdm\n\nimport matplotlib.pyplot as plt\n\ndevice = \"cuda\"\nrd.set_backend(\"torch\")" + "text": "General\n\nFirst page goes like this:\n\nTitle\nAuthors (Underline presenting author, no need to put * in case of equal contribution)\nAffiliations\nConference name\n\nIf importing figures from paper, avoid including the captions.\nInclude lot of images and less maths\nTalk should end with summary and not the future work or thank you slide or something.\nCite the references on the same slide in bottom.\n\nRefer to “Giving talks” section of this blog.\n\n\nDos and Don’ts\n\nNever put too detailed information difficult to grasp: a table with many numbers, a complex derivation all in one go, very complicated diagram." }, { - "objectID": "posts/bayesian-gaussian-basis-regression.html#generate-data", - "href": "posts/bayesian-gaussian-basis-regression.html#generate-data", - "title": "Bayesian Basis Regression", - "section": "Generate data", - "text": "Generate data\n\n# x = torch.linspace(-1, 1, 100)\n# y = (torch.sin(x * 2 * torch.pi) + torch.randn(x.size()) * 0.1).unsqueeze(1)\nx, y, _ = rd.MotorcycleHelmet().get_data()\nx = x.ravel().to(torch.float32)\nidx = np.argsort(x)\nx = x[idx]\ny = y.to(torch.float32)\ny = y[idx]\n\nx = torch.vstack([torch.ones_like(x), x]).T\nprint(x.shape, y.shape)\nx = x.to(device)\ny = y.to(device)\nprint(x.dtype, y.dtype)\n\nplt.scatter(x.cpu().numpy()[:, 1], y.cpu().numpy())\n\ntorch.Size([94, 2]) torch.Size([94])\ntorch.float32 torch.float32\n\n\n\n\n\n\n\n\n\n\nclass MLP(nn.Module):\n def __init__(self, in_dim, out_dim, neurons, transform=None):\n super().__init__()\n self.layers = nn.ModuleList()\n self.transform = transform\n if transform is None:\n self.transform = lambda x: x\n self.layers.append(nn.Linear(in_dim, neurons[0]))\n else:\n self.layers.append(nn.Linear(self.transform.n_grid + 1, neurons[0]))\n for i in range(1, len(neurons)):\n self.layers.append(nn.Linear(neurons[i - 1], neurons[i]))\n self.layers.append(nn.Linear(neurons[-1], out_dim))\n\n def forward(self, x):\n x = self.transform(x)\n # print(x.shape)\n for layer in self.layers[:-1]:\n x = F.gelu(layer(x))\n return self.layers[-1](x)\n\n\nclass RBF(nn.Module):\n def __init__(self, log_gauss_var, n_grid):\n super().__init__()\n self.log_gauss_var = nn.Parameter(torch.tensor(log_gauss_var))\n self.n_grid = n_grid\n self.grid = nn.Parameter(torch.linspace(-1, 1, n_grid))\n self.register_buffer(\"bias\", torch.zeros(1))\n\n def forward(self, x):\n self.dist = dist.Normal(self.grid, torch.exp(self.log_gauss_var))\n features = torch.exp(self.dist.log_prob(x[:, 1:2]))\n # print(features.shape)\n features = torch.cat(\n [\n torch.ones_like(self.bias.repeat(features.shape[0])).reshape(-1, 1),\n features,\n ],\n dim=1,\n )\n return features\n\n\nRBF(0.0, 10).to(device)(x).shape\n\ntorch.Size([94, 11])\n\n\n\n# def transform_fn(x):\n# all_x = []\n# for i in range(2, 11):\n# all_x.append(x[:, 1:2] ** i)\n# return torch.hstack([x] + all_x)\n\n\ndef get_mn_sn(x, s0):\n x = transform_fn(x)\n sn_inv = (x.T @ x) / torch.exp(log_var_noise)\n diag = sn_inv.diagonal()\n diag += 1 / s0\n sn = torch.inverse(sn_inv)\n mn = sn @ ((x.T @ y) / torch.exp(log_var_noise))\n return mn, sn\n\n\ndef neg_log_likelihood(x, y, m0, s0):\n x = transform_fn(x)\n cov = (x @ x.T) / s0\n diag = cov.diagonal()\n diag += torch.exp(log_var_noise)\n return (\n -dist.MultivariateNormal(m0.repeat(y.shape[0]), cov).log_prob(y.ravel()).sum()\n )\n\n\ndef get_pred_post(sn, mn, x):\n x = transform_fn(x)\n pred_cov = x @ sn @ x.T\n diag = pred_cov.diagonal()\n diag += torch.exp(log_var_noise)\n pred_mean = x @ mn\n return pred_mean, pred_cov\n\n\ndef plot_preds_and_95(ax, x, pred_mean, pred_cov):\n with torch.no_grad():\n x = x[:, 1].cpu().numpy()\n pred_mean = pred_mean.ravel().cpu().numpy()\n pred_var = pred_cov.diagonal().cpu().numpy()\n ax.plot(x, pred_mean, color=\"red\", label=\"mean\")\n ax.fill_between(\n x,\n (pred_mean - 2 * np.sqrt(pred_var)),\n (pred_mean + 2 * np.sqrt(pred_var)),\n color=\"red\",\n alpha=0.2,\n label=\"95% CI\",\n )\n return ax\n\n\nmlp = MLP(2, 1, [256, 256, 256]).to(device)\n# mlp = RBF(0.1, 20).to(device)\ntransform_fn = mlp.forward\n\nm0 = torch.zeros((1,)).to(device)\ns0 = torch.tensor(1.0).to(device)\nwith torch.no_grad():\n log_var_noise = nn.Parameter(torch.tensor(0.1)).to(device)\n log_var_noise.requires_grad = True\n m0.requires_grad = True\n s0.requires_grad = True\n\n\noptimizer = torch.optim.Adam([*list(mlp.parameters()), log_var_noise, m0, s0], lr=0.01)\nlosses = []\npbar = tqdm(range(500))\nfor i in pbar:\n optimizer.zero_grad()\n loss = neg_log_likelihood(x, y, m0, s0)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: 30.6285: 100%|██████████| 500/500 [00:02<00:00, 209.49it/s]\n\n\n\n\n\n\n\n\n\n\nmn, sn = get_mn_sn(x, s0)\npred_mean, pred_var = get_pred_post(sn, mn, x)\n\nfig, ax = plt.subplots()\nax = plot_preds_and_95(ax, x, pred_mean, pred_var)\nwith torch.no_grad():\n ax.scatter(x.cpu().numpy()[:, 1], y.cpu().numpy())\n # ax.vlines(mlp.transform.grid.cpu().numpy(), -1, 1, color=\"black\", alpha=0.2)\nplt.show()\n\n\n\n\n\n\n\n\n\ntorch.exp(log_var_noise), s0, m0\n\n(tensor(0.1191, device='cuda:0', grad_fn=<ExpBackward0>),\n tensor(1.3897, device='cuda:0', requires_grad=True),\n tensor([-0.0693], device='cuda:0', requires_grad=True))" + "objectID": "posts/GNN_for_regression.html", + "href": "posts/GNN_for_regression.html", + "title": "Graph Neural Networks for Regression", + "section": "", + "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"3\"\n\nimport GPy\n\nimport torch\nimport torch.nn as nn\n\nfrom tqdm import trange\n\nimport numpy as np\nimport matplotlib.pyplot as plt\n\nfrom sklearn.model_selection import train_test_split\n\ndevice = \"cuda\"" }, { - "objectID": "posts/bayesian-gaussian-basis-regression.html#add-gaussian-transform", - "href": "posts/bayesian-gaussian-basis-regression.html#add-gaussian-transform", - "title": "Bayesian Basis Regression", - "section": "Add Gaussian transform", - "text": "Add Gaussian transform\n\nmlp = MLP(2, 1, [256, 256, 256], transform=RBF(0.1, 10)).to(device)\n# mlp = RBF(0.1, 20).to(device)\ntransform_fn = mlp.forward\n\nm0 = torch.zeros((1,)).to(device)\ns0 = torch.tensor(1.0).to(device)\nwith torch.no_grad():\n log_var_noise = nn.Parameter(torch.tensor(0.1)).to(device)\n log_var_noise.requires_grad = True\n m0.requires_grad = False\n s0.requires_grad = True\n\n\noptimizer = torch.optim.Adam([*list(mlp.parameters()), log_var_noise, m0, s0], lr=0.01)\nlosses = []\npbar = tqdm(range(500))\nfor i in pbar:\n optimizer.zero_grad()\n loss = neg_log_likelihood(x, y, m0, s0)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: -29.9227: 100%|██████████| 500/500 [00:03<00:00, 156.90it/s]\n\n\n\n\n\n\n\n\n\n\nmn, sn = get_mn_sn(x, s0)\npred_mean, pred_var = get_pred_post(sn, mn, x)\n\nfig, ax = plt.subplots()\nax = plot_preds_and_95(ax, x, pred_mean, pred_var)\nwith torch.no_grad():\n ax.scatter(x.cpu().numpy()[:, 1], y.cpu().numpy())\n ax.vlines(mlp.transform.grid.cpu().numpy(), -1, 1, color=\"black\", alpha=0.2)\nplt.show()" + "objectID": "posts/GNN_for_regression.html#create-a-synthetic-dataset", + "href": "posts/GNN_for_regression.html#create-a-synthetic-dataset", + "title": "Graph Neural Networks for Regression", + "section": "Create a synthetic dataset", + "text": "Create a synthetic dataset\n\nnp.random.seed(0)\ntorch.random.manual_seed(4)\n\nN = 50\nx = np.linspace(-1, 1, N).reshape(-1, 1)\nkernel = GPy.kern.RBF(input_dim=1, variance=1, lengthscale=0.1)\ny = np.random.multivariate_normal(np.zeros(N), kernel.K(x)).reshape(-1, 1)\ny_noisy = y + np.random.normal(0, 0.1, N).reshape(-1, 1)\n\ntrain_x, test_x, train_y, test_y = train_test_split(x, y_noisy, test_size=0.4, random_state=0)\n\nplt.plot(x, y, label=\"True\");\nplt.plot(train_x, train_y, 'o', label='train')\nplt.plot(test_x, test_y, 'o', label='test')\nplt.legend();\n\nx, y, y_noisy = map(lambda x: torch.tensor(x).float().to(device), (x, y, y_noisy))\ntrain_x, test_x, train_y, test_y = map(lambda x: torch.tensor(x).float().to(device), (train_x, test_x, train_y, test_y))\nprint(x.shape, y.shape, y_noisy.shape)\n\ntorch.Size([50, 1]) torch.Size([50, 1]) torch.Size([50, 1])" }, { - "objectID": "posts/bayesian-gaussian-basis-regression.html#just-gaussian-basis", - "href": "posts/bayesian-gaussian-basis-regression.html#just-gaussian-basis", - "title": "Bayesian Basis Regression", - "section": "Just Gaussian basis", - "text": "Just Gaussian basis\n\n# mlp = MLP(2, 1, [32, 32, 32], transform=RBF(0.1, 10)).to(device)\nmlp = RBF(1.0, 5).to(device)\ntransform_fn = mlp.forward\n\nm0 = torch.zeros((1,)).to(device)\ns0 = torch.tensor(1.0).to(device)\nwith torch.no_grad():\n log_var_noise = nn.Parameter(torch.tensor(0.1)).to(device)\n log_var_noise.requires_grad = True\n m0.requires_grad = False\n s0.requires_grad = True\n\n\noptimizer = torch.optim.Adam([*list(mlp.parameters()), log_var_noise, m0, s0], lr=0.001)\nlosses = []\npbar = tqdm(range(500))\nfor i in pbar:\n optimizer.zero_grad()\n loss = neg_log_likelihood(x, y, m0, s0)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: 207.0843: 100%|██████████| 500/500 [00:02<00:00, 195.61it/s]\n\n\n\n\n\n\n\n\n\n\nmn, sn = get_mn_sn(x, s0)\npred_mean, pred_var = get_pred_post(sn, mn, x)\n\nfig, ax = plt.subplots()\nax = plot_preds_and_95(ax, x, pred_mean, pred_var)\nwith torch.no_grad():\n ax.scatter(x.cpu().numpy()[:, 1], y.cpu().numpy())\n ax.vlines(mlp.grid.cpu().numpy(), -1, 1, color=\"black\", alpha=0.2)\nplt.show()" + "objectID": "posts/GNN_for_regression.html#fit-with-a-simple-mlp", + "href": "posts/GNN_for_regression.html#fit-with-a-simple-mlp", + "title": "Graph Neural Networks for Regression", + "section": "Fit with a simple MLP", + "text": "Fit with a simple MLP\n\ndef fit(model, x, y, A=None, lr=0.01, epochs=100):\n optimizer = torch.optim.Adam(model.parameters(), lr=lr)\n loss_fn = nn.MSELoss()\n \n if A is None:\n inputs = (x,)\n else:\n inputs = (x, A)\n \n losses = []\n pbar = trange(epochs)\n for epoch in pbar:\n optimizer.zero_grad()\n y_hat = model(*inputs)\n loss = loss_fn(y_hat, y)\n losses.append(loss.item())\n pbar.set_description(f\"Epoch {epoch} Loss: {loss.item()}\")\n loss.backward()\n optimizer.step()\n \n return losses\n\nclass SimpleMLP(nn.Module):\n def __init__(self, features):\n super().__init__()\n layers = [nn.Linear(1, features[0]), nn.ReLU()]\n for in_features, out_features in zip(features, features[1:]):\n layers.append(nn.Linear(in_features, out_features))\n layers.append(nn.ReLU())\n \n layers.append(nn.Linear(features[-1], 1))\n \n self.layers = nn.Sequential(*layers)\n \n def forward(self, x):\n return self.layers(x)\n\n\ntorch.manual_seed(0)\nmodel = SimpleMLP([10, 10, 10]).to(device)\nfit(model, train_x, train_y, lr=0.01, epochs=1000);\n\npred_y = model(x)\n\n(x_, y_, train_x_, train_y_, test_x_, test_y_, pred_y_) = map(lambda x: x.cpu().detach().numpy(), (x, y, train_x, train_y, test_x, test_y, pred_y))\nplt.plot(x_, y_, label=\"True\");\nplt.plot(train_x_, train_y_, 'o', label='train')\nplt.plot(test_x_, test_y_, 'o', label='test')\nplt.plot(x_, pred_y_, label='pred')\nplt.legend();\n\nEpoch 999 Loss: 0.07143261283636093: 100%|██████████| 1000/1000 [00:02<00:00, 410.79it/s]" }, { - "objectID": "posts/bayesian-gaussian-basis-regression.html#appendix", - "href": "posts/bayesian-gaussian-basis-regression.html#appendix", - "title": "Bayesian Basis Regression", - "section": "Appendix", - "text": "Appendix\n\nfrom sklearn.preprocessing import StandardScaler, MinMaxScaler\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import mean_squared_error\n\n\ndata = pd.read_csv(\"~/datasets/uci/bike/hour.csv\", header=None).iloc[:, 1:]\ndata.shape\n\n(17379, 18)\n\n\n\nX = data.iloc[:, :-1].values\ny = data.iloc[:, -1].values\nX_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4, random_state=0)\nX_train.shape, X_test.shape, y_train.shape, y_test.shape\n\nx_scaler = MinMaxScaler()\ny_scaler = StandardScaler()\nX_train = x_scaler.fit_transform(X_train)\ny_train = y_scaler.fit_transform(y_train.reshape(-1, 1))\nX_test = x_scaler.transform(X_test)\ny_test = y_scaler.transform(y_test.reshape(-1, 1))\n\nX_train.shape, X_test.shape, y_train.shape, y_test.shape\n\n((10427, 17), (6952, 17), (10427, 1), (6952, 1))\n\n\n\n[X_train, X_test, y_train, y_test] = map(\n lambda x: torch.tensor(x, dtype=torch.float32).to(device),\n [X_train, X_test, y_train, y_test],\n)\n\n\nmlp = MLP(17, 1, [10, 10]).to(device)\n\noptimizer = torch.optim.Adam(mlp.parameters(), lr=0.01)\nlosses = []\npbar = tqdm(range(500))\nfor i in pbar:\n optimizer.zero_grad()\n loss = F.mse_loss(mlp(X_train), y_train)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: 0.0040: 100%|██████████| 500/500 [00:01<00:00, 482.25it/s]\n\n\n\n\n\n\n\n\n\n\nwith torch.no_grad():\n y_pred = mlp(X_test).cpu().numpy()\n if isinstance(y_test, torch.Tensor):\n y_test = y_test.cpu().numpy()\n print(y_pred.shape, y_test.shape)\n print(\"RMSE\", mean_squared_error(y_test, y_pred, squared=False))\n\n(6952, 1) (6952, 1)\nRMSE 0.08354535" + "objectID": "posts/GNN_for_regression.html#create-a-gcn-layer", + "href": "posts/GNN_for_regression.html#create-a-gcn-layer", + "title": "Graph Neural Networks for Regression", + "section": "Create a GCN layer", + "text": "Create a GCN layer\n\nclass GCNLayer(nn.Module):\n def __init__(self, in_features, out_features):\n super().__init__()\n self.linear = nn.Linear(in_features, out_features)\n \n def forward(self, x, A): \n return self.linear(A @ x)\n \n \nclass GCN(nn.Module):\n def __init__(self, features):\n super().__init__()\n layers = [GCNLayer(1, features[0]), nn.ReLU()]\n for in_features, out_features in zip(features, features[1:]):\n layers.append(GCNLayer(in_features, out_features))\n layers.append(nn.ReLU())\n \n layers.append(nn.Linear(features[-1], 1))\n self.layers = nn.Sequential(*layers)\n \n def forward(self, x, A):\n for layer in self.layers:\n if isinstance(layer, GCNLayer):\n x = layer(x, A)\n else:\n x = layer(x)\n return x\n \ndef get_eucledean_A(x, exponent):\n d = ((x - x.T)**2)**0.5\n d = torch.where(d==0, torch.min(d[d!=0])/2, d) # self distance is 0, so replace it with half of the min distance\n A = 1/(d**exponent)\n return A/A.sum(dim=1, keepdim=True)\n\ndef get_KNN_A(x, k):\n d = torch.abs(x - x.T)\n A = torch.zeros_like(d)\n _, indices = torch.topk(d, k, dim=1, largest=False)\n for i, index in enumerate(indices):\n A[i, index] = 1\n return A/A.sum(dim=1, keepdim=True)\n\ndef fit_and_plot(title):\n model = GCN([10, 10, 10]).to(device)\n losses = fit(model, train_x, train_y, A=A_train, lr=0.001, epochs=3000);\n\n pred_y = model(x, A_all)\n\n fig, ax = plt.subplots(1, 2, figsize=(12, 4))\n axes = ax[0]\n axes.plot(losses)\n axes.set_title(\"Losses\")\n\n (x_, y_, train_x_, train_y_, test_x_, test_y_, pred_y_) = map(lambda x: x.cpu().detach().numpy(), (x, y, train_x, train_y, test_x, test_y, pred_y))\n axes = ax[1]\n axes.plot(x_, y_, label=\"True\");\n axes.plot(train_x_, train_y_, 'o', label='train')\n axes.plot(test_x_, test_y_, 'o', label='test')\n axes.plot(x_, pred_y_, label='pred')\n axes.set_title(title)\n axes.legend();" }, { - "objectID": "posts/2023-03-28-nngp.html", - "href": "posts/2023-03-28-nngp.html", - "title": "Neural Network Gaussian Process", - "section": "", - "text": "# %%capture\n# %pip install -U --force-reinstall jaxutils\n# %pip install -U jax jaxlib optax\n\n\nimport jax\nimport jax.random as jr\nimport jax.numpy as jnp\nfrom jaxutils import Dataset\n\ntry:\n from neural_tangents import stax\nexcept ModuleNotFoundError:\n %pip install neural-tangents\n from neural_tangents import stax\n\ntry:\n import optax as ox\nexcept ModuleNotFoundError:\n %pip install optax\n import optax as ox\n\ntry:\n import gpjax as gpx\nexcept ModuleNotFoundError:\n %pip install gpjax\n import gpjax as gpx\n\ntry:\n import regdata as rd\nexcept ModuleNotFoundError:\n %pip install regdata\n import regdata as rd\n\nimport matplotlib.pyplot as plt\n\n\nclass NTK(gpx.kernels.AbstractKernel):\n def __init__(self) -> None:\n super().__init__()\n\n def __call__(self, params, x, y):\n params = jax.tree_util.tree_map(jax.nn.softplus, params)\n init_fn, apply_fn, kernel_fn = stax.serial(\n stax.Dense(512, W_std=params[\"w1\"], b_std=params[\"b1\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w2\"], b_std=params[\"b2\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w3\"], b_std=params[\"b3\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w4\"], b_std=params[\"b4\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w5\"], b_std=params[\"b5\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w6\"], b_std=params[\"b6\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w7\"], b_std=params[\"b7\"]), stax.Relu(),\n stax.Dense(1, W_std=params[\"w8\"], b_std=params[\"b8\"])\n )\n return kernel_fn(x.reshape(1, 1), y.reshape(1, 1)).nngp.squeeze()\n\n def init_params(self, key):\n # return init_fn(key, input_shape=(2,1))\n return {\"w1\": 0.1, \"w2\": 0.2, \"w3\": 0.3, \"w4\": 0.4, \"w5\": 0.5, \"w6\": 0.6, \"w7\": 0.7, \"w8\": 0.8,\n \"b1\": 0.1, \"b2\": 0.2, \"b3\": 0.3, \"b4\": 0.4, \"b5\": 0.5, \"b6\": 0.6, \"b7\": 0.7, \"b8\": 0.8\n }\n\n # This is depreciated. Can be removed once JaxKern is updated.\n def _initialise_params(self, key):\n return self.init_params(key)\n\n\nn = 100\nnoise = 0.3\nkey = jr.PRNGKey(123)\n# x = jr.uniform(key=key, minval=-3.0, maxval=3.0, shape=(n,)).sort().reshape(-1, 1)\n# f = lambda x: jnp.sin(4 * x) + jnp.cos(2 * x)\n# signal = f(x)\n# y = signal + jr.normal(key, shape=signal.shape) * noise\nx, y, xtest = rd.MotorcycleHelmet().get_data()\ny = y.reshape(-1, 1)\n\nD = Dataset(X=x, y=y)\n\n# xtest = jnp.linspace(-3.5, 3.5, 500).reshape(-1, 1)\n# ytest = f(xtest)\n\nprint(x.shape, y.shape)\n\n(94, 1) (94, 1)\n\n\n\nkernel = NTK()\nprior = gpx.Prior(kernel=kernel)\nlikelihood = gpx.Gaussian(num_datapoints=D.n)\nposterior = prior * likelihood\n\n\nkey = jr.PRNGKey(1234)\nparameter_state = gpx.initialise(posterior, key)\nparams, trainable, bijectors = parameter_state.unpack()\nparams[\"likelihood\"][\"obs_noise\"] = jnp.array(0.1)\nparameter_state = gpx.parameters.ParameterState(params, trainable, bijectors)\nprint(params)\n\n{'kernel': {'w1': 0.1, 'w2': 0.2, 'w3': 0.3, 'w4': 0.4, 'w5': 0.5, 'w6': 0.6, 'w7': 0.7, 'w8': 0.8, 'b1': 0.1, 'b2': 0.2, 'b3': 0.3, 'b4': 0.4, 'b5': 0.5, 'b6': 0.6, 'b7': 0.7, 'b8': 0.8}, 'mean_function': {}, 'likelihood': {'obs_noise': Array(0.1, dtype=float32, weak_type=True)}}\n\n\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w1 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w2 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w3 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w4 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w5 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w6 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w7 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w8 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b1 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b2 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b3 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b4 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b5 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b6 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b7 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b8 has no transform. Defaulting to identity transfom.\n warnings.warn(\n\n\n\nnegative_mll = jax.jit(posterior.marginal_log_likelihood(D, negative=True))\nnegative_mll(params)\n\nArray(415.1062, dtype=float32)\n\n\n\noptimiser = ox.adam(learning_rate=0.01)\n\ninference_state = gpx.fit(\n objective=negative_mll,\n parameter_state=parameter_state,\n optax_optim=optimiser,\n num_iters=500,\n)\n\nlearned_params, training_history = inference_state.unpack()\n\n100%|██████████| 500/500 [00:02<00:00, 172.53it/s, Objective=76.34]\n\n\n\nplt.plot(training_history);\n\n\n\n\n\n\n\n\n\nlearned_params\n\n{'kernel': {'b1': Array(0.03292831, dtype=float32),\n 'b2': Array(-0.9647168, dtype=float32),\n 'b3': Array(-1.2660046, dtype=float32),\n 'b4': Array(-1.3792713, dtype=float32),\n 'b5': Array(-1.4311961, dtype=float32),\n 'b6': Array(-1.4504426, dtype=float32),\n 'b7': Array(-1.4371448, dtype=float32),\n 'b8': Array(-1.3471106, dtype=float32),\n 'w1': Array(1.0706716, dtype=float32),\n 'w2': Array(1.1768614, dtype=float32),\n 'w3': Array(1.2740505, dtype=float32),\n 'w4': Array(1.3689499, dtype=float32),\n 'w5': Array(1.462641, dtype=float32),\n 'w6': Array(1.5562503, dtype=float32),\n 'w7': Array(1.6506695, dtype=float32),\n 'w8': Array(1.7462935, dtype=float32)},\n 'likelihood': {'obs_noise': Array(0.184795, dtype=float32)},\n 'mean_function': {}}\n\n\n\nlatent_dist = posterior(learned_params, D)(xtest)\npredictive_dist = likelihood(learned_params, latent_dist)\n\npredictive_mean = predictive_dist.mean()\npredictive_std = predictive_dist.stddev()\n\n\nfig, ax = plt.subplots(figsize=(12, 5))\nax.plot(x, y, \"o\", label=\"Observations\", color=\"tab:red\")\nax.plot(xtest, predictive_mean, label=\"Predictive mean\", color=\"tab:blue\")\nax.fill_between(\n xtest.squeeze(),\n predictive_mean - 2 * predictive_std,\n predictive_mean + 2 * predictive_std,\n alpha=0.2,\n color=\"tab:blue\",\n label=\"Two sigma\",\n)\nax.plot(\n xtest,\n predictive_mean - predictive_std,\n color=\"tab:blue\",\n linestyle=\"--\",\n linewidth=1,\n)\nax.plot(\n xtest,\n predictive_mean + predictive_std,\n color=\"tab:blue\",\n linestyle=\"--\",\n linewidth=1,\n)\n\n# ax.plot(\n# xtest, ytest, label=\"Latent function\", color=\"black\", linestyle=\"--\", linewidth=1\n# )\n\nax.legend();" + "objectID": "posts/GNN_for_regression.html#idw-setting", + "href": "posts/GNN_for_regression.html#idw-setting", + "title": "Graph Neural Networks for Regression", + "section": "IDW setting", + "text": "IDW setting\n\nexponent = 1\nA_train = get_eucledean_A(train_x, exponent).to(device)\nA_all = get_eucledean_A(x, exponent).to(device)\ntitle = f\"Distance based adjacency matrix with exponent {exponent}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.05447980388998985: 100%|██████████| 3000/3000 [00:07<00:00, 390.93it/s] \n\n\n\n\n\n\n\n\n\n\nexponent = 2\nA_train = get_eucledean_A(train_x, exponent).to(device)\nA_all = get_eucledean_A(x, exponent).to(device)\ntitle = f\"Distance based adjacency matrix with exponent {exponent}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.06475391983985901: 100%|██████████| 3000/3000 [00:07<00:00, 413.49it/s]\n\n\n\n\n\n\n\n\n\n\nexponent = 3\nA_train = get_eucledean_A(train_x, exponent).to(device)\nA_all = get_eucledean_A(x, exponent).to(device)\ntitle = f\"Distance based adjacency matrix with exponent {exponent}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.043554823845624924: 100%|██████████| 3000/3000 [00:08<00:00, 367.28it/s]" }, { - "objectID": "posts/2022-01-25-gp_frameworks_comparison.html", - "href": "posts/2022-01-25-gp_frameworks_comparison.html", - "title": "Comparing Gaussian Process Regression Frameworks", - "section": "", - "text": "import math\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport pandas as pd\nimport torch\nimport GPy\nimport jax\nimport gpytorch\nimport botorch\nimport tinygp\nimport jax.numpy as jnp\nimport optax\nfrom IPython.display import clear_output\n\nfrom sklearn.preprocessing import StandardScaler\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)" + "objectID": "posts/GNN_for_regression.html#knn-setting", + "href": "posts/GNN_for_regression.html#knn-setting", + "title": "Graph Neural Networks for Regression", + "section": "KNN Setting", + "text": "KNN Setting\n\nK = 1\nA_train = get_KNN_A(train_x, K).to(device)\nA_all = get_KNN_A(x, K).to(device)\ntitle = f\"KNN based adjacency matrix with K={K}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.04107221961021423: 100%|██████████| 3000/3000 [00:07<00:00, 383.88it/s] \n\n\n\n\n\n\n\n\n\n\nK = 3\nA_train = get_KNN_A(train_x, K).to(device)\nA_all = get_KNN_A(x, K).to(device)\ntitle = f\"KNN based adjacency matrix with K={K}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.14372628927230835: 100%|██████████| 3000/3000 [00:07<00:00, 404.74it/s]\n\n\n\n\n\n\n\n\n\n\nK = 7\nA_train = get_KNN_A(train_x, K).to(device)\nA_all = get_KNN_A(x, K).to(device)\ntitle = f\"KNN based adjacency matrix with K={K}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.13950258493423462: 100%|██████████| 3000/3000 [00:07<00:00, 381.66it/s]\n\n\n\n\n\n\n\n\n\n\nK = 15\nA_train = get_KNN_A(train_x, K).to(device)\nA_all = get_KNN_A(x, K).to(device)\ntitle = f\"KNN based adjacency matrix with K={K}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.33879855275154114: 100%|██████████| 3000/3000 [00:07<00:00, 376.56it/s]" }, { - "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#data", - "href": "posts/2022-01-25-gp_frameworks_comparison.html#data", - "title": "Comparing Gaussian Process Regression Frameworks", - "section": "Data", - "text": "Data\n\nnp.random.seed(0) # We don't want surprices in a presentation :)\nN = 10\ntrain_x = torch.linspace(0, 1, N)\ntrain_y = torch.sin(train_x * (2 * math.pi)) + torch.normal(0, 0.1, size=(N,))\n \ntest_x = torch.linspace(0, 1, N*10)\ntest_y = torch.sin(test_x * (2 * math.pi))\n\n\nplt.plot(train_x, train_y, 'ko', label='train');\nplt.plot(test_x, test_y, label='test');\nplt.legend();" - }, - { - "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#defining-kernel", - "href": "posts/2022-01-25-gp_frameworks_comparison.html#defining-kernel", - "title": "Comparing Gaussian Process Regression Frameworks", - "section": "Defining kernel", - "text": "Defining kernel\n\\[\\begin{equation}\n\\sigma_f^2 = \\text{variance}\\\\\n\\ell = \\text{lengthscale}\\\\\nk_{RBF}(x_1, x_2) = \\sigma_f^2 \\exp \\left[-\\frac{\\lVert x_1 - x_2 \\rVert^2}{2\\ell^2}\\right]\n\\end{equation}\\]\n\nGPy\n\ngpy_kernel = GPy.kern.RBF(input_dim=1, variance=1., lengthscale=1.)\ngpy_kernel\n\n\n\n\n\n\nrbf.\nvalue\nconstraints\npriors\n\n\nvariance\n1.0\n+ve\n\n\n\nlengthscale\n1.0\n+ve\n\n\n\n\n\n\n\n\nGPyTorch\n\ngpytorch_kernel = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())\ngpytorch_kernel.outputscale = 1. # variance\ngpytorch_kernel.base_kernel.lengthscale = 1. # lengthscale\n\ngpytorch_kernel\n\nScaleKernel(\n (base_kernel): RBFKernel(\n (raw_lengthscale_constraint): Positive()\n )\n (raw_outputscale_constraint): Positive()\n)\n\n\n\n\nTinyGP\n\ndef RBFKernel(variance, lengthscale):\n return jnp.exp(variance) * tinygp.kernels.ExpSquared(scale=jnp.exp(lengthscale))\n \ntinygp_kernel = RBFKernel(variance=1., lengthscale=1.)\ntinygp_kernel\n\n<tinygp.kernels.Product at 0x7f544039d710>" - }, - { - "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#define-model", - "href": "posts/2022-01-25-gp_frameworks_comparison.html#define-model", - "title": "Comparing Gaussian Process Regression Frameworks", - "section": "Define model", - "text": "Define model\n\\[\n\\sigma_n^2 = \\text{noise variance}\n\\]\n\nGPy\n\ngpy_model = GPy.models.GPRegression(train_x.numpy()[:,None], train_y.numpy()[:,None], gpy_kernel)\ngpy_model.Gaussian_noise.variance = 0.1\ngpy_model\n\n\n\n\nModel: GP regression\nObjective: 16.757933772959404\nNumber of Parameters: 3\nNumber of Optimization Parameters: 3\nUpdates: True\n\n\n\n\n\n\nGP_regression.\nvalue\nconstraints\npriors\n\n\nrbf.variance\n1.0\n+ve\n\n\n\nrbf.lengthscale\n1.0\n+ve\n\n\n\nGaussian_noise.variance\n0.1\n+ve\n\n\n\n\n\n\n\n\nGPyTorch\n\nclass ExactGPModel(gpytorch.models.ExactGP):\n def __init__(self, train_x, train_y, likelihood, kernel):\n super().__init__(train_x, train_y, likelihood)\n \n self.mean_module = gpytorch.means.ConstantMean()\n self.covar_module = kernel\n\n def forward(self, x):\n mean_x = self.mean_module(x)\n covar_x = self.covar_module(x)\n return gpytorch.distributions.MultivariateNormal(mean_x, covar_x)\n\ngpytorch_likelihood = gpytorch.likelihoods.GaussianLikelihood()\ngpytorch_model = ExactGPModel(train_x, train_y, gpytorch_likelihood, gpytorch_kernel)\n\ngpytorch_model.likelihood.noise = 0.1\ngpytorch_model\n\nExactGPModel(\n (likelihood): GaussianLikelihood(\n (noise_covar): HomoskedasticNoise(\n (raw_noise_constraint): GreaterThan(1.000E-04)\n )\n )\n (mean_module): ConstantMean()\n (covar_module): ScaleKernel(\n (base_kernel): RBFKernel(\n (raw_lengthscale_constraint): Positive()\n )\n (raw_outputscale_constraint): Positive()\n )\n)\n\n\n\n\nTinyGP\n\ndef build_gp(theta, X):\n mean = theta[0] \n variance, lengthscale, noise_variance = jnp.exp(theta[1:])\n \n kernel = variance * tinygp.kernels.ExpSquared(lengthscale)\n \n return tinygp.GaussianProcess(kernel, X, diag=noise_variance, mean=mean)\n\ntinygp_model = build_gp(theta=np.array([0., 1., 1., 0.1]), X=train_x.numpy())\n\ntinygp_model\n# __repr__\n\n<tinygp.gp.GaussianProcess at 0x7f5440401850>" - }, - { - "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#train-the-model", - "href": "posts/2022-01-25-gp_frameworks_comparison.html#train-the-model", - "title": "Comparing Gaussian Process Regression Frameworks", - "section": "Train the model", - "text": "Train the model\n\nGPy\n\ngpy_model.optimize(max_iters=50)\ngpy_model\n\n\n\n\nModel: GP regression\nObjective: 3.944394423452163\nNumber of Parameters: 3\nNumber of Optimization Parameters: 3\nUpdates: True\n\n\n\n\n\n\nGP_regression.\nvalue\nconstraints\npriors\n\n\nrbf.variance\n0.9376905183253631\n+ve\n\n\n\nrbf.lengthscale\n0.2559000163858406\n+ve\n\n\n\nGaussian_noise.variance\n0.012506184441481319\n+ve\n\n\n\n\n\n\n\n\nGPyTorch\n\nmll = gpytorch.mlls.ExactMarginalLogLikelihood(gpytorch_likelihood, gpytorch_model)\nbotorch.fit_gpytorch_model(mll)\n\ndisplay(gpytorch_model.mean_module.constant, # Mean\n gpytorch_model.covar_module.outputscale, # Variance\n gpytorch_model.covar_module.base_kernel.lengthscale, # Lengthscale \n gpytorch_model.likelihood.noise) # Noise variance\n\n /opt/conda/lib/python3.7/site-packages/botorch/fit.py:143: UserWarning:CUDA initialization: CUDA unknown error - this may be due to an incorrectly set up environment, e.g. changing env variable CUDA_VISIBLE_DEVICES after program start. Setting the available devices to be zero. (Triggered internally at /opt/conda/conda-bld/pytorch_1634272168290/work/c10/cuda/CUDAFunctions.cpp:112.)\n\n\nParameter containing:\ntensor([0.0923], requires_grad=True)\n\n\ntensor(0.9394, grad_fn=<SoftplusBackward0>)\n\n\ntensor([[0.2560]], grad_fn=<SoftplusBackward0>)\n\n\ntensor([0.0124], grad_fn=<AddBackward0>)\n\n\n\n\nTinyGP\n\nfrom scipy.optimize import minimize\n\ndef neg_log_likelihood(theta, X, y):\n gp = build_gp(theta, X)\n return -gp.condition(y)\n\n\nobj = jax.jit(jax.value_and_grad(neg_log_likelihood))\nresult = minimize(obj, [0., 1., 1., 0.1], jac=True, args=(train_x.numpy(), train_y.numpy()))\nresult.x[0], np.exp(result.x[1:])\n\n(0.09213499552879165, array([0.9395271 , 0.25604163, 0.01243025]))" - }, - { - "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#inference", - "href": "posts/2022-01-25-gp_frameworks_comparison.html#inference", - "title": "Comparing Gaussian Process Regression Frameworks", - "section": "Inference", - "text": "Inference\n\ndef plot_gp(pred_y, var_y):\n std_y = var_y ** 0.5\n plt.figure()\n plt.scatter(train_x, train_y, label='train')\n plt.plot(test_x, pred_y, label='predictive mean')\n plt.fill_between(test_x.ravel(), \n pred_y.ravel() - 2*std_y.ravel(), \n pred_y.ravel() + 2*std_y.ravel(), alpha=0.2, label='95% confidence')\n plt.legend()\n\n\nGPy\n\npred_y, var_y = gpy_model.predict(test_x.numpy()[:, None])\nplot_gp(pred_y, var_y)\n\n\n\n\n\n\n\n\n\n\nGPyTorch\n\ngpytorch_model.eval()\n\nwith torch.no_grad(), gpytorch.settings.fast_pred_var():\n pred_dist = gpytorch_likelihood(gpytorch_model(test_x))\n pred_y, var_y = pred_dist.mean, pred_dist.variance\n plot_gp(pred_y, var_y)\n\n\n\n\n\n\n\n\n\n\nTinyGP\n\ntinygp_model = build_gp(result.x, train_x.numpy())\npred_y, var_y = tinygp_model.predict(train_y.numpy(), test_x.numpy(), return_var=True)\n\nplot_gp(pred_y, var_y)" - }, - { - "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#tiny-gp-on-co2-dataset", - "href": "posts/2022-01-25-gp_frameworks_comparison.html#tiny-gp-on-co2-dataset", - "title": "Comparing Gaussian Process Regression Frameworks", - "section": "Tiny GP on CO2 dataset", - "text": "Tiny GP on CO2 dataset\n\ndata = pd.read_csv(\"data/co2.csv\")\n\n# Train test split\nX = data[\"0\"].iloc[:290].values.reshape(-1, 1)\nX_test = data[\"0\"].iloc[290:].values.reshape(-1, 1)\ny = data[\"1\"].iloc[:290].values\ny_test = data[\"1\"].iloc[290:].values\n\n# Scaling the dataset\nXscaler = StandardScaler()\nX = Xscaler.fit_transform(X)\nX_test = Xscaler.transform(X_test)\n\nyscaler = StandardScaler()\ny = yscaler.fit_transform(y.reshape(-1, 1)).ravel()\ny_test = yscaler.transform(y_test.reshape(-1, 1)).ravel()\n\n\nplt.plot(X, y, label='train');\nplt.plot(X_test, y_test, label='test');\nplt.legend();\n\n\n\n\n\n\n\n\n\nclass SpectralMixture(tinygp.kernels.Kernel):\n def __init__(self, weight, scale, freq):\n self.weight = jnp.atleast_1d(weight)\n self.scale = jnp.atleast_1d(scale)\n self.freq = jnp.atleast_1d(freq)\n\n def evaluate(self, X1, X2):\n tau = jnp.atleast_1d(jnp.abs(X1 - X2))[..., None]\n return jnp.sum(\n self.weight\n * jnp.prod(\n jnp.exp(-2 * jnp.pi ** 2 * tau ** 2 / self.scale ** 2)\n * jnp.cos(2 * jnp.pi * self.freq * tau),\n axis=-1,\n )\n )\n \ndef build_spectral_gp(theta):\n kernel = SpectralMixture(\n jnp.exp(theta[\"log_weight\"]),\n jnp.exp(theta[\"log_scale\"]),\n jnp.exp(theta[\"log_freq\"]),\n )\n return tinygp.GaussianProcess(\n kernel, X, diag=jnp.exp(theta[\"log_diag\"]), mean=theta[\"mean\"]\n )\n\n\nK = 4 # Number of mixtures\ndiv_factor = 0.4\nnp.random.seed(1)\nparams = {\n \"log_weight\": np.abs(np.random.rand(K))/div_factor,\n \"log_scale\": np.abs(np.random.rand(K))/div_factor,\n \"log_freq\": np.abs(np.random.rand(K))/div_factor,\n \"log_diag\": np.abs(np.random.rand(1))/div_factor,\n \"mean\": 0.,\n}\n\n@jax.jit\n@jax.value_and_grad\ndef loss(theta):\n return -build_spectral_gp(theta).condition(y)\n# opt = optax.sgd(learning_rate=0.001)\nopt = optax.adam(learning_rate=0.1)\nopt_state = opt.init(params)\nlosses = []\nfor i in range(100):\n loss_val, grads = loss(params)\n updates, opt_state = opt.update(grads, opt_state)\n params = optax.apply_updates(params, updates)\n losses.append(loss_val)\n clear_output(wait=True)\n print(f\"iter {i}, loss {loss_val}\")\n\nopt_gp = build_spectral_gp(params)\n\nparams\n\niter 99, loss 27.987701416015625\n\n\n{'log_diag': DeviceArray([-2.7388687], dtype=float32),\n 'log_freq': DeviceArray([-3.6072493, -3.1795945, -3.4490397, -2.373117 ], dtype=float32),\n 'log_scale': DeviceArray([3.9890492, 3.8530042, 4.0878096, 4.4860597], dtype=float32),\n 'log_weight': DeviceArray([-1.3715047, -0.6132469, -2.413771 , -1.6582283], dtype=float32),\n 'mean': DeviceArray(0.38844627, dtype=float32)}\n\n\n\nplt.plot(losses);\n\n\n\n\n\n\n\n\n\nmu, var = opt_gp.predict(y, X_test, return_var=True)\n\nplt.plot(X, y, c='k')\nplt.fill_between(\n X_test.ravel(), mu + np.sqrt(var), mu - np.sqrt(var), color=\"C0\", alpha=0.5\n)\nplt.plot(X_test, mu, color=\"C0\", lw=2)\n\n# plt.xlim(t.min(), 2025)\nplt.xlabel(\"year\")\n_ = plt.ylabel(\"CO$_2$ in ppm\")" - }, - { - "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#k_mathbfy-textcov_functionx_train-x_train-sigma_f-ell-sigma_n", - "href": "posts/2022-01-25-gp_frameworks_comparison.html#k_mathbfy-textcov_functionx_train-x_train-sigma_f-ell-sigma_n", - "title": "Comparing Gaussian Process Regression Frameworks", - "section": "\\(K_\\mathbf{y} = \\text{cov_function}(X_{train}, X_{train}, \\sigma_f, \\ell, \\sigma_n)\\)", - "text": "\\(K_\\mathbf{y} = \\text{cov_function}(X_{train}, X_{train}, \\sigma_f, \\ell, \\sigma_n)\\)" - }, - { - "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#gp-loss-log-pmathbfy-mid-mathbfx-theta-frac12-mathbfyt-k_y-1-mathbfy-frac12-log-leftk_yright-fracn2-log-2-pi", - "href": "posts/2022-01-25-gp_frameworks_comparison.html#gp-loss-log-pmathbfy-mid-mathbfx-theta-frac12-mathbfyt-k_y-1-mathbfy-frac12-log-leftk_yright-fracn2-log-2-pi", - "title": "Comparing Gaussian Process Regression Frameworks", - "section": "GP Loss: \\(\\log p(\\mathbf{y} \\mid \\mathbf{X}, \\theta)=-\\frac{1}{2} \\mathbf{y}^{T} K_{y}^{-1} \\mathbf{y}-\\frac{1}{2} \\log \\left|K_{y}\\right|-\\frac{n}{2} \\log 2 \\pi\\)", - "text": "GP Loss: \\(\\log p(\\mathbf{y} \\mid \\mathbf{X}, \\theta)=-\\frac{1}{2} \\mathbf{y}^{T} K_{y}^{-1} \\mathbf{y}-\\frac{1}{2} \\log \\left|K_{y}\\right|-\\frac{n}{2} \\log 2 \\pi\\)\n\nMinimize inverse term fully\nNow, Minimize both togather" - }, - { - "objectID": "posts/climate-modeling-with-siren.html", - "href": "posts/climate-modeling-with-siren.html", - "title": "Climate Modeling with SIRENs", - "section": "", - "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"0\"\nos.environ[\"TF_FORCE_GPU_ALLOW_GROWTH\"] = \"true\"\n\nimport numpy as np\nimport xarray as xr\nfrom tqdm.keras import TqdmCallback\n\nimport tensorflow as tf\nfrom tensorflow.keras import layers, initializers, activations\nfrom tensorflow.keras.applications.resnet50 import ResNet50\nfrom tensorflow.keras.callbacks import LearningRateScheduler\nimport tensorflow_addons as tfa\n\nimport matplotlib.pyplot as plt\n\n/home/patel_zeel/miniconda3/envs/tensorflow_gpu/lib/python3.10/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html\n from .autonotebook import tqdm as notebook_tqdm\n2023-07-18 05:13:52.439735: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.\nTo enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.\n2023-07-18 05:13:53.232689: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT\n/home/patel_zeel/miniconda3/envs/tensorflow_gpu/lib/python3.10/site-packages/tensorflow_addons/utils/tfa_eol_msg.py:23: UserWarning: \n\nTensorFlow Addons (TFA) has ended development and introduction of new features.\nTFA has entered a minimal maintenance and release mode until a planned end of life in May 2024.\nPlease modify downstream libraries to take dependencies from other repositories in our TensorFlow community (e.g. Keras, Keras-CV, and Keras-NLP). \n\nFor more information see: https://github.com/tensorflow/addons/issues/2807 \n\n warnings.warn(\n\n\n\ndef SIREN(input_dim, output_dim, features, activation_scale, dropout):\n first_init = lambda input_dim: initializers.RandomUniform(-1 / input_dim, 1 / input_dim)\n other_init = lambda input_dim: initializers.RandomUniform(-np.sqrt(6 / input_dim) / activation_scale, np.sqrt(6 / input_dim) / activation_scale)\n model = tf.keras.Sequential()\n model.add(layers.Dense(features[0], input_shape=(input_dim,), kernel_initializer=first_init(input_dim), activation=lambda x: tf.sin(activation_scale*x)))\n for i in range(1, len(features)):\n model.add(layers.Dense(features[i], kernel_initializer=other_init(features[i-1]), activation=lambda x: tf.sin(activation_scale*x)))\n model.add(layers.Dropout(dropout))\n model.add(layers.Dense(output_dim, kernel_initializer=other_init(features[-1]), activation='linear'))\n return model\n\ndef MLP(input_dim, output_dim, features, dropout):\n model = tf.keras.Sequential()\n model.add(layers.Dense(features[0], input_shape=(input_dim,), activation=activations.relu))\n for i in range(1, len(features)):\n model.add(layers.Dense(features[i], activation=activations.relu))\n model.add(layers.Dropout(dropout))\n model.add(layers.Dense(output_dim, activation='linear'))\n return model\n \ndef ResNet():\n resnet = ResNet50(include_top=False, weights=None, input_shape=(64, 32, 1), pooling='avg')\n model = tf.keras.Sequential()\n model.add(resnet)\n model.add(layers.Dense(2048, activation='relu'))\n model.add(layers.Dense(32768, activation='linear'))\n return model\n\n\ndata5 = xr.open_dataset(\"../../super_res/data/era5_low_res/2m_temperature/2m_temperature_2018_5.625deg.nc\")\ndata1 = xr.open_dataset(\"../../super_res/data/era5_high_res/2m_temperature/2m_temperature_2018_1.40625deg.nc\")\n\n\ndata5\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n<xarray.Dataset>\nDimensions: (lon: 64, lat: 32, time: 8760)\nCoordinates:\n * lon (lon) float64 0.0 5.625 11.25 16.88 ... 337.5 343.1 348.8 354.4\n * lat (lat) float64 -87.19 -81.56 -75.94 -70.31 ... 75.94 81.56 87.19\n * time (time) datetime64[ns] 2018-01-01 ... 2018-12-31T23:00:00\nData variables:\n t2m (time, lat, lon) float32 ...\nAttributes:\n Conventions: CF-1.6\n history: 2019-11-06 10:38:21 GMT by grib_to_netcdf-2.14.0: /opt/ecmw...xarray.DatasetDimensions:lon: 64lat: 32time: 8760Coordinates: (3)lon(lon)float640.0 5.625 11.25 ... 348.8 354.4array([ 0. , 5.625, 11.25 , 16.875, 22.5 , 28.125, 33.75 , 39.375,\n 45. , 50.625, 56.25 , 61.875, 67.5 , 73.125, 78.75 , 84.375,\n 90. , 95.625, 101.25 , 106.875, 112.5 , 118.125, 123.75 , 129.375,\n 135. , 140.625, 146.25 , 151.875, 157.5 , 163.125, 168.75 , 174.375,\n 180. , 185.625, 191.25 , 196.875, 202.5 , 208.125, 213.75 , 219.375,\n 225. , 230.625, 236.25 , 241.875, 247.5 , 253.125, 258.75 , 264.375,\n 270. , 275.625, 281.25 , 286.875, 292.5 , 298.125, 303.75 , 309.375,\n 315. , 320.625, 326.25 , 331.875, 337.5 , 343.125, 348.75 , 354.375])lat(lat)float64-87.19 -81.56 ... 81.56 87.19array([-87.1875, -81.5625, -75.9375, -70.3125, -64.6875, -59.0625, -53.4375,\n -47.8125, -42.1875, -36.5625, -30.9375, -25.3125, -19.6875, -14.0625,\n -8.4375, -2.8125, 2.8125, 8.4375, 14.0625, 19.6875, 25.3125,\n 30.9375, 36.5625, 42.1875, 47.8125, 53.4375, 59.0625, 64.6875,\n 70.3125, 75.9375, 81.5625, 87.1875])time(time)datetime64[ns]2018-01-01 ... 2018-12-31T23:00:00long_name :timearray(['2018-01-01T00:00:00.000000000', '2018-01-01T01:00:00.000000000',\n '2018-01-01T02:00:00.000000000', ..., '2018-12-31T21:00:00.000000000',\n '2018-12-31T22:00:00.000000000', '2018-12-31T23:00:00.000000000'],\n dtype='datetime64[ns]')Data variables: (1)t2m(time, lat, lon)float32...units :Klong_name :2 metre temperature[17940480 values with dtype=float32]Indexes: (3)lonPandasIndexPandasIndex(Index([ 0.0, 5.625, 11.25, 16.875, 22.5, 28.125, 33.75, 39.375,\n 45.0, 50.625, 56.25, 61.875, 67.5, 73.125, 78.75, 84.375,\n 90.0, 95.625, 101.25, 106.875, 112.5, 118.125, 123.75, 129.375,\n 135.0, 140.625, 146.25, 151.875, 157.5, 163.125, 168.75, 174.375,\n 180.0, 185.625, 191.25, 196.875, 202.5, 208.125, 213.75, 219.375,\n 225.0, 230.625, 236.25, 241.875, 247.5, 253.125, 258.75, 264.375,\n 270.0, 275.625, 281.25, 286.875, 292.5, 298.125, 303.75, 309.375,\n 315.0, 320.625, 326.25, 331.875, 337.5, 343.125, 348.75, 354.375],\n dtype='float64', name='lon'))latPandasIndexPandasIndex(Index([-87.1875, -81.5625, -75.9375, -70.3125, -64.6875, -59.0625, -53.4375,\n -47.8125, -42.1875, -36.5625, -30.9375, -25.3125, -19.6875, -14.0625,\n -8.4375, -2.8125, 2.8125, 8.4375, 14.0625, 19.6875, 25.3125,\n 30.9375, 36.5625, 42.1875, 47.8125, 53.4375, 59.0625, 64.6875,\n 70.3125, 75.9375, 81.5625, 87.1875],\n dtype='float64', name='lat'))timePandasIndexPandasIndex(DatetimeIndex(['2018-01-01 00:00:00', '2018-01-01 01:00:00',\n '2018-01-01 02:00:00', '2018-01-01 03:00:00',\n '2018-01-01 04:00:00', '2018-01-01 05:00:00',\n '2018-01-01 06:00:00', '2018-01-01 07:00:00',\n '2018-01-01 08:00:00', '2018-01-01 09:00:00',\n ...\n '2018-12-31 14:00:00', '2018-12-31 15:00:00',\n '2018-12-31 16:00:00', '2018-12-31 17:00:00',\n '2018-12-31 18:00:00', '2018-12-31 19:00:00',\n '2018-12-31 20:00:00', '2018-12-31 21:00:00',\n '2018-12-31 22:00:00', '2018-12-31 23:00:00'],\n dtype='datetime64[ns]', name='time', length=8760, freq=None))Attributes: (2)Conventions :CF-1.6history :2019-11-06 10:38:21 GMT by grib_to_netcdf-2.14.0: /opt/ecmwf/eccodes/bin/grib_to_netcdf -o /cache/data2/adaptor.mars.internal-1573035683.1772008-2550-1-601d5659-dae2-45e1-902b-45825d30e8d0.nc /cache/tmp/601d5659-dae2-45e1-902b-45825d30e8d0-adaptor.mars.internal-1573035683.1790879-2550-1-tmp.grib\n\n\n\ntime_stamp = slice(\"2018-01\", \"2018-03\")\ntrain_df = data5.sel(time=time_stamp).to_dataframe().reset_index()\ntest_df = data1.sel(time=time_stamp).to_dataframe().reset_index()\n\nX = np.stack([train_df.lat.values, train_df.lon.values, train_df.time.astype(np.int64) / 10**9], axis=1)\ny = train_df[[\"t2m\"]].values\nprint(f\"{X.shape=}, {y.shape=}\")\n\nX_test = np.stack([test_df.lat.values, test_df.lon.values, test_df.time.astype(np.int64) / 10**9], axis=1)\ny_test = test_df[[\"t2m\"]].values\nprint(f\"{X_test.shape=}, {y_test.shape=}\")\n\n# rff = np.random.normal(size=(2, 16)) * 0.01\n# X = np.concatenate([np.sin(X @ rff), np.cos(X @ rff)], axis=1)\n# print(f\"{sin_cos.shape=}\")\n# X = X @ sin_cos\n# X_test = np.concatenate([np.sin(X_test @ rff), np.cos(X_test @ rff)], axis=1)\n\nprint(f\"{X.shape=}, {X_test.shape=}\")\n\nX.shape=(4423680, 3), y.shape=(4423680, 1)\nX_test.shape=(70778880, 3), y_test.shape=(70778880, 1)\nX.shape=(4423680, 3), X_test.shape=(70778880, 3)\n\n\n\n32*64*24*(31+28+31)\n\n4423680\n\n\n\nX_max = np.max(X, axis=0, keepdims=True)\nX_min = np.min(X, axis=0, keepdims=True)\n\nX_scaled = (X - X_min) / (X_max - X_min)\nX_test_scaled = (X_test - X_min) / (X_max - X_min)\n\n# Scaling time\nif X.shape[1] == 3:\n X_scaled[:, 2] = X_scaled[:, 2] * 10 - 5\n X_test_scaled[:, 2] = X_test_scaled[:, 2] * 10 - 5\n\ny_min = np.min(y, axis=0, keepdims=True)\ny_max = np.max(y, axis=0, keepdims=True)\n\ny_scaled = (y - y_min) / (y_max - y_min)\n\n# y_mean = np.mean(y, axis=0, keepdims=True)\n# y_std = np.std(y, axis=0, keepdims=True)\n\n# y_scaled = (y - y_mean) / y_std\n\n\nmodel = SIREN(3, 1, [256]*4, 30.0, 0.0)\n# model = MLP(3, 1, [256]*4, 0.0)\n# model = ResNet()S\n# clr = tfa.optimizers.CyclicalLearningRate(initial_learning_rate=1e-3,\n# maximal_learning_rate=1e-2,\n# scale_fn=lambda x: 1/(2.**(x-1)),\n# step_size=2\n# )\nmodel.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=1e-3), loss='mse')\n\n2023-07-18 05:14:34.531498: W tensorflow/core/common_runtime/gpu/gpu_bfc_allocator.cc:47] Overriding orig_value setting because the TF_FORCE_GPU_ALLOW_GROWTH environment variable is set. Original config value was 0.\n2023-07-18 05:14:34.531583: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1635] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 78884 MB memory: -> device: 0, name: NVIDIA A100-SXM4-80GB, pci bus id: 0000:01:00.0, compute capability: 8.0\n\n\n\n0.00148\n\n0.00148\n\n\n\ncallbacks = [TqdmCallback(verbose=1)]\nhistory = model.fit(X_scaled, y_scaled, epochs=1000, batch_size=X_scaled.shape[0], verbose=0, callbacks=callbacks)\n\n 0%| | 0/1000 [00:00<?, ?epoch/s]2023-07-18 05:14:38.677299: I tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:637] TensorFloat-32 will be used for the matrix multiplication. This will only be logged once.\n2023-07-18 05:14:39.357828: I tensorflow/compiler/xla/service/service.cc:169] XLA service 0x7fb81b946130 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:\n2023-07-18 05:14:39.357901: I tensorflow/compiler/xla/service/service.cc:177] StreamExecutor device (0): NVIDIA A100-SXM4-80GB, Compute Capability 8.0\n2023-07-18 05:14:39.363158: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.\n2023-07-18 05:14:40.399794: I tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:424] Loaded cuDNN version 8600\n2023-07-18 05:14:40.557542: I ./tensorflow/compiler/jit/device_compiler.h:180] Compiled cluster using XLA! This line is logged at most once for the lifetime of the process.\n100%|██████████| 1000/1000 [04:46<00:00, 3.49epoch/s, loss=0.000295]\n\n\n\nplt.plot(history.history['loss'][200:], label='loss');\n# plt.plot(history.history['val_loss'][200:], label='val_loss');\nplt.legend();\n\n\n\n\n\n\n\n\n\n128*256*24\n\n786432\n\n\n\nimg_index = 0\ny_pred = model.predict(X_test_scaled, batch_size=20480) * (y_max - y_min) + y_min\nprint(y_pred.shape)\nplt.imshow(y_pred[img_index*(256*128):(img_index+1)*(256*128)].reshape(256, 128), origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\n\n3456/3456 [==============================] - 5s 1ms/step\n(70778880, 1)\n\n\n\n\n\n\n\n\n\n\nplt.imshow(y.reshape(64, 32), origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\n\n\ndiff = y_pred.reshape(256, 128) - y_test.reshape(256, 128)\nplt.imshow(diff, origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\nplt.colorbar();\nplt.title(\"Diff\")\n\n\n# rmse = np.sqrt(np.mean(np.abs(X_test[:, 0:1])*(y_pred.ravel() - y_test.ravel())**2))/np.mean(y_test.ravel() * np.abs(X_test[:, 0:1]))\ndef get_lat_weights(lat):\n lat_weights = np.cos(np.deg2rad(lat))\n lat_weights = lat_weights / lat_weights.mean()\n return lat_weights\n\nlat_weights = get_lat_weights(X_test[:, 0])\nprint(f\"{lat_weights.shape=}\")\n\nlat_squared_error = lat_weights * (y_pred.ravel() - y_test.ravel())**2\nlat_rmse = np.sqrt(lat_squared_error.mean())\nprint(f\"{lat_rmse=}\")\n# y_pred.shape, lat_weights.shape\n\nlat_weights.shape=(70778880,)\nlat_rmse=2.6118446730600438\n\n\n\n# lat_rmse=3.4826956884024356\n\n\nmean_bias = np.mean(y_pred.ravel() - y_test.ravel())\nprint(f\"{mean_bias=}\")" - }, - { - "objectID": "posts/2022-04-06-github_faqs.html", - "href": "posts/2022-04-06-github_faqs.html", - "title": "GitHub Contrubuting FAQs", - "section": "", - "text": "Create separate branches for each issue. Do not work on the master branch.\n\n\nWe will see that in Q5." - }, - { - "objectID": "posts/2022-04-06-github_faqs.html#q1-what-is-an-efficient-way-to-work-on-multiple-issues-at-once", - "href": "posts/2022-04-06-github_faqs.html#q1-what-is-an-efficient-way-to-work-on-multiple-issues-at-once", - "title": "GitHub Contrubuting FAQs", + "objectID": "posts/non-gaussian-likelihood-mlps.html", + "href": "posts/non-gaussian-likelihood-mlps.html", + "title": "Non-Gaussian Likelihoods for MLPs", "section": "", - "text": "Create separate branches for each issue. Do not work on the master branch.\n\n\nWe will see that in Q5." - }, - { - "objectID": "posts/2022-04-06-github_faqs.html#q2-what-to-do-if-the-main-or-master-gets-updated-before-i-open-a-pr", - "href": "posts/2022-04-06-github_faqs.html#q2-what-to-do-if-the-main-or-master-gets-updated-before-i-open-a-pr", - "title": "GitHub Contrubuting FAQs", - "section": "Q2: What to do if the main (or master) gets updated before I open a PR?", - "text": "Q2: What to do if the main (or master) gets updated before I open a PR?\nPull the changes directly to your branch with:\ngit pull https://github.com/probml/pyprobml" - }, - { - "objectID": "posts/2022-04-06-github_faqs.html#q3-what-to-do-with-the-forks-main-when-the-original-main-is-updated", - "href": "posts/2022-04-06-github_faqs.html#q3-what-to-do-with-the-forks-main-when-the-original-main-is-updated", - "title": "GitHub Contrubuting FAQs", - "section": "Q3: What to do with the fork’s main when the original main is updated?", - "text": "Q3: What to do with the fork’s main when the original main is updated?\nFetch upstream with GitHub GUI or use the same solution given in Q2." + "text": "# %pip install mapie\nimport os\n\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"3\"\n\nimport numpy as np\n\nimport torch\nimport torch.nn as nn\nimport torch.distributions as dist\n\nfrom tqdm import tqdm\n\nfrom sklearn.calibration import calibration_curve\nfrom sklearn.metrics import classification_report\nimport matplotlib.pyplot as plt\nimport seaborn as sns\n\nfrom mapie.metrics import regression_coverage_score\n\ndevice = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\ntorch.manual_seed(0)\n\nN = 100\nx = dist.Uniform(-1, 1).sample((N, 1)).sort(dim=0).values\nx_test = torch.linspace(-1, 1, 2 * N).view(-1, 1).sort(dim=0).values\ny = 3 * x**3 - 2 * x + 1\ny_noisy = y + dist.Gamma(0.1, 0.3).sample((N, 1))\n\nplt.plot(x, y, label=\"true\", color=\"C0\")\nplt.scatter(x, y_noisy, label=\"noisy data\", color=\"C1\")\n\nplt.legend()\nprint(\"x.shape:\", x.shape, \"y.shape:\", y.shape)\n\nx.shape: torch.Size([100, 1]) y.shape: torch.Size([100, 1])" }, { - "objectID": "posts/2022-04-06-github_faqs.html#q4-why-and-when-keeping-the-forks-main-up-to-date-with-the-original-main-is-important", - "href": "posts/2022-04-06-github_faqs.html#q4-why-and-when-keeping-the-forks-main-up-to-date-with-the-original-main-is-important", - "title": "GitHub Contrubuting FAQs", - "section": "Q4: Why and when keeping the fork’s main up to date with the original main is important?", - "text": "Q4: Why and when keeping the fork’s main up to date with the original main is important?\nWhenever we need to create new branches (usually from the fork’s main)." + "objectID": "posts/non-gaussian-likelihood-mlps.html#define-a-gaussiangamma-mlp", + "href": "posts/non-gaussian-likelihood-mlps.html#define-a-gaussiangamma-mlp", + "title": "Non-Gaussian Likelihoods for MLPs", + "section": "Define a Gaussian/Gamma MLP", + "text": "Define a Gaussian/Gamma MLP\n\nclass ProbabilisticMLP(nn.Module):\n def __init__(self, input_dim, feature_dims, type):\n super().__init__()\n self.input_dim = input_dim\n self.feature_dims = feature_dims\n self.type = type # \"gaussian\" or \"gamma\"\n\n self.layers = nn.ModuleList()\n self.layers.append(nn.Linear(input_dim, feature_dims[0]))\n for i in range(len(feature_dims) - 1):\n self.layers.append(nn.Linear(feature_dims[i], feature_dims[i + 1]))\n self.layers.append(nn.Linear(feature_dims[-1], 2))\n\n # likelihood parameters\n # if self.type == \"gaussian\":\n # self.register_buffer(\"likelihood_mean\", torch.zeros(1))\n # self.likelihood_log_std = nn.Parameter(torch.zeros(1))\n # elif self.type == \"gamma\":\n # self.likelihood_log_concentration = nn.Parameter(torch.zeros(1))\n # self.likelihood_log_rate = nn.Parameter(torch.zeros(1))\n\n def forward(self, x):\n for layer in self.layers[:-1]:\n x = torch.relu(layer(x))\n\n if self.type == \"gaussian\":\n # y_pred = self.layers[-1](x)\n # likelihood_mean = self.likelihood_mean.expand(y_pred.shape[0])\n # likelihood_log_std = self.likelihood_log_std.expand(y_pred.shape[0])\n # likelihood_std = torch.exp(likelihood_log_std)\n # return y_pred, likelihood_mean, likelihood_std\n\n y_out = self.layers[-1](x)\n mean = y_out[:, 0]\n log_std = y_out[:, 1]\n std = torch.exp(log_std)\n return mean.ravel(), std.ravel()\n\n elif self.type == \"gamma\":\n # y_pred = self.layers[-1](x)\n # likelihood_log_concentration = self.likelihood_log_concentration.expand(\n # y_pred.shape[0]\n # )\n # likelihood_log_rate = self.likelihood_log_rate.expand(y_pred.shape[0])\n # likelihood_concentration = torch.exp(likelihood_log_concentration)\n # likelihood_rate = torch.exp(likelihood_log_rate)\n # return y_pred, likelihood_concentration, likelihood_rate\n\n y_out = self.layers[-1](x)\n log_concentration = y_out[:, 0]\n log_rate = y_out[:, 1]\n concentration = torch.exp(log_concentration)\n rate = torch.exp(log_rate)\n return concentration, rate\n\n def loss_fn(self, y, param1, param2):\n if self.type == \"gaussian\":\n # epsilon = y - y_pred\n # mean = param1\n # std = param2\n # dist = torch.distributions.Normal(mean, std + 1e-6)\n # return -dist.log_prob(epsilon).mean()\n mean = param1\n std = param2\n dist = torch.distributions.Normal(mean, std + 1e-3)\n return -dist.log_prob(y.ravel()).mean()\n\n elif self.type == \"gamma\":\n # epsilon = torch.clip(y - y_pred, min=1e-6, max=1e6)\n # concentration = param1\n # rate = param2\n # dist = torch.distributions.Gamma(concentration, rate)\n # return -dist.log_prob(epsilon).mean()\n concentration = param1\n rate = param2\n dist = torch.distributions.Gamma(concentration + 1e-3, rate + 1e-3)\n return -dist.log_prob(y.ravel()).mean()" }, { - "objectID": "posts/2022-04-06-github_faqs.html#q5-how-to-update-a-change-in-a-pr-that-is-open", - "href": "posts/2022-04-06-github_faqs.html#q5-how-to-update-a-change-in-a-pr-that-is-open", - "title": "GitHub Contrubuting FAQs", - "section": "Q5: How to update a change in a PR that is open?", - "text": "Q5: How to update a change in a PR that is open?\nPush the change to the corresponding branch and PR will get updated automatically." + "objectID": "posts/non-gaussian-likelihood-mlps.html#fit-gaussian-mlp", + "href": "posts/non-gaussian-likelihood-mlps.html#fit-gaussian-mlp", + "title": "Non-Gaussian Likelihoods for MLPs", + "section": "Fit Gaussian MLP", + "text": "Fit Gaussian MLP\n\ntorch.manual_seed(0)\n\nmodel = ProbabilisticMLP(1, [32, 32], \"gaussian\").to(device)\n\noptimizer = torch.optim.Adam(model.parameters(), lr=0.01)\nn_epochs = 500\n\npbar = tqdm(range(n_epochs))\nlosses = []\nfor epoch in pbar:\n optimizer.zero_grad()\n param1, param2 = model(x.to(device))\n loss = model.loss_fn(y_noisy.to(device), param1, param2)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: 0.4503: 100%|██████████| 500/500 [00:01<00:00, 291.18it/s]\n\n\n\n\n\n\n\n\n\n\n# sns.kdeplot(param2.cpu().detach().numpy(), label=\"std\")\n\n\nwith torch.no_grad():\n y_mean, y_std = model(x_test.to(device))\n y_mean = y_mean.cpu().numpy().ravel()\n y_std = y_std.cpu().numpy().ravel()\n # y_mean = y_pred.cpu().numpy().ravel() + mean.cpu().numpy().ravel()\n # y_std = std.cpu().numpy().ravel()\n\nplt.plot(x, y, label=\"true\", color=\"C0\")\nplt.scatter(x, y_noisy, label=\"noisy data\", color=\"C1\")\nplt.plot(x_test, y_mean, label=\"y_mean\", color=\"C2\")\nplt.fill_between(\n x_test.squeeze(),\n y_mean - 2 * y_std,\n y_mean + 2 * y_std,\n alpha=0.3,\n color=\"C2\",\n label=\"95% CI\",\n)\n\nplt.legend()\n\n\n\n\n\n\n\n\n\nwith torch.no_grad():\n y_mean, y_std = model(x.to(device))\n y_mean = y_mean.cpu().numpy().ravel()\n y_std = y_std.cpu().numpy().ravel()\n\nupper = y_mean + 2 * y_std\nlower = y_mean - 2 * y_std\n\nregression_coverage_score(y_noisy.numpy(), lower, upper)\n\n0.91" }, { - "objectID": "posts/torch-tips.html", - "href": "posts/torch-tips.html", - "title": "PyTorch Tips", - "section": "", - "text": "Several tips for building torch models from scratch from my experience. Some of the tips are like zen, they are not immediately intuitive but useful for efficient code.\n\nAll the initializations or new tensor creation should only happen in __init__ method. During the forward() call, ideally no new tensors should be created from scratch such as torch.zeros(), torch.ones() etc. Reason: Violating this can sometimes brake your forward pass and end-to-end backprop may become buggy.\n.cuda() and .cpu() are discouraged, use .to(device) instead. Reason: .to(device) is more dynamic and scalable.\nDo not save models with torch.save(model), that may become incompaitable with different torch versions and may take more memory. Save torch.save(model.state_dict()) instead.\nNeed to set parameter names dynamically? Use this example, zero=0;self.register_parameter(f\"name_{zero}\"). They can be accessed with model.name_0.\nHave something in model which is necessary for forward pass but does not require backprop? define those variables with self.register_buffer.\nLet .to(device) to be set outside the model defition. Reason: It is less confusing to the users this way and it is less messy with internal tools to set device such as:\n\nmodule.to(deivce) sends all parameters and buffers of model/submodules to the device.\n\nmodule.float() or module.double() will convert all model/submodule parameters and buffers into float32 and float64 respectively.\nLet .train() and .eval() to be set outside the model defition or set by user. Reason: It can be confusing to user if these things are used inside the model against torch conventions.\ntorch.no_grad() should not be used within the model. Reason: Sometimes user may want to backprop through that chunk of code.\nLink the multiple modules togather. Reason: Ideally, it is useful if model is built like a assembled product (say a car). You should be able to replace the parts as per your requirement. Several benefits on these lines are:\n\nsetting module.train() or module.eval() puts all submodules in train mode or eval mode respectively.\nAll submodules parameters can be accesses directly from the parent module with module.parameters().\n\nCreating a list of parameters in model __init__ definition? consider torch.nn.ModuleList(params) else individual parameters in the list will not be recognized as parameters." + "objectID": "posts/non-gaussian-likelihood-mlps.html#fit-gamma-mlp", + "href": "posts/non-gaussian-likelihood-mlps.html#fit-gamma-mlp", + "title": "Non-Gaussian Likelihoods for MLPs", + "section": "Fit Gamma MLP", + "text": "Fit Gamma MLP\n\nmodel = ProbabilisticMLP(1, [32, 32, 32], \"gamma\").to(device)\n\noptimizer = torch.optim.Adam(model.parameters(), lr=0.01)\nn_epochs = 1000\n\npbar = tqdm(range(n_epochs))\nlosses = []\nfor epoch in pbar:\n optimizer.zero_grad()\n param1, param2 = model(x.to(device))\n loss = model.loss_fn(y_noisy.to(device), param1, param2)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: 0.0775: 100%|██████████| 1000/1000 [00:03<00:00, 266.98it/s]\n\n\n\n\n\n\n\n\n\n\nfrom scipy.special import gammaincinv, gamma\n\nwith torch.no_grad():\n concetration, rate = model(x_test.to(device))\n concetration = concetration.cpu().ravel().numpy()\n rate = rate.cpu().ravel().numpy()\n\n y_mode = (concetration - 1) / rate\n\n quantile_fn = lambda p: gammaincinv(concetration, gamma(concetration) * p) / rate\n\n upper = quantile_fn(0.975)\n lower = quantile_fn(0.025)\n\nplt.plot(x, y, label=\"true\", color=\"C0\")\nplt.scatter(x, y_noisy, label=\"noisy data\", color=\"C1\")\nplt.plot(x_test, y_mode, label=\"mean\", color=\"C2\")\nplt.fill_between(\n x_test.squeeze(),\n lower,\n upper,\n alpha=0.3,\n color=\"C2\",\n label=\"95% CI\",\n)\n\nplt.legend()\n\n\n\n\n\n\n\n\n\nwith torch.no_grad():\n param1, param2 = model(x.to(device))\n concetration = param1.cpu().numpy().ravel()\n rate = param2.cpu().numpy().ravel()\n\n upper = quantile_fn(0.975)\n lower = quantile_fn(0.025)\n\nregression_coverage_score(y_noisy.numpy(), lower, upper)\n\n0.07" }, { "objectID": "posts/2022-03-08-torch-essentials.html", @@ -406,193 +322,179 @@ "text": "NN way\n\nclass LinearRegression(torch.nn.Module):\n def __init__(self):\n super().__init__()\n self.layer = torch.nn.Linear(2, 1) # torch.nn.Linear(128, 64)\n # What else? \n# self.activation = torch.nn.ReLU()\n# torch.nn.LSTM()\n# torch.nn.Conv2d()\n \n def forward(self, x): # Don't call directly. it is called by __call__ method\n x_plus_ones = torch.cat([torch.ones_like(x), x], dim=1)\n y_pred = self.layer(x_plus_ones)\n return y_pred" }, { - "objectID": "posts/2024-12-10-cpcb-download.html", - "href": "posts/2024-12-10-cpcb-download.html", - "title": "Download CPCB live data", - "section": "", - "text": "import os\nimport re\nfrom glob import glob\nimport pandas as pd\nfrom tqdm.notebook import tqdm\nfrom selenium import webdriver\nfrom selenium.webdriver.common.by import By\nfrom selenium.webdriver.common.keys import Keys\nfrom selenium.webdriver.support.ui import Select, WebDriverWait\nfrom selenium.webdriver.common.action_chains import ActionChains\nfrom selenium.webdriver.support import expected_conditions as EC\nfrom selenium.webdriver.chrome.options import Options\nfrom time import sleep\n\nHOME_URL = \"https://airquality.cpcb.gov.in/ccr/#/caaqm-dashboard-all/caaqm-landing\"\nDOWNLOAD_OLD_DATA_URL = \"https://airquality.cpcb.gov.in/ccr/#/caaqm-dashboard-all/caaqm-landing/caaqm-data-repository\"\nDOWNLOAD_PAGE_URL = \"https://airquality.cpcb.gov.in/ccr/#/caaqm-dashboard-all/caaqm-landing/data\"\ndef click_it(driver, element):\n driver.execute_script(\"arguments[0].click();\", element)\n \ndef find_it(element, option):\n return element.find_element(By.XPATH, f\"//li[contains(text(), '{option}')]\")\n\ndef select_dropdown_option(driver, element, option):\n element.click()\n option = find_it(element, option)\n click_it(driver, option)" - }, - { - "objectID": "posts/2024-12-10-cpcb-download.html#dry-run-to-get-metadata", - "href": "posts/2024-12-10-cpcb-download.html#dry-run-to-get-metadata", - "title": "Download CPCB live data", - "section": "Dry run to get metadata", - "text": "Dry run to get metadata\n\n# headless chrome\noptions = Options()\noptions.add_argument(\"--headless\")\n\n# open the browser\ndriver = webdriver.Chrome(options=options)\n\n# open the website\ndriver.get(DOWNLOAD_OLD_DATA_URL)\n\n# wait for the page to load and the dropdowns to appear\ndropdowns = WebDriverWait(driver, 10).until(EC.presence_of_all_elements_located((By.CSS_SELECTOR, \".select-box\")))\nlen(dropdowns)\n\n5\n\n\n\ndrop_data_type, drop_frequency, drop_states, drop_cities, drop_stations = dropdowns\n\n\n# Select data type\nselect_dropdown_option(driver, drop_data_type, \"Raw data\")\n\n# Select frequency\nselect_dropdown_option(driver, drop_frequency, \"1 day\")\n\n# Get the states\ndrop_states.click() # Open the dropdown\nstates = drop_states.text.replace(\"▲\\n\", \"\").split(\"\\n\")\nprint(\"Number of states:\", len(states))\ndrop_states.click() # Close the dropdown\n\nNumber of states: 31\n\n\n\nmetadata_df = pd.DataFrame(columns=[\"State\", \"City\", \"Station\", \"site_id\"])\n\n# This loop took less than a minute to run\nprogress_bar = tqdm(total=600) # as of 2024, 560 stations. update this number if it changes\nfor state in states:\n select_dropdown_option(driver, drop_states, state)\n \n # Get all cities\n drop_cities.click() # Open the dropdown\n cities = drop_cities.text.replace(\"▲\\n\", \"\").split(\"\\n\")\n drop_cities.click() # Close the dropdown\n \n for city in cities:\n select_dropdown_option(driver, drop_cities, city)\n \n # Get all stations\n drop_stations.click() # Open the dropdown\n stations = drop_stations.text.replace(\"▲\\n\", \"\").split(\"\\n\")\n drop_stations.click() # Close the dropdown\n \n for station in stations:\n # corner cases\n if station == \"Municipal Corporation Office, Dharuhera - HSPCB\":\n site_id = \"site_5044\"\n elif station == \"Civil Lines, Ajmer - RSPCB\":\n site_id = \"site_1392\"\n else:\n try:\n select_dropdown_option(driver, drop_stations, station)\n except:\n print(\"Unable to select station\")\n print(station)\n print(drop_stations.text)\n continue\n site_id = drop_stations.get_attribute(\"ng-reflect-model\")\n metadata_df.loc[len(metadata_df)] = [state, city, station, site_id]\n progress_bar.update(1)\n\n\n\n\n\nlen(metadata_df)\n\n560\n\n\n\nmetadata_df.head()\n\n\n\n\n\n\n\n\nState\nCity\nStation\nsite_id\n\n\n\n\n0\nAndhra Pradesh\nAmaravati\nSecretariat, Amaravati - APPCB\nsite_1406\n\n\n1\nAndhra Pradesh\nAnantapur\nGulzarpet, Anantapur - APPCB\nsite_5632\n\n\n2\nAndhra Pradesh\nChittoor\nGangineni Cheruvu, Chittoor - APPCB\nsite_5665\n\n\n3\nAndhra Pradesh\nKadapa\nYerramukkapalli, Kadapa - APPCB\nsite_5693\n\n\n4\nAndhra Pradesh\nRajamahendravaram\nAnand Kala Kshetram, Rajamahendravaram - APPCB\nsite_1399\n\n\n\n\n\n\n\n\nmetadata_df.tail()\n\n\n\n\n\n\n\n\nState\nCity\nStation\nsite_id\n\n\n\n\n555\nWest Bengal\nKolkata\nRabindra Bharati University, Kolkata - WBPCB\nsite_296\n\n\n556\nWest Bengal\nKolkata\nFort William, Kolkata - WBPCB\nsite_5110\n\n\n557\nWest Bengal\nKolkata\nVictoria, Kolkata - WBPCB\nsite_309\n\n\n558\nWest Bengal\nKolkata\nBidhannagar, Kolkata - WBPCB\nsite_5129\n\n\n559\nWest Bengal\nSiliguri\nWard-32 Bapupara, Siliguri - WBPCB\nsite_1419\n\n\n\n\n\n\n\n\nfor site_id, more_than_1 in (metadata_df.site_id.value_counts() > 1).items():\n if more_than_1:\n print(metadata_df[metadata_df.site_id == site_id])\n\n State City Station site_id\n25 Bihar Aurangabad MIDC Chilkalthana, Aurangabad - MPCB site_5788\n254 Maharashtra Aurangabad MIDC Chilkalthana, Aurangabad - MPCB site_5788\n State City Station site_id\n26 Bihar Aurangabad More Chowk Waluj, Aurangabad - MPCB site_198\n255 Maharashtra Aurangabad More Chowk Waluj, Aurangabad - MPCB site_198\n State City Station \\\n499 Uttar Pradesh Greater Noida Knowledge Park - V, Greater Noida - UPPCB \n526 Uttar Pradesh Noida Knowledge Park - V, Greater Noida - UPPCB \n\n site_id \n499 site_5121 \n526 site_5121 \n State City \\\n498 Uttar Pradesh Greater Noida \n525 Uttar Pradesh Noida \n\n Station site_id \n498 Knowledge Park - III, Greater Noida - UPPCB site_1541 \n525 Knowledge Park - III, Greater Noida - UPPCB site_1541 \n State City Station site_id\n28 Bihar Aurangabad Rachnakar Colony, Aurangabad - MPCB site_5789\n257 Maharashtra Aurangabad Rachnakar Colony, Aurangabad - MPCB site_5789\n State City Station site_id\n27 Bihar Aurangabad Gurdeo Nagar, Aurangabad - BSPCB site_5544\n256 Maharashtra Aurangabad Gurdeo Nagar, Aurangabad - BSPCB site_5544\n\n\n\n# clean up\ndrop_items = [metadata_df[(metadata_df.State == \"Bihar\") & (metadata_df.Station == \"MIDC Chilkalthana, Aurangabad - MPCB\")].index.item(),\n metadata_df[(metadata_df.City == \"Noida\") & (metadata_df.Station == \"Knowledge Park - III, Greater Noida - UPPCB\")].index.item(),\n metadata_df[(metadata_df.State == \"Bihar\") & (metadata_df.Station == \"More Chowk Waluj, Aurangabad - MPCB\")].index.item(),\n metadata_df[(metadata_df.State == \"Bihar\") & (metadata_df.Station == \"MIDC Chilkalthana, Aurangabad - MPCB\")].index.item(),\n metadata_df[(metadata_df.State == \"Maharashtra\") & (metadata_df.Station == \"Gurdeo Nagar, Aurangabad - BSPCB\")].index.item(),\n metadata_df[(metadata_df.State == \"Bihar\") & (metadata_df.Station == \"Rachnakar Colony, Aurangabad - MPCB\")].index.item(),\n metadata_df[(metadata_df.City == \"Noida\") & (metadata_df.Station == \"Knowledge Park - V, Greater Noida - UPPCB\")].index.item()]\n\nmetadata_df.drop(drop_items, inplace=True)\nlen(metadata_df)\n\n554\n\n\n\nassert set(metadata_df.site_id.value_counts()) == {1}\n\n\nmetadata_df.to_csv(\"metadata.csv\", index=False)" - }, - { - "objectID": "posts/2024-12-10-cpcb-download.html#downloading-data", - "href": "posts/2024-12-10-cpcb-download.html#downloading-data", - "title": "Download CPCB live data", - "section": "Downloading data", - "text": "Downloading data\n\n# URL is specific to PM2.5 and PM10 so update it as per your needs\ndef get_url(state, city, site_id):\n return f\"https://airquality.cpcb.gov.in/ccr/#/caaqm-dashboard-all/caaqm-view-data-report/%2522%257B%255C%2522parameter_list%255C%2522%253A%255B%257B%255C%2522id%255C%2522%253A0%252C%255C%2522itemName%255C%2522%253A%255C%2522PM2.5%255C%2522%252C%255C%2522itemValue%255C%2522%253A%255C%2522parameter_193%255C%2522%257D%252C%257B%255C%2522id%255C%2522%253A1%252C%255C%2522itemName%255C%2522%253A%255C%2522PM10%255C%2522%252C%255C%2522itemValue%255C%2522%253A%255C%2522parameter_215%255C%2522%257D%255D%252C%255C%2522criteria%255C%2522%253A%255C%252224%2520Hours%255C%2522%252C%255C%2522reportFormat%255C%2522%253A%255C%2522Tabular%255C%2522%252C%255C%2522fromDate%255C%2522%253A%255C%252201-01-2024%2520T00%253A00%253A00Z%255C%2522%252C%255C%2522toDate%255C%2522%253A%255C%252211-12-2024%2520T16%253A45%253A59Z%255C%2522%252C%255C%2522state%255C%2522%253A%255C%2522{state.replace(' ', '%2520')}%255C%2522%252C%255C%2522city%255C%2522%253A%255C%2522{city.replace(' ', '%2520')}%255C%2522%252C%255C%2522station%255C%2522%253A%255C%2522{site_id}%255C%2522%252C%255C%2522parameter%255C%2522%253A%255B%255C%2522parameter_193%255C%2522%252C%255C%2522parameter_215%255C%2522%255D%252C%255C%2522parameterNames%255C%2522%253A%255B%255C%2522PM2.5%255C%2522%252C%255C%2522PM10%255C%2522%255D%257D%2522\"\n\n\n# add download directory\noptions = webdriver.ChromeOptions()\noptions.add_experimental_option(\"prefs\", {\n \"download.default_directory\": \"/Users/project561/cpcb_downloads\"\n})\n\ndriver = webdriver.Chrome(options=options)\ndriver.get(HOME_URL)\n\nEnter Captcha manually before moving ahead\n\nmetadata_df = pd.read_csv(\"metadata.csv\")\nmetadata_df.head(2)\n\n\n\n\n\n\n\n\nState\nCity\nStation\nsite_id\n\n\n\n\n0\nAndhra Pradesh\nAmaravati\nSecretariat, Amaravati - APPCB\nsite_1406\n\n\n1\nAndhra Pradesh\nAnantapur\nGulzarpet, Anantapur - APPCB\nsite_5632\n\n\n\n\n\n\n\n\nfiles = glob(\"/Users/project561/cpcb_downloads/*.xlsx\")\nprint(\"Number of files in the download directory:\", len(files))\nsite_ids = [re.search(r\"site_\\d+?2024\", file).group()[:-4] for file in files]\n# assert len(set(site_ids)) == len(site_ids), pd.Series(site_ids).value_counts()\nsite_ids = set(site_ids)\n\nfor i in range(len(metadata_df)):\n state, city, station, site_id = metadata_df.iloc[i]\n if site_id in site_ids:\n # print(\"Already downloaded\", i, state, city, station, site_id)\n continue\n print(\"Downloading\", i, state, city, station, site_id)\n url = get_url(state, city, site_id)\n \n # open new tab\n driver.execute_script(\"window.open('');\")\n driver.switch_to.window(driver.window_handles[-1])\n driver.get(url)\n excel_button = WebDriverWait(driver, 20).until(\n EC.element_to_be_clickable((By.CLASS_NAME, \"fa-file-excel-o\")))\n click_it(driver, excel_button)\n sleep(1)\n \n if len(driver.window_handles) > 10:\n # close first 9 windows\n for _ in range(9):\n driver.switch_to.window(driver.window_handles[0])\n driver.close()\n \n driver.switch_to.window(driver.window_handles[-1])\n sleep(1)\n\nNumber of files in the download directory: 302\nDownloading 301 Maharashtra Nagpur Ram Nagar, Nagpur - MPCB site_5793\nDownloading 302 Maharashtra Nagpur Mahal, Nagpur - MPCB site_5796\nDownloading 303 Maharashtra Nagpur Opp GPO Civil Lines, Nagpur - MPCB site_303\nDownloading 304 Maharashtra Nagpur Ambazari, Nagpur - MPCB site_5792\nDownloading 305 Maharashtra Nanded Sneh Nagar, Nanded - MPCB site_5795\nDownloading 306 Maharashtra Nashik Pandav Nagari, Nashik - MPCB site_5779\nDownloading 307 Maharashtra Nashik MIDC Ambad, Nashik - MPCB site_5781\nDownloading 308 Maharashtra Nashik Gangapur Road, Nashik - MPCB site_304\nDownloading 309 Maharashtra Nashik Hirawadi, Nashik - MPCB site_5782\nDownloading 310 Maharashtra Navi Mumbai Tondare-Taloja, Navi Mumbai - MPCB site_5803\nDownloading 311 Maharashtra Navi Mumbai Sanpada, Navi Mumbai - MPCB site_5815\nDownloading 312 Maharashtra Navi Mumbai Airoli, Navi Mumbai - MPCB site_261\nDownloading 313 Maharashtra Navi Mumbai Mahape, Navi Mumbai - MPCB site_5114\nDownloading 314 Maharashtra Navi Mumbai Kopripada-Vashi, Navi Mumbai - MPCB site_5805\nDownloading 315 Maharashtra Navi Mumbai Sector-19A Nerul, Navi Mumbai - IITM site_5401\nDownloading 316 Maharashtra Navi Mumbai Nerul, Navi Mumbai - MPCB site_5103\nDownloading 317 Maharashtra Navi Mumbai Sector-2E Kalamboli, Navi Mumbai - MPCB site_5799\nDownloading 318 Maharashtra Parbhani Masoom Colony, Parbhani - MPCB site_5794\nDownloading 319 Maharashtra Pimpri-Chinchwad Park Street Wakad, Pimpri Chinchwad - MPCB site_5764\nDownloading 320 Maharashtra Pimpri-Chinchwad Savta Mali Nagar, Pimpri-Chinchwad - IITM site_5998\nDownloading 321 Maharashtra Pimpri-Chinchwad Thergaon, Pimpri Chinchwad - MPCB site_5765\nDownloading 322 Maharashtra Pimpri-Chinchwad Gavalinagar, Pimpri Chinchwad - MPCB site_5763\nDownloading 323 Maharashtra Pune Revenue Colony-Shivajinagar, Pune - IITM site_5409\nDownloading 324 Maharashtra Pune Mhada Colony, Pune - IITM site_5404\nDownloading 325 Maharashtra Pune Savitribai Phule Pune University, Pune - MPCB site_5767\nDownloading 326 Maharashtra Pune Bhumkar Nagar, Pune - IITM site_5988\nDownloading 327 Maharashtra Pune Hadapsar, Pune - IITM site_5407\nDownloading 328 Maharashtra Pune Karve Road, Pune - MPCB site_292\nDownloading 329 Maharashtra Pune Alandi, Pune - IITM site_5405" - }, - { - "objectID": "posts/py_over_ipynb.html", - "href": "posts/py_over_ipynb.html", - "title": "Why .py files are better than .ipynb files for ML codebase", + "objectID": "posts/bayesian-gaussian-basis-regression.html", + "href": "posts/bayesian-gaussian-basis-regression.html", + "title": "Bayesian Basis Regression", "section": "", - "text": "I have shifted from .ipynb files to .py files (and Jupyter to VS code) in the last couple of months. Here are some reasons why I feel .py files are better than .ipynb files:" + "text": "import os\n\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"3\"\n\nimport numpy as np\nimport pandas as pd\nimport regdata as rd\n\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport torch.distributions as dist\n\nfrom tqdm import tqdm\n\nimport matplotlib.pyplot as plt\n\ndevice = \"cuda\"\nrd.set_backend(\"torch\")" }, { - "objectID": "posts/py_over_ipynb.html#fewer-errors", - "href": "posts/py_over_ipynb.html#fewer-errors", - "title": "Why .py files are better than .ipynb files for ML codebase", - "section": "Fewer Errors", - "text": "Fewer Errors\n\n.py files are easier to debug with a VS code like IDE, making it easier to find the errors.\nExecution of .py starts fresh, unlike some left out variables silently getting carried over from the last execution/deleted cells in .ipynb files." + "objectID": "posts/bayesian-gaussian-basis-regression.html#generate-data", + "href": "posts/bayesian-gaussian-basis-regression.html#generate-data", + "title": "Bayesian Basis Regression", + "section": "Generate data", + "text": "Generate data\n\n# x = torch.linspace(-1, 1, 100)\n# y = (torch.sin(x * 2 * torch.pi) + torch.randn(x.size()) * 0.1).unsqueeze(1)\nx, y, _ = rd.MotorcycleHelmet().get_data()\nx = x.ravel().to(torch.float32)\nidx = np.argsort(x)\nx = x[idx]\ny = y.to(torch.float32)\ny = y[idx]\n\nx = torch.vstack([torch.ones_like(x), x]).T\nprint(x.shape, y.shape)\nx = x.to(device)\ny = y.to(device)\nprint(x.dtype, y.dtype)\n\nplt.scatter(x.cpu().numpy()[:, 1], y.cpu().numpy())\n\ntorch.Size([94, 2]) torch.Size([94])\ntorch.float32 torch.float32\n\n\n\n\n\n\n\n\n\n\nclass MLP(nn.Module):\n def __init__(self, in_dim, out_dim, neurons, transform=None):\n super().__init__()\n self.layers = nn.ModuleList()\n self.transform = transform\n if transform is None:\n self.transform = lambda x: x\n self.layers.append(nn.Linear(in_dim, neurons[0]))\n else:\n self.layers.append(nn.Linear(self.transform.n_grid + 1, neurons[0]))\n for i in range(1, len(neurons)):\n self.layers.append(nn.Linear(neurons[i - 1], neurons[i]))\n self.layers.append(nn.Linear(neurons[-1], out_dim))\n\n def forward(self, x):\n x = self.transform(x)\n # print(x.shape)\n for layer in self.layers[:-1]:\n x = F.gelu(layer(x))\n return self.layers[-1](x)\n\n\nclass RBF(nn.Module):\n def __init__(self, log_gauss_var, n_grid):\n super().__init__()\n self.log_gauss_var = nn.Parameter(torch.tensor(log_gauss_var))\n self.n_grid = n_grid\n self.grid = nn.Parameter(torch.linspace(-1, 1, n_grid))\n self.register_buffer(\"bias\", torch.zeros(1))\n\n def forward(self, x):\n self.dist = dist.Normal(self.grid, torch.exp(self.log_gauss_var))\n features = torch.exp(self.dist.log_prob(x[:, 1:2]))\n # print(features.shape)\n features = torch.cat(\n [\n torch.ones_like(self.bias.repeat(features.shape[0])).reshape(-1, 1),\n features,\n ],\n dim=1,\n )\n return features\n\n\nRBF(0.0, 10).to(device)(x).shape\n\ntorch.Size([94, 11])\n\n\n\n# def transform_fn(x):\n# all_x = []\n# for i in range(2, 11):\n# all_x.append(x[:, 1:2] ** i)\n# return torch.hstack([x] + all_x)\n\n\ndef get_mn_sn(x, s0):\n x = transform_fn(x)\n sn_inv = (x.T @ x) / torch.exp(log_var_noise)\n diag = sn_inv.diagonal()\n diag += 1 / s0\n sn = torch.inverse(sn_inv)\n mn = sn @ ((x.T @ y) / torch.exp(log_var_noise))\n return mn, sn\n\n\ndef neg_log_likelihood(x, y, m0, s0):\n x = transform_fn(x)\n cov = (x @ x.T) / s0\n diag = cov.diagonal()\n diag += torch.exp(log_var_noise)\n return (\n -dist.MultivariateNormal(m0.repeat(y.shape[0]), cov).log_prob(y.ravel()).sum()\n )\n\n\ndef get_pred_post(sn, mn, x):\n x = transform_fn(x)\n pred_cov = x @ sn @ x.T\n diag = pred_cov.diagonal()\n diag += torch.exp(log_var_noise)\n pred_mean = x @ mn\n return pred_mean, pred_cov\n\n\ndef plot_preds_and_95(ax, x, pred_mean, pred_cov):\n with torch.no_grad():\n x = x[:, 1].cpu().numpy()\n pred_mean = pred_mean.ravel().cpu().numpy()\n pred_var = pred_cov.diagonal().cpu().numpy()\n ax.plot(x, pred_mean, color=\"red\", label=\"mean\")\n ax.fill_between(\n x,\n (pred_mean - 2 * np.sqrt(pred_var)),\n (pred_mean + 2 * np.sqrt(pred_var)),\n color=\"red\",\n alpha=0.2,\n label=\"95% CI\",\n )\n return ax\n\n\nmlp = MLP(2, 1, [256, 256, 256]).to(device)\n# mlp = RBF(0.1, 20).to(device)\ntransform_fn = mlp.forward\n\nm0 = torch.zeros((1,)).to(device)\ns0 = torch.tensor(1.0).to(device)\nwith torch.no_grad():\n log_var_noise = nn.Parameter(torch.tensor(0.1)).to(device)\n log_var_noise.requires_grad = True\n m0.requires_grad = True\n s0.requires_grad = True\n\n\noptimizer = torch.optim.Adam([*list(mlp.parameters()), log_var_noise, m0, s0], lr=0.01)\nlosses = []\npbar = tqdm(range(500))\nfor i in pbar:\n optimizer.zero_grad()\n loss = neg_log_likelihood(x, y, m0, s0)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: 30.6285: 100%|██████████| 500/500 [00:02<00:00, 209.49it/s]\n\n\n\n\n\n\n\n\n\n\nmn, sn = get_mn_sn(x, s0)\npred_mean, pred_var = get_pred_post(sn, mn, x)\n\nfig, ax = plt.subplots()\nax = plot_preds_and_95(ax, x, pred_mean, pred_var)\nwith torch.no_grad():\n ax.scatter(x.cpu().numpy()[:, 1], y.cpu().numpy())\n # ax.vlines(mlp.transform.grid.cpu().numpy(), -1, 1, color=\"black\", alpha=0.2)\nplt.show()\n\n\n\n\n\n\n\n\n\ntorch.exp(log_var_noise), s0, m0\n\n(tensor(0.1191, device='cuda:0', grad_fn=<ExpBackward0>),\n tensor(1.3897, device='cuda:0', requires_grad=True),\n tensor([-0.0693], device='cuda:0', requires_grad=True))" }, { - "objectID": "posts/py_over_ipynb.html#better-usage-of-a-shared-server", - "href": "posts/py_over_ipynb.html#better-usage-of-a-shared-server", - "title": "Why .py files are better than .ipynb files for ML codebase", - "section": "Better Usage of a Shared Server", - "text": "Better Usage of a Shared Server\n\n.py files release the resources (e.g., GPU memory) once executed. It could be inconvenient to repeatedly remind or be reminded by someone to release the resources manually from a Jupyter notebook." + "objectID": "posts/bayesian-gaussian-basis-regression.html#add-gaussian-transform", + "href": "posts/bayesian-gaussian-basis-regression.html#add-gaussian-transform", + "title": "Bayesian Basis Regression", + "section": "Add Gaussian transform", + "text": "Add Gaussian transform\n\nmlp = MLP(2, 1, [256, 256, 256], transform=RBF(0.1, 10)).to(device)\n# mlp = RBF(0.1, 20).to(device)\ntransform_fn = mlp.forward\n\nm0 = torch.zeros((1,)).to(device)\ns0 = torch.tensor(1.0).to(device)\nwith torch.no_grad():\n log_var_noise = nn.Parameter(torch.tensor(0.1)).to(device)\n log_var_noise.requires_grad = True\n m0.requires_grad = False\n s0.requires_grad = True\n\n\noptimizer = torch.optim.Adam([*list(mlp.parameters()), log_var_noise, m0, s0], lr=0.01)\nlosses = []\npbar = tqdm(range(500))\nfor i in pbar:\n optimizer.zero_grad()\n loss = neg_log_likelihood(x, y, m0, s0)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: -29.9227: 100%|██████████| 500/500 [00:03<00:00, 156.90it/s]\n\n\n\n\n\n\n\n\n\n\nmn, sn = get_mn_sn(x, s0)\npred_mean, pred_var = get_pred_post(sn, mn, x)\n\nfig, ax = plt.subplots()\nax = plot_preds_and_95(ax, x, pred_mean, pred_var)\nwith torch.no_grad():\n ax.scatter(x.cpu().numpy()[:, 1], y.cpu().numpy())\n ax.vlines(mlp.transform.grid.cpu().numpy(), -1, 1, color=\"black\", alpha=0.2)\nplt.show()" }, { - "objectID": "posts/py_over_ipynb.html#increased-productivity", - "href": "posts/py_over_ipynb.html#increased-productivity", - "title": "Why .py files are better than .ipynb files for ML codebase", - "section": "Increased Productivity", - "text": "Increased Productivity\n\nYou can make use of fantastic auto-complete, syntax-highlighting extensions in VS code to save a lot of time while working with .py files." + "objectID": "posts/bayesian-gaussian-basis-regression.html#just-gaussian-basis", + "href": "posts/bayesian-gaussian-basis-regression.html#just-gaussian-basis", + "title": "Bayesian Basis Regression", + "section": "Just Gaussian basis", + "text": "Just Gaussian basis\n\n# mlp = MLP(2, 1, [32, 32, 32], transform=RBF(0.1, 10)).to(device)\nmlp = RBF(1.0, 5).to(device)\ntransform_fn = mlp.forward\n\nm0 = torch.zeros((1,)).to(device)\ns0 = torch.tensor(1.0).to(device)\nwith torch.no_grad():\n log_var_noise = nn.Parameter(torch.tensor(0.1)).to(device)\n log_var_noise.requires_grad = True\n m0.requires_grad = False\n s0.requires_grad = True\n\n\noptimizer = torch.optim.Adam([*list(mlp.parameters()), log_var_noise, m0, s0], lr=0.001)\nlosses = []\npbar = tqdm(range(500))\nfor i in pbar:\n optimizer.zero_grad()\n loss = neg_log_likelihood(x, y, m0, s0)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: 207.0843: 100%|██████████| 500/500 [00:02<00:00, 195.61it/s]\n\n\n\n\n\n\n\n\n\n\nmn, sn = get_mn_sn(x, s0)\npred_mean, pred_var = get_pred_post(sn, mn, x)\n\nfig, ax = plt.subplots()\nax = plot_preds_and_95(ax, x, pred_mean, pred_var)\nwith torch.no_grad():\n ax.scatter(x.cpu().numpy()[:, 1], y.cpu().numpy())\n ax.vlines(mlp.grid.cpu().numpy(), -1, 1, color=\"black\", alpha=0.2)\nplt.show()" }, { - "objectID": "posts/py_over_ipynb.html#boost-collaboration", - "href": "posts/py_over_ipynb.html#boost-collaboration", - "title": "Why .py files are better than .ipynb files for ML codebase", - "section": "Boost Collaboration", - "text": "Boost Collaboration\n\n.py do not take time to render on GitHub because they are just plain text files, unlike .ipynb files.\nIt is a lot easier to see the changes made by others in a .py file than a .ipynb file." + "objectID": "posts/bayesian-gaussian-basis-regression.html#appendix", + "href": "posts/bayesian-gaussian-basis-regression.html#appendix", + "title": "Bayesian Basis Regression", + "section": "Appendix", + "text": "Appendix\n\nfrom sklearn.preprocessing import StandardScaler, MinMaxScaler\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import mean_squared_error\n\n\ndata = pd.read_csv(\"~/datasets/uci/bike/hour.csv\", header=None).iloc[:, 1:]\ndata.shape\n\n(17379, 18)\n\n\n\nX = data.iloc[:, :-1].values\ny = data.iloc[:, -1].values\nX_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4, random_state=0)\nX_train.shape, X_test.shape, y_train.shape, y_test.shape\n\nx_scaler = MinMaxScaler()\ny_scaler = StandardScaler()\nX_train = x_scaler.fit_transform(X_train)\ny_train = y_scaler.fit_transform(y_train.reshape(-1, 1))\nX_test = x_scaler.transform(X_test)\ny_test = y_scaler.transform(y_test.reshape(-1, 1))\n\nX_train.shape, X_test.shape, y_train.shape, y_test.shape\n\n((10427, 17), (6952, 17), (10427, 1), (6952, 1))\n\n\n\n[X_train, X_test, y_train, y_test] = map(\n lambda x: torch.tensor(x, dtype=torch.float32).to(device),\n [X_train, X_test, y_train, y_test],\n)\n\n\nmlp = MLP(17, 1, [10, 10]).to(device)\n\noptimizer = torch.optim.Adam(mlp.parameters(), lr=0.01)\nlosses = []\npbar = tqdm(range(500))\nfor i in pbar:\n optimizer.zero_grad()\n loss = F.mse_loss(mlp(X_train), y_train)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: 0.0040: 100%|██████████| 500/500 [00:01<00:00, 482.25it/s]\n\n\n\n\n\n\n\n\n\n\nwith torch.no_grad():\n y_pred = mlp(X_test).cpu().numpy()\n if isinstance(y_test, torch.Tensor):\n y_test = y_test.cpu().numpy()\n print(y_pred.shape, y_test.shape)\n print(\"RMSE\", mean_squared_error(y_test, y_pred, squared=False))\n\n(6952, 1) (6952, 1)\nRMSE 0.08354535" }, { - "objectID": "posts/py_over_ipynb.html#increased-modularity", - "href": "posts/py_over_ipynb.html#increased-modularity", - "title": "Why .py files are better than .ipynb files for ML codebase", - "section": "Increased Modularity", - "text": "Increased Modularity\n\nFunction and Class calls from other files are seamless with .py files.\n\nFeel free to comment your views/suggestions/additions in the comment box." + "objectID": "posts/2022-03-06-probabilistic-machine-learning.html", + "href": "posts/2022-03-06-probabilistic-machine-learning.html", + "title": "Probabilistic Machine Learning", + "section": "", + "text": "An inference problem requires statements about the value of an unobserved (latent) variable x based on observations y which are related to x, but may not be sufficient to fully determine x. This requires a notion of uncertainty.\n\nWe can define the following rules because \\(p(E) = 1\\) for any event \\(E\\).\n\nSum rule: \\(p(E) = p(E|A) + p(E|\\neg A)\\)\n\nProduct rule: \\(p(E, A) = p(E|A)p(A) = p(A|E)p(E)\\)\n\nBayes’ theorem: \\(p(E|A) = \\frac{p(A|E)p(E)}{p(A)}\\)" }, { - "objectID": "posts/2022-10-27-mogp.html", - "href": "posts/2022-10-27-mogp.html", - "title": "Multi-Output Gaussian Processes", + "objectID": "posts/2022-03-06-probabilistic-machine-learning.html#introduction", + "href": "posts/2022-03-06-probabilistic-machine-learning.html#introduction", + "title": "Probabilistic Machine Learning", "section": "", - "text": "Inspired from this GPSS video.\nimport jax\nfrom jax.flatten_util import ravel_pytree\nimport jax.numpy as jnp\n\nimport optax\n\nimport matplotlib.pyplot as plt\nfrom tinygp import kernels" + "text": "An inference problem requires statements about the value of an unobserved (latent) variable x based on observations y which are related to x, but may not be sufficient to fully determine x. This requires a notion of uncertainty.\n\nWe can define the following rules because \\(p(E) = 1\\) for any event \\(E\\).\n\nSum rule: \\(p(E) = p(E|A) + p(E|\\neg A)\\)\n\nProduct rule: \\(p(E, A) = p(E|A)p(A) = p(A|E)p(E)\\)\n\nBayes’ theorem: \\(p(E|A) = \\frac{p(A|E)p(E)}{p(A)}\\)" }, { - "objectID": "posts/2022-10-27-mogp.html#helper-functions", - "href": "posts/2022-10-27-mogp.html#helper-functions", - "title": "Multi-Output Gaussian Processes", - "section": "Helper functions", - "text": "Helper functions\n\ndef random_fill(key, params):\n values, unravel_fn = ravel_pytree(params)\n random_values = jax.random.normal(key, shape=values.shape)\n return unravel_fn(random_values)\n\ndef get_real_params(params):\n for i in range(1, q_len+1):\n params[f'a{i}'] = params[f'a{i}'].reshape(n_outputs, rank)\n if method == 'icm':\n params['var'] = jnp.exp(params['log_var'])\n params['scale'] = jnp.exp(params['log_scale'])\n params['noise'] = jnp.exp(params['log_noise'])\n elif method == 'lmc':\n for i in range(1, q_len+1):\n params[f'var{i}'] = jnp.exp(params[f'log_var{i}'])\n params[f'scale{i}'] = jnp.exp(params[f'log_scale{i}'])\n params[f'noise{i}'] = jnp.exp(params[f'log_noise{i}'])\n return params\n\ndef kron_cov_fn(params, x1, x2, add_noise=False):\n params = get_real_params(params)\n a_list = [params[f'a{i}'] for i in range(1, q_len+1)]\n\n if method == 'icm':\n kernel_fn = params['var'] * kernels.ExpSquared(scale=params['scale'])\n cov = kernel_fn(x1, x2)\n if add_noise:\n cov = cov + jnp.eye(cov.shape[0])*params['noise']\n\n B = jax.tree_util.tree_reduce(lambda x1, x2: x1@x1.T+x2@x2.T, a_list)\n# print(B.shape, cov.shape)\n return jnp.kron(B, cov)\n\n elif method == 'lmc':\n cov_list = []\n for idx in range(1, q_len+1):\n kernel_fn = params[f'var{idx}'] * kernels.ExpSquared(scale=params[f'scale{idx}'])\n cov = kernel_fn(x1, x2)\n if add_noise:\n cov = cov + jnp.eye(cov.shape[0])*params[f'noise{idx}']\n\n B = a_list[idx-1]@a_list[idx-1].T\n cov_list.append(jnp.kron(B, cov))\n \n return jax.tree_util.tree_reduce(lambda x1, x2: x1+x2, cov_list)" + "objectID": "posts/2022-01-24-query_by_committee.html", + "href": "posts/2022-01-24-query_by_committee.html", + "title": "Query by Committee", + "section": "", + "text": "# Common imports\nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom matplotlib.animation import FuncAnimation\nfrom matplotlib import rc\n\nplt.style.use('fivethirtyeight')\nrc('animation', html='jshtml')\n\n# Copy the models\nfrom copy import deepcopy\n\n# Sklearn imports\nfrom sklearn.ensemble import RandomForestRegressor, RandomForestClassifier\nfrom sklearn.datasets import make_classification\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import accuracy_score, f1_score\n\n# Entropy function\nfrom scipy.stats import entropy\n\n# Progress helper\nfrom IPython.display import clear_output" }, { - "objectID": "posts/2022-10-27-mogp.html#configuration", - "href": "posts/2022-10-27-mogp.html#configuration", - "title": "Multi-Output Gaussian Processes", - "section": "Configuration", - "text": "Configuration\n\nq_len = 2\nrank = 2 # if 1, slfm\nn_outputs = 2\n\nmethod = 'lmc' # lmc, icm\n\nif rank = 1, lmc becomes slfm.\n\nGenerative process\n\nx_key = jax.random.PRNGKey(4)\n\nx = jax.random.uniform(x_key, shape=(40, 1)).sort(axis=0)\nx_test = jnp.linspace(0,1,100).reshape(-1, 1)\n\ne1_key, e2_key = jax.random.split(x_key)\n\ne1 = jax.random.normal(e1_key, shape=(x.shape[0],))\ne2 = jax.random.normal(e2_key, shape=(x.shape[0],))\n\nif method == 'icm':\n noise = 0.01\n gen_kernel = 1.2*kernels.ExpSquared(scale=0.2)\n gen_covariance = gen_kernel(x, x) + jnp.eye(x.shape[0])*noise\n gen_chol = jnp.linalg.cholesky(gen_covariance)\n \n y1 = gen_chol@e1\n y2 = gen_chol@e2\n\n y = jnp.concatenate([y1, y2])\n \nelif method == 'lmc':\n noise1 = 0.01\n noise2 = 0.1\n gen_kernel1 = 1.2*kernels.ExpSquared(scale=0.1)\n gen_covariance1 = gen_kernel1(x, x) + jnp.eye(x.shape[0])*noise1\n gen_chol1 = jnp.linalg.cholesky(gen_covariance1)\n\n gen_kernel2 = 0.8*kernels.ExpSquared(scale=0.2)\n gen_covariance2 = gen_kernel2(x, x) + jnp.eye(x.shape[0])*noise2\n gen_chol2 = jnp.linalg.cholesky(gen_covariance2)\n \n y1 = gen_chol1@e1\n y2 = gen_chol2@e2\n\n y = jnp.concatenate([y1, y2])\n \n\nplt.scatter(x, y1, label='y1')\nplt.scatter(x, y2, label='y2')\nplt.legend();\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)\n\n\n\n\n\n\n\n\n\n\ndef loss_fn(params):\n mo_cov = kron_cov_fn(params, x, x, add_noise=True)\n# print(y.shape, mo_cov.shape)\n return -jax.scipy.stats.multivariate_normal.logpdf(y, jnp.zeros_like(y), mo_cov)\n\n\nkey = jax.random.PRNGKey(1)\nif method == 'icm':\n params = {'log_var':0.0, 'log_scale':0.0, 'log_noise':0.0}\n for i in range(1, q_len+1):\n params[f'a{i}'] = jnp.zeros((n_outputs, rank))\nelif method == 'lmc':\n params = {}\n for i in range(1, q_len+1):\n params[f'a{i}'] = jnp.zeros((n_outputs, rank))\n params[f'log_var{i}'] = 0.0\n params[f'log_scale{i}'] = 0.0\n params[f'log_noise{i}'] = 0.0\n\nparams = random_fill(key, params)\nparams\n\n{'a1': DeviceArray([[-0.764527 , 1.0286916],\n [-1.0690447, -0.7921495]], dtype=float32),\n 'a2': DeviceArray([[ 0.8845895, -1.1941622],\n [-1.7434924, 1.5159688]], dtype=float32),\n 'log_noise1': DeviceArray(-1.1254696, dtype=float32),\n 'log_noise2': DeviceArray(-0.22446911, dtype=float32),\n 'log_scale1': DeviceArray(0.39719132, dtype=float32),\n 'log_scale2': DeviceArray(-0.22453257, dtype=float32),\n 'log_var1': DeviceArray(-0.7590596, dtype=float32),\n 'log_var2': DeviceArray(-0.08601531, dtype=float32)}\n\n\n\nloss_fn(params)\n\nDeviceArray(116.04026, dtype=float32)\n\n\n\nkey = jax.random.PRNGKey(3)\nparams = random_fill(key, params)\n\nn_iters = 1000\n\nvalue_and_grad_fn = jax.jit(jax.value_and_grad(loss_fn))\nopt = optax.adam(0.01)\nstate = opt.init(params)\n\ndef one_step(params_and_state, xs):\n params, state = params_and_state\n loss, grads = value_and_grad_fn(params)\n updates, state = opt.update(grads, state)\n params = optax.apply_updates(params, updates)\n return (params, state), (params, loss)\n\n(tuned_params, state), (params_history, loss_history) = jax.lax.scan(one_step, init=(params, state), xs=None, length=n_iters)\n\nplt.plot(loss_history);\n\n\n\n\n\n\n\n\n\ndef predict_fn(params, x_test):\n cov = kron_cov_fn(params, x, x, add_noise=True)\n test_cov = kron_cov_fn(params, x_test, x_test, add_noise=True)\n cross_cov = kron_cov_fn(params, x_test, x, add_noise=False)\n \n chol = jnp.linalg.cholesky(cov)\n k_inv_y = jax.scipy.linalg.cho_solve((chol, True), y)\n k_inv_cross_cov = jax.scipy.linalg.cho_solve((chol, True), cross_cov.T)\n\n pred_mean = cross_cov@k_inv_y\n pred_cov = test_cov - cross_cov@k_inv_cross_cov\n return pred_mean, pred_cov\n\n\npred_mean, pred_cov = predict_fn(tuned_params, x_test)\npred_conf = 2 * jnp.diag(pred_cov)**0.5\n\nplt.scatter(x, y1, label='y1')\nplt.scatter(x, y2, label='y2')\nplt.plot(x_test, pred_mean[:x_test.shape[0]], label='pred_y1')\nplt.plot(x_test, pred_mean[x_test.shape[0]:], label='pred_y2')\nplt.fill_between(x_test.ravel(), pred_mean[:x_test.shape[0]] - pred_conf[:x_test.shape[0]], pred_mean[:x_test.shape[0]] + pred_conf[:x_test.shape[0]], label='pred_conf_y1', alpha=0.3)\nplt.fill_between(x_test.ravel(), pred_mean[x_test.shape[0]:] - pred_conf[x_test.shape[0]:], pred_mean[x_test.shape[0]:] + pred_conf[x_test.shape[0]:], label='pred_conf_y2', alpha=0.3)\nplt.legend(bbox_to_anchor=(1,1));\n\n\n\n\n\n\n\n\n\nfor name, value in get_real_params(tuned_params).items():\n if not name.startswith('log_'):\n print(name, value)\n\na1 [[0.03664799 0.00039898]\n [0.3191718 0.00344488]]\na2 [[ 0.1351072 0.00248941]\n [-0.05392759 -0.04239884]]\nnoise1 0.6797133\nnoise2 0.4154678\nscale1 5.048228\nscale2 0.10743636\nvar1 0.016275918\nvar2 41.034225" + "objectID": "posts/2022-01-24-query_by_committee.html#qbc-by-posterior-sampling", + "href": "posts/2022-01-24-query_by_committee.html#qbc-by-posterior-sampling", + "title": "Query by Committee", + "section": "QBC by posterior sampling", + "text": "QBC by posterior sampling\n\nInteresting fact: For probabilistic models, QBC is similar to uncertainty sampling. How?\n\nDraw \\(k\\) parameter sets from the posterior distribution representing \\(k\\) different models.\nQuery a point which shows maximum disagreement among the points." }, { - "objectID": "posts/2022-03-05-uncertainty-in-deep-learning.html", - "href": "posts/2022-03-05-uncertainty-in-deep-learning.html", - "title": "Uncertainty in Deep Learning", - "section": "", - "text": "import torch" + "objectID": "posts/2022-01-24-query_by_committee.html#an-example-bayesian-linear-regression", + "href": "posts/2022-01-24-query_by_committee.html#an-example-bayesian-linear-regression", + "title": "Query by Committee", + "section": "An example: Bayesian linear regression", + "text": "An example: Bayesian linear regression\n\nnp.random.seed(0)\nN = 10\nX = np.linspace(-1,1,N).reshape(-1,1)\n\nt0 = 3\nt1 = 2\n\ny = X * t1 + t0 + np.random.rand(N,1)\n\nplt.scatter(X, y);\n\n\n\n\n\n\n\n\n\nAssume a posterior\n\nn_samples = 50\n\nt0_dist_samples = np.random.normal(t0, 0.1, size=n_samples)\nt1_dist_samples = np.random.normal(t1, 1, size=n_samples)\n\n\n\nPlot the models\n\nplt.scatter(X, y)\n\nfor i in range(len(t0_dist_samples)):\n sample_t0 = t0_dist_samples[i]\n sample_t1 = t1_dist_samples[i]\n \n plt.plot(X, X * sample_t1 + sample_t0,alpha=0.1)" }, { - "objectID": "posts/2022-03-05-uncertainty-in-deep-learning.html#introduction", - "href": "posts/2022-03-05-uncertainty-in-deep-learning.html#introduction", - "title": "Uncertainty in Deep Learning", - "section": "1 - Introduction", - "text": "1 - Introduction\n\nAn online deep learning book from Ian Goodfellow, Yoshua Bengio, and Aaron Courville.\n\n\n1.1 - Deep Learning\nWe define a single layer network as the following:\n\nclass SingleLayerNetwork(torch.nn.Module):\n def __init__(self, Q, D, K):\n \"\"\"\n Q: number of features\n D: number of outputs\n K: number of hidden features\n \"\"\"\n super().__init__()\n self.input = torch.nn.Linear(Q, K) # Transforms Q features into K hidden features\n self.output = torch.nn.Linear(K, D) # Transforms K hidden features to D output features\n self.non_lin_transform = torch.nn.ReLU() # A non-linear transformation\n \n def forward(self, X):\n \"\"\"\n X: input (N x Q)\n \"\"\"\n self.linear_transformed_X = self.input(X) # (N, Q) -> (N, K)\n self.non_lin_transformed_X = self.non_lin_transform(linear_transformed_X) # (N, K) -> (N, K)\n output = self.output(self.non_lin_transformed_X) # (N, K) -> (N, D)\n return output\n\n\nQ = 10 # Number of features\nN = 100 # Number of samples\nD = 15 # Number of outputs\nK = 32 # Number of hidden features\n\nX = torch.rand(N, Q) # Input\nY = torch.rand(N, D) # Output\n\n\nmodel = SingleLayerNetwork(Q=Q, D=D, K=K)\nmodel\n\nSingleLayerNetwork(\n (input): Linear(in_features=10, out_features=32, bias=True)\n (output): Linear(in_features=32, out_features=15, bias=True)\n (non_lin_transform): ReLU()\n)\n\n\n\nfor name, value in model.named_parameters():\n print(name, value.shape)\n\ninput.weight torch.Size([32, 10])\ninput.bias torch.Size([32])\noutput.weight torch.Size([15, 32])\noutput.bias torch.Size([15])\n\n\nReLU is does not contain any parameters here so it is merely a function.\n\n\n1.2 Model Uncertainty\nIn which cases we want our model to be uncertain?\n\nWhen it encounters a out-of-the-distribution data\nWhen training data is noisy (irreducible/aleatoric uncertainty)\nWhen we have multiple predictors (model/epistemic uncertainty)" + "objectID": "posts/2022-01-24-query_by_committee.html#qbc-by-bootstrapping", + "href": "posts/2022-01-24-query_by_committee.html#qbc-by-bootstrapping", + "title": "Query by Committee", + "section": "QBC by bootstrapping", + "text": "QBC by bootstrapping\n\n2 class dataset\n\nX, y = make_classification(n_samples=1000, n_features=2, n_informative=2, n_redundant=0, random_state=3, shuffle=True)\n\nplt.figure()\nplt.scatter(X[:,0], X[:,1], c=y);\n\n\n\n\n\n\n\n\n\n\nFull data fit with RF\n\nmodel = RandomForestClassifier(random_state=0)\nmodel.fit(X, y);\n\nRandomForestClassifier(random_state=0)\n\n\n\n\nVisualize decision boundary\n\ngrid_X1, grid_X2 = np.meshgrid(np.linspace(X[:,0].min()-0.1, X[:,0].max()+0.1, 100), \n np.linspace(X[:,1].min()-0.1, X[:,1].max()+0.1, 100))\n\ngrid_X = [(x1, x2) for x1, x2 in zip(grid_X1.ravel(), grid_X2.ravel())]\n\ngrid_pred = model.predict(grid_X)\n\nplt.figure(figsize=(6,5))\nplt.scatter(X[:,0], X[:,1], c=y);\nplt.contourf(grid_X1, grid_X2, grid_pred.reshape(*grid_X1.shape), alpha=0.2);\n\n\n\n\n\n\n\n\n\n\nTrain, pool, test split\n\nX_train_pool, X_test, y_train_pool, y_test = train_test_split(X, y, test_size=0.2, random_state=0, stratify=y)\nX_train, X_pool, y_train, y_pool = train_test_split(X_train_pool, y_train_pool, train_size=20, random_state=0)\n\nX_list = [X_train, X_pool, X_test]\ny_list = [y_train, y_pool, y_test]\nt_list = ['Train', 'Pool', 'Test']\n\nfig, ax = plt.subplots(1,3,figsize=(15,4), sharex=True, sharey=True)\nfor i in range(3):\n ax[i].scatter(X_list[i][:,0], X_list[i][:,1], c=y_list[i])\n ax[i].set_title(t_list[i])\n \n\n\n\n\n\n\n\n\n\n\nFitting a model on initial train data\n\nAL_model = RandomForestClassifier(n_jobs=28, random_state=0)\n\nAL_model.fit(X_train, y_train);\n\nRandomForestClassifier(n_jobs=28, random_state=0)\n\n\n\n\nGet the votes from trees on pool dataset\n\nvotes = np.zeros(shape=(X_pool.shape[0], len(AL_model.estimators_)))\n\nfor learner_idx, learner in enumerate(AL_model.estimators_):\n votes[:, learner_idx] = learner.predict(X_pool)\n\n\nvotes.shape\n\n(780, 100)\n\n\n\nvotes\n\narray([[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [1., 1., 1., ..., 0., 1., 1.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]])\n\n\n\n\nConvert to probabilities\n\np_vote = np.zeros(shape=(X_pool.shape[0], X_pool.shape[1]))\n\nfor vote_idx, vote in enumerate(votes):\n vote_counter = {0 : (1-vote).sum(), 1 : vote.sum()}\n\n for class_idx, class_label in enumerate(range(X.shape[1])):\n p_vote[vote_idx, class_idx] = vote_counter[class_label]/len(AL_model.estimators_)\n\n\np_vote\n\narray([[1. , 0. ],\n [0.89, 0.11],\n [0.06, 0.94],\n ...,\n [0.93, 0.07],\n [1. , 0. ],\n [1. , 0. ]])\n\n\n\n\nCalculate dissimilarity (entropy)\n\nexample_id = 2\n\n\nans = 0\nfor category in range(X_pool.shape[1]):\n ans += (-p_vote[example_id][category] * np.log(p_vote[example_id][category]))\n\nans\n\n0.22696752250060448\n\n\n\nentr = entropy(p_vote, axis=1)\n\n\nentr[example_id]\n\n0.22696752250060448\n\n\n\n\nActive Learning Flow\n\ndef get_query_idx():\n # Gather the votes\n votes = np.zeros(shape=(X_pool.shape[0], len(AL_model.estimators_)))\n for learner_idx, learner in enumerate(AL_model.estimators_):\n votes[:, learner_idx] = learner.predict(X_pool)\n \n # Calcuate probability of votes\n p_vote = np.zeros(shape=(X_pool.shape[0], X_pool.shape[1]))\n for vote_idx, vote in enumerate(votes):\n vote_counter = {0 : (1-vote).sum(), \n 1 : vote.sum()}\n\n for class_idx, class_label in enumerate(range(X.shape[1])):\n p_vote[vote_idx, class_idx] = vote_counter[class_label]/len(AL_model.estimators_)\n \n # Calculate entropy for each example\n entr = entropy(p_vote, axis=1)\n \n # Choose example with highest entropy (disagreement)\n return entr.argmax()\n\n\n\nPrepare data for random sampling\n\nX_train_rand = X_train.copy()\ny_train_rand = y_train.copy()\nX_pool_rand = X_pool.copy()\ny_pool_rand = y_pool.copy()\n\nrandom_model = RandomForestClassifier(n_jobs=28, random_state=0)\n\n\n\nRun active learning\n\nAL_iters = 100\nnp.random.seed(0)\n\nAL_inds = []\nAL_models = []\nrandom_inds = []\nrandom_models = []\n\nfor iteration in range(AL_iters):\n clear_output(wait=True)\n print(\"iteration\", iteration)\n ######## Active Learning ############\n # Fit the model\n AL_model.fit(X_train, y_train)\n AL_models.append(deepcopy(AL_model))\n \n # Query a point\n query_idx = get_query_idx()\n AL_inds.append(query_idx)\n \n # Add it to the train data\n X_train = np.concatenate([X_train, X_pool[query_idx:query_idx+1, :]], axis=0)\n y_train = np.concatenate([y_train, y_pool[query_idx:query_idx+1]], axis=0)\n \n # Remove it from the pool data\n X_pool = np.delete(X_pool, query_idx, axis=0)\n y_pool = np.delete(y_pool, query_idx, axis=0)\n \n ######## Random Sampling ############\n # Fit the model\n random_model.fit(X_train_rand, y_train_rand)\n random_models.append(deepcopy(random_model))\n \n # Query a point\n query_idx = np.random.choice(len(X_pool))\n random_inds.append(query_idx)\n # Add it to the train data\n X_train_rand = np.concatenate([X_train_rand, X_pool_rand[query_idx:query_idx+1, :]], axis=0)\n y_train_rand = np.concatenate([y_train_rand, y_pool_rand[query_idx:query_idx+1]], axis=0)\n \n # Remove it from the pool data\n X_pool_rand = np.delete(X_pool_rand, query_idx, axis=0)\n y_pool_rand = np.delete(y_pool_rand, query_idx, axis=0)\n\niteration 99\n\n\n\n\nPlot accuracy\n\nrandom_scores = []\nAL_scores = []\nfor iteration in range(AL_iters):\n clear_output(wait=True)\n print(\"iteration\", iteration)\n AL_scores.append(accuracy_score(y_test, AL_models[iteration].predict(X_test)))\n random_scores.append(accuracy_score(y_test, random_models[iteration].predict(X_test)))\n \nplt.plot(AL_scores, label='Active Learning');\nplt.plot(random_scores, label='Random Sampling');\nplt.legend();\nplt.xlabel('Iterations');\nplt.ylabel('Accuracy\\n(Higher is better)');\n\niteration 99\n\n\n\n\n\n\n\n\n\n\n\nPlot decision boundary\n\ndef update(i):\n for each in ax:\n each.cla()\n \n AL_grid_preds = AL_models[i].predict(grid_X)\n random_grid_preds = random_models[i].predict(grid_X)\n \n # Active learning\n ax[0].scatter(X_train[:n_train,0], X_train[:n_train,1], c=y_train[:n_train], label='initial_train', alpha=0.2)\n ax[0].scatter(X_train[n_train:n_train+i, 0], X_train[n_train:n_train+i, 1], \n c=y_train[n_train:n_train+i], label='new_points')\n ax[0].contourf(grid_X1, grid_X2, AL_grid_preds.reshape(*grid_X1.shape), alpha=0.2);\n ax[0].set_title('New points')\n \n ax[1].scatter(X_test[:, 0], X_test[:, 1], c=y_test, label='test_set')\n ax[1].contourf(grid_X1, grid_X2, AL_grid_preds.reshape(*grid_X1.shape), alpha=0.2);\n ax[1].set_title('Test points');\n ax[0].text(locs[0],locs[1],'Active Learning')\n \n # Random sampling\n ax[2].scatter(X_train_rand[:n_train,0], X_train_rand[:n_train,1], c=y_train_rand[:n_train], label='initial_train', alpha=0.2)\n ax[2].scatter(X_train_rand[n_train:n_train+i, 0], X_train_rand[n_train:n_train+i, 1], \n c=y_train_rand[n_train:n_train+i], label='new_points')\n ax[2].contourf(grid_X1, grid_X2, random_grid_preds.reshape(*grid_X1.shape), alpha=0.2);\n ax[2].set_title('New points')\n \n ax[3].scatter(X_test[:, 0], X_test[:, 1], c=y_test, label='test_set')\n ax[3].contourf(grid_X1, grid_X2, random_grid_preds.reshape(*grid_X1.shape), alpha=0.2);\n ax[3].set_title('Test points');\n ax[2].text(locs[0],locs[1],'Random Sampling');\n\n\nlocs = (2.7, 4)\nfig, ax = plt.subplots(2,2,figsize=(12,6), sharex=True, sharey=True)\nax = ax.ravel()\nn_train = X_train.shape[0]-AL_iters\n\nanim = FuncAnimation(fig, func=update, frames=range(100))\nplt.close()\nanim\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect" }, { - "objectID": "posts/seq_to_seq.html", - "href": "posts/seq_to_seq.html", - "title": "Seq to Seq", + "objectID": "posts/ssh-macos.html", + "href": "posts/ssh-macos.html", + "title": "Passwordless SSH setup for MacOS Hosts", "section": "", - "text": "Model\nMain Disadvantage\nSolved by\nHow?\n\n\n\n\nNN\nCan’t handle dynamic length input\nRNN\nRNN can handle dynamic length input\n\n\nRNN\nVanishing Gradient Problem\nLSTM\nLSTM can handle vanishing gradient problem\n\n\nLSTM\nNon parallelizable\nTransformer\nTransformer can parallelize the computation\n\n\nTrasformer\nlosses sequentiality\nTransformer\nPositional Encoding" + "text": "HOST: The computer physically present with you.\nREMOTE: The remote computer that you’d like to access via ssh.\nREMOTE-IP: Ip address of the REMOTE.\nPORT: The port on which the ssh server is running on REMOTE.\nssh-keygen # this will generate a public and private key pair. Rename it if you want.\nssh-copy-id -i ~/.ssh/id_rsa.pub -p PORT USERANAME@REMOTE-IP # this will copy the public key to REMOTE\nssh-add ~/.ssh/id_rsa # this command tells the HOST to use the private key for ssh connections\nThat’s it! You should now be able to ssh into the REMOTE without a password. After rebooting the HOST, if VSCode or CLI asks for a password, run ssh-add ~/.ssh/id_rsa again." }, { - "objectID": "posts/wrf-tutorial.html", - "href": "posts/wrf-tutorial.html", - "title": "WRF Tutorial", + "objectID": "posts/ssh-macos.html#terminology", + "href": "posts/ssh-macos.html#terminology", + "title": "Passwordless SSH setup for MacOS Hosts", "section": "", - "text": "Acronym\nFull Form\n\n\n\n\nWRF\nWeather Research and Forecasting model\n\n\nWPS\nWRF Preprocessing System\n\n\nWRF-ARW\nAdvanced Research WRF\n\n\nWRF-Hydro\nWRF Hydrological model\n\n\nWRF-Chem\nWRF Chemical model\n\n\nWRFDA\nWRF Data Assimilation" + "text": "HOST: The computer physically present with you.\nREMOTE: The remote computer that you’d like to access via ssh.\nREMOTE-IP: Ip address of the REMOTE.\nPORT: The port on which the ssh server is running on REMOTE.\nssh-keygen # this will generate a public and private key pair. Rename it if you want.\nssh-copy-id -i ~/.ssh/id_rsa.pub -p PORT USERANAME@REMOTE-IP # this will copy the public key to REMOTE\nssh-add ~/.ssh/id_rsa # this command tells the HOST to use the private key for ssh connections\nThat’s it! You should now be able to ssh into the REMOTE without a password. After rebooting the HOST, if VSCode or CLI asks for a password, run ssh-add ~/.ssh/id_rsa again." }, { - "objectID": "posts/wrf-tutorial.html#what-is-what", - "href": "posts/wrf-tutorial.html#what-is-what", - "title": "WRF Tutorial", - "section": "", - "text": "Acronym\nFull Form\n\n\n\n\nWRF\nWeather Research and Forecasting model\n\n\nWPS\nWRF Preprocessing System\n\n\nWRF-ARW\nAdvanced Research WRF\n\n\nWRF-Hydro\nWRF Hydrological model\n\n\nWRF-Chem\nWRF Chemical model\n\n\nWRFDA\nWRF Data Assimilation" + "objectID": "posts/2022-10-31-stochastic-variational-gp.html", + "href": "posts/2022-10-31-stochastic-variational-gp.html", + "title": "Stochastic Variational Gaussian processes in JAX", + "section": "", + "text": "I recently read a compact and clean explanation of SVGP in the following blog post by Dr. Martin Ingram:\nNow, I am attempting to implement a practical code from scratch for the same (What is practical about it? Sometimes math does not simply translate to code without careful modifications). I am assuming that you have read the blog post cited above before moving further. Let’s go for coding!" }, { - "objectID": "posts/wrf-tutorial.html#explanation-in-simple-words", - "href": "posts/wrf-tutorial.html#explanation-in-simple-words", - "title": "WRF Tutorial", - "section": "Explanation in simple words", - "text": "Explanation in simple words\nWill add." + "objectID": "posts/2022-10-31-stochastic-variational-gp.html#imports", + "href": "posts/2022-10-31-stochastic-variational-gp.html#imports", + "title": "Stochastic Variational Gaussian processes in JAX", + "section": "Imports", + "text": "Imports\n\n# JAX\nimport jax\nfrom jax.flatten_util import ravel_pytree\nimport jax.numpy as jnp\nimport jax.scipy as jsp\n\n# Partially initialize functions\nfrom functools import partial\n\n# TFP\nimport tensorflow_probability.substrates.jax as tfp\ntfd = tfp.distributions\ntfb = tfp.bijectors\n\n# GP Kernels\nfrom tinygp import kernels\n\n# sklearn\nfrom sklearn.datasets import make_moons, make_blobs, make_circles\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.preprocessing import StandardScaler\nfrom sklearn.metrics import accuracy_score\n\n# Optimization\nimport optax\n\n# Plotting\nimport matplotlib.pyplot as plt\nplt.rcParams['scatter.edgecolors'] = \"k\"\n\n# Progress bar\nfrom tqdm import tqdm\n\n# Jitter\nJITTER = 1e-6\n\n# Enable JAX 64bit\njax.config.update(\"jax_enable_x64\", True)" }, { - "objectID": "posts/wrf-tutorial.html#system-information", - "href": "posts/wrf-tutorial.html#system-information", - "title": "WRF Tutorial", - "section": "System information", - "text": "System information\n\n!lsb_release -a\n\nNo LSB modules are available.\nDistributor ID: Ubuntu\nDescription: Ubuntu 20.04.6 LTS\nRelease: 20.04\nCodename: focal\n\n\n\n# processor info\n!cat /proc/cpuinfo\n\nprocessor : 0\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 2001.451\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 0\ncpu cores : 32\napicid : 0\ninitial apicid : 0\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 1\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 1\ncpu cores : 32\napicid : 1\ninitial apicid : 1\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 2\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 2\ncpu cores : 32\napicid : 2\ninitial apicid : 2\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 3\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 3\ncpu cores : 32\napicid : 3\ninitial apicid : 3\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 4\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 4\ncpu cores : 32\napicid : 4\ninitial apicid : 4\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 5\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 5\ncpu cores : 32\napicid : 5\ninitial apicid : 5\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 6\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 6\ncpu cores : 32\napicid : 6\ninitial apicid : 6\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 7\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 7\ncpu cores : 32\napicid : 7\ninitial apicid : 7\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 8\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 8\ncpu cores : 32\napicid : 8\ninitial apicid : 8\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 9\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 9\ncpu cores : 32\napicid : 9\ninitial apicid : 9\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 10\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 10\ncpu cores : 32\napicid : 10\ninitial apicid : 10\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 11\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 11\ncpu cores : 32\napicid : 11\ninitial apicid : 11\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 12\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 2000.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 12\ncpu cores : 32\napicid : 12\ninitial apicid : 12\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 13\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 13\ncpu cores : 32\napicid : 13\ninitial apicid : 13\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 14\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 14\ncpu cores : 32\napicid : 14\ninitial apicid : 14\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 15\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 15\ncpu cores : 32\napicid : 15\ninitial apicid : 15\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 16\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 16\ncpu cores : 32\napicid : 16\ninitial apicid : 16\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 17\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 17\ncpu cores : 32\napicid : 17\ninitial apicid : 17\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 18\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 18\ncpu cores : 32\napicid : 18\ninitial apicid : 18\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 19\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 19\ncpu cores : 32\napicid : 19\ninitial apicid : 19\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 20\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 20\ncpu cores : 32\napicid : 20\ninitial apicid : 20\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 21\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 21\ncpu cores : 32\napicid : 21\ninitial apicid : 21\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 22\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 22\ncpu cores : 32\napicid : 22\ninitial apicid : 22\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 23\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 23\ncpu cores : 32\napicid : 23\ninitial apicid : 23\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 24\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 24\ncpu cores : 32\napicid : 24\ninitial apicid : 24\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 25\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 25\ncpu cores : 32\napicid : 25\ninitial apicid : 25\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 26\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 26\ncpu cores : 32\napicid : 26\ninitial apicid : 26\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 27\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 27\ncpu cores : 32\napicid : 27\ninitial apicid : 27\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 28\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 28\ncpu cores : 32\napicid : 28\ninitial apicid : 28\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 29\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 29\ncpu cores : 32\napicid : 29\ninitial apicid : 29\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 30\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 30\ncpu cores : 32\napicid : 30\ninitial apicid : 30\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 31\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 0\nsiblings : 32\ncore id : 31\ncpu cores : 32\napicid : 31\ninitial apicid : 31\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4699.82\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 32\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 0\ncpu cores : 32\napicid : 64\ninitial apicid : 64\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 33\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 1\ncpu cores : 32\napicid : 65\ninitial apicid : 65\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 34\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 2\ncpu cores : 32\napicid : 66\ninitial apicid : 66\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 35\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 3\ncpu cores : 32\napicid : 67\ninitial apicid : 67\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 36\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 4\ncpu cores : 32\napicid : 68\ninitial apicid : 68\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 37\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 5\ncpu cores : 32\napicid : 69\ninitial apicid : 69\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 38\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 6\ncpu cores : 32\napicid : 70\ninitial apicid : 70\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 39\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 7\ncpu cores : 32\napicid : 71\ninitial apicid : 71\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 40\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 8\ncpu cores : 32\napicid : 72\ninitial apicid : 72\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 41\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 9\ncpu cores : 32\napicid : 73\ninitial apicid : 73\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 42\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 10\ncpu cores : 32\napicid : 74\ninitial apicid : 74\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 43\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 11\ncpu cores : 32\napicid : 75\ninitial apicid : 75\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 44\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 12\ncpu cores : 32\napicid : 76\ninitial apicid : 76\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 45\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 13\ncpu cores : 32\napicid : 77\ninitial apicid : 77\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 46\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 14\ncpu cores : 32\napicid : 78\ninitial apicid : 78\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 47\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 15\ncpu cores : 32\napicid : 79\ninitial apicid : 79\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 48\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 2900.378\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 16\ncpu cores : 32\napicid : 80\ninitial apicid : 80\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 49\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1549.380\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 17\ncpu cores : 32\napicid : 81\ninitial apicid : 81\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 50\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 18\ncpu cores : 32\napicid : 82\ninitial apicid : 82\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 51\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 19\ncpu cores : 32\napicid : 83\ninitial apicid : 83\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 52\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 20\ncpu cores : 32\napicid : 84\ninitial apicid : 84\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 53\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 21\ncpu cores : 32\napicid : 85\ninitial apicid : 85\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 54\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 22\ncpu cores : 32\napicid : 86\ninitial apicid : 86\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 55\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 23\ncpu cores : 32\napicid : 87\ninitial apicid : 87\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 56\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 24\ncpu cores : 32\napicid : 88\ninitial apicid : 88\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 57\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 25\ncpu cores : 32\napicid : 89\ninitial apicid : 89\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 58\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 26\ncpu cores : 32\napicid : 90\ninitial apicid : 90\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 59\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 2350.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 27\ncpu cores : 32\napicid : 91\ninitial apicid : 91\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 60\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 28\ncpu cores : 32\napicid : 92\ninitial apicid : 92\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 61\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 29\ncpu cores : 32\napicid : 93\ninitial apicid : 93\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 62\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 30\ncpu cores : 32\napicid : 94\ninitial apicid : 94\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]\n\nprocessor : 63\nvendor_id : AuthenticAMD\ncpu family : 23\nmodel : 49\nmodel name : AMD EPYC 7452 32-Core Processor\nstepping : 0\nmicrocode : 0x830107a\ncpu MHz : 1500.000\ncache size : 512 KB\nphysical id : 1\nsiblings : 32\ncore id : 31\ncpu cores : 32\napicid : 95\ninitial apicid : 95\nfpu : yes\nfpu_exception : yes\ncpuid level : 16\nwp : yes\nflags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm constant_tsc rep_good nopl nonstop_tsc cpuid extd_apicid aperfmperf rapl pni pclmulqdq monitor ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx f16c rdrand lahf_lm cmp_legacy svm extapic cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw ibs skinit wdt tce topoext perfctr_core perfctr_nb bpext perfctr_llc mwaitx cpb cat_l3 cdp_l3 hw_pstate ssbd mba ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 cqm rdt_a rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local clzero irperf xsaveerptr rdpru wbnoinvd amd_ppin arat npt lbrv svm_lock nrip_save tsc_scale vmcb_clean flushbyasid decodeassists pausefilter pfthreshold avic v_vmsave_vmload vgif v_spec_ctrl umip rdpid overflow_recov succor smca sme sev sev_es\nbugs : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass retbleed smt_rsb srso\nbogomips : 4673.53\nTLB size : 3072 4K pages\nclflush size : 64\ncache_alignment : 64\naddress sizes : 43 bits physical, 48 bits virtual\npower management: ts ttp tm hwpstate cpb eff_freq_ro [13] [14]" + "objectID": "posts/2022-10-31-stochastic-variational-gp.html#dataset", + "href": "posts/2022-10-31-stochastic-variational-gp.html#dataset", + "title": "Stochastic Variational Gaussian processes in JAX", + "section": "Dataset", + "text": "Dataset\nFor this blog post, we will stick to the classification problem and pick a reasonable classification dataset.\n\nn_samples = 100\nnoise = 0.1\nrandom_state = 0\nshuffle = True\n\nX, y = make_moons(\n n_samples=n_samples, random_state=random_state, noise=noise, shuffle=shuffle\n)\nX = StandardScaler().fit_transform(X) # Yes, this is useful for GPs\n\nX, y = map(jnp.array, (X, y))\n\nplt.scatter(X[:, 0], X[:, 1], c=y);\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)" }, { - "objectID": "posts/wrf-tutorial.html#wrf-wps-installation", - "href": "posts/wrf-tutorial.html#wrf-wps-installation", - "title": "WRF Tutorial", - "section": "WRF-WPS Installation", - "text": "WRF-WPS Installation\n\nCreate a new directory for all WRF codes and put everything into it.\nClone the repo recursively (includes WRF, WRFDA & WRF-Chem)\n\ngit clone --recurse-submodules https://github.com/wrf-model/WRF\n\nClone the repo (includes WPS)\n\ngit clone https://github.com/wrf-model/WPS\n\nFollow instructions given on https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compilation_tutorial.php" + "objectID": "posts/2022-10-31-stochastic-variational-gp.html#methodology", + "href": "posts/2022-10-31-stochastic-variational-gp.html#methodology", + "title": "Stochastic Variational Gaussian processes in JAX", + "section": "Methodology", + "text": "Methodology\nTo define a GP, we need a kernel function. Let us use the RBF or Exponentiated Quadratic or Squared Exponential kernel.\n\nlengthscale = 1.0\nvariance = 1.0\n\nkernel_fn = variance * kernels.ExpSquared(scale=lengthscale)\n\nkernel_fn(X, X).shape\n\n(100, 100)\n\n\nAs explained in the blog post, we want to minimize the following loss function:\n\\[\nKL[q(u|\\eta) || p(u|y, \\theta)] = KL[q(u|\\eta) || p(u | \\theta)] - \\mathbb{E}_{u \\sim q(u|\\eta)} \\log p(y | u, \\theta) + const\n\\]\nLet us break down the loss and discuss each componant.\n\nKL divergence\nIn the first term, we want to compute the KL divergence between prior and variational distribution of GP at inducing points. First, we need to define the inducing points.\n\nkey = jax.random.PRNGKey(0)\nn_inducing = 10\nn_dim = X.shape[1]\n\nX_inducing = jax.random.normal(key, shape=(n_inducing, n_dim))\nX_inducing.shape\n\n(10, 2)\n\n\nNow, defining the prior and variational distributions.\n\ngp_mean = 0.43 # a scalar parameter to train\n\nprior_mean = gp_mean * jnp.zeros(n_inducing)\nprior_cov = kernel_fn(X_inducing, X_inducing)\n\nprior_distribution = tfd.MultivariateNormalFullCovariance(prior_mean, prior_cov)\n\n\nvariational_mean = jax.random.uniform(key, shape=(n_inducing,)) # a vector parameter to train\n\nA covariance matrix can not be learned directly due to positive definite constraint. We can decompose a covariance matrix in a following way:\n\\[\n\\begin{aligned}\nK &= diag(\\boldsymbol{\\sigma})\\Sigma diag(\\boldsymbol{\\sigma})\\\\\n &= diag(\\boldsymbol{\\sigma})LL^T diag(\\boldsymbol{\\sigma})\n\\end{aligned}\n\\]\nWhere, \\(\\Sigma\\) is a correlation matrix, \\(L\\) is a lower triangular cholesky decomposition of \\(\\Sigma\\) and \\(\\boldsymbol{\\sigma}\\) is the variance vector. We can use tfb.CorrelationCholesky to generate \\(L\\) from an unconstrained vector:\n\nrandom_vector = jax.random.normal(key, shape=(3,))\ncorr_chol = tfb.CorrelationCholesky()(random_vector)\ncorrelation = corr_chol@corr_chol.T\ncorrelation\n\nDeviceArray([[ 1. , 0.54464529, -0.7835968 ],\n [ 0.54464529, 1. , -0.33059078],\n [-0.7835968 , -0.33059078, 1. ]], dtype=float64)\n\n\nTo constrain \\(\\boldsymbol{\\sigma}\\), any positivity constraint would suffice. So, combining these tricks, we can model the covariance as following:\n\nrandom_vector = jax.random.normal(\n key, shape=(n_inducing * (n_inducing - 1) // 2,)\n) # a trainable parameter\nlog_sigma = jax.random.normal(key, shape=(n_inducing, 1)) # a trainable parameter\n\n\nsigma = jnp.exp(log_sigma)\ncorr_chol = tfb.CorrelationCholesky()(random_vector)\nvariational_cov = sigma * sigma.T * (corr_chol @ corr_chol.T)\nprint(variational_cov.shape)\n\nvariational_distribution = tfd.MultivariateNormalFullCovariance(variational_mean, variational_cov\n)\n\n(10, 10)\n\n\nNow, we can compute the KL divergence:\n\nvariational_distribution.kl_divergence(prior_distribution)\n\nDeviceArray(416.89357355, dtype=float64)\n\n\n\n\nExpectation over the likelihood\nWe want to compute the following expectation:\n\\[\n-\\sum_{i=1}^N \\mathbb{E}_{f_i \\sim q(f_i | \\eta, \\theta)} \\log p(y_i| f_i, \\theta)\n\\]\nNote that, \\(p(y_i| f_i, \\theta)\\) can be any likelihood depending upon the problem, but for classification, we may use a Bernoulli likelihood.\n\nf = jax.random.normal(key, shape=y.shape)\nlikelihood_distribution = tfd.Bernoulli(logits=f)\n\nlog_likelihood = likelihood_distribution.log_prob(y).sum()\nlog_likelihood\n\nDeviceArray(-72.04665624, dtype=float64)\n\n\nWe need to sample \\(f_i\\) from \\(q(f_i | \\eta, \\theta)\\) which has the following form:\n\\[\n\\begin{aligned}\nq(u) &\\sim \\mathcal{N}(\\boldsymbol{m}, S)\\\\\nq(f_i | \\eta, \\theta) &\\sim \\mathcal{N}(\\mu_i, \\sigma_i^2)\\\\\n\\mu_i &= A\\boldsymbol{m}\\\\\n\\sigma_i^2 &= K_{ii} + A(S - K_{mm})A^T\\\\\nA &= K_{im}K_{mm}^{-1}\n\\end{aligned}\n\\]\nNote that matrix inversion is often unstable with jnp.linalg.inv and thus we will use cholesky tricks to compute \\(A\\).\n\ndef q_f(x_i):\n x_i = x_i.reshape(1, -1) # ensure correct shape\n K_im = kernel_fn(x_i, X_inducing)\n K_mm = kernel_fn(X_inducing, X_inducing)\n chol_mm = jnp.linalg.cholesky(K_mm + jnp.eye(K_mm.shape[0])*JITTER)\n A = jsp.linalg.cho_solve((chol_mm, True), K_im.T).T\n \n mu_i = A@variational_mean\n sigma_sqr_i = kernel_fn(x_i, x_i) + A@(variational_cov - prior_cov)@A.T\n \n return tfd.Normal(loc=mu_i, scale=sigma_sqr_i**0.5)\n\nHere is a function to compute log likelihood for a single data-point:\n\ndef log_likelihood(x_i, y_i, seed):\n sample = q_f(x_i).sample(seed=seed)\n log_likelihood = tfd.Bernoulli(logits=sample).log_prob(y_i)\n return log_likelihood.squeeze()\n\n\nlog_likelihood(X[0], y[0], seed=key)\n\nDeviceArray(-0.17831203, dtype=float64)\n\n\nWe can use jax.vmap to compute log_likelihood over a batch. With that, we can leverage the stochastic variational inference following section 10.3.1 (Eq. 10.108) from pml book2. Basically, in each iteration, we need to multiply the batch log likelihood with \\(\\frac{N}{B}\\) to get an unbiased minibatch approximation where \\(N\\) is size of the full dataset and \\(B\\) is the batch size.\n\nbatch_size = 10\n\nseeds = jax.random.split(key, num=batch_size)\n\nll = len(y)/batch_size * jax.vmap(log_likelihood)(X[:batch_size], y[:batch_size], seeds).sum()\nll\n\nDeviceArray(-215.46520331, dtype=float64)\n\n\nNote that, once the parameters are optimized, we can use the derivations of \\(q(f_i | \\eta, \\theta)\\) to compute the posterior distribution. We have figured out all the pieces by now so it is the time to put it togather in a single class. Some pointers to note are the following:\n\nWe define a single function get_constrained_params to transform all unconstrained parameters.\njax.lax.scan gives a huge boost to a training loop.\nThere is some repeatation of code due to lack of super code optimization. You can do it at your end if needed." }, { - "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html", - "href": "posts/2022-05-17-contributors_sorted_by_prs.html", - "title": "Get a list of contributors from a repo", - "section": "", - "text": "import pandas as pd" + "objectID": "posts/2022-10-31-stochastic-variational-gp.html#all-in-one", + "href": "posts/2022-10-31-stochastic-variational-gp.html#all-in-one", + "title": "Stochastic Variational Gaussian processes in JAX", + "section": "All in one", + "text": "All in one\n\nclass SVGP:\n def __init__(self, X_inducing, data_size):\n self.X_inducing = X_inducing\n self.n_inducing = len(X_inducing)\n self.data_size = data_size\n \n def init_params(self, seed):\n variational_corr_chol_param = tfb.CorrelationCholesky().inverse(jnp.eye(self.n_inducing))\n \n dummy_params = {\"log_variance\": jnp.zeros(()),\n \"log_scale\": jnp.zeros(()), \n \"mean\": jnp.zeros(()),\n \"X_inducing\": self.X_inducing,\n \"variational_mean\": jnp.zeros(self.n_inducing),\n \"variational_corr_chol_param\": variational_corr_chol_param,\n \"log_variational_sigma\": jnp.zeros((self.n_inducing, 1)),\n }\n \n flat_params, unravel_fn = ravel_pytree(dummy_params)\n random_params = jax.random.normal(key, shape=(len(flat_params), ))\n params = unravel_fn(random_params)\n return params\n \n @staticmethod\n def get_constrained_params(params):\n return {\"mean\": params[\"mean\"],\n \"variance\": jnp.exp(params['log_variance']), \n \"scale\": jnp.exp(params['log_scale']), \n \"X_inducing\": params[\"X_inducing\"],\n \"variational_mean\": params[\"variational_mean\"],\n \"variational_corr_chol_param\": params[\"variational_corr_chol_param\"],\n \"variational_sigma\": jnp.exp(params[\"log_variational_sigma\"])}\n \n @staticmethod\n def get_q_f(params, x_i, prior_distribution, variational_distribution):\n x_i = x_i.reshape(1, -1) # ensure correct shape\n \n kernel_fn = params['variance'] * kernels.ExpSquared(scale=params[\"scale\"])\n K_im = kernel_fn(x_i, params[\"X_inducing\"])\n K_mm = prior_distribution.covariance()\n chol_mm = jnp.linalg.cholesky(K_mm)\n A = jsp.linalg.cho_solve((chol_mm, True), K_im.T).T\n\n mu_i = A@params[\"variational_mean\"]\n sigma_sqr_i = kernel_fn(x_i, x_i) + A@(variational_distribution.covariance() - K_mm)@A.T\n\n return tfd.Normal(loc=mu_i, scale=sigma_sqr_i**0.5)\n \n def get_distributions(self, params):\n kernel_fn = params['variance'] * kernels.ExpSquared(scale=params[\"scale\"])\n prior_mean = params[\"mean\"]\n prior_cov = kernel_fn(params[\"X_inducing\"], params[\"X_inducing\"]) + jnp.eye(self.n_inducing)*JITTER\n prior_distribution = tfd.MultivariateNormalFullCovariance(prior_mean, prior_cov)\n\n corr_chol = tfb.CorrelationCholesky()(params[\"variational_corr_chol_param\"])\n sigma = jnp.diag(params[\"variational_sigma\"])\n variational_cov = sigma*sigma.T*(corr_chol@corr_chol.T) + jnp.eye(self.n_inducing)*JITTER\n variational_distribution = tfd.MultivariateNormalFullCovariance(params[\"variational_mean\"], variational_cov)\n \n return prior_distribution, variational_distribution\n \n def loss_fn(self, params, X_batch, y_batch, seed):\n params = self.get_constrained_params(params)\n \n # Get distributions\n prior_distribution, variational_distribution = self.get_distributions(params)\n \n # Compute kl\n kl = variational_distribution.kl_divergence(prior_distribution)\n\n # Compute log likelihood\n def log_likelihood_fn(x_i, y_i, seed):\n q_f = self.get_q_f(params, x_i, prior_distribution, variational_distribution)\n sample = q_f.sample(seed=seed)\n log_likelihood = tfd.Bernoulli(logits=sample).log_prob(y_i)\n return log_likelihood.squeeze()\n \n seeds = jax.random.split(seed, num=len(y_batch))\n log_likelihood = jax.vmap(log_likelihood_fn)(X_batch, y_batch, seeds).sum() * self.data_size/len(y_batch)\n\n return kl - log_likelihood\n \n def fit_fn(self, X, y, init_params, optimizer, n_iters, batch_size, seed):\n state = optimizer.init(init_params)\n value_and_grad_fn = jax.value_and_grad(self.loss_fn)\n \n def one_step(params_and_state, seed):\n params, state = params_and_state\n idx = jax.random.choice(seed, self.data_size, (batch_size,), replace=False)\n X_batch, y_batch = X[idx], y[idx]\n \n seed2 = jax.random.split(seed, 1)[0]\n loss, grads = value_and_grad_fn(params, X_batch, y_batch, seed2)\n updates, state = optimizer.update(grads, state)\n params = optax.apply_updates(params, updates)\n return (params, state), (loss, params)\n \n seeds = jax.random.split(seed, num=n_iters)\n (best_params, _), (loss_history, params_history) = jax.lax.scan(one_step, (init_params, state), xs=seeds)\n return best_params, loss_history, params_history\n\n def predict_fn(self, params, X_new):\n constrained_params = self.get_constrained_params(params)\n prior_distribution, variational_distribution = self.get_distributions(constrained_params)\n \n def _predict_fn(x_i): \n # Get posterior\n q_f = self.get_q_f(constrained_params, x_i, prior_distribution, variational_distribution)\n return q_f.mean().squeeze(), q_f.variance().squeeze()\n \n mean, var = jax.vmap(_predict_fn)(X_new)\n return mean.squeeze(), var.squeeze()" }, { - "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html#config", - "href": "posts/2022-05-17-contributors_sorted_by_prs.html#config", - "title": "Get a list of contributors from a repo", - "section": "Config", - "text": "Config\n\nowner = \"probml\"\nrepo = \"pyprobml\"" + "objectID": "posts/2022-10-31-stochastic-variational-gp.html#train-and-predict", + "href": "posts/2022-10-31-stochastic-variational-gp.html#train-and-predict", + "title": "Stochastic Variational Gaussian processes in JAX", + "section": "Train and predict", + "text": "Train and predict\n\nn_inducing = 20\nn_epochs = 100\nbatch_size = 10\ndata_size = len(y)\nn_iters = n_epochs*(data_size/batch_size)\nn_iters\n\n1000.0\n\n\n\nkey = jax.random.PRNGKey(0)\nkey2, subkey = jax.random.split(key)\noptimizer = optax.adam(learning_rate=0.01)\n\nX_inducing = jax.random.choice(key, X, (n_inducing,), replace=False)\nmodel = SVGP(X_inducing, data_size)\n\ninit_params = model.init_params(key2)\n\nmodel.loss_fn(init_params, X, y, key)\nbest_params, loss_history, params_history = model.fit_fn(X, y, init_params, optimizer, n_iters, batch_size, subkey)\n\nplt.figure()\nplt.plot(loss_history);\nplt.title(\"Loss\");\n\n\n\n\n\n\n\n\n\nx = jnp.linspace(-3.5, 3.5, 100)\nseed = jax.random.PRNGKey(123)\n\nX1, X2 = jnp.meshgrid(x, x)\nf = lambda x1, x2: model.predict_fn(best_params, jnp.array([x1, x2]).reshape(1, -1))\npred_mean, pred_var = jax.vmap(jax.vmap(f))(X1, X2)\nlogits = tfd.Normal(pred_mean, pred_var**0.5).sample(seed=seed, sample_shape=(10000,))\nproba = jax.nn.sigmoid(logits)\n\nproba_mean = proba.mean(axis=0)\nproba_std2 = proba.std(axis=0)*2\n\n\nfig, ax = plt.subplots(1, 2, figsize=(12,4))\ncplot1 = ax[0].contourf(X1, X2, proba_mean.squeeze(), alpha=0.5, levels=20)\nplt.colorbar(cplot1, ax=ax[0])\n\ncplot2 = ax[1].contourf(X1, X2, proba_std2.squeeze(), alpha=0.5, levels=20)\nplt.colorbar(cplot2, ax=ax[1])\n\nax[0].scatter(X[:, 0], X[:, 1], c=y);\nax[1].scatter(X[:, 0], X[:, 1], c=y);\n\nax[0].set_title(\"Posterior $\\mu$\");\nax[1].set_title(\"Posterior $\\mu \\pm 2*\\sigma$\");" }, { - "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html#get-all-contributors-to-a-repo", - "href": "posts/2022-05-17-contributors_sorted_by_prs.html#get-all-contributors-to-a-repo", - "title": "Get a list of contributors from a repo", - "section": "Get all contributors to a repo", - "text": "Get all contributors to a repo\n\ncontributors = pd.read_json(f\"https://api.github.com/repos/{owner}/{repo}/contributors?per_page=100\")\ncontributors = contributors.set_index(\"login\")\nprint(f\"Number of contributors: {len(contributors.index.unique())}\")\ncontributors.head(2)\n\nNumber of contributors: 47\n\n\n\n \n \n \n\n\n\n\n\n\nid\nnode_id\navatar_url\ngravatar_id\nurl\nhtml_url\nfollowers_url\nfollowing_url\ngists_url\nstarred_url\nsubscriptions_url\norganizations_url\nrepos_url\nevents_url\nreceived_events_url\ntype\nsite_admin\ncontributions\n\n\nlogin\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nmurphyk\n4632336\nMDQ6VXNlcjQ2MzIzMzY=\nhttps://avatars.githubusercontent.com/u/463233...\n\nhttps://api.github.com/users/murphyk\nhttps://github.com/murphyk\nhttps://api.github.com/users/murphyk/followers\nhttps://api.github.com/users/murphyk/following...\nhttps://api.github.com/users/murphyk/gists{/gi...\nhttps://api.github.com/users/murphyk/starred{/...\nhttps://api.github.com/users/murphyk/subscript...\nhttps://api.github.com/users/murphyk/orgs\nhttps://api.github.com/users/murphyk/repos\nhttps://api.github.com/users/murphyk/events{/p...\nhttps://api.github.com/users/murphyk/received_...\nUser\nFalse\n1777\n\n\nNeoanarika\n5188337\nMDQ6VXNlcjUxODgzMzc=\nhttps://avatars.githubusercontent.com/u/518833...\n\nhttps://api.github.com/users/Neoanarika\nhttps://github.com/Neoanarika\nhttps://api.github.com/users/Neoanarika/followers\nhttps://api.github.com/users/Neoanarika/follow...\nhttps://api.github.com/users/Neoanarika/gists{...\nhttps://api.github.com/users/Neoanarika/starre...\nhttps://api.github.com/users/Neoanarika/subscr...\nhttps://api.github.com/users/Neoanarika/orgs\nhttps://api.github.com/users/Neoanarika/repos\nhttps://api.github.com/users/Neoanarika/events...\nhttps://api.github.com/users/Neoanarika/receiv...\nUser\nFalse\n184" + "objectID": "posts/2022-10-31-stochastic-variational-gp.html#some-more-datasets", + "href": "posts/2022-10-31-stochastic-variational-gp.html#some-more-datasets", + "title": "Stochastic Variational Gaussian processes in JAX", + "section": "Some more datasets", + "text": "Some more datasets\n\ndef fit_and_plot(X, y):\n X = StandardScaler().fit_transform(X) # Yes, this is useful for GPs\n X, y = map(jnp.array, (X, y))\n\n X_inducing = jax.random.choice(key, X, (n_inducing,), replace=False)\n model = SVGP(X_inducing, data_size)\n\n init_params = model.init_params(key2)\n\n model.loss_fn(init_params, X, y, key)\n best_params, loss_history, params_history = model.fit_fn(X, y, init_params, optimizer, n_iters, batch_size, subkey)\n\n plt.figure()\n plt.plot(loss_history);\n plt.title(\"Loss\");\n \n f = lambda x1, x2: model.predict_fn(best_params, jnp.array([x1, x2]).reshape(1, -1))\n pred_mean, pred_var = jax.vmap(jax.vmap(f))(X1, X2)\n logits = tfd.Normal(pred_mean, pred_var**0.5).sample(seed=seed, sample_shape=(10000,))\n proba = jax.nn.sigmoid(logits)\n\n proba_mean = proba.mean(axis=0)\n proba_std2 = proba.std(axis=0)*2\n \n fig, ax = plt.subplots(1, 2, figsize=(12,4))\n cplot1 = ax[0].contourf(X1, X2, proba_mean.squeeze(), alpha=0.5, levels=20)\n plt.colorbar(cplot1, ax=ax[0])\n\n cplot2 = ax[1].contourf(X1, X2, proba_std2.squeeze(), alpha=0.5, levels=20)\n plt.colorbar(cplot2, ax=ax[1])\n\n ax[0].scatter(X[:, 0], X[:, 1], c=y);\n ax[1].scatter(X[:, 0], X[:, 1], c=y);\n\n ax[0].set_title(\"Posterior $\\mu$\");\n ax[1].set_title(\"Posterior $\\mu \\pm 2*\\sigma$\");\n\n\nmake_blobs\n\nX, y = make_blobs(n_samples=n_samples, random_state=random_state, centers=2)\n\nplt.scatter(X[:, 0], X[:, 1], c=y);\nfit_and_plot(X, y)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nmake_circles\n\nX, y = make_circles(n_samples=n_samples, random_state=random_state, noise=noise, factor=0.1)\n\nplt.scatter(X[:, 0], X[:, 1], c=y);\nfit_and_plot(X, y)" }, { - "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html#fetch-all-prs-from-a-repo", - "href": "posts/2022-05-17-contributors_sorted_by_prs.html#fetch-all-prs-from-a-repo", - "title": "Get a list of contributors from a repo", - "section": "Fetch all PRs from a repo", - "text": "Fetch all PRs from a repo\n\npage_range = range(1, 6)\nget_pr_df = lambda page: pd.read_json(f\"https://api.github.com/repos/probml/pyprobml/pulls?state=all&per_page=100&page={page}\")\npull_requests = pd.concat(map(get_pr_df, page_range))\nprint(f\"Number of PRs: {len(pull_requests)}\")\npull_requests.head(2)\n\nNumber of PRs: 497\n\n\n\n \n \n \n\n\n\n\n\n\nurl\nid\nnode_id\nhtml_url\ndiff_url\npatch_url\nissue_url\nnumber\nstate\nlocked\n...\nreview_comments_url\nreview_comment_url\ncomments_url\nstatuses_url\nhead\nbase\n_links\nauthor_association\nauto_merge\nactive_lock_reason\n\n\n\n\n0\nhttps://api.github.com/repos/probml/pyprobml/p...\n938329819\nPR_kwDOA-3vB8437cbb\nhttps://github.com/probml/pyprobml/pull/841\nhttps://github.com/probml/pyprobml/pull/841.diff\nhttps://github.com/probml/pyprobml/pull/841.patch\nhttps://api.github.com/repos/probml/pyprobml/i...\n841\nclosed\nFalse\n...\nhttps://api.github.com/repos/probml/pyprobml/p...\nhttps://api.github.com/repos/probml/pyprobml/p...\nhttps://api.github.com/repos/probml/pyprobml/i...\nhttps://api.github.com/repos/probml/pyprobml/s...\n{'label': 'karm-patel:posrprocessing', 'ref': ...\n{'label': 'probml:master', 'ref': 'master', 's...\n{'self': {'href': 'https://api.github.com/repo...\nCONTRIBUTOR\nNaN\nNaN\n\n\n1\nhttps://api.github.com/repos/probml/pyprobml/p...\n938317389\nPR_kwDOA-3vB8437ZZN\nhttps://github.com/probml/pyprobml/pull/840\nhttps://github.com/probml/pyprobml/pull/840.diff\nhttps://github.com/probml/pyprobml/pull/840.patch\nhttps://api.github.com/repos/probml/pyprobml/i...\n840\nclosed\nFalse\n...\nhttps://api.github.com/repos/probml/pyprobml/p...\nhttps://api.github.com/repos/probml/pyprobml/p...\nhttps://api.github.com/repos/probml/pyprobml/i...\nhttps://api.github.com/repos/probml/pyprobml/s...\n{'label': 'karm-patel:master', 'ref': 'master'...\n{'label': 'probml:master', 'ref': 'master', 's...\n{'self': {'href': 'https://api.github.com/repo...\nCONTRIBUTOR\nNaN\nNaN\n\n\n\n\n2 rows × 36 columns" + "objectID": "posts/2024-12-10-cpcb-download.html", + "href": "posts/2024-12-10-cpcb-download.html", + "title": "Download CPCB live data", + "section": "", + "text": "import os\nimport re\nfrom glob import glob\nimport pandas as pd\nfrom tqdm.notebook import tqdm\nfrom selenium import webdriver\nfrom selenium.webdriver.common.by import By\nfrom selenium.webdriver.common.keys import Keys\nfrom selenium.webdriver.support.ui import Select, WebDriverWait\nfrom selenium.webdriver.common.action_chains import ActionChains\nfrom selenium.webdriver.support import expected_conditions as EC\nfrom selenium.webdriver.chrome.options import Options\nfrom time import sleep\n\nHOME_URL = \"https://airquality.cpcb.gov.in/ccr/#/caaqm-dashboard-all/caaqm-landing\"\nDOWNLOAD_OLD_DATA_URL = \"https://airquality.cpcb.gov.in/ccr/#/caaqm-dashboard-all/caaqm-landing/caaqm-data-repository\"\nDOWNLOAD_PAGE_URL = \"https://airquality.cpcb.gov.in/ccr/#/caaqm-dashboard-all/caaqm-landing/data\"\ndef click_it(driver, element):\n driver.execute_script(\"arguments[0].click();\", element)\n \ndef find_it(element, option):\n return element.find_element(By.XPATH, f\"//li[contains(text(), '{option}')]\")\n\ndef select_dropdown_option(driver, element, option):\n element.click()\n option = find_it(element, option)\n click_it(driver, option)" }, { - "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html#get-a-list-of-contributors-sorted-by-count-of-prs", - "href": "posts/2022-05-17-contributors_sorted_by_prs.html#get-a-list-of-contributors-sorted-by-count-of-prs", - "title": "Get a list of contributors from a repo", - "section": "Get a list of contributors sorted by count of PRs", - "text": "Get a list of contributors sorted by count of PRs\n\npull_requests['login'] = pull_requests['user'].apply(lambda x: x[\"login\"])\nsorted_by_pr_count = pull_requests.groupby(\"login\").agg({'url': len}).sort_values(by='url', ascending=False)\nsorted_by_pr_count.rename(columns={'url': 'Number of PRs'}, inplace=True)\nsorted_by_pr_count.head(5)\n\n\n \n \n \n\n\n\n\n\n\nNumber of PRs\n\n\nlogin\n\n\n\n\n\nDrishttii\n79\n\n\ngerdm\n55\n\n\nkaralleyna\n43\n\n\nalways-newbie161\n29\n\n\nkarm-patel\n29" + "objectID": "posts/2024-12-10-cpcb-download.html#dry-run-to-get-metadata", + "href": "posts/2024-12-10-cpcb-download.html#dry-run-to-get-metadata", + "title": "Download CPCB live data", + "section": "Dry run to get metadata", + "text": "Dry run to get metadata\n\n# headless chrome\noptions = Options()\noptions.add_argument(\"--headless\")\n\n# open the browser\ndriver = webdriver.Chrome(options=options)\n\n# open the website\ndriver.get(DOWNLOAD_OLD_DATA_URL)\n\n# wait for the page to load and the dropdowns to appear\ndropdowns = WebDriverWait(driver, 10).until(EC.presence_of_all_elements_located((By.CSS_SELECTOR, \".select-box\")))\nlen(dropdowns)\n\n5\n\n\n\ndrop_data_type, drop_frequency, drop_states, drop_cities, drop_stations = dropdowns\n\n\n# Select data type\nselect_dropdown_option(driver, drop_data_type, \"Raw data\")\n\n# Select frequency\nselect_dropdown_option(driver, drop_frequency, \"1 day\")\n\n# Get the states\ndrop_states.click() # Open the dropdown\nstates = drop_states.text.replace(\"▲\\n\", \"\").split(\"\\n\")\nprint(\"Number of states:\", len(states))\ndrop_states.click() # Close the dropdown\n\nNumber of states: 31\n\n\n\nmetadata_df = pd.DataFrame(columns=[\"State\", \"City\", \"Station\", \"site_id\"])\n\n# This loop took less than a minute to run\nprogress_bar = tqdm(total=600) # as of 2024, 560 stations. update this number if it changes\nfor state in states:\n select_dropdown_option(driver, drop_states, state)\n \n # Get all cities\n drop_cities.click() # Open the dropdown\n cities = drop_cities.text.replace(\"▲\\n\", \"\").split(\"\\n\")\n drop_cities.click() # Close the dropdown\n \n for city in cities:\n select_dropdown_option(driver, drop_cities, city)\n \n # Get all stations\n drop_stations.click() # Open the dropdown\n stations = drop_stations.text.replace(\"▲\\n\", \"\").split(\"\\n\")\n drop_stations.click() # Close the dropdown\n \n for station in stations:\n # corner cases\n if station == \"Municipal Corporation Office, Dharuhera - HSPCB\":\n site_id = \"site_5044\"\n elif station == \"Civil Lines, Ajmer - RSPCB\":\n site_id = \"site_1392\"\n else:\n try:\n select_dropdown_option(driver, drop_stations, station)\n except:\n print(\"Unable to select station\")\n print(station)\n print(drop_stations.text)\n continue\n site_id = drop_stations.get_attribute(\"ng-reflect-model\")\n metadata_df.loc[len(metadata_df)] = [state, city, station, site_id]\n progress_bar.update(1)\n\n\n\n\n\nlen(metadata_df)\n\n560\n\n\n\nmetadata_df.head()\n\n\n\n\n\n\n\n\nState\nCity\nStation\nsite_id\n\n\n\n\n0\nAndhra Pradesh\nAmaravati\nSecretariat, Amaravati - APPCB\nsite_1406\n\n\n1\nAndhra Pradesh\nAnantapur\nGulzarpet, Anantapur - APPCB\nsite_5632\n\n\n2\nAndhra Pradesh\nChittoor\nGangineni Cheruvu, Chittoor - APPCB\nsite_5665\n\n\n3\nAndhra Pradesh\nKadapa\nYerramukkapalli, Kadapa - APPCB\nsite_5693\n\n\n4\nAndhra Pradesh\nRajamahendravaram\nAnand Kala Kshetram, Rajamahendravaram - APPCB\nsite_1399\n\n\n\n\n\n\n\n\nmetadata_df.tail()\n\n\n\n\n\n\n\n\nState\nCity\nStation\nsite_id\n\n\n\n\n555\nWest Bengal\nKolkata\nRabindra Bharati University, Kolkata - WBPCB\nsite_296\n\n\n556\nWest Bengal\nKolkata\nFort William, Kolkata - WBPCB\nsite_5110\n\n\n557\nWest Bengal\nKolkata\nVictoria, Kolkata - WBPCB\nsite_309\n\n\n558\nWest Bengal\nKolkata\nBidhannagar, Kolkata - WBPCB\nsite_5129\n\n\n559\nWest Bengal\nSiliguri\nWard-32 Bapupara, Siliguri - WBPCB\nsite_1419\n\n\n\n\n\n\n\n\nfor site_id, more_than_1 in (metadata_df.site_id.value_counts() > 1).items():\n if more_than_1:\n print(metadata_df[metadata_df.site_id == site_id])\n\n State City Station site_id\n25 Bihar Aurangabad MIDC Chilkalthana, Aurangabad - MPCB site_5788\n254 Maharashtra Aurangabad MIDC Chilkalthana, Aurangabad - MPCB site_5788\n State City Station site_id\n26 Bihar Aurangabad More Chowk Waluj, Aurangabad - MPCB site_198\n255 Maharashtra Aurangabad More Chowk Waluj, Aurangabad - MPCB site_198\n State City Station \\\n499 Uttar Pradesh Greater Noida Knowledge Park - V, Greater Noida - UPPCB \n526 Uttar Pradesh Noida Knowledge Park - V, Greater Noida - UPPCB \n\n site_id \n499 site_5121 \n526 site_5121 \n State City \\\n498 Uttar Pradesh Greater Noida \n525 Uttar Pradesh Noida \n\n Station site_id \n498 Knowledge Park - III, Greater Noida - UPPCB site_1541 \n525 Knowledge Park - III, Greater Noida - UPPCB site_1541 \n State City Station site_id\n28 Bihar Aurangabad Rachnakar Colony, Aurangabad - MPCB site_5789\n257 Maharashtra Aurangabad Rachnakar Colony, Aurangabad - MPCB site_5789\n State City Station site_id\n27 Bihar Aurangabad Gurdeo Nagar, Aurangabad - BSPCB site_5544\n256 Maharashtra Aurangabad Gurdeo Nagar, Aurangabad - BSPCB site_5544\n\n\n\n# clean up\ndrop_items = [metadata_df[(metadata_df.State == \"Bihar\") & (metadata_df.Station == \"MIDC Chilkalthana, Aurangabad - MPCB\")].index.item(),\n metadata_df[(metadata_df.City == \"Noida\") & (metadata_df.Station == \"Knowledge Park - III, Greater Noida - UPPCB\")].index.item(),\n metadata_df[(metadata_df.State == \"Bihar\") & (metadata_df.Station == \"More Chowk Waluj, Aurangabad - MPCB\")].index.item(),\n metadata_df[(metadata_df.State == \"Bihar\") & (metadata_df.Station == \"MIDC Chilkalthana, Aurangabad - MPCB\")].index.item(),\n metadata_df[(metadata_df.State == \"Maharashtra\") & (metadata_df.Station == \"Gurdeo Nagar, Aurangabad - BSPCB\")].index.item(),\n metadata_df[(metadata_df.State == \"Bihar\") & (metadata_df.Station == \"Rachnakar Colony, Aurangabad - MPCB\")].index.item(),\n metadata_df[(metadata_df.City == \"Noida\") & (metadata_df.Station == \"Knowledge Park - V, Greater Noida - UPPCB\")].index.item()]\n\nmetadata_df.drop(drop_items, inplace=True)\nlen(metadata_df)\n\n554\n\n\n\nassert set(metadata_df.site_id.value_counts()) == {1}\n\n\nmetadata_df.to_csv(\"metadata.csv\", index=False)" }, { - "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html#create-a-dashboard", - "href": "posts/2022-05-17-contributors_sorted_by_prs.html#create-a-dashboard", - "title": "Get a list of contributors from a repo", - "section": "Create a dashboard", - "text": "Create a dashboard\n\ndef get_href_user(user):\n username, profile_link = user.split(\"|\")\n return f\"[{username}]({profile_link})\"\n\ndashboard = pd.DataFrame(index=sorted_by_pr_count.index)\ndashboard[\"Avatar\"] = contributors.avatar_url.apply(lambda url: f'<img width=\"25\" alt=\"image\" src=\"{url}\">')\ndashboard[\"Contributor\"] = (contributors.index +\"|\"+ contributors['html_url']).apply(get_href_user)\ndashboard[\"Number of PRs\"] = sorted_by_pr_count[\"Number of PRs\"]\nprint(dashboard.dropna().T.to_markdown())\n\n| | Drishttii | gerdm | karalleyna | always-newbie161 | karm-patel | Duane321 | Nirzu97 | patel-zeel | animesh-007 | ashishpapanai | shivaditya-meduri | Neoanarika | andrewnc | nappaillav | Abdelrahman350 | mjsML | jdf22 | kzymgch | nalzok | nitish1295 | Garvit9000c | AnkitaKumariJain14 | rohit-khoiwal-30 | shobro | raymondyeh07 | khanshehjad | alenm10 | firatoncel | AnandShegde | Aadesh-1404 | nealmcb | nipunbatra | petercerno | posgnu | mvervuurt | hieuza | Prahitha | TripleTop | UmarJ | Vishal987595 | a-fakhri | adamnemecek | galv | jlh2018 | krasserm | yuanx749 |\n|:--------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|\n| Avatar | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/35187749?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/4108759?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/36455180?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/66471669?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/59387624?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/19956442?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/28842790?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/59758528?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/53366877?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/52123364?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/77324692?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/5188337?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/7716402?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/43855961?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/47902062?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/7131192?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/1637094?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/10054419?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/13443062?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/21181046?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/68856476?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/62535006?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/87682045?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/54628243?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/5696982?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/31896767?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/42214173?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/9141211?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/79975787?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/68186100?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/119472?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/60985?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/1649209?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/30136201?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/6399881?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/1021144?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/44160152?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/48208522?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/34779641?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/97757583?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/65111198?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/182415?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/4767568?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/40842099?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/202907?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/47032563?v=4\"> |\n| Contributor | [Drishttii](https://github.com/Drishttii) | [gerdm](https://github.com/gerdm) | [karalleyna](https://github.com/karalleyna) | [always-newbie161](https://github.com/always-newbie161) | [karm-patel](https://github.com/karm-patel) | [Duane321](https://github.com/Duane321) | [Nirzu97](https://github.com/Nirzu97) | [patel-zeel](https://github.com/patel-zeel) | [animesh-007](https://github.com/animesh-007) | [ashishpapanai](https://github.com/ashishpapanai) | [shivaditya-meduri](https://github.com/shivaditya-meduri) | [Neoanarika](https://github.com/Neoanarika) | [andrewnc](https://github.com/andrewnc) | [nappaillav](https://github.com/nappaillav) | [Abdelrahman350](https://github.com/Abdelrahman350) | [mjsML](https://github.com/mjsML) | [jdf22](https://github.com/jdf22) | [kzymgch](https://github.com/kzymgch) | [nalzok](https://github.com/nalzok) | [nitish1295](https://github.com/nitish1295) | [Garvit9000c](https://github.com/Garvit9000c) | [AnkitaKumariJain14](https://github.com/AnkitaKumariJain14) | [rohit-khoiwal-30](https://github.com/rohit-khoiwal-30) | [shobro](https://github.com/shobro) | [raymondyeh07](https://github.com/raymondyeh07) | [khanshehjad](https://github.com/khanshehjad) | [alenm10](https://github.com/alenm10) | [firatoncel](https://github.com/firatoncel) | [AnandShegde](https://github.com/AnandShegde) | [Aadesh-1404](https://github.com/Aadesh-1404) | [nealmcb](https://github.com/nealmcb) | [nipunbatra](https://github.com/nipunbatra) | [petercerno](https://github.com/petercerno) | [posgnu](https://github.com/posgnu) | [mvervuurt](https://github.com/mvervuurt) | [hieuza](https://github.com/hieuza) | [Prahitha](https://github.com/Prahitha) | [TripleTop](https://github.com/TripleTop) | [UmarJ](https://github.com/UmarJ) | [Vishal987595](https://github.com/Vishal987595) | [a-fakhri](https://github.com/a-fakhri) | [adamnemecek](https://github.com/adamnemecek) | [galv](https://github.com/galv) | [jlh2018](https://github.com/jlh2018) | [krasserm](https://github.com/krasserm) | [yuanx749](https://github.com/yuanx749) |\n| Number of PRs | 79 | 55 | 43 | 29 | 29 | 29 | 25 | 23 | 18 | 17 | 16 | 10 | 10 | 10 | 8 | 7 | 7 | 6 | 6 | 5 | 4 | 4 | 3 | 3 | 2 | 2 | 2 | 2 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |" + "objectID": "posts/2024-12-10-cpcb-download.html#downloading-data", + "href": "posts/2024-12-10-cpcb-download.html#downloading-data", + "title": "Download CPCB live data", + "section": "Downloading data", + "text": "Downloading data\n\n# URL is specific to PM2.5 and PM10 so update it as per your needs\ndef get_url(state, city, site_id):\n return f\"https://airquality.cpcb.gov.in/ccr/#/caaqm-dashboard-all/caaqm-view-data-report/%2522%257B%255C%2522parameter_list%255C%2522%253A%255B%257B%255C%2522id%255C%2522%253A0%252C%255C%2522itemName%255C%2522%253A%255C%2522PM2.5%255C%2522%252C%255C%2522itemValue%255C%2522%253A%255C%2522parameter_193%255C%2522%257D%252C%257B%255C%2522id%255C%2522%253A1%252C%255C%2522itemName%255C%2522%253A%255C%2522PM10%255C%2522%252C%255C%2522itemValue%255C%2522%253A%255C%2522parameter_215%255C%2522%257D%255D%252C%255C%2522criteria%255C%2522%253A%255C%252224%2520Hours%255C%2522%252C%255C%2522reportFormat%255C%2522%253A%255C%2522Tabular%255C%2522%252C%255C%2522fromDate%255C%2522%253A%255C%252201-01-2024%2520T00%253A00%253A00Z%255C%2522%252C%255C%2522toDate%255C%2522%253A%255C%252211-12-2024%2520T16%253A45%253A59Z%255C%2522%252C%255C%2522state%255C%2522%253A%255C%2522{state.replace(' ', '%2520')}%255C%2522%252C%255C%2522city%255C%2522%253A%255C%2522{city.replace(' ', '%2520')}%255C%2522%252C%255C%2522station%255C%2522%253A%255C%2522{site_id}%255C%2522%252C%255C%2522parameter%255C%2522%253A%255B%255C%2522parameter_193%255C%2522%252C%255C%2522parameter_215%255C%2522%255D%252C%255C%2522parameterNames%255C%2522%253A%255B%255C%2522PM2.5%255C%2522%252C%255C%2522PM10%255C%2522%255D%257D%2522\"\n\n\n# add download directory\noptions = webdriver.ChromeOptions()\noptions.add_experimental_option(\"prefs\", {\n \"download.default_directory\": \"/Users/project561/cpcb_downloads\"\n})\n\ndriver = webdriver.Chrome(options=options)\ndriver.get(HOME_URL)\n\nEnter Captcha manually before moving ahead\n\nmetadata_df = pd.read_csv(\"metadata.csv\")\nmetadata_df.head(2)\n\n\n\n\n\n\n\n\nState\nCity\nStation\nsite_id\n\n\n\n\n0\nAndhra Pradesh\nAmaravati\nSecretariat, Amaravati - APPCB\nsite_1406\n\n\n1\nAndhra Pradesh\nAnantapur\nGulzarpet, Anantapur - APPCB\nsite_5632\n\n\n\n\n\n\n\n\nfiles = glob(\"/Users/project561/cpcb_downloads/*.xlsx\")\nprint(\"Number of files in the download directory:\", len(files))\nsite_ids = [re.search(r\"site_\\d+?2024\", file).group()[:-4] for file in files]\n# assert len(set(site_ids)) == len(site_ids), pd.Series(site_ids).value_counts()\nsite_ids = set(site_ids)\n\nfor i in range(len(metadata_df)):\n state, city, station, site_id = metadata_df.iloc[i]\n if site_id in site_ids:\n # print(\"Already downloaded\", i, state, city, station, site_id)\n continue\n print(\"Downloading\", i, state, city, station, site_id)\n url = get_url(state, city, site_id)\n \n # open new tab\n driver.execute_script(\"window.open('');\")\n driver.switch_to.window(driver.window_handles[-1])\n driver.get(url)\n excel_button = WebDriverWait(driver, 20).until(\n EC.element_to_be_clickable((By.CLASS_NAME, \"fa-file-excel-o\")))\n click_it(driver, excel_button)\n sleep(1)\n \n if len(driver.window_handles) > 10:\n # close first 9 windows\n for _ in range(9):\n driver.switch_to.window(driver.window_handles[0])\n driver.close()\n \n driver.switch_to.window(driver.window_handles[-1])\n sleep(1)\n\nNumber of files in the download directory: 302\nDownloading 301 Maharashtra Nagpur Ram Nagar, Nagpur - MPCB site_5793\nDownloading 302 Maharashtra Nagpur Mahal, Nagpur - MPCB site_5796\nDownloading 303 Maharashtra Nagpur Opp GPO Civil Lines, Nagpur - MPCB site_303\nDownloading 304 Maharashtra Nagpur Ambazari, Nagpur - MPCB site_5792\nDownloading 305 Maharashtra Nanded Sneh Nagar, Nanded - MPCB site_5795\nDownloading 306 Maharashtra Nashik Pandav Nagari, Nashik - MPCB site_5779\nDownloading 307 Maharashtra Nashik MIDC Ambad, Nashik - MPCB site_5781\nDownloading 308 Maharashtra Nashik Gangapur Road, Nashik - MPCB site_304\nDownloading 309 Maharashtra Nashik Hirawadi, Nashik - MPCB site_5782\nDownloading 310 Maharashtra Navi Mumbai Tondare-Taloja, Navi Mumbai - MPCB site_5803\nDownloading 311 Maharashtra Navi Mumbai Sanpada, Navi Mumbai - MPCB site_5815\nDownloading 312 Maharashtra Navi Mumbai Airoli, Navi Mumbai - MPCB site_261\nDownloading 313 Maharashtra Navi Mumbai Mahape, Navi Mumbai - MPCB site_5114\nDownloading 314 Maharashtra Navi Mumbai Kopripada-Vashi, Navi Mumbai - MPCB site_5805\nDownloading 315 Maharashtra Navi Mumbai Sector-19A Nerul, Navi Mumbai - IITM site_5401\nDownloading 316 Maharashtra Navi Mumbai Nerul, Navi Mumbai - MPCB site_5103\nDownloading 317 Maharashtra Navi Mumbai Sector-2E Kalamboli, Navi Mumbai - MPCB site_5799\nDownloading 318 Maharashtra Parbhani Masoom Colony, Parbhani - MPCB site_5794\nDownloading 319 Maharashtra Pimpri-Chinchwad Park Street Wakad, Pimpri Chinchwad - MPCB site_5764\nDownloading 320 Maharashtra Pimpri-Chinchwad Savta Mali Nagar, Pimpri-Chinchwad - IITM site_5998\nDownloading 321 Maharashtra Pimpri-Chinchwad Thergaon, Pimpri Chinchwad - MPCB site_5765\nDownloading 322 Maharashtra Pimpri-Chinchwad Gavalinagar, Pimpri Chinchwad - MPCB site_5763\nDownloading 323 Maharashtra Pune Revenue Colony-Shivajinagar, Pune - IITM site_5409\nDownloading 324 Maharashtra Pune Mhada Colony, Pune - IITM site_5404\nDownloading 325 Maharashtra Pune Savitribai Phule Pune University, Pune - MPCB site_5767\nDownloading 326 Maharashtra Pune Bhumkar Nagar, Pune - IITM site_5988\nDownloading 327 Maharashtra Pune Hadapsar, Pune - IITM site_5407\nDownloading 328 Maharashtra Pune Karve Road, Pune - MPCB site_292\nDownloading 329 Maharashtra Pune Alandi, Pune - IITM site_5405" }, { - "objectID": "posts/2021-10-26-anonymization-tips.html", - "href": "posts/2021-10-26-anonymization-tips.html", - "title": "Anonymization tips for double-blind submission", + "objectID": "posts/CNPs_for_Images.html", + "href": "posts/CNPs_for_Images.html", + "title": "Conditional Neural Processes for Image Interpolation", "section": "", - "text": "Use following command locally to search for author names, institute name and other terms you think may violate double-blind\n\ngit grep <query>\n\nAbove command matches the query everywhere and thus a safe way. Avoid GitHub search for this purpose, it fails to identify some terms many times and there is no regex there (yet)!\n\n\nDo not use full paths inside README file. If you move content in other repo, the links will either become unusable or may violate double-blind. So follow the example below.\n\nBad practice: [link](https://github.com/patel-zeel/reponame/blob/master/dataset)\nGood practice: [link](dataset)\n\nPoint no. 2 does not work for GitHub pages links (username.github.io/stuff). Thus, keep in mind to manually update those (if you have a better idea, let everyone know in comments below)\nDownload the repo zip locally and create an anonymized repository in your anonymized GitHub account. Open the GitHub web editor by pressing “.” (dot) at repo homepage.\nNow, you can select and drag all folders to the left pan of the web editor to upload them at once. Finally, commit with a meaningfull message and the changes will automatically be uploaded to the mail branch of your anonymized repo.\nUpdate the link in your manuscipt and submit !!\n\n\nEdit:\nAfter acceptance, transfer the ownership to personal account and delete the ownership of anonymized account from the personal account. This will remove all the traces of repository from the anonymized account. However, repository will still show that the commits were made by anonymized account which is anyway not violation of the doule-blind explicitely." + "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"0\"\n# turn off preallocation by JAX\nos.environ[\"XLA_PYTHON_CLIENT_PREALLOCATE\"] = \"false\"\n\nimport numpy as np\nimport pandas as pd\n\nfrom tqdm import tqdm\nimport jax\nimport jax.numpy as jnp\nimport flax.linen as nn\n\nimport distrax as dx\n\nimport optax\n\n# load mnist dataset from tensorflow datasets\nimport tensorflow_datasets as tfds\n\nfrom sklearn.model_selection import train_test_split\n\nimport matplotlib.pyplot as plt\n# define initializers\ndef first_layer_init(key, shape, dtype=jnp.float32):\n num_input = shape[0] # reverse order compared to torch\n return jax.random.uniform(key, shape, dtype, minval=-1.0/num_input, maxval=1.0/num_input)\n\ndef other_layers_init(key, shape, dtype=jnp.float32):\n num_input = shape[0] # reverse order compared to torch\n return jax.random.uniform(key, shape, dtype, minval=-np.sqrt(6 / num_input)/30, maxval=np.sqrt(6 / num_input)/30)\n\nclass Encoder(nn.Module):\n features: list\n encoding_dims: int\n\n @nn.compact\n def __call__(self, x_context, y_context):\n x = jnp.hstack([x_context, y_context.reshape(x_context.shape[0], -1)])\n \n x = nn.Dense(self.features[0], kernel_init=first_layer_init, bias_init=first_layer_init)(x)\n x = jnp.sin(30*x)\n # x = nn.Dense(self.features[0])(x)\n # x = nn.relu(x)\n \n \n for n_features in self.features[1:]:\n x = nn.Dense(n_features, kernel_init=other_layers_init, bias_init=other_layers_init)(x)\n x = jnp.sin(30*x)\n # x = nn.Dense(n_features)(x)\n # x = nn.relu(x)\n\n x = nn.Dense(self.encoding_dims)(x)\n\n representation = x.mean(axis=0, keepdims=True) # option 1\n return representation # (1, encoding_dims)\n\nclass Decoder(nn.Module):\n features: list\n output_dim: int\n\n @nn.compact\n def __call__(self, representation, x):\n representation = jnp.repeat(representation, x.shape[0], axis=0)\n x = jnp.hstack([representation, x])\n \n x = nn.Dense(self.features[0], kernel_init=first_layer_init, bias_init=first_layer_init)(x)\n x = jnp.sin(30*x)\n # x = nn.Dense(self.features[0])(x)\n # x = nn.relu(x)\n\n for n_features in self.features:\n x = nn.Dense(n_features, kernel_init=other_layers_init, bias_init=other_layers_init)(x)\n x = jnp.sin(30*x)\n # x = nn.Dense(n_features)(x)\n # x = nn.relu(x)\n\n x = nn.Dense(self.output_dim*2)(x)\n loc, raw_scale = x[:, :self.output_dim], x[:, self.output_dim:]\n scale = jnp.exp(raw_scale)\n \n return loc, scale\n\nclass CNP(nn.Module):\n encoder_features: list\n encoding_dims: int\n decoder_features: list\n output_dim: int\n\n @nn.compact\n def __call__(self, x_content, y_context, x_target):\n representation = Encoder(self.encoder_features, self.encoding_dims)(x_content, y_context)\n loc, scale = Decoder(self.decoder_features, self.output_dim)(representation, x_target)\n return loc, scale\n\n def loss_fn(self, params, x_context, y_context, x_target, y_target):\n loc, scale = self.apply(params, x_context, y_context, x_target)\n predictive_distribution = dx.MultivariateNormalDiag(loc=loc, scale_diag=0.005+scale)\n return -predictive_distribution.log_prob(y_target)" + }, + { + "objectID": "posts/CNPs_for_Images.html#load-mnist", + "href": "posts/CNPs_for_Images.html#load-mnist", + "title": "Conditional Neural Processes for Image Interpolation", + "section": "Load MNIST", + "text": "Load MNIST\n\nds = tfds.load('mnist')\n\n\ndef dataset_to_arrays(dataset):\n data = []\n labels = []\n stopper = 0\n end = 100\n for sample in dataset:\n data.append(sample[\"image\"].numpy())\n labels.append(sample[\"label\"].numpy())\n stopper += 1\n if stopper == end:\n break\n return np.array(data), np.array(labels)[..., None]\n\ntrain_data, train_labels = dataset_to_arrays(ds[\"train\"])\ntest_data, test_labels = dataset_to_arrays(ds[\"test\"])\n\ntrain_data.shape, train_labels.shape, test_data.shape, test_labels.shape\n\n2023-06-02 09:58:48.609001: W tensorflow/core/kernels/data/cache_dataset_ops.cc:856] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead.\n2023-06-02 09:58:48.681190: W tensorflow/core/kernels/data/cache_dataset_ops.cc:856] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead.\n\n\n((100, 28, 28, 1), (100, 1), (100, 28, 28, 1), (100, 1))\n\n\n\ncoords = np.linspace(-1, 1, 28)\nx, y = np.meshgrid(coords, coords)\ntrain_X = jnp.stack([x, y], axis=-1).reshape(-1, 2)\n\ntrain_y = jax.vmap(lambda x: x.reshape(-1, 1))(train_data) / 255.0\ntrain_X.shape, train_y.shape, type(train_X), type(train_y)\n\n((784, 2),\n (100, 784, 1),\n jaxlib.xla_extension.ArrayImpl,\n jaxlib.xla_extension.ArrayImpl)\n\n\n\niterations = 10000\n\ndef loss_fn(params, context_X, context_y, target_X, target_y):\n def loss_fn_per_sample(context_X, context_y, target_X, target_y):\n loc, scale = model.apply(params, context_X, context_y, target_X)\n # predictive_distribution = dx.MultivariateNormalDiag(loc=loc, scale_diag=scale)\n # return -predictive_distribution.log_prob(target_y)\n return jnp.square(loc.ravel() - target_y.ravel()).mean()\n \n return jax.vmap(loss_fn_per_sample, in_axes=(None, 0, None, 0))(context_X, context_y, target_X, target_y).mean()\n\nvalue_and_grad_fn = jax.jit(jax.value_and_grad(loss_fn))\nmodel = CNP([256]*2, 128, [256]*4, 1)\nparams = model.init(jax.random.PRNGKey(0), train_X, train_y[0], train_X)\noptimizer = optax.adam(1e-5)\nstate = optimizer.init(params)\n\n# losses = []\n# for iter in tqdm(range(iterations)):\n# tmp_index = jax.random.permutation(jax.random.PRNGKey(iter), index)\n# context_X = train_X[tmp_index][:int(train_X.shape[0]*0.05)]\n# context_y = train_y[:, tmp_index, :][:, :int(train_X.shape[0]*0.05), :]\n# target_X = train_X[tmp_index][int(train_X.shape[0]*0.05):]\n# target_y = train_y[:, tmp_index, :][:, int(train_X.shape[0]*0.05):, :]\n \n# # print(context_X.shape, context_y.shape, target_X.shape, target_y.shape)\n# # print(loss_fn(params, context_X, context_y, target_X, target_y).shape)\n \n# loss, grads = value_and_grad_fn(params, context_X, context_y, target_X, target_y)\n# updates, state = optimizer.update(grads, state)\n# params = optax.apply_updates(params, updates)\n# losses.append(loss.item())\n\ndef one_step(params_and_state, key):\n params, state = params_and_state\n tmp_index = jax.random.permutation(key, train_X.shape[0])\n context_X = train_X[tmp_index][:int(train_X.shape[0]*0.05)]\n context_y = train_y[:, tmp_index, :][:, :int(train_X.shape[0]*0.05), :]\n target_X = train_X[tmp_index][int(train_X.shape[0]*0.05):]\n target_y = train_y[:, tmp_index, :][:, int(train_X.shape[0]*0.05):, :]\n loss, grads = value_and_grad_fn(params, context_X, context_y, target_X, target_y)\n updates, state = optimizer.update(grads, state)\n params = optax.apply_updates(params, updates)\n return (params, state), loss\n\n(params, state), loss_history = jax.lax.scan(one_step, (params, state), jax.random.split(jax.random.PRNGKey(0), iterations))\n\n\nplt.plot(loss_history[10:]);\n\n\n\n\n\n\n\n\n\ntest_key = jax.random.PRNGKey(0)\ntmp_index = jax.random.permutation(test_key, train_X.shape[0])\ncontext_X = train_X[tmp_index][:int(train_X.shape[0]*0.5)]\ncontext_y = train_y[:, tmp_index, :][:, :int(train_X.shape[0]*0.5), :]\ntarget_X = train_X#[tmp_index][int(train_X.shape[0]*0.5):]\ntarget_y = train_y#[:, tmp_index, :][:, int(train_X.shape[0]*0.5):, :]\n\nid = 91\nplt.imshow(train_y[id].reshape(28, 28), cmap=\"gray\", interpolation=None);\n\nlocs, scales = jax.vmap(model.apply, in_axes=(None, None, 0, None))(params, context_X, context_y, target_X)\n# full_preds = jnp.concatenate([context_y, locs], axis=1)\n# full_preds = full_preds.at[:, tmp_index, :].set(full_preds).__array__()\n\nplt.figure()\nplt.imshow(locs[id].reshape(28, 28), cmap=\"gray\", interpolation=None);" }, { "objectID": "posts/Rank1_GPs.html", @@ -602,67 +504,88 @@ "text": "from tqdm import tqdm\n\nimport torch\nimport torch.nn as nn\nimport torch.distributions as dist\nimport torch.nn.functional as F\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nfrom gpytorch.kernels import RBFKernel, Kernel\n\n\nclass Rank1Kernel(nn.Module):\n def __init__(self, input_dim, output_dim, n_neurons_per_layer, activation):\n super().__init__()\n self.init = nn.Linear(input_dim, n_neurons_per_layer[0])\n self.n_neurons_per_layer = n_neurons_per_layer\n self.activation = activation\n \n for i in range(1, len(n_neurons_per_layer)):\n setattr(self, f'fc{i}', nn.Linear(n_neurons_per_layer[i-1], n_neurons_per_layer[i]))\n \n self.out = nn.Linear(n_neurons_per_layer[-1], output_dim)\n \n def forward(self, x1, x2):\n def _forward(x):\n x = self.init(x)\n for i in range(1, len(self.n_neurons_per_layer)):\n x = getattr(self, f'fc{i}')(x)\n x = self.activation(x)\n return self.out(x)\n \n x1 = _forward(x1)\n x2 = _forward(x2)\n \n # print(x1.shape, x2.shape)\n covar = x1 @ x2.T\n # print(covar.shape, gt_covar.shape, x1.shape, x2.shape)\n return covar\n\n\nfixed_kernel = RBFKernel()\nfixed_kernel.lengthscale = 0.3\n\nX1 = torch.linspace(-1, 1, 100).view(-1, 1)\n\n\nepochs = 1000\nn_neurons_per_layer = [64]*4\noutput_dim = 10\nkernel = Rank1Kernel(1, output_dim, n_neurons_per_layer, torch.sin)\noptimizer = torch.optim.Adam(kernel.parameters(), lr=0.001)\n\nlosses = []\nwith torch.no_grad():\n gt_covar = fixed_kernel(X1, X1).evaluate_kernel().tensor\n \nbar = tqdm(range(epochs))\nfor epoch in bar:\n optimizer.zero_grad()\n pred_covar = kernel(X1, X1)\n loss = torch.mean((gt_covar - pred_covar)**2)\n losses.append(loss.item())\n loss.backward()\n optimizer.step()\n bar.set_description(f\"Loss: {loss.item():.4f}\")\n\nLoss: 0.0001: 100%|██████████| 1000/1000 [00:06<00:00, 150.34it/s]\n\n\n\nplt.plot(losses);\n\n\n\n\n\n\n\n\n\nfig, ax = plt.subplots(1,2,figsize=(8, 3))\n\nsns.heatmap(gt_covar, ax=ax[0], cmap='RdYlGn_r', cbar=True, vmin=-2, vmax=2)\nax[0].set_title('Ground Truth Covariance')\n\nX_new = torch.linspace(-1.5, 1.5, 100).view(-1, 1)\nwith torch.no_grad():\n est_covar = kernel(X_new, X_new)\nsns.heatmap(est_covar, ax=ax[1], cmap='RdYlGn_r', cbar=True, vmin=-2, vmax=2)\nax[1].set_title('Estimated Covariance');\n\n\n\n\n\n\n\n\n\n# plt.plot()\n\nX2 = torch.zeros(1, 1) + 1\nwith torch.no_grad():\n variance = gt_covar[-1, :]\n plt.plot(X1, variance.numpy(), label=\"fixed kernel\");\n \n variance = kernel(X1, X2)\n plt.plot(X1, variance.numpy(), label=f\"rank-{output_dim} kernel\");\n \n plt.legend()\n\n\n\n\n\n\n\n\n\nprint(gt_covar.shape)\ntorch.random.manual_seed(2)\nnorm = dist.MultivariateNormal(torch.zeros(100), gt_covar + 1e-5 * torch.eye(100))\ny = norm.sample()\nplt.plot(X1, y);\n\ntorch.Size([100, 100])\n\n\n\n\n\n\n\n\n\n\nn = 6\n\nfig, ax = plt.subplots(1, n, figsize=(15, 2))\nd_x = X1\nd_y = y\nfor i in range(n):\n print(f\"{i}: {torch.var(d_y)}\")\n ax[i].plot(d_x, d_y)\n d_x = d_x[1:] - d_x[:-1]\n d_x = torch.cumsum(d_x, dim=0)\n d_y = d_y[1:] - d_y[:-1]\n \nf = lambda x: torch.zeros_like(x)\nax[-1].plot(d_x, f(d_x), c=\"r\", label=\"f(x)\")\n\n0: 0.5698477029800415\n1: 0.006691396702080965\n2: 0.0001796285796444863\n3: 0.00022799619182478637\n4: 0.0008216467685997486\n5: 0.00304242386482656" }, { - "objectID": "posts/PurpleAir.html", - "href": "posts/PurpleAir.html", - "title": "Download low-cost data from OpenAQ", + "objectID": "posts/py_over_ipynb.html", + "href": "posts/py_over_ipynb.html", + "title": "Why .py files are better than .ipynb files for ML codebase", "section": "", - "text": "import requests\nimport pandas as pd\nimport geopandas as gpd\nfrom shapely.geometry import Point\nfrom glob import glob\n\nfrom geopy.geocoders import Nominatim\n\nOpenAQ has an aws s3 bucket that contains all the data to download for free. This is a guide to download the data from the bucket. If you have enough space and bandwidth, aws s3 commands are the fastest way to download the data. If you don’t have enough space/bandwidth or you want to download specific data only then follow along.\nAcknowledgement: Much help is taken from ChatGPT for some complex Linux commands.\nWe will mostly use the following commands:\naws s3 ls\naws s3 cp\nScenario 1: We want to download PurpleAir sensors data for Delhi for entire 2022. I am taking Delhi’s example since there are far lesser sensors in Delhi than in the US. So, it will be easier for this demo.\nSome statistics that I have calculated for PurpleAir sensors in US are as follows:\n\n\n\nCountry\nNumber of sensors\nTotal size\nyears\n\n\n\n\nUSA\n22497\n90.945 GB\n2018, 2019, 2020, 2021, 2022, 2023\n\n\n\nLet’s see how to calculate these statistics for India.\n\n!aws s3 --no-sign-request ls s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/ > /tmp/in.txt\n\nThe output of above command contains all sensor IDs within India. These sensor IDs are assigned by OpenAQ and they are different from PurpleAir “sensor_index”. The output looks like the following.\n\nwith open('/tmp/in.txt', 'r') as f:\n lines = [line.strip() for line in f.readlines()]\n \nprint(f\"Number of locations = {len(lines)}\\n\")\nprint(*lines[:3], sep='\\n')\n\nNumber of locations = 623\n\nPRE locationid=160485/\nPRE locationid=218334/\nPRE locationid=218336/\n\n\nCounting total size of files:\n\n!aws s3 --no-sign-request ls s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/ --recursive > /tmp/in_files.txt\n\n\n!awk '{ sum += $3 } END { print (sum/1024/1024/1024) \" GB\"}' /tmp/in_files.txt\n\n1.20947 GB\n\n\nNumber of years:\n\n!cat /tmp/in_files.txt | grep -oP 'year=\\K\\w+' | sort | uniq\n\n2019\n2020\n2021\n2022\n2023\n\n\nLet’s find out which sensors among these belong to Delhi.\n\n!cat /tmp/in.txt | grep -oP 'locationid=\\K\\w+' | sort | uniq > /tmp/in_locations.txt\n\nNow we use OpenAQ REST API. It has limit of 300 requests per 5 minutes. After the limit exceeds, it will return an error.\n\nurl = 'https://api.openaq.org/v2/locations'\nparams = {\n 'limit': 1000,\n 'country_id': 'IN',\n \"modelName\": \"PurpleAir Sensor\"\n}\n\n# Request headers\nheaders = {\n 'accept': 'application/json'\n}\n\nresponse = requests.get(url, params=params, headers=headers)\n\nif response.status_code == 200:\n data = response.json()\nelse:\n print(response.status_code, response.reason)\n\n500 Internal Server Error\n\n\n\ndf = pd.DataFrame(data['results'])\ndf.shape\n\n\n---------------------------------------------------------------------------\nNameError Traceback (most recent call last)\nCell In[10], line 1\n----> 1 df = pd.DataFrame(data['results'])\n 2 df.shape\n\nNameError: name 'data' is not defined\n\n\n\nThe following method works but takes too much time since it is online:\n\ndef get_city_name(coords):\n latitude = coords[\"latitude\"]\n longitude = coords[\"longitude\"]\n geolocator = Nominatim(user_agent=\"myGeocoder\") # Replace \"myGeocoder\" with your desired user agent\n location = geolocator.reverse((latitude, longitude), exactly_one=True)\n\n if location:\n address = location.raw.get('address', {})\n city = address.get('city', '')\n return city\n\n return None\n\ndf[\"city\"] = df[\"coordinates\"].apply(get_city_name)\ndf\n\n\n\n\n\n\n\n\nid\ncity\nname\nentity\ncountry\nsources\nisMobile\nisAnalysis\nparameters\nsensorType\ncoordinates\nlastUpdated\nfirstUpdated\nmeasurements\nbounds\nmanufacturers\n\n\n\n\n0\n318146\nGangtok\nNASA_AQCS_201_cpa\nNone\nIN\nNone\nFalse\nNone\n[{'id': 135, 'unit': 'particles/cm³', 'count':...\nNone\n{'latitude': 27.31013, 'longitude': 88.59687}\n2023-07-26T02:34:05+00:00\n2022-04-23T07:42:24+00:00\n1168071\n[88.59687, 27.31013, 88.59687, 27.31013]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n1\n220706\nGangtok\nNASA_AQCS_139\nNone\nIN\nNone\nFalse\nNone\n[{'id': 100, 'unit': 'c', 'count': 28600, 'ave...\nNone\n{'latitude': 27.310116, 'longitude': 88.59682}\n2023-07-26T02:34:04+00:00\n2021-02-17T09:56:06+00:00\n2205110\n[88.59682, 27.310116, 88.59682, 27.310116]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n2\n66673\nHisar\nNASA_AQCS_160\nNone\nIN\nNone\nFalse\nNone\n[{'id': 132, 'unit': 'mb', 'count': 28651, 'av...\nNone\n{'latitude': 29.146254, 'longitude': 75.72236}\n2023-07-26T02:33:56+00:00\n2021-01-19T23:59:16+00:00\n2396934\n[75.72236, 29.146254, 75.72236, 29.146254]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n3\n72977\nBengaluru\nUT Sensor 101\nNone\nIN\nNone\nFalse\nNone\n[{'id': 126, 'unit': 'particles/cm³', 'count':...\nNone\n{'latitude': 13.045313, 'longitude': 77.573395}\n2023-07-26T02:33:55+00:00\n2021-01-14T01:18:23+00:00\n2498544\n[77.573395, 13.045313, 77.573395, 13.045313]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n4\n235916\nBengaluru\nUW Sensor 311\nNone\nIN\nNone\nFalse\nNone\n[{'id': 130, 'unit': 'particles/cm³', 'count':...\nNone\n{'latitude': 13.048528, 'longitude': 77.582275}\n2023-07-26T02:33:49+00:00\n2021-09-16T12:29:52+00:00\n1773924\n[77.582275, 13.048528, 77.582275, 13.048528]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n...\n\n\n604\n73922\nNew Delhi District\nUS Embassy A\nNone\nIN\nNone\nFalse\nNone\n[{'id': 2, 'unit': 'µg/m³', 'count': 347, 'ave...\nNone\n{'latitude': 28.5979, 'longitude': 77.1847}\n2021-02-05T18:25:34+00:00\n2021-01-08T12:12:29+00:00\n2082\n[77.1847, 28.5979, 77.1847, 28.5979]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n605\n73924\nNew Delhi District\nUS Embassy B\nNone\nIN\nNone\nFalse\nNone\n[{'id': 2, 'unit': 'µg/m³', 'count': 347, 'ave...\nNone\n{'latitude': 28.5982, 'longitude': 77.1837}\n2021-02-05T18:23:59+00:00\n2021-01-08T12:11:29+00:00\n2082\n[77.1837, 28.5982, 77.1837, 28.5982]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n606\n219370\nBhubaneswar Municipal Corporation\nBhubaneswar India\nNone\nIN\nNone\nFalse\nNone\n[{'id': 2, 'unit': 'µg/m³', 'count': 180, 'ave...\nNone\n{'latitude': 20.2853, 'longitude': 85.7685}\n2021-02-03T10:18:41+00:00\n2021-02-03T03:02:39+00:00\n1080\n[85.7685, 20.2853, 85.7685, 20.2853]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n607\n71465\nGurugram District\nNASA_AQCS_152\nNone\nIN\nNone\nFalse\nNone\n[{'id': 1, 'unit': 'µg/m³', 'count': 1, 'avera...\nNone\n{'latitude': 28.4522, 'longitude': 77.0949}\n2021-01-14T01:18:54+00:00\n2021-01-14T01:18:54+00:00\n6\n[77.0949, 28.4522, 77.0949, 28.4522]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n608\n221974\nBengaluru\nUT sensor\nNone\nIN\nNone\nFalse\nNone\n[{'id': 130, 'unit': 'particles/cm³', 'count':...\nNone\n{'latitude': 13.0449, 'longitude': 77.5788}\n2019-12-17T10:32:35+00:00\n2019-12-17T10:32:35+00:00\n6\n[77.5788, 13.0449, 77.5788, 13.0449]\n[{'modelName': 'PurpleAir Sensor', 'manufactur...\n\n\n\n\n609 rows × 16 columns\n\n\n\nNow, we use another method of shapefile to do this:\n\n!wget --no-check-certificate \"https://groups.google.com/group/datameet/attach/29b74b1aef5f2f13/Delhi.zip?part=0.1\" -O /tmp/delhi.zip\n!unzip -o /tmp/delhi.zip -d /tmp/delhi\n\n--2023-07-26 08:37:34-- https://groups.google.com/group/datameet/attach/29b74b1aef5f2f13/Delhi.zip?part=0.1\nResolving groups.google.com (groups.google.com)... 216.239.32.177, 216.239.36.177, 216.239.38.177, ...\nConnecting to groups.google.com (groups.google.com)|216.239.32.177|:443... connected.\nHTTP request sent, awaiting response... 302 Moved Temporarily\nLocation: https://06895207363394598426.googlegroups.com/attach/29b74b1aef5f2f13/Delhi.zip?part=0.1&vt=ANaJVrEpUPnptnb4Y-J5gJRBVJ29K0pIGKzeBG7492Ume1tyn1MY5eTDbztxP0Hdbc7u8XhmH_GbemY_HD60x5OvDhr7M2ib1h8YfDmlNxFefazGPgmAUj0 [following]\n--2023-07-26 08:37:35-- https://06895207363394598426.googlegroups.com/attach/29b74b1aef5f2f13/Delhi.zip?part=0.1&vt=ANaJVrEpUPnptnb4Y-J5gJRBVJ29K0pIGKzeBG7492Ume1tyn1MY5eTDbztxP0Hdbc7u8XhmH_GbemY_HD60x5OvDhr7M2ib1h8YfDmlNxFefazGPgmAUj0\nResolving 06895207363394598426.googlegroups.com (06895207363394598426.googlegroups.com)... 142.251.10.137, 2404:6800:4003:c0f::89\nConnecting to 06895207363394598426.googlegroups.com (06895207363394598426.googlegroups.com)|142.251.10.137|:443... connected.\nHTTP request sent, awaiting response... 200 OK\nLength: unspecified [application/zip]\nSaving to: ‘/tmp/delhi.zip’\n\n/tmp/delhi.zip [ <=> ] 16.42K --.-KB/s in 0.04s \n\n2023-07-26 08:37:37 (429 KB/s) - ‘/tmp/delhi.zip’ saved [16812]\n\nArchive: /tmp/delhi.zip\n inflating: /tmp/delhi/Delhi.kml \n inflating: /tmp/delhi/Districts.dbf \n inflating: /tmp/delhi/Districts.prj \n inflating: /tmp/delhi/Districts.qpj \n inflating: /tmp/delhi/Districts.shp \n inflating: /tmp/delhi/Districts.shx \n\n\n\ngdf = gpd.read_file('/tmp/delhi/Districts.shp')\ngdf.plot(color=\"none\", edgecolor=\"black\");\n\n\n\n\n\n\n\n\n\n# check if a point is within Delhi\ndef is_within_delhi(coords):\n point = Point(coords[\"longitude\"], coords[\"latitude\"])\n for i, row in gdf.iterrows():\n if row.geometry.contains(point):\n return True\n return False\n\ndf[\"is_within_delhi\"] = df[\"coordinates\"].apply(is_within_delhi)\n\n\ndelhi_df = df[df[\"is_within_delhi\"]]\ndelhi_df.shape\n\n(311, 17)\n\n\n\ndelhi_df.city.value_counts()\n\n 194\nNew Delhi District 112\nDwarka 3\nGhaziabad 2\nName: city, dtype: int64\n\n\nSeems like many points were not detected by the online geopy encoder.\nNow, we know that out of 623, 311 sensors belong to Delhi. Let’s download the data for these sensors. For illustration, I will download data for 3 sensors for year 2022 and month of Jan.\n\n# dump delhi_df.id to a file\ndelhi_df.id.to_csv('/tmp/delhi_locations.txt', index=False, header=False)\n!head -n3 /tmp/delhi_locations.txt\n\n274208\n221227\n273205\n\n\n\n!head -n3 /tmp/delhi_locations.txt > /tmp/delhi_locations_3.txt\n!while read -r sensor_id; do aws s3 --no-sign-request cp s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=$sensor_id/year=2022/month=01 /tmp/delhi_data --recursive; done < /tmp/delhi_locations_3.txt\n\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220107.csv.gz to ../../../../tmp/delhi_data/location-274208-20220107.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220120.csv.gz to ../../../../tmp/delhi_data/location-274208-20220120.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220131.csv.gz to ../../../../tmp/delhi_data/location-274208-20220131.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220122.csv.gz to ../../../../tmp/delhi_data/location-274208-20220122.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220130.csv.gz to ../../../../tmp/delhi_data/location-274208-20220130.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220129.csv.gz to ../../../../tmp/delhi_data/location-274208-20220129.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220114.csv.gz to ../../../../tmp/delhi_data/location-274208-20220114.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220123.csv.gz to ../../../../tmp/delhi_data/location-274208-20220123.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220113.csv.gz to ../../../../tmp/delhi_data/location-274208-20220113.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=274208/year=2022/month=01/location-274208-20220121.csv.gz to ../../../../tmp/delhi_data/location-274208-20220121.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220107.csv.gz to ../../../../tmp/delhi_data/location-221227-20220107.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220104.csv.gz to ../../../../tmp/delhi_data/location-221227-20220104.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220115.csv.gz to ../../../../tmp/delhi_data/location-221227-20220115.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220105.csv.gz to ../../../../tmp/delhi_data/location-221227-20220105.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220108.csv.gz to ../../../../tmp/delhi_data/location-221227-20220108.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220106.csv.gz to ../../../../tmp/delhi_data/location-221227-20220106.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220102.csv.gz to ../../../../tmp/delhi_data/location-221227-20220102.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220117.csv.gz to ../../../../tmp/delhi_data/location-221227-20220117.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220118.csv.gz to ../../../../tmp/delhi_data/location-221227-20220118.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220110.csv.gz to ../../../../tmp/delhi_data/location-221227-20220110.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220131.csv.gz to ../../../../tmp/delhi_data/location-221227-20220131.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220113.csv.gz to ../../../../tmp/delhi_data/location-221227-20220113.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220119.csv.gz to ../../../../tmp/delhi_data/location-221227-20220119.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220121.csv.gz to ../../../../tmp/delhi_data/location-221227-20220121.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220123.csv.gz to ../../../../tmp/delhi_data/location-221227-20220123.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220116.csv.gz to ../../../../tmp/delhi_data/location-221227-20220116.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220128.csv.gz to ../../../../tmp/delhi_data/location-221227-20220128.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220122.csv.gz to ../../../../tmp/delhi_data/location-221227-20220122.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220101.csv.gz to ../../../../tmp/delhi_data/location-221227-20220101.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220111.csv.gz to ../../../../tmp/delhi_data/location-221227-20220111.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220120.csv.gz to ../../../../tmp/delhi_data/location-221227-20220120.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220109.csv.gz to ../../../../tmp/delhi_data/location-221227-20220109.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220129.csv.gz to ../../../../tmp/delhi_data/location-221227-20220129.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220103.csv.gz to ../../../../tmp/delhi_data/location-221227-20220103.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220112.csv.gz to ../../../../tmp/delhi_data/location-221227-20220112.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220114.csv.gz to ../../../../tmp/delhi_data/location-221227-20220114.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=221227/year=2022/month=01/location-221227-20220130.csv.gz to ../../../../tmp/delhi_data/location-221227-20220130.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220120.csv.gz to ../../../../tmp/delhi_data/location-273205-20220120.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220124.csv.gz to ../../../../tmp/delhi_data/location-273205-20220124.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220107.csv.gz to ../../../../tmp/delhi_data/location-273205-20220107.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220121.csv.gz to ../../../../tmp/delhi_data/location-273205-20220121.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220126.csv.gz to ../../../../tmp/delhi_data/location-273205-20220126.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220114.csv.gz to ../../../../tmp/delhi_data/location-273205-20220114.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220129.csv.gz to ../../../../tmp/delhi_data/location-273205-20220129.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220130.csv.gz to ../../../../tmp/delhi_data/location-273205-20220130.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220113.csv.gz to ../../../../tmp/delhi_data/location-273205-20220113.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220123.csv.gz to ../../../../tmp/delhi_data/location-273205-20220123.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220122.csv.gz to ../../../../tmp/delhi_data/location-273205-20220122.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220131.csv.gz to ../../../../tmp/delhi_data/location-273205-20220131.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220128.csv.gz to ../../../../tmp/delhi_data/location-273205-20220128.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220125.csv.gz to ../../../../tmp/delhi_data/location-273205-20220125.csv.gz\ndownload: s3://openaq-data-archive/records/csv.gz/provider=purpleair/country=in/locationid=273205/year=2022/month=01/location-273205-20220127.csv.gz to ../../../../tmp/delhi_data/location-273205-20220127.csv.gz\n\n\nVerify if we got all the sensors data as we needed.\n\n!ls /tmp/delhi_data/location-221227* | wc -l\n!ls /tmp/delhi_data/location-274208* | wc -l\n!ls /tmp/delhi_data/location-273205* | wc -l\n\n27\n10\n15\n\n\n\nsensor_df = pd.read_csv('/tmp/delhi_data/location-274208-20220107.csv.gz')\nsensor_df.parameter.value_counts()\n\npm10 64\npm25 64\npm1 64\num010 64\num025 64\num100 64\nName: parameter, dtype: int64" + "text": "I have shifted from .ipynb files to .py files (and Jupyter to VS code) in the last couple of months. Here are some reasons why I feel .py files are better than .ipynb files:" }, { - "objectID": "posts/2022-08-01-conditional_neural_processes.html", - "href": "posts/2022-08-01-conditional_neural_processes.html", - "title": "Conditional Neural Processes in JAX", - "section": "", - "text": "# Silence WARNING:root:The use of `check_types` is deprecated and does not have any effect.\n# https://github.com/tensorflow/probability/issues/1523\nimport logging\n\nlogger = logging.getLogger()\n\n\nclass CheckTypesFilter(logging.Filter):\n def filter(self, record):\n return \"check_types\" not in record.getMessage()\n\n\nlogger.addFilter(CheckTypesFilter())\n\nimport jax\nimport jax.numpy as jnp\nimport matplotlib.pyplot as plt\nfrom sklearn.model_selection import train_test_split\n\ntry:\n import flax.linen as nn\nexcept ModuleNotFoundError:\n %pip install flax\n import flax.linen as nn\n\ntry:\n import optax\nexcept ModuleNotFoundError:\n %pip install optax\n import optax\n\ntry:\n import tensorflow_probability.substrates.jax as tfp\nexcept ModuleNotFoundError:\n %pip install tensorflow-probability\n import tensorflow_probability.substrates.jax as tfp\ntfd = tfp.distributions" + "objectID": "posts/py_over_ipynb.html#fewer-errors", + "href": "posts/py_over_ipynb.html#fewer-errors", + "title": "Why .py files are better than .ipynb files for ML codebase", + "section": "Fewer Errors", + "text": "Fewer Errors\n\n.py files are easier to debug with a VS code like IDE, making it easier to find the errors.\nExecution of .py starts fresh, unlike some left out variables silently getting carried over from the last execution/deleted cells in .ipynb files." }, { - "objectID": "posts/2022-08-01-conditional_neural_processes.html#model", - "href": "posts/2022-08-01-conditional_neural_processes.html#model", - "title": "Conditional Neural Processes in JAX", - "section": "Model", - "text": "Model\n\nclass Encoder(nn.Module):\n features: list\n encoding_dims: int\n\n @nn.compact\n def __call__(self, x_context, y_context):\n x = jnp.hstack([x_context, y_context.reshape(x_context.shape[0], -1)])\n for n_features in self.features:\n x = nn.Dense(n_features)(x)\n x = nn.relu(x)\n\n x = nn.Dense(self.encoding_dims)(x)\n\n representation = x.mean(axis=0, keepdims=True) # option 1\n return representation # (1, encoding_dims)\n\nclass Decoder(nn.Module):\n features: list\n\n @nn.compact\n def __call__(self, representation, x):\n representation = jnp.repeat(representation, x.shape[0], axis=0)\n x = jnp.hstack([representation, x])\n\n for n_features in self.features:\n x = nn.Dense(n_features)(x)\n x = nn.relu(x)\n\n x = nn.Dense(2)(x)\n loc, raw_scale = x[:, 0], x[:, 1]\n scale = jax.nn.softplus(raw_scale)\n \n return loc, scale\n\nclass CNP(nn.Module):\n encoder_features: list\n encoding_dims: int\n decoder_features: list\n\n @nn.compact\n def __call__(self, x_content, y_context, x_target):\n representation = Encoder(self.encoder_features, self.encoding_dims)(x_content, y_context)\n loc, scale = Decoder(self.decoder_features)(representation, x_target)\n return loc, scale\n\n def loss_fn(self, params, x_context, y_context, x_target, y_target):\n loc, scale = self.apply(params, x_context, y_context, x_target)\n predictive_distribution = tfd.MultivariateNormalDiag(loc=loc, scale_diag=scale)\n return -predictive_distribution.log_prob(y_target)" + "objectID": "posts/py_over_ipynb.html#better-usage-of-a-shared-server", + "href": "posts/py_over_ipynb.html#better-usage-of-a-shared-server", + "title": "Why .py files are better than .ipynb files for ML codebase", + "section": "Better Usage of a Shared Server", + "text": "Better Usage of a Shared Server\n\n.py files release the resources (e.g., GPU memory) once executed. It could be inconvenient to repeatedly remind or be reminded by someone to release the resources manually from a Jupyter notebook." }, { - "objectID": "posts/2022-08-01-conditional_neural_processes.html#data", - "href": "posts/2022-08-01-conditional_neural_processes.html#data", - "title": "Conditional Neural Processes in JAX", - "section": "Data", - "text": "Data\n\nN = 100\nseed = jax.random.PRNGKey(0)\nx = jnp.linspace(-1, 1, N).reshape(-1, 1)\nf = lambda x: (jnp.sin(10*x) + x).flatten()\nnoise = jax.random.normal(seed, shape=(N,)) * 0.2\ny = f(x) + noise\n\nx_test = jnp.linspace(-2, 2, N*2+10).reshape(-1, 1)\ny_test = f(x_test) \n\nplt.scatter(x, y, label='train', zorder=5)\nplt.scatter(x_test, y_test, label='test', alpha=0.5)\nplt.legend();" + "objectID": "posts/py_over_ipynb.html#increased-productivity", + "href": "posts/py_over_ipynb.html#increased-productivity", + "title": "Why .py files are better than .ipynb files for ML codebase", + "section": "Increased Productivity", + "text": "Increased Productivity\n\nYou can make use of fantastic auto-complete, syntax-highlighting extensions in VS code to save a lot of time while working with .py files." }, { - "objectID": "posts/2022-08-01-conditional_neural_processes.html#training", - "href": "posts/2022-08-01-conditional_neural_processes.html#training", - "title": "Conditional Neural Processes in JAX", - "section": "Training", - "text": "Training\n\ndef train_fn(model, optimizer, seed, n_iterations, n_context):\n params = model.init(seed, x, y, x)\n value_and_grad_fn = jax.value_and_grad(model.loss_fn)\n state = optimizer.init(params)\n indices = jnp.arange(N)\n \n def one_step(params_and_state, seed):\n params, state = params_and_state\n shuffled_indices = jax.random.permutation(seed, indices)\n context_indices = shuffled_indices[:n_context]\n target_indices = shuffled_indices[n_context:]\n x_context, y_context = x[context_indices], y[context_indices]\n x_target, y_target = x[target_indices], y[target_indices]\n loss, grads = value_and_grad_fn(params, x_context, y_context, x_target, y_target)\n updates, state = optimizer.update(grads, state)\n params = optax.apply_updates(params, updates)\n return (params, state), loss\n\n seeds = jax.random.split(seed, num=n_iterations)\n (params, state), losses = jax.lax.scan(one_step, (params, state), seeds)\n return params, losses\n\n\nencoder_features = [64, 16, 8]\nencoding_dims = 1\ndecoder_features = [16, 8]\nmodel = CNP(encoder_features, encoding_dims, decoder_features)\noptimizer = optax.adam(learning_rate=0.001)\n\nseed = jax.random.PRNGKey(2)\nn_context = int(0.7 * N)\nn_iterations = 20000\n\nparams, losses = train_fn(model, optimizer, seed, n_iterations=n_iterations, n_context=n_context)\n\n\nplt.plot(losses);" + "objectID": "posts/py_over_ipynb.html#boost-collaboration", + "href": "posts/py_over_ipynb.html#boost-collaboration", + "title": "Why .py files are better than .ipynb files for ML codebase", + "section": "Boost Collaboration", + "text": "Boost Collaboration\n\n.py do not take time to render on GitHub because they are just plain text files, unlike .ipynb files.\nIt is a lot easier to see the changes made by others in a .py file than a .ipynb file." }, { - "objectID": "posts/2022-08-01-conditional_neural_processes.html#predict", - "href": "posts/2022-08-01-conditional_neural_processes.html#predict", - "title": "Conditional Neural Processes in JAX", - "section": "Predict", - "text": "Predict\n\nloc, scale = model.apply(params, x, y, x_test)\nlower, upper = loc - 2*scale, loc + 2*scale\n\nplt.scatter(x, y, label='train', alpha=0.5)\nplt.scatter(x_test, y_test, label='test', alpha=0.5)\nplt.plot(x_test, loc);\nplt.fill_between(x_test.flatten(), lower, upper, alpha=0.4);\nplt.ylim(-5, 5);" + "objectID": "posts/py_over_ipynb.html#increased-modularity", + "href": "posts/py_over_ipynb.html#increased-modularity", + "title": "Why .py files are better than .ipynb files for ML codebase", + "section": "Increased Modularity", + "text": "Increased Modularity\n\nFunction and Class calls from other files are seamless with .py files.\n\nFeel free to comment your views/suggestions/additions in the comment box." }, { - "objectID": "posts/Basis_functions.html", - "href": "posts/Basis_functions.html", - "title": "Basis functions", + "objectID": "posts/kl-divergence.html", + "href": "posts/kl-divergence.html", + "title": "KL divergence v/s cross-entropy", "section": "", - "text": "import GPy\nimport numpy as np\nimport pandas as pd\n\nfrom sklearn.preprocessing import MinMaxScaler, StandardScaler\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.ensemble import RandomForestRegressor\n\nimport matplotlib.pyplot as plt\n\n\ndata = pd.read_csv(\"../../beat_stgnp/dataset/bjair/NP/processed_raw.csv\")\ndata[\"time\"] = pd.to_datetime(data[\"time\"], format=\"%Y-%m-%d %H:%M:%S\")\ndata[\"time\"] = data[\"time\"].apply(lambda x: x.timestamp())\n\nx = [\"latitude\", \"longitude\", \"time\"]\ny = [\"PM25_Concentration\"]\n\nx_train, x_test, y_train, y_test = train_test_split(data[x], data[y], test_size=0.2, random_state=42)\nx_train, x_test, y_train, y_test = map(lambda x: x.values, [x_train, x_test, y_train, y_test])\n\nx_scaler = MinMaxScaler()\ny_scaler = StandardScaler()\nx_train = x_scaler.fit_transform(x_train)\ny_train = y_scaler.fit_transform(y_train)\nx_test = x_scaler.transform(x_test)\n\nmodel = RandomForestRegressor(n_estimators=1000, random_state=42)\nmodel.fit(x_train, y_train.ravel())\ny_pred = model.predict(x_test)\nprint(\"RMSE\", np.sqrt(np.mean((y_scaler.inverse_transform(y_pred).ravel() - y_test.ravel())**2)))\n\n /tmp/ipykernel_922642/3470971270.py:18: DataConversionWarning:A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples,), for example using ravel()." + "text": "In a classification problem, for a data-point \\(\\mathbf{x}_i\\), we have the true label \\(y_i\\) associated with it.\nLet us assume that we have three possible outcomes \\(\\{L1, L2, L3\\}\\) and for current \\(\\mathbf{x}_i\\), corresponding \\(y_i\\) is \\(L2\\). Then Ground truth probability distribution is the following:\n\\[\np_G(y = L1) = 0\\\\\np_G(y = L2) = 1\\\\\np_G(y=L3) = 0\n\\]\nLet us assume that our classifier model Predicted the following distribution:\n\\[\np_P(y = L1) = 0.1\\\\\np_P(y = L2) = 0.8\\\\\np_P(y=L3) = 0.1\n\\]" }, { - "objectID": "posts/2024-12-27-download_caaqm_locations.html", - "href": "posts/2024-12-27-download_caaqm_locations.html", - "title": "Download CPCB CAAQM locations", + "objectID": "posts/kl-divergence.html#ground", + "href": "posts/kl-divergence.html#ground", + "title": "KL divergence v/s cross-entropy", "section": "", - "text": "try:\n import selenium\nexcept ModuleNotFoundError:\n %pip install selenium\n\nimport os\nimport re\nimport numpy as np\nimport pandas as pd\n\nfrom tqdm.notebook import tqdm, trange\nfrom time import sleep, time\nfrom selenium import webdriver\nfrom selenium.webdriver.support.ui import Select\nfrom selenium.webdriver.common.by import By\nfrom selenium.webdriver.support.ui import WebDriverWait\nfrom selenium.webdriver.support import expected_conditions as EC\n\n!rm log.txt\n\ndef print_it(*args, **kwargs):\n print(*args, **kwargs)\n with open('log.txt', 'a') as f:\n print(*args, **kwargs, file=f)\n\nglobal_init = time()\n\nrm: log.txt: No such file or directory\n\n\n\n# Set up WebDriver\nop = webdriver.ChromeOptions()\n\ndriver = webdriver.Chrome(options=op)\n\n# Navigate to the website and manually solve the CAPTCHA\ndriver.get(\"https://airquality.cpcb.gov.in/ccr/#/caaqm-dashboard-all/caaqm-landing\")\n\n\nManually solve captcha before moving on to the next cell..\n\n\n# leaflet-marker-icon custom-div-icon map_markers station_status_live leaflet-zoom-animated leaflet-interactive\nall_station_markers = driver.find_elements(By.CLASS_NAME, 'leaflet-marker-icon')\n\nall_stations_len = len(all_station_markers)\nprint(\"Total stations: \", all_stations_len)\n\nTotal stations: 558\n\n\n\ndef get_after(string, phrase):\n return string[string.index(phrase) + len(phrase):]\n\ndata = {}\nall_station_markers = driver.find_elements(By.CLASS_NAME, 'leaflet-marker-icon')\nmarker_id = 0\nprogress_bar = tqdm(total=all_stations_len, desc=\"Progress\")\nwhile marker_id < all_stations_len:\n try:\n marker = all_station_markers[marker_id]\n driver.execute_script(\"arguments[0].click();\", marker)\n WebDriverWait(driver, 10).until(EC.element_to_be_clickable((By.CLASS_NAME, 'close')))\n children = driver.find_elements(By.CLASS_NAME, \"col-md-12\")\n assert \"Station Name\" in children[3].text\n \n # parse it\n \n station, address, location = children[3].text.split('\\n')\n station = get_after(station, \"Station Name: \")\n address = get_after(address, \"Address: \")\n latitude, longitude = location.split(\",\")\n latitude = get_after(latitude, \"Latitude: \")\n longitude = get_after(longitude, \"Longitude: \")\n \n data[station] = {\"address\": address, \"latitude\": float(latitude), \"longitude\": float(longitude)}\n close = driver.find_element(By.CLASS_NAME, \"close\")\n close.click()\n sleep(0.5)\n marker_id += 1\n progress_bar.update(1)\n except Exception as e:\n driver.refresh()\n input(\"Please manually solve the Captcha\")\n all_station_markers = driver.find_elements(By.CLASS_NAME, 'leaflet-marker-icon')\n\n\n\n\n\n---------------------------------------------------------------------------\nKeyboardInterrupt Traceback (most recent call last)\nCell In[4], line 12\n 10 marker = all_station_markers[marker_id]\n 11 driver.execute_script(\"arguments[0].click();\", marker)\n---> 12 WebDriverWait(driver, 10).until(EC.element_to_be_clickable((By.CLASS_NAME, 'close')))\n 13 children = driver.find_elements(By.CLASS_NAME, \"col-md-12\")\n 14 assert \"Station Name\" in children[3].text\n\nFile /opt/miniconda3/lib/python3.12/site-packages/selenium/webdriver/support/wait.py:102, in WebDriverWait.until(self, method, message)\n 100 screen = getattr(exc, \"screen\", None)\n 101 stacktrace = getattr(exc, \"stacktrace\", None)\n--> 102 time.sleep(self._poll)\n 103 if time.monotonic() > end_time:\n 104 break\n\nKeyboardInterrupt: \n\n\n\n\ndf = pd.DataFrame(data).T\ndf.index.name = \"station\"\ndf.head(2)\n\n\n\n\n\n\n\n\naddress\nlatitude\nlongitude\n\n\nstation\n\n\n\n\n\n\n\nSIDCO Kurichi, Coimbatore - TNPCB\nSIDCO Kurichi, Coimbatore, Tamil Nadu.\n10.942451\n76.978996\n\n\nMuradpur, Patna - BSPCB\nS K Memorial Hall Premises, Near Gandhi Maidan...\n25.619651\n85.147382\n\n\n\n\n\n\n\n\ndf.to_csv(\"station_data.csv\")" + "text": "In a classification problem, for a data-point \\(\\mathbf{x}_i\\), we have the true label \\(y_i\\) associated with it.\nLet us assume that we have three possible outcomes \\(\\{L1, L2, L3\\}\\) and for current \\(\\mathbf{x}_i\\), corresponding \\(y_i\\) is \\(L2\\). Then Ground truth probability distribution is the following:\n\\[\np_G(y = L1) = 0\\\\\np_G(y = L2) = 1\\\\\np_G(y=L3) = 0\n\\]\nLet us assume that our classifier model Predicted the following distribution:\n\\[\np_P(y = L1) = 0.1\\\\\np_P(y = L2) = 0.8\\\\\np_P(y=L3) = 0.1\n\\]" + }, + { + "objectID": "posts/kl-divergence.html#kl-divergence", + "href": "posts/kl-divergence.html#kl-divergence", + "title": "KL divergence v/s cross-entropy", + "section": "KL divergence", + "text": "KL divergence\nWe can use KL divergence to check how good is our model. The formula is:\n\\[\nD_{KL}(p_G\\;\\rVert\\;p_P) = \\sum_{y_i \\in \\{L1, L2, L3\\}} p_G(y_i)\\log\\frac{p_G(y_i)}{p_P(y_i)}\n\\]\nFor our example,\n\\[\nD_{KL}(p_G\\;\\rVert\\;p_P) = \\log\\frac{1}{0.8}\n\\]\nIt is evident that if \\(p_P(y = L2)\\) decreses from \\(0.8\\), \\(D_{KL}(p_G\\;\\rVert\\;p_P)\\) will increase and vice versa. Note that KL divergence is not symmetric which means \\(D_{KL}(p_G\\;\\rVert\\;p_P) \\ne D_{KL}(p_P\\;\\rVert\\;p_G)\\)." + }, + { + "objectID": "posts/kl-divergence.html#cross-entory", + "href": "posts/kl-divergence.html#cross-entory", + "title": "KL divergence v/s cross-entropy", + "section": "Cross-entory", + "text": "Cross-entory\nCross-entropy is another measure for distribution similarity. The formula is:\n\\[\nH(p_G, p_P) = \\sum_{y_i \\in \\{L1, L2, L3\\}} - p_G(y_i)\\log p_P(y_i)\n\\]\nFor our example:\n\\[\nH(p_G, p_P) = -\\log 0.8 = \\log \\frac{1}{0.8}\n\\]" + }, + { + "objectID": "posts/kl-divergence.html#kl-divergence-vs-cross-entropy", + "href": "posts/kl-divergence.html#kl-divergence-vs-cross-entropy", + "title": "KL divergence v/s cross-entropy", + "section": "KL divergence v/s cross-entropy", + "text": "KL divergence v/s cross-entropy\nThis shows that KL divergence and cross-entropy will return the same values for a simple classification problem. Then why do we use cross-entropy as a loss function and not KL divergence?\nThat’s because KL divergence will compute additional constant terms (zero here) that are not adding any value in minimization." }, { "objectID": "index.html", "href": "index.html", "title": "blog", "section": "", - "text": "Download CPCB CAAQM locations\n\n\n\n\n\n\nData\n\n\n\nDownload CPCB CAAQM locations using Selenium\n\n\n\n\n\nDec 27, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nDownload CPCB live data\n\n\n\n\n\n\nData\n\n\n\nDownload CPCB data with selenium\n\n\n\n\n\nDec 10, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nFoundation Models for Time Series Forecasting\n\n\n\n\n\n\nML\n\n\n\nExploring the foundation models for time series forecasting\n\n\n\n\n\nJul 6, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nBuilding GPT from scratch\n\n\n\n\n\n\nML\n\n\n\nBuilding GPT from scratch\n\n\n\n\n\nJul 1, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nFundamentals across ML domains\n\n\n\n\n\n\nML\n\n\n\nKnowledge transfer between ML domains\n\n\n\n\n\nJun 25, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nSeq to Seq\n\n\n\n\n\n\nML\n\n\n\nRational driven history of Seq to Seq models\n\n\n\n\n\nJun 24, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nWRF Tutorial\n\n\n\n\n\n\nML\n\n\n\nWRF end-to-end tutorial\n\n\n\n\n\nMar 4, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nLearnings from the Brick Kiln Project\n\n\n\n\n\n\nML\n\n\n\nLearnings from the Brick Kiln Project\n\n\n\n\n\nNov 28, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nData Handling for Large Scale ML\n\n\n\n\n\n\nML\n\n\n\nAn exploratory analysis of various dataset handling processes to optimize memory, diskspace and speed.\n\n\n\n\n\nSep 30, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nActive Learning with MNIST\n\n\n\n\n\n\nML\n\n\n\nActive Learning with MNIST\n\n\n\n\n\nSep 30, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nNon-Gaussian Likelihoods for MLPs\n\n\n\n\n\n\nML\n\n\n\nNon-Gaussian Likelihoods for MLPs\n\n\n\n\n\nSep 16, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nPruning vs Uncertainty\n\n\n\n\n\n\nML\n\n\n\nPruning vs Uncertainty\n\n\n\n\n\nSep 14, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGoogle Air Quality Data\n\n\n\n\n\n\nNumPy, Mathematics\n\n\n\nGoogle Air Quality API\n\n\n\n\n\nAug 31, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nBayesian Basis Regression\n\n\n\n\n\n\nML\n\n\n\nBayesian Basis Regression\n\n\n\n\n\nAug 31, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nHow numpy handles day-to-day algebra?\n\n\n\n\n\n\nNumPy, Mathematics\n\n\n\nA deep dive into basic operations of numpy\n\n\n\n\n\nAug 26, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nCan Rank 1 GPs represent all GPs?\n\n\n\n\n\n\nML, GP\n\n\n\nA trial\n\n\n\n\n\nJul 31, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nDownload low-cost data from OpenAQ\n\n\n\n\n\n\nML, GP\n\n\n\nA guide to download low-cost sensor data from OpenAQ\n\n\n\n\n\nJul 26, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nMulti-class classification with Gaussian Processes\n\n\n\n\n\n\nML, GP\n\n\n\nMulti-class GP classification with different strategies\n\n\n\n\n\nJul 4, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nClimate Modeling with GPs\n\n\n\n\n\n\nML\n\n\n\nExploring the use of GPs for climate modeling\n\n\n\n\n\nJul 4, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nClimate Modeling with SIRENs\n\n\n\n\n\n\nML\n\n\n\nExploring the use of SIRENs for climate modeling\n\n\n\n\n\nJul 1, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGNNs and GPs\n\n\n\n\n\n\nML\n\n\n\nExploring similarities between GNNs and GPs\n\n\n\n\n\nJun 23, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nBasis functions\n\n\n\n\n\n\nML\n\n\n\nExploring basis functions\n\n\n\n\n\nJun 12, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGraph Neural Networks for Regression\n\n\n\n\n\n\nML\n\n\n\nChallenges in using GNNs for regression using various strategies\n\n\n\n\n\nJun 12, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nConditional Neural Processes for Image Interpolation\n\n\n\n\n\n\nML\n\n\n\nExtreme Image Interpolation with Conditional Neural processes\n\n\n\n\n\nMay 31, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nPasswordless SSH setup for MacOS Hosts\n\n\n\n\n\n\nmacOS\n\n\n\nA tiny handbook to setup passwordless ssh in MacOS\n\n\n\n\n\nMay 14, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nSine Combination Networks\n\n\n\n\n\n\nML\n\n\n\nChallenges in fitting to a combination of sine waves\n\n\n\n\n\nApr 29, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nNeural Network Gaussian Process\n\n\n\n\n\n\nGP\n\n\nML\n\n\n\nExploring NTK kernels + GPJax with toy datasets\n\n\n\n\n\nMar 28, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nStochastic Variational Gaussian processes in JAX\n\n\n\n\n\n\nGP\n\n\n\nA practical implementation of Hensman et al. 2015 from scratch in JAX\n\n\n\n\n\nOct 31, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nMulti-Output Gaussian Processes\n\n\n\n\n\n\nML\n\n\n\nExploring MOGPs from scratch\n\n\n\n\n\nOct 27, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGaussian Processes - A no-skip-math version\n\n\n\n\n\n\nML\n\n\n\nEnd-to-end math derivations for Gaussian process regression and classification\n\n\n\n\n\nOct 21, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nTrain NN with KFAC-Laplace in JAX\n\n\n\n\n\n\nML\n\n\n\nExploring KFAC-Laplace approximation on simple problems in JAX\n\n\n\n\n\nOct 18, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nConditional Neural Processes in JAX\n\n\n\n\n\n\nML\n\n\n\nImplementing conditional neural processes from scratch in JAX\n\n\n\n\n\nAug 1, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nJAX Optimizers\n\n\n\n\n\n\nML\n\n\n\nPros and cons of several jax optimizers.\n\n\n\n\n\nJun 10, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGet a list of contributors from a repo\n\n\n\n\n\n\nGitHub\n\n\n\nGet contributors’ list using GitHub API and pandas\n\n\n\n\n\nMay 17, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nIteratively reweighted least squares (IRLS) logistic regression\n\n\n\n\n\n\nML\n\n\n\nImplementation of IRLS from Probabilistic ML book of Dr. Kevin Murphy and its comparison with naive second order implementation.\n\n\n\n\n\nMay 14, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGcloud cheatsheet\n\n\n\n\n\n\nGcloud\n\n\n\nMost used commands while working with gcloud\n\n\n\n\n\nApr 9, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGitHub Contrubuting FAQs\n\n\n\n\n\n\nGitHub\n\n\n\nThis is a collection of FAQs/road-blocks/queries/issues I had over the past 2 years of engagement with GitHub.\n\n\n\n\n\nApr 6, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nTorch essentials\n\n\n\n\n\n\nML\n\n\n\nPractical and direct introduction to PyTorch\n\n\n\n\n\nMar 8, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nProbabilistic Machine Learning\n\n\n\n\n\n\nML\n\n\n\nA video lecture series from Prof. Philipp Hennig\n\n\n\n\n\nMar 6, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nUncertainty in Deep Learning\n\n\n\n\n\n\nML\n\n\n\nReview of PhD thesis of Dr. Yarin Gal\n\n\n\n\n\nMar 5, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nPyTorch Tips\n\n\n\n\n\n\nML\n\n\n\nPyTorch zen tips\n\n\n\n\n\nFeb 25, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nConference Presentation Tips\n\n\n\n\n\n\nAcademic\n\n\n\nConference Presentation Tips\n\n\n\n\n\nJan 29, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nComparing Gaussian Process Regression Frameworks\n\n\n\n\n\n\nML\n\n\n\nA basic comparison among GPy, GPyTorch and TinyGP\n\n\n\n\n\nJan 25, 2022\n\n\nZeel B Patel, Harsh Patel, Shivam Sahni\n\n\n\n\n\n\n\n\n\n\n\n\nQuery by Committee\n\n\n\n\n\n\nML\n\n\n\nA programming introduction to QBC with Random Forest Classifier.\n\n\n\n\n\nJan 24, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nKL divergence v/s cross-entropy\n\n\n\n\n\n\nML\n\n\n\nUnderstanding KL divergence\n\n\n\n\n\nJan 20, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nWhy .py files are better than .ipynb files for ML codebase\n\n\n\n\n\n\nPython\n\n\n\nWhere .py files are better than .ipynb files?\n\n\n\n\n\nJan 15, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nAnonymization tips for double-blind submission\n\n\n\n\n\n\nAcademic\n\n\n\nA last-minute help list\n\n\n\n\n\nOct 26, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nInput Warped GPs - A failed idea\n\n\n\n\n\n\nML\n\n\n\nAn idea of input warping GPs\n\n\n\n\n\nOct 23, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nSparseGPs in Stheno\n\n\n\n\n\nA simple demo of sparse regression in stheno with VFE and FITC methods.\n\n\n\n\n\nOct 12, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nDocker Cheatsheet\n\n\n\n\n\n\nDocker\n\n\n\nMost used command while working with Docker\n\n\n\n\n\nSep 28, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nHow to apply constraint on parameters in various GP libraries\n\n\n\n\n\nApply constraints in GPy, GPFlow, GPyTorch\n\n\n\n\n\nSep 27, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nUnderstanding Kernels in Gaussian Processes\n\n\n\n\n\n\nML\n\n\n\nAn exploratory analysis of kernels in GPs\n\n\n\n\n\nMar 22, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nProgramatically download OpenAQ data\n\n\n\n\n\nProgramatically download OpenAQ data\n\n\n\n\n\nSep 21, 2020\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nActive Learning with Bayesian Linear Regression\n\n\n\n\n\n\nML\n\n\n\nA programming introduction to Active Learning with Bayesian Linear Regression.\n\n\n\n\n\nMar 28, 2020\n\n\nZeel B Patel, Nipun Batra\n\n\n\n\n\n\nNo matching items" + "text": "Object Detection - A how-to guide\n\n\n\n\n\n\nML\n\n\nCV\n\n\n\nBasic operations in object detection task\n\n\n\n\n\nDec 29, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nDownload CPCB CAAQM locations\n\n\n\n\n\n\nData\n\n\n\nDownload CPCB CAAQM locations using Selenium\n\n\n\n\n\nDec 27, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nDownload CPCB live data\n\n\n\n\n\n\nData\n\n\n\nDownload CPCB data with selenium\n\n\n\n\n\nDec 10, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nFundamentals across ML domains\n\n\n\n\n\n\nML\n\n\n\nKnowledge transfer between ML domains\n\n\n\n\n\nJun 25, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nObject Detection Random Baseline\n\n\n\n\n\n\nML\n\n\nCV\n\n\n\nCompare your performance with a random baseline.\n\n\n\n\n\nFeb 10, 2024\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nNon-Gaussian Likelihoods for MLPs\n\n\n\n\n\n\nML\n\n\n\nNon-Gaussian Likelihoods for MLPs\n\n\n\n\n\nSep 16, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nPruning vs Uncertainty\n\n\n\n\n\n\nML\n\n\n\nPruning vs Uncertainty\n\n\n\n\n\nSep 14, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nBayesian Basis Regression\n\n\n\n\n\n\nML\n\n\n\nBayesian Basis Regression\n\n\n\n\n\nAug 31, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nHow numpy handles day-to-day algebra?\n\n\n\n\n\n\nNumPy, Mathematics\n\n\n\nA deep dive into basic operations of numpy\n\n\n\n\n\nAug 26, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nCan Rank 1 GPs represent all GPs?\n\n\n\n\n\n\nML, GP\n\n\n\nA trial\n\n\n\n\n\nJul 31, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nDownload low-cost data from OpenAQ\n\n\n\n\n\n\nML, GP\n\n\n\nA guide to download low-cost sensor data from OpenAQ\n\n\n\n\n\nJul 26, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nClimate Modeling with GPs\n\n\n\n\n\n\nML\n\n\n\nExploring the use of GPs for climate modeling\n\n\n\n\n\nJul 4, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nMulti-class classification with Gaussian Processes\n\n\n\n\n\n\nML, GP\n\n\n\nMulti-class GP classification with different strategies\n\n\n\n\n\nJul 4, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nClimate Modeling with SIRENs\n\n\n\n\n\n\nML\n\n\n\nExploring the use of SIRENs for climate modeling\n\n\n\n\n\nJul 1, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGNNs and GPs\n\n\n\n\n\n\nML\n\n\n\nExploring similarities between GNNs and GPs\n\n\n\n\n\nJun 23, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGraph Neural Networks for Regression\n\n\n\n\n\n\nML\n\n\n\nChallenges in using GNNs for regression using various strategies\n\n\n\n\n\nJun 12, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nBasis functions\n\n\n\n\n\n\nML\n\n\n\nExploring basis functions\n\n\n\n\n\nJun 12, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nConditional Neural Processes for Image Interpolation\n\n\n\n\n\n\nML\n\n\n\nExtreme Image Interpolation with Conditional Neural processes\n\n\n\n\n\nMay 31, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nPasswordless SSH setup for MacOS Hosts\n\n\n\n\n\n\nmacOS\n\n\n\nA tiny handbook to setup passwordless ssh in MacOS\n\n\n\n\n\nMay 14, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nSine Combination Networks\n\n\n\n\n\n\nML\n\n\n\nChallenges in fitting to a combination of sine waves\n\n\n\n\n\nApr 29, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nNeural Network Gaussian Process\n\n\n\n\n\n\nGP\n\n\nML\n\n\n\nExploring NTK kernels + GPJax with toy datasets\n\n\n\n\n\nMar 28, 2023\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nStochastic Variational Gaussian processes in JAX\n\n\n\n\n\n\nGP\n\n\n\nA practical implementation of Hensman et al. 2015 from scratch in JAX\n\n\n\n\n\nOct 31, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nMulti-Output Gaussian Processes\n\n\n\n\n\n\nML\n\n\n\nExploring MOGPs from scratch\n\n\n\n\n\nOct 27, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGaussian Processes - A no-skip-math version\n\n\n\n\n\n\nML\n\n\n\nEnd-to-end math derivations for Gaussian process regression and classification\n\n\n\n\n\nOct 21, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nTrain NN with KFAC-Laplace in JAX\n\n\n\n\n\n\nML\n\n\n\nExploring KFAC-Laplace approximation on simple problems in JAX\n\n\n\n\n\nOct 18, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nConditional Neural Processes in JAX\n\n\n\n\n\n\nML\n\n\n\nImplementing conditional neural processes from scratch in JAX\n\n\n\n\n\nAug 1, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nJAX Optimizers\n\n\n\n\n\n\nML\n\n\n\nPros and cons of several jax optimizers.\n\n\n\n\n\nJun 10, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGet a list of contributors from a repo\n\n\n\n\n\n\nGitHub\n\n\n\nGet contributors’ list using GitHub API and pandas\n\n\n\n\n\nMay 17, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nIteratively reweighted least squares (IRLS) logistic regression\n\n\n\n\n\n\nML\n\n\n\nImplementation of IRLS from Probabilistic ML book of Dr. Kevin Murphy and its comparison with naive second order implementation.\n\n\n\n\n\nMay 14, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGcloud cheatsheet\n\n\n\n\n\n\nGcloud\n\n\n\nMost used commands while working with gcloud\n\n\n\n\n\nApr 9, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nGitHub Contrubuting FAQs\n\n\n\n\n\n\nGitHub\n\n\n\nThis is a collection of FAQs/road-blocks/queries/issues I had over the past 2 years of engagement with GitHub.\n\n\n\n\n\nApr 6, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nTorch essentials\n\n\n\n\n\n\nML\n\n\n\nPractical and direct introduction to PyTorch\n\n\n\n\n\nMar 8, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nProbabilistic Machine Learning\n\n\n\n\n\n\nML\n\n\n\nA video lecture series from Prof. Philipp Hennig\n\n\n\n\n\nMar 6, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nUncertainty in Deep Learning\n\n\n\n\n\n\nML\n\n\n\nReview of PhD thesis of Dr. Yarin Gal\n\n\n\n\n\nMar 5, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nPyTorch Tips\n\n\n\n\n\n\nML\n\n\n\nPyTorch zen tips\n\n\n\n\n\nFeb 25, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nConference Presentation Tips\n\n\n\n\n\n\nAcademic\n\n\n\nConference Presentation Tips\n\n\n\n\n\nJan 29, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nComparing Gaussian Process Regression Frameworks\n\n\n\n\n\n\nML\n\n\n\nA basic comparison among GPy, GPyTorch and TinyGP\n\n\n\n\n\nJan 25, 2022\n\n\nZeel B Patel, Harsh Patel, Shivam Sahni\n\n\n\n\n\n\n\n\n\n\n\n\nQuery by Committee\n\n\n\n\n\n\nML\n\n\n\nA programming introduction to QBC with Random Forest Classifier.\n\n\n\n\n\nJan 24, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nKL divergence v/s cross-entropy\n\n\n\n\n\n\nML\n\n\n\nUnderstanding KL divergence\n\n\n\n\n\nJan 20, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nWhy .py files are better than .ipynb files for ML codebase\n\n\n\n\n\n\nPython\n\n\n\nWhere .py files are better than .ipynb files?\n\n\n\n\n\nJan 15, 2022\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nAnonymization tips for double-blind submission\n\n\n\n\n\n\nAcademic\n\n\n\nA last-minute help list\n\n\n\n\n\nOct 26, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nInput Warped GPs - A failed idea\n\n\n\n\n\n\nML\n\n\n\nAn idea of input warping GPs\n\n\n\n\n\nOct 23, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nSparseGPs in Stheno\n\n\n\n\n\nA simple demo of sparse regression in stheno with VFE and FITC methods.\n\n\n\n\n\nOct 12, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nDocker Cheatsheet\n\n\n\n\n\n\nDocker\n\n\n\nMost used command while working with Docker\n\n\n\n\n\nSep 28, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nHow to apply constraint on parameters in various GP libraries\n\n\n\n\n\nApply constraints in GPy, GPFlow, GPyTorch\n\n\n\n\n\nSep 27, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nUnderstanding Kernels in Gaussian Processes\n\n\n\n\n\n\nML\n\n\n\nAn exploratory analysis of kernels in GPs\n\n\n\n\n\nMar 22, 2021\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nProgramatically download OpenAQ data\n\n\n\n\n\nProgramatically download OpenAQ data\n\n\n\n\n\nSep 21, 2020\n\n\nZeel B Patel\n\n\n\n\n\n\n\n\n\n\n\n\nActive Learning with Bayesian Linear Regression\n\n\n\n\n\n\nML\n\n\n\nA programming introduction to Active Learning with Bayesian Linear Regression.\n\n\n\n\n\nMar 28, 2020\n\n\nZeel B Patel, Nipun Batra\n\n\n\n\n\n\nNo matching items" }, { "objectID": "about.html", @@ -672,186 +595,284 @@ "text": "Hi, I am Zeel. This is my blog, where I add coding + other resources related to my research. Head over to this page for my personal website." }, { - "objectID": "posts/2022-01-24-query_by_committee.html", - "href": "posts/2022-01-24-query_by_committee.html", - "title": "Query by Committee", + "objectID": "posts/climate-modeling-with-SpecialGP.html", + "href": "posts/climate-modeling-with-SpecialGP.html", + "title": "Climate Modeling with GPs", "section": "", - "text": "# Common imports\nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom matplotlib.animation import FuncAnimation\nfrom matplotlib import rc\n\nplt.style.use('fivethirtyeight')\nrc('animation', html='jshtml')\n\n# Copy the models\nfrom copy import deepcopy\n\n# Sklearn imports\nfrom sklearn.ensemble import RandomForestRegressor, RandomForestClassifier\nfrom sklearn.datasets import make_classification\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import accuracy_score, f1_score\n\n# Entropy function\nfrom scipy.stats import entropy\n\n# Progress helper\nfrom IPython.display import clear_output" + "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"0\"\n\nimport pyproj\nimport numpy as np\nimport xarray as xr\n\nfrom skgpytorch.models import GPRegression\n\nimport matplotlib.pyplot as plt\n\n\n# def haversine(lon1, lat1, lon2, lat2):\n# \"\"\"\n# Calculate the great circle distance in kilometers between two points \n# on the earth (specified in decimal degrees)\n# \"\"\"\n# # convert decimal degrees to radians \n# lon1, lat1, lon2, lat2 = map(np.radians, [lon1, lat1, lon2, lat2])\n\n# # haversine formula \n# dlon = lon2 - lon1 \n# dlat = lat2 - lat1 \n# a = np.sin(dlat/2)**2 + np.cos(lat1) * np.cos(lat2) * np.sin(dlon/2)**2\n# c = 2 * np.arcsin(np.sqrt(a)) \n# r = 6371 # Radius of earth in kilometers. Use 3956 for miles. Determines return value units.\n# return c * r\n\n# def new_coords(lat1, long1):\n# new_lat1 = haversine(0, 0, 0, lat1)\n# new_long1 = haversine(0, 0, long1, 0)\n# return new_lat1, new_long1\n\ndef lat_long_to_cartesian(latitude, longitude):\n # Convert latitude and longitude to radians\n phi = np.radians(latitude)\n lam = np.radians(longitude)\n\n # Constants for WGS 84 ellipsoid\n a = 6378137.0 # equatorial radius in meters\n e = 0.0818191908426 # eccentricity\n\n # Calculate Earth's radius at the given latitude\n R = a / np.sqrt(1 - (e ** 2) * (np.sin(phi) ** 2))\n\n # Convert to Cartesian coordinates\n X = R * np.sin(lam)\n Y = R * np.tan(phi)\n\n return X, Y\n\ndef wgs84_coords(lat, lon): \n # Define coordinate systems\n wgs84 = pyproj.CRS.from_epsg(4326) # WGS 84 lat-long system\n utm_zone_32n = pyproj.CRS.from_string(\"+proj=utm +zone=32 +ellps=WGS84 +datum=WGS84 +units=m +no_defs\")\n\n # Create a transformer object\n transformer = pyproj.Transformer.from_crs(wgs84, utm_zone_32n)\n\n # Convert lat-long coordinates to UTM coordinates\n utm_easting, utm_northing = transformer.transform(lon, lat)\n\n return utm_northing, utm_easting\n\n# Copyright (c) Meta Platforms, Inc. and affiliates.\n# All rights reserved.\n\n# This source code is licensed under the license found in the\n# LICENSE file in the root directory of this source tree.\n# --------------------------------------------------------\n# Position embedding utils\n# --------------------------------------------------------\n\n\n# --------------------------------------------------------\n# 2D sine-cosine position embedding\n# References:\n# Transformer: https://github.com/tensorflow/models/blob/master/official/nlp/transformer/model_utils.py\n# MoCo v3: https://github.com/facebookresearch/moco-v3\n# --------------------------------------------------------\ndef get_2d_sincos_pos_embed(embed_dim, grid_size_h, grid_size_w, cls_token=False):\n \"\"\"\n grid_size: int of the grid height and width\n return:\n pos_embed: [grid_size*grid_size, embed_dim] or [1+grid_size*grid_size, embed_dim] (w/ or w/o cls_token)\n \"\"\"\n grid_h = np.arange(grid_size_h, dtype=np.float32)\n grid_w = np.arange(grid_size_w, dtype=np.float32)\n grid = np.meshgrid(grid_w, grid_h) # here w goes first\n grid = np.stack(grid, axis=0)\n\n grid = grid.reshape([2, 1, grid_size_h, grid_size_w])\n pos_embed = get_2d_sincos_pos_embed_from_grid(embed_dim, grid)\n if cls_token:\n pos_embed = np.concatenate([np.zeros([1, embed_dim]), pos_embed], axis=0)\n return pos_embed\n\n\ndef get_2d_sincos_pos_embed_from_grid(embed_dim, grid):\n assert embed_dim % 2 == 0\n\n # use half of dimensions to encode grid_h\n emb_h = get_1d_sincos_pos_embed_from_grid(embed_dim // 2, grid[0]) # (H*W, D/2)\n emb_w = get_1d_sincos_pos_embed_from_grid(embed_dim // 2, grid[1]) # (H*W, D/2)\n\n emb = np.concatenate([emb_h, emb_w], axis=1) # (H*W, D)\n return emb\n\n\ndef get_1d_sincos_pos_embed_from_grid(embed_dim, pos):\n \"\"\"\n embed_dim: output dimension for each position\n pos: a list of positions to be encoded: size (M,)\n out: (M, D)\n \"\"\"\n assert embed_dim % 2 == 0\n omega = np.arange(embed_dim // 2, dtype=np.float)\n omega /= embed_dim / 2.0\n omega = 1.0 / 10000**omega # (D/2,)\n\n pos = pos.reshape(-1) # (M,)\n out = np.einsum(\"m,d->md\", pos, omega) # (M, D/2), outer product\n\n emb_sin = np.sin(out) # (M, D/2)\n emb_cos = np.cos(out) # (M, D/2)\n\n emb = np.concatenate([emb_sin, emb_cos], axis=1) # (M, D)\n return emb\n\n\n# --------------------------------------------------------\n# Interpolate position embeddings for high-resolution\n# References:\n# DeiT: https://github.com/facebookresearch/deit\n# --------------------------------------------------------\ndef interpolate_pos_embed(model, checkpoint_model, new_size=(64, 128)):\n if \"net.pos_embed\" in checkpoint_model:\n pos_embed_checkpoint = checkpoint_model[\"net.pos_embed\"]\n embedding_size = pos_embed_checkpoint.shape[-1]\n orig_num_patches = pos_embed_checkpoint.shape[-2]\n patch_size = model.patch_size\n w_h_ratio = 2\n orig_h = int((orig_num_patches // w_h_ratio) ** 0.5)\n orig_w = w_h_ratio * orig_h\n orig_size = (orig_h, orig_w)\n new_size = (new_size[0] // patch_size, new_size[1] // patch_size)\n # print (orig_size)\n # print (new_size)\n if orig_size[0] != new_size[0]:\n print(\"Interpolate PEs from %dx%d to %dx%d\" % (orig_size[0], orig_size[1], new_size[0], new_size[1]))\n pos_tokens = pos_embed_checkpoint.reshape(-1, orig_size[0], orig_size[1], embedding_size).permute(\n 0, 3, 1, 2\n )\n new_pos_tokens = torch.nn.functional.interpolate(\n pos_tokens, size=(new_size[0], new_size[1]), mode=\"bicubic\", align_corners=False\n )\n new_pos_tokens = new_pos_tokens.permute(0, 2, 3, 1).flatten(1, 2)\n checkpoint_model[\"net.pos_embed\"] = new_pos_tokens\n\n\ndef interpolate_channel_embed(checkpoint_model, new_len):\n if \"net.channel_embed\" in checkpoint_model:\n channel_embed_checkpoint = checkpoint_model[\"net.channel_embed\"]\n old_len = channel_embed_checkpoint.shape[1]\n if new_len <= old_len:\n checkpoint_model[\"net.channel_embed\"] = channel_embed_checkpoint[:, :new_len]\n\n\ndef SIREN(input_dim, output_dim, features, activation_scale, dropout):\n model = tf.keras.Sequential()\n model.add(layers.Dense(features[0], input_shape=(input_dim,), kernel_initializer=initializers.RandomUniform(-1 / input_dim, 1 / input_dim), activation=tf.sin))\n for i in range(1, len(features)):\n model.add(layers.Dense(features[i], kernel_initializer=initializers.RandomUniform(-np.sqrt(6 / features[i-1]) / activation_scale, np.sqrt(6 / features[i-1]) / activation_scale), activation=tf.sin))\n model.add(layers.Dropout(dropout))\n model.add(layers.Dense(output_dim, kernel_initializer=initializers.RandomUniform(-np.sqrt(6 / features[-1]) / activation_scale, np.sqrt(6 / features[-1]) / activation_scale), activation='linear'))\n return model\n\ndef MLP(input_dim, output_dim, features, activation_scale, dropout):\n model = tf.keras.Sequential()\n model.add(layers.Dense(features[0], input_shape=(input_dim,), activation=activations.relu))\n for i in range(1, len(features)):\n model.add(layers.Dense(features[i], activation=activations.relu))\n model.add(layers.Dropout(dropout))\n model.add(layers.Dense(output_dim, activation='linear'))\n return model\n \ndef ResNet():\n resnet = ResNet50(include_top=False, weights=None, input_shape=(64, 32, 1), pooling='avg')\n model = tf.keras.Sequential()\n model.add(resnet)\n model.add(layers.Dense(2048, activation='relu'))\n model.add(layers.Dense(32768, activation='linear'))\n return model\n\n\ndata5 = xr.open_dataset(\"../data/2m_temperature_2018_5.625deg_Jan.nc\").to_dataframe().reset_index()\ndata1 = xr.open_dataset(\"../data/2m_temperature_2018_1.40625deg_Jan.nc\").to_dataframe().reset_index()\n\n\ndata5.head()\n\n\n\n\n\n\n\n\nlon\nlat\ntime\nt2m\n\n\n\n\n0\n0.0\n-87.1875\n2018-01-01 00:00:00\n250.728180\n\n\n1\n0.0\n-87.1875\n2018-01-01 01:00:00\n250.468552\n\n\n2\n0.0\n-87.1875\n2018-01-01 02:00:00\n250.250931\n\n\n3\n0.0\n-87.1875\n2018-01-01 03:00:00\n250.040314\n\n\n4\n0.0\n-87.1875\n2018-01-01 04:00:00\n249.993790\n\n\n\n\n\n\n\n\ntime_stamp = \"2018-01-01 01:00:00\"\ntrain_df = data5[data5.time == time_stamp]\ntest_df = data1[data1.time == time_stamp]\n\nX = np.stack([train_df.lat.values, train_df.lon.values], axis=1)\ny = train_df[[\"t2m\"]].values\nprint(f\"{X.shape=}, {y.shape=}\")\n\nX_test = np.stack([test_df.lat.values, test_df.lon.values], axis=1)\ny_test = test_df[[\"t2m\"]].values\nprint(f\"{X_test.shape=}, {y_test.shape=}\")\n\nrff = np.random.normal(size=(2, 16)) * 0.01\n# X = np.concatenate([np.sin(X @ rff), np.cos(X @ rff)], axis=1)\n# print(f\"{sin_cos.shape=}\")\n# X = X @ sin_cos\n# X_test = np.concatenate([np.sin(X_test @ rff), np.cos(X_test @ rff)], axis=1)\n\nprint(f\"{X.shape=}, {X_test.shape=}\")\n\nX.shape=(2048, 2), y.shape=(2048, 1)\nX_test.shape=(32768, 2), y_test.shape=(32768, 1)\nX.shape=(2048, 2), X_test.shape=(32768, 2)\n\n\n\nX_max = np.max(X, axis=0, keepdims=True)\nX_min = np.min(X, axis=0, keepdims=True)\n\nX_scaled = (X - X_min) / (X_max - X_min)\nX_test_scaled = (X_test - X_min) / (X_max - X_min)\n\ny_min = np.min(y, axis=0, keepdims=True)\ny_max = np.max(y, axis=0, keepdims=True)\n\ny_scaled = (y - y_min) / (y_max - y_min)\n\n# y_mean = np.mean(y, axis=0, keepdims=True)\n# y_std = np.std(y, axis=0, keepdims=True)\n\n# y_scaled = (y - y_mean) / y_std\n\n\nmodel = MLP(2, 1, [256]*4, 30.0, 0.0)\n# model = ResNet()\nmodel.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=1e-3), loss='mse')\n\n\nhistory = model.fit(X_scaled, y_scaled, epochs=5000, batch_size=X_scaled.shape[0], verbose=0)\n\n\nplt.plot(history.history['loss']);\n\n\n\n\n\n\n\n\n\ny_pred = model.predict(X_test_scaled) * (y_max - y_min) + y_min\nplt.imshow(y_pred.reshape(256, 128), origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\n\n1024/1024 [==============================] - 1s 1ms/step\n\n\n\n\n\n\n\n\n\n\nplt.imshow(y.reshape(64, 32), origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\n\n\n\n\n\n\n\n\n\ndiff = y_pred.reshape(256, 128) - y_test.reshape(256, 128)\nplt.imshow(diff, origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\nplt.colorbar();\nplt.title(\"Diff\")\n\nText(0.5, 1.0, 'Diff')\n\n\n\n\n\n\n\n\n\n\n# rmse = np.sqrt(np.mean(np.abs(X_test[:, 0:1])*(y_pred.ravel() - y_test.ravel())**2))/np.mean(y_test.ravel() * np.abs(X_test[:, 0:1]))\nrmse = np.sqrt(np.mean((y_pred.ravel() - y_test.ravel())**2))\nprint(f\"{rmse=}\")\n\nrmse=2.7606046\n\n\n\nmean_bias = np.mean(y_pred.ravel() - y_test.ravel())\nprint(f\"{mean_bias=}\")\n\nmean_bias=0.10866926" }, { - "objectID": "posts/2022-01-24-query_by_committee.html#qbc-by-posterior-sampling", - "href": "posts/2022-01-24-query_by_committee.html#qbc-by-posterior-sampling", - "title": "Query by Committee", - "section": "QBC by posterior sampling", - "text": "QBC by posterior sampling\n\nInteresting fact: For probabilistic models, QBC is similar to uncertainty sampling. How?\n\nDraw \\(k\\) parameter sets from the posterior distribution representing \\(k\\) different models.\nQuery a point which shows maximum disagreement among the points." + "objectID": "posts/2021-10-26-anonymization-tips.html", + "href": "posts/2021-10-26-anonymization-tips.html", + "title": "Anonymization tips for double-blind submission", + "section": "", + "text": "Use following command locally to search for author names, institute name and other terms you think may violate double-blind\n\ngit grep <query>\n\nAbove command matches the query everywhere and thus a safe way. Avoid GitHub search for this purpose, it fails to identify some terms many times and there is no regex there (yet)!\n\n\nDo not use full paths inside README file. If you move content in other repo, the links will either become unusable or may violate double-blind. So follow the example below.\n\nBad practice: [link](https://github.com/patel-zeel/reponame/blob/master/dataset)\nGood practice: [link](dataset)\n\nPoint no. 2 does not work for GitHub pages links (username.github.io/stuff). Thus, keep in mind to manually update those (if you have a better idea, let everyone know in comments below)\nDownload the repo zip locally and create an anonymized repository in your anonymized GitHub account. Open the GitHub web editor by pressing “.” (dot) at repo homepage.\nNow, you can select and drag all folders to the left pan of the web editor to upload them at once. Finally, commit with a meaningfull message and the changes will automatically be uploaded to the mail branch of your anonymized repo.\nUpdate the link in your manuscipt and submit !!\n\n\nEdit:\nAfter acceptance, transfer the ownership to personal account and delete the ownership of anonymized account from the personal account. This will remove all the traces of repository from the anonymized account. However, repository will still show that the commits were made by anonymized account which is anyway not violation of the doule-blind explicitely." }, { - "objectID": "posts/2022-01-24-query_by_committee.html#an-example-bayesian-linear-regression", - "href": "posts/2022-01-24-query_by_committee.html#an-example-bayesian-linear-regression", - "title": "Query by Committee", - "section": "An example: Bayesian linear regression", - "text": "An example: Bayesian linear regression\n\nnp.random.seed(0)\nN = 10\nX = np.linspace(-1,1,N).reshape(-1,1)\n\nt0 = 3\nt1 = 2\n\ny = X * t1 + t0 + np.random.rand(N,1)\n\nplt.scatter(X, y);\n\n\n\n\n\n\n\n\n\nAssume a posterior\n\nn_samples = 50\n\nt0_dist_samples = np.random.normal(t0, 0.1, size=n_samples)\nt1_dist_samples = np.random.normal(t1, 1, size=n_samples)\n\n\n\nPlot the models\n\nplt.scatter(X, y)\n\nfor i in range(len(t0_dist_samples)):\n sample_t0 = t0_dist_samples[i]\n sample_t1 = t1_dist_samples[i]\n \n plt.plot(X, X * sample_t1 + sample_t0,alpha=0.1)" + "objectID": "posts/2023-04-29-sine-combination-netowrks.html", + "href": "posts/2023-04-29-sine-combination-netowrks.html", + "title": "Sine Combination Networks", + "section": "", + "text": "We know that any continuous signal can be represented as a sum of sinusoids. The question is, how many sinusoids do we need to represent a signal? In this notebook, we will explore this question.\nimport os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"\"\n\nimport jax\nimport jax.numpy as jnp\nimport optax\n\nfrom tqdm import tqdm\n\nimport matplotlib.pyplot as plt\nfrom mpl_toolkits.mplot3d import Axes3D" }, { - "objectID": "posts/2022-01-24-query_by_committee.html#qbc-by-bootstrapping", - "href": "posts/2022-01-24-query_by_committee.html#qbc-by-bootstrapping", - "title": "Query by Committee", - "section": "QBC by bootstrapping", - "text": "QBC by bootstrapping\n\n2 class dataset\n\nX, y = make_classification(n_samples=1000, n_features=2, n_informative=2, n_redundant=0, random_state=3, shuffle=True)\n\nplt.figure()\nplt.scatter(X[:,0], X[:,1], c=y);\n\n\n\n\n\n\n\n\n\n\nFull data fit with RF\n\nmodel = RandomForestClassifier(random_state=0)\nmodel.fit(X, y);\n\nRandomForestClassifier(random_state=0)\n\n\n\n\nVisualize decision boundary\n\ngrid_X1, grid_X2 = np.meshgrid(np.linspace(X[:,0].min()-0.1, X[:,0].max()+0.1, 100), \n np.linspace(X[:,1].min()-0.1, X[:,1].max()+0.1, 100))\n\ngrid_X = [(x1, x2) for x1, x2 in zip(grid_X1.ravel(), grid_X2.ravel())]\n\ngrid_pred = model.predict(grid_X)\n\nplt.figure(figsize=(6,5))\nplt.scatter(X[:,0], X[:,1], c=y);\nplt.contourf(grid_X1, grid_X2, grid_pred.reshape(*grid_X1.shape), alpha=0.2);\n\n\n\n\n\n\n\n\n\n\nTrain, pool, test split\n\nX_train_pool, X_test, y_train_pool, y_test = train_test_split(X, y, test_size=0.2, random_state=0, stratify=y)\nX_train, X_pool, y_train, y_pool = train_test_split(X_train_pool, y_train_pool, train_size=20, random_state=0)\n\nX_list = [X_train, X_pool, X_test]\ny_list = [y_train, y_pool, y_test]\nt_list = ['Train', 'Pool', 'Test']\n\nfig, ax = plt.subplots(1,3,figsize=(15,4), sharex=True, sharey=True)\nfor i in range(3):\n ax[i].scatter(X_list[i][:,0], X_list[i][:,1], c=y_list[i])\n ax[i].set_title(t_list[i])\n \n\n\n\n\n\n\n\n\n\n\nFitting a model on initial train data\n\nAL_model = RandomForestClassifier(n_jobs=28, random_state=0)\n\nAL_model.fit(X_train, y_train);\n\nRandomForestClassifier(n_jobs=28, random_state=0)\n\n\n\n\nGet the votes from trees on pool dataset\n\nvotes = np.zeros(shape=(X_pool.shape[0], len(AL_model.estimators_)))\n\nfor learner_idx, learner in enumerate(AL_model.estimators_):\n votes[:, learner_idx] = learner.predict(X_pool)\n\n\nvotes.shape\n\n(780, 100)\n\n\n\nvotes\n\narray([[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [1., 1., 1., ..., 0., 1., 1.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]])\n\n\n\n\nConvert to probabilities\n\np_vote = np.zeros(shape=(X_pool.shape[0], X_pool.shape[1]))\n\nfor vote_idx, vote in enumerate(votes):\n vote_counter = {0 : (1-vote).sum(), 1 : vote.sum()}\n\n for class_idx, class_label in enumerate(range(X.shape[1])):\n p_vote[vote_idx, class_idx] = vote_counter[class_label]/len(AL_model.estimators_)\n\n\np_vote\n\narray([[1. , 0. ],\n [0.89, 0.11],\n [0.06, 0.94],\n ...,\n [0.93, 0.07],\n [1. , 0. ],\n [1. , 0. ]])\n\n\n\n\nCalculate dissimilarity (entropy)\n\nexample_id = 2\n\n\nans = 0\nfor category in range(X_pool.shape[1]):\n ans += (-p_vote[example_id][category] * np.log(p_vote[example_id][category]))\n\nans\n\n0.22696752250060448\n\n\n\nentr = entropy(p_vote, axis=1)\n\n\nentr[example_id]\n\n0.22696752250060448\n\n\n\n\nActive Learning Flow\n\ndef get_query_idx():\n # Gather the votes\n votes = np.zeros(shape=(X_pool.shape[0], len(AL_model.estimators_)))\n for learner_idx, learner in enumerate(AL_model.estimators_):\n votes[:, learner_idx] = learner.predict(X_pool)\n \n # Calcuate probability of votes\n p_vote = np.zeros(shape=(X_pool.shape[0], X_pool.shape[1]))\n for vote_idx, vote in enumerate(votes):\n vote_counter = {0 : (1-vote).sum(), \n 1 : vote.sum()}\n\n for class_idx, class_label in enumerate(range(X.shape[1])):\n p_vote[vote_idx, class_idx] = vote_counter[class_label]/len(AL_model.estimators_)\n \n # Calculate entropy for each example\n entr = entropy(p_vote, axis=1)\n \n # Choose example with highest entropy (disagreement)\n return entr.argmax()\n\n\n\nPrepare data for random sampling\n\nX_train_rand = X_train.copy()\ny_train_rand = y_train.copy()\nX_pool_rand = X_pool.copy()\ny_pool_rand = y_pool.copy()\n\nrandom_model = RandomForestClassifier(n_jobs=28, random_state=0)\n\n\n\nRun active learning\n\nAL_iters = 100\nnp.random.seed(0)\n\nAL_inds = []\nAL_models = []\nrandom_inds = []\nrandom_models = []\n\nfor iteration in range(AL_iters):\n clear_output(wait=True)\n print(\"iteration\", iteration)\n ######## Active Learning ############\n # Fit the model\n AL_model.fit(X_train, y_train)\n AL_models.append(deepcopy(AL_model))\n \n # Query a point\n query_idx = get_query_idx()\n AL_inds.append(query_idx)\n \n # Add it to the train data\n X_train = np.concatenate([X_train, X_pool[query_idx:query_idx+1, :]], axis=0)\n y_train = np.concatenate([y_train, y_pool[query_idx:query_idx+1]], axis=0)\n \n # Remove it from the pool data\n X_pool = np.delete(X_pool, query_idx, axis=0)\n y_pool = np.delete(y_pool, query_idx, axis=0)\n \n ######## Random Sampling ############\n # Fit the model\n random_model.fit(X_train_rand, y_train_rand)\n random_models.append(deepcopy(random_model))\n \n # Query a point\n query_idx = np.random.choice(len(X_pool))\n random_inds.append(query_idx)\n # Add it to the train data\n X_train_rand = np.concatenate([X_train_rand, X_pool_rand[query_idx:query_idx+1, :]], axis=0)\n y_train_rand = np.concatenate([y_train_rand, y_pool_rand[query_idx:query_idx+1]], axis=0)\n \n # Remove it from the pool data\n X_pool_rand = np.delete(X_pool_rand, query_idx, axis=0)\n y_pool_rand = np.delete(y_pool_rand, query_idx, axis=0)\n\niteration 99\n\n\n\n\nPlot accuracy\n\nrandom_scores = []\nAL_scores = []\nfor iteration in range(AL_iters):\n clear_output(wait=True)\n print(\"iteration\", iteration)\n AL_scores.append(accuracy_score(y_test, AL_models[iteration].predict(X_test)))\n random_scores.append(accuracy_score(y_test, random_models[iteration].predict(X_test)))\n \nplt.plot(AL_scores, label='Active Learning');\nplt.plot(random_scores, label='Random Sampling');\nplt.legend();\nplt.xlabel('Iterations');\nplt.ylabel('Accuracy\\n(Higher is better)');\n\niteration 99\n\n\n\n\n\n\n\n\n\n\n\nPlot decision boundary\n\ndef update(i):\n for each in ax:\n each.cla()\n \n AL_grid_preds = AL_models[i].predict(grid_X)\n random_grid_preds = random_models[i].predict(grid_X)\n \n # Active learning\n ax[0].scatter(X_train[:n_train,0], X_train[:n_train,1], c=y_train[:n_train], label='initial_train', alpha=0.2)\n ax[0].scatter(X_train[n_train:n_train+i, 0], X_train[n_train:n_train+i, 1], \n c=y_train[n_train:n_train+i], label='new_points')\n ax[0].contourf(grid_X1, grid_X2, AL_grid_preds.reshape(*grid_X1.shape), alpha=0.2);\n ax[0].set_title('New points')\n \n ax[1].scatter(X_test[:, 0], X_test[:, 1], c=y_test, label='test_set')\n ax[1].contourf(grid_X1, grid_X2, AL_grid_preds.reshape(*grid_X1.shape), alpha=0.2);\n ax[1].set_title('Test points');\n ax[0].text(locs[0],locs[1],'Active Learning')\n \n # Random sampling\n ax[2].scatter(X_train_rand[:n_train,0], X_train_rand[:n_train,1], c=y_train_rand[:n_train], label='initial_train', alpha=0.2)\n ax[2].scatter(X_train_rand[n_train:n_train+i, 0], X_train_rand[n_train:n_train+i, 1], \n c=y_train_rand[n_train:n_train+i], label='new_points')\n ax[2].contourf(grid_X1, grid_X2, random_grid_preds.reshape(*grid_X1.shape), alpha=0.2);\n ax[2].set_title('New points')\n \n ax[3].scatter(X_test[:, 0], X_test[:, 1], c=y_test, label='test_set')\n ax[3].contourf(grid_X1, grid_X2, random_grid_preds.reshape(*grid_X1.shape), alpha=0.2);\n ax[3].set_title('Test points');\n ax[2].text(locs[0],locs[1],'Random Sampling');\n\n\nlocs = (2.7, 4)\nfig, ax = plt.subplots(2,2,figsize=(12,6), sharex=True, sharey=True)\nax = ax.ravel()\nn_train = X_train.shape[0]-AL_iters\n\nanim = FuncAnimation(fig, func=update, frames=range(100))\nplt.close()\nanim\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect" + "objectID": "posts/2023-04-29-sine-combination-netowrks.html#random-combination-of-sinusoids", + "href": "posts/2023-04-29-sine-combination-netowrks.html#random-combination-of-sinusoids", + "title": "Sine Combination Networks", + "section": "Random Combination of Sinusoids", + "text": "Random Combination of Sinusoids\n\nN = 1000\nx = jnp.linspace(-10, 10, N).reshape(-1, 1)\ny = jnp.sin(x) + jnp.sin(2*x) #+ jax.random.normal(jax.random.PRNGKey(0), (N, 1)) * 0.1\nplt.plot(x, y, \"kx\");\nprint(x.shape, y.shape)\n\n(1000, 1) (1000, 1)" }, { - "objectID": "posts/2022-06-10-jaxoptimizers.html", - "href": "posts/2022-06-10-jaxoptimizers.html", - "title": "JAX Optimizers", + "objectID": "posts/2023-04-29-sine-combination-netowrks.html#recover-the-signal", + "href": "posts/2023-04-29-sine-combination-netowrks.html#recover-the-signal", + "title": "Sine Combination Networks", + "section": "Recover the Signal", + "text": "Recover the Signal\n\ndef get_weights(key):\n w1 = jax.random.uniform(key, (), minval=0.0, maxval=5.0)\n key = jax.random.split(key)[0]\n w2 = jax.random.uniform(key, (), minval=0.0, maxval=5.0)\n return w1, w2\n \ndef get_sine(weights, x):\n w1, w2 = weights\n return jnp.sin(w1*x) + jnp.sin(w2*x)\n\ndef loss_fn(weights, x, y):\n output = get_sine(weights, x)\n w1, w2 = weights\n return jnp.mean((output.ravel() - y.ravel())**2)\n\n\ndef one_step(weights_and_state, xs):\n weights, state = weights_and_state\n loss, grads = value_and_grad_fn(weights, x, y)\n updates, state = optimizer.update(grads, state)\n weights = optax.apply_updates(weights, updates)\n return (weights, state), (loss, weights)\n\nepochs = 1000\noptimizer = optax.adam(1e-2)\nvalue_and_grad_fn = jax.jit(jax.value_and_grad(loss_fn))\nfig, ax = plt.subplots(4, 3, figsize=(15, 12))\nfig2, ax2 = plt.subplots(4, 3, figsize=(15, 12))\nax = ax.ravel()\nax2 = ax2.ravel()\nfor seed in tqdm(range(12)):\n key = jax.random.PRNGKey(seed)\n init_weights = get_weights(key)\n state = optimizer.init(init_weights)\n (weights, _), (loss_history, _) = jax.lax.scan(one_step, (init_weights, state), None, length=epochs)\n y_pred = get_sine(weights, x)\n ax[seed].plot(x, y, \"kx\")\n ax[seed].plot(x, y_pred, \"r-\")\n ax[seed].set_title(f\"w_init=({init_weights[0]:.2f}, {init_weights[1]:.2f}), w_pred=({weights[0]:.2f}, {weights[1]:.2f}), loss={loss_fn(weights, x, y):.2f}\")\n ax2[seed].plot(loss_history)\nfig.tight_layout()\n\n100%|██████████| 12/12 [00:00<00:00, 15.91it/s]" + }, + { + "objectID": "posts/2023-04-29-sine-combination-netowrks.html#plot-loss-surface", + "href": "posts/2023-04-29-sine-combination-netowrks.html#plot-loss-surface", + "title": "Sine Combination Networks", + "section": "Plot loss surface", + "text": "Plot loss surface\n\nw1 = jnp.linspace(0, 3, 100)\nw2 = jnp.linspace(0, 3, 100)\nW1, W2 = jnp.meshgrid(w1, w2)\nloss = jax.vmap(jax.vmap(lambda w1, w2: loss_fn((w1, w2), x, y)))(W1, W2)\n\n# plot the loss surface in 3D\nfig = plt.figure(figsize=(8, 6))\nax = fig.add_subplot(111, projection='3d')\nax.plot_surface(W1, W2, loss, cmap=\"viridis\", alpha=0.9);\nax.set_xlabel(\"w1\");\nax.set_ylabel(\"w2\");\n# top view\nax.view_init(30, 45)" + }, + { + "objectID": "posts/2020-09-21-programatically_download_openaq_data.html", + "href": "posts/2020-09-21-programatically_download_openaq_data.html", + "title": "Programatically download OpenAQ data", "section": "", - "text": "%%capture\n%pip install -U jax\nimport jax\nimport jax.numpy as jnp\ntry:\n import jaxopt\nexcept ModuleNotFoundError:\n %pip install -qq jaxopt\n import jaxopt\ntry:\n import optax\nexcept ModuleNotFoundError:\n %pip install -qq optax\n import optax\n\nimport tensorflow_probability.substrates.jax as tfp" + "text": "# uncomment to install these libraries\n# !pip install boto3 botocore\n\nimport pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport sys\nimport boto3\nimport botocore\nimport os\nfrom IPython.display import clear_output" }, { - "objectID": "posts/2022-06-10-jaxoptimizers.html#loss-function", - "href": "posts/2022-06-10-jaxoptimizers.html#loss-function", - "title": "JAX Optimizers", - "section": "Loss function", - "text": "Loss function\n\ndef loss_fun(x, a):\n return (((x['param1'] - a) + (x['param2'] - (a+1)))**2).sum()" + "objectID": "posts/2020-09-21-programatically_download_openaq_data.html#setup", + "href": "posts/2020-09-21-programatically_download_openaq_data.html#setup", + "title": "Programatically download OpenAQ data", + "section": "Setup", + "text": "Setup\n\ns3 = boto3.client('s3', config=botocore.config.Config(signature_version=botocore.UNSIGNED))\nbucket_name = 'openaq-fetches'\nprefix = 'realtime-gzipped/'\n\npath = '/content/drive/MyDrive/IJCAI-21/data/OpenAQ-Delhi/'\n\nstart_date = '2020/01/01' # start date (inclusive)\nend_date = '2020/12/31' # end date (inclusive)\n\n\nDownload\n\nfor date in pd.date_range(start=start_date, end=end_date):\n clear_output(wait=True)\n date = str(date).split(' ')[0] # keeping just YYYY-MM-DD from YYYY-MM-DD HH:MM:SS\n print('Downloading:', date)\n data_dict = s3.list_objects(Bucket = bucket_name, Prefix = prefix+date)\n \n for file_obj in data_dict['Contents']:\n f_name = file_obj['Key']\n tmp_path = '/'.join((path+f_name).split('/')[:-1])\n \n if not os.path.exists(tmp_path):\n os.makedirs(tmp_path)\n \n s3.download_file(bucket_name, f_name, path+f_name)\n\nDownloading: 2020-05-04\n\n\n\n\nValidate\n\nfor date in pd.date_range(start=start_date, end=end_date):\n date = str(date).split(' ')[0] # keeping just YYYY-MM-DD from YYYY-MM-DD HH:MM:SS\n data_dict = s3.list_objects(Bucket = bucket_name, Prefix = prefix+date)\n \n for file_obj in data_dict['Contents']:\n assert os.path.exists(path+file_obj['Key']), file_obj['Key']\n\n\nprint('Validated')" }, { - "objectID": "posts/2022-06-10-jaxoptimizers.html#initial-parameters", - "href": "posts/2022-06-10-jaxoptimizers.html#initial-parameters", - "title": "JAX Optimizers", - "section": "Initial parameters", - "text": "Initial parameters\n\nN = 3\ninit_params = lambda: {'param1': jnp.zeros(N), 'param2': jnp.ones(N)}\na = 2.0" + "objectID": "posts/2023-03-28-nngp.html", + "href": "posts/2023-03-28-nngp.html", + "title": "Neural Network Gaussian Process", + "section": "", + "text": "# %%capture\n# %pip install -U --force-reinstall jaxutils\n# %pip install -U jax jaxlib optax\n\n\nimport jax\nimport jax.random as jr\nimport jax.numpy as jnp\nfrom jaxutils import Dataset\n\ntry:\n from neural_tangents import stax\nexcept ModuleNotFoundError:\n %pip install neural-tangents\n from neural_tangents import stax\n\ntry:\n import optax as ox\nexcept ModuleNotFoundError:\n %pip install optax\n import optax as ox\n\ntry:\n import gpjax as gpx\nexcept ModuleNotFoundError:\n %pip install gpjax\n import gpjax as gpx\n\ntry:\n import regdata as rd\nexcept ModuleNotFoundError:\n %pip install regdata\n import regdata as rd\n\nimport matplotlib.pyplot as plt\n\n\nclass NTK(gpx.kernels.AbstractKernel):\n def __init__(self) -> None:\n super().__init__()\n\n def __call__(self, params, x, y):\n params = jax.tree_util.tree_map(jax.nn.softplus, params)\n init_fn, apply_fn, kernel_fn = stax.serial(\n stax.Dense(512, W_std=params[\"w1\"], b_std=params[\"b1\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w2\"], b_std=params[\"b2\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w3\"], b_std=params[\"b3\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w4\"], b_std=params[\"b4\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w5\"], b_std=params[\"b5\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w6\"], b_std=params[\"b6\"]), stax.Relu(),\n stax.Dense(512, W_std=params[\"w7\"], b_std=params[\"b7\"]), stax.Relu(),\n stax.Dense(1, W_std=params[\"w8\"], b_std=params[\"b8\"])\n )\n return kernel_fn(x.reshape(1, 1), y.reshape(1, 1)).nngp.squeeze()\n\n def init_params(self, key):\n # return init_fn(key, input_shape=(2,1))\n return {\"w1\": 0.1, \"w2\": 0.2, \"w3\": 0.3, \"w4\": 0.4, \"w5\": 0.5, \"w6\": 0.6, \"w7\": 0.7, \"w8\": 0.8,\n \"b1\": 0.1, \"b2\": 0.2, \"b3\": 0.3, \"b4\": 0.4, \"b5\": 0.5, \"b6\": 0.6, \"b7\": 0.7, \"b8\": 0.8\n }\n\n # This is depreciated. Can be removed once JaxKern is updated.\n def _initialise_params(self, key):\n return self.init_params(key)\n\n\nn = 100\nnoise = 0.3\nkey = jr.PRNGKey(123)\n# x = jr.uniform(key=key, minval=-3.0, maxval=3.0, shape=(n,)).sort().reshape(-1, 1)\n# f = lambda x: jnp.sin(4 * x) + jnp.cos(2 * x)\n# signal = f(x)\n# y = signal + jr.normal(key, shape=signal.shape) * noise\nx, y, xtest = rd.MotorcycleHelmet().get_data()\ny = y.reshape(-1, 1)\n\nD = Dataset(X=x, y=y)\n\n# xtest = jnp.linspace(-3.5, 3.5, 500).reshape(-1, 1)\n# ytest = f(xtest)\n\nprint(x.shape, y.shape)\n\n(94, 1) (94, 1)\n\n\n\nkernel = NTK()\nprior = gpx.Prior(kernel=kernel)\nlikelihood = gpx.Gaussian(num_datapoints=D.n)\nposterior = prior * likelihood\n\n\nkey = jr.PRNGKey(1234)\nparameter_state = gpx.initialise(posterior, key)\nparams, trainable, bijectors = parameter_state.unpack()\nparams[\"likelihood\"][\"obs_noise\"] = jnp.array(0.1)\nparameter_state = gpx.parameters.ParameterState(params, trainable, bijectors)\nprint(params)\n\n{'kernel': {'w1': 0.1, 'w2': 0.2, 'w3': 0.3, 'w4': 0.4, 'w5': 0.5, 'w6': 0.6, 'w7': 0.7, 'w8': 0.8, 'b1': 0.1, 'b2': 0.2, 'b3': 0.3, 'b4': 0.4, 'b5': 0.5, 'b6': 0.6, 'b7': 0.7, 'b8': 0.8}, 'mean_function': {}, 'likelihood': {'obs_noise': Array(0.1, dtype=float32, weak_type=True)}}\n\n\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w1 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w2 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w3 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w4 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w5 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w6 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w7 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter w8 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b1 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b2 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b3 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b4 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b5 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b6 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b7 has no transform. Defaulting to identity transfom.\n warnings.warn(\n/home/patel_zeel/0Notebooks/.conda/lib/python3.9/site-packages/gpjax/parameters.py:194: UserWarning: Parameter b8 has no transform. Defaulting to identity transfom.\n warnings.warn(\n\n\n\nnegative_mll = jax.jit(posterior.marginal_log_likelihood(D, negative=True))\nnegative_mll(params)\n\nArray(415.1062, dtype=float32)\n\n\n\noptimiser = ox.adam(learning_rate=0.01)\n\ninference_state = gpx.fit(\n objective=negative_mll,\n parameter_state=parameter_state,\n optax_optim=optimiser,\n num_iters=500,\n)\n\nlearned_params, training_history = inference_state.unpack()\n\n100%|██████████| 500/500 [00:02<00:00, 172.53it/s, Objective=76.34]\n\n\n\nplt.plot(training_history);\n\n\n\n\n\n\n\n\n\nlearned_params\n\n{'kernel': {'b1': Array(0.03292831, dtype=float32),\n 'b2': Array(-0.9647168, dtype=float32),\n 'b3': Array(-1.2660046, dtype=float32),\n 'b4': Array(-1.3792713, dtype=float32),\n 'b5': Array(-1.4311961, dtype=float32),\n 'b6': Array(-1.4504426, dtype=float32),\n 'b7': Array(-1.4371448, dtype=float32),\n 'b8': Array(-1.3471106, dtype=float32),\n 'w1': Array(1.0706716, dtype=float32),\n 'w2': Array(1.1768614, dtype=float32),\n 'w3': Array(1.2740505, dtype=float32),\n 'w4': Array(1.3689499, dtype=float32),\n 'w5': Array(1.462641, dtype=float32),\n 'w6': Array(1.5562503, dtype=float32),\n 'w7': Array(1.6506695, dtype=float32),\n 'w8': Array(1.7462935, dtype=float32)},\n 'likelihood': {'obs_noise': Array(0.184795, dtype=float32)},\n 'mean_function': {}}\n\n\n\nlatent_dist = posterior(learned_params, D)(xtest)\npredictive_dist = likelihood(learned_params, latent_dist)\n\npredictive_mean = predictive_dist.mean()\npredictive_std = predictive_dist.stddev()\n\n\nfig, ax = plt.subplots(figsize=(12, 5))\nax.plot(x, y, \"o\", label=\"Observations\", color=\"tab:red\")\nax.plot(xtest, predictive_mean, label=\"Predictive mean\", color=\"tab:blue\")\nax.fill_between(\n xtest.squeeze(),\n predictive_mean - 2 * predictive_std,\n predictive_mean + 2 * predictive_std,\n alpha=0.2,\n color=\"tab:blue\",\n label=\"Two sigma\",\n)\nax.plot(\n xtest,\n predictive_mean - predictive_std,\n color=\"tab:blue\",\n linestyle=\"--\",\n linewidth=1,\n)\nax.plot(\n xtest,\n predictive_mean + predictive_std,\n color=\"tab:blue\",\n linestyle=\"--\",\n linewidth=1,\n)\n\n# ax.plot(\n# xtest, ytest, label=\"Latent function\", color=\"black\", linestyle=\"--\", linewidth=1\n# )\n\nax.legend();" }, { - "objectID": "posts/2022-06-10-jaxoptimizers.html#optimizers", - "href": "posts/2022-06-10-jaxoptimizers.html#optimizers", - "title": "JAX Optimizers", - "section": "Optimizers", - "text": "Optimizers\n\nJaxOpt ScipyMinimize\n\n%%time\nsolver = jaxopt.ScipyMinimize('L-BFGS-B', fun=loss_fun)\nans = solver.run(init_params(), a)\nprint(ans)\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)\n\n\nOptStep(params={'param1': DeviceArray([1.9999999, 1.9999999, 1.9999999], dtype=float32), 'param2': DeviceArray([3., 3., 3.], dtype=float32)}, state=ScipyMinimizeInfo(fun_val=DeviceArray(4.2632564e-14, dtype=float32), success=True, status=0, iter_num=2))\nCPU times: user 78.3 ms, sys: 18.5 ms, total: 96.8 ms\nWall time: 95.8 ms\n\n\n\nPros\n\nTwo lines of code will do it all.\n\n\n\nCons\n\nIt only returns the final parameters and final loss. No option to retrive in-between loss values.\n\n\n\n\nOptax\n\n%%time\noptimizer = optax.adam(learning_rate=0.1)\nvalue_and_grad_fun = jax.jit(jax.value_and_grad(loss_fun, argnums=0))\nparams = init_params()\nstate = optimizer.init(params)\n\nfor _ in range(100):\n loss_value, gradients = value_and_grad_fun(params, a)\n updates, state = optimizer.update(gradients, state)\n params = optax.apply_updates(params, updates)\n\nprint(params)\n\n{'param1': DeviceArray([2.0084236, 2.0084236, 2.0084236], dtype=float32), 'param2': DeviceArray([3.0084238, 3.0084238, 3.0084238], dtype=float32)}\nCPU times: user 3.09 s, sys: 63.4 ms, total: 3.16 s\nWall time: 4.2 s\n\n\n\nPros:\n\nFull control in user’s hand. We can save intermediate loss values.\n\n\n\nCons:\n\nIts code is verbose, similar to PyTorch optimizers.\n\n\n\n\nJaxopt OptaxSolver\n\n%%time\noptimizer = optax.adam(learning_rate=0.1)\nsolver = jaxopt.OptaxSolver(loss_fun, optimizer, maxiter=100)\nans = solver.run(init_params(), a)\nprint(ans)\n\nOptStep(params={'param1': DeviceArray([2.008423, 2.008423, 2.008423], dtype=float32), 'param2': DeviceArray([3.008423, 3.008423, 3.008423], dtype=float32)}, state=OptaxState(iter_num=DeviceArray(100, dtype=int32, weak_type=True), value=DeviceArray(0.00113989, dtype=float32), error=DeviceArray(0.09549397, dtype=float32), internal_state=(ScaleByAdamState(count=DeviceArray(100, dtype=int32), mu={'param1': DeviceArray([0.02871927, 0.02871927, 0.02871927], dtype=float32), 'param2': DeviceArray([0.02871927, 0.02871927, 0.02871927], dtype=float32)}, nu={'param1': DeviceArray([0.44847375, 0.44847375, 0.44847375], dtype=float32), 'param2': DeviceArray([0.44847375, 0.44847375, 0.44847375], dtype=float32)}), EmptyState()), aux=None))\nCPU times: user 719 ms, sys: 13.4 ms, total: 732 ms\nWall time: 1.09 s\n\n\n\nPros:\n\nLess lines of code.\nApplies lax.scan internally to make it fast [reference].\n\n\n\nCons:\n\nNot able to get in-between state/loss values\n\n\n\n\ntfp math minimize\n\n%%time\noptimizer = optax.adam(learning_rate=0.1)\nparams, losses = tfp.math.minimize_stateless(loss_fun, (init_params(), a), num_steps=1000, optimizer=optimizer)\nprint(params)\nprint(losses[:5])\n\n({'param1': DeviceArray([1.0000008, 1.0000008, 1.0000008], dtype=float32), 'param2': DeviceArray([1.9999989, 1.9999989, 1.9999989], dtype=float32)}, DeviceArray(0.9999999, dtype=float32))\n[48. 38.88006 30.751791 23.626852 17.507807]\nCPU times: user 880 ms, sys: 15.2 ms, total: 895 ms\nWall time: 1.53 s\n\n\n\nPros:\n\nOne line of code to optimize the function and return in-between losses.\n\n\n\nCons:\n\nBy default, it optimizes all arguments passed to the loss function. In above example, we can not control if a should be optimized or not. I have raised an issue here for this problem." + "objectID": "posts/2025-02-10-object-detection-random-baseline.html", + "href": "posts/2025-02-10-object-detection-random-baseline.html", + "title": "Object Detection Random Baseline", + "section": "", + "text": "Imports\n\nimport os\nimport numpy as np\nimport supervision as sv\nfrom roboflow import Roboflow\nfrom dotenv import load_dotenv\nload_dotenv()\n\nTrue" }, { - "objectID": "posts/2022-10-18-kfac-laplace.html", - "href": "posts/2022-10-18-kfac-laplace.html", - "title": "Train NN with KFAC-Laplace in JAX", + "objectID": "posts/2022-01-25-gp_frameworks_comparison.html", + "href": "posts/2022-01-25-gp_frameworks_comparison.html", + "title": "Comparing Gaussian Process Regression Frameworks", "section": "", - "text": "from math import prod\nfrom functools import partial\nfrom time import time\n\nimport blackjax\nimport flax.linen as nn\nimport jax\nfrom jax.flatten_util import ravel_pytree\nimport jax.tree_util as jtu\nimport jax.numpy as jnp\n# jnp.set_printoptions(linewidth=2000)\n\nimport optax\nfrom tqdm import trange\n\nimport arviz as az\nimport seaborn as sns\n\nimport matplotlib.pyplot as plt\njax.config.update(\"jax_enable_x64\", False)\n\n%reload_ext watermark\n\nSome helper functions:\n\njitter = 1e-6\n\ndef get_shapes(params):\n return jtu.tree_map(lambda x:x.shape, params)\n\ndef svd_inverse(matrix):\n U, S, V = jnp.linalg.svd(matrix+jnp.eye(matrix.shape[0])*jitter)\n \n return V.T/S@U.T\n\n\nDataset\nWe take XOR dataset to begin with:\n\nX = jnp.array([[0, 0], [0, 1], [1, 0], [1, 1]])\ny = jnp.array([0, 1, 1, 0])\n\nX.shape, y.shape\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)\n\n\n((4, 2), (4,))\n\n\n\n\nNN model\n\nclass MLP(nn.Module):\n features: []\n\n @nn.compact\n def __call__(self, x):\n for n_features in self.features[:-1]:\n x = nn.Dense(n_features, kernel_init=jax.nn.initializers.glorot_normal(), bias_init=jax.nn.initializers.normal())(x)\n x = nn.relu(x)\n \n x = nn.Dense(features[-1])(x)\n return x.ravel()\n\nLet us initialize the weights of NN and inspect shapes of the parameters:\n\nfeatures = [2, 1]\nkey = jax.random.PRNGKey(0)\n\nmodel = MLP(features)\nparams = model.init(key, X).unfreeze()\n\nget_shapes(params)\n\n{'params': {'Dense_0': {'bias': (2,), 'kernel': (2, 2)},\n 'Dense_1': {'bias': (1,), 'kernel': (2, 1)}}}\n\n\n\nmodel.apply(params, X)\n\nDeviceArray([ 0.00687164, -0.01380461, 0. , 0. ], dtype=float32)\n\n\n\n\nNegative Log Joint\n\nnoise_var = 0.1\n\ndef neg_log_joint(params):\n y_pred = model.apply(params, X)\n flat_params = ravel_pytree(params)[0]\n log_prior = jax.scipy.stats.norm.logpdf(flat_params).sum()\n log_likelihood = jax.scipy.stats.norm.logpdf(y, loc=y_pred, scale=noise_var).sum()\n \n return -(log_prior + log_likelihood)\n\nTesting if it works:\n\nneg_log_joint(params)\n\nDeviceArray(105.03511, dtype=float32)\n\n\n\n\nFind MAP\n\nkey = jax.random.PRNGKey(0)\nparams = model.init(key, X).unfreeze()\nn_iters = 1000\n\nvalue_and_grad_fn = jax.jit(jax.value_and_grad(neg_log_joint))\nopt = optax.adam(0.01)\nstate = opt.init(params)\n\ndef one_step(params_and_state, xs):\n params, state = params_and_state\n loss, grads = value_and_grad_fn(params)\n updates, state = opt.update(grads, state)\n params = optax.apply_updates(params, updates)\n return (params, state), loss\n \n(params, state), losses = jax.lax.scan(one_step, init=(params, state), xs=None, length=n_iters)\n\nplt.plot(losses);\n\n\n\n\n\n\n\n\n\ny_map = model.apply(params, X)\ny_map\n\nDeviceArray([0.01383345, 0.98666817, 0.98563665, 0.01507111], dtype=float32)\n\n\n\nx = jnp.linspace(-0.1,1.1,100)\nX1, X2 = jnp.meshgrid(x, x)\n\ndef predict_fn(x1, x2):\n return model.apply(params, jnp.array([x1,x2]).reshape(1,2))\n\npredict_fn_vec = jax.jit(jax.vmap(jax.vmap(predict_fn)))\n\nZ = predict_fn_vec(X1, X2).squeeze()\n\nplt.contourf(X1, X2, Z)\nplt.colorbar();\n\n\n\n\n\n\n\n\n\n\nFull Hessian Laplace\n\nflat_params, unravel_fn = ravel_pytree(params)\n\ndef neg_log_joint_flat(flat_params):\n return neg_log_joint(unravel_fn(flat_params))\n\nH = jax.hessian(neg_log_joint_flat)(flat_params)\n\nsns.heatmap(H);\n\n\n\n\n\n\n\n\n\nposterior_cov = svd_inverse(H)\n\nsns.heatmap(posterior_cov);\n\n\n\n\n\n\n\n\nNote that we can sample parameters from the posterior and revert them to correct structure with the unravel_fn. Here is a class to do it all:\n\nclass FullHessianLaplace:\n def __init__(self, map_params, model):\n flat_params, self.unravel_fn = ravel_pytree(map_params)\n\n def neg_log_joint_flat(flat_params):\n params = unravel_fn(flat_params)\n return neg_log_joint(params)\n\n self.H = jax.hessian(neg_log_joint_flat)(flat_params)\n \n self.mean = flat_params\n self.cov = svd_inverse(self.H)\n self.model = model\n\n def _vectorize(self, f, seed, shape, f_kwargs={}):\n length = prod(shape)\n seeds = jax.random.split(seed, num=length).reshape(shape+(2,))\n \n sample_fn = partial(f, **f_kwargs)\n for _ in shape:\n sample_fn = jax.vmap(sample_fn)\n \n return sample_fn(seed=seeds)\n \n def _sample(self, seed):\n sample = jax.random.multivariate_normal(seed, mean=self.mean, cov=self.cov)\n return self.unravel_fn(sample)\n \n def sample(self, seed, shape):\n return self._vectorize(self._sample, seed, shape)\n \n def _predict(self, X, seed):\n sample = self._sample(seed)\n return self.model.apply(sample, X)\n \n def predict(self, X, seed, shape):\n return self._vectorize(self._predict, seed, shape, {'X': X})\n\n\nEstimating predictive posterior\n\nposterior = FullHessianLaplace(params, model)\n\nseed = jax.random.PRNGKey(1)\nn_samples = 100000\ny_pred_full = posterior.predict(X, seed=seed, shape=(n_samples,))\nulim = 5\nllim = -5\n\nfig, ax = plt.subplots(2,2,figsize=(12,4))\nax=ax.ravel()\nfor i in range(len(y)):\n az.plot_dist(y_pred_full[:, i], ax=ax[i]);\n ax[i].grid(True)\n ax[i].set_xticks(range(llim,ulim))\n ax[i].set_xlim(llim, ulim)\n ax[i].set_title(f\"X={X[i]}, y_pred_mean={y_pred_full[:, i].mean():.3f}, y_map={y_map[i]:.3f}\")\nfig.tight_layout()\n\n\n\n\n\n\n\n\n\n\n\nKFAC-Laplace\nWe need to invert partial Hessians to do KFAC-Laplace. We can use tree_flatten with ravel_pytree to ease the workflow. We need to: 1. pick up partial Hessians in pure matrix form to be able to invert them. 2. Create layer-wise distributions and sample them. These samples will be 1d arrays. 3. We need to convert those 1d arrays to params dictionary form so that we can plug it into the flax model and get posterior predictions.\nFirst we need to segregate the parameters layer-wise. We will use is_leaf condition to stop traversing the parameter PyTree at a perticular depth. See how it is different from vanilla tree_flatten:\n\nflat_params, tree_def = jtu.tree_flatten(params)\ndisplay(flat_params, tree_def)\n\n[DeviceArray([-0.00024913, 0.00027019], dtype=float32),\n DeviceArray([[ 0.8275324 , -0.8314813 ],\n [-0.8276633 , 0.83254045]], dtype=float32),\n DeviceArray([0.01351773], dtype=float32),\n DeviceArray([[1.1750739],\n [1.1685134]], dtype=float32)]\n\n\nPyTreeDef({'params': {'Dense_0': {'bias': *, 'kernel': *}, 'Dense_1': {'bias': *, 'kernel': *}}})\n\n\n\nis_leaf = lambda param: 'bias' in param\nlayers, tree_def = jtu.tree_flatten(params, is_leaf=is_leaf)\ndisplay(layers, tree_def)\n\n[{'bias': DeviceArray([-0.00024913, 0.00027019], dtype=float32),\n 'kernel': DeviceArray([[ 0.8275324 , -0.8314813 ],\n [-0.8276633 , 0.83254045]], dtype=float32)},\n {'bias': DeviceArray([0.01351773], dtype=float32),\n 'kernel': DeviceArray([[1.1750739],\n [1.1685134]], dtype=float32)}]\n\n\nPyTreeDef({'params': {'Dense_0': *, 'Dense_1': *}})\n\n\nThe difference is clearly evident. Now, we need to flatten the inner dictionaries to get 1d arrays.\n\nflat_params = list(map(lambda x: ravel_pytree(x)[0], layers))\nunravel_fn_list = list(map(lambda x: ravel_pytree(x)[1], layers))\ndisplay(flat_params, unravel_fn_list)\n\n[DeviceArray([-2.4912864e-04, 2.7019347e-04, 8.2753241e-01,\n -8.3148128e-01, -8.2766330e-01, 8.3254045e-01], dtype=float32),\n DeviceArray([0.01351773, 1.1750739 , 1.1685134 ], dtype=float32)]\n\n\n[<function jax._src.flatten_util.ravel_pytree.<locals>.<lambda>(flat)>,\n <function jax._src.flatten_util.ravel_pytree.<locals>.<lambda>(flat)>]\n\n\n\ndef modified_neg_log_joint_fn(flat_params):\n layers = jtu.tree_map(lambda unravel_fn, flat_param: unravel_fn(flat_param), unravel_fn_list, flat_params)\n params = tree_def.unflatten(layers)\n return neg_log_joint(params)\n\nfull_hessian = jax.hessian(modified_neg_log_joint_fn)(flat_params)\n\n# Pick diagonal entries from the Hessian\nuseful_hessians = [full_hessian[i][i] for i in range(len(full_hessian))]\nuseful_hessians\n\n[DeviceArray([[139.07985, 0. , 138.07985, 0. , 0. ,\n 0. ],\n [ 0. , 410.62708, 0. , 136.54236, 0. ,\n 273.08472],\n [138.07985, 0. , 139.07985, 0. , 0. ,\n 0. ],\n [ 0. , 136.54236, 0. , 137.54236, 0. ,\n 136.54236],\n [ 0. , 0. , 0. , 0. , 1. ,\n 0. ],\n [ 0. , 273.08472, 0. , 136.54236, 0. ,\n 274.08472]], dtype=float32),\n DeviceArray([[400.99997, 82.72832, 83.44101],\n [ 82.72832, 69.43975, 0. ],\n [ 83.44101, 0. , 70.35754]], dtype=float32)]\n\n\nEach entry in above list corresponds to layer-wise hessian matrices. Now, we need to create layer-wise distributions, sample from them and reconstruct params using the similar tricks we used above:\n\nclass KFACHessianLaplace:\n def __init__(self, map_params, model):\n self.model = model\n layers, self.tree_def = jtu.tree_flatten(map_params, is_leaf=lambda x: 'bias' in x)\n flat_layers = [ravel_pytree(layer) for layer in layers]\n self.means = list(map(lambda x: x[0], flat_layers))\n self.unravel_fn_list = list(map(lambda x: x[1], flat_layers))\n\n def neg_log_joint_flat(flat_params):\n flat_layers = [self.unravel_fn_list[i](flat_params[i]) for i in range(len(flat_params))]\n params = self.tree_def.unflatten(flat_layers)\n return neg_log_joint(params)\n\n self.H = jax.hessian(neg_log_joint_flat)(self.means)\n self.useful_H = [self.H[i][i] for i in range(len(self.H))]\n \n self.covs = [svd_inverse(matrix) for matrix in self.useful_H]\n \n def _vectorize(self, f, seed, shape, f_kwargs={}):\n length = prod(shape)\n seeds = jax.random.split(seed, num=length).reshape(shape+(2,))\n \n sample_fn = partial(f, **f_kwargs)\n for _ in shape:\n sample_fn = jax.vmap(sample_fn)\n \n return sample_fn(seed=seeds)\n \n def _sample_partial(self, seed, unravel_fn, mean, cov):\n sample = jax.random.multivariate_normal(seed, mean=mean, cov=cov)\n return unravel_fn(sample)\n \n def _sample(self, seed):\n seeds = [seed for seed in jax.random.split(seed, num=len(self.means))]\n flat_sample = jtu.tree_map(self._sample_partial, seeds, self.unravel_fn_list, self.means, self.covs)\n sample = self.tree_def.unflatten(flat_sample)\n return sample\n \n def sample(self, seed, n_samples=1):\n return self._vectorize(self._sample, seed, shape)\n \n def _predict(self, X, seed):\n sample = self._sample(seed)\n return self.model.apply(sample, X)\n \n def predict(self, X, seed, shape):\n return self._vectorize(self._predict, seed, shape, {'X': X})\n\n\nEstimating predictive posterior\n\nkfac_posterior = KFACHessianLaplace(params, model)\n\nseed = jax.random.PRNGKey(1)\nn_samples = 1000000\ny_pred_kfac = kfac_posterior.predict(X, seed=seed, shape=(n_samples, ))\nulim = 5\nllim = -5\n\nfig, ax = plt.subplots(2,2,figsize=(12,4))\nax=ax.ravel()\nfor i in range(len(y)):\n az.plot_dist(y_pred_full[:, i], ax=ax[i], label='full', color='r')\n az.plot_dist(y_pred_kfac[:, i], ax=ax[i], label='kfac', color='b')\n ax[i].grid(True)\n ax[i].set_xticks(range(llim,ulim))\n ax[i].set_xlim(llim, ulim)\n ax[i].set_title(f\"X={X[i]}, y_map={y_map[i]:.3f}\")\nfig.tight_layout()\n\n\n\n\n\n\n\n\nWe can see that KFAC is approximating the trend of Full Hessian Laplace. We can visualize the Covariance matrices as below.\n\nfig, ax = plt.subplots(1,2,figsize=(18,5))\nsns.heatmap(posterior.cov, ax=ax[0], annot=True, fmt = '.2f')\nax[0].set_title('Full')\n\nkfac_cov = posterior.cov * 0\noffset = 0\nfor cov in kfac_posterior.covs:\n length = cov.shape[0]\n kfac_cov = kfac_cov.at[offset:offset+length, offset:offset+length].set(cov)\n offset += length\n\nsns.heatmap(kfac_cov, ax=ax[1], annot=True, fmt = '.2f')\nax[1].set_title('KFAC');\n\n\n\n\n\n\n\n\n\n\n\nComparison with MCMC\nInspired from a blackjax docs example.\n\nkey = jax.random.PRNGKey(0)\nwarmup_key, inference_key = jax.random.split(key, 2)\nnum_warmup = 5000\nnum_samples = n_samples\n\ninitial_position = model.init(key, X)\ndef logprob(params): \n return -neg_log_joint(params)\n\ndef inference_loop(rng_key, kernel, initial_state, num_samples):\n def one_step(state, rng_key):\n state, _ = kernel(rng_key, state)\n return state, state\n\n keys = jax.random.split(rng_key, num_samples)\n _, states = jax.lax.scan(one_step, initial_state, keys)\n\n return states\n\ninit = time()\nadapt = blackjax.window_adaptation(blackjax.nuts, logprob, num_warmup)\nfinal_state, kernel, _ = adapt.run(warmup_key, initial_position)\nstates = inference_loop(inference_key, kernel, final_state, num_samples)\nsamples = states.position.unfreeze()\nprint(f\"Sampled {n_samples} samples in {time()-init:.2f} seconds\")\n\nSampled 1000000 samples in 27.85 seconds\n\n\n\ny_pred_mcmc = jax.vmap(model.apply, in_axes=(0, None))(samples, X)\n\nulim = 5\nllim = -5\n\nfig, ax = plt.subplots(2,2,figsize=(12,4))\nax=ax.ravel()\nfor i in range(len(y)):\n az.plot_dist(y_pred_full[:, i], ax=ax[i], label='full', color='r')\n az.plot_dist(y_pred_kfac[:, i], ax=ax[i], label='kfac', color='b')\n az.plot_dist(y_pred_mcmc[:, i], ax=ax[i], label='mcmc', color='k')\n ax[i].grid(True)\n ax[i].set_xticks(range(llim,ulim))\n ax[i].set_xlim(llim, ulim)\n ax[i].set_title(f\"X={X[i]}, y_map={y_map[i]:.3f}\")\nfig.tight_layout()\n\n\n\n\n\n\n\n\n\nfig, ax = plt.subplots(1,3,figsize=(18,5))\nfig.subplots_adjust(wspace=0.1)\nsns.heatmap(posterior.cov, ax=ax[0], annot=True, fmt = '.2f')\nax[0].set_title('Full')\n\nkfac_cov = posterior.cov * 0\noffset = 0\nfor cov in kfac_posterior.covs:\n length = cov.shape[0]\n kfac_cov = kfac_cov.at[offset:offset+length, offset:offset+length].set(cov)\n offset += length\n\nsns.heatmap(kfac_cov, ax=ax[1], annot=True, fmt = '.2f')\nax[1].set_title('KFAC');\n\nmcmc_cov = jnp.cov(jax.vmap(lambda x: ravel_pytree(x)[0])(samples).T)\n\nsns.heatmap(mcmc_cov, ax=ax[2], annot=True, fmt = '.2f')\nax[2].set_title('MCMC');\n\n\n\n\n\n\n\n\n\n\nLibrary versions\n\n%watermark --iversions\n\nflax : 0.6.1\nblackjax : 0.8.2\noptax : 0.1.3\nmatplotlib: 3.5.1\njax : 0.3.23\narviz : 0.12.1\nseaborn : 0.11.2\njson : 2.0.9" + "text": "import math\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport pandas as pd\nimport torch\nimport GPy\nimport jax\nimport gpytorch\nimport botorch\nimport tinygp\nimport jax.numpy as jnp\nimport optax\nfrom IPython.display import clear_output\n\nfrom sklearn.preprocessing import StandardScaler\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)" }, { - "objectID": "posts/Torch-DataLoaders.html", - "href": "posts/Torch-DataLoaders.html", - "title": "Data Handling for Large Scale ML", + "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#data", + "href": "posts/2022-01-25-gp_frameworks_comparison.html#data", + "title": "Comparing Gaussian Process Regression Frameworks", + "section": "Data", + "text": "Data\n\nnp.random.seed(0) # We don't want surprices in a presentation :)\nN = 10\ntrain_x = torch.linspace(0, 1, N)\ntrain_y = torch.sin(train_x * (2 * math.pi)) + torch.normal(0, 0.1, size=(N,))\n \ntest_x = torch.linspace(0, 1, N*10)\ntest_y = torch.sin(test_x * (2 * math.pi))\n\n\nplt.plot(train_x, train_y, 'ko', label='train');\nplt.plot(test_x, test_y, label='test');\nplt.legend();" + }, + { + "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#defining-kernel", + "href": "posts/2022-01-25-gp_frameworks_comparison.html#defining-kernel", + "title": "Comparing Gaussian Process Regression Frameworks", + "section": "Defining kernel", + "text": "Defining kernel\n\\[\\begin{equation}\n\\sigma_f^2 = \\text{variance}\\\\\n\\ell = \\text{lengthscale}\\\\\nk_{RBF}(x_1, x_2) = \\sigma_f^2 \\exp \\left[-\\frac{\\lVert x_1 - x_2 \\rVert^2}{2\\ell^2}\\right]\n\\end{equation}\\]\n\nGPy\n\ngpy_kernel = GPy.kern.RBF(input_dim=1, variance=1., lengthscale=1.)\ngpy_kernel\n\n\n\n\n\n\nrbf.\nvalue\nconstraints\npriors\n\n\nvariance\n1.0\n+ve\n\n\n\nlengthscale\n1.0\n+ve\n\n\n\n\n\n\n\n\nGPyTorch\n\ngpytorch_kernel = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())\ngpytorch_kernel.outputscale = 1. # variance\ngpytorch_kernel.base_kernel.lengthscale = 1. # lengthscale\n\ngpytorch_kernel\n\nScaleKernel(\n (base_kernel): RBFKernel(\n (raw_lengthscale_constraint): Positive()\n )\n (raw_outputscale_constraint): Positive()\n)\n\n\n\n\nTinyGP\n\ndef RBFKernel(variance, lengthscale):\n return jnp.exp(variance) * tinygp.kernels.ExpSquared(scale=jnp.exp(lengthscale))\n \ntinygp_kernel = RBFKernel(variance=1., lengthscale=1.)\ntinygp_kernel\n\n<tinygp.kernels.Product at 0x7f544039d710>" + }, + { + "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#define-model", + "href": "posts/2022-01-25-gp_frameworks_comparison.html#define-model", + "title": "Comparing Gaussian Process Regression Frameworks", + "section": "Define model", + "text": "Define model\n\\[\n\\sigma_n^2 = \\text{noise variance}\n\\]\n\nGPy\n\ngpy_model = GPy.models.GPRegression(train_x.numpy()[:,None], train_y.numpy()[:,None], gpy_kernel)\ngpy_model.Gaussian_noise.variance = 0.1\ngpy_model\n\n\n\n\nModel: GP regression\nObjective: 16.757933772959404\nNumber of Parameters: 3\nNumber of Optimization Parameters: 3\nUpdates: True\n\n\n\n\n\n\nGP_regression.\nvalue\nconstraints\npriors\n\n\nrbf.variance\n1.0\n+ve\n\n\n\nrbf.lengthscale\n1.0\n+ve\n\n\n\nGaussian_noise.variance\n0.1\n+ve\n\n\n\n\n\n\n\n\nGPyTorch\n\nclass ExactGPModel(gpytorch.models.ExactGP):\n def __init__(self, train_x, train_y, likelihood, kernel):\n super().__init__(train_x, train_y, likelihood)\n \n self.mean_module = gpytorch.means.ConstantMean()\n self.covar_module = kernel\n\n def forward(self, x):\n mean_x = self.mean_module(x)\n covar_x = self.covar_module(x)\n return gpytorch.distributions.MultivariateNormal(mean_x, covar_x)\n\ngpytorch_likelihood = gpytorch.likelihoods.GaussianLikelihood()\ngpytorch_model = ExactGPModel(train_x, train_y, gpytorch_likelihood, gpytorch_kernel)\n\ngpytorch_model.likelihood.noise = 0.1\ngpytorch_model\n\nExactGPModel(\n (likelihood): GaussianLikelihood(\n (noise_covar): HomoskedasticNoise(\n (raw_noise_constraint): GreaterThan(1.000E-04)\n )\n )\n (mean_module): ConstantMean()\n (covar_module): ScaleKernel(\n (base_kernel): RBFKernel(\n (raw_lengthscale_constraint): Positive()\n )\n (raw_outputscale_constraint): Positive()\n )\n)\n\n\n\n\nTinyGP\n\ndef build_gp(theta, X):\n mean = theta[0] \n variance, lengthscale, noise_variance = jnp.exp(theta[1:])\n \n kernel = variance * tinygp.kernels.ExpSquared(lengthscale)\n \n return tinygp.GaussianProcess(kernel, X, diag=noise_variance, mean=mean)\n\ntinygp_model = build_gp(theta=np.array([0., 1., 1., 0.1]), X=train_x.numpy())\n\ntinygp_model\n# __repr__\n\n<tinygp.gp.GaussianProcess at 0x7f5440401850>" + }, + { + "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#train-the-model", + "href": "posts/2022-01-25-gp_frameworks_comparison.html#train-the-model", + "title": "Comparing Gaussian Process Regression Frameworks", + "section": "Train the model", + "text": "Train the model\n\nGPy\n\ngpy_model.optimize(max_iters=50)\ngpy_model\n\n\n\n\nModel: GP regression\nObjective: 3.944394423452163\nNumber of Parameters: 3\nNumber of Optimization Parameters: 3\nUpdates: True\n\n\n\n\n\n\nGP_regression.\nvalue\nconstraints\npriors\n\n\nrbf.variance\n0.9376905183253631\n+ve\n\n\n\nrbf.lengthscale\n0.2559000163858406\n+ve\n\n\n\nGaussian_noise.variance\n0.012506184441481319\n+ve\n\n\n\n\n\n\n\n\nGPyTorch\n\nmll = gpytorch.mlls.ExactMarginalLogLikelihood(gpytorch_likelihood, gpytorch_model)\nbotorch.fit_gpytorch_model(mll)\n\ndisplay(gpytorch_model.mean_module.constant, # Mean\n gpytorch_model.covar_module.outputscale, # Variance\n gpytorch_model.covar_module.base_kernel.lengthscale, # Lengthscale \n gpytorch_model.likelihood.noise) # Noise variance\n\n /opt/conda/lib/python3.7/site-packages/botorch/fit.py:143: UserWarning:CUDA initialization: CUDA unknown error - this may be due to an incorrectly set up environment, e.g. changing env variable CUDA_VISIBLE_DEVICES after program start. Setting the available devices to be zero. (Triggered internally at /opt/conda/conda-bld/pytorch_1634272168290/work/c10/cuda/CUDAFunctions.cpp:112.)\n\n\nParameter containing:\ntensor([0.0923], requires_grad=True)\n\n\ntensor(0.9394, grad_fn=<SoftplusBackward0>)\n\n\ntensor([[0.2560]], grad_fn=<SoftplusBackward0>)\n\n\ntensor([0.0124], grad_fn=<AddBackward0>)\n\n\n\n\nTinyGP\n\nfrom scipy.optimize import minimize\n\ndef neg_log_likelihood(theta, X, y):\n gp = build_gp(theta, X)\n return -gp.condition(y)\n\n\nobj = jax.jit(jax.value_and_grad(neg_log_likelihood))\nresult = minimize(obj, [0., 1., 1., 0.1], jac=True, args=(train_x.numpy(), train_y.numpy()))\nresult.x[0], np.exp(result.x[1:])\n\n(0.09213499552879165, array([0.9395271 , 0.25604163, 0.01243025]))" + }, + { + "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#inference", + "href": "posts/2022-01-25-gp_frameworks_comparison.html#inference", + "title": "Comparing Gaussian Process Regression Frameworks", + "section": "Inference", + "text": "Inference\n\ndef plot_gp(pred_y, var_y):\n std_y = var_y ** 0.5\n plt.figure()\n plt.scatter(train_x, train_y, label='train')\n plt.plot(test_x, pred_y, label='predictive mean')\n plt.fill_between(test_x.ravel(), \n pred_y.ravel() - 2*std_y.ravel(), \n pred_y.ravel() + 2*std_y.ravel(), alpha=0.2, label='95% confidence')\n plt.legend()\n\n\nGPy\n\npred_y, var_y = gpy_model.predict(test_x.numpy()[:, None])\nplot_gp(pred_y, var_y)\n\n\n\n\n\n\n\n\n\n\nGPyTorch\n\ngpytorch_model.eval()\n\nwith torch.no_grad(), gpytorch.settings.fast_pred_var():\n pred_dist = gpytorch_likelihood(gpytorch_model(test_x))\n pred_y, var_y = pred_dist.mean, pred_dist.variance\n plot_gp(pred_y, var_y)\n\n\n\n\n\n\n\n\n\n\nTinyGP\n\ntinygp_model = build_gp(result.x, train_x.numpy())\npred_y, var_y = tinygp_model.predict(train_y.numpy(), test_x.numpy(), return_var=True)\n\nplot_gp(pred_y, var_y)" + }, + { + "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#tiny-gp-on-co2-dataset", + "href": "posts/2022-01-25-gp_frameworks_comparison.html#tiny-gp-on-co2-dataset", + "title": "Comparing Gaussian Process Regression Frameworks", + "section": "Tiny GP on CO2 dataset", + "text": "Tiny GP on CO2 dataset\n\ndata = pd.read_csv(\"data/co2.csv\")\n\n# Train test split\nX = data[\"0\"].iloc[:290].values.reshape(-1, 1)\nX_test = data[\"0\"].iloc[290:].values.reshape(-1, 1)\ny = data[\"1\"].iloc[:290].values\ny_test = data[\"1\"].iloc[290:].values\n\n# Scaling the dataset\nXscaler = StandardScaler()\nX = Xscaler.fit_transform(X)\nX_test = Xscaler.transform(X_test)\n\nyscaler = StandardScaler()\ny = yscaler.fit_transform(y.reshape(-1, 1)).ravel()\ny_test = yscaler.transform(y_test.reshape(-1, 1)).ravel()\n\n\nplt.plot(X, y, label='train');\nplt.plot(X_test, y_test, label='test');\nplt.legend();\n\n\n\n\n\n\n\n\n\nclass SpectralMixture(tinygp.kernels.Kernel):\n def __init__(self, weight, scale, freq):\n self.weight = jnp.atleast_1d(weight)\n self.scale = jnp.atleast_1d(scale)\n self.freq = jnp.atleast_1d(freq)\n\n def evaluate(self, X1, X2):\n tau = jnp.atleast_1d(jnp.abs(X1 - X2))[..., None]\n return jnp.sum(\n self.weight\n * jnp.prod(\n jnp.exp(-2 * jnp.pi ** 2 * tau ** 2 / self.scale ** 2)\n * jnp.cos(2 * jnp.pi * self.freq * tau),\n axis=-1,\n )\n )\n \ndef build_spectral_gp(theta):\n kernel = SpectralMixture(\n jnp.exp(theta[\"log_weight\"]),\n jnp.exp(theta[\"log_scale\"]),\n jnp.exp(theta[\"log_freq\"]),\n )\n return tinygp.GaussianProcess(\n kernel, X, diag=jnp.exp(theta[\"log_diag\"]), mean=theta[\"mean\"]\n )\n\n\nK = 4 # Number of mixtures\ndiv_factor = 0.4\nnp.random.seed(1)\nparams = {\n \"log_weight\": np.abs(np.random.rand(K))/div_factor,\n \"log_scale\": np.abs(np.random.rand(K))/div_factor,\n \"log_freq\": np.abs(np.random.rand(K))/div_factor,\n \"log_diag\": np.abs(np.random.rand(1))/div_factor,\n \"mean\": 0.,\n}\n\n@jax.jit\n@jax.value_and_grad\ndef loss(theta):\n return -build_spectral_gp(theta).condition(y)\n# opt = optax.sgd(learning_rate=0.001)\nopt = optax.adam(learning_rate=0.1)\nopt_state = opt.init(params)\nlosses = []\nfor i in range(100):\n loss_val, grads = loss(params)\n updates, opt_state = opt.update(grads, opt_state)\n params = optax.apply_updates(params, updates)\n losses.append(loss_val)\n clear_output(wait=True)\n print(f\"iter {i}, loss {loss_val}\")\n\nopt_gp = build_spectral_gp(params)\n\nparams\n\niter 99, loss 27.987701416015625\n\n\n{'log_diag': DeviceArray([-2.7388687], dtype=float32),\n 'log_freq': DeviceArray([-3.6072493, -3.1795945, -3.4490397, -2.373117 ], dtype=float32),\n 'log_scale': DeviceArray([3.9890492, 3.8530042, 4.0878096, 4.4860597], dtype=float32),\n 'log_weight': DeviceArray([-1.3715047, -0.6132469, -2.413771 , -1.6582283], dtype=float32),\n 'mean': DeviceArray(0.38844627, dtype=float32)}\n\n\n\nplt.plot(losses);\n\n\n\n\n\n\n\n\n\nmu, var = opt_gp.predict(y, X_test, return_var=True)\n\nplt.plot(X, y, c='k')\nplt.fill_between(\n X_test.ravel(), mu + np.sqrt(var), mu - np.sqrt(var), color=\"C0\", alpha=0.5\n)\nplt.plot(X_test, mu, color=\"C0\", lw=2)\n\n# plt.xlim(t.min(), 2025)\nplt.xlabel(\"year\")\n_ = plt.ylabel(\"CO$_2$ in ppm\")" + }, + { + "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#k_mathbfy-textcov_functionx_train-x_train-sigma_f-ell-sigma_n", + "href": "posts/2022-01-25-gp_frameworks_comparison.html#k_mathbfy-textcov_functionx_train-x_train-sigma_f-ell-sigma_n", + "title": "Comparing Gaussian Process Regression Frameworks", + "section": "\\(K_\\mathbf{y} = \\text{cov_function}(X_{train}, X_{train}, \\sigma_f, \\ell, \\sigma_n)\\)", + "text": "\\(K_\\mathbf{y} = \\text{cov_function}(X_{train}, X_{train}, \\sigma_f, \\ell, \\sigma_n)\\)" + }, + { + "objectID": "posts/2022-01-25-gp_frameworks_comparison.html#gp-loss-log-pmathbfy-mid-mathbfx-theta-frac12-mathbfyt-k_y-1-mathbfy-frac12-log-leftk_yright-fracn2-log-2-pi", + "href": "posts/2022-01-25-gp_frameworks_comparison.html#gp-loss-log-pmathbfy-mid-mathbfx-theta-frac12-mathbfyt-k_y-1-mathbfy-frac12-log-leftk_yright-fracn2-log-2-pi", + "title": "Comparing Gaussian Process Regression Frameworks", + "section": "GP Loss: \\(\\log p(\\mathbf{y} \\mid \\mathbf{X}, \\theta)=-\\frac{1}{2} \\mathbf{y}^{T} K_{y}^{-1} \\mathbf{y}-\\frac{1}{2} \\log \\left|K_{y}\\right|-\\frac{n}{2} \\log 2 \\pi\\)", + "text": "GP Loss: \\(\\log p(\\mathbf{y} \\mid \\mathbf{X}, \\theta)=-\\frac{1}{2} \\mathbf{y}^{T} K_{y}^{-1} \\mathbf{y}-\\frac{1}{2} \\log \\left|K_{y}\\right|-\\frac{n}{2} \\log 2 \\pi\\)\n\nMinimize inverse term fully\nNow, Minimize both togather" + }, + { + "objectID": "posts/2021-10-12-sparsegps.html", + "href": "posts/2021-10-12-sparsegps.html", + "title": "SparseGPs in Stheno", "section": "", - "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"3\"\n\nimport torch\nimport torch.nn as nn\nfrom numcodecs import GZip, Zstd, Blosc\n\nfrom time import time, sleep\nfrom tqdm import tqdm\nfrom glob import glob\nfrom os.path import join\nfrom torch.utils.data import DataLoader, Dataset\nfrom joblib import Parallel, delayed\nimport xarray as xr\nimport numpy as np\n\nfrom torchvision.models import vit_b_16\nfrom astra.torch.models import ViTClassifier\nfrom astra.torch.utils import train_fn" + "text": "# !pip install -U regdata\n\n\nimport regdata as rd\nimport torch\nimport matplotlib.pyplot as plt\nfrom matplotlib.animation import FuncAnimation\nfrom matplotlib import rc\nimport wbml.out as out\nfrom wbml.plot import tweak\n\nfrom stheno import B, GP, EQ, PseudoObsVFE, PseudoObsFITC\nfrom varz.torch import Vars, minimise_l_bfgs_b, parametrised, Positive\nimport lab.torch" }, { - "objectID": "posts/Torch-DataLoaders.html#imports", - "href": "posts/Torch-DataLoaders.html#imports", - "title": "Data Handling for Large Scale ML", + "objectID": "posts/2021-10-12-sparsegps.html#imports", + "href": "posts/2021-10-12-sparsegps.html#imports", + "title": "SparseGPs in Stheno", "section": "", - "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"3\"\n\nimport torch\nimport torch.nn as nn\nfrom numcodecs import GZip, Zstd, Blosc\n\nfrom time import time, sleep\nfrom tqdm import tqdm\nfrom glob import glob\nfrom os.path import join\nfrom torch.utils.data import DataLoader, Dataset\nfrom joblib import Parallel, delayed\nimport xarray as xr\nimport numpy as np\n\nfrom torchvision.models import vit_b_16\nfrom astra.torch.models import ViTClassifier\nfrom astra.torch.utils import train_fn" + "text": "# !pip install -U regdata\n\n\nimport regdata as rd\nimport torch\nimport matplotlib.pyplot as plt\nfrom matplotlib.animation import FuncAnimation\nfrom matplotlib import rc\nimport wbml.out as out\nfrom wbml.plot import tweak\n\nfrom stheno import B, GP, EQ, PseudoObsVFE, PseudoObsFITC\nfrom varz.torch import Vars, minimise_l_bfgs_b, parametrised, Positive\nimport lab.torch" }, { - "objectID": "posts/Torch-DataLoaders.html#is-.nc-better-than-zarr", - "href": "posts/Torch-DataLoaders.html#is-.nc-better-than-zarr", - "title": "Data Handling for Large Scale ML", - "section": "Is .nc better than zarr?", - "text": "Is .nc better than zarr?\n\nos.system(f\"du -sh {base_path}\")\n\n1.8G /home/patel_zeel/bkdb/bangladesh_pnas_pred/team1\n\n\n0\n\n\n\nsave_path = \"/tmp/nc_check_uncompressed\"\nos.makedirs(save_path, exist_ok=True)\nfiles = []\ndef zarr_to_nc(file):\n with xr.open_zarr(file, consolidated=False) as ds:\n ds.to_netcdf(join(save_path, file.split(\"/\")[-1].replace(\".zarr\", \".nc\")))\n\n_ = Parallel(n_jobs=32)(delayed(zarr_to_nc)(file) for file in tqdm(glob(join(base_path, \"*.zarr\"))))\n\nos.system(f\"du -sh {save_path}\")\n\n 0%| | 0/1501 [00:00<?, ?it/s]100%|██████████| 1501/1501 [00:24<00:00, 62.47it/s] \n\n\n5.3G /tmp/nc_check_uncompressed\n\n\n0\n\n\n\nsave_path = \"/tmp/nc_check_compressed\"\nos.system(f\"rm -rf {save_path}\")\nos.makedirs(save_path, exist_ok=True)\n\nencoding = {var: {\"zlib\": True, \"complevel\": 1} for var in [\"data\"]}\n\nfiles = []\ndef zarr_to_nc(file):\n with xr.open_zarr(file, consolidated=False) as ds:\n ds.to_netcdf(join(save_path, file.split(\"/\")[-1].replace(\".zarr\", \".nc\")), encoding=encoding)\n\n_ = Parallel(n_jobs=32)(delayed(zarr_to_nc)(file) for file in tqdm(glob(join(base_path, \"*.zarr\"))))\n\nos.system(f\"du -sh {save_path}\")\n\n100%|██████████| 1501/1501 [00:04<00:00, 311.18it/s]\n\n\n1.8G /tmp/nc_check_compressed\n\n\n0\n\n\n\nclass XarrayDatasetWithNC(Dataset):\n def __init__(self, path, max_files):\n self.base_path = path\n self.all_files = glob(join(path, \"*.nc\"))[:max_files]\n self.all_files.sort()\n self.all_ds = [xr.open_dataset(file) for file in tqdm(self.all_files)]\n self.lat_lags = [-2, -1, 0, 1, 2]\n self.lon_lags = [-2, -1, 0, 1, 2]\n \n def __len__(self):\n return len(self.all_files) * 25\n \n def __getitem__(self, idx):\n file_idx = idx // 25\n local_idx = idx % 25\n lat_lag = self.lat_lags[local_idx // 5]\n lon_lag = self.lon_lags[local_idx % 5]\n \n ds = self.all_ds[file_idx]\n img = ds.isel(lat_lag=lat_lag, lon_lag=lon_lag)['data'].values\n return torch.tensor(np.einsum(\"hwc->chw\", img).astype(np.float32) / 255)\n\n\nnc_path = \"/tmp/nc_check_compressed\"\n\n\nbatch_size = 128\nnum_workers = 32\n\ndataset = XarrayDatasetWithNC(nc_path, max_files=max_files)\nprocess_it(dataset, batch_size, num_workers)\n\n100%|██████████| 500/500 [00:02<00:00, 246.27it/s]\nTime: 0.7414: 100%|██████████| 98/98 [01:25<00:00, 1.15it/s]\n\n\nAverage Iteration Processing Time: 0.8260 +- 0.0530\nTotal time for all iterations: 80.9527\nTotal Wall Time per iteration: 0.8725\nTotal Wall Time: 85.5034" + "objectID": "posts/2021-10-12-sparsegps.html#data-preperation", + "href": "posts/2021-10-12-sparsegps.html#data-preperation", + "title": "SparseGPs in Stheno", + "section": "Data preperation", + "text": "Data preperation\n\n# Define points to predict at.\nx = B.linspace(0, 10, 100)\nx_obs = B.linspace(0, 7, 50_000)\nx_ind = B.linspace(0, 10, 20)\n\n# Construct a prior.\nf = GP(EQ().periodic(2 * B.pi))\n\n# Sample a true, underlying function and observations.\nf_true = B.sin(x)\ny_obs = B.sin(x_obs) + B.sqrt(0.5) * B.randn(*x_obs.shape)" }, { - "objectID": "posts/Torch-DataLoaders.html#additional-experiments", - "href": "posts/Torch-DataLoaders.html#additional-experiments", - "title": "Data Handling for Large Scale ML", - "section": "Additional experiments", - "text": "Additional experiments\n\nn_images = 60000\nt = 84.9131/500/25 * n_images\nprint(f\"Time to process {n_images} images: \", t/60, \"minutes\")\n\nTime to process 60000 images: 6.793048000000001 minutes\n\n\n\nfiles = glob(join(base_path, \"*.zarr\"))\ndata_tensors = []\nfor file in tqdm(files):\n with xr.open_zarr(file, consolidated=False) as ds:\n # print(ds['data'].values.reshape(-1, 224, 224, 3))\n data_tensors.append(torch.tensor(np.einsum(\"nhwc->nchw\", ds['data'].values.reshape(-1, 224, 224, 3)).astype(np.float16) / 255))\n\n100%|██████████| 1501/1501 [02:44<00:00, 9.13it/s]\n\n\n\nall_in_one = torch.concat(data_tensors, dim=0)\nall_in_one.shape\n\ntorch.Size([37525, 3, 224, 224])\n\n\n\nall_in_one = all_in_one.to('cuda')" + "objectID": "posts/2021-10-12-sparsegps.html#plotting-function", + "href": "posts/2021-10-12-sparsegps.html#plotting-function", + "title": "SparseGPs in Stheno", + "section": "Plotting function", + "text": "Plotting function\n\ndef plot(method):\n if method == 'VFE':\n # Plot result.\n plt.plot(x, f_true, label=\"True\", style=\"test\")\n plt.scatter(\n x_obs,\n y_obs,\n label=\"Observations\",\n style=\"train\",\n c=\"tab:green\",\n alpha=0.35,\n )\n plt.scatter(\n x_ind,\n obs.mu(f.measure)[:, 0],\n label=\"Inducing Points\",\n style=\"train\",\n s=20,\n )\n plt.plot(x, mean, label=\"Prediction\", style=\"pred\")\n plt.fill_between(x, lower, upper, style=\"pred\")\n tweak()\n\n plt.show()\n else:\n # Plot result.\n plt.plot(x, f_true, label=\"True\", style=\"test\")\n plt.scatter(\n x_obs,\n y_obs,\n label=\"Observations\",\n style=\"train\",\n c=\"tab:green\",\n alpha=0.35,\n )\n plt.scatter(\n x_ind,\n B.dense(f_post(x_ind).mean),\n label=\"Inducing Points\",\n style=\"train\",\n s=20,\n )\n plt.plot(x, mean, label=\"Prediction\", style=\"pred\")\n plt.fill_between(x, lower, upper, style=\"pred\")\n tweak()\n\n plt.show()" }, { - "objectID": "posts/Torch-DataLoaders.html#insights", - "href": "posts/Torch-DataLoaders.html#insights", - "title": "Data Handling for Large Scale ML", - "section": "Insights", - "text": "Insights\n\nGPU Memory consumption is 17776MiB / 81920MiB for batch size 128 for ViT model\nUploading torch.Size([37525, 3, 224, 224]) of float32 data to GPU takes 22054MiB / 81920MiB of GPU Memory. Same data with float16 takes 11202MiB / 81920MiB of GPU Memory.\nIt seems .nc or .zarr are not making much difference in terms of time and/or memory." + "objectID": "posts/2021-10-12-sparsegps.html#sparse-regression-with-variational-free-energy-vfe-method", + "href": "posts/2021-10-12-sparsegps.html#sparse-regression-with-variational-free-energy-vfe-method", + "title": "SparseGPs in Stheno", + "section": "Sparse regression with Variational Free Energy (VFE) method", + "text": "Sparse regression with Variational Free Energy (VFE) method\n\n# Compute a pseudo-point approximation of the posterior.\nobs = PseudoObsVFE(f(x_ind), (f(x_obs, 0.5), y_obs))\n\n# Compute the ELBO.\nout.kv(\"ELBO\", obs.elbo(f.measure))\n\n# Compute the approximate posterior.\nf_post = f | obs\n\n# Make predictions with the approximate posterior.\nmean, lower, upper = f_post(x, 0.5).marginal_credible_bounds()\nplot('VFE')\n\nELBO: -5.345e+04" }, { - "objectID": "posts/GNNs_and_GPs.html", - "href": "posts/GNNs_and_GPs.html", - "title": "GNNs and GPs", + "objectID": "posts/2021-10-12-sparsegps.html#sparse-regression-with-fully-independent-training-conditional-fitc-mehod", + "href": "posts/2021-10-12-sparsegps.html#sparse-regression-with-fully-independent-training-conditional-fitc-mehod", + "title": "SparseGPs in Stheno", + "section": "Sparse Regression with Fully Independent Training Conditional (FITC) mehod", + "text": "Sparse Regression with Fully Independent Training Conditional (FITC) mehod\n\n# Compute a pseudo-point approximation of the posterior.\nobs = PseudoObsFITC(f(x_ind), (f(x_obs, 0.5), y_obs))\n\n# Compute the ELBO.\nout.kv(\"ELBO\", obs.elbo(f.measure))\n\n# Compute the approximate posterior.\nf_post = f | obs\n\n# Make predictions with the approximate posterior.\nmean, lower, upper = f_post(x, 0.5).marginal_credible_bounds()\nplot('FITC')\n\nELBO: -5.345e+04" + }, + { + "objectID": "posts/2021-10-12-sparsegps.html#hyperparameter-tuning-noisy-sine-data", + "href": "posts/2021-10-12-sparsegps.html#hyperparameter-tuning-noisy-sine-data", + "title": "SparseGPs in Stheno", + "section": "Hyperparameter tuning (Noisy Sine data)", + "text": "Hyperparameter tuning (Noisy Sine data)\n\ndef model(vs):\n \"\"\"Constuct a model with learnable parameters.\"\"\"\n return vs['variance']*GP(EQ().stretch(vs['length_scale']))\n\n\ntorch.manual_seed(123)\n\ndataObj = rd.SineNoisy(scale_X=False, scale_y=False, return_test=True, backend='torch')\nx_obs, y_obs, x = dataObj.get_data()\n\n\nplt.scatter(x_obs, y_obs, s=2);\n\n\n\n\n\n\n\n\n\nVFE\n\nvs = Vars(torch.float64)\nvs.positive(name=\"noise\")\nvs.positive(name=\"length_scale\");\nvs.positive(name=\"variance\");\nvs.positive(init=torch.linspace(0.4,0.6,10), shape=(10,), name='x_ind')\nvs.requires_grad(True)\n\noptimizer = torch.optim.Adam(vs.get_latent_vars(), lr=0.1)\nfig, ax = plt.subplots(1,2,figsize=(15,5))\nlosses = []\n\ndef update(i):\n optimizer.zero_grad()\n gp = model(vs)\n obs = PseudoObsVFE(gp(vs['x_ind']), (gp(x_obs, vs['noise']), y_obs))\n loss = -obs.elbo(gp.measure)\n losses.append(loss.item())\n loss.backward()\n optimizer.step()\n \n gp_post = gp | obs\n mean, lower, upper = gp_post(x, vs['noise']).marginal_credible_bounds()\n ind_mean = B.dense(gp_post(vs['x_ind']).mean)\n \n ax[0].cla();ax[1].cla();\n ax[0].scatter(x_obs, y_obs, s=2)\n with torch.no_grad():\n ax[0].plot()\n ax[0].plot(x, B.dense(mean), label='Prediction')\n ax[0].fill_between(x.ravel(), lower, upper, alpha=0.2, label='Uncertainty')\n ax[0].plot(x, dataObj.f(x), label='True')\n ax[0].scatter(vs['x_ind'], ind_mean, label='Inducing points')\n ax[0].set_xlabel('X')\n ax[0].legend()\n \n ax[1].plot(losses, label='loss')\n ax[1].set_xlabel('Iterations')\n ax[1].legend()\n \nanim = FuncAnimation(fig, update, range(50))\nrc('animation', html='jshtml')\nplt.close()\nanim\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\n\n\nFITC\n\nvs = Vars(torch.float64)\nvs.positive(name=\"noise\")\nvs.positive(name=\"length_scale\");\nvs.positive(name=\"variance\");\nvs.positive(init=torch.linspace(0.4,0.6,10), shape=(10,), name='x_ind')\nvs.requires_grad(True)\n\noptimizer = torch.optim.Adam(vs.get_latent_vars(), lr=0.1)\nfig, ax = plt.subplots(1,2,figsize=(15,5))\nlosses = []\n\ndef update(i):\n optimizer.zero_grad()\n gp = model(vs)\n obs = PseudoObsFITC(gp(vs['x_ind']), (gp(x_obs, vs['noise']), y_obs))\n loss = -obs.elbo(gp.measure)\n losses.append(loss.item())\n loss.backward()\n optimizer.step()\n \n gp_post = gp | obs\n mean, lower, upper = gp_post(x, vs['noise']).marginal_credible_bounds()\n ind_mean = B.dense(gp_post(vs['x_ind']).mean)\n \n ax[0].cla();ax[1].cla();\n ax[0].scatter(x_obs, y_obs, s=2)\n with torch.no_grad():\n ax[0].plot()\n ax[0].plot(x, B.dense(mean), label='Prediction')\n ax[0].fill_between(x.ravel(), lower, upper, alpha=0.2, label='Uncertainty')\n ax[0].plot(x, dataObj.f(x), label='True')\n ax[0].scatter(vs['x_ind'], ind_mean, label='Inducing points')\n ax[0].set_xlabel('X')\n ax[0].legend()\n \n ax[1].plot(losses, label='loss')\n ax[1].set_xlabel('Iterations')\n ax[1].legend()\n \nanim = FuncAnimation(fig, update, range(50))\nrc('animation', html='jshtml')\nplt.close()\nanim\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect" + }, + { + "objectID": "posts/2024-12-27-download_caaqm_locations copy.html", + "href": "posts/2024-12-27-download_caaqm_locations copy.html", + "title": "Download CPCB CAAQM locations", "section": "", - "text": "import GPy\nimport numpy as np\nimport pandas as pd\n\nfrom sklearn.preprocessing import MinMaxScaler, StandardScaler\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.ensemble import RandomForestRegressor\n\nimport regdata as rd\nimport matplotlib.pyplot as plt\n\nimport torch\nimport torch.nn as nn\n\n\nx_train, y_train, x_test = rd.Step().get_data()\ny_train = y_train.reshape(-1, 1)\nx_test = x_test * 1.5\nprint(x_train.shape, y_train.shape, x_test.shape)\n\nplt.scatter(x_train, y_train, label='train');\n\n(50, 1) (50, 1) (100, 1)\n\n\n\n\n\n\n\n\n\n\nkernel = GPy.kern.RBF(1, variance=1, lengthscale=1)\nmodel = GPy.models.GPRegression(x_train, y_train.reshape(-1, 1), kernel)\nmodel.Gaussian_noise.variance = 0.1\n\ny_pred_gp, y_var = model.predict(x_test)\n\nplt.scatter(x_train, y_train, label='train');\nplt.plot(x_test, y_pred_gp, label='pred');\n\n\n\n\n\n\n\n\n\nclass GCN_Forward(nn.Module):\n def __init__(self, in_features, out_features):\n super().__init__()\n self.fc = nn.Linear(in_features, out_features)\n \n def forward(self, x, A):\n x = self.fc(x)\n x = torch.matmul(A, x)\n return x\n \nclass GCN_Reverse(nn.Module):\n def __init__(self, in_features, out_features):\n super().__init__()\n self.fc = nn.Linear(in_features, out_features)\n \n def forward(self, x, A):\n x = torch.matmul(A, x)\n x = self.fc(x)\n return x\n\nclass NN(nn.Module):\n def __init__(self, features):\n super().__init__()\n self.features = features\n \n for i, (in_features, out_features) in enumerate(zip(features[:-1], features[1:])):\n setattr(self, f'layer_{i}', nn.Linear(in_features, out_features))\n \n self.last_layer = nn.Linear(features[-1], 1)\n \n def forward(self, x, A):\n for i in range(len(self.features) - 1):\n if isinstance(getattr(self, f'layer_{i}'), GCN_Forward):\n x = getattr(self, f'layer_{i}')(x, A)\n else:\n x = getattr(self, f'layer_{i}')(x)\n x = nn.functional.gelu(x)\n \n x = self.last_layer(x)\n return x\n\nclass GCN(NN):\n def __init__(self, features):\n super().__init__(features)\n for i, (in_features, out_features) in enumerate(zip(features[:-1], features[1:])):\n setattr(self, f'layer_{i}', GCN_Forward(in_features, out_features))\n\n\nA = torch.tensor(kernel.K(x_train, x_train)).float()\n# A.fill_diagonal_(0)\nA = A / A.sum(dim=0, keepdim=True)\n# A.fill_diagonal_(1)\n\nnum_epochs = 500\nfeatures = [1, 1024]\n\ngcn_model = GCN(features=features)\nnn_model = NN(features=features)\n\ngcn_optimizer = torch.optim.Adam(gcn_model.parameters(), lr=0.01)\nnn_optimizer = torch.optim.Adam(nn_model.parameters(), lr=0.01)\n\ncriterion = nn.MSELoss()\n\nx_train_torch = torch.from_numpy(x_train).float()\ny_train_torch = torch.from_numpy(y_train).float()\n\ngcn_losses = []\nnn_losses = []\nfor epoch in range(num_epochs):\n gcn_optimizer.zero_grad()\n nn_optimizer.zero_grad()\n \n y_out_gcn = gcn_model(x_train_torch, A)\n y_out_nn = nn_model(x_train_torch, A)\n gcn_loss = criterion(y_out_gcn, y_train_torch)\n nn_loss = criterion(y_out_nn, y_train_torch)\n \n gcn_loss.backward()\n nn_loss.backward()\n \n gcn_losses.append(gcn_loss.item())\n nn_losses.append(nn_loss.item())\n \n gcn_optimizer.step()\n nn_optimizer.step()\n \nplt.plot(gcn_losses, label='gcn');\nplt.plot(nn_losses, label='nn');\nplt.legend();\n\n\n\n\n\n\n\n\n\nA_test = torch.tensor(kernel.K(x_test, x_test)).float()\n# A_test.fill_diagonal_(0)\nA_test = A_test / A_test.sum(dim=0, keepdim=True)\n# A_test.fill_diagonal_(1)\n\ny_pred_nn = nn_model(torch.from_numpy(x_test).float(), A_test).detach().numpy()\ny_pred_gcn = gcn_model(torch.from_numpy(x_test).float(), A_test).detach().numpy()\n\nplt.figure(figsize=(10, 6))\nplt.scatter(x_train, y_train, label='train');\nplt.plot(x_train, y_out_gcn.detach().numpy(), label='pred GCN train');\nplt.plot(x_train, y_out_nn.detach().numpy(), label='pred NN train');\nplt.plot(x_test, y_pred_gp, label='pred GP', linestyle='--');\nplt.plot(x_test, y_pred_nn, label='pred NN');\nplt.plot(x_test, y_pred_gcn, label='pred GCN');\nplt.ylim(-3, 3);\nplt.legend();" + "text": "try:\n import selenium\nexcept ModuleNotFoundError:\n %pip install selenium\n\nimport os\nimport re\nimport numpy as np\nimport pandas as pd\n\nfrom tqdm.notebook import tqdm, trange\nfrom time import sleep, time\nfrom selenium import webdriver\nfrom selenium.webdriver.support.ui import Select\nfrom selenium.webdriver.common.by import By\nfrom selenium.webdriver.support.ui import WebDriverWait\nfrom selenium.webdriver.support import expected_conditions as EC\n\n!rm log.txt\n\ndef print_it(*args, **kwargs):\n print(*args, **kwargs)\n with open('log.txt', 'a') as f:\n print(*args, **kwargs, file=f)\n\nglobal_init = time()\n\nrm: log.txt: No such file or directory\n\n\n\n# Set up WebDriver\nop = webdriver.ChromeOptions()\n\ndriver = webdriver.Chrome(options=op)\n\n# Navigate to the website and manually solve the CAPTCHA\ndriver.get(\"https://airquality.cpcb.gov.in/ccr/#/caaqm-dashboard-all/caaqm-landing\")\n\n\nManually solve captcha before moving on to the next cell..\n\n\n# leaflet-marker-icon custom-div-icon map_markers station_status_live leaflet-zoom-animated leaflet-interactive\nall_station_markers = driver.find_elements(By.CLASS_NAME, 'leaflet-marker-icon')\n\nall_stations_len = len(all_station_markers)\nprint(\"Total stations: \", all_stations_len)\n\nTotal stations: 558\n\n\n\ndef get_after(string, phrase):\n return string[string.index(phrase) + len(phrase):]\n\ndata = {}\nall_station_markers = driver.find_elements(By.CLASS_NAME, 'leaflet-marker-icon')\nmarker_id = 0\nprogress_bar = tqdm(total=all_stations_len, desc=\"Progress\")\nwhile marker_id < all_stations_len:\n try:\n marker = all_station_markers[marker_id]\n driver.execute_script(\"arguments[0].click();\", marker)\n WebDriverWait(driver, 10).until(EC.element_to_be_clickable((By.CLASS_NAME, 'close')))\n children = driver.find_elements(By.CLASS_NAME, \"col-md-12\")\n assert \"Station Name\" in children[3].text\n \n # parse it\n \n station, address, location = children[3].text.split('\\n')\n station = get_after(station, \"Station Name: \")\n address = get_after(address, \"Address: \")\n latitude, longitude = location.split(\",\")\n latitude = get_after(latitude, \"Latitude: \")\n longitude = get_after(longitude, \"Longitude: \")\n \n data[station] = {\"address\": address, \"latitude\": float(latitude), \"longitude\": float(longitude)}\n close = driver.find_element(By.CLASS_NAME, \"close\")\n close.click()\n sleep(0.5)\n marker_id += 1\n progress_bar.update(1)\n except Exception as e:\n driver.refresh()\n input(\"Please manually solve the Captcha\")\n all_station_markers = driver.find_elements(By.CLASS_NAME, 'leaflet-marker-icon')\n\n\n\n\n\n---------------------------------------------------------------------------\nKeyboardInterrupt Traceback (most recent call last)\nCell In[4], line 12\n 10 marker = all_station_markers[marker_id]\n 11 driver.execute_script(\"arguments[0].click();\", marker)\n---> 12 WebDriverWait(driver, 10).until(EC.element_to_be_clickable((By.CLASS_NAME, 'close')))\n 13 children = driver.find_elements(By.CLASS_NAME, \"col-md-12\")\n 14 assert \"Station Name\" in children[3].text\n\nFile /opt/miniconda3/lib/python3.12/site-packages/selenium/webdriver/support/wait.py:102, in WebDriverWait.until(self, method, message)\n 100 screen = getattr(exc, \"screen\", None)\n 101 stacktrace = getattr(exc, \"stacktrace\", None)\n--> 102 time.sleep(self._poll)\n 103 if time.monotonic() > end_time:\n 104 break\n\nKeyboardInterrupt: \n\n\n\n\ndf = pd.DataFrame(data).T\ndf.index.name = \"station\"\ndf.head(2)\n\n\n\n\n\n\n\n\naddress\nlatitude\nlongitude\n\n\nstation\n\n\n\n\n\n\n\nSIDCO Kurichi, Coimbatore - TNPCB\nSIDCO Kurichi, Coimbatore, Tamil Nadu.\n10.942451\n76.978996\n\n\nMuradpur, Patna - BSPCB\nS K Memorial Hall Premises, Near Gandhi Maidan...\n25.619651\n85.147382\n\n\n\n\n\n\n\n\ndf.to_csv(\"station_data.csv\")" + }, + { + "objectID": "posts/2022-04-06-github_faqs.html", + "href": "posts/2022-04-06-github_faqs.html", + "title": "GitHub Contrubuting FAQs", + "section": "", + "text": "Create separate branches for each issue. Do not work on the master branch.\n\n\nWe will see that in Q5." }, { - "objectID": "posts/air-quality-google-.html", - "href": "posts/air-quality-google-.html", - "title": "Google Air Quality Data", + "objectID": "posts/2022-04-06-github_faqs.html#q1-what-is-an-efficient-way-to-work-on-multiple-issues-at-once", + "href": "posts/2022-04-06-github_faqs.html#q1-what-is-an-efficient-way-to-work-on-multiple-issues-at-once", + "title": "GitHub Contrubuting FAQs", "section": "", - "text": "import requests\nimport numpy as np\nimport pandas as pd\nimport xarray as xr\nfrom tqdm import tqdm\nimport matplotlib.pyplot as plt\nimport dask\nfrom dask.distributed import Client, LocalCluster\n\nif \"key\" in locals():\n pass\nelse:\n key = input(\"Enter your key: \")\n\nurl = f\"https://airquality.googleapis.com/v1/history:lookup?key={key}\"\nurl\n\n'https://airquality.googleapis.com/v1/history:lookup?key=AIzaSyA9ytSec31of_INkpuB3TZ6vLR1nzme9iQ'\nif \"client\" in locals():\n print(client)\nelse:\n cluster = LocalCluster(n_workers=54, threads_per_worker=1)\n client = Client(cluster)\n print(client)\n\n<Client: 'tcp://127.0.0.1:36239' processes=54 threads=54, memory=503.73 GiB>\npayload = {\n \"hours\": 24 * 30,\n \"pageSize\": 168,\n \"pageToken\": \"\",\n \"location\": {\"latitude\": 28.636429, \"longitude\": 77.201067},\n \"extraComputations\": [\"POLLUTANT_CONCENTRATION\", \"LOCAL_AQI\"],\n}\n\nheaders = {\"Content-Type\": \"application/json\"}\n\nresponse = requests.post(url, json=payload, headers=headers)\n\nres = response.json()\n24 * 30\n\n720\nlen(res[\"hoursInfo\"])\n\n168\nts = []\ncodes = []\npm25 = []\ndf = pd.DataFrame(columns=[\"timestamp\", \"value\", \"code\"])\n\nfor each in res[\"hoursInfo\"]:\n ts.append(each[\"dateTime\"])\n codes.append(each[\"pollutants\"][4][\"code\"])\n pm25.append(each[\"pollutants\"][4][\"concentration\"][\"value\"])\n\ndf[\"timestamp\"] = ts\ndf[\"value\"] = pm25\ndf[\"code\"] = codes\ndf[\"timestamp\"] = pd.to_datetime(df[\"timestamp\"])\ndf.head(20)\n\n\n---------------------------------------------------------------------------\nKeyError Traceback (most recent call last)\nCell In[5], line 9\n 7 ts.append(each[\"dateTime\"])\n 8 codes.append(each[\"pollutants\"][4][\"code\"])\n----> 9 pm25.append(each[\"pollutants\"][4][\"concentration\"][\"value\"])\n 11 df[\"timestamp\"] = ts\n 12 df[\"value\"] = pm25\n\nKeyError: 'concentration'\nsensor_data = pd.read_excel(\n \"/home/patel_zeel/blog/posts/site_12220230831125207.xlsx\", skiprows=16\n)\nsensor_data[\"From Date\"] = pd.to_datetime(\n sensor_data[\"From Date\"], format=\"%d-%m-%Y %H:%M\"\n)\nsensor_data[\"To Date\"] = pd.to_datetime(sensor_data[\"To Date\"], format=\"%d-%m-%Y %H:%M\")\nsensor_data[\"mean_time\"] = sensor_data[[\"From Date\", \"To Date\"]].mean(axis=1)\nsensor_data[\"utc_time\"] = sensor_data[\"mean_time\"] - pd.Timedelta(hours=5, minutes=30)\n\nfig, ax = plt.subplots(figsize=(15, 4))\nsensor_data.plot(x=\"utc_time\", y=\"PM2.5\", ax=ax, label=\"sensor\")\ndf.plot(x=\"timestamp\", y=\"value\", ax=ax, label=\"google\")" + "text": "Create separate branches for each issue. Do not work on the master branch.\n\n\nWe will see that in Q5." }, { - "objectID": "posts/air-quality-google-.html#request-at-a-grid.", - "href": "posts/air-quality-google-.html#request-at-a-grid.", - "title": "Google Air Quality Data", - "section": "Request at a grid.", - "text": "Request at a grid.\n\nfrom aqmsp_data.data import load_camx\n\ncamx = load_camx(years=2022, months=1, days=1, variables=\"P25\")\ncamx\n\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/xarray/core/indexing.py:1443: PerformanceWarning: Slicing is producing a large chunk. To accept the large\nchunk and silence this warning, set the option\n >>> with dask.config.set(**{'array.slicing.split_large_chunks': False}):\n ... array[indexer]\n\nTo avoid creating the large chunks, set the option\n >>> with dask.config.set(**{'array.slicing.split_large_chunks': True}):\n ... array[indexer]\n return self.array[key]\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n<xarray.Dataset>\nDimensions: (time: 24, latitude: 80, longitude: 80)\nCoordinates:\n * longitude (longitude) float64 76.85 76.86 76.87 76.88 ... 77.62 77.63 77.64\n * latitude (latitude) float64 28.2 28.21 28.22 28.23 ... 28.97 28.98 28.99\n * time (time) datetime64[ns] 2022-01-01T00:30:00 ... 2022-01-01T23:30:00\nData variables:\n P25 (time, latitude, longitude) float32 dask.array<chunksize=(24, 80, 80), meta=np.ndarray>\nAttributes: (12/34)\n CDATE: 2023126\n CTIME: 95909\n EXEC_ID: ???????????????? ...\n FILEDESC: I/O API formatted CAMx AVRG output ...\n FTYPE: 1\n GDNAM: ????????????????\n ... ...\n XCELL: 0.009999999776482582\n XCENT: 0.0\n XORIG: 76.8499984741211\n YCELL: 0.009999999776482582\n YCENT: 0.0\n YORIG: 28.200000762939453xarray.DatasetDimensions:time: 24latitude: 80longitude: 80Coordinates: (3)longitude(longitude)float6476.85 76.86 76.87 ... 77.63 77.64array([76.85, 76.86, 76.87, 76.88, 76.89, 76.9 , 76.91, 76.92, 76.93, 76.94,\n 76.95, 76.96, 76.97, 76.98, 76.99, 77. , 77.01, 77.02, 77.03, 77.04,\n 77.05, 77.06, 77.07, 77.08, 77.09, 77.1 , 77.11, 77.12, 77.13, 77.14,\n 77.15, 77.16, 77.17, 77.18, 77.19, 77.2 , 77.21, 77.22, 77.23, 77.24,\n 77.25, 77.26, 77.27, 77.28, 77.29, 77.3 , 77.31, 77.32, 77.33, 77.34,\n 77.35, 77.36, 77.37, 77.38, 77.39, 77.4 , 77.41, 77.42, 77.43, 77.44,\n 77.45, 77.46, 77.47, 77.48, 77.49, 77.5 , 77.51, 77.52, 77.53, 77.54,\n 77.55, 77.56, 77.57, 77.58, 77.59, 77.6 , 77.61, 77.62, 77.63, 77.64])latitude(latitude)float6428.2 28.21 28.22 ... 28.98 28.99array([28.2 , 28.21, 28.22, 28.23, 28.24, 28.25, 28.26, 28.27, 28.28, 28.29,\n 28.3 , 28.31, 28.32, 28.33, 28.34, 28.35, 28.36, 28.37, 28.38, 28.39,\n 28.4 , 28.41, 28.42, 28.43, 28.44, 28.45, 28.46, 28.47, 28.48, 28.49,\n 28.5 , 28.51, 28.52, 28.53, 28.54, 28.55, 28.56, 28.57, 28.58, 28.59,\n 28.6 , 28.61, 28.62, 28.63, 28.64, 28.65, 28.66, 28.67, 28.68, 28.69,\n 28.7 , 28.71, 28.72, 28.73, 28.74, 28.75, 28.76, 28.77, 28.78, 28.79,\n 28.8 , 28.81, 28.82, 28.83, 28.84, 28.85, 28.86, 28.87, 28.88, 28.89,\n 28.9 , 28.91, 28.92, 28.93, 28.94, 28.95, 28.96, 28.97, 28.98, 28.99])time(time)datetime64[ns]2022-01-01T00:30:00 ... 2022-01-...array(['2022-01-01T00:30:00.000000000', '2022-01-01T01:30:00.000000000',\n '2022-01-01T02:30:00.000000000', '2022-01-01T03:30:00.000000000',\n '2022-01-01T04:30:00.000000000', '2022-01-01T05:30:00.000000000',\n '2022-01-01T06:30:00.000000000', '2022-01-01T07:30:00.000000000',\n '2022-01-01T08:30:00.000000000', '2022-01-01T09:30:00.000000000',\n '2022-01-01T10:30:00.000000000', '2022-01-01T11:30:00.000000000',\n '2022-01-01T12:30:00.000000000', '2022-01-01T13:30:00.000000000',\n '2022-01-01T14:30:00.000000000', '2022-01-01T15:30:00.000000000',\n '2022-01-01T16:30:00.000000000', '2022-01-01T17:30:00.000000000',\n '2022-01-01T18:30:00.000000000', '2022-01-01T19:30:00.000000000',\n '2022-01-01T20:30:00.000000000', '2022-01-01T21:30:00.000000000',\n '2022-01-01T22:30:00.000000000', '2022-01-01T23:30:00.000000000'],\n dtype='datetime64[ns]')Data variables: (1)P25(time, latitude, longitude)float32dask.array<chunksize=(24, 80, 80), meta=np.ndarray>long_name :FPRM units :micrograms/m**3 var_desc :VARIABLE FPRM \n\n\n\n\n\n\n\n\n\n\n\nArray\nChunk\n\n\n\n\nBytes\n600.00 kiB\n600.00 kiB\n\n\nShape\n(24, 80, 80)\n(24, 80, 80)\n\n\nDask graph\n1 chunks in 11 graph layers\n\n\nData type\nfloat32 numpy.ndarray\n\n\n\n\n 80 80 24\n\n\n\n\nIndexes: (3)longitudePandasIndexPandasIndex(Index([ 76.85, 76.86, 76.86999999999999,\n 76.88, 76.89, 76.89999999999999,\n 76.91, 76.91999999999999, 76.92999999999999,\n 76.94, 76.94999999999999, 76.96,\n 76.97, 76.97999999999999, 76.99,\n 77.0, 77.00999999999999, 77.02,\n 77.03, 77.03999999999999, 77.05,\n 77.05999999999999, 77.07, 77.08,\n 77.08999999999999, 77.1, 77.11,\n 77.11999999999999, 77.13, 77.14,\n 77.14999999999999, 77.16, 77.16999999999999,\n 77.17999999999999, 77.19, 77.19999999999999,\n 77.21, 77.22, 77.22999999999999,\n 77.24, 77.25, 77.25999999999999,\n 77.27, 77.28, 77.28999999999999,\n 77.3, 77.30999999999999, 77.32,\n 77.33, 77.33999999999999, 77.35,\n 77.36, 77.36999999999999, 77.38,\n 77.39, 77.39999999999999, 77.41,\n 77.41999999999999, 77.42999999999999, 77.44,\n 77.44999999999999, 77.46, 77.47,\n 77.47999999999999, 77.49, 77.5,\n 77.50999999999999, 77.52, 77.53,\n 77.53999999999999, 77.55, 77.55999999999999,\n 77.57, 77.58, 77.58999999999999,\n 77.6, 77.61, 77.61999999999999,\n 77.63, 77.64],\n dtype='float64', name='longitude'))latitudePandasIndexPandasIndex(Index([ 28.2, 28.21, 28.22,\n 28.23, 28.24, 28.25,\n 28.259999999999998, 28.27, 28.279999999999998,\n 28.29, 28.3, 28.31,\n 28.32, 28.33, 28.34,\n 28.349999999999998, 28.36, 28.37,\n 28.38, 28.39, 28.4,\n 28.41, 28.419999999999998, 28.43,\n 28.439999999999998, 28.45, 28.46,\n 28.47, 28.48, 28.49,\n 28.5, 28.509999999999998, 28.52,\n 28.529999999999998, 28.54, 28.55,\n 28.56, 28.57, 28.58,\n 28.59, 28.599999999999998, 28.61,\n 28.62, 28.63, 28.64,\n 28.65, 28.66, 28.669999999999998,\n 28.68, 28.689999999999998, 28.7,\n 28.71, 28.72, 28.73,\n 28.74, 28.75, 28.759999999999998,\n 28.77, 28.779999999999998, 28.79,\n 28.8, 28.81, 28.82,\n 28.83, 28.84, 28.849999999999998,\n 28.86, 28.87, 28.88,\n 28.89, 28.9, 28.91,\n 28.919999999999998, 28.93, 28.939999999999998,\n 28.95, 28.96, 28.97,\n 28.98, 28.99],\n dtype='float64', name='latitude'))timePandasIndexPandasIndex(DatetimeIndex(['2022-01-01 00:30:00', '2022-01-01 01:30:00',\n '2022-01-01 02:30:00', '2022-01-01 03:30:00',\n '2022-01-01 04:30:00', '2022-01-01 05:30:00',\n '2022-01-01 06:30:00', '2022-01-01 07:30:00',\n '2022-01-01 08:30:00', '2022-01-01 09:30:00',\n '2022-01-01 10:30:00', '2022-01-01 11:30:00',\n '2022-01-01 12:30:00', '2022-01-01 13:30:00',\n '2022-01-01 14:30:00', '2022-01-01 15:30:00',\n '2022-01-01 16:30:00', '2022-01-01 17:30:00',\n '2022-01-01 18:30:00', '2022-01-01 19:30:00',\n '2022-01-01 20:30:00', '2022-01-01 21:30:00',\n '2022-01-01 22:30:00', '2022-01-01 23:30:00'],\n dtype='datetime64[ns]', name='time', freq=None))Attributes: (34)CDATE :2023126CTIME :95909EXEC_ID :???????????????? FILEDESC :I/O API formatted CAMx AVRG output FTYPE :1GDNAM :????????????????GDTYP :1HISTORY :Sun May 7 09:57:07 2023: ncrcat camxout.2023.05.06.nc camxout.2023.05.07.nc camxout.2023.05.08.nc camxout.2023.05.09.nc camxout.2023.05.10.nc camx120hr.nc\nIOAPI_VERSION :$Id: @(#) ioapi library version 3.0 $ NCO :netCDF Operators version 4.9.1 (Homepage = http://nco.sf.net, Code = http://github.com/nco/nco)NCOLS :80NLAYS :1NROWS :80NTHIK :1NVARS :2P_ALP :0.0P_BET :0.0P_GAM :0.0SDATE :2023126STIME :0TSTEP :10000UPNAM :CAMXMETOU VAR-LIST :P10 P25 VGLVLS :[0. 0.]VGTOP :-9.998999757492232e+36VGTYP :-9999WDATE :2023126WTIME :95909XCELL :0.009999999776482582XCENT :0.0XORIG :76.8499984741211YCELL :0.009999999776482582YCENT :0.0YORIG :28.200000762939453\n\n\n\nlat_grid, lon_grid = np.meshgrid(camx.latitude, camx.longitude)\nlat_lon_grid = np.vstack([lat_grid.ravel(), lon_grid.ravel()]).T\nprint(lat_lon_grid.shape)\n\nsession = requests.Session()\ndelayed_fn = dask.delayed(session.post)\nresponses = []\nfor lat, lon in tqdm(lat_lon_grid):\n payload = {\n \"hours\": 1,\n \"pageSize\": 200,\n \"pageToken\": \"\",\n \"location\": {\"latitude\": lat, \"longitude\": lon},\n \"extraComputations\": [\"POLLUTANT_CONCENTRATION\"],\n }\n\n headers = {\"Content-Type\": \"application/json\"}\n\n response = delayed_fn(url, json=payload, headers=headers)\n responses.append(response)\n\nall_res = dask.compute(*responses)\n\n(6400, 2)\n\n\n100%|██████████| 6400/6400 [00:01<00:00, 5981.06it/s]\n\n\n\n---------------------------------------------------------------------------\nKeyboardInterrupt Traceback (most recent call last)\nCell In[7], line 22\n 19 response = delayed_fn(url, json=payload, headers=headers)\n 20 responses.append(response)\n---> 22 all_res = dask.compute(*responses)\n\nFile ~/miniconda3/lib/python3.9/site-packages/dask/base.py:666, in compute(traverse, optimize_graph, scheduler, get, *args, **kwargs)\n 663 keys.append(x.__dask_keys__())\n 664 postcomputes.append(x.__dask_postcompute__())\n--> 666 results = schedule(dsk, keys, **kwargs)\n 667 return repack([f(r, *a) for r, (f, a) in zip(results, postcomputes)])\n\nFile ~/miniconda3/lib/python3.9/site-packages/dask/threaded.py:89, in get(dsk, keys, cache, num_workers, pool, **kwargs)\n 86 elif isinstance(pool, multiprocessing.pool.Pool):\n 87 pool = MultiprocessingPoolExecutor(pool)\n---> 89 results = get_async(\n 90 pool.submit,\n 91 pool._max_workers,\n 92 dsk,\n 93 keys,\n 94 cache=cache,\n 95 get_id=_thread_get_id,\n 96 pack_exception=pack_exception,\n 97 **kwargs,\n 98 )\n 100 # Cleanup pools associated to dead threads\n 101 with pools_lock:\n\nFile ~/miniconda3/lib/python3.9/site-packages/dask/local.py:500, in get_async(submit, num_workers, dsk, result, cache, get_id, rerun_exceptions_locally, pack_exception, raise_exception, callbacks, dumps, loads, chunksize, **kwargs)\n 498 while state[\"waiting\"] or state[\"ready\"] or state[\"running\"]:\n 499 fire_tasks(chunksize)\n--> 500 for key, res_info, failed in queue_get(queue).result():\n 501 if failed:\n 502 exc, tb = loads(res_info)\n\nFile ~/miniconda3/lib/python3.9/site-packages/dask/local.py:137, in queue_get(q)\n 136 def queue_get(q):\n--> 137 return q.get()\n\nFile ~/miniconda3/lib/python3.9/queue.py:171, in Queue.get(self, block, timeout)\n 169 elif timeout is None:\n 170 while not self._qsize():\n--> 171 self.not_empty.wait()\n 172 elif timeout < 0:\n 173 raise ValueError(\"'timeout' must be a non-negative number\")\n\nFile ~/miniconda3/lib/python3.9/threading.py:312, in Condition.wait(self, timeout)\n 310 try: # restore state no matter what (e.g., KeyboardInterrupt)\n 311 if timeout is None:\n--> 312 waiter.acquire()\n 313 gotit = True\n 314 else:\n\nKeyboardInterrupt: \n\n\n\n\ndfs = []\nfor res, (lat, lon) in tqdm(zip(all_res, lat_lon_grid)):\n res = res.json()\n df = pd.DataFrame(columns=[\"timestamp\", \"value\", \"code\"])\n ts = []\n codes = []\n pm25 = []\n for each in res[\"hoursInfo\"]:\n ts.append(each[\"dateTime\"])\n codes.append(each[\"pollutants\"][4][\"code\"])\n try:\n pm25.append(each[\"pollutants\"][4][\"concentration\"][\"value\"])\n except KeyError:\n pm25.append(np.nan)\n\n df[\"timestamp\"] = ts\n df[\"value\"] = pm25\n df[\"code\"] = codes\n df[\"lat\"] = lat\n df[\"lon\"] = lon\n df[\"timestamp\"] = pd.to_datetime(df[\"timestamp\"])\n dfs.append(df)\nmaster_df = pd.concat(dfs)\nassert master_df[\"code\"].nunique() == 1 and master_df[\"code\"].unique()[0] == \"pm25\"\n\nds = master_df.set_index([\"lat\", \"lon\", \"timestamp\"]).to_xarray()\nds.isel(timestamp=0)[\"value\"].plot(x=\"lon\", y=\"lat\", cmap=\"RdYlGn_r\")\n\n0it [00:00, ?it/s]6400it [00:12, 497.91it/s]" + "objectID": "posts/2022-04-06-github_faqs.html#q2-what-to-do-if-the-main-or-master-gets-updated-before-i-open-a-pr", + "href": "posts/2022-04-06-github_faqs.html#q2-what-to-do-if-the-main-or-master-gets-updated-before-i-open-a-pr", + "title": "GitHub Contrubuting FAQs", + "section": "Q2: What to do if the main (or master) gets updated before I open a PR?", + "text": "Q2: What to do if the main (or master) gets updated before I open a PR?\nPull the changes directly to your branch with:\ngit pull https://github.com/probml/pyprobml" }, { - "objectID": "posts/air-quality-google-.html#appendix", - "href": "posts/air-quality-google-.html#appendix", - "title": "Google Air Quality Data", - "section": "Appendix", - "text": "Appendix\n\nimport xarray as xr\n\n\nglobal_date = \"2023-07-26\"\nds_met = xr.open_dataset(\n f'../../sarath_auto_download/data/camxmet2d.delhi.{global_date.replace(\"-\",\"\")}.96hours.nc'\n)\nds_aq = xr.open_dataset(\n f'../../sarath_auto_download/data/camx120hr_merged_{global_date.replace(\"-\",\"\")}.nc'\n)\nds_met\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n<xarray.Dataset>\nDimensions: (TSTEP: 96, VAR: 14, DATE-TIME: 2, LAY: 1, ROW: 80, COL: 80)\nDimensions without coordinates: TSTEP, VAR, DATE-TIME, LAY, ROW, COL\nData variables: (12/15)\n TFLAG (TSTEP, VAR, DATE-TIME) int32 ...\n TSURF_K (TSTEP, LAY, ROW, COL) float32 ...\n SNOWEW_M (TSTEP, LAY, ROW, COL) float32 ...\n SNOWAGE_HR (TSTEP, LAY, ROW, COL) float32 ...\n PRATE_MMpH (TSTEP, LAY, ROW, COL) float32 ...\n CLOUD_OD (TSTEP, LAY, ROW, COL) float32 ...\n ... ...\n SWSFC_WpM2 (TSTEP, LAY, ROW, COL) float32 ...\n SOLM_M3pM3 (TSTEP, LAY, ROW, COL) float32 ...\n CLDTOP_KM (TSTEP, LAY, ROW, COL) float32 ...\n CAPE (TSTEP, LAY, ROW, COL) float32 ...\n PBL_WRF_M (TSTEP, LAY, ROW, COL) float32 ...\n PBL_YSU_M (TSTEP, LAY, ROW, COL) float32 ...\nAttributes: (12/33)\n IOAPI_VERSION: $Id: @(#) ioapi library version 3.0 $ ...\n EXEC_ID: ???????????????? ...\n FTYPE: 1\n CDATE: 2023207\n CTIME: 72116\n WDATE: 2023207\n ... ...\n VGLVLS: [0. 0.]\n GDNAM: ????????????????\n UPNAM: CAMx2IOAPI \n VAR-LIST: TSURF_K SNOWEW_M SNOWAGE_HR PRATE_MMp...\n FILEDESC: I/O API formatted CAMx AVRG output ...\n HISTORY: xarray.DatasetDimensions:TSTEP: 96VAR: 14DATE-TIME: 2LAY: 1ROW: 80COL: 80Coordinates: (0)Data variables: (15)TFLAG(TSTEP, VAR, DATE-TIME)int32...units :<YYYYDDD,HHMMSS>long_name :TFLAG var_desc :Timestep-valid flags: (1) YYYYDDD or (2) HHMMSS [2688 values with dtype=int32]TSURF_K(TSTEP, LAY, ROW, COL)float32...long_name :TSURF_K units :ppmV var_desc :VARIABLE TSURF_K [614400 values with dtype=float32]SNOWEW_M(TSTEP, LAY, ROW, COL)float32...long_name :SNOWEW_M units :ppmV var_desc :VARIABLE SNOWEW_M [614400 values with dtype=float32]SNOWAGE_HR(TSTEP, LAY, ROW, COL)float32...long_name :SNOWAGE_HR units :ppmV var_desc :VARIABLE SNOWAGE_HR [614400 values with dtype=float32]PRATE_MMpH(TSTEP, LAY, ROW, COL)float32...long_name :PRATE_MMpH units :ppmV var_desc :VARIABLE PRATE_MMpH [614400 values with dtype=float32]CLOUD_OD(TSTEP, LAY, ROW, COL)float32...long_name :CLOUD_OD units :ppmV var_desc :VARIABLE CLOUD_OD [614400 values with dtype=float32]U10_MpS(TSTEP, LAY, ROW, COL)float32...long_name :U10_MpS units :ppmV var_desc :VARIABLE U10_MpS [614400 values with dtype=float32]V10_MpS(TSTEP, LAY, ROW, COL)float32...long_name :V10_MpS units :ppmV var_desc :VARIABLE V10_MpS [614400 values with dtype=float32]T2_K(TSTEP, LAY, ROW, COL)float32...long_name :T2_K units :ppmV var_desc :VARIABLE T2_K [614400 values with dtype=float32]SWSFC_WpM2(TSTEP, LAY, ROW, COL)float32...long_name :SWSFC_WpM2 units :ppmV var_desc :VARIABLE SWSFC_WpM2 [614400 values with dtype=float32]SOLM_M3pM3(TSTEP, LAY, ROW, COL)float32...long_name :SOLM_M3pM3 units :ppmV var_desc :VARIABLE SOLM_M3pM3 [614400 values with dtype=float32]CLDTOP_KM(TSTEP, LAY, ROW, COL)float32...long_name :CLDTOP_KM units :ppmV var_desc :VARIABLE CLDTOP_KM [614400 values with dtype=float32]CAPE(TSTEP, LAY, ROW, COL)float32...long_name :CAPE units :ppmV var_desc :VARIABLE CAPE [614400 values with dtype=float32]PBL_WRF_M(TSTEP, LAY, ROW, COL)float32...long_name :PBL_WRF_M units :ppmV var_desc :VARIABLE PBL_WRF_M [614400 values with dtype=float32]PBL_YSU_M(TSTEP, LAY, ROW, COL)float32...long_name :PBL_YSU_M units :ppmV var_desc :VARIABLE PBL_YSU_M [614400 values with dtype=float32]Indexes: (0)Attributes: (33)IOAPI_VERSION :$Id: @(#) ioapi library version 3.0 $ EXEC_ID :???????????????? FTYPE :1CDATE :2023207CTIME :72116WDATE :2023207WTIME :72116SDATE :2023207STIME :0TSTEP :10000NTHIK :1NCOLS :80NROWS :80NLAYS :1NVARS :14GDTYP :1P_ALP :0.0P_BET :0.0P_GAM :0.0XCENT :0.0YCENT :0.0XORIG :76.8499984741211YORIG :28.200000762939453XCELL :0.009999999776482582YCELL :0.009999999776482582VGTYP :-9999VGTOP :-9.999e+36VGLVLS :[0. 0.]GDNAM :????????????????UPNAM :CAMx2IOAPI VAR-LIST :TSURF_K SNOWEW_M SNOWAGE_HR PRATE_MMpH CLOUD_OD U10_MpS V10_MpS T2_K SWSFC_WpM2 SOLM_M3pM3 CLDTOP_KM CAPE PBL_WRF_M PBL_YSU_M FILEDESC :I/O API formatted CAMx AVRG output HISTORY :\n\n\n\nplt.figure(figsize=(15, 3))\nds_met[\"PBL_WRF_M\"].isel(LAY=0).mean(dim=[\"ROW\", \"COL\"]).plot()\ntwin_x = plt.gca().twinx()\nds_aq[\"P25\"].isel(LAY=0, TSTEP=range(24, 120)).mean(dim=[\"ROW\", \"COL\"]).plot(\n ax=twin_x, color=\"r\"\n)\n# set ylabel color as red\ntwin_x.set_ylabel(\"PM2.5\", color=\"r\")\n# set yticks color as red\ntwin_x.tick_params(axis=\"y\", colors=\"r\")\ntwin_x.set_xlabel(\"Time\")\n\nText(0.5, 0, 'Time')\n\n\n\n\n\n\n\n\n\n\nds_met\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n<xarray.Dataset>\nDimensions: (TSTEP: 96, VAR: 14, DATE-TIME: 2, LAY: 1, ROW: 80, COL: 80)\nDimensions without coordinates: TSTEP, VAR, DATE-TIME, LAY, ROW, COL\nData variables: (12/15)\n TFLAG (TSTEP, VAR, DATE-TIME) int32 ...\n TSURF_K (TSTEP, LAY, ROW, COL) float32 ...\n SNOWEW_M (TSTEP, LAY, ROW, COL) float32 ...\n SNOWAGE_HR (TSTEP, LAY, ROW, COL) float32 ...\n PRATE_MMpH (TSTEP, LAY, ROW, COL) float32 ...\n CLOUD_OD (TSTEP, LAY, ROW, COL) float32 ...\n ... ...\n SWSFC_WpM2 (TSTEP, LAY, ROW, COL) float32 ...\n SOLM_M3pM3 (TSTEP, LAY, ROW, COL) float32 ...\n CLDTOP_KM (TSTEP, LAY, ROW, COL) float32 ...\n CAPE (TSTEP, LAY, ROW, COL) float32 ...\n PBL_WRF_M (TSTEP, LAY, ROW, COL) float32 ...\n PBL_YSU_M (TSTEP, LAY, ROW, COL) float32 ...\nAttributes: (12/33)\n IOAPI_VERSION: $Id: @(#) ioapi library version 3.0 $ ...\n EXEC_ID: ???????????????? ...\n FTYPE: 1\n CDATE: 2023207\n CTIME: 72116\n WDATE: 2023207\n ... ...\n VGLVLS: [0. 0.]\n GDNAM: ????????????????\n UPNAM: CAMx2IOAPI \n VAR-LIST: TSURF_K SNOWEW_M SNOWAGE_HR PRATE_MMp...\n FILEDESC: I/O API formatted CAMx AVRG output ...\n HISTORY: xarray.DatasetDimensions:TSTEP: 96VAR: 14DATE-TIME: 2LAY: 1ROW: 80COL: 80Coordinates: (0)Data variables: (15)TFLAG(TSTEP, VAR, DATE-TIME)int32...units :<YYYYDDD,HHMMSS>long_name :TFLAG var_desc :Timestep-valid flags: (1) YYYYDDD or (2) HHMMSS [2688 values with dtype=int32]TSURF_K(TSTEP, LAY, ROW, COL)float32...long_name :TSURF_K units :ppmV var_desc :VARIABLE TSURF_K [614400 values with dtype=float32]SNOWEW_M(TSTEP, LAY, ROW, COL)float32...long_name :SNOWEW_M units :ppmV var_desc :VARIABLE SNOWEW_M [614400 values with dtype=float32]SNOWAGE_HR(TSTEP, LAY, ROW, COL)float32...long_name :SNOWAGE_HR units :ppmV var_desc :VARIABLE SNOWAGE_HR [614400 values with dtype=float32]PRATE_MMpH(TSTEP, LAY, ROW, COL)float32...long_name :PRATE_MMpH units :ppmV var_desc :VARIABLE PRATE_MMpH [614400 values with dtype=float32]CLOUD_OD(TSTEP, LAY, ROW, COL)float32...long_name :CLOUD_OD units :ppmV var_desc :VARIABLE CLOUD_OD [614400 values with dtype=float32]U10_MpS(TSTEP, LAY, ROW, COL)float32...long_name :U10_MpS units :ppmV var_desc :VARIABLE U10_MpS [614400 values with dtype=float32]V10_MpS(TSTEP, LAY, ROW, COL)float32...long_name :V10_MpS units :ppmV var_desc :VARIABLE V10_MpS [614400 values with dtype=float32]T2_K(TSTEP, LAY, ROW, COL)float32...long_name :T2_K units :ppmV var_desc :VARIABLE T2_K [614400 values with dtype=float32]SWSFC_WpM2(TSTEP, LAY, ROW, COL)float32...long_name :SWSFC_WpM2 units :ppmV var_desc :VARIABLE SWSFC_WpM2 [614400 values with dtype=float32]SOLM_M3pM3(TSTEP, LAY, ROW, COL)float32...long_name :SOLM_M3pM3 units :ppmV var_desc :VARIABLE SOLM_M3pM3 [614400 values with dtype=float32]CLDTOP_KM(TSTEP, LAY, ROW, COL)float32...long_name :CLDTOP_KM units :ppmV var_desc :VARIABLE CLDTOP_KM [614400 values with dtype=float32]CAPE(TSTEP, LAY, ROW, COL)float32...long_name :CAPE units :ppmV var_desc :VARIABLE CAPE [614400 values with dtype=float32]PBL_WRF_M(TSTEP, LAY, ROW, COL)float32...long_name :PBL_WRF_M units :ppmV var_desc :VARIABLE PBL_WRF_M [614400 values with dtype=float32]PBL_YSU_M(TSTEP, LAY, ROW, COL)float32...long_name :PBL_YSU_M units :ppmV var_desc :VARIABLE PBL_YSU_M [614400 values with dtype=float32]Indexes: (0)Attributes: (33)IOAPI_VERSION :$Id: @(#) ioapi library version 3.0 $ EXEC_ID :???????????????? FTYPE :1CDATE :2023207CTIME :72116WDATE :2023207WTIME :72116SDATE :2023207STIME :0TSTEP :10000NTHIK :1NCOLS :80NROWS :80NLAYS :1NVARS :14GDTYP :1P_ALP :0.0P_BET :0.0P_GAM :0.0XCENT :0.0YCENT :0.0XORIG :76.8499984741211YORIG :28.200000762939453XCELL :0.009999999776482582YCELL :0.009999999776482582VGTYP :-9999VGTOP :-9.999e+36VGLVLS :[0. 0.]GDNAM :????????????????UPNAM :CAMx2IOAPI VAR-LIST :TSURF_K SNOWEW_M SNOWAGE_HR PRATE_MMpH CLOUD_OD U10_MpS V10_MpS T2_K SWSFC_WpM2 SOLM_M3pM3 CLDTOP_KM CAPE PBL_WRF_M PBL_YSU_M FILEDESC :I/O API formatted CAMx AVRG output HISTORY :\n\n\n\nnp.array([1, 2, 3])[:]\n\narray([1, 2, 3])\n\n\n\nimport numpy as np\nimport pandas as pd\n\ndf = pd.DataFrame(columns=[\"lag\"], index=list(ds_met.data_vars)[1:])\n\nfor lag in range(-24, 24):\n for var in list(ds_met.data_vars)[1:]:\n if lag == 0:\n met_series = ds_met[var].isel(LAY=0).mean(dim=[\"ROW\", \"COL\"]).values\n aq_series = (\n ds_aq[\"P25\"]\n .isel(LAY=0, TSTEP=range(24, 120))\n .mean(dim=[\"ROW\", \"COL\"])\n .values\n )\n elif lag > 0:\n met_series = ds_met[var].isel(LAY=0).mean(dim=[\"ROW\", \"COL\"]).values[lag:]\n aq_series = (\n ds_aq[\"P25\"]\n .isel(LAY=0, TSTEP=range(24, 120))\n .mean(dim=[\"ROW\", \"COL\"])\n .values[:-lag]\n )\n else:\n met_series = ds_met[var].isel(LAY=0).mean(dim=[\"ROW\", \"COL\"]).values[:lag]\n aq_series = (\n ds_aq[\"P25\"]\n .isel(LAY=0, TSTEP=range(24, 120))\n .mean(dim=[\"ROW\", \"COL\"])\n .values[-lag:]\n )\n # print(f\"{var}: {np.corrcoef(met_series, aq_series)[0, 1]}\")\n df.loc[var, lag] = np.corrcoef(met_series, aq_series)[0, 1]\n\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2897: RuntimeWarning: invalid value encountered in divide\n c /= stddev[:, None]\n/home/patel_zeel/miniconda3/lib/python3.9/site-packages/numpy/lib/function_base.py:2898: RuntimeWarning: invalid value encountered in divide\n c /= stddev[None, :]\n\n\n\ndf.index.values\n\narray(['TSURF_K', 'SNOWEW_M', 'SNOWAGE_HR', 'PRATE_MMpH', 'CLOUD_OD',\n 'U10_MpS', 'V10_MpS', 'T2_K', 'SWSFC_WpM2', 'SOLM_M3pM3',\n 'CLDTOP_KM', 'CAPE', 'PBL_WRF_M', 'PBL_YSU_M'], dtype=object)\n\n\n\nimport matplotlib.pyplot as plt\nfrom matplotlib.animation import FuncAnimation\n\nplt.figure(figsize=(6, 3))\nfor var in [\"TSURF_K\", \"T2_K\", \"SWSFC_WpM2\", \"PBL_WRF_M\", \"PBL_YSU_M\"]:\n df.loc[var].plot(label=var)\nfor var in [\"CLOUD_OD\", \"V10_MpS\"]:\n df.loc[var].plot(label=var, linestyle=\"--\")\n\nplt.legend(bbox_to_anchor=(1, 1))\nplt.ylabel(\"Correlation with P25 from CAMx\")\nplt.tight_layout()\nplt.savefig(\"lag.pdf\")\n\n\n\n\n\n\n\n\n\nplt.rcParams[\"animation.html\"] = \"jshtml\"\nfig, ax = plt.subplots(figsize=(15, 4))\nmappable = (\n ds_met[\"PBL_WRF_M\"]\n .isel(TSTEP=1, LAY=0)\n .plot(\n x=\"COL\", y=\"ROW\", cmap=\"RdYlGn_r\", ax=ax, vmin=0, vmax=2000, add_colorbar=False\n )\n)\nfig.colorbar(mappable)\n\n\ndef plot_it(t):\n ax.cla()\n tmp = ds_met[\"PBL_WRF_M\"].isel(TSTEP=t, LAY=0)\n tmp.plot(\n x=\"COL\", y=\"ROW\", cmap=\"RdYlGn_r\", ax=ax, vmin=0, vmax=2000, add_colorbar=False\n )\n ax.set_xlabel(\"Longitude\")\n ax.set_ylabel(\"Latitude\")\n ax.set_title(f\"Mean PBLH: {tmp.mean().values:.2f} m\")\n\n\nanim = FuncAnimation(fig, plot_it, frames=range(1, 13), interval=500)\nplt.close()\nanim\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\n\nfrom aqmsp_data.data import load_caaqm\n\ncaaqm = load_caaqm(years=2022, months=1, days=1, variables=\"PM2.5\")\n\n\nimport pandas as pd\nfrom glob import glob\n\nfiles = glob(\"/home/patel_zeel/aqmsp/aqmsp_data/data/caaqm/raw_2023/*.xlsx\")\n\ndf = pd.read_excel(\n files[-3],\n header=None,\n)\n# print(df.head(7))\nstation = df.iloc[6, 1]\nlat = caaqm.sel(station=station).latitude.values.item()\nlon = caaqm.sel(station=station).longitude.values.item()\ndf = df.iloc[16:, :]\n# assign first row as column names\ndf.columns = df.iloc[0].values\n# drop first row\ndf = df.iloc[1:, :]\nprint(station)\ndf[\"time\"] = pd.to_datetime(df[\"From Date\"], format=\"%d-%m-%Y %H:%M\") + pd.Timedelta(\n minutes=30\n)\ndf\n\nVivek Vihar, Delhi - DPCC\n\n\n\n\n\n\n\n\n\nFrom Date\nTo Date\nPM2.5\nPM10\nAT\nBP\nRH\ntime\n\n\n\n\n17\n01-01-2023 00:00\n01-01-2023 01:00\n141\n211\n11.95\n997.83\n78.9\n2023-01-01 00:30:00\n\n\n18\n01-01-2023 01:00\n01-01-2023 02:00\n149\n210\n11.85\n997.47\n79.1\n2023-01-01 01:30:00\n\n\n19\n01-01-2023 02:00\n01-01-2023 03:00\n141\n186\n11.3\n997.03\n80.25\n2023-01-01 02:30:00\n\n\n20\n01-01-2023 03:00\n01-01-2023 04:00\n138\n174\n10.3\n996.5\n83.33\n2023-01-01 03:30:00\n\n\n21\n01-01-2023 04:00\n01-01-2023 05:00\n122\n161\n10.05\n996.38\n84.08\n2023-01-01 04:30:00\n\n\n...\n...\n...\n...\n...\n...\n...\n...\n...\n\n\n5077\n30-07-2023 20:00\n30-07-2023 21:00\n14\n95.75\n30.87\n972.4\n66.17\n2023-07-30 20:30:00\n\n\n5078\n30-07-2023 21:00\n30-07-2023 22:00\n18.25\n114.75\n30.6\n972.42\n69.22\n2023-07-30 21:30:00\n\n\n5079\n30-07-2023 22:00\n30-07-2023 23:00\n20.5\n100.5\n30.48\n972.43\n72\n2023-07-30 22:30:00\n\n\n5080\n30-07-2023 23:00\n31-07-2023 00:00\n17.75\n103\n30.42\n972.35\n71.92\n2023-07-30 23:30:00\n\n\n5081\n31-07-2023 00:00\n31-07-2023 00:00\n23\n124\n30.4\n972.3\n71.8\n2023-07-31 00:30:00\n\n\n\n\n5065 rows × 8 columns\n\n\n\n\ndef process_it(ds, date):\n if ds.TSTEP.size == 120:\n print(\"120\")\n ds = ds.isel(LAY=0, TSTEP=range(24, 48))\n else:\n print(\"96\")\n ds = ds.isel(LAY=0, TSTEP=range(24))\n ds1 = ds.assign(time=(\"TSTEP\", pd.date_range(date, periods=24, freq=\"H\")))\n lats = np.arange(80) * ds1.YCELL + ds1.YORIG\n lons = np.arange(80) * ds1.XCELL + ds1.XORIG\n ds2 = ds1.assign_coords(lat=(\"ROW\", lats), lon=(\"COL\", lons))\n ds3 = ds2.swap_dims({\"TSTEP\": \"time\", \"ROW\": \"lat\", \"COL\": \"lon\"})\n ds3[\"time\"] = ds3[\"time\"] + pd.Timedelta(hours=5, minutes=30)\n return ds3\n\n\nds_met_processed = process_it(ds_met, global_date)\nds_met_processed\n\n96\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n<xarray.Dataset>\nDimensions: (time: 24, VAR: 14, DATE-TIME: 2, lat: 80, lon: 80)\nCoordinates:\n * time (time) datetime64[ns] 2023-07-26T05:30:00 ... 2023-07-27T04:3...\n * lat (lat) float64 28.2 28.21 28.22 28.23 ... 28.96 28.97 28.98 28.99\n * lon (lon) float64 76.85 76.86 76.87 76.88 ... 77.62 77.63 77.64\nDimensions without coordinates: VAR, DATE-TIME\nData variables: (12/15)\n TFLAG (time, VAR, DATE-TIME) int32 ...\n TSURF_K (time, lat, lon) float32 ...\n SNOWEW_M (time, lat, lon) float32 ...\n SNOWAGE_HR (time, lat, lon) float32 ...\n PRATE_MMpH (time, lat, lon) float32 ...\n CLOUD_OD (time, lat, lon) float32 ...\n ... ...\n SWSFC_WpM2 (time, lat, lon) float32 ...\n SOLM_M3pM3 (time, lat, lon) float32 ...\n CLDTOP_KM (time, lat, lon) float32 ...\n CAPE (time, lat, lon) float32 ...\n PBL_WRF_M (time, lat, lon) float32 ...\n PBL_YSU_M (time, lat, lon) float32 ...\nAttributes: (12/33)\n IOAPI_VERSION: $Id: @(#) ioapi library version 3.0 $ ...\n EXEC_ID: ???????????????? ...\n FTYPE: 1\n CDATE: 2023207\n CTIME: 72116\n WDATE: 2023207\n ... ...\n VGLVLS: [0. 0.]\n GDNAM: ????????????????\n UPNAM: CAMx2IOAPI \n VAR-LIST: TSURF_K SNOWEW_M SNOWAGE_HR PRATE_MMp...\n FILEDESC: I/O API formatted CAMx AVRG output ...\n HISTORY: xarray.DatasetDimensions:time: 24VAR: 14DATE-TIME: 2lat: 80lon: 80Coordinates: (3)time(time)datetime64[ns]2023-07-26T05:30:00 ... 2023-07-...array(['2023-07-26T05:30:00.000000000', '2023-07-26T06:30:00.000000000',\n '2023-07-26T07:30:00.000000000', '2023-07-26T08:30:00.000000000',\n '2023-07-26T09:30:00.000000000', '2023-07-26T10:30:00.000000000',\n '2023-07-26T11:30:00.000000000', '2023-07-26T12:30:00.000000000',\n '2023-07-26T13:30:00.000000000', '2023-07-26T14:30:00.000000000',\n '2023-07-26T15:30:00.000000000', '2023-07-26T16:30:00.000000000',\n '2023-07-26T17:30:00.000000000', '2023-07-26T18:30:00.000000000',\n '2023-07-26T19:30:00.000000000', '2023-07-26T20:30:00.000000000',\n '2023-07-26T21:30:00.000000000', '2023-07-26T22:30:00.000000000',\n '2023-07-26T23:30:00.000000000', '2023-07-27T00:30:00.000000000',\n '2023-07-27T01:30:00.000000000', '2023-07-27T02:30:00.000000000',\n '2023-07-27T03:30:00.000000000', '2023-07-27T04:30:00.000000000'],\n dtype='datetime64[ns]')lat(lat)float6428.2 28.21 28.22 ... 28.98 28.99array([28.200001, 28.210001, 28.220001, 28.230001, 28.240001, 28.250001,\n 28.260001, 28.270001, 28.280001, 28.290001, 28.300001, 28.310001,\n 28.320001, 28.330001, 28.340001, 28.350001, 28.360001, 28.370001,\n 28.380001, 28.390001, 28.400001, 28.410001, 28.420001, 28.430001,\n 28.440001, 28.450001, 28.460001, 28.470001, 28.480001, 28.490001,\n 28.500001, 28.510001, 28.520001, 28.530001, 28.540001, 28.550001,\n 28.560001, 28.570001, 28.580001, 28.590001, 28.600001, 28.610001,\n 28.620001, 28.630001, 28.640001, 28.650001, 28.660001, 28.670001,\n 28.680001, 28.690001, 28.700001, 28.710001, 28.720001, 28.730001,\n 28.740001, 28.750001, 28.760001, 28.770001, 28.780001, 28.790001,\n 28.800001, 28.810001, 28.820001, 28.830001, 28.840001, 28.850001,\n 28.860001, 28.870001, 28.880001, 28.890001, 28.900001, 28.910001,\n 28.920001, 28.930001, 28.940001, 28.950001, 28.960001, 28.970001,\n 28.980001, 28.990001])lon(lon)float6476.85 76.86 76.87 ... 77.63 77.64array([76.849998, 76.859998, 76.869998, 76.879998, 76.889998, 76.899998,\n 76.909998, 76.919998, 76.929998, 76.939998, 76.949998, 76.959998,\n 76.969998, 76.979998, 76.989998, 76.999998, 77.009998, 77.019998,\n 77.029998, 77.039998, 77.049998, 77.059998, 77.069998, 77.079998,\n 77.089998, 77.099998, 77.109998, 77.119998, 77.129998, 77.139998,\n 77.149998, 77.159998, 77.169998, 77.179998, 77.189998, 77.199998,\n 77.209998, 77.219998, 77.229998, 77.239998, 77.249998, 77.259998,\n 77.269998, 77.279998, 77.289998, 77.299998, 77.309998, 77.319998,\n 77.329998, 77.339998, 77.349998, 77.359998, 77.369998, 77.379998,\n 77.389998, 77.399998, 77.409998, 77.419998, 77.429998, 77.439998,\n 77.449998, 77.459998, 77.469998, 77.479998, 77.489998, 77.499998,\n 77.509998, 77.519998, 77.529998, 77.539998, 77.549998, 77.559998,\n 77.569998, 77.579998, 77.589998, 77.599998, 77.609998, 77.619998,\n 77.629998, 77.639998])Data variables: (15)TFLAG(time, VAR, DATE-TIME)int32...units :<YYYYDDD,HHMMSS>long_name :TFLAG var_desc :Timestep-valid flags: (1) YYYYDDD or (2) HHMMSS [672 values with dtype=int32]TSURF_K(time, lat, lon)float32...long_name :TSURF_K units :ppmV var_desc :VARIABLE TSURF_K [153600 values with dtype=float32]SNOWEW_M(time, lat, lon)float32...long_name :SNOWEW_M units :ppmV var_desc :VARIABLE SNOWEW_M [153600 values with dtype=float32]SNOWAGE_HR(time, lat, lon)float32...long_name :SNOWAGE_HR units :ppmV var_desc :VARIABLE SNOWAGE_HR [153600 values with dtype=float32]PRATE_MMpH(time, lat, lon)float32...long_name :PRATE_MMpH units :ppmV var_desc :VARIABLE PRATE_MMpH [153600 values with dtype=float32]CLOUD_OD(time, lat, lon)float32...long_name :CLOUD_OD units :ppmV var_desc :VARIABLE CLOUD_OD [153600 values with dtype=float32]U10_MpS(time, lat, lon)float32...long_name :U10_MpS units :ppmV var_desc :VARIABLE U10_MpS [153600 values with dtype=float32]V10_MpS(time, lat, lon)float32...long_name :V10_MpS units :ppmV var_desc :VARIABLE V10_MpS [153600 values with dtype=float32]T2_K(time, lat, lon)float32...long_name :T2_K units :ppmV var_desc :VARIABLE T2_K [153600 values with dtype=float32]SWSFC_WpM2(time, lat, lon)float32...long_name :SWSFC_WpM2 units :ppmV var_desc :VARIABLE SWSFC_WpM2 [153600 values with dtype=float32]SOLM_M3pM3(time, lat, lon)float32...long_name :SOLM_M3pM3 units :ppmV var_desc :VARIABLE SOLM_M3pM3 [153600 values with dtype=float32]CLDTOP_KM(time, lat, lon)float32...long_name :CLDTOP_KM units :ppmV var_desc :VARIABLE CLDTOP_KM [153600 values with dtype=float32]CAPE(time, lat, lon)float32...long_name :CAPE units :ppmV var_desc :VARIABLE CAPE [153600 values with dtype=float32]PBL_WRF_M(time, lat, lon)float32...long_name :PBL_WRF_M units :ppmV var_desc :VARIABLE PBL_WRF_M [153600 values with dtype=float32]PBL_YSU_M(time, lat, lon)float32...long_name :PBL_YSU_M units :ppmV var_desc :VARIABLE PBL_YSU_M [153600 values with dtype=float32]Indexes: (3)timePandasIndexPandasIndex(DatetimeIndex(['2023-07-26 05:30:00', '2023-07-26 06:30:00',\n '2023-07-26 07:30:00', '2023-07-26 08:30:00',\n '2023-07-26 09:30:00', '2023-07-26 10:30:00',\n '2023-07-26 11:30:00', '2023-07-26 12:30:00',\n '2023-07-26 13:30:00', '2023-07-26 14:30:00',\n '2023-07-26 15:30:00', '2023-07-26 16:30:00',\n '2023-07-26 17:30:00', '2023-07-26 18:30:00',\n '2023-07-26 19:30:00', '2023-07-26 20:30:00',\n '2023-07-26 21:30:00', '2023-07-26 22:30:00',\n '2023-07-26 23:30:00', '2023-07-27 00:30:00',\n '2023-07-27 01:30:00', '2023-07-27 02:30:00',\n '2023-07-27 03:30:00', '2023-07-27 04:30:00'],\n dtype='datetime64[ns]', name='time', freq=None))latPandasIndexPandasIndex(Index([28.200000762939453, 28.210000762715936, 28.22000076249242,\n 28.2300007622689, 28.240000762045383, 28.250000761821866,\n 28.26000076159835, 28.27000076137483, 28.280000761151314,\n 28.290000760927796, 28.30000076070428, 28.31000076048076,\n 28.320000760257244, 28.330000760033727, 28.34000075981021,\n 28.350000759586692, 28.360000759363174, 28.370000759139657,\n 28.38000075891614, 28.390000758692622, 28.400000758469105,\n 28.410000758245587, 28.42000075802207, 28.430000757798553,\n 28.440000757575035, 28.450000757351518, 28.460000757128,\n 28.470000756904483, 28.480000756680965, 28.490000756457448,\n 28.50000075623393, 28.510000756010413, 28.520000755786896,\n 28.53000075556338, 28.54000075533986, 28.550000755116343,\n 28.560000754892826, 28.57000075466931, 28.58000075444579,\n 28.590000754222274, 28.600000753998756, 28.61000075377524,\n 28.62000075355172, 28.630000753328204, 28.640000753104687,\n 28.65000075288117, 28.660000752657652, 28.670000752434134,\n 28.680000752210617, 28.6900007519871, 28.700000751763582,\n 28.710000751540065, 28.720000751316547, 28.73000075109303,\n 28.740000750869513, 28.750000750645995, 28.760000750422478,\n 28.77000075019896, 28.780000749975443, 28.790000749751925,\n 28.800000749528408, 28.81000074930489, 28.820000749081373,\n 28.830000748857856, 28.84000074863434, 28.85000074841082,\n 28.860000748187304, 28.870000747963786, 28.88000074774027,\n 28.89000074751675, 28.900000747293234, 28.910000747069716,\n 28.9200007468462, 28.93000074662268, 28.940000746399164,\n 28.950000746175647, 28.96000074595213, 28.970000745728612,\n 28.980000745505095, 28.990000745281577],\n dtype='float64', name='lat'))lonPandasIndexPandasIndex(Index([ 76.8499984741211, 76.85999847389758, 76.86999847367406,\n 76.87999847345054, 76.88999847322702, 76.8999984730035,\n 76.90999847277999, 76.91999847255647, 76.92999847233295,\n 76.93999847210944, 76.94999847188592, 76.9599984716624,\n 76.96999847143888, 76.97999847121537, 76.98999847099185,\n 76.99999847076833, 77.00999847054482, 77.0199984703213,\n 77.02999847009778, 77.03999846987426, 77.04999846965075,\n 77.05999846942723, 77.06999846920371, 77.0799984689802,\n 77.08999846875668, 77.09999846853316, 77.10999846830964,\n 77.11999846808612, 77.1299984678626, 77.13999846763909,\n 77.14999846741557, 77.15999846719205, 77.16999846696854,\n 77.17999846674502, 77.1899984665215, 77.19999846629798,\n 77.20999846607447, 77.21999846585095, 77.22999846562743,\n 77.23999846540391, 77.2499984651804, 77.25999846495688,\n 77.26999846473336, 77.27999846450984, 77.28999846428633,\n 77.29999846406281, 77.30999846383929, 77.31999846361578,\n 77.32999846339226, 77.33999846316874, 77.34999846294522,\n 77.3599984627217, 77.36999846249819, 77.37999846227467,\n 77.38999846205115, 77.39999846182764, 77.40999846160412,\n 77.4199984613806, 77.42999846115708, 77.43999846093357,\n 77.44999846071005, 77.45999846048653, 77.46999846026301,\n 77.4799984600395, 77.48999845981598, 77.49999845959246,\n 77.50999845936894, 77.51999845914543, 77.52999845892191,\n 77.53999845869839, 77.54999845847487, 77.55999845825136,\n 77.56999845802784, 77.57999845780432, 77.5899984575808,\n 77.59999845735729, 77.60999845713377, 77.61999845691025,\n 77.62999845668674, 77.63999845646322],\n dtype='float64', name='lon'))Attributes: (33)IOAPI_VERSION :$Id: @(#) ioapi library version 3.0 $ EXEC_ID :???????????????? FTYPE :1CDATE :2023207CTIME :72116WDATE :2023207WTIME :72116SDATE :2023207STIME :0TSTEP :10000NTHIK :1NCOLS :80NROWS :80NLAYS :1NVARS :14GDTYP :1P_ALP :0.0P_BET :0.0P_GAM :0.0XCENT :0.0YCENT :0.0XORIG :76.8499984741211YORIG :28.200000762939453XCELL :0.009999999776482582YCELL :0.009999999776482582VGTYP :-9999VGTOP :-9.999e+36VGLVLS :[0. 0.]GDNAM :????????????????UPNAM :CAMx2IOAPI VAR-LIST :TSURF_K SNOWEW_M SNOWAGE_HR PRATE_MMpH CLOUD_OD U10_MpS V10_MpS T2_K SWSFC_WpM2 SOLM_M3pM3 CLDTOP_KM CAPE PBL_WRF_M PBL_YSU_M FILEDESC :I/O API formatted CAMx AVRG output HISTORY :\n\n\n\nclosest_ds = ds_met_processed.sel(lat=lat, lon=lon, method=\"nearest\")\nfig, ax = plt.subplots(figsize=(15, 4))\n\nprint(lat, lon)\nclosest_ds[\"TSURF_K\"].plot(ax=ax, label=\"TSURF_K\")\nclosest_ds[\"T2_K\"].plot(ax=ax, label=\"T2_K\")\ntmp_df = df.set_index(\"time\")\n# select data from global_data\ntmp_df = tmp_df.loc[closest_ds[\"time\"].values[0] : closest_ds[\"time\"].values[-1]]\ntmp_df[\"AT_K\"] = tmp_df[\"AT\"] + 273.15\ntmp_df.plot(ax=ax, y=\"AT_K\")\nplt.legend()\n# tmp_df\n\n28.672342 77.31526" + "objectID": "posts/2022-04-06-github_faqs.html#q3-what-to-do-with-the-forks-main-when-the-original-main-is-updated", + "href": "posts/2022-04-06-github_faqs.html#q3-what-to-do-with-the-forks-main-when-the-original-main-is-updated", + "title": "GitHub Contrubuting FAQs", + "section": "Q3: What to do with the fork’s main when the original main is updated?", + "text": "Q3: What to do with the fork’s main when the original main is updated?\nFetch upstream with GitHub GUI or use the same solution given in Q2." + }, + { + "objectID": "posts/2022-04-06-github_faqs.html#q4-why-and-when-keeping-the-forks-main-up-to-date-with-the-original-main-is-important", + "href": "posts/2022-04-06-github_faqs.html#q4-why-and-when-keeping-the-forks-main-up-to-date-with-the-original-main-is-important", + "title": "GitHub Contrubuting FAQs", + "section": "Q4: Why and when keeping the fork’s main up to date with the original main is important?", + "text": "Q4: Why and when keeping the fork’s main up to date with the original main is important?\nWhenever we need to create new branches (usually from the fork’s main)." }, { - "objectID": "posts/gcloud.html", - "href": "posts/gcloud.html", - "title": "Gcloud cheatsheet", - "section": "", - "text": "Following this guide.\n\nTo set default email, project-id & zone:\n\ngcloud config set account your-email-account\ngcloud config set project your-project-id\ngcloud config set compute/zone us-central1-f # us-central1-f for free v2 TPUs and europe-west4-a for free v3 TPUs (only if you have free TRC access)\n\nTo get currently active project and zone related info:\n\ngcloud info\n\nTo create an identity (I don’t know if this is required or not. This command should trigger installation of “gcloud Beta Commands” automatically in another shell and then you need to rerun the following command):\n\ngcloud beta services identity create --service tpu.googleapis.com" + "objectID": "posts/2022-04-06-github_faqs.html#q5-how-to-update-a-change-in-a-pr-that-is-open", + "href": "posts/2022-04-06-github_faqs.html#q5-how-to-update-a-change-in-a-pr-that-is-open", + "title": "GitHub Contrubuting FAQs", + "section": "Q5: How to update a change in a PR that is open?", + "text": "Q5: How to update a change in a PR that is open?\nPush the change to the corresponding branch and PR will get updated automatically." }, { - "objectID": "posts/gcloud.html#initial-setup", - "href": "posts/gcloud.html#initial-setup", - "title": "Gcloud cheatsheet", + "objectID": "posts/2021-03-22-gp_kernels.html", + "href": "posts/2021-03-22-gp_kernels.html", + "title": "Understanding Kernels in Gaussian Processes", "section": "", - "text": "Following this guide.\n\nTo set default email, project-id & zone:\n\ngcloud config set account your-email-account\ngcloud config set project your-project-id\ngcloud config set compute/zone us-central1-f # us-central1-f for free v2 TPUs and europe-west4-a for free v3 TPUs (only if you have free TRC access)\n\nTo get currently active project and zone related info:\n\ngcloud info\n\nTo create an identity (I don’t know if this is required or not. This command should trigger installation of “gcloud Beta Commands” automatically in another shell and then you need to rerun the following command):\n\ngcloud beta services identity create --service tpu.googleapis.com" + "text": "!pip install -qq GPy\nimport autograd.numpy as np\nimport pandas as pd\nimport GPy\nimport matplotlib.pyplot as plt\nfrom autograd import grad\nfrom matplotlib.animation import FuncAnimation\nfrom matplotlib import rc\nimport seaborn as sns" }, { - "objectID": "posts/gcloud.html#working-with-tpu-vms", - "href": "posts/gcloud.html#working-with-tpu-vms", - "title": "Gcloud cheatsheet", - "section": "Working with TPU VMs", - "text": "Working with TPU VMs\nThere are two different terms here: “TPU VMs” and “TPU nodes”. TPU nodes can be connected externally via another VM. TPU VMs are stand-alone systems with TPUs, RAM and CPU (96 core Intel 2 GHz processor and 335 GB RAM). We may be charged via GCP for the VM (CPUs and RAM). (I will update this info once I know for sure):\n\n\nTo create a TPU VM in preferred zone via CLI (be careful about the --zone to avoid charges, check the first email received from TRC team to see what kind of TPUs are free in different zones. if --zone is not passed, VM will be created in the default zone that we set initially. This command triggered installation of “gcloud Alpha Commands”):\n\ngcloud alpha compute tpus tpu-vm create vm-1 --accelerator-type v2-8 --version tpu-vm-tf-2.8.0 --zone us-central1-f\n\nTo get the list of TPU nodes/VMs:\n\ngcloud compute tpus list\n\nTo delete a TPU node/VM:\n\ngcloud compute tpus delete vm-1\n\nTo connect with a vm via ssh (this automatically creates ssh key pair and places in default ssh config location):\n\ngcloud alpha compute tpus tpu-vm ssh vm-1\n\nFollow this guide to create and attach a persistent disk with the TPU VM" + "objectID": "posts/2021-03-22-gp_kernels.html#rbf-radial-basis-function-kernel-stationarity-and-isotropy", + "href": "posts/2021-03-22-gp_kernels.html#rbf-radial-basis-function-kernel-stationarity-and-isotropy", + "title": "Understanding Kernels in Gaussian Processes", + "section": "RBF (Radial basis function) Kernel, Stationarity and Isotropy", + "text": "RBF (Radial basis function) Kernel, Stationarity and Isotropy\nRBF is one of the most commonly used kernels in GPs due to it’s infinetely differentiability (extreme flexibility). This property helps us to model a vast variety of functions \\(X \\to Y\\).\nRBF kernel is given as the following, \\[\n\\begin{aligned}\n\\mathcal{K}(x_1,x_2)= \\sigma^2exp\\left(-\\frac{(x-x')^2}{2l^2}\\right)\n\\end{aligned}\n\\] Where, \\(\\sigma^2\\) is variance and \\(l\\) is known as lengthscale. #### Stationarity RBF is a stationary kernel and so it is invariant to translation in the input space. In other words, \\(\\mathcal{K}(x,x')\\) depends only on \\(x-x'\\).\n\nIsotropy\nRBF is also isotropic kernel, which means that \\(\\mathcal{K}(x,x')\\) depends only on \\(|x-x'|\\). Thus, we have \\(\\mathcal{K}(x,x') = \\mathcal{K}(x',x)\\).\nLet’s visualize few functions drawn from the RBF kernel\n\ndef K_rbf(X1, X2, sigma=1., l=1.):\n return (sigma**2)*(np.exp(-0.5*np.square(X1-X2.T)/l**2))\n\n\n\nHelper functions\n\ndef plot_functions(kernel_func, ax0_ylim=(-3,3), ax1_ylim=(-0.1,1.1)):\n mean = np.zeros(X.shape[0])\n cov = kernel_func(X, X, sigma, l)\n functions = np.random.multivariate_normal(mean, cov, size=5)\n fig = plt.figure(figsize=(14,8), constrained_layout=True)\n gs = fig.add_gridspec(2,4)\n ax0 = fig.add_subplot(gs[0, 1:-1])\n ax0.set_ylim(*ax0_ylim)\n ax1 = fig.add_subplot(gs[1, 0:2])\n ax1.set_ylim(*ax1_ylim)\n ax2 = fig.add_subplot(gs[1, 2:4])\n for func in functions:\n ax0.plot(X, func,'o-');\n ax0.set_xlabel('X');ax0.set_ylabel('Y');ax0.set_title('Functions drawn from '+k_name+' kernel');\n ax1.plot(X, cov[:,4]);ax1.set_title('K(0,X)');ax1.set_xlabel('X');ax1.set_ylabel('K(0,X)')\n sns.heatmap(cov.round(2), ax=ax2, xticklabels=X.ravel(), yticklabels=X.ravel(), annot=True);\n ax2.set_xlabel('X');ax2.set_ylabel('X');ax2.set_title('Covariance matrix');\n\ndef animate_functions(kernel_func, val_list, ax0_ylim=(-3,3), ax1_ylim=(-0.1,1.1), \n k_name='',p_name='',symbol=''):\n fig = plt.figure(figsize=(14,8))\n gs = fig.add_gridspec(2,4)\n ax0 = fig.add_subplot(gs[0, 1:-1]);ax1 = fig.add_subplot(gs[1, 0:2]);ax2 = fig.add_subplot(gs[1, 2:4]);\n def update(p):\n ax0.cla();ax1.cla();ax2.cla();\n ax0.set_ylim(*ax0_ylim);ax1.set_ylim(*ax1_ylim)\n if p_name == 'Lengthscale':\n cov = kernel_func(X, X, l=p)\n elif p_name == 'Variance':\n cov = kernel_func(X, X, sigma=np.sqrt(p))\n elif p_name == 'Offset':\n cov = kernel_func(X, X, c=p)\n elif p_name == 'Period':\n cov = kernel_func(X, X, p=p)\n functions = np.random.multivariate_normal(mean, cov, size=5)\n for func in functions:\n ax0.plot(X, func,'o-');\n ax0.set_xlabel('X');ax0.set_ylabel('Y');ax0.set_title('Functions drawn from '+k_name+' kernel\\n'+p_name+' ('+symbol+') = '+str(p));\n ax1.plot(X, cov[:,4]);ax1.set_title('K(0,X)');ax1.set_title('K(0,X)');ax1.set_xlabel('X');ax1.set_ylabel('K(0,X)')\n sns.heatmap(cov.round(2), ax=ax2, xticklabels=X.ravel(), yticklabels=X.ravel(), annot=True, cbar=False);\n ax2.set_xlabel('X');ax2.set_ylabel('X');ax2.set_title('Covariance matrix');\n\n anim = FuncAnimation(fig, update, frames=val_list, blit=False)\n plt.close()\n rc('animation', html='jshtml')\n return anim\n\nVerifying if our kernel is consistent with GPy kernels.\n\nX = np.linspace(101,1001,200).reshape(-1,1)\nsigma, l = 7, 11\nassert np.allclose(K_rbf(X,X,sigma,l), GPy.kern.RBF(1, variance=sigma**2, lengthscale=l).K(X,X)) \n\n\nnp.random.seed(0)\nX = np.arange(-4,5).reshape(-1,1)\nsigma = 1.\nl = 3.\nk_name = 'RBF'\nplot_functions(K_rbf, ax0_ylim=(-3.5,3))\n\n\n\n\n\n\n\n\nLet’s see the effect of varying parameters \\(\\sigma\\) and \\(l\\) of the RBF kernel function.\n\nnp.random.seed(0)\nsigma = 1.\nval_list = [0.5,1,2,3,4,5]\nanimate_functions(K_rbf, val_list, k_name='RBF', p_name='Lengthscale', symbol='l')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\n\nl = 1.\nval_list = [1,4,9,16,25]\nanimate_functions(K_rbf, val_list, ax0_ylim=(-12,12), ax1_ylim=(-0.1, 26),\n k_name='RBF', p_name='Variance', symbol='sigma')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nWith increase in value of \\(l\\), functions drawn from the kernel become smoother. Covariance between a pair of points is increasing with increase in \\(l\\).\nIncreasing \\(\\sigma^2\\) increase the overall uncertainty (width of the space where 95% of the functions live) across all the points." }, { - "objectID": "posts/gcloud.html#working-with-tpu-vms-via-vs-code", - "href": "posts/gcloud.html#working-with-tpu-vms-via-vs-code", - "title": "Gcloud cheatsheet", - "section": "Working with TPU VMs via VS-code", - "text": "Working with TPU VMs via VS-code\n\nInstall the following extension on VS-code: \nUse the following button to connect to a remote machine (use “Connect to Host…” button): \nManually update the default ssh config file (in my case, “C:\\.ssh”) to add a VM in VS-code (you can use VS-code command palette to figure out the config file for you and edit it. Please see the screeshot below).\n\n\n\nNote that ssh public-private key pair with name google_compute_engine is automatically generated when you connect with the VM for the first time with gcloud alpha compute tpus tpu-vm ssh command. The VM config for me looks like this:\n\nHost Cloud-TPU-Node-2\n HostName <External-IP-of-your-TPU-VM>\n User zeelp\n Port 22\n IdentityFile C:\\Users\\zeelp\\.ssh\\google_compute_engine" + "objectID": "posts/2021-03-22-gp_kernels.html#matern-kernel", + "href": "posts/2021-03-22-gp_kernels.html#matern-kernel", + "title": "Understanding Kernels in Gaussian Processes", + "section": "Matern Kernel", + "text": "Matern Kernel\nMatern kernels are given by a general formula as following, \\[\n\\begin{aligned}\n\\mathcal{K}(x_1, x_2) = \\sigma^2\\frac{1}{\\Gamma(\\nu)2^{\\nu-1}}\\Bigg(\n\\frac{\\sqrt{2\\nu}}{l} |x_1-x_2|\n\\Bigg)^\\nu K_\\nu\\Bigg(\n\\frac{\\sqrt{2\\nu}}{l} |x_1-x_2|\\Bigg)\n\\end{aligned}\n\\] Where, \\(\\Gamma\\) is gamma function and \\(K_\\nu\\) is modified Bessel function of second order.\nThe general formula is not very intuitive about the functionality of this kernel. In practice, Matern with \\(\\nu=\\{0.5,1.5,2.5\\}\\) are used, where GP with each kernel is \\((\\lceil\\nu\\rceil-1)\\) times differentiable.\nMatern functions corresponding to each \\(\\nu\\) values are defined as the following, \\[\n\\begin{aligned}\nMatern12 \\to \\mathcal{K_{\\nu=0.5}}(x_1, x_2) &= \\sigma^2exp\\left(-\\frac{|x_1-x_2|}{l}\\right)\\\\\nMatern32 \\to \\mathcal{K_{\\nu=1.5}}(x_1, x_2) &= \\sigma^2\\left(1+\\frac{\\sqrt{3}|x_1-x_2|}{l}\\right)exp\\left(-\\frac{\\sqrt{3}|x_1-x_2|}{l}\\right)\\\\\nMatern52 \\to \\mathcal{K_{\\nu=2.5}}(x_1, x_2) &= \\sigma^2\\left(1+\\frac{\\sqrt{5}|x_1-x_2|}{l}+\\frac{5(x_1-x_2)^2)}{3l^2}\\right)exp\\left(-\\frac{\\sqrt{5}|x_1-x_2|}{l}\\right)\n\\end{aligned}\n\\] Matern kernels are stationary as well as isotropic. With \\(\\nu \\to \\infty\\) they converge to \\(RBF\\) kernel. \\(Matern12\\) is also known as \\(Exponential\\) kernel in toolkits such as GPy.\nNow, let’s draw few functions from each of these versions and try to get intuition behind each of them.\n\ndef K_m12(X1, X2, sigma=1., l=1.): # v = 0.5\n return (sigma**2)*(np.exp(-np.abs(X1-X2.T)/l))\ndef K_m32(X1, X2, sigma=1., l=1.): # v = 1.5\n return (sigma**2)*(1+((3**0.5)*np.abs(X1-X2.T))/l)*(np.exp(-(3**0.5)*np.abs(X1-X2.T)/l))\ndef K_m52(X1, X2, sigma=1., l=1.): # v = 2.5\n return (sigma**2)*(1+(((5**0.5)*np.abs(X1-X2.T))/l)+((5*(X1-X2.T)**2)/(3*l**2)))*\\\n (np.exp(-(5**0.5)*np.abs(X1-X2.T)/l))\n\nVerifying if our kernels are consistent with GPy kernels.\n\nX = np.linspace(101,1001,50).reshape(-1,1)\nassert np.allclose(K_m32(X,X,sigma=7.,l=11.), GPy.kern.Matern32(1,lengthscale=11.,variance=7**2).K(X,X))\nassert np.allclose(K_m52(X,X,sigma=7.,l=11.), GPy.kern.Matern52(1,lengthscale=11.,variance=7**2).K(X,X))\n\n\nX = np.arange(-4,5).reshape(-1,1)\nsigma = 1.\nl = 3.\n\nfig, ax = plt.subplots(3,2,figsize=(14,10))\nnames = ['Matern12', 'Matern32', 'Matern52']\nfor k_i, kernel in enumerate([K_m12, K_m32, K_m52]):\n mean = np.zeros(X.shape[0])\n cov = kernel(X, X, sigma, l)\n functions = np.random.multivariate_normal(mean, cov, size=5)\n for func in functions:\n ax[k_i,0].plot(X, func);\n ax[k_i,0].set_xlabel('X');ax[k_i,0].set_ylabel('Y');ax[k_i,0].set_title('Functions drawn from '+names[k_i]+' kernel');\n sns.heatmap(cov.round(2), ax=ax[k_i,1], xticklabels=X.ravel(), yticklabels=X.ravel(), annot=True);\n ax[k_i,1].set_xlabel('X');ax[k_i,1].set_ylabel('X');ax[k_i,1].set_title('Covariance matrix');\nplt.tight_layout();\n\n\n\n\n\n\n\n\nFrom the above plot, we can say that smoothness is increasing in functions as we increase \\(\\nu\\). Thus, smoothness of functions in terms of kernels is in the following order: Matern12<Matern32<Matern52.\nLet us see effect of varying \\(\\sigma\\) and \\(l\\) on Matern32 which is more popular among the three.\n\nnp.random.seed(0)\nsigma = 1.\nval_list = [0.5,1,2,3,4,5]\nanimate_functions(K_m32, val_list, k_name='Matern32', p_name='Lengthscale', symbol='l')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nWe can see that Matern32 kernel behaves similar to RBF with varying \\(l\\). Though, Matern32 is less smoother than RBF. A quick comparison would clarify this.\n\nX = np.linspace(-10,10,100).reshape(-1,1)\nplt.plot(X, K_rbf(X,X, l=3.)[:,50], label='RBF')\nplt.plot(X, K_m32(X,X, l=3.)[:,50], label='Matern32')\nplt.legend();plt.xlabel('X');plt.ylabel('Covariance (K(0,X))');\nplt.title('K(0,X)');" }, { - "objectID": "posts/2023-04-29-sine-combination-netowrks.html", - "href": "posts/2023-04-29-sine-combination-netowrks.html", - "title": "Sine Combination Networks", - "section": "", - "text": "We know that any continuous signal can be represented as a sum of sinusoids. The question is, how many sinusoids do we need to represent a signal? In this notebook, we will explore this question.\nimport os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"\"\n\nimport jax\nimport jax.numpy as jnp\nimport optax\n\nfrom tqdm import tqdm\n\nimport matplotlib.pyplot as plt\nfrom mpl_toolkits.mplot3d import Axes3D" + "objectID": "posts/2021-03-22-gp_kernels.html#periodic-kernel", + "href": "posts/2021-03-22-gp_kernels.html#periodic-kernel", + "title": "Understanding Kernels in Gaussian Processes", + "section": "Periodic Kernel", + "text": "Periodic Kernel\nPeriodic Kernel is given as the following, \\[\n\\begin{aligned}\n\\mathcal{K}(x_1,x_2)= \\sigma^2\\exp\\left(-\\frac{\\sin^2(\\pi|x_1 - x_2|/p)}{2l^2}\\right)\n\\end{aligned}\n\\] Where \\(p\\) is period. Let’s visualize few functions drawn from this kernel.\n\ndef K_periodic(X1, X2, sigma=1., l=1., p=3.):\n return sigma**2 * np.exp(-0.5*np.square(np.sin(np.pi*(X1-X2.T)/p))/l**2)\n\nX = np.linspace(10,1001,50).reshape(-1,1)\nassert np.allclose(K_periodic(X,X,sigma=7.,l=11.,p=3.), \n GPy.kern.StdPeriodic(1,lengthscale=11.,variance=7**2,period=3.).K(X,X))\n\n\nnp.random.seed(0)\nX = np.arange(-4,5).reshape(-1,1)\nsigma = 1\nl = 1.\np = 3.\nk_name = 'Periodic'\nplot_functions(K_periodic)\n\n\n\n\n\n\n\n\nWe will investigate the effect of varying period \\(p\\) now.\n\nnp.random.seed(0)\nval_list = [1., 2., 3., 4., 5.]\n\nanimate_functions(K_periodic, val_list, ax1_ylim=(0.4,1.1),\n k_name='Periodic',p_name='Period')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nFrom the above animation we can see that, all points that are \\(p\\) distance apart from each other have exactly same values because they have correlation of exactly 1 (\\(\\sigma=1 \\to covariance=correlation\\)).\nNow, we will investigate effect of lenging lengthscale \\(l\\) while other parameters are constant.\n\nnp.random.seed(0)\nval_list = [1., 2., 3., 4., 5.]\n\nanimate_functions(K_periodic, val_list, ax1_ylim=(0.6,1.1),\n k_name='Periodic',p_name='Lengthscale', symbol='l')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nWe can see that correlation between a pair of locations \\(\\{x_1,x_2|x_1-x_2<p\\}\\) increases as the lengthscale is increased." }, { - "objectID": "posts/2023-04-29-sine-combination-netowrks.html#random-combination-of-sinusoids", - "href": "posts/2023-04-29-sine-combination-netowrks.html#random-combination-of-sinusoids", - "title": "Sine Combination Networks", - "section": "Random Combination of Sinusoids", - "text": "Random Combination of Sinusoids\n\nN = 1000\nx = jnp.linspace(-10, 10, N).reshape(-1, 1)\ny = jnp.sin(x) + jnp.sin(2*x) #+ jax.random.normal(jax.random.PRNGKey(0), (N, 1)) * 0.1\nplt.plot(x, y, \"kx\");\nprint(x.shape, y.shape)\n\n(1000, 1) (1000, 1)" + "objectID": "posts/2021-03-22-gp_kernels.html#linear-kernel", + "href": "posts/2021-03-22-gp_kernels.html#linear-kernel", + "title": "Understanding Kernels in Gaussian Processes", + "section": "Linear Kernel", + "text": "Linear Kernel\nLinear kernel (a.k.a. dot-product kernel) is given as the following, \\[\n\\begin{aligned}\n\\mathcal{K}(x_1,x_2)= (x_1-c)(x_2-c)+\\sigma^2\n\\end{aligned}\n\\] Let’s visualize few functions drawn from the linear kernel\n\ndef K_lin(X1, X2, sigma=1., c=1.):\n return (X1-c)@(X2.T-c) + sigma**2\n\n\nnp.random.seed(0)\nsigma = 1.\nc = 1.\n\nplot_functions(K_lin, ax0_ylim=(-10,5), ax1_ylim=(-3,7))\n\n\n\n\n\n\n\n\nLet’s see the effect of varying parameters \\(\\sigma\\) and \\(c\\) of the linear kernel function.\n\nval_list = [-3,-2,-1,0,1,2,3]\n\nanimate_functions(K_lin, val_list, ax0_ylim=(-15,12), ax1_ylim=(-3,23), \n p_name='Offset', symbol='c')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\n\nnp.random.seed(1)\nval_list = np.square(np.array([1,2,3,4,5,8]))\n\nanimate_functions(K_lin, val_list, ax0_ylim=(-25,15), ax1_ylim=(-5,110), \n p_name='Variance', symbol='sigma')\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nVarying \\(c\\) parameter changes position of shallow region in covariance matrix. In other words, as \\(x \\to c\\), points close to \\(x\\) have variance \\(\\to \\sigma^2\\). Distant points have monotonically increasing variance.\nIncreasing \\(\\sigma^2\\) adds a constant in all variance and covariances. So, it allows more uncertainty across all points and weakens the monotonic trend of variance over distant points.\n\nNon-stationary behaviour of Linear kernel\nUnlike other stationary kernels, Linear kernel is not invariant of translations in the input space. The comparison below, visually supports this claim.\n\nfig, ax = plt.subplots(2,2,figsize=(14,8), sharex=True)\nkerns = [K_rbf, K_m32, K_periodic, K_lin]\nk_names = ['RBF', 'Matern32', 'Periodic', 'Linear']\nX = np.linspace(-10,10,21).reshape(-1,1)\ndef update(x):\n count = 0\n for i in range(2):\n for j in range(2):\n ax.ravel()[count].cla()\n tmp_kern = kerns[count]\n mean = np.zeros(X.shape[0])\n cov = tmp_kern(X,X)\n ax.ravel()[count].plot(X, cov[:,x]);\n ax.ravel()[count].set_xlim(X[x-3],X[x+3])\n ax.ravel()[count].set_xlabel('X');\n ax.ravel()[count].set_ylabel('K('+str(X[x].round(2))+',X)');\n ax.ravel()[count].set_title('Covariance K('+str(X[x].round(2))+',X) for '+k_names[count]+' kernel');\n count += 1\n ax.ravel()[3].set_ylim(-5,80)\n plt.tight_layout()\n\nanim = FuncAnimation(fig, update, frames=[5,7,9,11,13,15], blit=False)\nplt.close()\nrc('animation', html='jshtml')\nanim\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\n<Figure size 432x288 with 0 Axes>" }, { - "objectID": "posts/2023-04-29-sine-combination-netowrks.html#recover-the-signal", - "href": "posts/2023-04-29-sine-combination-netowrks.html#recover-the-signal", - "title": "Sine Combination Networks", - "section": "Recover the Signal", - "text": "Recover the Signal\n\ndef get_weights(key):\n w1 = jax.random.uniform(key, (), minval=0.0, maxval=5.0)\n key = jax.random.split(key)[0]\n w2 = jax.random.uniform(key, (), minval=0.0, maxval=5.0)\n return w1, w2\n \ndef get_sine(weights, x):\n w1, w2 = weights\n return jnp.sin(w1*x) + jnp.sin(w2*x)\n\ndef loss_fn(weights, x, y):\n output = get_sine(weights, x)\n w1, w2 = weights\n return jnp.mean((output.ravel() - y.ravel())**2)\n\n\ndef one_step(weights_and_state, xs):\n weights, state = weights_and_state\n loss, grads = value_and_grad_fn(weights, x, y)\n updates, state = optimizer.update(grads, state)\n weights = optax.apply_updates(weights, updates)\n return (weights, state), (loss, weights)\n\nepochs = 1000\noptimizer = optax.adam(1e-2)\nvalue_and_grad_fn = jax.jit(jax.value_and_grad(loss_fn))\nfig, ax = plt.subplots(4, 3, figsize=(15, 12))\nfig2, ax2 = plt.subplots(4, 3, figsize=(15, 12))\nax = ax.ravel()\nax2 = ax2.ravel()\nfor seed in tqdm(range(12)):\n key = jax.random.PRNGKey(seed)\n init_weights = get_weights(key)\n state = optimizer.init(init_weights)\n (weights, _), (loss_history, _) = jax.lax.scan(one_step, (init_weights, state), None, length=epochs)\n y_pred = get_sine(weights, x)\n ax[seed].plot(x, y, \"kx\")\n ax[seed].plot(x, y_pred, \"r-\")\n ax[seed].set_title(f\"w_init=({init_weights[0]:.2f}, {init_weights[1]:.2f}), w_pred=({weights[0]:.2f}, {weights[1]:.2f}), loss={loss_fn(weights, x, y):.2f}\")\n ax2[seed].plot(loss_history)\nfig.tight_layout()\n\n100%|██████████| 12/12 [00:00<00:00, 15.91it/s]" + "objectID": "posts/2021-03-22-gp_kernels.html#multiplications-of-kernels", + "href": "posts/2021-03-22-gp_kernels.html#multiplications-of-kernels", + "title": "Understanding Kernels in Gaussian Processes", + "section": "Multiplications of kernels", + "text": "Multiplications of kernels\nIf a single kernel is having high bias in fitting a dataset, we can use mutiple of these kernels in multiplications and/or summations. First, let us see effect of multiplication of a few kernels.\n\nPeriodic * Linear\n\nX = np.linspace(-10,10,100).reshape(-1,1)\nplt.plot(X, K_periodic(X,X,sigma=2.)[:,50], label='Periodic')\nplt.plot(X, K_lin(X,X,sigma=0.01,c=0)[:,50], label='Linear')\nplt.plot(X, K_periodic(X,X,sigma=2.)[:,50]*K_lin(X,X,sigma=0.01,c=0)[:,50], label='Periodic*Linear')\nplt.legend(bbox_to_anchor=(1,1));plt.xlabel('X');plt.ylabel('Covariance')\nplt.title('K(0,*)');\n\n\n\n\n\n\n\n\n\n\nLinear * Linear\n\nX = np.linspace(-1,1,100).reshape(-1,1)\nplt.plot(X, K_lin(X,X,c=-1)[:,50], label='Linear1')\nplt.plot(X, K_lin(X,X,c=1)[:,50], label='Linear2')\nplt.plot(X, K_lin(X,X,c=0.5)[:,50], label='Linear3')\nplt.plot(X, K_lin(X,X,c=-1)[:,50]*K_lin(X,X,c=1)[:,50], label='Linear1*Linear3')\nplt.plot(X, K_lin(X,X,c=-1)[:,50]*K_lin(X,X,c=1)[:,50]*K_lin(X,X,c=0.5)[:,50], label='Linear1*Linear2*Linear3')\nplt.legend(bbox_to_anchor=(1,1));\n\n\n\n\n\n\n\n\n\n\nMatern * Linear\n\nX = np.linspace(-1,1,100).reshape(-1,1)\nk1 = K_lin(X,X,c=1)[:,50]\nk2 = K_m32(X,X)[:,50]\nplt.plot(X, k1, label='Linear')\nplt.plot(X, k2, label='Matern32')\nplt.plot(X, k1*k2, label='Matern32*Linear')\nplt.legend(bbox_to_anchor=(1,1));" }, { - "objectID": "posts/2023-04-29-sine-combination-netowrks.html#plot-loss-surface", - "href": "posts/2023-04-29-sine-combination-netowrks.html#plot-loss-surface", - "title": "Sine Combination Networks", - "section": "Plot loss surface", - "text": "Plot loss surface\n\nw1 = jnp.linspace(0, 3, 100)\nw2 = jnp.linspace(0, 3, 100)\nW1, W2 = jnp.meshgrid(w1, w2)\nloss = jax.vmap(jax.vmap(lambda w1, w2: loss_fn((w1, w2), x, y)))(W1, W2)\n\n# plot the loss surface in 3D\nfig = plt.figure(figsize=(8, 6))\nax = fig.add_subplot(111, projection='3d')\nax.plot_surface(W1, W2, loss, cmap=\"viridis\", alpha=0.9);\nax.set_xlabel(\"w1\");\nax.set_ylabel(\"w2\");\n# top view\nax.view_init(30, 45)" + "objectID": "posts/2021-03-22-gp_kernels.html#appendix-extra-material", + "href": "posts/2021-03-22-gp_kernels.html#appendix-extra-material", + "title": "Understanding Kernels in Gaussian Processes", + "section": "Appendix (Extra material)", + "text": "Appendix (Extra material)\nAt this stage, we do not know how the fuctions are drawn from linear kernel based covariance matrix end up being lines with various intercepts and slopes.\n\n\nPredicting at a single point after observing value at a single point\nLet’s see how would be a GP prediction after observing value at a single point.\nOur kernel function is given by, * \\(K(x,x')=(x-c) \\cdot (x'-c)+\\sigma^2\\)\nNow, we observe value \\(y\\) at a location \\(x\\) and we want to predict value \\(y^*\\) at location \\(x^*\\). \\[\n\\begin{aligned}\n(y^*|x_1,y_1,x^*) &= K(x^*,x) \\cdot K^{-1}(x,x)\\cdot y \\\\\n&= \\left(\\frac{(x-c)(x^*-c)+\\sigma^2}{(x-c)(x-c)+\\sigma^2}\\right)\\cdot y\n\\end{aligned}\n\\] \\(c\\) and \\(\\sigma^2\\) do not vary in numerator and denominator so, the value of \\(y^* \\propto x^*\\).\n\n\n\nPredicting at a single point after observing values at two points\nNow, we’ll take a case where two values \\({y_1, y_2}\\) are observed at \\({x_1, x_2}\\). Let us try to predict value \\(y^*\\) at \\(x^*\\).\n$$ y^* =\n\\[\\begin{bmatrix}\nK(x_1, x^*) & K(x_2,x^*)\n\\end{bmatrix}\\begin{bmatrix}\nK(x_1, x_1) & K(x_1,x_2) \\\\\nK(x_2, x_1) & K(x_2,x_2)\n\\end{bmatrix}\\]\n^{-1}\n\\[\\begin{bmatrix}\ny_1 \\\\\ny_2\n\\end{bmatrix}\\]\n\\\n& =\n\\[\\begin{bmatrix}\n(x_1-c)(x^*-c)+\\sigma^2 & (x_2-c)(x^*-c)+\\sigma^2\n\\end{bmatrix}\n\\begin{bmatrix}\n(x_1-c)^2+\\sigma^2 & (x_1-c) (x_2-c)+\\sigma^2 \\\\\n(x_2-c) (x_1-c)+\\sigma^2 & (x_2-c)^2 +\\sigma^2\n\\end{bmatrix}\\]\n^{-1}\n\\[\\begin{bmatrix}\ny_1 \\\\\ny_2\n\\end{bmatrix}\\]\n\\\n& =\n\\[\\begin{bmatrix}\n(x_1-c)(x^*-c)+\\sigma^2 & (x_2-c)(x^*-c)+\\sigma^2\n\\end{bmatrix} \\frac{1}{\\sigma^2(x_1-x_2)^2}\n\\begin{bmatrix}\n(x_2-c)^2+\\sigma^2 & -[(x_1-c)(x_2-c)+\\sigma^2] \\\\\n-[(x_2-c) (x_1-c)+\\sigma^2] & (x_1-c)^2 +\\sigma^2\n\\end{bmatrix}\n\\begin{bmatrix}\ny_1 \\\\\ny_2\n\\end{bmatrix} \\tag{1}\\]\nFrom Eq. (1) second term, we can say that if \\(\\sigma^2=0\\), matrix is not-invertible because determinant is zero. It means that, if \\(\\sigma^2=0\\), observing a single point is enough, we can infer values at infinite points after observing that single point.\nEvaluating Eq. (1) further, it converges to the following equation, \\[\n\\begin{aligned}\ny^* = \\frac{(x_1y_2-x_2y_1)+x^*(y_1-y_2)}{(x_1-x_2)}\n\\end{aligned}\n\\] Interestingly, we can see that output does not depend on \\(c\\) or \\(\\sigma^2\\) anymore. Let us verify experimentally if this is true for observing more than 2 data points.\n\n\nPrepering useful functions\n\nfrom scipy.optimize import minimize\n\n\ndef cov_func(x, x_prime, sigma, c):\n return (x-c)@(x_prime-c) + sigma**2\n\ndef neg_log_likelihood(params):\n n = X.shape[0]\n sigma, c, noise_std = params\n cov = cov_func(X, X.T, sigma, c)\n cov = cov + (noise_std**2)*np.eye(n)\n nll_ar = 0.5*(Y.T@np.linalg.pinv(cov)@Y) + 0.5*n*np.log(2*np.pi) + 0.5*np.log(np.linalg.det(cov)) \n return nll_ar[0,0]\n\ndef predict(params):\n sigma, c, noise_std = params\n k = cov_func(X, X.T, sigma, c)\n np.fill_diagonal(k, k.diagonal()+noise_std**2)\n k_inv = np.linalg.pinv(k)\n k_star = cov_func(X_test, X.T, sigma, c)\n\n mean = k_star@k_inv@Y\n cov = cov_func(X_test, X_test.T, sigma, c) - k_star@k_inv@k_star.T\n return mean, cov\n\n\n\nObserving more than two points and changing hyperparameters manually\n\nX = np.array([3,4,5,6,7,8]).reshape(-1,1)\nY = np.array([6,9,8,11,10,13]).reshape(-1,1)\nX_test = np.linspace(1,8,20).reshape(-1,1)\nparams_grid = [[1., 0.01, 10**-10], [100., 1., 10**-10], \n [100., 0.01, 10**-10], [1., 2., 1.]] # sigma, c, noise_std\n\nX_extra = np.hstack([np.ones((X.shape[0], 1)), X])\nTheta = np.linalg.pinv(X_extra.T@X_extra)@X_extra.T@Y\nX_test_extra = np.hstack([np.ones((X_test.shape[0], 1)), X_test])\nY_test_ideal = X_test_extra@Theta\n\nfig, ax = plt.subplots(1,4,figsize=(16,5), sharey=True)\nmeans = []\nfor p_i, params in enumerate(params_grid):\n Y_test_mean, Y_test_cov = predict(params)\n means.append(Y_test_mean)\n ax[p_i].scatter(X, Y, label='train')\n ax[p_i].scatter(X_test, Y_test_mean, label='test')\n ax[p_i].legend();ax[p_i].set_xlabel('X');ax[p_i].set_ylabel('Y');\n ax[p_i].set_title('sigma='+str(params[0])+', c='+str(params[1])+', noise='+str(params[2]));\n\n\n\n\n\n\n\n\n\nnp.allclose(Y_test_ideal, means[0]),\\\nnp.allclose(Y_test_ideal, means[1]),\\\nnp.allclose(Y_test_ideal, means[2]),\\\nnp.allclose(Y_test_ideal, means[3])\n\n(True, True, True, False)\n\n\n\nmodel = GPy.models.GPRegression(X, Y, GPy.kern.Linear(input_dim=1))\n# model['Gaussian_noise'].fix(10**-10)\n# model.kern.variances.fix(10**-10)\nmodel.optimize()\nmodel.plot()\nplt.plot(X_test, Y_test_ideal, label='Normal Eq. fit')\nplt.plot(X_test,model.predict(X_test)[0], label='Prediction')\nplt.legend()\nmodel\n\n\n\n\nModel: GP regression\nObjective: 13.51314321804978\nNumber of Parameters: 2\nNumber of Optimization Parameters: 2\nUpdates: True\n\n\n\n\n\n\nGP_regression.\nvalue\nconstraints\npriors\n\n\nlinear.variances\n2.806515343539501\n+ve\n\n\n\nGaussian_noise.variance\n2.0834221617534134\n+ve\n\n\n\n\n\n\n\n\n\n\n\n\n\nWe can see that there is no change in fit with change in \\(c\\) and \\(\\sigma\\). 4th fit is not matching with the ideal fit obtained by normal equation because of high noise. Now, let us estimate parameters by minimizing negative log marginal likelihood.\n\nparams = [1., 1., 1.]\nresult = minimize(neg_log_likelihood, params, bounds=[(10**-5, 10**5), (10**-5, 10**5), (10**-5, 10**-5)])\nparams = result.x\nprint(params, result.fun)\nY_test_mean, Y_test_cov = predict(params)\nplt.scatter(X, Y, label='train')\nplt.scatter(X_test, Y_test_mean, label='test')\nplt.legend();plt.xlabel('X');plt.ylabel('Y');\nparams = np.round(params, 4)\nplt.title('sigma='+str(params[0])+', c='+str(params[1])+', noise='+str(params[2]));\nnp.allclose(Y_test_ideal, Y_test_mean)\n\n[9.99998123e-01 9.99998123e-01 1.00000000e-05] 10207223403405.541\n\n\nFalse\n\n\n\n\n\n\n\n\n\n\ndef neg_log_likelihood(sigma, c, noise_std):\n n = X.shape[0]\n cov = cov_func(X, X.T, sigma, c)\n cov = cov + (noise_std**2)*np.eye(n)\n nll_ar = 0.5*(Y.T@np.linalg.pinv(cov)@Y) + 0.5*n*np.log(2*np.pi) + 0.5*np.log(np.linalg.det(cov))\n return nll_ar[0,0]\n\n\ngrad_func = grad(neg_log_likelihood, argnum=[0,1,2])\nalpha = 0.01\nloss = []\nsigma, c, noise_std = 1., 1., 1.\nfor _ in range(5000):\n grads = grad_func(sigma, c, noise_std)\n # print(grads)\n sigma = sigma - alpha*grads[0]\n c = c - alpha*grads[1]\n noise_std = noise_std - alpha*grads[2]\n loss.append(neg_log_likelihood(sigma, c, noise_std))\nprint(sigma, c, noise_std)\nplt.plot(loss);\nloss[-1]\n\n7.588989986845149 -2.830840439162303 32.2487569348891\n\n\n31.05187173290998\n\n\n\n\n\n\n\n\n\n\nparams = sigma, c, noise_std\nY_test_mean, Y_test_cov = predict(params)\nplt.scatter(X, Y, label='train')\nplt.scatter(X_test, Y_test_mean, label='test')\nplt.legend();plt.xlabel('X');plt.ylabel('Y');\nparams = np.round(params, 4)\nplt.title('sigma='+str(params[0])+', c='+str(params[1])+', noise='+str(params[2]));\nnp.allclose(means[0], Y_test_mean, rtol=10**-1, atol=10**-1)\n\nFalse" }, { "objectID": "posts/docker_cheatsheet.html", @@ -889,11 +910,39 @@ "text": "Set up rootless docker\nRefer to this guide: https://docs.docker.com/engine/security/rootless/\nMain steps:\n\nRun dockerd-rootless-setuptool.sh install.\nSetup PATH and DOCKER_HOME as suggested by command output.\nsystemctl --user restart docker.\nTry docker images to check if things worked.\nTry docker run --rm hello-world to check if things really worked." }, { - "objectID": "posts/GPT-from-scratch.html", - "href": "posts/GPT-from-scratch.html", - "title": "Building GPT from scratch", + "objectID": "posts/numpy-algebra- copy.html", + "href": "posts/numpy-algebra- copy.html", + "title": "How numpy handles day-to-day algebra?", + "section": "", + "text": "import numpy as np\n\nnp.__version__\n\n118 False\n140 False\numath 8 False\numath 10 False\numath 12 True\ncore 73 False\nYes!\nnumeric 29 False\nnumeric 41 False\nnumeric 1128 False\nnumeric 2515 False\nnumeric 2517 True\ncore 75 False\ncore 77 True\ncore 83 True\ncore 94 True\n142 False\n144 True\n\n\n'1.25.2'" + }, + { + "objectID": "posts/numpy-algebra- copy.html#motivation", + "href": "posts/numpy-algebra- copy.html#motivation", + "title": "How numpy handles day-to-day algebra?", + "section": "Motivation", + "text": "Motivation\nIn this blog post, we will try to figure out how numpy handles seemingly simple math operations. The motivation behind this exploration is to figure out if there are a few foundational operations behind most of the frequently used functions in numpy. For the sake of right level of abstraction, we will not look into addition, subtraction, multiplication, and division." + }, + { + "objectID": "posts/numpy-algebra- copy.html#pi", + "href": "posts/numpy-algebra- copy.html#pi", + "title": "How numpy handles day-to-day algebra?", + "section": "Pi", + "text": "Pi\n\nBackground\npi is an irrational number and computing its value to a certain precision is a challenging task. This video talks in detail how people used to compute pi in the past. At the time of writing this blog post, Google holds the record for computing pi to the highest precision to 100 trilian digits. They used y-cruncher program (it’s free. try it!) with Chudnovsky algorithm to compute pi. Here are the first 100 digits of pi:\n\\(3.1415926535897932384626433832795028841971693993751058209749445923078164062862089986280348253421170679\\)\n\n\nSource code\nThis is how pi is defined in numpy source code upto 36 digits.\n#define NPY_PI 3.141592653589793238462643383279502884 /* pi */\nLet’s verify it.\n\nprint(f\"{np.pi:.64f}\")\n\n3.1415926535897931159979634685441851615905761718750000000000000000\n\n\nHmm, that looks off. From 16th digit onwards, the values are different. Let’s try to figure out why.\n\npi = 3.141592653589793238462643383279502884\npi = np.array(pi, dtype=np.float64)\npi = f\"{pi:.64f}\"\nnp_pi = f\"{np.pi:.64f}\"\nassert np_pi == pi\n\nOkay, so it seems like converting 36 digits of pi to 64 bit precision went wrong from 16th digit onwards. What a waste of last 20 digits of pi due to floating point errors! Anyways, let’s move on." + }, + { + "objectID": "posts/numpy-algebra- copy.html#power", + "href": "posts/numpy-algebra- copy.html#power", + "title": "How numpy handles day-to-day algebra?", + "section": "Power", + "text": "Power\nLet’s find out what happens when you execute the following code in numpy.\n\nnumber = np.float64(1.1)\nnumber**1.2\n\n1.1211693641406024" + }, + { + "objectID": "posts/2020-03-28-active_learning_with_bayesian_linear_regression.html", + "href": "posts/2020-03-28-active_learning_with_bayesian_linear_regression.html", + "title": "Active Learning with Bayesian Linear Regression", "section": "", - "text": "import torch\nimport torch.nn as nn\nimport torch.nn.functional as F" + "text": "A quick wrap-up for Bayesian Linear Regression (BLR)\nWe have a feature matrix \\(X\\) and a target vector \\(Y\\). We want to obtain \\(\\theta\\) vector in such a way that the error \\(\\epsilon\\) for the following equation is minimum.\n\\[\nY = X^T\\theta + \\epsilon\n\\] Prior PDF for \\(\\theta\\) is,\n\\[\np(\\theta) \\sim \\mathcal{N}(M_0, S_0)\n\\]\nWhere \\(S_0\\) is prior covariance matrix, and \\(M_0\\) is prior mean.\nPosterier PDF can be given as,\n\\[\n\\begin{aligned}\np(\\theta|X,Y) &\\sim \\mathcal{N}(\\theta | M_n, S_n) \\\\\nS_n &= (S_0^{-1} + \\sigma_{mle}^{-2}X^TX) \\\\\nM_n &= S_n(S_0^{-1}M_0+\\sigma_{mle}^{-2}X^TY)\n\\end{aligned}\n\\]\nMaximum likelihood estimation of \\(\\sigma\\) can be calculated as,\n\\[\n\\begin{aligned}\n\\theta_{mle} &= (X^TX)^{-1}X^TY \\\\\n\\sigma_{mle} &= ||Y - X^T\\theta_{mle}||\n\\end{aligned}\n\\]\nFinally, predicted mean \\(\\hat{Y}_{mean}\\) and predicted covariance matrix \\(\\hat{Y}_{cov}\\) can be given as,\n\\[\n\\begin{aligned}\n\\hat{Y} &\\sim \\mathcal{N}(\\hat{Y}_{mean}, \\hat{Y}_{cov}) \\\\\n\\hat{Y}_{mean} &= XM_n \\\\\n\\hat{Y}_{cov} &= X^TS_nX\n\\end{aligned}\n\\]\nNow, let’s put everything together and write a class for Bayesian Linear Regression.\n\n\nCreating scikit-learn like class with fit predict methods for BLR\n\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nfrom sklearn import datasets\nfrom sklearn.preprocessing import PolynomialFeatures, MinMaxScaler\nfrom sklearn.model_selection import train_test_split\nfrom matplotlib.animation import FuncAnimation\nfrom matplotlib import rc\nimport warnings\nwarnings.filterwarnings('ignore')\nseed = 0 # random seed for train_test_split\n\n\nclass BLR():\n def __init__(self,S0, M0): # M0 -> prior mean, S0 -> prior covariance matrix\n self.S0 = S0\n self.M0 = M0\n\n def fit(self,x,y, return_self = False):\n self.x = x\n self.y = y\n\n # Maximum likelihood estimation for sigma parameter\n theta_mle = np.linalg.pinv(x.T@x)@(x.T@y)\n sigma_2_mle = np.linalg.norm(y - x@theta_mle)**2\n sigma_mle = np.sqrt(sigma_2_mle)\n\n # Calculating predicted mean and covariance matrix for theta\n self.SN = np.linalg.pinv(np.linalg.pinv(self.S0) + (sigma_mle**-2)*x.T@x)\n self.MN = self.SN@(np.linalg.pinv(self.S0)@self.M0 + (sigma_mle**-2)*(x.T@y).squeeze())\n\n # Calculating predicted mean and covariance matrix for data\n self.pred_var = x@self.SN@x.T\n self.y_hat_map = x@self.MN\n if return_self:\n return (self.y_hat_map, self.pred_var)\n \n def predict(self, x):\n self.pred_var = x@self.SN@x.T\n self.y_hat_map = x@self.MN\n return (self.y_hat_map, self.pred_var)\n\n def plot(self, s=1): # s -> size of dots for scatter plot\n individual_var = self.pred_var.diagonal()\n plt.figure()\n plt.xlabel('x')\n plt.ylabel('y')\n plt.plot(self.x[:,1], self.y_hat_map, color='black', label='model')\n plt.fill_between(self.x[:,1], self.y_hat_map-individual_var, self.y_hat_map+individual_var, alpha=0.4, color='black', label='uncertainty')\n plt.scatter(self.x[:,1], self.y, label='actual data',s=s)\n plt.title('MAE is '+str(np.mean(np.abs(self.y - self.y_hat_map))))\n plt.legend()\n\n\n\nCreating & visualizing dataset\nTo start with, let’s create a random dataset with degree 3 polynomial function with some added noise.\n\\[\nY = (5X^3 - 4X^2 + 3X - 2) + \\mathcal{N}(0,1)\n\\]\n\nnp.random.seed(seed)\nX_init = np.linspace(-1, 1, 1000)\nnoise = np.random.randn(1000, )\nY = (5 * X_init**3 - 4 * X_init**2 + 3 * X_init - 2) + noise\n\nWe’ll try to fit a degree 5 polynomial function to our data.\n\nX = PolynomialFeatures(degree=5, include_bias=True).fit_transform(X_init.reshape(-1,1))\nN_features = X.shape[1]\n\n\nplt.scatter(X[:,1], Y, s=0.5, label = 'data points')\nplt.xlabel(\"X\")\nplt.ylabel(\"Y\")\nplt.legend()\nplt.show()\n\n\n\n\n\n\n\n\n\n\nLearning a BLR model on the entire data\nWe’ll take \\(M_0\\) (prior mean) as zero vector initially, assuming that we do not have any prior knowledge about \\(M_0\\). We’re taking \\(S_0\\) (prior covariance) as the identity matrix, assuming that all coefficients are completely independent of each other.\n\nS0 = np.eye(N_features)\nM0 = np.zeros((N_features, ))\nmodel = BLR(S0, M0)\n\n\nmodel.fit(X, Y)\n\n\n\nVisualising the fit\n\nmodel.plot(s=0.5)\n\n\n\n\n\n\n\n\nThis doesn’t look like a good fit, right? Let’s set the prior closer to the real values and visualize the fit again.\n\n\nVisualising the fit after changing the prior\n\nnp.random.seed(seed)\nS0 = np.eye(N_features)\nM0 = np.array([-2, 3, -4, 5, 0, 0]) + np.random.randn(N_features, )\nmodel = BLR(S0, M0)\n\n\nmodel.fit(X, Y)\nmodel.plot(s=0.5)\n\n\n\n\n\n\n\n\nHmm, better. Now let’s see how it fits after reducing the noise and setting the prior mean to zero vector again.\n\n\nVisualising the fit after reducing the noise\n\nnp.random.seed(seed)\nX_init = np.linspace(-1, 1, 1000)\nnoise = np.random.randn(1000, ) * 0.5\nY = (5 * X_init**3 - 4 * X_init**2 + 3 * X_init - 2) + noise\n\n\nS0 = np.eye(N_features)\nM0 = np.zeros((N_features, ))\nmodel = BLR(S0, M0)\n\n\nmodel.fit(X, Y)\nmodel.plot(s=0.5)\n\n\n\n\n\n\n\n\nWhen the noise was high, the model tended to align with the prior. After keeping the prior closer to the original coefficients, the model was improved as expected. From the last plot, we can say that as noise reduces from the data, the impact of the prior reduces, and the model tries to fit the data more precisely. Therefore, we can say that when data is too noisy or insufficient, a wisely chosen prior can produce a precise fit.\n\n\nIntuition to Active Learning (Uncertainty Sampling) with an example\nLet’s take the case where we want to train a machine learning model to classify if a person is infected with COVID-19 or not, but the testing facilities for the same are not available so widely. We may have very few amounts of data for detected positive and detected negative patients. Now, we want our model to be highly confident or least uncertain about its results; otherwise, it may create havoc for wrongly classified patients, but, our bottleneck is labeled data. Thanks to active learning techniques, we can overcome this problem smartly. How?\nWe train our model with existing data and test it on all the suspected patients’ data. Let’s say we have an uncertainty measure or confidence level about each tested data point (distance from the decision boundary in case of SVM, variance in case of Gaussian processes, or Bayesian Linear Regression). We can choose a patient for which our model is least certain, and send him to COVID-19 testing facilities (assuming that we can send only one patient at a time). Now, we can include his data to the train set and test the model on everyone else. By following the same procedure repeatedly, we can increase the size of our train data and confidence of the model without sending everyone randomly for testing.\nThis method is called Uncertainty Sampling in Active Learning. Now let’s formally define Active Learning. From Wikipedia,\nActive learning is a special case of machine learning in which a learning algorithm can interactively query a user (or some other information source) to label new data points with the desired outputs.\nNow, we’ll go through the active learning procedure step by step.\n\n\nTrain set, test set, and pool. What is what?\nThe train set includes labeled data points. The pool includes potential data points to query for a label, and the test set includes labeled data points to check the performance of our model. Here, we cannot actually do a query to anyone, so we assume that we do not have labels for the pool while training, and after each iteration, we include a data point from the pool set to the train set for which our model has the highest uncertainty.\nSo, the algorithm can be represented as the following,\n\nTrain the model with the train set.\nTest the performance on the test set (This is expected to improve).\nTest the model with the pool.\nQuery for the most uncertain datapoint from the pool.\nAdd that datapoint into the train set.\nRepeat step 1 to step 5 for \\(K\\) iterations (\\(K\\) ranges from \\(0\\) to the pool size).\n\n\n\nCreating initial train set, test set, and pool\nLet’s take half of the dataset as the test set, and from another half, we will start with some points as the train set and remaining as the pool. Let’s start with 2 data points as the train set.\n\nnp.random.seed(seed)\nX_init = np.linspace(-1, 1, 1000)\nX = PolynomialFeatures(degree=5, include_bias=True).fit_transform(X_init.reshape(-1,1))\nnoise = np.random.randn(1000, ) * 0.5\nY = (5 * X_init**3 - 4 * X_init**2 + 3 * X_init - 2) + noise\n\n\ntrain_pool_X, test_X, train_pool_Y, test_Y = train_test_split(X, Y, test_size = 0.5, random_state=seed)\ntrain_X, pool_X, train_Y, pool_Y = train_test_split(train_pool_X, train_pool_Y, train_size=2, random_state=seed)\n\nVisualizing train, test and pool.\n\nplt.scatter(test_X[:,1], test_Y, label='test set',color='r', s=2)\nplt.scatter(train_X[:,1], train_Y, label='train set',marker='s',color='k', s=50)\nplt.scatter(pool_X[:,1], pool_Y, label='pool',color='b', s=2)\nplt.xlabel('X')\nplt.ylabel('Y')\nplt.legend()\nplt.show()\n\n\n\n\n\n\n\n\nLet’s initialize a few dictionaries to keep track of each iteration.\n\ntrain_X_iter = {} # to store train points at each iteration\ntrain_Y_iter = {} # to store corresponding labels to the train set at each iteration\nmodels = {} # to store the models at each iteration\nestimations = {} # to store the estimations on the test set at each iteration\ntest_mae_error = {} # to store MAE(Mean Absolute Error) at each iteration\n\n\n\nTraining & testing initial learner on train set (Iteration 0)\nNow we will train the model for the initial train set, which is iteration 0.\n\ntrain_X_iter[0] = train_X\ntrain_Y_iter[0] = train_Y\n\n\nS0 = np.eye(N_features)\nM0 = np.zeros((N_features, ))\nmodels[0] = BLR(S0, M0)\n\n\nmodels[0].fit(train_X_iter[0], train_Y_iter[0])\n\nCreating a plot method to visualize train, test and pool with estimations and uncertainty.\n\ndef plot(ax, model, init_title=''):\n # Plotting the pool\n ax.scatter(pool_X[:,1], pool_Y, label='pool',s=1,color='r',alpha=0.4)\n \n # Plotting the test data\n ax.scatter(test_X[:,1], test_Y, label='test data',s=1, color='b', alpha=0.4)\n \n # Combining the test & the pool\n test_pool_X, test_pool_Y = np.append(test_X,pool_X, axis=0), np.append(test_Y,pool_Y)\n \n # Sorting test_pool for plotting\n sorted_inds = np.argsort(test_pool_X[:,1])\n test_pool_X, test_pool_Y = test_pool_X[sorted_inds], test_pool_Y[sorted_inds]\n \n # Plotting test_pool with uncertainty\n model.predict(test_pool_X)\n individual_var = model.pred_var.diagonal()\n ax.plot(test_pool_X[:,1], model.y_hat_map, color='black', label='model')\n ax.fill_between(test_pool_X[:,1], model.y_hat_map-individual_var, model.y_hat_map+individual_var\n , alpha=0.2, color='black', label='uncertainty')\n \n # Plotting the train data\n ax.scatter(model.x[:,1], model.y,s=40, color='k', marker='s', label='train data')\n ax.scatter(model.x[-1,1], model.y[-1],s=80, color='r', marker='o', label='last added point')\n \n # Plotting MAE on the test set\n model.predict(test_X)\n ax.set_title(init_title+' MAE is '+str(np.mean(np.abs(test_Y - model.y_hat_map))))\n ax.set_xlabel('x')\n ax.set_ylabel('y')\n ax.legend()\n\nPlotting the estimations and uncertainty.\n\nfig, ax = plt.subplots()\nplot(ax, models[0])\n\n\n\n\n\n\n\n\nLet’s check the maximum uncertainty about any point for the model.\n\nmodels[0].pred_var.diagonal().max()\n\n4.8261426545316604e-29\n\n\nOops!! There is almost no uncertainty in the model. Why? let’s try again with more train points.\n\ntrain_pool_X, test_X, train_pool_Y, test_Y = train_test_split(X, Y, test_size = 0.5, random_state=seed)\ntrain_X, pool_X, train_Y, pool_Y = train_test_split(train_pool_X, train_pool_Y, train_size=7, random_state=seed)\n\n\ntrain_X_iter[0] = train_X\ntrain_Y_iter[0] = train_Y\n\n\nS0 = np.eye(N_features)\nM0 = np.zeros((N_features, ))\nmodels[0] = BLR(S0, M0)\n\n\nmodels[0].fit(train_X_iter[0], train_Y_iter[0])\n\n\nfig, ax = plt.subplots()\nplot(ax, models[0])\n\n\n\n\n\n\n\n\nNow uncertainty is visible, and currently, it’s high at the left-most points. We are trying to fit a degree 5 polynomial here. So our linear regression coefficients are 6, including the bias. If we choose train points equal to or lesser than 6, our model perfectly fits the train points and has no uncertainty. Choosing train points more than 6 induces uncertainty in the model.\nLet’s evaluate the performance on the test set.\n\nestimations[0], _ = models[0].predict(test_X)\ntest_mae_error[0] = np.mean(np.abs(test_Y - estimations[0]))\n\nMean Absolute Error (MAE) on the test set is\n\ntest_mae_error[0]\n\n0.5783654195019617\n\n\n\n\nMoving the most uncertain point from the pool to the train set\nIn the previous plot, we saw that the model was least certain about the left-most point. We’ll move that point from the pool to the train set and see the effect.\n\nesimations_pool, _ = models[0].predict(pool_X)\n\nFinding out a point having the most uncertainty.\n\nin_var = models[0].pred_var.diagonal().argmax()\nto_add_x = pool_X[in_var,:]\nto_add_y = pool_Y[in_var]\n\nAdding the point from the pool to the train set.\n\ntrain_X_iter[1] = np.vstack([train_X_iter[0], to_add_x])\ntrain_Y_iter[1] = np.append(train_Y_iter[0], to_add_y)\n\nDeleting the point from the pool.\n\npool_X = np.delete(pool_X, in_var, axis=0)\npool_Y = np.delete(pool_Y, in_var)\n\n\n\nTraining again and visualising the results (Iteration 1)\nThis time, we will pass previously learnt prior to the next iteration.\n\nS0 = np.eye(N_features)\nmodels[1] = BLR(S0, models[0].MN)\n\n\nmodels[1].fit(train_X_iter[1], train_Y_iter[1])\n\n\nestimations[1], _ = models[1].predict(test_X)\ntest_mae_error[1] = np.mean(np.abs(test_Y - estimations[1]))\n\nMAE on the test set is\n\ntest_mae_error[1]\n\n0.5779411133071186\n\n\nVisualizing the results.\n\nfig, ax = plt.subplots()\nplot(ax, models[1])\n\n\n\n\n\n\n\n\nBefore & after adding most uncertain point\n\nfig, ax = plt.subplots(1,2, figsize=(13.5,4.5))\nplot(ax[0], models[0],'Before')\nplot(ax[1], models[1],'After')\n\n\n\n\n\n\n\n\nWe can see that including most uncertain point into the train set has produced a better fit and MAE for test set has been reduced. Also, uncertainty has reduced at the left part of the data but it has increased a bit on the right part of the data.\nNow let’s do this for few more iterations in a loop and visualise the results.\n\n\nActive learning procedure\n\nnum_iterations = 20\npoints_added_x= np.zeros((num_iterations+1, N_features))\n\npoints_added_y=[]\n\nprint(\"Iteration, Cost\\n\")\nprint(\"-\"*40)\n\nfor iteration in range(2, num_iterations+1):\n # Making predictions on the pool set based on model learnt in the respective train set \n estimations_pool, var = models[iteration-1].predict(pool_X)\n \n # Finding the point from the pool with highest uncertainty\n in_var = var.diagonal().argmax()\n to_add_x = pool_X[in_var,:]\n to_add_y = pool_Y[in_var]\n points_added_x[iteration-1,:] = to_add_x\n points_added_y.append(to_add_y)\n \n # Adding the point to the train set from the pool\n train_X_iter[iteration] = np.vstack([train_X_iter[iteration-1], to_add_x])\n train_Y_iter[iteration] = np.append(train_Y_iter[iteration-1], to_add_y)\n \n # Deleting the point from the pool\n pool_X = np.delete(pool_X, in_var, axis=0)\n pool_Y = np.delete(pool_Y, in_var)\n \n # Training on the new set\n models[iteration] = BLR(S0, models[iteration-1].MN)\n models[iteration].fit(train_X_iter[iteration], train_Y_iter[iteration])\n \n estimations[iteration], _ = models[iteration].predict(test_X)\n test_mae_error[iteration]= pd.Series(estimations[iteration] - test_Y.squeeze()).abs().mean()\n print(iteration, (test_mae_error[iteration]))\n\nIteration, Cost\n\n----------------------------------------\n2 0.49023173501654815\n3 0.4923391714942153\n4 0.49040074812746753\n5 0.49610198614600165\n6 0.5015282102751122\n7 0.5051264429971314\n8 0.5099913097301352\n9 0.504455016053513\n10 0.5029219102020734\n11 0.5009762782262487\n12 0.5004883097883343\n13 0.5005169638980388\n14 0.5002731089932334\n15 0.49927485683909884\n16 0.49698416490822594\n17 0.49355398855432897\n18 0.49191185613804617\n19 0.491164833699368\n20 0.4908067530719673\n\n\n\npd.Series(test_mae_error).plot(style='ko-')\nplt.xlim((-0.5, num_iterations+0.5))\nplt.ylabel(\"MAE on test set\")\nplt.xlabel(\"# Points Queried\")\nplt.show()\n\n\n\n\n\n\n\n\nThe plot above shows that MAE on the test set fluctuates a bit initially then reduces gradually as we keep including more points from the pool to the train set. Let’s visualise fits for all the iterations. We’ll discuss this behaviour after that.\n\n\nVisualizing active learning procedure\n\nprint('Initial model')\nprint('Y = {0:0.2f} X^5 + {1:0.2f} X^4 + {2:0.2f} X^3 + {3:0.2f} X^2 + {4:0.2f} X + {5:0.2f}'.format(*models[0].MN[::-1]))\nprint('\\nFinal model')\nprint('Y = {0:0.2f} X^5 + {1:0.2f} X^4 + {2:0.2f} X^3 + {3:0.2f} X^2 + {4:0.2f} X + {5:0.2f}'.format(*models[num_iterations].MN[::-1]))\n\nInitial model\nY = 1.89 X^5 + 1.54 X^4 + 0.84 X^3 + -6.48 X^2 + 4.74 X + -1.63\n\nFinal model\nY = 2.50 X^5 + 3.11 X^4 + 0.83 X^3 + -7.08 X^2 + 4.47 X + -1.58\n\n\n\ndef update(iteration):\n ax.cla()\n plot(ax, models[iteration])\n fig.tight_layout()\n\n\nfig, ax = plt.subplots()\nanim = FuncAnimation(fig, update, frames=np.arange(0, num_iterations+1, 1), interval=250)\nplt.close()\nrc('animation', html='jshtml')\n\n\nanim\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nWe can see that the point having highest uncertainty was chosen in first iteration and it produced the near optimal fit. After that, error reduced gradually.\nNow, let’s put everything together and create a class for active learning procedure\n\n\nCreating a class for active learning procedure\n\nclass ActiveL():\n def __init__(self, X, y, S0=None, M0=None, test_size=0.5, degree = 5, iterations = 20, seed=1):\n self.X_init = X\n self.y = y\n self.S0 = S0\n self.M0 = M0\n self.train_X_iter = {} # to store train points at each iteration\n self.train_Y_iter = {} # to store corresponding labels to the train set at each iteration\n self.models = {} # to store the models at each iteration\n self.estimations = {} # to store the estimations on the test set at each iteration\n self.test_mae_error = {} # to store MAE(Mean Absolute Error) at each iteration\n self.test_size = test_size\n self.degree = degree\n self.iterations = iterations\n self.seed = seed\n self.train_size = degree + 2\n\n def data_preperation(self):\n # Adding polynomial features\n self.X = PolynomialFeatures(degree=self.degree).fit_transform(self.X_init)\n N_features = self.X.shape[1]\n \n # Splitting into train, test and pool\n train_pool_X, self.test_X, train_pool_Y, self.test_Y = train_test_split(self.X, self.y, \n test_size=self.test_size,\n random_state=self.seed)\n self.train_X, self.pool_X, self.train_Y, self.pool_Y = train_test_split(train_pool_X, train_pool_Y, \n train_size=self.train_size, \n random_state=self.seed)\n \n # Setting BLR prior incase of not given\n if self.M0 == None:\n self.M0 = np.zeros((N_features, ))\n if self.S0 == None:\n self.S0 = np.eye(N_features)\n \n def main(self):\n # Training for iteration 0\n self.train_X_iter[0] = self.train_X\n self.train_Y_iter[0] = self.train_Y\n self.models[0] = BLR(self.S0, self.M0)\n self.models[0].fit(self.train_X, self.train_Y)\n\n # Running loop for all iterations\n for iteration in range(1, self.iterations+1):\n # Making predictions on the pool set based on model learnt in the respective train set \n estimations_pool, var = self.models[iteration-1].predict(self.pool_X)\n \n # Finding the point from the pool with highest uncertainty\n in_var = var.diagonal().argmax()\n to_add_x = self.pool_X[in_var,:]\n to_add_y = self.pool_Y[in_var]\n \n # Adding the point to the train set from the pool\n self.train_X_iter[iteration] = np.vstack([self.train_X_iter[iteration-1], to_add_x])\n self.train_Y_iter[iteration] = np.append(self.train_Y_iter[iteration-1], to_add_y)\n \n # Deleting the point from the pool\n self.pool_X = np.delete(self.pool_X, in_var, axis=0)\n self.pool_Y = np.delete(self.pool_Y, in_var)\n \n # Training on the new set\n self.models[iteration] = BLR(self.S0, self.models[iteration-1].MN)\n self.models[iteration].fit(self.train_X_iter[iteration], self.train_Y_iter[iteration])\n \n self.estimations[iteration], _ = self.models[iteration].predict(self.test_X)\n self.test_mae_error[iteration]= pd.Series(self.estimations[iteration] - self.test_Y.squeeze()).abs().mean()\n\n def _plot_iter_MAE(self, ax, iteration):\n ax.plot(list(self.test_mae_error.values())[:iteration+1], 'ko-')\n ax.set_title('MAE on test set over iterations')\n ax.set_xlim((-0.5, self.iterations+0.5))\n ax.set_ylabel(\"MAE on test set\")\n ax.set_xlabel(\"# Points Queried\")\n \n def _plot(self, ax, model):\n # Plotting the pool\n ax.scatter(self.pool_X[:,1], self.pool_Y, label='pool',s=1,color='r',alpha=0.4)\n \n # Plotting the test data\n ax.scatter(self.test_X[:,1], self.test_Y, label='test data',s=1, color='b', alpha=0.4)\n \n # Combining test_pool\n test_pool_X, test_pool_Y = np.append(self.test_X, self.pool_X, axis=0), np.append(self.test_Y, self.pool_Y)\n \n # Sorting test_pool\n sorted_inds = np.argsort(test_pool_X[:,1])\n test_pool_X, test_pool_Y = test_pool_X[sorted_inds], test_pool_Y[sorted_inds]\n \n # Plotting test_pool with uncertainty\n preds, var = model.predict(test_pool_X)\n individual_var = var.diagonal()\n ax.plot(test_pool_X[:,1], model.y_hat_map, color='black', label='model')\n ax.fill_between(test_pool_X[:,1], model.y_hat_map-individual_var, model.y_hat_map+individual_var\n , alpha=0.2, color='black', label='uncertainty')\n \n # plotting the train data\n ax.scatter(model.x[:,1], model.y,s=10, color='k', marker='s', label='train data')\n ax.scatter(model.x[-1,1], model.y[-1],s=80, color='r', marker='o', label='last added point')\n \n # plotting MAE\n preds, var = model.predict(self.test_X)\n ax.set_title('MAE is '+str(np.mean(np.abs(self.test_Y - preds))))\n ax.set_xlabel('x')\n ax.set_ylabel('y')\n ax.legend()\n \n def visualise_AL(self):\n fig, ax = plt.subplots(1,2,figsize=(13,5))\n def update(iteration):\n ax[0].cla()\n ax[1].cla()\n self._plot(ax[0], self.models[iteration])\n self._plot_iter_MAE(ax[1], iteration)\n fig.tight_layout()\n\n print('Initial model')\n print('Y = '+' + '.join(['{0:0.2f}'.format(self.models[0].MN[i])+' X^'*min(i,1)+str(i)*min(i,1) for i in range(self.degree+1)]))\n print('\\nFinal model')\n print('Y = '+' + '.join(['{0:0.2f}'.format(self.models[self.iterations].MN[i])+' X^'*min(i,1)+str(i)*min(i,1) for i in range(self.degree+1)]))\n\n anim = FuncAnimation(fig, update, frames=np.arange(0, self.iterations+1, 1), interval=250)\n plt.close()\n\n rc('animation', html='jshtml')\n return anim\n\n\n\nVisualizing a different polynomial fit on the same dataset\nLet’s try to fit a degree 7 polynomial to the same data now.\n\nnp.random.seed(seed)\nX_init = np.linspace(-1, 1, 1000)\nnoise = np.random.randn(1000, ) * 0.5\nY = (5 * X_init**3 - 4 * X_init**2 + 3 * X_init - 2) + noise\n\n\nmodel = ActiveL(X_init.reshape(-1,1), Y, degree=7, iterations=20, seed=seed)\n\n\nmodel.data_preperation()\nmodel.main()\nmodel.visualise_AL()\n\nInitial model\nY = -1.92 + 3.79 X^1 + -1.81 X^2 + -0.43 X^3 + -0.51 X^4 + -0.27 X^5 + -0.18 X^6 + -0.11 X^7\n\nFinal model\nY = -1.79 + 4.86 X^1 + -5.38 X^2 + 0.50 X^3 + -0.17 X^4 + 1.19 X^5 + 1.83 X^6 + 1.31 X^7\n\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nWe can clearly see that model was fitting the train points well and uncertainty was high at the left-most position. After first iteration, the left-most point was added to the train set and MAE reduced significantly. Similar phenomeneon happened at iteration 2 with the right-most point. After that error kept reducing at slower rate gradually because fit was near optimal after just 2 iterations.\n\n\nActive learning for diabetes dataset from the Scikit-learn module\nLet’s run our model for diabetes data from sklearn module. The data have various features like age, sex, weight etc. of diabetic people and target is increment in disease after one year. We’ll choose only ‘weight’ feature, which seems to have more correlation with the target.\nWe’ll try to fit degree 1 polynomial to this data, as our data seems to have a linear fit. First, let’s check the performance of Scikit-learn linear regression model.\n\nX, Y = datasets.load_diabetes(return_X_y=True)\nX = X[:, 2].reshape(-1,1) # Choosing only feature 2 which seems more relevent to linear regression\n\n# Normalizing\nX = (X - X.min())/(X.max() - X.min())\nY = (Y - Y.min())/(Y.max() - Y.min())\n\nVisualizing the dataset.\n\nplt.scatter(X, Y)\nplt.xlabel('Weight of the patients')\nplt.ylabel('Increase in the disease after a year')\nplt.show()\n\n\n\n\n\n\n\n\nLet’s fit the Scikit-learn linear regression model with 50% train-test split.\n\nfrom sklearn.linear_model import LinearRegression\ntrain_X, test_X, train_Y, test_Y = train_test_split(X, Y, test_size = 0.5, random_state = seed)\n\n\nclf = LinearRegression()\n\n\nclf.fit(train_X, train_Y)\npred_Y = clf.predict(test_X)\n\nVisualizing the fit & MAE.\n\nplt.scatter(X, Y, label='data', s=5)\nplt.plot(test_X, pred_Y, label='model', color='r')\nplt.xlabel('Weight of the patients')\nplt.ylabel('Increase in the disease after a year')\nplt.title('MAE is '+str(np.mean(np.abs(pred_Y - test_Y))))\nplt.legend()\nplt.show()\n\n\n\n\n\n\n\n\nNow we’ll fit the same data with our BLR model\n\nmodel = ActiveL(X.reshape(-1,1), Y, degree=1, iterations=20, seed=seed)\n\n\nmodel.data_preperation()\nmodel.main()\nmodel.visualise_AL()\n\nInitial model\nY = 0.41 + 0.16 X^1\n\nFinal model\nY = 0.13 + 0.86 X^1\n\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nInitially, the fit is leaning towards zero slope, which is the influence of bias due to a low number of training points. It’s interesting to see that our initial train points tend to make a vertical fit, but the model doesn’t get carried away by that and stabilizes the self with prior.\n\nprint('MAE for Scikit-learn Linear Regression is',np.mean(np.abs(pred_Y - test_Y)))\nprint('MAE for Bayesian Linear Regression is', model.test_mae_error[20])\n\nMAE for Scikit-learn Linear Regression is 0.15424985705353944\nMAE for Bayesian Linear Regression is 0.15738001811804758\n\n\nAt the end, results of sklearn linear regression and our active learning based BLR model are comparable even though we’ve used only 20 points to train our model over 221 points used by sklearn. This is because active learning enables us to choose those datapoints for training, which are going to contribute the most towards a precise fit." }, { "objectID": "posts/Multiclass_GP_classification.html", @@ -952,214 +1001,200 @@ "text": "Try random forest model\n\nmodel = RandomForestClassifier(n_estimators=1000, random_state=0)\nmodel.fit(X_train, y_train.ravel())\n\nprint(\"Random Forest\")\npreds = model.predict(X_test)\nprint(classification_report(y_test, preds))\n\nRandom Forest\n precision recall f1-score support\n\n 0 0.78 0.54 0.64 13\n 1 0.60 0.86 0.71 7\n 2 0.92 0.85 0.88 13\n 3 0.92 1.00 0.96 11\n 4 1.00 1.00 1.00 11\n 5 0.91 1.00 0.95 10\n 6 1.00 1.00 1.00 6\n 7 1.00 1.00 1.00 17\n 8 1.00 1.00 1.00 12\n\n accuracy 0.91 100\n macro avg 0.90 0.92 0.90 100\nweighted avg 0.91 0.91 0.91 100\n\n\n\n\nfig, ax = plt.subplots(figsize=(6, 4))\ny_grid = model.predict(X_grid) + 0.5 # due to some reason, unless we add 0.5, the contourf is not showing all the colors\nax.contourf(Grid1, Grid2, y_grid.reshape(Grid1.shape), cmap='rainbow', alpha=0.5)\nax.scatter(X[:, 0], X[:, 1], c=y, s=50, cmap='rainbow', edgecolor='k');" }, { - "objectID": "posts/foundation-models-for-time-series.html", - "href": "posts/foundation-models-for-time-series.html", - "title": "Foundation Models for Time Series Forecasting", + "objectID": "posts/2022-05-14-iteratively_reweighted_least_squares.html", + "href": "posts/2022-05-14-iteratively_reweighted_least_squares.html", + "title": "Iteratively reweighted least squares (IRLS) logistic regression", "section": "", - "text": "# Config\nimport os\n\n# Basic\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\n\n# Monitoring\nfrom tqdm.notebook import tqdm\n\n# IO\nfrom os.path import join, exists, basename, dirname\nfrom glob import glob\n\n# Parallel processing\nfrom joblib import Parallel, delayed\n\nimport xarray as xr" + "text": "import jax\nimport jax.numpy as jnp\nimport numpy as np\nfrom sklearn.datasets import make_blobs\nimport matplotlib.pyplot as plt\nfrom matplotlib.animation import FuncAnimation\nfrom time import time\n\n# Enable high precision\nfrom jax.config import config\nconfig.update(\"jax_enable_x64\", True)\n\n# To enable animation inside notebook\nplt.rc(\"animation\", html=\"jshtml\")" }, { - "objectID": "posts/foundation-models-for-time-series.html#data", - "href": "posts/foundation-models-for-time-series.html#data", - "title": "Foundation Models for Time Series Forecasting", - "section": "Data", - "text": "Data\n\nds = xr.open_zarr(\"zip:///::https://huggingface.co/datasets/Zeel/P1/resolve/main/all_in_one.zarr.zip\")\nds\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n<xarray.Dataset> Size: 25GB\nDimensions: (Timestamp: 245376, station: 537)\nCoordinates:\n * Timestamp (Timestamp) datetime64[ns] 2MB 2017-01-01 ... 2023-12-31T23:...\n address (station) <U187 402kB ...\n city (station) <U18 39kB ...\n latitude (station) float64 4kB ...\n longitude (station) float64 4kB ...\n state (station) <U17 37kB ...\n * station (station) <U64 137kB '32Bungalows, Bhilai - CECB' ... 'Ward-...\nData variables: (12/24)\n AT (Timestamp, station) float64 1GB ...\n BP (Timestamp, station) float64 1GB ...\n Benzene (Timestamp, station) float64 1GB ...\n CO (Timestamp, station) float64 1GB ...\n Eth-Benzene (Timestamp, station) float64 1GB ...\n MP-Xylene (Timestamp, station) float64 1GB ...\n ... ...\n TOT-RF (Timestamp, station) float64 1GB ...\n Toluene (Timestamp, station) float64 1GB ...\n VWS (Timestamp, station) float64 1GB ...\n WD (Timestamp, station) float64 1GB ...\n WS (Timestamp, station) float64 1GB ...\n Xylene (Timestamp, station) float64 1GB ...xarray.DatasetDimensions:Timestamp: 245376station: 537Coordinates: (7)Timestamp(Timestamp)datetime64[ns]2017-01-01 ... 2023-12-31T23:45:00array(['2017-01-01T00:00:00.000000000', '2017-01-01T00:15:00.000000000',\n '2017-01-01T00:30:00.000000000', ..., '2023-12-31T23:15:00.000000000',\n '2023-12-31T23:30:00.000000000', '2023-12-31T23:45:00.000000000'],\n dtype='datetime64[ns]')address(station)<U187...[537 values with dtype=<U187]city(station)<U18...[537 values with dtype=<U18]latitude(station)float64...[537 values with dtype=float64]longitude(station)float64...[537 values with dtype=float64]state(station)<U17...[537 values with dtype=<U17]station(station)<U64'32Bungalows, Bhilai - CECB' ......array(['32Bungalows, Bhilai - CECB', 'AIIMS, Raipur - CECB',\n 'Bhatagaon New ISBT, Raipur - CECB', ...,\n 'Sidhu Kanhu Indoor Stadium, Durgapur - WBPCB',\n 'Victoria, Kolkata - WBPCB', 'Ward-32 Bapupara, Siliguri - WBPCB'],\n dtype='<U64')Data variables: (24)AT(Timestamp, station)float64...unit :°C[131766912 values with dtype=float64]BP(Timestamp, station)float64...unit :mmHg[131766912 values with dtype=float64]Benzene(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]CO(Timestamp, station)float64...unit :mg/m³[131766912 values with dtype=float64]Eth-Benzene(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]MP-Xylene(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]NH3(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]NO(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]NO2(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]NOx(Timestamp, station)float64...unit :ppb[131766912 values with dtype=float64]O Xylene(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]Ozone(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]PM10(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]PM2.5(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]RF(Timestamp, station)float64...unit :mm[131766912 values with dtype=float64]RH(Timestamp, station)float64...unit :%[131766912 values with dtype=float64]SO2(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]SR(Timestamp, station)float64...unit :W/mt2[131766912 values with dtype=float64]TOT-RF(Timestamp, station)float64...unit :mm[131766912 values with dtype=float64]Toluene(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]VWS(Timestamp, station)float64...unit :m/s[131766912 values with dtype=float64]WD(Timestamp, station)float64...unit :deg[131766912 values with dtype=float64]WS(Timestamp, station)float64...unit :m/s[131766912 values with dtype=float64]Xylene(Timestamp, station)float64...unit :µg/m³[131766912 values with dtype=float64]Indexes: (2)TimestampPandasIndexPandasIndex(DatetimeIndex(['2017-01-01 00:00:00', '2017-01-01 00:15:00',\n '2017-01-01 00:30:00', '2017-01-01 00:45:00',\n '2017-01-01 01:00:00', '2017-01-01 01:15:00',\n '2017-01-01 01:30:00', '2017-01-01 01:45:00',\n '2017-01-01 02:00:00', '2017-01-01 02:15:00',\n ...\n '2023-12-31 21:30:00', '2023-12-31 21:45:00',\n '2023-12-31 22:00:00', '2023-12-31 22:15:00',\n '2023-12-31 22:30:00', '2023-12-31 22:45:00',\n '2023-12-31 23:00:00', '2023-12-31 23:15:00',\n '2023-12-31 23:30:00', '2023-12-31 23:45:00'],\n dtype='datetime64[ns]', name='Timestamp', length=245376, freq=None))stationPandasIndexPandasIndex(Index(['32Bungalows, Bhilai - CECB', 'AIIMS, Raipur - CECB',\n 'Bhatagaon New ISBT, Raipur - CECB',\n 'Civic Center, Bhilai - Bhilai Steel Plant',\n 'Govt. Higher Secondary School, Milupara - CECB',\n 'Hathkhoj, Bhilai - CECB', 'Krishak Nagar, Raipur - CECB',\n 'Mangala, Bilaspur - NTPC', 'Nawapara SECL Colony, Chhal - CECB',\n 'OP Jindal Industrial Park, Tumidih - CECB',\n ...\n 'Ghusuri, Howrah - WBPCB', 'Jadavpur, Kolkata - WBPCB',\n 'Padmapukur, Howrah - WBPCB',\n 'Priyambada Housing Estate, Haldia - WBPCB',\n 'Rabindra Bharati University, Kolkata - WBPCB',\n 'Rabindra Sarobar, Kolkata - WBPCB',\n 'SVSPA Campus, Barrackpore - WBPCB',\n 'Sidhu Kanhu Indoor Stadium, Durgapur - WBPCB',\n 'Victoria, Kolkata - WBPCB', 'Ward-32 Bapupara, Siliguri - WBPCB'],\n dtype='object', name='station', length=537))Attributes: (0)\n\n\n\none_station_ds = ds.sel(station=\"IGI Airport (T3), Delhi - IMD\", Timestamp=slice(\"2022\", \"2023\"))[[\"PM2.5\"]]\none_station_ds\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n<xarray.Dataset> Size: 1MB\nDimensions: (Timestamp: 70080)\nCoordinates:\n * Timestamp (Timestamp) datetime64[ns] 561kB 2022-01-01 ... 2023-12-31T23:...\n address <U187 748B ...\n city <U18 72B ...\n latitude float64 8B ...\n longitude float64 8B ...\n state <U17 68B ...\n station <U64 256B 'IGI Airport (T3), Delhi - IMD'\nData variables:\n PM2.5 (Timestamp) float64 561kB ...xarray.DatasetDimensions:Timestamp: 70080Coordinates: (7)Timestamp(Timestamp)datetime64[ns]2022-01-01 ... 2023-12-31T23:45:00array(['2022-01-01T00:00:00.000000000', '2022-01-01T00:15:00.000000000',\n '2022-01-01T00:30:00.000000000', ..., '2023-12-31T23:15:00.000000000',\n '2023-12-31T23:30:00.000000000', '2023-12-31T23:45:00.000000000'],\n dtype='datetime64[ns]')address()<U187...[1 values with dtype=<U187]city()<U18...[1 values with dtype=<U18]latitude()float64...[1 values with dtype=float64]longitude()float64...[1 values with dtype=float64]state()<U17...[1 values with dtype=<U17]station()<U64'IGI Airport (T3), Delhi - IMD'array('IGI Airport (T3), Delhi - IMD', dtype='<U64')Data variables: (1)PM2.5(Timestamp)float64...unit :µg/m³[70080 values with dtype=float64]Indexes: (1)TimestampPandasIndexPandasIndex(DatetimeIndex(['2022-01-01 00:00:00', '2022-01-01 00:15:00',\n '2022-01-01 00:30:00', '2022-01-01 00:45:00',\n '2022-01-01 01:00:00', '2022-01-01 01:15:00',\n '2022-01-01 01:30:00', '2022-01-01 01:45:00',\n '2022-01-01 02:00:00', '2022-01-01 02:15:00',\n ...\n '2023-12-31 21:30:00', '2023-12-31 21:45:00',\n '2023-12-31 22:00:00', '2023-12-31 22:15:00',\n '2023-12-31 22:30:00', '2023-12-31 22:45:00',\n '2023-12-31 23:00:00', '2023-12-31 23:15:00',\n '2023-12-31 23:30:00', '2023-12-31 23:45:00'],\n dtype='datetime64[ns]', name='Timestamp', length=70080, freq=None))Attributes: (0)\n\n\n\ndata = one_station_ds['PM2.5'].to_dataframe()[['PM2.5']]\n\n# convert to hourly data\ndata = data.resample('h').mean()\n\n# how much missing data\nprint(f\"Missing data: {data.isna().sum().values[0]}\")\n\n# fill missing data\ndata = data.interpolate(method='linear')\n\nprint(f\"Missing data after interpolation: {data.isna().sum().values[0]}\")\n\ndata.head()\n\nMissing data: 298\nMissing data after interpolation: 0\n\n\n\n\n\n\n\n\n\nPM2.5\n\n\nTimestamp\n\n\n\n\n\n2022-01-01 00:00:00\n273.5475\n\n\n2022-01-01 01:00:00\n268.8675\n\n\n2022-01-01 02:00:00\n258.0225\n\n\n2022-01-01 03:00:00\n194.9100\n\n\n2022-01-01 04:00:00\n197.9975\n\n\n\n\n\n\n\n\nimport timesfm\n\ntfm = timesfm.TimesFm(\n context_len=32,\n horizon_len=24,\n input_patch_len=32,\n output_patch_len=128,\n num_layers=20,\n model_dims=1280,\n backend=\"gpu\",\n)\ntfm.load_from_checkpoint(repo_id=\"google/timesfm-1.0-200m\")\n\nMultiprocessing context has already been set.\nConstructing model weights.\n\n\nWARNING:absl:No registered CheckpointArgs found for handler type: <class 'paxml.checkpoints.FlaxCheckpointHandler'>\nWARNING:absl:Configured `CheckpointManager` using deprecated legacy API. Please follow the instructions at https://orbax.readthedocs.io/en/latest/api_refactor.html to migrate by May 1st, 2024.\nWARNING:absl:train_state_unpadded_shape_dtype_struct is not provided. We assume `train_state` is unpadded.\n\n\nConstructed model weights in 3.76 seconds.\nRestoring checkpoint from /home/patel_zeel/.cache/huggingface/hub/models--google--timesfm-1.0-200m/snapshots/8775f7531211ac864b739fe776b0b255c277e2be/checkpoints.\n\n\n\n---------------------------------------------------------------------------\nMemoryError Traceback (most recent call last)\nCell In[6], line 12\n 1 import timesfm\n 3 tfm = timesfm.TimesFm(\n 4 context_len=32,\n 5 horizon_len=24,\n (...)\n 10 backend=\"gpu\",\n 11 )\n---> 12 tfm.load_from_checkpoint(repo_id=\"google/timesfm-1.0-200m\")\n\nFile ~/timesfm/src/timesfm.py:270, in TimesFm.load_from_checkpoint(self, checkpoint_path, repo_id, checkpoint_type, step)\n 268 self._logging(f\"Restoring checkpoint from {checkpoint_path}.\")\n 269 start_time = time.time()\n--> 270 self._train_state = checkpoints.restore_checkpoint(\n 271 train_state_local_shapes,\n 272 checkpoint_dir=checkpoint_path,\n 273 checkpoint_type=checkpoint_type,\n 274 state_specs=train_state_partition_specs,\n 275 step=step,\n 276 )\n 277 self._logging(\n 278 f\"Restored checkpoint in {time.time() - start_time:.2f} seconds.\"\n 279 )\n 281 # Initialize and jit the decode fn.\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/paxml/checkpoints.py:246, in restore_checkpoint(state_global_shapes, checkpoint_dir, global_mesh, checkpoint_type, state_specs, step, enforce_restore_shape_check, state_unpadded_shape_dtype_struct, tensorstore_use_ocdbt, restore_transformations)\n 240 if checkpoint_type == CheckpointType.GDA:\n 241 restore_args = {\n 242 'specs': state_specs,\n 243 'mesh': global_mesh,\n 244 'transforms': restore_transformations,\n 245 }\n--> 246 output = checkpoint_manager.restore(\n 247 step,\n 248 state_global_shapes,\n 249 state_unpadded_shape_dtype_struct,\n 250 restore_kwargs=restore_args,\n 251 )\n 252 # Note: `aux_items` argument wasn't passed to checkpoint_manager.restore()\n 253 # so this returns a TrainState instance.\n 254 return cast(train_states.TrainState, output)\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/paxml/checkpoint_managers.py:568, in OrbaxCheckpointManager.restore(self, step, train_state, train_state_unpadded_shape_dtype_struct, train_input_pipeline, restore_kwargs, aux_items, aux_restore_kwargs)\n 565 if train_input_pipeline and self._train_checkpoint_exists(step):\n 566 items[INPUT_ITEM_NAME] = train_input_pipeline\n--> 568 restored = self._manager.restore(\n 569 step, items=items, restore_kwargs=restore_kwargs\n 570 )\n 572 # Skip metadata checks if using transformations, since the TrainState may be\n 573 # completely altered.\n 574 if self.version > 1.0 and not uses_transformations:\n 575 # If unpadded shapes were not provided, skip the shape check for now, as\n 576 # there are many callers that need to be changed.\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/checkpoint_manager.py:1055, in CheckpointManager.restore(self, step, items, restore_kwargs, directory, args)\n 1052 args = typing.cast(args_lib.Composite, args)\n 1054 restore_directory = self._get_read_step_directory(step, directory)\n-> 1055 restored = self._checkpointer.restore(restore_directory, args=args)\n 1056 if self._single_item:\n 1057 return restored[DEFAULT_ITEM_NAME]\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/checkpointer.py:170, in Checkpointer.restore(self, directory, *args, **kwargs)\n 168 logging.info('Restoring item from %s.', directory)\n 169 ckpt_args = construct_checkpoint_args(self._handler, False, *args, **kwargs)\n--> 170 restored = self._handler.restore(directory, args=ckpt_args)\n 171 logging.info('Finished restoring checkpoint from %s.', directory)\n 172 utils.sync_global_processes('Checkpointer:restore', self._active_processes)\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/composite_checkpoint_handler.py:470, in CompositeCheckpointHandler.restore(self, directory, args)\n 468 continue\n 469 handler = self._get_or_set_handler(item_name, arg)\n--> 470 restored[item_name] = handler.restore(\n 471 self._get_item_directory(directory, item_name), args=arg\n 472 )\n 473 return CompositeResults(**restored)\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/composite_checkpoint_handler.py:138, in _AsyncLegacyCheckpointHandlerWrapper.restore(self, directory, args)\n 137 def restore(self, directory: epath.Path, args: '_AsyncWrapperArgs'):\n--> 138 return self._handler.restore(directory, *args.args, **args.kwargs)\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/paxml/checkpoints.py:685, in FlaxCheckpointHandler.restore(self, directory, item, restore_args, transforms, transforms_default_to_original, version)\n 680 str_pytree_state = str(pytree_state)\n 681 input_target = {\n 682 'flattened_state': flattened_state,\n 683 'str_pytree_state': str_pytree_state,\n 684 }\n--> 685 restored_target = super().restore(directory, input_target)\n 686 # Flax restore_checkpoint returned input_target unchanged if\n 687 # no step specified and no checkpoint files present.\n 688 if restored_target is input_target:\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/pytree_checkpoint_handler.py:1089, in PyTreeCheckpointHandler.restore(self, directory, item, restore_args, transforms, transforms_default_to_original, legacy_transform_fn, args)\n 1085 raise FileNotFoundError(\n 1086 f'Requested directory for restore does not exist at {directory}'\n 1087 )\n 1088 byte_limiter = get_byte_limiter(self._concurrent_gb)\n-> 1089 structure, use_zarr3_metadata = self._get_internal_metadata(directory)\n 1090 # `checkpoint_restore_args` has a structure relative to the checkpoint,\n 1091 # while `restore_args` remains structured relative to the output.\n 1092 param_infos, checkpoint_restore_args = _get_restore_parameters(\n 1093 directory,\n 1094 item,\n (...)\n 1102 else self._use_zarr3,\n 1103 )\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/pytree_checkpoint_handler.py:1312, in PyTreeCheckpointHandler._get_internal_metadata(self, directory)\n 1296 def _get_internal_metadata(\n 1297 self, directory: epath.Path\n 1298 ) -> Tuple[PyTree, Optional[bool]]:\n 1299 \"\"\"Gets limited information needed to fully restore the checkpoint.\n 1300 \n 1301 This information just consists of the restore type for each leaf, as well\n (...)\n 1310 checkpoint.\n 1311 \"\"\"\n-> 1312 aggregate_tree = self._read_aggregate_file(directory)\n 1313 flat_aggregate = utils.to_flat_dict(aggregate_tree, keep_empty_nodes=True)\n 1314 try:\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/pytree_checkpoint_handler.py:1172, in PyTreeCheckpointHandler._read_aggregate_file(self, directory)\n 1170 checkpoint_path = directory / self._aggregate_filename\n 1171 if checkpoint_path.exists():\n-> 1172 return self._aggregate_handler.deserialize(checkpoint_path)\n 1173 elif self._use_ocdbt:\n 1174 raise FileNotFoundError(\n 1175 f'Checkpoint structure file does not exist at {directory}.'\n 1176 )\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/orbax/checkpoint/aggregate_handlers.py:86, in MsgpackHandler.deserialize(self, path)\n 84 \"\"\"See superclass documentation.\"\"\"\n 85 if path.exists():\n---> 86 msgpack = path.read_bytes()\n 87 return msgpack_utils.msgpack_restore(msgpack)\n 88 else:\n\nFile /opt/anaconda3/envs/tfm_env/lib/python3.10/site-packages/etils/epath/abstract_path.py:152, in Path.read_bytes(self)\n 150 \"\"\"Reads contents of self as bytes.\"\"\"\n 151 with self.open('rb') as f:\n--> 152 return f.read()\n\nMemoryError:" + "objectID": "posts/2022-05-14-iteratively_reweighted_least_squares.html#create-dataset", + "href": "posts/2022-05-14-iteratively_reweighted_least_squares.html#create-dataset", + "title": "Iteratively reweighted least squares (IRLS) logistic regression", + "section": "Create dataset", + "text": "Create dataset\n\nfeatures, labels = make_blobs(100, n_features=2, centers=2, random_state=0)\nplt.scatter(features[:, 0], features[:, 1], c=labels);\n\n\n\n\n\n\n\n\n\nprint(features.shape, features.dtype, labels.shape, labels.dtype)\n\n(100, 2) float64 (100,) int64" }, { - "objectID": "posts/learnings_from_brick_kiln_project.html", - "href": "posts/learnings_from_brick_kiln_project.html", - "title": "Learnings from the Brick Kiln Project", - "section": "", - "text": "Labeling is the most important and effort-taking part of the project. It is also most confusing part if not done properly for the images. For example, we needed to make this decision for the images of the brick kilns: “If brick kiln firing chamber is visible fully or partially at a level where a human would be able to identify it as a brick kiln, we mark it as a brick kiln”.\nTo ensure good quality of labels, one should allow a small number of images to be labeled by multiple people and then compare the labels. This will help in identifying the mistakes in the labeling process and also help in improving the labeling instructions." + "objectID": "posts/2022-05-14-iteratively_reweighted_least_squares.html#implementing-newtons-method-naive-way", + "href": "posts/2022-05-14-iteratively_reweighted_least_squares.html#implementing-newtons-method-naive-way", + "title": "Iteratively reweighted least squares (IRLS) logistic regression", + "section": "Implementing Newton’s method (naive way)", + "text": "Implementing Newton’s method (naive way)\nWe will first try to implement Eq. 10.31 directly from PML book1:\n\\[\n\\boldsymbol{w}_{t+1}=\\boldsymbol{w}_{t}-\\eta_{t} \\mathbf{H}_{t}^{-1} \\boldsymbol{g}_{t}\n\\]\n\ndef get_logits(params, feature): # for a single data-point\n logits = jnp.sum(feature * params[\"w\"]) + params[\"b\"]\n return logits\n\ndef naive_loss(params, feature, label): # for a single data-point\n logits = get_logits(params, feature)\n prob = jax.nn.sigmoid(logits)\n\n # Check if label is 1 or 0\n is_one = (label == 1)\n loss_if_one = lambda: -jnp.log(prob) # loss if label is 1\n loss_if_zero = lambda: -jnp.log(1 - prob) # loss if labels is 0\n\n # Use lax.cond to convert if..else.. in jittable format\n loss = jax.lax.cond(is_one, loss_if_one, loss_if_zero)\n\n return loss\n\ndef naive_loss_batch(params, features, labels): # for a batch of data-points\n losses = jax.vmap(naive_loss, in_axes=(None, 0, 0))(params, features, labels)\n return jnp.mean(losses)\n\nWriting the train function\n\ndef naive_train_step(params, features, labels, learning_rate):\n # Find gradient\n loss_value, grads = jax.value_and_grad(naive_loss_batch)(params, features, labels)\n\n # Find Hessian\n hess = jax.hessian(naive_loss_batch)(params, features, labels)\n\n # Adjust Hessian matrix nicely\n hess_matrix = jnp.block([[hess[\"b\"][\"b\"], hess[\"b\"][\"w\"]],\n [hess[\"w\"][\"b\"], hess[\"w\"][\"w\"]]])\n \n # Adjust gradient vector nicely\n grad_vector = jnp.r_[grads[\"b\"], grads[\"w\"]]\n\n # Find H^-1g\n h_inv_g = jnp.dot(jnp.linalg.inv(hess_matrix), grad_vector)\n\n # Get back the structure\n h_inv_g = {\"b\": h_inv_g[0], \"w\": h_inv_g[1:]}\n\n # Apply the update\n params = jax.tree_map(lambda p, g: p - learning_rate*g, params, h_inv_g)\n\n return params, loss_value\n\n# First order method\n# vg = jax.value_and_grad(naive_loss_batch)\n# def train_step(params, features, labels, learning_rate):\n# # Find gradient\n# loss_value, grads = vg(params, features, labels)\n\n# # Apply the update\n# params = jax.tree_map(lambda p, g: p - learning_rate*g, params, grads)\n\n# return params, loss_value\n\n\nkey = jax.random.PRNGKey(0)\nrandom_params = jax.random.normal(key, shape=(3, ))\n# \"b\" should have shape (1,) for hessian trick with jnp.block to work\nparams = {\"w\": random_params[:2], \"b\": random_params[2].reshape(1,)}\nlearning_rate = 1.0\nepochs = 20\n\ntrain_step_jitted = jax.jit(naive_train_step)\n\nhistory = {\"loss\": [], \"params\": []}\n\n# warm up\ntrain_step_jitted(params, features, labels, learning_rate)\n\ninit = time()\nfor _ in range(epochs):\n history[\"params\"].append(params)\n params, loss_value = train_step_jitted(params, features, labels, learning_rate)\n history[\"loss\"].append(loss_value)\nprint(time() - init, \"seconds\")\nprint(params)\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)\n\n\n0.0015490055084228516 seconds\n{'b': DeviceArray([13.22076694], dtype=float64), 'w': DeviceArray([ 0.59021174, -5.18797851], dtype=float64)}\n\n\nA helper function to animate the learning.\n\ndef animate(history):\n fig, ax = plt.subplots(1, 2, figsize=(10,4))\n def update(idx):\n # Clear previous frame\n ax[0].cla()\n ax[1].cla()\n\n # Plot data\n params = history[\"params\"][idx]\n losses = history[\"loss\"][:idx]\n ax[0].scatter(features[:, 0], features[:, 1], c=labels)\n \n # Calculate and plot decision boundary\n x0_min, x0_max = features[:, 0].min(), features[:, 0].max()\n x1_min = -(params[\"b\"] + params[\"w\"][0] * x0_min)/params[\"w\"][1]\n x1_max = -(params[\"b\"] + params[\"w\"][0] * x0_max)/params[\"w\"][1]\n\n ax[0].plot([x0_min, x0_max], [x1_min, x1_max], label='decision boundary')\n\n # Plot losses\n ax[1].plot(losses, label=\"loss\")\n ax[1].set_xlabel(\"Iterations\")\n\n ax[0].legend()\n ax[1].legend()\n\n anim = FuncAnimation(fig, update, range(epochs))\n plt.close()\n return anim\n\n\nanimate(history)\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect" }, { - "objectID": "posts/learnings_from_brick_kiln_project.html#points", - "href": "posts/learnings_from_brick_kiln_project.html#points", - "title": "Learnings from the Brick Kiln Project", - "section": "", - "text": "Labeling is the most important and effort-taking part of the project. It is also most confusing part if not done properly for the images. For example, we needed to make this decision for the images of the brick kilns: “If brick kiln firing chamber is visible fully or partially at a level where a human would be able to identify it as a brick kiln, we mark it as a brick kiln”.\nTo ensure good quality of labels, one should allow a small number of images to be labeled by multiple people and then compare the labels. This will help in identifying the mistakes in the labeling process and also help in improving the labeling instructions." + "objectID": "posts/2022-05-14-iteratively_reweighted_least_squares.html#implementing-irls-algorithm", + "href": "posts/2022-05-14-iteratively_reweighted_least_squares.html#implementing-irls-algorithm", + "title": "Iteratively reweighted least squares (IRLS) logistic regression", + "section": "Implementing IRLS algorithm", + "text": "Implementing IRLS algorithm\n\ndef get_s_and_z(params, feature, label): # for a single data-point\n logits = get_logits(params, feature)\n prob = jax.nn.sigmoid(logits)\n s = prob * (1 - prob)\n z = logits + (label - prob)/s\n return s, z\n\ndef irls_train_step(params, features, labels):\n s, z = jax.vmap(get_s_and_z, in_axes=(None, 0, 0))(params, features, labels)\n S = jnp.diag(s.flatten()) # convert into a diagonal matrix\n\n # Add column with ones\n X = jnp.c_[jnp.ones(len(z)), features]\n\n # Get weights\n weights = jnp.linalg.inv(X.T@S@X)@X.T@S@z.flatten()\n\n # get correct format\n params = {\"b\": weights[0], \"w\": weights[1:]}\n\n return params\n\n\nkey = jax.random.PRNGKey(0)\nrandom_params = jax.random.normal(key, shape=(3,))\nparams = {\"w\": random_params[:2], \"b\": random_params[2]}\nepochs = 20\n\ntrain_step_jitted = jax.jit(irls_train_step)\n\nirls_history = {\"params\": []}\n\n# warm up\ntrain_step_jitted(params, features, labels)\n\ninit = time()\nfor _ in range(epochs):\n irls_history[\"params\"].append(params)\n params = train_step_jitted(params, features, labels)\nprint(time() - init, \"seconds\")\nprint(params)\n\n0.0016303062438964844 seconds\n{'b': DeviceArray(13.22076694, dtype=float64), 'w': DeviceArray([ 0.59021174, -5.18797851], dtype=float64)}" }, { - "objectID": "posts/presentation_tips.html", - "href": "posts/presentation_tips.html", - "title": "Conference Presentation Tips", - "section": "", - "text": "General\n\nFirst page goes like this:\n\nTitle\nAuthors (Underline presenting author, no need to put * in case of equal contribution)\nAffiliations\nConference name\n\nIf importing figures from paper, avoid including the captions.\nInclude lot of images and less maths\nTalk should end with summary and not the future work or thank you slide or something.\nCite the references on the same slide in bottom.\n\nRefer to “Giving talks” section of this blog.\n\n\nDos and Don’ts\n\nNever put too detailed information difficult to grasp: a table with many numbers, a complex derivation all in one go, very complicated diagram." + "objectID": "posts/2022-05-14-iteratively_reweighted_least_squares.html#comparison", + "href": "posts/2022-05-14-iteratively_reweighted_least_squares.html#comparison", + "title": "Iteratively reweighted least squares (IRLS) logistic regression", + "section": "Comparison", + "text": "Comparison\n\nnaive_params_b = list(map(lambda x: x[\"b\"], history[\"params\"]))\nirls_params_b = list(map(lambda x: x[\"b\"], irls_history[\"params\"]))\n\nnaive_params_w = list(map(lambda x: x[\"w\"], history[\"params\"]))\nirls_params_w = list(map(lambda x: x[\"w\"], irls_history[\"params\"]))\n\n\nplt.plot(naive_params_b, \"o-\", label=\"Naive\")\nplt.plot(irls_params_b, label=\"IRLS\")\nplt.xlabel(\"Iterations\")\nplt.title(\"Bias\")\nplt.legend();\n\n\n\n\n\n\n\n\n\nplt.plot(naive_params_w, \"o-\", label=\"Naive\")\nplt.plot(irls_params_w, label=\"IRLS\")\nplt.xlabel(\"Iterations\")\nplt.title(\"Weights\")\nplt.legend();" }, { - "objectID": "posts/2021-10-23-warped-gp.html", - "href": "posts/2021-10-23-warped-gp.html", - "title": "Input Warped GPs - A failed idea", + "objectID": "posts/fundamentals_across_domains.html", + "href": "posts/fundamentals_across_domains.html", + "title": "Fundamentals across ML domains", "section": "", - "text": "Comments\n\nWe are warping inputs \\(\\mathbf{x}\\) into \\(\\mathbf{w}\\cdot\\mathbf{x}\\)\nLearning second level GP over \\(\\mathbf{w}\\).\nAppling penalty over \\(\\mathbf{w}\\) if varies too much unnecessary.\nSee problems at the end of the notebook.\nWe need to check mathematical concerns related to this transformation.\n\n\nimport math\nimport numpy as np\nimport torch\nimport gpytorch\nfrom matplotlib import pyplot as plt\nimport regdata as rd\nfrom sklearn.cluster import KMeans\n\n\nclass ExactGPModel(gpytorch.models.ExactGP):\n def __init__(self, train_x, train_y, likelihood):\n super(ExactGPModel, self).__init__(train_x, train_y, likelihood)\n self.mean_module = gpytorch.means.ConstantMean()\n self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())\n\n def forward(self, x):\n mean_x = self.mean_module(x)\n covar_x = self.covar_module(x)\n return gpytorch.distributions.MultivariateNormal(mean_x, covar_x)\n\nclass ExactNSGPModel(gpytorch.models.ExactGP):\n def __init__(self, train_x, train_y, likelihood, num_latent):\n super(ExactNSGPModel, self).__init__(train_x, train_y, likelihood)\n# inds = np.random.choice(train_x.shape[0], size=num_latent, replace=False)\n# self.x_bar = train_x[inds]\n self.x_bar = torch.tensor(KMeans(n_clusters=num_latent).fit(train_x).cluster_centers_).to(train_x)\n self.w_bar = torch.nn.Parameter(torch.ones(num_latent,).to(self.x_bar))\n self.bias = torch.nn.Parameter(torch.zeros(1,).to(self.x_bar))\n self.latent_likelihood = gpytorch.likelihoods.GaussianLikelihood()\n# We can fix noise to be minimum but it is not ideal. Ideally, noise should automatically reduce to reasonable value.\n# self.latent_likelihood.raw_noise.requires_grad = False\n# self.latent_likelihood.raw_noise = torch.tensor(-10.)\n self.latent_model = ExactGPModel(self.x_bar, self.w_bar, self.latent_likelihood)\n \n self.mean_module = gpytorch.means.ConstantMean()\n self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())\n\n def forward(self, x):\n self.latent_model.eval()\n with gpytorch.settings.detach_test_caches(False): # needed to back propagate thru predictive posterior\n self.latent_model.set_train_data(self.x_bar, self.w_bar, strict=False)\n self.w = self.latent_likelihood(self.latent_model(x)) # predictive posterior\n x_warped = x*self.w.mean[:, None] + self.bias\n mean_x = self.mean_module(x_warped)\n covar_x = self.covar_module(x_warped)\n return gpytorch.distributions.MultivariateNormal(mean_x, covar_x)\n\n\ndef training(model, likelihood):\n training_iter = 100\n\n # Find optimal model hyperparameters\n model.train()\n likelihood.train()\n\n # Use the adam optimizer\n optimizer = torch.optim.Adam([\n {'params': model.parameters()}, # Includes GaussianLikelihood parameters\n ], lr=0.1)\n\n # \"Loss\" for GPs - the marginal log likelihood\n mll = gpytorch.mlls.ExactMarginalLogLikelihood(likelihood, model)\n\n for i in range(training_iter):\n # Zero gradients from previous iteration\n optimizer.zero_grad()\n # Output from model\n output = model(train_x)\n # Calc loss and backprop gradients\n try:\n loss = -mll(output, train_y) + torch.square(model.w.mean-1).mean()\n# print(model.latent_likelihood.noise)\n except AttributeError:\n loss = -mll(output, train_y)\n loss.backward()\n# print('Iter %d/%d - Loss: %.3f lengthscale: %.3f noise: %.3f' % (\n# i + 1, training_iter, loss.item(),\n# model.covar_module.base_kernel.lengthscale.item(),\n# model.likelihood.noise.item()\n# ))\n optimizer.step()\n \ndef predict_plot(model, likelihood, title):\n # Get into evaluation (predictive posterior) mode\n model.eval()\n likelihood.eval()\n\n # Test points are regularly spaced along [0,1]\n # Make predictions by feeding model through likelihood\n with torch.no_grad():\n observed_pred = likelihood(model(test_x))\n\n with torch.no_grad():\n # Initialize plot\n f, ax = plt.subplots(1, 1, figsize=(10, 6))\n\n # Get upper and lower confidence bounds\n lower, upper = observed_pred.confidence_region()\n # Plot training data as black stars\n ax.plot(train_x.numpy(), train_y.numpy(), 'k*')\n # Plot predictive means as blue line\n ax.plot(test_x.numpy(), observed_pred.mean.numpy(), 'b')\n # Shade between the lower and upper confidence bounds\n ax.fill_between(test_x.numpy().ravel(), lower.numpy(), upper.numpy(), alpha=0.5)\n ax.legend(['Observed Data', 'Mean', 'Confidence'])\n ax.set_title(title)\n return observed_pred\n\n\ndef GP(num_latent):\n\n # initialize likelihood and model\n likelihood = gpytorch.likelihoods.GaussianLikelihood()\n model = ExactGPModel(train_x, train_y, likelihood)\n \n training(model, likelihood)\n predict_plot(model, likelihood, 'GP')\n\ndef NSGP(num_latent):\n\n # initialize likelihood and model\n likelihood = gpytorch.likelihoods.GaussianLikelihood()\n model = ExactNSGPModel(train_x, train_y, likelihood, num_latent)\n \n training(model, likelihood)\n observed_pred = predict_plot(model, likelihood, 'NSGP')\n \n with torch.no_grad():\n model.train()\n model.forward(test_x)\n plt.figure(figsize=(10,6))\n plt.plot(test_x*model.w.mean[:, None], observed_pred.mean.numpy())\n plt.title('Warped test inputs v/s test outputs')\n \n with torch.no_grad():\n model.train()\n model.forward(test_x)\n plt.figure(figsize=(10,6))\n plt.plot(test_x, model.w.mean, label='interpolated')\n plt.scatter(model.x_bar, model.w_bar, label='learned')\n plt.ylim(0,2)\n plt.title('Test input v/s weights')\n plt.legend()\n\n\n\nTesting over various datasets\n\ntrain_x, train_y, test_x = rd.DellaGattaGene(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=7)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.Heinonen4(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.Jump1D(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.MotorcycleHelmet(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.Olympic(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.SineJump1D(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\ntrain_x, train_y, test_x = rd.SineNoisy(backend='torch').get_data()\nGP(0)\nNSGP(num_latent=10)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nProblems\n\nTransformation from x to x_warped is not monotonic." + "text": "Similarities among ML domains\n\n\n\nNN\nTransformer\nCNN\n\n\n\n\n-\nMulti-head\nMulti-channel\n\n\n-\nSkip-connection\nResNet\n\n\n\n\n\nProgress of Natural Language Processing\n\n\n\n\n\n\n\n\n\nModel\nMain Disadvantage\nSolved by\nHow?\n\n\n\n\nNN\nCan’t handle dynamic length input\nRNN\nRNN can handle dynamic length input\n\n\nRNN\nVanishing Gradient Problem\nLSTM\nLSTM can handle vanishing gradient problem\n\n\nLSTM\nNon parallelizable\nTransformer\nTransformer can parallelize the computation\n\n\nTrasformer\nlosses sequentiality\nTransformer\nPositional Encoding" }, { - "objectID": "posts/numpy-algebra- copy.html", - "href": "posts/numpy-algebra- copy.html", - "title": "How numpy handles day-to-day algebra?", + "objectID": "posts/pruning_vs_uncertainty.html", + "href": "posts/pruning_vs_uncertainty.html", + "title": "Pruning vs Uncertainty", "section": "", - "text": "import numpy as np\n\nnp.__version__\n\n118 False\n140 False\numath 8 False\numath 10 False\numath 12 True\ncore 73 False\nYes!\nnumeric 29 False\nnumeric 41 False\nnumeric 1128 False\nnumeric 2515 False\nnumeric 2517 True\ncore 75 False\ncore 77 True\ncore 83 True\ncore 94 True\n142 False\n144 True\n\n\n'1.25.2'" + "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"3\"\n\nimport numpy as np\n\nimport torch\nimport torch.nn as nn\n\n# import pruning library\nimport torch.nn.utils.prune as prune\n\n# import torchvision\nimport torchvision\nfrom torchvision import transforms, datasets\nfrom torch.utils.data import DataLoader, TensorDataset\n\nfrom tqdm import tqdm\n\nfrom sklearn.calibration import calibration_curve\nfrom sklearn.metrics import classification_report\nimport matplotlib.pyplot as plt\n\ntry:\n from laplace import Laplace\nexcept ModuleNotFoundError:\n %pip install laplace-torch\n from laplace import Laplace\n\n<frozen importlib._bootstrap>:228: RuntimeWarning: scipy._lib.messagestream.MessageStream size changed, may indicate binary incompatibility. Expected 56 from C header, got 64 from PyObject" }, { - "objectID": "posts/numpy-algebra- copy.html#motivation", - "href": "posts/numpy-algebra- copy.html#motivation", - "title": "How numpy handles day-to-day algebra?", - "section": "Motivation", - "text": "Motivation\nIn this blog post, we will try to figure out how numpy handles seemingly simple math operations. The motivation behind this exploration is to figure out if there are a few foundational operations behind most of the frequently used functions in numpy. For the sake of right level of abstraction, we will not look into addition, subtraction, multiplication, and division." + "objectID": "posts/pruning_vs_uncertainty.html#train-a-model-on-mnist", + "href": "posts/pruning_vs_uncertainty.html#train-a-model-on-mnist", + "title": "Pruning vs Uncertainty", + "section": "Train a model on MNIST", + "text": "Train a model on MNIST\n\n# Define data transformations\ntransform = transforms.Compose(\n [\n transforms.Resize((224, 224)),\n transforms.Grayscale(num_output_channels=3), # Convert to RGB format\n transforms.ToTensor(),\n transforms.Normalize((0.5,), (0.5,)),\n # convert dtype to float32\n # transforms.Lambda(lambda x: x.to(torch.float32)),\n ]\n)\n\n\n# Load MNIST dataset\ndevice = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\nprint(f\"Using {device} device\")\ntrain_dataset = datasets.MNIST(\n root=\"./data\", train=True, transform=transform, download=True\n)\nprint(\"Train size\", len(train_dataset))\n\ntrain_dataset = TensorDataset(\n train_dataset.data[..., None]\n .repeat(1, 1, 1, 3)\n .swapaxes(1, 3)\n .swapaxes(2, 3)\n .to(torch.float32)\n .to(device),\n train_dataset.targets.to(device),\n)\ntest_dataset = datasets.MNIST(\n root=\"./data\", train=False, transform=transform, download=True\n)\nprint(\"Test size\", len(test_dataset))\ntest_dataset = TensorDataset(\n test_dataset.data[..., None]\n .repeat(1, 1, 1, 3)\n .swapaxes(1, 3)\n .swapaxes(2, 3)\n .to(torch.float32)\n .to(device),\n test_dataset.targets.to(device),\n)\n\nUsing cuda device\nTrain size 60000\nTest size 10000\n\n\n\ntrain_dataset[0][0].dtype, train_dataset[0][1].dtype\n\n(torch.float32, torch.int64)\n\n\n\n# Define data loaders\nbatch_size = 64\ntrain_loader = DataLoader(train_dataset, batch_size=batch_size, shuffle=True)\ntest_loader = DataLoader(test_dataset, batch_size=batch_size, shuffle=False)\n\n\n# Load pre-trained ResNet model\nresnet = torchvision.models.resnet18(pretrained=True)\nprint(\"Loaded pre-trained ResNet18 model\")\nprint(resnet.fc.in_features)\n\n# Modify the last fully connected layer to match MNIST's number of classes (10)\nnum_classes = 10\nresnet.fc = nn.Sequential(\n nn.Linear(resnet.fc.in_features, resnet.fc.in_features),\n nn.GELU(),\n nn.Linear(resnet.fc.in_features, num_classes),\n)\n\n# Freeze all layers except the last fully connected layer\nfor name, param in resnet.named_parameters():\n param.requires_grad = False\nresnet.fc.requires_grad_(True)\n\n# Define loss and optimizer\ncriterion = nn.CrossEntropyLoss()\noptimizer = torch.optim.Adam(resnet.parameters(), lr=1e-4)\n\n# Training loop\nnum_epochs = 50\nprint(f\"Training on device {device}\")\nresnet.to(device)\n\nprint(\"Training ResNet18 model\")\nfor epoch in range(num_epochs):\n resnet.train()\n epoch_loss = 0.0\n for images, labels in tqdm(train_loader):\n optimizer.zero_grad()\n outputs = resnet(images)\n loss = criterion(outputs, labels)\n loss.backward()\n optimizer.step()\n epoch_loss += loss.item()\n\n epoch_loss /= len(train_loader)\n\n print(f\"Epoch [{epoch+1}/{num_epochs}] Loss: {epoch_loss:.4f}\")\n\n # Evaluation\n resnet.eval()\n correct = 0\n total = 0\n with torch.no_grad():\n predicted_list = []\n for images, labels in test_loader:\n outputs = resnet(images)\n _, predicted = torch.max(outputs.data, 1)\n total += labels.size(0)\n correct += (predicted == labels).sum().item()\n\n print(f\"Accuracy on the test set: {(100 * correct / total):.2f}%\")\n\n/home/patel_zeel/miniconda3/envs/torch_dt/lib/python3.9/site-packages/torchvision/models/_utils.py:208: UserWarning: The parameter 'pretrained' is deprecated since 0.13 and may be removed in the future, please use 'weights' instead.\n warnings.warn(\n/home/patel_zeel/miniconda3/envs/torch_dt/lib/python3.9/site-packages/torchvision/models/_utils.py:223: UserWarning: Arguments other than a weight enum or `None` for 'weights' are deprecated since 0.13 and may be removed in the future. The current behavior is equivalent to passing `weights=ResNet18_Weights.IMAGENET1K_V1`. You can also use `weights=ResNet18_Weights.DEFAULT` to get the most up-to-date weights.\n warnings.warn(msg)\n\n\nLoaded pre-trained ResNet18 model\n512\nTraining on device cuda\nTraining ResNet18 model\n\n\n100%|██████████| 938/938 [00:03<00:00, 242.75it/s]\n\n\nEpoch [1/50] Loss: 1.0877\nAccuracy on the test set: 75.42%\n\n\n100%|██████████| 938/938 [00:03<00:00, 262.53it/s]\n\n\nEpoch [2/50] Loss: 0.8051\nAccuracy on the test set: 76.74%\n\n\n100%|██████████| 938/938 [00:03<00:00, 270.43it/s]\n\n\nEpoch [3/50] Loss: 0.7578\nAccuracy on the test set: 78.27%\n\n\n100%|██████████| 938/938 [00:03<00:00, 265.38it/s]\n\n\nEpoch [4/50] Loss: 0.7290\nAccuracy on the test set: 78.71%\n\n\n100%|██████████| 938/938 [00:03<00:00, 265.51it/s]\n\n\nEpoch [5/50] Loss: 0.7083\nAccuracy on the test set: 79.62%\n\n\n100%|██████████| 938/938 [00:03<00:00, 266.62it/s]\n\n\nEpoch [6/50] Loss: 0.6761\nAccuracy on the test set: 79.82%\n\n\n100%|██████████| 938/938 [00:03<00:00, 268.49it/s]\n\n\nEpoch [7/50] Loss: 0.6627\nAccuracy on the test set: 80.47%\n\n\n100%|██████████| 938/938 [00:03<00:00, 266.33it/s]\n\n\nEpoch [8/50] Loss: 0.6423\nAccuracy on the test set: 80.24%\n\n\n100%|██████████| 938/938 [00:03<00:00, 268.52it/s]\n\n\nEpoch [9/50] Loss: 0.6257\nAccuracy on the test set: 81.11%\n\n\n100%|██████████| 938/938 [00:03<00:00, 269.38it/s]\n\n\nEpoch [10/50] Loss: 0.6131\nAccuracy on the test set: 81.42%\n\n\n100%|██████████| 938/938 [00:03<00:00, 264.77it/s]\n\n\nEpoch [11/50] Loss: 0.5911\nAccuracy on the test set: 82.02%\n\n\n100%|██████████| 938/938 [00:03<00:00, 266.07it/s]\n\n\nEpoch [12/50] Loss: 0.5765\nAccuracy on the test set: 82.32%\n\n\n100%|██████████| 938/938 [00:03<00:00, 262.19it/s]\n\n\nEpoch [13/50] Loss: 0.5611\nAccuracy on the test set: 82.30%\n\n\n100%|██████████| 938/938 [00:04<00:00, 214.62it/s]\n\n\nEpoch [14/50] Loss: 0.5466\nAccuracy on the test set: 82.49%\n\n\n100%|██████████| 938/938 [00:04<00:00, 219.31it/s]\n\n\nEpoch [15/50] Loss: 0.5358\nAccuracy on the test set: 82.81%\n\n\n100%|██████████| 938/938 [00:04<00:00, 226.53it/s]\n\n\nEpoch [16/50] Loss: 0.5266\nAccuracy on the test set: 83.30%\n\n\n100%|██████████| 938/938 [00:05<00:00, 171.25it/s]\n\n\nEpoch [17/50] Loss: 0.5137\nAccuracy on the test set: 83.37%\n\n\n100%|██████████| 938/938 [00:03<00:00, 278.59it/s]\n\n\nEpoch [18/50] Loss: 0.5051\nAccuracy on the test set: 83.17%\n\n\n100%|██████████| 938/938 [00:03<00:00, 248.82it/s]\n\n\nEpoch [19/50] Loss: 0.4969\nAccuracy on the test set: 83.46%\n\n\n100%|██████████| 938/938 [00:05<00:00, 175.56it/s]\n\n\nEpoch [20/50] Loss: 0.4811\nAccuracy on the test set: 83.76%\n\n\n100%|██████████| 938/938 [00:03<00:00, 277.18it/s]\n\n\nEpoch [21/50] Loss: 0.4714\nAccuracy on the test set: 83.57%\n\n\n100%|██████████| 938/938 [00:03<00:00, 273.71it/s]\n\n\nEpoch [22/50] Loss: 0.4624\nAccuracy on the test set: 84.25%\n\n\n100%|██████████| 938/938 [00:03<00:00, 242.18it/s]\n\n\nEpoch [23/50] Loss: 0.4553\nAccuracy on the test set: 84.27%\n\n\n100%|██████████| 938/938 [00:03<00:00, 279.42it/s]\n\n\nEpoch [24/50] Loss: 0.4506\nAccuracy on the test set: 84.62%\n\n\n100%|██████████| 938/938 [00:03<00:00, 269.21it/s]\n\n\nEpoch [25/50] Loss: 0.4394\nAccuracy on the test set: 83.97%\n\n\n100%|██████████| 938/938 [00:04<00:00, 227.36it/s]\n\n\nEpoch [26/50] Loss: 0.4346\nAccuracy on the test set: 84.16%\n\n\n100%|██████████| 938/938 [00:04<00:00, 222.91it/s]\n\n\nEpoch [27/50] Loss: 0.4271\nAccuracy on the test set: 84.38%\n\n\n100%|██████████| 938/938 [00:04<00:00, 223.68it/s]\n\n\nEpoch [28/50] Loss: 0.4193\nAccuracy on the test set: 84.84%\n\n\n100%|██████████| 938/938 [00:03<00:00, 261.50it/s]\n\n\nEpoch [29/50] Loss: 0.4148\nAccuracy on the test set: 85.05%\n\n\n100%|██████████| 938/938 [00:03<00:00, 246.52it/s]\n\n\nEpoch [30/50] Loss: 0.4040\nAccuracy on the test set: 84.49%\n\n\n100%|██████████| 938/938 [00:03<00:00, 281.60it/s]\n\n\nEpoch [31/50] Loss: 0.3990\nAccuracy on the test set: 84.59%\n\n\n100%|██████████| 938/938 [00:03<00:00, 278.41it/s]\n\n\nEpoch [32/50] Loss: 0.4016\nAccuracy on the test set: 84.92%\n\n\n100%|██████████| 938/938 [00:03<00:00, 275.60it/s]\n\n\nEpoch [33/50] Loss: 0.3979\nAccuracy on the test set: 85.01%\n\n\n100%|██████████| 938/938 [00:03<00:00, 250.04it/s]\n\n\nEpoch [34/50] Loss: 0.3844\nAccuracy on the test set: 84.82%\n\n\n100%|██████████| 938/938 [00:03<00:00, 280.53it/s]\n\n\nEpoch [35/50] Loss: 0.3789\nAccuracy on the test set: 85.49%\n\n\n100%|██████████| 938/938 [00:03<00:00, 279.26it/s]\n\n\nEpoch [36/50] Loss: 0.3760\nAccuracy on the test set: 85.26%\n\n\n100%|██████████| 938/938 [00:04<00:00, 207.71it/s]\n\n\nEpoch [37/50] Loss: 0.3733\nAccuracy on the test set: 85.36%\n\n\n100%|██████████| 938/938 [00:03<00:00, 265.92it/s]\n\n\nEpoch [38/50] Loss: 0.3655\nAccuracy on the test set: 84.98%\n\n\n100%|██████████| 938/938 [00:03<00:00, 279.79it/s]\n\n\nEpoch [39/50] Loss: 0.3627\nAccuracy on the test set: 85.19%\n\n\n100%|██████████| 938/938 [00:03<00:00, 276.73it/s]\n\n\nEpoch [40/50] Loss: 0.3517\nAccuracy on the test set: 84.78%\n\n\n100%|██████████| 938/938 [00:03<00:00, 278.32it/s]\n\n\nEpoch [41/50] Loss: 0.3526\nAccuracy on the test set: 85.43%\n\n\n100%|██████████| 938/938 [00:03<00:00, 243.70it/s]\n\n\nEpoch [42/50] Loss: 0.3523\nAccuracy on the test set: 85.55%\n\n\n100%|██████████| 938/938 [00:03<00:00, 240.48it/s]\n\n\nEpoch [43/50] Loss: 0.3457\nAccuracy on the test set: 85.02%\n\n\n100%|██████████| 938/938 [00:03<00:00, 274.70it/s]\n\n\nEpoch [44/50] Loss: 0.3447\nAccuracy on the test set: 85.20%\n\n\n100%|██████████| 938/938 [00:03<00:00, 276.08it/s]\n\n\nEpoch [45/50] Loss: 0.3411\nAccuracy on the test set: 85.47%\n\n\n100%|██████████| 938/938 [00:04<00:00, 215.18it/s]\n\n\nEpoch [46/50] Loss: 0.3312\nAccuracy on the test set: 85.55%\n\n\n100%|██████████| 938/938 [00:03<00:00, 244.20it/s]\n\n\nEpoch [47/50] Loss: 0.3290\nAccuracy on the test set: 85.52%\n\n\n100%|██████████| 938/938 [00:03<00:00, 267.56it/s]\n\n\nEpoch [48/50] Loss: 0.3277\nAccuracy on the test set: 85.35%\n\n\n100%|██████████| 938/938 [00:03<00:00, 267.91it/s]\n\n\nEpoch [49/50] Loss: 0.3241\nAccuracy on the test set: 85.80%\n\n\n100%|██████████| 938/938 [00:03<00:00, 266.04it/s]\n\n\nEpoch [50/50] Loss: 0.3217\nAccuracy on the test set: 84.93%\n\n\n\n# Evaluation\nresnet.eval()\ncorrect = 0\ntotal = 0\nwith torch.no_grad():\n predicted_list = []\n for images, labels in test_loader:\n outputs = resnet(images)\n _, predicted = torch.max(outputs.data, 1)\n total += labels.size(0)\n correct += (predicted == labels).sum().item()\n softmax_outputs = nn.Softmax(dim=1)(outputs)\n predicted_list.append(softmax_outputs.data.cpu().numpy())\n\nall_predicted = np.concatenate(predicted_list, axis=0)\nprint(f\"Accuracy on the test set: {(100 * correct / total):.2f}%\")\n\nAccuracy on the test set: 84.93%" }, { - "objectID": "posts/numpy-algebra- copy.html#pi", - "href": "posts/numpy-algebra- copy.html#pi", - "title": "How numpy handles day-to-day algebra?", - "section": "Pi", - "text": "Pi\n\nBackground\npi is an irrational number and computing its value to a certain precision is a challenging task. This video talks in detail how people used to compute pi in the past. At the time of writing this blog post, Google holds the record for computing pi to the highest precision to 100 trilian digits. They used y-cruncher program (it’s free. try it!) with Chudnovsky algorithm to compute pi. Here are the first 100 digits of pi:\n\\(3.1415926535897932384626433832795028841971693993751058209749445923078164062862089986280348253421170679\\)\n\n\nSource code\nThis is how pi is defined in numpy source code upto 36 digits.\n#define NPY_PI 3.141592653589793238462643383279502884 /* pi */\nLet’s verify it.\n\nprint(f\"{np.pi:.64f}\")\n\n3.1415926535897931159979634685441851615905761718750000000000000000\n\n\nHmm, that looks off. From 16th digit onwards, the values are different. Let’s try to figure out why.\n\npi = 3.141592653589793238462643383279502884\npi = np.array(pi, dtype=np.float64)\npi = f\"{pi:.64f}\"\nnp_pi = f\"{np.pi:.64f}\"\nassert np_pi == pi\n\nOkay, so it seems like converting 36 digits of pi to 64 bit precision went wrong from 16th digit onwards. What a waste of last 20 digits of pi due to floating point errors! Anyways, let’s move on." + "objectID": "posts/pruning_vs_uncertainty.html#check-calibration", + "href": "posts/pruning_vs_uncertainty.html#check-calibration", + "title": "Pruning vs Uncertainty", + "section": "Check calibration", + "text": "Check calibration\n\ntest_dataset.tensors[1].cpu().numpy().shape, all_predicted.shape\n\n((10000,), (10000, 10))\n\n\n\n# Compute calibration curve\n\nfig, axes = plt.subplots(2, 5, figsize=(20, 5))\naxes = axes.flatten()\n\nfor target_class in range(10):\n true_labels = test_dataset.tensors[1].cpu().numpy() == target_class\n predicted_probabilities = all_predicted[:, target_class]\n\n prob_true, prob_pred = calibration_curve(\n true_labels, predicted_probabilities, n_bins=10\n )\n\n # Plot calibration curve\n axes[target_class].plot(prob_pred, prob_true, marker=\"o\", label=\"Calibration Curve\")\n axes[target_class].plot(\n [0, 1], [0, 1], linestyle=\"--\", label=\"Perfectly Calibrated\"\n )\n axes[target_class].set_xlabel(\"Mean Predicted Probability\")\n axes[target_class].set_ylabel(\"Observed Accuracy\")\n # ece_score = compute_ece(predicted_probabilities, true_labels, num_bins=10)\n # print(f\"Expected Calibration Error (ECE): {ece_score:.4f}\")\n\n axes[target_class].set_title(f\"Class {target_class}\")\nplt.tight_layout()\n\n# Compute expected calibration error (ECE)\nece = compute_ece(all_predicted, test_dataset.tensors[1].cpu().numpy(), 10)\nece\n\n0.021885250088572478\n\n\n\n\n\n\n\n\n\n\nprint(\n classification_report(\n test_dataset.tensors[1].cpu().numpy(), all_predicted.argmax(axis=1)\n )\n)\n\n precision recall f1-score support\n\n 0 0.89 0.91 0.90 980\n 1 0.91 0.97 0.94 1135\n 2 0.75 0.72 0.73 1032\n 3 0.77 0.74 0.75 1010\n 4 0.82 0.88 0.85 982\n 5 0.68 0.70 0.69 892\n 6 0.85 0.84 0.85 958\n 7 0.80 0.79 0.79 1028\n 8 0.76 0.75 0.75 974\n 9 0.81 0.74 0.77 1009\n\n accuracy 0.81 10000\n macro avg 0.80 0.80 0.80 10000\nweighted avg 0.80 0.81 0.80 10000\n\n\n\n\ndef compute_ece(predicted_probs, true_labels, num_bins=10):\n # Ensure predicted_probs is a NumPy array\n predicted_probs = np.array(predicted_probs)\n true_labels = np.array(true_labels)\n\n # Calculate predicted class labels\n predicted_labels = np.argmax(predicted_probs, axis=1)\n\n # Calculate confidence scores (maximum predicted probability)\n confidence_scores = np.max(predicted_probs, axis=1)\n\n # Create bins for confidence scores\n bin_edges = np.linspace(0, 1, num_bins + 1)\n\n ece = 0.0\n total_samples = len(true_labels)\n\n for bin_idx in range(num_bins):\n # Find examples whose confidence scores fall into the current bin\n bin_mask = (confidence_scores >= bin_edges[bin_idx]) & (\n confidence_scores < bin_edges[bin_idx + 1]\n )\n\n if np.any(bin_mask):\n # Calculate the accuracy of predictions in this bin\n bin_accuracy = np.mean(predicted_labels[bin_mask] == true_labels[bin_mask])\n\n # Calculate the fraction of examples in this bin\n bin_fraction = np.sum(bin_mask) / total_samples\n\n # Calculate the calibration error in this bin\n bin_error = np.abs(bin_accuracy - np.mean(confidence_scores[bin_mask]))\n\n # Weighted contribution to ECE\n ece += bin_fraction * bin_error\n\n return ece" }, { - "objectID": "posts/numpy-algebra- copy.html#power", - "href": "posts/numpy-algebra- copy.html#power", - "title": "How numpy handles day-to-day algebra?", - "section": "Power", - "text": "Power\nLet’s find out what happens when you execute the following code in numpy.\n\nnumber = np.float64(1.1)\nnumber**1.2\n\n1.1211693641406024" + "objectID": "posts/pruning_vs_uncertainty.html#does-mc-dropout-help-with-calibration", + "href": "posts/pruning_vs_uncertainty.html#does-mc-dropout-help-with-calibration", + "title": "Pruning vs Uncertainty", + "section": "Does MC-dropout help with calibration?", + "text": "Does MC-dropout help with calibration?" }, { - "objectID": "posts/2021-10-12-sparsegps.html", - "href": "posts/2021-10-12-sparsegps.html", - "title": "SparseGPs in Stheno", - "section": "", - "text": "# !pip install -U regdata\n\n\nimport regdata as rd\nimport torch\nimport matplotlib.pyplot as plt\nfrom matplotlib.animation import FuncAnimation\nfrom matplotlib import rc\nimport wbml.out as out\nfrom wbml.plot import tweak\n\nfrom stheno import B, GP, EQ, PseudoObsVFE, PseudoObsFITC\nfrom varz.torch import Vars, minimise_l_bfgs_b, parametrised, Positive\nimport lab.torch" + "objectID": "posts/pruning_vs_uncertainty.html#last-layer-only", + "href": "posts/pruning_vs_uncertainty.html#last-layer-only", + "title": "Pruning vs Uncertainty", + "section": "Last layer only", + "text": "Last layer only\n\nclass MCDropout(nn.Module):\n def __init__(self, p):\n super().__init__()\n self.p = p\n self.dropout = nn.Dropout(p=self.p)\n\n def forward(self, x):\n self.train()\n return self.dropout(x)\n\n\nresnet_with_dropout = torchvision.models.resnet18(pretrained=True)\nresnet_with_dropout.fc = nn.Sequential(\n nn.Linear(\n resnet_with_dropout.fc.in_features, resnet_with_dropout.fc.in_features // 2\n ),\n nn.GELU(),\n MCDropout(p=0.33),\n nn.Linear(resnet_with_dropout.fc.in_features // 2, num_classes),\n)\n\nresnet_with_dropout.load_state_dict(resnet.state_dict())\n\nresnet_with_dropout.to(device)\n\nmc_samples = 1000\n\noutputs = []\nfor _ in tqdm(range(mc_samples)):\n output = resnet_with_dropout(test_dataset.tensors[0])\n softmax_output = nn.Softmax(dim=1)(output)\n outputs.append(softmax_output.data.cpu().numpy())\n\n100%|██████████| 1000/1000 [00:18<00:00, 55.50it/s]\n\n\n\nmc_mean = np.mean(outputs, axis=0)\nmc_std = np.std(outputs, axis=0)\nmc_mean.shape\n\n(10000, 10)\n\n\n\n# Compute calibration curve\n\nfig, axes = plt.subplots(2, 5, figsize=(20, 5))\naxes = axes.flatten()\n\nfor target_class in range(10):\n true_labels = test_dataset.tensors[1].cpu().numpy() == target_class\n predicted_probabilities = mc_mean[:, target_class]\n\n prob_true, prob_pred = calibration_curve(\n true_labels, predicted_probabilities, n_bins=10\n )\n\n # Plot calibration curve\n axes[target_class].plot(prob_pred, prob_true, marker=\"o\", label=\"Calibration Curve\")\n axes[target_class].plot(\n [0, 1], [0, 1], linestyle=\"--\", label=\"Perfectly Calibrated\"\n )\n axes[target_class].set_xlabel(\"Mean Predicted Probability\")\n axes[target_class].set_ylabel(\"Observed Accuracy\")\n axes[target_class].set_title(f\"Class {target_class}\")\nplt.tight_layout()\n\nece = compute_ece(mc_mean, test_dataset.tensors[1].cpu().numpy(), 10)\nece\n\n0.04250686831623317\n\n\n\n\n\n\n\n\n\n\nprint(\n classification_report(test_dataset.tensors[1].cpu().numpy(), mc_mean.argmax(axis=1))\n)\n\n precision recall f1-score support\n\n 0 0.88 0.92 0.90 980\n 1 0.91 0.98 0.94 1135\n 2 0.73 0.71 0.72 1032\n 3 0.78 0.72 0.75 1010\n 4 0.83 0.87 0.85 982\n 5 0.68 0.71 0.69 892\n 6 0.82 0.88 0.85 958\n 7 0.80 0.78 0.79 1028\n 8 0.77 0.74 0.75 974\n 9 0.82 0.72 0.77 1009\n\n accuracy 0.81 10000\n macro avg 0.80 0.80 0.80 10000\nweighted avg 0.80 0.81 0.80 10000" }, { - "objectID": "posts/2021-10-12-sparsegps.html#imports", - "href": "posts/2021-10-12-sparsegps.html#imports", - "title": "SparseGPs in Stheno", + "objectID": "posts/gcloud.html", + "href": "posts/gcloud.html", + "title": "Gcloud cheatsheet", "section": "", - "text": "# !pip install -U regdata\n\n\nimport regdata as rd\nimport torch\nimport matplotlib.pyplot as plt\nfrom matplotlib.animation import FuncAnimation\nfrom matplotlib import rc\nimport wbml.out as out\nfrom wbml.plot import tweak\n\nfrom stheno import B, GP, EQ, PseudoObsVFE, PseudoObsFITC\nfrom varz.torch import Vars, minimise_l_bfgs_b, parametrised, Positive\nimport lab.torch" - }, - { - "objectID": "posts/2021-10-12-sparsegps.html#data-preperation", - "href": "posts/2021-10-12-sparsegps.html#data-preperation", - "title": "SparseGPs in Stheno", - "section": "Data preperation", - "text": "Data preperation\n\n# Define points to predict at.\nx = B.linspace(0, 10, 100)\nx_obs = B.linspace(0, 7, 50_000)\nx_ind = B.linspace(0, 10, 20)\n\n# Construct a prior.\nf = GP(EQ().periodic(2 * B.pi))\n\n# Sample a true, underlying function and observations.\nf_true = B.sin(x)\ny_obs = B.sin(x_obs) + B.sqrt(0.5) * B.randn(*x_obs.shape)" - }, - { - "objectID": "posts/2021-10-12-sparsegps.html#plotting-function", - "href": "posts/2021-10-12-sparsegps.html#plotting-function", - "title": "SparseGPs in Stheno", - "section": "Plotting function", - "text": "Plotting function\n\ndef plot(method):\n if method == 'VFE':\n # Plot result.\n plt.plot(x, f_true, label=\"True\", style=\"test\")\n plt.scatter(\n x_obs,\n y_obs,\n label=\"Observations\",\n style=\"train\",\n c=\"tab:green\",\n alpha=0.35,\n )\n plt.scatter(\n x_ind,\n obs.mu(f.measure)[:, 0],\n label=\"Inducing Points\",\n style=\"train\",\n s=20,\n )\n plt.plot(x, mean, label=\"Prediction\", style=\"pred\")\n plt.fill_between(x, lower, upper, style=\"pred\")\n tweak()\n\n plt.show()\n else:\n # Plot result.\n plt.plot(x, f_true, label=\"True\", style=\"test\")\n plt.scatter(\n x_obs,\n y_obs,\n label=\"Observations\",\n style=\"train\",\n c=\"tab:green\",\n alpha=0.35,\n )\n plt.scatter(\n x_ind,\n B.dense(f_post(x_ind).mean),\n label=\"Inducing Points\",\n style=\"train\",\n s=20,\n )\n plt.plot(x, mean, label=\"Prediction\", style=\"pred\")\n plt.fill_between(x, lower, upper, style=\"pred\")\n tweak()\n\n plt.show()" + "text": "Following this guide.\n\nTo set default email, project-id & zone:\n\ngcloud config set account your-email-account\ngcloud config set project your-project-id\ngcloud config set compute/zone us-central1-f # us-central1-f for free v2 TPUs and europe-west4-a for free v3 TPUs (only if you have free TRC access)\n\nTo get currently active project and zone related info:\n\ngcloud info\n\nTo create an identity (I don’t know if this is required or not. This command should trigger installation of “gcloud Beta Commands” automatically in another shell and then you need to rerun the following command):\n\ngcloud beta services identity create --service tpu.googleapis.com" }, { - "objectID": "posts/2021-10-12-sparsegps.html#sparse-regression-with-variational-free-energy-vfe-method", - "href": "posts/2021-10-12-sparsegps.html#sparse-regression-with-variational-free-energy-vfe-method", - "title": "SparseGPs in Stheno", - "section": "Sparse regression with Variational Free Energy (VFE) method", - "text": "Sparse regression with Variational Free Energy (VFE) method\n\n# Compute a pseudo-point approximation of the posterior.\nobs = PseudoObsVFE(f(x_ind), (f(x_obs, 0.5), y_obs))\n\n# Compute the ELBO.\nout.kv(\"ELBO\", obs.elbo(f.measure))\n\n# Compute the approximate posterior.\nf_post = f | obs\n\n# Make predictions with the approximate posterior.\nmean, lower, upper = f_post(x, 0.5).marginal_credible_bounds()\nplot('VFE')\n\nELBO: -5.345e+04" + "objectID": "posts/gcloud.html#initial-setup", + "href": "posts/gcloud.html#initial-setup", + "title": "Gcloud cheatsheet", + "section": "", + "text": "Following this guide.\n\nTo set default email, project-id & zone:\n\ngcloud config set account your-email-account\ngcloud config set project your-project-id\ngcloud config set compute/zone us-central1-f # us-central1-f for free v2 TPUs and europe-west4-a for free v3 TPUs (only if you have free TRC access)\n\nTo get currently active project and zone related info:\n\ngcloud info\n\nTo create an identity (I don’t know if this is required or not. This command should trigger installation of “gcloud Beta Commands” automatically in another shell and then you need to rerun the following command):\n\ngcloud beta services identity create --service tpu.googleapis.com" }, { - "objectID": "posts/2021-10-12-sparsegps.html#sparse-regression-with-fully-independent-training-conditional-fitc-mehod", - "href": "posts/2021-10-12-sparsegps.html#sparse-regression-with-fully-independent-training-conditional-fitc-mehod", - "title": "SparseGPs in Stheno", - "section": "Sparse Regression with Fully Independent Training Conditional (FITC) mehod", - "text": "Sparse Regression with Fully Independent Training Conditional (FITC) mehod\n\n# Compute a pseudo-point approximation of the posterior.\nobs = PseudoObsFITC(f(x_ind), (f(x_obs, 0.5), y_obs))\n\n# Compute the ELBO.\nout.kv(\"ELBO\", obs.elbo(f.measure))\n\n# Compute the approximate posterior.\nf_post = f | obs\n\n# Make predictions with the approximate posterior.\nmean, lower, upper = f_post(x, 0.5).marginal_credible_bounds()\nplot('FITC')\n\nELBO: -5.345e+04" + "objectID": "posts/gcloud.html#working-with-tpu-vms", + "href": "posts/gcloud.html#working-with-tpu-vms", + "title": "Gcloud cheatsheet", + "section": "Working with TPU VMs", + "text": "Working with TPU VMs\nThere are two different terms here: “TPU VMs” and “TPU nodes”. TPU nodes can be connected externally via another VM. TPU VMs are stand-alone systems with TPUs, RAM and CPU (96 core Intel 2 GHz processor and 335 GB RAM). We may be charged via GCP for the VM (CPUs and RAM). (I will update this info once I know for sure):\n\n\nTo create a TPU VM in preferred zone via CLI (be careful about the --zone to avoid charges, check the first email received from TRC team to see what kind of TPUs are free in different zones. if --zone is not passed, VM will be created in the default zone that we set initially. This command triggered installation of “gcloud Alpha Commands”):\n\ngcloud alpha compute tpus tpu-vm create vm-1 --accelerator-type v2-8 --version tpu-vm-tf-2.8.0 --zone us-central1-f\n\nTo get the list of TPU nodes/VMs:\n\ngcloud compute tpus list\n\nTo delete a TPU node/VM:\n\ngcloud compute tpus delete vm-1\n\nTo connect with a vm via ssh (this automatically creates ssh key pair and places in default ssh config location):\n\ngcloud alpha compute tpus tpu-vm ssh vm-1\n\nFollow this guide to create and attach a persistent disk with the TPU VM" }, { - "objectID": "posts/2021-10-12-sparsegps.html#hyperparameter-tuning-noisy-sine-data", - "href": "posts/2021-10-12-sparsegps.html#hyperparameter-tuning-noisy-sine-data", - "title": "SparseGPs in Stheno", - "section": "Hyperparameter tuning (Noisy Sine data)", - "text": "Hyperparameter tuning (Noisy Sine data)\n\ndef model(vs):\n \"\"\"Constuct a model with learnable parameters.\"\"\"\n return vs['variance']*GP(EQ().stretch(vs['length_scale']))\n\n\ntorch.manual_seed(123)\n\ndataObj = rd.SineNoisy(scale_X=False, scale_y=False, return_test=True, backend='torch')\nx_obs, y_obs, x = dataObj.get_data()\n\n\nplt.scatter(x_obs, y_obs, s=2);\n\n\n\n\n\n\n\n\n\nVFE\n\nvs = Vars(torch.float64)\nvs.positive(name=\"noise\")\nvs.positive(name=\"length_scale\");\nvs.positive(name=\"variance\");\nvs.positive(init=torch.linspace(0.4,0.6,10), shape=(10,), name='x_ind')\nvs.requires_grad(True)\n\noptimizer = torch.optim.Adam(vs.get_latent_vars(), lr=0.1)\nfig, ax = plt.subplots(1,2,figsize=(15,5))\nlosses = []\n\ndef update(i):\n optimizer.zero_grad()\n gp = model(vs)\n obs = PseudoObsVFE(gp(vs['x_ind']), (gp(x_obs, vs['noise']), y_obs))\n loss = -obs.elbo(gp.measure)\n losses.append(loss.item())\n loss.backward()\n optimizer.step()\n \n gp_post = gp | obs\n mean, lower, upper = gp_post(x, vs['noise']).marginal_credible_bounds()\n ind_mean = B.dense(gp_post(vs['x_ind']).mean)\n \n ax[0].cla();ax[1].cla();\n ax[0].scatter(x_obs, y_obs, s=2)\n with torch.no_grad():\n ax[0].plot()\n ax[0].plot(x, B.dense(mean), label='Prediction')\n ax[0].fill_between(x.ravel(), lower, upper, alpha=0.2, label='Uncertainty')\n ax[0].plot(x, dataObj.f(x), label='True')\n ax[0].scatter(vs['x_ind'], ind_mean, label='Inducing points')\n ax[0].set_xlabel('X')\n ax[0].legend()\n \n ax[1].plot(losses, label='loss')\n ax[1].set_xlabel('Iterations')\n ax[1].legend()\n \nanim = FuncAnimation(fig, update, range(50))\nrc('animation', html='jshtml')\nplt.close()\nanim\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\n\n\nFITC\n\nvs = Vars(torch.float64)\nvs.positive(name=\"noise\")\nvs.positive(name=\"length_scale\");\nvs.positive(name=\"variance\");\nvs.positive(init=torch.linspace(0.4,0.6,10), shape=(10,), name='x_ind')\nvs.requires_grad(True)\n\noptimizer = torch.optim.Adam(vs.get_latent_vars(), lr=0.1)\nfig, ax = plt.subplots(1,2,figsize=(15,5))\nlosses = []\n\ndef update(i):\n optimizer.zero_grad()\n gp = model(vs)\n obs = PseudoObsFITC(gp(vs['x_ind']), (gp(x_obs, vs['noise']), y_obs))\n loss = -obs.elbo(gp.measure)\n losses.append(loss.item())\n loss.backward()\n optimizer.step()\n \n gp_post = gp | obs\n mean, lower, upper = gp_post(x, vs['noise']).marginal_credible_bounds()\n ind_mean = B.dense(gp_post(vs['x_ind']).mean)\n \n ax[0].cla();ax[1].cla();\n ax[0].scatter(x_obs, y_obs, s=2)\n with torch.no_grad():\n ax[0].plot()\n ax[0].plot(x, B.dense(mean), label='Prediction')\n ax[0].fill_between(x.ravel(), lower, upper, alpha=0.2, label='Uncertainty')\n ax[0].plot(x, dataObj.f(x), label='True')\n ax[0].scatter(vs['x_ind'], ind_mean, label='Inducing points')\n ax[0].set_xlabel('X')\n ax[0].legend()\n \n ax[1].plot(losses, label='loss')\n ax[1].set_xlabel('Iterations')\n ax[1].legend()\n \nanim = FuncAnimation(fig, update, range(50))\nrc('animation', html='jshtml')\nplt.close()\nanim\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect" + "objectID": "posts/gcloud.html#working-with-tpu-vms-via-vs-code", + "href": "posts/gcloud.html#working-with-tpu-vms-via-vs-code", + "title": "Gcloud cheatsheet", + "section": "Working with TPU VMs via VS-code", + "text": "Working with TPU VMs via VS-code\n\nInstall the following extension on VS-code: \nUse the following button to connect to a remote machine (use “Connect to Host…” button): \nManually update the default ssh config file (in my case, “C:\\.ssh”) to add a VM in VS-code (you can use VS-code command palette to figure out the config file for you and edit it. Please see the screeshot below).\n\n\n\nNote that ssh public-private key pair with name google_compute_engine is automatically generated when you connect with the VM for the first time with gcloud alpha compute tpus tpu-vm ssh command. The VM config for me looks like this:\n\nHost Cloud-TPU-Node-2\n HostName <External-IP-of-your-TPU-VM>\n User zeelp\n Port 22\n IdentityFile C:\\Users\\zeelp\\.ssh\\google_compute_engine" }, { - "objectID": "posts/GNN_for_regression.html", - "href": "posts/GNN_for_regression.html", - "title": "Graph Neural Networks for Regression", + "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html", + "href": "posts/2022-05-17-contributors_sorted_by_prs.html", + "title": "Get a list of contributors from a repo", "section": "", - "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"3\"\n\nimport GPy\n\nimport torch\nimport torch.nn as nn\n\nfrom tqdm import trange\n\nimport numpy as np\nimport matplotlib.pyplot as plt\n\nfrom sklearn.model_selection import train_test_split\n\ndevice = \"cuda\"" + "text": "import pandas as pd" }, { - "objectID": "posts/GNN_for_regression.html#create-a-synthetic-dataset", - "href": "posts/GNN_for_regression.html#create-a-synthetic-dataset", - "title": "Graph Neural Networks for Regression", - "section": "Create a synthetic dataset", - "text": "Create a synthetic dataset\n\nnp.random.seed(0)\ntorch.random.manual_seed(4)\n\nN = 50\nx = np.linspace(-1, 1, N).reshape(-1, 1)\nkernel = GPy.kern.RBF(input_dim=1, variance=1, lengthscale=0.1)\ny = np.random.multivariate_normal(np.zeros(N), kernel.K(x)).reshape(-1, 1)\ny_noisy = y + np.random.normal(0, 0.1, N).reshape(-1, 1)\n\ntrain_x, test_x, train_y, test_y = train_test_split(x, y_noisy, test_size=0.4, random_state=0)\n\nplt.plot(x, y, label=\"True\");\nplt.plot(train_x, train_y, 'o', label='train')\nplt.plot(test_x, test_y, 'o', label='test')\nplt.legend();\n\nx, y, y_noisy = map(lambda x: torch.tensor(x).float().to(device), (x, y, y_noisy))\ntrain_x, test_x, train_y, test_y = map(lambda x: torch.tensor(x).float().to(device), (train_x, test_x, train_y, test_y))\nprint(x.shape, y.shape, y_noisy.shape)\n\ntorch.Size([50, 1]) torch.Size([50, 1]) torch.Size([50, 1])" + "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html#config", + "href": "posts/2022-05-17-contributors_sorted_by_prs.html#config", + "title": "Get a list of contributors from a repo", + "section": "Config", + "text": "Config\n\nowner = \"probml\"\nrepo = \"pyprobml\"" }, { - "objectID": "posts/GNN_for_regression.html#fit-with-a-simple-mlp", - "href": "posts/GNN_for_regression.html#fit-with-a-simple-mlp", - "title": "Graph Neural Networks for Regression", - "section": "Fit with a simple MLP", - "text": "Fit with a simple MLP\n\ndef fit(model, x, y, A=None, lr=0.01, epochs=100):\n optimizer = torch.optim.Adam(model.parameters(), lr=lr)\n loss_fn = nn.MSELoss()\n \n if A is None:\n inputs = (x,)\n else:\n inputs = (x, A)\n \n losses = []\n pbar = trange(epochs)\n for epoch in pbar:\n optimizer.zero_grad()\n y_hat = model(*inputs)\n loss = loss_fn(y_hat, y)\n losses.append(loss.item())\n pbar.set_description(f\"Epoch {epoch} Loss: {loss.item()}\")\n loss.backward()\n optimizer.step()\n \n return losses\n\nclass SimpleMLP(nn.Module):\n def __init__(self, features):\n super().__init__()\n layers = [nn.Linear(1, features[0]), nn.ReLU()]\n for in_features, out_features in zip(features, features[1:]):\n layers.append(nn.Linear(in_features, out_features))\n layers.append(nn.ReLU())\n \n layers.append(nn.Linear(features[-1], 1))\n \n self.layers = nn.Sequential(*layers)\n \n def forward(self, x):\n return self.layers(x)\n\n\ntorch.manual_seed(0)\nmodel = SimpleMLP([10, 10, 10]).to(device)\nfit(model, train_x, train_y, lr=0.01, epochs=1000);\n\npred_y = model(x)\n\n(x_, y_, train_x_, train_y_, test_x_, test_y_, pred_y_) = map(lambda x: x.cpu().detach().numpy(), (x, y, train_x, train_y, test_x, test_y, pred_y))\nplt.plot(x_, y_, label=\"True\");\nplt.plot(train_x_, train_y_, 'o', label='train')\nplt.plot(test_x_, test_y_, 'o', label='test')\nplt.plot(x_, pred_y_, label='pred')\nplt.legend();\n\nEpoch 999 Loss: 0.07143261283636093: 100%|██████████| 1000/1000 [00:02<00:00, 410.79it/s]" + "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html#get-all-contributors-to-a-repo", + "href": "posts/2022-05-17-contributors_sorted_by_prs.html#get-all-contributors-to-a-repo", + "title": "Get a list of contributors from a repo", + "section": "Get all contributors to a repo", + "text": "Get all contributors to a repo\n\ncontributors = pd.read_json(f\"https://api.github.com/repos/{owner}/{repo}/contributors?per_page=100\")\ncontributors = contributors.set_index(\"login\")\nprint(f\"Number of contributors: {len(contributors.index.unique())}\")\ncontributors.head(2)\n\nNumber of contributors: 47\n\n\n\n \n \n \n\n\n\n\n\n\nid\nnode_id\navatar_url\ngravatar_id\nurl\nhtml_url\nfollowers_url\nfollowing_url\ngists_url\nstarred_url\nsubscriptions_url\norganizations_url\nrepos_url\nevents_url\nreceived_events_url\ntype\nsite_admin\ncontributions\n\n\nlogin\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nmurphyk\n4632336\nMDQ6VXNlcjQ2MzIzMzY=\nhttps://avatars.githubusercontent.com/u/463233...\n\nhttps://api.github.com/users/murphyk\nhttps://github.com/murphyk\nhttps://api.github.com/users/murphyk/followers\nhttps://api.github.com/users/murphyk/following...\nhttps://api.github.com/users/murphyk/gists{/gi...\nhttps://api.github.com/users/murphyk/starred{/...\nhttps://api.github.com/users/murphyk/subscript...\nhttps://api.github.com/users/murphyk/orgs\nhttps://api.github.com/users/murphyk/repos\nhttps://api.github.com/users/murphyk/events{/p...\nhttps://api.github.com/users/murphyk/received_...\nUser\nFalse\n1777\n\n\nNeoanarika\n5188337\nMDQ6VXNlcjUxODgzMzc=\nhttps://avatars.githubusercontent.com/u/518833...\n\nhttps://api.github.com/users/Neoanarika\nhttps://github.com/Neoanarika\nhttps://api.github.com/users/Neoanarika/followers\nhttps://api.github.com/users/Neoanarika/follow...\nhttps://api.github.com/users/Neoanarika/gists{...\nhttps://api.github.com/users/Neoanarika/starre...\nhttps://api.github.com/users/Neoanarika/subscr...\nhttps://api.github.com/users/Neoanarika/orgs\nhttps://api.github.com/users/Neoanarika/repos\nhttps://api.github.com/users/Neoanarika/events...\nhttps://api.github.com/users/Neoanarika/receiv...\nUser\nFalse\n184" }, { - "objectID": "posts/GNN_for_regression.html#create-a-gcn-layer", - "href": "posts/GNN_for_regression.html#create-a-gcn-layer", - "title": "Graph Neural Networks for Regression", - "section": "Create a GCN layer", - "text": "Create a GCN layer\n\nclass GCNLayer(nn.Module):\n def __init__(self, in_features, out_features):\n super().__init__()\n self.linear = nn.Linear(in_features, out_features)\n \n def forward(self, x, A): \n return self.linear(A @ x)\n \n \nclass GCN(nn.Module):\n def __init__(self, features):\n super().__init__()\n layers = [GCNLayer(1, features[0]), nn.ReLU()]\n for in_features, out_features in zip(features, features[1:]):\n layers.append(GCNLayer(in_features, out_features))\n layers.append(nn.ReLU())\n \n layers.append(nn.Linear(features[-1], 1))\n self.layers = nn.Sequential(*layers)\n \n def forward(self, x, A):\n for layer in self.layers:\n if isinstance(layer, GCNLayer):\n x = layer(x, A)\n else:\n x = layer(x)\n return x\n \ndef get_eucledean_A(x, exponent):\n d = ((x - x.T)**2)**0.5\n d = torch.where(d==0, torch.min(d[d!=0])/2, d) # self distance is 0, so replace it with half of the min distance\n A = 1/(d**exponent)\n return A/A.sum(dim=1, keepdim=True)\n\ndef get_KNN_A(x, k):\n d = torch.abs(x - x.T)\n A = torch.zeros_like(d)\n _, indices = torch.topk(d, k, dim=1, largest=False)\n for i, index in enumerate(indices):\n A[i, index] = 1\n return A/A.sum(dim=1, keepdim=True)\n\ndef fit_and_plot(title):\n model = GCN([10, 10, 10]).to(device)\n losses = fit(model, train_x, train_y, A=A_train, lr=0.001, epochs=3000);\n\n pred_y = model(x, A_all)\n\n fig, ax = plt.subplots(1, 2, figsize=(12, 4))\n axes = ax[0]\n axes.plot(losses)\n axes.set_title(\"Losses\")\n\n (x_, y_, train_x_, train_y_, test_x_, test_y_, pred_y_) = map(lambda x: x.cpu().detach().numpy(), (x, y, train_x, train_y, test_x, test_y, pred_y))\n axes = ax[1]\n axes.plot(x_, y_, label=\"True\");\n axes.plot(train_x_, train_y_, 'o', label='train')\n axes.plot(test_x_, test_y_, 'o', label='test')\n axes.plot(x_, pred_y_, label='pred')\n axes.set_title(title)\n axes.legend();" + "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html#fetch-all-prs-from-a-repo", + "href": "posts/2022-05-17-contributors_sorted_by_prs.html#fetch-all-prs-from-a-repo", + "title": "Get a list of contributors from a repo", + "section": "Fetch all PRs from a repo", + "text": "Fetch all PRs from a repo\n\npage_range = range(1, 6)\nget_pr_df = lambda page: pd.read_json(f\"https://api.github.com/repos/probml/pyprobml/pulls?state=all&per_page=100&page={page}\")\npull_requests = pd.concat(map(get_pr_df, page_range))\nprint(f\"Number of PRs: {len(pull_requests)}\")\npull_requests.head(2)\n\nNumber of PRs: 497\n\n\n\n \n \n \n\n\n\n\n\n\nurl\nid\nnode_id\nhtml_url\ndiff_url\npatch_url\nissue_url\nnumber\nstate\nlocked\n...\nreview_comments_url\nreview_comment_url\ncomments_url\nstatuses_url\nhead\nbase\n_links\nauthor_association\nauto_merge\nactive_lock_reason\n\n\n\n\n0\nhttps://api.github.com/repos/probml/pyprobml/p...\n938329819\nPR_kwDOA-3vB8437cbb\nhttps://github.com/probml/pyprobml/pull/841\nhttps://github.com/probml/pyprobml/pull/841.diff\nhttps://github.com/probml/pyprobml/pull/841.patch\nhttps://api.github.com/repos/probml/pyprobml/i...\n841\nclosed\nFalse\n...\nhttps://api.github.com/repos/probml/pyprobml/p...\nhttps://api.github.com/repos/probml/pyprobml/p...\nhttps://api.github.com/repos/probml/pyprobml/i...\nhttps://api.github.com/repos/probml/pyprobml/s...\n{'label': 'karm-patel:posrprocessing', 'ref': ...\n{'label': 'probml:master', 'ref': 'master', 's...\n{'self': {'href': 'https://api.github.com/repo...\nCONTRIBUTOR\nNaN\nNaN\n\n\n1\nhttps://api.github.com/repos/probml/pyprobml/p...\n938317389\nPR_kwDOA-3vB8437ZZN\nhttps://github.com/probml/pyprobml/pull/840\nhttps://github.com/probml/pyprobml/pull/840.diff\nhttps://github.com/probml/pyprobml/pull/840.patch\nhttps://api.github.com/repos/probml/pyprobml/i...\n840\nclosed\nFalse\n...\nhttps://api.github.com/repos/probml/pyprobml/p...\nhttps://api.github.com/repos/probml/pyprobml/p...\nhttps://api.github.com/repos/probml/pyprobml/i...\nhttps://api.github.com/repos/probml/pyprobml/s...\n{'label': 'karm-patel:master', 'ref': 'master'...\n{'label': 'probml:master', 'ref': 'master', 's...\n{'self': {'href': 'https://api.github.com/repo...\nCONTRIBUTOR\nNaN\nNaN\n\n\n\n\n2 rows × 36 columns" }, { - "objectID": "posts/GNN_for_regression.html#idw-setting", - "href": "posts/GNN_for_regression.html#idw-setting", - "title": "Graph Neural Networks for Regression", - "section": "IDW setting", - "text": "IDW setting\n\nexponent = 1\nA_train = get_eucledean_A(train_x, exponent).to(device)\nA_all = get_eucledean_A(x, exponent).to(device)\ntitle = f\"Distance based adjacency matrix with exponent {exponent}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.05447980388998985: 100%|██████████| 3000/3000 [00:07<00:00, 390.93it/s] \n\n\n\n\n\n\n\n\n\n\nexponent = 2\nA_train = get_eucledean_A(train_x, exponent).to(device)\nA_all = get_eucledean_A(x, exponent).to(device)\ntitle = f\"Distance based adjacency matrix with exponent {exponent}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.06475391983985901: 100%|██████████| 3000/3000 [00:07<00:00, 413.49it/s]\n\n\n\n\n\n\n\n\n\n\nexponent = 3\nA_train = get_eucledean_A(train_x, exponent).to(device)\nA_all = get_eucledean_A(x, exponent).to(device)\ntitle = f\"Distance based adjacency matrix with exponent {exponent}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.043554823845624924: 100%|██████████| 3000/3000 [00:08<00:00, 367.28it/s]" + "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html#get-a-list-of-contributors-sorted-by-count-of-prs", + "href": "posts/2022-05-17-contributors_sorted_by_prs.html#get-a-list-of-contributors-sorted-by-count-of-prs", + "title": "Get a list of contributors from a repo", + "section": "Get a list of contributors sorted by count of PRs", + "text": "Get a list of contributors sorted by count of PRs\n\npull_requests['login'] = pull_requests['user'].apply(lambda x: x[\"login\"])\nsorted_by_pr_count = pull_requests.groupby(\"login\").agg({'url': len}).sort_values(by='url', ascending=False)\nsorted_by_pr_count.rename(columns={'url': 'Number of PRs'}, inplace=True)\nsorted_by_pr_count.head(5)\n\n\n \n \n \n\n\n\n\n\n\nNumber of PRs\n\n\nlogin\n\n\n\n\n\nDrishttii\n79\n\n\ngerdm\n55\n\n\nkaralleyna\n43\n\n\nalways-newbie161\n29\n\n\nkarm-patel\n29" }, { - "objectID": "posts/GNN_for_regression.html#knn-setting", - "href": "posts/GNN_for_regression.html#knn-setting", - "title": "Graph Neural Networks for Regression", - "section": "KNN Setting", - "text": "KNN Setting\n\nK = 1\nA_train = get_KNN_A(train_x, K).to(device)\nA_all = get_KNN_A(x, K).to(device)\ntitle = f\"KNN based adjacency matrix with K={K}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.04107221961021423: 100%|██████████| 3000/3000 [00:07<00:00, 383.88it/s] \n\n\n\n\n\n\n\n\n\n\nK = 3\nA_train = get_KNN_A(train_x, K).to(device)\nA_all = get_KNN_A(x, K).to(device)\ntitle = f\"KNN based adjacency matrix with K={K}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.14372628927230835: 100%|██████████| 3000/3000 [00:07<00:00, 404.74it/s]\n\n\n\n\n\n\n\n\n\n\nK = 7\nA_train = get_KNN_A(train_x, K).to(device)\nA_all = get_KNN_A(x, K).to(device)\ntitle = f\"KNN based adjacency matrix with K={K}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.13950258493423462: 100%|██████████| 3000/3000 [00:07<00:00, 381.66it/s]\n\n\n\n\n\n\n\n\n\n\nK = 15\nA_train = get_KNN_A(train_x, K).to(device)\nA_all = get_KNN_A(x, K).to(device)\ntitle = f\"KNN based adjacency matrix with K={K}\"\n\nfit_and_plot(title)\n\nEpoch 2999 Loss: 0.33879855275154114: 100%|██████████| 3000/3000 [00:07<00:00, 376.56it/s]" + "objectID": "posts/2022-05-17-contributors_sorted_by_prs.html#create-a-dashboard", + "href": "posts/2022-05-17-contributors_sorted_by_prs.html#create-a-dashboard", + "title": "Get a list of contributors from a repo", + "section": "Create a dashboard", + "text": "Create a dashboard\n\ndef get_href_user(user):\n username, profile_link = user.split(\"|\")\n return f\"[{username}]({profile_link})\"\n\ndashboard = pd.DataFrame(index=sorted_by_pr_count.index)\ndashboard[\"Avatar\"] = contributors.avatar_url.apply(lambda url: f'<img width=\"25\" alt=\"image\" src=\"{url}\">')\ndashboard[\"Contributor\"] = (contributors.index +\"|\"+ contributors['html_url']).apply(get_href_user)\ndashboard[\"Number of PRs\"] = sorted_by_pr_count[\"Number of PRs\"]\nprint(dashboard.dropna().T.to_markdown())\n\n| | Drishttii | gerdm | karalleyna | always-newbie161 | karm-patel | Duane321 | Nirzu97 | patel-zeel | animesh-007 | ashishpapanai | shivaditya-meduri | Neoanarika | andrewnc | nappaillav | Abdelrahman350 | mjsML | jdf22 | kzymgch | nalzok | nitish1295 | Garvit9000c | AnkitaKumariJain14 | rohit-khoiwal-30 | shobro | raymondyeh07 | khanshehjad | alenm10 | firatoncel | AnandShegde | Aadesh-1404 | nealmcb | nipunbatra | petercerno | posgnu | mvervuurt | hieuza | Prahitha | TripleTop | UmarJ | Vishal987595 | a-fakhri | adamnemecek | galv | jlh2018 | krasserm | yuanx749 |\n|:--------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------|\n| Avatar | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/35187749?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/4108759?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/36455180?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/66471669?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/59387624?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/19956442?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/28842790?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/59758528?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/53366877?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/52123364?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/77324692?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/5188337?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/7716402?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/43855961?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/47902062?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/7131192?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/1637094?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/10054419?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/13443062?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/21181046?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/68856476?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/62535006?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/87682045?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/54628243?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/5696982?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/31896767?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/42214173?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/9141211?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/79975787?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/68186100?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/119472?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/60985?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/1649209?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/30136201?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/6399881?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/1021144?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/44160152?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/48208522?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/34779641?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/97757583?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/65111198?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/182415?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/4767568?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/40842099?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/202907?v=4\"> | <img width=\"25\" alt=\"image\" src=\"https://avatars.githubusercontent.com/u/47032563?v=4\"> |\n| Contributor | [Drishttii](https://github.com/Drishttii) | [gerdm](https://github.com/gerdm) | [karalleyna](https://github.com/karalleyna) | [always-newbie161](https://github.com/always-newbie161) | [karm-patel](https://github.com/karm-patel) | [Duane321](https://github.com/Duane321) | [Nirzu97](https://github.com/Nirzu97) | [patel-zeel](https://github.com/patel-zeel) | [animesh-007](https://github.com/animesh-007) | [ashishpapanai](https://github.com/ashishpapanai) | [shivaditya-meduri](https://github.com/shivaditya-meduri) | [Neoanarika](https://github.com/Neoanarika) | [andrewnc](https://github.com/andrewnc) | [nappaillav](https://github.com/nappaillav) | [Abdelrahman350](https://github.com/Abdelrahman350) | [mjsML](https://github.com/mjsML) | [jdf22](https://github.com/jdf22) | [kzymgch](https://github.com/kzymgch) | [nalzok](https://github.com/nalzok) | [nitish1295](https://github.com/nitish1295) | [Garvit9000c](https://github.com/Garvit9000c) | [AnkitaKumariJain14](https://github.com/AnkitaKumariJain14) | [rohit-khoiwal-30](https://github.com/rohit-khoiwal-30) | [shobro](https://github.com/shobro) | [raymondyeh07](https://github.com/raymondyeh07) | [khanshehjad](https://github.com/khanshehjad) | [alenm10](https://github.com/alenm10) | [firatoncel](https://github.com/firatoncel) | [AnandShegde](https://github.com/AnandShegde) | [Aadesh-1404](https://github.com/Aadesh-1404) | [nealmcb](https://github.com/nealmcb) | [nipunbatra](https://github.com/nipunbatra) | [petercerno](https://github.com/petercerno) | [posgnu](https://github.com/posgnu) | [mvervuurt](https://github.com/mvervuurt) | [hieuza](https://github.com/hieuza) | [Prahitha](https://github.com/Prahitha) | [TripleTop](https://github.com/TripleTop) | [UmarJ](https://github.com/UmarJ) | [Vishal987595](https://github.com/Vishal987595) | [a-fakhri](https://github.com/a-fakhri) | [adamnemecek](https://github.com/adamnemecek) | [galv](https://github.com/galv) | [jlh2018](https://github.com/jlh2018) | [krasserm](https://github.com/krasserm) | [yuanx749](https://github.com/yuanx749) |\n| Number of PRs | 79 | 55 | 43 | 29 | 29 | 29 | 25 | 23 | 18 | 17 | 16 | 10 | 10 | 10 | 8 | 7 | 7 | 6 | 6 | 5 | 4 | 4 | 3 | 3 | 2 | 2 | 2 | 2 | 2 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 |" }, { - "objectID": "posts/2022-10-31-stochastic-variational-gp.html", - "href": "posts/2022-10-31-stochastic-variational-gp.html", - "title": "Stochastic Variational Gaussian processes in JAX", + "objectID": "posts/2022-03-05-uncertainty-in-deep-learning.html", + "href": "posts/2022-03-05-uncertainty-in-deep-learning.html", + "title": "Uncertainty in Deep Learning", "section": "", - "text": "I recently read a compact and clean explanation of SVGP in the following blog post by Dr. Martin Ingram:\nNow, I am attempting to implement a practical code from scratch for the same (What is practical about it? Sometimes math does not simply translate to code without careful modifications). I am assuming that you have read the blog post cited above before moving further. Let’s go for coding!" + "text": "import torch" }, { - "objectID": "posts/2022-10-31-stochastic-variational-gp.html#imports", - "href": "posts/2022-10-31-stochastic-variational-gp.html#imports", - "title": "Stochastic Variational Gaussian processes in JAX", - "section": "Imports", - "text": "Imports\n\n# JAX\nimport jax\nfrom jax.flatten_util import ravel_pytree\nimport jax.numpy as jnp\nimport jax.scipy as jsp\n\n# Partially initialize functions\nfrom functools import partial\n\n# TFP\nimport tensorflow_probability.substrates.jax as tfp\ntfd = tfp.distributions\ntfb = tfp.bijectors\n\n# GP Kernels\nfrom tinygp import kernels\n\n# sklearn\nfrom sklearn.datasets import make_moons, make_blobs, make_circles\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.preprocessing import StandardScaler\nfrom sklearn.metrics import accuracy_score\n\n# Optimization\nimport optax\n\n# Plotting\nimport matplotlib.pyplot as plt\nplt.rcParams['scatter.edgecolors'] = \"k\"\n\n# Progress bar\nfrom tqdm import tqdm\n\n# Jitter\nJITTER = 1e-6\n\n# Enable JAX 64bit\njax.config.update(\"jax_enable_x64\", True)" + "objectID": "posts/2022-03-05-uncertainty-in-deep-learning.html#introduction", + "href": "posts/2022-03-05-uncertainty-in-deep-learning.html#introduction", + "title": "Uncertainty in Deep Learning", + "section": "1 - Introduction", + "text": "1 - Introduction\n\nAn online deep learning book from Ian Goodfellow, Yoshua Bengio, and Aaron Courville.\n\n\n1.1 - Deep Learning\nWe define a single layer network as the following:\n\nclass SingleLayerNetwork(torch.nn.Module):\n def __init__(self, Q, D, K):\n \"\"\"\n Q: number of features\n D: number of outputs\n K: number of hidden features\n \"\"\"\n super().__init__()\n self.input = torch.nn.Linear(Q, K) # Transforms Q features into K hidden features\n self.output = torch.nn.Linear(K, D) # Transforms K hidden features to D output features\n self.non_lin_transform = torch.nn.ReLU() # A non-linear transformation\n \n def forward(self, X):\n \"\"\"\n X: input (N x Q)\n \"\"\"\n self.linear_transformed_X = self.input(X) # (N, Q) -> (N, K)\n self.non_lin_transformed_X = self.non_lin_transform(linear_transformed_X) # (N, K) -> (N, K)\n output = self.output(self.non_lin_transformed_X) # (N, K) -> (N, D)\n return output\n\n\nQ = 10 # Number of features\nN = 100 # Number of samples\nD = 15 # Number of outputs\nK = 32 # Number of hidden features\n\nX = torch.rand(N, Q) # Input\nY = torch.rand(N, D) # Output\n\n\nmodel = SingleLayerNetwork(Q=Q, D=D, K=K)\nmodel\n\nSingleLayerNetwork(\n (input): Linear(in_features=10, out_features=32, bias=True)\n (output): Linear(in_features=32, out_features=15, bias=True)\n (non_lin_transform): ReLU()\n)\n\n\n\nfor name, value in model.named_parameters():\n print(name, value.shape)\n\ninput.weight torch.Size([32, 10])\ninput.bias torch.Size([32])\noutput.weight torch.Size([15, 32])\noutput.bias torch.Size([15])\n\n\nReLU is does not contain any parameters here so it is merely a function.\n\n\n1.2 Model Uncertainty\nIn which cases we want our model to be uncertain?\n\nWhen it encounters a out-of-the-distribution data\nWhen training data is noisy (irreducible/aleatoric uncertainty)\nWhen we have multiple predictors (model/epistemic uncertainty)" }, { - "objectID": "posts/2022-10-31-stochastic-variational-gp.html#dataset", - "href": "posts/2022-10-31-stochastic-variational-gp.html#dataset", - "title": "Stochastic Variational Gaussian processes in JAX", - "section": "Dataset", - "text": "Dataset\nFor this blog post, we will stick to the classification problem and pick a reasonable classification dataset.\n\nn_samples = 100\nnoise = 0.1\nrandom_state = 0\nshuffle = True\n\nX, y = make_moons(\n n_samples=n_samples, random_state=random_state, noise=noise, shuffle=shuffle\n)\nX = StandardScaler().fit_transform(X) # Yes, this is useful for GPs\n\nX, y = map(jnp.array, (X, y))\n\nplt.scatter(X[:, 0], X[:, 1], c=y);\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)" + "objectID": "posts/2022-06-10-jaxoptimizers.html", + "href": "posts/2022-06-10-jaxoptimizers.html", + "title": "JAX Optimizers", + "section": "", + "text": "%%capture\n%pip install -U jax\nimport jax\nimport jax.numpy as jnp\ntry:\n import jaxopt\nexcept ModuleNotFoundError:\n %pip install -qq jaxopt\n import jaxopt\ntry:\n import optax\nexcept ModuleNotFoundError:\n %pip install -qq optax\n import optax\n\nimport tensorflow_probability.substrates.jax as tfp" }, { - "objectID": "posts/2022-10-31-stochastic-variational-gp.html#methodology", - "href": "posts/2022-10-31-stochastic-variational-gp.html#methodology", - "title": "Stochastic Variational Gaussian processes in JAX", - "section": "Methodology", - "text": "Methodology\nTo define a GP, we need a kernel function. Let us use the RBF or Exponentiated Quadratic or Squared Exponential kernel.\n\nlengthscale = 1.0\nvariance = 1.0\n\nkernel_fn = variance * kernels.ExpSquared(scale=lengthscale)\n\nkernel_fn(X, X).shape\n\n(100, 100)\n\n\nAs explained in the blog post, we want to minimize the following loss function:\n\\[\nKL[q(u|\\eta) || p(u|y, \\theta)] = KL[q(u|\\eta) || p(u | \\theta)] - \\mathbb{E}_{u \\sim q(u|\\eta)} \\log p(y | u, \\theta) + const\n\\]\nLet us break down the loss and discuss each componant.\n\nKL divergence\nIn the first term, we want to compute the KL divergence between prior and variational distribution of GP at inducing points. First, we need to define the inducing points.\n\nkey = jax.random.PRNGKey(0)\nn_inducing = 10\nn_dim = X.shape[1]\n\nX_inducing = jax.random.normal(key, shape=(n_inducing, n_dim))\nX_inducing.shape\n\n(10, 2)\n\n\nNow, defining the prior and variational distributions.\n\ngp_mean = 0.43 # a scalar parameter to train\n\nprior_mean = gp_mean * jnp.zeros(n_inducing)\nprior_cov = kernel_fn(X_inducing, X_inducing)\n\nprior_distribution = tfd.MultivariateNormalFullCovariance(prior_mean, prior_cov)\n\n\nvariational_mean = jax.random.uniform(key, shape=(n_inducing,)) # a vector parameter to train\n\nA covariance matrix can not be learned directly due to positive definite constraint. We can decompose a covariance matrix in a following way:\n\\[\n\\begin{aligned}\nK &= diag(\\boldsymbol{\\sigma})\\Sigma diag(\\boldsymbol{\\sigma})\\\\\n &= diag(\\boldsymbol{\\sigma})LL^T diag(\\boldsymbol{\\sigma})\n\\end{aligned}\n\\]\nWhere, \\(\\Sigma\\) is a correlation matrix, \\(L\\) is a lower triangular cholesky decomposition of \\(\\Sigma\\) and \\(\\boldsymbol{\\sigma}\\) is the variance vector. We can use tfb.CorrelationCholesky to generate \\(L\\) from an unconstrained vector:\n\nrandom_vector = jax.random.normal(key, shape=(3,))\ncorr_chol = tfb.CorrelationCholesky()(random_vector)\ncorrelation = corr_chol@corr_chol.T\ncorrelation\n\nDeviceArray([[ 1. , 0.54464529, -0.7835968 ],\n [ 0.54464529, 1. , -0.33059078],\n [-0.7835968 , -0.33059078, 1. ]], dtype=float64)\n\n\nTo constrain \\(\\boldsymbol{\\sigma}\\), any positivity constraint would suffice. So, combining these tricks, we can model the covariance as following:\n\nrandom_vector = jax.random.normal(\n key, shape=(n_inducing * (n_inducing - 1) // 2,)\n) # a trainable parameter\nlog_sigma = jax.random.normal(key, shape=(n_inducing, 1)) # a trainable parameter\n\n\nsigma = jnp.exp(log_sigma)\ncorr_chol = tfb.CorrelationCholesky()(random_vector)\nvariational_cov = sigma * sigma.T * (corr_chol @ corr_chol.T)\nprint(variational_cov.shape)\n\nvariational_distribution = tfd.MultivariateNormalFullCovariance(variational_mean, variational_cov\n)\n\n(10, 10)\n\n\nNow, we can compute the KL divergence:\n\nvariational_distribution.kl_divergence(prior_distribution)\n\nDeviceArray(416.89357355, dtype=float64)\n\n\n\n\nExpectation over the likelihood\nWe want to compute the following expectation:\n\\[\n-\\sum_{i=1}^N \\mathbb{E}_{f_i \\sim q(f_i | \\eta, \\theta)} \\log p(y_i| f_i, \\theta)\n\\]\nNote that, \\(p(y_i| f_i, \\theta)\\) can be any likelihood depending upon the problem, but for classification, we may use a Bernoulli likelihood.\n\nf = jax.random.normal(key, shape=y.shape)\nlikelihood_distribution = tfd.Bernoulli(logits=f)\n\nlog_likelihood = likelihood_distribution.log_prob(y).sum()\nlog_likelihood\n\nDeviceArray(-72.04665624, dtype=float64)\n\n\nWe need to sample \\(f_i\\) from \\(q(f_i | \\eta, \\theta)\\) which has the following form:\n\\[\n\\begin{aligned}\nq(u) &\\sim \\mathcal{N}(\\boldsymbol{m}, S)\\\\\nq(f_i | \\eta, \\theta) &\\sim \\mathcal{N}(\\mu_i, \\sigma_i^2)\\\\\n\\mu_i &= A\\boldsymbol{m}\\\\\n\\sigma_i^2 &= K_{ii} + A(S - K_{mm})A^T\\\\\nA &= K_{im}K_{mm}^{-1}\n\\end{aligned}\n\\]\nNote that matrix inversion is often unstable with jnp.linalg.inv and thus we will use cholesky tricks to compute \\(A\\).\n\ndef q_f(x_i):\n x_i = x_i.reshape(1, -1) # ensure correct shape\n K_im = kernel_fn(x_i, X_inducing)\n K_mm = kernel_fn(X_inducing, X_inducing)\n chol_mm = jnp.linalg.cholesky(K_mm + jnp.eye(K_mm.shape[0])*JITTER)\n A = jsp.linalg.cho_solve((chol_mm, True), K_im.T).T\n \n mu_i = A@variational_mean\n sigma_sqr_i = kernel_fn(x_i, x_i) + A@(variational_cov - prior_cov)@A.T\n \n return tfd.Normal(loc=mu_i, scale=sigma_sqr_i**0.5)\n\nHere is a function to compute log likelihood for a single data-point:\n\ndef log_likelihood(x_i, y_i, seed):\n sample = q_f(x_i).sample(seed=seed)\n log_likelihood = tfd.Bernoulli(logits=sample).log_prob(y_i)\n return log_likelihood.squeeze()\n\n\nlog_likelihood(X[0], y[0], seed=key)\n\nDeviceArray(-0.17831203, dtype=float64)\n\n\nWe can use jax.vmap to compute log_likelihood over a batch. With that, we can leverage the stochastic variational inference following section 10.3.1 (Eq. 10.108) from pml book2. Basically, in each iteration, we need to multiply the batch log likelihood with \\(\\frac{N}{B}\\) to get an unbiased minibatch approximation where \\(N\\) is size of the full dataset and \\(B\\) is the batch size.\n\nbatch_size = 10\n\nseeds = jax.random.split(key, num=batch_size)\n\nll = len(y)/batch_size * jax.vmap(log_likelihood)(X[:batch_size], y[:batch_size], seeds).sum()\nll\n\nDeviceArray(-215.46520331, dtype=float64)\n\n\nNote that, once the parameters are optimized, we can use the derivations of \\(q(f_i | \\eta, \\theta)\\) to compute the posterior distribution. We have figured out all the pieces by now so it is the time to put it togather in a single class. Some pointers to note are the following:\n\nWe define a single function get_constrained_params to transform all unconstrained parameters.\njax.lax.scan gives a huge boost to a training loop.\nThere is some repeatation of code due to lack of super code optimization. You can do it at your end if needed." + "objectID": "posts/2022-06-10-jaxoptimizers.html#loss-function", + "href": "posts/2022-06-10-jaxoptimizers.html#loss-function", + "title": "JAX Optimizers", + "section": "Loss function", + "text": "Loss function\n\ndef loss_fun(x, a):\n return (((x['param1'] - a) + (x['param2'] - (a+1)))**2).sum()" }, { - "objectID": "posts/2022-10-31-stochastic-variational-gp.html#all-in-one", - "href": "posts/2022-10-31-stochastic-variational-gp.html#all-in-one", - "title": "Stochastic Variational Gaussian processes in JAX", - "section": "All in one", - "text": "All in one\n\nclass SVGP:\n def __init__(self, X_inducing, data_size):\n self.X_inducing = X_inducing\n self.n_inducing = len(X_inducing)\n self.data_size = data_size\n \n def init_params(self, seed):\n variational_corr_chol_param = tfb.CorrelationCholesky().inverse(jnp.eye(self.n_inducing))\n \n dummy_params = {\"log_variance\": jnp.zeros(()),\n \"log_scale\": jnp.zeros(()), \n \"mean\": jnp.zeros(()),\n \"X_inducing\": self.X_inducing,\n \"variational_mean\": jnp.zeros(self.n_inducing),\n \"variational_corr_chol_param\": variational_corr_chol_param,\n \"log_variational_sigma\": jnp.zeros((self.n_inducing, 1)),\n }\n \n flat_params, unravel_fn = ravel_pytree(dummy_params)\n random_params = jax.random.normal(key, shape=(len(flat_params), ))\n params = unravel_fn(random_params)\n return params\n \n @staticmethod\n def get_constrained_params(params):\n return {\"mean\": params[\"mean\"],\n \"variance\": jnp.exp(params['log_variance']), \n \"scale\": jnp.exp(params['log_scale']), \n \"X_inducing\": params[\"X_inducing\"],\n \"variational_mean\": params[\"variational_mean\"],\n \"variational_corr_chol_param\": params[\"variational_corr_chol_param\"],\n \"variational_sigma\": jnp.exp(params[\"log_variational_sigma\"])}\n \n @staticmethod\n def get_q_f(params, x_i, prior_distribution, variational_distribution):\n x_i = x_i.reshape(1, -1) # ensure correct shape\n \n kernel_fn = params['variance'] * kernels.ExpSquared(scale=params[\"scale\"])\n K_im = kernel_fn(x_i, params[\"X_inducing\"])\n K_mm = prior_distribution.covariance()\n chol_mm = jnp.linalg.cholesky(K_mm)\n A = jsp.linalg.cho_solve((chol_mm, True), K_im.T).T\n\n mu_i = A@params[\"variational_mean\"]\n sigma_sqr_i = kernel_fn(x_i, x_i) + A@(variational_distribution.covariance() - K_mm)@A.T\n\n return tfd.Normal(loc=mu_i, scale=sigma_sqr_i**0.5)\n \n def get_distributions(self, params):\n kernel_fn = params['variance'] * kernels.ExpSquared(scale=params[\"scale\"])\n prior_mean = params[\"mean\"]\n prior_cov = kernel_fn(params[\"X_inducing\"], params[\"X_inducing\"]) + jnp.eye(self.n_inducing)*JITTER\n prior_distribution = tfd.MultivariateNormalFullCovariance(prior_mean, prior_cov)\n\n corr_chol = tfb.CorrelationCholesky()(params[\"variational_corr_chol_param\"])\n sigma = jnp.diag(params[\"variational_sigma\"])\n variational_cov = sigma*sigma.T*(corr_chol@corr_chol.T) + jnp.eye(self.n_inducing)*JITTER\n variational_distribution = tfd.MultivariateNormalFullCovariance(params[\"variational_mean\"], variational_cov)\n \n return prior_distribution, variational_distribution\n \n def loss_fn(self, params, X_batch, y_batch, seed):\n params = self.get_constrained_params(params)\n \n # Get distributions\n prior_distribution, variational_distribution = self.get_distributions(params)\n \n # Compute kl\n kl = variational_distribution.kl_divergence(prior_distribution)\n\n # Compute log likelihood\n def log_likelihood_fn(x_i, y_i, seed):\n q_f = self.get_q_f(params, x_i, prior_distribution, variational_distribution)\n sample = q_f.sample(seed=seed)\n log_likelihood = tfd.Bernoulli(logits=sample).log_prob(y_i)\n return log_likelihood.squeeze()\n \n seeds = jax.random.split(seed, num=len(y_batch))\n log_likelihood = jax.vmap(log_likelihood_fn)(X_batch, y_batch, seeds).sum() * self.data_size/len(y_batch)\n\n return kl - log_likelihood\n \n def fit_fn(self, X, y, init_params, optimizer, n_iters, batch_size, seed):\n state = optimizer.init(init_params)\n value_and_grad_fn = jax.value_and_grad(self.loss_fn)\n \n def one_step(params_and_state, seed):\n params, state = params_and_state\n idx = jax.random.choice(seed, self.data_size, (batch_size,), replace=False)\n X_batch, y_batch = X[idx], y[idx]\n \n seed2 = jax.random.split(seed, 1)[0]\n loss, grads = value_and_grad_fn(params, X_batch, y_batch, seed2)\n updates, state = optimizer.update(grads, state)\n params = optax.apply_updates(params, updates)\n return (params, state), (loss, params)\n \n seeds = jax.random.split(seed, num=n_iters)\n (best_params, _), (loss_history, params_history) = jax.lax.scan(one_step, (init_params, state), xs=seeds)\n return best_params, loss_history, params_history\n\n def predict_fn(self, params, X_new):\n constrained_params = self.get_constrained_params(params)\n prior_distribution, variational_distribution = self.get_distributions(constrained_params)\n \n def _predict_fn(x_i): \n # Get posterior\n q_f = self.get_q_f(constrained_params, x_i, prior_distribution, variational_distribution)\n return q_f.mean().squeeze(), q_f.variance().squeeze()\n \n mean, var = jax.vmap(_predict_fn)(X_new)\n return mean.squeeze(), var.squeeze()" + "objectID": "posts/2022-06-10-jaxoptimizers.html#initial-parameters", + "href": "posts/2022-06-10-jaxoptimizers.html#initial-parameters", + "title": "JAX Optimizers", + "section": "Initial parameters", + "text": "Initial parameters\n\nN = 3\ninit_params = lambda: {'param1': jnp.zeros(N), 'param2': jnp.ones(N)}\na = 2.0" }, { - "objectID": "posts/2022-10-31-stochastic-variational-gp.html#train-and-predict", - "href": "posts/2022-10-31-stochastic-variational-gp.html#train-and-predict", - "title": "Stochastic Variational Gaussian processes in JAX", - "section": "Train and predict", - "text": "Train and predict\n\nn_inducing = 20\nn_epochs = 100\nbatch_size = 10\ndata_size = len(y)\nn_iters = n_epochs*(data_size/batch_size)\nn_iters\n\n1000.0\n\n\n\nkey = jax.random.PRNGKey(0)\nkey2, subkey = jax.random.split(key)\noptimizer = optax.adam(learning_rate=0.01)\n\nX_inducing = jax.random.choice(key, X, (n_inducing,), replace=False)\nmodel = SVGP(X_inducing, data_size)\n\ninit_params = model.init_params(key2)\n\nmodel.loss_fn(init_params, X, y, key)\nbest_params, loss_history, params_history = model.fit_fn(X, y, init_params, optimizer, n_iters, batch_size, subkey)\n\nplt.figure()\nplt.plot(loss_history);\nplt.title(\"Loss\");\n\n\n\n\n\n\n\n\n\nx = jnp.linspace(-3.5, 3.5, 100)\nseed = jax.random.PRNGKey(123)\n\nX1, X2 = jnp.meshgrid(x, x)\nf = lambda x1, x2: model.predict_fn(best_params, jnp.array([x1, x2]).reshape(1, -1))\npred_mean, pred_var = jax.vmap(jax.vmap(f))(X1, X2)\nlogits = tfd.Normal(pred_mean, pred_var**0.5).sample(seed=seed, sample_shape=(10000,))\nproba = jax.nn.sigmoid(logits)\n\nproba_mean = proba.mean(axis=0)\nproba_std2 = proba.std(axis=0)*2\n\n\nfig, ax = plt.subplots(1, 2, figsize=(12,4))\ncplot1 = ax[0].contourf(X1, X2, proba_mean.squeeze(), alpha=0.5, levels=20)\nplt.colorbar(cplot1, ax=ax[0])\n\ncplot2 = ax[1].contourf(X1, X2, proba_std2.squeeze(), alpha=0.5, levels=20)\nplt.colorbar(cplot2, ax=ax[1])\n\nax[0].scatter(X[:, 0], X[:, 1], c=y);\nax[1].scatter(X[:, 0], X[:, 1], c=y);\n\nax[0].set_title(\"Posterior $\\mu$\");\nax[1].set_title(\"Posterior $\\mu \\pm 2*\\sigma$\");" + "objectID": "posts/2022-06-10-jaxoptimizers.html#optimizers", + "href": "posts/2022-06-10-jaxoptimizers.html#optimizers", + "title": "JAX Optimizers", + "section": "Optimizers", + "text": "Optimizers\n\nJaxOpt ScipyMinimize\n\n%%time\nsolver = jaxopt.ScipyMinimize('L-BFGS-B', fun=loss_fun)\nans = solver.run(init_params(), a)\nprint(ans)\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)\n\n\nOptStep(params={'param1': DeviceArray([1.9999999, 1.9999999, 1.9999999], dtype=float32), 'param2': DeviceArray([3., 3., 3.], dtype=float32)}, state=ScipyMinimizeInfo(fun_val=DeviceArray(4.2632564e-14, dtype=float32), success=True, status=0, iter_num=2))\nCPU times: user 78.3 ms, sys: 18.5 ms, total: 96.8 ms\nWall time: 95.8 ms\n\n\n\nPros\n\nTwo lines of code will do it all.\n\n\n\nCons\n\nIt only returns the final parameters and final loss. No option to retrive in-between loss values.\n\n\n\n\nOptax\n\n%%time\noptimizer = optax.adam(learning_rate=0.1)\nvalue_and_grad_fun = jax.jit(jax.value_and_grad(loss_fun, argnums=0))\nparams = init_params()\nstate = optimizer.init(params)\n\nfor _ in range(100):\n loss_value, gradients = value_and_grad_fun(params, a)\n updates, state = optimizer.update(gradients, state)\n params = optax.apply_updates(params, updates)\n\nprint(params)\n\n{'param1': DeviceArray([2.0084236, 2.0084236, 2.0084236], dtype=float32), 'param2': DeviceArray([3.0084238, 3.0084238, 3.0084238], dtype=float32)}\nCPU times: user 3.09 s, sys: 63.4 ms, total: 3.16 s\nWall time: 4.2 s\n\n\n\nPros:\n\nFull control in user’s hand. We can save intermediate loss values.\n\n\n\nCons:\n\nIts code is verbose, similar to PyTorch optimizers.\n\n\n\n\nJaxopt OptaxSolver\n\n%%time\noptimizer = optax.adam(learning_rate=0.1)\nsolver = jaxopt.OptaxSolver(loss_fun, optimizer, maxiter=100)\nans = solver.run(init_params(), a)\nprint(ans)\n\nOptStep(params={'param1': DeviceArray([2.008423, 2.008423, 2.008423], dtype=float32), 'param2': DeviceArray([3.008423, 3.008423, 3.008423], dtype=float32)}, state=OptaxState(iter_num=DeviceArray(100, dtype=int32, weak_type=True), value=DeviceArray(0.00113989, dtype=float32), error=DeviceArray(0.09549397, dtype=float32), internal_state=(ScaleByAdamState(count=DeviceArray(100, dtype=int32), mu={'param1': DeviceArray([0.02871927, 0.02871927, 0.02871927], dtype=float32), 'param2': DeviceArray([0.02871927, 0.02871927, 0.02871927], dtype=float32)}, nu={'param1': DeviceArray([0.44847375, 0.44847375, 0.44847375], dtype=float32), 'param2': DeviceArray([0.44847375, 0.44847375, 0.44847375], dtype=float32)}), EmptyState()), aux=None))\nCPU times: user 719 ms, sys: 13.4 ms, total: 732 ms\nWall time: 1.09 s\n\n\n\nPros:\n\nLess lines of code.\nApplies lax.scan internally to make it fast [reference].\n\n\n\nCons:\n\nNot able to get in-between state/loss values\n\n\n\n\ntfp math minimize\n\n%%time\noptimizer = optax.adam(learning_rate=0.1)\nparams, losses = tfp.math.minimize_stateless(loss_fun, (init_params(), a), num_steps=1000, optimizer=optimizer)\nprint(params)\nprint(losses[:5])\n\n({'param1': DeviceArray([1.0000008, 1.0000008, 1.0000008], dtype=float32), 'param2': DeviceArray([1.9999989, 1.9999989, 1.9999989], dtype=float32)}, DeviceArray(0.9999999, dtype=float32))\n[48. 38.88006 30.751791 23.626852 17.507807]\nCPU times: user 880 ms, sys: 15.2 ms, total: 895 ms\nWall time: 1.53 s\n\n\n\nPros:\n\nOne line of code to optimize the function and return in-between losses.\n\n\n\nCons:\n\nBy default, it optimizes all arguments passed to the loss function. In above example, we can not control if a should be optimized or not. I have raised an issue here for this problem." }, { - "objectID": "posts/2022-10-31-stochastic-variational-gp.html#some-more-datasets", - "href": "posts/2022-10-31-stochastic-variational-gp.html#some-more-datasets", - "title": "Stochastic Variational Gaussian processes in JAX", - "section": "Some more datasets", - "text": "Some more datasets\n\ndef fit_and_plot(X, y):\n X = StandardScaler().fit_transform(X) # Yes, this is useful for GPs\n X, y = map(jnp.array, (X, y))\n\n X_inducing = jax.random.choice(key, X, (n_inducing,), replace=False)\n model = SVGP(X_inducing, data_size)\n\n init_params = model.init_params(key2)\n\n model.loss_fn(init_params, X, y, key)\n best_params, loss_history, params_history = model.fit_fn(X, y, init_params, optimizer, n_iters, batch_size, subkey)\n\n plt.figure()\n plt.plot(loss_history);\n plt.title(\"Loss\");\n \n f = lambda x1, x2: model.predict_fn(best_params, jnp.array([x1, x2]).reshape(1, -1))\n pred_mean, pred_var = jax.vmap(jax.vmap(f))(X1, X2)\n logits = tfd.Normal(pred_mean, pred_var**0.5).sample(seed=seed, sample_shape=(10000,))\n proba = jax.nn.sigmoid(logits)\n\n proba_mean = proba.mean(axis=0)\n proba_std2 = proba.std(axis=0)*2\n \n fig, ax = plt.subplots(1, 2, figsize=(12,4))\n cplot1 = ax[0].contourf(X1, X2, proba_mean.squeeze(), alpha=0.5, levels=20)\n plt.colorbar(cplot1, ax=ax[0])\n\n cplot2 = ax[1].contourf(X1, X2, proba_std2.squeeze(), alpha=0.5, levels=20)\n plt.colorbar(cplot2, ax=ax[1])\n\n ax[0].scatter(X[:, 0], X[:, 1], c=y);\n ax[1].scatter(X[:, 0], X[:, 1], c=y);\n\n ax[0].set_title(\"Posterior $\\mu$\");\n ax[1].set_title(\"Posterior $\\mu \\pm 2*\\sigma$\");\n\n\nmake_blobs\n\nX, y = make_blobs(n_samples=n_samples, random_state=random_state, centers=2)\n\nplt.scatter(X[:, 0], X[:, 1], c=y);\nfit_and_plot(X, y)\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nmake_circles\n\nX, y = make_circles(n_samples=n_samples, random_state=random_state, noise=noise, factor=0.1)\n\nplt.scatter(X[:, 0], X[:, 1], c=y);\nfit_and_plot(X, y)" + "objectID": "posts/torch-tips.html", + "href": "posts/torch-tips.html", + "title": "PyTorch Tips", + "section": "", + "text": "Several tips for building torch models from scratch from my experience. Some of the tips are like zen, they are not immediately intuitive but useful for efficient code.\n\nAll the initializations or new tensor creation should only happen in __init__ method. During the forward() call, ideally no new tensors should be created from scratch such as torch.zeros(), torch.ones() etc. Reason: Violating this can sometimes brake your forward pass and end-to-end backprop may become buggy.\n.cuda() and .cpu() are discouraged, use .to(device) instead. Reason: .to(device) is more dynamic and scalable.\nDo not save models with torch.save(model), that may become incompaitable with different torch versions and may take more memory. Save torch.save(model.state_dict()) instead.\nNeed to set parameter names dynamically? Use this example, zero=0;self.register_parameter(f\"name_{zero}\"). They can be accessed with model.name_0.\nHave something in model which is necessary for forward pass but does not require backprop? define those variables with self.register_buffer.\nLet .to(device) to be set outside the model defition. Reason: It is less confusing to the users this way and it is less messy with internal tools to set device such as:\n\nmodule.to(deivce) sends all parameters and buffers of model/submodules to the device.\n\nmodule.float() or module.double() will convert all model/submodule parameters and buffers into float32 and float64 respectively.\nLet .train() and .eval() to be set outside the model defition or set by user. Reason: It can be confusing to user if these things are used inside the model against torch conventions.\ntorch.no_grad() should not be used within the model. Reason: Sometimes user may want to backprop through that chunk of code.\nLink the multiple modules togather. Reason: Ideally, it is useful if model is built like a assembled product (say a car). You should be able to replace the parts as per your requirement. Several benefits on these lines are:\n\nsetting module.train() or module.eval() puts all submodules in train mode or eval mode respectively.\nAll submodules parameters can be accesses directly from the parent module with module.parameters().\n\nCreating a list of parameters in model __init__ definition? consider torch.nn.ModuleList(params) else individual parameters in the list will not be recognized as parameters." }, { "objectID": "posts/2021-09-27-constraints.html", @@ -1190,157 +1225,73 @@ "text": "GPFlow\n\nfrom gpflow.utilities.bijectors import positive\n\n\ngpflow_trans = positive()\n\n\nplt.plot(x, gpflow_trans(x));\nplt.xlabel('X')\nplt.ylabel('f(X)');\n\n\n\n\n\n\n\n\n\nnp.allclose(gpy_trans.f(x), gpytorch_trans.transform(x))\n\nTrue\n\n\n\nnp.allclose(gpy_trans.f(x), gpflow_trans(x))\n\nTrue" }, { - "objectID": "posts/kl-divergence.html", - "href": "posts/kl-divergence.html", - "title": "KL divergence v/s cross-entropy", - "section": "", - "text": "In a classification problem, for a data-point \\(\\mathbf{x}_i\\), we have the true label \\(y_i\\) associated with it.\nLet us assume that we have three possible outcomes \\(\\{L1, L2, L3\\}\\) and for current \\(\\mathbf{x}_i\\), corresponding \\(y_i\\) is \\(L2\\). Then Ground truth probability distribution is the following:\n\\[\np_G(y = L1) = 0\\\\\np_G(y = L2) = 1\\\\\np_G(y=L3) = 0\n\\]\nLet us assume that our classifier model Predicted the following distribution:\n\\[\np_P(y = L1) = 0.1\\\\\np_P(y = L2) = 0.8\\\\\np_P(y=L3) = 0.1\n\\]" - }, - { - "objectID": "posts/kl-divergence.html#ground", - "href": "posts/kl-divergence.html#ground", - "title": "KL divergence v/s cross-entropy", - "section": "", - "text": "In a classification problem, for a data-point \\(\\mathbf{x}_i\\), we have the true label \\(y_i\\) associated with it.\nLet us assume that we have three possible outcomes \\(\\{L1, L2, L3\\}\\) and for current \\(\\mathbf{x}_i\\), corresponding \\(y_i\\) is \\(L2\\). Then Ground truth probability distribution is the following:\n\\[\np_G(y = L1) = 0\\\\\np_G(y = L2) = 1\\\\\np_G(y=L3) = 0\n\\]\nLet us assume that our classifier model Predicted the following distribution:\n\\[\np_P(y = L1) = 0.1\\\\\np_P(y = L2) = 0.8\\\\\np_P(y=L3) = 0.1\n\\]" - }, - { - "objectID": "posts/kl-divergence.html#kl-divergence", - "href": "posts/kl-divergence.html#kl-divergence", - "title": "KL divergence v/s cross-entropy", - "section": "KL divergence", - "text": "KL divergence\nWe can use KL divergence to check how good is our model. The formula is:\n\\[\nD_{KL}(p_G\\;\\rVert\\;p_P) = \\sum_{y_i \\in \\{L1, L2, L3\\}} p_G(y_i)\\log\\frac{p_G(y_i)}{p_P(y_i)}\n\\]\nFor our example,\n\\[\nD_{KL}(p_G\\;\\rVert\\;p_P) = \\log\\frac{1}{0.8}\n\\]\nIt is evident that if \\(p_P(y = L2)\\) decreses from \\(0.8\\), \\(D_{KL}(p_G\\;\\rVert\\;p_P)\\) will increase and vice versa. Note that KL divergence is not symmetric which means \\(D_{KL}(p_G\\;\\rVert\\;p_P) \\ne D_{KL}(p_P\\;\\rVert\\;p_G)\\)." - }, - { - "objectID": "posts/kl-divergence.html#cross-entory", - "href": "posts/kl-divergence.html#cross-entory", - "title": "KL divergence v/s cross-entropy", - "section": "Cross-entory", - "text": "Cross-entory\nCross-entropy is another measure for distribution similarity. The formula is:\n\\[\nH(p_G, p_P) = \\sum_{y_i \\in \\{L1, L2, L3\\}} - p_G(y_i)\\log p_P(y_i)\n\\]\nFor our example:\n\\[\nH(p_G, p_P) = -\\log 0.8 = \\log \\frac{1}{0.8}\n\\]" - }, - { - "objectID": "posts/kl-divergence.html#kl-divergence-vs-cross-entropy", - "href": "posts/kl-divergence.html#kl-divergence-vs-cross-entropy", - "title": "KL divergence v/s cross-entropy", - "section": "KL divergence v/s cross-entropy", - "text": "KL divergence v/s cross-entropy\nThis shows that KL divergence and cross-entropy will return the same values for a simple classification problem. Then why do we use cross-entropy as a loss function and not KL divergence?\nThat’s because KL divergence will compute additional constant terms (zero here) that are not adding any value in minimization." - }, - { - "objectID": "posts/2020-03-28-active_learning_with_bayesian_linear_regression.html", - "href": "posts/2020-03-28-active_learning_with_bayesian_linear_regression.html", - "title": "Active Learning with Bayesian Linear Regression", - "section": "", - "text": "A quick wrap-up for Bayesian Linear Regression (BLR)\nWe have a feature matrix \\(X\\) and a target vector \\(Y\\). We want to obtain \\(\\theta\\) vector in such a way that the error \\(\\epsilon\\) for the following equation is minimum.\n\\[\nY = X^T\\theta + \\epsilon\n\\] Prior PDF for \\(\\theta\\) is,\n\\[\np(\\theta) \\sim \\mathcal{N}(M_0, S_0)\n\\]\nWhere \\(S_0\\) is prior covariance matrix, and \\(M_0\\) is prior mean.\nPosterier PDF can be given as,\n\\[\n\\begin{aligned}\np(\\theta|X,Y) &\\sim \\mathcal{N}(\\theta | M_n, S_n) \\\\\nS_n &= (S_0^{-1} + \\sigma_{mle}^{-2}X^TX) \\\\\nM_n &= S_n(S_0^{-1}M_0+\\sigma_{mle}^{-2}X^TY)\n\\end{aligned}\n\\]\nMaximum likelihood estimation of \\(\\sigma\\) can be calculated as,\n\\[\n\\begin{aligned}\n\\theta_{mle} &= (X^TX)^{-1}X^TY \\\\\n\\sigma_{mle} &= ||Y - X^T\\theta_{mle}||\n\\end{aligned}\n\\]\nFinally, predicted mean \\(\\hat{Y}_{mean}\\) and predicted covariance matrix \\(\\hat{Y}_{cov}\\) can be given as,\n\\[\n\\begin{aligned}\n\\hat{Y} &\\sim \\mathcal{N}(\\hat{Y}_{mean}, \\hat{Y}_{cov}) \\\\\n\\hat{Y}_{mean} &= XM_n \\\\\n\\hat{Y}_{cov} &= X^TS_nX\n\\end{aligned}\n\\]\nNow, let’s put everything together and write a class for Bayesian Linear Regression.\n\n\nCreating scikit-learn like class with fit predict methods for BLR\n\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nfrom sklearn import datasets\nfrom sklearn.preprocessing import PolynomialFeatures, MinMaxScaler\nfrom sklearn.model_selection import train_test_split\nfrom matplotlib.animation import FuncAnimation\nfrom matplotlib import rc\nimport warnings\nwarnings.filterwarnings('ignore')\nseed = 0 # random seed for train_test_split\n\n\nclass BLR():\n def __init__(self,S0, M0): # M0 -> prior mean, S0 -> prior covariance matrix\n self.S0 = S0\n self.M0 = M0\n\n def fit(self,x,y, return_self = False):\n self.x = x\n self.y = y\n\n # Maximum likelihood estimation for sigma parameter\n theta_mle = np.linalg.pinv(x.T@x)@(x.T@y)\n sigma_2_mle = np.linalg.norm(y - x@theta_mle)**2\n sigma_mle = np.sqrt(sigma_2_mle)\n\n # Calculating predicted mean and covariance matrix for theta\n self.SN = np.linalg.pinv(np.linalg.pinv(self.S0) + (sigma_mle**-2)*x.T@x)\n self.MN = self.SN@(np.linalg.pinv(self.S0)@self.M0 + (sigma_mle**-2)*(x.T@y).squeeze())\n\n # Calculating predicted mean and covariance matrix for data\n self.pred_var = x@self.SN@x.T\n self.y_hat_map = x@self.MN\n if return_self:\n return (self.y_hat_map, self.pred_var)\n \n def predict(self, x):\n self.pred_var = x@self.SN@x.T\n self.y_hat_map = x@self.MN\n return (self.y_hat_map, self.pred_var)\n\n def plot(self, s=1): # s -> size of dots for scatter plot\n individual_var = self.pred_var.diagonal()\n plt.figure()\n plt.xlabel('x')\n plt.ylabel('y')\n plt.plot(self.x[:,1], self.y_hat_map, color='black', label='model')\n plt.fill_between(self.x[:,1], self.y_hat_map-individual_var, self.y_hat_map+individual_var, alpha=0.4, color='black', label='uncertainty')\n plt.scatter(self.x[:,1], self.y, label='actual data',s=s)\n plt.title('MAE is '+str(np.mean(np.abs(self.y - self.y_hat_map))))\n plt.legend()\n\n\n\nCreating & visualizing dataset\nTo start with, let’s create a random dataset with degree 3 polynomial function with some added noise.\n\\[\nY = (5X^3 - 4X^2 + 3X - 2) + \\mathcal{N}(0,1)\n\\]\n\nnp.random.seed(seed)\nX_init = np.linspace(-1, 1, 1000)\nnoise = np.random.randn(1000, )\nY = (5 * X_init**3 - 4 * X_init**2 + 3 * X_init - 2) + noise\n\nWe’ll try to fit a degree 5 polynomial function to our data.\n\nX = PolynomialFeatures(degree=5, include_bias=True).fit_transform(X_init.reshape(-1,1))\nN_features = X.shape[1]\n\n\nplt.scatter(X[:,1], Y, s=0.5, label = 'data points')\nplt.xlabel(\"X\")\nplt.ylabel(\"Y\")\nplt.legend()\nplt.show()\n\n\n\n\n\n\n\n\n\n\nLearning a BLR model on the entire data\nWe’ll take \\(M_0\\) (prior mean) as zero vector initially, assuming that we do not have any prior knowledge about \\(M_0\\). We’re taking \\(S_0\\) (prior covariance) as the identity matrix, assuming that all coefficients are completely independent of each other.\n\nS0 = np.eye(N_features)\nM0 = np.zeros((N_features, ))\nmodel = BLR(S0, M0)\n\n\nmodel.fit(X, Y)\n\n\n\nVisualising the fit\n\nmodel.plot(s=0.5)\n\n\n\n\n\n\n\n\nThis doesn’t look like a good fit, right? Let’s set the prior closer to the real values and visualize the fit again.\n\n\nVisualising the fit after changing the prior\n\nnp.random.seed(seed)\nS0 = np.eye(N_features)\nM0 = np.array([-2, 3, -4, 5, 0, 0]) + np.random.randn(N_features, )\nmodel = BLR(S0, M0)\n\n\nmodel.fit(X, Y)\nmodel.plot(s=0.5)\n\n\n\n\n\n\n\n\nHmm, better. Now let’s see how it fits after reducing the noise and setting the prior mean to zero vector again.\n\n\nVisualising the fit after reducing the noise\n\nnp.random.seed(seed)\nX_init = np.linspace(-1, 1, 1000)\nnoise = np.random.randn(1000, ) * 0.5\nY = (5 * X_init**3 - 4 * X_init**2 + 3 * X_init - 2) + noise\n\n\nS0 = np.eye(N_features)\nM0 = np.zeros((N_features, ))\nmodel = BLR(S0, M0)\n\n\nmodel.fit(X, Y)\nmodel.plot(s=0.5)\n\n\n\n\n\n\n\n\nWhen the noise was high, the model tended to align with the prior. After keeping the prior closer to the original coefficients, the model was improved as expected. From the last plot, we can say that as noise reduces from the data, the impact of the prior reduces, and the model tries to fit the data more precisely. Therefore, we can say that when data is too noisy or insufficient, a wisely chosen prior can produce a precise fit.\n\n\nIntuition to Active Learning (Uncertainty Sampling) with an example\nLet’s take the case where we want to train a machine learning model to classify if a person is infected with COVID-19 or not, but the testing facilities for the same are not available so widely. We may have very few amounts of data for detected positive and detected negative patients. Now, we want our model to be highly confident or least uncertain about its results; otherwise, it may create havoc for wrongly classified patients, but, our bottleneck is labeled data. Thanks to active learning techniques, we can overcome this problem smartly. How?\nWe train our model with existing data and test it on all the suspected patients’ data. Let’s say we have an uncertainty measure or confidence level about each tested data point (distance from the decision boundary in case of SVM, variance in case of Gaussian processes, or Bayesian Linear Regression). We can choose a patient for which our model is least certain, and send him to COVID-19 testing facilities (assuming that we can send only one patient at a time). Now, we can include his data to the train set and test the model on everyone else. By following the same procedure repeatedly, we can increase the size of our train data and confidence of the model without sending everyone randomly for testing.\nThis method is called Uncertainty Sampling in Active Learning. Now let’s formally define Active Learning. From Wikipedia,\nActive learning is a special case of machine learning in which a learning algorithm can interactively query a user (or some other information source) to label new data points with the desired outputs.\nNow, we’ll go through the active learning procedure step by step.\n\n\nTrain set, test set, and pool. What is what?\nThe train set includes labeled data points. The pool includes potential data points to query for a label, and the test set includes labeled data points to check the performance of our model. Here, we cannot actually do a query to anyone, so we assume that we do not have labels for the pool while training, and after each iteration, we include a data point from the pool set to the train set for which our model has the highest uncertainty.\nSo, the algorithm can be represented as the following,\n\nTrain the model with the train set.\nTest the performance on the test set (This is expected to improve).\nTest the model with the pool.\nQuery for the most uncertain datapoint from the pool.\nAdd that datapoint into the train set.\nRepeat step 1 to step 5 for \\(K\\) iterations (\\(K\\) ranges from \\(0\\) to the pool size).\n\n\n\nCreating initial train set, test set, and pool\nLet’s take half of the dataset as the test set, and from another half, we will start with some points as the train set and remaining as the pool. Let’s start with 2 data points as the train set.\n\nnp.random.seed(seed)\nX_init = np.linspace(-1, 1, 1000)\nX = PolynomialFeatures(degree=5, include_bias=True).fit_transform(X_init.reshape(-1,1))\nnoise = np.random.randn(1000, ) * 0.5\nY = (5 * X_init**3 - 4 * X_init**2 + 3 * X_init - 2) + noise\n\n\ntrain_pool_X, test_X, train_pool_Y, test_Y = train_test_split(X, Y, test_size = 0.5, random_state=seed)\ntrain_X, pool_X, train_Y, pool_Y = train_test_split(train_pool_X, train_pool_Y, train_size=2, random_state=seed)\n\nVisualizing train, test and pool.\n\nplt.scatter(test_X[:,1], test_Y, label='test set',color='r', s=2)\nplt.scatter(train_X[:,1], train_Y, label='train set',marker='s',color='k', s=50)\nplt.scatter(pool_X[:,1], pool_Y, label='pool',color='b', s=2)\nplt.xlabel('X')\nplt.ylabel('Y')\nplt.legend()\nplt.show()\n\n\n\n\n\n\n\n\nLet’s initialize a few dictionaries to keep track of each iteration.\n\ntrain_X_iter = {} # to store train points at each iteration\ntrain_Y_iter = {} # to store corresponding labels to the train set at each iteration\nmodels = {} # to store the models at each iteration\nestimations = {} # to store the estimations on the test set at each iteration\ntest_mae_error = {} # to store MAE(Mean Absolute Error) at each iteration\n\n\n\nTraining & testing initial learner on train set (Iteration 0)\nNow we will train the model for the initial train set, which is iteration 0.\n\ntrain_X_iter[0] = train_X\ntrain_Y_iter[0] = train_Y\n\n\nS0 = np.eye(N_features)\nM0 = np.zeros((N_features, ))\nmodels[0] = BLR(S0, M0)\n\n\nmodels[0].fit(train_X_iter[0], train_Y_iter[0])\n\nCreating a plot method to visualize train, test and pool with estimations and uncertainty.\n\ndef plot(ax, model, init_title=''):\n # Plotting the pool\n ax.scatter(pool_X[:,1], pool_Y, label='pool',s=1,color='r',alpha=0.4)\n \n # Plotting the test data\n ax.scatter(test_X[:,1], test_Y, label='test data',s=1, color='b', alpha=0.4)\n \n # Combining the test & the pool\n test_pool_X, test_pool_Y = np.append(test_X,pool_X, axis=0), np.append(test_Y,pool_Y)\n \n # Sorting test_pool for plotting\n sorted_inds = np.argsort(test_pool_X[:,1])\n test_pool_X, test_pool_Y = test_pool_X[sorted_inds], test_pool_Y[sorted_inds]\n \n # Plotting test_pool with uncertainty\n model.predict(test_pool_X)\n individual_var = model.pred_var.diagonal()\n ax.plot(test_pool_X[:,1], model.y_hat_map, color='black', label='model')\n ax.fill_between(test_pool_X[:,1], model.y_hat_map-individual_var, model.y_hat_map+individual_var\n , alpha=0.2, color='black', label='uncertainty')\n \n # Plotting the train data\n ax.scatter(model.x[:,1], model.y,s=40, color='k', marker='s', label='train data')\n ax.scatter(model.x[-1,1], model.y[-1],s=80, color='r', marker='o', label='last added point')\n \n # Plotting MAE on the test set\n model.predict(test_X)\n ax.set_title(init_title+' MAE is '+str(np.mean(np.abs(test_Y - model.y_hat_map))))\n ax.set_xlabel('x')\n ax.set_ylabel('y')\n ax.legend()\n\nPlotting the estimations and uncertainty.\n\nfig, ax = plt.subplots()\nplot(ax, models[0])\n\n\n\n\n\n\n\n\nLet’s check the maximum uncertainty about any point for the model.\n\nmodels[0].pred_var.diagonal().max()\n\n4.8261426545316604e-29\n\n\nOops!! There is almost no uncertainty in the model. Why? let’s try again with more train points.\n\ntrain_pool_X, test_X, train_pool_Y, test_Y = train_test_split(X, Y, test_size = 0.5, random_state=seed)\ntrain_X, pool_X, train_Y, pool_Y = train_test_split(train_pool_X, train_pool_Y, train_size=7, random_state=seed)\n\n\ntrain_X_iter[0] = train_X\ntrain_Y_iter[0] = train_Y\n\n\nS0 = np.eye(N_features)\nM0 = np.zeros((N_features, ))\nmodels[0] = BLR(S0, M0)\n\n\nmodels[0].fit(train_X_iter[0], train_Y_iter[0])\n\n\nfig, ax = plt.subplots()\nplot(ax, models[0])\n\n\n\n\n\n\n\n\nNow uncertainty is visible, and currently, it’s high at the left-most points. We are trying to fit a degree 5 polynomial here. So our linear regression coefficients are 6, including the bias. If we choose train points equal to or lesser than 6, our model perfectly fits the train points and has no uncertainty. Choosing train points more than 6 induces uncertainty in the model.\nLet’s evaluate the performance on the test set.\n\nestimations[0], _ = models[0].predict(test_X)\ntest_mae_error[0] = np.mean(np.abs(test_Y - estimations[0]))\n\nMean Absolute Error (MAE) on the test set is\n\ntest_mae_error[0]\n\n0.5783654195019617\n\n\n\n\nMoving the most uncertain point from the pool to the train set\nIn the previous plot, we saw that the model was least certain about the left-most point. We’ll move that point from the pool to the train set and see the effect.\n\nesimations_pool, _ = models[0].predict(pool_X)\n\nFinding out a point having the most uncertainty.\n\nin_var = models[0].pred_var.diagonal().argmax()\nto_add_x = pool_X[in_var,:]\nto_add_y = pool_Y[in_var]\n\nAdding the point from the pool to the train set.\n\ntrain_X_iter[1] = np.vstack([train_X_iter[0], to_add_x])\ntrain_Y_iter[1] = np.append(train_Y_iter[0], to_add_y)\n\nDeleting the point from the pool.\n\npool_X = np.delete(pool_X, in_var, axis=0)\npool_Y = np.delete(pool_Y, in_var)\n\n\n\nTraining again and visualising the results (Iteration 1)\nThis time, we will pass previously learnt prior to the next iteration.\n\nS0 = np.eye(N_features)\nmodels[1] = BLR(S0, models[0].MN)\n\n\nmodels[1].fit(train_X_iter[1], train_Y_iter[1])\n\n\nestimations[1], _ = models[1].predict(test_X)\ntest_mae_error[1] = np.mean(np.abs(test_Y - estimations[1]))\n\nMAE on the test set is\n\ntest_mae_error[1]\n\n0.5779411133071186\n\n\nVisualizing the results.\n\nfig, ax = plt.subplots()\nplot(ax, models[1])\n\n\n\n\n\n\n\n\nBefore & after adding most uncertain point\n\nfig, ax = plt.subplots(1,2, figsize=(13.5,4.5))\nplot(ax[0], models[0],'Before')\nplot(ax[1], models[1],'After')\n\n\n\n\n\n\n\n\nWe can see that including most uncertain point into the train set has produced a better fit and MAE for test set has been reduced. Also, uncertainty has reduced at the left part of the data but it has increased a bit on the right part of the data.\nNow let’s do this for few more iterations in a loop and visualise the results.\n\n\nActive learning procedure\n\nnum_iterations = 20\npoints_added_x= np.zeros((num_iterations+1, N_features))\n\npoints_added_y=[]\n\nprint(\"Iteration, Cost\\n\")\nprint(\"-\"*40)\n\nfor iteration in range(2, num_iterations+1):\n # Making predictions on the pool set based on model learnt in the respective train set \n estimations_pool, var = models[iteration-1].predict(pool_X)\n \n # Finding the point from the pool with highest uncertainty\n in_var = var.diagonal().argmax()\n to_add_x = pool_X[in_var,:]\n to_add_y = pool_Y[in_var]\n points_added_x[iteration-1,:] = to_add_x\n points_added_y.append(to_add_y)\n \n # Adding the point to the train set from the pool\n train_X_iter[iteration] = np.vstack([train_X_iter[iteration-1], to_add_x])\n train_Y_iter[iteration] = np.append(train_Y_iter[iteration-1], to_add_y)\n \n # Deleting the point from the pool\n pool_X = np.delete(pool_X, in_var, axis=0)\n pool_Y = np.delete(pool_Y, in_var)\n \n # Training on the new set\n models[iteration] = BLR(S0, models[iteration-1].MN)\n models[iteration].fit(train_X_iter[iteration], train_Y_iter[iteration])\n \n estimations[iteration], _ = models[iteration].predict(test_X)\n test_mae_error[iteration]= pd.Series(estimations[iteration] - test_Y.squeeze()).abs().mean()\n print(iteration, (test_mae_error[iteration]))\n\nIteration, Cost\n\n----------------------------------------\n2 0.49023173501654815\n3 0.4923391714942153\n4 0.49040074812746753\n5 0.49610198614600165\n6 0.5015282102751122\n7 0.5051264429971314\n8 0.5099913097301352\n9 0.504455016053513\n10 0.5029219102020734\n11 0.5009762782262487\n12 0.5004883097883343\n13 0.5005169638980388\n14 0.5002731089932334\n15 0.49927485683909884\n16 0.49698416490822594\n17 0.49355398855432897\n18 0.49191185613804617\n19 0.491164833699368\n20 0.4908067530719673\n\n\n\npd.Series(test_mae_error).plot(style='ko-')\nplt.xlim((-0.5, num_iterations+0.5))\nplt.ylabel(\"MAE on test set\")\nplt.xlabel(\"# Points Queried\")\nplt.show()\n\n\n\n\n\n\n\n\nThe plot above shows that MAE on the test set fluctuates a bit initially then reduces gradually as we keep including more points from the pool to the train set. Let’s visualise fits for all the iterations. We’ll discuss this behaviour after that.\n\n\nVisualizing active learning procedure\n\nprint('Initial model')\nprint('Y = {0:0.2f} X^5 + {1:0.2f} X^4 + {2:0.2f} X^3 + {3:0.2f} X^2 + {4:0.2f} X + {5:0.2f}'.format(*models[0].MN[::-1]))\nprint('\\nFinal model')\nprint('Y = {0:0.2f} X^5 + {1:0.2f} X^4 + {2:0.2f} X^3 + {3:0.2f} X^2 + {4:0.2f} X + {5:0.2f}'.format(*models[num_iterations].MN[::-1]))\n\nInitial model\nY = 1.89 X^5 + 1.54 X^4 + 0.84 X^3 + -6.48 X^2 + 4.74 X + -1.63\n\nFinal model\nY = 2.50 X^5 + 3.11 X^4 + 0.83 X^3 + -7.08 X^2 + 4.47 X + -1.58\n\n\n\ndef update(iteration):\n ax.cla()\n plot(ax, models[iteration])\n fig.tight_layout()\n\n\nfig, ax = plt.subplots()\nanim = FuncAnimation(fig, update, frames=np.arange(0, num_iterations+1, 1), interval=250)\nplt.close()\nrc('animation', html='jshtml')\n\n\nanim\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nWe can see that the point having highest uncertainty was chosen in first iteration and it produced the near optimal fit. After that, error reduced gradually.\nNow, let’s put everything together and create a class for active learning procedure\n\n\nCreating a class for active learning procedure\n\nclass ActiveL():\n def __init__(self, X, y, S0=None, M0=None, test_size=0.5, degree = 5, iterations = 20, seed=1):\n self.X_init = X\n self.y = y\n self.S0 = S0\n self.M0 = M0\n self.train_X_iter = {} # to store train points at each iteration\n self.train_Y_iter = {} # to store corresponding labels to the train set at each iteration\n self.models = {} # to store the models at each iteration\n self.estimations = {} # to store the estimations on the test set at each iteration\n self.test_mae_error = {} # to store MAE(Mean Absolute Error) at each iteration\n self.test_size = test_size\n self.degree = degree\n self.iterations = iterations\n self.seed = seed\n self.train_size = degree + 2\n\n def data_preperation(self):\n # Adding polynomial features\n self.X = PolynomialFeatures(degree=self.degree).fit_transform(self.X_init)\n N_features = self.X.shape[1]\n \n # Splitting into train, test and pool\n train_pool_X, self.test_X, train_pool_Y, self.test_Y = train_test_split(self.X, self.y, \n test_size=self.test_size,\n random_state=self.seed)\n self.train_X, self.pool_X, self.train_Y, self.pool_Y = train_test_split(train_pool_X, train_pool_Y, \n train_size=self.train_size, \n random_state=self.seed)\n \n # Setting BLR prior incase of not given\n if self.M0 == None:\n self.M0 = np.zeros((N_features, ))\n if self.S0 == None:\n self.S0 = np.eye(N_features)\n \n def main(self):\n # Training for iteration 0\n self.train_X_iter[0] = self.train_X\n self.train_Y_iter[0] = self.train_Y\n self.models[0] = BLR(self.S0, self.M0)\n self.models[0].fit(self.train_X, self.train_Y)\n\n # Running loop for all iterations\n for iteration in range(1, self.iterations+1):\n # Making predictions on the pool set based on model learnt in the respective train set \n estimations_pool, var = self.models[iteration-1].predict(self.pool_X)\n \n # Finding the point from the pool with highest uncertainty\n in_var = var.diagonal().argmax()\n to_add_x = self.pool_X[in_var,:]\n to_add_y = self.pool_Y[in_var]\n \n # Adding the point to the train set from the pool\n self.train_X_iter[iteration] = np.vstack([self.train_X_iter[iteration-1], to_add_x])\n self.train_Y_iter[iteration] = np.append(self.train_Y_iter[iteration-1], to_add_y)\n \n # Deleting the point from the pool\n self.pool_X = np.delete(self.pool_X, in_var, axis=0)\n self.pool_Y = np.delete(self.pool_Y, in_var)\n \n # Training on the new set\n self.models[iteration] = BLR(self.S0, self.models[iteration-1].MN)\n self.models[iteration].fit(self.train_X_iter[iteration], self.train_Y_iter[iteration])\n \n self.estimations[iteration], _ = self.models[iteration].predict(self.test_X)\n self.test_mae_error[iteration]= pd.Series(self.estimations[iteration] - self.test_Y.squeeze()).abs().mean()\n\n def _plot_iter_MAE(self, ax, iteration):\n ax.plot(list(self.test_mae_error.values())[:iteration+1], 'ko-')\n ax.set_title('MAE on test set over iterations')\n ax.set_xlim((-0.5, self.iterations+0.5))\n ax.set_ylabel(\"MAE on test set\")\n ax.set_xlabel(\"# Points Queried\")\n \n def _plot(self, ax, model):\n # Plotting the pool\n ax.scatter(self.pool_X[:,1], self.pool_Y, label='pool',s=1,color='r',alpha=0.4)\n \n # Plotting the test data\n ax.scatter(self.test_X[:,1], self.test_Y, label='test data',s=1, color='b', alpha=0.4)\n \n # Combining test_pool\n test_pool_X, test_pool_Y = np.append(self.test_X, self.pool_X, axis=0), np.append(self.test_Y, self.pool_Y)\n \n # Sorting test_pool\n sorted_inds = np.argsort(test_pool_X[:,1])\n test_pool_X, test_pool_Y = test_pool_X[sorted_inds], test_pool_Y[sorted_inds]\n \n # Plotting test_pool with uncertainty\n preds, var = model.predict(test_pool_X)\n individual_var = var.diagonal()\n ax.plot(test_pool_X[:,1], model.y_hat_map, color='black', label='model')\n ax.fill_between(test_pool_X[:,1], model.y_hat_map-individual_var, model.y_hat_map+individual_var\n , alpha=0.2, color='black', label='uncertainty')\n \n # plotting the train data\n ax.scatter(model.x[:,1], model.y,s=10, color='k', marker='s', label='train data')\n ax.scatter(model.x[-1,1], model.y[-1],s=80, color='r', marker='o', label='last added point')\n \n # plotting MAE\n preds, var = model.predict(self.test_X)\n ax.set_title('MAE is '+str(np.mean(np.abs(self.test_Y - preds))))\n ax.set_xlabel('x')\n ax.set_ylabel('y')\n ax.legend()\n \n def visualise_AL(self):\n fig, ax = plt.subplots(1,2,figsize=(13,5))\n def update(iteration):\n ax[0].cla()\n ax[1].cla()\n self._plot(ax[0], self.models[iteration])\n self._plot_iter_MAE(ax[1], iteration)\n fig.tight_layout()\n\n print('Initial model')\n print('Y = '+' + '.join(['{0:0.2f}'.format(self.models[0].MN[i])+' X^'*min(i,1)+str(i)*min(i,1) for i in range(self.degree+1)]))\n print('\\nFinal model')\n print('Y = '+' + '.join(['{0:0.2f}'.format(self.models[self.iterations].MN[i])+' X^'*min(i,1)+str(i)*min(i,1) for i in range(self.degree+1)]))\n\n anim = FuncAnimation(fig, update, frames=np.arange(0, self.iterations+1, 1), interval=250)\n plt.close()\n\n rc('animation', html='jshtml')\n return anim\n\n\n\nVisualizing a different polynomial fit on the same dataset\nLet’s try to fit a degree 7 polynomial to the same data now.\n\nnp.random.seed(seed)\nX_init = np.linspace(-1, 1, 1000)\nnoise = np.random.randn(1000, ) * 0.5\nY = (5 * X_init**3 - 4 * X_init**2 + 3 * X_init - 2) + noise\n\n\nmodel = ActiveL(X_init.reshape(-1,1), Y, degree=7, iterations=20, seed=seed)\n\n\nmodel.data_preperation()\nmodel.main()\nmodel.visualise_AL()\n\nInitial model\nY = -1.92 + 3.79 X^1 + -1.81 X^2 + -0.43 X^3 + -0.51 X^4 + -0.27 X^5 + -0.18 X^6 + -0.11 X^7\n\nFinal model\nY = -1.79 + 4.86 X^1 + -5.38 X^2 + 0.50 X^3 + -0.17 X^4 + 1.19 X^5 + 1.83 X^6 + 1.31 X^7\n\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nWe can clearly see that model was fitting the train points well and uncertainty was high at the left-most position. After first iteration, the left-most point was added to the train set and MAE reduced significantly. Similar phenomeneon happened at iteration 2 with the right-most point. After that error kept reducing at slower rate gradually because fit was near optimal after just 2 iterations.\n\n\nActive learning for diabetes dataset from the Scikit-learn module\nLet’s run our model for diabetes data from sklearn module. The data have various features like age, sex, weight etc. of diabetic people and target is increment in disease after one year. We’ll choose only ‘weight’ feature, which seems to have more correlation with the target.\nWe’ll try to fit degree 1 polynomial to this data, as our data seems to have a linear fit. First, let’s check the performance of Scikit-learn linear regression model.\n\nX, Y = datasets.load_diabetes(return_X_y=True)\nX = X[:, 2].reshape(-1,1) # Choosing only feature 2 which seems more relevent to linear regression\n\n# Normalizing\nX = (X - X.min())/(X.max() - X.min())\nY = (Y - Y.min())/(Y.max() - Y.min())\n\nVisualizing the dataset.\n\nplt.scatter(X, Y)\nplt.xlabel('Weight of the patients')\nplt.ylabel('Increase in the disease after a year')\nplt.show()\n\n\n\n\n\n\n\n\nLet’s fit the Scikit-learn linear regression model with 50% train-test split.\n\nfrom sklearn.linear_model import LinearRegression\ntrain_X, test_X, train_Y, test_Y = train_test_split(X, Y, test_size = 0.5, random_state = seed)\n\n\nclf = LinearRegression()\n\n\nclf.fit(train_X, train_Y)\npred_Y = clf.predict(test_X)\n\nVisualizing the fit & MAE.\n\nplt.scatter(X, Y, label='data', s=5)\nplt.plot(test_X, pred_Y, label='model', color='r')\nplt.xlabel('Weight of the patients')\nplt.ylabel('Increase in the disease after a year')\nplt.title('MAE is '+str(np.mean(np.abs(pred_Y - test_Y))))\nplt.legend()\nplt.show()\n\n\n\n\n\n\n\n\nNow we’ll fit the same data with our BLR model\n\nmodel = ActiveL(X.reshape(-1,1), Y, degree=1, iterations=20, seed=seed)\n\n\nmodel.data_preperation()\nmodel.main()\nmodel.visualise_AL()\n\nInitial model\nY = 0.41 + 0.16 X^1\n\nFinal model\nY = 0.13 + 0.86 X^1\n\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect\n \n \n\n\n\n\n\n\nInitially, the fit is leaning towards zero slope, which is the influence of bias due to a low number of training points. It’s interesting to see that our initial train points tend to make a vertical fit, but the model doesn’t get carried away by that and stabilizes the self with prior.\n\nprint('MAE for Scikit-learn Linear Regression is',np.mean(np.abs(pred_Y - test_Y)))\nprint('MAE for Bayesian Linear Regression is', model.test_mae_error[20])\n\nMAE for Scikit-learn Linear Regression is 0.15424985705353944\nMAE for Bayesian Linear Regression is 0.15738001811804758\n\n\nAt the end, results of sklearn linear regression and our active learning based BLR model are comparable even though we’ve used only 20 points to train our model over 221 points used by sklearn. This is because active learning enables us to choose those datapoints for training, which are going to contribute the most towards a precise fit." - }, - { - "objectID": "posts/non-gaussian-likelihood-mlps.html", - "href": "posts/non-gaussian-likelihood-mlps.html", - "title": "Non-Gaussian Likelihoods for MLPs", - "section": "", - "text": "# %pip install mapie\nimport os\n\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"3\"\n\nimport numpy as np\n\nimport torch\nimport torch.nn as nn\nimport torch.distributions as dist\n\nfrom tqdm import tqdm\n\nfrom sklearn.calibration import calibration_curve\nfrom sklearn.metrics import classification_report\nimport matplotlib.pyplot as plt\nimport seaborn as sns\n\nfrom mapie.metrics import regression_coverage_score\n\ndevice = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\ntorch.manual_seed(0)\n\nN = 100\nx = dist.Uniform(-1, 1).sample((N, 1)).sort(dim=0).values\nx_test = torch.linspace(-1, 1, 2 * N).view(-1, 1).sort(dim=0).values\ny = 3 * x**3 - 2 * x + 1\ny_noisy = y + dist.Gamma(0.1, 0.3).sample((N, 1))\n\nplt.plot(x, y, label=\"true\", color=\"C0\")\nplt.scatter(x, y_noisy, label=\"noisy data\", color=\"C1\")\n\nplt.legend()\nprint(\"x.shape:\", x.shape, \"y.shape:\", y.shape)\n\nx.shape: torch.Size([100, 1]) y.shape: torch.Size([100, 1])" - }, - { - "objectID": "posts/non-gaussian-likelihood-mlps.html#define-a-gaussiangamma-mlp", - "href": "posts/non-gaussian-likelihood-mlps.html#define-a-gaussiangamma-mlp", - "title": "Non-Gaussian Likelihoods for MLPs", - "section": "Define a Gaussian/Gamma MLP", - "text": "Define a Gaussian/Gamma MLP\n\nclass ProbabilisticMLP(nn.Module):\n def __init__(self, input_dim, feature_dims, type):\n super().__init__()\n self.input_dim = input_dim\n self.feature_dims = feature_dims\n self.type = type # \"gaussian\" or \"gamma\"\n\n self.layers = nn.ModuleList()\n self.layers.append(nn.Linear(input_dim, feature_dims[0]))\n for i in range(len(feature_dims) - 1):\n self.layers.append(nn.Linear(feature_dims[i], feature_dims[i + 1]))\n self.layers.append(nn.Linear(feature_dims[-1], 2))\n\n # likelihood parameters\n # if self.type == \"gaussian\":\n # self.register_buffer(\"likelihood_mean\", torch.zeros(1))\n # self.likelihood_log_std = nn.Parameter(torch.zeros(1))\n # elif self.type == \"gamma\":\n # self.likelihood_log_concentration = nn.Parameter(torch.zeros(1))\n # self.likelihood_log_rate = nn.Parameter(torch.zeros(1))\n\n def forward(self, x):\n for layer in self.layers[:-1]:\n x = torch.relu(layer(x))\n\n if self.type == \"gaussian\":\n # y_pred = self.layers[-1](x)\n # likelihood_mean = self.likelihood_mean.expand(y_pred.shape[0])\n # likelihood_log_std = self.likelihood_log_std.expand(y_pred.shape[0])\n # likelihood_std = torch.exp(likelihood_log_std)\n # return y_pred, likelihood_mean, likelihood_std\n\n y_out = self.layers[-1](x)\n mean = y_out[:, 0]\n log_std = y_out[:, 1]\n std = torch.exp(log_std)\n return mean.ravel(), std.ravel()\n\n elif self.type == \"gamma\":\n # y_pred = self.layers[-1](x)\n # likelihood_log_concentration = self.likelihood_log_concentration.expand(\n # y_pred.shape[0]\n # )\n # likelihood_log_rate = self.likelihood_log_rate.expand(y_pred.shape[0])\n # likelihood_concentration = torch.exp(likelihood_log_concentration)\n # likelihood_rate = torch.exp(likelihood_log_rate)\n # return y_pred, likelihood_concentration, likelihood_rate\n\n y_out = self.layers[-1](x)\n log_concentration = y_out[:, 0]\n log_rate = y_out[:, 1]\n concentration = torch.exp(log_concentration)\n rate = torch.exp(log_rate)\n return concentration, rate\n\n def loss_fn(self, y, param1, param2):\n if self.type == \"gaussian\":\n # epsilon = y - y_pred\n # mean = param1\n # std = param2\n # dist = torch.distributions.Normal(mean, std + 1e-6)\n # return -dist.log_prob(epsilon).mean()\n mean = param1\n std = param2\n dist = torch.distributions.Normal(mean, std + 1e-3)\n return -dist.log_prob(y.ravel()).mean()\n\n elif self.type == \"gamma\":\n # epsilon = torch.clip(y - y_pred, min=1e-6, max=1e6)\n # concentration = param1\n # rate = param2\n # dist = torch.distributions.Gamma(concentration, rate)\n # return -dist.log_prob(epsilon).mean()\n concentration = param1\n rate = param2\n dist = torch.distributions.Gamma(concentration + 1e-3, rate + 1e-3)\n return -dist.log_prob(y.ravel()).mean()" - }, - { - "objectID": "posts/non-gaussian-likelihood-mlps.html#fit-gaussian-mlp", - "href": "posts/non-gaussian-likelihood-mlps.html#fit-gaussian-mlp", - "title": "Non-Gaussian Likelihoods for MLPs", - "section": "Fit Gaussian MLP", - "text": "Fit Gaussian MLP\n\ntorch.manual_seed(0)\n\nmodel = ProbabilisticMLP(1, [32, 32], \"gaussian\").to(device)\n\noptimizer = torch.optim.Adam(model.parameters(), lr=0.01)\nn_epochs = 500\n\npbar = tqdm(range(n_epochs))\nlosses = []\nfor epoch in pbar:\n optimizer.zero_grad()\n param1, param2 = model(x.to(device))\n loss = model.loss_fn(y_noisy.to(device), param1, param2)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: 0.4503: 100%|██████████| 500/500 [00:01<00:00, 291.18it/s]\n\n\n\n\n\n\n\n\n\n\n# sns.kdeplot(param2.cpu().detach().numpy(), label=\"std\")\n\n\nwith torch.no_grad():\n y_mean, y_std = model(x_test.to(device))\n y_mean = y_mean.cpu().numpy().ravel()\n y_std = y_std.cpu().numpy().ravel()\n # y_mean = y_pred.cpu().numpy().ravel() + mean.cpu().numpy().ravel()\n # y_std = std.cpu().numpy().ravel()\n\nplt.plot(x, y, label=\"true\", color=\"C0\")\nplt.scatter(x, y_noisy, label=\"noisy data\", color=\"C1\")\nplt.plot(x_test, y_mean, label=\"y_mean\", color=\"C2\")\nplt.fill_between(\n x_test.squeeze(),\n y_mean - 2 * y_std,\n y_mean + 2 * y_std,\n alpha=0.3,\n color=\"C2\",\n label=\"95% CI\",\n)\n\nplt.legend()\n\n\n\n\n\n\n\n\n\nwith torch.no_grad():\n y_mean, y_std = model(x.to(device))\n y_mean = y_mean.cpu().numpy().ravel()\n y_std = y_std.cpu().numpy().ravel()\n\nupper = y_mean + 2 * y_std\nlower = y_mean - 2 * y_std\n\nregression_coverage_score(y_noisy.numpy(), lower, upper)\n\n0.91" - }, - { - "objectID": "posts/non-gaussian-likelihood-mlps.html#fit-gamma-mlp", - "href": "posts/non-gaussian-likelihood-mlps.html#fit-gamma-mlp", - "title": "Non-Gaussian Likelihoods for MLPs", - "section": "Fit Gamma MLP", - "text": "Fit Gamma MLP\n\nmodel = ProbabilisticMLP(1, [32, 32, 32], \"gamma\").to(device)\n\noptimizer = torch.optim.Adam(model.parameters(), lr=0.01)\nn_epochs = 1000\n\npbar = tqdm(range(n_epochs))\nlosses = []\nfor epoch in pbar:\n optimizer.zero_grad()\n param1, param2 = model(x.to(device))\n loss = model.loss_fn(y_noisy.to(device), param1, param2)\n loss.backward()\n optimizer.step()\n losses.append(loss.item())\n\n pbar.set_description(f\"loss: {loss.item():.4f}\")\n\nplt.plot(losses)\n\nloss: 0.0775: 100%|██████████| 1000/1000 [00:03<00:00, 266.98it/s]\n\n\n\n\n\n\n\n\n\n\nfrom scipy.special import gammaincinv, gamma\n\nwith torch.no_grad():\n concetration, rate = model(x_test.to(device))\n concetration = concetration.cpu().ravel().numpy()\n rate = rate.cpu().ravel().numpy()\n\n y_mode = (concetration - 1) / rate\n\n quantile_fn = lambda p: gammaincinv(concetration, gamma(concetration) * p) / rate\n\n upper = quantile_fn(0.975)\n lower = quantile_fn(0.025)\n\nplt.plot(x, y, label=\"true\", color=\"C0\")\nplt.scatter(x, y_noisy, label=\"noisy data\", color=\"C1\")\nplt.plot(x_test, y_mode, label=\"mean\", color=\"C2\")\nplt.fill_between(\n x_test.squeeze(),\n lower,\n upper,\n alpha=0.3,\n color=\"C2\",\n label=\"95% CI\",\n)\n\nplt.legend()\n\n\n\n\n\n\n\n\n\nwith torch.no_grad():\n param1, param2 = model(x.to(device))\n concetration = param1.cpu().numpy().ravel()\n rate = param2.cpu().numpy().ravel()\n\n upper = quantile_fn(0.975)\n lower = quantile_fn(0.025)\n\nregression_coverage_score(y_noisy.numpy(), lower, upper)\n\n0.07" - }, - { - "objectID": "posts/2022-05-14-iteratively_reweighted_least_squares.html", - "href": "posts/2022-05-14-iteratively_reweighted_least_squares.html", - "title": "Iteratively reweighted least squares (IRLS) logistic regression", + "objectID": "lab/how-to-finetune-florence-2-on-detection-dataset.html", + "href": "lab/how-to-finetune-florence-2-on-detection-dataset.html", + "title": "Fine-tuning Florence-2 on Object Detection Dataset", "section": "", - "text": "import jax\nimport jax.numpy as jnp\nimport numpy as np\nfrom sklearn.datasets import make_blobs\nimport matplotlib.pyplot as plt\nfrom matplotlib.animation import FuncAnimation\nfrom time import time\n\n# Enable high precision\nfrom jax.config import config\nconfig.update(\"jax_enable_x64\", True)\n\n# To enable animation inside notebook\nplt.rc(\"animation\", html=\"jshtml\")" - }, - { - "objectID": "posts/2022-05-14-iteratively_reweighted_least_squares.html#create-dataset", - "href": "posts/2022-05-14-iteratively_reweighted_least_squares.html#create-dataset", - "title": "Iteratively reweighted least squares (IRLS) logistic regression", - "section": "Create dataset", - "text": "Create dataset\n\nfeatures, labels = make_blobs(100, n_features=2, centers=2, random_state=0)\nplt.scatter(features[:, 0], features[:, 1], c=labels);\n\n\n\n\n\n\n\n\n\nprint(features.shape, features.dtype, labels.shape, labels.dtype)\n\n(100, 2) float64 (100,) int64" - }, - { - "objectID": "posts/2022-05-14-iteratively_reweighted_least_squares.html#implementing-newtons-method-naive-way", - "href": "posts/2022-05-14-iteratively_reweighted_least_squares.html#implementing-newtons-method-naive-way", - "title": "Iteratively reweighted least squares (IRLS) logistic regression", - "section": "Implementing Newton’s method (naive way)", - "text": "Implementing Newton’s method (naive way)\nWe will first try to implement Eq. 10.31 directly from PML book1:\n\\[\n\\boldsymbol{w}_{t+1}=\\boldsymbol{w}_{t}-\\eta_{t} \\mathbf{H}_{t}^{-1} \\boldsymbol{g}_{t}\n\\]\n\ndef get_logits(params, feature): # for a single data-point\n logits = jnp.sum(feature * params[\"w\"]) + params[\"b\"]\n return logits\n\ndef naive_loss(params, feature, label): # for a single data-point\n logits = get_logits(params, feature)\n prob = jax.nn.sigmoid(logits)\n\n # Check if label is 1 or 0\n is_one = (label == 1)\n loss_if_one = lambda: -jnp.log(prob) # loss if label is 1\n loss_if_zero = lambda: -jnp.log(1 - prob) # loss if labels is 0\n\n # Use lax.cond to convert if..else.. in jittable format\n loss = jax.lax.cond(is_one, loss_if_one, loss_if_zero)\n\n return loss\n\ndef naive_loss_batch(params, features, labels): # for a batch of data-points\n losses = jax.vmap(naive_loss, in_axes=(None, 0, 0))(params, features, labels)\n return jnp.mean(losses)\n\nWriting the train function\n\ndef naive_train_step(params, features, labels, learning_rate):\n # Find gradient\n loss_value, grads = jax.value_and_grad(naive_loss_batch)(params, features, labels)\n\n # Find Hessian\n hess = jax.hessian(naive_loss_batch)(params, features, labels)\n\n # Adjust Hessian matrix nicely\n hess_matrix = jnp.block([[hess[\"b\"][\"b\"], hess[\"b\"][\"w\"]],\n [hess[\"w\"][\"b\"], hess[\"w\"][\"w\"]]])\n \n # Adjust gradient vector nicely\n grad_vector = jnp.r_[grads[\"b\"], grads[\"w\"]]\n\n # Find H^-1g\n h_inv_g = jnp.dot(jnp.linalg.inv(hess_matrix), grad_vector)\n\n # Get back the structure\n h_inv_g = {\"b\": h_inv_g[0], \"w\": h_inv_g[1:]}\n\n # Apply the update\n params = jax.tree_map(lambda p, g: p - learning_rate*g, params, h_inv_g)\n\n return params, loss_value\n\n# First order method\n# vg = jax.value_and_grad(naive_loss_batch)\n# def train_step(params, features, labels, learning_rate):\n# # Find gradient\n# loss_value, grads = vg(params, features, labels)\n\n# # Apply the update\n# params = jax.tree_map(lambda p, g: p - learning_rate*g, params, grads)\n\n# return params, loss_value\n\n\nkey = jax.random.PRNGKey(0)\nrandom_params = jax.random.normal(key, shape=(3, ))\n# \"b\" should have shape (1,) for hessian trick with jnp.block to work\nparams = {\"w\": random_params[:2], \"b\": random_params[2].reshape(1,)}\nlearning_rate = 1.0\nepochs = 20\n\ntrain_step_jitted = jax.jit(naive_train_step)\n\nhistory = {\"loss\": [], \"params\": []}\n\n# warm up\ntrain_step_jitted(params, features, labels, learning_rate)\n\ninit = time()\nfor _ in range(epochs):\n history[\"params\"].append(params)\n params, loss_value = train_step_jitted(params, features, labels, learning_rate)\n history[\"loss\"].append(loss_value)\nprint(time() - init, \"seconds\")\nprint(params)\n\nWARNING:absl:No GPU/TPU found, falling back to CPU. (Set TF_CPP_MIN_LOG_LEVEL=0 and rerun for more info.)\n\n\n0.0015490055084228516 seconds\n{'b': DeviceArray([13.22076694], dtype=float64), 'w': DeviceArray([ 0.59021174, -5.18797851], dtype=float64)}\n\n\nA helper function to animate the learning.\n\ndef animate(history):\n fig, ax = plt.subplots(1, 2, figsize=(10,4))\n def update(idx):\n # Clear previous frame\n ax[0].cla()\n ax[1].cla()\n\n # Plot data\n params = history[\"params\"][idx]\n losses = history[\"loss\"][:idx]\n ax[0].scatter(features[:, 0], features[:, 1], c=labels)\n \n # Calculate and plot decision boundary\n x0_min, x0_max = features[:, 0].min(), features[:, 0].max()\n x1_min = -(params[\"b\"] + params[\"w\"][0] * x0_min)/params[\"w\"][1]\n x1_max = -(params[\"b\"] + params[\"w\"][0] * x0_max)/params[\"w\"][1]\n\n ax[0].plot([x0_min, x0_max], [x1_min, x1_max], label='decision boundary')\n\n # Plot losses\n ax[1].plot(losses, label=\"loss\")\n ax[1].set_xlabel(\"Iterations\")\n\n ax[0].legend()\n ax[1].legend()\n\n anim = FuncAnimation(fig, update, range(epochs))\n plt.close()\n return anim\n\n\nanimate(history)\n\n\n\n\n\n\n\n\n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n \n Once\n \n Loop\n \n Reflect" + "text": "Roboflow Notebooks" }, { - "objectID": "posts/2022-05-14-iteratively_reweighted_least_squares.html#implementing-irls-algorithm", - "href": "posts/2022-05-14-iteratively_reweighted_least_squares.html#implementing-irls-algorithm", - "title": "Iteratively reweighted least squares (IRLS) logistic regression", - "section": "Implementing IRLS algorithm", - "text": "Implementing IRLS algorithm\n\ndef get_s_and_z(params, feature, label): # for a single data-point\n logits = get_logits(params, feature)\n prob = jax.nn.sigmoid(logits)\n s = prob * (1 - prob)\n z = logits + (label - prob)/s\n return s, z\n\ndef irls_train_step(params, features, labels):\n s, z = jax.vmap(get_s_and_z, in_axes=(None, 0, 0))(params, features, labels)\n S = jnp.diag(s.flatten()) # convert into a diagonal matrix\n\n # Add column with ones\n X = jnp.c_[jnp.ones(len(z)), features]\n\n # Get weights\n weights = jnp.linalg.inv(X.T@S@X)@X.T@S@z.flatten()\n\n # get correct format\n params = {\"b\": weights[0], \"w\": weights[1:]}\n\n return params\n\n\nkey = jax.random.PRNGKey(0)\nrandom_params = jax.random.normal(key, shape=(3,))\nparams = {\"w\": random_params[:2], \"b\": random_params[2]}\nepochs = 20\n\ntrain_step_jitted = jax.jit(irls_train_step)\n\nirls_history = {\"params\": []}\n\n# warm up\ntrain_step_jitted(params, features, labels)\n\ninit = time()\nfor _ in range(epochs):\n irls_history[\"params\"].append(params)\n params = train_step_jitted(params, features, labels)\nprint(time() - init, \"seconds\")\nprint(params)\n\n0.0016303062438964844 seconds\n{'b': DeviceArray(13.22076694, dtype=float64), 'w': DeviceArray([ 0.59021174, -5.18797851], dtype=float64)}" + "objectID": "lab/how-to-finetune-florence-2-on-detection-dataset.html#setup", + "href": "lab/how-to-finetune-florence-2-on-detection-dataset.html#setup", + "title": "Fine-tuning Florence-2 on Object Detection Dataset", + "section": "Setup", + "text": "Setup\n\nConfigure your API keys\nTo fine-tune Florence-2, you need to provide your HuggingFace Token and Roboflow API key. Follow these steps:\n\nOpen your HuggingFace Settings page. Click Access Tokens then New Token to generate new token.\nGo to your Roboflow Settings page. Click Copy. This will place your private key in the clipboard.\nIn Colab, go to the left pane and click on Secrets (🔑).\n\nStore HuggingFace Access Token under the name HF_TOKEN.\nStore Roboflow API Key under the name ROBOFLOW_API_KEY.\n\n\n\n\nSelect the runtime\nLet’s make sure that we have access to GPU. We can use nvidia-smi command to do that. In case of any problems navigate to Edit -> Notebook settings -> Hardware accelerator, set it to L4 GPU, and then click Save.\n\n!nvidia-smi\n\nTue Feb 11 14:44:13 2025 \n+-----------------------------------------------------------------------------------------+\n| NVIDIA-SMI 565.57.01 Driver Version: 565.57.01 CUDA Version: 12.7 |\n|-----------------------------------------+------------------------+----------------------+\n| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |\n| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |\n| | | MIG M. |\n|=========================================+========================+======================|\n| 0 NVIDIA A100-SXM4-80GB On | 00000000:01:00.0 Off | 0 |\n| N/A 38C P0 86W / 500W | 5MiB / 81920MiB | 0% Default |\n| | | Disabled |\n+-----------------------------------------+------------------------+----------------------+\n| 1 NVIDIA A100-SXM4-80GB On | 00000000:41:00.0 Off | 0 |\n| N/A 35C P0 63W / 500W | 5MiB / 81920MiB | 0% Default |\n| | | Disabled |\n+-----------------------------------------+------------------------+----------------------+\n| 2 NVIDIA A100-SXM4-80GB On | 00000000:81:00.0 Off | 0 |\n| N/A 35C P0 63W / 500W | 5MiB / 81920MiB | 0% Default |\n| | | Disabled |\n+-----------------------------------------+------------------------+----------------------+\n| 3 NVIDIA A100-SXM4-80GB On | 00000000:C1:00.0 Off | 0 |\n| N/A 34C P0 62W / 500W | 5MiB / 81920MiB | 0% Default |\n| | | Disabled |\n+-----------------------------------------+------------------------+----------------------+\n \n+-----------------------------------------------------------------------------------------+\n| Processes: |\n| GPU GI CI PID Type Process name GPU Memory |\n| ID ID Usage |\n|=========================================================================================|\n| No running processes found |\n+-----------------------------------------------------------------------------------------+\n\n\n\n\nDownload example data\nNOTE: Feel free to replace our example image with your own photo.\n\n!wget -q https://media.roboflow.com/notebooks/examples/dog.jpeg\n!ls -lh\n\ntotal 337M\n-rw-rw-r-- 1 patel_zeel patel_zeel 104K Jun 2 2023 dog.jpeg\n-rw-rw-r-- 1 patel_zeel patel_zeel 104K Jun 2 2023 dog.jpeg.1\n-rw-rw-r-- 1 patel_zeel patel_zeel 104K Jun 2 2023 dog.jpeg.2\n-rw-rw-r-- 1 patel_zeel patel_zeel 104K Jun 2 2023 dog.jpeg.3\n-rw-rw-r-- 1 patel_zeel patel_zeel 4.7M Jan 20 13:40 example.tiff\n-rw-rw-r-- 1 patel_zeel patel_zeel 4.3M Feb 11 14:44 how-to-finetune-florence-2-on-detection-dataset.ipynb\ndrwxrwxr-x 7 patel_zeel patel_zeel 4.0K Feb 11 14:33 model_checkpoints\ndrwxrwxr-x 5 patel_zeel patel_zeel 4.0K Feb 11 00:54 poker-cards-4\ndrwxrwxr-x 4 patel_zeel patel_zeel 4.0K Jan 20 15:43 runs\n-rw-rw-r-- 1 patel_zeel patel_zeel 2.5M Jan 22 10:51 scratchpad.ipynb\ndrwxrwxr-x 4 patel_zeel patel_zeel 4.0K Jan 20 15:17 trench_width\ndrwxrwxr-x 7 patel_zeel patel_zeel 4.0K Jan 20 15:43 wandb\n-rw-rw-r-- 1 patel_zeel patel_zeel 41M Jan 20 15:36 yolo11m-obb.pt\n-rw-rw-r-- 1 patel_zeel patel_zeel 5.4M Jan 20 15:17 yolo11n.pt\n-rw-rw-r-- 1 patel_zeel patel_zeel 50M Jan 20 15:43 yolov8m.pt\n-rw-rw-r-- 1 patel_zeel patel_zeel 53M Jan 20 15:43 yolov8m-seg.pt\n-rw-rw-r-- 1 patel_zeel patel_zeel 23M Jan 20 15:14 yolov8s-obb.pt\n-rw-rw-r-- 1 patel_zeel patel_zeel 22M Jan 18 23:20 yolov8s.pt\n-rw-rw-r-- 1 patel_zeel patel_zeel 134M Jan 20 15:30 yolov8x-obb.pt\n\n\n\nEXAMPLE_IMAGE_PATH = \"dog.jpeg\"" }, { - "objectID": "posts/2022-05-14-iteratively_reweighted_least_squares.html#comparison", - "href": "posts/2022-05-14-iteratively_reweighted_least_squares.html#comparison", - "title": "Iteratively reweighted least squares (IRLS) logistic regression", - "section": "Comparison", - "text": "Comparison\n\nnaive_params_b = list(map(lambda x: x[\"b\"], history[\"params\"]))\nirls_params_b = list(map(lambda x: x[\"b\"], irls_history[\"params\"]))\n\nnaive_params_w = list(map(lambda x: x[\"w\"], history[\"params\"]))\nirls_params_w = list(map(lambda x: x[\"w\"], irls_history[\"params\"]))\n\n\nplt.plot(naive_params_b, \"o-\", label=\"Naive\")\nplt.plot(irls_params_b, label=\"IRLS\")\nplt.xlabel(\"Iterations\")\nplt.title(\"Bias\")\nplt.legend();\n\n\n\n\n\n\n\n\n\nplt.plot(naive_params_w, \"o-\", label=\"Naive\")\nplt.plot(irls_params_w, label=\"IRLS\")\nplt.xlabel(\"Iterations\")\nplt.title(\"Weights\")\nplt.legend();" + "objectID": "lab/how-to-finetune-florence-2-on-detection-dataset.html#download-and-configure-the-model", + "href": "lab/how-to-finetune-florence-2-on-detection-dataset.html#download-and-configure-the-model", + "title": "Fine-tuning Florence-2 on Object Detection Dataset", + "section": "Download and configure the model", + "text": "Download and configure the model\nLet’s download the model checkpoint and configure it so that you can fine-tune it later on.\n\n!pip install -q transformers flash_attn timm einops peft\n!pip install -q roboflow git+https://github.com/roboflow/supervision.git\n\n\n# @title Imports\n\nimport io\nimport os\nimport re\nimport json\nimport torch\nimport html\nimport base64\nimport itertools\n\nimport numpy as np\nimport supervision as sv\n\n# from google.colab import userdata\nfrom IPython.core.display import display, HTML\nfrom torch.utils.data import Dataset, DataLoader\nfrom transformers import (\n AdamW,\n AutoModelForCausalLM,\n AutoProcessor,\n get_scheduler\n)\nfrom tqdm import tqdm\nfrom typing import List, Dict, Any, Tuple, Generator\nfrom peft import LoraConfig, get_peft_model\nfrom PIL import Image\nfrom roboflow import Roboflow\n\nDeprecationWarning: Importing display from IPython.core.display is deprecated since IPython 7.14, please import from IPython.display\n\n\nLoad the model using AutoModelForCausalLM and the processor using AutoProcessor classes from the transformers library. Note that you need to pass trust_remote_code as True since this model is not a standard transformers model.\n\nCHECKPOINT = \"microsoft/Florence-2-base-ft\"\n# REVISION = 'refs/pr/6'\nDEVICE = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n\nmodel = AutoModelForCausalLM.from_pretrained(CHECKPOINT, trust_remote_code=True).to(DEVICE)\nprocessor = AutoProcessor.from_pretrained(CHECKPOINT, trust_remote_code=True)\n\nImporting from timm.models.layers is deprecated, please import via timm.layers\nFlorence2LanguageForConditionalGeneration has generative capabilities, as `prepare_inputs_for_generation` is explicitly overwritten. However, it doesn't directly inherit from `GenerationMixin`. From 👉v4.50👈 onwards, `PreTrainedModel` will NOT inherit from `GenerationMixin`, and this model will lose the ability to call `generate` and other related functions.\n - If you're using `trust_remote_code=True`, you can get rid of this warning by loading the model with an auto class. See https://huggingface.co/docs/transformers/en/model_doc/auto#auto-classes\n - If you are the owner of the model architecture code, please modify your model class such that it inherits from `GenerationMixin` (after `PreTrainedModel`, otherwise you'll get an exception).\n - If you are not the owner of the model architecture class, please contact the model code owner to update it." }, { - "objectID": "posts/CNPs_for_Images.html", - "href": "posts/CNPs_for_Images.html", - "title": "Conditional Neural Processes for Image Interpolation", - "section": "", - "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"0\"\n# turn off preallocation by JAX\nos.environ[\"XLA_PYTHON_CLIENT_PREALLOCATE\"] = \"false\"\n\nimport numpy as np\nimport pandas as pd\n\nfrom tqdm import tqdm\nimport jax\nimport jax.numpy as jnp\nimport flax.linen as nn\n\nimport distrax as dx\n\nimport optax\n\n# load mnist dataset from tensorflow datasets\nimport tensorflow_datasets as tfds\n\nfrom sklearn.model_selection import train_test_split\n\nimport matplotlib.pyplot as plt\n# define initializers\ndef first_layer_init(key, shape, dtype=jnp.float32):\n num_input = shape[0] # reverse order compared to torch\n return jax.random.uniform(key, shape, dtype, minval=-1.0/num_input, maxval=1.0/num_input)\n\ndef other_layers_init(key, shape, dtype=jnp.float32):\n num_input = shape[0] # reverse order compared to torch\n return jax.random.uniform(key, shape, dtype, minval=-np.sqrt(6 / num_input)/30, maxval=np.sqrt(6 / num_input)/30)\n\nclass Encoder(nn.Module):\n features: list\n encoding_dims: int\n\n @nn.compact\n def __call__(self, x_context, y_context):\n x = jnp.hstack([x_context, y_context.reshape(x_context.shape[0], -1)])\n \n x = nn.Dense(self.features[0], kernel_init=first_layer_init, bias_init=first_layer_init)(x)\n x = jnp.sin(30*x)\n # x = nn.Dense(self.features[0])(x)\n # x = nn.relu(x)\n \n \n for n_features in self.features[1:]:\n x = nn.Dense(n_features, kernel_init=other_layers_init, bias_init=other_layers_init)(x)\n x = jnp.sin(30*x)\n # x = nn.Dense(n_features)(x)\n # x = nn.relu(x)\n\n x = nn.Dense(self.encoding_dims)(x)\n\n representation = x.mean(axis=0, keepdims=True) # option 1\n return representation # (1, encoding_dims)\n\nclass Decoder(nn.Module):\n features: list\n output_dim: int\n\n @nn.compact\n def __call__(self, representation, x):\n representation = jnp.repeat(representation, x.shape[0], axis=0)\n x = jnp.hstack([representation, x])\n \n x = nn.Dense(self.features[0], kernel_init=first_layer_init, bias_init=first_layer_init)(x)\n x = jnp.sin(30*x)\n # x = nn.Dense(self.features[0])(x)\n # x = nn.relu(x)\n\n for n_features in self.features:\n x = nn.Dense(n_features, kernel_init=other_layers_init, bias_init=other_layers_init)(x)\n x = jnp.sin(30*x)\n # x = nn.Dense(n_features)(x)\n # x = nn.relu(x)\n\n x = nn.Dense(self.output_dim*2)(x)\n loc, raw_scale = x[:, :self.output_dim], x[:, self.output_dim:]\n scale = jnp.exp(raw_scale)\n \n return loc, scale\n\nclass CNP(nn.Module):\n encoder_features: list\n encoding_dims: int\n decoder_features: list\n output_dim: int\n\n @nn.compact\n def __call__(self, x_content, y_context, x_target):\n representation = Encoder(self.encoder_features, self.encoding_dims)(x_content, y_context)\n loc, scale = Decoder(self.decoder_features, self.output_dim)(representation, x_target)\n return loc, scale\n\n def loss_fn(self, params, x_context, y_context, x_target, y_target):\n loc, scale = self.apply(params, x_context, y_context, x_target)\n predictive_distribution = dx.MultivariateNormalDiag(loc=loc, scale_diag=0.005+scale)\n return -predictive_distribution.log_prob(y_target)" + "objectID": "lab/how-to-finetune-florence-2-on-detection-dataset.html#run-inference-with-pre-trained-florence-2-model", + "href": "lab/how-to-finetune-florence-2-on-detection-dataset.html#run-inference-with-pre-trained-florence-2-model", + "title": "Fine-tuning Florence-2 on Object Detection Dataset", + "section": "Run inference with pre-trained Florence-2 model", + "text": "Run inference with pre-trained Florence-2 model\n\n# @title Example object detection inference\n\nimage = Image.open(EXAMPLE_IMAGE_PATH)\ntask = \"<OD>\"\ntext = \"<OD>\"\n\ninputs = processor(text=text, images=image, return_tensors=\"pt\").to(DEVICE)\ngenerated_ids = model.generate(\n input_ids=inputs[\"input_ids\"],\n pixel_values=inputs[\"pixel_values\"],\n max_new_tokens=1024,\n num_beams=3\n)\ngenerated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]\nresponse = processor.post_process_generation(generated_text, task=task, image_size=(image.width, image.height))\ndetections = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, response, resolution_wh=image.size)\n\nbounding_box_annotator = sv.BoundingBoxAnnotator(color_lookup=sv.ColorLookup.INDEX)\nlabel_annotator = sv.LabelAnnotator(color_lookup=sv.ColorLookup.INDEX)\n\nimage = bounding_box_annotator.annotate(image, detections)\nimage = label_annotator.annotate(image, detections)\nimage.thumbnail((600, 600))\nimage\n\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\n\n\n\n\n\n\n\n\n\n\n# @title Example image captioning inference\n\nimage = Image.open(EXAMPLE_IMAGE_PATH)\ntask = \"<DETAILED_CAPTION>\"\ntext = \"<DETAILED_CAPTION>\"\n\ninputs = processor(text=text, images=image, return_tensors=\"pt\").to(DEVICE)\ngenerated_ids = model.generate(\n input_ids=inputs[\"input_ids\"],\n pixel_values=inputs[\"pixel_values\"],\n max_new_tokens=1024,\n num_beams=3\n)\ngenerated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]\nresponse = processor.post_process_generation(generated_text, task=task, image_size=(image.width, image.height))\nresponse\n\n{'<DETAILED_CAPTION>': 'In this image we can see a person wearing a bag and holding a dog. In the background there are buildings, poles and sky with clouds.'}\n\n\n\n# @title Example caption to phrase grounding inference\n\nimage = Image.open(EXAMPLE_IMAGE_PATH)\ntask = \"<CAPTION_TO_PHRASE_GROUNDING>\"\ntext = \"<CAPTION_TO_PHRASE_GROUNDING> Vehicle\"\n\ninputs = processor(text=text, images=image, return_tensors=\"pt\").to(DEVICE)\ngenerated_ids = model.generate(\n input_ids=inputs[\"input_ids\"],\n pixel_values=inputs[\"pixel_values\"],\n max_new_tokens=1024,\n num_beams=3\n)\ngenerated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]\nresponse = processor.post_process_generation(generated_text, task=task, image_size=(image.width, image.height))\ndetections = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, response, resolution_wh=image.size)\n\nbounding_box_annotator = sv.BoundingBoxAnnotator(color_lookup=sv.ColorLookup.INDEX)\nlabel_annotator = sv.LabelAnnotator(color_lookup=sv.ColorLookup.INDEX)\n\nimage = bounding_box_annotator.annotate(image, detections)\nimage = label_annotator.annotate(image, detections)\nimage.thumbnail((600, 600))\nimage\n\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0." }, { - "objectID": "posts/CNPs_for_Images.html#load-mnist", - "href": "posts/CNPs_for_Images.html#load-mnist", - "title": "Conditional Neural Processes for Image Interpolation", - "section": "Load MNIST", - "text": "Load MNIST\n\nds = tfds.load('mnist')\n\n\ndef dataset_to_arrays(dataset):\n data = []\n labels = []\n stopper = 0\n end = 100\n for sample in dataset:\n data.append(sample[\"image\"].numpy())\n labels.append(sample[\"label\"].numpy())\n stopper += 1\n if stopper == end:\n break\n return np.array(data), np.array(labels)[..., None]\n\ntrain_data, train_labels = dataset_to_arrays(ds[\"train\"])\ntest_data, test_labels = dataset_to_arrays(ds[\"test\"])\n\ntrain_data.shape, train_labels.shape, test_data.shape, test_labels.shape\n\n2023-06-02 09:58:48.609001: W tensorflow/core/kernels/data/cache_dataset_ops.cc:856] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead.\n2023-06-02 09:58:48.681190: W tensorflow/core/kernels/data/cache_dataset_ops.cc:856] The calling iterator did not fully read the dataset being cached. In order to avoid unexpected truncation of the dataset, the partially cached contents of the dataset will be discarded. This can happen if you have an input pipeline similar to `dataset.cache().take(k).repeat()`. You should use `dataset.take(k).cache().repeat()` instead.\n\n\n((100, 28, 28, 1), (100, 1), (100, 28, 28, 1), (100, 1))\n\n\n\ncoords = np.linspace(-1, 1, 28)\nx, y = np.meshgrid(coords, coords)\ntrain_X = jnp.stack([x, y], axis=-1).reshape(-1, 2)\n\ntrain_y = jax.vmap(lambda x: x.reshape(-1, 1))(train_data) / 255.0\ntrain_X.shape, train_y.shape, type(train_X), type(train_y)\n\n((784, 2),\n (100, 784, 1),\n jaxlib.xla_extension.ArrayImpl,\n jaxlib.xla_extension.ArrayImpl)\n\n\n\niterations = 10000\n\ndef loss_fn(params, context_X, context_y, target_X, target_y):\n def loss_fn_per_sample(context_X, context_y, target_X, target_y):\n loc, scale = model.apply(params, context_X, context_y, target_X)\n # predictive_distribution = dx.MultivariateNormalDiag(loc=loc, scale_diag=scale)\n # return -predictive_distribution.log_prob(target_y)\n return jnp.square(loc.ravel() - target_y.ravel()).mean()\n \n return jax.vmap(loss_fn_per_sample, in_axes=(None, 0, None, 0))(context_X, context_y, target_X, target_y).mean()\n\nvalue_and_grad_fn = jax.jit(jax.value_and_grad(loss_fn))\nmodel = CNP([256]*2, 128, [256]*4, 1)\nparams = model.init(jax.random.PRNGKey(0), train_X, train_y[0], train_X)\noptimizer = optax.adam(1e-5)\nstate = optimizer.init(params)\n\n# losses = []\n# for iter in tqdm(range(iterations)):\n# tmp_index = jax.random.permutation(jax.random.PRNGKey(iter), index)\n# context_X = train_X[tmp_index][:int(train_X.shape[0]*0.05)]\n# context_y = train_y[:, tmp_index, :][:, :int(train_X.shape[0]*0.05), :]\n# target_X = train_X[tmp_index][int(train_X.shape[0]*0.05):]\n# target_y = train_y[:, tmp_index, :][:, int(train_X.shape[0]*0.05):, :]\n \n# # print(context_X.shape, context_y.shape, target_X.shape, target_y.shape)\n# # print(loss_fn(params, context_X, context_y, target_X, target_y).shape)\n \n# loss, grads = value_and_grad_fn(params, context_X, context_y, target_X, target_y)\n# updates, state = optimizer.update(grads, state)\n# params = optax.apply_updates(params, updates)\n# losses.append(loss.item())\n\ndef one_step(params_and_state, key):\n params, state = params_and_state\n tmp_index = jax.random.permutation(key, train_X.shape[0])\n context_X = train_X[tmp_index][:int(train_X.shape[0]*0.05)]\n context_y = train_y[:, tmp_index, :][:, :int(train_X.shape[0]*0.05), :]\n target_X = train_X[tmp_index][int(train_X.shape[0]*0.05):]\n target_y = train_y[:, tmp_index, :][:, int(train_X.shape[0]*0.05):, :]\n loss, grads = value_and_grad_fn(params, context_X, context_y, target_X, target_y)\n updates, state = optimizer.update(grads, state)\n params = optax.apply_updates(params, updates)\n return (params, state), loss\n\n(params, state), loss_history = jax.lax.scan(one_step, (params, state), jax.random.split(jax.random.PRNGKey(0), iterations))\n\n\nplt.plot(loss_history[10:]);\n\n\n\n\n\n\n\n\n\ntest_key = jax.random.PRNGKey(0)\ntmp_index = jax.random.permutation(test_key, train_X.shape[0])\ncontext_X = train_X[tmp_index][:int(train_X.shape[0]*0.5)]\ncontext_y = train_y[:, tmp_index, :][:, :int(train_X.shape[0]*0.5), :]\ntarget_X = train_X#[tmp_index][int(train_X.shape[0]*0.5):]\ntarget_y = train_y#[:, tmp_index, :][:, int(train_X.shape[0]*0.5):, :]\n\nid = 91\nplt.imshow(train_y[id].reshape(28, 28), cmap=\"gray\", interpolation=None);\n\nlocs, scales = jax.vmap(model.apply, in_axes=(None, None, 0, None))(params, context_X, context_y, target_X)\n# full_preds = jnp.concatenate([context_y, locs], axis=1)\n# full_preds = full_preds.at[:, tmp_index, :].set(full_preds).__array__()\n\nplt.figure()\nplt.imshow(locs[id].reshape(28, 28), cmap=\"gray\", interpolation=None);" + "objectID": "lab/how-to-finetune-florence-2-on-detection-dataset.html#fine-tune-florence-2-on-custom-dataset", + "href": "lab/how-to-finetune-florence-2-on-detection-dataset.html#fine-tune-florence-2-on-custom-dataset", + "title": "Fine-tuning Florence-2 on Object Detection Dataset", + "section": "Fine-tune Florence-2 on custom dataset", + "text": "Fine-tune Florence-2 on custom dataset\n\nDownload dataset from Roboflow Universe\n\nROBOFLOW_API_KEY = os.getenv(\"ROBOFLOW_API_KEY\")\nrf = Roboflow(api_key=ROBOFLOW_API_KEY)\n\nproject = rf.workspace(\"roboflow-jvuqo\").project(\"poker-cards-fmjio\")\nversion = project.version(4)\ndataset = version.download(\"florence2-od\")\n\nloading Roboflow workspace...\nloading Roboflow project...\n\n\n\n!head -n 5 {dataset.location}/train/annotations.jsonl\n\n{\"image\": \"IMG_20220316_172418_jpg.rf.e3cb4a86dc0247e71e3697aa3e9db923.jpg\", \"prefix\": \"<OD>\", \"suffix\": \":!pg!dmvct<loc_138><loc_100><loc_470><loc_448>21!pg!dmvct<loc_388><loc_145><loc_670><loc_453>kbdl!!pg!dmvct<loc_566><loc_166><loc_823><loc_432>rvffo!pg!dmvct<loc_365><loc_465><loc_765><loc_999>ljoh!pg!dmvct<loc_601><loc_440><loc_949><loc_873>\"}\n{\"image\": \"IMG_20220316_171515_jpg.rf.e3b1932bb375b3b3912027647586daa8.jpg\", \"prefix\": \"<OD>\", \"suffix\": \"6!pg!dmvct<loc_554><loc_2><loc_763><loc_467>7!pg!dmvct<loc_399><loc_79><loc_555><loc_466>8!pg!dmvct<loc_363><loc_484><loc_552><loc_905>9!pg!dmvct<loc_535><loc_449><loc_757><loc_971>\"}\n{\"image\": \"IMG_20220316_165139_jpg.rf.e30257ec169a2bfdfecb693211d37250.jpg\", \"prefix\": \"<OD>\", \"suffix\": \":!pg!ejbnpoet<loc_596><loc_535><loc_859><loc_982>kbdl!pg!ejbnpoet<loc_211><loc_546><loc_411><loc_880>rvffo!pg!ejbnpoet<loc_430><loc_34><loc_692><loc_518>ljoh!pg!ejbnpoet<loc_223><loc_96><loc_451><loc_523>21!pg!ejbnpoet<loc_387><loc_542><loc_604><loc_925>\"}\n{\"image\": \"IMG_20220316_143407_jpg.rf.e1eb3be3efc6c3bbede436cfb5489e7c.jpg\", \"prefix\": \"<OD>\", \"suffix\": \"bdf!pg!ifbsut<loc_345><loc_315><loc_582><loc_721>3!pg!ifbsut<loc_709><loc_115><loc_888><loc_509>4!pg!ifbsut<loc_529><loc_228><loc_735><loc_613>5!pg!ifbsut<loc_98><loc_421><loc_415><loc_845>\"}\n{\"image\": \"IMG_20220316_165139_jpg.rf.e4c229a9128494d17992cbe88af575df.jpg\", \"prefix\": \"<OD>\", \"suffix\": \":!pg!ejbnpoet<loc_141><loc_18><loc_404><loc_465>kbdl!pg!ejbnpoet<loc_589><loc_120><loc_789><loc_454>rvffo!pg!ejbnpoet<loc_308><loc_482><loc_570><loc_966>ljoh!pg!ejbnpoet<loc_549><loc_477><loc_777><loc_904>21!pg!ejbnpoet<loc_396><loc_75><loc_613><loc_458>\"}\n\n\nhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...\nTo disable this warning, you can either:\n - Avoid using `tokenizers` before the fork if possible\n - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)\n\n\n\n# read jsonl file\ndef read_jsonl(file_path: str) -> Generator[Dict[str, Any], None, None]:\n with open(file_path, \"r\") as f:\n for line in f:\n yield json.loads(line)\n\nlines = []\nsplit = \"test\" \nfor line in read_jsonl(dataset.location + f\"/{split}/annotations.jsonl.backup\"):\n # print(line)\n # edit = True\n # copied_line = list(line['suffix'])\n # for i in range(len(copied_line)):\n # if copied_line[i] == \"<\":\n # edit = False\n # elif copied_line[i] == \">\":\n # edit = True\n # else:\n # if edit:\n # copied_line[i] = chr(ord(copied_line[i]) + 1)\n # copied_line = \"\".join(copied_line)\n # line['suffix'] = copied_line\n \n line['suffix'] = line['suffix'].replace(\"club\", \"dog\").replace(\"diamond\", \"cat\").replace(\"heart\", \"bird\").replace(\"spade\", \"fish\")\n print(line)\n lines.append(line)\n\nwith open(dataset.location + f\"/{split}/annotations.jsonl\", \"w\") as f:\n for line in lines:\n f.write(json.dumps(line) + \"\\n\")\n\n{'image': 'IMG_20220316_140255_jpg.rf.0d10768652a0f20bea317e96632d3448.jpg', 'prefix': '<OD>', 'suffix': '5 of fishs<loc_146><loc_488><loc_541><loc_917>6 of fishs<loc_259><loc_221><loc_604><loc_673>7 of fishs<loc_470><loc_206><loc_761><loc_741>8 of fishs<loc_599><loc_201><loc_949><loc_732>'}\n{'image': 'IMG_20220316_140400_jpg.rf.3f21e54bd916b05218202fbf109d8a5f.jpg', 'prefix': '<OD>', 'suffix': '5 of fishs<loc_127><loc_280><loc_378><loc_597>7 of fishs<loc_244><loc_415><loc_539><loc_904>8 of fishs<loc_414><loc_98><loc_792><loc_513>6 of fishs<loc_450><loc_469><loc_847><loc_999>'}\n{'image': 'IMG_20220316_144511_jpg.rf.40ee049c8f854c558e2ca20f90be3787.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_49><loc_508><loc_221><loc_891>6 of birds<loc_205><loc_391><loc_384><loc_832>7 of birds<loc_387><loc_281><loc_621><loc_777>8 of birds<loc_615><loc_100><loc_971><loc_677>'}\n{'image': 'IMG_20220316_144657_jpg.rf.0b7bb9ab4b594b83097af9c4c1ea46c3.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_643><loc_238><loc_870><loc_672>6 of birds<loc_184><loc_34><loc_555><loc_468>7 of birds<loc_137><loc_531><loc_527><loc_989>8 of birds<loc_517><loc_521><loc_753><loc_982>'}\n{'image': 'IMG_20220316_172435_jpg.rf.854fe0c471b03fe3a3894a3d2cbe00d0.jpg', 'prefix': '<OD>', 'suffix': '9 of dogs<loc_76><loc_146><loc_420><loc_702>10 of dogs<loc_353><loc_111><loc_586><loc_570>jack of dogs<loc_540><loc_59><loc_724><loc_460>queen of dogs<loc_336><loc_550><loc_586><loc_998>king of dogs<loc_525><loc_456><loc_717><loc_852>'}\n{'image': 'IMG_20220316_161455_jpg.rf.635ccd0ee9f7dd762009f539d6b998e9.jpg', 'prefix': '<OD>', 'suffix': '9 of birds<loc_22><loc_495><loc_367><loc_878>10 of birds<loc_423><loc_33><loc_653><loc_428>jack of birds<loc_170><loc_136><loc_445><loc_563>queen of birds<loc_390><loc_470><loc_687><loc_791>king of birds<loc_666><loc_207><loc_938><loc_643>'}\n{'image': 'IMG_20220316_161313_jpg.rf.12498ecbc8985a8ff65bd2033d8f622a.jpg', 'prefix': '<OD>', 'suffix': '9 of birds<loc_165><loc_191><loc_374><loc_532>10 of birds<loc_361><loc_115><loc_584><loc_471>jack of birds<loc_527><loc_212><loc_912><loc_646>queen of birds<loc_375><loc_616><loc_692><loc_988>king of birds<loc_216><loc_451><loc_463><loc_846>'}\n{'image': 'IMG_20220316_142643_jpg.rf.13b0a2a65a9d4b17580b39ed19de5bba.jpg', 'prefix': '<OD>', 'suffix': 'ace of birds<loc_180><loc_323><loc_416><loc_725>2 of birds<loc_374><loc_577><loc_671><loc_946>3 of birds<loc_300><loc_16><loc_572><loc_485>4 of birds<loc_535><loc_214><loc_973><loc_769>'}\n{'image': 'IMG_20220316_144650_jpg.rf.34b246c8ee646cbb5979a35d68c58901.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_606><loc_265><loc_919><loc_746>6 of birds<loc_258><loc_2><loc_689><loc_439>7 of birds<loc_1><loc_295><loc_330><loc_841>8 of birds<loc_271><loc_352><loc_646><loc_875>'}\n{'image': 'IMG_20220316_161217_jpg.rf.1755e5fefb14ca6df49690604289bb46.jpg', 'prefix': '<OD>', 'suffix': '9 of birds<loc_20><loc_195><loc_275><loc_491>10 of birds<loc_268><loc_188><loc_490><loc_495>jack of birds<loc_491><loc_152><loc_735><loc_452>queen of birds<loc_713><loc_145><loc_991><loc_441>king of birds<loc_298><loc_480><loc_607><loc_935>'}\n{'image': 'IMG_20220316_144711_jpg.rf.19a6cc13f83f27b45e10a6056bb25721.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_738><loc_233><loc_997><loc_663>6 of birds<loc_20><loc_202><loc_258><loc_630>7 of birds<loc_476><loc_220><loc_734><loc_649>8 of birds<loc_264><loc_205><loc_491><loc_630>'}\n{'image': 'IMG_20220316_164257_jpg.rf.0c3abfccf0f7f147946c89251b87f598.jpg', 'prefix': '<OD>', 'suffix': '5 of cats<loc_49><loc_312><loc_290><loc_621>6 of cats<loc_681><loc_245><loc_920><loc_554>7 of cats<loc_242><loc_202><loc_492><loc_559>8 of cats<loc_468><loc_261><loc_687><loc_592>'}\n{'image': 'IMG_20220316_170755_jpg.rf.71804c21e1c7681d5b656427449e0a2a.jpg', 'prefix': '<OD>', 'suffix': 'ace of dogs<loc_47><loc_95><loc_391><loc_754>2 of dogs<loc_577><loc_238><loc_737><loc_674>3 of dogs<loc_723><loc_348><loc_861><loc_708>4 of dogs<loc_366><loc_189><loc_602><loc_711>'}\n{'image': 'IMG_20220316_141442_jpg.rf.41913768e5d56c57566ee3b45391470d.jpg', 'prefix': '<OD>', 'suffix': '9 of fishs<loc_509><loc_233><loc_768><loc_603>10 of fishs<loc_202><loc_86><loc_530><loc_413>jack of fishs<loc_597><loc_476><loc_836><loc_895>queen of fishs<loc_370><loc_441><loc_635><loc_916>king of fishs<loc_145><loc_312><loc_482><loc_782>'}\n{'image': 'IMG_20220316_140723_jpg.rf.826bc2b212c4001c11115ef28f58d142.jpg', 'prefix': '<OD>', 'suffix': '6 of fishs<loc_23><loc_341><loc_329><loc_713>7 of fishs<loc_228><loc_259><loc_442><loc_527>5 of fishs<loc_438><loc_254><loc_709><loc_555>8 of fishs<loc_631><loc_284><loc_999><loc_652>'}\n{'image': 'IMG_20220316_134701_jpg.rf.27aa29de9d6012ae05c64b156f7c07b8.jpg', 'prefix': '<OD>', 'suffix': 'ace of fishs<loc_197><loc_114><loc_489><loc_480>2 of fishs<loc_270><loc_581><loc_538><loc_999>3 of fishs<loc_470><loc_398><loc_718><loc_782>4 of fishs<loc_580><loc_213><loc_824><loc_543>'}\n{'image': 'IMG_20220316_163749_jpg.rf.3c41ec25d2a8390c760eb7fcf2d2466b.jpg', 'prefix': '<OD>', 'suffix': '5 of cats<loc_0><loc_276><loc_289><loc_582>6 of cats<loc_256><loc_286><loc_497><loc_566>7 of cats<loc_485><loc_282><loc_751><loc_559>8 of cats<loc_698><loc_254><loc_995><loc_548>'}\n{'image': 'IMG_20220316_165206_jpg.rf.1e20afbea0b132989e944ffbd800f348.jpg', 'prefix': '<OD>', 'suffix': '9 of cats<loc_57><loc_220><loc_401><loc_691>jack of cats<loc_420><loc_30><loc_713><loc_468>queen of cats<loc_266><loc_490><loc_563><loc_971>king of cats<loc_397><loc_422><loc_797><loc_948>10 of cats<loc_217><loc_74><loc_500><loc_557>'}\n{'image': 'IMG_20220316_144500_jpg.rf.14c3cb5eadd6c3d449916f52d9f9381e.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_43><loc_27><loc_396><loc_608>6 of birds<loc_348><loc_191><loc_610><loc_702>7 of birds<loc_577><loc_326><loc_790><loc_785>8 of birds<loc_795><loc_452><loc_973><loc_883>'}\n{'image': 'IMG_20220316_172537_jpg.rf.8fe076e115c111f04732f8e7f778e51d.jpg', 'prefix': '<OD>', 'suffix': '9 of dogs<loc_635><loc_348><loc_998><loc_738>10 of dogs<loc_1><loc_445><loc_320><loc_852>jack of dogs<loc_267><loc_177><loc_467><loc_409>queen of dogs<loc_473><loc_141><loc_722><loc_358>king of dogs<loc_340><loc_380><loc_634><loc_788>'}\n{'image': 'IMG_20220316_144652_jpg.rf.316d0dad84d3d696fa8fa3caf53fb700.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_451><loc_505><loc_820><loc_980>6 of birds<loc_519><loc_101><loc_773><loc_495>7 of birds<loc_177><loc_68><loc_452><loc_427>8 of birds<loc_291><loc_301><loc_559><loc_730>'}\n{'image': 'IMG_20220316_173229_jpg.rf.1b93cc9a66ca2d0a28820a7dae74222a.jpg', 'prefix': '<OD>', 'suffix': '9 of dogs<loc_514><loc_426><loc_769><loc_895>10 of dogs<loc_650><loc_168><loc_978><loc_680>jack of dogs<loc_151><loc_368><loc_432><loc_865>queen of dogs<loc_365><loc_82><loc_657><loc_546>king of dogs<loc_67><loc_272><loc_327><loc_786>'}\n{'image': 'IMG_20220316_171916_jpg.rf.7c8b85f64455e815acc30b5111422bf2.jpg', 'prefix': '<OD>', 'suffix': '5 of dogs<loc_402><loc_294><loc_713><loc_806>6 of dogs<loc_175><loc_587><loc_334><loc_982>7 of dogs<loc_509><loc_5><loc_893><loc_552>8 of dogs<loc_273><loc_414><loc_491><loc_867>'}\n{'image': 'IMG_20220316_165737_jpg.rf.11f2e19b001c300ee820e093b135409a.jpg', 'prefix': '<OD>', 'suffix': '9 of cats<loc_141><loc_350><loc_441><loc_848>jack of cats<loc_257><loc_107><loc_513><loc_429>queen of cats<loc_354><loc_405><loc_688><loc_926>king of cats<loc_201><loc_223><loc_477><loc_597>10 of cats<loc_519><loc_114><loc_783><loc_467>'}\n{'image': 'IMG_20220316_170523_jpg.rf.a106d534bf3279ec40e771452a142c1e.jpg', 'prefix': '<OD>', 'suffix': 'ace of dogs<loc_1><loc_295><loc_290><loc_616>2 of dogs<loc_250><loc_296><loc_509><loc_604>3 of dogs<loc_493><loc_275><loc_765><loc_588>4 of dogs<loc_700><loc_261><loc_999><loc_577>'}\n{'image': 'IMG_20220316_171810_jpg.rf.be96dac3cbdda6973a920a0a787b33f2.jpg', 'prefix': '<OD>', 'suffix': '5 of dogs<loc_546><loc_83><loc_699><loc_530>6 of dogs<loc_301><loc_438><loc_610><loc_906>7 of dogs<loc_363><loc_66><loc_638><loc_515>8 of dogs<loc_238><loc_163><loc_604><loc_547>'}\n{'image': 'IMG_20220316_171557_jpg.rf.9c6f913ba56e578ef4c31cc3faffcf7d.jpg', 'prefix': '<OD>', 'suffix': '5 of dogs<loc_289><loc_130><loc_556><loc_277>6 of dogs<loc_80><loc_391><loc_465><loc_838>7 of dogs<loc_255><loc_213><loc_659><loc_579>8 of dogs<loc_438><loc_258><loc_680><loc_603>'}\n{'image': 'IMG_20220316_140335_jpg.rf.c3310d8f13f66189440daf8419b1ad9c.jpg', 'prefix': '<OD>', 'suffix': '7 of fishs<loc_85><loc_452><loc_295><loc_678>6 of fishs<loc_291><loc_413><loc_488><loc_648>8 of fishs<loc_491><loc_405><loc_696><loc_626>5 of fishs<loc_708><loc_418><loc_961><loc_652>'}\n{'image': 'IMG_20220316_135021_jpg.rf.d038afdef1a927103dae268ff392888f.jpg', 'prefix': '<OD>', 'suffix': 'ace of fishs<loc_514><loc_234><loc_814><loc_523>2 of fishs<loc_326><loc_506><loc_596><loc_916>3 of fishs<loc_601><loc_515><loc_921><loc_937>4 of fishs<loc_219><loc_22><loc_528><loc_347>'}\n{'image': 'IMG_20220316_163800_jpg.rf.e1ad0b7b78e379d5f9f0bec787a9050b.jpg', 'prefix': '<OD>', 'suffix': '5 of cats<loc_26><loc_100><loc_376><loc_752>6 of cats<loc_377><loc_223><loc_612><loc_725>7 of cats<loc_591><loc_290><loc_779><loc_712>8 of cats<loc_760><loc_336><loc_916><loc_702>'}\n{'image': 'IMG_20220316_165711_jpg.rf.c64b84b61322947a5a8c7545e214278c.jpg', 'prefix': '<OD>', 'suffix': '9 of cats<loc_437><loc_86><loc_609><loc_363>10 of cats<loc_217><loc_123><loc_434><loc_458>jack of cats<loc_69><loc_504><loc_395><loc_995>queen of cats<loc_366><loc_530><loc_610><loc_951>king of cats<loc_553><loc_425><loc_745><loc_759>'}\n{'image': 'IMG_20220316_141917_jpg.rf.b3075e4161fe5fa2285d75bb2da3bc7a.jpg', 'prefix': '<OD>', 'suffix': '9 of fishs<loc_682><loc_295><loc_951><loc_709>10 of fishs<loc_63><loc_502><loc_293><loc_807>jack of fishs<loc_462><loc_345><loc_727><loc_729>queen of fishs<loc_256><loc_486><loc_497><loc_819>king of fishs<loc_113><loc_191><loc_411><loc_488>'}\n{'image': 'IMG_20220316_144507_jpg.rf.ab9b95792bbdbe7694910967641fba38.jpg', 'prefix': '<OD>', 'suffix': '5 of birds<loc_1><loc_339><loc_343><loc_888>6 of birds<loc_327><loc_326><loc_563><loc_773>7 of birds<loc_573><loc_320><loc_810><loc_735>8 of birds<loc_784><loc_320><loc_999><loc_684>'}\n{'image': 'IMG_20220316_161515_jpg.rf.b22dd5d2b8037f009a9e65052d2d3b5c.jpg', 'prefix': '<OD>', 'suffix': '9 of birds<loc_91><loc_238><loc_309><loc_620>10 of birds<loc_708><loc_179><loc_900><loc_532>jack of birds<loc_470><loc_161><loc_670><loc_525>queen of birds<loc_386><loc_495><loc_689><loc_816>king of birds<loc_282><loc_163><loc_490><loc_532>'}\n{'image': 'IMG_20220316_164227_jpg.rf.e42455b79bae041d959f90597b7065bc.jpg', 'prefix': '<OD>', 'suffix': '5 of cats<loc_59><loc_326><loc_274><loc_720>6 of cats<loc_740><loc_223><loc_955><loc_625>7 of cats<loc_260><loc_206><loc_523><loc_639>8 of cats<loc_485><loc_252><loc_760><loc_695>'}\n{'image': 'IMG_20220316_135012_jpg.rf.ee7179374d33235528db011cb5418226.jpg', 'prefix': '<OD>', 'suffix': 'ace of fishs<loc_290><loc_323><loc_573><loc_639>2 of fishs<loc_330><loc_652><loc_617><loc_986>3 of fishs<loc_570><loc_447><loc_856><loc_816>4 of fishs<loc_375><loc_49><loc_658><loc_361>'}\n{'image': 'IMG_20220316_161525_jpg.rf.f7962cdc50e0a3cd03e4c081a1d2a67a.jpg', 'prefix': '<OD>', 'suffix': '9 of birds<loc_55><loc_363><loc_334><loc_798>10 of birds<loc_607><loc_155><loc_863><loc_418>jack of birds<loc_400><loc_204><loc_670><loc_516>queen of birds<loc_423><loc_432><loc_778><loc_795>king of birds<loc_224><loc_244><loc_499><loc_602>'}\n{'image': 'IMG_20220316_141940_jpg.rf.e0ca71cfc8dc86a6624c3f90fe6c5e9e.jpg', 'prefix': '<OD>', 'suffix': '9 of fishs<loc_680><loc_333><loc_922><loc_778>10 of fishs<loc_105><loc_438><loc_451><loc_920>jack of fishs<loc_493><loc_230><loc_709><loc_655>queen of fishs<loc_219><loc_227><loc_489><loc_654>king of fishs<loc_366><loc_412><loc_728><loc_796>'}\n{'image': 'IMG_20220316_165221_jpg.rf.a5188caf60a7bd5e8c6dcf92d68f9505.jpg', 'prefix': '<OD>', 'suffix': '9 of cats<loc_43><loc_241><loc_334><loc_550>jack of cats<loc_470><loc_132><loc_757><loc_420>queen of cats<loc_347><loc_275><loc_641><loc_586>king of cats<loc_295><loc_511><loc_623><loc_948>10 of cats<loc_596><loc_446><loc_906><loc_757>'}\n{'image': 'IMG_20220316_162828_jpg.rf.db7557485f5c3e5ce01b2e1ab3d4621e.jpg', 'prefix': '<OD>', 'suffix': 'ace of cats<loc_34><loc_73><loc_348><loc_538>2 of cats<loc_277><loc_523><loc_555><loc_978>3 of cats<loc_318><loc_91><loc_659><loc_498>4 of cats<loc_539><loc_447><loc_838><loc_917>'}\n{'image': 'IMG_20220316_171936_jpg.rf.b6e31b1cc6b5e14dc66462becfa4a63d.jpg', 'prefix': '<OD>', 'suffix': '5 of dogs<loc_329><loc_499><loc_623><loc_984>6 of dogs<loc_353><loc_2><loc_748><loc_355>7 of dogs<loc_390><loc_238><loc_699><loc_731>8 of dogs<loc_38><loc_416><loc_420><loc_784>'}\n{'image': 'IMG_20220316_172800_jpg.rf.e63a3bae897cbf27dccbf969f543bb6f.jpg', 'prefix': '<OD>', 'suffix': '9 of dogs<loc_708><loc_168><loc_906><loc_501>10 of dogs<loc_42><loc_430><loc_409><loc_877>jack of dogs<loc_481><loc_126><loc_678><loc_445>queen of dogs<loc_232><loc_158><loc_440><loc_488>king of dogs<loc_448><loc_459><loc_737><loc_773>'}\n{'image': 'IMG_20220316_171653_jpg.rf.637e380597f1d844654a33f4a3555471.jpg', 'prefix': '<OD>', 'suffix': '5 of dogs<loc_102><loc_473><loc_418><loc_800>6 of dogs<loc_423><loc_495><loc_798><loc_862>7 of dogs<loc_466><loc_100><loc_816><loc_400>8 of dogs<loc_173><loc_157><loc_466><loc_480>'}\n{'image': 'IMG_20220316_140241_jpg.rf.c44522806b5455bfb03a638aa3ffa896.jpg', 'prefix': '<OD>', 'suffix': '5 of fishs<loc_300><loc_409><loc_405><loc_673>6 of fishs<loc_372><loc_380><loc_489><loc_688>7 of fishs<loc_456><loc_334><loc_623><loc_747>8 of fishs<loc_598><loc_236><loc_880><loc_830>'}\n\n\n\n# @title Define `DetectionsDataset` class\n\nclass JSONLDataset:\n def __init__(self, jsonl_file_path: str, image_directory_path: str):\n self.jsonl_file_path = jsonl_file_path\n self.image_directory_path = image_directory_path\n self.entries = self._load_entries()\n\n def _load_entries(self) -> List[Dict[str, Any]]:\n entries = []\n with open(self.jsonl_file_path, 'r') as file:\n for line in file:\n data = json.loads(line)\n entries.append(data)\n return entries\n\n def __len__(self) -> int:\n return len(self.entries)\n\n def __getitem__(self, idx: int) -> Tuple[Image.Image, Dict[str, Any]]:\n if idx < 0 or idx >= len(self.entries):\n raise IndexError(\"Index out of range\")\n\n entry = self.entries[idx]\n image_path = os.path.join(self.image_directory_path, entry['image'])\n try:\n image = Image.open(image_path)\n return (image, entry)\n except FileNotFoundError:\n raise FileNotFoundError(f\"Image file {image_path} not found.\")\n\n\nclass DetectionDataset(Dataset):\n def __init__(self, jsonl_file_path: str, image_directory_path: str):\n self.dataset = JSONLDataset(jsonl_file_path, image_directory_path)\n\n def __len__(self):\n return len(self.dataset)\n\n def __getitem__(self, idx):\n image, data = self.dataset[idx]\n prefix = data['prefix']\n suffix = data['suffix']\n return prefix, suffix, image\n\n\n# @title Initiate `DetectionsDataset` and `DataLoader` for train and validation subsets\n\nBATCH_SIZE = 6\nNUM_WORKERS = 0\n\ndef collate_fn(batch):\n questions, answers, images = zip(*batch)\n inputs = processor(text=list(questions), images=list(images), return_tensors=\"pt\", padding=True).to(DEVICE)\n return inputs, answers\n\ntrain_dataset = DetectionDataset(\n jsonl_file_path = f\"{dataset.location}/train/annotations.jsonl\",\n image_directory_path = f\"{dataset.location}/train/\"\n)\nval_dataset = DetectionDataset(\n jsonl_file_path = f\"{dataset.location}/valid/annotations.jsonl\",\n image_directory_path = f\"{dataset.location}/valid/\"\n)\n\ntrain_loader = DataLoader(train_dataset, batch_size=BATCH_SIZE, collate_fn=collate_fn, num_workers=NUM_WORKERS, shuffle=True)\nval_loader = DataLoader(val_dataset, batch_size=BATCH_SIZE, collate_fn=collate_fn, num_workers=NUM_WORKERS)\n\n\n# @title Setup LoRA Florence-2 model\n\nconfig = LoraConfig(\n r=8,\n lora_alpha=8,\n target_modules=[\"q_proj\", \"o_proj\", \"k_proj\", \"v_proj\", \"linear\", \"Conv2d\", \"lm_head\", \"fc2\"],\n task_type=\"CAUSAL_LM\",\n lora_dropout=0.05,\n bias=\"none\",\n inference_mode=False,\n use_rslora=True,\n init_lora_weights=\"gaussian\",\n revision=\"asdjasdsa\",\n)\n\npeft_model = get_peft_model(model, config)\npeft_model.print_trainable_parameters()\n\ntrainable params: 1,929,928 || all params: 272,733,896 || trainable%: 0.7076\n\n\n\ntorch.cuda.empty_cache()\n\n\n# @title Run inference with pre-trained Florence-2 model on validation dataset\n\ndef render_inline(image: Image.Image, resize=(128, 128)):\n \"\"\"Convert image into inline html.\"\"\"\n image.resize(resize)\n with io.BytesIO() as buffer:\n image.save(buffer, format='jpeg')\n image_b64 = str(base64.b64encode(buffer.getvalue()), \"utf-8\")\n return f\"data:image/jpeg;base64,{image_b64}\"\n\n\ndef render_example(image: Image.Image, response):\n try:\n detections = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, response, resolution_wh=image.size)\n image = sv.BoundingBoxAnnotator(color_lookup=sv.ColorLookup.INDEX).annotate(image.copy(), detections)\n image = sv.LabelAnnotator(color_lookup=sv.ColorLookup.INDEX).annotate(image, detections)\n except:\n print('failed to redner model response')\n return f\"\"\"\n<div style=\"display: inline-flex; align-items: center; justify-content: center;\">\n <img style=\"width:256px; height:256px;\" src=\"{render_inline(image, resize=(128, 128))}\" />\n <p style=\"width:512px; margin:10px; font-size:small;\">{html.escape(json.dumps(response))}</p>\n</div>\n\"\"\"\n\n\ndef render_inference_results(model, dataset: DetectionDataset, count: int):\n html_out = \"\"\n count = min(count, len(dataset))\n for i in range(count):\n image, data = dataset.dataset[i]\n prefix = data['prefix']\n suffix = data['suffix']\n inputs = processor(text=prefix, images=image, return_tensors=\"pt\").to(DEVICE)\n generated_ids = model.generate(\n input_ids=inputs[\"input_ids\"],\n pixel_values=inputs[\"pixel_values\"],\n max_new_tokens=1024,\n num_beams=3\n )\n generated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]\n answer = processor.post_process_generation(generated_text, task='<OD>', image_size=image.size)\n html_out += render_example(image, answer)\n\n display(HTML(html_out))\n\nrender_inference_results(peft_model, val_dataset, 4)\n\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\n\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], \"labels\": [\"bed\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], \"labels\": [\"table\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], \"labels\": [\"chair\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], \"labels\": [\"furniture\"]}}" }, { - "objectID": "posts/climate-modeling-with-SpecialGP.html", - "href": "posts/climate-modeling-with-SpecialGP.html", - "title": "Climate Modeling with GPs", - "section": "", - "text": "import os\nos.environ[\"CUDA_VISIBLE_DEVICES\"] = \"0\"\n\nimport pyproj\nimport numpy as np\nimport xarray as xr\n\nfrom skgpytorch.models import GPRegression\n\nimport matplotlib.pyplot as plt\n\n\n# def haversine(lon1, lat1, lon2, lat2):\n# \"\"\"\n# Calculate the great circle distance in kilometers between two points \n# on the earth (specified in decimal degrees)\n# \"\"\"\n# # convert decimal degrees to radians \n# lon1, lat1, lon2, lat2 = map(np.radians, [lon1, lat1, lon2, lat2])\n\n# # haversine formula \n# dlon = lon2 - lon1 \n# dlat = lat2 - lat1 \n# a = np.sin(dlat/2)**2 + np.cos(lat1) * np.cos(lat2) * np.sin(dlon/2)**2\n# c = 2 * np.arcsin(np.sqrt(a)) \n# r = 6371 # Radius of earth in kilometers. Use 3956 for miles. Determines return value units.\n# return c * r\n\n# def new_coords(lat1, long1):\n# new_lat1 = haversine(0, 0, 0, lat1)\n# new_long1 = haversine(0, 0, long1, 0)\n# return new_lat1, new_long1\n\ndef lat_long_to_cartesian(latitude, longitude):\n # Convert latitude and longitude to radians\n phi = np.radians(latitude)\n lam = np.radians(longitude)\n\n # Constants for WGS 84 ellipsoid\n a = 6378137.0 # equatorial radius in meters\n e = 0.0818191908426 # eccentricity\n\n # Calculate Earth's radius at the given latitude\n R = a / np.sqrt(1 - (e ** 2) * (np.sin(phi) ** 2))\n\n # Convert to Cartesian coordinates\n X = R * np.sin(lam)\n Y = R * np.tan(phi)\n\n return X, Y\n\ndef wgs84_coords(lat, lon): \n # Define coordinate systems\n wgs84 = pyproj.CRS.from_epsg(4326) # WGS 84 lat-long system\n utm_zone_32n = pyproj.CRS.from_string(\"+proj=utm +zone=32 +ellps=WGS84 +datum=WGS84 +units=m +no_defs\")\n\n # Create a transformer object\n transformer = pyproj.Transformer.from_crs(wgs84, utm_zone_32n)\n\n # Convert lat-long coordinates to UTM coordinates\n utm_easting, utm_northing = transformer.transform(lon, lat)\n\n return utm_northing, utm_easting\n\n# Copyright (c) Meta Platforms, Inc. and affiliates.\n# All rights reserved.\n\n# This source code is licensed under the license found in the\n# LICENSE file in the root directory of this source tree.\n# --------------------------------------------------------\n# Position embedding utils\n# --------------------------------------------------------\n\n\n# --------------------------------------------------------\n# 2D sine-cosine position embedding\n# References:\n# Transformer: https://github.com/tensorflow/models/blob/master/official/nlp/transformer/model_utils.py\n# MoCo v3: https://github.com/facebookresearch/moco-v3\n# --------------------------------------------------------\ndef get_2d_sincos_pos_embed(embed_dim, grid_size_h, grid_size_w, cls_token=False):\n \"\"\"\n grid_size: int of the grid height and width\n return:\n pos_embed: [grid_size*grid_size, embed_dim] or [1+grid_size*grid_size, embed_dim] (w/ or w/o cls_token)\n \"\"\"\n grid_h = np.arange(grid_size_h, dtype=np.float32)\n grid_w = np.arange(grid_size_w, dtype=np.float32)\n grid = np.meshgrid(grid_w, grid_h) # here w goes first\n grid = np.stack(grid, axis=0)\n\n grid = grid.reshape([2, 1, grid_size_h, grid_size_w])\n pos_embed = get_2d_sincos_pos_embed_from_grid(embed_dim, grid)\n if cls_token:\n pos_embed = np.concatenate([np.zeros([1, embed_dim]), pos_embed], axis=0)\n return pos_embed\n\n\ndef get_2d_sincos_pos_embed_from_grid(embed_dim, grid):\n assert embed_dim % 2 == 0\n\n # use half of dimensions to encode grid_h\n emb_h = get_1d_sincos_pos_embed_from_grid(embed_dim // 2, grid[0]) # (H*W, D/2)\n emb_w = get_1d_sincos_pos_embed_from_grid(embed_dim // 2, grid[1]) # (H*W, D/2)\n\n emb = np.concatenate([emb_h, emb_w], axis=1) # (H*W, D)\n return emb\n\n\ndef get_1d_sincos_pos_embed_from_grid(embed_dim, pos):\n \"\"\"\n embed_dim: output dimension for each position\n pos: a list of positions to be encoded: size (M,)\n out: (M, D)\n \"\"\"\n assert embed_dim % 2 == 0\n omega = np.arange(embed_dim // 2, dtype=np.float)\n omega /= embed_dim / 2.0\n omega = 1.0 / 10000**omega # (D/2,)\n\n pos = pos.reshape(-1) # (M,)\n out = np.einsum(\"m,d->md\", pos, omega) # (M, D/2), outer product\n\n emb_sin = np.sin(out) # (M, D/2)\n emb_cos = np.cos(out) # (M, D/2)\n\n emb = np.concatenate([emb_sin, emb_cos], axis=1) # (M, D)\n return emb\n\n\n# --------------------------------------------------------\n# Interpolate position embeddings for high-resolution\n# References:\n# DeiT: https://github.com/facebookresearch/deit\n# --------------------------------------------------------\ndef interpolate_pos_embed(model, checkpoint_model, new_size=(64, 128)):\n if \"net.pos_embed\" in checkpoint_model:\n pos_embed_checkpoint = checkpoint_model[\"net.pos_embed\"]\n embedding_size = pos_embed_checkpoint.shape[-1]\n orig_num_patches = pos_embed_checkpoint.shape[-2]\n patch_size = model.patch_size\n w_h_ratio = 2\n orig_h = int((orig_num_patches // w_h_ratio) ** 0.5)\n orig_w = w_h_ratio * orig_h\n orig_size = (orig_h, orig_w)\n new_size = (new_size[0] // patch_size, new_size[1] // patch_size)\n # print (orig_size)\n # print (new_size)\n if orig_size[0] != new_size[0]:\n print(\"Interpolate PEs from %dx%d to %dx%d\" % (orig_size[0], orig_size[1], new_size[0], new_size[1]))\n pos_tokens = pos_embed_checkpoint.reshape(-1, orig_size[0], orig_size[1], embedding_size).permute(\n 0, 3, 1, 2\n )\n new_pos_tokens = torch.nn.functional.interpolate(\n pos_tokens, size=(new_size[0], new_size[1]), mode=\"bicubic\", align_corners=False\n )\n new_pos_tokens = new_pos_tokens.permute(0, 2, 3, 1).flatten(1, 2)\n checkpoint_model[\"net.pos_embed\"] = new_pos_tokens\n\n\ndef interpolate_channel_embed(checkpoint_model, new_len):\n if \"net.channel_embed\" in checkpoint_model:\n channel_embed_checkpoint = checkpoint_model[\"net.channel_embed\"]\n old_len = channel_embed_checkpoint.shape[1]\n if new_len <= old_len:\n checkpoint_model[\"net.channel_embed\"] = channel_embed_checkpoint[:, :new_len]\n\n\ndef SIREN(input_dim, output_dim, features, activation_scale, dropout):\n model = tf.keras.Sequential()\n model.add(layers.Dense(features[0], input_shape=(input_dim,), kernel_initializer=initializers.RandomUniform(-1 / input_dim, 1 / input_dim), activation=tf.sin))\n for i in range(1, len(features)):\n model.add(layers.Dense(features[i], kernel_initializer=initializers.RandomUniform(-np.sqrt(6 / features[i-1]) / activation_scale, np.sqrt(6 / features[i-1]) / activation_scale), activation=tf.sin))\n model.add(layers.Dropout(dropout))\n model.add(layers.Dense(output_dim, kernel_initializer=initializers.RandomUniform(-np.sqrt(6 / features[-1]) / activation_scale, np.sqrt(6 / features[-1]) / activation_scale), activation='linear'))\n return model\n\ndef MLP(input_dim, output_dim, features, activation_scale, dropout):\n model = tf.keras.Sequential()\n model.add(layers.Dense(features[0], input_shape=(input_dim,), activation=activations.relu))\n for i in range(1, len(features)):\n model.add(layers.Dense(features[i], activation=activations.relu))\n model.add(layers.Dropout(dropout))\n model.add(layers.Dense(output_dim, activation='linear'))\n return model\n \ndef ResNet():\n resnet = ResNet50(include_top=False, weights=None, input_shape=(64, 32, 1), pooling='avg')\n model = tf.keras.Sequential()\n model.add(resnet)\n model.add(layers.Dense(2048, activation='relu'))\n model.add(layers.Dense(32768, activation='linear'))\n return model\n\n\ndata5 = xr.open_dataset(\"../data/2m_temperature_2018_5.625deg_Jan.nc\").to_dataframe().reset_index()\ndata1 = xr.open_dataset(\"../data/2m_temperature_2018_1.40625deg_Jan.nc\").to_dataframe().reset_index()\n\n\ndata5.head()\n\n\n\n\n\n\n\n\nlon\nlat\ntime\nt2m\n\n\n\n\n0\n0.0\n-87.1875\n2018-01-01 00:00:00\n250.728180\n\n\n1\n0.0\n-87.1875\n2018-01-01 01:00:00\n250.468552\n\n\n2\n0.0\n-87.1875\n2018-01-01 02:00:00\n250.250931\n\n\n3\n0.0\n-87.1875\n2018-01-01 03:00:00\n250.040314\n\n\n4\n0.0\n-87.1875\n2018-01-01 04:00:00\n249.993790\n\n\n\n\n\n\n\n\ntime_stamp = \"2018-01-01 01:00:00\"\ntrain_df = data5[data5.time == time_stamp]\ntest_df = data1[data1.time == time_stamp]\n\nX = np.stack([train_df.lat.values, train_df.lon.values], axis=1)\ny = train_df[[\"t2m\"]].values\nprint(f\"{X.shape=}, {y.shape=}\")\n\nX_test = np.stack([test_df.lat.values, test_df.lon.values], axis=1)\ny_test = test_df[[\"t2m\"]].values\nprint(f\"{X_test.shape=}, {y_test.shape=}\")\n\nrff = np.random.normal(size=(2, 16)) * 0.01\n# X = np.concatenate([np.sin(X @ rff), np.cos(X @ rff)], axis=1)\n# print(f\"{sin_cos.shape=}\")\n# X = X @ sin_cos\n# X_test = np.concatenate([np.sin(X_test @ rff), np.cos(X_test @ rff)], axis=1)\n\nprint(f\"{X.shape=}, {X_test.shape=}\")\n\nX.shape=(2048, 2), y.shape=(2048, 1)\nX_test.shape=(32768, 2), y_test.shape=(32768, 1)\nX.shape=(2048, 2), X_test.shape=(32768, 2)\n\n\n\nX_max = np.max(X, axis=0, keepdims=True)\nX_min = np.min(X, axis=0, keepdims=True)\n\nX_scaled = (X - X_min) / (X_max - X_min)\nX_test_scaled = (X_test - X_min) / (X_max - X_min)\n\ny_min = np.min(y, axis=0, keepdims=True)\ny_max = np.max(y, axis=0, keepdims=True)\n\ny_scaled = (y - y_min) / (y_max - y_min)\n\n# y_mean = np.mean(y, axis=0, keepdims=True)\n# y_std = np.std(y, axis=0, keepdims=True)\n\n# y_scaled = (y - y_mean) / y_std\n\n\nmodel = MLP(2, 1, [256]*4, 30.0, 0.0)\n# model = ResNet()\nmodel.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=1e-3), loss='mse')\n\n\nhistory = model.fit(X_scaled, y_scaled, epochs=5000, batch_size=X_scaled.shape[0], verbose=0)\n\n\nplt.plot(history.history['loss']);\n\n\n\n\n\n\n\n\n\ny_pred = model.predict(X_test_scaled) * (y_max - y_min) + y_min\nplt.imshow(y_pred.reshape(256, 128), origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\n\n1024/1024 [==============================] - 1s 1ms/step\n\n\n\n\n\n\n\n\n\n\nplt.imshow(y.reshape(64, 32), origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\n\n\n\n\n\n\n\n\n\ndiff = y_pred.reshape(256, 128) - y_test.reshape(256, 128)\nplt.imshow(diff, origin='lower', extent=[-180, 180, -90, 90], cmap='coolwarm', interpolation=\"none\");\nplt.colorbar();\nplt.title(\"Diff\")\n\nText(0.5, 1.0, 'Diff')\n\n\n\n\n\n\n\n\n\n\n# rmse = np.sqrt(np.mean(np.abs(X_test[:, 0:1])*(y_pred.ravel() - y_test.ravel())**2))/np.mean(y_test.ravel() * np.abs(X_test[:, 0:1]))\nrmse = np.sqrt(np.mean((y_pred.ravel() - y_test.ravel())**2))\nprint(f\"{rmse=}\")\n\nrmse=2.7606046\n\n\n\nmean_bias = np.mean(y_pred.ravel() - y_test.ravel())\nprint(f\"{mean_bias=}\")\n\nmean_bias=0.10866926" + "objectID": "lab/how-to-finetune-florence-2-on-detection-dataset.html#fine-tune-florence-2-on-custom-object-detection-dataset", + "href": "lab/how-to-finetune-florence-2-on-detection-dataset.html#fine-tune-florence-2-on-custom-object-detection-dataset", + "title": "Fine-tuning Florence-2 on Object Detection Dataset", + "section": "Fine-tune Florence-2 on custom object detection dataset", + "text": "Fine-tune Florence-2 on custom object detection dataset\n\n# @title Define train loop\n\ndef train_model(train_loader, val_loader, model, processor, epochs=10, lr=1e-6):\n optimizer = AdamW(model.parameters(), lr=lr)\n num_training_steps = epochs * len(train_loader)\n lr_scheduler = get_scheduler(\n name=\"linear\",\n optimizer=optimizer,\n num_warmup_steps=0,\n num_training_steps=num_training_steps,\n )\n\n render_inference_results(peft_model, val_loader.dataset, 6)\n\n for epoch in range(epochs):\n model.train()\n train_loss = 0\n for inputs, answers in tqdm(train_loader, desc=f\"Training Epoch {epoch + 1}/{epochs}\"):\n\n input_ids = inputs[\"input_ids\"]\n pixel_values = inputs[\"pixel_values\"]\n labels = processor.tokenizer(\n text=answers,\n return_tensors=\"pt\",\n padding=True,\n return_token_type_ids=False\n ).input_ids.to(DEVICE)\n\n outputs = model(input_ids=input_ids, pixel_values=pixel_values, labels=labels)\n loss = outputs.loss\n\n loss.backward(), optimizer.step(), lr_scheduler.step(), optimizer.zero_grad()\n train_loss += loss.item()\n\n avg_train_loss = train_loss / len(train_loader)\n print(f\"Average Training Loss: {avg_train_loss}\")\n\n model.eval()\n val_loss = 0\n with torch.no_grad():\n for inputs, answers in tqdm(val_loader, desc=f\"Validation Epoch {epoch + 1}/{epochs}\"):\n\n input_ids = inputs[\"input_ids\"]\n pixel_values = inputs[\"pixel_values\"]\n labels = processor.tokenizer(\n text=answers,\n return_tensors=\"pt\",\n padding=True,\n return_token_type_ids=False\n ).input_ids.to(DEVICE)\n\n outputs = model(input_ids=input_ids, pixel_values=pixel_values, labels=labels)\n loss = outputs.loss\n\n val_loss += loss.item()\n\n avg_val_loss = val_loss / len(val_loader)\n print(f\"Average Validation Loss: {avg_val_loss}\")\n\n render_inference_results(peft_model, val_loader.dataset, 6)\n\n output_dir = f\"./model_checkpoints/epoch_{epoch+1}\"\n os.makedirs(output_dir, exist_ok=True)\n model.save_pretrained(output_dir)\n processor.save_pretrained(output_dir)\n\n\n%%time\n\nEPOCHS = 10\nLR = 5e-6\n\ntrain_model(train_loader, val_loader, peft_model, processor, epochs=EPOCHS, lr=LR)\n\nThis implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch implementation torch.optim.AdamW instead, or set `no_deprecation_warning=True` to disable this warning\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\n\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[434.8800048828125, 140.47999572753906, 460.47998046875, 178.239990234375]], \"labels\": [\"human face\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], \"labels\": [\"table\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[321.6000061035156, 212.1599884033203, 343.3599853515625, 242.87998962402344]], \"labels\": [\"human face\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], \"labels\": [\"furniture\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], \"labels\": [\"bed\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[0.3199999928474426, 0.3199999928474426, 639.0399780273438, 639.0399780273438]], \"labels\": [\"trousers\"]}}\n\n\n\nTraining Epoch 1/10: 100%|██████████| 136/136 [01:32<00:00, 1.48it/s]\n\n\nAverage Training Loss: 5.180607767666087\n\n\nValidation Epoch 1/10: 100%|██████████| 8/8 [00:02<00:00, 2.76it/s]\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\n\n\nAverage Validation Loss: 3.619225859642029\n\n\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\n\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[372.79998779296875, 112.95999908447266, 512.3200073242188, 356.79998779296875]], \"labels\": [\"queen of spades\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[0.3199999928474426, 129.59999084472656, 267.8399963378906, 411.8399963378906]], \"labels\": [\"6 of 6 of 7 of 8 of 9 of 9's of 7's of 8's of 5's of 6's of 4's of 9' of 5' of 4' of 6' of 7' of 8' of 10's of 10' of 12's of 11's of 16's of 17's of 18's of 19's of 20's of 30's of 40's of 50's of 70's of 80's of eight's of ten's of four's of seven's of all sorts of sorts of eight' of sorts's of sorts\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[19.520000457763672, 289.6000061035156, 223.0399932861328, 580.7999877929688], [186.55999755859375, 274.239990234375, 395.8399963378906, 511.03997802734375]], \"labels\": [\"queen of spades\", \"queen of spades\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [], \"labels\": []}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[15.039999961853027, 254.39999389648438, 213.44000244140625, 464.9599914550781], [463.67999267578125, 222.39999389648438, 635.8399658203125, 405.44000244140625], [329.2799987792969, 192.3199920654297, 466.8799743652344, 397.1199951171875], [208.95999145507812, 285.1199951171875, 345.91998291015625, 461.7599792480469]], \"labels\": [\"queen of spades\", \"queen of spades\", \"queen of spades\", \"queen of spades\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[294.0799865722656, 176.95999145507812, 624.3200073242188, 398.3999938964844]], \"labels\": [\"6 of spades\"]}}\n\n\n\nSetting `save_embedding_layers` to `True` as embedding layers found in `target_modules`.\nTraining Epoch 2/10: 100%|██████████| 136/136 [01:32<00:00, 1.47it/s]\n\n\nAverage Training Loss: 3.457882178180358\n\n\nValidation Epoch 2/10: 100%|██████████| 8/8 [00:02<00:00, 2.75it/s]\n\n\nAverage Validation Loss: 2.489599049091339\n\n\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\n\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[372.79998779296875, 112.95999908447266, 512.3200073242188, 356.79998779296875], [162.87998962402344, 330.55999755859375, 301.1199951171875, 585.2799682617188], [52.79999923706055, 239.0399932861328, 166.0800018310547, 469.44000244140625]], \"labels\": [\"queen of cats\", \"king of cats\", \"9 of cats\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[0.3199999928474426, 128.3199920654297, 267.8399963378906, 411.8399963378906], [198.0800018310547, 82.23999786376953, 381.1199951171875, 323.5199890136719], [330.55999755859375, 42.55999755859375, 516.7999877929688, 207.0399932861328]], \"labels\": [\"6 of cats\", \"7 of cats\", \"5 of cats\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[18.8799991607666, 289.6000061035156, 223.0399932861328, 581.4400024414062], [368.3199768066406, 235.1999969482422, 517.4400024414062, 490.55999755859375], [185.27999877929688, 273.6000061035156, 396.47998046875, 511.67999267578125], [86.08000183105469, 164.16000366210938, 255.67999267578125, 403.5199890136719], [254.39999389648438, 167.36000061035156, 389.44000244140625, 317.1199951171875]], \"labels\": [\"queen of cats\", \"9 of cats\", \"queen's of cats\", \"queen cats\", \"queen card\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[55.36000061035156, 228.79998779296875, 331.8399963378906, 637.1199951171875], [331.1999816894531, 151.36000061035156, 479.03997802734375, 447.03997802734375], [296.6399841308594, 253.1199951171875, 459.8399963378906, 550.719970703125], [436.79998779296875, 157.1199951171875, 557.760009765625, 392.0]], \"labels\": [\"8 of cats\", \"6 of cats\", \"7 of cats\", \"6 of dogs\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[329.2799987792969, 192.3199920654297, 466.8799743652344, 397.7599792480469], [208.95999145507812, 285.1199951171875, 345.91998291015625, 461.7599792480469], [14.399999618530273, 254.39999389648438, 214.0800018310547, 464.9599914550781], [463.03997802734375, 222.39999389648438, 636.47998046875, 406.0799865722656]], \"labels\": [\"6 of cats\", \"7 of cats\", \"8 of cats\", \"5 of cats\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[97.5999984741211, 433.5999755859375, 314.55999755859375, 563.5199584960938], [292.1600036621094, 176.3199920654297, 624.9599609375, 399.03997802734375], [10.559999465942383, 228.1599884033203, 275.5199890136719, 427.8399963378906]], \"labels\": [\"5 of cats\", \"9 of cats\", \"7 of cats\"]}}\n\n\n\nTraining Epoch 3/10: 100%|██████████| 136/136 [01:32<00:00, 1.47it/s]\n\n\nAverage Training Loss: 2.4063032251947067\n\n\nValidation Epoch 3/10: 100%|██████████| 8/8 [00:02<00:00, 2.76it/s]\n\n\nAverage Validation Loss: 1.7791805267333984\n\n\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\n\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[372.79998779296875, 113.5999984741211, 512.3200073242188, 356.79998779296875], [161.59999084472656, 330.55999755859375, 301.1199951171875, 585.2799682617188], [52.79999923706055, 239.67999267578125, 166.0800018310547, 469.44000244140625], [310.0799865722656, 360.6399841308594, 445.7599792480469, 616.6400146484375]], \"labels\": [\"queen of cats\", \"king of cats\", \"9 of cats\", \"10 of cats\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[0.3199999928474426, 127.68000030517578, 267.1999816894531, 410.55999755859375], [200.63999938964844, 82.23999786376953, 381.1199951171875, 323.5199890136719], [333.1199951171875, 42.55999755859375, 516.7999877929688, 207.0399932861328]], \"labels\": [\"6 of dogs\", \"7 of dogs\", \"5 of dogs\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[19.520000457763672, 289.6000061035156, 223.0399932861328, 580.7999877929688], [368.3199768066406, 234.55999755859375, 517.4400024414062, 490.55999755859375], [185.9199981689453, 273.6000061035156, 395.8399963378906, 511.03997802734375], [86.08000183105469, 163.51998901367188, 255.0399932861328, 401.6000061035156], [255.0399932861328, 167.36000061035156, 388.79998779296875, 317.1199951171875]], \"labels\": [\"queen of birds\", \"9 of birds\", \"king of birds\", \"queen birds\", \"queen fish\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[55.36000061035156, 228.79998779296875, 333.1199951171875, 637.1199951171875], [437.44000244140625, 157.1199951171875, 557.760009765625, 391.3599853515625], [296.6399841308594, 253.1199951171875, 459.8399963378906, 550.0800170898438], [331.1999816894531, 150.0800018310547, 479.67999267578125, 447.03997802734375]], \"labels\": [\"8 of birds\", \"6 of birds\", \"7 of birds\", \"5 of birds\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[328.6399841308594, 192.3199920654297, 467.5199890136719, 397.7599792480469], [208.95999145507812, 285.1199951171875, 346.55999755859375, 461.7599792480469], [14.399999618530273, 254.39999389648438, 214.0800018310547, 465.5999755859375], [462.3999938964844, 221.1199951171875, 636.47998046875, 407.3599853515625]], \"labels\": [\"6 of birds\", \"7 of birds\", \"8 of birds\", \"5 of birds\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[97.5999984741211, 433.5999755859375, 314.55999755859375, 563.5199584960938], [308.1600036621094, 423.3599853515625, 548.7999877929688, 563.5199584960938], [10.559999465942383, 228.1599884033203, 275.5199890136719, 427.8399963378906]], \"labels\": [\"5 of cats\", \"9 of cats\", \"7 of cats\"]}}\n\n\n\nTraining Epoch 4/10: 100%|██████████| 136/136 [01:32<00:00, 1.47it/s]\n\n\nAverage Training Loss: 1.849329402341562\n\n\nValidation Epoch 4/10: 100%|██████████| 8/8 [00:02<00:00, 2.72it/s]\n\n\nAverage Validation Loss: 1.6737226694822311\n\n\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\n\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[372.79998779296875, 113.5999984741211, 512.3200073242188, 358.0799865722656], [162.239990234375, 330.55999755859375, 301.1199951171875, 585.2799682617188], [52.79999923706055, 239.0399932861328, 167.36000061035156, 470.0799865722656], [310.0799865722656, 360.6399841308594, 445.7599792480469, 616.6400146484375]], \"labels\": [\"queen of birds\", \"king of birds\", \"9 of birds\", \"10 of birds\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[333.1199951171875, 43.20000076293945, 516.7999877929688, 207.0399932861328], [0.3199999928474426, 128.3199920654297, 267.1999816894531, 410.55999755859375], [200.0, 83.5199966430664, 381.1199951171875, 323.5199890136719]], \"labels\": [\"5 of cats\", \"6 of cats\", \"7 of cats\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[369.6000061035156, 234.55999755859375, 518.0800170898438, 490.55999755859375], [256.32000732421875, 168.0, 388.79998779296875, 317.1199951171875], [18.8799991607666, 289.6000061035156, 223.0399932861328, 582.0800170898438], [186.55999755859375, 273.6000061035156, 396.47998046875, 512.3200073242188], [87.36000061035156, 164.8000030517578, 255.0399932861328, 403.5199890136719]], \"labels\": [\"9 of cats\", \"queen of cats\", \"10 of cats\", \"king of cats\", \"queen OF cats\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[437.44000244140625, 157.1199951171875, 556.47998046875, 390.7200012207031], [298.55999755859375, 254.39999389648438, 457.2799987792969, 549.4400024414062], [333.1199951171875, 151.36000061035156, 479.03997802734375, 447.03997802734375], [56.0, 228.79998779296875, 331.8399963378906, 637.1199951171875]], \"labels\": [\"6 of dogs\", \"7 of dogs\", \"5 of dogs\", \"8 of dogs\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[328.6399841308594, 192.3199920654297, 467.5199890136719, 397.1199951171875], [208.95999145507812, 285.1199951171875, 346.55999755859375, 461.7599792480469], [15.039999961853027, 254.39999389648438, 214.0800018310547, 465.5999755859375], [464.3199768066406, 222.39999389648438, 636.47998046875, 406.0799865722656]], \"labels\": [\"6 of birds\", \"7 of birds\", \"8 of birds\", \"5 of birds\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[97.5999984741211, 433.5999755859375, 314.55999755859375, 563.5199584960938], [310.0799865722656, 424.0, 548.7999877929688, 562.239990234375], [10.559999465942383, 228.1599884033203, 275.5199890136719, 427.8399963378906]], \"labels\": [\"5 of cats\", \"6 of cats\", \"7 of cats\"]}}\n\n\n\nTraining Epoch 5/10: 100%|██████████| 136/136 [01:32<00:00, 1.46it/s]\n\n\nAverage Training Loss: 1.7156715840101242\n\n\nValidation Epoch 5/10: 100%|██████████| 8/8 [00:02<00:00, 2.74it/s]\n\n\nAverage Validation Loss: 1.6302864849567413\n\n\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\n\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[371.5199890136719, 112.31999969482422, 512.9599609375, 358.0799865722656], [161.59999084472656, 329.91998291015625, 301.7599792480469, 585.2799682617188], [51.52000045776367, 238.39999389648438, 167.36000061035156, 470.0799865722656], [310.0799865722656, 358.0799865722656, 447.03997802734375, 616.6400146484375]], \"labels\": [\"queen of birds\", \"king of birds\", \"9 of birds\", \"10 of birds\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[333.1199951171875, 43.20000076293945, 516.7999877929688, 207.0399932861328], [0.3199999928474426, 129.59999084472656, 267.1999816894531, 410.55999755859375], [200.63999938964844, 83.5199966430664, 381.1199951171875, 323.5199890136719]], \"labels\": [\"5 of dogs\", \"6 of dogs\", \"7 of dogs\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[369.6000061035156, 234.55999755859375, 518.0800170898438, 490.55999755859375], [86.08000183105469, 162.87998962402344, 255.67999267578125, 401.6000061035156], [256.32000732421875, 167.36000061035156, 389.44000244140625, 317.1199951171875], [18.8799991607666, 289.6000061035156, 223.67999267578125, 582.0800170898438], [186.55999755859375, 273.6000061035156, 397.1199951171875, 512.3200073242188]], \"labels\": [\"9 of birds\", \"queen of birds\", \"jack of birds\", \"10 of birds\", \"king of birds\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[438.0799865722656, 157.1199951171875, 556.47998046875, 390.0799865722656], [333.1199951171875, 150.72000122070312, 479.03997802734375, 447.03997802734375], [298.55999755859375, 254.39999389648438, 458.55999755859375, 550.0800170898438], [56.0, 228.79998779296875, 333.1199951171875, 637.1199951171875]], \"labels\": [\"6 of dogs\", \"5 of dogs\", \"7 of dogs\", \"8 of dogs\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[328.6399841308594, 192.3199920654297, 467.5199890136719, 397.1199951171875], [208.95999145507812, 285.1199951171875, 346.55999755859375, 461.7599792480469], [15.039999961853027, 254.39999389648438, 214.0800018310547, 465.5999755859375], [464.3199768066406, 222.39999389648438, 636.47998046875, 406.0799865722656]], \"labels\": [\"6 of birds\", \"7 of birds\", \"8 of birds\", \"5 of birds\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[97.5999984741211, 433.5999755859375, 314.55999755859375, 563.5199584960938], [310.0799865722656, 424.0, 548.7999877929688, 562.239990234375], [10.559999465942383, 228.1599884033203, 275.5199890136719, 427.8399963378906], [291.5199890136719, 176.3199920654297, 625.5999755859375, 399.03997802734375]], \"labels\": [\"5 of cats\", \"6 of cats\", \"7 of cats\", \"8 of cats\"]}}\n\n\n\nTraining Epoch 6/10: 100%|██████████| 136/136 [01:32<00:00, 1.46it/s]\n\n\nAverage Training Loss: 1.6734710993135677\n\n\nValidation Epoch 6/10: 100%|██████████| 8/8 [00:02<00:00, 2.73it/s]\n\n\nAverage Validation Loss: 1.6009675413370132\n\n\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\nBoundingBoxAnnotator is deprecated: `BoundingBoxAnnotator` is deprecated and has been renamed to `BoxAnnotator`. `BoundingBoxAnnotator` will be removed in supervision-0.26.0.\n\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[371.5199890136719, 112.31999969482422, 512.9599609375, 358.0799865722656], [161.59999084472656, 330.55999755859375, 301.7599792480469, 585.2799682617188], [51.52000045776367, 239.0399932861328, 168.63999938964844, 470.0799865722656], [310.0799865722656, 360.0, 447.03997802734375, 616.6400146484375]], \"labels\": [\"queen of birds\", \"king of birds\", \"9 of birds\", \"10 of birds\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[333.1199951171875, 43.20000076293945, 516.7999877929688, 207.0399932861328], [0.3199999928474426, 129.59999084472656, 267.1999816894531, 410.55999755859375], [200.63999938964844, 83.5199966430664, 381.1199951171875, 323.5199890136719]], \"labels\": [\"5 of dogs\", \"6 of dogs\", \"7 of dogs\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[369.6000061035156, 234.55999755859375, 518.0800170898438, 490.55999755859375], [86.08000183105469, 162.87998962402344, 255.67999267578125, 401.6000061035156], [256.32000732421875, 167.36000061035156, 389.44000244140625, 317.1199951171875], [18.8799991607666, 289.6000061035156, 223.67999267578125, 582.0800170898438], [186.55999755859375, 273.6000061035156, 396.47998046875, 511.03997802734375]], \"labels\": [\"9 of birds\", \"queen of birds\", \"jack of birds\", \"10 of birds\", \"king of birds\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[438.0799865722656, 157.1199951171875, 556.47998046875, 390.0799865722656], [333.1199951171875, 150.72000122070312, 479.03997802734375, 447.03997802734375], [298.55999755859375, 254.39999389648438, 457.2799987792969, 550.0800170898438], [56.0, 228.79998779296875, 333.1199951171875, 637.1199951171875]], \"labels\": [\"6 of dogs\", \"5 of dogs\", \"7 of dogs\", \"8 of dogs\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[328.6399841308594, 192.3199920654297, 467.5199890136719, 397.1199951171875], [464.3199768066406, 222.39999389648438, 636.47998046875, 405.44000244140625], [209.59999084472656, 285.1199951171875, 346.55999755859375, 461.7599792480469], [15.039999961853027, 254.39999389648438, 214.0800018310547, 464.9599914550781]], \"labels\": [\"6 of birds\", \"5 of birds\", \"7 of birds\", \"8 of birds\"]}}\n\n\n\n \n {\"<OD>\": {\"bboxes\": [[97.5999984741211, 433.5999755859375, 314.55999755859375, 563.5199584960938], [310.0799865722656, 424.0, 548.7999877929688, 561.5999755859375], [10.559999465942383, 228.1599884033203, 275.5199890136719, 427.8399963378906], [291.5199890136719, 176.3199920654297, 625.5999755859375, 399.03997802734375]], \"labels\": [\"5 of cats\", \"6 of cats\", \"7 of cats\", \"8 of cats\"]}}\n\n\n\nTraining Epoch 7/10: 34%|███▍ | 46/136 [00:31<01:02, 1.44it/s]\n\n\n\n---------------------------------------------------------------------------\nKeyboardInterrupt Traceback (most recent call last)\nFile <timed exec>:4\n\nCell In[29], line 29, in train_model(train_loader, val_loader, model, processor, epochs, lr)\n 21 pixel_values = inputs[\"pixel_values\"]\n 22 labels = processor.tokenizer(\n 23 text=answers,\n 24 return_tensors=\"pt\",\n 25 padding=True,\n 26 return_token_type_ids=False\n 27 ).input_ids.to(DEVICE)\n---> 29 outputs = model(input_ids=input_ids, pixel_values=pixel_values, labels=labels)\n 30 loss = outputs.loss\n 32 loss.backward(), optimizer.step(), lr_scheduler.step(), optimizer.zero_grad()\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)\n 1734 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]\n 1735 else:\n-> 1736 return self._call_impl(*args, **kwargs)\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)\n 1742 # If we don't have any hooks, we want to skip the rest of the logic in\n 1743 # this function, and just call forward.\n 1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks\n 1745 or _global_backward_pre_hooks or _global_backward_hooks\n 1746 or _global_forward_hooks or _global_forward_pre_hooks):\n-> 1747 return forward_call(*args, **kwargs)\n 1749 result = None\n 1750 called_always_called_hooks = set()\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/peft/peft_model.py:1719, in PeftModelForCausalLM.forward(self, input_ids, attention_mask, inputs_embeds, labels, output_attentions, output_hidden_states, return_dict, task_ids, **kwargs)\n 1717 with self._enable_peft_forward_hooks(**kwargs):\n 1718 kwargs = {k: v for k, v in kwargs.items() if k not in self.special_peft_forward_args}\n-> 1719 return self.base_model(\n 1720 input_ids=input_ids,\n 1721 attention_mask=attention_mask,\n 1722 inputs_embeds=inputs_embeds,\n 1723 labels=labels,\n 1724 output_attentions=output_attentions,\n 1725 output_hidden_states=output_hidden_states,\n 1726 return_dict=return_dict,\n 1727 **kwargs,\n 1728 )\n 1730 batch_size = _get_batch_size(input_ids, inputs_embeds)\n 1731 if attention_mask is not None:\n 1732 # concat prompt attention mask\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)\n 1734 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]\n 1735 else:\n-> 1736 return self._call_impl(*args, **kwargs)\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)\n 1742 # If we don't have any hooks, we want to skip the rest of the logic in\n 1743 # this function, and just call forward.\n 1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks\n 1745 or _global_backward_pre_hooks or _global_backward_hooks\n 1746 or _global_forward_hooks or _global_forward_pre_hooks):\n-> 1747 return forward_call(*args, **kwargs)\n 1749 result = None\n 1750 called_always_called_hooks = set()\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/peft/tuners/tuners_utils.py:197, in BaseTuner.forward(self, *args, **kwargs)\n 196 def forward(self, *args: Any, **kwargs: Any):\n--> 197 return self.model.forward(*args, **kwargs)\n\nFile ~/.cache/huggingface/modules/transformers_modules/microsoft/Florence-2-base-ft/9803f52844ec1ae5df004e6089262e9a23e527fd/modeling_florence2.py:2741, in Florence2ForConditionalGeneration.forward(self, input_ids, pixel_values, attention_mask, decoder_input_ids, decoder_attention_mask, head_mask, decoder_head_mask, cross_attn_head_mask, encoder_outputs, past_key_values, inputs_embeds, decoder_inputs_embeds, labels, use_cache, output_attentions, output_hidden_states, return_dict)\n 2739 if inputs_embeds is not None:\n 2740 attention_mask = attention_mask.to(inputs_embeds.dtype)\n-> 2741 outputs = self.language_model(\n 2742 attention_mask=attention_mask,\n 2743 labels=labels,\n 2744 inputs_embeds=inputs_embeds,\n 2745 decoder_input_ids=decoder_input_ids,\n 2746 encoder_outputs=encoder_outputs,\n 2747 decoder_attention_mask=decoder_attention_mask,\n 2748 head_mask=head_mask,\n 2749 decoder_head_mask=decoder_head_mask,\n 2750 cross_attn_head_mask=cross_attn_head_mask,\n 2751 past_key_values=past_key_values,\n 2752 decoder_inputs_embeds=decoder_inputs_embeds,\n 2753 use_cache=use_cache,\n 2754 output_attentions=output_attentions,\n 2755 output_hidden_states=output_hidden_states,\n 2756 return_dict=return_dict,\n 2757 )\n 2759 logits = outputs.logits\n 2760 logits = logits.float()\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)\n 1734 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]\n 1735 else:\n-> 1736 return self._call_impl(*args, **kwargs)\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)\n 1742 # If we don't have any hooks, we want to skip the rest of the logic in\n 1743 # this function, and just call forward.\n 1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks\n 1745 or _global_backward_pre_hooks or _global_backward_hooks\n 1746 or _global_forward_hooks or _global_forward_pre_hooks):\n-> 1747 return forward_call(*args, **kwargs)\n 1749 result = None\n 1750 called_always_called_hooks = set()\n\nFile ~/.cache/huggingface/modules/transformers_modules/microsoft/Florence-2-base-ft/9803f52844ec1ae5df004e6089262e9a23e527fd/modeling_florence2.py:2140, in Florence2LanguageForConditionalGeneration.forward(self, input_ids, attention_mask, decoder_input_ids, decoder_attention_mask, head_mask, decoder_head_mask, cross_attn_head_mask, encoder_outputs, past_key_values, inputs_embeds, decoder_inputs_embeds, labels, use_cache, output_attentions, output_hidden_states, return_dict)\n 2135 if decoder_input_ids is None and decoder_inputs_embeds is None:\n 2136 decoder_input_ids = shift_tokens_right(\n 2137 labels, self.config.pad_token_id, self.config.decoder_start_token_id\n 2138 )\n-> 2140 outputs = self.model(\n 2141 input_ids,\n 2142 attention_mask=attention_mask,\n 2143 decoder_input_ids=decoder_input_ids,\n 2144 encoder_outputs=encoder_outputs,\n 2145 decoder_attention_mask=decoder_attention_mask,\n 2146 head_mask=head_mask,\n 2147 decoder_head_mask=decoder_head_mask,\n 2148 cross_attn_head_mask=cross_attn_head_mask,\n 2149 past_key_values=past_key_values,\n 2150 inputs_embeds=inputs_embeds,\n 2151 decoder_inputs_embeds=decoder_inputs_embeds,\n 2152 use_cache=use_cache,\n 2153 output_attentions=output_attentions,\n 2154 output_hidden_states=output_hidden_states,\n 2155 return_dict=return_dict,\n 2156 )\n 2158 lm_logits = self.lm_head(outputs[0])\n 2159 lm_logits = lm_logits + self.final_logits_bias.to(lm_logits.device)\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)\n 1734 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]\n 1735 else:\n-> 1736 return self._call_impl(*args, **kwargs)\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)\n 1742 # If we don't have any hooks, we want to skip the rest of the logic in\n 1743 # this function, and just call forward.\n 1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks\n 1745 or _global_backward_pre_hooks or _global_backward_hooks\n 1746 or _global_forward_hooks or _global_forward_pre_hooks):\n-> 1747 return forward_call(*args, **kwargs)\n 1749 result = None\n 1750 called_always_called_hooks = set()\n\nFile ~/.cache/huggingface/modules/transformers_modules/microsoft/Florence-2-base-ft/9803f52844ec1ae5df004e6089262e9a23e527fd/modeling_florence2.py:2014, in Florence2LanguageModel.forward(self, input_ids, attention_mask, decoder_input_ids, decoder_attention_mask, head_mask, decoder_head_mask, cross_attn_head_mask, encoder_outputs, past_key_values, inputs_embeds, decoder_inputs_embeds, use_cache, output_attentions, output_hidden_states, return_dict)\n 2011 return_dict = return_dict if return_dict is not None else self.config.use_return_dict\n 2013 if encoder_outputs is None:\n-> 2014 encoder_outputs = self.encoder(\n 2015 input_ids=input_ids,\n 2016 attention_mask=attention_mask,\n 2017 head_mask=head_mask,\n 2018 inputs_embeds=inputs_embeds,\n 2019 output_attentions=output_attentions,\n 2020 output_hidden_states=output_hidden_states,\n 2021 return_dict=return_dict,\n 2022 )\n 2023 # If the user passed a tuple for encoder_outputs, we wrap it in a BaseModelOutput when return_dict=True\n 2024 elif return_dict and not isinstance(encoder_outputs, BaseModelOutput):\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1736, in Module._wrapped_call_impl(self, *args, **kwargs)\n 1734 return self._compiled_call_impl(*args, **kwargs) # type: ignore[misc]\n 1735 else:\n-> 1736 return self._call_impl(*args, **kwargs)\n\nFile /opt/anaconda3/envs/zeel_py310/lib/python3.10/site-packages/torch/nn/modules/module.py:1747, in Module._call_impl(self, *args, **kwargs)\n 1742 # If we don't have any hooks, we want to skip the rest of the logic in\n 1743 # this function, and just call forward.\n 1744 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks\n 1745 or _global_backward_pre_hooks or _global_backward_hooks\n 1746 or _global_forward_hooks or _global_forward_pre_hooks):\n-> 1747 return forward_call(*args, **kwargs)\n 1749 result = None\n 1750 called_always_called_hooks = set()\n\nFile ~/.cache/huggingface/modules/transformers_modules/microsoft/Florence-2-base-ft/9803f52844ec1ae5df004e6089262e9a23e527fd/modeling_florence2.py:1593, in Florence2Encoder.forward(self, input_ids, attention_mask, head_mask, inputs_embeds, output_attentions, output_hidden_states, return_dict)\n 1588 attention_mask = attention_mask if 0 in attention_mask else None\n 1589 elif self._use_sdpa and head_mask is None and not output_attentions:\n 1590 # output_attentions=True & head_mask can not be supported when using SDPA, fall back to\n 1591 # the manual implementation that requires a 4D causal mask in all cases.\n 1592 # [bsz, seq_len] -> [bsz, 1, tgt_seq_len, src_seq_len]\n-> 1593 attention_mask = _prepare_4d_attention_mask_for_sdpa(attention_mask, inputs_embeds.dtype)\n 1594 else:\n 1595 # [bsz, seq_len] -> [bsz, 1, tgt_seq_len, src_seq_len]\n 1596 attention_mask = _prepare_4d_attention_mask(attention_mask, inputs_embeds.dtype)\n\nKeyboardInterrupt:" }, { - "objectID": "posts/2022-10-21-gaussian-processes.html", - "href": "posts/2022-10-21-gaussian-processes.html", - "title": "Gaussian Processes - A no-skip-math version", - "section": "", - "text": "import jax\nimport jax.numpy as jnp\n\nfrom tinygp.kernels import ExpSquared\n\nimport matplotlib.pyplot as plt" + "objectID": "lab/how-to-finetune-florence-2-on-detection-dataset.html#fine-tuned-model-evaluation", + "href": "lab/how-to-finetune-florence-2-on-detection-dataset.html#fine-tuned-model-evaluation", + "title": "Fine-tuning Florence-2 on Object Detection Dataset", + "section": "Fine-tuned model evaluation", + "text": "Fine-tuned model evaluation\n\n# @title Check if the model can still detect objects outside of the custom dataset\n\nimage = Image.open(EXAMPLE_IMAGE_PATH)\ntask = \"<OD>\"\ntext = \"<OD>\"\n\ninputs = processor(text=text, images=image, return_tensors=\"pt\").to(DEVICE)\ngenerated_ids = peft_model.generate(\n input_ids=inputs[\"input_ids\"],\n pixel_values=inputs[\"pixel_values\"],\n max_new_tokens=1024,\n num_beams=3\n)\ngenerated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]\nresponse = processor.post_process_generation(generated_text, task=task, image_size=(image.width, image.height))\ndetections = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, response, resolution_wh=image.size)\n\nbounding_box_annotator = sv.BoundingBoxAnnotator(color_lookup=sv.ColorLookup.INDEX)\nlabel_annotator = sv.LabelAnnotator(color_lookup=sv.ColorLookup.INDEX)\n\nimage = bounding_box_annotator.annotate(image, detections)\nimage = label_annotator.annotate(image, detections)\nimage.thumbnail((600, 600))\nimage\n\n\n\n\n\n\n\n\nNOTE: It seems that the model can still detect classes that don’t belong to our custom dataset.\n\n# @title Collect predictions\n\nPATTERN = r'([a-zA-Z0-9 ]+ of [a-zA-Z0-9 ]+)<loc_\\d+>'\n\ndef extract_classes(dataset: DetectionDataset):\n class_set = set()\n for i in range(len(dataset.dataset)):\n image, data = dataset.dataset[i]\n suffix = data[\"suffix\"]\n classes = re.findall(PATTERN, suffix)\n class_set.update(classes)\n return sorted(class_set)\n\nCLASSES = extract_classes(train_dataset)\n\ntargets = []\npredictions = []\n\nfor i in range(len(val_dataset.dataset)):\n image, data = val_dataset.dataset[i]\n prefix = data['prefix']\n suffix = data['suffix']\n\n inputs = processor(text=prefix, images=image, return_tensors=\"pt\").to(DEVICE)\n generated_ids = model.generate(\n input_ids=inputs[\"input_ids\"],\n pixel_values=inputs[\"pixel_values\"],\n max_new_tokens=1024,\n num_beams=3\n )\n generated_text = processor.batch_decode(generated_ids, skip_special_tokens=False)[0]\n\n prediction = processor.post_process_generation(generated_text, task='<OD>', image_size=image.size)\n prediction = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, prediction, resolution_wh=image.size)\n prediction = prediction[np.isin(prediction['class_name'], CLASSES)]\n prediction.class_id = np.array([CLASSES.index(class_name) for class_name in prediction['class_name']])\n prediction.confidence = np.ones(len(prediction))\n\n target = processor.post_process_generation(suffix, task='<OD>', image_size=image.size)\n target = sv.Detections.from_lmm(sv.LMM.FLORENCE_2, target, resolution_wh=image.size)\n target.class_id = np.array([CLASSES.index(class_name) for class_name in target['class_name']])\n\n targets.append(target)\n predictions.append(prediction)\n\n\n# @title Calculate mAP\nmean_average_precision = sv.MeanAveragePrecision.from_detections(\n predictions=predictions,\n targets=targets,\n)\n\nprint(f\"map50_95: {mean_average_precision.map50_95:.2f}\")\nprint(f\"map50: {mean_average_precision.map50:.2f}\")\nprint(f\"map75: {mean_average_precision.map75:.2f}\")\n\nmap50_95: 0.49\nmap50: 0.53\nmap75: 0.53\n\n\n\n# @title Calculate Confusion Matrix\nconfusion_matrix = sv.ConfusionMatrix.from_detections(\n predictions=predictions,\n targets=targets,\n classes=CLASSES\n)\n\n_ = confusion_matrix.plot()" }, { - "objectID": "posts/2022-10-21-gaussian-processes.html#regression", - "href": "posts/2022-10-21-gaussian-processes.html#regression", - "title": "Gaussian Processes - A no-skip-math version", - "section": "Regression", - "text": "Regression\nIn this post, we will consider the regression problem of finding a reasonable map \\(X \\to \\boldsymbol{y}\\) along with uncertainty. We can do this in a simplest setting with Bayesian linear regression assuming a MultiVariate Normal (MVN) prior \\(\\boldsymbol{\\theta} \\sim \\mathcal{N}(\\boldsymbol{\\mu}_\\theta, \\Sigma_\\theta)\\) (why MVN? because \\(\\theta \\in (-\\infty, \\infty)\\)) and Normal likelihood \\(y \\sim \\mathcal{N}(\\boldsymbol{x}^T\\theta, \\sigma_n^2)\\) with i.i.d. assumption.\nTo start with Gaussian process regression, let us first focus on \\(\\boldsymbol{y}\\) (and ignore \\(X\\)). We assume \\(\\boldsymbol{f}\\) as a random variable and \\(\\boldsymbol{y}\\) as a realization of \\(\\boldsymbol{f}\\) with some noise. It would be a natural probabilistic assumption to assume \\(\\boldsymbol{f}\\) to be MVN distributed since its range is \\((-\\infty, \\infty)\\).\n\\[\np(\\boldsymbol{f}) \\sim \\mathcal{N}(\\boldsymbol{m}_f, K_{ff})\n\\tag{prior}\n\\]\nNow, we need to bring in \\(X\\) in a reasonable way to this formulation. A core assumption connecting \\(X\\) with \\(\\boldsymbol{y}\\) is the following: > if two inputs \\(\\boldsymbol{x}\\) and \\(\\boldsymbol{x}'\\) are close to each other (how to define the closeness? kernels!), corresponding \\(\\boldsymbol{y}\\) and \\(\\boldsymbol{y}'\\) are likely to be similar.\nWe use something known as covariance function or kernel (later is more prevalent) to define this closeness. For example, RBF or squared exponential is a well-known kernel:\n\\[\nk_{RBF}(\\boldsymbol{x}, \\boldsymbol{x}') = \\sigma^2 \\exp \\left(-{\\frac {\\|\\boldsymbol{x} -\\boldsymbol{x}' \\|^{2}}{2\\ell ^{2}}}\\right)\n\\tag{kernel}\n\\]\n\nx = jnp.array(0.0).reshape(1, 1)\nx_prime = jnp.linspace(-5,5,100).reshape(-1, 1)\n\nplt.plot(x_prime, ExpSquared()(x_prime, x));\nplt.xlabel(\"$x'$\")\nplt.title(f\"$k(x,x')$ where $x={x[0][0]}$ and $x' \\in ${plt.xlim()}\");\n\n\n\n\n\n\n\n\nThe plot above shows that value of \\(k(\\boldsymbol{x}, \\boldsymbol{x}')\\) increases as \\(\\boldsymbol{x}'\\) approaches \\(\\boldsymbol{x}\\) and reduces as it moves far from \\(\\boldsymbol{x}\\). Now, we will connect \\(X\\) with \\(\\boldsymbol{f}\\) (and thus with \\(\\boldsymbol{y}\\)) through kernel \\(k\\) with two following assumptions:\n\nDiagonal entries of \\(K_{ff}\\) represent variance of \\(f_i\\), which can be represented by \\(k(\\boldsymbol{x}_i, \\boldsymbol{x}_i)\\).\nNon-diagonal entries of \\(K_{ff}\\) represent covariance between \\(f_i\\) and \\(f_j\\) and can be represented by \\(k(\\boldsymbol{x}_i, \\boldsymbol{x}_j)\\).\n\nAt this point, we have made everything clear about prior \\(p(\\boldsymbol{f}) \\sim \\mathcal{N}(\\boldsymbol{m}_f, K_{ff})\\). Now, we will look at the likelihood. As mentioned earlier, \\(\\boldsymbol{y}\\) is noisy realization of \\(f\\) so the following likelihood would be a simple and natural choice.\n\\[\np(\\boldsymbol{y}|\\boldsymbol{f}) \\sim \\mathcal{N}(\\boldsymbol{f}, \\sigma_n^2I)\n\\tag{likelihood}\n\\]\nTill now, we followed bottom-up approach and defined prior and likelihood for this problem. Now we will explore the top-down approach.\nOur ultimate goal is derive \\(p(\\boldsymbol{y}^*|X^*,\\boldsymbol{y}, X)\\) at new inputs \\(X^*\\). This can be written as:\n\\[\np(\\boldsymbol{y}^*|X^*,\\boldsymbol{y}, X) = \\int p(\\boldsymbol{y}^*|\\boldsymbol{f}^*)p(\\boldsymbol{f}^*|X^*,\\boldsymbol{y}, X)d\\boldsymbol{f}^*\n\\tag{pred post new}\n\\]\nHere, \\(p(\\boldsymbol{f}^*|X^*,\\boldsymbol{y}, X)\\) is the posterior distribution at inputs \\(X^*\\). Once we derive posterior \\(p(\\boldsymbol{f}|\\boldsymbol{y},X)\\), We can find \\(p(\\boldsymbol{f}^*|X^*,\\boldsymbol{y}, X)\\) like following:\n\\[\np(\\boldsymbol{f}^*|X^*, \\boldsymbol{y}, X) = \\int p(\\boldsymbol{f}^*|X^*, \\boldsymbol{f}, X)p(\\boldsymbol{f}|\\boldsymbol{y}, X)d\\boldsymbol{f}\n\\tag{post new}\n\\]\nHere, \\(p(\\boldsymbol{f}^*|X^*, \\boldsymbol{f}, X)\\) is a conditional Gaussian distribution with the following closed form:\n\\[\np(\\boldsymbol{f}^*|X^*, \\boldsymbol{f}, X) \\sim \\mathcal{N}(\\boldsymbol{m}_{f^*}+K_{f^*f}K_{ff}^{-1}(\\boldsymbol{f}-\\boldsymbol{m}_{f}), K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*})\n\\tag{cond}\n\\]\nPosterior \\(p(\\boldsymbol{f}|\\boldsymbol{y}, X)\\) can be derived following “Bayes’ rule for Gaussians” (section 2.2.6.2 in pml book2):\n\\[\np(\\boldsymbol{f}|\\boldsymbol{y}, X) \\sim \\mathcal{N}(\\boldsymbol{m}_f + K_{ff}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f), K_{ff} - K_{ff}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff})\n\\tag{post}\n\\]\nWe can now substitute Eq. (post) and Eq. (cond) in Eq. (post new). The integral can be solved with using Eq. 2.90 in section 2.2.6.2 in pml book2 and also mentioned in Eq. (int gaussians) in Appendix.\n\\[\n\\begin{aligned}\np(\\boldsymbol{f}^*|X^*, \\boldsymbol{y}, X) &\\sim \\mathcal{N}(\\boldsymbol{\\mu}^*, \\Sigma^*)\\\\\n\\boldsymbol{\\mu}^* &= \\boldsymbol{m}_{f^*}+K_{f^*f}K_{ff}^{-1}(\\left[\\boldsymbol{m}_f + K_{ff}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f)\\right]-\\boldsymbol{m}_{f})\\\\\n&=\\boldsymbol{m}_{f^*}+K_{f^*f}K_{ff}^{-1}(K_{ff}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f))\\\\\n&=\\boldsymbol{m}_{f^*}+K_{f^*f}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f)\\\\\n\\\\\n\\Sigma^* &= K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*} + K_{f^*f}K_{ff}^{-1}\\left[K_{ff} - K_{ff}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff}\\right]K_{ff}^{-1}K_{ff^*}\\\\\n&=K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*} + K_{f^*f}\\left[I - \\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff}\\right]K_{ff}^{-1}K_{ff^*}\\\\\n&=K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*} + K_{f^*f}\\left[K_{ff}^{-1} - \\left(K_{ff} + \\sigma_n^2I\\right)^{-1}\\right]K_{ff^*}\\\\\n&=K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*} + K_{f^*f}K_{ff}^{-1}K_{ff^*} - K_{f^*f}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff^*}\\\\\n&=K_{f^*f^*} - K_{f^*f}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff^*}\\\\\np(\\boldsymbol{f}^*|X^*, \\boldsymbol{y}, X) &\\sim \\mathcal{N}(\\boldsymbol{m}_{f^*}+K_{f^*f}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f), K_{f^*f^*} - K_{f^*f}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff^*})\n\\end{aligned}\n\\]\nNow, we are almost there. Plugging in the above formula in Eq. (pred post) and using known result in Eq. (int gaussians), we get the predictive posterior as following:\n\\[\np(\\boldsymbol{y}^*|X^*,\\boldsymbol{y}, X) \\sim \\mathcal{N}(\\boldsymbol{m}_{f^*}+K_{f^*f}\\left(K_{ff}+\\sigma_n^2I\\right)^{-1}(\\boldsymbol{y} - \\boldsymbol{m}_f), K_{f^*f^*} - K_{f^*f}\\left(K_{ff} + \\sigma_n^2I\\right)^{-1}K_{ff^*} + \\sigma_n^2I)\n\\]\n\n\n\n\n\n\nNote\n\n\n\nWe did not exploit the special structure of likelihood variance \\(\\sigma_n^2I\\) anywhere, so, these derivations hold true for full rank likelihood covariance matrices also.\n\n\n\nOptimization\nWe perform type-II likelihood estimation (in other words, minimize log marginal likelihood or evidence term). Our goal is to find optimal model \\(\\mathcal{M}\\) represented by prior (or kernel) hyperparameters and likelihood hyperparameters. We can get the log marginal likelihood using Eq. (int gaussians):\n\\[\n\\begin{aligned}\np(\\boldsymbol{y}|X, \\mathcal{M}) &= \\int p(\\boldsymbol{y}|\\boldsymbol{f}) p(\\boldsymbol{f})d\\boldsymbol{f}\\\\\n&\\sim \\int \\mathcal{N}(\\boldsymbol{y}|\\boldsymbol{f}, \\sigma_n^2I) \\mathcal{N}(\\boldsymbol{f}|\\boldsymbol{m}_f, K_{ff})\\\\\n&\\sim \\mathcal{N}(\\boldsymbol{y}|\\boldsymbol{m}_f, K_{ff}+\\sigma_n^2I)\n\\end{aligned}\n\\]\nFor case of RBF kernel, \\(\\mathcal{M}\\) parameters will be \\(\\{\\sigma, \\ell, \\sigma_n\\}\\)." + "objectID": "lab/how-to-finetune-florence-2-on-detection-dataset.html#save-fine-tuned-model-on-hard-drive", + "href": "lab/how-to-finetune-florence-2-on-detection-dataset.html#save-fine-tuned-model-on-hard-drive", + "title": "Fine-tuning Florence-2 on Object Detection Dataset", + "section": "Save fine-tuned model on hard drive", + "text": "Save fine-tuned model on hard drive\n\npeft_model.save_pretrained(\"/content/florence2-lora\")\nprocessor.save_pretrained(\"/content/florence2-lora/\")\n!ls -la /content/florence2-lora/\n\ntotal 11432\ndrwxr-xr-x 2 root root 4096 Jun 26 21:43 .\ndrwxr-xr-x 1 root root 4096 Jun 26 21:43 ..\n-rw-r--r-- 1 root root 746 Jun 26 21:43 adapter_config.json\n-rw-r--r-- 1 root root 7747264 Jun 26 21:43 adapter_model.safetensors\n-rw-r--r-- 1 root root 22410 Jun 26 21:43 added_tokens.json\n-rw-r--r-- 1 root root 456318 Jun 26 21:43 merges.txt\n-rw-r--r-- 1 root root 947 Jun 26 21:43 preprocessor_config.json\n-rw-r--r-- 1 root root 5102 Jun 26 21:43 README.md\n-rw-r--r-- 1 root root 146627 Jun 26 21:43 special_tokens_map.json\n-rw-r--r-- 1 root root 197658 Jun 26 21:43 tokenizer_config.json\n-rw-r--r-- 1 root root 2297961 Jun 26 21:43 tokenizer.json\n-rw-r--r-- 1 root root 798293 Jun 26 21:43 vocab.json" }, { - "objectID": "posts/2022-10-21-gaussian-processes.html#classification-with-laplace-approximation", - "href": "posts/2022-10-21-gaussian-processes.html#classification-with-laplace-approximation", - "title": "Gaussian Processes - A no-skip-math version", - "section": "Classification (with Laplace approximation)", - "text": "Classification (with Laplace approximation)\nWe will derive a GP predictive posterior for binary case only because for multi-class, it gets a bit complex. Our assumption for prior over the \\(\\boldsymbol{f}\\) can still be the same but likelihood needs to be changed because \\(\\boldsymbol{y}\\) is no more a real number but rather a binary value e.g. 0 or 1. From Bayesian point-of-view, Bernoulli likelihood would be the most appropriate as a likelihood here:\n\\[\np(\\boldsymbol{y}|\\boldsymbol{f}) = \\prod_{i=1}^{N} \\sigma(f_i)^{y_i=1}(1-\\sigma(f_i))^{y_i=0}\n\\tag{class likelihood}\n\\]\nSince, MVN prior and Bernoulli likelihood are not conjugate, we need to use an approximate method of inference here. We use Laplace approximation to get the MAP estimate \\(\\boldsymbol{\\hat{f}}\\) and by computing the Hessian \\(H\\) of negative log joint (log prior + log likelihood) with respect to \\(\\boldsymbol{\\hat{f}}\\), we can get the posterior distribution as the following:\n\\[\np(\\boldsymbol{f}|\\boldsymbol{y}, X) \\sim \\mathcal{N}(\\boldsymbol{\\hat{f}}, H^{-1})\n\\tag{class post}\n\\]\nEq. (cond) will be the same in this case, and thus, we can solve Eq. (post new) as we did for regression case, like the following:\n\\[\np(\\boldsymbol{f}^*|X^*, \\boldsymbol{y}, X) \\sim \\mathcal{N}(\\boldsymbol{m}_{f^*}+K_{f^*f}K_{ff}^{-1}(\\boldsymbol{\\hat{f}}-\\boldsymbol{m}_{f}), K_{f^*f^*} - K_{f^*f}K_{ff}^{-1}K_{ff^*} + K_{f^*f}K_{ff}^{-1}H^{-1}K_{ff}^{-1}K_{ff^*})\n\\]\n\nOptimization\nTo perform Type-II likelihood estimation for binary classification, we first need to derive the log marginal likelihood which can be approximated with Laplace approximation. First, we define the following quantity:\n\\[\n\\boldsymbol{\\psi}(\\boldsymbol{f}) \\triangleq \\log p(\\boldsymbol{y}|\\boldsymbol{f}) + \\log p(\\boldsymbol{f})\n\\]\nNow, computing the log marginal likelihood as suggested in section 3.4.4 of GPML book:\n\\[\n\\begin{aligned}\n\\log p(\\boldsymbol{y}|X, \\mathcal{M}) &\\sim \\log \\left[ \\int p(\\boldsymbol{y}|\\boldsymbol{f}) p(\\boldsymbol{f})d\\boldsymbol{f}\\right]\\\\\n&= \\log \\left[ \\int \\exp\\left(\\boldsymbol{\\psi}(\\boldsymbol{f})\\right)d\\boldsymbol{f} \\right]\\\\\n&\\thickapprox \\log \\left[ \\int \\exp\\left(\\boldsymbol{\\psi}(\\boldsymbol{\\hat{f}}) -\\frac{1}{2}(\\mathbf{f}-\\hat{\\mathbf{f}})^{\\top} H(\\mathbf{f}-\\hat{\\mathbf{f}})\\right)d\\boldsymbol{f} \\right]\\\\\n&= \\log \\left[ \\exp \\boldsymbol{\\psi}(\\boldsymbol{\\hat{f}}) \\int exp\\left(-\\frac{1}{2}(\\mathbf{f}-\\hat{\\mathbf{f}})^{\\top} H(\\mathbf{f}-\\hat{\\mathbf{f}})\\right)d\\boldsymbol{f}\\right]\\\\\n&= \\log p(\\boldsymbol{y}|\\boldsymbol{\\hat{f}}) + \\log p(\\boldsymbol{\\hat{f}}) - \\frac{N}{2}\\log(2\\pi) - \\frac{1}{2}\\log|H^{-1}|\\\\\n&= \\log p(\\boldsymbol{y}|\\boldsymbol{\\hat{f}}) -\\frac{1}{2}\\boldsymbol{\\hat{f}}^TK_{ff}^{-1}\\boldsymbol{\\hat{f}} - \\frac{1}{2}\\log|K_{ff}| - \\frac{N}{2}\\log(2\\pi) - \\frac{1}{2}\\log|H^{-1}| - \\frac{N}{2}\\log(2\\pi)\n\\end{aligned}\n\\]\nOur final optimization algorithm would be as following: 1. For N iterations do 2. to 4. 2. Optimize for \\(\\boldsymbol{\\hat{f}}\\) with M iterations using standard MAP estimation (maybe use non-centered parametrization). 3. Compute gradient of parameters of \\(\\mathcal{M}\\) w.r.t. log marginal likelihood 4. Update parameters of \\(\\mathcal{M}\\)." + "objectID": "lab/how-to-finetune-florence-2-on-detection-dataset.html#upload-model-to-roboflow-optional", + "href": "lab/how-to-finetune-florence-2-on-detection-dataset.html#upload-model-to-roboflow-optional", + "title": "Fine-tuning Florence-2 on Object Detection Dataset", + "section": "Upload model to Roboflow (optional)", + "text": "Upload model to Roboflow (optional)\nYou can deploy your Florence-2 object detection model on your own hardware (i.e. a cloud GPu server or an NVIDIA Jetson) with Roboflow Inference, an open source computer vision inference server.\nTo deploy your model, you will need a free Roboflow account.\nTo get started, create a new Project in Roboflow if you don’t already have one. Then, upload the dataset you used to train your model. Then, create a dataset Version, which is a snapshot of your dataset with which your model will be associated in Roboflow.\nYou can read our full Deploy Florence-2 with Roboflow guide for step-by-step instructions of these steps.\nOnce you have trained your model A, you can upload it to Roboflow using the following code:\n\nimport roboflow\n\nrf = Roboflow(api_key=\"API_KEY\")\nproject = rf.workspace(\"workspace-id\").project(\"project-id\")\nversion = project.version(VERSION)\n\nversion.deploy(model_type=\"florence-2\", model_path=\"/content/florence2-lora\")\n\nAbove, replace:\n\nAPI_KEY with your Roboflow API key.\nworkspace-id and project-id with your workspace and project IDs.\nVERSION with your project version.\n\nIf you are not using our notebook, replace /content/florence2-lora with the directory where you saved your model weights.\nWhen you run the code above, the model will be uploaded to Roboflow. It will take a few minutes for the model to be processed before it is ready for use.\nYour model will be uploaded to Roboflow." }, { - "objectID": "posts/2022-10-21-gaussian-processes.html#appendix", - "href": "posts/2022-10-21-gaussian-processes.html#appendix", - "title": "Gaussian Processes - A no-skip-math version", - "section": "Appendix", - "text": "Appendix\n\\[\n\\int \\mathcal{N}(\\boldsymbol{y}|W\\boldsymbol{x}+\\boldsymbol{b}, \\Sigma) \\mathcal{N}(\\boldsymbol{x}|\\boldsymbol{\\mu}, K) = \\mathcal{N}(\\boldsymbol{y}|W\\boldsymbol{\\mu}+b, WKW^T+\\Sigma)\n\\tag{int gaussians}\n\\]" + "objectID": "lab/how-to-finetune-florence-2-on-detection-dataset.html#deploy-to-your-hardware", + "href": "lab/how-to-finetune-florence-2-on-detection-dataset.html#deploy-to-your-hardware", + "title": "Fine-tuning Florence-2 on Object Detection Dataset", + "section": "Deploy to your hardware", + "text": "Deploy to your hardware\nOnce your model has been processed, you can download it to any device on which you want to deploy your model. Deployment is supported through Roboflow Inference, our open source computer vision inference server.\nInference can be run as a microservice with Docker, ideal for large deployments where you may need a centralized server on which to run inference, or when you want to run Inference in an isolated container. You can also directly integrate Inference into your project through the Inference Python SDK.\nFor this guide, we will show how to deploy the model with the Python SDK.\nFirst, install inference:\n\n!pip install inference\n\nThen, create a new Python file and add the following code:\n\nimport os\nfrom inference import get_model\nfrom PIL import Image\nimport json\n\nlora_model = get_model(\"model-id/version-id\", api_key=\"KEY\")\n\nimage = Image.open(\"containers.png\")\nresponse = lora_model.infer(image)\nprint(response)\n\nIn the code avove, we load our model, run it on an image, then plot the predictions with the supervision Python package.\nWhen you first run the code, your model weights will be downloaded and cached to your device for subsequent runs. This process may take a few minutes depending on the strength of your internet connection." } ] \ No newline at end of file diff --git a/site_libs/bootstrap/bootstrap-659650fc26dc25888fc1474f317bb8ac.min.css b/site_libs/bootstrap/bootstrap-5027bf1c1f92ac6615724d89c8213d6a.min.css similarity index 86% rename from site_libs/bootstrap/bootstrap-659650fc26dc25888fc1474f317bb8ac.min.css rename to site_libs/bootstrap/bootstrap-5027bf1c1f92ac6615724d89c8213d6a.min.css index 83fb211..b257798 100644 --- a/site_libs/bootstrap/bootstrap-659650fc26dc25888fc1474f317bb8ac.min.css +++ b/site_libs/bootstrap/bootstrap-5027bf1c1f92ac6615724d89c8213d6a.min.css @@ -2,7 +2,7 @@ * Bootstrap v5.3.1 (https://getbootstrap.com/) * Copyright 2011-2023 The Bootstrap Authors * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE) - */@import"https://fonts.googleapis.com/css2?family=Source+Sans+Pro:wght@300;400;700&display=swap";:root,[data-bs-theme=light]{--bs-blue: #2780e3;--bs-indigo: #6610f2;--bs-purple: #613d7c;--bs-pink: #e83e8c;--bs-red: #ff0039;--bs-orange: #f0ad4e;--bs-yellow: #ff7518;--bs-green: #3fb618;--bs-teal: #20c997;--bs-cyan: #9954bb;--bs-black: #000;--bs-white: #fff;--bs-gray: #6c757d;--bs-gray-dark: #343a40;--bs-gray-100: #f8f9fa;--bs-gray-200: #e9ecef;--bs-gray-300: #dee2e6;--bs-gray-400: #ced4da;--bs-gray-500: #adb5bd;--bs-gray-600: #6c757d;--bs-gray-700: #495057;--bs-gray-800: #343a40;--bs-gray-900: #212529;--bs-default: #343a40;--bs-primary: #2780e3;--bs-secondary: #343a40;--bs-success: #3fb618;--bs-info: #9954bb;--bs-warning: #ff7518;--bs-danger: #ff0039;--bs-light: #f8f9fa;--bs-dark: #343a40;--bs-default-rgb: 52, 58, 64;--bs-primary-rgb: 39, 128, 227;--bs-secondary-rgb: 52, 58, 64;--bs-success-rgb: 63, 182, 24;--bs-info-rgb: 153, 84, 187;--bs-warning-rgb: 255, 117, 24;--bs-danger-rgb: 255, 0, 57;--bs-light-rgb: 248, 249, 250;--bs-dark-rgb: 52, 58, 64;--bs-primary-text-emphasis: #10335b;--bs-secondary-text-emphasis: #15171a;--bs-success-text-emphasis: #19490a;--bs-info-text-emphasis: #3d224b;--bs-warning-text-emphasis: #662f0a;--bs-danger-text-emphasis: #660017;--bs-light-text-emphasis: #495057;--bs-dark-text-emphasis: #495057;--bs-primary-bg-subtle: #d4e6f9;--bs-secondary-bg-subtle: #d6d8d9;--bs-success-bg-subtle: #d9f0d1;--bs-info-bg-subtle: #ebddf1;--bs-warning-bg-subtle: #ffe3d1;--bs-danger-bg-subtle: #ffccd7;--bs-light-bg-subtle: #fcfcfd;--bs-dark-bg-subtle: #ced4da;--bs-primary-border-subtle: #a9ccf4;--bs-secondary-border-subtle: #aeb0b3;--bs-success-border-subtle: #b2e2a3;--bs-info-border-subtle: #d6bbe4;--bs-warning-border-subtle: #ffc8a3;--bs-danger-border-subtle: #ff99b0;--bs-light-border-subtle: #e9ecef;--bs-dark-border-subtle: #adb5bd;--bs-white-rgb: 255, 255, 255;--bs-black-rgb: 0, 0, 0;--bs-font-sans-serif: "Source Sans Pro", -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol";--bs-font-monospace: SFMono-Regular, Menlo, Monaco, Consolas, "Liberation Mono", "Courier New", monospace;--bs-gradient: linear-gradient(180deg, rgba(255, 255, 255, 0.15), rgba(255, 255, 255, 0));--bs-root-font-size: 17px;--bs-body-font-family: "Source Sans Pro", -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol";--bs-body-font-size:1rem;--bs-body-font-weight: 400;--bs-body-line-height: 1.5;--bs-body-color: #343a40;--bs-body-color-rgb: 52, 58, 64;--bs-body-bg: #fff;--bs-body-bg-rgb: 255, 255, 255;--bs-emphasis-color: #000;--bs-emphasis-color-rgb: 0, 0, 0;--bs-secondary-color: rgba(52, 58, 64, 0.75);--bs-secondary-color-rgb: 52, 58, 64;--bs-secondary-bg: #e9ecef;--bs-secondary-bg-rgb: 233, 236, 239;--bs-tertiary-color: rgba(52, 58, 64, 0.5);--bs-tertiary-color-rgb: 52, 58, 64;--bs-tertiary-bg: #f8f9fa;--bs-tertiary-bg-rgb: 248, 249, 250;--bs-heading-color: inherit;--bs-link-color: #2761e3;--bs-link-color-rgb: 39, 97, 227;--bs-link-decoration: underline;--bs-link-hover-color: #1f4eb6;--bs-link-hover-color-rgb: 31, 78, 182;--bs-code-color: #7d12ba;--bs-highlight-bg: #ffe3d1;--bs-border-width: 1px;--bs-border-style: solid;--bs-border-color: #dee2e6;--bs-border-color-translucent: rgba(0, 0, 0, 0.175);--bs-border-radius: 0.25rem;--bs-border-radius-sm: 0.2em;--bs-border-radius-lg: 0.5rem;--bs-border-radius-xl: 1rem;--bs-border-radius-xxl: 2rem;--bs-border-radius-2xl: var(--bs-border-radius-xxl);--bs-border-radius-pill: 50rem;--bs-box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15);--bs-box-shadow-sm: 0 0.125rem 0.25rem rgba(0, 0, 0, 0.075);--bs-box-shadow-lg: 0 1rem 3rem rgba(0, 0, 0, 0.175);--bs-box-shadow-inset: inset 0 1px 2px rgba(0, 0, 0, 0.075);--bs-focus-ring-width: 0.25rem;--bs-focus-ring-opacity: 0.25;--bs-focus-ring-color: rgba(39, 128, 227, 0.25);--bs-form-valid-color: #3fb618;--bs-form-valid-border-color: #3fb618;--bs-form-invalid-color: #ff0039;--bs-form-invalid-border-color: #ff0039}[data-bs-theme=dark]{color-scheme:dark;--bs-body-color: #dee2e6;--bs-body-color-rgb: 222, 226, 230;--bs-body-bg: #212529;--bs-body-bg-rgb: 33, 37, 41;--bs-emphasis-color: #fff;--bs-emphasis-color-rgb: 255, 255, 255;--bs-secondary-color: rgba(222, 226, 230, 0.75);--bs-secondary-color-rgb: 222, 226, 230;--bs-secondary-bg: #343a40;--bs-secondary-bg-rgb: 52, 58, 64;--bs-tertiary-color: rgba(222, 226, 230, 0.5);--bs-tertiary-color-rgb: 222, 226, 230;--bs-tertiary-bg: #2b3035;--bs-tertiary-bg-rgb: 43, 48, 53;--bs-primary-text-emphasis: #7db3ee;--bs-secondary-text-emphasis: #85898c;--bs-success-text-emphasis: #8cd374;--bs-info-text-emphasis: #c298d6;--bs-warning-text-emphasis: #ffac74;--bs-danger-text-emphasis: #ff6688;--bs-light-text-emphasis: #f8f9fa;--bs-dark-text-emphasis: #dee2e6;--bs-primary-bg-subtle: #081a2d;--bs-secondary-bg-subtle: #0a0c0d;--bs-success-bg-subtle: #0d2405;--bs-info-bg-subtle: #1f1125;--bs-warning-bg-subtle: #331705;--bs-danger-bg-subtle: #33000b;--bs-light-bg-subtle: #343a40;--bs-dark-bg-subtle: #1a1d20;--bs-primary-border-subtle: #174d88;--bs-secondary-border-subtle: #1f2326;--bs-success-border-subtle: #266d0e;--bs-info-border-subtle: #5c3270;--bs-warning-border-subtle: #99460e;--bs-danger-border-subtle: #990022;--bs-light-border-subtle: #495057;--bs-dark-border-subtle: #343a40;--bs-heading-color: inherit;--bs-link-color: #7db3ee;--bs-link-hover-color: #97c2f1;--bs-link-color-rgb: 125, 179, 238;--bs-link-hover-color-rgb: 151, 194, 241;--bs-code-color: white;--bs-border-color: #495057;--bs-border-color-translucent: rgba(255, 255, 255, 0.15);--bs-form-valid-color: #8cd374;--bs-form-valid-border-color: #8cd374;--bs-form-invalid-color: #ff6688;--bs-form-invalid-border-color: #ff6688}*,*::before,*::after{box-sizing:border-box}:root{font-size:var(--bs-root-font-size)}body{margin:0;font-family:var(--bs-body-font-family);font-size:var(--bs-body-font-size);font-weight:var(--bs-body-font-weight);line-height:var(--bs-body-line-height);color:var(--bs-body-color);text-align:var(--bs-body-text-align);background-color:var(--bs-body-bg);-webkit-text-size-adjust:100%;-webkit-tap-highlight-color:rgba(0,0,0,0)}hr{margin:1rem 0;color:inherit;border:0;border-top:1px solid;opacity:.25}h6,.h6,h5,.h5,h4,.h4,h3,.h3,h2,.h2,h1,.h1{margin-top:0;margin-bottom:.5rem;font-weight:400;line-height:1.2;color:var(--bs-heading-color)}h1,.h1{font-size:calc(1.325rem + 0.9vw)}@media(min-width: 1200px){h1,.h1{font-size:2rem}}h2,.h2{font-size:calc(1.29rem + 0.48vw)}@media(min-width: 1200px){h2,.h2{font-size:1.65rem}}h3,.h3{font-size:calc(1.27rem + 0.24vw)}@media(min-width: 1200px){h3,.h3{font-size:1.45rem}}h4,.h4{font-size:1.25rem}h5,.h5{font-size:1.1rem}h6,.h6{font-size:1rem}p{margin-top:0;margin-bottom:1rem}abbr[title]{text-decoration:underline dotted;-webkit-text-decoration:underline dotted;-moz-text-decoration:underline dotted;-ms-text-decoration:underline dotted;-o-text-decoration:underline dotted;cursor:help;text-decoration-skip-ink:none}address{margin-bottom:1rem;font-style:normal;line-height:inherit}ol,ul{padding-left:2rem}ol,ul,dl{margin-top:0;margin-bottom:1rem}ol ol,ul ul,ol ul,ul ol{margin-bottom:0}dt{font-weight:700}dd{margin-bottom:.5rem;margin-left:0}blockquote{margin:0 0 1rem;padding:.625rem 1.25rem;border-left:.25rem solid #e9ecef}blockquote p:last-child,blockquote ul:last-child,blockquote ol:last-child{margin-bottom:0}b,strong{font-weight:bolder}small,.small{font-size:0.875em}mark,.mark{padding:.1875em;background-color:var(--bs-highlight-bg)}sub,sup{position:relative;font-size:0.75em;line-height:0;vertical-align:baseline}sub{bottom:-0.25em}sup{top:-0.5em}a{color:rgba(var(--bs-link-color-rgb), var(--bs-link-opacity, 1));text-decoration:underline;-webkit-text-decoration:underline;-moz-text-decoration:underline;-ms-text-decoration:underline;-o-text-decoration:underline}a:hover{--bs-link-color-rgb: var(--bs-link-hover-color-rgb)}a:not([href]):not([class]),a:not([href]):not([class]):hover{color:inherit;text-decoration:none}pre,code,kbd,samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,"Liberation Mono","Courier New",monospace;font-size:1em}pre{display:block;margin-top:0;margin-bottom:1rem;overflow:auto;font-size:0.875em;color:#000;background-color:#f8f9fa;line-height:1.5;padding:.5rem;border:1px solid var(--bs-border-color, #dee2e6)}pre code{background-color:rgba(0,0,0,0);font-size:inherit;color:inherit;word-break:normal}code{font-size:0.875em;color:var(--bs-code-color);background-color:#f8f9fa;padding:.125rem .25rem;word-wrap:break-word}a>code{color:inherit}kbd{padding:.4rem .4rem;font-size:0.875em;color:#fff;background-color:#343a40}kbd kbd{padding:0;font-size:1em}figure{margin:0 0 1rem}img,svg{vertical-align:middle}table{caption-side:bottom;border-collapse:collapse}caption{padding-top:.5rem;padding-bottom:.5rem;color:rgba(52,58,64,.75);text-align:left}th{text-align:inherit;text-align:-webkit-match-parent}thead,tbody,tfoot,tr,td,th{border-color:inherit;border-style:solid;border-width:0}label{display:inline-block}button{border-radius:0}button:focus:not(:focus-visible){outline:0}input,button,select,optgroup,textarea{margin:0;font-family:inherit;font-size:inherit;line-height:inherit}button,select{text-transform:none}[role=button]{cursor:pointer}select{word-wrap:normal}select:disabled{opacity:1}[list]:not([type=date]):not([type=datetime-local]):not([type=month]):not([type=week]):not([type=time])::-webkit-calendar-picker-indicator{display:none !important}button,[type=button],[type=reset],[type=submit]{-webkit-appearance:button}button:not(:disabled),[type=button]:not(:disabled),[type=reset]:not(:disabled),[type=submit]:not(:disabled){cursor:pointer}::-moz-focus-inner{padding:0;border-style:none}textarea{resize:vertical}fieldset{min-width:0;padding:0;margin:0;border:0}legend{float:left;width:100%;padding:0;margin-bottom:.5rem;font-size:calc(1.275rem + 0.3vw);line-height:inherit}@media(min-width: 1200px){legend{font-size:1.5rem}}legend+*{clear:left}::-webkit-datetime-edit-fields-wrapper,::-webkit-datetime-edit-text,::-webkit-datetime-edit-minute,::-webkit-datetime-edit-hour-field,::-webkit-datetime-edit-day-field,::-webkit-datetime-edit-month-field,::-webkit-datetime-edit-year-field{padding:0}::-webkit-inner-spin-button{height:auto}[type=search]{-webkit-appearance:textfield;outline-offset:-2px}::-webkit-search-decoration{-webkit-appearance:none}::-webkit-color-swatch-wrapper{padding:0}::file-selector-button{font:inherit;-webkit-appearance:button}output{display:inline-block}iframe{border:0}summary{display:list-item;cursor:pointer}progress{vertical-align:baseline}[hidden]{display:none !important}.lead{font-size:1.25rem;font-weight:300}.display-1{font-size:calc(1.625rem + 4.5vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-1{font-size:5rem}}.display-2{font-size:calc(1.575rem + 3.9vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-2{font-size:4.5rem}}.display-3{font-size:calc(1.525rem + 3.3vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-3{font-size:4rem}}.display-4{font-size:calc(1.475rem + 2.7vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-4{font-size:3.5rem}}.display-5{font-size:calc(1.425rem + 2.1vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-5{font-size:3rem}}.display-6{font-size:calc(1.375rem + 1.5vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-6{font-size:2.5rem}}.list-unstyled{padding-left:0;list-style:none}.list-inline{padding-left:0;list-style:none}.list-inline-item{display:inline-block}.list-inline-item:not(:last-child){margin-right:.5rem}.initialism{font-size:0.875em;text-transform:uppercase}.blockquote{margin-bottom:1rem;font-size:1.25rem}.blockquote>:last-child{margin-bottom:0}.blockquote-footer{margin-top:-1rem;margin-bottom:1rem;font-size:0.875em;color:#6c757d}.blockquote-footer::before{content:"— "}.img-fluid{max-width:100%;height:auto}.img-thumbnail{padding:.25rem;background-color:#fff;border:1px solid #dee2e6;max-width:100%;height:auto}.figure{display:inline-block}.figure-img{margin-bottom:.5rem;line-height:1}.figure-caption{font-size:0.875em;color:rgba(52,58,64,.75)}.container,.container-fluid,.container-xxl,.container-xl,.container-lg,.container-md,.container-sm{--bs-gutter-x: 1.5rem;--bs-gutter-y: 0;width:100%;padding-right:calc(var(--bs-gutter-x)*.5);padding-left:calc(var(--bs-gutter-x)*.5);margin-right:auto;margin-left:auto}@media(min-width: 576px){.container-sm,.container{max-width:540px}}@media(min-width: 768px){.container-md,.container-sm,.container{max-width:720px}}@media(min-width: 992px){.container-lg,.container-md,.container-sm,.container{max-width:960px}}@media(min-width: 1200px){.container-xl,.container-lg,.container-md,.container-sm,.container{max-width:1140px}}@media(min-width: 1400px){.container-xxl,.container-xl,.container-lg,.container-md,.container-sm,.container{max-width:1320px}}:root{--bs-breakpoint-xs: 0;--bs-breakpoint-sm: 576px;--bs-breakpoint-md: 768px;--bs-breakpoint-lg: 992px;--bs-breakpoint-xl: 1200px;--bs-breakpoint-xxl: 1400px}.grid{display:grid;grid-template-rows:repeat(var(--bs-rows, 1), 1fr);grid-template-columns:repeat(var(--bs-columns, 12), 1fr);gap:var(--bs-gap, 1.5rem)}.grid .g-col-1{grid-column:auto/span 1}.grid .g-col-2{grid-column:auto/span 2}.grid .g-col-3{grid-column:auto/span 3}.grid .g-col-4{grid-column:auto/span 4}.grid .g-col-5{grid-column:auto/span 5}.grid .g-col-6{grid-column:auto/span 6}.grid .g-col-7{grid-column:auto/span 7}.grid .g-col-8{grid-column:auto/span 8}.grid .g-col-9{grid-column:auto/span 9}.grid .g-col-10{grid-column:auto/span 10}.grid .g-col-11{grid-column:auto/span 11}.grid .g-col-12{grid-column:auto/span 12}.grid .g-start-1{grid-column-start:1}.grid .g-start-2{grid-column-start:2}.grid .g-start-3{grid-column-start:3}.grid .g-start-4{grid-column-start:4}.grid .g-start-5{grid-column-start:5}.grid .g-start-6{grid-column-start:6}.grid .g-start-7{grid-column-start:7}.grid .g-start-8{grid-column-start:8}.grid .g-start-9{grid-column-start:9}.grid .g-start-10{grid-column-start:10}.grid .g-start-11{grid-column-start:11}@media(min-width: 576px){.grid .g-col-sm-1{grid-column:auto/span 1}.grid .g-col-sm-2{grid-column:auto/span 2}.grid .g-col-sm-3{grid-column:auto/span 3}.grid .g-col-sm-4{grid-column:auto/span 4}.grid .g-col-sm-5{grid-column:auto/span 5}.grid .g-col-sm-6{grid-column:auto/span 6}.grid .g-col-sm-7{grid-column:auto/span 7}.grid .g-col-sm-8{grid-column:auto/span 8}.grid .g-col-sm-9{grid-column:auto/span 9}.grid .g-col-sm-10{grid-column:auto/span 10}.grid .g-col-sm-11{grid-column:auto/span 11}.grid .g-col-sm-12{grid-column:auto/span 12}.grid .g-start-sm-1{grid-column-start:1}.grid .g-start-sm-2{grid-column-start:2}.grid .g-start-sm-3{grid-column-start:3}.grid .g-start-sm-4{grid-column-start:4}.grid .g-start-sm-5{grid-column-start:5}.grid .g-start-sm-6{grid-column-start:6}.grid .g-start-sm-7{grid-column-start:7}.grid .g-start-sm-8{grid-column-start:8}.grid .g-start-sm-9{grid-column-start:9}.grid .g-start-sm-10{grid-column-start:10}.grid .g-start-sm-11{grid-column-start:11}}@media(min-width: 768px){.grid .g-col-md-1{grid-column:auto/span 1}.grid .g-col-md-2{grid-column:auto/span 2}.grid .g-col-md-3{grid-column:auto/span 3}.grid .g-col-md-4{grid-column:auto/span 4}.grid .g-col-md-5{grid-column:auto/span 5}.grid .g-col-md-6{grid-column:auto/span 6}.grid .g-col-md-7{grid-column:auto/span 7}.grid .g-col-md-8{grid-column:auto/span 8}.grid .g-col-md-9{grid-column:auto/span 9}.grid .g-col-md-10{grid-column:auto/span 10}.grid .g-col-md-11{grid-column:auto/span 11}.grid .g-col-md-12{grid-column:auto/span 12}.grid .g-start-md-1{grid-column-start:1}.grid .g-start-md-2{grid-column-start:2}.grid .g-start-md-3{grid-column-start:3}.grid .g-start-md-4{grid-column-start:4}.grid .g-start-md-5{grid-column-start:5}.grid .g-start-md-6{grid-column-start:6}.grid .g-start-md-7{grid-column-start:7}.grid .g-start-md-8{grid-column-start:8}.grid .g-start-md-9{grid-column-start:9}.grid .g-start-md-10{grid-column-start:10}.grid .g-start-md-11{grid-column-start:11}}@media(min-width: 992px){.grid .g-col-lg-1{grid-column:auto/span 1}.grid .g-col-lg-2{grid-column:auto/span 2}.grid .g-col-lg-3{grid-column:auto/span 3}.grid .g-col-lg-4{grid-column:auto/span 4}.grid .g-col-lg-5{grid-column:auto/span 5}.grid .g-col-lg-6{grid-column:auto/span 6}.grid .g-col-lg-7{grid-column:auto/span 7}.grid .g-col-lg-8{grid-column:auto/span 8}.grid .g-col-lg-9{grid-column:auto/span 9}.grid .g-col-lg-10{grid-column:auto/span 10}.grid .g-col-lg-11{grid-column:auto/span 11}.grid .g-col-lg-12{grid-column:auto/span 12}.grid .g-start-lg-1{grid-column-start:1}.grid .g-start-lg-2{grid-column-start:2}.grid .g-start-lg-3{grid-column-start:3}.grid .g-start-lg-4{grid-column-start:4}.grid .g-start-lg-5{grid-column-start:5}.grid .g-start-lg-6{grid-column-start:6}.grid .g-start-lg-7{grid-column-start:7}.grid .g-start-lg-8{grid-column-start:8}.grid .g-start-lg-9{grid-column-start:9}.grid .g-start-lg-10{grid-column-start:10}.grid .g-start-lg-11{grid-column-start:11}}@media(min-width: 1200px){.grid .g-col-xl-1{grid-column:auto/span 1}.grid .g-col-xl-2{grid-column:auto/span 2}.grid .g-col-xl-3{grid-column:auto/span 3}.grid .g-col-xl-4{grid-column:auto/span 4}.grid .g-col-xl-5{grid-column:auto/span 5}.grid .g-col-xl-6{grid-column:auto/span 6}.grid .g-col-xl-7{grid-column:auto/span 7}.grid .g-col-xl-8{grid-column:auto/span 8}.grid .g-col-xl-9{grid-column:auto/span 9}.grid .g-col-xl-10{grid-column:auto/span 10}.grid .g-col-xl-11{grid-column:auto/span 11}.grid .g-col-xl-12{grid-column:auto/span 12}.grid .g-start-xl-1{grid-column-start:1}.grid .g-start-xl-2{grid-column-start:2}.grid .g-start-xl-3{grid-column-start:3}.grid .g-start-xl-4{grid-column-start:4}.grid .g-start-xl-5{grid-column-start:5}.grid .g-start-xl-6{grid-column-start:6}.grid .g-start-xl-7{grid-column-start:7}.grid .g-start-xl-8{grid-column-start:8}.grid .g-start-xl-9{grid-column-start:9}.grid .g-start-xl-10{grid-column-start:10}.grid .g-start-xl-11{grid-column-start:11}}@media(min-width: 1400px){.grid .g-col-xxl-1{grid-column:auto/span 1}.grid .g-col-xxl-2{grid-column:auto/span 2}.grid .g-col-xxl-3{grid-column:auto/span 3}.grid .g-col-xxl-4{grid-column:auto/span 4}.grid .g-col-xxl-5{grid-column:auto/span 5}.grid .g-col-xxl-6{grid-column:auto/span 6}.grid .g-col-xxl-7{grid-column:auto/span 7}.grid .g-col-xxl-8{grid-column:auto/span 8}.grid .g-col-xxl-9{grid-column:auto/span 9}.grid .g-col-xxl-10{grid-column:auto/span 10}.grid .g-col-xxl-11{grid-column:auto/span 11}.grid .g-col-xxl-12{grid-column:auto/span 12}.grid .g-start-xxl-1{grid-column-start:1}.grid .g-start-xxl-2{grid-column-start:2}.grid .g-start-xxl-3{grid-column-start:3}.grid .g-start-xxl-4{grid-column-start:4}.grid .g-start-xxl-5{grid-column-start:5}.grid .g-start-xxl-6{grid-column-start:6}.grid .g-start-xxl-7{grid-column-start:7}.grid .g-start-xxl-8{grid-column-start:8}.grid .g-start-xxl-9{grid-column-start:9}.grid .g-start-xxl-10{grid-column-start:10}.grid .g-start-xxl-11{grid-column-start:11}}.table{--bs-table-color-type: initial;--bs-table-bg-type: initial;--bs-table-color-state: initial;--bs-table-bg-state: initial;--bs-table-color: #343a40;--bs-table-bg: #fff;--bs-table-border-color: #dee2e6;--bs-table-accent-bg: transparent;--bs-table-striped-color: #343a40;--bs-table-striped-bg: rgba(0, 0, 0, 0.05);--bs-table-active-color: #343a40;--bs-table-active-bg: rgba(0, 0, 0, 0.1);--bs-table-hover-color: #343a40;--bs-table-hover-bg: rgba(0, 0, 0, 0.075);width:100%;margin-bottom:1rem;vertical-align:top;border-color:var(--bs-table-border-color)}.table>:not(caption)>*>*{padding:.5rem .5rem;color:var(--bs-table-color-state, var(--bs-table-color-type, var(--bs-table-color)));background-color:var(--bs-table-bg);border-bottom-width:1px;box-shadow:inset 0 0 0 9999px var(--bs-table-bg-state, var(--bs-table-bg-type, var(--bs-table-accent-bg)))}.table>tbody{vertical-align:inherit}.table>thead{vertical-align:bottom}.table-group-divider{border-top:calc(1px*2) solid #9a9da0}.caption-top{caption-side:top}.table-sm>:not(caption)>*>*{padding:.25rem .25rem}.table-bordered>:not(caption)>*{border-width:1px 0}.table-bordered>:not(caption)>*>*{border-width:0 1px}.table-borderless>:not(caption)>*>*{border-bottom-width:0}.table-borderless>:not(:first-child){border-top-width:0}.table-striped>tbody>tr:nth-of-type(odd)>*{--bs-table-color-type: var(--bs-table-striped-color);--bs-table-bg-type: var(--bs-table-striped-bg)}.table-striped-columns>:not(caption)>tr>:nth-child(even){--bs-table-color-type: var(--bs-table-striped-color);--bs-table-bg-type: var(--bs-table-striped-bg)}.table-active{--bs-table-color-state: var(--bs-table-active-color);--bs-table-bg-state: var(--bs-table-active-bg)}.table-hover>tbody>tr:hover>*{--bs-table-color-state: var(--bs-table-hover-color);--bs-table-bg-state: var(--bs-table-hover-bg)}.table-primary{--bs-table-color: #000;--bs-table-bg: #d4e6f9;--bs-table-border-color: #bfcfe0;--bs-table-striped-bg: #c9dbed;--bs-table-striped-color: #000;--bs-table-active-bg: #bfcfe0;--bs-table-active-color: #000;--bs-table-hover-bg: #c4d5e6;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-secondary{--bs-table-color: #000;--bs-table-bg: #d6d8d9;--bs-table-border-color: #c1c2c3;--bs-table-striped-bg: #cbcdce;--bs-table-striped-color: #000;--bs-table-active-bg: #c1c2c3;--bs-table-active-color: #000;--bs-table-hover-bg: #c6c8c9;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-success{--bs-table-color: #000;--bs-table-bg: #d9f0d1;--bs-table-border-color: #c3d8bc;--bs-table-striped-bg: #cee4c7;--bs-table-striped-color: #000;--bs-table-active-bg: #c3d8bc;--bs-table-active-color: #000;--bs-table-hover-bg: #c9dec1;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-info{--bs-table-color: #000;--bs-table-bg: #ebddf1;--bs-table-border-color: #d4c7d9;--bs-table-striped-bg: #dfd2e5;--bs-table-striped-color: #000;--bs-table-active-bg: #d4c7d9;--bs-table-active-color: #000;--bs-table-hover-bg: #d9ccdf;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-warning{--bs-table-color: #000;--bs-table-bg: #ffe3d1;--bs-table-border-color: #e6ccbc;--bs-table-striped-bg: #f2d8c7;--bs-table-striped-color: #000;--bs-table-active-bg: #e6ccbc;--bs-table-active-color: #000;--bs-table-hover-bg: #ecd2c1;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-danger{--bs-table-color: #000;--bs-table-bg: #ffccd7;--bs-table-border-color: #e6b8c2;--bs-table-striped-bg: #f2c2cc;--bs-table-striped-color: #000;--bs-table-active-bg: #e6b8c2;--bs-table-active-color: #000;--bs-table-hover-bg: #ecbdc7;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-light{--bs-table-color: #000;--bs-table-bg: #f8f9fa;--bs-table-border-color: #dfe0e1;--bs-table-striped-bg: #ecedee;--bs-table-striped-color: #000;--bs-table-active-bg: #dfe0e1;--bs-table-active-color: #000;--bs-table-hover-bg: #e5e6e7;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-dark{--bs-table-color: #fff;--bs-table-bg: #343a40;--bs-table-border-color: #484e53;--bs-table-striped-bg: #3e444a;--bs-table-striped-color: #fff;--bs-table-active-bg: #484e53;--bs-table-active-color: #fff;--bs-table-hover-bg: #43494e;--bs-table-hover-color: #fff;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-responsive{overflow-x:auto;-webkit-overflow-scrolling:touch}@media(max-width: 575.98px){.table-responsive-sm{overflow-x:auto;-webkit-overflow-scrolling:touch}}@media(max-width: 767.98px){.table-responsive-md{overflow-x:auto;-webkit-overflow-scrolling:touch}}@media(max-width: 991.98px){.table-responsive-lg{overflow-x:auto;-webkit-overflow-scrolling:touch}}@media(max-width: 1199.98px){.table-responsive-xl{overflow-x:auto;-webkit-overflow-scrolling:touch}}@media(max-width: 1399.98px){.table-responsive-xxl{overflow-x:auto;-webkit-overflow-scrolling:touch}}.form-label,.shiny-input-container .control-label{margin-bottom:.5rem}.col-form-label{padding-top:calc(0.375rem + 1px);padding-bottom:calc(0.375rem + 1px);margin-bottom:0;font-size:inherit;line-height:1.5}.col-form-label-lg{padding-top:calc(0.5rem + 1px);padding-bottom:calc(0.5rem + 1px);font-size:1.25rem}.col-form-label-sm{padding-top:calc(0.25rem + 1px);padding-bottom:calc(0.25rem + 1px);font-size:0.875rem}.form-text{margin-top:.25rem;font-size:0.875em;color:rgba(52,58,64,.75)}.form-control{display:block;width:100%;padding:.375rem .75rem;font-size:1rem;font-weight:400;line-height:1.5;color:#343a40;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:#fff;background-clip:padding-box;border:1px solid #dee2e6;border-radius:0;transition:border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-control{transition:none}}.form-control[type=file]{overflow:hidden}.form-control[type=file]:not(:disabled):not([readonly]){cursor:pointer}.form-control:focus{color:#343a40;background-color:#fff;border-color:#93c0f1;outline:0;box-shadow:0 0 0 .25rem rgba(39,128,227,.25)}.form-control::-webkit-date-and-time-value{min-width:85px;height:1.5em;margin:0}.form-control::-webkit-datetime-edit{display:block;padding:0}.form-control::placeholder{color:rgba(52,58,64,.75);opacity:1}.form-control:disabled{background-color:#e9ecef;opacity:1}.form-control::file-selector-button{padding:.375rem .75rem;margin:-0.375rem -0.75rem;margin-inline-end:.75rem;color:#343a40;background-color:#f8f9fa;pointer-events:none;border-color:inherit;border-style:solid;border-width:0;border-inline-end-width:1px;border-radius:0;transition:color .15s ease-in-out,background-color .15s ease-in-out,border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-control::file-selector-button{transition:none}}.form-control:hover:not(:disabled):not([readonly])::file-selector-button{background-color:#e9ecef}.form-control-plaintext{display:block;width:100%;padding:.375rem 0;margin-bottom:0;line-height:1.5;color:#343a40;background-color:rgba(0,0,0,0);border:solid rgba(0,0,0,0);border-width:1px 0}.form-control-plaintext:focus{outline:0}.form-control-plaintext.form-control-sm,.form-control-plaintext.form-control-lg{padding-right:0;padding-left:0}.form-control-sm{min-height:calc(1.5em + 0.5rem + calc(1px * 2));padding:.25rem .5rem;font-size:0.875rem}.form-control-sm::file-selector-button{padding:.25rem .5rem;margin:-0.25rem -0.5rem;margin-inline-end:.5rem}.form-control-lg{min-height:calc(1.5em + 1rem + calc(1px * 2));padding:.5rem 1rem;font-size:1.25rem}.form-control-lg::file-selector-button{padding:.5rem 1rem;margin:-0.5rem -1rem;margin-inline-end:1rem}textarea.form-control{min-height:calc(1.5em + 0.75rem + calc(1px * 2))}textarea.form-control-sm{min-height:calc(1.5em + 0.5rem + calc(1px * 2))}textarea.form-control-lg{min-height:calc(1.5em + 1rem + calc(1px * 2))}.form-control-color{width:3rem;height:calc(1.5em + 0.75rem + calc(1px * 2));padding:.375rem}.form-control-color:not(:disabled):not([readonly]){cursor:pointer}.form-control-color::-moz-color-swatch{border:0 !important}.form-control-color::-webkit-color-swatch{border:0 !important}.form-control-color.form-control-sm{height:calc(1.5em + 0.5rem + calc(1px * 2))}.form-control-color.form-control-lg{height:calc(1.5em + 1rem + calc(1px * 2))}.form-select{--bs-form-select-bg-img: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16'%3e%3cpath fill='none' stroke='%23343a40' stroke-linecap='round' stroke-linejoin='round' stroke-width='2' d='m2 5 6 6 6-6'/%3e%3c/svg%3e");display:block;width:100%;padding:.375rem 2.25rem .375rem .75rem;font-size:1rem;font-weight:400;line-height:1.5;color:#343a40;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:#fff;background-image:var(--bs-form-select-bg-img),var(--bs-form-select-bg-icon, none);background-repeat:no-repeat;background-position:right .75rem center;background-size:16px 12px;border:1px solid #dee2e6;border-radius:0;transition:border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-select{transition:none}}.form-select:focus{border-color:#93c0f1;outline:0;box-shadow:0 0 0 .25rem rgba(39,128,227,.25)}.form-select[multiple],.form-select[size]:not([size="1"]){padding-right:.75rem;background-image:none}.form-select:disabled{background-color:#e9ecef}.form-select:-moz-focusring{color:rgba(0,0,0,0);text-shadow:0 0 0 #343a40}.form-select-sm{padding-top:.25rem;padding-bottom:.25rem;padding-left:.5rem;font-size:0.875rem}.form-select-lg{padding-top:.5rem;padding-bottom:.5rem;padding-left:1rem;font-size:1.25rem}[data-bs-theme=dark] .form-select{--bs-form-select-bg-img: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16'%3e%3cpath fill='none' stroke='%23dee2e6' stroke-linecap='round' stroke-linejoin='round' stroke-width='2' d='m2 5 6 6 6-6'/%3e%3c/svg%3e")}.form-check,.shiny-input-container .checkbox,.shiny-input-container .radio{display:block;min-height:1.5rem;padding-left:0;margin-bottom:.125rem}.form-check .form-check-input,.form-check .shiny-input-container .checkbox input,.form-check .shiny-input-container .radio input,.shiny-input-container .checkbox .form-check-input,.shiny-input-container .checkbox .shiny-input-container .checkbox input,.shiny-input-container .checkbox .shiny-input-container .radio input,.shiny-input-container .radio .form-check-input,.shiny-input-container .radio .shiny-input-container .checkbox input,.shiny-input-container .radio .shiny-input-container .radio input{float:left;margin-left:0}.form-check-reverse{padding-right:0;padding-left:0;text-align:right}.form-check-reverse .form-check-input{float:right;margin-right:0;margin-left:0}.form-check-input,.shiny-input-container .checkbox input,.shiny-input-container .checkbox-inline input,.shiny-input-container .radio input,.shiny-input-container .radio-inline input{--bs-form-check-bg: #fff;width:1em;height:1em;margin-top:.25em;vertical-align:top;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:var(--bs-form-check-bg);background-image:var(--bs-form-check-bg-image);background-repeat:no-repeat;background-position:center;background-size:contain;border:1px solid #dee2e6;print-color-adjust:exact}.form-check-input[type=radio],.shiny-input-container .checkbox input[type=radio],.shiny-input-container .checkbox-inline input[type=radio],.shiny-input-container .radio input[type=radio],.shiny-input-container .radio-inline input[type=radio]{border-radius:50%}.form-check-input:active,.shiny-input-container .checkbox input:active,.shiny-input-container .checkbox-inline input:active,.shiny-input-container .radio input:active,.shiny-input-container .radio-inline input:active{filter:brightness(90%)}.form-check-input:focus,.shiny-input-container .checkbox input:focus,.shiny-input-container .checkbox-inline input:focus,.shiny-input-container .radio input:focus,.shiny-input-container .radio-inline input:focus{border-color:#93c0f1;outline:0;box-shadow:0 0 0 .25rem rgba(39,128,227,.25)}.form-check-input:checked,.shiny-input-container .checkbox input:checked,.shiny-input-container .checkbox-inline input:checked,.shiny-input-container .radio input:checked,.shiny-input-container .radio-inline input:checked{background-color:#2780e3;border-color:#2780e3}.form-check-input:checked[type=checkbox],.shiny-input-container .checkbox input:checked[type=checkbox],.shiny-input-container .checkbox-inline input:checked[type=checkbox],.shiny-input-container .radio input:checked[type=checkbox],.shiny-input-container .radio-inline input:checked[type=checkbox]{--bs-form-check-bg-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 20 20'%3e%3cpath fill='none' stroke='%23fff' stroke-linecap='round' stroke-linejoin='round' stroke-width='3' d='m6 10 3 3 6-6'/%3e%3c/svg%3e")}.form-check-input:checked[type=radio],.shiny-input-container .checkbox input:checked[type=radio],.shiny-input-container .checkbox-inline input:checked[type=radio],.shiny-input-container .radio input:checked[type=radio],.shiny-input-container .radio-inline input:checked[type=radio]{--bs-form-check-bg-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='2' fill='%23fff'/%3e%3c/svg%3e")}.form-check-input[type=checkbox]:indeterminate,.shiny-input-container .checkbox input[type=checkbox]:indeterminate,.shiny-input-container .checkbox-inline input[type=checkbox]:indeterminate,.shiny-input-container .radio input[type=checkbox]:indeterminate,.shiny-input-container .radio-inline input[type=checkbox]:indeterminate{background-color:#2780e3;border-color:#2780e3;--bs-form-check-bg-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 20 20'%3e%3cpath fill='none' stroke='%23fff' stroke-linecap='round' stroke-linejoin='round' stroke-width='3' d='M6 10h8'/%3e%3c/svg%3e")}.form-check-input:disabled,.shiny-input-container .checkbox input:disabled,.shiny-input-container .checkbox-inline input:disabled,.shiny-input-container .radio input:disabled,.shiny-input-container .radio-inline input:disabled{pointer-events:none;filter:none;opacity:.5}.form-check-input[disabled]~.form-check-label,.form-check-input[disabled]~span,.form-check-input:disabled~.form-check-label,.form-check-input:disabled~span,.shiny-input-container .checkbox input[disabled]~.form-check-label,.shiny-input-container .checkbox input[disabled]~span,.shiny-input-container .checkbox input:disabled~.form-check-label,.shiny-input-container .checkbox input:disabled~span,.shiny-input-container .checkbox-inline input[disabled]~.form-check-label,.shiny-input-container .checkbox-inline input[disabled]~span,.shiny-input-container .checkbox-inline input:disabled~.form-check-label,.shiny-input-container .checkbox-inline input:disabled~span,.shiny-input-container .radio input[disabled]~.form-check-label,.shiny-input-container .radio input[disabled]~span,.shiny-input-container .radio input:disabled~.form-check-label,.shiny-input-container .radio input:disabled~span,.shiny-input-container .radio-inline input[disabled]~.form-check-label,.shiny-input-container .radio-inline input[disabled]~span,.shiny-input-container .radio-inline input:disabled~.form-check-label,.shiny-input-container .radio-inline input:disabled~span{cursor:default;opacity:.5}.form-check-label,.shiny-input-container .checkbox label,.shiny-input-container .checkbox-inline label,.shiny-input-container .radio label,.shiny-input-container .radio-inline label{cursor:pointer}.form-switch{padding-left:2.5em}.form-switch .form-check-input{--bs-form-switch-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='rgba%280, 0, 0, 0.25%29'/%3e%3c/svg%3e");width:2em;margin-left:-2.5em;background-image:var(--bs-form-switch-bg);background-position:left center;transition:background-position .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-switch .form-check-input{transition:none}}.form-switch .form-check-input:focus{--bs-form-switch-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%2393c0f1'/%3e%3c/svg%3e")}.form-switch .form-check-input:checked{background-position:right center;--bs-form-switch-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%23fff'/%3e%3c/svg%3e")}.form-switch.form-check-reverse{padding-right:2.5em;padding-left:0}.form-switch.form-check-reverse .form-check-input{margin-right:-2.5em;margin-left:0}.form-check-inline{display:inline-block;margin-right:1rem}.btn-check{position:absolute;clip:rect(0, 0, 0, 0);pointer-events:none}.btn-check[disabled]+.btn,.btn-check:disabled+.btn{pointer-events:none;filter:none;opacity:.65}[data-bs-theme=dark] .form-switch .form-check-input:not(:checked):not(:focus){--bs-form-switch-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='rgba%28255, 255, 255, 0.25%29'/%3e%3c/svg%3e")}.form-range{width:100%;height:1.5rem;padding:0;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:rgba(0,0,0,0)}.form-range:focus{outline:0}.form-range:focus::-webkit-slider-thumb{box-shadow:0 0 0 1px #fff,0 0 0 .25rem rgba(39,128,227,.25)}.form-range:focus::-moz-range-thumb{box-shadow:0 0 0 1px #fff,0 0 0 .25rem rgba(39,128,227,.25)}.form-range::-moz-focus-outer{border:0}.form-range::-webkit-slider-thumb{width:1rem;height:1rem;margin-top:-0.25rem;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:#2780e3;border:0;transition:background-color .15s ease-in-out,border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-range::-webkit-slider-thumb{transition:none}}.form-range::-webkit-slider-thumb:active{background-color:#bed9f7}.form-range::-webkit-slider-runnable-track{width:100%;height:.5rem;color:rgba(0,0,0,0);cursor:pointer;background-color:#f8f9fa;border-color:rgba(0,0,0,0)}.form-range::-moz-range-thumb{width:1rem;height:1rem;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:#2780e3;border:0;transition:background-color .15s ease-in-out,border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-range::-moz-range-thumb{transition:none}}.form-range::-moz-range-thumb:active{background-color:#bed9f7}.form-range::-moz-range-track{width:100%;height:.5rem;color:rgba(0,0,0,0);cursor:pointer;background-color:#f8f9fa;border-color:rgba(0,0,0,0)}.form-range:disabled{pointer-events:none}.form-range:disabled::-webkit-slider-thumb{background-color:rgba(52,58,64,.75)}.form-range:disabled::-moz-range-thumb{background-color:rgba(52,58,64,.75)}.form-floating{position:relative}.form-floating>.form-control,.form-floating>.form-control-plaintext,.form-floating>.form-select{height:calc(3.5rem + calc(1px * 2));min-height:calc(3.5rem + calc(1px * 2));line-height:1.25}.form-floating>label{position:absolute;top:0;left:0;z-index:2;height:100%;padding:1rem .75rem;overflow:hidden;text-align:start;text-overflow:ellipsis;white-space:nowrap;pointer-events:none;border:1px solid rgba(0,0,0,0);transform-origin:0 0;transition:opacity .1s ease-in-out,transform .1s ease-in-out}@media(prefers-reduced-motion: reduce){.form-floating>label{transition:none}}.form-floating>.form-control,.form-floating>.form-control-plaintext{padding:1rem .75rem}.form-floating>.form-control::placeholder,.form-floating>.form-control-plaintext::placeholder{color:rgba(0,0,0,0)}.form-floating>.form-control:focus,.form-floating>.form-control:not(:placeholder-shown),.form-floating>.form-control-plaintext:focus,.form-floating>.form-control-plaintext:not(:placeholder-shown){padding-top:1.625rem;padding-bottom:.625rem}.form-floating>.form-control:-webkit-autofill,.form-floating>.form-control-plaintext:-webkit-autofill{padding-top:1.625rem;padding-bottom:.625rem}.form-floating>.form-select{padding-top:1.625rem;padding-bottom:.625rem}.form-floating>.form-control:focus~label,.form-floating>.form-control:not(:placeholder-shown)~label,.form-floating>.form-control-plaintext~label,.form-floating>.form-select~label{color:rgba(var(--bs-body-color-rgb), 0.65);transform:scale(0.85) translateY(-0.5rem) translateX(0.15rem)}.form-floating>.form-control:focus~label::after,.form-floating>.form-control:not(:placeholder-shown)~label::after,.form-floating>.form-control-plaintext~label::after,.form-floating>.form-select~label::after{position:absolute;inset:1rem .375rem;z-index:-1;height:1.5em;content:"";background-color:#fff}.form-floating>.form-control:-webkit-autofill~label{color:rgba(var(--bs-body-color-rgb), 0.65);transform:scale(0.85) translateY(-0.5rem) translateX(0.15rem)}.form-floating>.form-control-plaintext~label{border-width:1px 0}.form-floating>:disabled~label,.form-floating>.form-control:disabled~label{color:#6c757d}.form-floating>:disabled~label::after,.form-floating>.form-control:disabled~label::after{background-color:#e9ecef}.input-group{position:relative;display:flex;display:-webkit-flex;flex-wrap:wrap;-webkit-flex-wrap:wrap;align-items:stretch;-webkit-align-items:stretch;width:100%}.input-group>.form-control,.input-group>.form-select,.input-group>.form-floating{position:relative;flex:1 1 auto;-webkit-flex:1 1 auto;width:1%;min-width:0}.input-group>.form-control:focus,.input-group>.form-select:focus,.input-group>.form-floating:focus-within{z-index:5}.input-group .btn{position:relative;z-index:2}.input-group .btn:focus{z-index:5}.input-group-text{display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;padding:.375rem .75rem;font-size:1rem;font-weight:400;line-height:1.5;color:#343a40;text-align:center;white-space:nowrap;background-color:#f8f9fa;border:1px solid #dee2e6}.input-group-lg>.form-control,.input-group-lg>.form-select,.input-group-lg>.input-group-text,.input-group-lg>.btn{padding:.5rem 1rem;font-size:1.25rem}.input-group-sm>.form-control,.input-group-sm>.form-select,.input-group-sm>.input-group-text,.input-group-sm>.btn{padding:.25rem .5rem;font-size:0.875rem}.input-group-lg>.form-select,.input-group-sm>.form-select{padding-right:3rem}.input-group>:not(:first-child):not(.dropdown-menu):not(.valid-tooltip):not(.valid-feedback):not(.invalid-tooltip):not(.invalid-feedback){margin-left:calc(1px*-1)}.valid-feedback{display:none;width:100%;margin-top:.25rem;font-size:0.875em;color:#3fb618}.valid-tooltip{position:absolute;top:100%;z-index:5;display:none;max-width:100%;padding:.25rem .5rem;margin-top:.1rem;font-size:0.875rem;color:#fff;background-color:#3fb618}.was-validated :valid~.valid-feedback,.was-validated :valid~.valid-tooltip,.is-valid~.valid-feedback,.is-valid~.valid-tooltip{display:block}.was-validated .form-control:valid,.form-control.is-valid{border-color:#3fb618;padding-right:calc(1.5em + 0.75rem);background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 8 8'%3e%3cpath fill='%233fb618' d='M2.3 6.73.6 4.53c-.4-1.04.46-1.4 1.1-.8l1.1 1.4 3.4-3.8c.6-.63 1.6-.27 1.2.7l-4 4.6c-.43.5-.8.4-1.1.1z'/%3e%3c/svg%3e");background-repeat:no-repeat;background-position:right calc(0.375em + 0.1875rem) center;background-size:calc(0.75em + 0.375rem) calc(0.75em + 0.375rem)}.was-validated .form-control:valid:focus,.form-control.is-valid:focus{border-color:#3fb618;box-shadow:0 0 0 .25rem rgba(63,182,24,.25)}.was-validated textarea.form-control:valid,textarea.form-control.is-valid{padding-right:calc(1.5em + 0.75rem);background-position:top calc(0.375em + 0.1875rem) right calc(0.375em + 0.1875rem)}.was-validated .form-select:valid,.form-select.is-valid{border-color:#3fb618}.was-validated .form-select:valid:not([multiple]):not([size]),.was-validated .form-select:valid:not([multiple])[size="1"],.form-select.is-valid:not([multiple]):not([size]),.form-select.is-valid:not([multiple])[size="1"]{--bs-form-select-bg-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 8 8'%3e%3cpath fill='%233fb618' d='M2.3 6.73.6 4.53c-.4-1.04.46-1.4 1.1-.8l1.1 1.4 3.4-3.8c.6-.63 1.6-.27 1.2.7l-4 4.6c-.43.5-.8.4-1.1.1z'/%3e%3c/svg%3e");padding-right:4.125rem;background-position:right .75rem center,center right 2.25rem;background-size:16px 12px,calc(0.75em + 0.375rem) calc(0.75em + 0.375rem)}.was-validated .form-select:valid:focus,.form-select.is-valid:focus{border-color:#3fb618;box-shadow:0 0 0 .25rem rgba(63,182,24,.25)}.was-validated .form-control-color:valid,.form-control-color.is-valid{width:calc(3rem + calc(1.5em + 0.75rem))}.was-validated .form-check-input:valid,.form-check-input.is-valid{border-color:#3fb618}.was-validated .form-check-input:valid:checked,.form-check-input.is-valid:checked{background-color:#3fb618}.was-validated .form-check-input:valid:focus,.form-check-input.is-valid:focus{box-shadow:0 0 0 .25rem rgba(63,182,24,.25)}.was-validated .form-check-input:valid~.form-check-label,.form-check-input.is-valid~.form-check-label{color:#3fb618}.form-check-inline .form-check-input~.valid-feedback{margin-left:.5em}.was-validated .input-group>.form-control:not(:focus):valid,.input-group>.form-control:not(:focus).is-valid,.was-validated .input-group>.form-select:not(:focus):valid,.input-group>.form-select:not(:focus).is-valid,.was-validated .input-group>.form-floating:not(:focus-within):valid,.input-group>.form-floating:not(:focus-within).is-valid{z-index:3}.invalid-feedback{display:none;width:100%;margin-top:.25rem;font-size:0.875em;color:#ff0039}.invalid-tooltip{position:absolute;top:100%;z-index:5;display:none;max-width:100%;padding:.25rem .5rem;margin-top:.1rem;font-size:0.875rem;color:#fff;background-color:#ff0039}.was-validated :invalid~.invalid-feedback,.was-validated :invalid~.invalid-tooltip,.is-invalid~.invalid-feedback,.is-invalid~.invalid-tooltip{display:block}.was-validated .form-control:invalid,.form-control.is-invalid{border-color:#ff0039;padding-right:calc(1.5em + 0.75rem);background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 12 12' width='12' height='12' fill='none' stroke='%23ff0039'%3e%3ccircle cx='6' cy='6' r='4.5'/%3e%3cpath stroke-linejoin='round' d='M5.8 3.6h.4L6 6.5z'/%3e%3ccircle cx='6' cy='8.2' r='.6' fill='%23ff0039' stroke='none'/%3e%3c/svg%3e");background-repeat:no-repeat;background-position:right calc(0.375em + 0.1875rem) center;background-size:calc(0.75em + 0.375rem) calc(0.75em + 0.375rem)}.was-validated .form-control:invalid:focus,.form-control.is-invalid:focus{border-color:#ff0039;box-shadow:0 0 0 .25rem rgba(255,0,57,.25)}.was-validated textarea.form-control:invalid,textarea.form-control.is-invalid{padding-right:calc(1.5em + 0.75rem);background-position:top calc(0.375em + 0.1875rem) right calc(0.375em + 0.1875rem)}.was-validated .form-select:invalid,.form-select.is-invalid{border-color:#ff0039}.was-validated .form-select:invalid:not([multiple]):not([size]),.was-validated .form-select:invalid:not([multiple])[size="1"],.form-select.is-invalid:not([multiple]):not([size]),.form-select.is-invalid:not([multiple])[size="1"]{--bs-form-select-bg-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 12 12' width='12' height='12' fill='none' stroke='%23ff0039'%3e%3ccircle cx='6' cy='6' r='4.5'/%3e%3cpath stroke-linejoin='round' d='M5.8 3.6h.4L6 6.5z'/%3e%3ccircle cx='6' cy='8.2' r='.6' fill='%23ff0039' stroke='none'/%3e%3c/svg%3e");padding-right:4.125rem;background-position:right .75rem center,center right 2.25rem;background-size:16px 12px,calc(0.75em + 0.375rem) calc(0.75em + 0.375rem)}.was-validated .form-select:invalid:focus,.form-select.is-invalid:focus{border-color:#ff0039;box-shadow:0 0 0 .25rem rgba(255,0,57,.25)}.was-validated .form-control-color:invalid,.form-control-color.is-invalid{width:calc(3rem + calc(1.5em + 0.75rem))}.was-validated .form-check-input:invalid,.form-check-input.is-invalid{border-color:#ff0039}.was-validated .form-check-input:invalid:checked,.form-check-input.is-invalid:checked{background-color:#ff0039}.was-validated .form-check-input:invalid:focus,.form-check-input.is-invalid:focus{box-shadow:0 0 0 .25rem rgba(255,0,57,.25)}.was-validated .form-check-input:invalid~.form-check-label,.form-check-input.is-invalid~.form-check-label{color:#ff0039}.form-check-inline .form-check-input~.invalid-feedback{margin-left:.5em}.was-validated .input-group>.form-control:not(:focus):invalid,.input-group>.form-control:not(:focus).is-invalid,.was-validated .input-group>.form-select:not(:focus):invalid,.input-group>.form-select:not(:focus).is-invalid,.was-validated .input-group>.form-floating:not(:focus-within):invalid,.input-group>.form-floating:not(:focus-within).is-invalid{z-index:4}.btn{--bs-btn-padding-x: 0.75rem;--bs-btn-padding-y: 0.375rem;--bs-btn-font-family: ;--bs-btn-font-size:1rem;--bs-btn-font-weight: 400;--bs-btn-line-height: 1.5;--bs-btn-color: #343a40;--bs-btn-bg: transparent;--bs-btn-border-width: 1px;--bs-btn-border-color: transparent;--bs-btn-border-radius: 0.25rem;--bs-btn-hover-border-color: transparent;--bs-btn-box-shadow: inset 0 1px 0 rgba(255, 255, 255, 0.15), 0 1px 1px rgba(0, 0, 0, 0.075);--bs-btn-disabled-opacity: 0.65;--bs-btn-focus-box-shadow: 0 0 0 0.25rem rgba(var(--bs-btn-focus-shadow-rgb), .5);display:inline-block;padding:var(--bs-btn-padding-y) var(--bs-btn-padding-x);font-family:var(--bs-btn-font-family);font-size:var(--bs-btn-font-size);font-weight:var(--bs-btn-font-weight);line-height:var(--bs-btn-line-height);color:var(--bs-btn-color);text-align:center;text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;vertical-align:middle;cursor:pointer;user-select:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;-o-user-select:none;border:var(--bs-btn-border-width) solid var(--bs-btn-border-color);background-color:var(--bs-btn-bg);transition:color .15s ease-in-out,background-color .15s ease-in-out,border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.btn{transition:none}}.btn:hover{color:var(--bs-btn-hover-color);background-color:var(--bs-btn-hover-bg);border-color:var(--bs-btn-hover-border-color)}.btn-check+.btn:hover{color:var(--bs-btn-color);background-color:var(--bs-btn-bg);border-color:var(--bs-btn-border-color)}.btn:focus-visible{color:var(--bs-btn-hover-color);background-color:var(--bs-btn-hover-bg);border-color:var(--bs-btn-hover-border-color);outline:0;box-shadow:var(--bs-btn-focus-box-shadow)}.btn-check:focus-visible+.btn{border-color:var(--bs-btn-hover-border-color);outline:0;box-shadow:var(--bs-btn-focus-box-shadow)}.btn-check:checked+.btn,:not(.btn-check)+.btn:active,.btn:first-child:active,.btn.active,.btn.show{color:var(--bs-btn-active-color);background-color:var(--bs-btn-active-bg);border-color:var(--bs-btn-active-border-color)}.btn-check:checked+.btn:focus-visible,:not(.btn-check)+.btn:active:focus-visible,.btn:first-child:active:focus-visible,.btn.active:focus-visible,.btn.show:focus-visible{box-shadow:var(--bs-btn-focus-box-shadow)}.btn:disabled,.btn.disabled,fieldset:disabled .btn{color:var(--bs-btn-disabled-color);pointer-events:none;background-color:var(--bs-btn-disabled-bg);border-color:var(--bs-btn-disabled-border-color);opacity:var(--bs-btn-disabled-opacity)}.btn-default{--bs-btn-color: #fff;--bs-btn-bg: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #2c3136;--bs-btn-hover-border-color: #2a2e33;--bs-btn-focus-shadow-rgb: 82, 88, 93;--bs-btn-active-color: #fff;--bs-btn-active-bg: #2a2e33;--bs-btn-active-border-color: #272c30;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #343a40;--bs-btn-disabled-border-color: #343a40}.btn-primary{--bs-btn-color: #fff;--bs-btn-bg: #2780e3;--bs-btn-border-color: #2780e3;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #216dc1;--bs-btn-hover-border-color: #1f66b6;--bs-btn-focus-shadow-rgb: 71, 147, 231;--bs-btn-active-color: #fff;--bs-btn-active-bg: #1f66b6;--bs-btn-active-border-color: #1d60aa;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #2780e3;--bs-btn-disabled-border-color: #2780e3}.btn-secondary{--bs-btn-color: #fff;--bs-btn-bg: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #2c3136;--bs-btn-hover-border-color: #2a2e33;--bs-btn-focus-shadow-rgb: 82, 88, 93;--bs-btn-active-color: #fff;--bs-btn-active-bg: #2a2e33;--bs-btn-active-border-color: #272c30;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #343a40;--bs-btn-disabled-border-color: #343a40}.btn-success{--bs-btn-color: #fff;--bs-btn-bg: #3fb618;--bs-btn-border-color: #3fb618;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #369b14;--bs-btn-hover-border-color: #329213;--bs-btn-focus-shadow-rgb: 92, 193, 59;--bs-btn-active-color: #fff;--bs-btn-active-bg: #329213;--bs-btn-active-border-color: #2f8912;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #3fb618;--bs-btn-disabled-border-color: #3fb618}.btn-info{--bs-btn-color: #fff;--bs-btn-bg: #9954bb;--bs-btn-border-color: #9954bb;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #82479f;--bs-btn-hover-border-color: #7a4396;--bs-btn-focus-shadow-rgb: 168, 110, 197;--bs-btn-active-color: #fff;--bs-btn-active-bg: #7a4396;--bs-btn-active-border-color: #733f8c;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #9954bb;--bs-btn-disabled-border-color: #9954bb}.btn-warning{--bs-btn-color: #fff;--bs-btn-bg: #ff7518;--bs-btn-border-color: #ff7518;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #d96314;--bs-btn-hover-border-color: #cc5e13;--bs-btn-focus-shadow-rgb: 255, 138, 59;--bs-btn-active-color: #fff;--bs-btn-active-bg: #cc5e13;--bs-btn-active-border-color: #bf5812;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #ff7518;--bs-btn-disabled-border-color: #ff7518}.btn-danger{--bs-btn-color: #fff;--bs-btn-bg: #ff0039;--bs-btn-border-color: #ff0039;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #d90030;--bs-btn-hover-border-color: #cc002e;--bs-btn-focus-shadow-rgb: 255, 38, 87;--bs-btn-active-color: #fff;--bs-btn-active-bg: #cc002e;--bs-btn-active-border-color: #bf002b;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #ff0039;--bs-btn-disabled-border-color: #ff0039}.btn-light{--bs-btn-color: #000;--bs-btn-bg: #f8f9fa;--bs-btn-border-color: #f8f9fa;--bs-btn-hover-color: #000;--bs-btn-hover-bg: #d3d4d5;--bs-btn-hover-border-color: #c6c7c8;--bs-btn-focus-shadow-rgb: 211, 212, 213;--bs-btn-active-color: #000;--bs-btn-active-bg: #c6c7c8;--bs-btn-active-border-color: #babbbc;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #000;--bs-btn-disabled-bg: #f8f9fa;--bs-btn-disabled-border-color: #f8f9fa}.btn-dark{--bs-btn-color: #fff;--bs-btn-bg: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #52585d;--bs-btn-hover-border-color: #484e53;--bs-btn-focus-shadow-rgb: 82, 88, 93;--bs-btn-active-color: #fff;--bs-btn-active-bg: #5d6166;--bs-btn-active-border-color: #484e53;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #343a40;--bs-btn-disabled-border-color: #343a40}.btn-outline-default{--bs-btn-color: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #343a40;--bs-btn-hover-border-color: #343a40;--bs-btn-focus-shadow-rgb: 52, 58, 64;--bs-btn-active-color: #fff;--bs-btn-active-bg: #343a40;--bs-btn-active-border-color: #343a40;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #343a40;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #343a40;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-primary{--bs-btn-color: #2780e3;--bs-btn-border-color: #2780e3;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #2780e3;--bs-btn-hover-border-color: #2780e3;--bs-btn-focus-shadow-rgb: 39, 128, 227;--bs-btn-active-color: #fff;--bs-btn-active-bg: #2780e3;--bs-btn-active-border-color: #2780e3;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #2780e3;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #2780e3;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-secondary{--bs-btn-color: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #343a40;--bs-btn-hover-border-color: #343a40;--bs-btn-focus-shadow-rgb: 52, 58, 64;--bs-btn-active-color: #fff;--bs-btn-active-bg: #343a40;--bs-btn-active-border-color: #343a40;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #343a40;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #343a40;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-success{--bs-btn-color: #3fb618;--bs-btn-border-color: #3fb618;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #3fb618;--bs-btn-hover-border-color: #3fb618;--bs-btn-focus-shadow-rgb: 63, 182, 24;--bs-btn-active-color: #fff;--bs-btn-active-bg: #3fb618;--bs-btn-active-border-color: #3fb618;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #3fb618;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #3fb618;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-info{--bs-btn-color: #9954bb;--bs-btn-border-color: #9954bb;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #9954bb;--bs-btn-hover-border-color: #9954bb;--bs-btn-focus-shadow-rgb: 153, 84, 187;--bs-btn-active-color: #fff;--bs-btn-active-bg: #9954bb;--bs-btn-active-border-color: #9954bb;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #9954bb;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #9954bb;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-warning{--bs-btn-color: #ff7518;--bs-btn-border-color: #ff7518;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #ff7518;--bs-btn-hover-border-color: #ff7518;--bs-btn-focus-shadow-rgb: 255, 117, 24;--bs-btn-active-color: #fff;--bs-btn-active-bg: #ff7518;--bs-btn-active-border-color: #ff7518;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #ff7518;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #ff7518;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-danger{--bs-btn-color: #ff0039;--bs-btn-border-color: #ff0039;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #ff0039;--bs-btn-hover-border-color: #ff0039;--bs-btn-focus-shadow-rgb: 255, 0, 57;--bs-btn-active-color: #fff;--bs-btn-active-bg: #ff0039;--bs-btn-active-border-color: #ff0039;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #ff0039;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #ff0039;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-light{--bs-btn-color: #f8f9fa;--bs-btn-border-color: #f8f9fa;--bs-btn-hover-color: #000;--bs-btn-hover-bg: #f8f9fa;--bs-btn-hover-border-color: #f8f9fa;--bs-btn-focus-shadow-rgb: 248, 249, 250;--bs-btn-active-color: #000;--bs-btn-active-bg: #f8f9fa;--bs-btn-active-border-color: #f8f9fa;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #f8f9fa;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #f8f9fa;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-dark{--bs-btn-color: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #343a40;--bs-btn-hover-border-color: #343a40;--bs-btn-focus-shadow-rgb: 52, 58, 64;--bs-btn-active-color: #fff;--bs-btn-active-bg: #343a40;--bs-btn-active-border-color: #343a40;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #343a40;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #343a40;--bs-btn-bg: transparent;--bs-gradient: none}.btn-link{--bs-btn-font-weight: 400;--bs-btn-color: #2761e3;--bs-btn-bg: transparent;--bs-btn-border-color: transparent;--bs-btn-hover-color: #1f4eb6;--bs-btn-hover-border-color: transparent;--bs-btn-active-color: #1f4eb6;--bs-btn-active-border-color: transparent;--bs-btn-disabled-color: #6c757d;--bs-btn-disabled-border-color: transparent;--bs-btn-box-shadow: 0 0 0 #000;--bs-btn-focus-shadow-rgb: 71, 121, 231;text-decoration:underline;-webkit-text-decoration:underline;-moz-text-decoration:underline;-ms-text-decoration:underline;-o-text-decoration:underline}.btn-link:focus-visible{color:var(--bs-btn-color)}.btn-link:hover{color:var(--bs-btn-hover-color)}.btn-lg,.btn-group-lg>.btn{--bs-btn-padding-y: 0.5rem;--bs-btn-padding-x: 1rem;--bs-btn-font-size:1.25rem;--bs-btn-border-radius: 0.5rem}.btn-sm,.btn-group-sm>.btn{--bs-btn-padding-y: 0.25rem;--bs-btn-padding-x: 0.5rem;--bs-btn-font-size:0.875rem;--bs-btn-border-radius: 0.2em}.fade{transition:opacity .15s linear}@media(prefers-reduced-motion: reduce){.fade{transition:none}}.fade:not(.show){opacity:0}.collapse:not(.show){display:none}.collapsing{height:0;overflow:hidden;transition:height .2s ease}@media(prefers-reduced-motion: reduce){.collapsing{transition:none}}.collapsing.collapse-horizontal{width:0;height:auto;transition:width .35s ease}@media(prefers-reduced-motion: reduce){.collapsing.collapse-horizontal{transition:none}}.dropup,.dropend,.dropdown,.dropstart,.dropup-center,.dropdown-center{position:relative}.dropdown-toggle{white-space:nowrap}.dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:"";border-top:.3em solid;border-right:.3em solid rgba(0,0,0,0);border-bottom:0;border-left:.3em solid rgba(0,0,0,0)}.dropdown-toggle:empty::after{margin-left:0}.dropdown-menu{--bs-dropdown-zindex: 1000;--bs-dropdown-min-width: 10rem;--bs-dropdown-padding-x: 0;--bs-dropdown-padding-y: 0.5rem;--bs-dropdown-spacer: 0.125rem;--bs-dropdown-font-size:1rem;--bs-dropdown-color: #343a40;--bs-dropdown-bg: #fff;--bs-dropdown-border-color: rgba(0, 0, 0, 0.175);--bs-dropdown-border-radius: 0.25rem;--bs-dropdown-border-width: 1px;--bs-dropdown-inner-border-radius: calc(0.25rem - 1px);--bs-dropdown-divider-bg: rgba(0, 0, 0, 0.175);--bs-dropdown-divider-margin-y: 0.5rem;--bs-dropdown-box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15);--bs-dropdown-link-color: #343a40;--bs-dropdown-link-hover-color: #343a40;--bs-dropdown-link-hover-bg: #f8f9fa;--bs-dropdown-link-active-color: #fff;--bs-dropdown-link-active-bg: #2780e3;--bs-dropdown-link-disabled-color: rgba(52, 58, 64, 0.5);--bs-dropdown-item-padding-x: 1rem;--bs-dropdown-item-padding-y: 0.25rem;--bs-dropdown-header-color: #6c757d;--bs-dropdown-header-padding-x: 1rem;--bs-dropdown-header-padding-y: 0.5rem;position:absolute;z-index:var(--bs-dropdown-zindex);display:none;min-width:var(--bs-dropdown-min-width);padding:var(--bs-dropdown-padding-y) var(--bs-dropdown-padding-x);margin:0;font-size:var(--bs-dropdown-font-size);color:var(--bs-dropdown-color);text-align:left;list-style:none;background-color:var(--bs-dropdown-bg);background-clip:padding-box;border:var(--bs-dropdown-border-width) solid var(--bs-dropdown-border-color)}.dropdown-menu[data-bs-popper]{top:100%;left:0;margin-top:var(--bs-dropdown-spacer)}.dropdown-menu-start{--bs-position: start}.dropdown-menu-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-end{--bs-position: end}.dropdown-menu-end[data-bs-popper]{right:0;left:auto}@media(min-width: 576px){.dropdown-menu-sm-start{--bs-position: start}.dropdown-menu-sm-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-sm-end{--bs-position: end}.dropdown-menu-sm-end[data-bs-popper]{right:0;left:auto}}@media(min-width: 768px){.dropdown-menu-md-start{--bs-position: start}.dropdown-menu-md-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-md-end{--bs-position: end}.dropdown-menu-md-end[data-bs-popper]{right:0;left:auto}}@media(min-width: 992px){.dropdown-menu-lg-start{--bs-position: start}.dropdown-menu-lg-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-lg-end{--bs-position: end}.dropdown-menu-lg-end[data-bs-popper]{right:0;left:auto}}@media(min-width: 1200px){.dropdown-menu-xl-start{--bs-position: start}.dropdown-menu-xl-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-xl-end{--bs-position: end}.dropdown-menu-xl-end[data-bs-popper]{right:0;left:auto}}@media(min-width: 1400px){.dropdown-menu-xxl-start{--bs-position: start}.dropdown-menu-xxl-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-xxl-end{--bs-position: end}.dropdown-menu-xxl-end[data-bs-popper]{right:0;left:auto}}.dropup .dropdown-menu[data-bs-popper]{top:auto;bottom:100%;margin-top:0;margin-bottom:var(--bs-dropdown-spacer)}.dropup .dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:"";border-top:0;border-right:.3em solid rgba(0,0,0,0);border-bottom:.3em solid;border-left:.3em solid rgba(0,0,0,0)}.dropup .dropdown-toggle:empty::after{margin-left:0}.dropend .dropdown-menu[data-bs-popper]{top:0;right:auto;left:100%;margin-top:0;margin-left:var(--bs-dropdown-spacer)}.dropend .dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:"";border-top:.3em solid rgba(0,0,0,0);border-right:0;border-bottom:.3em solid rgba(0,0,0,0);border-left:.3em solid}.dropend .dropdown-toggle:empty::after{margin-left:0}.dropend .dropdown-toggle::after{vertical-align:0}.dropstart .dropdown-menu[data-bs-popper]{top:0;right:100%;left:auto;margin-top:0;margin-right:var(--bs-dropdown-spacer)}.dropstart .dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:""}.dropstart .dropdown-toggle::after{display:none}.dropstart .dropdown-toggle::before{display:inline-block;margin-right:.255em;vertical-align:.255em;content:"";border-top:.3em solid rgba(0,0,0,0);border-right:.3em solid;border-bottom:.3em solid rgba(0,0,0,0)}.dropstart .dropdown-toggle:empty::after{margin-left:0}.dropstart .dropdown-toggle::before{vertical-align:0}.dropdown-divider{height:0;margin:var(--bs-dropdown-divider-margin-y) 0;overflow:hidden;border-top:1px solid var(--bs-dropdown-divider-bg);opacity:1}.dropdown-item{display:block;width:100%;padding:var(--bs-dropdown-item-padding-y) var(--bs-dropdown-item-padding-x);clear:both;font-weight:400;color:var(--bs-dropdown-link-color);text-align:inherit;text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;white-space:nowrap;background-color:rgba(0,0,0,0);border:0}.dropdown-item:hover,.dropdown-item:focus{color:var(--bs-dropdown-link-hover-color);background-color:var(--bs-dropdown-link-hover-bg)}.dropdown-item.active,.dropdown-item:active{color:var(--bs-dropdown-link-active-color);text-decoration:none;background-color:var(--bs-dropdown-link-active-bg)}.dropdown-item.disabled,.dropdown-item:disabled{color:var(--bs-dropdown-link-disabled-color);pointer-events:none;background-color:rgba(0,0,0,0)}.dropdown-menu.show{display:block}.dropdown-header{display:block;padding:var(--bs-dropdown-header-padding-y) var(--bs-dropdown-header-padding-x);margin-bottom:0;font-size:0.875rem;color:var(--bs-dropdown-header-color);white-space:nowrap}.dropdown-item-text{display:block;padding:var(--bs-dropdown-item-padding-y) var(--bs-dropdown-item-padding-x);color:var(--bs-dropdown-link-color)}.dropdown-menu-dark{--bs-dropdown-color: #dee2e6;--bs-dropdown-bg: #343a40;--bs-dropdown-border-color: rgba(0, 0, 0, 0.175);--bs-dropdown-box-shadow: ;--bs-dropdown-link-color: #dee2e6;--bs-dropdown-link-hover-color: #fff;--bs-dropdown-divider-bg: rgba(0, 0, 0, 0.175);--bs-dropdown-link-hover-bg: rgba(255, 255, 255, 0.15);--bs-dropdown-link-active-color: #fff;--bs-dropdown-link-active-bg: #2780e3;--bs-dropdown-link-disabled-color: #adb5bd;--bs-dropdown-header-color: #adb5bd}.btn-group,.btn-group-vertical{position:relative;display:inline-flex;vertical-align:middle}.btn-group>.btn,.btn-group-vertical>.btn{position:relative;flex:1 1 auto;-webkit-flex:1 1 auto}.btn-group>.btn-check:checked+.btn,.btn-group>.btn-check:focus+.btn,.btn-group>.btn:hover,.btn-group>.btn:focus,.btn-group>.btn:active,.btn-group>.btn.active,.btn-group-vertical>.btn-check:checked+.btn,.btn-group-vertical>.btn-check:focus+.btn,.btn-group-vertical>.btn:hover,.btn-group-vertical>.btn:focus,.btn-group-vertical>.btn:active,.btn-group-vertical>.btn.active{z-index:1}.btn-toolbar{display:flex;display:-webkit-flex;flex-wrap:wrap;-webkit-flex-wrap:wrap;justify-content:flex-start;-webkit-justify-content:flex-start}.btn-toolbar .input-group{width:auto}.btn-group>:not(.btn-check:first-child)+.btn,.btn-group>.btn-group:not(:first-child){margin-left:calc(1px*-1)}.dropdown-toggle-split{padding-right:.5625rem;padding-left:.5625rem}.dropdown-toggle-split::after,.dropup .dropdown-toggle-split::after,.dropend .dropdown-toggle-split::after{margin-left:0}.dropstart .dropdown-toggle-split::before{margin-right:0}.btn-sm+.dropdown-toggle-split,.btn-group-sm>.btn+.dropdown-toggle-split{padding-right:.375rem;padding-left:.375rem}.btn-lg+.dropdown-toggle-split,.btn-group-lg>.btn+.dropdown-toggle-split{padding-right:.75rem;padding-left:.75rem}.btn-group-vertical{flex-direction:column;-webkit-flex-direction:column;align-items:flex-start;-webkit-align-items:flex-start;justify-content:center;-webkit-justify-content:center}.btn-group-vertical>.btn,.btn-group-vertical>.btn-group{width:100%}.btn-group-vertical>.btn:not(:first-child),.btn-group-vertical>.btn-group:not(:first-child){margin-top:calc(1px*-1)}.nav{--bs-nav-link-padding-x: 1rem;--bs-nav-link-padding-y: 0.5rem;--bs-nav-link-font-weight: ;--bs-nav-link-color: #2761e3;--bs-nav-link-hover-color: #1f4eb6;--bs-nav-link-disabled-color: rgba(52, 58, 64, 0.75);display:flex;display:-webkit-flex;flex-wrap:wrap;-webkit-flex-wrap:wrap;padding-left:0;margin-bottom:0;list-style:none}.nav-link{display:block;padding:var(--bs-nav-link-padding-y) var(--bs-nav-link-padding-x);font-size:var(--bs-nav-link-font-size);font-weight:var(--bs-nav-link-font-weight);color:var(--bs-nav-link-color);text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;background:none;border:0;transition:color .15s ease-in-out,background-color .15s ease-in-out,border-color .15s ease-in-out}@media(prefers-reduced-motion: reduce){.nav-link{transition:none}}.nav-link:hover,.nav-link:focus{color:var(--bs-nav-link-hover-color)}.nav-link:focus-visible{outline:0;box-shadow:0 0 0 .25rem rgba(39,128,227,.25)}.nav-link.disabled,.nav-link:disabled{color:var(--bs-nav-link-disabled-color);pointer-events:none;cursor:default}.nav-tabs{--bs-nav-tabs-border-width: 1px;--bs-nav-tabs-border-color: #dee2e6;--bs-nav-tabs-border-radius: 0.25rem;--bs-nav-tabs-link-hover-border-color: #e9ecef #e9ecef #dee2e6;--bs-nav-tabs-link-active-color: #000;--bs-nav-tabs-link-active-bg: #fff;--bs-nav-tabs-link-active-border-color: #dee2e6 #dee2e6 #fff;border-bottom:var(--bs-nav-tabs-border-width) solid var(--bs-nav-tabs-border-color)}.nav-tabs .nav-link{margin-bottom:calc(-1*var(--bs-nav-tabs-border-width));border:var(--bs-nav-tabs-border-width) solid rgba(0,0,0,0)}.nav-tabs .nav-link:hover,.nav-tabs .nav-link:focus{isolation:isolate;border-color:var(--bs-nav-tabs-link-hover-border-color)}.nav-tabs .nav-link.active,.nav-tabs .nav-item.show .nav-link{color:var(--bs-nav-tabs-link-active-color);background-color:var(--bs-nav-tabs-link-active-bg);border-color:var(--bs-nav-tabs-link-active-border-color)}.nav-tabs .dropdown-menu{margin-top:calc(-1*var(--bs-nav-tabs-border-width))}.nav-pills{--bs-nav-pills-border-radius: 0.25rem;--bs-nav-pills-link-active-color: #fff;--bs-nav-pills-link-active-bg: #2780e3}.nav-pills .nav-link.active,.nav-pills .show>.nav-link{color:var(--bs-nav-pills-link-active-color);background-color:var(--bs-nav-pills-link-active-bg)}.nav-underline{--bs-nav-underline-gap: 1rem;--bs-nav-underline-border-width: 0.125rem;--bs-nav-underline-link-active-color: #000;gap:var(--bs-nav-underline-gap)}.nav-underline .nav-link{padding-right:0;padding-left:0;border-bottom:var(--bs-nav-underline-border-width) solid rgba(0,0,0,0)}.nav-underline .nav-link:hover,.nav-underline .nav-link:focus{border-bottom-color:currentcolor}.nav-underline .nav-link.active,.nav-underline .show>.nav-link{font-weight:700;color:var(--bs-nav-underline-link-active-color);border-bottom-color:currentcolor}.nav-fill>.nav-link,.nav-fill .nav-item{flex:1 1 auto;-webkit-flex:1 1 auto;text-align:center}.nav-justified>.nav-link,.nav-justified .nav-item{flex-basis:0;-webkit-flex-basis:0;flex-grow:1;-webkit-flex-grow:1;text-align:center}.nav-fill .nav-item .nav-link,.nav-justified .nav-item .nav-link{width:100%}.tab-content>.tab-pane{display:none}.tab-content>.active{display:block}.navbar{--bs-navbar-padding-x: 0;--bs-navbar-padding-y: 0.5rem;--bs-navbar-color: #545555;--bs-navbar-hover-color: rgba(31, 78, 182, 0.8);--bs-navbar-disabled-color: rgba(84, 85, 85, 0.75);--bs-navbar-active-color: #1f4eb6;--bs-navbar-brand-padding-y: 0.3125rem;--bs-navbar-brand-margin-end: 1rem;--bs-navbar-brand-font-size: 1.25rem;--bs-navbar-brand-color: #545555;--bs-navbar-brand-hover-color: #1f4eb6;--bs-navbar-nav-link-padding-x: 0.5rem;--bs-navbar-toggler-padding-y: 0.25;--bs-navbar-toggler-padding-x: 0;--bs-navbar-toggler-font-size: 1.25rem;--bs-navbar-toggler-icon-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 30 30'%3e%3cpath stroke='%23545555' stroke-linecap='round' stroke-miterlimit='10' stroke-width='2' d='M4 7h22M4 15h22M4 23h22'/%3e%3c/svg%3e");--bs-navbar-toggler-border-color: rgba(84, 85, 85, 0);--bs-navbar-toggler-border-radius: 0.25rem;--bs-navbar-toggler-focus-width: 0.25rem;--bs-navbar-toggler-transition: box-shadow 0.15s ease-in-out;position:relative;display:flex;display:-webkit-flex;flex-wrap:wrap;-webkit-flex-wrap:wrap;align-items:center;-webkit-align-items:center;justify-content:space-between;-webkit-justify-content:space-between;padding:var(--bs-navbar-padding-y) var(--bs-navbar-padding-x)}.navbar>.container,.navbar>.container-fluid,.navbar>.container-sm,.navbar>.container-md,.navbar>.container-lg,.navbar>.container-xl,.navbar>.container-xxl{display:flex;display:-webkit-flex;flex-wrap:inherit;-webkit-flex-wrap:inherit;align-items:center;-webkit-align-items:center;justify-content:space-between;-webkit-justify-content:space-between}.navbar-brand{padding-top:var(--bs-navbar-brand-padding-y);padding-bottom:var(--bs-navbar-brand-padding-y);margin-right:var(--bs-navbar-brand-margin-end);font-size:var(--bs-navbar-brand-font-size);color:var(--bs-navbar-brand-color);text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;white-space:nowrap}.navbar-brand:hover,.navbar-brand:focus{color:var(--bs-navbar-brand-hover-color)}.navbar-nav{--bs-nav-link-padding-x: 0;--bs-nav-link-padding-y: 0.5rem;--bs-nav-link-font-weight: ;--bs-nav-link-color: var(--bs-navbar-color);--bs-nav-link-hover-color: var(--bs-navbar-hover-color);--bs-nav-link-disabled-color: var(--bs-navbar-disabled-color);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;padding-left:0;margin-bottom:0;list-style:none}.navbar-nav .nav-link.active,.navbar-nav .nav-link.show{color:var(--bs-navbar-active-color)}.navbar-nav .dropdown-menu{position:static}.navbar-text{padding-top:.5rem;padding-bottom:.5rem;color:var(--bs-navbar-color)}.navbar-text a,.navbar-text a:hover,.navbar-text a:focus{color:var(--bs-navbar-active-color)}.navbar-collapse{flex-basis:100%;-webkit-flex-basis:100%;flex-grow:1;-webkit-flex-grow:1;align-items:center;-webkit-align-items:center}.navbar-toggler{padding:var(--bs-navbar-toggler-padding-y) var(--bs-navbar-toggler-padding-x);font-size:var(--bs-navbar-toggler-font-size);line-height:1;color:var(--bs-navbar-color);background-color:rgba(0,0,0,0);border:var(--bs-border-width) solid var(--bs-navbar-toggler-border-color);transition:var(--bs-navbar-toggler-transition)}@media(prefers-reduced-motion: reduce){.navbar-toggler{transition:none}}.navbar-toggler:hover{text-decoration:none}.navbar-toggler:focus{text-decoration:none;outline:0;box-shadow:0 0 0 var(--bs-navbar-toggler-focus-width)}.navbar-toggler-icon{display:inline-block;width:1.5em;height:1.5em;vertical-align:middle;background-image:var(--bs-navbar-toggler-icon-bg);background-repeat:no-repeat;background-position:center;background-size:100%}.navbar-nav-scroll{max-height:var(--bs-scroll-height, 75vh);overflow-y:auto}@media(min-width: 576px){.navbar-expand-sm{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand-sm .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand-sm .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-sm .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand-sm .navbar-nav-scroll{overflow:visible}.navbar-expand-sm .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand-sm .navbar-toggler{display:none}.navbar-expand-sm .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand-sm .offcanvas .offcanvas-header{display:none}.navbar-expand-sm .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}}@media(min-width: 768px){.navbar-expand-md{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand-md .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand-md .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-md .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand-md .navbar-nav-scroll{overflow:visible}.navbar-expand-md .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand-md .navbar-toggler{display:none}.navbar-expand-md .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand-md .offcanvas .offcanvas-header{display:none}.navbar-expand-md .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}}@media(min-width: 992px){.navbar-expand-lg{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand-lg .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand-lg .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-lg .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand-lg .navbar-nav-scroll{overflow:visible}.navbar-expand-lg .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand-lg .navbar-toggler{display:none}.navbar-expand-lg .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand-lg .offcanvas .offcanvas-header{display:none}.navbar-expand-lg .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}}@media(min-width: 1200px){.navbar-expand-xl{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand-xl .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand-xl .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-xl .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand-xl .navbar-nav-scroll{overflow:visible}.navbar-expand-xl .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand-xl .navbar-toggler{display:none}.navbar-expand-xl .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand-xl .offcanvas .offcanvas-header{display:none}.navbar-expand-xl .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}}@media(min-width: 1400px){.navbar-expand-xxl{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand-xxl .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand-xxl .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-xxl .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand-xxl .navbar-nav-scroll{overflow:visible}.navbar-expand-xxl .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand-xxl .navbar-toggler{display:none}.navbar-expand-xxl .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand-xxl .offcanvas .offcanvas-header{display:none}.navbar-expand-xxl .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}}.navbar-expand{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand .navbar-nav .dropdown-menu{position:absolute}.navbar-expand .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand .navbar-nav-scroll{overflow:visible}.navbar-expand .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand .navbar-toggler{display:none}.navbar-expand .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand .offcanvas .offcanvas-header{display:none}.navbar-expand .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}.navbar-dark,.navbar[data-bs-theme=dark]{--bs-navbar-color: #545555;--bs-navbar-hover-color: rgba(31, 78, 182, 0.8);--bs-navbar-disabled-color: rgba(84, 85, 85, 0.75);--bs-navbar-active-color: #1f4eb6;--bs-navbar-brand-color: #545555;--bs-navbar-brand-hover-color: #1f4eb6;--bs-navbar-toggler-border-color: rgba(84, 85, 85, 0);--bs-navbar-toggler-icon-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 30 30'%3e%3cpath stroke='%23545555' stroke-linecap='round' stroke-miterlimit='10' stroke-width='2' d='M4 7h22M4 15h22M4 23h22'/%3e%3c/svg%3e")}[data-bs-theme=dark] .navbar-toggler-icon{--bs-navbar-toggler-icon-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 30 30'%3e%3cpath stroke='%23545555' stroke-linecap='round' stroke-miterlimit='10' stroke-width='2' d='M4 7h22M4 15h22M4 23h22'/%3e%3c/svg%3e")}.card{--bs-card-spacer-y: 1rem;--bs-card-spacer-x: 1rem;--bs-card-title-spacer-y: 0.5rem;--bs-card-title-color: ;--bs-card-subtitle-color: ;--bs-card-border-width: 1px;--bs-card-border-color: rgba(0, 0, 0, 0.175);--bs-card-border-radius: 0.25rem;--bs-card-box-shadow: ;--bs-card-inner-border-radius: calc(0.25rem - 1px);--bs-card-cap-padding-y: 0.5rem;--bs-card-cap-padding-x: 1rem;--bs-card-cap-bg: rgba(52, 58, 64, 0.25);--bs-card-cap-color: ;--bs-card-height: ;--bs-card-color: ;--bs-card-bg: #fff;--bs-card-img-overlay-padding: 1rem;--bs-card-group-margin: 0.75rem;position:relative;display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;min-width:0;height:var(--bs-card-height);color:var(--bs-body-color);word-wrap:break-word;background-color:var(--bs-card-bg);background-clip:border-box;border:var(--bs-card-border-width) solid var(--bs-card-border-color)}.card>hr{margin-right:0;margin-left:0}.card>.list-group{border-top:inherit;border-bottom:inherit}.card>.list-group:first-child{border-top-width:0}.card>.list-group:last-child{border-bottom-width:0}.card>.card-header+.list-group,.card>.list-group+.card-footer{border-top:0}.card-body{flex:1 1 auto;-webkit-flex:1 1 auto;padding:var(--bs-card-spacer-y) var(--bs-card-spacer-x);color:var(--bs-card-color)}.card-title{margin-bottom:var(--bs-card-title-spacer-y);color:var(--bs-card-title-color)}.card-subtitle{margin-top:calc(-0.5*var(--bs-card-title-spacer-y));margin-bottom:0;color:var(--bs-card-subtitle-color)}.card-text:last-child{margin-bottom:0}.card-link+.card-link{margin-left:var(--bs-card-spacer-x)}.card-header{padding:var(--bs-card-cap-padding-y) var(--bs-card-cap-padding-x);margin-bottom:0;color:var(--bs-card-cap-color);background-color:var(--bs-card-cap-bg);border-bottom:var(--bs-card-border-width) solid var(--bs-card-border-color)}.card-footer{padding:var(--bs-card-cap-padding-y) var(--bs-card-cap-padding-x);color:var(--bs-card-cap-color);background-color:var(--bs-card-cap-bg);border-top:var(--bs-card-border-width) solid var(--bs-card-border-color)}.card-header-tabs{margin-right:calc(-0.5*var(--bs-card-cap-padding-x));margin-bottom:calc(-1*var(--bs-card-cap-padding-y));margin-left:calc(-0.5*var(--bs-card-cap-padding-x));border-bottom:0}.card-header-tabs .nav-link.active{background-color:var(--bs-card-bg);border-bottom-color:var(--bs-card-bg)}.card-header-pills{margin-right:calc(-0.5*var(--bs-card-cap-padding-x));margin-left:calc(-0.5*var(--bs-card-cap-padding-x))}.card-img-overlay{position:absolute;top:0;right:0;bottom:0;left:0;padding:var(--bs-card-img-overlay-padding)}.card-img,.card-img-top,.card-img-bottom{width:100%}.card-group>.card{margin-bottom:var(--bs-card-group-margin)}@media(min-width: 576px){.card-group{display:flex;display:-webkit-flex;flex-flow:row wrap;-webkit-flex-flow:row wrap}.card-group>.card{flex:1 0 0%;-webkit-flex:1 0 0%;margin-bottom:0}.card-group>.card+.card{margin-left:0;border-left:0}}.accordion{--bs-accordion-color: #343a40;--bs-accordion-bg: #fff;--bs-accordion-transition: color 0.15s ease-in-out, background-color 0.15s ease-in-out, border-color 0.15s ease-in-out, box-shadow 0.15s ease-in-out, border-radius 0.15s ease;--bs-accordion-border-color: #dee2e6;--bs-accordion-border-width: 1px;--bs-accordion-border-radius: 0.25rem;--bs-accordion-inner-border-radius: calc(0.25rem - 1px);--bs-accordion-btn-padding-x: 1.25rem;--bs-accordion-btn-padding-y: 1rem;--bs-accordion-btn-color: #343a40;--bs-accordion-btn-bg: #fff;--bs-accordion-btn-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23343a40'%3e%3cpath fill-rule='evenodd' d='M1.646 4.646a.5.5 0 0 1 .708 0L8 10.293l5.646-5.647a.5.5 0 0 1 .708.708l-6 6a.5.5 0 0 1-.708 0l-6-6a.5.5 0 0 1 0-.708z'/%3e%3c/svg%3e");--bs-accordion-btn-icon-width: 1.25rem;--bs-accordion-btn-icon-transform: rotate(-180deg);--bs-accordion-btn-icon-transition: transform 0.2s ease-in-out;--bs-accordion-btn-active-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%2310335b'%3e%3cpath fill-rule='evenodd' d='M1.646 4.646a.5.5 0 0 1 .708 0L8 10.293l5.646-5.647a.5.5 0 0 1 .708.708l-6 6a.5.5 0 0 1-.708 0l-6-6a.5.5 0 0 1 0-.708z'/%3e%3c/svg%3e");--bs-accordion-btn-focus-border-color: #93c0f1;--bs-accordion-btn-focus-box-shadow: 0 0 0 0.25rem rgba(39, 128, 227, 0.25);--bs-accordion-body-padding-x: 1.25rem;--bs-accordion-body-padding-y: 1rem;--bs-accordion-active-color: #10335b;--bs-accordion-active-bg: #d4e6f9}.accordion-button{position:relative;display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;width:100%;padding:var(--bs-accordion-btn-padding-y) var(--bs-accordion-btn-padding-x);font-size:1rem;color:var(--bs-accordion-btn-color);text-align:left;background-color:var(--bs-accordion-btn-bg);border:0;overflow-anchor:none;transition:var(--bs-accordion-transition)}@media(prefers-reduced-motion: reduce){.accordion-button{transition:none}}.accordion-button:not(.collapsed){color:var(--bs-accordion-active-color);background-color:var(--bs-accordion-active-bg);box-shadow:inset 0 calc(-1*var(--bs-accordion-border-width)) 0 var(--bs-accordion-border-color)}.accordion-button:not(.collapsed)::after{background-image:var(--bs-accordion-btn-active-icon);transform:var(--bs-accordion-btn-icon-transform)}.accordion-button::after{flex-shrink:0;-webkit-flex-shrink:0;width:var(--bs-accordion-btn-icon-width);height:var(--bs-accordion-btn-icon-width);margin-left:auto;content:"";background-image:var(--bs-accordion-btn-icon);background-repeat:no-repeat;background-size:var(--bs-accordion-btn-icon-width);transition:var(--bs-accordion-btn-icon-transition)}@media(prefers-reduced-motion: reduce){.accordion-button::after{transition:none}}.accordion-button:hover{z-index:2}.accordion-button:focus{z-index:3;border-color:var(--bs-accordion-btn-focus-border-color);outline:0;box-shadow:var(--bs-accordion-btn-focus-box-shadow)}.accordion-header{margin-bottom:0}.accordion-item{color:var(--bs-accordion-color);background-color:var(--bs-accordion-bg);border:var(--bs-accordion-border-width) solid var(--bs-accordion-border-color)}.accordion-item:not(:first-of-type){border-top:0}.accordion-body{padding:var(--bs-accordion-body-padding-y) var(--bs-accordion-body-padding-x)}.accordion-flush .accordion-collapse{border-width:0}.accordion-flush .accordion-item{border-right:0;border-left:0}.accordion-flush .accordion-item:first-child{border-top:0}.accordion-flush .accordion-item:last-child{border-bottom:0}[data-bs-theme=dark] .accordion-button::after{--bs-accordion-btn-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%237db3ee'%3e%3cpath fill-rule='evenodd' d='M1.646 4.646a.5.5 0 0 1 .708 0L8 10.293l5.646-5.647a.5.5 0 0 1 .708.708l-6 6a.5.5 0 0 1-.708 0l-6-6a.5.5 0 0 1 0-.708z'/%3e%3c/svg%3e");--bs-accordion-btn-active-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%237db3ee'%3e%3cpath fill-rule='evenodd' d='M1.646 4.646a.5.5 0 0 1 .708 0L8 10.293l5.646-5.647a.5.5 0 0 1 .708.708l-6 6a.5.5 0 0 1-.708 0l-6-6a.5.5 0 0 1 0-.708z'/%3e%3c/svg%3e")}.breadcrumb{--bs-breadcrumb-padding-x: 0;--bs-breadcrumb-padding-y: 0;--bs-breadcrumb-margin-bottom: 1rem;--bs-breadcrumb-bg: ;--bs-breadcrumb-border-radius: ;--bs-breadcrumb-divider-color: rgba(52, 58, 64, 0.75);--bs-breadcrumb-item-padding-x: 0.5rem;--bs-breadcrumb-item-active-color: rgba(52, 58, 64, 0.75);display:flex;display:-webkit-flex;flex-wrap:wrap;-webkit-flex-wrap:wrap;padding:var(--bs-breadcrumb-padding-y) var(--bs-breadcrumb-padding-x);margin-bottom:var(--bs-breadcrumb-margin-bottom);font-size:var(--bs-breadcrumb-font-size);list-style:none;background-color:var(--bs-breadcrumb-bg)}.breadcrumb-item+.breadcrumb-item{padding-left:var(--bs-breadcrumb-item-padding-x)}.breadcrumb-item+.breadcrumb-item::before{float:left;padding-right:var(--bs-breadcrumb-item-padding-x);color:var(--bs-breadcrumb-divider-color);content:var(--bs-breadcrumb-divider, ">") /* rtl: var(--bs-breadcrumb-divider, ">") */}.breadcrumb-item.active{color:var(--bs-breadcrumb-item-active-color)}.pagination{--bs-pagination-padding-x: 0.75rem;--bs-pagination-padding-y: 0.375rem;--bs-pagination-font-size:1rem;--bs-pagination-color: #2761e3;--bs-pagination-bg: #fff;--bs-pagination-border-width: 1px;--bs-pagination-border-color: #dee2e6;--bs-pagination-border-radius: 0.25rem;--bs-pagination-hover-color: #1f4eb6;--bs-pagination-hover-bg: #f8f9fa;--bs-pagination-hover-border-color: #dee2e6;--bs-pagination-focus-color: #1f4eb6;--bs-pagination-focus-bg: #e9ecef;--bs-pagination-focus-box-shadow: 0 0 0 0.25rem rgba(39, 128, 227, 0.25);--bs-pagination-active-color: #fff;--bs-pagination-active-bg: #2780e3;--bs-pagination-active-border-color: #2780e3;--bs-pagination-disabled-color: rgba(52, 58, 64, 0.75);--bs-pagination-disabled-bg: #e9ecef;--bs-pagination-disabled-border-color: #dee2e6;display:flex;display:-webkit-flex;padding-left:0;list-style:none}.page-link{position:relative;display:block;padding:var(--bs-pagination-padding-y) var(--bs-pagination-padding-x);font-size:var(--bs-pagination-font-size);color:var(--bs-pagination-color);text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;background-color:var(--bs-pagination-bg);border:var(--bs-pagination-border-width) solid var(--bs-pagination-border-color);transition:color .15s ease-in-out,background-color .15s ease-in-out,border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.page-link{transition:none}}.page-link:hover{z-index:2;color:var(--bs-pagination-hover-color);background-color:var(--bs-pagination-hover-bg);border-color:var(--bs-pagination-hover-border-color)}.page-link:focus{z-index:3;color:var(--bs-pagination-focus-color);background-color:var(--bs-pagination-focus-bg);outline:0;box-shadow:var(--bs-pagination-focus-box-shadow)}.page-link.active,.active>.page-link{z-index:3;color:var(--bs-pagination-active-color);background-color:var(--bs-pagination-active-bg);border-color:var(--bs-pagination-active-border-color)}.page-link.disabled,.disabled>.page-link{color:var(--bs-pagination-disabled-color);pointer-events:none;background-color:var(--bs-pagination-disabled-bg);border-color:var(--bs-pagination-disabled-border-color)}.page-item:not(:first-child) .page-link{margin-left:calc(1px*-1)}.pagination-lg{--bs-pagination-padding-x: 1.5rem;--bs-pagination-padding-y: 0.75rem;--bs-pagination-font-size:1.25rem;--bs-pagination-border-radius: 0.5rem}.pagination-sm{--bs-pagination-padding-x: 0.5rem;--bs-pagination-padding-y: 0.25rem;--bs-pagination-font-size:0.875rem;--bs-pagination-border-radius: 0.2em}.badge{--bs-badge-padding-x: 0.65em;--bs-badge-padding-y: 0.35em;--bs-badge-font-size:0.75em;--bs-badge-font-weight: 700;--bs-badge-color: #fff;--bs-badge-border-radius: 0.25rem;display:inline-block;padding:var(--bs-badge-padding-y) var(--bs-badge-padding-x);font-size:var(--bs-badge-font-size);font-weight:var(--bs-badge-font-weight);line-height:1;color:var(--bs-badge-color);text-align:center;white-space:nowrap;vertical-align:baseline}.badge:empty{display:none}.btn .badge{position:relative;top:-1px}.alert{--bs-alert-bg: transparent;--bs-alert-padding-x: 1rem;--bs-alert-padding-y: 1rem;--bs-alert-margin-bottom: 1rem;--bs-alert-color: inherit;--bs-alert-border-color: transparent;--bs-alert-border: 0 solid var(--bs-alert-border-color);--bs-alert-border-radius: 0.25rem;--bs-alert-link-color: inherit;position:relative;padding:var(--bs-alert-padding-y) var(--bs-alert-padding-x);margin-bottom:var(--bs-alert-margin-bottom);color:var(--bs-alert-color);background-color:var(--bs-alert-bg);border:var(--bs-alert-border)}.alert-heading{color:inherit}.alert-link{font-weight:700;color:var(--bs-alert-link-color)}.alert-dismissible{padding-right:3rem}.alert-dismissible .btn-close{position:absolute;top:0;right:0;z-index:2;padding:1.25rem 1rem}.alert-default{--bs-alert-color: var(--bs-default-text-emphasis);--bs-alert-bg: var(--bs-default-bg-subtle);--bs-alert-border-color: var(--bs-default-border-subtle);--bs-alert-link-color: var(--bs-default-text-emphasis)}.alert-primary{--bs-alert-color: var(--bs-primary-text-emphasis);--bs-alert-bg: var(--bs-primary-bg-subtle);--bs-alert-border-color: var(--bs-primary-border-subtle);--bs-alert-link-color: var(--bs-primary-text-emphasis)}.alert-secondary{--bs-alert-color: var(--bs-secondary-text-emphasis);--bs-alert-bg: var(--bs-secondary-bg-subtle);--bs-alert-border-color: var(--bs-secondary-border-subtle);--bs-alert-link-color: var(--bs-secondary-text-emphasis)}.alert-success{--bs-alert-color: var(--bs-success-text-emphasis);--bs-alert-bg: var(--bs-success-bg-subtle);--bs-alert-border-color: var(--bs-success-border-subtle);--bs-alert-link-color: var(--bs-success-text-emphasis)}.alert-info{--bs-alert-color: var(--bs-info-text-emphasis);--bs-alert-bg: var(--bs-info-bg-subtle);--bs-alert-border-color: var(--bs-info-border-subtle);--bs-alert-link-color: var(--bs-info-text-emphasis)}.alert-warning{--bs-alert-color: var(--bs-warning-text-emphasis);--bs-alert-bg: var(--bs-warning-bg-subtle);--bs-alert-border-color: var(--bs-warning-border-subtle);--bs-alert-link-color: var(--bs-warning-text-emphasis)}.alert-danger{--bs-alert-color: var(--bs-danger-text-emphasis);--bs-alert-bg: var(--bs-danger-bg-subtle);--bs-alert-border-color: var(--bs-danger-border-subtle);--bs-alert-link-color: var(--bs-danger-text-emphasis)}.alert-light{--bs-alert-color: var(--bs-light-text-emphasis);--bs-alert-bg: var(--bs-light-bg-subtle);--bs-alert-border-color: var(--bs-light-border-subtle);--bs-alert-link-color: var(--bs-light-text-emphasis)}.alert-dark{--bs-alert-color: var(--bs-dark-text-emphasis);--bs-alert-bg: var(--bs-dark-bg-subtle);--bs-alert-border-color: var(--bs-dark-border-subtle);--bs-alert-link-color: var(--bs-dark-text-emphasis)}@keyframes progress-bar-stripes{0%{background-position-x:.5rem}}.progress,.progress-stacked{--bs-progress-height: 0.5rem;--bs-progress-font-size:0.75rem;--bs-progress-bg: #e9ecef;--bs-progress-border-radius: 0.25rem;--bs-progress-box-shadow: inset 0 1px 2px rgba(0, 0, 0, 0.075);--bs-progress-bar-color: #fff;--bs-progress-bar-bg: #2780e3;--bs-progress-bar-transition: width 0.6s ease;display:flex;display:-webkit-flex;height:var(--bs-progress-height);overflow:hidden;font-size:var(--bs-progress-font-size);background-color:var(--bs-progress-bg)}.progress-bar{display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;justify-content:center;-webkit-justify-content:center;overflow:hidden;color:var(--bs-progress-bar-color);text-align:center;white-space:nowrap;background-color:var(--bs-progress-bar-bg);transition:var(--bs-progress-bar-transition)}@media(prefers-reduced-motion: reduce){.progress-bar{transition:none}}.progress-bar-striped{background-image:linear-gradient(45deg, rgba(255, 255, 255, 0.15) 25%, transparent 25%, transparent 50%, rgba(255, 255, 255, 0.15) 50%, rgba(255, 255, 255, 0.15) 75%, transparent 75%, transparent);background-size:var(--bs-progress-height) var(--bs-progress-height)}.progress-stacked>.progress{overflow:visible}.progress-stacked>.progress>.progress-bar{width:100%}.progress-bar-animated{animation:1s linear infinite progress-bar-stripes}@media(prefers-reduced-motion: reduce){.progress-bar-animated{animation:none}}.list-group{--bs-list-group-color: #343a40;--bs-list-group-bg: #fff;--bs-list-group-border-color: #dee2e6;--bs-list-group-border-width: 1px;--bs-list-group-border-radius: 0.25rem;--bs-list-group-item-padding-x: 1rem;--bs-list-group-item-padding-y: 0.5rem;--bs-list-group-action-color: rgba(52, 58, 64, 0.75);--bs-list-group-action-hover-color: #000;--bs-list-group-action-hover-bg: #f8f9fa;--bs-list-group-action-active-color: #343a40;--bs-list-group-action-active-bg: #e9ecef;--bs-list-group-disabled-color: rgba(52, 58, 64, 0.75);--bs-list-group-disabled-bg: #fff;--bs-list-group-active-color: #fff;--bs-list-group-active-bg: #2780e3;--bs-list-group-active-border-color: #2780e3;display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;padding-left:0;margin-bottom:0}.list-group-numbered{list-style-type:none;counter-reset:section}.list-group-numbered>.list-group-item::before{content:counters(section, ".") ". ";counter-increment:section}.list-group-item-action{width:100%;color:var(--bs-list-group-action-color);text-align:inherit}.list-group-item-action:hover,.list-group-item-action:focus{z-index:1;color:var(--bs-list-group-action-hover-color);text-decoration:none;background-color:var(--bs-list-group-action-hover-bg)}.list-group-item-action:active{color:var(--bs-list-group-action-active-color);background-color:var(--bs-list-group-action-active-bg)}.list-group-item{position:relative;display:block;padding:var(--bs-list-group-item-padding-y) var(--bs-list-group-item-padding-x);color:var(--bs-list-group-color);text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;background-color:var(--bs-list-group-bg);border:var(--bs-list-group-border-width) solid var(--bs-list-group-border-color)}.list-group-item.disabled,.list-group-item:disabled{color:var(--bs-list-group-disabled-color);pointer-events:none;background-color:var(--bs-list-group-disabled-bg)}.list-group-item.active{z-index:2;color:var(--bs-list-group-active-color);background-color:var(--bs-list-group-active-bg);border-color:var(--bs-list-group-active-border-color)}.list-group-item+.list-group-item{border-top-width:0}.list-group-item+.list-group-item.active{margin-top:calc(-1*var(--bs-list-group-border-width));border-top-width:var(--bs-list-group-border-width)}.list-group-horizontal{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal>.list-group-item.active{margin-top:0}.list-group-horizontal>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}@media(min-width: 576px){.list-group-horizontal-sm{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal-sm>.list-group-item.active{margin-top:0}.list-group-horizontal-sm>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal-sm>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}}@media(min-width: 768px){.list-group-horizontal-md{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal-md>.list-group-item.active{margin-top:0}.list-group-horizontal-md>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal-md>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}}@media(min-width: 992px){.list-group-horizontal-lg{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal-lg>.list-group-item.active{margin-top:0}.list-group-horizontal-lg>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal-lg>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}}@media(min-width: 1200px){.list-group-horizontal-xl{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal-xl>.list-group-item.active{margin-top:0}.list-group-horizontal-xl>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal-xl>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}}@media(min-width: 1400px){.list-group-horizontal-xxl{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal-xxl>.list-group-item.active{margin-top:0}.list-group-horizontal-xxl>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal-xxl>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}}.list-group-flush>.list-group-item{border-width:0 0 var(--bs-list-group-border-width)}.list-group-flush>.list-group-item:last-child{border-bottom-width:0}.list-group-item-default{--bs-list-group-color: var(--bs-default-text-emphasis);--bs-list-group-bg: var(--bs-default-bg-subtle);--bs-list-group-border-color: var(--bs-default-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-default-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-default-border-subtle);--bs-list-group-active-color: var(--bs-default-bg-subtle);--bs-list-group-active-bg: var(--bs-default-text-emphasis);--bs-list-group-active-border-color: var(--bs-default-text-emphasis)}.list-group-item-primary{--bs-list-group-color: var(--bs-primary-text-emphasis);--bs-list-group-bg: var(--bs-primary-bg-subtle);--bs-list-group-border-color: var(--bs-primary-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-primary-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-primary-border-subtle);--bs-list-group-active-color: var(--bs-primary-bg-subtle);--bs-list-group-active-bg: var(--bs-primary-text-emphasis);--bs-list-group-active-border-color: var(--bs-primary-text-emphasis)}.list-group-item-secondary{--bs-list-group-color: var(--bs-secondary-text-emphasis);--bs-list-group-bg: var(--bs-secondary-bg-subtle);--bs-list-group-border-color: var(--bs-secondary-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-secondary-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-secondary-border-subtle);--bs-list-group-active-color: var(--bs-secondary-bg-subtle);--bs-list-group-active-bg: var(--bs-secondary-text-emphasis);--bs-list-group-active-border-color: var(--bs-secondary-text-emphasis)}.list-group-item-success{--bs-list-group-color: var(--bs-success-text-emphasis);--bs-list-group-bg: var(--bs-success-bg-subtle);--bs-list-group-border-color: var(--bs-success-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-success-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-success-border-subtle);--bs-list-group-active-color: var(--bs-success-bg-subtle);--bs-list-group-active-bg: var(--bs-success-text-emphasis);--bs-list-group-active-border-color: var(--bs-success-text-emphasis)}.list-group-item-info{--bs-list-group-color: var(--bs-info-text-emphasis);--bs-list-group-bg: var(--bs-info-bg-subtle);--bs-list-group-border-color: var(--bs-info-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-info-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-info-border-subtle);--bs-list-group-active-color: var(--bs-info-bg-subtle);--bs-list-group-active-bg: var(--bs-info-text-emphasis);--bs-list-group-active-border-color: var(--bs-info-text-emphasis)}.list-group-item-warning{--bs-list-group-color: var(--bs-warning-text-emphasis);--bs-list-group-bg: var(--bs-warning-bg-subtle);--bs-list-group-border-color: var(--bs-warning-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-warning-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-warning-border-subtle);--bs-list-group-active-color: var(--bs-warning-bg-subtle);--bs-list-group-active-bg: var(--bs-warning-text-emphasis);--bs-list-group-active-border-color: var(--bs-warning-text-emphasis)}.list-group-item-danger{--bs-list-group-color: var(--bs-danger-text-emphasis);--bs-list-group-bg: var(--bs-danger-bg-subtle);--bs-list-group-border-color: var(--bs-danger-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-danger-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-danger-border-subtle);--bs-list-group-active-color: var(--bs-danger-bg-subtle);--bs-list-group-active-bg: var(--bs-danger-text-emphasis);--bs-list-group-active-border-color: var(--bs-danger-text-emphasis)}.list-group-item-light{--bs-list-group-color: var(--bs-light-text-emphasis);--bs-list-group-bg: var(--bs-light-bg-subtle);--bs-list-group-border-color: var(--bs-light-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-light-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-light-border-subtle);--bs-list-group-active-color: var(--bs-light-bg-subtle);--bs-list-group-active-bg: var(--bs-light-text-emphasis);--bs-list-group-active-border-color: var(--bs-light-text-emphasis)}.list-group-item-dark{--bs-list-group-color: var(--bs-dark-text-emphasis);--bs-list-group-bg: var(--bs-dark-bg-subtle);--bs-list-group-border-color: var(--bs-dark-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-dark-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-dark-border-subtle);--bs-list-group-active-color: var(--bs-dark-bg-subtle);--bs-list-group-active-bg: var(--bs-dark-text-emphasis);--bs-list-group-active-border-color: var(--bs-dark-text-emphasis)}.btn-close{--bs-btn-close-color: #000;--bs-btn-close-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23000'%3e%3cpath d='M.293.293a1 1 0 0 1 1.414 0L8 6.586 14.293.293a1 1 0 1 1 1.414 1.414L9.414 8l6.293 6.293a1 1 0 0 1-1.414 1.414L8 9.414l-6.293 6.293a1 1 0 0 1-1.414-1.414L6.586 8 .293 1.707a1 1 0 0 1 0-1.414z'/%3e%3c/svg%3e");--bs-btn-close-opacity: 0.5;--bs-btn-close-hover-opacity: 0.75;--bs-btn-close-focus-shadow: 0 0 0 0.25rem rgba(39, 128, 227, 0.25);--bs-btn-close-focus-opacity: 1;--bs-btn-close-disabled-opacity: 0.25;--bs-btn-close-white-filter: invert(1) grayscale(100%) brightness(200%);box-sizing:content-box;width:1em;height:1em;padding:.25em .25em;color:var(--bs-btn-close-color);background:rgba(0,0,0,0) var(--bs-btn-close-bg) center/1em auto no-repeat;border:0;opacity:var(--bs-btn-close-opacity)}.btn-close:hover{color:var(--bs-btn-close-color);text-decoration:none;opacity:var(--bs-btn-close-hover-opacity)}.btn-close:focus{outline:0;box-shadow:var(--bs-btn-close-focus-shadow);opacity:var(--bs-btn-close-focus-opacity)}.btn-close:disabled,.btn-close.disabled{pointer-events:none;user-select:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;-o-user-select:none;opacity:var(--bs-btn-close-disabled-opacity)}.btn-close-white{filter:var(--bs-btn-close-white-filter)}[data-bs-theme=dark] .btn-close{filter:var(--bs-btn-close-white-filter)}.toast{--bs-toast-zindex: 1090;--bs-toast-padding-x: 0.75rem;--bs-toast-padding-y: 0.5rem;--bs-toast-spacing: 1.5rem;--bs-toast-max-width: 350px;--bs-toast-font-size:0.875rem;--bs-toast-color: ;--bs-toast-bg: rgba(255, 255, 255, 0.85);--bs-toast-border-width: 1px;--bs-toast-border-color: rgba(0, 0, 0, 0.175);--bs-toast-border-radius: 0.25rem;--bs-toast-box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15);--bs-toast-header-color: rgba(52, 58, 64, 0.75);--bs-toast-header-bg: rgba(255, 255, 255, 0.85);--bs-toast-header-border-color: rgba(0, 0, 0, 0.175);width:var(--bs-toast-max-width);max-width:100%;font-size:var(--bs-toast-font-size);color:var(--bs-toast-color);pointer-events:auto;background-color:var(--bs-toast-bg);background-clip:padding-box;border:var(--bs-toast-border-width) solid var(--bs-toast-border-color);box-shadow:var(--bs-toast-box-shadow)}.toast.showing{opacity:0}.toast:not(.show){display:none}.toast-container{--bs-toast-zindex: 1090;position:absolute;z-index:var(--bs-toast-zindex);width:max-content;width:-webkit-max-content;width:-moz-max-content;width:-ms-max-content;width:-o-max-content;max-width:100%;pointer-events:none}.toast-container>:not(:last-child){margin-bottom:var(--bs-toast-spacing)}.toast-header{display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;padding:var(--bs-toast-padding-y) var(--bs-toast-padding-x);color:var(--bs-toast-header-color);background-color:var(--bs-toast-header-bg);background-clip:padding-box;border-bottom:var(--bs-toast-border-width) solid var(--bs-toast-header-border-color)}.toast-header .btn-close{margin-right:calc(-0.5*var(--bs-toast-padding-x));margin-left:var(--bs-toast-padding-x)}.toast-body{padding:var(--bs-toast-padding-x);word-wrap:break-word}.modal{--bs-modal-zindex: 1055;--bs-modal-width: 500px;--bs-modal-padding: 1rem;--bs-modal-margin: 0.5rem;--bs-modal-color: ;--bs-modal-bg: #fff;--bs-modal-border-color: rgba(0, 0, 0, 0.175);--bs-modal-border-width: 1px;--bs-modal-border-radius: 0.5rem;--bs-modal-box-shadow: 0 0.125rem 0.25rem rgba(0, 0, 0, 0.075);--bs-modal-inner-border-radius: calc(0.5rem - 1px);--bs-modal-header-padding-x: 1rem;--bs-modal-header-padding-y: 1rem;--bs-modal-header-padding: 1rem 1rem;--bs-modal-header-border-color: #dee2e6;--bs-modal-header-border-width: 1px;--bs-modal-title-line-height: 1.5;--bs-modal-footer-gap: 0.5rem;--bs-modal-footer-bg: ;--bs-modal-footer-border-color: #dee2e6;--bs-modal-footer-border-width: 1px;position:fixed;top:0;left:0;z-index:var(--bs-modal-zindex);display:none;width:100%;height:100%;overflow-x:hidden;overflow-y:auto;outline:0}.modal-dialog{position:relative;width:auto;margin:var(--bs-modal-margin);pointer-events:none}.modal.fade .modal-dialog{transition:transform .3s ease-out;transform:translate(0, -50px)}@media(prefers-reduced-motion: reduce){.modal.fade .modal-dialog{transition:none}}.modal.show .modal-dialog{transform:none}.modal.modal-static .modal-dialog{transform:scale(1.02)}.modal-dialog-scrollable{height:calc(100% - var(--bs-modal-margin)*2)}.modal-dialog-scrollable .modal-content{max-height:100%;overflow:hidden}.modal-dialog-scrollable .modal-body{overflow-y:auto}.modal-dialog-centered{display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;min-height:calc(100% - var(--bs-modal-margin)*2)}.modal-content{position:relative;display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;width:100%;color:var(--bs-modal-color);pointer-events:auto;background-color:var(--bs-modal-bg);background-clip:padding-box;border:var(--bs-modal-border-width) solid var(--bs-modal-border-color);outline:0}.modal-backdrop{--bs-backdrop-zindex: 1050;--bs-backdrop-bg: #000;--bs-backdrop-opacity: 0.5;position:fixed;top:0;left:0;z-index:var(--bs-backdrop-zindex);width:100vw;height:100vh;background-color:var(--bs-backdrop-bg)}.modal-backdrop.fade{opacity:0}.modal-backdrop.show{opacity:var(--bs-backdrop-opacity)}.modal-header{display:flex;display:-webkit-flex;flex-shrink:0;-webkit-flex-shrink:0;align-items:center;-webkit-align-items:center;justify-content:space-between;-webkit-justify-content:space-between;padding:var(--bs-modal-header-padding);border-bottom:var(--bs-modal-header-border-width) solid var(--bs-modal-header-border-color)}.modal-header .btn-close{padding:calc(var(--bs-modal-header-padding-y)*.5) calc(var(--bs-modal-header-padding-x)*.5);margin:calc(-0.5*var(--bs-modal-header-padding-y)) calc(-0.5*var(--bs-modal-header-padding-x)) calc(-0.5*var(--bs-modal-header-padding-y)) auto}.modal-title{margin-bottom:0;line-height:var(--bs-modal-title-line-height)}.modal-body{position:relative;flex:1 1 auto;-webkit-flex:1 1 auto;padding:var(--bs-modal-padding)}.modal-footer{display:flex;display:-webkit-flex;flex-shrink:0;-webkit-flex-shrink:0;flex-wrap:wrap;-webkit-flex-wrap:wrap;align-items:center;-webkit-align-items:center;justify-content:flex-end;-webkit-justify-content:flex-end;padding:calc(var(--bs-modal-padding) - var(--bs-modal-footer-gap)*.5);background-color:var(--bs-modal-footer-bg);border-top:var(--bs-modal-footer-border-width) solid var(--bs-modal-footer-border-color)}.modal-footer>*{margin:calc(var(--bs-modal-footer-gap)*.5)}@media(min-width: 576px){.modal{--bs-modal-margin: 1.75rem;--bs-modal-box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15)}.modal-dialog{max-width:var(--bs-modal-width);margin-right:auto;margin-left:auto}.modal-sm{--bs-modal-width: 300px}}@media(min-width: 992px){.modal-lg,.modal-xl{--bs-modal-width: 800px}}@media(min-width: 1200px){.modal-xl{--bs-modal-width: 1140px}}.modal-fullscreen{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen .modal-content{height:100%;border:0}.modal-fullscreen .modal-body{overflow-y:auto}@media(max-width: 575.98px){.modal-fullscreen-sm-down{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen-sm-down .modal-content{height:100%;border:0}.modal-fullscreen-sm-down .modal-body{overflow-y:auto}}@media(max-width: 767.98px){.modal-fullscreen-md-down{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen-md-down .modal-content{height:100%;border:0}.modal-fullscreen-md-down .modal-body{overflow-y:auto}}@media(max-width: 991.98px){.modal-fullscreen-lg-down{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen-lg-down .modal-content{height:100%;border:0}.modal-fullscreen-lg-down .modal-body{overflow-y:auto}}@media(max-width: 1199.98px){.modal-fullscreen-xl-down{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen-xl-down .modal-content{height:100%;border:0}.modal-fullscreen-xl-down .modal-body{overflow-y:auto}}@media(max-width: 1399.98px){.modal-fullscreen-xxl-down{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen-xxl-down .modal-content{height:100%;border:0}.modal-fullscreen-xxl-down .modal-body{overflow-y:auto}}.tooltip{--bs-tooltip-zindex: 1080;--bs-tooltip-max-width: 200px;--bs-tooltip-padding-x: 0.5rem;--bs-tooltip-padding-y: 0.25rem;--bs-tooltip-margin: ;--bs-tooltip-font-size:0.875rem;--bs-tooltip-color: #fff;--bs-tooltip-bg: #000;--bs-tooltip-border-radius: 0.25rem;--bs-tooltip-opacity: 0.9;--bs-tooltip-arrow-width: 0.8rem;--bs-tooltip-arrow-height: 0.4rem;z-index:var(--bs-tooltip-zindex);display:block;margin:var(--bs-tooltip-margin);font-family:"Source Sans Pro",-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol";font-style:normal;font-weight:400;line-height:1.5;text-align:left;text-align:start;text-decoration:none;text-shadow:none;text-transform:none;letter-spacing:normal;word-break:normal;white-space:normal;word-spacing:normal;line-break:auto;font-size:var(--bs-tooltip-font-size);word-wrap:break-word;opacity:0}.tooltip.show{opacity:var(--bs-tooltip-opacity)}.tooltip .tooltip-arrow{display:block;width:var(--bs-tooltip-arrow-width);height:var(--bs-tooltip-arrow-height)}.tooltip .tooltip-arrow::before{position:absolute;content:"";border-color:rgba(0,0,0,0);border-style:solid}.bs-tooltip-top .tooltip-arrow,.bs-tooltip-auto[data-popper-placement^=top] .tooltip-arrow{bottom:calc(-1*var(--bs-tooltip-arrow-height))}.bs-tooltip-top .tooltip-arrow::before,.bs-tooltip-auto[data-popper-placement^=top] .tooltip-arrow::before{top:-1px;border-width:var(--bs-tooltip-arrow-height) calc(var(--bs-tooltip-arrow-width)*.5) 0;border-top-color:var(--bs-tooltip-bg)}.bs-tooltip-end .tooltip-arrow,.bs-tooltip-auto[data-popper-placement^=right] .tooltip-arrow{left:calc(-1*var(--bs-tooltip-arrow-height));width:var(--bs-tooltip-arrow-height);height:var(--bs-tooltip-arrow-width)}.bs-tooltip-end .tooltip-arrow::before,.bs-tooltip-auto[data-popper-placement^=right] .tooltip-arrow::before{right:-1px;border-width:calc(var(--bs-tooltip-arrow-width)*.5) var(--bs-tooltip-arrow-height) calc(var(--bs-tooltip-arrow-width)*.5) 0;border-right-color:var(--bs-tooltip-bg)}.bs-tooltip-bottom .tooltip-arrow,.bs-tooltip-auto[data-popper-placement^=bottom] .tooltip-arrow{top:calc(-1*var(--bs-tooltip-arrow-height))}.bs-tooltip-bottom .tooltip-arrow::before,.bs-tooltip-auto[data-popper-placement^=bottom] .tooltip-arrow::before{bottom:-1px;border-width:0 calc(var(--bs-tooltip-arrow-width)*.5) var(--bs-tooltip-arrow-height);border-bottom-color:var(--bs-tooltip-bg)}.bs-tooltip-start .tooltip-arrow,.bs-tooltip-auto[data-popper-placement^=left] .tooltip-arrow{right:calc(-1*var(--bs-tooltip-arrow-height));width:var(--bs-tooltip-arrow-height);height:var(--bs-tooltip-arrow-width)}.bs-tooltip-start .tooltip-arrow::before,.bs-tooltip-auto[data-popper-placement^=left] .tooltip-arrow::before{left:-1px;border-width:calc(var(--bs-tooltip-arrow-width)*.5) 0 calc(var(--bs-tooltip-arrow-width)*.5) var(--bs-tooltip-arrow-height);border-left-color:var(--bs-tooltip-bg)}.tooltip-inner{max-width:var(--bs-tooltip-max-width);padding:var(--bs-tooltip-padding-y) var(--bs-tooltip-padding-x);color:var(--bs-tooltip-color);text-align:center;background-color:var(--bs-tooltip-bg)}.popover{--bs-popover-zindex: 1070;--bs-popover-max-width: 276px;--bs-popover-font-size:0.875rem;--bs-popover-bg: #fff;--bs-popover-border-width: 1px;--bs-popover-border-color: rgba(0, 0, 0, 0.175);--bs-popover-border-radius: 0.5rem;--bs-popover-inner-border-radius: calc(0.5rem - 1px);--bs-popover-box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15);--bs-popover-header-padding-x: 1rem;--bs-popover-header-padding-y: 0.5rem;--bs-popover-header-font-size:1rem;--bs-popover-header-color: inherit;--bs-popover-header-bg: #e9ecef;--bs-popover-body-padding-x: 1rem;--bs-popover-body-padding-y: 1rem;--bs-popover-body-color: #343a40;--bs-popover-arrow-width: 1rem;--bs-popover-arrow-height: 0.5rem;--bs-popover-arrow-border: var(--bs-popover-border-color);z-index:var(--bs-popover-zindex);display:block;max-width:var(--bs-popover-max-width);font-family:"Source Sans Pro",-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol";font-style:normal;font-weight:400;line-height:1.5;text-align:left;text-align:start;text-decoration:none;text-shadow:none;text-transform:none;letter-spacing:normal;word-break:normal;white-space:normal;word-spacing:normal;line-break:auto;font-size:var(--bs-popover-font-size);word-wrap:break-word;background-color:var(--bs-popover-bg);background-clip:padding-box;border:var(--bs-popover-border-width) solid var(--bs-popover-border-color)}.popover .popover-arrow{display:block;width:var(--bs-popover-arrow-width);height:var(--bs-popover-arrow-height)}.popover .popover-arrow::before,.popover .popover-arrow::after{position:absolute;display:block;content:"";border-color:rgba(0,0,0,0);border-style:solid;border-width:0}.bs-popover-top>.popover-arrow,.bs-popover-auto[data-popper-placement^=top]>.popover-arrow{bottom:calc(-1*(var(--bs-popover-arrow-height)) - var(--bs-popover-border-width))}.bs-popover-top>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=top]>.popover-arrow::before,.bs-popover-top>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=top]>.popover-arrow::after{border-width:var(--bs-popover-arrow-height) calc(var(--bs-popover-arrow-width)*.5) 0}.bs-popover-top>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=top]>.popover-arrow::before{bottom:0;border-top-color:var(--bs-popover-arrow-border)}.bs-popover-top>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=top]>.popover-arrow::after{bottom:var(--bs-popover-border-width);border-top-color:var(--bs-popover-bg)}.bs-popover-end>.popover-arrow,.bs-popover-auto[data-popper-placement^=right]>.popover-arrow{left:calc(-1*(var(--bs-popover-arrow-height)) - var(--bs-popover-border-width));width:var(--bs-popover-arrow-height);height:var(--bs-popover-arrow-width)}.bs-popover-end>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=right]>.popover-arrow::before,.bs-popover-end>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=right]>.popover-arrow::after{border-width:calc(var(--bs-popover-arrow-width)*.5) var(--bs-popover-arrow-height) calc(var(--bs-popover-arrow-width)*.5) 0}.bs-popover-end>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=right]>.popover-arrow::before{left:0;border-right-color:var(--bs-popover-arrow-border)}.bs-popover-end>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=right]>.popover-arrow::after{left:var(--bs-popover-border-width);border-right-color:var(--bs-popover-bg)}.bs-popover-bottom>.popover-arrow,.bs-popover-auto[data-popper-placement^=bottom]>.popover-arrow{top:calc(-1*(var(--bs-popover-arrow-height)) - var(--bs-popover-border-width))}.bs-popover-bottom>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=bottom]>.popover-arrow::before,.bs-popover-bottom>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=bottom]>.popover-arrow::after{border-width:0 calc(var(--bs-popover-arrow-width)*.5) var(--bs-popover-arrow-height)}.bs-popover-bottom>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=bottom]>.popover-arrow::before{top:0;border-bottom-color:var(--bs-popover-arrow-border)}.bs-popover-bottom>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=bottom]>.popover-arrow::after{top:var(--bs-popover-border-width);border-bottom-color:var(--bs-popover-bg)}.bs-popover-bottom .popover-header::before,.bs-popover-auto[data-popper-placement^=bottom] .popover-header::before{position:absolute;top:0;left:50%;display:block;width:var(--bs-popover-arrow-width);margin-left:calc(-0.5*var(--bs-popover-arrow-width));content:"";border-bottom:var(--bs-popover-border-width) solid var(--bs-popover-header-bg)}.bs-popover-start>.popover-arrow,.bs-popover-auto[data-popper-placement^=left]>.popover-arrow{right:calc(-1*(var(--bs-popover-arrow-height)) - var(--bs-popover-border-width));width:var(--bs-popover-arrow-height);height:var(--bs-popover-arrow-width)}.bs-popover-start>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=left]>.popover-arrow::before,.bs-popover-start>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=left]>.popover-arrow::after{border-width:calc(var(--bs-popover-arrow-width)*.5) 0 calc(var(--bs-popover-arrow-width)*.5) var(--bs-popover-arrow-height)}.bs-popover-start>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=left]>.popover-arrow::before{right:0;border-left-color:var(--bs-popover-arrow-border)}.bs-popover-start>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=left]>.popover-arrow::after{right:var(--bs-popover-border-width);border-left-color:var(--bs-popover-bg)}.popover-header{padding:var(--bs-popover-header-padding-y) var(--bs-popover-header-padding-x);margin-bottom:0;font-size:var(--bs-popover-header-font-size);color:var(--bs-popover-header-color);background-color:var(--bs-popover-header-bg);border-bottom:var(--bs-popover-border-width) solid var(--bs-popover-border-color)}.popover-header:empty{display:none}.popover-body{padding:var(--bs-popover-body-padding-y) var(--bs-popover-body-padding-x);color:var(--bs-popover-body-color)}.carousel{position:relative}.carousel.pointer-event{touch-action:pan-y;-webkit-touch-action:pan-y;-moz-touch-action:pan-y;-ms-touch-action:pan-y;-o-touch-action:pan-y}.carousel-inner{position:relative;width:100%;overflow:hidden}.carousel-inner::after{display:block;clear:both;content:""}.carousel-item{position:relative;display:none;float:left;width:100%;margin-right:-100%;backface-visibility:hidden;-webkit-backface-visibility:hidden;-moz-backface-visibility:hidden;-ms-backface-visibility:hidden;-o-backface-visibility:hidden;transition:transform .6s ease-in-out}@media(prefers-reduced-motion: reduce){.carousel-item{transition:none}}.carousel-item.active,.carousel-item-next,.carousel-item-prev{display:block}.carousel-item-next:not(.carousel-item-start),.active.carousel-item-end{transform:translateX(100%)}.carousel-item-prev:not(.carousel-item-end),.active.carousel-item-start{transform:translateX(-100%)}.carousel-fade .carousel-item{opacity:0;transition-property:opacity;transform:none}.carousel-fade .carousel-item.active,.carousel-fade .carousel-item-next.carousel-item-start,.carousel-fade .carousel-item-prev.carousel-item-end{z-index:1;opacity:1}.carousel-fade .active.carousel-item-start,.carousel-fade .active.carousel-item-end{z-index:0;opacity:0;transition:opacity 0s .6s}@media(prefers-reduced-motion: reduce){.carousel-fade .active.carousel-item-start,.carousel-fade .active.carousel-item-end{transition:none}}.carousel-control-prev,.carousel-control-next{position:absolute;top:0;bottom:0;z-index:1;display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;justify-content:center;-webkit-justify-content:center;width:15%;padding:0;color:#fff;text-align:center;background:none;border:0;opacity:.5;transition:opacity .15s ease}@media(prefers-reduced-motion: reduce){.carousel-control-prev,.carousel-control-next{transition:none}}.carousel-control-prev:hover,.carousel-control-prev:focus,.carousel-control-next:hover,.carousel-control-next:focus{color:#fff;text-decoration:none;outline:0;opacity:.9}.carousel-control-prev{left:0}.carousel-control-next{right:0}.carousel-control-prev-icon,.carousel-control-next-icon{display:inline-block;width:2rem;height:2rem;background-repeat:no-repeat;background-position:50%;background-size:100% 100%}.carousel-control-prev-icon{background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23fff'%3e%3cpath d='M11.354 1.646a.5.5 0 0 1 0 .708L5.707 8l5.647 5.646a.5.5 0 0 1-.708.708l-6-6a.5.5 0 0 1 0-.708l6-6a.5.5 0 0 1 .708 0z'/%3e%3c/svg%3e")}.carousel-control-next-icon{background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23fff'%3e%3cpath d='M4.646 1.646a.5.5 0 0 1 .708 0l6 6a.5.5 0 0 1 0 .708l-6 6a.5.5 0 0 1-.708-.708L10.293 8 4.646 2.354a.5.5 0 0 1 0-.708z'/%3e%3c/svg%3e")}.carousel-indicators{position:absolute;right:0;bottom:0;left:0;z-index:2;display:flex;display:-webkit-flex;justify-content:center;-webkit-justify-content:center;padding:0;margin-right:15%;margin-bottom:1rem;margin-left:15%}.carousel-indicators [data-bs-target]{box-sizing:content-box;flex:0 1 auto;-webkit-flex:0 1 auto;width:30px;height:3px;padding:0;margin-right:3px;margin-left:3px;text-indent:-999px;cursor:pointer;background-color:#fff;background-clip:padding-box;border:0;border-top:10px solid rgba(0,0,0,0);border-bottom:10px solid rgba(0,0,0,0);opacity:.5;transition:opacity .6s ease}@media(prefers-reduced-motion: reduce){.carousel-indicators [data-bs-target]{transition:none}}.carousel-indicators .active{opacity:1}.carousel-caption{position:absolute;right:15%;bottom:1.25rem;left:15%;padding-top:1.25rem;padding-bottom:1.25rem;color:#fff;text-align:center}.carousel-dark .carousel-control-prev-icon,.carousel-dark .carousel-control-next-icon{filter:invert(1) grayscale(100)}.carousel-dark .carousel-indicators [data-bs-target]{background-color:#000}.carousel-dark .carousel-caption{color:#000}[data-bs-theme=dark] .carousel .carousel-control-prev-icon,[data-bs-theme=dark] .carousel .carousel-control-next-icon,[data-bs-theme=dark].carousel .carousel-control-prev-icon,[data-bs-theme=dark].carousel .carousel-control-next-icon{filter:invert(1) grayscale(100)}[data-bs-theme=dark] .carousel .carousel-indicators [data-bs-target],[data-bs-theme=dark].carousel .carousel-indicators [data-bs-target]{background-color:#000}[data-bs-theme=dark] .carousel .carousel-caption,[data-bs-theme=dark].carousel .carousel-caption{color:#000}.spinner-grow,.spinner-border{display:inline-block;width:var(--bs-spinner-width);height:var(--bs-spinner-height);vertical-align:var(--bs-spinner-vertical-align);border-radius:50%;animation:var(--bs-spinner-animation-speed) linear infinite var(--bs-spinner-animation-name)}@keyframes spinner-border{to{transform:rotate(360deg) /* rtl:ignore */}}.spinner-border{--bs-spinner-width: 2rem;--bs-spinner-height: 2rem;--bs-spinner-vertical-align: -0.125em;--bs-spinner-border-width: 0.25em;--bs-spinner-animation-speed: 0.75s;--bs-spinner-animation-name: spinner-border;border:var(--bs-spinner-border-width) solid currentcolor;border-right-color:rgba(0,0,0,0)}.spinner-border-sm{--bs-spinner-width: 1rem;--bs-spinner-height: 1rem;--bs-spinner-border-width: 0.2em}@keyframes spinner-grow{0%{transform:scale(0)}50%{opacity:1;transform:none}}.spinner-grow{--bs-spinner-width: 2rem;--bs-spinner-height: 2rem;--bs-spinner-vertical-align: -0.125em;--bs-spinner-animation-speed: 0.75s;--bs-spinner-animation-name: spinner-grow;background-color:currentcolor;opacity:0}.spinner-grow-sm{--bs-spinner-width: 1rem;--bs-spinner-height: 1rem}@media(prefers-reduced-motion: reduce){.spinner-border,.spinner-grow{--bs-spinner-animation-speed: 1.5s}}.offcanvas,.offcanvas-xxl,.offcanvas-xl,.offcanvas-lg,.offcanvas-md,.offcanvas-sm{--bs-offcanvas-zindex: 1045;--bs-offcanvas-width: 400px;--bs-offcanvas-height: 30vh;--bs-offcanvas-padding-x: 1rem;--bs-offcanvas-padding-y: 1rem;--bs-offcanvas-color: #343a40;--bs-offcanvas-bg: #fff;--bs-offcanvas-border-width: 1px;--bs-offcanvas-border-color: rgba(0, 0, 0, 0.175);--bs-offcanvas-box-shadow: 0 0.125rem 0.25rem rgba(0, 0, 0, 0.075);--bs-offcanvas-transition: transform 0.3s ease-in-out;--bs-offcanvas-title-line-height: 1.5}@media(max-width: 575.98px){.offcanvas-sm{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}}@media(max-width: 575.98px)and (prefers-reduced-motion: reduce){.offcanvas-sm{transition:none}}@media(max-width: 575.98px){.offcanvas-sm.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas-sm.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas-sm.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas-sm.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas-sm.showing,.offcanvas-sm.show:not(.hiding){transform:none}.offcanvas-sm.showing,.offcanvas-sm.hiding,.offcanvas-sm.show{visibility:visible}}@media(min-width: 576px){.offcanvas-sm{--bs-offcanvas-height: auto;--bs-offcanvas-border-width: 0;background-color:rgba(0,0,0,0) !important}.offcanvas-sm .offcanvas-header{display:none}.offcanvas-sm .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible;background-color:rgba(0,0,0,0) !important}}@media(max-width: 767.98px){.offcanvas-md{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}}@media(max-width: 767.98px)and (prefers-reduced-motion: reduce){.offcanvas-md{transition:none}}@media(max-width: 767.98px){.offcanvas-md.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas-md.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas-md.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas-md.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas-md.showing,.offcanvas-md.show:not(.hiding){transform:none}.offcanvas-md.showing,.offcanvas-md.hiding,.offcanvas-md.show{visibility:visible}}@media(min-width: 768px){.offcanvas-md{--bs-offcanvas-height: auto;--bs-offcanvas-border-width: 0;background-color:rgba(0,0,0,0) !important}.offcanvas-md .offcanvas-header{display:none}.offcanvas-md .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible;background-color:rgba(0,0,0,0) !important}}@media(max-width: 991.98px){.offcanvas-lg{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}}@media(max-width: 991.98px)and (prefers-reduced-motion: reduce){.offcanvas-lg{transition:none}}@media(max-width: 991.98px){.offcanvas-lg.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas-lg.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas-lg.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas-lg.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas-lg.showing,.offcanvas-lg.show:not(.hiding){transform:none}.offcanvas-lg.showing,.offcanvas-lg.hiding,.offcanvas-lg.show{visibility:visible}}@media(min-width: 992px){.offcanvas-lg{--bs-offcanvas-height: auto;--bs-offcanvas-border-width: 0;background-color:rgba(0,0,0,0) !important}.offcanvas-lg .offcanvas-header{display:none}.offcanvas-lg .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible;background-color:rgba(0,0,0,0) !important}}@media(max-width: 1199.98px){.offcanvas-xl{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}}@media(max-width: 1199.98px)and (prefers-reduced-motion: reduce){.offcanvas-xl{transition:none}}@media(max-width: 1199.98px){.offcanvas-xl.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas-xl.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas-xl.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas-xl.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas-xl.showing,.offcanvas-xl.show:not(.hiding){transform:none}.offcanvas-xl.showing,.offcanvas-xl.hiding,.offcanvas-xl.show{visibility:visible}}@media(min-width: 1200px){.offcanvas-xl{--bs-offcanvas-height: auto;--bs-offcanvas-border-width: 0;background-color:rgba(0,0,0,0) !important}.offcanvas-xl .offcanvas-header{display:none}.offcanvas-xl .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible;background-color:rgba(0,0,0,0) !important}}@media(max-width: 1399.98px){.offcanvas-xxl{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}}@media(max-width: 1399.98px)and (prefers-reduced-motion: reduce){.offcanvas-xxl{transition:none}}@media(max-width: 1399.98px){.offcanvas-xxl.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas-xxl.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas-xxl.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas-xxl.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas-xxl.showing,.offcanvas-xxl.show:not(.hiding){transform:none}.offcanvas-xxl.showing,.offcanvas-xxl.hiding,.offcanvas-xxl.show{visibility:visible}}@media(min-width: 1400px){.offcanvas-xxl{--bs-offcanvas-height: auto;--bs-offcanvas-border-width: 0;background-color:rgba(0,0,0,0) !important}.offcanvas-xxl .offcanvas-header{display:none}.offcanvas-xxl .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible;background-color:rgba(0,0,0,0) !important}}.offcanvas{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}@media(prefers-reduced-motion: reduce){.offcanvas{transition:none}}.offcanvas.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas.showing,.offcanvas.show:not(.hiding){transform:none}.offcanvas.showing,.offcanvas.hiding,.offcanvas.show{visibility:visible}.offcanvas-backdrop{position:fixed;top:0;left:0;z-index:1040;width:100vw;height:100vh;background-color:#000}.offcanvas-backdrop.fade{opacity:0}.offcanvas-backdrop.show{opacity:.5}.offcanvas-header{display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;justify-content:space-between;-webkit-justify-content:space-between;padding:var(--bs-offcanvas-padding-y) var(--bs-offcanvas-padding-x)}.offcanvas-header .btn-close{padding:calc(var(--bs-offcanvas-padding-y)*.5) calc(var(--bs-offcanvas-padding-x)*.5);margin-top:calc(-0.5*var(--bs-offcanvas-padding-y));margin-right:calc(-0.5*var(--bs-offcanvas-padding-x));margin-bottom:calc(-0.5*var(--bs-offcanvas-padding-y))}.offcanvas-title{margin-bottom:0;line-height:var(--bs-offcanvas-title-line-height)}.offcanvas-body{flex-grow:1;-webkit-flex-grow:1;padding:var(--bs-offcanvas-padding-y) var(--bs-offcanvas-padding-x);overflow-y:auto}.placeholder{display:inline-block;min-height:1em;vertical-align:middle;cursor:wait;background-color:currentcolor;opacity:.5}.placeholder.btn::before{display:inline-block;content:""}.placeholder-xs{min-height:.6em}.placeholder-sm{min-height:.8em}.placeholder-lg{min-height:1.2em}.placeholder-glow .placeholder{animation:placeholder-glow 2s ease-in-out infinite}@keyframes placeholder-glow{50%{opacity:.2}}.placeholder-wave{mask-image:linear-gradient(130deg, #000 55%, rgba(0, 0, 0, 0.8) 75%, #000 95%);-webkit-mask-image:linear-gradient(130deg, #000 55%, rgba(0, 0, 0, 0.8) 75%, #000 95%);mask-size:200% 100%;-webkit-mask-size:200% 100%;animation:placeholder-wave 2s linear infinite}@keyframes placeholder-wave{100%{mask-position:-200% 0%;-webkit-mask-position:-200% 0%}}.clearfix::after{display:block;clear:both;content:""}.text-bg-default{color:#fff !important;background-color:RGBA(var(--bs-default-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-primary{color:#fff !important;background-color:RGBA(var(--bs-primary-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-secondary{color:#fff !important;background-color:RGBA(var(--bs-secondary-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-success{color:#fff !important;background-color:RGBA(var(--bs-success-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-info{color:#fff !important;background-color:RGBA(var(--bs-info-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-warning{color:#fff !important;background-color:RGBA(var(--bs-warning-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-danger{color:#fff !important;background-color:RGBA(var(--bs-danger-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-light{color:#000 !important;background-color:RGBA(var(--bs-light-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-dark{color:#fff !important;background-color:RGBA(var(--bs-dark-rgb), var(--bs-bg-opacity, 1)) !important}.link-default{color:RGBA(var(--bs-default-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-default-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-default:hover,.link-default:focus{color:RGBA(42, 46, 51, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(42, 46, 51, var(--bs-link-underline-opacity, 1)) !important}.link-primary{color:RGBA(var(--bs-primary-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-primary-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-primary:hover,.link-primary:focus{color:RGBA(31, 102, 182, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(31, 102, 182, var(--bs-link-underline-opacity, 1)) !important}.link-secondary{color:RGBA(var(--bs-secondary-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-secondary-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-secondary:hover,.link-secondary:focus{color:RGBA(42, 46, 51, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(42, 46, 51, var(--bs-link-underline-opacity, 1)) !important}.link-success{color:RGBA(var(--bs-success-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-success-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-success:hover,.link-success:focus{color:RGBA(50, 146, 19, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(50, 146, 19, var(--bs-link-underline-opacity, 1)) !important}.link-info{color:RGBA(var(--bs-info-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-info-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-info:hover,.link-info:focus{color:RGBA(122, 67, 150, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(122, 67, 150, var(--bs-link-underline-opacity, 1)) !important}.link-warning{color:RGBA(var(--bs-warning-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-warning-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-warning:hover,.link-warning:focus{color:RGBA(204, 94, 19, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(204, 94, 19, var(--bs-link-underline-opacity, 1)) !important}.link-danger{color:RGBA(var(--bs-danger-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-danger-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-danger:hover,.link-danger:focus{color:RGBA(204, 0, 46, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(204, 0, 46, var(--bs-link-underline-opacity, 1)) !important}.link-light{color:RGBA(var(--bs-light-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-light-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-light:hover,.link-light:focus{color:RGBA(249, 250, 251, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(249, 250, 251, var(--bs-link-underline-opacity, 1)) !important}.link-dark{color:RGBA(var(--bs-dark-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-dark-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-dark:hover,.link-dark:focus{color:RGBA(42, 46, 51, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(42, 46, 51, var(--bs-link-underline-opacity, 1)) !important}.link-body-emphasis{color:RGBA(var(--bs-emphasis-color-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-emphasis-color-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-body-emphasis:hover,.link-body-emphasis:focus{color:RGBA(var(--bs-emphasis-color-rgb), var(--bs-link-opacity, 0.75)) !important;text-decoration-color:RGBA(var(--bs-emphasis-color-rgb), var(--bs-link-underline-opacity, 0.75)) !important}.focus-ring:focus{outline:0;box-shadow:var(--bs-focus-ring-x, 0) var(--bs-focus-ring-y, 0) var(--bs-focus-ring-blur, 0) var(--bs-focus-ring-width) var(--bs-focus-ring-color)}.icon-link{display:inline-flex;gap:.375rem;align-items:center;-webkit-align-items:center;text-decoration-color:rgba(var(--bs-link-color-rgb), var(--bs-link-opacity, 0.5));text-underline-offset:.25em;backface-visibility:hidden;-webkit-backface-visibility:hidden;-moz-backface-visibility:hidden;-ms-backface-visibility:hidden;-o-backface-visibility:hidden}.icon-link>.bi{flex-shrink:0;-webkit-flex-shrink:0;width:1em;height:1em;fill:currentcolor;transition:.2s ease-in-out transform}@media(prefers-reduced-motion: reduce){.icon-link>.bi{transition:none}}.icon-link-hover:hover>.bi,.icon-link-hover:focus-visible>.bi{transform:var(--bs-icon-link-transform, translate3d(0.25em, 0, 0))}.ratio{position:relative;width:100%}.ratio::before{display:block;padding-top:var(--bs-aspect-ratio);content:""}.ratio>*{position:absolute;top:0;left:0;width:100%;height:100%}.ratio-1x1{--bs-aspect-ratio: 100%}.ratio-4x3{--bs-aspect-ratio: 75%}.ratio-16x9{--bs-aspect-ratio: 56.25%}.ratio-21x9{--bs-aspect-ratio: 42.8571428571%}.fixed-top{position:fixed;top:0;right:0;left:0;z-index:1030}.fixed-bottom{position:fixed;right:0;bottom:0;left:0;z-index:1030}.sticky-top{position:sticky;top:0;z-index:1020}.sticky-bottom{position:sticky;bottom:0;z-index:1020}@media(min-width: 576px){.sticky-sm-top{position:sticky;top:0;z-index:1020}.sticky-sm-bottom{position:sticky;bottom:0;z-index:1020}}@media(min-width: 768px){.sticky-md-top{position:sticky;top:0;z-index:1020}.sticky-md-bottom{position:sticky;bottom:0;z-index:1020}}@media(min-width: 992px){.sticky-lg-top{position:sticky;top:0;z-index:1020}.sticky-lg-bottom{position:sticky;bottom:0;z-index:1020}}@media(min-width: 1200px){.sticky-xl-top{position:sticky;top:0;z-index:1020}.sticky-xl-bottom{position:sticky;bottom:0;z-index:1020}}@media(min-width: 1400px){.sticky-xxl-top{position:sticky;top:0;z-index:1020}.sticky-xxl-bottom{position:sticky;bottom:0;z-index:1020}}.hstack{display:flex;display:-webkit-flex;flex-direction:row;-webkit-flex-direction:row;align-items:center;-webkit-align-items:center;align-self:stretch;-webkit-align-self:stretch}.vstack{display:flex;display:-webkit-flex;flex:1 1 auto;-webkit-flex:1 1 auto;flex-direction:column;-webkit-flex-direction:column;align-self:stretch;-webkit-align-self:stretch}.visually-hidden,.visually-hidden-focusable:not(:focus):not(:focus-within){width:1px !important;height:1px !important;padding:0 !important;margin:-1px !important;overflow:hidden !important;clip:rect(0, 0, 0, 0) !important;white-space:nowrap !important;border:0 !important}.visually-hidden:not(caption),.visually-hidden-focusable:not(:focus):not(:focus-within):not(caption){position:absolute !important}.stretched-link::after{position:absolute;top:0;right:0;bottom:0;left:0;z-index:1;content:""}.text-truncate{overflow:hidden;text-overflow:ellipsis;white-space:nowrap}.vr{display:inline-block;align-self:stretch;-webkit-align-self:stretch;width:1px;min-height:1em;background-color:currentcolor;opacity:.25}.align-baseline{vertical-align:baseline !important}.align-top{vertical-align:top !important}.align-middle{vertical-align:middle !important}.align-bottom{vertical-align:bottom !important}.align-text-bottom{vertical-align:text-bottom !important}.align-text-top{vertical-align:text-top !important}.float-start{float:left !important}.float-end{float:right !important}.float-none{float:none !important}.object-fit-contain{object-fit:contain !important}.object-fit-cover{object-fit:cover !important}.object-fit-fill{object-fit:fill !important}.object-fit-scale{object-fit:scale-down !important}.object-fit-none{object-fit:none !important}.opacity-0{opacity:0 !important}.opacity-25{opacity:.25 !important}.opacity-50{opacity:.5 !important}.opacity-75{opacity:.75 !important}.opacity-100{opacity:1 !important}.overflow-auto{overflow:auto !important}.overflow-hidden{overflow:hidden !important}.overflow-visible{overflow:visible !important}.overflow-scroll{overflow:scroll !important}.overflow-x-auto{overflow-x:auto !important}.overflow-x-hidden{overflow-x:hidden !important}.overflow-x-visible{overflow-x:visible !important}.overflow-x-scroll{overflow-x:scroll !important}.overflow-y-auto{overflow-y:auto !important}.overflow-y-hidden{overflow-y:hidden !important}.overflow-y-visible{overflow-y:visible !important}.overflow-y-scroll{overflow-y:scroll !important}.d-inline{display:inline !important}.d-inline-block{display:inline-block !important}.d-block{display:block !important}.d-grid{display:grid !important}.d-inline-grid{display:inline-grid !important}.d-table{display:table !important}.d-table-row{display:table-row !important}.d-table-cell{display:table-cell !important}.d-flex{display:flex !important}.d-inline-flex{display:inline-flex !important}.d-none{display:none !important}.shadow{box-shadow:0 .5rem 1rem rgba(0,0,0,.15) !important}.shadow-sm{box-shadow:0 .125rem .25rem rgba(0,0,0,.075) !important}.shadow-lg{box-shadow:0 1rem 3rem rgba(0,0,0,.175) !important}.shadow-none{box-shadow:none !important}.focus-ring-default{--bs-focus-ring-color: rgba(var(--bs-default-rgb), var(--bs-focus-ring-opacity))}.focus-ring-primary{--bs-focus-ring-color: rgba(var(--bs-primary-rgb), var(--bs-focus-ring-opacity))}.focus-ring-secondary{--bs-focus-ring-color: rgba(var(--bs-secondary-rgb), var(--bs-focus-ring-opacity))}.focus-ring-success{--bs-focus-ring-color: rgba(var(--bs-success-rgb), var(--bs-focus-ring-opacity))}.focus-ring-info{--bs-focus-ring-color: rgba(var(--bs-info-rgb), var(--bs-focus-ring-opacity))}.focus-ring-warning{--bs-focus-ring-color: rgba(var(--bs-warning-rgb), var(--bs-focus-ring-opacity))}.focus-ring-danger{--bs-focus-ring-color: rgba(var(--bs-danger-rgb), var(--bs-focus-ring-opacity))}.focus-ring-light{--bs-focus-ring-color: rgba(var(--bs-light-rgb), var(--bs-focus-ring-opacity))}.focus-ring-dark{--bs-focus-ring-color: rgba(var(--bs-dark-rgb), var(--bs-focus-ring-opacity))}.position-static{position:static !important}.position-relative{position:relative !important}.position-absolute{position:absolute !important}.position-fixed{position:fixed !important}.position-sticky{position:sticky !important}.top-0{top:0 !important}.top-50{top:50% !important}.top-100{top:100% !important}.bottom-0{bottom:0 !important}.bottom-50{bottom:50% !important}.bottom-100{bottom:100% !important}.start-0{left:0 !important}.start-50{left:50% !important}.start-100{left:100% !important}.end-0{right:0 !important}.end-50{right:50% !important}.end-100{right:100% !important}.translate-middle{transform:translate(-50%, -50%) !important}.translate-middle-x{transform:translateX(-50%) !important}.translate-middle-y{transform:translateY(-50%) !important}.border{border:var(--bs-border-width) var(--bs-border-style) var(--bs-border-color) !important}.border-0{border:0 !important}.border-top{border-top:var(--bs-border-width) var(--bs-border-style) var(--bs-border-color) !important}.border-top-0{border-top:0 !important}.border-end{border-right:var(--bs-border-width) var(--bs-border-style) var(--bs-border-color) !important}.border-end-0{border-right:0 !important}.border-bottom{border-bottom:var(--bs-border-width) var(--bs-border-style) var(--bs-border-color) !important}.border-bottom-0{border-bottom:0 !important}.border-start{border-left:var(--bs-border-width) var(--bs-border-style) var(--bs-border-color) !important}.border-start-0{border-left:0 !important}.border-default{--bs-border-opacity: 1;border-color:rgba(var(--bs-default-rgb), var(--bs-border-opacity)) !important}.border-primary{--bs-border-opacity: 1;border-color:rgba(var(--bs-primary-rgb), var(--bs-border-opacity)) !important}.border-secondary{--bs-border-opacity: 1;border-color:rgba(var(--bs-secondary-rgb), var(--bs-border-opacity)) !important}.border-success{--bs-border-opacity: 1;border-color:rgba(var(--bs-success-rgb), var(--bs-border-opacity)) !important}.border-info{--bs-border-opacity: 1;border-color:rgba(var(--bs-info-rgb), var(--bs-border-opacity)) !important}.border-warning{--bs-border-opacity: 1;border-color:rgba(var(--bs-warning-rgb), var(--bs-border-opacity)) !important}.border-danger{--bs-border-opacity: 1;border-color:rgba(var(--bs-danger-rgb), var(--bs-border-opacity)) !important}.border-light{--bs-border-opacity: 1;border-color:rgba(var(--bs-light-rgb), var(--bs-border-opacity)) !important}.border-dark{--bs-border-opacity: 1;border-color:rgba(var(--bs-dark-rgb), var(--bs-border-opacity)) !important}.border-black{--bs-border-opacity: 1;border-color:rgba(var(--bs-black-rgb), var(--bs-border-opacity)) !important}.border-white{--bs-border-opacity: 1;border-color:rgba(var(--bs-white-rgb), var(--bs-border-opacity)) !important}.border-primary-subtle{border-color:var(--bs-primary-border-subtle) !important}.border-secondary-subtle{border-color:var(--bs-secondary-border-subtle) !important}.border-success-subtle{border-color:var(--bs-success-border-subtle) !important}.border-info-subtle{border-color:var(--bs-info-border-subtle) !important}.border-warning-subtle{border-color:var(--bs-warning-border-subtle) !important}.border-danger-subtle{border-color:var(--bs-danger-border-subtle) !important}.border-light-subtle{border-color:var(--bs-light-border-subtle) !important}.border-dark-subtle{border-color:var(--bs-dark-border-subtle) !important}.border-1{border-width:1px !important}.border-2{border-width:2px !important}.border-3{border-width:3px !important}.border-4{border-width:4px !important}.border-5{border-width:5px !important}.border-opacity-10{--bs-border-opacity: 0.1}.border-opacity-25{--bs-border-opacity: 0.25}.border-opacity-50{--bs-border-opacity: 0.5}.border-opacity-75{--bs-border-opacity: 0.75}.border-opacity-100{--bs-border-opacity: 1}.w-25{width:25% !important}.w-50{width:50% !important}.w-75{width:75% !important}.w-100{width:100% !important}.w-auto{width:auto !important}.mw-100{max-width:100% !important}.vw-100{width:100vw !important}.min-vw-100{min-width:100vw !important}.h-25{height:25% !important}.h-50{height:50% !important}.h-75{height:75% !important}.h-100{height:100% !important}.h-auto{height:auto !important}.mh-100{max-height:100% !important}.vh-100{height:100vh !important}.min-vh-100{min-height:100vh !important}.flex-fill{flex:1 1 auto !important}.flex-row{flex-direction:row !important}.flex-column{flex-direction:column !important}.flex-row-reverse{flex-direction:row-reverse !important}.flex-column-reverse{flex-direction:column-reverse !important}.flex-grow-0{flex-grow:0 !important}.flex-grow-1{flex-grow:1 !important}.flex-shrink-0{flex-shrink:0 !important}.flex-shrink-1{flex-shrink:1 !important}.flex-wrap{flex-wrap:wrap !important}.flex-nowrap{flex-wrap:nowrap !important}.flex-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-start{justify-content:flex-start !important}.justify-content-end{justify-content:flex-end !important}.justify-content-center{justify-content:center !important}.justify-content-between{justify-content:space-between !important}.justify-content-around{justify-content:space-around !important}.justify-content-evenly{justify-content:space-evenly !important}.align-items-start{align-items:flex-start !important}.align-items-end{align-items:flex-end !important}.align-items-center{align-items:center !important}.align-items-baseline{align-items:baseline !important}.align-items-stretch{align-items:stretch !important}.align-content-start{align-content:flex-start !important}.align-content-end{align-content:flex-end !important}.align-content-center{align-content:center !important}.align-content-between{align-content:space-between !important}.align-content-around{align-content:space-around !important}.align-content-stretch{align-content:stretch !important}.align-self-auto{align-self:auto !important}.align-self-start{align-self:flex-start !important}.align-self-end{align-self:flex-end !important}.align-self-center{align-self:center !important}.align-self-baseline{align-self:baseline !important}.align-self-stretch{align-self:stretch !important}.order-first{order:-1 !important}.order-0{order:0 !important}.order-1{order:1 !important}.order-2{order:2 !important}.order-3{order:3 !important}.order-4{order:4 !important}.order-5{order:5 !important}.order-last{order:6 !important}.m-0{margin:0 !important}.m-1{margin:.25rem !important}.m-2{margin:.5rem !important}.m-3{margin:1rem !important}.m-4{margin:1.5rem !important}.m-5{margin:3rem !important}.m-auto{margin:auto !important}.mx-0{margin-right:0 !important;margin-left:0 !important}.mx-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-3{margin-right:1rem !important;margin-left:1rem !important}.mx-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-5{margin-right:3rem !important;margin-left:3rem !important}.mx-auto{margin-right:auto !important;margin-left:auto !important}.my-0{margin-top:0 !important;margin-bottom:0 !important}.my-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-0{margin-top:0 !important}.mt-1{margin-top:.25rem !important}.mt-2{margin-top:.5rem !important}.mt-3{margin-top:1rem !important}.mt-4{margin-top:1.5rem !important}.mt-5{margin-top:3rem !important}.mt-auto{margin-top:auto !important}.me-0{margin-right:0 !important}.me-1{margin-right:.25rem !important}.me-2{margin-right:.5rem !important}.me-3{margin-right:1rem !important}.me-4{margin-right:1.5rem !important}.me-5{margin-right:3rem !important}.me-auto{margin-right:auto !important}.mb-0{margin-bottom:0 !important}.mb-1{margin-bottom:.25rem !important}.mb-2{margin-bottom:.5rem !important}.mb-3{margin-bottom:1rem !important}.mb-4{margin-bottom:1.5rem !important}.mb-5{margin-bottom:3rem !important}.mb-auto{margin-bottom:auto !important}.ms-0{margin-left:0 !important}.ms-1{margin-left:.25rem !important}.ms-2{margin-left:.5rem !important}.ms-3{margin-left:1rem !important}.ms-4{margin-left:1.5rem !important}.ms-5{margin-left:3rem !important}.ms-auto{margin-left:auto !important}.p-0{padding:0 !important}.p-1{padding:.25rem !important}.p-2{padding:.5rem !important}.p-3{padding:1rem !important}.p-4{padding:1.5rem !important}.p-5{padding:3rem !important}.px-0{padding-right:0 !important;padding-left:0 !important}.px-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-3{padding-right:1rem !important;padding-left:1rem !important}.px-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-5{padding-right:3rem !important;padding-left:3rem !important}.py-0{padding-top:0 !important;padding-bottom:0 !important}.py-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-0{padding-top:0 !important}.pt-1{padding-top:.25rem !important}.pt-2{padding-top:.5rem !important}.pt-3{padding-top:1rem !important}.pt-4{padding-top:1.5rem !important}.pt-5{padding-top:3rem !important}.pe-0{padding-right:0 !important}.pe-1{padding-right:.25rem !important}.pe-2{padding-right:.5rem !important}.pe-3{padding-right:1rem !important}.pe-4{padding-right:1.5rem !important}.pe-5{padding-right:3rem !important}.pb-0{padding-bottom:0 !important}.pb-1{padding-bottom:.25rem !important}.pb-2{padding-bottom:.5rem !important}.pb-3{padding-bottom:1rem !important}.pb-4{padding-bottom:1.5rem !important}.pb-5{padding-bottom:3rem !important}.ps-0{padding-left:0 !important}.ps-1{padding-left:.25rem !important}.ps-2{padding-left:.5rem !important}.ps-3{padding-left:1rem !important}.ps-4{padding-left:1.5rem !important}.ps-5{padding-left:3rem !important}.gap-0{gap:0 !important}.gap-1{gap:.25rem !important}.gap-2{gap:.5rem !important}.gap-3{gap:1rem !important}.gap-4{gap:1.5rem !important}.gap-5{gap:3rem !important}.row-gap-0{row-gap:0 !important}.row-gap-1{row-gap:.25rem !important}.row-gap-2{row-gap:.5rem !important}.row-gap-3{row-gap:1rem !important}.row-gap-4{row-gap:1.5rem !important}.row-gap-5{row-gap:3rem !important}.column-gap-0{column-gap:0 !important}.column-gap-1{column-gap:.25rem !important}.column-gap-2{column-gap:.5rem !important}.column-gap-3{column-gap:1rem !important}.column-gap-4{column-gap:1.5rem !important}.column-gap-5{column-gap:3rem !important}.font-monospace{font-family:var(--bs-font-monospace) !important}.fs-1{font-size:calc(1.325rem + 0.9vw) !important}.fs-2{font-size:calc(1.29rem + 0.48vw) !important}.fs-3{font-size:calc(1.27rem + 0.24vw) !important}.fs-4{font-size:1.25rem !important}.fs-5{font-size:1.1rem !important}.fs-6{font-size:1rem !important}.fst-italic{font-style:italic !important}.fst-normal{font-style:normal !important}.fw-lighter{font-weight:lighter !important}.fw-light{font-weight:300 !important}.fw-normal{font-weight:400 !important}.fw-medium{font-weight:500 !important}.fw-semibold{font-weight:600 !important}.fw-bold{font-weight:700 !important}.fw-bolder{font-weight:bolder !important}.lh-1{line-height:1 !important}.lh-sm{line-height:1.25 !important}.lh-base{line-height:1.5 !important}.lh-lg{line-height:2 !important}.text-start{text-align:left !important}.text-end{text-align:right !important}.text-center{text-align:center !important}.text-decoration-none{text-decoration:none !important}.text-decoration-underline{text-decoration:underline !important}.text-decoration-line-through{text-decoration:line-through !important}.text-lowercase{text-transform:lowercase !important}.text-uppercase{text-transform:uppercase !important}.text-capitalize{text-transform:capitalize !important}.text-wrap{white-space:normal !important}.text-nowrap{white-space:nowrap !important}.text-break{word-wrap:break-word !important;word-break:break-word !important}.text-default{--bs-text-opacity: 1;color:rgba(var(--bs-default-rgb), var(--bs-text-opacity)) !important}.text-primary{--bs-text-opacity: 1;color:rgba(var(--bs-primary-rgb), var(--bs-text-opacity)) !important}.text-secondary{--bs-text-opacity: 1;color:rgba(var(--bs-secondary-rgb), var(--bs-text-opacity)) !important}.text-success{--bs-text-opacity: 1;color:rgba(var(--bs-success-rgb), var(--bs-text-opacity)) !important}.text-info{--bs-text-opacity: 1;color:rgba(var(--bs-info-rgb), var(--bs-text-opacity)) !important}.text-warning{--bs-text-opacity: 1;color:rgba(var(--bs-warning-rgb), var(--bs-text-opacity)) !important}.text-danger{--bs-text-opacity: 1;color:rgba(var(--bs-danger-rgb), var(--bs-text-opacity)) !important}.text-light{--bs-text-opacity: 1;color:rgba(var(--bs-light-rgb), var(--bs-text-opacity)) !important}.text-dark{--bs-text-opacity: 1;color:rgba(var(--bs-dark-rgb), var(--bs-text-opacity)) !important}.text-black{--bs-text-opacity: 1;color:rgba(var(--bs-black-rgb), var(--bs-text-opacity)) !important}.text-white{--bs-text-opacity: 1;color:rgba(var(--bs-white-rgb), var(--bs-text-opacity)) !important}.text-body{--bs-text-opacity: 1;color:rgba(var(--bs-body-color-rgb), var(--bs-text-opacity)) !important}.text-muted{--bs-text-opacity: 1;color:var(--bs-secondary-color) !important}.text-black-50{--bs-text-opacity: 1;color:rgba(0,0,0,.5) !important}.text-white-50{--bs-text-opacity: 1;color:rgba(255,255,255,.5) !important}.text-body-secondary{--bs-text-opacity: 1;color:var(--bs-secondary-color) !important}.text-body-tertiary{--bs-text-opacity: 1;color:var(--bs-tertiary-color) !important}.text-body-emphasis{--bs-text-opacity: 1;color:var(--bs-emphasis-color) !important}.text-reset{--bs-text-opacity: 1;color:inherit !important}.text-opacity-25{--bs-text-opacity: 0.25}.text-opacity-50{--bs-text-opacity: 0.5}.text-opacity-75{--bs-text-opacity: 0.75}.text-opacity-100{--bs-text-opacity: 1}.text-primary-emphasis{color:var(--bs-primary-text-emphasis) !important}.text-secondary-emphasis{color:var(--bs-secondary-text-emphasis) !important}.text-success-emphasis{color:var(--bs-success-text-emphasis) !important}.text-info-emphasis{color:var(--bs-info-text-emphasis) !important}.text-warning-emphasis{color:var(--bs-warning-text-emphasis) !important}.text-danger-emphasis{color:var(--bs-danger-text-emphasis) !important}.text-light-emphasis{color:var(--bs-light-text-emphasis) !important}.text-dark-emphasis{color:var(--bs-dark-text-emphasis) !important}.link-opacity-10{--bs-link-opacity: 0.1}.link-opacity-10-hover:hover{--bs-link-opacity: 0.1}.link-opacity-25{--bs-link-opacity: 0.25}.link-opacity-25-hover:hover{--bs-link-opacity: 0.25}.link-opacity-50{--bs-link-opacity: 0.5}.link-opacity-50-hover:hover{--bs-link-opacity: 0.5}.link-opacity-75{--bs-link-opacity: 0.75}.link-opacity-75-hover:hover{--bs-link-opacity: 0.75}.link-opacity-100{--bs-link-opacity: 1}.link-opacity-100-hover:hover{--bs-link-opacity: 1}.link-offset-1{text-underline-offset:.125em !important}.link-offset-1-hover:hover{text-underline-offset:.125em !important}.link-offset-2{text-underline-offset:.25em !important}.link-offset-2-hover:hover{text-underline-offset:.25em !important}.link-offset-3{text-underline-offset:.375em !important}.link-offset-3-hover:hover{text-underline-offset:.375em !important}.link-underline-default{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-default-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-primary{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-primary-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-secondary{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-secondary-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-success{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-success-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-info{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-info-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-warning{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-warning-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-danger{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-danger-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-light{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-light-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-dark{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-dark-rgb), var(--bs-link-underline-opacity)) !important}.link-underline{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-link-color-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-underline-opacity-0{--bs-link-underline-opacity: 0}.link-underline-opacity-0-hover:hover{--bs-link-underline-opacity: 0}.link-underline-opacity-10{--bs-link-underline-opacity: 0.1}.link-underline-opacity-10-hover:hover{--bs-link-underline-opacity: 0.1}.link-underline-opacity-25{--bs-link-underline-opacity: 0.25}.link-underline-opacity-25-hover:hover{--bs-link-underline-opacity: 0.25}.link-underline-opacity-50{--bs-link-underline-opacity: 0.5}.link-underline-opacity-50-hover:hover{--bs-link-underline-opacity: 0.5}.link-underline-opacity-75{--bs-link-underline-opacity: 0.75}.link-underline-opacity-75-hover:hover{--bs-link-underline-opacity: 0.75}.link-underline-opacity-100{--bs-link-underline-opacity: 1}.link-underline-opacity-100-hover:hover{--bs-link-underline-opacity: 1}.bg-default{--bs-bg-opacity: 1;background-color:rgba(var(--bs-default-rgb), var(--bs-bg-opacity)) !important}.bg-primary{--bs-bg-opacity: 1;background-color:rgba(var(--bs-primary-rgb), var(--bs-bg-opacity)) !important}.bg-secondary{--bs-bg-opacity: 1;background-color:rgba(var(--bs-secondary-rgb), var(--bs-bg-opacity)) !important}.bg-success{--bs-bg-opacity: 1;background-color:rgba(var(--bs-success-rgb), var(--bs-bg-opacity)) !important}.bg-info{--bs-bg-opacity: 1;background-color:rgba(var(--bs-info-rgb), var(--bs-bg-opacity)) !important}.bg-warning{--bs-bg-opacity: 1;background-color:rgba(var(--bs-warning-rgb), var(--bs-bg-opacity)) !important}.bg-danger{--bs-bg-opacity: 1;background-color:rgba(var(--bs-danger-rgb), var(--bs-bg-opacity)) !important}.bg-light{--bs-bg-opacity: 1;background-color:rgba(var(--bs-light-rgb), var(--bs-bg-opacity)) !important}.bg-dark{--bs-bg-opacity: 1;background-color:rgba(var(--bs-dark-rgb), var(--bs-bg-opacity)) !important}.bg-black{--bs-bg-opacity: 1;background-color:rgba(var(--bs-black-rgb), var(--bs-bg-opacity)) !important}.bg-white{--bs-bg-opacity: 1;background-color:rgba(var(--bs-white-rgb), var(--bs-bg-opacity)) !important}.bg-body{--bs-bg-opacity: 1;background-color:rgba(var(--bs-body-bg-rgb), var(--bs-bg-opacity)) !important}.bg-transparent{--bs-bg-opacity: 1;background-color:rgba(0,0,0,0) !important}.bg-body-secondary{--bs-bg-opacity: 1;background-color:rgba(var(--bs-secondary-bg-rgb), var(--bs-bg-opacity)) !important}.bg-body-tertiary{--bs-bg-opacity: 1;background-color:rgba(var(--bs-tertiary-bg-rgb), var(--bs-bg-opacity)) !important}.bg-opacity-10{--bs-bg-opacity: 0.1}.bg-opacity-25{--bs-bg-opacity: 0.25}.bg-opacity-50{--bs-bg-opacity: 0.5}.bg-opacity-75{--bs-bg-opacity: 0.75}.bg-opacity-100{--bs-bg-opacity: 1}.bg-primary-subtle{background-color:var(--bs-primary-bg-subtle) !important}.bg-secondary-subtle{background-color:var(--bs-secondary-bg-subtle) !important}.bg-success-subtle{background-color:var(--bs-success-bg-subtle) !important}.bg-info-subtle{background-color:var(--bs-info-bg-subtle) !important}.bg-warning-subtle{background-color:var(--bs-warning-bg-subtle) !important}.bg-danger-subtle{background-color:var(--bs-danger-bg-subtle) !important}.bg-light-subtle{background-color:var(--bs-light-bg-subtle) !important}.bg-dark-subtle{background-color:var(--bs-dark-bg-subtle) !important}.bg-gradient{background-image:var(--bs-gradient) !important}.user-select-all{user-select:all !important}.user-select-auto{user-select:auto !important}.user-select-none{user-select:none !important}.pe-none{pointer-events:none !important}.pe-auto{pointer-events:auto !important}.rounded{border-radius:var(--bs-border-radius) !important}.rounded-0{border-radius:0 !important}.rounded-1{border-radius:var(--bs-border-radius-sm) !important}.rounded-2{border-radius:var(--bs-border-radius) !important}.rounded-3{border-radius:var(--bs-border-radius-lg) !important}.rounded-4{border-radius:var(--bs-border-radius-xl) !important}.rounded-5{border-radius:var(--bs-border-radius-xxl) !important}.rounded-circle{border-radius:50% !important}.rounded-pill{border-radius:var(--bs-border-radius-pill) !important}.rounded-top{border-top-left-radius:var(--bs-border-radius) !important;border-top-right-radius:var(--bs-border-radius) !important}.rounded-top-0{border-top-left-radius:0 !important;border-top-right-radius:0 !important}.rounded-top-1{border-top-left-radius:var(--bs-border-radius-sm) !important;border-top-right-radius:var(--bs-border-radius-sm) !important}.rounded-top-2{border-top-left-radius:var(--bs-border-radius) !important;border-top-right-radius:var(--bs-border-radius) !important}.rounded-top-3{border-top-left-radius:var(--bs-border-radius-lg) !important;border-top-right-radius:var(--bs-border-radius-lg) !important}.rounded-top-4{border-top-left-radius:var(--bs-border-radius-xl) !important;border-top-right-radius:var(--bs-border-radius-xl) !important}.rounded-top-5{border-top-left-radius:var(--bs-border-radius-xxl) !important;border-top-right-radius:var(--bs-border-radius-xxl) !important}.rounded-top-circle{border-top-left-radius:50% !important;border-top-right-radius:50% !important}.rounded-top-pill{border-top-left-radius:var(--bs-border-radius-pill) !important;border-top-right-radius:var(--bs-border-radius-pill) !important}.rounded-end{border-top-right-radius:var(--bs-border-radius) !important;border-bottom-right-radius:var(--bs-border-radius) !important}.rounded-end-0{border-top-right-radius:0 !important;border-bottom-right-radius:0 !important}.rounded-end-1{border-top-right-radius:var(--bs-border-radius-sm) !important;border-bottom-right-radius:var(--bs-border-radius-sm) !important}.rounded-end-2{border-top-right-radius:var(--bs-border-radius) !important;border-bottom-right-radius:var(--bs-border-radius) !important}.rounded-end-3{border-top-right-radius:var(--bs-border-radius-lg) !important;border-bottom-right-radius:var(--bs-border-radius-lg) !important}.rounded-end-4{border-top-right-radius:var(--bs-border-radius-xl) !important;border-bottom-right-radius:var(--bs-border-radius-xl) !important}.rounded-end-5{border-top-right-radius:var(--bs-border-radius-xxl) !important;border-bottom-right-radius:var(--bs-border-radius-xxl) !important}.rounded-end-circle{border-top-right-radius:50% !important;border-bottom-right-radius:50% !important}.rounded-end-pill{border-top-right-radius:var(--bs-border-radius-pill) !important;border-bottom-right-radius:var(--bs-border-radius-pill) !important}.rounded-bottom{border-bottom-right-radius:var(--bs-border-radius) !important;border-bottom-left-radius:var(--bs-border-radius) !important}.rounded-bottom-0{border-bottom-right-radius:0 !important;border-bottom-left-radius:0 !important}.rounded-bottom-1{border-bottom-right-radius:var(--bs-border-radius-sm) !important;border-bottom-left-radius:var(--bs-border-radius-sm) !important}.rounded-bottom-2{border-bottom-right-radius:var(--bs-border-radius) !important;border-bottom-left-radius:var(--bs-border-radius) !important}.rounded-bottom-3{border-bottom-right-radius:var(--bs-border-radius-lg) !important;border-bottom-left-radius:var(--bs-border-radius-lg) !important}.rounded-bottom-4{border-bottom-right-radius:var(--bs-border-radius-xl) !important;border-bottom-left-radius:var(--bs-border-radius-xl) !important}.rounded-bottom-5{border-bottom-right-radius:var(--bs-border-radius-xxl) !important;border-bottom-left-radius:var(--bs-border-radius-xxl) !important}.rounded-bottom-circle{border-bottom-right-radius:50% !important;border-bottom-left-radius:50% !important}.rounded-bottom-pill{border-bottom-right-radius:var(--bs-border-radius-pill) !important;border-bottom-left-radius:var(--bs-border-radius-pill) !important}.rounded-start{border-bottom-left-radius:var(--bs-border-radius) !important;border-top-left-radius:var(--bs-border-radius) !important}.rounded-start-0{border-bottom-left-radius:0 !important;border-top-left-radius:0 !important}.rounded-start-1{border-bottom-left-radius:var(--bs-border-radius-sm) !important;border-top-left-radius:var(--bs-border-radius-sm) !important}.rounded-start-2{border-bottom-left-radius:var(--bs-border-radius) !important;border-top-left-radius:var(--bs-border-radius) !important}.rounded-start-3{border-bottom-left-radius:var(--bs-border-radius-lg) !important;border-top-left-radius:var(--bs-border-radius-lg) !important}.rounded-start-4{border-bottom-left-radius:var(--bs-border-radius-xl) !important;border-top-left-radius:var(--bs-border-radius-xl) !important}.rounded-start-5{border-bottom-left-radius:var(--bs-border-radius-xxl) !important;border-top-left-radius:var(--bs-border-radius-xxl) !important}.rounded-start-circle{border-bottom-left-radius:50% !important;border-top-left-radius:50% !important}.rounded-start-pill{border-bottom-left-radius:var(--bs-border-radius-pill) !important;border-top-left-radius:var(--bs-border-radius-pill) !important}.visible{visibility:visible !important}.invisible{visibility:hidden !important}.z-n1{z-index:-1 !important}.z-0{z-index:0 !important}.z-1{z-index:1 !important}.z-2{z-index:2 !important}.z-3{z-index:3 !important}@media(min-width: 576px){.float-sm-start{float:left !important}.float-sm-end{float:right !important}.float-sm-none{float:none !important}.object-fit-sm-contain{object-fit:contain !important}.object-fit-sm-cover{object-fit:cover !important}.object-fit-sm-fill{object-fit:fill !important}.object-fit-sm-scale{object-fit:scale-down !important}.object-fit-sm-none{object-fit:none !important}.d-sm-inline{display:inline !important}.d-sm-inline-block{display:inline-block !important}.d-sm-block{display:block !important}.d-sm-grid{display:grid !important}.d-sm-inline-grid{display:inline-grid !important}.d-sm-table{display:table !important}.d-sm-table-row{display:table-row !important}.d-sm-table-cell{display:table-cell !important}.d-sm-flex{display:flex !important}.d-sm-inline-flex{display:inline-flex !important}.d-sm-none{display:none !important}.flex-sm-fill{flex:1 1 auto !important}.flex-sm-row{flex-direction:row !important}.flex-sm-column{flex-direction:column !important}.flex-sm-row-reverse{flex-direction:row-reverse !important}.flex-sm-column-reverse{flex-direction:column-reverse !important}.flex-sm-grow-0{flex-grow:0 !important}.flex-sm-grow-1{flex-grow:1 !important}.flex-sm-shrink-0{flex-shrink:0 !important}.flex-sm-shrink-1{flex-shrink:1 !important}.flex-sm-wrap{flex-wrap:wrap !important}.flex-sm-nowrap{flex-wrap:nowrap !important}.flex-sm-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-sm-start{justify-content:flex-start !important}.justify-content-sm-end{justify-content:flex-end !important}.justify-content-sm-center{justify-content:center !important}.justify-content-sm-between{justify-content:space-between !important}.justify-content-sm-around{justify-content:space-around !important}.justify-content-sm-evenly{justify-content:space-evenly !important}.align-items-sm-start{align-items:flex-start !important}.align-items-sm-end{align-items:flex-end !important}.align-items-sm-center{align-items:center !important}.align-items-sm-baseline{align-items:baseline !important}.align-items-sm-stretch{align-items:stretch !important}.align-content-sm-start{align-content:flex-start !important}.align-content-sm-end{align-content:flex-end !important}.align-content-sm-center{align-content:center !important}.align-content-sm-between{align-content:space-between !important}.align-content-sm-around{align-content:space-around !important}.align-content-sm-stretch{align-content:stretch !important}.align-self-sm-auto{align-self:auto !important}.align-self-sm-start{align-self:flex-start !important}.align-self-sm-end{align-self:flex-end !important}.align-self-sm-center{align-self:center !important}.align-self-sm-baseline{align-self:baseline !important}.align-self-sm-stretch{align-self:stretch !important}.order-sm-first{order:-1 !important}.order-sm-0{order:0 !important}.order-sm-1{order:1 !important}.order-sm-2{order:2 !important}.order-sm-3{order:3 !important}.order-sm-4{order:4 !important}.order-sm-5{order:5 !important}.order-sm-last{order:6 !important}.m-sm-0{margin:0 !important}.m-sm-1{margin:.25rem !important}.m-sm-2{margin:.5rem !important}.m-sm-3{margin:1rem !important}.m-sm-4{margin:1.5rem !important}.m-sm-5{margin:3rem !important}.m-sm-auto{margin:auto !important}.mx-sm-0{margin-right:0 !important;margin-left:0 !important}.mx-sm-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-sm-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-sm-3{margin-right:1rem !important;margin-left:1rem !important}.mx-sm-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-sm-5{margin-right:3rem !important;margin-left:3rem !important}.mx-sm-auto{margin-right:auto !important;margin-left:auto !important}.my-sm-0{margin-top:0 !important;margin-bottom:0 !important}.my-sm-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-sm-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-sm-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-sm-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-sm-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-sm-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-sm-0{margin-top:0 !important}.mt-sm-1{margin-top:.25rem !important}.mt-sm-2{margin-top:.5rem !important}.mt-sm-3{margin-top:1rem !important}.mt-sm-4{margin-top:1.5rem !important}.mt-sm-5{margin-top:3rem !important}.mt-sm-auto{margin-top:auto !important}.me-sm-0{margin-right:0 !important}.me-sm-1{margin-right:.25rem !important}.me-sm-2{margin-right:.5rem !important}.me-sm-3{margin-right:1rem !important}.me-sm-4{margin-right:1.5rem !important}.me-sm-5{margin-right:3rem !important}.me-sm-auto{margin-right:auto !important}.mb-sm-0{margin-bottom:0 !important}.mb-sm-1{margin-bottom:.25rem !important}.mb-sm-2{margin-bottom:.5rem !important}.mb-sm-3{margin-bottom:1rem !important}.mb-sm-4{margin-bottom:1.5rem !important}.mb-sm-5{margin-bottom:3rem !important}.mb-sm-auto{margin-bottom:auto !important}.ms-sm-0{margin-left:0 !important}.ms-sm-1{margin-left:.25rem !important}.ms-sm-2{margin-left:.5rem !important}.ms-sm-3{margin-left:1rem !important}.ms-sm-4{margin-left:1.5rem !important}.ms-sm-5{margin-left:3rem !important}.ms-sm-auto{margin-left:auto !important}.p-sm-0{padding:0 !important}.p-sm-1{padding:.25rem !important}.p-sm-2{padding:.5rem !important}.p-sm-3{padding:1rem !important}.p-sm-4{padding:1.5rem !important}.p-sm-5{padding:3rem !important}.px-sm-0{padding-right:0 !important;padding-left:0 !important}.px-sm-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-sm-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-sm-3{padding-right:1rem !important;padding-left:1rem !important}.px-sm-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-sm-5{padding-right:3rem !important;padding-left:3rem !important}.py-sm-0{padding-top:0 !important;padding-bottom:0 !important}.py-sm-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-sm-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-sm-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-sm-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-sm-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-sm-0{padding-top:0 !important}.pt-sm-1{padding-top:.25rem !important}.pt-sm-2{padding-top:.5rem !important}.pt-sm-3{padding-top:1rem !important}.pt-sm-4{padding-top:1.5rem !important}.pt-sm-5{padding-top:3rem !important}.pe-sm-0{padding-right:0 !important}.pe-sm-1{padding-right:.25rem !important}.pe-sm-2{padding-right:.5rem !important}.pe-sm-3{padding-right:1rem !important}.pe-sm-4{padding-right:1.5rem !important}.pe-sm-5{padding-right:3rem !important}.pb-sm-0{padding-bottom:0 !important}.pb-sm-1{padding-bottom:.25rem !important}.pb-sm-2{padding-bottom:.5rem !important}.pb-sm-3{padding-bottom:1rem !important}.pb-sm-4{padding-bottom:1.5rem !important}.pb-sm-5{padding-bottom:3rem !important}.ps-sm-0{padding-left:0 !important}.ps-sm-1{padding-left:.25rem !important}.ps-sm-2{padding-left:.5rem !important}.ps-sm-3{padding-left:1rem !important}.ps-sm-4{padding-left:1.5rem !important}.ps-sm-5{padding-left:3rem !important}.gap-sm-0{gap:0 !important}.gap-sm-1{gap:.25rem !important}.gap-sm-2{gap:.5rem !important}.gap-sm-3{gap:1rem !important}.gap-sm-4{gap:1.5rem !important}.gap-sm-5{gap:3rem !important}.row-gap-sm-0{row-gap:0 !important}.row-gap-sm-1{row-gap:.25rem !important}.row-gap-sm-2{row-gap:.5rem !important}.row-gap-sm-3{row-gap:1rem !important}.row-gap-sm-4{row-gap:1.5rem !important}.row-gap-sm-5{row-gap:3rem !important}.column-gap-sm-0{column-gap:0 !important}.column-gap-sm-1{column-gap:.25rem !important}.column-gap-sm-2{column-gap:.5rem !important}.column-gap-sm-3{column-gap:1rem !important}.column-gap-sm-4{column-gap:1.5rem !important}.column-gap-sm-5{column-gap:3rem !important}.text-sm-start{text-align:left !important}.text-sm-end{text-align:right !important}.text-sm-center{text-align:center !important}}@media(min-width: 768px){.float-md-start{float:left !important}.float-md-end{float:right !important}.float-md-none{float:none !important}.object-fit-md-contain{object-fit:contain !important}.object-fit-md-cover{object-fit:cover !important}.object-fit-md-fill{object-fit:fill !important}.object-fit-md-scale{object-fit:scale-down !important}.object-fit-md-none{object-fit:none !important}.d-md-inline{display:inline !important}.d-md-inline-block{display:inline-block !important}.d-md-block{display:block !important}.d-md-grid{display:grid !important}.d-md-inline-grid{display:inline-grid !important}.d-md-table{display:table !important}.d-md-table-row{display:table-row !important}.d-md-table-cell{display:table-cell !important}.d-md-flex{display:flex !important}.d-md-inline-flex{display:inline-flex !important}.d-md-none{display:none !important}.flex-md-fill{flex:1 1 auto !important}.flex-md-row{flex-direction:row !important}.flex-md-column{flex-direction:column !important}.flex-md-row-reverse{flex-direction:row-reverse !important}.flex-md-column-reverse{flex-direction:column-reverse !important}.flex-md-grow-0{flex-grow:0 !important}.flex-md-grow-1{flex-grow:1 !important}.flex-md-shrink-0{flex-shrink:0 !important}.flex-md-shrink-1{flex-shrink:1 !important}.flex-md-wrap{flex-wrap:wrap !important}.flex-md-nowrap{flex-wrap:nowrap !important}.flex-md-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-md-start{justify-content:flex-start !important}.justify-content-md-end{justify-content:flex-end !important}.justify-content-md-center{justify-content:center !important}.justify-content-md-between{justify-content:space-between !important}.justify-content-md-around{justify-content:space-around !important}.justify-content-md-evenly{justify-content:space-evenly !important}.align-items-md-start{align-items:flex-start !important}.align-items-md-end{align-items:flex-end !important}.align-items-md-center{align-items:center !important}.align-items-md-baseline{align-items:baseline !important}.align-items-md-stretch{align-items:stretch !important}.align-content-md-start{align-content:flex-start !important}.align-content-md-end{align-content:flex-end !important}.align-content-md-center{align-content:center !important}.align-content-md-between{align-content:space-between !important}.align-content-md-around{align-content:space-around !important}.align-content-md-stretch{align-content:stretch !important}.align-self-md-auto{align-self:auto !important}.align-self-md-start{align-self:flex-start !important}.align-self-md-end{align-self:flex-end !important}.align-self-md-center{align-self:center !important}.align-self-md-baseline{align-self:baseline !important}.align-self-md-stretch{align-self:stretch !important}.order-md-first{order:-1 !important}.order-md-0{order:0 !important}.order-md-1{order:1 !important}.order-md-2{order:2 !important}.order-md-3{order:3 !important}.order-md-4{order:4 !important}.order-md-5{order:5 !important}.order-md-last{order:6 !important}.m-md-0{margin:0 !important}.m-md-1{margin:.25rem !important}.m-md-2{margin:.5rem !important}.m-md-3{margin:1rem !important}.m-md-4{margin:1.5rem !important}.m-md-5{margin:3rem !important}.m-md-auto{margin:auto !important}.mx-md-0{margin-right:0 !important;margin-left:0 !important}.mx-md-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-md-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-md-3{margin-right:1rem !important;margin-left:1rem !important}.mx-md-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-md-5{margin-right:3rem !important;margin-left:3rem !important}.mx-md-auto{margin-right:auto !important;margin-left:auto !important}.my-md-0{margin-top:0 !important;margin-bottom:0 !important}.my-md-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-md-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-md-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-md-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-md-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-md-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-md-0{margin-top:0 !important}.mt-md-1{margin-top:.25rem !important}.mt-md-2{margin-top:.5rem !important}.mt-md-3{margin-top:1rem !important}.mt-md-4{margin-top:1.5rem !important}.mt-md-5{margin-top:3rem !important}.mt-md-auto{margin-top:auto !important}.me-md-0{margin-right:0 !important}.me-md-1{margin-right:.25rem !important}.me-md-2{margin-right:.5rem !important}.me-md-3{margin-right:1rem !important}.me-md-4{margin-right:1.5rem !important}.me-md-5{margin-right:3rem !important}.me-md-auto{margin-right:auto !important}.mb-md-0{margin-bottom:0 !important}.mb-md-1{margin-bottom:.25rem !important}.mb-md-2{margin-bottom:.5rem !important}.mb-md-3{margin-bottom:1rem !important}.mb-md-4{margin-bottom:1.5rem !important}.mb-md-5{margin-bottom:3rem !important}.mb-md-auto{margin-bottom:auto !important}.ms-md-0{margin-left:0 !important}.ms-md-1{margin-left:.25rem !important}.ms-md-2{margin-left:.5rem !important}.ms-md-3{margin-left:1rem !important}.ms-md-4{margin-left:1.5rem !important}.ms-md-5{margin-left:3rem !important}.ms-md-auto{margin-left:auto !important}.p-md-0{padding:0 !important}.p-md-1{padding:.25rem !important}.p-md-2{padding:.5rem !important}.p-md-3{padding:1rem !important}.p-md-4{padding:1.5rem !important}.p-md-5{padding:3rem !important}.px-md-0{padding-right:0 !important;padding-left:0 !important}.px-md-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-md-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-md-3{padding-right:1rem !important;padding-left:1rem !important}.px-md-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-md-5{padding-right:3rem !important;padding-left:3rem !important}.py-md-0{padding-top:0 !important;padding-bottom:0 !important}.py-md-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-md-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-md-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-md-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-md-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-md-0{padding-top:0 !important}.pt-md-1{padding-top:.25rem !important}.pt-md-2{padding-top:.5rem !important}.pt-md-3{padding-top:1rem !important}.pt-md-4{padding-top:1.5rem !important}.pt-md-5{padding-top:3rem !important}.pe-md-0{padding-right:0 !important}.pe-md-1{padding-right:.25rem !important}.pe-md-2{padding-right:.5rem !important}.pe-md-3{padding-right:1rem !important}.pe-md-4{padding-right:1.5rem !important}.pe-md-5{padding-right:3rem !important}.pb-md-0{padding-bottom:0 !important}.pb-md-1{padding-bottom:.25rem !important}.pb-md-2{padding-bottom:.5rem !important}.pb-md-3{padding-bottom:1rem !important}.pb-md-4{padding-bottom:1.5rem !important}.pb-md-5{padding-bottom:3rem !important}.ps-md-0{padding-left:0 !important}.ps-md-1{padding-left:.25rem !important}.ps-md-2{padding-left:.5rem !important}.ps-md-3{padding-left:1rem !important}.ps-md-4{padding-left:1.5rem !important}.ps-md-5{padding-left:3rem !important}.gap-md-0{gap:0 !important}.gap-md-1{gap:.25rem !important}.gap-md-2{gap:.5rem !important}.gap-md-3{gap:1rem !important}.gap-md-4{gap:1.5rem !important}.gap-md-5{gap:3rem !important}.row-gap-md-0{row-gap:0 !important}.row-gap-md-1{row-gap:.25rem !important}.row-gap-md-2{row-gap:.5rem !important}.row-gap-md-3{row-gap:1rem !important}.row-gap-md-4{row-gap:1.5rem !important}.row-gap-md-5{row-gap:3rem !important}.column-gap-md-0{column-gap:0 !important}.column-gap-md-1{column-gap:.25rem !important}.column-gap-md-2{column-gap:.5rem !important}.column-gap-md-3{column-gap:1rem !important}.column-gap-md-4{column-gap:1.5rem !important}.column-gap-md-5{column-gap:3rem !important}.text-md-start{text-align:left !important}.text-md-end{text-align:right !important}.text-md-center{text-align:center !important}}@media(min-width: 992px){.float-lg-start{float:left !important}.float-lg-end{float:right !important}.float-lg-none{float:none !important}.object-fit-lg-contain{object-fit:contain !important}.object-fit-lg-cover{object-fit:cover !important}.object-fit-lg-fill{object-fit:fill !important}.object-fit-lg-scale{object-fit:scale-down !important}.object-fit-lg-none{object-fit:none !important}.d-lg-inline{display:inline !important}.d-lg-inline-block{display:inline-block !important}.d-lg-block{display:block !important}.d-lg-grid{display:grid !important}.d-lg-inline-grid{display:inline-grid !important}.d-lg-table{display:table !important}.d-lg-table-row{display:table-row !important}.d-lg-table-cell{display:table-cell !important}.d-lg-flex{display:flex !important}.d-lg-inline-flex{display:inline-flex !important}.d-lg-none{display:none !important}.flex-lg-fill{flex:1 1 auto !important}.flex-lg-row{flex-direction:row !important}.flex-lg-column{flex-direction:column !important}.flex-lg-row-reverse{flex-direction:row-reverse !important}.flex-lg-column-reverse{flex-direction:column-reverse !important}.flex-lg-grow-0{flex-grow:0 !important}.flex-lg-grow-1{flex-grow:1 !important}.flex-lg-shrink-0{flex-shrink:0 !important}.flex-lg-shrink-1{flex-shrink:1 !important}.flex-lg-wrap{flex-wrap:wrap !important}.flex-lg-nowrap{flex-wrap:nowrap !important}.flex-lg-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-lg-start{justify-content:flex-start !important}.justify-content-lg-end{justify-content:flex-end !important}.justify-content-lg-center{justify-content:center !important}.justify-content-lg-between{justify-content:space-between !important}.justify-content-lg-around{justify-content:space-around !important}.justify-content-lg-evenly{justify-content:space-evenly !important}.align-items-lg-start{align-items:flex-start !important}.align-items-lg-end{align-items:flex-end !important}.align-items-lg-center{align-items:center !important}.align-items-lg-baseline{align-items:baseline !important}.align-items-lg-stretch{align-items:stretch !important}.align-content-lg-start{align-content:flex-start !important}.align-content-lg-end{align-content:flex-end !important}.align-content-lg-center{align-content:center !important}.align-content-lg-between{align-content:space-between !important}.align-content-lg-around{align-content:space-around !important}.align-content-lg-stretch{align-content:stretch !important}.align-self-lg-auto{align-self:auto !important}.align-self-lg-start{align-self:flex-start !important}.align-self-lg-end{align-self:flex-end !important}.align-self-lg-center{align-self:center !important}.align-self-lg-baseline{align-self:baseline !important}.align-self-lg-stretch{align-self:stretch !important}.order-lg-first{order:-1 !important}.order-lg-0{order:0 !important}.order-lg-1{order:1 !important}.order-lg-2{order:2 !important}.order-lg-3{order:3 !important}.order-lg-4{order:4 !important}.order-lg-5{order:5 !important}.order-lg-last{order:6 !important}.m-lg-0{margin:0 !important}.m-lg-1{margin:.25rem !important}.m-lg-2{margin:.5rem !important}.m-lg-3{margin:1rem !important}.m-lg-4{margin:1.5rem !important}.m-lg-5{margin:3rem !important}.m-lg-auto{margin:auto !important}.mx-lg-0{margin-right:0 !important;margin-left:0 !important}.mx-lg-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-lg-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-lg-3{margin-right:1rem !important;margin-left:1rem !important}.mx-lg-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-lg-5{margin-right:3rem !important;margin-left:3rem !important}.mx-lg-auto{margin-right:auto !important;margin-left:auto !important}.my-lg-0{margin-top:0 !important;margin-bottom:0 !important}.my-lg-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-lg-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-lg-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-lg-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-lg-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-lg-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-lg-0{margin-top:0 !important}.mt-lg-1{margin-top:.25rem !important}.mt-lg-2{margin-top:.5rem !important}.mt-lg-3{margin-top:1rem !important}.mt-lg-4{margin-top:1.5rem !important}.mt-lg-5{margin-top:3rem !important}.mt-lg-auto{margin-top:auto !important}.me-lg-0{margin-right:0 !important}.me-lg-1{margin-right:.25rem !important}.me-lg-2{margin-right:.5rem !important}.me-lg-3{margin-right:1rem !important}.me-lg-4{margin-right:1.5rem !important}.me-lg-5{margin-right:3rem !important}.me-lg-auto{margin-right:auto !important}.mb-lg-0{margin-bottom:0 !important}.mb-lg-1{margin-bottom:.25rem !important}.mb-lg-2{margin-bottom:.5rem !important}.mb-lg-3{margin-bottom:1rem !important}.mb-lg-4{margin-bottom:1.5rem !important}.mb-lg-5{margin-bottom:3rem !important}.mb-lg-auto{margin-bottom:auto !important}.ms-lg-0{margin-left:0 !important}.ms-lg-1{margin-left:.25rem !important}.ms-lg-2{margin-left:.5rem !important}.ms-lg-3{margin-left:1rem !important}.ms-lg-4{margin-left:1.5rem !important}.ms-lg-5{margin-left:3rem !important}.ms-lg-auto{margin-left:auto !important}.p-lg-0{padding:0 !important}.p-lg-1{padding:.25rem !important}.p-lg-2{padding:.5rem !important}.p-lg-3{padding:1rem !important}.p-lg-4{padding:1.5rem !important}.p-lg-5{padding:3rem !important}.px-lg-0{padding-right:0 !important;padding-left:0 !important}.px-lg-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-lg-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-lg-3{padding-right:1rem !important;padding-left:1rem !important}.px-lg-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-lg-5{padding-right:3rem !important;padding-left:3rem !important}.py-lg-0{padding-top:0 !important;padding-bottom:0 !important}.py-lg-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-lg-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-lg-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-lg-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-lg-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-lg-0{padding-top:0 !important}.pt-lg-1{padding-top:.25rem !important}.pt-lg-2{padding-top:.5rem !important}.pt-lg-3{padding-top:1rem !important}.pt-lg-4{padding-top:1.5rem !important}.pt-lg-5{padding-top:3rem !important}.pe-lg-0{padding-right:0 !important}.pe-lg-1{padding-right:.25rem !important}.pe-lg-2{padding-right:.5rem !important}.pe-lg-3{padding-right:1rem !important}.pe-lg-4{padding-right:1.5rem !important}.pe-lg-5{padding-right:3rem !important}.pb-lg-0{padding-bottom:0 !important}.pb-lg-1{padding-bottom:.25rem !important}.pb-lg-2{padding-bottom:.5rem !important}.pb-lg-3{padding-bottom:1rem !important}.pb-lg-4{padding-bottom:1.5rem !important}.pb-lg-5{padding-bottom:3rem !important}.ps-lg-0{padding-left:0 !important}.ps-lg-1{padding-left:.25rem !important}.ps-lg-2{padding-left:.5rem !important}.ps-lg-3{padding-left:1rem !important}.ps-lg-4{padding-left:1.5rem !important}.ps-lg-5{padding-left:3rem !important}.gap-lg-0{gap:0 !important}.gap-lg-1{gap:.25rem !important}.gap-lg-2{gap:.5rem !important}.gap-lg-3{gap:1rem !important}.gap-lg-4{gap:1.5rem !important}.gap-lg-5{gap:3rem !important}.row-gap-lg-0{row-gap:0 !important}.row-gap-lg-1{row-gap:.25rem !important}.row-gap-lg-2{row-gap:.5rem !important}.row-gap-lg-3{row-gap:1rem !important}.row-gap-lg-4{row-gap:1.5rem !important}.row-gap-lg-5{row-gap:3rem !important}.column-gap-lg-0{column-gap:0 !important}.column-gap-lg-1{column-gap:.25rem !important}.column-gap-lg-2{column-gap:.5rem !important}.column-gap-lg-3{column-gap:1rem !important}.column-gap-lg-4{column-gap:1.5rem !important}.column-gap-lg-5{column-gap:3rem !important}.text-lg-start{text-align:left !important}.text-lg-end{text-align:right !important}.text-lg-center{text-align:center !important}}@media(min-width: 1200px){.float-xl-start{float:left !important}.float-xl-end{float:right !important}.float-xl-none{float:none !important}.object-fit-xl-contain{object-fit:contain !important}.object-fit-xl-cover{object-fit:cover !important}.object-fit-xl-fill{object-fit:fill !important}.object-fit-xl-scale{object-fit:scale-down !important}.object-fit-xl-none{object-fit:none !important}.d-xl-inline{display:inline !important}.d-xl-inline-block{display:inline-block !important}.d-xl-block{display:block !important}.d-xl-grid{display:grid !important}.d-xl-inline-grid{display:inline-grid !important}.d-xl-table{display:table !important}.d-xl-table-row{display:table-row !important}.d-xl-table-cell{display:table-cell !important}.d-xl-flex{display:flex !important}.d-xl-inline-flex{display:inline-flex !important}.d-xl-none{display:none !important}.flex-xl-fill{flex:1 1 auto !important}.flex-xl-row{flex-direction:row !important}.flex-xl-column{flex-direction:column !important}.flex-xl-row-reverse{flex-direction:row-reverse !important}.flex-xl-column-reverse{flex-direction:column-reverse !important}.flex-xl-grow-0{flex-grow:0 !important}.flex-xl-grow-1{flex-grow:1 !important}.flex-xl-shrink-0{flex-shrink:0 !important}.flex-xl-shrink-1{flex-shrink:1 !important}.flex-xl-wrap{flex-wrap:wrap !important}.flex-xl-nowrap{flex-wrap:nowrap !important}.flex-xl-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-xl-start{justify-content:flex-start !important}.justify-content-xl-end{justify-content:flex-end !important}.justify-content-xl-center{justify-content:center !important}.justify-content-xl-between{justify-content:space-between !important}.justify-content-xl-around{justify-content:space-around !important}.justify-content-xl-evenly{justify-content:space-evenly !important}.align-items-xl-start{align-items:flex-start !important}.align-items-xl-end{align-items:flex-end !important}.align-items-xl-center{align-items:center !important}.align-items-xl-baseline{align-items:baseline !important}.align-items-xl-stretch{align-items:stretch !important}.align-content-xl-start{align-content:flex-start !important}.align-content-xl-end{align-content:flex-end !important}.align-content-xl-center{align-content:center !important}.align-content-xl-between{align-content:space-between !important}.align-content-xl-around{align-content:space-around !important}.align-content-xl-stretch{align-content:stretch !important}.align-self-xl-auto{align-self:auto !important}.align-self-xl-start{align-self:flex-start !important}.align-self-xl-end{align-self:flex-end !important}.align-self-xl-center{align-self:center !important}.align-self-xl-baseline{align-self:baseline !important}.align-self-xl-stretch{align-self:stretch !important}.order-xl-first{order:-1 !important}.order-xl-0{order:0 !important}.order-xl-1{order:1 !important}.order-xl-2{order:2 !important}.order-xl-3{order:3 !important}.order-xl-4{order:4 !important}.order-xl-5{order:5 !important}.order-xl-last{order:6 !important}.m-xl-0{margin:0 !important}.m-xl-1{margin:.25rem !important}.m-xl-2{margin:.5rem !important}.m-xl-3{margin:1rem !important}.m-xl-4{margin:1.5rem !important}.m-xl-5{margin:3rem !important}.m-xl-auto{margin:auto !important}.mx-xl-0{margin-right:0 !important;margin-left:0 !important}.mx-xl-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-xl-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-xl-3{margin-right:1rem !important;margin-left:1rem !important}.mx-xl-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-xl-5{margin-right:3rem !important;margin-left:3rem !important}.mx-xl-auto{margin-right:auto !important;margin-left:auto !important}.my-xl-0{margin-top:0 !important;margin-bottom:0 !important}.my-xl-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-xl-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-xl-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-xl-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-xl-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-xl-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-xl-0{margin-top:0 !important}.mt-xl-1{margin-top:.25rem !important}.mt-xl-2{margin-top:.5rem !important}.mt-xl-3{margin-top:1rem !important}.mt-xl-4{margin-top:1.5rem !important}.mt-xl-5{margin-top:3rem !important}.mt-xl-auto{margin-top:auto !important}.me-xl-0{margin-right:0 !important}.me-xl-1{margin-right:.25rem !important}.me-xl-2{margin-right:.5rem !important}.me-xl-3{margin-right:1rem !important}.me-xl-4{margin-right:1.5rem !important}.me-xl-5{margin-right:3rem !important}.me-xl-auto{margin-right:auto !important}.mb-xl-0{margin-bottom:0 !important}.mb-xl-1{margin-bottom:.25rem !important}.mb-xl-2{margin-bottom:.5rem !important}.mb-xl-3{margin-bottom:1rem !important}.mb-xl-4{margin-bottom:1.5rem !important}.mb-xl-5{margin-bottom:3rem !important}.mb-xl-auto{margin-bottom:auto !important}.ms-xl-0{margin-left:0 !important}.ms-xl-1{margin-left:.25rem !important}.ms-xl-2{margin-left:.5rem !important}.ms-xl-3{margin-left:1rem !important}.ms-xl-4{margin-left:1.5rem !important}.ms-xl-5{margin-left:3rem !important}.ms-xl-auto{margin-left:auto !important}.p-xl-0{padding:0 !important}.p-xl-1{padding:.25rem !important}.p-xl-2{padding:.5rem !important}.p-xl-3{padding:1rem !important}.p-xl-4{padding:1.5rem !important}.p-xl-5{padding:3rem !important}.px-xl-0{padding-right:0 !important;padding-left:0 !important}.px-xl-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-xl-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-xl-3{padding-right:1rem !important;padding-left:1rem !important}.px-xl-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-xl-5{padding-right:3rem !important;padding-left:3rem !important}.py-xl-0{padding-top:0 !important;padding-bottom:0 !important}.py-xl-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-xl-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-xl-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-xl-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-xl-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-xl-0{padding-top:0 !important}.pt-xl-1{padding-top:.25rem !important}.pt-xl-2{padding-top:.5rem !important}.pt-xl-3{padding-top:1rem !important}.pt-xl-4{padding-top:1.5rem !important}.pt-xl-5{padding-top:3rem !important}.pe-xl-0{padding-right:0 !important}.pe-xl-1{padding-right:.25rem !important}.pe-xl-2{padding-right:.5rem !important}.pe-xl-3{padding-right:1rem !important}.pe-xl-4{padding-right:1.5rem !important}.pe-xl-5{padding-right:3rem !important}.pb-xl-0{padding-bottom:0 !important}.pb-xl-1{padding-bottom:.25rem !important}.pb-xl-2{padding-bottom:.5rem !important}.pb-xl-3{padding-bottom:1rem !important}.pb-xl-4{padding-bottom:1.5rem !important}.pb-xl-5{padding-bottom:3rem !important}.ps-xl-0{padding-left:0 !important}.ps-xl-1{padding-left:.25rem !important}.ps-xl-2{padding-left:.5rem !important}.ps-xl-3{padding-left:1rem !important}.ps-xl-4{padding-left:1.5rem !important}.ps-xl-5{padding-left:3rem !important}.gap-xl-0{gap:0 !important}.gap-xl-1{gap:.25rem !important}.gap-xl-2{gap:.5rem !important}.gap-xl-3{gap:1rem !important}.gap-xl-4{gap:1.5rem !important}.gap-xl-5{gap:3rem !important}.row-gap-xl-0{row-gap:0 !important}.row-gap-xl-1{row-gap:.25rem !important}.row-gap-xl-2{row-gap:.5rem !important}.row-gap-xl-3{row-gap:1rem !important}.row-gap-xl-4{row-gap:1.5rem !important}.row-gap-xl-5{row-gap:3rem !important}.column-gap-xl-0{column-gap:0 !important}.column-gap-xl-1{column-gap:.25rem !important}.column-gap-xl-2{column-gap:.5rem !important}.column-gap-xl-3{column-gap:1rem !important}.column-gap-xl-4{column-gap:1.5rem !important}.column-gap-xl-5{column-gap:3rem !important}.text-xl-start{text-align:left !important}.text-xl-end{text-align:right !important}.text-xl-center{text-align:center !important}}@media(min-width: 1400px){.float-xxl-start{float:left !important}.float-xxl-end{float:right !important}.float-xxl-none{float:none !important}.object-fit-xxl-contain{object-fit:contain !important}.object-fit-xxl-cover{object-fit:cover !important}.object-fit-xxl-fill{object-fit:fill !important}.object-fit-xxl-scale{object-fit:scale-down !important}.object-fit-xxl-none{object-fit:none !important}.d-xxl-inline{display:inline !important}.d-xxl-inline-block{display:inline-block !important}.d-xxl-block{display:block !important}.d-xxl-grid{display:grid !important}.d-xxl-inline-grid{display:inline-grid !important}.d-xxl-table{display:table !important}.d-xxl-table-row{display:table-row !important}.d-xxl-table-cell{display:table-cell !important}.d-xxl-flex{display:flex !important}.d-xxl-inline-flex{display:inline-flex !important}.d-xxl-none{display:none !important}.flex-xxl-fill{flex:1 1 auto !important}.flex-xxl-row{flex-direction:row !important}.flex-xxl-column{flex-direction:column !important}.flex-xxl-row-reverse{flex-direction:row-reverse !important}.flex-xxl-column-reverse{flex-direction:column-reverse !important}.flex-xxl-grow-0{flex-grow:0 !important}.flex-xxl-grow-1{flex-grow:1 !important}.flex-xxl-shrink-0{flex-shrink:0 !important}.flex-xxl-shrink-1{flex-shrink:1 !important}.flex-xxl-wrap{flex-wrap:wrap !important}.flex-xxl-nowrap{flex-wrap:nowrap !important}.flex-xxl-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-xxl-start{justify-content:flex-start !important}.justify-content-xxl-end{justify-content:flex-end !important}.justify-content-xxl-center{justify-content:center !important}.justify-content-xxl-between{justify-content:space-between !important}.justify-content-xxl-around{justify-content:space-around !important}.justify-content-xxl-evenly{justify-content:space-evenly !important}.align-items-xxl-start{align-items:flex-start !important}.align-items-xxl-end{align-items:flex-end !important}.align-items-xxl-center{align-items:center !important}.align-items-xxl-baseline{align-items:baseline !important}.align-items-xxl-stretch{align-items:stretch !important}.align-content-xxl-start{align-content:flex-start !important}.align-content-xxl-end{align-content:flex-end !important}.align-content-xxl-center{align-content:center !important}.align-content-xxl-between{align-content:space-between !important}.align-content-xxl-around{align-content:space-around !important}.align-content-xxl-stretch{align-content:stretch !important}.align-self-xxl-auto{align-self:auto !important}.align-self-xxl-start{align-self:flex-start !important}.align-self-xxl-end{align-self:flex-end !important}.align-self-xxl-center{align-self:center !important}.align-self-xxl-baseline{align-self:baseline !important}.align-self-xxl-stretch{align-self:stretch !important}.order-xxl-first{order:-1 !important}.order-xxl-0{order:0 !important}.order-xxl-1{order:1 !important}.order-xxl-2{order:2 !important}.order-xxl-3{order:3 !important}.order-xxl-4{order:4 !important}.order-xxl-5{order:5 !important}.order-xxl-last{order:6 !important}.m-xxl-0{margin:0 !important}.m-xxl-1{margin:.25rem !important}.m-xxl-2{margin:.5rem !important}.m-xxl-3{margin:1rem !important}.m-xxl-4{margin:1.5rem !important}.m-xxl-5{margin:3rem !important}.m-xxl-auto{margin:auto !important}.mx-xxl-0{margin-right:0 !important;margin-left:0 !important}.mx-xxl-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-xxl-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-xxl-3{margin-right:1rem !important;margin-left:1rem !important}.mx-xxl-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-xxl-5{margin-right:3rem !important;margin-left:3rem !important}.mx-xxl-auto{margin-right:auto !important;margin-left:auto !important}.my-xxl-0{margin-top:0 !important;margin-bottom:0 !important}.my-xxl-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-xxl-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-xxl-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-xxl-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-xxl-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-xxl-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-xxl-0{margin-top:0 !important}.mt-xxl-1{margin-top:.25rem !important}.mt-xxl-2{margin-top:.5rem !important}.mt-xxl-3{margin-top:1rem !important}.mt-xxl-4{margin-top:1.5rem !important}.mt-xxl-5{margin-top:3rem !important}.mt-xxl-auto{margin-top:auto !important}.me-xxl-0{margin-right:0 !important}.me-xxl-1{margin-right:.25rem !important}.me-xxl-2{margin-right:.5rem !important}.me-xxl-3{margin-right:1rem !important}.me-xxl-4{margin-right:1.5rem !important}.me-xxl-5{margin-right:3rem !important}.me-xxl-auto{margin-right:auto !important}.mb-xxl-0{margin-bottom:0 !important}.mb-xxl-1{margin-bottom:.25rem !important}.mb-xxl-2{margin-bottom:.5rem !important}.mb-xxl-3{margin-bottom:1rem !important}.mb-xxl-4{margin-bottom:1.5rem !important}.mb-xxl-5{margin-bottom:3rem !important}.mb-xxl-auto{margin-bottom:auto !important}.ms-xxl-0{margin-left:0 !important}.ms-xxl-1{margin-left:.25rem !important}.ms-xxl-2{margin-left:.5rem !important}.ms-xxl-3{margin-left:1rem !important}.ms-xxl-4{margin-left:1.5rem !important}.ms-xxl-5{margin-left:3rem !important}.ms-xxl-auto{margin-left:auto !important}.p-xxl-0{padding:0 !important}.p-xxl-1{padding:.25rem !important}.p-xxl-2{padding:.5rem !important}.p-xxl-3{padding:1rem !important}.p-xxl-4{padding:1.5rem !important}.p-xxl-5{padding:3rem !important}.px-xxl-0{padding-right:0 !important;padding-left:0 !important}.px-xxl-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-xxl-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-xxl-3{padding-right:1rem !important;padding-left:1rem !important}.px-xxl-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-xxl-5{padding-right:3rem !important;padding-left:3rem !important}.py-xxl-0{padding-top:0 !important;padding-bottom:0 !important}.py-xxl-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-xxl-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-xxl-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-xxl-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-xxl-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-xxl-0{padding-top:0 !important}.pt-xxl-1{padding-top:.25rem !important}.pt-xxl-2{padding-top:.5rem !important}.pt-xxl-3{padding-top:1rem !important}.pt-xxl-4{padding-top:1.5rem !important}.pt-xxl-5{padding-top:3rem !important}.pe-xxl-0{padding-right:0 !important}.pe-xxl-1{padding-right:.25rem !important}.pe-xxl-2{padding-right:.5rem !important}.pe-xxl-3{padding-right:1rem !important}.pe-xxl-4{padding-right:1.5rem !important}.pe-xxl-5{padding-right:3rem !important}.pb-xxl-0{padding-bottom:0 !important}.pb-xxl-1{padding-bottom:.25rem !important}.pb-xxl-2{padding-bottom:.5rem !important}.pb-xxl-3{padding-bottom:1rem !important}.pb-xxl-4{padding-bottom:1.5rem !important}.pb-xxl-5{padding-bottom:3rem !important}.ps-xxl-0{padding-left:0 !important}.ps-xxl-1{padding-left:.25rem !important}.ps-xxl-2{padding-left:.5rem !important}.ps-xxl-3{padding-left:1rem !important}.ps-xxl-4{padding-left:1.5rem !important}.ps-xxl-5{padding-left:3rem !important}.gap-xxl-0{gap:0 !important}.gap-xxl-1{gap:.25rem !important}.gap-xxl-2{gap:.5rem !important}.gap-xxl-3{gap:1rem !important}.gap-xxl-4{gap:1.5rem !important}.gap-xxl-5{gap:3rem !important}.row-gap-xxl-0{row-gap:0 !important}.row-gap-xxl-1{row-gap:.25rem !important}.row-gap-xxl-2{row-gap:.5rem !important}.row-gap-xxl-3{row-gap:1rem !important}.row-gap-xxl-4{row-gap:1.5rem !important}.row-gap-xxl-5{row-gap:3rem !important}.column-gap-xxl-0{column-gap:0 !important}.column-gap-xxl-1{column-gap:.25rem !important}.column-gap-xxl-2{column-gap:.5rem !important}.column-gap-xxl-3{column-gap:1rem !important}.column-gap-xxl-4{column-gap:1.5rem !important}.column-gap-xxl-5{column-gap:3rem !important}.text-xxl-start{text-align:left !important}.text-xxl-end{text-align:right !important}.text-xxl-center{text-align:center !important}}.bg-default{color:#fff}.bg-primary{color:#fff}.bg-secondary{color:#fff}.bg-success{color:#fff}.bg-info{color:#fff}.bg-warning{color:#fff}.bg-danger{color:#fff}.bg-light{color:#000}.bg-dark{color:#fff}@media(min-width: 1200px){.fs-1{font-size:2rem !important}.fs-2{font-size:1.65rem !important}.fs-3{font-size:1.45rem !important}}@media print{.d-print-inline{display:inline !important}.d-print-inline-block{display:inline-block !important}.d-print-block{display:block !important}.d-print-grid{display:grid !important}.d-print-inline-grid{display:inline-grid !important}.d-print-table{display:table !important}.d-print-table-row{display:table-row !important}.d-print-table-cell{display:table-cell !important}.d-print-flex{display:flex !important}.d-print-inline-flex{display:inline-flex !important}.d-print-none{display:none !important}}:root{--bslib-spacer: 1rem;--bslib-mb-spacer: var(--bslib-spacer, 1rem)}.bslib-mb-spacing{margin-bottom:var(--bslib-mb-spacer)}.bslib-gap-spacing{gap:var(--bslib-mb-spacer)}.bslib-gap-spacing>.bslib-mb-spacing,.bslib-gap-spacing>.form-group,.bslib-gap-spacing>p,.bslib-gap-spacing>pre{margin-bottom:0}.html-fill-container>.html-fill-item.bslib-mb-spacing{margin-bottom:0}.tab-content>.tab-pane.html-fill-container{display:none}.tab-content>.active.html-fill-container{display:flex}.tab-content.html-fill-container{padding:0}.bg-blue{--bslib-color-bg: #2780e3;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-blue{--bslib-color-fg: #2780e3;color:var(--bslib-color-fg)}.bg-indigo{--bslib-color-bg: #6610f2;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-indigo{--bslib-color-fg: #6610f2;color:var(--bslib-color-fg)}.bg-purple{--bslib-color-bg: #613d7c;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-purple{--bslib-color-fg: #613d7c;color:var(--bslib-color-fg)}.bg-pink{--bslib-color-bg: #e83e8c;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-pink{--bslib-color-fg: #e83e8c;color:var(--bslib-color-fg)}.bg-red{--bslib-color-bg: #ff0039;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-red{--bslib-color-fg: #ff0039;color:var(--bslib-color-fg)}.bg-orange{--bslib-color-bg: #f0ad4e;--bslib-color-fg: #000;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-orange{--bslib-color-fg: #f0ad4e;color:var(--bslib-color-fg)}.bg-yellow{--bslib-color-bg: #ff7518;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-yellow{--bslib-color-fg: #ff7518;color:var(--bslib-color-fg)}.bg-green{--bslib-color-bg: #3fb618;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-green{--bslib-color-fg: #3fb618;color:var(--bslib-color-fg)}.bg-teal{--bslib-color-bg: #20c997;--bslib-color-fg: #000;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-teal{--bslib-color-fg: #20c997;color:var(--bslib-color-fg)}.bg-cyan{--bslib-color-bg: #9954bb;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-cyan{--bslib-color-fg: #9954bb;color:var(--bslib-color-fg)}.text-default{--bslib-color-fg: #343a40}.bg-default{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.text-primary{--bslib-color-fg: #2780e3}.bg-primary{--bslib-color-bg: #2780e3;--bslib-color-fg: #fff}.text-secondary{--bslib-color-fg: #343a40}.bg-secondary{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.text-success{--bslib-color-fg: #3fb618}.bg-success{--bslib-color-bg: #3fb618;--bslib-color-fg: #fff}.text-info{--bslib-color-fg: #9954bb}.bg-info{--bslib-color-bg: #9954bb;--bslib-color-fg: #fff}.text-warning{--bslib-color-fg: #ff7518}.bg-warning{--bslib-color-bg: #ff7518;--bslib-color-fg: #fff}.text-danger{--bslib-color-fg: #ff0039}.bg-danger{--bslib-color-bg: #ff0039;--bslib-color-fg: #fff}.text-light{--bslib-color-fg: #f8f9fa}.bg-light{--bslib-color-bg: #f8f9fa;--bslib-color-fg: #000}.text-dark{--bslib-color-fg: #343a40}.bg-dark{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.bg-gradient-blue-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #4053e9;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #4053e9;color:#fff}.bg-gradient-blue-purple{--bslib-color-fg: #fff;--bslib-color-bg: #3e65ba;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #3e65ba;color:#fff}.bg-gradient-blue-pink{--bslib-color-fg: #fff;--bslib-color-bg: #7466c0;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #7466c0;color:#fff}.bg-gradient-blue-red{--bslib-color-fg: #fff;--bslib-color-bg: #7d4d9f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #7d4d9f;color:#fff}.bg-gradient-blue-orange{--bslib-color-fg: #fff;--bslib-color-bg: #7792a7;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #7792a7;color:#fff}.bg-gradient-blue-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #7d7c92;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #7d7c92;color:#fff}.bg-gradient-blue-green{--bslib-color-fg: #fff;--bslib-color-bg: #319692;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #319692;color:#fff}.bg-gradient-blue-teal{--bslib-color-fg: #fff;--bslib-color-bg: #249dc5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #249dc5;color:#fff}.bg-gradient-blue-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #556ed3;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #556ed3;color:#fff}.bg-gradient-indigo-blue{--bslib-color-fg: #fff;--bslib-color-bg: #4d3dec;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #4d3dec;color:#fff}.bg-gradient-indigo-purple{--bslib-color-fg: #fff;--bslib-color-bg: #6422c3;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #6422c3;color:#fff}.bg-gradient-indigo-pink{--bslib-color-fg: #fff;--bslib-color-bg: #9a22c9;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #9a22c9;color:#fff}.bg-gradient-indigo-red{--bslib-color-fg: #fff;--bslib-color-bg: #a30aa8;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #a30aa8;color:#fff}.bg-gradient-indigo-orange{--bslib-color-fg: #fff;--bslib-color-bg: #9d4fb0;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #9d4fb0;color:#fff}.bg-gradient-indigo-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #a3389b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #a3389b;color:#fff}.bg-gradient-indigo-green{--bslib-color-fg: #fff;--bslib-color-bg: #56529b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #56529b;color:#fff}.bg-gradient-indigo-teal{--bslib-color-fg: #fff;--bslib-color-bg: #4a5ace;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #4a5ace;color:#fff}.bg-gradient-indigo-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #7a2bdc;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #7a2bdc;color:#fff}.bg-gradient-purple-blue{--bslib-color-fg: #fff;--bslib-color-bg: #4a58a5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #4a58a5;color:#fff}.bg-gradient-purple-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #632bab;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #632bab;color:#fff}.bg-gradient-purple-pink{--bslib-color-fg: #fff;--bslib-color-bg: #973d82;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #973d82;color:#fff}.bg-gradient-purple-red{--bslib-color-fg: #fff;--bslib-color-bg: #a02561;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #a02561;color:#fff}.bg-gradient-purple-orange{--bslib-color-fg: #fff;--bslib-color-bg: #9a6a6a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #9a6a6a;color:#fff}.bg-gradient-purple-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #a05354;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #a05354;color:#fff}.bg-gradient-purple-green{--bslib-color-fg: #fff;--bslib-color-bg: #536d54;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #536d54;color:#fff}.bg-gradient-purple-teal{--bslib-color-fg: #fff;--bslib-color-bg: #477587;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #477587;color:#fff}.bg-gradient-purple-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #774695;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #774695;color:#fff}.bg-gradient-pink-blue{--bslib-color-fg: #fff;--bslib-color-bg: #9b58af;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #9b58af;color:#fff}.bg-gradient-pink-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #b42cb5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #b42cb5;color:#fff}.bg-gradient-pink-purple{--bslib-color-fg: #fff;--bslib-color-bg: #b23e86;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #b23e86;color:#fff}.bg-gradient-pink-red{--bslib-color-fg: #fff;--bslib-color-bg: #f1256b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #f1256b;color:#fff}.bg-gradient-pink-orange{--bslib-color-fg: #fff;--bslib-color-bg: #eb6a73;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #eb6a73;color:#fff}.bg-gradient-pink-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #f1545e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #f1545e;color:#fff}.bg-gradient-pink-green{--bslib-color-fg: #fff;--bslib-color-bg: #a46e5e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #a46e5e;color:#fff}.bg-gradient-pink-teal{--bslib-color-fg: #fff;--bslib-color-bg: #987690;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #987690;color:#fff}.bg-gradient-pink-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #c8479f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #c8479f;color:#fff}.bg-gradient-red-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a9337d;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a9337d;color:#fff}.bg-gradient-red-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #c20683;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #c20683;color:#fff}.bg-gradient-red-purple{--bslib-color-fg: #fff;--bslib-color-bg: #c01854;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #c01854;color:#fff}.bg-gradient-red-pink{--bslib-color-fg: #fff;--bslib-color-bg: #f6195a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #f6195a;color:#fff}.bg-gradient-red-orange{--bslib-color-fg: #fff;--bslib-color-bg: #f94541;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #f94541;color:#fff}.bg-gradient-red-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #ff2f2c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #ff2f2c;color:#fff}.bg-gradient-red-green{--bslib-color-fg: #fff;--bslib-color-bg: #b2492c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #b2492c;color:#fff}.bg-gradient-red-teal{--bslib-color-fg: #fff;--bslib-color-bg: #a6505f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #a6505f;color:#fff}.bg-gradient-red-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #d6226d;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #d6226d;color:#fff}.bg-gradient-orange-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a09b8a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a09b8a;color:#fff}.bg-gradient-orange-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #b96e90;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #b96e90;color:#fff}.bg-gradient-orange-purple{--bslib-color-fg: #fff;--bslib-color-bg: #b78060;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #b78060;color:#fff}.bg-gradient-orange-pink{--bslib-color-fg: #fff;--bslib-color-bg: #ed8167;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #ed8167;color:#fff}.bg-gradient-orange-red{--bslib-color-fg: #fff;--bslib-color-bg: #f66846;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #f66846;color:#fff}.bg-gradient-orange-yellow{--bslib-color-fg: #000;--bslib-color-bg: #f69738;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #f69738;color:#000}.bg-gradient-orange-green{--bslib-color-fg: #000;--bslib-color-bg: #a9b138;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #a9b138;color:#000}.bg-gradient-orange-teal{--bslib-color-fg: #000;--bslib-color-bg: #9db86b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #9db86b;color:#000}.bg-gradient-orange-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #cd897a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #cd897a;color:#fff}.bg-gradient-yellow-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a97969;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a97969;color:#fff}.bg-gradient-yellow-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #c24d6f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #c24d6f;color:#fff}.bg-gradient-yellow-purple{--bslib-color-fg: #fff;--bslib-color-bg: #c05f40;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #c05f40;color:#fff}.bg-gradient-yellow-pink{--bslib-color-fg: #fff;--bslib-color-bg: #f65f46;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #f65f46;color:#fff}.bg-gradient-yellow-red{--bslib-color-fg: #fff;--bslib-color-bg: #ff4625;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #ff4625;color:#fff}.bg-gradient-yellow-orange{--bslib-color-fg: #000;--bslib-color-bg: #f98b2e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #f98b2e;color:#000}.bg-gradient-yellow-green{--bslib-color-fg: #fff;--bslib-color-bg: #b28f18;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #b28f18;color:#fff}.bg-gradient-yellow-teal{--bslib-color-fg: #fff;--bslib-color-bg: #a6974b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #a6974b;color:#fff}.bg-gradient-yellow-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #d66859;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #d66859;color:#fff}.bg-gradient-green-blue{--bslib-color-fg: #fff;--bslib-color-bg: #35a069;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #35a069;color:#fff}.bg-gradient-green-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #4f746f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #4f746f;color:#fff}.bg-gradient-green-purple{--bslib-color-fg: #fff;--bslib-color-bg: #4d8640;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #4d8640;color:#fff}.bg-gradient-green-pink{--bslib-color-fg: #fff;--bslib-color-bg: #838646;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #838646;color:#fff}.bg-gradient-green-red{--bslib-color-fg: #fff;--bslib-color-bg: #8c6d25;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #8c6d25;color:#fff}.bg-gradient-green-orange{--bslib-color-fg: #000;--bslib-color-bg: #86b22e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #86b22e;color:#000}.bg-gradient-green-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #8c9c18;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #8c9c18;color:#fff}.bg-gradient-green-teal{--bslib-color-fg: #000;--bslib-color-bg: #33be4b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #33be4b;color:#000}.bg-gradient-green-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #638f59;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #638f59;color:#fff}.bg-gradient-teal-blue{--bslib-color-fg: #fff;--bslib-color-bg: #23acb5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #23acb5;color:#fff}.bg-gradient-teal-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #3c7fbb;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #3c7fbb;color:#fff}.bg-gradient-teal-purple{--bslib-color-fg: #fff;--bslib-color-bg: #3a918c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #3a918c;color:#fff}.bg-gradient-teal-pink{--bslib-color-fg: #fff;--bslib-color-bg: #709193;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #709193;color:#fff}.bg-gradient-teal-red{--bslib-color-fg: #fff;--bslib-color-bg: #797971;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #797971;color:#fff}.bg-gradient-teal-orange{--bslib-color-fg: #000;--bslib-color-bg: #73be7a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #73be7a;color:#000}.bg-gradient-teal-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #79a764;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #79a764;color:#fff}.bg-gradient-teal-green{--bslib-color-fg: #000;--bslib-color-bg: #2cc164;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #2cc164;color:#000}.bg-gradient-teal-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #509aa5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #509aa5;color:#fff}.bg-gradient-cyan-blue{--bslib-color-fg: #fff;--bslib-color-bg: #6b66cb;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #6b66cb;color:#fff}.bg-gradient-cyan-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #8539d1;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #8539d1;color:#fff}.bg-gradient-cyan-purple{--bslib-color-fg: #fff;--bslib-color-bg: #834ba2;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #834ba2;color:#fff}.bg-gradient-cyan-pink{--bslib-color-fg: #fff;--bslib-color-bg: #b94ba8;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #b94ba8;color:#fff}.bg-gradient-cyan-red{--bslib-color-fg: #fff;--bslib-color-bg: #c23287;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #c23287;color:#fff}.bg-gradient-cyan-orange{--bslib-color-fg: #fff;--bslib-color-bg: #bc788f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #bc788f;color:#fff}.bg-gradient-cyan-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #c2617a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #c2617a;color:#fff}.bg-gradient-cyan-green{--bslib-color-fg: #fff;--bslib-color-bg: #757b7a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #757b7a;color:#fff}.bg-gradient-cyan-teal{--bslib-color-fg: #fff;--bslib-color-bg: #6983ad;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #6983ad;color:#fff}.bg-blue{--bslib-color-bg: #2780e3;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-blue{--bslib-color-fg: #2780e3;color:var(--bslib-color-fg)}.bg-indigo{--bslib-color-bg: #6610f2;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-indigo{--bslib-color-fg: #6610f2;color:var(--bslib-color-fg)}.bg-purple{--bslib-color-bg: #613d7c;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-purple{--bslib-color-fg: #613d7c;color:var(--bslib-color-fg)}.bg-pink{--bslib-color-bg: #e83e8c;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-pink{--bslib-color-fg: #e83e8c;color:var(--bslib-color-fg)}.bg-red{--bslib-color-bg: #ff0039;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-red{--bslib-color-fg: #ff0039;color:var(--bslib-color-fg)}.bg-orange{--bslib-color-bg: #f0ad4e;--bslib-color-fg: #000;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-orange{--bslib-color-fg: #f0ad4e;color:var(--bslib-color-fg)}.bg-yellow{--bslib-color-bg: #ff7518;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-yellow{--bslib-color-fg: #ff7518;color:var(--bslib-color-fg)}.bg-green{--bslib-color-bg: #3fb618;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-green{--bslib-color-fg: #3fb618;color:var(--bslib-color-fg)}.bg-teal{--bslib-color-bg: #20c997;--bslib-color-fg: #000;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-teal{--bslib-color-fg: #20c997;color:var(--bslib-color-fg)}.bg-cyan{--bslib-color-bg: #9954bb;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-cyan{--bslib-color-fg: #9954bb;color:var(--bslib-color-fg)}.text-default{--bslib-color-fg: #343a40}.bg-default{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.text-primary{--bslib-color-fg: #2780e3}.bg-primary{--bslib-color-bg: #2780e3;--bslib-color-fg: #fff}.text-secondary{--bslib-color-fg: #343a40}.bg-secondary{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.text-success{--bslib-color-fg: #3fb618}.bg-success{--bslib-color-bg: #3fb618;--bslib-color-fg: #fff}.text-info{--bslib-color-fg: #9954bb}.bg-info{--bslib-color-bg: #9954bb;--bslib-color-fg: #fff}.text-warning{--bslib-color-fg: #ff7518}.bg-warning{--bslib-color-bg: #ff7518;--bslib-color-fg: #fff}.text-danger{--bslib-color-fg: #ff0039}.bg-danger{--bslib-color-bg: #ff0039;--bslib-color-fg: #fff}.text-light{--bslib-color-fg: #f8f9fa}.bg-light{--bslib-color-bg: #f8f9fa;--bslib-color-fg: #000}.text-dark{--bslib-color-fg: #343a40}.bg-dark{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.bg-gradient-blue-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #4053e9;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #4053e9;color:#fff}.bg-gradient-blue-purple{--bslib-color-fg: #fff;--bslib-color-bg: #3e65ba;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #3e65ba;color:#fff}.bg-gradient-blue-pink{--bslib-color-fg: #fff;--bslib-color-bg: #7466c0;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #7466c0;color:#fff}.bg-gradient-blue-red{--bslib-color-fg: #fff;--bslib-color-bg: #7d4d9f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #7d4d9f;color:#fff}.bg-gradient-blue-orange{--bslib-color-fg: #fff;--bslib-color-bg: #7792a7;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #7792a7;color:#fff}.bg-gradient-blue-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #7d7c92;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #7d7c92;color:#fff}.bg-gradient-blue-green{--bslib-color-fg: #fff;--bslib-color-bg: #319692;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #319692;color:#fff}.bg-gradient-blue-teal{--bslib-color-fg: #fff;--bslib-color-bg: #249dc5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #249dc5;color:#fff}.bg-gradient-blue-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #556ed3;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #556ed3;color:#fff}.bg-gradient-indigo-blue{--bslib-color-fg: #fff;--bslib-color-bg: #4d3dec;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #4d3dec;color:#fff}.bg-gradient-indigo-purple{--bslib-color-fg: #fff;--bslib-color-bg: #6422c3;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #6422c3;color:#fff}.bg-gradient-indigo-pink{--bslib-color-fg: #fff;--bslib-color-bg: #9a22c9;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #9a22c9;color:#fff}.bg-gradient-indigo-red{--bslib-color-fg: #fff;--bslib-color-bg: #a30aa8;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #a30aa8;color:#fff}.bg-gradient-indigo-orange{--bslib-color-fg: #fff;--bslib-color-bg: #9d4fb0;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #9d4fb0;color:#fff}.bg-gradient-indigo-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #a3389b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #a3389b;color:#fff}.bg-gradient-indigo-green{--bslib-color-fg: #fff;--bslib-color-bg: #56529b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #56529b;color:#fff}.bg-gradient-indigo-teal{--bslib-color-fg: #fff;--bslib-color-bg: #4a5ace;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #4a5ace;color:#fff}.bg-gradient-indigo-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #7a2bdc;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #7a2bdc;color:#fff}.bg-gradient-purple-blue{--bslib-color-fg: #fff;--bslib-color-bg: #4a58a5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #4a58a5;color:#fff}.bg-gradient-purple-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #632bab;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #632bab;color:#fff}.bg-gradient-purple-pink{--bslib-color-fg: #fff;--bslib-color-bg: #973d82;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #973d82;color:#fff}.bg-gradient-purple-red{--bslib-color-fg: #fff;--bslib-color-bg: #a02561;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #a02561;color:#fff}.bg-gradient-purple-orange{--bslib-color-fg: #fff;--bslib-color-bg: #9a6a6a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #9a6a6a;color:#fff}.bg-gradient-purple-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #a05354;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #a05354;color:#fff}.bg-gradient-purple-green{--bslib-color-fg: #fff;--bslib-color-bg: #536d54;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #536d54;color:#fff}.bg-gradient-purple-teal{--bslib-color-fg: #fff;--bslib-color-bg: #477587;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #477587;color:#fff}.bg-gradient-purple-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #774695;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #774695;color:#fff}.bg-gradient-pink-blue{--bslib-color-fg: #fff;--bslib-color-bg: #9b58af;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #9b58af;color:#fff}.bg-gradient-pink-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #b42cb5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #b42cb5;color:#fff}.bg-gradient-pink-purple{--bslib-color-fg: #fff;--bslib-color-bg: #b23e86;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #b23e86;color:#fff}.bg-gradient-pink-red{--bslib-color-fg: #fff;--bslib-color-bg: #f1256b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #f1256b;color:#fff}.bg-gradient-pink-orange{--bslib-color-fg: #fff;--bslib-color-bg: #eb6a73;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #eb6a73;color:#fff}.bg-gradient-pink-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #f1545e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #f1545e;color:#fff}.bg-gradient-pink-green{--bslib-color-fg: #fff;--bslib-color-bg: #a46e5e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #a46e5e;color:#fff}.bg-gradient-pink-teal{--bslib-color-fg: #fff;--bslib-color-bg: #987690;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #987690;color:#fff}.bg-gradient-pink-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #c8479f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #c8479f;color:#fff}.bg-gradient-red-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a9337d;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a9337d;color:#fff}.bg-gradient-red-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #c20683;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #c20683;color:#fff}.bg-gradient-red-purple{--bslib-color-fg: #fff;--bslib-color-bg: #c01854;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #c01854;color:#fff}.bg-gradient-red-pink{--bslib-color-fg: #fff;--bslib-color-bg: #f6195a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #f6195a;color:#fff}.bg-gradient-red-orange{--bslib-color-fg: #fff;--bslib-color-bg: #f94541;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #f94541;color:#fff}.bg-gradient-red-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #ff2f2c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #ff2f2c;color:#fff}.bg-gradient-red-green{--bslib-color-fg: #fff;--bslib-color-bg: #b2492c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #b2492c;color:#fff}.bg-gradient-red-teal{--bslib-color-fg: #fff;--bslib-color-bg: #a6505f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #a6505f;color:#fff}.bg-gradient-red-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #d6226d;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #d6226d;color:#fff}.bg-gradient-orange-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a09b8a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a09b8a;color:#fff}.bg-gradient-orange-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #b96e90;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #b96e90;color:#fff}.bg-gradient-orange-purple{--bslib-color-fg: #fff;--bslib-color-bg: #b78060;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #b78060;color:#fff}.bg-gradient-orange-pink{--bslib-color-fg: #fff;--bslib-color-bg: #ed8167;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #ed8167;color:#fff}.bg-gradient-orange-red{--bslib-color-fg: #fff;--bslib-color-bg: #f66846;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #f66846;color:#fff}.bg-gradient-orange-yellow{--bslib-color-fg: #000;--bslib-color-bg: #f69738;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #f69738;color:#000}.bg-gradient-orange-green{--bslib-color-fg: #000;--bslib-color-bg: #a9b138;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #a9b138;color:#000}.bg-gradient-orange-teal{--bslib-color-fg: #000;--bslib-color-bg: #9db86b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #9db86b;color:#000}.bg-gradient-orange-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #cd897a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #cd897a;color:#fff}.bg-gradient-yellow-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a97969;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a97969;color:#fff}.bg-gradient-yellow-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #c24d6f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #c24d6f;color:#fff}.bg-gradient-yellow-purple{--bslib-color-fg: #fff;--bslib-color-bg: #c05f40;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #c05f40;color:#fff}.bg-gradient-yellow-pink{--bslib-color-fg: #fff;--bslib-color-bg: #f65f46;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #f65f46;color:#fff}.bg-gradient-yellow-red{--bslib-color-fg: #fff;--bslib-color-bg: #ff4625;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #ff4625;color:#fff}.bg-gradient-yellow-orange{--bslib-color-fg: #000;--bslib-color-bg: #f98b2e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #f98b2e;color:#000}.bg-gradient-yellow-green{--bslib-color-fg: #fff;--bslib-color-bg: #b28f18;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #b28f18;color:#fff}.bg-gradient-yellow-teal{--bslib-color-fg: #fff;--bslib-color-bg: #a6974b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #a6974b;color:#fff}.bg-gradient-yellow-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #d66859;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #d66859;color:#fff}.bg-gradient-green-blue{--bslib-color-fg: #fff;--bslib-color-bg: #35a069;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #35a069;color:#fff}.bg-gradient-green-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #4f746f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #4f746f;color:#fff}.bg-gradient-green-purple{--bslib-color-fg: #fff;--bslib-color-bg: #4d8640;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #4d8640;color:#fff}.bg-gradient-green-pink{--bslib-color-fg: #fff;--bslib-color-bg: #838646;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #838646;color:#fff}.bg-gradient-green-red{--bslib-color-fg: #fff;--bslib-color-bg: #8c6d25;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #8c6d25;color:#fff}.bg-gradient-green-orange{--bslib-color-fg: #000;--bslib-color-bg: #86b22e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #86b22e;color:#000}.bg-gradient-green-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #8c9c18;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #8c9c18;color:#fff}.bg-gradient-green-teal{--bslib-color-fg: #000;--bslib-color-bg: #33be4b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #33be4b;color:#000}.bg-gradient-green-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #638f59;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #638f59;color:#fff}.bg-gradient-teal-blue{--bslib-color-fg: #fff;--bslib-color-bg: #23acb5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #23acb5;color:#fff}.bg-gradient-teal-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #3c7fbb;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #3c7fbb;color:#fff}.bg-gradient-teal-purple{--bslib-color-fg: #fff;--bslib-color-bg: #3a918c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #3a918c;color:#fff}.bg-gradient-teal-pink{--bslib-color-fg: #fff;--bslib-color-bg: #709193;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #709193;color:#fff}.bg-gradient-teal-red{--bslib-color-fg: #fff;--bslib-color-bg: #797971;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #797971;color:#fff}.bg-gradient-teal-orange{--bslib-color-fg: #000;--bslib-color-bg: #73be7a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #73be7a;color:#000}.bg-gradient-teal-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #79a764;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #79a764;color:#fff}.bg-gradient-teal-green{--bslib-color-fg: #000;--bslib-color-bg: #2cc164;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #2cc164;color:#000}.bg-gradient-teal-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #509aa5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #509aa5;color:#fff}.bg-gradient-cyan-blue{--bslib-color-fg: #fff;--bslib-color-bg: #6b66cb;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #6b66cb;color:#fff}.bg-gradient-cyan-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #8539d1;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #8539d1;color:#fff}.bg-gradient-cyan-purple{--bslib-color-fg: #fff;--bslib-color-bg: #834ba2;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #834ba2;color:#fff}.bg-gradient-cyan-pink{--bslib-color-fg: #fff;--bslib-color-bg: #b94ba8;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #b94ba8;color:#fff}.bg-gradient-cyan-red{--bslib-color-fg: #fff;--bslib-color-bg: #c23287;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #c23287;color:#fff}.bg-gradient-cyan-orange{--bslib-color-fg: #fff;--bslib-color-bg: #bc788f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #bc788f;color:#fff}.bg-gradient-cyan-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #c2617a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #c2617a;color:#fff}.bg-gradient-cyan-green{--bslib-color-fg: #fff;--bslib-color-bg: #757b7a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #757b7a;color:#fff}.bg-gradient-cyan-teal{--bslib-color-fg: #fff;--bslib-color-bg: #6983ad;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #6983ad;color:#fff}:root{--bslib-spacer: 1rem;--bslib-mb-spacer: var(--bslib-spacer, 1rem)}.bslib-mb-spacing{margin-bottom:var(--bslib-mb-spacer)}.bslib-gap-spacing{gap:var(--bslib-mb-spacer)}.bslib-gap-spacing>.bslib-mb-spacing,.bslib-gap-spacing>.form-group,.bslib-gap-spacing>p,.bslib-gap-spacing>pre{margin-bottom:0}.html-fill-container>.html-fill-item.bslib-mb-spacing{margin-bottom:0}.tab-content>.tab-pane.html-fill-container{display:none}.tab-content>.active.html-fill-container{display:flex}.tab-content.html-fill-container{padding:0}html{height:100%}.bslib-page-fill{width:100%;height:100%;margin:0;padding:var(--bslib-spacer, 1rem);gap:var(--bslib-spacer, 1rem)}@media(max-width: 575.98px){.bslib-page-fill{height:var(--bslib-page-fill-mobile-height, auto)}}.bslib-grid{display:grid !important;gap:var(--bslib-spacer, 1rem);height:var(--bslib-grid-height)}.bslib-grid.grid{grid-template-columns:repeat(var(--bs-columns, 12), minmax(0, 1fr));grid-template-rows:unset;grid-auto-rows:var(--bslib-grid--row-heights);--bslib-grid--row-heights--xs: unset;--bslib-grid--row-heights--sm: unset;--bslib-grid--row-heights--md: unset;--bslib-grid--row-heights--lg: unset;--bslib-grid--row-heights--xl: unset;--bslib-grid--row-heights--xxl: unset}.bslib-grid.grid.bslib-grid--row-heights--xs{--bslib-grid--row-heights: var(--bslib-grid--row-heights--xs)}@media(min-width: 576px){.bslib-grid.grid.bslib-grid--row-heights--sm{--bslib-grid--row-heights: var(--bslib-grid--row-heights--sm)}}@media(min-width: 768px){.bslib-grid.grid.bslib-grid--row-heights--md{--bslib-grid--row-heights: var(--bslib-grid--row-heights--md)}}@media(min-width: 992px){.bslib-grid.grid.bslib-grid--row-heights--lg{--bslib-grid--row-heights: var(--bslib-grid--row-heights--lg)}}@media(min-width: 1200px){.bslib-grid.grid.bslib-grid--row-heights--xl{--bslib-grid--row-heights: var(--bslib-grid--row-heights--xl)}}@media(min-width: 1400px){.bslib-grid.grid.bslib-grid--row-heights--xxl{--bslib-grid--row-heights: var(--bslib-grid--row-heights--xxl)}}.bslib-grid>*>.shiny-input-container{width:100%}.bslib-grid-item{grid-column:auto/span 1}@media(max-width: 767.98px){.bslib-grid-item{grid-column:1/-1}}@media(max-width: 575.98px){.bslib-grid{grid-template-columns:1fr !important;height:var(--bslib-grid-height-mobile)}.bslib-grid.grid{height:unset !important;grid-auto-rows:var(--bslib-grid--row-heights--xs, auto)}}.navbar+.container-fluid:has(>.tab-content>.tab-pane.active.html-fill-container),.navbar+.container-sm:has(>.tab-content>.tab-pane.active.html-fill-container),.navbar+.container-md:has(>.tab-content>.tab-pane.active.html-fill-container),.navbar+.container-lg:has(>.tab-content>.tab-pane.active.html-fill-container),.navbar+.container-xl:has(>.tab-content>.tab-pane.active.html-fill-container),.navbar+.container-xxl:has(>.tab-content>.tab-pane.active.html-fill-container){padding-left:0;padding-right:0}.navbar+.container-fluid>.tab-content>.tab-pane.active.html-fill-container,.navbar+.container-sm>.tab-content>.tab-pane.active.html-fill-container,.navbar+.container-md>.tab-content>.tab-pane.active.html-fill-container,.navbar+.container-lg>.tab-content>.tab-pane.active.html-fill-container,.navbar+.container-xl>.tab-content>.tab-pane.active.html-fill-container,.navbar+.container-xxl>.tab-content>.tab-pane.active.html-fill-container{padding:var(--bslib-spacer, 1rem);gap:var(--bslib-spacer, 1rem)}.navbar+.container-fluid>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child),.navbar+.container-sm>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child),.navbar+.container-md>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child),.navbar+.container-lg>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child),.navbar+.container-xl>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child),.navbar+.container-xxl>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child){padding:0}.navbar+.container-fluid>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]),.navbar+.container-sm>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]),.navbar+.container-md>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]),.navbar+.container-lg>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]),.navbar+.container-xl>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]),.navbar+.container-xxl>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]){border-left:none;border-right:none;border-bottom:none}.navbar+.container-fluid>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]),.navbar+.container-sm>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]),.navbar+.container-md>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]),.navbar+.container-lg>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]),.navbar+.container-xl>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]),.navbar+.container-xxl>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]){border-radius:0}.navbar+div>.bslib-sidebar-layout{border-top:var(--bslib-sidebar-border)}.bslib-card{overflow:auto}.bslib-card .card-body+.card-body{padding-top:0}.bslib-card .card-body{overflow:auto}.bslib-card .card-body p{margin-top:0}.bslib-card .card-body p:last-child{margin-bottom:0}.bslib-card .card-body{max-height:var(--bslib-card-body-max-height, none)}.bslib-card[data-full-screen=true]>.card-body{max-height:var(--bslib-card-body-max-height-full-screen, none)}.bslib-card .card-header .form-group{margin-bottom:0}.bslib-card .card-header .selectize-control{margin-bottom:0}.bslib-card .card-header .selectize-control .item{margin-right:1.15rem}.bslib-card .card-footer{margin-top:auto}.bslib-card .bslib-navs-card-title{display:flex;flex-wrap:wrap;justify-content:space-between;align-items:center}.bslib-card .bslib-navs-card-title .nav{margin-left:auto}.bslib-card .bslib-sidebar-layout:not([data-bslib-sidebar-border=true]){border:none}.bslib-card .bslib-sidebar-layout:not([data-bslib-sidebar-border-radius=true]){border-top-left-radius:0;border-top-right-radius:0}[data-full-screen=true]{position:fixed;inset:3.5rem 1rem 1rem;height:auto !important;max-height:none !important;width:auto !important;z-index:1070}.bslib-full-screen-enter{display:none;position:absolute;bottom:var(--bslib-full-screen-enter-bottom, 0.2rem);right:var(--bslib-full-screen-enter-right, 0);top:var(--bslib-full-screen-enter-top);left:var(--bslib-full-screen-enter-left);color:var(--bslib-color-fg, var(--bs-card-color));background-color:var(--bslib-color-bg, var(--bs-card-bg, var(--bs-body-bg)));border:var(--bs-card-border-width) solid var(--bslib-color-fg, var(--bs-card-border-color));box-shadow:0 2px 4px rgba(0,0,0,.15);margin:.2rem .4rem;padding:.55rem !important;font-size:.8rem;cursor:pointer;opacity:.7;z-index:1070}.bslib-full-screen-enter:hover{opacity:1}.card[data-full-screen=false]:hover>*>.bslib-full-screen-enter{display:block}.bslib-has-full-screen .card:hover>*>.bslib-full-screen-enter{display:none}@media(max-width: 575.98px){.bslib-full-screen-enter{display:none !important}}.bslib-full-screen-exit{position:relative;top:1.35rem;font-size:.9rem;cursor:pointer;text-decoration:none;display:flex;float:right;margin-right:2.15rem;align-items:center;color:rgba(var(--bs-body-bg-rgb), 0.8)}.bslib-full-screen-exit:hover{color:rgba(var(--bs-body-bg-rgb), 1)}.bslib-full-screen-exit svg{margin-left:.5rem;font-size:1.5rem}#bslib-full-screen-overlay{position:fixed;inset:0;background-color:rgba(var(--bs-body-color-rgb), 0.6);backdrop-filter:blur(2px);-webkit-backdrop-filter:blur(2px);z-index:1069;animation:bslib-full-screen-overlay-enter 400ms cubic-bezier(0.6, 0.02, 0.65, 1) forwards}@keyframes bslib-full-screen-overlay-enter{0%{opacity:0}100%{opacity:1}}.accordion .accordion-header{font-size:calc(1.29rem + 0.48vw);margin-top:0;margin-bottom:.5rem;font-weight:400;line-height:1.2;color:var(--bs-heading-color);margin-bottom:0}@media(min-width: 1200px){.accordion .accordion-header{font-size:1.65rem}}.accordion .accordion-icon:not(:empty){margin-right:.75rem;display:flex}.accordion .accordion-button:not(.collapsed){box-shadow:none}.accordion .accordion-button:not(.collapsed):focus{box-shadow:var(--bs-accordion-btn-focus-box-shadow)}.bslib-sidebar-layout{--bslib-sidebar-transition-duration: 500ms;--bslib-sidebar-transition-easing-x: cubic-bezier(0.8, 0.78, 0.22, 1.07);--bslib-sidebar-border: var(--bs-card-border-width, 1px) solid var(--bs-card-border-color, rgba(0, 0, 0, 0.175));--bslib-sidebar-border-radius: var(--bs-border-radius);--bslib-sidebar-vert-border: var(--bs-card-border-width, 1px) solid var(--bs-card-border-color, rgba(0, 0, 0, 0.175));--bslib-sidebar-bg: rgba(var(--bs-emphasis-color-rgb, 0, 0, 0), 0.05);--bslib-sidebar-fg: var(--bs-emphasis-color, black);--bslib-sidebar-main-fg: var(--bs-card-color, var(--bs-body-color));--bslib-sidebar-main-bg: var(--bs-card-bg, var(--bs-body-bg));--bslib-sidebar-toggle-bg: rgba(var(--bs-emphasis-color-rgb, 0, 0, 0), 0.1);--bslib-sidebar-padding: calc(var(--bslib-spacer) * 1.5);--bslib-sidebar-icon-size: var(--bslib-spacer, 1rem);--bslib-sidebar-icon-button-size: calc(var(--bslib-sidebar-icon-size, 1rem) * 2);--bslib-sidebar-padding-icon: calc(var(--bslib-sidebar-icon-button-size, 2rem) * 1.5);--bslib-collapse-toggle-border-radius: var(--bs-border-radius, 0.25rem);--bslib-collapse-toggle-transform: 0deg;--bslib-sidebar-toggle-transition-easing: cubic-bezier(1, 0, 0, 1);--bslib-collapse-toggle-right-transform: 180deg;--bslib-sidebar-column-main: minmax(0, 1fr);display:grid !important;grid-template-columns:min(100% - var(--bslib-sidebar-icon-size),var(--bslib-sidebar-width, 250px)) var(--bslib-sidebar-column-main);position:relative;transition:grid-template-columns ease-in-out var(--bslib-sidebar-transition-duration);border:var(--bslib-sidebar-border);border-radius:var(--bslib-sidebar-border-radius)}@media(prefers-reduced-motion: reduce){.bslib-sidebar-layout{transition:none}}.bslib-sidebar-layout[data-bslib-sidebar-border=false]{border:none}.bslib-sidebar-layout[data-bslib-sidebar-border-radius=false]{border-radius:initial}.bslib-sidebar-layout>.main,.bslib-sidebar-layout>.sidebar{grid-row:1/2;border-radius:inherit;overflow:auto}.bslib-sidebar-layout>.main{grid-column:2/3;border-top-left-radius:0;border-bottom-left-radius:0;padding:var(--bslib-sidebar-padding);transition:padding var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration);color:var(--bslib-sidebar-main-fg);background-color:var(--bslib-sidebar-main-bg)}.bslib-sidebar-layout>.sidebar{grid-column:1/2;width:100%;height:100%;border-right:var(--bslib-sidebar-vert-border);border-top-right-radius:0;border-bottom-right-radius:0;color:var(--bslib-sidebar-fg);background-color:var(--bslib-sidebar-bg);backdrop-filter:blur(5px)}.bslib-sidebar-layout>.sidebar>.sidebar-content{display:flex;flex-direction:column;gap:var(--bslib-spacer, 1rem);padding:var(--bslib-sidebar-padding);padding-top:var(--bslib-sidebar-padding-icon)}.bslib-sidebar-layout>.sidebar>.sidebar-content>:last-child:not(.sidebar-title){margin-bottom:0}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion{margin-left:calc(-1*var(--bslib-sidebar-padding));margin-right:calc(-1*var(--bslib-sidebar-padding))}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion:last-child{margin-bottom:calc(-1*var(--bslib-sidebar-padding))}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion:not(:last-child){margin-bottom:1rem}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion .accordion-body{display:flex;flex-direction:column}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion:not(:first-child) .accordion-item:first-child{border-top:var(--bs-accordion-border-width) solid var(--bs-accordion-border-color)}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion:not(:last-child) .accordion-item:last-child{border-bottom:var(--bs-accordion-border-width) solid var(--bs-accordion-border-color)}.bslib-sidebar-layout>.sidebar>.sidebar-content.has-accordion>.sidebar-title{border-bottom:none;padding-bottom:0}.bslib-sidebar-layout>.sidebar .shiny-input-container{width:100%}.bslib-sidebar-layout[data-bslib-sidebar-open=always]>.sidebar>.sidebar-content{padding-top:var(--bslib-sidebar-padding)}.bslib-sidebar-layout>.collapse-toggle{grid-row:1/2;grid-column:1/2;display:inline-flex;align-items:center;position:absolute;right:calc(var(--bslib-sidebar-icon-size));top:calc(var(--bslib-sidebar-icon-size, 1rem)/2);border:none;border-radius:var(--bslib-collapse-toggle-border-radius);height:var(--bslib-sidebar-icon-button-size, 2rem);width:var(--bslib-sidebar-icon-button-size, 2rem);display:flex;align-items:center;justify-content:center;padding:0;color:var(--bslib-sidebar-fg);background-color:unset;transition:color var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration),top var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration),right var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration),left var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration)}.bslib-sidebar-layout>.collapse-toggle:hover{background-color:var(--bslib-sidebar-toggle-bg)}.bslib-sidebar-layout>.collapse-toggle>.collapse-icon{opacity:.8;width:var(--bslib-sidebar-icon-size);height:var(--bslib-sidebar-icon-size);transform:rotateY(var(--bslib-collapse-toggle-transform));transition:transform var(--bslib-sidebar-toggle-transition-easing) var(--bslib-sidebar-transition-duration)}.bslib-sidebar-layout>.collapse-toggle:hover>.collapse-icon{opacity:1}.bslib-sidebar-layout .sidebar-title{font-size:1.25rem;line-height:1.25;margin-top:0;margin-bottom:1rem;padding-bottom:1rem;border-bottom:var(--bslib-sidebar-border)}.bslib-sidebar-layout.sidebar-right{grid-template-columns:var(--bslib-sidebar-column-main) min(100% - var(--bslib-sidebar-icon-size),var(--bslib-sidebar-width, 250px))}.bslib-sidebar-layout.sidebar-right>.main{grid-column:1/2;border-top-right-radius:0;border-bottom-right-radius:0;border-top-left-radius:inherit;border-bottom-left-radius:inherit}.bslib-sidebar-layout.sidebar-right>.sidebar{grid-column:2/3;border-right:none;border-left:var(--bslib-sidebar-vert-border);border-top-left-radius:0;border-bottom-left-radius:0}.bslib-sidebar-layout.sidebar-right>.collapse-toggle{grid-column:2/3;left:var(--bslib-sidebar-icon-size);right:unset;border:var(--bslib-collapse-toggle-border)}.bslib-sidebar-layout.sidebar-right>.collapse-toggle>.collapse-icon{transform:rotateY(var(--bslib-collapse-toggle-right-transform))}.bslib-sidebar-layout.sidebar-collapsed{--bslib-collapse-toggle-transform: 180deg;--bslib-collapse-toggle-right-transform: 0deg;--bslib-sidebar-vert-border: none;grid-template-columns:0 minmax(0, 1fr)}.bslib-sidebar-layout.sidebar-collapsed.sidebar-right{grid-template-columns:minmax(0, 1fr) 0}.bslib-sidebar-layout.sidebar-collapsed:not(.transitioning)>.sidebar>*{display:none}.bslib-sidebar-layout.sidebar-collapsed>.main{border-radius:inherit}.bslib-sidebar-layout.sidebar-collapsed:not(.sidebar-right)>.main{padding-left:var(--bslib-sidebar-padding-icon)}.bslib-sidebar-layout.sidebar-collapsed.sidebar-right>.main{padding-right:var(--bslib-sidebar-padding-icon)}.bslib-sidebar-layout.sidebar-collapsed>.collapse-toggle{color:var(--bslib-sidebar-main-fg);top:calc(var(--bslib-sidebar-overlap-counter, 0)*(var(--bslib-sidebar-icon-size) + var(--bslib-sidebar-padding)) + var(--bslib-sidebar-icon-size, 1rem)/2);right:calc(-2.5*var(--bslib-sidebar-icon-size) - var(--bs-card-border-width, 1px))}.bslib-sidebar-layout.sidebar-collapsed.sidebar-right>.collapse-toggle{left:calc(-2.5*var(--bslib-sidebar-icon-size) - var(--bs-card-border-width, 1px));right:unset}@media(min-width: 576px){.bslib-sidebar-layout.transitioning>.sidebar>.sidebar-content{display:none}}@media(max-width: 575.98px){.bslib-sidebar-layout[data-bslib-sidebar-open=desktop]{--bslib-sidebar-js-init-collapsed: true}.bslib-sidebar-layout>.sidebar,.bslib-sidebar-layout.sidebar-right>.sidebar{border:none}.bslib-sidebar-layout>.main,.bslib-sidebar-layout.sidebar-right>.main{grid-column:1/3}.bslib-sidebar-layout[data-bslib-sidebar-open=always]{display:block !important}.bslib-sidebar-layout[data-bslib-sidebar-open=always]>.sidebar{max-height:var(--bslib-sidebar-max-height-mobile);overflow-y:auto;border-top:var(--bslib-sidebar-vert-border)}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]){grid-template-columns:100% 0}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]):not(.sidebar-collapsed)>.sidebar{z-index:1}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]):not(.sidebar-collapsed)>.collapse-toggle{z-index:1}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]).sidebar-right{grid-template-columns:0 100%}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]).sidebar-collapsed{grid-template-columns:0 100%}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]).sidebar-collapsed.sidebar-right{grid-template-columns:100% 0}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]):not(.sidebar-right)>.main{padding-left:var(--bslib-sidebar-padding-icon)}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]).sidebar-right>.main{padding-right:var(--bslib-sidebar-padding-icon)}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always])>.main{opacity:0;transition:opacity var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration)}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]).sidebar-collapsed>.main{opacity:1}}:root{--bslib-value-box-shadow: none;--bslib-value-box-border-width-auto-yes: var(--bslib-value-box-border-width-baseline);--bslib-value-box-border-width-auto-no: 0;--bslib-value-box-border-width-baseline: 1px}.bslib-value-box{border-width:var(--bslib-value-box-border-width-auto-no, var(--bslib-value-box-border-width-baseline));container-name:bslib-value-box;container-type:inline-size}.bslib-value-box.card{box-shadow:var(--bslib-value-box-shadow)}.bslib-value-box.border-auto{border-width:var(--bslib-value-box-border-width-auto-yes, var(--bslib-value-box-border-width-baseline))}.bslib-value-box.default{--bslib-value-box-bg-default: var(--bs-card-bg, #fff);--bslib-value-box-border-color-default: var(--bs-card-border-color, rgba(0, 0, 0, 0.175));color:var(--bslib-value-box-color);background-color:var(--bslib-value-box-bg, var(--bslib-value-box-bg-default));border-color:var(--bslib-value-box-border-color, var(--bslib-value-box-border-color-default))}.bslib-value-box .value-box-grid{display:grid;grid-template-areas:"left right";align-items:center;overflow:hidden}.bslib-value-box .value-box-showcase{height:100%;max-height:var(---bslib-value-box-showcase-max-h, 100%)}.bslib-value-box .value-box-showcase,.bslib-value-box .value-box-showcase>.html-fill-item{width:100%}.bslib-value-box[data-full-screen=true] .value-box-showcase{max-height:var(---bslib-value-box-showcase-max-h-fs, 100%)}@media screen and (min-width: 575.98px){@container bslib-value-box (max-width: 300px){.bslib-value-box:not(.showcase-bottom) .value-box-grid{grid-template-columns:1fr !important;grid-template-rows:auto auto;grid-template-areas:"top" "bottom"}.bslib-value-box:not(.showcase-bottom) .value-box-grid .value-box-showcase{grid-area:top !important}.bslib-value-box:not(.showcase-bottom) .value-box-grid .value-box-area{grid-area:bottom !important;justify-content:end}}}.bslib-value-box .value-box-area{justify-content:center;padding:1.5rem 1rem;font-size:.9rem;font-weight:500}.bslib-value-box .value-box-area *{margin-bottom:0;margin-top:0}.bslib-value-box .value-box-title{font-size:1rem;margin-top:0;margin-bottom:.5rem;font-weight:400;line-height:1.2}.bslib-value-box .value-box-title:empty::after{content:" "}.bslib-value-box .value-box-value{font-size:calc(1.29rem + 0.48vw);margin-top:0;margin-bottom:.5rem;font-weight:400;line-height:1.2}@media(min-width: 1200px){.bslib-value-box .value-box-value{font-size:1.65rem}}.bslib-value-box .value-box-value:empty::after{content:" "}.bslib-value-box .value-box-showcase{align-items:center;justify-content:center;margin-top:auto;margin-bottom:auto;padding:1rem}.bslib-value-box .value-box-showcase .bi,.bslib-value-box .value-box-showcase .fa,.bslib-value-box .value-box-showcase .fab,.bslib-value-box .value-box-showcase .fas,.bslib-value-box .value-box-showcase .far{opacity:.85;min-width:50px;max-width:125%}.bslib-value-box .value-box-showcase .bi,.bslib-value-box .value-box-showcase .fa,.bslib-value-box .value-box-showcase .fab,.bslib-value-box .value-box-showcase .fas,.bslib-value-box .value-box-showcase .far{font-size:4rem}.bslib-value-box.showcase-top-right .value-box-grid{grid-template-columns:1fr var(---bslib-value-box-showcase-w, 50%)}.bslib-value-box.showcase-top-right .value-box-grid .value-box-showcase{grid-area:right;margin-left:auto;align-self:start;align-items:end;padding-left:0;padding-bottom:0}.bslib-value-box.showcase-top-right .value-box-grid .value-box-area{grid-area:left;align-self:end}.bslib-value-box.showcase-top-right[data-full-screen=true] .value-box-grid{grid-template-columns:auto var(---bslib-value-box-showcase-w-fs, 1fr)}.bslib-value-box.showcase-top-right[data-full-screen=true] .value-box-grid>div{align-self:center}.bslib-value-box.showcase-top-right:not([data-full-screen=true]) .value-box-showcase{margin-top:0}@container bslib-value-box (max-width: 300px){.bslib-value-box.showcase-top-right:not([data-full-screen=true]) .value-box-grid .value-box-showcase{padding-left:1rem}}.bslib-value-box.showcase-left-center .value-box-grid{grid-template-columns:var(---bslib-value-box-showcase-w, 30%) auto}.bslib-value-box.showcase-left-center[data-full-screen=true] .value-box-grid{grid-template-columns:var(---bslib-value-box-showcase-w-fs, 1fr) auto}.bslib-value-box.showcase-left-center:not([data-fill-screen=true]) .value-box-grid .value-box-showcase{grid-area:left}.bslib-value-box.showcase-left-center:not([data-fill-screen=true]) .value-box-grid .value-box-area{grid-area:right}.bslib-value-box.showcase-bottom .value-box-grid{grid-template-columns:1fr;grid-template-rows:1fr var(---bslib-value-box-showcase-h, auto);grid-template-areas:"top" "bottom";overflow:hidden}.bslib-value-box.showcase-bottom .value-box-grid .value-box-showcase{grid-area:bottom;padding:0;margin:0}.bslib-value-box.showcase-bottom .value-box-grid .value-box-area{grid-area:top}.bslib-value-box.showcase-bottom[data-full-screen=true] .value-box-grid{grid-template-rows:1fr var(---bslib-value-box-showcase-h-fs, 2fr)}.bslib-value-box.showcase-bottom[data-full-screen=true] .value-box-grid .value-box-showcase{padding:1rem}[data-bs-theme=dark] .bslib-value-box{--bslib-value-box-shadow: 0 0.5rem 1rem rgb(0 0 0 / 50%)}:root{--bslib-page-sidebar-title-bg: #f8f9fa;--bslib-page-sidebar-title-color: #000}.bslib-page-title{background-color:var(--bslib-page-sidebar-title-bg);color:var(--bslib-page-sidebar-title-color);font-size:1.25rem;font-weight:300;padding:var(--bslib-spacer, 1rem);padding-left:1.5rem;margin-bottom:0;border-bottom:1px solid #dee2e6}@media(min-width: 576px){.nav:not(.nav-hidden){display:flex !important;display:-webkit-flex !important}.nav:not(.nav-hidden):not(.nav-stacked):not(.flex-column){float:none !important}.nav:not(.nav-hidden):not(.nav-stacked):not(.flex-column)>.bslib-nav-spacer{margin-left:auto !important}.nav:not(.nav-hidden):not(.nav-stacked):not(.flex-column)>.form-inline{margin-top:auto;margin-bottom:auto}.nav:not(.nav-hidden).nav-stacked{flex-direction:column;-webkit-flex-direction:column;height:100%}.nav:not(.nav-hidden).nav-stacked>.bslib-nav-spacer{margin-top:auto !important}}.html-fill-container{display:flex;flex-direction:column;min-height:0;min-width:0}.html-fill-container>.html-fill-item{flex:1 1 auto;min-height:0;min-width:0}.html-fill-container>:not(.html-fill-item){flex:0 0 auto}.quarto-container{min-height:calc(100vh - 132px)}body.hypothesis-enabled #quarto-header{margin-right:16px}footer.footer .nav-footer,#quarto-header>nav{padding-left:1em;padding-right:1em}footer.footer div.nav-footer p:first-child{margin-top:0}footer.footer div.nav-footer p:last-child{margin-bottom:0}#quarto-content>*{padding-top:14px}#quarto-content>#quarto-sidebar-glass{padding-top:0px}@media(max-width: 991.98px){#quarto-content>*{padding-top:0}#quarto-content .subtitle{padding-top:14px}#quarto-content section:first-of-type h2:first-of-type,#quarto-content section:first-of-type .h2:first-of-type{margin-top:1rem}}.headroom-target,header.headroom{will-change:transform;transition:position 200ms linear;transition:all 200ms linear}header.headroom--pinned{transform:translateY(0%)}header.headroom--unpinned{transform:translateY(-100%)}.navbar-container{width:100%}.navbar-brand{overflow:hidden;text-overflow:ellipsis}.navbar-brand-container{max-width:calc(100% - 115px);min-width:0;display:flex;align-items:center}@media(min-width: 992px){.navbar-brand-container{margin-right:1em}}.navbar-brand.navbar-brand-logo{margin-right:4px;display:inline-flex}.navbar-toggler{flex-basis:content;flex-shrink:0}.navbar .navbar-brand-container{order:2}.navbar .navbar-toggler{order:1}.navbar .navbar-container>.navbar-nav{order:20}.navbar .navbar-container>.navbar-brand-container{margin-left:0 !important;margin-right:0 !important}.navbar .navbar-collapse{order:20}.navbar #quarto-search{order:4;margin-left:auto}.navbar .navbar-toggler{margin-right:.5em}.navbar-collapse .quarto-navbar-tools{margin-left:.5em}.navbar-logo{max-height:24px;width:auto;padding-right:4px}nav .nav-item:not(.compact){padding-top:1px}nav .nav-link i,nav .dropdown-item i{padding-right:1px}.navbar-expand-lg .navbar-nav .nav-link{padding-left:.6rem;padding-right:.6rem}nav .nav-item.compact .nav-link{padding-left:.5rem;padding-right:.5rem;font-size:1.1rem}.navbar .quarto-navbar-tools{order:3}.navbar .quarto-navbar-tools div.dropdown{display:inline-block}.navbar .quarto-navbar-tools .quarto-navigation-tool{color:#545555}.navbar .quarto-navbar-tools .quarto-navigation-tool:hover{color:#1f4eb6}.navbar-nav .dropdown-menu{min-width:220px;font-size:.9rem}.navbar .navbar-nav .nav-link.dropdown-toggle::after{opacity:.75;vertical-align:.175em}.navbar ul.dropdown-menu{padding-top:0;padding-bottom:0}.navbar .dropdown-header{text-transform:uppercase;font-size:.8rem;padding:0 .5rem}.navbar .dropdown-item{padding:.4rem .5rem}.navbar .dropdown-item>i.bi{margin-left:.1rem;margin-right:.25em}.sidebar #quarto-search{margin-top:-1px}.sidebar #quarto-search svg.aa-SubmitIcon{width:16px;height:16px}.sidebar-navigation a{color:inherit}.sidebar-title{margin-top:.25rem;padding-bottom:.5rem;font-size:1.3rem;line-height:1.6rem;visibility:visible}.sidebar-title>a{font-size:inherit;text-decoration:none}.sidebar-title .sidebar-tools-main{margin-top:-6px}@media(max-width: 991.98px){#quarto-sidebar div.sidebar-header{padding-top:.2em}}.sidebar-header-stacked .sidebar-title{margin-top:.6rem}.sidebar-logo{max-width:90%;padding-bottom:.5rem}.sidebar-logo-link{text-decoration:none}.sidebar-navigation li a{text-decoration:none}.sidebar-navigation .quarto-navigation-tool{opacity:.7;font-size:.875rem}#quarto-sidebar>nav>.sidebar-tools-main{margin-left:14px}.sidebar-tools-main{display:inline-flex;margin-left:0px;order:2}.sidebar-tools-main:not(.tools-wide){vertical-align:middle}.sidebar-navigation .quarto-navigation-tool.dropdown-toggle::after{display:none}.sidebar.sidebar-navigation>*{padding-top:1em}.sidebar-item{margin-bottom:.2em;line-height:1rem;margin-top:.4rem}.sidebar-section{padding-left:.5em;padding-bottom:.2em}.sidebar-item .sidebar-item-container{display:flex;justify-content:space-between;cursor:pointer}.sidebar-item-toggle:hover{cursor:pointer}.sidebar-item .sidebar-item-toggle .bi{font-size:.7rem;text-align:center}.sidebar-item .sidebar-item-toggle .bi-chevron-right::before{transition:transform 200ms ease}.sidebar-item .sidebar-item-toggle[aria-expanded=false] .bi-chevron-right::before{transform:none}.sidebar-item .sidebar-item-toggle[aria-expanded=true] .bi-chevron-right::before{transform:rotate(90deg)}.sidebar-item-text{width:100%}.sidebar-navigation .sidebar-divider{margin-left:0;margin-right:0;margin-top:.5rem;margin-bottom:.5rem}@media(max-width: 991.98px){.quarto-secondary-nav{display:block}.quarto-secondary-nav button.quarto-search-button{padding-right:0em;padding-left:2em}.quarto-secondary-nav button.quarto-btn-toggle{margin-left:-0.75rem;margin-right:.15rem}.quarto-secondary-nav nav.quarto-title-breadcrumbs{display:none}.quarto-secondary-nav nav.quarto-page-breadcrumbs{display:flex;align-items:center;padding-right:1em;margin-left:-0.25em}.quarto-secondary-nav nav.quarto-page-breadcrumbs a{text-decoration:none}.quarto-secondary-nav nav.quarto-page-breadcrumbs ol.breadcrumb{margin-bottom:0}}@media(min-width: 992px){.quarto-secondary-nav{display:none}}.quarto-title-breadcrumbs .breadcrumb{margin-bottom:.5em;font-size:.9rem}.quarto-title-breadcrumbs .breadcrumb li:last-of-type a{color:#6c757d}.quarto-secondary-nav .quarto-btn-toggle{color:#595959}.quarto-secondary-nav[aria-expanded=false] .quarto-btn-toggle .bi-chevron-right::before{transform:none}.quarto-secondary-nav[aria-expanded=true] .quarto-btn-toggle .bi-chevron-right::before{transform:rotate(90deg)}.quarto-secondary-nav .quarto-btn-toggle .bi-chevron-right::before{transition:transform 200ms ease}.quarto-secondary-nav{cursor:pointer}.no-decor{text-decoration:none}.quarto-secondary-nav-title{margin-top:.3em;color:#595959;padding-top:4px}.quarto-secondary-nav nav.quarto-page-breadcrumbs{color:#595959}.quarto-secondary-nav nav.quarto-page-breadcrumbs a{color:#595959}.quarto-secondary-nav nav.quarto-page-breadcrumbs a:hover{color:rgba(33,81,191,.8)}.quarto-secondary-nav nav.quarto-page-breadcrumbs .breadcrumb-item::before{color:#8c8c8c}.breadcrumb-item{line-height:1.2rem}div.sidebar-item-container{color:#595959}div.sidebar-item-container:hover,div.sidebar-item-container:focus{color:rgba(33,81,191,.8)}div.sidebar-item-container.disabled{color:rgba(89,89,89,.75)}div.sidebar-item-container .active,div.sidebar-item-container .show>.nav-link,div.sidebar-item-container .sidebar-link>code{color:#2151bf}div.sidebar.sidebar-navigation.rollup.quarto-sidebar-toggle-contents,nav.sidebar.sidebar-navigation:not(.rollup){background-color:#fff}@media(max-width: 991.98px){.sidebar-navigation .sidebar-item a,.nav-page .nav-page-text,.sidebar-navigation{font-size:1rem}.sidebar-navigation ul.sidebar-section.depth1 .sidebar-section-item{font-size:1.1rem}.sidebar-logo{display:none}.sidebar.sidebar-navigation{position:static;border-bottom:1px solid #dee2e6}.sidebar.sidebar-navigation.collapsing{position:fixed;z-index:1000}.sidebar.sidebar-navigation.show{position:fixed;z-index:1000}.sidebar.sidebar-navigation{min-height:100%}nav.quarto-secondary-nav{background-color:#fff;border-bottom:1px solid #dee2e6}.quarto-banner nav.quarto-secondary-nav{background-color:#f8f9fa;color:#545555;border-top:1px solid #dee2e6}.sidebar .sidebar-footer{visibility:visible;padding-top:1rem;position:inherit}.sidebar-tools-collapse{display:block}}#quarto-sidebar{transition:width .15s ease-in}#quarto-sidebar>*{padding-right:1em}@media(max-width: 991.98px){#quarto-sidebar .sidebar-menu-container{white-space:nowrap;min-width:225px}#quarto-sidebar.show{transition:width .15s ease-out}}@media(min-width: 992px){#quarto-sidebar{display:flex;flex-direction:column}.nav-page .nav-page-text,.sidebar-navigation .sidebar-section .sidebar-item{font-size:.875rem}.sidebar-navigation .sidebar-item{font-size:.925rem}.sidebar.sidebar-navigation{display:block;position:sticky}.sidebar-search{width:100%}.sidebar .sidebar-footer{visibility:visible}}@media(min-width: 992px){#quarto-sidebar-glass{display:none}}@media(max-width: 991.98px){#quarto-sidebar-glass{position:fixed;top:0;bottom:0;left:0;right:0;background-color:rgba(255,255,255,0);transition:background-color .15s ease-in;z-index:-1}#quarto-sidebar-glass.collapsing{z-index:1000}#quarto-sidebar-glass.show{transition:background-color .15s ease-out;background-color:rgba(102,102,102,.4);z-index:1000}}.sidebar .sidebar-footer{padding:.5rem 1rem;align-self:flex-end;color:#6c757d;width:100%}.quarto-page-breadcrumbs .breadcrumb-item+.breadcrumb-item,.quarto-page-breadcrumbs .breadcrumb-item{padding-right:.33em;padding-left:0}.quarto-page-breadcrumbs .breadcrumb-item::before{padding-right:.33em}.quarto-sidebar-footer{font-size:.875em}.sidebar-section .bi-chevron-right{vertical-align:middle}.sidebar-section .bi-chevron-right::before{font-size:.9em}.notransition{-webkit-transition:none !important;-moz-transition:none !important;-o-transition:none !important;transition:none !important}.btn:focus:not(:focus-visible){box-shadow:none}.page-navigation{display:flex;justify-content:space-between}.nav-page{padding-bottom:.75em}.nav-page .bi{font-size:1.8rem;vertical-align:middle}.nav-page .nav-page-text{padding-left:.25em;padding-right:.25em}.nav-page a{color:#6c757d;text-decoration:none;display:flex;align-items:center}.nav-page a:hover{color:#1f4eb6}.nav-footer .toc-actions{padding-bottom:.5em;padding-top:.5em}.nav-footer .toc-actions a,.nav-footer .toc-actions a:hover{text-decoration:none}.nav-footer .toc-actions ul{display:flex;list-style:none}.nav-footer .toc-actions ul :first-child{margin-left:auto}.nav-footer .toc-actions ul :last-child{margin-right:auto}.nav-footer .toc-actions ul li{padding-right:1.5em}.nav-footer .toc-actions ul li i.bi{padding-right:.4em}.nav-footer .toc-actions ul li:last-of-type{padding-right:0}.nav-footer{display:flex;flex-direction:row;flex-wrap:wrap;justify-content:space-between;align-items:baseline;text-align:center;padding-top:.5rem;padding-bottom:.5rem;background-color:#fff}body.nav-fixed{padding-top:64px}.nav-footer-contents{color:#6c757d;margin-top:.25rem}.nav-footer{min-height:3.5em;color:#757575}.nav-footer a{color:#757575}.nav-footer .nav-footer-left{font-size:.825em}.nav-footer .nav-footer-center{font-size:.825em}.nav-footer .nav-footer-right{font-size:.825em}.nav-footer-left .footer-items,.nav-footer-center .footer-items,.nav-footer-right .footer-items{display:inline-flex;padding-top:.3em;padding-bottom:.3em;margin-bottom:0em}.nav-footer-left .footer-items .nav-link,.nav-footer-center .footer-items .nav-link,.nav-footer-right .footer-items .nav-link{padding-left:.6em;padding-right:.6em}@media(min-width: 768px){.nav-footer-left{flex:1 1 0px;text-align:left}}@media(max-width: 575.98px){.nav-footer-left{margin-bottom:1em;flex:100%}}@media(min-width: 768px){.nav-footer-right{flex:1 1 0px;text-align:right}}@media(max-width: 575.98px){.nav-footer-right{margin-bottom:1em;flex:100%}}.nav-footer-center{text-align:center;min-height:3em}@media(min-width: 768px){.nav-footer-center{flex:1 1 0px}}.nav-footer-center .footer-items{justify-content:center}@media(max-width: 767.98px){.nav-footer-center{margin-bottom:1em;flex:100%}}@media(max-width: 767.98px){.nav-footer-center{margin-top:3em;order:10}}.navbar .quarto-reader-toggle.reader .quarto-reader-toggle-btn{background-color:#545555;border-radius:3px}@media(max-width: 991.98px){.quarto-reader-toggle{display:none}}.quarto-reader-toggle.reader.quarto-navigation-tool .quarto-reader-toggle-btn{background-color:#595959;border-radius:3px}.quarto-reader-toggle .quarto-reader-toggle-btn{display:inline-flex;padding-left:.2em;padding-right:.2em;margin-left:-0.2em;margin-right:-0.2em;text-align:center}.navbar .quarto-reader-toggle:not(.reader) .bi::before{background-image:url('data:image/svg+xml,')}.navbar .quarto-reader-toggle.reader .bi::before{background-image:url('data:image/svg+xml,')}.sidebar-navigation .quarto-reader-toggle:not(.reader) .bi::before{background-image:url('data:image/svg+xml,')}.sidebar-navigation .quarto-reader-toggle.reader .bi::before{background-image:url('data:image/svg+xml,')}#quarto-back-to-top{display:none;position:fixed;bottom:50px;background-color:#fff;border-radius:.25rem;box-shadow:0 .2rem .5rem #6c757d,0 0 .05rem #6c757d;color:#6c757d;text-decoration:none;font-size:.9em;text-align:center;left:50%;padding:.4rem .8rem;transform:translate(-50%, 0)}#quarto-announcement{padding:.5em;display:flex;justify-content:space-between;margin-bottom:0;font-size:.9em}#quarto-announcement .quarto-announcement-content{margin-right:auto}#quarto-announcement .quarto-announcement-content p{margin-bottom:0}#quarto-announcement .quarto-announcement-icon{margin-right:.5em;font-size:1.2em;margin-top:-0.15em}#quarto-announcement .quarto-announcement-action{cursor:pointer}.aa-DetachedSearchButtonQuery{display:none}.aa-DetachedOverlay ul.aa-List,#quarto-search-results ul.aa-List{list-style:none;padding-left:0}.aa-DetachedOverlay .aa-Panel,#quarto-search-results .aa-Panel{background-color:#fff;position:absolute;z-index:2000}#quarto-search-results .aa-Panel{max-width:400px}#quarto-search input{font-size:.925rem}@media(min-width: 992px){.navbar #quarto-search{margin-left:.25rem;order:999}}.navbar.navbar-expand-sm #quarto-search,.navbar.navbar-expand-md #quarto-search{order:999}@media(min-width: 992px){.navbar .quarto-navbar-tools{order:900}}@media(min-width: 992px){.navbar .quarto-navbar-tools.tools-end{margin-left:auto !important}}@media(max-width: 991.98px){#quarto-sidebar .sidebar-search{display:none}}#quarto-sidebar .sidebar-search .aa-Autocomplete{width:100%}.navbar .aa-Autocomplete .aa-Form{width:180px}.navbar #quarto-search.type-overlay .aa-Autocomplete{width:40px}.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form{background-color:inherit;border:none}.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form:focus-within{box-shadow:none;outline:none}.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form .aa-InputWrapper{display:none}.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form .aa-InputWrapper:focus-within{display:inherit}.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form .aa-Label svg,.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form .aa-LoadingIndicator svg{width:26px;height:26px;color:#545555;opacity:1}.navbar #quarto-search.type-overlay .aa-Autocomplete svg.aa-SubmitIcon{width:26px;height:26px;color:#545555;opacity:1}.aa-Autocomplete .aa-Form,.aa-DetachedFormContainer .aa-Form{align-items:center;background-color:#fff;border:1px solid #dee2e6;border-radius:.25rem;color:#343a40;display:flex;line-height:1em;margin:0;position:relative;width:100%}.aa-Autocomplete .aa-Form:focus-within,.aa-DetachedFormContainer .aa-Form:focus-within{box-shadow:rgba(39,128,227,.6) 0 0 0 1px;outline:currentColor none medium}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix{align-items:center;display:flex;flex-shrink:0;order:1}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-Label,.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-Label,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator{cursor:initial;flex-shrink:0;padding:0;text-align:left}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-Label svg,.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator svg,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-Label svg,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator svg{color:#343a40;opacity:.5}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-SubmitButton,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-SubmitButton{appearance:none;background:none;border:0;margin:0}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator{align-items:center;display:flex;justify-content:center}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator[hidden],.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator[hidden]{display:none}.aa-Autocomplete .aa-Form .aa-InputWrapper,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper{order:3;position:relative;width:100%}.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input{appearance:none;background:none;border:0;color:#343a40;font:inherit;height:calc(1.5em + .1rem + 2px);padding:0;width:100%}.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input::placeholder,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input::placeholder{color:#343a40;opacity:.8}.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input:focus,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input:focus{border-color:none;box-shadow:none;outline:none}.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-decoration,.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-cancel-button,.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-results-button,.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-results-decoration,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-decoration,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-cancel-button,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-results-button,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-results-decoration{display:none}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix{align-items:center;display:flex;order:4}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-ClearButton,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-ClearButton{align-items:center;background:none;border:0;color:#343a40;opacity:.8;cursor:pointer;display:flex;margin:0;width:calc(1.5em + .1rem + 2px)}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-ClearButton:hover,.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-ClearButton:focus,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-ClearButton:hover,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-ClearButton:focus{color:#343a40;opacity:.8}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-ClearButton[hidden],.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-ClearButton[hidden]{display:none}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-ClearButton svg,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-ClearButton svg{width:calc(1.5em + 0.75rem + calc(1px * 2))}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-CopyButton,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-CopyButton{border:none;align-items:center;background:none;color:#343a40;opacity:.4;font-size:.7rem;cursor:pointer;display:none;margin:0;width:calc(1em + .1rem + 2px)}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-CopyButton:hover,.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-CopyButton:focus,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-CopyButton:hover,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-CopyButton:focus{color:#343a40;opacity:.8}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-CopyButton[hidden],.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-CopyButton[hidden]{display:none}.aa-PanelLayout:empty{display:none}.quarto-search-no-results.no-query{display:none}.aa-Source:has(.no-query){display:none}#quarto-search-results .aa-Panel{border:solid #dee2e6 1px}#quarto-search-results .aa-SourceNoResults{width:398px}.aa-DetachedOverlay .aa-Panel,#quarto-search-results .aa-Panel{max-height:65vh;overflow-y:auto;font-size:.925rem}.aa-DetachedOverlay .aa-SourceNoResults,#quarto-search-results .aa-SourceNoResults{height:60px;display:flex;justify-content:center;align-items:center}.aa-DetachedOverlay .search-error,#quarto-search-results .search-error{padding-top:10px;padding-left:20px;padding-right:20px;cursor:default}.aa-DetachedOverlay .search-error .search-error-title,#quarto-search-results .search-error .search-error-title{font-size:1.1rem;margin-bottom:.5rem}.aa-DetachedOverlay .search-error .search-error-title .search-error-icon,#quarto-search-results .search-error .search-error-title .search-error-icon{margin-right:8px}.aa-DetachedOverlay .search-error .search-error-text,#quarto-search-results .search-error .search-error-text{font-weight:300}.aa-DetachedOverlay .search-result-text,#quarto-search-results .search-result-text{font-weight:300;overflow:hidden;text-overflow:ellipsis;display:-webkit-box;-webkit-line-clamp:2;-webkit-box-orient:vertical;line-height:1.2rem;max-height:2.4rem}.aa-DetachedOverlay .aa-SourceHeader .search-result-header,#quarto-search-results .aa-SourceHeader .search-result-header{font-size:.875rem;background-color:#f2f2f2;padding-left:14px;padding-bottom:4px;padding-top:4px}.aa-DetachedOverlay .aa-SourceHeader .search-result-header-no-results,#quarto-search-results .aa-SourceHeader .search-result-header-no-results{display:none}.aa-DetachedOverlay .aa-SourceFooter .algolia-search-logo,#quarto-search-results .aa-SourceFooter .algolia-search-logo{width:110px;opacity:.85;margin:8px;float:right}.aa-DetachedOverlay .search-result-section,#quarto-search-results .search-result-section{font-size:.925em}.aa-DetachedOverlay a.search-result-link,#quarto-search-results a.search-result-link{color:inherit;text-decoration:none}.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item,#quarto-search-results li.aa-Item[aria-selected=true] .search-item{background-color:#2780e3}.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item.search-result-more,.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item .search-result-section,.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item .search-result-text,.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item .search-result-title-container,.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item .search-result-text-container,#quarto-search-results li.aa-Item[aria-selected=true] .search-item.search-result-more,#quarto-search-results li.aa-Item[aria-selected=true] .search-item .search-result-section,#quarto-search-results li.aa-Item[aria-selected=true] .search-item .search-result-text,#quarto-search-results li.aa-Item[aria-selected=true] .search-item .search-result-title-container,#quarto-search-results li.aa-Item[aria-selected=true] .search-item .search-result-text-container{color:#fff;background-color:#2780e3}.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item mark.search-match,.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item .search-match.mark,#quarto-search-results li.aa-Item[aria-selected=true] .search-item mark.search-match,#quarto-search-results li.aa-Item[aria-selected=true] .search-item .search-match.mark{color:#fff;background-color:#4b95e8}.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item,#quarto-search-results li.aa-Item[aria-selected=false] .search-item{background-color:#fff}.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item.search-result-more,.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item .search-result-section,.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item .search-result-text,.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item .search-result-title-container,.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item .search-result-text-container,#quarto-search-results li.aa-Item[aria-selected=false] .search-item.search-result-more,#quarto-search-results li.aa-Item[aria-selected=false] .search-item .search-result-section,#quarto-search-results li.aa-Item[aria-selected=false] .search-item .search-result-text,#quarto-search-results li.aa-Item[aria-selected=false] .search-item .search-result-title-container,#quarto-search-results li.aa-Item[aria-selected=false] .search-item .search-result-text-container{color:#343a40}.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item mark.search-match,.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item .search-match.mark,#quarto-search-results li.aa-Item[aria-selected=false] .search-item mark.search-match,#quarto-search-results li.aa-Item[aria-selected=false] .search-item .search-match.mark{color:inherit;background-color:#e5effc}.aa-DetachedOverlay .aa-Item .search-result-doc:not(.document-selectable) .search-result-title-container,#quarto-search-results .aa-Item .search-result-doc:not(.document-selectable) .search-result-title-container{background-color:#fff;color:#343a40}.aa-DetachedOverlay .aa-Item .search-result-doc:not(.document-selectable) .search-result-text-container,#quarto-search-results .aa-Item .search-result-doc:not(.document-selectable) .search-result-text-container{padding-top:0px}.aa-DetachedOverlay li.aa-Item .search-result-doc.document-selectable .search-result-text-container,#quarto-search-results li.aa-Item .search-result-doc.document-selectable .search-result-text-container{margin-top:-4px}.aa-DetachedOverlay .aa-Item,#quarto-search-results .aa-Item{cursor:pointer}.aa-DetachedOverlay .aa-Item .search-item,#quarto-search-results .aa-Item .search-item{border-left:none;border-right:none;border-top:none;background-color:#fff;border-color:#dee2e6;color:#343a40}.aa-DetachedOverlay .aa-Item .search-item p,#quarto-search-results .aa-Item .search-item p{margin-top:0;margin-bottom:0}.aa-DetachedOverlay .aa-Item .search-item i.bi,#quarto-search-results .aa-Item .search-item i.bi{padding-left:8px;padding-right:8px;font-size:1.3em}.aa-DetachedOverlay .aa-Item .search-item .search-result-title,#quarto-search-results .aa-Item .search-item .search-result-title{margin-top:.3em;margin-bottom:0em}.aa-DetachedOverlay .aa-Item .search-item .search-result-crumbs,#quarto-search-results .aa-Item .search-item .search-result-crumbs{white-space:nowrap;text-overflow:ellipsis;font-size:.8em;font-weight:300;margin-right:1em}.aa-DetachedOverlay .aa-Item .search-item .search-result-crumbs:not(.search-result-crumbs-wrap),#quarto-search-results .aa-Item .search-item .search-result-crumbs:not(.search-result-crumbs-wrap){max-width:30%;margin-left:auto;margin-top:.5em;margin-bottom:.1rem}.aa-DetachedOverlay .aa-Item .search-item .search-result-crumbs.search-result-crumbs-wrap,#quarto-search-results .aa-Item .search-item .search-result-crumbs.search-result-crumbs-wrap{flex-basis:100%;margin-top:0em;margin-bottom:.2em;margin-left:37px}.aa-DetachedOverlay .aa-Item .search-result-title-container,#quarto-search-results .aa-Item .search-result-title-container{font-size:1em;display:flex;flex-wrap:wrap;padding:6px 4px 6px 4px}.aa-DetachedOverlay .aa-Item .search-result-text-container,#quarto-search-results .aa-Item .search-result-text-container{padding-bottom:8px;padding-right:8px;margin-left:42px}.aa-DetachedOverlay .aa-Item .search-result-doc-section,.aa-DetachedOverlay .aa-Item .search-result-more,#quarto-search-results .aa-Item .search-result-doc-section,#quarto-search-results .aa-Item .search-result-more{padding-top:8px;padding-bottom:8px;padding-left:44px}.aa-DetachedOverlay .aa-Item .search-result-more,#quarto-search-results .aa-Item .search-result-more{font-size:.8em;font-weight:400}.aa-DetachedOverlay .aa-Item .search-result-doc,#quarto-search-results .aa-Item .search-result-doc{border-top:1px solid #dee2e6}.aa-DetachedSearchButton{background:none;border:none}.aa-DetachedSearchButton .aa-DetachedSearchButtonPlaceholder{display:none}.navbar .aa-DetachedSearchButton .aa-DetachedSearchButtonIcon{color:#545555}.sidebar-tools-collapse #quarto-search,.sidebar-tools-main #quarto-search{display:inline}.sidebar-tools-collapse #quarto-search .aa-Autocomplete,.sidebar-tools-main #quarto-search .aa-Autocomplete{display:inline}.sidebar-tools-collapse #quarto-search .aa-DetachedSearchButton,.sidebar-tools-main #quarto-search .aa-DetachedSearchButton{padding-left:4px;padding-right:4px}.sidebar-tools-collapse #quarto-search .aa-DetachedSearchButton .aa-DetachedSearchButtonIcon,.sidebar-tools-main #quarto-search .aa-DetachedSearchButton .aa-DetachedSearchButtonIcon{color:#595959}.sidebar-tools-collapse #quarto-search .aa-DetachedSearchButton .aa-DetachedSearchButtonIcon .aa-SubmitIcon,.sidebar-tools-main #quarto-search .aa-DetachedSearchButton .aa-DetachedSearchButtonIcon .aa-SubmitIcon{margin-top:-3px}.aa-DetachedContainer{background:rgba(255,255,255,.65);width:90%;bottom:0;box-shadow:rgba(222,226,230,.6) 0 0 0 1px;outline:currentColor none medium;display:flex;flex-direction:column;left:0;margin:0;overflow:hidden;padding:0;position:fixed;right:0;top:0;z-index:1101}.aa-DetachedContainer::after{height:32px}.aa-DetachedContainer .aa-SourceHeader{margin:var(--aa-spacing-half) 0 var(--aa-spacing-half) 2px}.aa-DetachedContainer .aa-Panel{background-color:#fff;border-radius:0;box-shadow:none;flex-grow:1;margin:0;padding:0;position:relative}.aa-DetachedContainer .aa-PanelLayout{bottom:0;box-shadow:none;left:0;margin:0;max-height:none;overflow-y:auto;position:absolute;right:0;top:0;width:100%}.aa-DetachedFormContainer{background-color:#fff;border-bottom:1px solid #dee2e6;display:flex;flex-direction:row;justify-content:space-between;margin:0;padding:.5em}.aa-DetachedCancelButton{background:none;font-size:.8em;border:0;border-radius:3px;color:#343a40;cursor:pointer;margin:0 0 0 .5em;padding:0 .5em}.aa-DetachedCancelButton:hover,.aa-DetachedCancelButton:focus{box-shadow:rgba(39,128,227,.6) 0 0 0 1px;outline:currentColor none medium}.aa-DetachedContainer--modal{bottom:inherit;height:auto;margin:0 auto;position:absolute;top:100px;border-radius:6px;max-width:850px}@media(max-width: 575.98px){.aa-DetachedContainer--modal{width:100%;top:0px;border-radius:0px;border:none}}.aa-DetachedContainer--modal .aa-PanelLayout{max-height:var(--aa-detached-modal-max-height);padding-bottom:var(--aa-spacing-half);position:static}.aa-Detached{height:100vh;overflow:hidden}.aa-DetachedOverlay{background-color:rgba(52,58,64,.4);position:fixed;left:0;right:0;top:0;margin:0;padding:0;height:100vh;z-index:1100}.quarto-dashboard.nav-fixed.dashboard-sidebar #quarto-content.quarto-dashboard-content{padding:0em}.quarto-dashboard #quarto-content.quarto-dashboard-content{padding:1em}.quarto-dashboard #quarto-content.quarto-dashboard-content>*{padding-top:0}@media(min-width: 576px){.quarto-dashboard{height:100%}}.quarto-dashboard .card.valuebox.bslib-card.bg-primary{background-color:#5397e9 !important}.quarto-dashboard .card.valuebox.bslib-card.bg-secondary{background-color:#343a40 !important}.quarto-dashboard .card.valuebox.bslib-card.bg-success{background-color:#3aa716 !important}.quarto-dashboard .card.valuebox.bslib-card.bg-info{background-color:rgba(153,84,187,.7019607843) !important}.quarto-dashboard .card.valuebox.bslib-card.bg-warning{background-color:#fa6400 !important}.quarto-dashboard .card.valuebox.bslib-card.bg-danger{background-color:rgba(255,0,57,.7019607843) !important}.quarto-dashboard .card.valuebox.bslib-card.bg-light{background-color:#f8f9fa !important}.quarto-dashboard .card.valuebox.bslib-card.bg-dark{background-color:#343a40 !important}.quarto-dashboard.dashboard-fill{display:flex;flex-direction:column}.quarto-dashboard #quarto-appendix{display:none}.quarto-dashboard #quarto-header #quarto-dashboard-header{border-top:solid 1px #dae0e5;border-bottom:solid 1px #dae0e5}.quarto-dashboard #quarto-header #quarto-dashboard-header>nav{padding-left:1em;padding-right:1em}.quarto-dashboard #quarto-header #quarto-dashboard-header>nav .navbar-brand-container{padding-left:0}.quarto-dashboard #quarto-header #quarto-dashboard-header .navbar-toggler{margin-right:0}.quarto-dashboard #quarto-header #quarto-dashboard-header .navbar-toggler-icon{height:1em;width:1em;background-image:url('data:image/svg+xml,')}.quarto-dashboard #quarto-header #quarto-dashboard-header .navbar-brand-container{padding-right:1em}.quarto-dashboard #quarto-header #quarto-dashboard-header .navbar-title{font-size:1.1em}.quarto-dashboard #quarto-header #quarto-dashboard-header .navbar-nav{font-size:.9em}.quarto-dashboard #quarto-dashboard-header .navbar{padding:0}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-container{padding-left:1em}.quarto-dashboard #quarto-dashboard-header .navbar.slim .navbar-brand-container .nav-link,.quarto-dashboard #quarto-dashboard-header .navbar.slim .navbar-nav .nav-link{padding:.7em}.quarto-dashboard #quarto-dashboard-header .navbar .quarto-color-scheme-toggle{order:9}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-toggler{margin-left:.5em;order:10}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-nav .nav-link{padding:.5em;height:100%;display:flex;align-items:center}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-nav .active{background-color:#e0e5e9}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-brand-container{padding:.5em .5em .5em 0;display:flex;flex-direction:row;margin-right:2em;align-items:center}@media(max-width: 767.98px){.quarto-dashboard #quarto-dashboard-header .navbar .navbar-brand-container{margin-right:auto}}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-collapse{align-self:stretch}@media(min-width: 768px){.quarto-dashboard #quarto-dashboard-header .navbar .navbar-collapse{order:8}}@media(max-width: 767.98px){.quarto-dashboard #quarto-dashboard-header .navbar .navbar-collapse{order:1000;padding-bottom:.5em}}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-collapse .navbar-nav{align-self:stretch}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-title{font-size:1.25em;line-height:1.1em;display:flex;flex-direction:row;flex-wrap:wrap;align-items:baseline}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-title .navbar-title-text{margin-right:.4em}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-title a{text-decoration:none;color:inherit}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-subtitle,.quarto-dashboard #quarto-dashboard-header .navbar .navbar-author{font-size:.9rem;margin-right:.5em}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-author{margin-left:auto}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-logo{max-height:48px;min-height:30px;object-fit:cover;margin-right:1em}.quarto-dashboard #quarto-dashboard-header .navbar .quarto-dashboard-links{order:9;padding-right:1em}.quarto-dashboard #quarto-dashboard-header .navbar .quarto-dashboard-link-text{margin-left:.25em}.quarto-dashboard #quarto-dashboard-header .navbar .quarto-dashboard-link{padding-right:0em;padding-left:.7em;text-decoration:none;color:#545555}.quarto-dashboard .page-layout-custom .tab-content{padding:0;border:none}.quarto-dashboard-img-contain{height:100%;width:100%;object-fit:contain}@media(max-width: 575.98px){.quarto-dashboard .bslib-grid{grid-template-rows:minmax(1em, max-content) !important}.quarto-dashboard .sidebar-content{height:inherit}.quarto-dashboard .page-layout-custom{min-height:100vh}}.quarto-dashboard.dashboard-toolbar>.page-layout-custom,.quarto-dashboard.dashboard-sidebar>.page-layout-custom{padding:0}.quarto-dashboard .quarto-dashboard-content.quarto-dashboard-pages{padding:0}.quarto-dashboard .callout{margin-bottom:0;margin-top:0}.quarto-dashboard .html-fill-container figure{overflow:hidden}.quarto-dashboard bslib-tooltip .rounded-pill{border:solid #6c757d 1px}.quarto-dashboard bslib-tooltip .rounded-pill .svg{fill:#343a40}.quarto-dashboard .tabset .dashboard-card-no-title .nav-tabs{margin-left:0;margin-right:auto}.quarto-dashboard .tabset .tab-content{border:none}.quarto-dashboard .tabset .card-header .nav-link[role=tab]{margin-top:-6px;padding-top:6px;padding-bottom:6px}.quarto-dashboard .card.valuebox,.quarto-dashboard .card.bslib-value-box{min-height:3rem}.quarto-dashboard .card.valuebox .card-body,.quarto-dashboard .card.bslib-value-box .card-body{padding:0}.quarto-dashboard .bslib-value-box .value-box-value{font-size:clamp(.1em,15cqw,5em)}.quarto-dashboard .bslib-value-box .value-box-showcase .bi{font-size:clamp(.1em,max(18cqw,5.2cqh),5em);text-align:center;height:1em}.quarto-dashboard .bslib-value-box .value-box-showcase .bi::before{vertical-align:1em}.quarto-dashboard .bslib-value-box .value-box-area{margin-top:auto;margin-bottom:auto}.quarto-dashboard .card figure.quarto-float{display:flex;flex-direction:column;align-items:center}.quarto-dashboard .dashboard-scrolling{padding:1em}.quarto-dashboard .full-height{height:100%}.quarto-dashboard .showcase-bottom .value-box-grid{display:grid;grid-template-columns:1fr;grid-template-rows:1fr auto;grid-template-areas:"top" "bottom"}.quarto-dashboard .showcase-bottom .value-box-grid .value-box-showcase{grid-area:bottom;padding:0;margin:0}.quarto-dashboard .showcase-bottom .value-box-grid .value-box-showcase i.bi{font-size:4rem}.quarto-dashboard .showcase-bottom .value-box-grid .value-box-area{grid-area:top}.quarto-dashboard .tab-content{margin-bottom:0}.quarto-dashboard .bslib-card .bslib-navs-card-title{justify-content:stretch;align-items:end}.quarto-dashboard .card-header{display:flex;flex-wrap:wrap;justify-content:space-between}.quarto-dashboard .card-header .card-title{display:flex;flex-direction:column;justify-content:center;margin-bottom:0}.quarto-dashboard .tabset .card-toolbar{margin-bottom:1em}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout{border:none;gap:var(--bslib-spacer, 1rem)}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout>.main{padding:0}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout>.sidebar{border-radius:.25rem;border:1px solid rgba(0,0,0,.175)}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout>.collapse-toggle{display:none}@media(max-width: 767.98px){.quarto-dashboard .bslib-grid>.bslib-sidebar-layout{grid-template-columns:1fr;grid-template-rows:max-content 1fr}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout>.main{grid-column:1;grid-row:2}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout .sidebar{grid-column:1;grid-row:1}}.quarto-dashboard .sidebar-right .sidebar{padding-left:2.5em}.quarto-dashboard .sidebar-right .collapse-toggle{left:2px}.quarto-dashboard .quarto-dashboard .sidebar-right button.collapse-toggle:not(.transitioning){left:unset}.quarto-dashboard aside.sidebar{padding-left:1em;padding-right:1em;background-color:rgba(52,58,64,.25);color:#343a40}.quarto-dashboard .bslib-sidebar-layout>div.main{padding:.7em}.quarto-dashboard .bslib-sidebar-layout button.collapse-toggle{margin-top:.3em}.quarto-dashboard .bslib-sidebar-layout .collapse-toggle{top:0}.quarto-dashboard .bslib-sidebar-layout.sidebar-collapsed:not(.transitioning):not(.sidebar-right) .collapse-toggle{left:2px}.quarto-dashboard .sidebar>section>.h3:first-of-type{margin-top:0em}.quarto-dashboard .sidebar .h3,.quarto-dashboard .sidebar .h4,.quarto-dashboard .sidebar .h5,.quarto-dashboard .sidebar .h6{margin-top:.5em}.quarto-dashboard .sidebar form{flex-direction:column;align-items:start;margin-bottom:1em}.quarto-dashboard .sidebar form div[class*=oi-][class$=-input]{flex-direction:column}.quarto-dashboard .sidebar form[class*=oi-][class$=-toggle]{flex-direction:row-reverse;align-items:center;justify-content:start}.quarto-dashboard .sidebar form input[type=range]{margin-top:.5em;margin-right:.8em;margin-left:1em}.quarto-dashboard .sidebar label{width:fit-content}.quarto-dashboard .sidebar .card-body{margin-bottom:2em}.quarto-dashboard .sidebar .shiny-input-container{margin-bottom:1em}.quarto-dashboard .sidebar .shiny-options-group{margin-top:0}.quarto-dashboard .sidebar .control-label{margin-bottom:.3em}.quarto-dashboard .card .card-body .quarto-layout-row{align-items:stretch}.quarto-dashboard .toolbar{font-size:.9em;display:flex;flex-direction:row;border-top:solid 1px #bcbfc0;padding:1em;flex-wrap:wrap;background-color:rgba(52,58,64,.25)}.quarto-dashboard .toolbar .cell-output-display{display:flex}.quarto-dashboard .toolbar .shiny-input-container{padding-bottom:.5em;margin-bottom:.5em;width:inherit}.quarto-dashboard .toolbar .shiny-input-container>.checkbox:first-child{margin-top:6px}.quarto-dashboard .toolbar>*:last-child{margin-right:0}.quarto-dashboard .toolbar>*>*{margin-right:1em;align-items:baseline}.quarto-dashboard .toolbar>*>*>a{text-decoration:none;margin-top:auto;margin-bottom:auto}.quarto-dashboard .toolbar .shiny-input-container{padding-bottom:0;margin-bottom:0}.quarto-dashboard .toolbar .shiny-input-container>*{flex-shrink:0;flex-grow:0}.quarto-dashboard .toolbar .form-group.shiny-input-container:not([role=group])>label{margin-bottom:0}.quarto-dashboard .toolbar .shiny-input-container.no-baseline{align-items:start;padding-top:6px}.quarto-dashboard .toolbar .shiny-input-container{display:flex;align-items:baseline}.quarto-dashboard .toolbar .shiny-input-container label{padding-right:.4em}.quarto-dashboard .toolbar .shiny-input-container .bslib-input-switch{margin-top:6px}.quarto-dashboard .toolbar input[type=text]{line-height:1;width:inherit}.quarto-dashboard .toolbar .input-daterange{width:inherit}.quarto-dashboard .toolbar .input-daterange input[type=text]{height:2.4em;width:10em}.quarto-dashboard .toolbar .input-daterange .input-group-addon{height:auto;padding:0;margin-left:-5px !important;margin-right:-5px}.quarto-dashboard .toolbar .input-daterange .input-group-addon .input-group-text{padding-top:0;padding-bottom:0;height:100%}.quarto-dashboard .toolbar span.irs.irs--shiny{width:10em}.quarto-dashboard .toolbar span.irs.irs--shiny .irs-line{top:9px}.quarto-dashboard .toolbar span.irs.irs--shiny .irs-min,.quarto-dashboard .toolbar span.irs.irs--shiny .irs-max,.quarto-dashboard .toolbar span.irs.irs--shiny .irs-from,.quarto-dashboard .toolbar span.irs.irs--shiny .irs-to,.quarto-dashboard .toolbar span.irs.irs--shiny .irs-single{top:20px}.quarto-dashboard .toolbar span.irs.irs--shiny .irs-bar{top:8px}.quarto-dashboard .toolbar span.irs.irs--shiny .irs-handle{top:0px}.quarto-dashboard .toolbar .shiny-input-checkboxgroup>label{margin-top:6px}.quarto-dashboard .toolbar .shiny-input-checkboxgroup>.shiny-options-group{margin-top:0;align-items:baseline}.quarto-dashboard .toolbar .shiny-input-radiogroup>label{margin-top:6px}.quarto-dashboard .toolbar .shiny-input-radiogroup>.shiny-options-group{align-items:baseline;margin-top:0}.quarto-dashboard .toolbar .shiny-input-radiogroup>.shiny-options-group>.radio{margin-right:.3em}.quarto-dashboard .toolbar .form-select{padding-top:.2em;padding-bottom:.2em}.quarto-dashboard .toolbar .shiny-input-select{min-width:6em}.quarto-dashboard .toolbar div.checkbox{margin-bottom:0px}.quarto-dashboard .toolbar>.checkbox:first-child{margin-top:6px}.quarto-dashboard .toolbar form{width:fit-content}.quarto-dashboard .toolbar form label{padding-top:.2em;padding-bottom:.2em;width:fit-content}.quarto-dashboard .toolbar form input[type=date]{width:fit-content}.quarto-dashboard .toolbar form input[type=color]{width:3em}.quarto-dashboard .toolbar form button{padding:.4em}.quarto-dashboard .toolbar form select{width:fit-content}.quarto-dashboard .toolbar>*{font-size:.9em;flex-grow:0}.quarto-dashboard .toolbar .shiny-input-container label{margin-bottom:1px}.quarto-dashboard .toolbar-bottom{margin-top:1em;margin-bottom:0 !important;order:2}.quarto-dashboard .quarto-dashboard-content>.dashboard-toolbar-container>.toolbar-content>.tab-content>.tab-pane>*:not(.bslib-sidebar-layout){padding:1em}.quarto-dashboard .quarto-dashboard-content>.dashboard-toolbar-container>.toolbar-content>*:not(.tab-content){padding:1em}.quarto-dashboard .quarto-dashboard-content>.tab-content>.dashboard-page>.dashboard-toolbar-container>.toolbar-content,.quarto-dashboard .quarto-dashboard-content>.tab-content>.dashboard-page:not(.dashboard-sidebar-container)>*:not(.dashboard-toolbar-container){padding:1em}.quarto-dashboard .toolbar-content{padding:0}.quarto-dashboard .quarto-dashboard-content.quarto-dashboard-pages .tab-pane>.dashboard-toolbar-container .toolbar{border-radius:0;margin-bottom:0}.quarto-dashboard .dashboard-toolbar-container.toolbar-toplevel .toolbar{border-bottom:1px solid rgba(0,0,0,.175)}.quarto-dashboard .dashboard-toolbar-container.toolbar-toplevel .toolbar-bottom{margin-top:0}.quarto-dashboard .dashboard-toolbar-container:not(.toolbar-toplevel) .toolbar{margin-bottom:1em;border-top:none;border-radius:.25rem;border:1px solid rgba(0,0,0,.175)}.quarto-dashboard .vega-embed.has-actions details{width:1.7em;height:2em;position:absolute !important;top:0;right:0}.quarto-dashboard .dashboard-toolbar-container{padding:0}.quarto-dashboard .card .card-header p:last-child,.quarto-dashboard .card .card-footer p:last-child{margin-bottom:0}.quarto-dashboard .card .card-body>.h4:first-child{margin-top:0}.quarto-dashboard .card .card-body{z-index:4}@media(max-width: 767.98px){.quarto-dashboard .card .card-body .itables div.dataTables_wrapper div.dataTables_length,.quarto-dashboard .card .card-body .itables div.dataTables_wrapper div.dataTables_info,.quarto-dashboard .card .card-body .itables div.dataTables_wrapper div.dataTables_paginate{text-align:initial}.quarto-dashboard .card .card-body .itables div.dataTables_wrapper div.dataTables_filter{text-align:right}.quarto-dashboard .card .card-body .itables div.dataTables_wrapper div.dataTables_paginate ul.pagination{justify-content:initial}}.quarto-dashboard .card .card-body .itables .dataTables_wrapper{display:flex;flex-wrap:wrap;justify-content:space-between;align-items:center;padding-top:0}.quarto-dashboard .card .card-body .itables .dataTables_wrapper table{flex-shrink:0}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dt-buttons{margin-bottom:.5em;margin-left:auto;width:fit-content;float:right}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dt-buttons.btn-group{background:#fff;border:none}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dt-buttons .btn-secondary{background-color:#fff;background-image:none;border:solid #dee2e6 1px;padding:.2em .7em}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dt-buttons .btn span{font-size:.8em;color:#343a40}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_info{margin-left:.5em;margin-bottom:.5em;padding-top:0}@media(min-width: 768px){.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_info{font-size:.875em}}@media(max-width: 767.98px){.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_info{font-size:.8em}}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_filter{margin-bottom:.5em;font-size:.875em}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_filter input[type=search]{padding:1px 5px 1px 5px;font-size:.875em}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_length{flex-basis:1 1 50%;margin-bottom:.5em;font-size:.875em}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_length select{padding:.4em 3em .4em .5em;font-size:.875em;margin-left:.2em;margin-right:.2em}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_paginate{flex-shrink:0}@media(min-width: 768px){.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_paginate{margin-left:auto}}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_paginate ul.pagination .paginate_button .page-link{font-size:.8em}.quarto-dashboard .card .card-footer{font-size:.9em}.quarto-dashboard .card .card-toolbar{display:flex;flex-grow:1;flex-direction:row;width:100%;flex-wrap:wrap}.quarto-dashboard .card .card-toolbar>*{font-size:.8em;flex-grow:0}.quarto-dashboard .card .card-toolbar>.card-title{font-size:1em;flex-grow:1;align-self:flex-start;margin-top:.1em}.quarto-dashboard .card .card-toolbar .cell-output-display{display:flex}.quarto-dashboard .card .card-toolbar .shiny-input-container{padding-bottom:.5em;margin-bottom:.5em;width:inherit}.quarto-dashboard .card .card-toolbar .shiny-input-container>.checkbox:first-child{margin-top:6px}.quarto-dashboard .card .card-toolbar>*:last-child{margin-right:0}.quarto-dashboard .card .card-toolbar>*>*{margin-right:1em;align-items:baseline}.quarto-dashboard .card .card-toolbar>*>*>a{text-decoration:none;margin-top:auto;margin-bottom:auto}.quarto-dashboard .card .card-toolbar form{width:fit-content}.quarto-dashboard .card .card-toolbar form label{padding-top:.2em;padding-bottom:.2em;width:fit-content}.quarto-dashboard .card .card-toolbar form input[type=date]{width:fit-content}.quarto-dashboard .card .card-toolbar form input[type=color]{width:3em}.quarto-dashboard .card .card-toolbar form button{padding:.4em}.quarto-dashboard .card .card-toolbar form select{width:fit-content}.quarto-dashboard .card .card-toolbar .cell-output-display{display:flex}.quarto-dashboard .card .card-toolbar .shiny-input-container{padding-bottom:.5em;margin-bottom:.5em;width:inherit}.quarto-dashboard .card .card-toolbar .shiny-input-container>.checkbox:first-child{margin-top:6px}.quarto-dashboard .card .card-toolbar>*:last-child{margin-right:0}.quarto-dashboard .card .card-toolbar>*>*{margin-right:1em;align-items:baseline}.quarto-dashboard .card .card-toolbar>*>*>a{text-decoration:none;margin-top:auto;margin-bottom:auto}.quarto-dashboard .card .card-toolbar .shiny-input-container{padding-bottom:0;margin-bottom:0}.quarto-dashboard .card .card-toolbar .shiny-input-container>*{flex-shrink:0;flex-grow:0}.quarto-dashboard .card .card-toolbar .form-group.shiny-input-container:not([role=group])>label{margin-bottom:0}.quarto-dashboard .card .card-toolbar .shiny-input-container.no-baseline{align-items:start;padding-top:6px}.quarto-dashboard .card .card-toolbar .shiny-input-container{display:flex;align-items:baseline}.quarto-dashboard .card .card-toolbar .shiny-input-container label{padding-right:.4em}.quarto-dashboard .card .card-toolbar .shiny-input-container .bslib-input-switch{margin-top:6px}.quarto-dashboard .card .card-toolbar input[type=text]{line-height:1;width:inherit}.quarto-dashboard .card .card-toolbar .input-daterange{width:inherit}.quarto-dashboard .card .card-toolbar .input-daterange input[type=text]{height:2.4em;width:10em}.quarto-dashboard .card .card-toolbar .input-daterange .input-group-addon{height:auto;padding:0;margin-left:-5px !important;margin-right:-5px}.quarto-dashboard .card .card-toolbar .input-daterange .input-group-addon .input-group-text{padding-top:0;padding-bottom:0;height:100%}.quarto-dashboard .card .card-toolbar span.irs.irs--shiny{width:10em}.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-line{top:9px}.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-min,.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-max,.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-from,.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-to,.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-single{top:20px}.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-bar{top:8px}.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-handle{top:0px}.quarto-dashboard .card .card-toolbar .shiny-input-checkboxgroup>label{margin-top:6px}.quarto-dashboard .card .card-toolbar .shiny-input-checkboxgroup>.shiny-options-group{margin-top:0;align-items:baseline}.quarto-dashboard .card .card-toolbar .shiny-input-radiogroup>label{margin-top:6px}.quarto-dashboard .card .card-toolbar .shiny-input-radiogroup>.shiny-options-group{align-items:baseline;margin-top:0}.quarto-dashboard .card .card-toolbar .shiny-input-radiogroup>.shiny-options-group>.radio{margin-right:.3em}.quarto-dashboard .card .card-toolbar .form-select{padding-top:.2em;padding-bottom:.2em}.quarto-dashboard .card .card-toolbar .shiny-input-select{min-width:6em}.quarto-dashboard .card .card-toolbar div.checkbox{margin-bottom:0px}.quarto-dashboard .card .card-toolbar>.checkbox:first-child{margin-top:6px}.quarto-dashboard .card-body>table>thead{border-top:none}.quarto-dashboard .card-body>.table>:not(caption)>*>*{background-color:#fff}.tableFloatingHeaderOriginal{background-color:#fff;position:sticky !important;top:0 !important}.dashboard-data-table{margin-top:-1px}div.value-box-area span.observablehq--number{font-size:calc(clamp(.1em,15cqw,5em)*1.25);line-height:1.2;color:inherit;font-family:var(--bs-body-font-family)}.quarto-listing{padding-bottom:1em}.listing-pagination{padding-top:.5em}ul.pagination{float:right;padding-left:8px;padding-top:.5em}ul.pagination li{padding-right:.75em}ul.pagination li.disabled a,ul.pagination li.active a{color:#fff;text-decoration:none}ul.pagination li:last-of-type{padding-right:0}.listing-actions-group{display:flex}.quarto-listing-filter{margin-bottom:1em;width:200px;margin-left:auto}.quarto-listing-sort{margin-bottom:1em;margin-right:auto;width:auto}.quarto-listing-sort .input-group-text{font-size:.8em}.input-group-text{border-right:none}.quarto-listing-sort select.form-select{font-size:.8em}.listing-no-matching{text-align:center;padding-top:2em;padding-bottom:3em;font-size:1em}#quarto-margin-sidebar .quarto-listing-category{padding-top:0;font-size:1rem}#quarto-margin-sidebar .quarto-listing-category-title{cursor:pointer;font-weight:600;font-size:1rem}.quarto-listing-category .category{cursor:pointer}.quarto-listing-category .category.active{font-weight:600}.quarto-listing-category.category-cloud{display:flex;flex-wrap:wrap;align-items:baseline}.quarto-listing-category.category-cloud .category{padding-right:5px}.quarto-listing-category.category-cloud .category-cloud-1{font-size:.75em}.quarto-listing-category.category-cloud .category-cloud-2{font-size:.95em}.quarto-listing-category.category-cloud .category-cloud-3{font-size:1.15em}.quarto-listing-category.category-cloud .category-cloud-4{font-size:1.35em}.quarto-listing-category.category-cloud .category-cloud-5{font-size:1.55em}.quarto-listing-category.category-cloud .category-cloud-6{font-size:1.75em}.quarto-listing-category.category-cloud .category-cloud-7{font-size:1.95em}.quarto-listing-category.category-cloud .category-cloud-8{font-size:2.15em}.quarto-listing-category.category-cloud .category-cloud-9{font-size:2.35em}.quarto-listing-category.category-cloud .category-cloud-10{font-size:2.55em}.quarto-listing-cols-1{grid-template-columns:repeat(1, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-1{grid-template-columns:repeat(1, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-1{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-2{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-2{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-2{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-3{grid-template-columns:repeat(3, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-3{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-3{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-4{grid-template-columns:repeat(4, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-4{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-4{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-5{grid-template-columns:repeat(5, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-5{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-5{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-6{grid-template-columns:repeat(6, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-6{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-6{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-7{grid-template-columns:repeat(7, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-7{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-7{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-8{grid-template-columns:repeat(8, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-8{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-8{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-9{grid-template-columns:repeat(9, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-9{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-9{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-10{grid-template-columns:repeat(10, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-10{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-10{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-11{grid-template-columns:repeat(11, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-11{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-11{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-12{grid-template-columns:repeat(12, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-12{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-12{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-grid{gap:1.5em}.quarto-grid-item.borderless{border:none}.quarto-grid-item.borderless .listing-categories .listing-category:last-of-type,.quarto-grid-item.borderless .listing-categories .listing-category:first-of-type{padding-left:0}.quarto-grid-item.borderless .listing-categories .listing-category{border:0}.quarto-grid-link{text-decoration:none;color:inherit}.quarto-grid-link:hover{text-decoration:none;color:inherit}.quarto-grid-item h5.title,.quarto-grid-item .title.h5{margin-top:0;margin-bottom:0}.quarto-grid-item .card-footer{display:flex;justify-content:space-between;font-size:.8em}.quarto-grid-item .card-footer p{margin-bottom:0}.quarto-grid-item p.card-img-top{margin-bottom:0}.quarto-grid-item p.card-img-top>img{object-fit:cover}.quarto-grid-item .card-other-values{margin-top:.5em;font-size:.8em}.quarto-grid-item .card-other-values tr{margin-bottom:.5em}.quarto-grid-item .card-other-values tr>td:first-of-type{font-weight:600;padding-right:1em;padding-left:1em;vertical-align:top}.quarto-grid-item div.post-contents{display:flex;flex-direction:column;text-decoration:none;height:100%}.quarto-grid-item .listing-item-img-placeholder{background-color:rgba(52,58,64,.25);flex-shrink:0}.quarto-grid-item .card-attribution{padding-top:1em;display:flex;gap:1em;text-transform:uppercase;color:#6c757d;font-weight:500;flex-grow:10;align-items:flex-end}.quarto-grid-item .description{padding-bottom:1em}.quarto-grid-item .card-attribution .date{align-self:flex-end}.quarto-grid-item .card-attribution.justify{justify-content:space-between}.quarto-grid-item .card-attribution.start{justify-content:flex-start}.quarto-grid-item .card-attribution.end{justify-content:flex-end}.quarto-grid-item .card-title{margin-bottom:.1em}.quarto-grid-item .card-subtitle{padding-top:.25em}.quarto-grid-item .card-text{font-size:.9em}.quarto-grid-item .listing-reading-time{padding-bottom:.25em}.quarto-grid-item .card-text-small{font-size:.8em}.quarto-grid-item .card-subtitle.subtitle{font-size:.9em;font-weight:600;padding-bottom:.5em}.quarto-grid-item .listing-categories{display:flex;flex-wrap:wrap;padding-bottom:5px}.quarto-grid-item .listing-categories .listing-category{color:#6c757d;border:solid 1px #dee2e6;border-radius:.25rem;text-transform:uppercase;font-size:.65em;padding-left:.5em;padding-right:.5em;padding-top:.15em;padding-bottom:.15em;cursor:pointer;margin-right:4px;margin-bottom:4px}.quarto-grid-item.card-right{text-align:right}.quarto-grid-item.card-right .listing-categories{justify-content:flex-end}.quarto-grid-item.card-left{text-align:left}.quarto-grid-item.card-center{text-align:center}.quarto-grid-item.card-center .listing-description{text-align:justify}.quarto-grid-item.card-center .listing-categories{justify-content:center}table.quarto-listing-table td.image{padding:0px}table.quarto-listing-table td.image img{width:100%;max-width:50px;object-fit:contain}table.quarto-listing-table a{text-decoration:none;word-break:keep-all}table.quarto-listing-table th a{color:inherit}table.quarto-listing-table th a.asc:after{margin-bottom:-2px;margin-left:5px;display:inline-block;height:1rem;width:1rem;background-repeat:no-repeat;background-size:1rem 1rem;background-image:url('data:image/svg+xml,');content:""}table.quarto-listing-table th a.desc:after{margin-bottom:-2px;margin-left:5px;display:inline-block;height:1rem;width:1rem;background-repeat:no-repeat;background-size:1rem 1rem;background-image:url('data:image/svg+xml,');content:""}table.quarto-listing-table.table-hover td{cursor:pointer}.quarto-post.image-left{flex-direction:row}.quarto-post.image-right{flex-direction:row-reverse}@media(max-width: 767.98px){.quarto-post.image-right,.quarto-post.image-left{gap:0em;flex-direction:column}.quarto-post .metadata{padding-bottom:1em;order:2}.quarto-post .body{order:1}.quarto-post .thumbnail{order:3}}.list.quarto-listing-default div:last-of-type{border-bottom:none}@media(min-width: 992px){.quarto-listing-container-default{margin-right:2em}}div.quarto-post{display:flex;gap:2em;margin-bottom:1.5em;border-bottom:1px solid #dee2e6}@media(max-width: 767.98px){div.quarto-post{padding-bottom:1em}}div.quarto-post .metadata{flex-basis:20%;flex-grow:0;margin-top:.2em;flex-shrink:10}div.quarto-post .thumbnail{flex-basis:30%;flex-grow:0;flex-shrink:0}div.quarto-post .thumbnail img{margin-top:.4em;width:100%;object-fit:cover}div.quarto-post .body{flex-basis:45%;flex-grow:1;flex-shrink:0}div.quarto-post .body h3.listing-title,div.quarto-post .body .listing-title.h3{margin-top:0px;margin-bottom:0px;border-bottom:none}div.quarto-post .body .listing-subtitle{font-size:.875em;margin-bottom:.5em;margin-top:.2em}div.quarto-post .body .description{font-size:.9em}div.quarto-post .body pre code{white-space:pre-wrap}div.quarto-post a{color:#343a40;text-decoration:none}div.quarto-post .metadata{display:flex;flex-direction:column;font-size:.8em;font-family:"Source Sans Pro",-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol";flex-basis:33%}div.quarto-post .listing-categories{display:flex;flex-wrap:wrap;padding-bottom:5px}div.quarto-post .listing-categories .listing-category{color:#6c757d;border:solid 1px #dee2e6;border-radius:.25rem;text-transform:uppercase;font-size:.65em;padding-left:.5em;padding-right:.5em;padding-top:.15em;padding-bottom:.15em;cursor:pointer;margin-right:4px;margin-bottom:4px}div.quarto-post .listing-description{margin-bottom:.5em}div.quarto-about-jolla{display:flex !important;flex-direction:column;align-items:center;margin-top:10%;padding-bottom:1em}div.quarto-about-jolla .about-image{object-fit:cover;margin-left:auto;margin-right:auto;margin-bottom:1.5em}div.quarto-about-jolla img.round{border-radius:50%}div.quarto-about-jolla img.rounded{border-radius:10px}div.quarto-about-jolla .quarto-title h1.title,div.quarto-about-jolla .quarto-title .title.h1{text-align:center}div.quarto-about-jolla .quarto-title .description{text-align:center}div.quarto-about-jolla h2,div.quarto-about-jolla .h2{border-bottom:none}div.quarto-about-jolla .about-sep{width:60%}div.quarto-about-jolla main{text-align:center}div.quarto-about-jolla .about-links{display:flex}@media(min-width: 992px){div.quarto-about-jolla .about-links{flex-direction:row;column-gap:.8em;row-gap:15px;flex-wrap:wrap}}@media(max-width: 991.98px){div.quarto-about-jolla .about-links{flex-direction:column;row-gap:1em;width:100%;padding-bottom:1.5em}}div.quarto-about-jolla .about-link{color:#626d78;text-decoration:none;border:solid 1px}@media(min-width: 992px){div.quarto-about-jolla .about-link{font-size:.8em;padding:.25em .5em;border-radius:4px}}@media(max-width: 991.98px){div.quarto-about-jolla .about-link{font-size:1.1em;padding:.5em .5em;text-align:center;border-radius:6px}}div.quarto-about-jolla .about-link:hover{color:#2761e3}div.quarto-about-jolla .about-link i.bi{margin-right:.15em}div.quarto-about-solana{display:flex !important;flex-direction:column;padding-top:3em !important;padding-bottom:1em}div.quarto-about-solana .about-entity{display:flex !important;align-items:start;justify-content:space-between}@media(min-width: 992px){div.quarto-about-solana .about-entity{flex-direction:row}}@media(max-width: 991.98px){div.quarto-about-solana .about-entity{flex-direction:column-reverse;align-items:center;text-align:center}}div.quarto-about-solana .about-entity .entity-contents{display:flex;flex-direction:column}@media(max-width: 767.98px){div.quarto-about-solana .about-entity .entity-contents{width:100%}}div.quarto-about-solana .about-entity .about-image{object-fit:cover}@media(max-width: 991.98px){div.quarto-about-solana .about-entity .about-image{margin-bottom:1.5em}}div.quarto-about-solana .about-entity img.round{border-radius:50%}div.quarto-about-solana .about-entity img.rounded{border-radius:10px}div.quarto-about-solana .about-entity .about-links{display:flex;justify-content:left;padding-bottom:1.2em}@media(min-width: 992px){div.quarto-about-solana .about-entity .about-links{flex-direction:row;column-gap:.8em;row-gap:15px;flex-wrap:wrap}}@media(max-width: 991.98px){div.quarto-about-solana .about-entity .about-links{flex-direction:column;row-gap:1em;width:100%;padding-bottom:1.5em}}div.quarto-about-solana .about-entity .about-link{color:#626d78;text-decoration:none;border:solid 1px}@media(min-width: 992px){div.quarto-about-solana .about-entity .about-link{font-size:.8em;padding:.25em .5em;border-radius:4px}}@media(max-width: 991.98px){div.quarto-about-solana .about-entity .about-link{font-size:1.1em;padding:.5em .5em;text-align:center;border-radius:6px}}div.quarto-about-solana .about-entity .about-link:hover{color:#2761e3}div.quarto-about-solana .about-entity .about-link i.bi{margin-right:.15em}div.quarto-about-solana .about-contents{padding-right:1.5em;flex-basis:0;flex-grow:1}div.quarto-about-solana .about-contents main.content{margin-top:0}div.quarto-about-solana .about-contents h2,div.quarto-about-solana .about-contents .h2{border-bottom:none}div.quarto-about-trestles{display:flex !important;flex-direction:row;padding-top:3em !important;padding-bottom:1em}@media(max-width: 991.98px){div.quarto-about-trestles{flex-direction:column;padding-top:0em !important}}div.quarto-about-trestles .about-entity{display:flex !important;flex-direction:column;align-items:center;text-align:center;padding-right:1em}@media(min-width: 992px){div.quarto-about-trestles .about-entity{flex:0 0 42%}}div.quarto-about-trestles .about-entity .about-image{object-fit:cover;margin-bottom:1.5em}div.quarto-about-trestles .about-entity img.round{border-radius:50%}div.quarto-about-trestles .about-entity img.rounded{border-radius:10px}div.quarto-about-trestles .about-entity .about-links{display:flex;justify-content:center}@media(min-width: 992px){div.quarto-about-trestles .about-entity .about-links{flex-direction:row;column-gap:.8em;row-gap:15px;flex-wrap:wrap}}@media(max-width: 991.98px){div.quarto-about-trestles .about-entity .about-links{flex-direction:column;row-gap:1em;width:100%;padding-bottom:1.5em}}div.quarto-about-trestles .about-entity .about-link{color:#626d78;text-decoration:none;border:solid 1px}@media(min-width: 992px){div.quarto-about-trestles .about-entity .about-link{font-size:.8em;padding:.25em .5em;border-radius:4px}}@media(max-width: 991.98px){div.quarto-about-trestles .about-entity .about-link{font-size:1.1em;padding:.5em .5em;text-align:center;border-radius:6px}}div.quarto-about-trestles .about-entity .about-link:hover{color:#2761e3}div.quarto-about-trestles .about-entity .about-link i.bi{margin-right:.15em}div.quarto-about-trestles .about-contents{flex-basis:0;flex-grow:1}div.quarto-about-trestles .about-contents h2,div.quarto-about-trestles .about-contents .h2{border-bottom:none}@media(min-width: 992px){div.quarto-about-trestles .about-contents{border-left:solid 1px #dee2e6;padding-left:1.5em}}div.quarto-about-trestles .about-contents main.content{margin-top:0}div.quarto-about-marquee{padding-bottom:1em}div.quarto-about-marquee .about-contents{display:flex;flex-direction:column}div.quarto-about-marquee .about-image{max-height:550px;margin-bottom:1.5em;object-fit:cover}div.quarto-about-marquee img.round{border-radius:50%}div.quarto-about-marquee img.rounded{border-radius:10px}div.quarto-about-marquee h2,div.quarto-about-marquee .h2{border-bottom:none}div.quarto-about-marquee .about-links{display:flex;justify-content:center;padding-top:1.5em}@media(min-width: 992px){div.quarto-about-marquee .about-links{flex-direction:row;column-gap:.8em;row-gap:15px;flex-wrap:wrap}}@media(max-width: 991.98px){div.quarto-about-marquee .about-links{flex-direction:column;row-gap:1em;width:100%;padding-bottom:1.5em}}div.quarto-about-marquee .about-link{color:#626d78;text-decoration:none;border:solid 1px}@media(min-width: 992px){div.quarto-about-marquee .about-link{font-size:.8em;padding:.25em .5em;border-radius:4px}}@media(max-width: 991.98px){div.quarto-about-marquee .about-link{font-size:1.1em;padding:.5em .5em;text-align:center;border-radius:6px}}div.quarto-about-marquee .about-link:hover{color:#2761e3}div.quarto-about-marquee .about-link i.bi{margin-right:.15em}@media(min-width: 992px){div.quarto-about-marquee .about-link{border:none}}div.quarto-about-broadside{display:flex;flex-direction:column;padding-bottom:1em}div.quarto-about-broadside .about-main{display:flex !important;padding-top:0 !important}@media(min-width: 992px){div.quarto-about-broadside .about-main{flex-direction:row;align-items:flex-start}}@media(max-width: 991.98px){div.quarto-about-broadside .about-main{flex-direction:column}}@media(max-width: 991.98px){div.quarto-about-broadside .about-main .about-entity{flex-shrink:0;width:100%;height:450px;margin-bottom:1.5em;background-size:cover;background-repeat:no-repeat}}@media(min-width: 992px){div.quarto-about-broadside .about-main .about-entity{flex:0 10 50%;margin-right:1.5em;width:100%;height:100%;background-size:100%;background-repeat:no-repeat}}div.quarto-about-broadside .about-main .about-contents{padding-top:14px;flex:0 0 50%}div.quarto-about-broadside h2,div.quarto-about-broadside .h2{border-bottom:none}div.quarto-about-broadside .about-sep{margin-top:1.5em;width:60%;align-self:center}div.quarto-about-broadside .about-links{display:flex;justify-content:center;column-gap:20px;padding-top:1.5em}@media(min-width: 992px){div.quarto-about-broadside .about-links{flex-direction:row;column-gap:.8em;row-gap:15px;flex-wrap:wrap}}@media(max-width: 991.98px){div.quarto-about-broadside .about-links{flex-direction:column;row-gap:1em;width:100%;padding-bottom:1.5em}}div.quarto-about-broadside .about-link{color:#626d78;text-decoration:none;border:solid 1px}@media(min-width: 992px){div.quarto-about-broadside .about-link{font-size:.8em;padding:.25em .5em;border-radius:4px}}@media(max-width: 991.98px){div.quarto-about-broadside .about-link{font-size:1.1em;padding:.5em .5em;text-align:center;border-radius:6px}}div.quarto-about-broadside .about-link:hover{color:#2761e3}div.quarto-about-broadside .about-link i.bi{margin-right:.15em}@media(min-width: 992px){div.quarto-about-broadside .about-link{border:none}}.tippy-box[data-theme~=quarto]{background-color:#fff;border:solid 1px #dee2e6;border-radius:.25rem;color:#343a40;font-size:.875rem}.tippy-box[data-theme~=quarto]>.tippy-backdrop{background-color:#fff}.tippy-box[data-theme~=quarto]>.tippy-arrow:after,.tippy-box[data-theme~=quarto]>.tippy-svg-arrow:after{content:"";position:absolute;z-index:-1}.tippy-box[data-theme~=quarto]>.tippy-arrow:after{border-color:rgba(0,0,0,0);border-style:solid}.tippy-box[data-placement^=top]>.tippy-arrow:before{bottom:-6px}.tippy-box[data-placement^=bottom]>.tippy-arrow:before{top:-6px}.tippy-box[data-placement^=right]>.tippy-arrow:before{left:-6px}.tippy-box[data-placement^=left]>.tippy-arrow:before{right:-6px}.tippy-box[data-theme~=quarto][data-placement^=top]>.tippy-arrow:before{border-top-color:#fff}.tippy-box[data-theme~=quarto][data-placement^=top]>.tippy-arrow:after{border-top-color:#dee2e6;border-width:7px 7px 0;top:17px;left:1px}.tippy-box[data-theme~=quarto][data-placement^=top]>.tippy-svg-arrow>svg{top:16px}.tippy-box[data-theme~=quarto][data-placement^=top]>.tippy-svg-arrow:after{top:17px}.tippy-box[data-theme~=quarto][data-placement^=bottom]>.tippy-arrow:before{border-bottom-color:#fff;bottom:16px}.tippy-box[data-theme~=quarto][data-placement^=bottom]>.tippy-arrow:after{border-bottom-color:#dee2e6;border-width:0 7px 7px;bottom:17px;left:1px}.tippy-box[data-theme~=quarto][data-placement^=bottom]>.tippy-svg-arrow>svg{bottom:15px}.tippy-box[data-theme~=quarto][data-placement^=bottom]>.tippy-svg-arrow:after{bottom:17px}.tippy-box[data-theme~=quarto][data-placement^=left]>.tippy-arrow:before{border-left-color:#fff}.tippy-box[data-theme~=quarto][data-placement^=left]>.tippy-arrow:after{border-left-color:#dee2e6;border-width:7px 0 7px 7px;left:17px;top:1px}.tippy-box[data-theme~=quarto][data-placement^=left]>.tippy-svg-arrow>svg{left:11px}.tippy-box[data-theme~=quarto][data-placement^=left]>.tippy-svg-arrow:after{left:12px}.tippy-box[data-theme~=quarto][data-placement^=right]>.tippy-arrow:before{border-right-color:#fff;right:16px}.tippy-box[data-theme~=quarto][data-placement^=right]>.tippy-arrow:after{border-width:7px 7px 7px 0;right:17px;top:1px;border-right-color:#dee2e6}.tippy-box[data-theme~=quarto][data-placement^=right]>.tippy-svg-arrow>svg{right:11px}.tippy-box[data-theme~=quarto][data-placement^=right]>.tippy-svg-arrow:after{right:12px}.tippy-box[data-theme~=quarto]>.tippy-svg-arrow{fill:#343a40}.tippy-box[data-theme~=quarto]>.tippy-svg-arrow:after{background-image:url(data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMTYiIGhlaWdodD0iNiIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj48cGF0aCBkPSJNMCA2czEuNzk2LS4wMTMgNC42Ny0zLjYxNUM1Ljg1MS45IDYuOTMuMDA2IDggMGMxLjA3LS4wMDYgMi4xNDguODg3IDMuMzQzIDIuMzg1QzE0LjIzMyA2LjAwNSAxNiA2IDE2IDZIMHoiIGZpbGw9InJnYmEoMCwgOCwgMTYsIDAuMikiLz48L3N2Zz4=);background-size:16px 6px;width:16px;height:6px}.top-right{position:absolute;top:1em;right:1em}.visually-hidden{border:0;clip:rect(0 0 0 0);height:auto;margin:0;overflow:hidden;padding:0;position:absolute;width:1px;white-space:nowrap}.hidden{display:none !important}.zindex-bottom{z-index:-1 !important}figure.figure{display:block}.quarto-layout-panel{margin-bottom:1em}.quarto-layout-panel>figure{width:100%}.quarto-layout-panel>figure>figcaption,.quarto-layout-panel>.panel-caption{margin-top:10pt}.quarto-layout-panel>.table-caption{margin-top:0px}.table-caption p{margin-bottom:.5em}.quarto-layout-row{display:flex;flex-direction:row;align-items:flex-start}.quarto-layout-valign-top{align-items:flex-start}.quarto-layout-valign-bottom{align-items:flex-end}.quarto-layout-valign-center{align-items:center}.quarto-layout-cell{position:relative;margin-right:20px}.quarto-layout-cell:last-child{margin-right:0}.quarto-layout-cell figure,.quarto-layout-cell>p{margin:.2em}.quarto-layout-cell img{max-width:100%}.quarto-layout-cell .html-widget{width:100% !important}.quarto-layout-cell div figure p{margin:0}.quarto-layout-cell figure{display:block;margin-inline-start:0;margin-inline-end:0}.quarto-layout-cell table{display:inline-table}.quarto-layout-cell-subref figcaption,figure .quarto-layout-row figure figcaption{text-align:center;font-style:italic}.quarto-figure{position:relative;margin-bottom:1em}.quarto-figure>figure{width:100%;margin-bottom:0}.quarto-figure-left>figure>p,.quarto-figure-left>figure>div{text-align:left}.quarto-figure-center>figure>p,.quarto-figure-center>figure>div{text-align:center}.quarto-figure-right>figure>p,.quarto-figure-right>figure>div{text-align:right}.quarto-figure>figure>div.cell-annotation,.quarto-figure>figure>div code{text-align:left}figure>p:empty{display:none}figure>p:first-child{margin-top:0;margin-bottom:0}figure>figcaption.quarto-float-caption-bottom{margin-bottom:.5em}figure>figcaption.quarto-float-caption-top{margin-top:.5em}div[id^=tbl-]{position:relative}.quarto-figure>.anchorjs-link{position:absolute;top:.6em;right:.5em}div[id^=tbl-]>.anchorjs-link{position:absolute;top:.7em;right:.3em}.quarto-figure:hover>.anchorjs-link,div[id^=tbl-]:hover>.anchorjs-link,h2:hover>.anchorjs-link,.h2:hover>.anchorjs-link,h3:hover>.anchorjs-link,.h3:hover>.anchorjs-link,h4:hover>.anchorjs-link,.h4:hover>.anchorjs-link,h5:hover>.anchorjs-link,.h5:hover>.anchorjs-link,h6:hover>.anchorjs-link,.h6:hover>.anchorjs-link,.reveal-anchorjs-link>.anchorjs-link{opacity:1}#title-block-header{margin-block-end:1rem;position:relative;margin-top:-1px}#title-block-header .abstract{margin-block-start:1rem}#title-block-header .abstract .abstract-title{font-weight:600}#title-block-header a{text-decoration:none}#title-block-header .author,#title-block-header .date,#title-block-header .doi{margin-block-end:.2rem}#title-block-header .quarto-title-block>div{display:flex}#title-block-header .quarto-title-block>div>h1,#title-block-header .quarto-title-block>div>.h1{flex-grow:1}#title-block-header .quarto-title-block>div>button{flex-shrink:0;height:2.25rem;margin-top:0}@media(min-width: 992px){#title-block-header .quarto-title-block>div>button{margin-top:5px}}tr.header>th>p:last-of-type{margin-bottom:0px}table,table.table{margin-top:.5rem;margin-bottom:.5rem}caption,.table-caption{padding-top:.5rem;padding-bottom:.5rem;text-align:center}figure.quarto-float-tbl figcaption.quarto-float-caption-top{margin-top:.5rem;margin-bottom:.25rem;text-align:center}figure.quarto-float-tbl figcaption.quarto-float-caption-bottom{padding-top:.25rem;margin-bottom:.5rem;text-align:center}.utterances{max-width:none;margin-left:-8px}iframe{margin-bottom:1em}details{margin-bottom:1em}details[show]{margin-bottom:0}details>summary{color:#6c757d}details>summary>p:only-child{display:inline}pre.sourceCode,code.sourceCode{position:relative}dd code:not(.sourceCode),p code:not(.sourceCode){white-space:pre-wrap}code{white-space:pre}@media print{code{white-space:pre-wrap}}pre>code{display:block}pre>code.sourceCode{white-space:pre}pre>code.sourceCode>span>a:first-child::before{text-decoration:none}pre.code-overflow-wrap>code.sourceCode{white-space:pre-wrap}pre.code-overflow-scroll>code.sourceCode{white-space:pre}code a:any-link{color:inherit;text-decoration:none}code a:hover{color:inherit;text-decoration:underline}ul.task-list{padding-left:1em}[data-tippy-root]{display:inline-block}.tippy-content .footnote-back{display:none}.footnote-back{margin-left:.2em}.tippy-content{overflow-x:auto}.quarto-embedded-source-code{display:none}.quarto-unresolved-ref{font-weight:600}.quarto-cover-image{max-width:35%;float:right;margin-left:30px}.cell-output-display .widget-subarea{margin-bottom:1em}.cell-output-display:not(.no-overflow-x),.knitsql-table:not(.no-overflow-x){overflow-x:auto}.panel-input{margin-bottom:1em}.panel-input>div,.panel-input>div>div{display:inline-block;vertical-align:top;padding-right:12px}.panel-input>p:last-child{margin-bottom:0}.layout-sidebar{margin-bottom:1em}.layout-sidebar .tab-content{border:none}.tab-content>.page-columns.active{display:grid}div.sourceCode>iframe{width:100%;height:300px;margin-bottom:-0.5em}a{text-underline-offset:3px}.callout pre.sourceCode{padding-left:0}div.ansi-escaped-output{font-family:monospace;display:block}/*! + */@import"https://fonts.googleapis.com/css2?family=Source+Sans+Pro:wght@300;400;700&display=swap";:root,[data-bs-theme=light]{--bs-blue: #2780e3;--bs-indigo: #6610f2;--bs-purple: #613d7c;--bs-pink: #e83e8c;--bs-red: #ff0039;--bs-orange: #f0ad4e;--bs-yellow: #ff7518;--bs-green: #3fb618;--bs-teal: #20c997;--bs-cyan: #9954bb;--bs-black: #000;--bs-white: #fff;--bs-gray: #6c757d;--bs-gray-dark: #343a40;--bs-gray-100: #f8f9fa;--bs-gray-200: #e9ecef;--bs-gray-300: #dee2e6;--bs-gray-400: #ced4da;--bs-gray-500: #adb5bd;--bs-gray-600: #6c757d;--bs-gray-700: #495057;--bs-gray-800: #343a40;--bs-gray-900: #212529;--bs-default: #343a40;--bs-primary: #2780e3;--bs-secondary: #343a40;--bs-success: #3fb618;--bs-info: #9954bb;--bs-warning: #ff7518;--bs-danger: #ff0039;--bs-light: #f8f9fa;--bs-dark: #343a40;--bs-default-rgb: 52, 58, 64;--bs-primary-rgb: 39, 128, 227;--bs-secondary-rgb: 52, 58, 64;--bs-success-rgb: 63, 182, 24;--bs-info-rgb: 153, 84, 187;--bs-warning-rgb: 255, 117, 24;--bs-danger-rgb: 255, 0, 57;--bs-light-rgb: 248, 249, 250;--bs-dark-rgb: 52, 58, 64;--bs-primary-text-emphasis: #10335b;--bs-secondary-text-emphasis: #15171a;--bs-success-text-emphasis: #19490a;--bs-info-text-emphasis: #3d224b;--bs-warning-text-emphasis: #662f0a;--bs-danger-text-emphasis: #660017;--bs-light-text-emphasis: #495057;--bs-dark-text-emphasis: #495057;--bs-primary-bg-subtle: #d4e6f9;--bs-secondary-bg-subtle: #d6d8d9;--bs-success-bg-subtle: #d9f0d1;--bs-info-bg-subtle: #ebddf1;--bs-warning-bg-subtle: #ffe3d1;--bs-danger-bg-subtle: #ffccd7;--bs-light-bg-subtle: #fcfcfd;--bs-dark-bg-subtle: #ced4da;--bs-primary-border-subtle: #a9ccf4;--bs-secondary-border-subtle: #aeb0b3;--bs-success-border-subtle: #b2e2a3;--bs-info-border-subtle: #d6bbe4;--bs-warning-border-subtle: #ffc8a3;--bs-danger-border-subtle: #ff99b0;--bs-light-border-subtle: #e9ecef;--bs-dark-border-subtle: #adb5bd;--bs-white-rgb: 255, 255, 255;--bs-black-rgb: 0, 0, 0;--bs-font-sans-serif: "Source Sans Pro", -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol";--bs-font-monospace: SFMono-Regular, Menlo, Monaco, Consolas, "Liberation Mono", "Courier New", monospace;--bs-gradient: linear-gradient(180deg, rgba(255, 255, 255, 0.15), rgba(255, 255, 255, 0));--bs-root-font-size: 17px;--bs-body-font-family: "Source Sans Pro", -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, "Helvetica Neue", Arial, sans-serif, "Apple Color Emoji", "Segoe UI Emoji", "Segoe UI Symbol";--bs-body-font-size:1rem;--bs-body-font-weight: 400;--bs-body-line-height: 1.5;--bs-body-color: #343a40;--bs-body-color-rgb: 52, 58, 64;--bs-body-bg: #fff;--bs-body-bg-rgb: 255, 255, 255;--bs-emphasis-color: #000;--bs-emphasis-color-rgb: 0, 0, 0;--bs-secondary-color: rgba(52, 58, 64, 0.75);--bs-secondary-color-rgb: 52, 58, 64;--bs-secondary-bg: #e9ecef;--bs-secondary-bg-rgb: 233, 236, 239;--bs-tertiary-color: rgba(52, 58, 64, 0.5);--bs-tertiary-color-rgb: 52, 58, 64;--bs-tertiary-bg: #f8f9fa;--bs-tertiary-bg-rgb: 248, 249, 250;--bs-heading-color: inherit;--bs-link-color: #2761e3;--bs-link-color-rgb: 39, 97, 227;--bs-link-decoration: underline;--bs-link-hover-color: #1f4eb6;--bs-link-hover-color-rgb: 31, 78, 182;--bs-code-color: #7d12ba;--bs-highlight-bg: #ffe3d1;--bs-border-width: 1px;--bs-border-style: solid;--bs-border-color: #dee2e6;--bs-border-color-translucent: rgba(0, 0, 0, 0.175);--bs-border-radius: 0.25rem;--bs-border-radius-sm: 0.2em;--bs-border-radius-lg: 0.5rem;--bs-border-radius-xl: 1rem;--bs-border-radius-xxl: 2rem;--bs-border-radius-2xl: var(--bs-border-radius-xxl);--bs-border-radius-pill: 50rem;--bs-box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15);--bs-box-shadow-sm: 0 0.125rem 0.25rem rgba(0, 0, 0, 0.075);--bs-box-shadow-lg: 0 1rem 3rem rgba(0, 0, 0, 0.175);--bs-box-shadow-inset: inset 0 1px 2px rgba(0, 0, 0, 0.075);--bs-focus-ring-width: 0.25rem;--bs-focus-ring-opacity: 0.25;--bs-focus-ring-color: rgba(39, 128, 227, 0.25);--bs-form-valid-color: #3fb618;--bs-form-valid-border-color: #3fb618;--bs-form-invalid-color: #ff0039;--bs-form-invalid-border-color: #ff0039}[data-bs-theme=dark]{color-scheme:dark;--bs-body-color: #dee2e6;--bs-body-color-rgb: 222, 226, 230;--bs-body-bg: #212529;--bs-body-bg-rgb: 33, 37, 41;--bs-emphasis-color: #fff;--bs-emphasis-color-rgb: 255, 255, 255;--bs-secondary-color: rgba(222, 226, 230, 0.75);--bs-secondary-color-rgb: 222, 226, 230;--bs-secondary-bg: #343a40;--bs-secondary-bg-rgb: 52, 58, 64;--bs-tertiary-color: rgba(222, 226, 230, 0.5);--bs-tertiary-color-rgb: 222, 226, 230;--bs-tertiary-bg: #2b3035;--bs-tertiary-bg-rgb: 43, 48, 53;--bs-primary-text-emphasis: #7db3ee;--bs-secondary-text-emphasis: #85898c;--bs-success-text-emphasis: #8cd374;--bs-info-text-emphasis: #c298d6;--bs-warning-text-emphasis: #ffac74;--bs-danger-text-emphasis: #ff6688;--bs-light-text-emphasis: #f8f9fa;--bs-dark-text-emphasis: #dee2e6;--bs-primary-bg-subtle: #081a2d;--bs-secondary-bg-subtle: #0a0c0d;--bs-success-bg-subtle: #0d2405;--bs-info-bg-subtle: #1f1125;--bs-warning-bg-subtle: #331705;--bs-danger-bg-subtle: #33000b;--bs-light-bg-subtle: #343a40;--bs-dark-bg-subtle: #1a1d20;--bs-primary-border-subtle: #174d88;--bs-secondary-border-subtle: #1f2326;--bs-success-border-subtle: #266d0e;--bs-info-border-subtle: #5c3270;--bs-warning-border-subtle: #99460e;--bs-danger-border-subtle: #990022;--bs-light-border-subtle: #495057;--bs-dark-border-subtle: #343a40;--bs-heading-color: inherit;--bs-link-color: #7db3ee;--bs-link-hover-color: #97c2f1;--bs-link-color-rgb: 125, 179, 238;--bs-link-hover-color-rgb: 151, 194, 241;--bs-code-color: white;--bs-border-color: #495057;--bs-border-color-translucent: rgba(255, 255, 255, 0.15);--bs-form-valid-color: #8cd374;--bs-form-valid-border-color: #8cd374;--bs-form-invalid-color: #ff6688;--bs-form-invalid-border-color: #ff6688}*,*::before,*::after{box-sizing:border-box}:root{font-size:var(--bs-root-font-size)}body{margin:0;font-family:var(--bs-body-font-family);font-size:var(--bs-body-font-size);font-weight:var(--bs-body-font-weight);line-height:var(--bs-body-line-height);color:var(--bs-body-color);text-align:var(--bs-body-text-align);background-color:var(--bs-body-bg);-webkit-text-size-adjust:100%;-webkit-tap-highlight-color:rgba(0,0,0,0)}hr{margin:1rem 0;color:inherit;border:0;border-top:1px solid;opacity:.25}h6,.h6,h5,.h5,h4,.h4,h3,.h3,h2,.h2,h1,.h1{margin-top:0;margin-bottom:.5rem;font-weight:400;line-height:1.2;color:var(--bs-heading-color)}h1,.h1{font-size:calc(1.325rem + 0.9vw)}@media(min-width: 1200px){h1,.h1{font-size:2rem}}h2,.h2{font-size:calc(1.29rem + 0.48vw)}@media(min-width: 1200px){h2,.h2{font-size:1.65rem}}h3,.h3{font-size:calc(1.27rem + 0.24vw)}@media(min-width: 1200px){h3,.h3{font-size:1.45rem}}h4,.h4{font-size:1.25rem}h5,.h5{font-size:1.1rem}h6,.h6{font-size:1rem}p{margin-top:0;margin-bottom:1rem}abbr[title]{text-decoration:underline dotted;-webkit-text-decoration:underline dotted;-moz-text-decoration:underline dotted;-ms-text-decoration:underline dotted;-o-text-decoration:underline dotted;cursor:help;text-decoration-skip-ink:none}address{margin-bottom:1rem;font-style:normal;line-height:inherit}ol,ul{padding-left:2rem}ol,ul,dl{margin-top:0;margin-bottom:1rem}ol ol,ul ul,ol ul,ul ol{margin-bottom:0}dt{font-weight:700}dd{margin-bottom:.5rem;margin-left:0}blockquote{margin:0 0 1rem;padding:.625rem 1.25rem;border-left:.25rem solid #e9ecef}blockquote p:last-child,blockquote ul:last-child,blockquote ol:last-child{margin-bottom:0}b,strong{font-weight:bolder}small,.small{font-size:0.875em}mark,.mark{padding:.1875em;background-color:var(--bs-highlight-bg)}sub,sup{position:relative;font-size:0.75em;line-height:0;vertical-align:baseline}sub{bottom:-0.25em}sup{top:-0.5em}a{color:rgba(var(--bs-link-color-rgb), var(--bs-link-opacity, 1));text-decoration:underline;-webkit-text-decoration:underline;-moz-text-decoration:underline;-ms-text-decoration:underline;-o-text-decoration:underline}a:hover{--bs-link-color-rgb: var(--bs-link-hover-color-rgb)}a:not([href]):not([class]),a:not([href]):not([class]):hover{color:inherit;text-decoration:none}pre,code,kbd,samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,"Liberation Mono","Courier New",monospace;font-size:1em}pre{display:block;margin-top:0;margin-bottom:1rem;overflow:auto;font-size:0.875em;color:#000;background-color:#f8f9fa;line-height:1.5;padding:.5rem;border:1px solid var(--bs-border-color, #dee2e6)}pre code{background-color:rgba(0,0,0,0);font-size:inherit;color:inherit;word-break:normal}code{font-size:0.875em;color:var(--bs-code-color);background-color:#f8f9fa;padding:.125rem .25rem;word-wrap:break-word}a>code{color:inherit}kbd{padding:.4rem .4rem;font-size:0.875em;color:#fff;background-color:#343a40}kbd kbd{padding:0;font-size:1em}figure{margin:0 0 1rem}img,svg{vertical-align:middle}table{caption-side:bottom;border-collapse:collapse}caption{padding-top:.5rem;padding-bottom:.5rem;color:rgba(52,58,64,.75);text-align:left}th{text-align:inherit;text-align:-webkit-match-parent}thead,tbody,tfoot,tr,td,th{border-color:inherit;border-style:solid;border-width:0}label{display:inline-block}button{border-radius:0}button:focus:not(:focus-visible){outline:0}input,button,select,optgroup,textarea{margin:0;font-family:inherit;font-size:inherit;line-height:inherit}button,select{text-transform:none}[role=button]{cursor:pointer}select{word-wrap:normal}select:disabled{opacity:1}[list]:not([type=date]):not([type=datetime-local]):not([type=month]):not([type=week]):not([type=time])::-webkit-calendar-picker-indicator{display:none !important}button,[type=button],[type=reset],[type=submit]{-webkit-appearance:button}button:not(:disabled),[type=button]:not(:disabled),[type=reset]:not(:disabled),[type=submit]:not(:disabled){cursor:pointer}::-moz-focus-inner{padding:0;border-style:none}textarea{resize:vertical}fieldset{min-width:0;padding:0;margin:0;border:0}legend{float:left;width:100%;padding:0;margin-bottom:.5rem;font-size:calc(1.275rem + 0.3vw);line-height:inherit}@media(min-width: 1200px){legend{font-size:1.5rem}}legend+*{clear:left}::-webkit-datetime-edit-fields-wrapper,::-webkit-datetime-edit-text,::-webkit-datetime-edit-minute,::-webkit-datetime-edit-hour-field,::-webkit-datetime-edit-day-field,::-webkit-datetime-edit-month-field,::-webkit-datetime-edit-year-field{padding:0}::-webkit-inner-spin-button{height:auto}[type=search]{-webkit-appearance:textfield;outline-offset:-2px}::-webkit-search-decoration{-webkit-appearance:none}::-webkit-color-swatch-wrapper{padding:0}::file-selector-button{font:inherit;-webkit-appearance:button}output{display:inline-block}iframe{border:0}summary{display:list-item;cursor:pointer}progress{vertical-align:baseline}[hidden]{display:none !important}.lead{font-size:1.25rem;font-weight:300}.display-1{font-size:calc(1.625rem + 4.5vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-1{font-size:5rem}}.display-2{font-size:calc(1.575rem + 3.9vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-2{font-size:4.5rem}}.display-3{font-size:calc(1.525rem + 3.3vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-3{font-size:4rem}}.display-4{font-size:calc(1.475rem + 2.7vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-4{font-size:3.5rem}}.display-5{font-size:calc(1.425rem + 2.1vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-5{font-size:3rem}}.display-6{font-size:calc(1.375rem + 1.5vw);font-weight:300;line-height:1.2}@media(min-width: 1200px){.display-6{font-size:2.5rem}}.list-unstyled{padding-left:0;list-style:none}.list-inline{padding-left:0;list-style:none}.list-inline-item{display:inline-block}.list-inline-item:not(:last-child){margin-right:.5rem}.initialism{font-size:0.875em;text-transform:uppercase}.blockquote{margin-bottom:1rem;font-size:1.25rem}.blockquote>:last-child{margin-bottom:0}.blockquote-footer{margin-top:-1rem;margin-bottom:1rem;font-size:0.875em;color:#6c757d}.blockquote-footer::before{content:"— "}.img-fluid{max-width:100%;height:auto}.img-thumbnail{padding:.25rem;background-color:#fff;border:1px solid #dee2e6;max-width:100%;height:auto}.figure{display:inline-block}.figure-img{margin-bottom:.5rem;line-height:1}.figure-caption{font-size:0.875em;color:rgba(52,58,64,.75)}.container,.container-fluid,.container-xxl,.container-xl,.container-lg,.container-md,.container-sm{--bs-gutter-x: 1.5rem;--bs-gutter-y: 0;width:100%;padding-right:calc(var(--bs-gutter-x)*.5);padding-left:calc(var(--bs-gutter-x)*.5);margin-right:auto;margin-left:auto}@media(min-width: 576px){.container-sm,.container{max-width:540px}}@media(min-width: 768px){.container-md,.container-sm,.container{max-width:720px}}@media(min-width: 992px){.container-lg,.container-md,.container-sm,.container{max-width:960px}}@media(min-width: 1200px){.container-xl,.container-lg,.container-md,.container-sm,.container{max-width:1140px}}@media(min-width: 1400px){.container-xxl,.container-xl,.container-lg,.container-md,.container-sm,.container{max-width:1320px}}:root{--bs-breakpoint-xs: 0;--bs-breakpoint-sm: 576px;--bs-breakpoint-md: 768px;--bs-breakpoint-lg: 992px;--bs-breakpoint-xl: 1200px;--bs-breakpoint-xxl: 1400px}.grid{display:grid;grid-template-rows:repeat(var(--bs-rows, 1), 1fr);grid-template-columns:repeat(var(--bs-columns, 12), 1fr);gap:var(--bs-gap, 1.5rem)}.grid .g-col-1{grid-column:auto/span 1}.grid .g-col-2{grid-column:auto/span 2}.grid .g-col-3{grid-column:auto/span 3}.grid .g-col-4{grid-column:auto/span 4}.grid .g-col-5{grid-column:auto/span 5}.grid .g-col-6{grid-column:auto/span 6}.grid .g-col-7{grid-column:auto/span 7}.grid .g-col-8{grid-column:auto/span 8}.grid .g-col-9{grid-column:auto/span 9}.grid .g-col-10{grid-column:auto/span 10}.grid .g-col-11{grid-column:auto/span 11}.grid .g-col-12{grid-column:auto/span 12}.grid .g-start-1{grid-column-start:1}.grid .g-start-2{grid-column-start:2}.grid .g-start-3{grid-column-start:3}.grid .g-start-4{grid-column-start:4}.grid .g-start-5{grid-column-start:5}.grid .g-start-6{grid-column-start:6}.grid .g-start-7{grid-column-start:7}.grid .g-start-8{grid-column-start:8}.grid .g-start-9{grid-column-start:9}.grid .g-start-10{grid-column-start:10}.grid .g-start-11{grid-column-start:11}@media(min-width: 576px){.grid .g-col-sm-1{grid-column:auto/span 1}.grid .g-col-sm-2{grid-column:auto/span 2}.grid .g-col-sm-3{grid-column:auto/span 3}.grid .g-col-sm-4{grid-column:auto/span 4}.grid .g-col-sm-5{grid-column:auto/span 5}.grid .g-col-sm-6{grid-column:auto/span 6}.grid .g-col-sm-7{grid-column:auto/span 7}.grid .g-col-sm-8{grid-column:auto/span 8}.grid .g-col-sm-9{grid-column:auto/span 9}.grid .g-col-sm-10{grid-column:auto/span 10}.grid .g-col-sm-11{grid-column:auto/span 11}.grid .g-col-sm-12{grid-column:auto/span 12}.grid .g-start-sm-1{grid-column-start:1}.grid .g-start-sm-2{grid-column-start:2}.grid .g-start-sm-3{grid-column-start:3}.grid .g-start-sm-4{grid-column-start:4}.grid .g-start-sm-5{grid-column-start:5}.grid .g-start-sm-6{grid-column-start:6}.grid .g-start-sm-7{grid-column-start:7}.grid .g-start-sm-8{grid-column-start:8}.grid .g-start-sm-9{grid-column-start:9}.grid .g-start-sm-10{grid-column-start:10}.grid .g-start-sm-11{grid-column-start:11}}@media(min-width: 768px){.grid .g-col-md-1{grid-column:auto/span 1}.grid .g-col-md-2{grid-column:auto/span 2}.grid .g-col-md-3{grid-column:auto/span 3}.grid .g-col-md-4{grid-column:auto/span 4}.grid .g-col-md-5{grid-column:auto/span 5}.grid .g-col-md-6{grid-column:auto/span 6}.grid .g-col-md-7{grid-column:auto/span 7}.grid .g-col-md-8{grid-column:auto/span 8}.grid .g-col-md-9{grid-column:auto/span 9}.grid .g-col-md-10{grid-column:auto/span 10}.grid .g-col-md-11{grid-column:auto/span 11}.grid .g-col-md-12{grid-column:auto/span 12}.grid .g-start-md-1{grid-column-start:1}.grid .g-start-md-2{grid-column-start:2}.grid .g-start-md-3{grid-column-start:3}.grid .g-start-md-4{grid-column-start:4}.grid .g-start-md-5{grid-column-start:5}.grid .g-start-md-6{grid-column-start:6}.grid .g-start-md-7{grid-column-start:7}.grid .g-start-md-8{grid-column-start:8}.grid .g-start-md-9{grid-column-start:9}.grid .g-start-md-10{grid-column-start:10}.grid .g-start-md-11{grid-column-start:11}}@media(min-width: 992px){.grid .g-col-lg-1{grid-column:auto/span 1}.grid .g-col-lg-2{grid-column:auto/span 2}.grid .g-col-lg-3{grid-column:auto/span 3}.grid .g-col-lg-4{grid-column:auto/span 4}.grid .g-col-lg-5{grid-column:auto/span 5}.grid .g-col-lg-6{grid-column:auto/span 6}.grid .g-col-lg-7{grid-column:auto/span 7}.grid .g-col-lg-8{grid-column:auto/span 8}.grid .g-col-lg-9{grid-column:auto/span 9}.grid .g-col-lg-10{grid-column:auto/span 10}.grid .g-col-lg-11{grid-column:auto/span 11}.grid .g-col-lg-12{grid-column:auto/span 12}.grid .g-start-lg-1{grid-column-start:1}.grid .g-start-lg-2{grid-column-start:2}.grid .g-start-lg-3{grid-column-start:3}.grid .g-start-lg-4{grid-column-start:4}.grid .g-start-lg-5{grid-column-start:5}.grid .g-start-lg-6{grid-column-start:6}.grid .g-start-lg-7{grid-column-start:7}.grid .g-start-lg-8{grid-column-start:8}.grid .g-start-lg-9{grid-column-start:9}.grid .g-start-lg-10{grid-column-start:10}.grid .g-start-lg-11{grid-column-start:11}}@media(min-width: 1200px){.grid .g-col-xl-1{grid-column:auto/span 1}.grid .g-col-xl-2{grid-column:auto/span 2}.grid .g-col-xl-3{grid-column:auto/span 3}.grid .g-col-xl-4{grid-column:auto/span 4}.grid .g-col-xl-5{grid-column:auto/span 5}.grid .g-col-xl-6{grid-column:auto/span 6}.grid .g-col-xl-7{grid-column:auto/span 7}.grid .g-col-xl-8{grid-column:auto/span 8}.grid .g-col-xl-9{grid-column:auto/span 9}.grid .g-col-xl-10{grid-column:auto/span 10}.grid .g-col-xl-11{grid-column:auto/span 11}.grid .g-col-xl-12{grid-column:auto/span 12}.grid .g-start-xl-1{grid-column-start:1}.grid .g-start-xl-2{grid-column-start:2}.grid .g-start-xl-3{grid-column-start:3}.grid .g-start-xl-4{grid-column-start:4}.grid .g-start-xl-5{grid-column-start:5}.grid .g-start-xl-6{grid-column-start:6}.grid .g-start-xl-7{grid-column-start:7}.grid .g-start-xl-8{grid-column-start:8}.grid .g-start-xl-9{grid-column-start:9}.grid .g-start-xl-10{grid-column-start:10}.grid .g-start-xl-11{grid-column-start:11}}@media(min-width: 1400px){.grid .g-col-xxl-1{grid-column:auto/span 1}.grid .g-col-xxl-2{grid-column:auto/span 2}.grid .g-col-xxl-3{grid-column:auto/span 3}.grid .g-col-xxl-4{grid-column:auto/span 4}.grid .g-col-xxl-5{grid-column:auto/span 5}.grid .g-col-xxl-6{grid-column:auto/span 6}.grid .g-col-xxl-7{grid-column:auto/span 7}.grid .g-col-xxl-8{grid-column:auto/span 8}.grid .g-col-xxl-9{grid-column:auto/span 9}.grid .g-col-xxl-10{grid-column:auto/span 10}.grid .g-col-xxl-11{grid-column:auto/span 11}.grid .g-col-xxl-12{grid-column:auto/span 12}.grid .g-start-xxl-1{grid-column-start:1}.grid .g-start-xxl-2{grid-column-start:2}.grid .g-start-xxl-3{grid-column-start:3}.grid .g-start-xxl-4{grid-column-start:4}.grid .g-start-xxl-5{grid-column-start:5}.grid .g-start-xxl-6{grid-column-start:6}.grid .g-start-xxl-7{grid-column-start:7}.grid .g-start-xxl-8{grid-column-start:8}.grid .g-start-xxl-9{grid-column-start:9}.grid .g-start-xxl-10{grid-column-start:10}.grid .g-start-xxl-11{grid-column-start:11}}.table{--bs-table-color-type: initial;--bs-table-bg-type: initial;--bs-table-color-state: initial;--bs-table-bg-state: initial;--bs-table-color: #343a40;--bs-table-bg: #fff;--bs-table-border-color: #dee2e6;--bs-table-accent-bg: transparent;--bs-table-striped-color: #343a40;--bs-table-striped-bg: rgba(0, 0, 0, 0.05);--bs-table-active-color: #343a40;--bs-table-active-bg: rgba(0, 0, 0, 0.1);--bs-table-hover-color: #343a40;--bs-table-hover-bg: rgba(0, 0, 0, 0.075);width:100%;margin-bottom:1rem;vertical-align:top;border-color:var(--bs-table-border-color)}.table>:not(caption)>*>*{padding:.5rem .5rem;color:var(--bs-table-color-state, var(--bs-table-color-type, var(--bs-table-color)));background-color:var(--bs-table-bg);border-bottom-width:1px;box-shadow:inset 0 0 0 9999px var(--bs-table-bg-state, var(--bs-table-bg-type, var(--bs-table-accent-bg)))}.table>tbody{vertical-align:inherit}.table>thead{vertical-align:bottom}.table-group-divider{border-top:calc(1px*2) solid #9a9da0}.caption-top{caption-side:top}.table-sm>:not(caption)>*>*{padding:.25rem .25rem}.table-bordered>:not(caption)>*{border-width:1px 0}.table-bordered>:not(caption)>*>*{border-width:0 1px}.table-borderless>:not(caption)>*>*{border-bottom-width:0}.table-borderless>:not(:first-child){border-top-width:0}.table-striped>tbody>tr:nth-of-type(odd)>*{--bs-table-color-type: var(--bs-table-striped-color);--bs-table-bg-type: var(--bs-table-striped-bg)}.table-striped-columns>:not(caption)>tr>:nth-child(even){--bs-table-color-type: var(--bs-table-striped-color);--bs-table-bg-type: var(--bs-table-striped-bg)}.table-active{--bs-table-color-state: var(--bs-table-active-color);--bs-table-bg-state: var(--bs-table-active-bg)}.table-hover>tbody>tr:hover>*{--bs-table-color-state: var(--bs-table-hover-color);--bs-table-bg-state: var(--bs-table-hover-bg)}.table-primary{--bs-table-color: #000;--bs-table-bg: #d4e6f9;--bs-table-border-color: #bfcfe0;--bs-table-striped-bg: #c9dbed;--bs-table-striped-color: #000;--bs-table-active-bg: #bfcfe0;--bs-table-active-color: #000;--bs-table-hover-bg: #c4d5e6;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-secondary{--bs-table-color: #000;--bs-table-bg: #d6d8d9;--bs-table-border-color: #c1c2c3;--bs-table-striped-bg: #cbcdce;--bs-table-striped-color: #000;--bs-table-active-bg: #c1c2c3;--bs-table-active-color: #000;--bs-table-hover-bg: #c6c8c9;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-success{--bs-table-color: #000;--bs-table-bg: #d9f0d1;--bs-table-border-color: #c3d8bc;--bs-table-striped-bg: #cee4c7;--bs-table-striped-color: #000;--bs-table-active-bg: #c3d8bc;--bs-table-active-color: #000;--bs-table-hover-bg: #c9dec1;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-info{--bs-table-color: #000;--bs-table-bg: #ebddf1;--bs-table-border-color: #d4c7d9;--bs-table-striped-bg: #dfd2e5;--bs-table-striped-color: #000;--bs-table-active-bg: #d4c7d9;--bs-table-active-color: #000;--bs-table-hover-bg: #d9ccdf;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-warning{--bs-table-color: #000;--bs-table-bg: #ffe3d1;--bs-table-border-color: #e6ccbc;--bs-table-striped-bg: #f2d8c7;--bs-table-striped-color: #000;--bs-table-active-bg: #e6ccbc;--bs-table-active-color: #000;--bs-table-hover-bg: #ecd2c1;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-danger{--bs-table-color: #000;--bs-table-bg: #ffccd7;--bs-table-border-color: #e6b8c2;--bs-table-striped-bg: #f2c2cc;--bs-table-striped-color: #000;--bs-table-active-bg: #e6b8c2;--bs-table-active-color: #000;--bs-table-hover-bg: #ecbdc7;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-light{--bs-table-color: #000;--bs-table-bg: #f8f9fa;--bs-table-border-color: #dfe0e1;--bs-table-striped-bg: #ecedee;--bs-table-striped-color: #000;--bs-table-active-bg: #dfe0e1;--bs-table-active-color: #000;--bs-table-hover-bg: #e5e6e7;--bs-table-hover-color: #000;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-dark{--bs-table-color: #fff;--bs-table-bg: #343a40;--bs-table-border-color: #484e53;--bs-table-striped-bg: #3e444a;--bs-table-striped-color: #fff;--bs-table-active-bg: #484e53;--bs-table-active-color: #fff;--bs-table-hover-bg: #43494e;--bs-table-hover-color: #fff;color:var(--bs-table-color);border-color:var(--bs-table-border-color)}.table-responsive{overflow-x:auto;-webkit-overflow-scrolling:touch}@media(max-width: 575.98px){.table-responsive-sm{overflow-x:auto;-webkit-overflow-scrolling:touch}}@media(max-width: 767.98px){.table-responsive-md{overflow-x:auto;-webkit-overflow-scrolling:touch}}@media(max-width: 991.98px){.table-responsive-lg{overflow-x:auto;-webkit-overflow-scrolling:touch}}@media(max-width: 1199.98px){.table-responsive-xl{overflow-x:auto;-webkit-overflow-scrolling:touch}}@media(max-width: 1399.98px){.table-responsive-xxl{overflow-x:auto;-webkit-overflow-scrolling:touch}}.form-label,.shiny-input-container .control-label{margin-bottom:.5rem}.col-form-label{padding-top:calc(0.375rem + 1px);padding-bottom:calc(0.375rem + 1px);margin-bottom:0;font-size:inherit;line-height:1.5}.col-form-label-lg{padding-top:calc(0.5rem + 1px);padding-bottom:calc(0.5rem + 1px);font-size:1.25rem}.col-form-label-sm{padding-top:calc(0.25rem + 1px);padding-bottom:calc(0.25rem + 1px);font-size:0.875rem}.form-text{margin-top:.25rem;font-size:0.875em;color:rgba(52,58,64,.75)}.form-control{display:block;width:100%;padding:.375rem .75rem;font-size:1rem;font-weight:400;line-height:1.5;color:#343a40;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:#fff;background-clip:padding-box;border:1px solid #dee2e6;border-radius:0;transition:border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-control{transition:none}}.form-control[type=file]{overflow:hidden}.form-control[type=file]:not(:disabled):not([readonly]){cursor:pointer}.form-control:focus{color:#343a40;background-color:#fff;border-color:#93c0f1;outline:0;box-shadow:0 0 0 .25rem rgba(39,128,227,.25)}.form-control::-webkit-date-and-time-value{min-width:85px;height:1.5em;margin:0}.form-control::-webkit-datetime-edit{display:block;padding:0}.form-control::placeholder{color:rgba(52,58,64,.75);opacity:1}.form-control:disabled{background-color:#e9ecef;opacity:1}.form-control::file-selector-button{padding:.375rem .75rem;margin:-0.375rem -0.75rem;margin-inline-end:.75rem;color:#343a40;background-color:#f8f9fa;pointer-events:none;border-color:inherit;border-style:solid;border-width:0;border-inline-end-width:1px;border-radius:0;transition:color .15s ease-in-out,background-color .15s ease-in-out,border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-control::file-selector-button{transition:none}}.form-control:hover:not(:disabled):not([readonly])::file-selector-button{background-color:#e9ecef}.form-control-plaintext{display:block;width:100%;padding:.375rem 0;margin-bottom:0;line-height:1.5;color:#343a40;background-color:rgba(0,0,0,0);border:solid rgba(0,0,0,0);border-width:1px 0}.form-control-plaintext:focus{outline:0}.form-control-plaintext.form-control-sm,.form-control-plaintext.form-control-lg{padding-right:0;padding-left:0}.form-control-sm{min-height:calc(1.5em + 0.5rem + calc(1px * 2));padding:.25rem .5rem;font-size:0.875rem}.form-control-sm::file-selector-button{padding:.25rem .5rem;margin:-0.25rem -0.5rem;margin-inline-end:.5rem}.form-control-lg{min-height:calc(1.5em + 1rem + calc(1px * 2));padding:.5rem 1rem;font-size:1.25rem}.form-control-lg::file-selector-button{padding:.5rem 1rem;margin:-0.5rem -1rem;margin-inline-end:1rem}textarea.form-control{min-height:calc(1.5em + 0.75rem + calc(1px * 2))}textarea.form-control-sm{min-height:calc(1.5em + 0.5rem + calc(1px * 2))}textarea.form-control-lg{min-height:calc(1.5em + 1rem + calc(1px * 2))}.form-control-color{width:3rem;height:calc(1.5em + 0.75rem + calc(1px * 2));padding:.375rem}.form-control-color:not(:disabled):not([readonly]){cursor:pointer}.form-control-color::-moz-color-swatch{border:0 !important}.form-control-color::-webkit-color-swatch{border:0 !important}.form-control-color.form-control-sm{height:calc(1.5em + 0.5rem + calc(1px * 2))}.form-control-color.form-control-lg{height:calc(1.5em + 1rem + calc(1px * 2))}.form-select{--bs-form-select-bg-img: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16'%3e%3cpath fill='none' stroke='%23343a40' stroke-linecap='round' stroke-linejoin='round' stroke-width='2' d='m2 5 6 6 6-6'/%3e%3c/svg%3e");display:block;width:100%;padding:.375rem 2.25rem .375rem .75rem;font-size:1rem;font-weight:400;line-height:1.5;color:#343a40;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:#fff;background-image:var(--bs-form-select-bg-img),var(--bs-form-select-bg-icon, none);background-repeat:no-repeat;background-position:right .75rem center;background-size:16px 12px;border:1px solid #dee2e6;border-radius:0;transition:border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-select{transition:none}}.form-select:focus{border-color:#93c0f1;outline:0;box-shadow:0 0 0 .25rem rgba(39,128,227,.25)}.form-select[multiple],.form-select[size]:not([size="1"]){padding-right:.75rem;background-image:none}.form-select:disabled{background-color:#e9ecef}.form-select:-moz-focusring{color:rgba(0,0,0,0);text-shadow:0 0 0 #343a40}.form-select-sm{padding-top:.25rem;padding-bottom:.25rem;padding-left:.5rem;font-size:0.875rem}.form-select-lg{padding-top:.5rem;padding-bottom:.5rem;padding-left:1rem;font-size:1.25rem}[data-bs-theme=dark] .form-select{--bs-form-select-bg-img: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16'%3e%3cpath fill='none' stroke='%23dee2e6' stroke-linecap='round' stroke-linejoin='round' stroke-width='2' d='m2 5 6 6 6-6'/%3e%3c/svg%3e")}.form-check,.shiny-input-container .checkbox,.shiny-input-container .radio{display:block;min-height:1.5rem;padding-left:0;margin-bottom:.125rem}.form-check .form-check-input,.form-check .shiny-input-container .checkbox input,.form-check .shiny-input-container .radio input,.shiny-input-container .checkbox .form-check-input,.shiny-input-container .checkbox .shiny-input-container .checkbox input,.shiny-input-container .checkbox .shiny-input-container .radio input,.shiny-input-container .radio .form-check-input,.shiny-input-container .radio .shiny-input-container .checkbox input,.shiny-input-container .radio .shiny-input-container .radio input{float:left;margin-left:0}.form-check-reverse{padding-right:0;padding-left:0;text-align:right}.form-check-reverse .form-check-input{float:right;margin-right:0;margin-left:0}.form-check-input,.shiny-input-container .checkbox input,.shiny-input-container .checkbox-inline input,.shiny-input-container .radio input,.shiny-input-container .radio-inline input{--bs-form-check-bg: #fff;width:1em;height:1em;margin-top:.25em;vertical-align:top;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:var(--bs-form-check-bg);background-image:var(--bs-form-check-bg-image);background-repeat:no-repeat;background-position:center;background-size:contain;border:1px solid #dee2e6;print-color-adjust:exact}.form-check-input[type=radio],.shiny-input-container .checkbox input[type=radio],.shiny-input-container .checkbox-inline input[type=radio],.shiny-input-container .radio input[type=radio],.shiny-input-container .radio-inline input[type=radio]{border-radius:50%}.form-check-input:active,.shiny-input-container .checkbox input:active,.shiny-input-container .checkbox-inline input:active,.shiny-input-container .radio input:active,.shiny-input-container .radio-inline input:active{filter:brightness(90%)}.form-check-input:focus,.shiny-input-container .checkbox input:focus,.shiny-input-container .checkbox-inline input:focus,.shiny-input-container .radio input:focus,.shiny-input-container .radio-inline input:focus{border-color:#93c0f1;outline:0;box-shadow:0 0 0 .25rem rgba(39,128,227,.25)}.form-check-input:checked,.shiny-input-container .checkbox input:checked,.shiny-input-container .checkbox-inline input:checked,.shiny-input-container .radio input:checked,.shiny-input-container .radio-inline input:checked{background-color:#2780e3;border-color:#2780e3}.form-check-input:checked[type=checkbox],.shiny-input-container .checkbox input:checked[type=checkbox],.shiny-input-container .checkbox-inline input:checked[type=checkbox],.shiny-input-container .radio input:checked[type=checkbox],.shiny-input-container .radio-inline input:checked[type=checkbox]{--bs-form-check-bg-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 20 20'%3e%3cpath fill='none' stroke='%23fff' stroke-linecap='round' stroke-linejoin='round' stroke-width='3' d='m6 10 3 3 6-6'/%3e%3c/svg%3e")}.form-check-input:checked[type=radio],.shiny-input-container .checkbox input:checked[type=radio],.shiny-input-container .checkbox-inline input:checked[type=radio],.shiny-input-container .radio input:checked[type=radio],.shiny-input-container .radio-inline input:checked[type=radio]{--bs-form-check-bg-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='2' fill='%23fff'/%3e%3c/svg%3e")}.form-check-input[type=checkbox]:indeterminate,.shiny-input-container .checkbox input[type=checkbox]:indeterminate,.shiny-input-container .checkbox-inline input[type=checkbox]:indeterminate,.shiny-input-container .radio input[type=checkbox]:indeterminate,.shiny-input-container .radio-inline input[type=checkbox]:indeterminate{background-color:#2780e3;border-color:#2780e3;--bs-form-check-bg-image: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 20 20'%3e%3cpath fill='none' stroke='%23fff' stroke-linecap='round' stroke-linejoin='round' stroke-width='3' d='M6 10h8'/%3e%3c/svg%3e")}.form-check-input:disabled,.shiny-input-container .checkbox input:disabled,.shiny-input-container .checkbox-inline input:disabled,.shiny-input-container .radio input:disabled,.shiny-input-container .radio-inline input:disabled{pointer-events:none;filter:none;opacity:.5}.form-check-input[disabled]~.form-check-label,.form-check-input[disabled]~span,.form-check-input:disabled~.form-check-label,.form-check-input:disabled~span,.shiny-input-container .checkbox input[disabled]~.form-check-label,.shiny-input-container .checkbox input[disabled]~span,.shiny-input-container .checkbox input:disabled~.form-check-label,.shiny-input-container .checkbox input:disabled~span,.shiny-input-container .checkbox-inline input[disabled]~.form-check-label,.shiny-input-container .checkbox-inline input[disabled]~span,.shiny-input-container .checkbox-inline input:disabled~.form-check-label,.shiny-input-container .checkbox-inline input:disabled~span,.shiny-input-container .radio input[disabled]~.form-check-label,.shiny-input-container .radio input[disabled]~span,.shiny-input-container .radio input:disabled~.form-check-label,.shiny-input-container .radio input:disabled~span,.shiny-input-container .radio-inline input[disabled]~.form-check-label,.shiny-input-container .radio-inline input[disabled]~span,.shiny-input-container .radio-inline input:disabled~.form-check-label,.shiny-input-container .radio-inline input:disabled~span{cursor:default;opacity:.5}.form-check-label,.shiny-input-container .checkbox label,.shiny-input-container .checkbox-inline label,.shiny-input-container .radio label,.shiny-input-container .radio-inline label{cursor:pointer}.form-switch{padding-left:2.5em}.form-switch .form-check-input{--bs-form-switch-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='rgba%280, 0, 0, 0.25%29'/%3e%3c/svg%3e");width:2em;margin-left:-2.5em;background-image:var(--bs-form-switch-bg);background-position:left center;transition:background-position .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-switch .form-check-input{transition:none}}.form-switch .form-check-input:focus{--bs-form-switch-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%2393c0f1'/%3e%3c/svg%3e")}.form-switch .form-check-input:checked{background-position:right center;--bs-form-switch-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='%23fff'/%3e%3c/svg%3e")}.form-switch.form-check-reverse{padding-right:2.5em;padding-left:0}.form-switch.form-check-reverse .form-check-input{margin-right:-2.5em;margin-left:0}.form-check-inline{display:inline-block;margin-right:1rem}.btn-check{position:absolute;clip:rect(0, 0, 0, 0);pointer-events:none}.btn-check[disabled]+.btn,.btn-check:disabled+.btn{pointer-events:none;filter:none;opacity:.65}[data-bs-theme=dark] .form-switch .form-check-input:not(:checked):not(:focus){--bs-form-switch-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='-4 -4 8 8'%3e%3ccircle r='3' fill='rgba%28255, 255, 255, 0.25%29'/%3e%3c/svg%3e")}.form-range{width:100%;height:1.5rem;padding:0;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:rgba(0,0,0,0)}.form-range:focus{outline:0}.form-range:focus::-webkit-slider-thumb{box-shadow:0 0 0 1px #fff,0 0 0 .25rem rgba(39,128,227,.25)}.form-range:focus::-moz-range-thumb{box-shadow:0 0 0 1px #fff,0 0 0 .25rem rgba(39,128,227,.25)}.form-range::-moz-focus-outer{border:0}.form-range::-webkit-slider-thumb{width:1rem;height:1rem;margin-top:-0.25rem;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:#2780e3;border:0;transition:background-color .15s ease-in-out,border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-range::-webkit-slider-thumb{transition:none}}.form-range::-webkit-slider-thumb:active{background-color:#bed9f7}.form-range::-webkit-slider-runnable-track{width:100%;height:.5rem;color:rgba(0,0,0,0);cursor:pointer;background-color:#f8f9fa;border-color:rgba(0,0,0,0)}.form-range::-moz-range-thumb{width:1rem;height:1rem;appearance:none;-webkit-appearance:none;-moz-appearance:none;-ms-appearance:none;-o-appearance:none;background-color:#2780e3;border:0;transition:background-color .15s ease-in-out,border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.form-range::-moz-range-thumb{transition:none}}.form-range::-moz-range-thumb:active{background-color:#bed9f7}.form-range::-moz-range-track{width:100%;height:.5rem;color:rgba(0,0,0,0);cursor:pointer;background-color:#f8f9fa;border-color:rgba(0,0,0,0)}.form-range:disabled{pointer-events:none}.form-range:disabled::-webkit-slider-thumb{background-color:rgba(52,58,64,.75)}.form-range:disabled::-moz-range-thumb{background-color:rgba(52,58,64,.75)}.form-floating{position:relative}.form-floating>.form-control,.form-floating>.form-control-plaintext,.form-floating>.form-select{height:calc(3.5rem + calc(1px * 2));min-height:calc(3.5rem + calc(1px * 2));line-height:1.25}.form-floating>label{position:absolute;top:0;left:0;z-index:2;height:100%;padding:1rem .75rem;overflow:hidden;text-align:start;text-overflow:ellipsis;white-space:nowrap;pointer-events:none;border:1px solid rgba(0,0,0,0);transform-origin:0 0;transition:opacity .1s ease-in-out,transform .1s ease-in-out}@media(prefers-reduced-motion: reduce){.form-floating>label{transition:none}}.form-floating>.form-control,.form-floating>.form-control-plaintext{padding:1rem .75rem}.form-floating>.form-control::placeholder,.form-floating>.form-control-plaintext::placeholder{color:rgba(0,0,0,0)}.form-floating>.form-control:focus,.form-floating>.form-control:not(:placeholder-shown),.form-floating>.form-control-plaintext:focus,.form-floating>.form-control-plaintext:not(:placeholder-shown){padding-top:1.625rem;padding-bottom:.625rem}.form-floating>.form-control:-webkit-autofill,.form-floating>.form-control-plaintext:-webkit-autofill{padding-top:1.625rem;padding-bottom:.625rem}.form-floating>.form-select{padding-top:1.625rem;padding-bottom:.625rem}.form-floating>.form-control:focus~label,.form-floating>.form-control:not(:placeholder-shown)~label,.form-floating>.form-control-plaintext~label,.form-floating>.form-select~label{color:rgba(var(--bs-body-color-rgb), 0.65);transform:scale(0.85) translateY(-0.5rem) translateX(0.15rem)}.form-floating>.form-control:focus~label::after,.form-floating>.form-control:not(:placeholder-shown)~label::after,.form-floating>.form-control-plaintext~label::after,.form-floating>.form-select~label::after{position:absolute;inset:1rem .375rem;z-index:-1;height:1.5em;content:"";background-color:#fff}.form-floating>.form-control:-webkit-autofill~label{color:rgba(var(--bs-body-color-rgb), 0.65);transform:scale(0.85) translateY(-0.5rem) translateX(0.15rem)}.form-floating>.form-control-plaintext~label{border-width:1px 0}.form-floating>:disabled~label,.form-floating>.form-control:disabled~label{color:#6c757d}.form-floating>:disabled~label::after,.form-floating>.form-control:disabled~label::after{background-color:#e9ecef}.input-group{position:relative;display:flex;display:-webkit-flex;flex-wrap:wrap;-webkit-flex-wrap:wrap;align-items:stretch;-webkit-align-items:stretch;width:100%}.input-group>.form-control,.input-group>.form-select,.input-group>.form-floating{position:relative;flex:1 1 auto;-webkit-flex:1 1 auto;width:1%;min-width:0}.input-group>.form-control:focus,.input-group>.form-select:focus,.input-group>.form-floating:focus-within{z-index:5}.input-group .btn{position:relative;z-index:2}.input-group .btn:focus{z-index:5}.input-group-text{display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;padding:.375rem .75rem;font-size:1rem;font-weight:400;line-height:1.5;color:#343a40;text-align:center;white-space:nowrap;background-color:#f8f9fa;border:1px solid #dee2e6}.input-group-lg>.form-control,.input-group-lg>.form-select,.input-group-lg>.input-group-text,.input-group-lg>.btn{padding:.5rem 1rem;font-size:1.25rem}.input-group-sm>.form-control,.input-group-sm>.form-select,.input-group-sm>.input-group-text,.input-group-sm>.btn{padding:.25rem .5rem;font-size:0.875rem}.input-group-lg>.form-select,.input-group-sm>.form-select{padding-right:3rem}.input-group>:not(:first-child):not(.dropdown-menu):not(.valid-tooltip):not(.valid-feedback):not(.invalid-tooltip):not(.invalid-feedback){margin-left:calc(1px*-1)}.valid-feedback{display:none;width:100%;margin-top:.25rem;font-size:0.875em;color:#3fb618}.valid-tooltip{position:absolute;top:100%;z-index:5;display:none;max-width:100%;padding:.25rem .5rem;margin-top:.1rem;font-size:0.875rem;color:#fff;background-color:#3fb618}.was-validated :valid~.valid-feedback,.was-validated :valid~.valid-tooltip,.is-valid~.valid-feedback,.is-valid~.valid-tooltip{display:block}.was-validated .form-control:valid,.form-control.is-valid{border-color:#3fb618;padding-right:calc(1.5em + 0.75rem);background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 8 8'%3e%3cpath fill='%233fb618' d='M2.3 6.73.6 4.53c-.4-1.04.46-1.4 1.1-.8l1.1 1.4 3.4-3.8c.6-.63 1.6-.27 1.2.7l-4 4.6c-.43.5-.8.4-1.1.1z'/%3e%3c/svg%3e");background-repeat:no-repeat;background-position:right calc(0.375em + 0.1875rem) center;background-size:calc(0.75em + 0.375rem) calc(0.75em + 0.375rem)}.was-validated .form-control:valid:focus,.form-control.is-valid:focus{border-color:#3fb618;box-shadow:0 0 0 .25rem rgba(63,182,24,.25)}.was-validated textarea.form-control:valid,textarea.form-control.is-valid{padding-right:calc(1.5em + 0.75rem);background-position:top calc(0.375em + 0.1875rem) right calc(0.375em + 0.1875rem)}.was-validated .form-select:valid,.form-select.is-valid{border-color:#3fb618}.was-validated .form-select:valid:not([multiple]):not([size]),.was-validated .form-select:valid:not([multiple])[size="1"],.form-select.is-valid:not([multiple]):not([size]),.form-select.is-valid:not([multiple])[size="1"]{--bs-form-select-bg-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 8 8'%3e%3cpath fill='%233fb618' d='M2.3 6.73.6 4.53c-.4-1.04.46-1.4 1.1-.8l1.1 1.4 3.4-3.8c.6-.63 1.6-.27 1.2.7l-4 4.6c-.43.5-.8.4-1.1.1z'/%3e%3c/svg%3e");padding-right:4.125rem;background-position:right .75rem center,center right 2.25rem;background-size:16px 12px,calc(0.75em + 0.375rem) calc(0.75em + 0.375rem)}.was-validated .form-select:valid:focus,.form-select.is-valid:focus{border-color:#3fb618;box-shadow:0 0 0 .25rem rgba(63,182,24,.25)}.was-validated .form-control-color:valid,.form-control-color.is-valid{width:calc(3rem + calc(1.5em + 0.75rem))}.was-validated .form-check-input:valid,.form-check-input.is-valid{border-color:#3fb618}.was-validated .form-check-input:valid:checked,.form-check-input.is-valid:checked{background-color:#3fb618}.was-validated .form-check-input:valid:focus,.form-check-input.is-valid:focus{box-shadow:0 0 0 .25rem rgba(63,182,24,.25)}.was-validated .form-check-input:valid~.form-check-label,.form-check-input.is-valid~.form-check-label{color:#3fb618}.form-check-inline .form-check-input~.valid-feedback{margin-left:.5em}.was-validated .input-group>.form-control:not(:focus):valid,.input-group>.form-control:not(:focus).is-valid,.was-validated .input-group>.form-select:not(:focus):valid,.input-group>.form-select:not(:focus).is-valid,.was-validated .input-group>.form-floating:not(:focus-within):valid,.input-group>.form-floating:not(:focus-within).is-valid{z-index:3}.invalid-feedback{display:none;width:100%;margin-top:.25rem;font-size:0.875em;color:#ff0039}.invalid-tooltip{position:absolute;top:100%;z-index:5;display:none;max-width:100%;padding:.25rem .5rem;margin-top:.1rem;font-size:0.875rem;color:#fff;background-color:#ff0039}.was-validated :invalid~.invalid-feedback,.was-validated :invalid~.invalid-tooltip,.is-invalid~.invalid-feedback,.is-invalid~.invalid-tooltip{display:block}.was-validated .form-control:invalid,.form-control.is-invalid{border-color:#ff0039;padding-right:calc(1.5em + 0.75rem);background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 12 12' width='12' height='12' fill='none' stroke='%23ff0039'%3e%3ccircle cx='6' cy='6' r='4.5'/%3e%3cpath stroke-linejoin='round' d='M5.8 3.6h.4L6 6.5z'/%3e%3ccircle cx='6' cy='8.2' r='.6' fill='%23ff0039' stroke='none'/%3e%3c/svg%3e");background-repeat:no-repeat;background-position:right calc(0.375em + 0.1875rem) center;background-size:calc(0.75em + 0.375rem) calc(0.75em + 0.375rem)}.was-validated .form-control:invalid:focus,.form-control.is-invalid:focus{border-color:#ff0039;box-shadow:0 0 0 .25rem rgba(255,0,57,.25)}.was-validated textarea.form-control:invalid,textarea.form-control.is-invalid{padding-right:calc(1.5em + 0.75rem);background-position:top calc(0.375em + 0.1875rem) right calc(0.375em + 0.1875rem)}.was-validated .form-select:invalid,.form-select.is-invalid{border-color:#ff0039}.was-validated .form-select:invalid:not([multiple]):not([size]),.was-validated .form-select:invalid:not([multiple])[size="1"],.form-select.is-invalid:not([multiple]):not([size]),.form-select.is-invalid:not([multiple])[size="1"]{--bs-form-select-bg-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 12 12' width='12' height='12' fill='none' stroke='%23ff0039'%3e%3ccircle cx='6' cy='6' r='4.5'/%3e%3cpath stroke-linejoin='round' d='M5.8 3.6h.4L6 6.5z'/%3e%3ccircle cx='6' cy='8.2' r='.6' fill='%23ff0039' stroke='none'/%3e%3c/svg%3e");padding-right:4.125rem;background-position:right .75rem center,center right 2.25rem;background-size:16px 12px,calc(0.75em + 0.375rem) calc(0.75em + 0.375rem)}.was-validated .form-select:invalid:focus,.form-select.is-invalid:focus{border-color:#ff0039;box-shadow:0 0 0 .25rem rgba(255,0,57,.25)}.was-validated .form-control-color:invalid,.form-control-color.is-invalid{width:calc(3rem + calc(1.5em + 0.75rem))}.was-validated .form-check-input:invalid,.form-check-input.is-invalid{border-color:#ff0039}.was-validated .form-check-input:invalid:checked,.form-check-input.is-invalid:checked{background-color:#ff0039}.was-validated .form-check-input:invalid:focus,.form-check-input.is-invalid:focus{box-shadow:0 0 0 .25rem rgba(255,0,57,.25)}.was-validated .form-check-input:invalid~.form-check-label,.form-check-input.is-invalid~.form-check-label{color:#ff0039}.form-check-inline .form-check-input~.invalid-feedback{margin-left:.5em}.was-validated .input-group>.form-control:not(:focus):invalid,.input-group>.form-control:not(:focus).is-invalid,.was-validated .input-group>.form-select:not(:focus):invalid,.input-group>.form-select:not(:focus).is-invalid,.was-validated .input-group>.form-floating:not(:focus-within):invalid,.input-group>.form-floating:not(:focus-within).is-invalid{z-index:4}.btn{--bs-btn-padding-x: 0.75rem;--bs-btn-padding-y: 0.375rem;--bs-btn-font-family: ;--bs-btn-font-size:1rem;--bs-btn-font-weight: 400;--bs-btn-line-height: 1.5;--bs-btn-color: #343a40;--bs-btn-bg: transparent;--bs-btn-border-width: 1px;--bs-btn-border-color: transparent;--bs-btn-border-radius: 0.25rem;--bs-btn-hover-border-color: transparent;--bs-btn-box-shadow: inset 0 1px 0 rgba(255, 255, 255, 0.15), 0 1px 1px rgba(0, 0, 0, 0.075);--bs-btn-disabled-opacity: 0.65;--bs-btn-focus-box-shadow: 0 0 0 0.25rem rgba(var(--bs-btn-focus-shadow-rgb), .5);display:inline-block;padding:var(--bs-btn-padding-y) var(--bs-btn-padding-x);font-family:var(--bs-btn-font-family);font-size:var(--bs-btn-font-size);font-weight:var(--bs-btn-font-weight);line-height:var(--bs-btn-line-height);color:var(--bs-btn-color);text-align:center;text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;vertical-align:middle;cursor:pointer;user-select:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;-o-user-select:none;border:var(--bs-btn-border-width) solid var(--bs-btn-border-color);background-color:var(--bs-btn-bg);transition:color .15s ease-in-out,background-color .15s ease-in-out,border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.btn{transition:none}}.btn:hover{color:var(--bs-btn-hover-color);background-color:var(--bs-btn-hover-bg);border-color:var(--bs-btn-hover-border-color)}.btn-check+.btn:hover{color:var(--bs-btn-color);background-color:var(--bs-btn-bg);border-color:var(--bs-btn-border-color)}.btn:focus-visible{color:var(--bs-btn-hover-color);background-color:var(--bs-btn-hover-bg);border-color:var(--bs-btn-hover-border-color);outline:0;box-shadow:var(--bs-btn-focus-box-shadow)}.btn-check:focus-visible+.btn{border-color:var(--bs-btn-hover-border-color);outline:0;box-shadow:var(--bs-btn-focus-box-shadow)}.btn-check:checked+.btn,:not(.btn-check)+.btn:active,.btn:first-child:active,.btn.active,.btn.show{color:var(--bs-btn-active-color);background-color:var(--bs-btn-active-bg);border-color:var(--bs-btn-active-border-color)}.btn-check:checked+.btn:focus-visible,:not(.btn-check)+.btn:active:focus-visible,.btn:first-child:active:focus-visible,.btn.active:focus-visible,.btn.show:focus-visible{box-shadow:var(--bs-btn-focus-box-shadow)}.btn:disabled,.btn.disabled,fieldset:disabled .btn{color:var(--bs-btn-disabled-color);pointer-events:none;background-color:var(--bs-btn-disabled-bg);border-color:var(--bs-btn-disabled-border-color);opacity:var(--bs-btn-disabled-opacity)}.btn-default{--bs-btn-color: #fff;--bs-btn-bg: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #2c3136;--bs-btn-hover-border-color: #2a2e33;--bs-btn-focus-shadow-rgb: 82, 88, 93;--bs-btn-active-color: #fff;--bs-btn-active-bg: #2a2e33;--bs-btn-active-border-color: #272c30;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #343a40;--bs-btn-disabled-border-color: #343a40}.btn-primary{--bs-btn-color: #fff;--bs-btn-bg: #2780e3;--bs-btn-border-color: #2780e3;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #216dc1;--bs-btn-hover-border-color: #1f66b6;--bs-btn-focus-shadow-rgb: 71, 147, 231;--bs-btn-active-color: #fff;--bs-btn-active-bg: #1f66b6;--bs-btn-active-border-color: #1d60aa;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #2780e3;--bs-btn-disabled-border-color: #2780e3}.btn-secondary{--bs-btn-color: #fff;--bs-btn-bg: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #2c3136;--bs-btn-hover-border-color: #2a2e33;--bs-btn-focus-shadow-rgb: 82, 88, 93;--bs-btn-active-color: #fff;--bs-btn-active-bg: #2a2e33;--bs-btn-active-border-color: #272c30;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #343a40;--bs-btn-disabled-border-color: #343a40}.btn-success{--bs-btn-color: #fff;--bs-btn-bg: #3fb618;--bs-btn-border-color: #3fb618;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #369b14;--bs-btn-hover-border-color: #329213;--bs-btn-focus-shadow-rgb: 92, 193, 59;--bs-btn-active-color: #fff;--bs-btn-active-bg: #329213;--bs-btn-active-border-color: #2f8912;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #3fb618;--bs-btn-disabled-border-color: #3fb618}.btn-info{--bs-btn-color: #fff;--bs-btn-bg: #9954bb;--bs-btn-border-color: #9954bb;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #82479f;--bs-btn-hover-border-color: #7a4396;--bs-btn-focus-shadow-rgb: 168, 110, 197;--bs-btn-active-color: #fff;--bs-btn-active-bg: #7a4396;--bs-btn-active-border-color: #733f8c;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #9954bb;--bs-btn-disabled-border-color: #9954bb}.btn-warning{--bs-btn-color: #fff;--bs-btn-bg: #ff7518;--bs-btn-border-color: #ff7518;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #d96314;--bs-btn-hover-border-color: #cc5e13;--bs-btn-focus-shadow-rgb: 255, 138, 59;--bs-btn-active-color: #fff;--bs-btn-active-bg: #cc5e13;--bs-btn-active-border-color: #bf5812;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #ff7518;--bs-btn-disabled-border-color: #ff7518}.btn-danger{--bs-btn-color: #fff;--bs-btn-bg: #ff0039;--bs-btn-border-color: #ff0039;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #d90030;--bs-btn-hover-border-color: #cc002e;--bs-btn-focus-shadow-rgb: 255, 38, 87;--bs-btn-active-color: #fff;--bs-btn-active-bg: #cc002e;--bs-btn-active-border-color: #bf002b;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #ff0039;--bs-btn-disabled-border-color: #ff0039}.btn-light{--bs-btn-color: #000;--bs-btn-bg: #f8f9fa;--bs-btn-border-color: #f8f9fa;--bs-btn-hover-color: #000;--bs-btn-hover-bg: #d3d4d5;--bs-btn-hover-border-color: #c6c7c8;--bs-btn-focus-shadow-rgb: 211, 212, 213;--bs-btn-active-color: #000;--bs-btn-active-bg: #c6c7c8;--bs-btn-active-border-color: #babbbc;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #000;--bs-btn-disabled-bg: #f8f9fa;--bs-btn-disabled-border-color: #f8f9fa}.btn-dark{--bs-btn-color: #fff;--bs-btn-bg: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #52585d;--bs-btn-hover-border-color: #484e53;--bs-btn-focus-shadow-rgb: 82, 88, 93;--bs-btn-active-color: #fff;--bs-btn-active-bg: #5d6166;--bs-btn-active-border-color: #484e53;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #fff;--bs-btn-disabled-bg: #343a40;--bs-btn-disabled-border-color: #343a40}.btn-outline-default{--bs-btn-color: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #343a40;--bs-btn-hover-border-color: #343a40;--bs-btn-focus-shadow-rgb: 52, 58, 64;--bs-btn-active-color: #fff;--bs-btn-active-bg: #343a40;--bs-btn-active-border-color: #343a40;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #343a40;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #343a40;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-primary{--bs-btn-color: #2780e3;--bs-btn-border-color: #2780e3;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #2780e3;--bs-btn-hover-border-color: #2780e3;--bs-btn-focus-shadow-rgb: 39, 128, 227;--bs-btn-active-color: #fff;--bs-btn-active-bg: #2780e3;--bs-btn-active-border-color: #2780e3;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #2780e3;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #2780e3;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-secondary{--bs-btn-color: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #343a40;--bs-btn-hover-border-color: #343a40;--bs-btn-focus-shadow-rgb: 52, 58, 64;--bs-btn-active-color: #fff;--bs-btn-active-bg: #343a40;--bs-btn-active-border-color: #343a40;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #343a40;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #343a40;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-success{--bs-btn-color: #3fb618;--bs-btn-border-color: #3fb618;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #3fb618;--bs-btn-hover-border-color: #3fb618;--bs-btn-focus-shadow-rgb: 63, 182, 24;--bs-btn-active-color: #fff;--bs-btn-active-bg: #3fb618;--bs-btn-active-border-color: #3fb618;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #3fb618;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #3fb618;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-info{--bs-btn-color: #9954bb;--bs-btn-border-color: #9954bb;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #9954bb;--bs-btn-hover-border-color: #9954bb;--bs-btn-focus-shadow-rgb: 153, 84, 187;--bs-btn-active-color: #fff;--bs-btn-active-bg: #9954bb;--bs-btn-active-border-color: #9954bb;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #9954bb;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #9954bb;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-warning{--bs-btn-color: #ff7518;--bs-btn-border-color: #ff7518;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #ff7518;--bs-btn-hover-border-color: #ff7518;--bs-btn-focus-shadow-rgb: 255, 117, 24;--bs-btn-active-color: #fff;--bs-btn-active-bg: #ff7518;--bs-btn-active-border-color: #ff7518;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #ff7518;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #ff7518;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-danger{--bs-btn-color: #ff0039;--bs-btn-border-color: #ff0039;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #ff0039;--bs-btn-hover-border-color: #ff0039;--bs-btn-focus-shadow-rgb: 255, 0, 57;--bs-btn-active-color: #fff;--bs-btn-active-bg: #ff0039;--bs-btn-active-border-color: #ff0039;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #ff0039;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #ff0039;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-light{--bs-btn-color: #f8f9fa;--bs-btn-border-color: #f8f9fa;--bs-btn-hover-color: #000;--bs-btn-hover-bg: #f8f9fa;--bs-btn-hover-border-color: #f8f9fa;--bs-btn-focus-shadow-rgb: 248, 249, 250;--bs-btn-active-color: #000;--bs-btn-active-bg: #f8f9fa;--bs-btn-active-border-color: #f8f9fa;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #f8f9fa;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #f8f9fa;--bs-btn-bg: transparent;--bs-gradient: none}.btn-outline-dark{--bs-btn-color: #343a40;--bs-btn-border-color: #343a40;--bs-btn-hover-color: #fff;--bs-btn-hover-bg: #343a40;--bs-btn-hover-border-color: #343a40;--bs-btn-focus-shadow-rgb: 52, 58, 64;--bs-btn-active-color: #fff;--bs-btn-active-bg: #343a40;--bs-btn-active-border-color: #343a40;--bs-btn-active-shadow: inset 0 3px 5px rgba(0, 0, 0, 0.125);--bs-btn-disabled-color: #343a40;--bs-btn-disabled-bg: transparent;--bs-btn-disabled-border-color: #343a40;--bs-btn-bg: transparent;--bs-gradient: none}.btn-link{--bs-btn-font-weight: 400;--bs-btn-color: #2761e3;--bs-btn-bg: transparent;--bs-btn-border-color: transparent;--bs-btn-hover-color: #1f4eb6;--bs-btn-hover-border-color: transparent;--bs-btn-active-color: #1f4eb6;--bs-btn-active-border-color: transparent;--bs-btn-disabled-color: #6c757d;--bs-btn-disabled-border-color: transparent;--bs-btn-box-shadow: 0 0 0 #000;--bs-btn-focus-shadow-rgb: 71, 121, 231;text-decoration:underline;-webkit-text-decoration:underline;-moz-text-decoration:underline;-ms-text-decoration:underline;-o-text-decoration:underline}.btn-link:focus-visible{color:var(--bs-btn-color)}.btn-link:hover{color:var(--bs-btn-hover-color)}.btn-lg,.btn-group-lg>.btn{--bs-btn-padding-y: 0.5rem;--bs-btn-padding-x: 1rem;--bs-btn-font-size:1.25rem;--bs-btn-border-radius: 0.5rem}.btn-sm,.btn-group-sm>.btn{--bs-btn-padding-y: 0.25rem;--bs-btn-padding-x: 0.5rem;--bs-btn-font-size:0.875rem;--bs-btn-border-radius: 0.2em}.fade{transition:opacity .15s linear}@media(prefers-reduced-motion: reduce){.fade{transition:none}}.fade:not(.show){opacity:0}.collapse:not(.show){display:none}.collapsing{height:0;overflow:hidden;transition:height .2s ease}@media(prefers-reduced-motion: reduce){.collapsing{transition:none}}.collapsing.collapse-horizontal{width:0;height:auto;transition:width .35s ease}@media(prefers-reduced-motion: reduce){.collapsing.collapse-horizontal{transition:none}}.dropup,.dropend,.dropdown,.dropstart,.dropup-center,.dropdown-center{position:relative}.dropdown-toggle{white-space:nowrap}.dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:"";border-top:.3em solid;border-right:.3em solid rgba(0,0,0,0);border-bottom:0;border-left:.3em solid rgba(0,0,0,0)}.dropdown-toggle:empty::after{margin-left:0}.dropdown-menu{--bs-dropdown-zindex: 1000;--bs-dropdown-min-width: 10rem;--bs-dropdown-padding-x: 0;--bs-dropdown-padding-y: 0.5rem;--bs-dropdown-spacer: 0.125rem;--bs-dropdown-font-size:1rem;--bs-dropdown-color: #343a40;--bs-dropdown-bg: #fff;--bs-dropdown-border-color: rgba(0, 0, 0, 0.175);--bs-dropdown-border-radius: 0.25rem;--bs-dropdown-border-width: 1px;--bs-dropdown-inner-border-radius: calc(0.25rem - 1px);--bs-dropdown-divider-bg: rgba(0, 0, 0, 0.175);--bs-dropdown-divider-margin-y: 0.5rem;--bs-dropdown-box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15);--bs-dropdown-link-color: #343a40;--bs-dropdown-link-hover-color: #343a40;--bs-dropdown-link-hover-bg: #f8f9fa;--bs-dropdown-link-active-color: #fff;--bs-dropdown-link-active-bg: #2780e3;--bs-dropdown-link-disabled-color: rgba(52, 58, 64, 0.5);--bs-dropdown-item-padding-x: 1rem;--bs-dropdown-item-padding-y: 0.25rem;--bs-dropdown-header-color: #6c757d;--bs-dropdown-header-padding-x: 1rem;--bs-dropdown-header-padding-y: 0.5rem;position:absolute;z-index:var(--bs-dropdown-zindex);display:none;min-width:var(--bs-dropdown-min-width);padding:var(--bs-dropdown-padding-y) var(--bs-dropdown-padding-x);margin:0;font-size:var(--bs-dropdown-font-size);color:var(--bs-dropdown-color);text-align:left;list-style:none;background-color:var(--bs-dropdown-bg);background-clip:padding-box;border:var(--bs-dropdown-border-width) solid var(--bs-dropdown-border-color)}.dropdown-menu[data-bs-popper]{top:100%;left:0;margin-top:var(--bs-dropdown-spacer)}.dropdown-menu-start{--bs-position: start}.dropdown-menu-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-end{--bs-position: end}.dropdown-menu-end[data-bs-popper]{right:0;left:auto}@media(min-width: 576px){.dropdown-menu-sm-start{--bs-position: start}.dropdown-menu-sm-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-sm-end{--bs-position: end}.dropdown-menu-sm-end[data-bs-popper]{right:0;left:auto}}@media(min-width: 768px){.dropdown-menu-md-start{--bs-position: start}.dropdown-menu-md-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-md-end{--bs-position: end}.dropdown-menu-md-end[data-bs-popper]{right:0;left:auto}}@media(min-width: 992px){.dropdown-menu-lg-start{--bs-position: start}.dropdown-menu-lg-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-lg-end{--bs-position: end}.dropdown-menu-lg-end[data-bs-popper]{right:0;left:auto}}@media(min-width: 1200px){.dropdown-menu-xl-start{--bs-position: start}.dropdown-menu-xl-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-xl-end{--bs-position: end}.dropdown-menu-xl-end[data-bs-popper]{right:0;left:auto}}@media(min-width: 1400px){.dropdown-menu-xxl-start{--bs-position: start}.dropdown-menu-xxl-start[data-bs-popper]{right:auto;left:0}.dropdown-menu-xxl-end{--bs-position: end}.dropdown-menu-xxl-end[data-bs-popper]{right:0;left:auto}}.dropup .dropdown-menu[data-bs-popper]{top:auto;bottom:100%;margin-top:0;margin-bottom:var(--bs-dropdown-spacer)}.dropup .dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:"";border-top:0;border-right:.3em solid rgba(0,0,0,0);border-bottom:.3em solid;border-left:.3em solid rgba(0,0,0,0)}.dropup .dropdown-toggle:empty::after{margin-left:0}.dropend .dropdown-menu[data-bs-popper]{top:0;right:auto;left:100%;margin-top:0;margin-left:var(--bs-dropdown-spacer)}.dropend .dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:"";border-top:.3em solid rgba(0,0,0,0);border-right:0;border-bottom:.3em solid rgba(0,0,0,0);border-left:.3em solid}.dropend .dropdown-toggle:empty::after{margin-left:0}.dropend .dropdown-toggle::after{vertical-align:0}.dropstart .dropdown-menu[data-bs-popper]{top:0;right:100%;left:auto;margin-top:0;margin-right:var(--bs-dropdown-spacer)}.dropstart .dropdown-toggle::after{display:inline-block;margin-left:.255em;vertical-align:.255em;content:""}.dropstart .dropdown-toggle::after{display:none}.dropstart .dropdown-toggle::before{display:inline-block;margin-right:.255em;vertical-align:.255em;content:"";border-top:.3em solid rgba(0,0,0,0);border-right:.3em solid;border-bottom:.3em solid rgba(0,0,0,0)}.dropstart .dropdown-toggle:empty::after{margin-left:0}.dropstart .dropdown-toggle::before{vertical-align:0}.dropdown-divider{height:0;margin:var(--bs-dropdown-divider-margin-y) 0;overflow:hidden;border-top:1px solid var(--bs-dropdown-divider-bg);opacity:1}.dropdown-item{display:block;width:100%;padding:var(--bs-dropdown-item-padding-y) var(--bs-dropdown-item-padding-x);clear:both;font-weight:400;color:var(--bs-dropdown-link-color);text-align:inherit;text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;white-space:nowrap;background-color:rgba(0,0,0,0);border:0}.dropdown-item:hover,.dropdown-item:focus{color:var(--bs-dropdown-link-hover-color);background-color:var(--bs-dropdown-link-hover-bg)}.dropdown-item.active,.dropdown-item:active{color:var(--bs-dropdown-link-active-color);text-decoration:none;background-color:var(--bs-dropdown-link-active-bg)}.dropdown-item.disabled,.dropdown-item:disabled{color:var(--bs-dropdown-link-disabled-color);pointer-events:none;background-color:rgba(0,0,0,0)}.dropdown-menu.show{display:block}.dropdown-header{display:block;padding:var(--bs-dropdown-header-padding-y) var(--bs-dropdown-header-padding-x);margin-bottom:0;font-size:0.875rem;color:var(--bs-dropdown-header-color);white-space:nowrap}.dropdown-item-text{display:block;padding:var(--bs-dropdown-item-padding-y) var(--bs-dropdown-item-padding-x);color:var(--bs-dropdown-link-color)}.dropdown-menu-dark{--bs-dropdown-color: #dee2e6;--bs-dropdown-bg: #343a40;--bs-dropdown-border-color: rgba(0, 0, 0, 0.175);--bs-dropdown-box-shadow: ;--bs-dropdown-link-color: #dee2e6;--bs-dropdown-link-hover-color: #fff;--bs-dropdown-divider-bg: rgba(0, 0, 0, 0.175);--bs-dropdown-link-hover-bg: rgba(255, 255, 255, 0.15);--bs-dropdown-link-active-color: #fff;--bs-dropdown-link-active-bg: #2780e3;--bs-dropdown-link-disabled-color: #adb5bd;--bs-dropdown-header-color: #adb5bd}.btn-group,.btn-group-vertical{position:relative;display:inline-flex;vertical-align:middle}.btn-group>.btn,.btn-group-vertical>.btn{position:relative;flex:1 1 auto;-webkit-flex:1 1 auto}.btn-group>.btn-check:checked+.btn,.btn-group>.btn-check:focus+.btn,.btn-group>.btn:hover,.btn-group>.btn:focus,.btn-group>.btn:active,.btn-group>.btn.active,.btn-group-vertical>.btn-check:checked+.btn,.btn-group-vertical>.btn-check:focus+.btn,.btn-group-vertical>.btn:hover,.btn-group-vertical>.btn:focus,.btn-group-vertical>.btn:active,.btn-group-vertical>.btn.active{z-index:1}.btn-toolbar{display:flex;display:-webkit-flex;flex-wrap:wrap;-webkit-flex-wrap:wrap;justify-content:flex-start;-webkit-justify-content:flex-start}.btn-toolbar .input-group{width:auto}.btn-group>:not(.btn-check:first-child)+.btn,.btn-group>.btn-group:not(:first-child){margin-left:calc(1px*-1)}.dropdown-toggle-split{padding-right:.5625rem;padding-left:.5625rem}.dropdown-toggle-split::after,.dropup .dropdown-toggle-split::after,.dropend .dropdown-toggle-split::after{margin-left:0}.dropstart .dropdown-toggle-split::before{margin-right:0}.btn-sm+.dropdown-toggle-split,.btn-group-sm>.btn+.dropdown-toggle-split{padding-right:.375rem;padding-left:.375rem}.btn-lg+.dropdown-toggle-split,.btn-group-lg>.btn+.dropdown-toggle-split{padding-right:.75rem;padding-left:.75rem}.btn-group-vertical{flex-direction:column;-webkit-flex-direction:column;align-items:flex-start;-webkit-align-items:flex-start;justify-content:center;-webkit-justify-content:center}.btn-group-vertical>.btn,.btn-group-vertical>.btn-group{width:100%}.btn-group-vertical>.btn:not(:first-child),.btn-group-vertical>.btn-group:not(:first-child){margin-top:calc(1px*-1)}.nav{--bs-nav-link-padding-x: 1rem;--bs-nav-link-padding-y: 0.5rem;--bs-nav-link-font-weight: ;--bs-nav-link-color: #2761e3;--bs-nav-link-hover-color: #1f4eb6;--bs-nav-link-disabled-color: rgba(52, 58, 64, 0.75);display:flex;display:-webkit-flex;flex-wrap:wrap;-webkit-flex-wrap:wrap;padding-left:0;margin-bottom:0;list-style:none}.nav-link{display:block;padding:var(--bs-nav-link-padding-y) var(--bs-nav-link-padding-x);font-size:var(--bs-nav-link-font-size);font-weight:var(--bs-nav-link-font-weight);color:var(--bs-nav-link-color);text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;background:none;border:0;transition:color .15s ease-in-out,background-color .15s ease-in-out,border-color .15s ease-in-out}@media(prefers-reduced-motion: reduce){.nav-link{transition:none}}.nav-link:hover,.nav-link:focus{color:var(--bs-nav-link-hover-color)}.nav-link:focus-visible{outline:0;box-shadow:0 0 0 .25rem rgba(39,128,227,.25)}.nav-link.disabled,.nav-link:disabled{color:var(--bs-nav-link-disabled-color);pointer-events:none;cursor:default}.nav-tabs{--bs-nav-tabs-border-width: 1px;--bs-nav-tabs-border-color: #dee2e6;--bs-nav-tabs-border-radius: 0.25rem;--bs-nav-tabs-link-hover-border-color: #e9ecef #e9ecef #dee2e6;--bs-nav-tabs-link-active-color: #000;--bs-nav-tabs-link-active-bg: #fff;--bs-nav-tabs-link-active-border-color: #dee2e6 #dee2e6 #fff;border-bottom:var(--bs-nav-tabs-border-width) solid var(--bs-nav-tabs-border-color)}.nav-tabs .nav-link{margin-bottom:calc(-1*var(--bs-nav-tabs-border-width));border:var(--bs-nav-tabs-border-width) solid rgba(0,0,0,0)}.nav-tabs .nav-link:hover,.nav-tabs .nav-link:focus{isolation:isolate;border-color:var(--bs-nav-tabs-link-hover-border-color)}.nav-tabs .nav-link.active,.nav-tabs .nav-item.show .nav-link{color:var(--bs-nav-tabs-link-active-color);background-color:var(--bs-nav-tabs-link-active-bg);border-color:var(--bs-nav-tabs-link-active-border-color)}.nav-tabs .dropdown-menu{margin-top:calc(-1*var(--bs-nav-tabs-border-width))}.nav-pills{--bs-nav-pills-border-radius: 0.25rem;--bs-nav-pills-link-active-color: #fff;--bs-nav-pills-link-active-bg: #2780e3}.nav-pills .nav-link.active,.nav-pills .show>.nav-link{color:var(--bs-nav-pills-link-active-color);background-color:var(--bs-nav-pills-link-active-bg)}.nav-underline{--bs-nav-underline-gap: 1rem;--bs-nav-underline-border-width: 0.125rem;--bs-nav-underline-link-active-color: #000;gap:var(--bs-nav-underline-gap)}.nav-underline .nav-link{padding-right:0;padding-left:0;border-bottom:var(--bs-nav-underline-border-width) solid rgba(0,0,0,0)}.nav-underline .nav-link:hover,.nav-underline .nav-link:focus{border-bottom-color:currentcolor}.nav-underline .nav-link.active,.nav-underline .show>.nav-link{font-weight:700;color:var(--bs-nav-underline-link-active-color);border-bottom-color:currentcolor}.nav-fill>.nav-link,.nav-fill .nav-item{flex:1 1 auto;-webkit-flex:1 1 auto;text-align:center}.nav-justified>.nav-link,.nav-justified .nav-item{flex-basis:0;-webkit-flex-basis:0;flex-grow:1;-webkit-flex-grow:1;text-align:center}.nav-fill .nav-item .nav-link,.nav-justified .nav-item .nav-link{width:100%}.tab-content>.tab-pane{display:none}.tab-content>.active{display:block}.navbar{--bs-navbar-padding-x: 0;--bs-navbar-padding-y: 0.5rem;--bs-navbar-color: #545555;--bs-navbar-hover-color: rgba(31, 78, 182, 0.8);--bs-navbar-disabled-color: rgba(84, 85, 85, 0.75);--bs-navbar-active-color: #1f4eb6;--bs-navbar-brand-padding-y: 0.3125rem;--bs-navbar-brand-margin-end: 1rem;--bs-navbar-brand-font-size: 1.25rem;--bs-navbar-brand-color: #545555;--bs-navbar-brand-hover-color: #1f4eb6;--bs-navbar-nav-link-padding-x: 0.5rem;--bs-navbar-toggler-padding-y: 0.25;--bs-navbar-toggler-padding-x: 0;--bs-navbar-toggler-font-size: 1.25rem;--bs-navbar-toggler-icon-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 30 30'%3e%3cpath stroke='%23545555' stroke-linecap='round' stroke-miterlimit='10' stroke-width='2' d='M4 7h22M4 15h22M4 23h22'/%3e%3c/svg%3e");--bs-navbar-toggler-border-color: rgba(84, 85, 85, 0);--bs-navbar-toggler-border-radius: 0.25rem;--bs-navbar-toggler-focus-width: 0.25rem;--bs-navbar-toggler-transition: box-shadow 0.15s ease-in-out;position:relative;display:flex;display:-webkit-flex;flex-wrap:wrap;-webkit-flex-wrap:wrap;align-items:center;-webkit-align-items:center;justify-content:space-between;-webkit-justify-content:space-between;padding:var(--bs-navbar-padding-y) var(--bs-navbar-padding-x)}.navbar>.container,.navbar>.container-fluid,.navbar>.container-sm,.navbar>.container-md,.navbar>.container-lg,.navbar>.container-xl,.navbar>.container-xxl{display:flex;display:-webkit-flex;flex-wrap:inherit;-webkit-flex-wrap:inherit;align-items:center;-webkit-align-items:center;justify-content:space-between;-webkit-justify-content:space-between}.navbar-brand{padding-top:var(--bs-navbar-brand-padding-y);padding-bottom:var(--bs-navbar-brand-padding-y);margin-right:var(--bs-navbar-brand-margin-end);font-size:var(--bs-navbar-brand-font-size);color:var(--bs-navbar-brand-color);text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;white-space:nowrap}.navbar-brand:hover,.navbar-brand:focus{color:var(--bs-navbar-brand-hover-color)}.navbar-nav{--bs-nav-link-padding-x: 0;--bs-nav-link-padding-y: 0.5rem;--bs-nav-link-font-weight: ;--bs-nav-link-color: var(--bs-navbar-color);--bs-nav-link-hover-color: var(--bs-navbar-hover-color);--bs-nav-link-disabled-color: var(--bs-navbar-disabled-color);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;padding-left:0;margin-bottom:0;list-style:none}.navbar-nav .nav-link.active,.navbar-nav .nav-link.show{color:var(--bs-navbar-active-color)}.navbar-nav .dropdown-menu{position:static}.navbar-text{padding-top:.5rem;padding-bottom:.5rem;color:var(--bs-navbar-color)}.navbar-text a,.navbar-text a:hover,.navbar-text a:focus{color:var(--bs-navbar-active-color)}.navbar-collapse{flex-basis:100%;-webkit-flex-basis:100%;flex-grow:1;-webkit-flex-grow:1;align-items:center;-webkit-align-items:center}.navbar-toggler{padding:var(--bs-navbar-toggler-padding-y) var(--bs-navbar-toggler-padding-x);font-size:var(--bs-navbar-toggler-font-size);line-height:1;color:var(--bs-navbar-color);background-color:rgba(0,0,0,0);border:var(--bs-border-width) solid var(--bs-navbar-toggler-border-color);transition:var(--bs-navbar-toggler-transition)}@media(prefers-reduced-motion: reduce){.navbar-toggler{transition:none}}.navbar-toggler:hover{text-decoration:none}.navbar-toggler:focus{text-decoration:none;outline:0;box-shadow:0 0 0 var(--bs-navbar-toggler-focus-width)}.navbar-toggler-icon{display:inline-block;width:1.5em;height:1.5em;vertical-align:middle;background-image:var(--bs-navbar-toggler-icon-bg);background-repeat:no-repeat;background-position:center;background-size:100%}.navbar-nav-scroll{max-height:var(--bs-scroll-height, 75vh);overflow-y:auto}@media(min-width: 576px){.navbar-expand-sm{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand-sm .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand-sm .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-sm .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand-sm .navbar-nav-scroll{overflow:visible}.navbar-expand-sm .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand-sm .navbar-toggler{display:none}.navbar-expand-sm .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand-sm .offcanvas .offcanvas-header{display:none}.navbar-expand-sm .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}}@media(min-width: 768px){.navbar-expand-md{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand-md .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand-md .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-md .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand-md .navbar-nav-scroll{overflow:visible}.navbar-expand-md .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand-md .navbar-toggler{display:none}.navbar-expand-md .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand-md .offcanvas .offcanvas-header{display:none}.navbar-expand-md .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}}@media(min-width: 992px){.navbar-expand-lg{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand-lg .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand-lg .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-lg .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand-lg .navbar-nav-scroll{overflow:visible}.navbar-expand-lg .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand-lg .navbar-toggler{display:none}.navbar-expand-lg .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand-lg .offcanvas .offcanvas-header{display:none}.navbar-expand-lg .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}}@media(min-width: 1200px){.navbar-expand-xl{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand-xl .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand-xl .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-xl .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand-xl .navbar-nav-scroll{overflow:visible}.navbar-expand-xl .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand-xl .navbar-toggler{display:none}.navbar-expand-xl .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand-xl .offcanvas .offcanvas-header{display:none}.navbar-expand-xl .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}}@media(min-width: 1400px){.navbar-expand-xxl{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand-xxl .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand-xxl .navbar-nav .dropdown-menu{position:absolute}.navbar-expand-xxl .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand-xxl .navbar-nav-scroll{overflow:visible}.navbar-expand-xxl .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand-xxl .navbar-toggler{display:none}.navbar-expand-xxl .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand-xxl .offcanvas .offcanvas-header{display:none}.navbar-expand-xxl .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}}.navbar-expand{flex-wrap:nowrap;-webkit-flex-wrap:nowrap;justify-content:flex-start;-webkit-justify-content:flex-start}.navbar-expand .navbar-nav{flex-direction:row;-webkit-flex-direction:row}.navbar-expand .navbar-nav .dropdown-menu{position:absolute}.navbar-expand .navbar-nav .nav-link{padding-right:var(--bs-navbar-nav-link-padding-x);padding-left:var(--bs-navbar-nav-link-padding-x)}.navbar-expand .navbar-nav-scroll{overflow:visible}.navbar-expand .navbar-collapse{display:flex !important;display:-webkit-flex !important;flex-basis:auto;-webkit-flex-basis:auto}.navbar-expand .navbar-toggler{display:none}.navbar-expand .offcanvas{position:static;z-index:auto;flex-grow:1;-webkit-flex-grow:1;width:auto !important;height:auto !important;visibility:visible !important;background-color:rgba(0,0,0,0) !important;border:0 !important;transform:none !important;transition:none}.navbar-expand .offcanvas .offcanvas-header{display:none}.navbar-expand .offcanvas .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible}.navbar-dark,.navbar[data-bs-theme=dark]{--bs-navbar-color: #545555;--bs-navbar-hover-color: rgba(31, 78, 182, 0.8);--bs-navbar-disabled-color: rgba(84, 85, 85, 0.75);--bs-navbar-active-color: #1f4eb6;--bs-navbar-brand-color: #545555;--bs-navbar-brand-hover-color: #1f4eb6;--bs-navbar-toggler-border-color: rgba(84, 85, 85, 0);--bs-navbar-toggler-icon-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 30 30'%3e%3cpath stroke='%23545555' stroke-linecap='round' stroke-miterlimit='10' stroke-width='2' d='M4 7h22M4 15h22M4 23h22'/%3e%3c/svg%3e")}[data-bs-theme=dark] .navbar-toggler-icon{--bs-navbar-toggler-icon-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 30 30'%3e%3cpath stroke='%23545555' stroke-linecap='round' stroke-miterlimit='10' stroke-width='2' d='M4 7h22M4 15h22M4 23h22'/%3e%3c/svg%3e")}.card{--bs-card-spacer-y: 1rem;--bs-card-spacer-x: 1rem;--bs-card-title-spacer-y: 0.5rem;--bs-card-title-color: ;--bs-card-subtitle-color: ;--bs-card-border-width: 1px;--bs-card-border-color: rgba(0, 0, 0, 0.175);--bs-card-border-radius: 0.25rem;--bs-card-box-shadow: ;--bs-card-inner-border-radius: calc(0.25rem - 1px);--bs-card-cap-padding-y: 0.5rem;--bs-card-cap-padding-x: 1rem;--bs-card-cap-bg: rgba(52, 58, 64, 0.25);--bs-card-cap-color: ;--bs-card-height: ;--bs-card-color: ;--bs-card-bg: #fff;--bs-card-img-overlay-padding: 1rem;--bs-card-group-margin: 0.75rem;position:relative;display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;min-width:0;height:var(--bs-card-height);color:var(--bs-body-color);word-wrap:break-word;background-color:var(--bs-card-bg);background-clip:border-box;border:var(--bs-card-border-width) solid var(--bs-card-border-color)}.card>hr{margin-right:0;margin-left:0}.card>.list-group{border-top:inherit;border-bottom:inherit}.card>.list-group:first-child{border-top-width:0}.card>.list-group:last-child{border-bottom-width:0}.card>.card-header+.list-group,.card>.list-group+.card-footer{border-top:0}.card-body{flex:1 1 auto;-webkit-flex:1 1 auto;padding:var(--bs-card-spacer-y) var(--bs-card-spacer-x);color:var(--bs-card-color)}.card-title{margin-bottom:var(--bs-card-title-spacer-y);color:var(--bs-card-title-color)}.card-subtitle{margin-top:calc(-0.5*var(--bs-card-title-spacer-y));margin-bottom:0;color:var(--bs-card-subtitle-color)}.card-text:last-child{margin-bottom:0}.card-link+.card-link{margin-left:var(--bs-card-spacer-x)}.card-header{padding:var(--bs-card-cap-padding-y) var(--bs-card-cap-padding-x);margin-bottom:0;color:var(--bs-card-cap-color);background-color:var(--bs-card-cap-bg);border-bottom:var(--bs-card-border-width) solid var(--bs-card-border-color)}.card-footer{padding:var(--bs-card-cap-padding-y) var(--bs-card-cap-padding-x);color:var(--bs-card-cap-color);background-color:var(--bs-card-cap-bg);border-top:var(--bs-card-border-width) solid var(--bs-card-border-color)}.card-header-tabs{margin-right:calc(-0.5*var(--bs-card-cap-padding-x));margin-bottom:calc(-1*var(--bs-card-cap-padding-y));margin-left:calc(-0.5*var(--bs-card-cap-padding-x));border-bottom:0}.card-header-tabs .nav-link.active{background-color:var(--bs-card-bg);border-bottom-color:var(--bs-card-bg)}.card-header-pills{margin-right:calc(-0.5*var(--bs-card-cap-padding-x));margin-left:calc(-0.5*var(--bs-card-cap-padding-x))}.card-img-overlay{position:absolute;top:0;right:0;bottom:0;left:0;padding:var(--bs-card-img-overlay-padding)}.card-img,.card-img-top,.card-img-bottom{width:100%}.card-group>.card{margin-bottom:var(--bs-card-group-margin)}@media(min-width: 576px){.card-group{display:flex;display:-webkit-flex;flex-flow:row wrap;-webkit-flex-flow:row wrap}.card-group>.card{flex:1 0 0%;-webkit-flex:1 0 0%;margin-bottom:0}.card-group>.card+.card{margin-left:0;border-left:0}}.accordion{--bs-accordion-color: #343a40;--bs-accordion-bg: #fff;--bs-accordion-transition: color 0.15s ease-in-out, background-color 0.15s ease-in-out, border-color 0.15s ease-in-out, box-shadow 0.15s ease-in-out, border-radius 0.15s ease;--bs-accordion-border-color: #dee2e6;--bs-accordion-border-width: 1px;--bs-accordion-border-radius: 0.25rem;--bs-accordion-inner-border-radius: calc(0.25rem - 1px);--bs-accordion-btn-padding-x: 1.25rem;--bs-accordion-btn-padding-y: 1rem;--bs-accordion-btn-color: #343a40;--bs-accordion-btn-bg: #fff;--bs-accordion-btn-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23343a40'%3e%3cpath fill-rule='evenodd' d='M1.646 4.646a.5.5 0 0 1 .708 0L8 10.293l5.646-5.647a.5.5 0 0 1 .708.708l-6 6a.5.5 0 0 1-.708 0l-6-6a.5.5 0 0 1 0-.708z'/%3e%3c/svg%3e");--bs-accordion-btn-icon-width: 1.25rem;--bs-accordion-btn-icon-transform: rotate(-180deg);--bs-accordion-btn-icon-transition: transform 0.2s ease-in-out;--bs-accordion-btn-active-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%2310335b'%3e%3cpath fill-rule='evenodd' d='M1.646 4.646a.5.5 0 0 1 .708 0L8 10.293l5.646-5.647a.5.5 0 0 1 .708.708l-6 6a.5.5 0 0 1-.708 0l-6-6a.5.5 0 0 1 0-.708z'/%3e%3c/svg%3e");--bs-accordion-btn-focus-border-color: #93c0f1;--bs-accordion-btn-focus-box-shadow: 0 0 0 0.25rem rgba(39, 128, 227, 0.25);--bs-accordion-body-padding-x: 1.25rem;--bs-accordion-body-padding-y: 1rem;--bs-accordion-active-color: #10335b;--bs-accordion-active-bg: #d4e6f9}.accordion-button{position:relative;display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;width:100%;padding:var(--bs-accordion-btn-padding-y) var(--bs-accordion-btn-padding-x);font-size:1rem;color:var(--bs-accordion-btn-color);text-align:left;background-color:var(--bs-accordion-btn-bg);border:0;overflow-anchor:none;transition:var(--bs-accordion-transition)}@media(prefers-reduced-motion: reduce){.accordion-button{transition:none}}.accordion-button:not(.collapsed){color:var(--bs-accordion-active-color);background-color:var(--bs-accordion-active-bg);box-shadow:inset 0 calc(-1*var(--bs-accordion-border-width)) 0 var(--bs-accordion-border-color)}.accordion-button:not(.collapsed)::after{background-image:var(--bs-accordion-btn-active-icon);transform:var(--bs-accordion-btn-icon-transform)}.accordion-button::after{flex-shrink:0;-webkit-flex-shrink:0;width:var(--bs-accordion-btn-icon-width);height:var(--bs-accordion-btn-icon-width);margin-left:auto;content:"";background-image:var(--bs-accordion-btn-icon);background-repeat:no-repeat;background-size:var(--bs-accordion-btn-icon-width);transition:var(--bs-accordion-btn-icon-transition)}@media(prefers-reduced-motion: reduce){.accordion-button::after{transition:none}}.accordion-button:hover{z-index:2}.accordion-button:focus{z-index:3;border-color:var(--bs-accordion-btn-focus-border-color);outline:0;box-shadow:var(--bs-accordion-btn-focus-box-shadow)}.accordion-header{margin-bottom:0}.accordion-item{color:var(--bs-accordion-color);background-color:var(--bs-accordion-bg);border:var(--bs-accordion-border-width) solid var(--bs-accordion-border-color)}.accordion-item:not(:first-of-type){border-top:0}.accordion-body{padding:var(--bs-accordion-body-padding-y) var(--bs-accordion-body-padding-x)}.accordion-flush .accordion-collapse{border-width:0}.accordion-flush .accordion-item{border-right:0;border-left:0}.accordion-flush .accordion-item:first-child{border-top:0}.accordion-flush .accordion-item:last-child{border-bottom:0}[data-bs-theme=dark] .accordion-button::after{--bs-accordion-btn-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%237db3ee'%3e%3cpath fill-rule='evenodd' d='M1.646 4.646a.5.5 0 0 1 .708 0L8 10.293l5.646-5.647a.5.5 0 0 1 .708.708l-6 6a.5.5 0 0 1-.708 0l-6-6a.5.5 0 0 1 0-.708z'/%3e%3c/svg%3e");--bs-accordion-btn-active-icon: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%237db3ee'%3e%3cpath fill-rule='evenodd' d='M1.646 4.646a.5.5 0 0 1 .708 0L8 10.293l5.646-5.647a.5.5 0 0 1 .708.708l-6 6a.5.5 0 0 1-.708 0l-6-6a.5.5 0 0 1 0-.708z'/%3e%3c/svg%3e")}.breadcrumb{--bs-breadcrumb-padding-x: 0;--bs-breadcrumb-padding-y: 0;--bs-breadcrumb-margin-bottom: 1rem;--bs-breadcrumb-bg: ;--bs-breadcrumb-border-radius: ;--bs-breadcrumb-divider-color: rgba(52, 58, 64, 0.75);--bs-breadcrumb-item-padding-x: 0.5rem;--bs-breadcrumb-item-active-color: rgba(52, 58, 64, 0.75);display:flex;display:-webkit-flex;flex-wrap:wrap;-webkit-flex-wrap:wrap;padding:var(--bs-breadcrumb-padding-y) var(--bs-breadcrumb-padding-x);margin-bottom:var(--bs-breadcrumb-margin-bottom);font-size:var(--bs-breadcrumb-font-size);list-style:none;background-color:var(--bs-breadcrumb-bg)}.breadcrumb-item+.breadcrumb-item{padding-left:var(--bs-breadcrumb-item-padding-x)}.breadcrumb-item+.breadcrumb-item::before{float:left;padding-right:var(--bs-breadcrumb-item-padding-x);color:var(--bs-breadcrumb-divider-color);content:var(--bs-breadcrumb-divider, ">") /* rtl: var(--bs-breadcrumb-divider, ">") */}.breadcrumb-item.active{color:var(--bs-breadcrumb-item-active-color)}.pagination{--bs-pagination-padding-x: 0.75rem;--bs-pagination-padding-y: 0.375rem;--bs-pagination-font-size:1rem;--bs-pagination-color: #2761e3;--bs-pagination-bg: #fff;--bs-pagination-border-width: 1px;--bs-pagination-border-color: #dee2e6;--bs-pagination-border-radius: 0.25rem;--bs-pagination-hover-color: #1f4eb6;--bs-pagination-hover-bg: #f8f9fa;--bs-pagination-hover-border-color: #dee2e6;--bs-pagination-focus-color: #1f4eb6;--bs-pagination-focus-bg: #e9ecef;--bs-pagination-focus-box-shadow: 0 0 0 0.25rem rgba(39, 128, 227, 0.25);--bs-pagination-active-color: #fff;--bs-pagination-active-bg: #2780e3;--bs-pagination-active-border-color: #2780e3;--bs-pagination-disabled-color: rgba(52, 58, 64, 0.75);--bs-pagination-disabled-bg: #e9ecef;--bs-pagination-disabled-border-color: #dee2e6;display:flex;display:-webkit-flex;padding-left:0;list-style:none}.page-link{position:relative;display:block;padding:var(--bs-pagination-padding-y) var(--bs-pagination-padding-x);font-size:var(--bs-pagination-font-size);color:var(--bs-pagination-color);text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;background-color:var(--bs-pagination-bg);border:var(--bs-pagination-border-width) solid var(--bs-pagination-border-color);transition:color .15s ease-in-out,background-color .15s ease-in-out,border-color .15s ease-in-out,box-shadow .15s ease-in-out}@media(prefers-reduced-motion: reduce){.page-link{transition:none}}.page-link:hover{z-index:2;color:var(--bs-pagination-hover-color);background-color:var(--bs-pagination-hover-bg);border-color:var(--bs-pagination-hover-border-color)}.page-link:focus{z-index:3;color:var(--bs-pagination-focus-color);background-color:var(--bs-pagination-focus-bg);outline:0;box-shadow:var(--bs-pagination-focus-box-shadow)}.page-link.active,.active>.page-link{z-index:3;color:var(--bs-pagination-active-color);background-color:var(--bs-pagination-active-bg);border-color:var(--bs-pagination-active-border-color)}.page-link.disabled,.disabled>.page-link{color:var(--bs-pagination-disabled-color);pointer-events:none;background-color:var(--bs-pagination-disabled-bg);border-color:var(--bs-pagination-disabled-border-color)}.page-item:not(:first-child) .page-link{margin-left:calc(1px*-1)}.pagination-lg{--bs-pagination-padding-x: 1.5rem;--bs-pagination-padding-y: 0.75rem;--bs-pagination-font-size:1.25rem;--bs-pagination-border-radius: 0.5rem}.pagination-sm{--bs-pagination-padding-x: 0.5rem;--bs-pagination-padding-y: 0.25rem;--bs-pagination-font-size:0.875rem;--bs-pagination-border-radius: 0.2em}.badge{--bs-badge-padding-x: 0.65em;--bs-badge-padding-y: 0.35em;--bs-badge-font-size:0.75em;--bs-badge-font-weight: 700;--bs-badge-color: #fff;--bs-badge-border-radius: 0.25rem;display:inline-block;padding:var(--bs-badge-padding-y) var(--bs-badge-padding-x);font-size:var(--bs-badge-font-size);font-weight:var(--bs-badge-font-weight);line-height:1;color:var(--bs-badge-color);text-align:center;white-space:nowrap;vertical-align:baseline}.badge:empty{display:none}.btn .badge{position:relative;top:-1px}.alert{--bs-alert-bg: transparent;--bs-alert-padding-x: 1rem;--bs-alert-padding-y: 1rem;--bs-alert-margin-bottom: 1rem;--bs-alert-color: inherit;--bs-alert-border-color: transparent;--bs-alert-border: 0 solid var(--bs-alert-border-color);--bs-alert-border-radius: 0.25rem;--bs-alert-link-color: inherit;position:relative;padding:var(--bs-alert-padding-y) var(--bs-alert-padding-x);margin-bottom:var(--bs-alert-margin-bottom);color:var(--bs-alert-color);background-color:var(--bs-alert-bg);border:var(--bs-alert-border)}.alert-heading{color:inherit}.alert-link{font-weight:700;color:var(--bs-alert-link-color)}.alert-dismissible{padding-right:3rem}.alert-dismissible .btn-close{position:absolute;top:0;right:0;z-index:2;padding:1.25rem 1rem}.alert-default{--bs-alert-color: var(--bs-default-text-emphasis);--bs-alert-bg: var(--bs-default-bg-subtle);--bs-alert-border-color: var(--bs-default-border-subtle);--bs-alert-link-color: var(--bs-default-text-emphasis)}.alert-primary{--bs-alert-color: var(--bs-primary-text-emphasis);--bs-alert-bg: var(--bs-primary-bg-subtle);--bs-alert-border-color: var(--bs-primary-border-subtle);--bs-alert-link-color: var(--bs-primary-text-emphasis)}.alert-secondary{--bs-alert-color: var(--bs-secondary-text-emphasis);--bs-alert-bg: var(--bs-secondary-bg-subtle);--bs-alert-border-color: var(--bs-secondary-border-subtle);--bs-alert-link-color: var(--bs-secondary-text-emphasis)}.alert-success{--bs-alert-color: var(--bs-success-text-emphasis);--bs-alert-bg: var(--bs-success-bg-subtle);--bs-alert-border-color: var(--bs-success-border-subtle);--bs-alert-link-color: var(--bs-success-text-emphasis)}.alert-info{--bs-alert-color: var(--bs-info-text-emphasis);--bs-alert-bg: var(--bs-info-bg-subtle);--bs-alert-border-color: var(--bs-info-border-subtle);--bs-alert-link-color: var(--bs-info-text-emphasis)}.alert-warning{--bs-alert-color: var(--bs-warning-text-emphasis);--bs-alert-bg: var(--bs-warning-bg-subtle);--bs-alert-border-color: var(--bs-warning-border-subtle);--bs-alert-link-color: var(--bs-warning-text-emphasis)}.alert-danger{--bs-alert-color: var(--bs-danger-text-emphasis);--bs-alert-bg: var(--bs-danger-bg-subtle);--bs-alert-border-color: var(--bs-danger-border-subtle);--bs-alert-link-color: var(--bs-danger-text-emphasis)}.alert-light{--bs-alert-color: var(--bs-light-text-emphasis);--bs-alert-bg: var(--bs-light-bg-subtle);--bs-alert-border-color: var(--bs-light-border-subtle);--bs-alert-link-color: var(--bs-light-text-emphasis)}.alert-dark{--bs-alert-color: var(--bs-dark-text-emphasis);--bs-alert-bg: var(--bs-dark-bg-subtle);--bs-alert-border-color: var(--bs-dark-border-subtle);--bs-alert-link-color: var(--bs-dark-text-emphasis)}@keyframes progress-bar-stripes{0%{background-position-x:.5rem}}.progress,.progress-stacked{--bs-progress-height: 0.5rem;--bs-progress-font-size:0.75rem;--bs-progress-bg: #e9ecef;--bs-progress-border-radius: 0.25rem;--bs-progress-box-shadow: inset 0 1px 2px rgba(0, 0, 0, 0.075);--bs-progress-bar-color: #fff;--bs-progress-bar-bg: #2780e3;--bs-progress-bar-transition: width 0.6s ease;display:flex;display:-webkit-flex;height:var(--bs-progress-height);overflow:hidden;font-size:var(--bs-progress-font-size);background-color:var(--bs-progress-bg)}.progress-bar{display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;justify-content:center;-webkit-justify-content:center;overflow:hidden;color:var(--bs-progress-bar-color);text-align:center;white-space:nowrap;background-color:var(--bs-progress-bar-bg);transition:var(--bs-progress-bar-transition)}@media(prefers-reduced-motion: reduce){.progress-bar{transition:none}}.progress-bar-striped{background-image:linear-gradient(45deg, rgba(255, 255, 255, 0.15) 25%, transparent 25%, transparent 50%, rgba(255, 255, 255, 0.15) 50%, rgba(255, 255, 255, 0.15) 75%, transparent 75%, transparent);background-size:var(--bs-progress-height) var(--bs-progress-height)}.progress-stacked>.progress{overflow:visible}.progress-stacked>.progress>.progress-bar{width:100%}.progress-bar-animated{animation:1s linear infinite progress-bar-stripes}@media(prefers-reduced-motion: reduce){.progress-bar-animated{animation:none}}.list-group{--bs-list-group-color: #343a40;--bs-list-group-bg: #fff;--bs-list-group-border-color: #dee2e6;--bs-list-group-border-width: 1px;--bs-list-group-border-radius: 0.25rem;--bs-list-group-item-padding-x: 1rem;--bs-list-group-item-padding-y: 0.5rem;--bs-list-group-action-color: rgba(52, 58, 64, 0.75);--bs-list-group-action-hover-color: #000;--bs-list-group-action-hover-bg: #f8f9fa;--bs-list-group-action-active-color: #343a40;--bs-list-group-action-active-bg: #e9ecef;--bs-list-group-disabled-color: rgba(52, 58, 64, 0.75);--bs-list-group-disabled-bg: #fff;--bs-list-group-active-color: #fff;--bs-list-group-active-bg: #2780e3;--bs-list-group-active-border-color: #2780e3;display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;padding-left:0;margin-bottom:0}.list-group-numbered{list-style-type:none;counter-reset:section}.list-group-numbered>.list-group-item::before{content:counters(section, ".") ". ";counter-increment:section}.list-group-item-action{width:100%;color:var(--bs-list-group-action-color);text-align:inherit}.list-group-item-action:hover,.list-group-item-action:focus{z-index:1;color:var(--bs-list-group-action-hover-color);text-decoration:none;background-color:var(--bs-list-group-action-hover-bg)}.list-group-item-action:active{color:var(--bs-list-group-action-active-color);background-color:var(--bs-list-group-action-active-bg)}.list-group-item{position:relative;display:block;padding:var(--bs-list-group-item-padding-y) var(--bs-list-group-item-padding-x);color:var(--bs-list-group-color);text-decoration:none;-webkit-text-decoration:none;-moz-text-decoration:none;-ms-text-decoration:none;-o-text-decoration:none;background-color:var(--bs-list-group-bg);border:var(--bs-list-group-border-width) solid var(--bs-list-group-border-color)}.list-group-item.disabled,.list-group-item:disabled{color:var(--bs-list-group-disabled-color);pointer-events:none;background-color:var(--bs-list-group-disabled-bg)}.list-group-item.active{z-index:2;color:var(--bs-list-group-active-color);background-color:var(--bs-list-group-active-bg);border-color:var(--bs-list-group-active-border-color)}.list-group-item+.list-group-item{border-top-width:0}.list-group-item+.list-group-item.active{margin-top:calc(-1*var(--bs-list-group-border-width));border-top-width:var(--bs-list-group-border-width)}.list-group-horizontal{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal>.list-group-item.active{margin-top:0}.list-group-horizontal>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}@media(min-width: 576px){.list-group-horizontal-sm{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal-sm>.list-group-item.active{margin-top:0}.list-group-horizontal-sm>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal-sm>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}}@media(min-width: 768px){.list-group-horizontal-md{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal-md>.list-group-item.active{margin-top:0}.list-group-horizontal-md>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal-md>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}}@media(min-width: 992px){.list-group-horizontal-lg{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal-lg>.list-group-item.active{margin-top:0}.list-group-horizontal-lg>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal-lg>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}}@media(min-width: 1200px){.list-group-horizontal-xl{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal-xl>.list-group-item.active{margin-top:0}.list-group-horizontal-xl>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal-xl>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}}@media(min-width: 1400px){.list-group-horizontal-xxl{flex-direction:row;-webkit-flex-direction:row}.list-group-horizontal-xxl>.list-group-item.active{margin-top:0}.list-group-horizontal-xxl>.list-group-item+.list-group-item{border-top-width:var(--bs-list-group-border-width);border-left-width:0}.list-group-horizontal-xxl>.list-group-item+.list-group-item.active{margin-left:calc(-1*var(--bs-list-group-border-width));border-left-width:var(--bs-list-group-border-width)}}.list-group-flush>.list-group-item{border-width:0 0 var(--bs-list-group-border-width)}.list-group-flush>.list-group-item:last-child{border-bottom-width:0}.list-group-item-default{--bs-list-group-color: var(--bs-default-text-emphasis);--bs-list-group-bg: var(--bs-default-bg-subtle);--bs-list-group-border-color: var(--bs-default-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-default-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-default-border-subtle);--bs-list-group-active-color: var(--bs-default-bg-subtle);--bs-list-group-active-bg: var(--bs-default-text-emphasis);--bs-list-group-active-border-color: var(--bs-default-text-emphasis)}.list-group-item-primary{--bs-list-group-color: var(--bs-primary-text-emphasis);--bs-list-group-bg: var(--bs-primary-bg-subtle);--bs-list-group-border-color: var(--bs-primary-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-primary-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-primary-border-subtle);--bs-list-group-active-color: var(--bs-primary-bg-subtle);--bs-list-group-active-bg: var(--bs-primary-text-emphasis);--bs-list-group-active-border-color: var(--bs-primary-text-emphasis)}.list-group-item-secondary{--bs-list-group-color: var(--bs-secondary-text-emphasis);--bs-list-group-bg: var(--bs-secondary-bg-subtle);--bs-list-group-border-color: var(--bs-secondary-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-secondary-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-secondary-border-subtle);--bs-list-group-active-color: var(--bs-secondary-bg-subtle);--bs-list-group-active-bg: var(--bs-secondary-text-emphasis);--bs-list-group-active-border-color: var(--bs-secondary-text-emphasis)}.list-group-item-success{--bs-list-group-color: var(--bs-success-text-emphasis);--bs-list-group-bg: var(--bs-success-bg-subtle);--bs-list-group-border-color: var(--bs-success-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-success-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-success-border-subtle);--bs-list-group-active-color: var(--bs-success-bg-subtle);--bs-list-group-active-bg: var(--bs-success-text-emphasis);--bs-list-group-active-border-color: var(--bs-success-text-emphasis)}.list-group-item-info{--bs-list-group-color: var(--bs-info-text-emphasis);--bs-list-group-bg: var(--bs-info-bg-subtle);--bs-list-group-border-color: var(--bs-info-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-info-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-info-border-subtle);--bs-list-group-active-color: var(--bs-info-bg-subtle);--bs-list-group-active-bg: var(--bs-info-text-emphasis);--bs-list-group-active-border-color: var(--bs-info-text-emphasis)}.list-group-item-warning{--bs-list-group-color: var(--bs-warning-text-emphasis);--bs-list-group-bg: var(--bs-warning-bg-subtle);--bs-list-group-border-color: var(--bs-warning-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-warning-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-warning-border-subtle);--bs-list-group-active-color: var(--bs-warning-bg-subtle);--bs-list-group-active-bg: var(--bs-warning-text-emphasis);--bs-list-group-active-border-color: var(--bs-warning-text-emphasis)}.list-group-item-danger{--bs-list-group-color: var(--bs-danger-text-emphasis);--bs-list-group-bg: var(--bs-danger-bg-subtle);--bs-list-group-border-color: var(--bs-danger-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-danger-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-danger-border-subtle);--bs-list-group-active-color: var(--bs-danger-bg-subtle);--bs-list-group-active-bg: var(--bs-danger-text-emphasis);--bs-list-group-active-border-color: var(--bs-danger-text-emphasis)}.list-group-item-light{--bs-list-group-color: var(--bs-light-text-emphasis);--bs-list-group-bg: var(--bs-light-bg-subtle);--bs-list-group-border-color: var(--bs-light-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-light-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-light-border-subtle);--bs-list-group-active-color: var(--bs-light-bg-subtle);--bs-list-group-active-bg: var(--bs-light-text-emphasis);--bs-list-group-active-border-color: var(--bs-light-text-emphasis)}.list-group-item-dark{--bs-list-group-color: var(--bs-dark-text-emphasis);--bs-list-group-bg: var(--bs-dark-bg-subtle);--bs-list-group-border-color: var(--bs-dark-border-subtle);--bs-list-group-action-hover-color: var(--bs-emphasis-color);--bs-list-group-action-hover-bg: var(--bs-dark-border-subtle);--bs-list-group-action-active-color: var(--bs-emphasis-color);--bs-list-group-action-active-bg: var(--bs-dark-border-subtle);--bs-list-group-active-color: var(--bs-dark-bg-subtle);--bs-list-group-active-bg: var(--bs-dark-text-emphasis);--bs-list-group-active-border-color: var(--bs-dark-text-emphasis)}.btn-close{--bs-btn-close-color: #000;--bs-btn-close-bg: url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23000'%3e%3cpath d='M.293.293a1 1 0 0 1 1.414 0L8 6.586 14.293.293a1 1 0 1 1 1.414 1.414L9.414 8l6.293 6.293a1 1 0 0 1-1.414 1.414L8 9.414l-6.293 6.293a1 1 0 0 1-1.414-1.414L6.586 8 .293 1.707a1 1 0 0 1 0-1.414z'/%3e%3c/svg%3e");--bs-btn-close-opacity: 0.5;--bs-btn-close-hover-opacity: 0.75;--bs-btn-close-focus-shadow: 0 0 0 0.25rem rgba(39, 128, 227, 0.25);--bs-btn-close-focus-opacity: 1;--bs-btn-close-disabled-opacity: 0.25;--bs-btn-close-white-filter: invert(1) grayscale(100%) brightness(200%);box-sizing:content-box;width:1em;height:1em;padding:.25em .25em;color:var(--bs-btn-close-color);background:rgba(0,0,0,0) var(--bs-btn-close-bg) center/1em auto no-repeat;border:0;opacity:var(--bs-btn-close-opacity)}.btn-close:hover{color:var(--bs-btn-close-color);text-decoration:none;opacity:var(--bs-btn-close-hover-opacity)}.btn-close:focus{outline:0;box-shadow:var(--bs-btn-close-focus-shadow);opacity:var(--bs-btn-close-focus-opacity)}.btn-close:disabled,.btn-close.disabled{pointer-events:none;user-select:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;-o-user-select:none;opacity:var(--bs-btn-close-disabled-opacity)}.btn-close-white{filter:var(--bs-btn-close-white-filter)}[data-bs-theme=dark] .btn-close{filter:var(--bs-btn-close-white-filter)}.toast{--bs-toast-zindex: 1090;--bs-toast-padding-x: 0.75rem;--bs-toast-padding-y: 0.5rem;--bs-toast-spacing: 1.5rem;--bs-toast-max-width: 350px;--bs-toast-font-size:0.875rem;--bs-toast-color: ;--bs-toast-bg: rgba(255, 255, 255, 0.85);--bs-toast-border-width: 1px;--bs-toast-border-color: rgba(0, 0, 0, 0.175);--bs-toast-border-radius: 0.25rem;--bs-toast-box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15);--bs-toast-header-color: rgba(52, 58, 64, 0.75);--bs-toast-header-bg: rgba(255, 255, 255, 0.85);--bs-toast-header-border-color: rgba(0, 0, 0, 0.175);width:var(--bs-toast-max-width);max-width:100%;font-size:var(--bs-toast-font-size);color:var(--bs-toast-color);pointer-events:auto;background-color:var(--bs-toast-bg);background-clip:padding-box;border:var(--bs-toast-border-width) solid var(--bs-toast-border-color);box-shadow:var(--bs-toast-box-shadow)}.toast.showing{opacity:0}.toast:not(.show){display:none}.toast-container{--bs-toast-zindex: 1090;position:absolute;z-index:var(--bs-toast-zindex);width:max-content;width:-webkit-max-content;width:-moz-max-content;width:-ms-max-content;width:-o-max-content;max-width:100%;pointer-events:none}.toast-container>:not(:last-child){margin-bottom:var(--bs-toast-spacing)}.toast-header{display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;padding:var(--bs-toast-padding-y) var(--bs-toast-padding-x);color:var(--bs-toast-header-color);background-color:var(--bs-toast-header-bg);background-clip:padding-box;border-bottom:var(--bs-toast-border-width) solid var(--bs-toast-header-border-color)}.toast-header .btn-close{margin-right:calc(-0.5*var(--bs-toast-padding-x));margin-left:var(--bs-toast-padding-x)}.toast-body{padding:var(--bs-toast-padding-x);word-wrap:break-word}.modal{--bs-modal-zindex: 1055;--bs-modal-width: 500px;--bs-modal-padding: 1rem;--bs-modal-margin: 0.5rem;--bs-modal-color: ;--bs-modal-bg: #fff;--bs-modal-border-color: rgba(0, 0, 0, 0.175);--bs-modal-border-width: 1px;--bs-modal-border-radius: 0.5rem;--bs-modal-box-shadow: 0 0.125rem 0.25rem rgba(0, 0, 0, 0.075);--bs-modal-inner-border-radius: calc(0.5rem - 1px);--bs-modal-header-padding-x: 1rem;--bs-modal-header-padding-y: 1rem;--bs-modal-header-padding: 1rem 1rem;--bs-modal-header-border-color: #dee2e6;--bs-modal-header-border-width: 1px;--bs-modal-title-line-height: 1.5;--bs-modal-footer-gap: 0.5rem;--bs-modal-footer-bg: ;--bs-modal-footer-border-color: #dee2e6;--bs-modal-footer-border-width: 1px;position:fixed;top:0;left:0;z-index:var(--bs-modal-zindex);display:none;width:100%;height:100%;overflow-x:hidden;overflow-y:auto;outline:0}.modal-dialog{position:relative;width:auto;margin:var(--bs-modal-margin);pointer-events:none}.modal.fade .modal-dialog{transition:transform .3s ease-out;transform:translate(0, -50px)}@media(prefers-reduced-motion: reduce){.modal.fade .modal-dialog{transition:none}}.modal.show .modal-dialog{transform:none}.modal.modal-static .modal-dialog{transform:scale(1.02)}.modal-dialog-scrollable{height:calc(100% - var(--bs-modal-margin)*2)}.modal-dialog-scrollable .modal-content{max-height:100%;overflow:hidden}.modal-dialog-scrollable .modal-body{overflow-y:auto}.modal-dialog-centered{display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;min-height:calc(100% - var(--bs-modal-margin)*2)}.modal-content{position:relative;display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;width:100%;color:var(--bs-modal-color);pointer-events:auto;background-color:var(--bs-modal-bg);background-clip:padding-box;border:var(--bs-modal-border-width) solid var(--bs-modal-border-color);outline:0}.modal-backdrop{--bs-backdrop-zindex: 1050;--bs-backdrop-bg: #000;--bs-backdrop-opacity: 0.5;position:fixed;top:0;left:0;z-index:var(--bs-backdrop-zindex);width:100vw;height:100vh;background-color:var(--bs-backdrop-bg)}.modal-backdrop.fade{opacity:0}.modal-backdrop.show{opacity:var(--bs-backdrop-opacity)}.modal-header{display:flex;display:-webkit-flex;flex-shrink:0;-webkit-flex-shrink:0;align-items:center;-webkit-align-items:center;justify-content:space-between;-webkit-justify-content:space-between;padding:var(--bs-modal-header-padding);border-bottom:var(--bs-modal-header-border-width) solid var(--bs-modal-header-border-color)}.modal-header .btn-close{padding:calc(var(--bs-modal-header-padding-y)*.5) calc(var(--bs-modal-header-padding-x)*.5);margin:calc(-0.5*var(--bs-modal-header-padding-y)) calc(-0.5*var(--bs-modal-header-padding-x)) calc(-0.5*var(--bs-modal-header-padding-y)) auto}.modal-title{margin-bottom:0;line-height:var(--bs-modal-title-line-height)}.modal-body{position:relative;flex:1 1 auto;-webkit-flex:1 1 auto;padding:var(--bs-modal-padding)}.modal-footer{display:flex;display:-webkit-flex;flex-shrink:0;-webkit-flex-shrink:0;flex-wrap:wrap;-webkit-flex-wrap:wrap;align-items:center;-webkit-align-items:center;justify-content:flex-end;-webkit-justify-content:flex-end;padding:calc(var(--bs-modal-padding) - var(--bs-modal-footer-gap)*.5);background-color:var(--bs-modal-footer-bg);border-top:var(--bs-modal-footer-border-width) solid var(--bs-modal-footer-border-color)}.modal-footer>*{margin:calc(var(--bs-modal-footer-gap)*.5)}@media(min-width: 576px){.modal{--bs-modal-margin: 1.75rem;--bs-modal-box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15)}.modal-dialog{max-width:var(--bs-modal-width);margin-right:auto;margin-left:auto}.modal-sm{--bs-modal-width: 300px}}@media(min-width: 992px){.modal-lg,.modal-xl{--bs-modal-width: 800px}}@media(min-width: 1200px){.modal-xl{--bs-modal-width: 1140px}}.modal-fullscreen{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen .modal-content{height:100%;border:0}.modal-fullscreen .modal-body{overflow-y:auto}@media(max-width: 575.98px){.modal-fullscreen-sm-down{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen-sm-down .modal-content{height:100%;border:0}.modal-fullscreen-sm-down .modal-body{overflow-y:auto}}@media(max-width: 767.98px){.modal-fullscreen-md-down{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen-md-down .modal-content{height:100%;border:0}.modal-fullscreen-md-down .modal-body{overflow-y:auto}}@media(max-width: 991.98px){.modal-fullscreen-lg-down{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen-lg-down .modal-content{height:100%;border:0}.modal-fullscreen-lg-down .modal-body{overflow-y:auto}}@media(max-width: 1199.98px){.modal-fullscreen-xl-down{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen-xl-down .modal-content{height:100%;border:0}.modal-fullscreen-xl-down .modal-body{overflow-y:auto}}@media(max-width: 1399.98px){.modal-fullscreen-xxl-down{width:100vw;max-width:none;height:100%;margin:0}.modal-fullscreen-xxl-down .modal-content{height:100%;border:0}.modal-fullscreen-xxl-down .modal-body{overflow-y:auto}}.tooltip{--bs-tooltip-zindex: 1080;--bs-tooltip-max-width: 200px;--bs-tooltip-padding-x: 0.5rem;--bs-tooltip-padding-y: 0.25rem;--bs-tooltip-margin: ;--bs-tooltip-font-size:0.875rem;--bs-tooltip-color: #fff;--bs-tooltip-bg: #000;--bs-tooltip-border-radius: 0.25rem;--bs-tooltip-opacity: 0.9;--bs-tooltip-arrow-width: 0.8rem;--bs-tooltip-arrow-height: 0.4rem;z-index:var(--bs-tooltip-zindex);display:block;margin:var(--bs-tooltip-margin);font-family:"Source Sans Pro",-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol";font-style:normal;font-weight:400;line-height:1.5;text-align:left;text-align:start;text-decoration:none;text-shadow:none;text-transform:none;letter-spacing:normal;word-break:normal;white-space:normal;word-spacing:normal;line-break:auto;font-size:var(--bs-tooltip-font-size);word-wrap:break-word;opacity:0}.tooltip.show{opacity:var(--bs-tooltip-opacity)}.tooltip .tooltip-arrow{display:block;width:var(--bs-tooltip-arrow-width);height:var(--bs-tooltip-arrow-height)}.tooltip .tooltip-arrow::before{position:absolute;content:"";border-color:rgba(0,0,0,0);border-style:solid}.bs-tooltip-top .tooltip-arrow,.bs-tooltip-auto[data-popper-placement^=top] .tooltip-arrow{bottom:calc(-1*var(--bs-tooltip-arrow-height))}.bs-tooltip-top .tooltip-arrow::before,.bs-tooltip-auto[data-popper-placement^=top] .tooltip-arrow::before{top:-1px;border-width:var(--bs-tooltip-arrow-height) calc(var(--bs-tooltip-arrow-width)*.5) 0;border-top-color:var(--bs-tooltip-bg)}.bs-tooltip-end .tooltip-arrow,.bs-tooltip-auto[data-popper-placement^=right] .tooltip-arrow{left:calc(-1*var(--bs-tooltip-arrow-height));width:var(--bs-tooltip-arrow-height);height:var(--bs-tooltip-arrow-width)}.bs-tooltip-end .tooltip-arrow::before,.bs-tooltip-auto[data-popper-placement^=right] .tooltip-arrow::before{right:-1px;border-width:calc(var(--bs-tooltip-arrow-width)*.5) var(--bs-tooltip-arrow-height) calc(var(--bs-tooltip-arrow-width)*.5) 0;border-right-color:var(--bs-tooltip-bg)}.bs-tooltip-bottom .tooltip-arrow,.bs-tooltip-auto[data-popper-placement^=bottom] .tooltip-arrow{top:calc(-1*var(--bs-tooltip-arrow-height))}.bs-tooltip-bottom .tooltip-arrow::before,.bs-tooltip-auto[data-popper-placement^=bottom] .tooltip-arrow::before{bottom:-1px;border-width:0 calc(var(--bs-tooltip-arrow-width)*.5) var(--bs-tooltip-arrow-height);border-bottom-color:var(--bs-tooltip-bg)}.bs-tooltip-start .tooltip-arrow,.bs-tooltip-auto[data-popper-placement^=left] .tooltip-arrow{right:calc(-1*var(--bs-tooltip-arrow-height));width:var(--bs-tooltip-arrow-height);height:var(--bs-tooltip-arrow-width)}.bs-tooltip-start .tooltip-arrow::before,.bs-tooltip-auto[data-popper-placement^=left] .tooltip-arrow::before{left:-1px;border-width:calc(var(--bs-tooltip-arrow-width)*.5) 0 calc(var(--bs-tooltip-arrow-width)*.5) var(--bs-tooltip-arrow-height);border-left-color:var(--bs-tooltip-bg)}.tooltip-inner{max-width:var(--bs-tooltip-max-width);padding:var(--bs-tooltip-padding-y) var(--bs-tooltip-padding-x);color:var(--bs-tooltip-color);text-align:center;background-color:var(--bs-tooltip-bg)}.popover{--bs-popover-zindex: 1070;--bs-popover-max-width: 276px;--bs-popover-font-size:0.875rem;--bs-popover-bg: #fff;--bs-popover-border-width: 1px;--bs-popover-border-color: rgba(0, 0, 0, 0.175);--bs-popover-border-radius: 0.5rem;--bs-popover-inner-border-radius: calc(0.5rem - 1px);--bs-popover-box-shadow: 0 0.5rem 1rem rgba(0, 0, 0, 0.15);--bs-popover-header-padding-x: 1rem;--bs-popover-header-padding-y: 0.5rem;--bs-popover-header-font-size:1rem;--bs-popover-header-color: inherit;--bs-popover-header-bg: #e9ecef;--bs-popover-body-padding-x: 1rem;--bs-popover-body-padding-y: 1rem;--bs-popover-body-color: #343a40;--bs-popover-arrow-width: 1rem;--bs-popover-arrow-height: 0.5rem;--bs-popover-arrow-border: var(--bs-popover-border-color);z-index:var(--bs-popover-zindex);display:block;max-width:var(--bs-popover-max-width);font-family:"Source Sans Pro",-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol";font-style:normal;font-weight:400;line-height:1.5;text-align:left;text-align:start;text-decoration:none;text-shadow:none;text-transform:none;letter-spacing:normal;word-break:normal;white-space:normal;word-spacing:normal;line-break:auto;font-size:var(--bs-popover-font-size);word-wrap:break-word;background-color:var(--bs-popover-bg);background-clip:padding-box;border:var(--bs-popover-border-width) solid var(--bs-popover-border-color)}.popover .popover-arrow{display:block;width:var(--bs-popover-arrow-width);height:var(--bs-popover-arrow-height)}.popover .popover-arrow::before,.popover .popover-arrow::after{position:absolute;display:block;content:"";border-color:rgba(0,0,0,0);border-style:solid;border-width:0}.bs-popover-top>.popover-arrow,.bs-popover-auto[data-popper-placement^=top]>.popover-arrow{bottom:calc(-1*(var(--bs-popover-arrow-height)) - var(--bs-popover-border-width))}.bs-popover-top>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=top]>.popover-arrow::before,.bs-popover-top>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=top]>.popover-arrow::after{border-width:var(--bs-popover-arrow-height) calc(var(--bs-popover-arrow-width)*.5) 0}.bs-popover-top>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=top]>.popover-arrow::before{bottom:0;border-top-color:var(--bs-popover-arrow-border)}.bs-popover-top>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=top]>.popover-arrow::after{bottom:var(--bs-popover-border-width);border-top-color:var(--bs-popover-bg)}.bs-popover-end>.popover-arrow,.bs-popover-auto[data-popper-placement^=right]>.popover-arrow{left:calc(-1*(var(--bs-popover-arrow-height)) - var(--bs-popover-border-width));width:var(--bs-popover-arrow-height);height:var(--bs-popover-arrow-width)}.bs-popover-end>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=right]>.popover-arrow::before,.bs-popover-end>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=right]>.popover-arrow::after{border-width:calc(var(--bs-popover-arrow-width)*.5) var(--bs-popover-arrow-height) calc(var(--bs-popover-arrow-width)*.5) 0}.bs-popover-end>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=right]>.popover-arrow::before{left:0;border-right-color:var(--bs-popover-arrow-border)}.bs-popover-end>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=right]>.popover-arrow::after{left:var(--bs-popover-border-width);border-right-color:var(--bs-popover-bg)}.bs-popover-bottom>.popover-arrow,.bs-popover-auto[data-popper-placement^=bottom]>.popover-arrow{top:calc(-1*(var(--bs-popover-arrow-height)) - var(--bs-popover-border-width))}.bs-popover-bottom>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=bottom]>.popover-arrow::before,.bs-popover-bottom>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=bottom]>.popover-arrow::after{border-width:0 calc(var(--bs-popover-arrow-width)*.5) var(--bs-popover-arrow-height)}.bs-popover-bottom>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=bottom]>.popover-arrow::before{top:0;border-bottom-color:var(--bs-popover-arrow-border)}.bs-popover-bottom>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=bottom]>.popover-arrow::after{top:var(--bs-popover-border-width);border-bottom-color:var(--bs-popover-bg)}.bs-popover-bottom .popover-header::before,.bs-popover-auto[data-popper-placement^=bottom] .popover-header::before{position:absolute;top:0;left:50%;display:block;width:var(--bs-popover-arrow-width);margin-left:calc(-0.5*var(--bs-popover-arrow-width));content:"";border-bottom:var(--bs-popover-border-width) solid var(--bs-popover-header-bg)}.bs-popover-start>.popover-arrow,.bs-popover-auto[data-popper-placement^=left]>.popover-arrow{right:calc(-1*(var(--bs-popover-arrow-height)) - var(--bs-popover-border-width));width:var(--bs-popover-arrow-height);height:var(--bs-popover-arrow-width)}.bs-popover-start>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=left]>.popover-arrow::before,.bs-popover-start>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=left]>.popover-arrow::after{border-width:calc(var(--bs-popover-arrow-width)*.5) 0 calc(var(--bs-popover-arrow-width)*.5) var(--bs-popover-arrow-height)}.bs-popover-start>.popover-arrow::before,.bs-popover-auto[data-popper-placement^=left]>.popover-arrow::before{right:0;border-left-color:var(--bs-popover-arrow-border)}.bs-popover-start>.popover-arrow::after,.bs-popover-auto[data-popper-placement^=left]>.popover-arrow::after{right:var(--bs-popover-border-width);border-left-color:var(--bs-popover-bg)}.popover-header{padding:var(--bs-popover-header-padding-y) var(--bs-popover-header-padding-x);margin-bottom:0;font-size:var(--bs-popover-header-font-size);color:var(--bs-popover-header-color);background-color:var(--bs-popover-header-bg);border-bottom:var(--bs-popover-border-width) solid var(--bs-popover-border-color)}.popover-header:empty{display:none}.popover-body{padding:var(--bs-popover-body-padding-y) var(--bs-popover-body-padding-x);color:var(--bs-popover-body-color)}.carousel{position:relative}.carousel.pointer-event{touch-action:pan-y;-webkit-touch-action:pan-y;-moz-touch-action:pan-y;-ms-touch-action:pan-y;-o-touch-action:pan-y}.carousel-inner{position:relative;width:100%;overflow:hidden}.carousel-inner::after{display:block;clear:both;content:""}.carousel-item{position:relative;display:none;float:left;width:100%;margin-right:-100%;backface-visibility:hidden;-webkit-backface-visibility:hidden;-moz-backface-visibility:hidden;-ms-backface-visibility:hidden;-o-backface-visibility:hidden;transition:transform .6s ease-in-out}@media(prefers-reduced-motion: reduce){.carousel-item{transition:none}}.carousel-item.active,.carousel-item-next,.carousel-item-prev{display:block}.carousel-item-next:not(.carousel-item-start),.active.carousel-item-end{transform:translateX(100%)}.carousel-item-prev:not(.carousel-item-end),.active.carousel-item-start{transform:translateX(-100%)}.carousel-fade .carousel-item{opacity:0;transition-property:opacity;transform:none}.carousel-fade .carousel-item.active,.carousel-fade .carousel-item-next.carousel-item-start,.carousel-fade .carousel-item-prev.carousel-item-end{z-index:1;opacity:1}.carousel-fade .active.carousel-item-start,.carousel-fade .active.carousel-item-end{z-index:0;opacity:0;transition:opacity 0s .6s}@media(prefers-reduced-motion: reduce){.carousel-fade .active.carousel-item-start,.carousel-fade .active.carousel-item-end{transition:none}}.carousel-control-prev,.carousel-control-next{position:absolute;top:0;bottom:0;z-index:1;display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;justify-content:center;-webkit-justify-content:center;width:15%;padding:0;color:#fff;text-align:center;background:none;border:0;opacity:.5;transition:opacity .15s ease}@media(prefers-reduced-motion: reduce){.carousel-control-prev,.carousel-control-next{transition:none}}.carousel-control-prev:hover,.carousel-control-prev:focus,.carousel-control-next:hover,.carousel-control-next:focus{color:#fff;text-decoration:none;outline:0;opacity:.9}.carousel-control-prev{left:0}.carousel-control-next{right:0}.carousel-control-prev-icon,.carousel-control-next-icon{display:inline-block;width:2rem;height:2rem;background-repeat:no-repeat;background-position:50%;background-size:100% 100%}.carousel-control-prev-icon{background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23fff'%3e%3cpath d='M11.354 1.646a.5.5 0 0 1 0 .708L5.707 8l5.647 5.646a.5.5 0 0 1-.708.708l-6-6a.5.5 0 0 1 0-.708l6-6a.5.5 0 0 1 .708 0z'/%3e%3c/svg%3e")}.carousel-control-next-icon{background-image:url("data:image/svg+xml,%3csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 16 16' fill='%23fff'%3e%3cpath d='M4.646 1.646a.5.5 0 0 1 .708 0l6 6a.5.5 0 0 1 0 .708l-6 6a.5.5 0 0 1-.708-.708L10.293 8 4.646 2.354a.5.5 0 0 1 0-.708z'/%3e%3c/svg%3e")}.carousel-indicators{position:absolute;right:0;bottom:0;left:0;z-index:2;display:flex;display:-webkit-flex;justify-content:center;-webkit-justify-content:center;padding:0;margin-right:15%;margin-bottom:1rem;margin-left:15%}.carousel-indicators [data-bs-target]{box-sizing:content-box;flex:0 1 auto;-webkit-flex:0 1 auto;width:30px;height:3px;padding:0;margin-right:3px;margin-left:3px;text-indent:-999px;cursor:pointer;background-color:#fff;background-clip:padding-box;border:0;border-top:10px solid rgba(0,0,0,0);border-bottom:10px solid rgba(0,0,0,0);opacity:.5;transition:opacity .6s ease}@media(prefers-reduced-motion: reduce){.carousel-indicators [data-bs-target]{transition:none}}.carousel-indicators .active{opacity:1}.carousel-caption{position:absolute;right:15%;bottom:1.25rem;left:15%;padding-top:1.25rem;padding-bottom:1.25rem;color:#fff;text-align:center}.carousel-dark .carousel-control-prev-icon,.carousel-dark .carousel-control-next-icon{filter:invert(1) grayscale(100)}.carousel-dark .carousel-indicators [data-bs-target]{background-color:#000}.carousel-dark .carousel-caption{color:#000}[data-bs-theme=dark] .carousel .carousel-control-prev-icon,[data-bs-theme=dark] .carousel .carousel-control-next-icon,[data-bs-theme=dark].carousel .carousel-control-prev-icon,[data-bs-theme=dark].carousel .carousel-control-next-icon{filter:invert(1) grayscale(100)}[data-bs-theme=dark] .carousel .carousel-indicators [data-bs-target],[data-bs-theme=dark].carousel .carousel-indicators [data-bs-target]{background-color:#000}[data-bs-theme=dark] .carousel .carousel-caption,[data-bs-theme=dark].carousel .carousel-caption{color:#000}.spinner-grow,.spinner-border{display:inline-block;width:var(--bs-spinner-width);height:var(--bs-spinner-height);vertical-align:var(--bs-spinner-vertical-align);border-radius:50%;animation:var(--bs-spinner-animation-speed) linear infinite var(--bs-spinner-animation-name)}@keyframes spinner-border{to{transform:rotate(360deg) /* rtl:ignore */}}.spinner-border{--bs-spinner-width: 2rem;--bs-spinner-height: 2rem;--bs-spinner-vertical-align: -0.125em;--bs-spinner-border-width: 0.25em;--bs-spinner-animation-speed: 0.75s;--bs-spinner-animation-name: spinner-border;border:var(--bs-spinner-border-width) solid currentcolor;border-right-color:rgba(0,0,0,0)}.spinner-border-sm{--bs-spinner-width: 1rem;--bs-spinner-height: 1rem;--bs-spinner-border-width: 0.2em}@keyframes spinner-grow{0%{transform:scale(0)}50%{opacity:1;transform:none}}.spinner-grow{--bs-spinner-width: 2rem;--bs-spinner-height: 2rem;--bs-spinner-vertical-align: -0.125em;--bs-spinner-animation-speed: 0.75s;--bs-spinner-animation-name: spinner-grow;background-color:currentcolor;opacity:0}.spinner-grow-sm{--bs-spinner-width: 1rem;--bs-spinner-height: 1rem}@media(prefers-reduced-motion: reduce){.spinner-border,.spinner-grow{--bs-spinner-animation-speed: 1.5s}}.offcanvas,.offcanvas-xxl,.offcanvas-xl,.offcanvas-lg,.offcanvas-md,.offcanvas-sm{--bs-offcanvas-zindex: 1045;--bs-offcanvas-width: 400px;--bs-offcanvas-height: 30vh;--bs-offcanvas-padding-x: 1rem;--bs-offcanvas-padding-y: 1rem;--bs-offcanvas-color: #343a40;--bs-offcanvas-bg: #fff;--bs-offcanvas-border-width: 1px;--bs-offcanvas-border-color: rgba(0, 0, 0, 0.175);--bs-offcanvas-box-shadow: 0 0.125rem 0.25rem rgba(0, 0, 0, 0.075);--bs-offcanvas-transition: transform 0.3s ease-in-out;--bs-offcanvas-title-line-height: 1.5}@media(max-width: 575.98px){.offcanvas-sm{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}}@media(max-width: 575.98px)and (prefers-reduced-motion: reduce){.offcanvas-sm{transition:none}}@media(max-width: 575.98px){.offcanvas-sm.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas-sm.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas-sm.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas-sm.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas-sm.showing,.offcanvas-sm.show:not(.hiding){transform:none}.offcanvas-sm.showing,.offcanvas-sm.hiding,.offcanvas-sm.show{visibility:visible}}@media(min-width: 576px){.offcanvas-sm{--bs-offcanvas-height: auto;--bs-offcanvas-border-width: 0;background-color:rgba(0,0,0,0) !important}.offcanvas-sm .offcanvas-header{display:none}.offcanvas-sm .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible;background-color:rgba(0,0,0,0) !important}}@media(max-width: 767.98px){.offcanvas-md{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}}@media(max-width: 767.98px)and (prefers-reduced-motion: reduce){.offcanvas-md{transition:none}}@media(max-width: 767.98px){.offcanvas-md.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas-md.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas-md.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas-md.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas-md.showing,.offcanvas-md.show:not(.hiding){transform:none}.offcanvas-md.showing,.offcanvas-md.hiding,.offcanvas-md.show{visibility:visible}}@media(min-width: 768px){.offcanvas-md{--bs-offcanvas-height: auto;--bs-offcanvas-border-width: 0;background-color:rgba(0,0,0,0) !important}.offcanvas-md .offcanvas-header{display:none}.offcanvas-md .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible;background-color:rgba(0,0,0,0) !important}}@media(max-width: 991.98px){.offcanvas-lg{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}}@media(max-width: 991.98px)and (prefers-reduced-motion: reduce){.offcanvas-lg{transition:none}}@media(max-width: 991.98px){.offcanvas-lg.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas-lg.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas-lg.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas-lg.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas-lg.showing,.offcanvas-lg.show:not(.hiding){transform:none}.offcanvas-lg.showing,.offcanvas-lg.hiding,.offcanvas-lg.show{visibility:visible}}@media(min-width: 992px){.offcanvas-lg{--bs-offcanvas-height: auto;--bs-offcanvas-border-width: 0;background-color:rgba(0,0,0,0) !important}.offcanvas-lg .offcanvas-header{display:none}.offcanvas-lg .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible;background-color:rgba(0,0,0,0) !important}}@media(max-width: 1199.98px){.offcanvas-xl{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}}@media(max-width: 1199.98px)and (prefers-reduced-motion: reduce){.offcanvas-xl{transition:none}}@media(max-width: 1199.98px){.offcanvas-xl.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas-xl.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas-xl.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas-xl.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas-xl.showing,.offcanvas-xl.show:not(.hiding){transform:none}.offcanvas-xl.showing,.offcanvas-xl.hiding,.offcanvas-xl.show{visibility:visible}}@media(min-width: 1200px){.offcanvas-xl{--bs-offcanvas-height: auto;--bs-offcanvas-border-width: 0;background-color:rgba(0,0,0,0) !important}.offcanvas-xl .offcanvas-header{display:none}.offcanvas-xl .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible;background-color:rgba(0,0,0,0) !important}}@media(max-width: 1399.98px){.offcanvas-xxl{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}}@media(max-width: 1399.98px)and (prefers-reduced-motion: reduce){.offcanvas-xxl{transition:none}}@media(max-width: 1399.98px){.offcanvas-xxl.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas-xxl.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas-xxl.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas-xxl.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas-xxl.showing,.offcanvas-xxl.show:not(.hiding){transform:none}.offcanvas-xxl.showing,.offcanvas-xxl.hiding,.offcanvas-xxl.show{visibility:visible}}@media(min-width: 1400px){.offcanvas-xxl{--bs-offcanvas-height: auto;--bs-offcanvas-border-width: 0;background-color:rgba(0,0,0,0) !important}.offcanvas-xxl .offcanvas-header{display:none}.offcanvas-xxl .offcanvas-body{display:flex;display:-webkit-flex;flex-grow:0;-webkit-flex-grow:0;padding:0;overflow-y:visible;background-color:rgba(0,0,0,0) !important}}.offcanvas{position:fixed;bottom:0;z-index:var(--bs-offcanvas-zindex);display:flex;display:-webkit-flex;flex-direction:column;-webkit-flex-direction:column;max-width:100%;color:var(--bs-offcanvas-color);visibility:hidden;background-color:var(--bs-offcanvas-bg);background-clip:padding-box;outline:0;transition:var(--bs-offcanvas-transition)}@media(prefers-reduced-motion: reduce){.offcanvas{transition:none}}.offcanvas.offcanvas-start{top:0;left:0;width:var(--bs-offcanvas-width);border-right:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(-100%)}.offcanvas.offcanvas-end{top:0;right:0;width:var(--bs-offcanvas-width);border-left:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateX(100%)}.offcanvas.offcanvas-top{top:0;right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-bottom:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(-100%)}.offcanvas.offcanvas-bottom{right:0;left:0;height:var(--bs-offcanvas-height);max-height:100%;border-top:var(--bs-offcanvas-border-width) solid var(--bs-offcanvas-border-color);transform:translateY(100%)}.offcanvas.showing,.offcanvas.show:not(.hiding){transform:none}.offcanvas.showing,.offcanvas.hiding,.offcanvas.show{visibility:visible}.offcanvas-backdrop{position:fixed;top:0;left:0;z-index:1040;width:100vw;height:100vh;background-color:#000}.offcanvas-backdrop.fade{opacity:0}.offcanvas-backdrop.show{opacity:.5}.offcanvas-header{display:flex;display:-webkit-flex;align-items:center;-webkit-align-items:center;justify-content:space-between;-webkit-justify-content:space-between;padding:var(--bs-offcanvas-padding-y) var(--bs-offcanvas-padding-x)}.offcanvas-header .btn-close{padding:calc(var(--bs-offcanvas-padding-y)*.5) calc(var(--bs-offcanvas-padding-x)*.5);margin-top:calc(-0.5*var(--bs-offcanvas-padding-y));margin-right:calc(-0.5*var(--bs-offcanvas-padding-x));margin-bottom:calc(-0.5*var(--bs-offcanvas-padding-y))}.offcanvas-title{margin-bottom:0;line-height:var(--bs-offcanvas-title-line-height)}.offcanvas-body{flex-grow:1;-webkit-flex-grow:1;padding:var(--bs-offcanvas-padding-y) var(--bs-offcanvas-padding-x);overflow-y:auto}.placeholder{display:inline-block;min-height:1em;vertical-align:middle;cursor:wait;background-color:currentcolor;opacity:.5}.placeholder.btn::before{display:inline-block;content:""}.placeholder-xs{min-height:.6em}.placeholder-sm{min-height:.8em}.placeholder-lg{min-height:1.2em}.placeholder-glow .placeholder{animation:placeholder-glow 2s ease-in-out infinite}@keyframes placeholder-glow{50%{opacity:.2}}.placeholder-wave{mask-image:linear-gradient(130deg, #000 55%, rgba(0, 0, 0, 0.8) 75%, #000 95%);-webkit-mask-image:linear-gradient(130deg, #000 55%, rgba(0, 0, 0, 0.8) 75%, #000 95%);mask-size:200% 100%;-webkit-mask-size:200% 100%;animation:placeholder-wave 2s linear infinite}@keyframes placeholder-wave{100%{mask-position:-200% 0%;-webkit-mask-position:-200% 0%}}.clearfix::after{display:block;clear:both;content:""}.text-bg-default{color:#fff !important;background-color:RGBA(var(--bs-default-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-primary{color:#fff !important;background-color:RGBA(var(--bs-primary-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-secondary{color:#fff !important;background-color:RGBA(var(--bs-secondary-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-success{color:#fff !important;background-color:RGBA(var(--bs-success-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-info{color:#fff !important;background-color:RGBA(var(--bs-info-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-warning{color:#fff !important;background-color:RGBA(var(--bs-warning-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-danger{color:#fff !important;background-color:RGBA(var(--bs-danger-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-light{color:#000 !important;background-color:RGBA(var(--bs-light-rgb), var(--bs-bg-opacity, 1)) !important}.text-bg-dark{color:#fff !important;background-color:RGBA(var(--bs-dark-rgb), var(--bs-bg-opacity, 1)) !important}.link-default{color:RGBA(var(--bs-default-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-default-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-default:hover,.link-default:focus{color:RGBA(42, 46, 51, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(42, 46, 51, var(--bs-link-underline-opacity, 1)) !important}.link-primary{color:RGBA(var(--bs-primary-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-primary-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-primary:hover,.link-primary:focus{color:RGBA(31, 102, 182, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(31, 102, 182, var(--bs-link-underline-opacity, 1)) !important}.link-secondary{color:RGBA(var(--bs-secondary-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-secondary-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-secondary:hover,.link-secondary:focus{color:RGBA(42, 46, 51, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(42, 46, 51, var(--bs-link-underline-opacity, 1)) !important}.link-success{color:RGBA(var(--bs-success-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-success-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-success:hover,.link-success:focus{color:RGBA(50, 146, 19, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(50, 146, 19, var(--bs-link-underline-opacity, 1)) !important}.link-info{color:RGBA(var(--bs-info-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-info-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-info:hover,.link-info:focus{color:RGBA(122, 67, 150, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(122, 67, 150, var(--bs-link-underline-opacity, 1)) !important}.link-warning{color:RGBA(var(--bs-warning-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-warning-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-warning:hover,.link-warning:focus{color:RGBA(204, 94, 19, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(204, 94, 19, var(--bs-link-underline-opacity, 1)) !important}.link-danger{color:RGBA(var(--bs-danger-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-danger-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-danger:hover,.link-danger:focus{color:RGBA(204, 0, 46, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(204, 0, 46, var(--bs-link-underline-opacity, 1)) !important}.link-light{color:RGBA(var(--bs-light-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-light-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-light:hover,.link-light:focus{color:RGBA(249, 250, 251, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(249, 250, 251, var(--bs-link-underline-opacity, 1)) !important}.link-dark{color:RGBA(var(--bs-dark-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-dark-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-dark:hover,.link-dark:focus{color:RGBA(42, 46, 51, var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(42, 46, 51, var(--bs-link-underline-opacity, 1)) !important}.link-body-emphasis{color:RGBA(var(--bs-emphasis-color-rgb), var(--bs-link-opacity, 1)) !important;text-decoration-color:RGBA(var(--bs-emphasis-color-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-body-emphasis:hover,.link-body-emphasis:focus{color:RGBA(var(--bs-emphasis-color-rgb), var(--bs-link-opacity, 0.75)) !important;text-decoration-color:RGBA(var(--bs-emphasis-color-rgb), var(--bs-link-underline-opacity, 0.75)) !important}.focus-ring:focus{outline:0;box-shadow:var(--bs-focus-ring-x, 0) var(--bs-focus-ring-y, 0) var(--bs-focus-ring-blur, 0) var(--bs-focus-ring-width) var(--bs-focus-ring-color)}.icon-link{display:inline-flex;gap:.375rem;align-items:center;-webkit-align-items:center;text-decoration-color:rgba(var(--bs-link-color-rgb), var(--bs-link-opacity, 0.5));text-underline-offset:.25em;backface-visibility:hidden;-webkit-backface-visibility:hidden;-moz-backface-visibility:hidden;-ms-backface-visibility:hidden;-o-backface-visibility:hidden}.icon-link>.bi{flex-shrink:0;-webkit-flex-shrink:0;width:1em;height:1em;fill:currentcolor;transition:.2s ease-in-out transform}@media(prefers-reduced-motion: reduce){.icon-link>.bi{transition:none}}.icon-link-hover:hover>.bi,.icon-link-hover:focus-visible>.bi{transform:var(--bs-icon-link-transform, translate3d(0.25em, 0, 0))}.ratio{position:relative;width:100%}.ratio::before{display:block;padding-top:var(--bs-aspect-ratio);content:""}.ratio>*{position:absolute;top:0;left:0;width:100%;height:100%}.ratio-1x1{--bs-aspect-ratio: 100%}.ratio-4x3{--bs-aspect-ratio: 75%}.ratio-16x9{--bs-aspect-ratio: 56.25%}.ratio-21x9{--bs-aspect-ratio: 42.8571428571%}.fixed-top{position:fixed;top:0;right:0;left:0;z-index:1030}.fixed-bottom{position:fixed;right:0;bottom:0;left:0;z-index:1030}.sticky-top{position:sticky;top:0;z-index:1020}.sticky-bottom{position:sticky;bottom:0;z-index:1020}@media(min-width: 576px){.sticky-sm-top{position:sticky;top:0;z-index:1020}.sticky-sm-bottom{position:sticky;bottom:0;z-index:1020}}@media(min-width: 768px){.sticky-md-top{position:sticky;top:0;z-index:1020}.sticky-md-bottom{position:sticky;bottom:0;z-index:1020}}@media(min-width: 992px){.sticky-lg-top{position:sticky;top:0;z-index:1020}.sticky-lg-bottom{position:sticky;bottom:0;z-index:1020}}@media(min-width: 1200px){.sticky-xl-top{position:sticky;top:0;z-index:1020}.sticky-xl-bottom{position:sticky;bottom:0;z-index:1020}}@media(min-width: 1400px){.sticky-xxl-top{position:sticky;top:0;z-index:1020}.sticky-xxl-bottom{position:sticky;bottom:0;z-index:1020}}.hstack{display:flex;display:-webkit-flex;flex-direction:row;-webkit-flex-direction:row;align-items:center;-webkit-align-items:center;align-self:stretch;-webkit-align-self:stretch}.vstack{display:flex;display:-webkit-flex;flex:1 1 auto;-webkit-flex:1 1 auto;flex-direction:column;-webkit-flex-direction:column;align-self:stretch;-webkit-align-self:stretch}.visually-hidden,.visually-hidden-focusable:not(:focus):not(:focus-within){width:1px !important;height:1px !important;padding:0 !important;margin:-1px !important;overflow:hidden !important;clip:rect(0, 0, 0, 0) !important;white-space:nowrap !important;border:0 !important}.visually-hidden:not(caption),.visually-hidden-focusable:not(:focus):not(:focus-within):not(caption){position:absolute !important}.stretched-link::after{position:absolute;top:0;right:0;bottom:0;left:0;z-index:1;content:""}.text-truncate{overflow:hidden;text-overflow:ellipsis;white-space:nowrap}.vr{display:inline-block;align-self:stretch;-webkit-align-self:stretch;width:1px;min-height:1em;background-color:currentcolor;opacity:.25}.align-baseline{vertical-align:baseline !important}.align-top{vertical-align:top !important}.align-middle{vertical-align:middle !important}.align-bottom{vertical-align:bottom !important}.align-text-bottom{vertical-align:text-bottom !important}.align-text-top{vertical-align:text-top !important}.float-start{float:left !important}.float-end{float:right !important}.float-none{float:none !important}.object-fit-contain{object-fit:contain !important}.object-fit-cover{object-fit:cover !important}.object-fit-fill{object-fit:fill !important}.object-fit-scale{object-fit:scale-down !important}.object-fit-none{object-fit:none !important}.opacity-0{opacity:0 !important}.opacity-25{opacity:.25 !important}.opacity-50{opacity:.5 !important}.opacity-75{opacity:.75 !important}.opacity-100{opacity:1 !important}.overflow-auto{overflow:auto !important}.overflow-hidden{overflow:hidden !important}.overflow-visible{overflow:visible !important}.overflow-scroll{overflow:scroll !important}.overflow-x-auto{overflow-x:auto !important}.overflow-x-hidden{overflow-x:hidden !important}.overflow-x-visible{overflow-x:visible !important}.overflow-x-scroll{overflow-x:scroll !important}.overflow-y-auto{overflow-y:auto !important}.overflow-y-hidden{overflow-y:hidden !important}.overflow-y-visible{overflow-y:visible !important}.overflow-y-scroll{overflow-y:scroll !important}.d-inline{display:inline !important}.d-inline-block{display:inline-block !important}.d-block{display:block !important}.d-grid{display:grid !important}.d-inline-grid{display:inline-grid !important}.d-table{display:table !important}.d-table-row{display:table-row !important}.d-table-cell{display:table-cell !important}.d-flex{display:flex !important}.d-inline-flex{display:inline-flex !important}.d-none{display:none !important}.shadow{box-shadow:0 .5rem 1rem rgba(0,0,0,.15) !important}.shadow-sm{box-shadow:0 .125rem .25rem rgba(0,0,0,.075) !important}.shadow-lg{box-shadow:0 1rem 3rem rgba(0,0,0,.175) !important}.shadow-none{box-shadow:none !important}.focus-ring-default{--bs-focus-ring-color: rgba(var(--bs-default-rgb), var(--bs-focus-ring-opacity))}.focus-ring-primary{--bs-focus-ring-color: rgba(var(--bs-primary-rgb), var(--bs-focus-ring-opacity))}.focus-ring-secondary{--bs-focus-ring-color: rgba(var(--bs-secondary-rgb), var(--bs-focus-ring-opacity))}.focus-ring-success{--bs-focus-ring-color: rgba(var(--bs-success-rgb), var(--bs-focus-ring-opacity))}.focus-ring-info{--bs-focus-ring-color: rgba(var(--bs-info-rgb), var(--bs-focus-ring-opacity))}.focus-ring-warning{--bs-focus-ring-color: rgba(var(--bs-warning-rgb), var(--bs-focus-ring-opacity))}.focus-ring-danger{--bs-focus-ring-color: rgba(var(--bs-danger-rgb), var(--bs-focus-ring-opacity))}.focus-ring-light{--bs-focus-ring-color: rgba(var(--bs-light-rgb), var(--bs-focus-ring-opacity))}.focus-ring-dark{--bs-focus-ring-color: rgba(var(--bs-dark-rgb), var(--bs-focus-ring-opacity))}.position-static{position:static !important}.position-relative{position:relative !important}.position-absolute{position:absolute !important}.position-fixed{position:fixed !important}.position-sticky{position:sticky !important}.top-0{top:0 !important}.top-50{top:50% !important}.top-100{top:100% !important}.bottom-0{bottom:0 !important}.bottom-50{bottom:50% !important}.bottom-100{bottom:100% !important}.start-0{left:0 !important}.start-50{left:50% !important}.start-100{left:100% !important}.end-0{right:0 !important}.end-50{right:50% !important}.end-100{right:100% !important}.translate-middle{transform:translate(-50%, -50%) !important}.translate-middle-x{transform:translateX(-50%) !important}.translate-middle-y{transform:translateY(-50%) !important}.border{border:var(--bs-border-width) var(--bs-border-style) var(--bs-border-color) !important}.border-0{border:0 !important}.border-top{border-top:var(--bs-border-width) var(--bs-border-style) var(--bs-border-color) !important}.border-top-0{border-top:0 !important}.border-end{border-right:var(--bs-border-width) var(--bs-border-style) var(--bs-border-color) !important}.border-end-0{border-right:0 !important}.border-bottom{border-bottom:var(--bs-border-width) var(--bs-border-style) var(--bs-border-color) !important}.border-bottom-0{border-bottom:0 !important}.border-start{border-left:var(--bs-border-width) var(--bs-border-style) var(--bs-border-color) !important}.border-start-0{border-left:0 !important}.border-default{--bs-border-opacity: 1;border-color:rgba(var(--bs-default-rgb), var(--bs-border-opacity)) !important}.border-primary{--bs-border-opacity: 1;border-color:rgba(var(--bs-primary-rgb), var(--bs-border-opacity)) !important}.border-secondary{--bs-border-opacity: 1;border-color:rgba(var(--bs-secondary-rgb), var(--bs-border-opacity)) !important}.border-success{--bs-border-opacity: 1;border-color:rgba(var(--bs-success-rgb), var(--bs-border-opacity)) !important}.border-info{--bs-border-opacity: 1;border-color:rgba(var(--bs-info-rgb), var(--bs-border-opacity)) !important}.border-warning{--bs-border-opacity: 1;border-color:rgba(var(--bs-warning-rgb), var(--bs-border-opacity)) !important}.border-danger{--bs-border-opacity: 1;border-color:rgba(var(--bs-danger-rgb), var(--bs-border-opacity)) !important}.border-light{--bs-border-opacity: 1;border-color:rgba(var(--bs-light-rgb), var(--bs-border-opacity)) !important}.border-dark{--bs-border-opacity: 1;border-color:rgba(var(--bs-dark-rgb), var(--bs-border-opacity)) !important}.border-black{--bs-border-opacity: 1;border-color:rgba(var(--bs-black-rgb), var(--bs-border-opacity)) !important}.border-white{--bs-border-opacity: 1;border-color:rgba(var(--bs-white-rgb), var(--bs-border-opacity)) !important}.border-primary-subtle{border-color:var(--bs-primary-border-subtle) !important}.border-secondary-subtle{border-color:var(--bs-secondary-border-subtle) !important}.border-success-subtle{border-color:var(--bs-success-border-subtle) !important}.border-info-subtle{border-color:var(--bs-info-border-subtle) !important}.border-warning-subtle{border-color:var(--bs-warning-border-subtle) !important}.border-danger-subtle{border-color:var(--bs-danger-border-subtle) !important}.border-light-subtle{border-color:var(--bs-light-border-subtle) !important}.border-dark-subtle{border-color:var(--bs-dark-border-subtle) !important}.border-1{border-width:1px !important}.border-2{border-width:2px !important}.border-3{border-width:3px !important}.border-4{border-width:4px !important}.border-5{border-width:5px !important}.border-opacity-10{--bs-border-opacity: 0.1}.border-opacity-25{--bs-border-opacity: 0.25}.border-opacity-50{--bs-border-opacity: 0.5}.border-opacity-75{--bs-border-opacity: 0.75}.border-opacity-100{--bs-border-opacity: 1}.w-25{width:25% !important}.w-50{width:50% !important}.w-75{width:75% !important}.w-100{width:100% !important}.w-auto{width:auto !important}.mw-100{max-width:100% !important}.vw-100{width:100vw !important}.min-vw-100{min-width:100vw !important}.h-25{height:25% !important}.h-50{height:50% !important}.h-75{height:75% !important}.h-100{height:100% !important}.h-auto{height:auto !important}.mh-100{max-height:100% !important}.vh-100{height:100vh !important}.min-vh-100{min-height:100vh !important}.flex-fill{flex:1 1 auto !important}.flex-row{flex-direction:row !important}.flex-column{flex-direction:column !important}.flex-row-reverse{flex-direction:row-reverse !important}.flex-column-reverse{flex-direction:column-reverse !important}.flex-grow-0{flex-grow:0 !important}.flex-grow-1{flex-grow:1 !important}.flex-shrink-0{flex-shrink:0 !important}.flex-shrink-1{flex-shrink:1 !important}.flex-wrap{flex-wrap:wrap !important}.flex-nowrap{flex-wrap:nowrap !important}.flex-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-start{justify-content:flex-start !important}.justify-content-end{justify-content:flex-end !important}.justify-content-center{justify-content:center !important}.justify-content-between{justify-content:space-between !important}.justify-content-around{justify-content:space-around !important}.justify-content-evenly{justify-content:space-evenly !important}.align-items-start{align-items:flex-start !important}.align-items-end{align-items:flex-end !important}.align-items-center{align-items:center !important}.align-items-baseline{align-items:baseline !important}.align-items-stretch{align-items:stretch !important}.align-content-start{align-content:flex-start !important}.align-content-end{align-content:flex-end !important}.align-content-center{align-content:center !important}.align-content-between{align-content:space-between !important}.align-content-around{align-content:space-around !important}.align-content-stretch{align-content:stretch !important}.align-self-auto{align-self:auto !important}.align-self-start{align-self:flex-start !important}.align-self-end{align-self:flex-end !important}.align-self-center{align-self:center !important}.align-self-baseline{align-self:baseline !important}.align-self-stretch{align-self:stretch !important}.order-first{order:-1 !important}.order-0{order:0 !important}.order-1{order:1 !important}.order-2{order:2 !important}.order-3{order:3 !important}.order-4{order:4 !important}.order-5{order:5 !important}.order-last{order:6 !important}.m-0{margin:0 !important}.m-1{margin:.25rem !important}.m-2{margin:.5rem !important}.m-3{margin:1rem !important}.m-4{margin:1.5rem !important}.m-5{margin:3rem !important}.m-auto{margin:auto !important}.mx-0{margin-right:0 !important;margin-left:0 !important}.mx-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-3{margin-right:1rem !important;margin-left:1rem !important}.mx-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-5{margin-right:3rem !important;margin-left:3rem !important}.mx-auto{margin-right:auto !important;margin-left:auto !important}.my-0{margin-top:0 !important;margin-bottom:0 !important}.my-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-0{margin-top:0 !important}.mt-1{margin-top:.25rem !important}.mt-2{margin-top:.5rem !important}.mt-3{margin-top:1rem !important}.mt-4{margin-top:1.5rem !important}.mt-5{margin-top:3rem !important}.mt-auto{margin-top:auto !important}.me-0{margin-right:0 !important}.me-1{margin-right:.25rem !important}.me-2{margin-right:.5rem !important}.me-3{margin-right:1rem !important}.me-4{margin-right:1.5rem !important}.me-5{margin-right:3rem !important}.me-auto{margin-right:auto !important}.mb-0{margin-bottom:0 !important}.mb-1{margin-bottom:.25rem !important}.mb-2{margin-bottom:.5rem !important}.mb-3{margin-bottom:1rem !important}.mb-4{margin-bottom:1.5rem !important}.mb-5{margin-bottom:3rem !important}.mb-auto{margin-bottom:auto !important}.ms-0{margin-left:0 !important}.ms-1{margin-left:.25rem !important}.ms-2{margin-left:.5rem !important}.ms-3{margin-left:1rem !important}.ms-4{margin-left:1.5rem !important}.ms-5{margin-left:3rem !important}.ms-auto{margin-left:auto !important}.p-0{padding:0 !important}.p-1{padding:.25rem !important}.p-2{padding:.5rem !important}.p-3{padding:1rem !important}.p-4{padding:1.5rem !important}.p-5{padding:3rem !important}.px-0{padding-right:0 !important;padding-left:0 !important}.px-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-3{padding-right:1rem !important;padding-left:1rem !important}.px-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-5{padding-right:3rem !important;padding-left:3rem !important}.py-0{padding-top:0 !important;padding-bottom:0 !important}.py-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-0{padding-top:0 !important}.pt-1{padding-top:.25rem !important}.pt-2{padding-top:.5rem !important}.pt-3{padding-top:1rem !important}.pt-4{padding-top:1.5rem !important}.pt-5{padding-top:3rem !important}.pe-0{padding-right:0 !important}.pe-1{padding-right:.25rem !important}.pe-2{padding-right:.5rem !important}.pe-3{padding-right:1rem !important}.pe-4{padding-right:1.5rem !important}.pe-5{padding-right:3rem !important}.pb-0{padding-bottom:0 !important}.pb-1{padding-bottom:.25rem !important}.pb-2{padding-bottom:.5rem !important}.pb-3{padding-bottom:1rem !important}.pb-4{padding-bottom:1.5rem !important}.pb-5{padding-bottom:3rem !important}.ps-0{padding-left:0 !important}.ps-1{padding-left:.25rem !important}.ps-2{padding-left:.5rem !important}.ps-3{padding-left:1rem !important}.ps-4{padding-left:1.5rem !important}.ps-5{padding-left:3rem !important}.gap-0{gap:0 !important}.gap-1{gap:.25rem !important}.gap-2{gap:.5rem !important}.gap-3{gap:1rem !important}.gap-4{gap:1.5rem !important}.gap-5{gap:3rem !important}.row-gap-0{row-gap:0 !important}.row-gap-1{row-gap:.25rem !important}.row-gap-2{row-gap:.5rem !important}.row-gap-3{row-gap:1rem !important}.row-gap-4{row-gap:1.5rem !important}.row-gap-5{row-gap:3rem !important}.column-gap-0{column-gap:0 !important}.column-gap-1{column-gap:.25rem !important}.column-gap-2{column-gap:.5rem !important}.column-gap-3{column-gap:1rem !important}.column-gap-4{column-gap:1.5rem !important}.column-gap-5{column-gap:3rem !important}.font-monospace{font-family:var(--bs-font-monospace) !important}.fs-1{font-size:calc(1.325rem + 0.9vw) !important}.fs-2{font-size:calc(1.29rem + 0.48vw) !important}.fs-3{font-size:calc(1.27rem + 0.24vw) !important}.fs-4{font-size:1.25rem !important}.fs-5{font-size:1.1rem !important}.fs-6{font-size:1rem !important}.fst-italic{font-style:italic !important}.fst-normal{font-style:normal !important}.fw-lighter{font-weight:lighter !important}.fw-light{font-weight:300 !important}.fw-normal{font-weight:400 !important}.fw-medium{font-weight:500 !important}.fw-semibold{font-weight:600 !important}.fw-bold{font-weight:700 !important}.fw-bolder{font-weight:bolder !important}.lh-1{line-height:1 !important}.lh-sm{line-height:1.25 !important}.lh-base{line-height:1.5 !important}.lh-lg{line-height:2 !important}.text-start{text-align:left !important}.text-end{text-align:right !important}.text-center{text-align:center !important}.text-decoration-none{text-decoration:none !important}.text-decoration-underline{text-decoration:underline !important}.text-decoration-line-through{text-decoration:line-through !important}.text-lowercase{text-transform:lowercase !important}.text-uppercase{text-transform:uppercase !important}.text-capitalize{text-transform:capitalize !important}.text-wrap{white-space:normal !important}.text-nowrap{white-space:nowrap !important}.text-break{word-wrap:break-word !important;word-break:break-word !important}.text-default{--bs-text-opacity: 1;color:rgba(var(--bs-default-rgb), var(--bs-text-opacity)) !important}.text-primary{--bs-text-opacity: 1;color:rgba(var(--bs-primary-rgb), var(--bs-text-opacity)) !important}.text-secondary{--bs-text-opacity: 1;color:rgba(var(--bs-secondary-rgb), var(--bs-text-opacity)) !important}.text-success{--bs-text-opacity: 1;color:rgba(var(--bs-success-rgb), var(--bs-text-opacity)) !important}.text-info{--bs-text-opacity: 1;color:rgba(var(--bs-info-rgb), var(--bs-text-opacity)) !important}.text-warning{--bs-text-opacity: 1;color:rgba(var(--bs-warning-rgb), var(--bs-text-opacity)) !important}.text-danger{--bs-text-opacity: 1;color:rgba(var(--bs-danger-rgb), var(--bs-text-opacity)) !important}.text-light{--bs-text-opacity: 1;color:rgba(var(--bs-light-rgb), var(--bs-text-opacity)) !important}.text-dark{--bs-text-opacity: 1;color:rgba(var(--bs-dark-rgb), var(--bs-text-opacity)) !important}.text-black{--bs-text-opacity: 1;color:rgba(var(--bs-black-rgb), var(--bs-text-opacity)) !important}.text-white{--bs-text-opacity: 1;color:rgba(var(--bs-white-rgb), var(--bs-text-opacity)) !important}.text-body{--bs-text-opacity: 1;color:rgba(var(--bs-body-color-rgb), var(--bs-text-opacity)) !important}.text-muted{--bs-text-opacity: 1;color:var(--bs-secondary-color) !important}.text-black-50{--bs-text-opacity: 1;color:rgba(0,0,0,.5) !important}.text-white-50{--bs-text-opacity: 1;color:rgba(255,255,255,.5) !important}.text-body-secondary{--bs-text-opacity: 1;color:var(--bs-secondary-color) !important}.text-body-tertiary{--bs-text-opacity: 1;color:var(--bs-tertiary-color) !important}.text-body-emphasis{--bs-text-opacity: 1;color:var(--bs-emphasis-color) !important}.text-reset{--bs-text-opacity: 1;color:inherit !important}.text-opacity-25{--bs-text-opacity: 0.25}.text-opacity-50{--bs-text-opacity: 0.5}.text-opacity-75{--bs-text-opacity: 0.75}.text-opacity-100{--bs-text-opacity: 1}.text-primary-emphasis{color:var(--bs-primary-text-emphasis) !important}.text-secondary-emphasis{color:var(--bs-secondary-text-emphasis) !important}.text-success-emphasis{color:var(--bs-success-text-emphasis) !important}.text-info-emphasis{color:var(--bs-info-text-emphasis) !important}.text-warning-emphasis{color:var(--bs-warning-text-emphasis) !important}.text-danger-emphasis{color:var(--bs-danger-text-emphasis) !important}.text-light-emphasis{color:var(--bs-light-text-emphasis) !important}.text-dark-emphasis{color:var(--bs-dark-text-emphasis) !important}.link-opacity-10{--bs-link-opacity: 0.1}.link-opacity-10-hover:hover{--bs-link-opacity: 0.1}.link-opacity-25{--bs-link-opacity: 0.25}.link-opacity-25-hover:hover{--bs-link-opacity: 0.25}.link-opacity-50{--bs-link-opacity: 0.5}.link-opacity-50-hover:hover{--bs-link-opacity: 0.5}.link-opacity-75{--bs-link-opacity: 0.75}.link-opacity-75-hover:hover{--bs-link-opacity: 0.75}.link-opacity-100{--bs-link-opacity: 1}.link-opacity-100-hover:hover{--bs-link-opacity: 1}.link-offset-1{text-underline-offset:.125em !important}.link-offset-1-hover:hover{text-underline-offset:.125em !important}.link-offset-2{text-underline-offset:.25em !important}.link-offset-2-hover:hover{text-underline-offset:.25em !important}.link-offset-3{text-underline-offset:.375em !important}.link-offset-3-hover:hover{text-underline-offset:.375em !important}.link-underline-default{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-default-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-primary{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-primary-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-secondary{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-secondary-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-success{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-success-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-info{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-info-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-warning{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-warning-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-danger{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-danger-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-light{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-light-rgb), var(--bs-link-underline-opacity)) !important}.link-underline-dark{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-dark-rgb), var(--bs-link-underline-opacity)) !important}.link-underline{--bs-link-underline-opacity: 1;text-decoration-color:rgba(var(--bs-link-color-rgb), var(--bs-link-underline-opacity, 1)) !important}.link-underline-opacity-0{--bs-link-underline-opacity: 0}.link-underline-opacity-0-hover:hover{--bs-link-underline-opacity: 0}.link-underline-opacity-10{--bs-link-underline-opacity: 0.1}.link-underline-opacity-10-hover:hover{--bs-link-underline-opacity: 0.1}.link-underline-opacity-25{--bs-link-underline-opacity: 0.25}.link-underline-opacity-25-hover:hover{--bs-link-underline-opacity: 0.25}.link-underline-opacity-50{--bs-link-underline-opacity: 0.5}.link-underline-opacity-50-hover:hover{--bs-link-underline-opacity: 0.5}.link-underline-opacity-75{--bs-link-underline-opacity: 0.75}.link-underline-opacity-75-hover:hover{--bs-link-underline-opacity: 0.75}.link-underline-opacity-100{--bs-link-underline-opacity: 1}.link-underline-opacity-100-hover:hover{--bs-link-underline-opacity: 1}.bg-default{--bs-bg-opacity: 1;background-color:rgba(var(--bs-default-rgb), var(--bs-bg-opacity)) !important}.bg-primary{--bs-bg-opacity: 1;background-color:rgba(var(--bs-primary-rgb), var(--bs-bg-opacity)) !important}.bg-secondary{--bs-bg-opacity: 1;background-color:rgba(var(--bs-secondary-rgb), var(--bs-bg-opacity)) !important}.bg-success{--bs-bg-opacity: 1;background-color:rgba(var(--bs-success-rgb), var(--bs-bg-opacity)) !important}.bg-info{--bs-bg-opacity: 1;background-color:rgba(var(--bs-info-rgb), var(--bs-bg-opacity)) !important}.bg-warning{--bs-bg-opacity: 1;background-color:rgba(var(--bs-warning-rgb), var(--bs-bg-opacity)) !important}.bg-danger{--bs-bg-opacity: 1;background-color:rgba(var(--bs-danger-rgb), var(--bs-bg-opacity)) !important}.bg-light{--bs-bg-opacity: 1;background-color:rgba(var(--bs-light-rgb), var(--bs-bg-opacity)) !important}.bg-dark{--bs-bg-opacity: 1;background-color:rgba(var(--bs-dark-rgb), var(--bs-bg-opacity)) !important}.bg-black{--bs-bg-opacity: 1;background-color:rgba(var(--bs-black-rgb), var(--bs-bg-opacity)) !important}.bg-white{--bs-bg-opacity: 1;background-color:rgba(var(--bs-white-rgb), var(--bs-bg-opacity)) !important}.bg-body{--bs-bg-opacity: 1;background-color:rgba(var(--bs-body-bg-rgb), var(--bs-bg-opacity)) !important}.bg-transparent{--bs-bg-opacity: 1;background-color:rgba(0,0,0,0) !important}.bg-body-secondary{--bs-bg-opacity: 1;background-color:rgba(var(--bs-secondary-bg-rgb), var(--bs-bg-opacity)) !important}.bg-body-tertiary{--bs-bg-opacity: 1;background-color:rgba(var(--bs-tertiary-bg-rgb), var(--bs-bg-opacity)) !important}.bg-opacity-10{--bs-bg-opacity: 0.1}.bg-opacity-25{--bs-bg-opacity: 0.25}.bg-opacity-50{--bs-bg-opacity: 0.5}.bg-opacity-75{--bs-bg-opacity: 0.75}.bg-opacity-100{--bs-bg-opacity: 1}.bg-primary-subtle{background-color:var(--bs-primary-bg-subtle) !important}.bg-secondary-subtle{background-color:var(--bs-secondary-bg-subtle) !important}.bg-success-subtle{background-color:var(--bs-success-bg-subtle) !important}.bg-info-subtle{background-color:var(--bs-info-bg-subtle) !important}.bg-warning-subtle{background-color:var(--bs-warning-bg-subtle) !important}.bg-danger-subtle{background-color:var(--bs-danger-bg-subtle) !important}.bg-light-subtle{background-color:var(--bs-light-bg-subtle) !important}.bg-dark-subtle{background-color:var(--bs-dark-bg-subtle) !important}.bg-gradient{background-image:var(--bs-gradient) !important}.user-select-all{user-select:all !important}.user-select-auto{user-select:auto !important}.user-select-none{user-select:none !important}.pe-none{pointer-events:none !important}.pe-auto{pointer-events:auto !important}.rounded{border-radius:var(--bs-border-radius) !important}.rounded-0{border-radius:0 !important}.rounded-1{border-radius:var(--bs-border-radius-sm) !important}.rounded-2{border-radius:var(--bs-border-radius) !important}.rounded-3{border-radius:var(--bs-border-radius-lg) !important}.rounded-4{border-radius:var(--bs-border-radius-xl) !important}.rounded-5{border-radius:var(--bs-border-radius-xxl) !important}.rounded-circle{border-radius:50% !important}.rounded-pill{border-radius:var(--bs-border-radius-pill) !important}.rounded-top{border-top-left-radius:var(--bs-border-radius) !important;border-top-right-radius:var(--bs-border-radius) !important}.rounded-top-0{border-top-left-radius:0 !important;border-top-right-radius:0 !important}.rounded-top-1{border-top-left-radius:var(--bs-border-radius-sm) !important;border-top-right-radius:var(--bs-border-radius-sm) !important}.rounded-top-2{border-top-left-radius:var(--bs-border-radius) !important;border-top-right-radius:var(--bs-border-radius) !important}.rounded-top-3{border-top-left-radius:var(--bs-border-radius-lg) !important;border-top-right-radius:var(--bs-border-radius-lg) !important}.rounded-top-4{border-top-left-radius:var(--bs-border-radius-xl) !important;border-top-right-radius:var(--bs-border-radius-xl) !important}.rounded-top-5{border-top-left-radius:var(--bs-border-radius-xxl) !important;border-top-right-radius:var(--bs-border-radius-xxl) !important}.rounded-top-circle{border-top-left-radius:50% !important;border-top-right-radius:50% !important}.rounded-top-pill{border-top-left-radius:var(--bs-border-radius-pill) !important;border-top-right-radius:var(--bs-border-radius-pill) !important}.rounded-end{border-top-right-radius:var(--bs-border-radius) !important;border-bottom-right-radius:var(--bs-border-radius) !important}.rounded-end-0{border-top-right-radius:0 !important;border-bottom-right-radius:0 !important}.rounded-end-1{border-top-right-radius:var(--bs-border-radius-sm) !important;border-bottom-right-radius:var(--bs-border-radius-sm) !important}.rounded-end-2{border-top-right-radius:var(--bs-border-radius) !important;border-bottom-right-radius:var(--bs-border-radius) !important}.rounded-end-3{border-top-right-radius:var(--bs-border-radius-lg) !important;border-bottom-right-radius:var(--bs-border-radius-lg) !important}.rounded-end-4{border-top-right-radius:var(--bs-border-radius-xl) !important;border-bottom-right-radius:var(--bs-border-radius-xl) !important}.rounded-end-5{border-top-right-radius:var(--bs-border-radius-xxl) !important;border-bottom-right-radius:var(--bs-border-radius-xxl) !important}.rounded-end-circle{border-top-right-radius:50% !important;border-bottom-right-radius:50% !important}.rounded-end-pill{border-top-right-radius:var(--bs-border-radius-pill) !important;border-bottom-right-radius:var(--bs-border-radius-pill) !important}.rounded-bottom{border-bottom-right-radius:var(--bs-border-radius) !important;border-bottom-left-radius:var(--bs-border-radius) !important}.rounded-bottom-0{border-bottom-right-radius:0 !important;border-bottom-left-radius:0 !important}.rounded-bottom-1{border-bottom-right-radius:var(--bs-border-radius-sm) !important;border-bottom-left-radius:var(--bs-border-radius-sm) !important}.rounded-bottom-2{border-bottom-right-radius:var(--bs-border-radius) !important;border-bottom-left-radius:var(--bs-border-radius) !important}.rounded-bottom-3{border-bottom-right-radius:var(--bs-border-radius-lg) !important;border-bottom-left-radius:var(--bs-border-radius-lg) !important}.rounded-bottom-4{border-bottom-right-radius:var(--bs-border-radius-xl) !important;border-bottom-left-radius:var(--bs-border-radius-xl) !important}.rounded-bottom-5{border-bottom-right-radius:var(--bs-border-radius-xxl) !important;border-bottom-left-radius:var(--bs-border-radius-xxl) !important}.rounded-bottom-circle{border-bottom-right-radius:50% !important;border-bottom-left-radius:50% !important}.rounded-bottom-pill{border-bottom-right-radius:var(--bs-border-radius-pill) !important;border-bottom-left-radius:var(--bs-border-radius-pill) !important}.rounded-start{border-bottom-left-radius:var(--bs-border-radius) !important;border-top-left-radius:var(--bs-border-radius) !important}.rounded-start-0{border-bottom-left-radius:0 !important;border-top-left-radius:0 !important}.rounded-start-1{border-bottom-left-radius:var(--bs-border-radius-sm) !important;border-top-left-radius:var(--bs-border-radius-sm) !important}.rounded-start-2{border-bottom-left-radius:var(--bs-border-radius) !important;border-top-left-radius:var(--bs-border-radius) !important}.rounded-start-3{border-bottom-left-radius:var(--bs-border-radius-lg) !important;border-top-left-radius:var(--bs-border-radius-lg) !important}.rounded-start-4{border-bottom-left-radius:var(--bs-border-radius-xl) !important;border-top-left-radius:var(--bs-border-radius-xl) !important}.rounded-start-5{border-bottom-left-radius:var(--bs-border-radius-xxl) !important;border-top-left-radius:var(--bs-border-radius-xxl) !important}.rounded-start-circle{border-bottom-left-radius:50% !important;border-top-left-radius:50% !important}.rounded-start-pill{border-bottom-left-radius:var(--bs-border-radius-pill) !important;border-top-left-radius:var(--bs-border-radius-pill) !important}.visible{visibility:visible !important}.invisible{visibility:hidden !important}.z-n1{z-index:-1 !important}.z-0{z-index:0 !important}.z-1{z-index:1 !important}.z-2{z-index:2 !important}.z-3{z-index:3 !important}@media(min-width: 576px){.float-sm-start{float:left !important}.float-sm-end{float:right !important}.float-sm-none{float:none !important}.object-fit-sm-contain{object-fit:contain !important}.object-fit-sm-cover{object-fit:cover !important}.object-fit-sm-fill{object-fit:fill !important}.object-fit-sm-scale{object-fit:scale-down !important}.object-fit-sm-none{object-fit:none !important}.d-sm-inline{display:inline !important}.d-sm-inline-block{display:inline-block !important}.d-sm-block{display:block !important}.d-sm-grid{display:grid !important}.d-sm-inline-grid{display:inline-grid !important}.d-sm-table{display:table !important}.d-sm-table-row{display:table-row !important}.d-sm-table-cell{display:table-cell !important}.d-sm-flex{display:flex !important}.d-sm-inline-flex{display:inline-flex !important}.d-sm-none{display:none !important}.flex-sm-fill{flex:1 1 auto !important}.flex-sm-row{flex-direction:row !important}.flex-sm-column{flex-direction:column !important}.flex-sm-row-reverse{flex-direction:row-reverse !important}.flex-sm-column-reverse{flex-direction:column-reverse !important}.flex-sm-grow-0{flex-grow:0 !important}.flex-sm-grow-1{flex-grow:1 !important}.flex-sm-shrink-0{flex-shrink:0 !important}.flex-sm-shrink-1{flex-shrink:1 !important}.flex-sm-wrap{flex-wrap:wrap !important}.flex-sm-nowrap{flex-wrap:nowrap !important}.flex-sm-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-sm-start{justify-content:flex-start !important}.justify-content-sm-end{justify-content:flex-end !important}.justify-content-sm-center{justify-content:center !important}.justify-content-sm-between{justify-content:space-between !important}.justify-content-sm-around{justify-content:space-around !important}.justify-content-sm-evenly{justify-content:space-evenly !important}.align-items-sm-start{align-items:flex-start !important}.align-items-sm-end{align-items:flex-end !important}.align-items-sm-center{align-items:center !important}.align-items-sm-baseline{align-items:baseline !important}.align-items-sm-stretch{align-items:stretch !important}.align-content-sm-start{align-content:flex-start !important}.align-content-sm-end{align-content:flex-end !important}.align-content-sm-center{align-content:center !important}.align-content-sm-between{align-content:space-between !important}.align-content-sm-around{align-content:space-around !important}.align-content-sm-stretch{align-content:stretch !important}.align-self-sm-auto{align-self:auto !important}.align-self-sm-start{align-self:flex-start !important}.align-self-sm-end{align-self:flex-end !important}.align-self-sm-center{align-self:center !important}.align-self-sm-baseline{align-self:baseline !important}.align-self-sm-stretch{align-self:stretch !important}.order-sm-first{order:-1 !important}.order-sm-0{order:0 !important}.order-sm-1{order:1 !important}.order-sm-2{order:2 !important}.order-sm-3{order:3 !important}.order-sm-4{order:4 !important}.order-sm-5{order:5 !important}.order-sm-last{order:6 !important}.m-sm-0{margin:0 !important}.m-sm-1{margin:.25rem !important}.m-sm-2{margin:.5rem !important}.m-sm-3{margin:1rem !important}.m-sm-4{margin:1.5rem !important}.m-sm-5{margin:3rem !important}.m-sm-auto{margin:auto !important}.mx-sm-0{margin-right:0 !important;margin-left:0 !important}.mx-sm-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-sm-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-sm-3{margin-right:1rem !important;margin-left:1rem !important}.mx-sm-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-sm-5{margin-right:3rem !important;margin-left:3rem !important}.mx-sm-auto{margin-right:auto !important;margin-left:auto !important}.my-sm-0{margin-top:0 !important;margin-bottom:0 !important}.my-sm-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-sm-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-sm-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-sm-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-sm-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-sm-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-sm-0{margin-top:0 !important}.mt-sm-1{margin-top:.25rem !important}.mt-sm-2{margin-top:.5rem !important}.mt-sm-3{margin-top:1rem !important}.mt-sm-4{margin-top:1.5rem !important}.mt-sm-5{margin-top:3rem !important}.mt-sm-auto{margin-top:auto !important}.me-sm-0{margin-right:0 !important}.me-sm-1{margin-right:.25rem !important}.me-sm-2{margin-right:.5rem !important}.me-sm-3{margin-right:1rem !important}.me-sm-4{margin-right:1.5rem !important}.me-sm-5{margin-right:3rem !important}.me-sm-auto{margin-right:auto !important}.mb-sm-0{margin-bottom:0 !important}.mb-sm-1{margin-bottom:.25rem !important}.mb-sm-2{margin-bottom:.5rem !important}.mb-sm-3{margin-bottom:1rem !important}.mb-sm-4{margin-bottom:1.5rem !important}.mb-sm-5{margin-bottom:3rem !important}.mb-sm-auto{margin-bottom:auto !important}.ms-sm-0{margin-left:0 !important}.ms-sm-1{margin-left:.25rem !important}.ms-sm-2{margin-left:.5rem !important}.ms-sm-3{margin-left:1rem !important}.ms-sm-4{margin-left:1.5rem !important}.ms-sm-5{margin-left:3rem !important}.ms-sm-auto{margin-left:auto !important}.p-sm-0{padding:0 !important}.p-sm-1{padding:.25rem !important}.p-sm-2{padding:.5rem !important}.p-sm-3{padding:1rem !important}.p-sm-4{padding:1.5rem !important}.p-sm-5{padding:3rem !important}.px-sm-0{padding-right:0 !important;padding-left:0 !important}.px-sm-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-sm-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-sm-3{padding-right:1rem !important;padding-left:1rem !important}.px-sm-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-sm-5{padding-right:3rem !important;padding-left:3rem !important}.py-sm-0{padding-top:0 !important;padding-bottom:0 !important}.py-sm-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-sm-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-sm-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-sm-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-sm-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-sm-0{padding-top:0 !important}.pt-sm-1{padding-top:.25rem !important}.pt-sm-2{padding-top:.5rem !important}.pt-sm-3{padding-top:1rem !important}.pt-sm-4{padding-top:1.5rem !important}.pt-sm-5{padding-top:3rem !important}.pe-sm-0{padding-right:0 !important}.pe-sm-1{padding-right:.25rem !important}.pe-sm-2{padding-right:.5rem !important}.pe-sm-3{padding-right:1rem !important}.pe-sm-4{padding-right:1.5rem !important}.pe-sm-5{padding-right:3rem !important}.pb-sm-0{padding-bottom:0 !important}.pb-sm-1{padding-bottom:.25rem !important}.pb-sm-2{padding-bottom:.5rem !important}.pb-sm-3{padding-bottom:1rem !important}.pb-sm-4{padding-bottom:1.5rem !important}.pb-sm-5{padding-bottom:3rem !important}.ps-sm-0{padding-left:0 !important}.ps-sm-1{padding-left:.25rem !important}.ps-sm-2{padding-left:.5rem !important}.ps-sm-3{padding-left:1rem !important}.ps-sm-4{padding-left:1.5rem !important}.ps-sm-5{padding-left:3rem !important}.gap-sm-0{gap:0 !important}.gap-sm-1{gap:.25rem !important}.gap-sm-2{gap:.5rem !important}.gap-sm-3{gap:1rem !important}.gap-sm-4{gap:1.5rem !important}.gap-sm-5{gap:3rem !important}.row-gap-sm-0{row-gap:0 !important}.row-gap-sm-1{row-gap:.25rem !important}.row-gap-sm-2{row-gap:.5rem !important}.row-gap-sm-3{row-gap:1rem !important}.row-gap-sm-4{row-gap:1.5rem !important}.row-gap-sm-5{row-gap:3rem !important}.column-gap-sm-0{column-gap:0 !important}.column-gap-sm-1{column-gap:.25rem !important}.column-gap-sm-2{column-gap:.5rem !important}.column-gap-sm-3{column-gap:1rem !important}.column-gap-sm-4{column-gap:1.5rem !important}.column-gap-sm-5{column-gap:3rem !important}.text-sm-start{text-align:left !important}.text-sm-end{text-align:right !important}.text-sm-center{text-align:center !important}}@media(min-width: 768px){.float-md-start{float:left !important}.float-md-end{float:right !important}.float-md-none{float:none !important}.object-fit-md-contain{object-fit:contain !important}.object-fit-md-cover{object-fit:cover !important}.object-fit-md-fill{object-fit:fill !important}.object-fit-md-scale{object-fit:scale-down !important}.object-fit-md-none{object-fit:none !important}.d-md-inline{display:inline !important}.d-md-inline-block{display:inline-block !important}.d-md-block{display:block !important}.d-md-grid{display:grid !important}.d-md-inline-grid{display:inline-grid !important}.d-md-table{display:table !important}.d-md-table-row{display:table-row !important}.d-md-table-cell{display:table-cell !important}.d-md-flex{display:flex !important}.d-md-inline-flex{display:inline-flex !important}.d-md-none{display:none !important}.flex-md-fill{flex:1 1 auto !important}.flex-md-row{flex-direction:row !important}.flex-md-column{flex-direction:column !important}.flex-md-row-reverse{flex-direction:row-reverse !important}.flex-md-column-reverse{flex-direction:column-reverse !important}.flex-md-grow-0{flex-grow:0 !important}.flex-md-grow-1{flex-grow:1 !important}.flex-md-shrink-0{flex-shrink:0 !important}.flex-md-shrink-1{flex-shrink:1 !important}.flex-md-wrap{flex-wrap:wrap !important}.flex-md-nowrap{flex-wrap:nowrap !important}.flex-md-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-md-start{justify-content:flex-start !important}.justify-content-md-end{justify-content:flex-end !important}.justify-content-md-center{justify-content:center !important}.justify-content-md-between{justify-content:space-between !important}.justify-content-md-around{justify-content:space-around !important}.justify-content-md-evenly{justify-content:space-evenly !important}.align-items-md-start{align-items:flex-start !important}.align-items-md-end{align-items:flex-end !important}.align-items-md-center{align-items:center !important}.align-items-md-baseline{align-items:baseline !important}.align-items-md-stretch{align-items:stretch !important}.align-content-md-start{align-content:flex-start !important}.align-content-md-end{align-content:flex-end !important}.align-content-md-center{align-content:center !important}.align-content-md-between{align-content:space-between !important}.align-content-md-around{align-content:space-around !important}.align-content-md-stretch{align-content:stretch !important}.align-self-md-auto{align-self:auto !important}.align-self-md-start{align-self:flex-start !important}.align-self-md-end{align-self:flex-end !important}.align-self-md-center{align-self:center !important}.align-self-md-baseline{align-self:baseline !important}.align-self-md-stretch{align-self:stretch !important}.order-md-first{order:-1 !important}.order-md-0{order:0 !important}.order-md-1{order:1 !important}.order-md-2{order:2 !important}.order-md-3{order:3 !important}.order-md-4{order:4 !important}.order-md-5{order:5 !important}.order-md-last{order:6 !important}.m-md-0{margin:0 !important}.m-md-1{margin:.25rem !important}.m-md-2{margin:.5rem !important}.m-md-3{margin:1rem !important}.m-md-4{margin:1.5rem !important}.m-md-5{margin:3rem !important}.m-md-auto{margin:auto !important}.mx-md-0{margin-right:0 !important;margin-left:0 !important}.mx-md-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-md-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-md-3{margin-right:1rem !important;margin-left:1rem !important}.mx-md-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-md-5{margin-right:3rem !important;margin-left:3rem !important}.mx-md-auto{margin-right:auto !important;margin-left:auto !important}.my-md-0{margin-top:0 !important;margin-bottom:0 !important}.my-md-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-md-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-md-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-md-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-md-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-md-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-md-0{margin-top:0 !important}.mt-md-1{margin-top:.25rem !important}.mt-md-2{margin-top:.5rem !important}.mt-md-3{margin-top:1rem !important}.mt-md-4{margin-top:1.5rem !important}.mt-md-5{margin-top:3rem !important}.mt-md-auto{margin-top:auto !important}.me-md-0{margin-right:0 !important}.me-md-1{margin-right:.25rem !important}.me-md-2{margin-right:.5rem !important}.me-md-3{margin-right:1rem !important}.me-md-4{margin-right:1.5rem !important}.me-md-5{margin-right:3rem !important}.me-md-auto{margin-right:auto !important}.mb-md-0{margin-bottom:0 !important}.mb-md-1{margin-bottom:.25rem !important}.mb-md-2{margin-bottom:.5rem !important}.mb-md-3{margin-bottom:1rem !important}.mb-md-4{margin-bottom:1.5rem !important}.mb-md-5{margin-bottom:3rem !important}.mb-md-auto{margin-bottom:auto !important}.ms-md-0{margin-left:0 !important}.ms-md-1{margin-left:.25rem !important}.ms-md-2{margin-left:.5rem !important}.ms-md-3{margin-left:1rem !important}.ms-md-4{margin-left:1.5rem !important}.ms-md-5{margin-left:3rem !important}.ms-md-auto{margin-left:auto !important}.p-md-0{padding:0 !important}.p-md-1{padding:.25rem !important}.p-md-2{padding:.5rem !important}.p-md-3{padding:1rem !important}.p-md-4{padding:1.5rem !important}.p-md-5{padding:3rem !important}.px-md-0{padding-right:0 !important;padding-left:0 !important}.px-md-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-md-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-md-3{padding-right:1rem !important;padding-left:1rem !important}.px-md-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-md-5{padding-right:3rem !important;padding-left:3rem !important}.py-md-0{padding-top:0 !important;padding-bottom:0 !important}.py-md-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-md-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-md-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-md-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-md-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-md-0{padding-top:0 !important}.pt-md-1{padding-top:.25rem !important}.pt-md-2{padding-top:.5rem !important}.pt-md-3{padding-top:1rem !important}.pt-md-4{padding-top:1.5rem !important}.pt-md-5{padding-top:3rem !important}.pe-md-0{padding-right:0 !important}.pe-md-1{padding-right:.25rem !important}.pe-md-2{padding-right:.5rem !important}.pe-md-3{padding-right:1rem !important}.pe-md-4{padding-right:1.5rem !important}.pe-md-5{padding-right:3rem !important}.pb-md-0{padding-bottom:0 !important}.pb-md-1{padding-bottom:.25rem !important}.pb-md-2{padding-bottom:.5rem !important}.pb-md-3{padding-bottom:1rem !important}.pb-md-4{padding-bottom:1.5rem !important}.pb-md-5{padding-bottom:3rem !important}.ps-md-0{padding-left:0 !important}.ps-md-1{padding-left:.25rem !important}.ps-md-2{padding-left:.5rem !important}.ps-md-3{padding-left:1rem !important}.ps-md-4{padding-left:1.5rem !important}.ps-md-5{padding-left:3rem !important}.gap-md-0{gap:0 !important}.gap-md-1{gap:.25rem !important}.gap-md-2{gap:.5rem !important}.gap-md-3{gap:1rem !important}.gap-md-4{gap:1.5rem !important}.gap-md-5{gap:3rem !important}.row-gap-md-0{row-gap:0 !important}.row-gap-md-1{row-gap:.25rem !important}.row-gap-md-2{row-gap:.5rem !important}.row-gap-md-3{row-gap:1rem !important}.row-gap-md-4{row-gap:1.5rem !important}.row-gap-md-5{row-gap:3rem !important}.column-gap-md-0{column-gap:0 !important}.column-gap-md-1{column-gap:.25rem !important}.column-gap-md-2{column-gap:.5rem !important}.column-gap-md-3{column-gap:1rem !important}.column-gap-md-4{column-gap:1.5rem !important}.column-gap-md-5{column-gap:3rem !important}.text-md-start{text-align:left !important}.text-md-end{text-align:right !important}.text-md-center{text-align:center !important}}@media(min-width: 992px){.float-lg-start{float:left !important}.float-lg-end{float:right !important}.float-lg-none{float:none !important}.object-fit-lg-contain{object-fit:contain !important}.object-fit-lg-cover{object-fit:cover !important}.object-fit-lg-fill{object-fit:fill !important}.object-fit-lg-scale{object-fit:scale-down !important}.object-fit-lg-none{object-fit:none !important}.d-lg-inline{display:inline !important}.d-lg-inline-block{display:inline-block !important}.d-lg-block{display:block !important}.d-lg-grid{display:grid !important}.d-lg-inline-grid{display:inline-grid !important}.d-lg-table{display:table !important}.d-lg-table-row{display:table-row !important}.d-lg-table-cell{display:table-cell !important}.d-lg-flex{display:flex !important}.d-lg-inline-flex{display:inline-flex !important}.d-lg-none{display:none !important}.flex-lg-fill{flex:1 1 auto !important}.flex-lg-row{flex-direction:row !important}.flex-lg-column{flex-direction:column !important}.flex-lg-row-reverse{flex-direction:row-reverse !important}.flex-lg-column-reverse{flex-direction:column-reverse !important}.flex-lg-grow-0{flex-grow:0 !important}.flex-lg-grow-1{flex-grow:1 !important}.flex-lg-shrink-0{flex-shrink:0 !important}.flex-lg-shrink-1{flex-shrink:1 !important}.flex-lg-wrap{flex-wrap:wrap !important}.flex-lg-nowrap{flex-wrap:nowrap !important}.flex-lg-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-lg-start{justify-content:flex-start !important}.justify-content-lg-end{justify-content:flex-end !important}.justify-content-lg-center{justify-content:center !important}.justify-content-lg-between{justify-content:space-between !important}.justify-content-lg-around{justify-content:space-around !important}.justify-content-lg-evenly{justify-content:space-evenly !important}.align-items-lg-start{align-items:flex-start !important}.align-items-lg-end{align-items:flex-end !important}.align-items-lg-center{align-items:center !important}.align-items-lg-baseline{align-items:baseline !important}.align-items-lg-stretch{align-items:stretch !important}.align-content-lg-start{align-content:flex-start !important}.align-content-lg-end{align-content:flex-end !important}.align-content-lg-center{align-content:center !important}.align-content-lg-between{align-content:space-between !important}.align-content-lg-around{align-content:space-around !important}.align-content-lg-stretch{align-content:stretch !important}.align-self-lg-auto{align-self:auto !important}.align-self-lg-start{align-self:flex-start !important}.align-self-lg-end{align-self:flex-end !important}.align-self-lg-center{align-self:center !important}.align-self-lg-baseline{align-self:baseline !important}.align-self-lg-stretch{align-self:stretch !important}.order-lg-first{order:-1 !important}.order-lg-0{order:0 !important}.order-lg-1{order:1 !important}.order-lg-2{order:2 !important}.order-lg-3{order:3 !important}.order-lg-4{order:4 !important}.order-lg-5{order:5 !important}.order-lg-last{order:6 !important}.m-lg-0{margin:0 !important}.m-lg-1{margin:.25rem !important}.m-lg-2{margin:.5rem !important}.m-lg-3{margin:1rem !important}.m-lg-4{margin:1.5rem !important}.m-lg-5{margin:3rem !important}.m-lg-auto{margin:auto !important}.mx-lg-0{margin-right:0 !important;margin-left:0 !important}.mx-lg-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-lg-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-lg-3{margin-right:1rem !important;margin-left:1rem !important}.mx-lg-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-lg-5{margin-right:3rem !important;margin-left:3rem !important}.mx-lg-auto{margin-right:auto !important;margin-left:auto !important}.my-lg-0{margin-top:0 !important;margin-bottom:0 !important}.my-lg-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-lg-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-lg-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-lg-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-lg-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-lg-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-lg-0{margin-top:0 !important}.mt-lg-1{margin-top:.25rem !important}.mt-lg-2{margin-top:.5rem !important}.mt-lg-3{margin-top:1rem !important}.mt-lg-4{margin-top:1.5rem !important}.mt-lg-5{margin-top:3rem !important}.mt-lg-auto{margin-top:auto !important}.me-lg-0{margin-right:0 !important}.me-lg-1{margin-right:.25rem !important}.me-lg-2{margin-right:.5rem !important}.me-lg-3{margin-right:1rem !important}.me-lg-4{margin-right:1.5rem !important}.me-lg-5{margin-right:3rem !important}.me-lg-auto{margin-right:auto !important}.mb-lg-0{margin-bottom:0 !important}.mb-lg-1{margin-bottom:.25rem !important}.mb-lg-2{margin-bottom:.5rem !important}.mb-lg-3{margin-bottom:1rem !important}.mb-lg-4{margin-bottom:1.5rem !important}.mb-lg-5{margin-bottom:3rem !important}.mb-lg-auto{margin-bottom:auto !important}.ms-lg-0{margin-left:0 !important}.ms-lg-1{margin-left:.25rem !important}.ms-lg-2{margin-left:.5rem !important}.ms-lg-3{margin-left:1rem !important}.ms-lg-4{margin-left:1.5rem !important}.ms-lg-5{margin-left:3rem !important}.ms-lg-auto{margin-left:auto !important}.p-lg-0{padding:0 !important}.p-lg-1{padding:.25rem !important}.p-lg-2{padding:.5rem !important}.p-lg-3{padding:1rem !important}.p-lg-4{padding:1.5rem !important}.p-lg-5{padding:3rem !important}.px-lg-0{padding-right:0 !important;padding-left:0 !important}.px-lg-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-lg-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-lg-3{padding-right:1rem !important;padding-left:1rem !important}.px-lg-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-lg-5{padding-right:3rem !important;padding-left:3rem !important}.py-lg-0{padding-top:0 !important;padding-bottom:0 !important}.py-lg-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-lg-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-lg-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-lg-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-lg-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-lg-0{padding-top:0 !important}.pt-lg-1{padding-top:.25rem !important}.pt-lg-2{padding-top:.5rem !important}.pt-lg-3{padding-top:1rem !important}.pt-lg-4{padding-top:1.5rem !important}.pt-lg-5{padding-top:3rem !important}.pe-lg-0{padding-right:0 !important}.pe-lg-1{padding-right:.25rem !important}.pe-lg-2{padding-right:.5rem !important}.pe-lg-3{padding-right:1rem !important}.pe-lg-4{padding-right:1.5rem !important}.pe-lg-5{padding-right:3rem !important}.pb-lg-0{padding-bottom:0 !important}.pb-lg-1{padding-bottom:.25rem !important}.pb-lg-2{padding-bottom:.5rem !important}.pb-lg-3{padding-bottom:1rem !important}.pb-lg-4{padding-bottom:1.5rem !important}.pb-lg-5{padding-bottom:3rem !important}.ps-lg-0{padding-left:0 !important}.ps-lg-1{padding-left:.25rem !important}.ps-lg-2{padding-left:.5rem !important}.ps-lg-3{padding-left:1rem !important}.ps-lg-4{padding-left:1.5rem !important}.ps-lg-5{padding-left:3rem !important}.gap-lg-0{gap:0 !important}.gap-lg-1{gap:.25rem !important}.gap-lg-2{gap:.5rem !important}.gap-lg-3{gap:1rem !important}.gap-lg-4{gap:1.5rem !important}.gap-lg-5{gap:3rem !important}.row-gap-lg-0{row-gap:0 !important}.row-gap-lg-1{row-gap:.25rem !important}.row-gap-lg-2{row-gap:.5rem !important}.row-gap-lg-3{row-gap:1rem !important}.row-gap-lg-4{row-gap:1.5rem !important}.row-gap-lg-5{row-gap:3rem !important}.column-gap-lg-0{column-gap:0 !important}.column-gap-lg-1{column-gap:.25rem !important}.column-gap-lg-2{column-gap:.5rem !important}.column-gap-lg-3{column-gap:1rem !important}.column-gap-lg-4{column-gap:1.5rem !important}.column-gap-lg-5{column-gap:3rem !important}.text-lg-start{text-align:left !important}.text-lg-end{text-align:right !important}.text-lg-center{text-align:center !important}}@media(min-width: 1200px){.float-xl-start{float:left !important}.float-xl-end{float:right !important}.float-xl-none{float:none !important}.object-fit-xl-contain{object-fit:contain !important}.object-fit-xl-cover{object-fit:cover !important}.object-fit-xl-fill{object-fit:fill !important}.object-fit-xl-scale{object-fit:scale-down !important}.object-fit-xl-none{object-fit:none !important}.d-xl-inline{display:inline !important}.d-xl-inline-block{display:inline-block !important}.d-xl-block{display:block !important}.d-xl-grid{display:grid !important}.d-xl-inline-grid{display:inline-grid !important}.d-xl-table{display:table !important}.d-xl-table-row{display:table-row !important}.d-xl-table-cell{display:table-cell !important}.d-xl-flex{display:flex !important}.d-xl-inline-flex{display:inline-flex !important}.d-xl-none{display:none !important}.flex-xl-fill{flex:1 1 auto !important}.flex-xl-row{flex-direction:row !important}.flex-xl-column{flex-direction:column !important}.flex-xl-row-reverse{flex-direction:row-reverse !important}.flex-xl-column-reverse{flex-direction:column-reverse !important}.flex-xl-grow-0{flex-grow:0 !important}.flex-xl-grow-1{flex-grow:1 !important}.flex-xl-shrink-0{flex-shrink:0 !important}.flex-xl-shrink-1{flex-shrink:1 !important}.flex-xl-wrap{flex-wrap:wrap !important}.flex-xl-nowrap{flex-wrap:nowrap !important}.flex-xl-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-xl-start{justify-content:flex-start !important}.justify-content-xl-end{justify-content:flex-end !important}.justify-content-xl-center{justify-content:center !important}.justify-content-xl-between{justify-content:space-between !important}.justify-content-xl-around{justify-content:space-around !important}.justify-content-xl-evenly{justify-content:space-evenly !important}.align-items-xl-start{align-items:flex-start !important}.align-items-xl-end{align-items:flex-end !important}.align-items-xl-center{align-items:center !important}.align-items-xl-baseline{align-items:baseline !important}.align-items-xl-stretch{align-items:stretch !important}.align-content-xl-start{align-content:flex-start !important}.align-content-xl-end{align-content:flex-end !important}.align-content-xl-center{align-content:center !important}.align-content-xl-between{align-content:space-between !important}.align-content-xl-around{align-content:space-around !important}.align-content-xl-stretch{align-content:stretch !important}.align-self-xl-auto{align-self:auto !important}.align-self-xl-start{align-self:flex-start !important}.align-self-xl-end{align-self:flex-end !important}.align-self-xl-center{align-self:center !important}.align-self-xl-baseline{align-self:baseline !important}.align-self-xl-stretch{align-self:stretch !important}.order-xl-first{order:-1 !important}.order-xl-0{order:0 !important}.order-xl-1{order:1 !important}.order-xl-2{order:2 !important}.order-xl-3{order:3 !important}.order-xl-4{order:4 !important}.order-xl-5{order:5 !important}.order-xl-last{order:6 !important}.m-xl-0{margin:0 !important}.m-xl-1{margin:.25rem !important}.m-xl-2{margin:.5rem !important}.m-xl-3{margin:1rem !important}.m-xl-4{margin:1.5rem !important}.m-xl-5{margin:3rem !important}.m-xl-auto{margin:auto !important}.mx-xl-0{margin-right:0 !important;margin-left:0 !important}.mx-xl-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-xl-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-xl-3{margin-right:1rem !important;margin-left:1rem !important}.mx-xl-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-xl-5{margin-right:3rem !important;margin-left:3rem !important}.mx-xl-auto{margin-right:auto !important;margin-left:auto !important}.my-xl-0{margin-top:0 !important;margin-bottom:0 !important}.my-xl-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-xl-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-xl-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-xl-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-xl-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-xl-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-xl-0{margin-top:0 !important}.mt-xl-1{margin-top:.25rem !important}.mt-xl-2{margin-top:.5rem !important}.mt-xl-3{margin-top:1rem !important}.mt-xl-4{margin-top:1.5rem !important}.mt-xl-5{margin-top:3rem !important}.mt-xl-auto{margin-top:auto !important}.me-xl-0{margin-right:0 !important}.me-xl-1{margin-right:.25rem !important}.me-xl-2{margin-right:.5rem !important}.me-xl-3{margin-right:1rem !important}.me-xl-4{margin-right:1.5rem !important}.me-xl-5{margin-right:3rem !important}.me-xl-auto{margin-right:auto !important}.mb-xl-0{margin-bottom:0 !important}.mb-xl-1{margin-bottom:.25rem !important}.mb-xl-2{margin-bottom:.5rem !important}.mb-xl-3{margin-bottom:1rem !important}.mb-xl-4{margin-bottom:1.5rem !important}.mb-xl-5{margin-bottom:3rem !important}.mb-xl-auto{margin-bottom:auto !important}.ms-xl-0{margin-left:0 !important}.ms-xl-1{margin-left:.25rem !important}.ms-xl-2{margin-left:.5rem !important}.ms-xl-3{margin-left:1rem !important}.ms-xl-4{margin-left:1.5rem !important}.ms-xl-5{margin-left:3rem !important}.ms-xl-auto{margin-left:auto !important}.p-xl-0{padding:0 !important}.p-xl-1{padding:.25rem !important}.p-xl-2{padding:.5rem !important}.p-xl-3{padding:1rem !important}.p-xl-4{padding:1.5rem !important}.p-xl-5{padding:3rem !important}.px-xl-0{padding-right:0 !important;padding-left:0 !important}.px-xl-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-xl-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-xl-3{padding-right:1rem !important;padding-left:1rem !important}.px-xl-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-xl-5{padding-right:3rem !important;padding-left:3rem !important}.py-xl-0{padding-top:0 !important;padding-bottom:0 !important}.py-xl-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-xl-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-xl-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-xl-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-xl-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-xl-0{padding-top:0 !important}.pt-xl-1{padding-top:.25rem !important}.pt-xl-2{padding-top:.5rem !important}.pt-xl-3{padding-top:1rem !important}.pt-xl-4{padding-top:1.5rem !important}.pt-xl-5{padding-top:3rem !important}.pe-xl-0{padding-right:0 !important}.pe-xl-1{padding-right:.25rem !important}.pe-xl-2{padding-right:.5rem !important}.pe-xl-3{padding-right:1rem !important}.pe-xl-4{padding-right:1.5rem !important}.pe-xl-5{padding-right:3rem !important}.pb-xl-0{padding-bottom:0 !important}.pb-xl-1{padding-bottom:.25rem !important}.pb-xl-2{padding-bottom:.5rem !important}.pb-xl-3{padding-bottom:1rem !important}.pb-xl-4{padding-bottom:1.5rem !important}.pb-xl-5{padding-bottom:3rem !important}.ps-xl-0{padding-left:0 !important}.ps-xl-1{padding-left:.25rem !important}.ps-xl-2{padding-left:.5rem !important}.ps-xl-3{padding-left:1rem !important}.ps-xl-4{padding-left:1.5rem !important}.ps-xl-5{padding-left:3rem !important}.gap-xl-0{gap:0 !important}.gap-xl-1{gap:.25rem !important}.gap-xl-2{gap:.5rem !important}.gap-xl-3{gap:1rem !important}.gap-xl-4{gap:1.5rem !important}.gap-xl-5{gap:3rem !important}.row-gap-xl-0{row-gap:0 !important}.row-gap-xl-1{row-gap:.25rem !important}.row-gap-xl-2{row-gap:.5rem !important}.row-gap-xl-3{row-gap:1rem !important}.row-gap-xl-4{row-gap:1.5rem !important}.row-gap-xl-5{row-gap:3rem !important}.column-gap-xl-0{column-gap:0 !important}.column-gap-xl-1{column-gap:.25rem !important}.column-gap-xl-2{column-gap:.5rem !important}.column-gap-xl-3{column-gap:1rem !important}.column-gap-xl-4{column-gap:1.5rem !important}.column-gap-xl-5{column-gap:3rem !important}.text-xl-start{text-align:left !important}.text-xl-end{text-align:right !important}.text-xl-center{text-align:center !important}}@media(min-width: 1400px){.float-xxl-start{float:left !important}.float-xxl-end{float:right !important}.float-xxl-none{float:none !important}.object-fit-xxl-contain{object-fit:contain !important}.object-fit-xxl-cover{object-fit:cover !important}.object-fit-xxl-fill{object-fit:fill !important}.object-fit-xxl-scale{object-fit:scale-down !important}.object-fit-xxl-none{object-fit:none !important}.d-xxl-inline{display:inline !important}.d-xxl-inline-block{display:inline-block !important}.d-xxl-block{display:block !important}.d-xxl-grid{display:grid !important}.d-xxl-inline-grid{display:inline-grid !important}.d-xxl-table{display:table !important}.d-xxl-table-row{display:table-row !important}.d-xxl-table-cell{display:table-cell !important}.d-xxl-flex{display:flex !important}.d-xxl-inline-flex{display:inline-flex !important}.d-xxl-none{display:none !important}.flex-xxl-fill{flex:1 1 auto !important}.flex-xxl-row{flex-direction:row !important}.flex-xxl-column{flex-direction:column !important}.flex-xxl-row-reverse{flex-direction:row-reverse !important}.flex-xxl-column-reverse{flex-direction:column-reverse !important}.flex-xxl-grow-0{flex-grow:0 !important}.flex-xxl-grow-1{flex-grow:1 !important}.flex-xxl-shrink-0{flex-shrink:0 !important}.flex-xxl-shrink-1{flex-shrink:1 !important}.flex-xxl-wrap{flex-wrap:wrap !important}.flex-xxl-nowrap{flex-wrap:nowrap !important}.flex-xxl-wrap-reverse{flex-wrap:wrap-reverse !important}.justify-content-xxl-start{justify-content:flex-start !important}.justify-content-xxl-end{justify-content:flex-end !important}.justify-content-xxl-center{justify-content:center !important}.justify-content-xxl-between{justify-content:space-between !important}.justify-content-xxl-around{justify-content:space-around !important}.justify-content-xxl-evenly{justify-content:space-evenly !important}.align-items-xxl-start{align-items:flex-start !important}.align-items-xxl-end{align-items:flex-end !important}.align-items-xxl-center{align-items:center !important}.align-items-xxl-baseline{align-items:baseline !important}.align-items-xxl-stretch{align-items:stretch !important}.align-content-xxl-start{align-content:flex-start !important}.align-content-xxl-end{align-content:flex-end !important}.align-content-xxl-center{align-content:center !important}.align-content-xxl-between{align-content:space-between !important}.align-content-xxl-around{align-content:space-around !important}.align-content-xxl-stretch{align-content:stretch !important}.align-self-xxl-auto{align-self:auto !important}.align-self-xxl-start{align-self:flex-start !important}.align-self-xxl-end{align-self:flex-end !important}.align-self-xxl-center{align-self:center !important}.align-self-xxl-baseline{align-self:baseline !important}.align-self-xxl-stretch{align-self:stretch !important}.order-xxl-first{order:-1 !important}.order-xxl-0{order:0 !important}.order-xxl-1{order:1 !important}.order-xxl-2{order:2 !important}.order-xxl-3{order:3 !important}.order-xxl-4{order:4 !important}.order-xxl-5{order:5 !important}.order-xxl-last{order:6 !important}.m-xxl-0{margin:0 !important}.m-xxl-1{margin:.25rem !important}.m-xxl-2{margin:.5rem !important}.m-xxl-3{margin:1rem !important}.m-xxl-4{margin:1.5rem !important}.m-xxl-5{margin:3rem !important}.m-xxl-auto{margin:auto !important}.mx-xxl-0{margin-right:0 !important;margin-left:0 !important}.mx-xxl-1{margin-right:.25rem !important;margin-left:.25rem !important}.mx-xxl-2{margin-right:.5rem !important;margin-left:.5rem !important}.mx-xxl-3{margin-right:1rem !important;margin-left:1rem !important}.mx-xxl-4{margin-right:1.5rem !important;margin-left:1.5rem !important}.mx-xxl-5{margin-right:3rem !important;margin-left:3rem !important}.mx-xxl-auto{margin-right:auto !important;margin-left:auto !important}.my-xxl-0{margin-top:0 !important;margin-bottom:0 !important}.my-xxl-1{margin-top:.25rem !important;margin-bottom:.25rem !important}.my-xxl-2{margin-top:.5rem !important;margin-bottom:.5rem !important}.my-xxl-3{margin-top:1rem !important;margin-bottom:1rem !important}.my-xxl-4{margin-top:1.5rem !important;margin-bottom:1.5rem !important}.my-xxl-5{margin-top:3rem !important;margin-bottom:3rem !important}.my-xxl-auto{margin-top:auto !important;margin-bottom:auto !important}.mt-xxl-0{margin-top:0 !important}.mt-xxl-1{margin-top:.25rem !important}.mt-xxl-2{margin-top:.5rem !important}.mt-xxl-3{margin-top:1rem !important}.mt-xxl-4{margin-top:1.5rem !important}.mt-xxl-5{margin-top:3rem !important}.mt-xxl-auto{margin-top:auto !important}.me-xxl-0{margin-right:0 !important}.me-xxl-1{margin-right:.25rem !important}.me-xxl-2{margin-right:.5rem !important}.me-xxl-3{margin-right:1rem !important}.me-xxl-4{margin-right:1.5rem !important}.me-xxl-5{margin-right:3rem !important}.me-xxl-auto{margin-right:auto !important}.mb-xxl-0{margin-bottom:0 !important}.mb-xxl-1{margin-bottom:.25rem !important}.mb-xxl-2{margin-bottom:.5rem !important}.mb-xxl-3{margin-bottom:1rem !important}.mb-xxl-4{margin-bottom:1.5rem !important}.mb-xxl-5{margin-bottom:3rem !important}.mb-xxl-auto{margin-bottom:auto !important}.ms-xxl-0{margin-left:0 !important}.ms-xxl-1{margin-left:.25rem !important}.ms-xxl-2{margin-left:.5rem !important}.ms-xxl-3{margin-left:1rem !important}.ms-xxl-4{margin-left:1.5rem !important}.ms-xxl-5{margin-left:3rem !important}.ms-xxl-auto{margin-left:auto !important}.p-xxl-0{padding:0 !important}.p-xxl-1{padding:.25rem !important}.p-xxl-2{padding:.5rem !important}.p-xxl-3{padding:1rem !important}.p-xxl-4{padding:1.5rem !important}.p-xxl-5{padding:3rem !important}.px-xxl-0{padding-right:0 !important;padding-left:0 !important}.px-xxl-1{padding-right:.25rem !important;padding-left:.25rem !important}.px-xxl-2{padding-right:.5rem !important;padding-left:.5rem !important}.px-xxl-3{padding-right:1rem !important;padding-left:1rem !important}.px-xxl-4{padding-right:1.5rem !important;padding-left:1.5rem !important}.px-xxl-5{padding-right:3rem !important;padding-left:3rem !important}.py-xxl-0{padding-top:0 !important;padding-bottom:0 !important}.py-xxl-1{padding-top:.25rem !important;padding-bottom:.25rem !important}.py-xxl-2{padding-top:.5rem !important;padding-bottom:.5rem !important}.py-xxl-3{padding-top:1rem !important;padding-bottom:1rem !important}.py-xxl-4{padding-top:1.5rem !important;padding-bottom:1.5rem !important}.py-xxl-5{padding-top:3rem !important;padding-bottom:3rem !important}.pt-xxl-0{padding-top:0 !important}.pt-xxl-1{padding-top:.25rem !important}.pt-xxl-2{padding-top:.5rem !important}.pt-xxl-3{padding-top:1rem !important}.pt-xxl-4{padding-top:1.5rem !important}.pt-xxl-5{padding-top:3rem !important}.pe-xxl-0{padding-right:0 !important}.pe-xxl-1{padding-right:.25rem !important}.pe-xxl-2{padding-right:.5rem !important}.pe-xxl-3{padding-right:1rem !important}.pe-xxl-4{padding-right:1.5rem !important}.pe-xxl-5{padding-right:3rem !important}.pb-xxl-0{padding-bottom:0 !important}.pb-xxl-1{padding-bottom:.25rem !important}.pb-xxl-2{padding-bottom:.5rem !important}.pb-xxl-3{padding-bottom:1rem !important}.pb-xxl-4{padding-bottom:1.5rem !important}.pb-xxl-5{padding-bottom:3rem !important}.ps-xxl-0{padding-left:0 !important}.ps-xxl-1{padding-left:.25rem !important}.ps-xxl-2{padding-left:.5rem !important}.ps-xxl-3{padding-left:1rem !important}.ps-xxl-4{padding-left:1.5rem !important}.ps-xxl-5{padding-left:3rem !important}.gap-xxl-0{gap:0 !important}.gap-xxl-1{gap:.25rem !important}.gap-xxl-2{gap:.5rem !important}.gap-xxl-3{gap:1rem !important}.gap-xxl-4{gap:1.5rem !important}.gap-xxl-5{gap:3rem !important}.row-gap-xxl-0{row-gap:0 !important}.row-gap-xxl-1{row-gap:.25rem !important}.row-gap-xxl-2{row-gap:.5rem !important}.row-gap-xxl-3{row-gap:1rem !important}.row-gap-xxl-4{row-gap:1.5rem !important}.row-gap-xxl-5{row-gap:3rem !important}.column-gap-xxl-0{column-gap:0 !important}.column-gap-xxl-1{column-gap:.25rem !important}.column-gap-xxl-2{column-gap:.5rem !important}.column-gap-xxl-3{column-gap:1rem !important}.column-gap-xxl-4{column-gap:1.5rem !important}.column-gap-xxl-5{column-gap:3rem !important}.text-xxl-start{text-align:left !important}.text-xxl-end{text-align:right !important}.text-xxl-center{text-align:center !important}}.bg-default{color:#fff}.bg-primary{color:#fff}.bg-secondary{color:#fff}.bg-success{color:#fff}.bg-info{color:#fff}.bg-warning{color:#fff}.bg-danger{color:#fff}.bg-light{color:#000}.bg-dark{color:#fff}@media(min-width: 1200px){.fs-1{font-size:2rem !important}.fs-2{font-size:1.65rem !important}.fs-3{font-size:1.45rem !important}}@media print{.d-print-inline{display:inline !important}.d-print-inline-block{display:inline-block !important}.d-print-block{display:block !important}.d-print-grid{display:grid !important}.d-print-inline-grid{display:inline-grid !important}.d-print-table{display:table !important}.d-print-table-row{display:table-row !important}.d-print-table-cell{display:table-cell !important}.d-print-flex{display:flex !important}.d-print-inline-flex{display:inline-flex !important}.d-print-none{display:none !important}}:root{--bslib-spacer: 1rem;--bslib-mb-spacer: var(--bslib-spacer, 1rem)}.bslib-mb-spacing{margin-bottom:var(--bslib-mb-spacer)}.bslib-gap-spacing{gap:var(--bslib-mb-spacer)}.bslib-gap-spacing>.bslib-mb-spacing,.bslib-gap-spacing>.form-group,.bslib-gap-spacing>p,.bslib-gap-spacing>pre{margin-bottom:0}.html-fill-container>.html-fill-item.bslib-mb-spacing{margin-bottom:0}.tab-content>.tab-pane.html-fill-container{display:none}.tab-content>.active.html-fill-container{display:flex}.tab-content.html-fill-container{padding:0}:root{--bslib-spacer: 1rem;--bslib-mb-spacer: var(--bslib-spacer, 1rem)}.bslib-mb-spacing{margin-bottom:var(--bslib-mb-spacer)}.bslib-gap-spacing{gap:var(--bslib-mb-spacer)}.bslib-gap-spacing>.bslib-mb-spacing,.bslib-gap-spacing>.form-group,.bslib-gap-spacing>p,.bslib-gap-spacing>pre{margin-bottom:0}.html-fill-container>.html-fill-item.bslib-mb-spacing{margin-bottom:0}.tab-content>.tab-pane.html-fill-container{display:none}.tab-content>.active.html-fill-container{display:flex}.tab-content.html-fill-container{padding:0}.bg-blue{--bslib-color-bg: #2780e3;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-blue{--bslib-color-fg: #2780e3;color:var(--bslib-color-fg)}.bg-indigo{--bslib-color-bg: #6610f2;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-indigo{--bslib-color-fg: #6610f2;color:var(--bslib-color-fg)}.bg-purple{--bslib-color-bg: #613d7c;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-purple{--bslib-color-fg: #613d7c;color:var(--bslib-color-fg)}.bg-pink{--bslib-color-bg: #e83e8c;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-pink{--bslib-color-fg: #e83e8c;color:var(--bslib-color-fg)}.bg-red{--bslib-color-bg: #ff0039;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-red{--bslib-color-fg: #ff0039;color:var(--bslib-color-fg)}.bg-orange{--bslib-color-bg: #f0ad4e;--bslib-color-fg: #000;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-orange{--bslib-color-fg: #f0ad4e;color:var(--bslib-color-fg)}.bg-yellow{--bslib-color-bg: #ff7518;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-yellow{--bslib-color-fg: #ff7518;color:var(--bslib-color-fg)}.bg-green{--bslib-color-bg: #3fb618;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-green{--bslib-color-fg: #3fb618;color:var(--bslib-color-fg)}.bg-teal{--bslib-color-bg: #20c997;--bslib-color-fg: #000;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-teal{--bslib-color-fg: #20c997;color:var(--bslib-color-fg)}.bg-cyan{--bslib-color-bg: #9954bb;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-cyan{--bslib-color-fg: #9954bb;color:var(--bslib-color-fg)}.text-default{--bslib-color-fg: #343a40}.bg-default{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.text-primary{--bslib-color-fg: #2780e3}.bg-primary{--bslib-color-bg: #2780e3;--bslib-color-fg: #fff}.text-secondary{--bslib-color-fg: #343a40}.bg-secondary{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.text-success{--bslib-color-fg: #3fb618}.bg-success{--bslib-color-bg: #3fb618;--bslib-color-fg: #fff}.text-info{--bslib-color-fg: #9954bb}.bg-info{--bslib-color-bg: #9954bb;--bslib-color-fg: #fff}.text-warning{--bslib-color-fg: #ff7518}.bg-warning{--bslib-color-bg: #ff7518;--bslib-color-fg: #fff}.text-danger{--bslib-color-fg: #ff0039}.bg-danger{--bslib-color-bg: #ff0039;--bslib-color-fg: #fff}.text-light{--bslib-color-fg: #f8f9fa}.bg-light{--bslib-color-bg: #f8f9fa;--bslib-color-fg: #000}.text-dark{--bslib-color-fg: #343a40}.bg-dark{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.bg-gradient-blue-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #4053e9;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #4053e9;color:#fff}.bg-gradient-blue-purple{--bslib-color-fg: #fff;--bslib-color-bg: #3e65ba;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #3e65ba;color:#fff}.bg-gradient-blue-pink{--bslib-color-fg: #fff;--bslib-color-bg: #7466c0;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #7466c0;color:#fff}.bg-gradient-blue-red{--bslib-color-fg: #fff;--bslib-color-bg: #7d4d9f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #7d4d9f;color:#fff}.bg-gradient-blue-orange{--bslib-color-fg: #fff;--bslib-color-bg: #7792a7;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #7792a7;color:#fff}.bg-gradient-blue-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #7d7c92;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #7d7c92;color:#fff}.bg-gradient-blue-green{--bslib-color-fg: #fff;--bslib-color-bg: #319692;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #319692;color:#fff}.bg-gradient-blue-teal{--bslib-color-fg: #fff;--bslib-color-bg: #249dc5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #249dc5;color:#fff}.bg-gradient-blue-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #556ed3;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #556ed3;color:#fff}.bg-gradient-indigo-blue{--bslib-color-fg: #fff;--bslib-color-bg: #4d3dec;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #4d3dec;color:#fff}.bg-gradient-indigo-purple{--bslib-color-fg: #fff;--bslib-color-bg: #6422c3;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #6422c3;color:#fff}.bg-gradient-indigo-pink{--bslib-color-fg: #fff;--bslib-color-bg: #9a22c9;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #9a22c9;color:#fff}.bg-gradient-indigo-red{--bslib-color-fg: #fff;--bslib-color-bg: #a30aa8;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #a30aa8;color:#fff}.bg-gradient-indigo-orange{--bslib-color-fg: #fff;--bslib-color-bg: #9d4fb0;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #9d4fb0;color:#fff}.bg-gradient-indigo-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #a3389b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #a3389b;color:#fff}.bg-gradient-indigo-green{--bslib-color-fg: #fff;--bslib-color-bg: #56529b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #56529b;color:#fff}.bg-gradient-indigo-teal{--bslib-color-fg: #fff;--bslib-color-bg: #4a5ace;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #4a5ace;color:#fff}.bg-gradient-indigo-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #7a2bdc;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #7a2bdc;color:#fff}.bg-gradient-purple-blue{--bslib-color-fg: #fff;--bslib-color-bg: #4a58a5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #4a58a5;color:#fff}.bg-gradient-purple-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #632bab;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #632bab;color:#fff}.bg-gradient-purple-pink{--bslib-color-fg: #fff;--bslib-color-bg: #973d82;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #973d82;color:#fff}.bg-gradient-purple-red{--bslib-color-fg: #fff;--bslib-color-bg: #a02561;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #a02561;color:#fff}.bg-gradient-purple-orange{--bslib-color-fg: #fff;--bslib-color-bg: #9a6a6a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #9a6a6a;color:#fff}.bg-gradient-purple-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #a05354;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #a05354;color:#fff}.bg-gradient-purple-green{--bslib-color-fg: #fff;--bslib-color-bg: #536d54;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #536d54;color:#fff}.bg-gradient-purple-teal{--bslib-color-fg: #fff;--bslib-color-bg: #477587;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #477587;color:#fff}.bg-gradient-purple-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #774695;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #774695;color:#fff}.bg-gradient-pink-blue{--bslib-color-fg: #fff;--bslib-color-bg: #9b58af;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #9b58af;color:#fff}.bg-gradient-pink-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #b42cb5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #b42cb5;color:#fff}.bg-gradient-pink-purple{--bslib-color-fg: #fff;--bslib-color-bg: #b23e86;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #b23e86;color:#fff}.bg-gradient-pink-red{--bslib-color-fg: #fff;--bslib-color-bg: #f1256b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #f1256b;color:#fff}.bg-gradient-pink-orange{--bslib-color-fg: #fff;--bslib-color-bg: #eb6a73;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #eb6a73;color:#fff}.bg-gradient-pink-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #f1545e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #f1545e;color:#fff}.bg-gradient-pink-green{--bslib-color-fg: #fff;--bslib-color-bg: #a46e5e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #a46e5e;color:#fff}.bg-gradient-pink-teal{--bslib-color-fg: #fff;--bslib-color-bg: #987690;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #987690;color:#fff}.bg-gradient-pink-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #c8479f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #c8479f;color:#fff}.bg-gradient-red-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a9337d;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a9337d;color:#fff}.bg-gradient-red-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #c20683;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #c20683;color:#fff}.bg-gradient-red-purple{--bslib-color-fg: #fff;--bslib-color-bg: #c01854;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #c01854;color:#fff}.bg-gradient-red-pink{--bslib-color-fg: #fff;--bslib-color-bg: #f6195a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #f6195a;color:#fff}.bg-gradient-red-orange{--bslib-color-fg: #fff;--bslib-color-bg: #f94541;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #f94541;color:#fff}.bg-gradient-red-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #ff2f2c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #ff2f2c;color:#fff}.bg-gradient-red-green{--bslib-color-fg: #fff;--bslib-color-bg: #b2492c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #b2492c;color:#fff}.bg-gradient-red-teal{--bslib-color-fg: #fff;--bslib-color-bg: #a6505f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #a6505f;color:#fff}.bg-gradient-red-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #d6226d;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #d6226d;color:#fff}.bg-gradient-orange-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a09b8a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a09b8a;color:#fff}.bg-gradient-orange-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #b96e90;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #b96e90;color:#fff}.bg-gradient-orange-purple{--bslib-color-fg: #fff;--bslib-color-bg: #b78060;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #b78060;color:#fff}.bg-gradient-orange-pink{--bslib-color-fg: #fff;--bslib-color-bg: #ed8167;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #ed8167;color:#fff}.bg-gradient-orange-red{--bslib-color-fg: #fff;--bslib-color-bg: #f66846;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #f66846;color:#fff}.bg-gradient-orange-yellow{--bslib-color-fg: #000;--bslib-color-bg: #f69738;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #f69738;color:#000}.bg-gradient-orange-green{--bslib-color-fg: #000;--bslib-color-bg: #a9b138;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #a9b138;color:#000}.bg-gradient-orange-teal{--bslib-color-fg: #000;--bslib-color-bg: #9db86b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #9db86b;color:#000}.bg-gradient-orange-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #cd897a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #cd897a;color:#fff}.bg-gradient-yellow-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a97969;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a97969;color:#fff}.bg-gradient-yellow-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #c24d6f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #c24d6f;color:#fff}.bg-gradient-yellow-purple{--bslib-color-fg: #fff;--bslib-color-bg: #c05f40;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #c05f40;color:#fff}.bg-gradient-yellow-pink{--bslib-color-fg: #fff;--bslib-color-bg: #f65f46;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #f65f46;color:#fff}.bg-gradient-yellow-red{--bslib-color-fg: #fff;--bslib-color-bg: #ff4625;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #ff4625;color:#fff}.bg-gradient-yellow-orange{--bslib-color-fg: #000;--bslib-color-bg: #f98b2e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #f98b2e;color:#000}.bg-gradient-yellow-green{--bslib-color-fg: #fff;--bslib-color-bg: #b28f18;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #b28f18;color:#fff}.bg-gradient-yellow-teal{--bslib-color-fg: #fff;--bslib-color-bg: #a6974b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #a6974b;color:#fff}.bg-gradient-yellow-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #d66859;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #d66859;color:#fff}.bg-gradient-green-blue{--bslib-color-fg: #fff;--bslib-color-bg: #35a069;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #35a069;color:#fff}.bg-gradient-green-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #4f746f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #4f746f;color:#fff}.bg-gradient-green-purple{--bslib-color-fg: #fff;--bslib-color-bg: #4d8640;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #4d8640;color:#fff}.bg-gradient-green-pink{--bslib-color-fg: #fff;--bslib-color-bg: #838646;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #838646;color:#fff}.bg-gradient-green-red{--bslib-color-fg: #fff;--bslib-color-bg: #8c6d25;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #8c6d25;color:#fff}.bg-gradient-green-orange{--bslib-color-fg: #000;--bslib-color-bg: #86b22e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #86b22e;color:#000}.bg-gradient-green-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #8c9c18;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #8c9c18;color:#fff}.bg-gradient-green-teal{--bslib-color-fg: #000;--bslib-color-bg: #33be4b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #33be4b;color:#000}.bg-gradient-green-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #638f59;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #638f59;color:#fff}.bg-gradient-teal-blue{--bslib-color-fg: #fff;--bslib-color-bg: #23acb5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #23acb5;color:#fff}.bg-gradient-teal-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #3c7fbb;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #3c7fbb;color:#fff}.bg-gradient-teal-purple{--bslib-color-fg: #fff;--bslib-color-bg: #3a918c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #3a918c;color:#fff}.bg-gradient-teal-pink{--bslib-color-fg: #fff;--bslib-color-bg: #709193;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #709193;color:#fff}.bg-gradient-teal-red{--bslib-color-fg: #fff;--bslib-color-bg: #797971;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #797971;color:#fff}.bg-gradient-teal-orange{--bslib-color-fg: #000;--bslib-color-bg: #73be7a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #73be7a;color:#000}.bg-gradient-teal-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #79a764;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #79a764;color:#fff}.bg-gradient-teal-green{--bslib-color-fg: #000;--bslib-color-bg: #2cc164;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #2cc164;color:#000}.bg-gradient-teal-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #509aa5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #509aa5;color:#fff}.bg-gradient-cyan-blue{--bslib-color-fg: #fff;--bslib-color-bg: #6b66cb;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #6b66cb;color:#fff}.bg-gradient-cyan-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #8539d1;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #8539d1;color:#fff}.bg-gradient-cyan-purple{--bslib-color-fg: #fff;--bslib-color-bg: #834ba2;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #834ba2;color:#fff}.bg-gradient-cyan-pink{--bslib-color-fg: #fff;--bslib-color-bg: #b94ba8;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #b94ba8;color:#fff}.bg-gradient-cyan-red{--bslib-color-fg: #fff;--bslib-color-bg: #c23287;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #c23287;color:#fff}.bg-gradient-cyan-orange{--bslib-color-fg: #fff;--bslib-color-bg: #bc788f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #bc788f;color:#fff}.bg-gradient-cyan-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #c2617a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #c2617a;color:#fff}.bg-gradient-cyan-green{--bslib-color-fg: #fff;--bslib-color-bg: #757b7a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #757b7a;color:#fff}.bg-gradient-cyan-teal{--bslib-color-fg: #fff;--bslib-color-bg: #6983ad;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #6983ad;color:#fff}.bg-blue{--bslib-color-bg: #2780e3;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-blue{--bslib-color-fg: #2780e3;color:var(--bslib-color-fg)}.bg-indigo{--bslib-color-bg: #6610f2;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-indigo{--bslib-color-fg: #6610f2;color:var(--bslib-color-fg)}.bg-purple{--bslib-color-bg: #613d7c;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-purple{--bslib-color-fg: #613d7c;color:var(--bslib-color-fg)}.bg-pink{--bslib-color-bg: #e83e8c;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-pink{--bslib-color-fg: #e83e8c;color:var(--bslib-color-fg)}.bg-red{--bslib-color-bg: #ff0039;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-red{--bslib-color-fg: #ff0039;color:var(--bslib-color-fg)}.bg-orange{--bslib-color-bg: #f0ad4e;--bslib-color-fg: #000;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-orange{--bslib-color-fg: #f0ad4e;color:var(--bslib-color-fg)}.bg-yellow{--bslib-color-bg: #ff7518;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-yellow{--bslib-color-fg: #ff7518;color:var(--bslib-color-fg)}.bg-green{--bslib-color-bg: #3fb618;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-green{--bslib-color-fg: #3fb618;color:var(--bslib-color-fg)}.bg-teal{--bslib-color-bg: #20c997;--bslib-color-fg: #000;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-teal{--bslib-color-fg: #20c997;color:var(--bslib-color-fg)}.bg-cyan{--bslib-color-bg: #9954bb;--bslib-color-fg: #fff;background-color:var(--bslib-color-bg);color:var(--bslib-color-fg)}.text-cyan{--bslib-color-fg: #9954bb;color:var(--bslib-color-fg)}.text-default{--bslib-color-fg: #343a40}.bg-default{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.text-primary{--bslib-color-fg: #2780e3}.bg-primary{--bslib-color-bg: #2780e3;--bslib-color-fg: #fff}.text-secondary{--bslib-color-fg: #343a40}.bg-secondary{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.text-success{--bslib-color-fg: #3fb618}.bg-success{--bslib-color-bg: #3fb618;--bslib-color-fg: #fff}.text-info{--bslib-color-fg: #9954bb}.bg-info{--bslib-color-bg: #9954bb;--bslib-color-fg: #fff}.text-warning{--bslib-color-fg: #ff7518}.bg-warning{--bslib-color-bg: #ff7518;--bslib-color-fg: #fff}.text-danger{--bslib-color-fg: #ff0039}.bg-danger{--bslib-color-bg: #ff0039;--bslib-color-fg: #fff}.text-light{--bslib-color-fg: #f8f9fa}.bg-light{--bslib-color-bg: #f8f9fa;--bslib-color-fg: #000}.text-dark{--bslib-color-fg: #343a40}.bg-dark{--bslib-color-bg: #343a40;--bslib-color-fg: #fff}.bg-gradient-blue-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #4053e9;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #4053e9;color:#fff}.bg-gradient-blue-purple{--bslib-color-fg: #fff;--bslib-color-bg: #3e65ba;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #3e65ba;color:#fff}.bg-gradient-blue-pink{--bslib-color-fg: #fff;--bslib-color-bg: #7466c0;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #7466c0;color:#fff}.bg-gradient-blue-red{--bslib-color-fg: #fff;--bslib-color-bg: #7d4d9f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #7d4d9f;color:#fff}.bg-gradient-blue-orange{--bslib-color-fg: #fff;--bslib-color-bg: #7792a7;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #7792a7;color:#fff}.bg-gradient-blue-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #7d7c92;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #7d7c92;color:#fff}.bg-gradient-blue-green{--bslib-color-fg: #fff;--bslib-color-bg: #319692;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #319692;color:#fff}.bg-gradient-blue-teal{--bslib-color-fg: #fff;--bslib-color-bg: #249dc5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #249dc5;color:#fff}.bg-gradient-blue-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #556ed3;background:linear-gradient(var(--bg-gradient-deg, 140deg), #2780e3 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #556ed3;color:#fff}.bg-gradient-indigo-blue{--bslib-color-fg: #fff;--bslib-color-bg: #4d3dec;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #4d3dec;color:#fff}.bg-gradient-indigo-purple{--bslib-color-fg: #fff;--bslib-color-bg: #6422c3;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #6422c3;color:#fff}.bg-gradient-indigo-pink{--bslib-color-fg: #fff;--bslib-color-bg: #9a22c9;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #9a22c9;color:#fff}.bg-gradient-indigo-red{--bslib-color-fg: #fff;--bslib-color-bg: #a30aa8;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #a30aa8;color:#fff}.bg-gradient-indigo-orange{--bslib-color-fg: #fff;--bslib-color-bg: #9d4fb0;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #9d4fb0;color:#fff}.bg-gradient-indigo-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #a3389b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #a3389b;color:#fff}.bg-gradient-indigo-green{--bslib-color-fg: #fff;--bslib-color-bg: #56529b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #56529b;color:#fff}.bg-gradient-indigo-teal{--bslib-color-fg: #fff;--bslib-color-bg: #4a5ace;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #4a5ace;color:#fff}.bg-gradient-indigo-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #7a2bdc;background:linear-gradient(var(--bg-gradient-deg, 140deg), #6610f2 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #7a2bdc;color:#fff}.bg-gradient-purple-blue{--bslib-color-fg: #fff;--bslib-color-bg: #4a58a5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #4a58a5;color:#fff}.bg-gradient-purple-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #632bab;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #632bab;color:#fff}.bg-gradient-purple-pink{--bslib-color-fg: #fff;--bslib-color-bg: #973d82;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #973d82;color:#fff}.bg-gradient-purple-red{--bslib-color-fg: #fff;--bslib-color-bg: #a02561;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #a02561;color:#fff}.bg-gradient-purple-orange{--bslib-color-fg: #fff;--bslib-color-bg: #9a6a6a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #9a6a6a;color:#fff}.bg-gradient-purple-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #a05354;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #a05354;color:#fff}.bg-gradient-purple-green{--bslib-color-fg: #fff;--bslib-color-bg: #536d54;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #536d54;color:#fff}.bg-gradient-purple-teal{--bslib-color-fg: #fff;--bslib-color-bg: #477587;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #477587;color:#fff}.bg-gradient-purple-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #774695;background:linear-gradient(var(--bg-gradient-deg, 140deg), #613d7c var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #774695;color:#fff}.bg-gradient-pink-blue{--bslib-color-fg: #fff;--bslib-color-bg: #9b58af;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #9b58af;color:#fff}.bg-gradient-pink-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #b42cb5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #b42cb5;color:#fff}.bg-gradient-pink-purple{--bslib-color-fg: #fff;--bslib-color-bg: #b23e86;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #b23e86;color:#fff}.bg-gradient-pink-red{--bslib-color-fg: #fff;--bslib-color-bg: #f1256b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #f1256b;color:#fff}.bg-gradient-pink-orange{--bslib-color-fg: #fff;--bslib-color-bg: #eb6a73;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #eb6a73;color:#fff}.bg-gradient-pink-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #f1545e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #f1545e;color:#fff}.bg-gradient-pink-green{--bslib-color-fg: #fff;--bslib-color-bg: #a46e5e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #a46e5e;color:#fff}.bg-gradient-pink-teal{--bslib-color-fg: #fff;--bslib-color-bg: #987690;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #987690;color:#fff}.bg-gradient-pink-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #c8479f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #e83e8c var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #c8479f;color:#fff}.bg-gradient-red-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a9337d;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a9337d;color:#fff}.bg-gradient-red-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #c20683;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #c20683;color:#fff}.bg-gradient-red-purple{--bslib-color-fg: #fff;--bslib-color-bg: #c01854;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #c01854;color:#fff}.bg-gradient-red-pink{--bslib-color-fg: #fff;--bslib-color-bg: #f6195a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #f6195a;color:#fff}.bg-gradient-red-orange{--bslib-color-fg: #fff;--bslib-color-bg: #f94541;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #f94541;color:#fff}.bg-gradient-red-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #ff2f2c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #ff2f2c;color:#fff}.bg-gradient-red-green{--bslib-color-fg: #fff;--bslib-color-bg: #b2492c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #b2492c;color:#fff}.bg-gradient-red-teal{--bslib-color-fg: #fff;--bslib-color-bg: #a6505f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #a6505f;color:#fff}.bg-gradient-red-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #d6226d;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff0039 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #d6226d;color:#fff}.bg-gradient-orange-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a09b8a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a09b8a;color:#fff}.bg-gradient-orange-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #b96e90;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #b96e90;color:#fff}.bg-gradient-orange-purple{--bslib-color-fg: #fff;--bslib-color-bg: #b78060;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #b78060;color:#fff}.bg-gradient-orange-pink{--bslib-color-fg: #fff;--bslib-color-bg: #ed8167;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #ed8167;color:#fff}.bg-gradient-orange-red{--bslib-color-fg: #fff;--bslib-color-bg: #f66846;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #f66846;color:#fff}.bg-gradient-orange-yellow{--bslib-color-fg: #000;--bslib-color-bg: #f69738;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #f69738;color:#000}.bg-gradient-orange-green{--bslib-color-fg: #000;--bslib-color-bg: #a9b138;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #a9b138;color:#000}.bg-gradient-orange-teal{--bslib-color-fg: #000;--bslib-color-bg: #9db86b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #9db86b;color:#000}.bg-gradient-orange-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #cd897a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #f0ad4e var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #cd897a;color:#fff}.bg-gradient-yellow-blue{--bslib-color-fg: #fff;--bslib-color-bg: #a97969;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #a97969;color:#fff}.bg-gradient-yellow-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #c24d6f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #c24d6f;color:#fff}.bg-gradient-yellow-purple{--bslib-color-fg: #fff;--bslib-color-bg: #c05f40;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #c05f40;color:#fff}.bg-gradient-yellow-pink{--bslib-color-fg: #fff;--bslib-color-bg: #f65f46;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #f65f46;color:#fff}.bg-gradient-yellow-red{--bslib-color-fg: #fff;--bslib-color-bg: #ff4625;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #ff4625;color:#fff}.bg-gradient-yellow-orange{--bslib-color-fg: #000;--bslib-color-bg: #f98b2e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #f98b2e;color:#000}.bg-gradient-yellow-green{--bslib-color-fg: #fff;--bslib-color-bg: #b28f18;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #b28f18;color:#fff}.bg-gradient-yellow-teal{--bslib-color-fg: #fff;--bslib-color-bg: #a6974b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #a6974b;color:#fff}.bg-gradient-yellow-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #d66859;background:linear-gradient(var(--bg-gradient-deg, 140deg), #ff7518 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #d66859;color:#fff}.bg-gradient-green-blue{--bslib-color-fg: #fff;--bslib-color-bg: #35a069;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #35a069;color:#fff}.bg-gradient-green-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #4f746f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #4f746f;color:#fff}.bg-gradient-green-purple{--bslib-color-fg: #fff;--bslib-color-bg: #4d8640;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #4d8640;color:#fff}.bg-gradient-green-pink{--bslib-color-fg: #fff;--bslib-color-bg: #838646;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #838646;color:#fff}.bg-gradient-green-red{--bslib-color-fg: #fff;--bslib-color-bg: #8c6d25;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #8c6d25;color:#fff}.bg-gradient-green-orange{--bslib-color-fg: #000;--bslib-color-bg: #86b22e;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #86b22e;color:#000}.bg-gradient-green-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #8c9c18;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #8c9c18;color:#fff}.bg-gradient-green-teal{--bslib-color-fg: #000;--bslib-color-bg: #33be4b;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #33be4b;color:#000}.bg-gradient-green-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #638f59;background:linear-gradient(var(--bg-gradient-deg, 140deg), #3fb618 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #638f59;color:#fff}.bg-gradient-teal-blue{--bslib-color-fg: #fff;--bslib-color-bg: #23acb5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #23acb5;color:#fff}.bg-gradient-teal-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #3c7fbb;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #3c7fbb;color:#fff}.bg-gradient-teal-purple{--bslib-color-fg: #fff;--bslib-color-bg: #3a918c;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #3a918c;color:#fff}.bg-gradient-teal-pink{--bslib-color-fg: #fff;--bslib-color-bg: #709193;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #709193;color:#fff}.bg-gradient-teal-red{--bslib-color-fg: #fff;--bslib-color-bg: #797971;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #797971;color:#fff}.bg-gradient-teal-orange{--bslib-color-fg: #000;--bslib-color-bg: #73be7a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #73be7a;color:#000}.bg-gradient-teal-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #79a764;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #79a764;color:#fff}.bg-gradient-teal-green{--bslib-color-fg: #000;--bslib-color-bg: #2cc164;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #2cc164;color:#000}.bg-gradient-teal-cyan{--bslib-color-fg: #fff;--bslib-color-bg: #509aa5;background:linear-gradient(var(--bg-gradient-deg, 140deg), #20c997 var(--bg-gradient-start, 36%), #9954bb var(--bg-gradient-end, 180%)) #509aa5;color:#fff}.bg-gradient-cyan-blue{--bslib-color-fg: #fff;--bslib-color-bg: #6b66cb;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #2780e3 var(--bg-gradient-end, 180%)) #6b66cb;color:#fff}.bg-gradient-cyan-indigo{--bslib-color-fg: #fff;--bslib-color-bg: #8539d1;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #6610f2 var(--bg-gradient-end, 180%)) #8539d1;color:#fff}.bg-gradient-cyan-purple{--bslib-color-fg: #fff;--bslib-color-bg: #834ba2;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #613d7c var(--bg-gradient-end, 180%)) #834ba2;color:#fff}.bg-gradient-cyan-pink{--bslib-color-fg: #fff;--bslib-color-bg: #b94ba8;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #e83e8c var(--bg-gradient-end, 180%)) #b94ba8;color:#fff}.bg-gradient-cyan-red{--bslib-color-fg: #fff;--bslib-color-bg: #c23287;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #ff0039 var(--bg-gradient-end, 180%)) #c23287;color:#fff}.bg-gradient-cyan-orange{--bslib-color-fg: #fff;--bslib-color-bg: #bc788f;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #f0ad4e var(--bg-gradient-end, 180%)) #bc788f;color:#fff}.bg-gradient-cyan-yellow{--bslib-color-fg: #fff;--bslib-color-bg: #c2617a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #ff7518 var(--bg-gradient-end, 180%)) #c2617a;color:#fff}.bg-gradient-cyan-green{--bslib-color-fg: #fff;--bslib-color-bg: #757b7a;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #3fb618 var(--bg-gradient-end, 180%)) #757b7a;color:#fff}.bg-gradient-cyan-teal{--bslib-color-fg: #fff;--bslib-color-bg: #6983ad;background:linear-gradient(var(--bg-gradient-deg, 140deg), #9954bb var(--bg-gradient-start, 36%), #20c997 var(--bg-gradient-end, 180%)) #6983ad;color:#fff}.accordion .accordion-header{font-size:calc(1.29rem + 0.48vw);margin-top:0;margin-bottom:.5rem;font-weight:400;line-height:1.2;color:var(--bs-heading-color);margin-bottom:0}@media(min-width: 1200px){.accordion .accordion-header{font-size:1.65rem}}.accordion .accordion-icon:not(:empty){margin-right:.75rem;display:flex}.accordion .accordion-button:not(.collapsed){box-shadow:none}.accordion .accordion-button:not(.collapsed):focus{box-shadow:var(--bs-accordion-btn-focus-box-shadow)}.bslib-sidebar-layout{--bslib-sidebar-transition-duration: 500ms;--bslib-sidebar-transition-easing-x: cubic-bezier(0.8, 0.78, 0.22, 1.07);--bslib-sidebar-border: var(--bs-card-border-width, 1px) solid var(--bs-card-border-color, rgba(0, 0, 0, 0.175));--bslib-sidebar-border-radius: var(--bs-border-radius);--bslib-sidebar-vert-border: var(--bs-card-border-width, 1px) solid var(--bs-card-border-color, rgba(0, 0, 0, 0.175));--bslib-sidebar-bg: rgba(var(--bs-emphasis-color-rgb, 0, 0, 0), 0.05);--bslib-sidebar-fg: var(--bs-emphasis-color, black);--bslib-sidebar-main-fg: var(--bs-card-color, var(--bs-body-color));--bslib-sidebar-main-bg: var(--bs-card-bg, var(--bs-body-bg));--bslib-sidebar-toggle-bg: rgba(var(--bs-emphasis-color-rgb, 0, 0, 0), 0.1);--bslib-sidebar-padding: calc(var(--bslib-spacer) * 1.5);--bslib-sidebar-icon-size: var(--bslib-spacer, 1rem);--bslib-sidebar-icon-button-size: calc(var(--bslib-sidebar-icon-size, 1rem) * 2);--bslib-sidebar-padding-icon: calc(var(--bslib-sidebar-icon-button-size, 2rem) * 1.5);--bslib-collapse-toggle-border-radius: var(--bs-border-radius, 0.25rem);--bslib-collapse-toggle-transform: 0deg;--bslib-sidebar-toggle-transition-easing: cubic-bezier(1, 0, 0, 1);--bslib-collapse-toggle-right-transform: 180deg;--bslib-sidebar-column-main: minmax(0, 1fr);display:grid !important;grid-template-columns:min(100% - var(--bslib-sidebar-icon-size),var(--bslib-sidebar-width, 250px)) var(--bslib-sidebar-column-main);position:relative;transition:grid-template-columns ease-in-out var(--bslib-sidebar-transition-duration);border:var(--bslib-sidebar-border);border-radius:var(--bslib-sidebar-border-radius)}@media(prefers-reduced-motion: reduce){.bslib-sidebar-layout{transition:none}}.bslib-sidebar-layout[data-bslib-sidebar-border=false]{border:none}.bslib-sidebar-layout[data-bslib-sidebar-border-radius=false]{border-radius:initial}.bslib-sidebar-layout>.main,.bslib-sidebar-layout>.sidebar{grid-row:1/2;border-radius:inherit;overflow:auto}.bslib-sidebar-layout>.main{grid-column:2/3;border-top-left-radius:0;border-bottom-left-radius:0;padding:var(--bslib-sidebar-padding);transition:padding var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration);color:var(--bslib-sidebar-main-fg);background-color:var(--bslib-sidebar-main-bg)}.bslib-sidebar-layout>.sidebar{grid-column:1/2;width:100%;height:100%;border-right:var(--bslib-sidebar-vert-border);border-top-right-radius:0;border-bottom-right-radius:0;color:var(--bslib-sidebar-fg);background-color:var(--bslib-sidebar-bg);backdrop-filter:blur(5px)}.bslib-sidebar-layout>.sidebar>.sidebar-content{display:flex;flex-direction:column;gap:var(--bslib-spacer, 1rem);padding:var(--bslib-sidebar-padding);padding-top:var(--bslib-sidebar-padding-icon)}.bslib-sidebar-layout>.sidebar>.sidebar-content>:last-child:not(.sidebar-title){margin-bottom:0}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion{margin-left:calc(-1*var(--bslib-sidebar-padding));margin-right:calc(-1*var(--bslib-sidebar-padding))}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion:last-child{margin-bottom:calc(-1*var(--bslib-sidebar-padding))}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion:not(:last-child){margin-bottom:1rem}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion .accordion-body{display:flex;flex-direction:column}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion:not(:first-child) .accordion-item:first-child{border-top:var(--bs-accordion-border-width) solid var(--bs-accordion-border-color)}.bslib-sidebar-layout>.sidebar>.sidebar-content>.accordion:not(:last-child) .accordion-item:last-child{border-bottom:var(--bs-accordion-border-width) solid var(--bs-accordion-border-color)}.bslib-sidebar-layout>.sidebar>.sidebar-content.has-accordion>.sidebar-title{border-bottom:none;padding-bottom:0}.bslib-sidebar-layout>.sidebar .shiny-input-container{width:100%}.bslib-sidebar-layout[data-bslib-sidebar-open=always]>.sidebar>.sidebar-content{padding-top:var(--bslib-sidebar-padding)}.bslib-sidebar-layout>.collapse-toggle{grid-row:1/2;grid-column:1/2;display:inline-flex;align-items:center;position:absolute;right:calc(var(--bslib-sidebar-icon-size));top:calc(var(--bslib-sidebar-icon-size, 1rem)/2);border:none;border-radius:var(--bslib-collapse-toggle-border-radius);height:var(--bslib-sidebar-icon-button-size, 2rem);width:var(--bslib-sidebar-icon-button-size, 2rem);display:flex;align-items:center;justify-content:center;padding:0;color:var(--bslib-sidebar-fg);background-color:unset;transition:color var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration),top var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration),right var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration),left var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration)}.bslib-sidebar-layout>.collapse-toggle:hover{background-color:var(--bslib-sidebar-toggle-bg)}.bslib-sidebar-layout>.collapse-toggle>.collapse-icon{opacity:.8;width:var(--bslib-sidebar-icon-size);height:var(--bslib-sidebar-icon-size);transform:rotateY(var(--bslib-collapse-toggle-transform));transition:transform var(--bslib-sidebar-toggle-transition-easing) var(--bslib-sidebar-transition-duration)}.bslib-sidebar-layout>.collapse-toggle:hover>.collapse-icon{opacity:1}.bslib-sidebar-layout .sidebar-title{font-size:1.25rem;line-height:1.25;margin-top:0;margin-bottom:1rem;padding-bottom:1rem;border-bottom:var(--bslib-sidebar-border)}.bslib-sidebar-layout.sidebar-right{grid-template-columns:var(--bslib-sidebar-column-main) min(100% - var(--bslib-sidebar-icon-size),var(--bslib-sidebar-width, 250px))}.bslib-sidebar-layout.sidebar-right>.main{grid-column:1/2;border-top-right-radius:0;border-bottom-right-radius:0;border-top-left-radius:inherit;border-bottom-left-radius:inherit}.bslib-sidebar-layout.sidebar-right>.sidebar{grid-column:2/3;border-right:none;border-left:var(--bslib-sidebar-vert-border);border-top-left-radius:0;border-bottom-left-radius:0}.bslib-sidebar-layout.sidebar-right>.collapse-toggle{grid-column:2/3;left:var(--bslib-sidebar-icon-size);right:unset;border:var(--bslib-collapse-toggle-border)}.bslib-sidebar-layout.sidebar-right>.collapse-toggle>.collapse-icon{transform:rotateY(var(--bslib-collapse-toggle-right-transform))}.bslib-sidebar-layout.sidebar-collapsed{--bslib-collapse-toggle-transform: 180deg;--bslib-collapse-toggle-right-transform: 0deg;--bslib-sidebar-vert-border: none;grid-template-columns:0 minmax(0, 1fr)}.bslib-sidebar-layout.sidebar-collapsed.sidebar-right{grid-template-columns:minmax(0, 1fr) 0}.bslib-sidebar-layout.sidebar-collapsed:not(.transitioning)>.sidebar>*{display:none}.bslib-sidebar-layout.sidebar-collapsed>.main{border-radius:inherit}.bslib-sidebar-layout.sidebar-collapsed:not(.sidebar-right)>.main{padding-left:var(--bslib-sidebar-padding-icon)}.bslib-sidebar-layout.sidebar-collapsed.sidebar-right>.main{padding-right:var(--bslib-sidebar-padding-icon)}.bslib-sidebar-layout.sidebar-collapsed>.collapse-toggle{color:var(--bslib-sidebar-main-fg);top:calc(var(--bslib-sidebar-overlap-counter, 0)*(var(--bslib-sidebar-icon-size) + var(--bslib-sidebar-padding)) + var(--bslib-sidebar-icon-size, 1rem)/2);right:calc(-2.5*var(--bslib-sidebar-icon-size) - var(--bs-card-border-width, 1px))}.bslib-sidebar-layout.sidebar-collapsed.sidebar-right>.collapse-toggle{left:calc(-2.5*var(--bslib-sidebar-icon-size) - var(--bs-card-border-width, 1px));right:unset}@media(min-width: 576px){.bslib-sidebar-layout.transitioning>.sidebar>.sidebar-content{display:none}}@media(max-width: 575.98px){.bslib-sidebar-layout[data-bslib-sidebar-open=desktop]{--bslib-sidebar-js-init-collapsed: true}.bslib-sidebar-layout>.sidebar,.bslib-sidebar-layout.sidebar-right>.sidebar{border:none}.bslib-sidebar-layout>.main,.bslib-sidebar-layout.sidebar-right>.main{grid-column:1/3}.bslib-sidebar-layout[data-bslib-sidebar-open=always]{display:block !important}.bslib-sidebar-layout[data-bslib-sidebar-open=always]>.sidebar{max-height:var(--bslib-sidebar-max-height-mobile);overflow-y:auto;border-top:var(--bslib-sidebar-vert-border)}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]){grid-template-columns:100% 0}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]):not(.sidebar-collapsed)>.sidebar{z-index:1}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]):not(.sidebar-collapsed)>.collapse-toggle{z-index:1}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]).sidebar-right{grid-template-columns:0 100%}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]).sidebar-collapsed{grid-template-columns:0 100%}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]).sidebar-collapsed.sidebar-right{grid-template-columns:100% 0}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]):not(.sidebar-right)>.main{padding-left:var(--bslib-sidebar-padding-icon)}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]).sidebar-right>.main{padding-right:var(--bslib-sidebar-padding-icon)}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always])>.main{opacity:0;transition:opacity var(--bslib-sidebar-transition-easing-x) var(--bslib-sidebar-transition-duration)}.bslib-sidebar-layout:not([data-bslib-sidebar-open=always]).sidebar-collapsed>.main{opacity:1}}.navbar+.container-fluid:has(>.tab-content>.tab-pane.active.html-fill-container),.navbar+.container-sm:has(>.tab-content>.tab-pane.active.html-fill-container),.navbar+.container-md:has(>.tab-content>.tab-pane.active.html-fill-container),.navbar+.container-lg:has(>.tab-content>.tab-pane.active.html-fill-container),.navbar+.container-xl:has(>.tab-content>.tab-pane.active.html-fill-container),.navbar+.container-xxl:has(>.tab-content>.tab-pane.active.html-fill-container){padding-left:0;padding-right:0}.navbar+.container-fluid>.tab-content>.tab-pane.active.html-fill-container,.navbar+.container-sm>.tab-content>.tab-pane.active.html-fill-container,.navbar+.container-md>.tab-content>.tab-pane.active.html-fill-container,.navbar+.container-lg>.tab-content>.tab-pane.active.html-fill-container,.navbar+.container-xl>.tab-content>.tab-pane.active.html-fill-container,.navbar+.container-xxl>.tab-content>.tab-pane.active.html-fill-container{padding:var(--bslib-spacer, 1rem);gap:var(--bslib-spacer, 1rem)}.navbar+.container-fluid>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child),.navbar+.container-sm>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child),.navbar+.container-md>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child),.navbar+.container-lg>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child),.navbar+.container-xl>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child),.navbar+.container-xxl>.tab-content>.tab-pane.active.html-fill-container:has(>.bslib-sidebar-layout:only-child){padding:0}.navbar+.container-fluid>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]),.navbar+.container-sm>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]),.navbar+.container-md>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]),.navbar+.container-lg>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]),.navbar+.container-xl>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]),.navbar+.container-xxl>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border=true]){border-left:none;border-right:none;border-bottom:none}.navbar+.container-fluid>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]),.navbar+.container-sm>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]),.navbar+.container-md>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]),.navbar+.container-lg>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]),.navbar+.container-xl>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]),.navbar+.container-xxl>.tab-content>.tab-pane.active.html-fill-container>.bslib-sidebar-layout:only-child:not([data-bslib-sidebar-border-radius=true]){border-radius:0}.navbar+div>.bslib-sidebar-layout{border-top:var(--bslib-sidebar-border)}:root{--bslib-value-box-shadow: none;--bslib-value-box-border-width-auto-yes: var(--bslib-value-box-border-width-baseline);--bslib-value-box-border-width-auto-no: 0;--bslib-value-box-border-width-baseline: 1px}.bslib-value-box{border-width:var(--bslib-value-box-border-width-auto-no, var(--bslib-value-box-border-width-baseline));container-name:bslib-value-box;container-type:inline-size}.bslib-value-box.card{box-shadow:var(--bslib-value-box-shadow)}.bslib-value-box.border-auto{border-width:var(--bslib-value-box-border-width-auto-yes, var(--bslib-value-box-border-width-baseline))}.bslib-value-box.default{--bslib-value-box-bg-default: var(--bs-card-bg, #fff);--bslib-value-box-border-color-default: var(--bs-card-border-color, rgba(0, 0, 0, 0.175));color:var(--bslib-value-box-color);background-color:var(--bslib-value-box-bg, var(--bslib-value-box-bg-default));border-color:var(--bslib-value-box-border-color, var(--bslib-value-box-border-color-default))}.bslib-value-box .value-box-grid{display:grid;grid-template-areas:"left right";align-items:center;overflow:hidden}.bslib-value-box .value-box-showcase{height:100%;max-height:var(---bslib-value-box-showcase-max-h, 100%)}.bslib-value-box .value-box-showcase,.bslib-value-box .value-box-showcase>.html-fill-item{width:100%}.bslib-value-box[data-full-screen=true] .value-box-showcase{max-height:var(---bslib-value-box-showcase-max-h-fs, 100%)}@media screen and (min-width: 575.98px){@container bslib-value-box (max-width: 300px){.bslib-value-box:not(.showcase-bottom) .value-box-grid{grid-template-columns:1fr !important;grid-template-rows:auto auto;grid-template-areas:"top" "bottom"}.bslib-value-box:not(.showcase-bottom) .value-box-grid .value-box-showcase{grid-area:top !important}.bslib-value-box:not(.showcase-bottom) .value-box-grid .value-box-area{grid-area:bottom !important;justify-content:end}}}.bslib-value-box .value-box-area{justify-content:center;padding:1.5rem 1rem;font-size:.9rem;font-weight:500}.bslib-value-box .value-box-area *{margin-bottom:0;margin-top:0}.bslib-value-box .value-box-title{font-size:1rem;margin-top:0;margin-bottom:.5rem;font-weight:400;line-height:1.2}.bslib-value-box .value-box-title:empty::after{content:" "}.bslib-value-box .value-box-value{font-size:calc(1.29rem + 0.48vw);margin-top:0;margin-bottom:.5rem;font-weight:400;line-height:1.2}@media(min-width: 1200px){.bslib-value-box .value-box-value{font-size:1.65rem}}.bslib-value-box .value-box-value:empty::after{content:" "}.bslib-value-box .value-box-showcase{align-items:center;justify-content:center;margin-top:auto;margin-bottom:auto;padding:1rem}.bslib-value-box .value-box-showcase .bi,.bslib-value-box .value-box-showcase .fa,.bslib-value-box .value-box-showcase .fab,.bslib-value-box .value-box-showcase .fas,.bslib-value-box .value-box-showcase .far{opacity:.85;min-width:50px;max-width:125%}.bslib-value-box .value-box-showcase .bi,.bslib-value-box .value-box-showcase .fa,.bslib-value-box .value-box-showcase .fab,.bslib-value-box .value-box-showcase .fas,.bslib-value-box .value-box-showcase .far{font-size:4rem}.bslib-value-box.showcase-top-right .value-box-grid{grid-template-columns:1fr var(---bslib-value-box-showcase-w, 50%)}.bslib-value-box.showcase-top-right .value-box-grid .value-box-showcase{grid-area:right;margin-left:auto;align-self:start;align-items:end;padding-left:0;padding-bottom:0}.bslib-value-box.showcase-top-right .value-box-grid .value-box-area{grid-area:left;align-self:end}.bslib-value-box.showcase-top-right[data-full-screen=true] .value-box-grid{grid-template-columns:auto var(---bslib-value-box-showcase-w-fs, 1fr)}.bslib-value-box.showcase-top-right[data-full-screen=true] .value-box-grid>div{align-self:center}.bslib-value-box.showcase-top-right:not([data-full-screen=true]) .value-box-showcase{margin-top:0}@container bslib-value-box (max-width: 300px){.bslib-value-box.showcase-top-right:not([data-full-screen=true]) .value-box-grid .value-box-showcase{padding-left:1rem}}.bslib-value-box.showcase-left-center .value-box-grid{grid-template-columns:var(---bslib-value-box-showcase-w, 30%) auto}.bslib-value-box.showcase-left-center[data-full-screen=true] .value-box-grid{grid-template-columns:var(---bslib-value-box-showcase-w-fs, 1fr) auto}.bslib-value-box.showcase-left-center:not([data-fill-screen=true]) .value-box-grid .value-box-showcase{grid-area:left}.bslib-value-box.showcase-left-center:not([data-fill-screen=true]) .value-box-grid .value-box-area{grid-area:right}.bslib-value-box.showcase-bottom .value-box-grid{grid-template-columns:1fr;grid-template-rows:1fr var(---bslib-value-box-showcase-h, auto);grid-template-areas:"top" "bottom";overflow:hidden}.bslib-value-box.showcase-bottom .value-box-grid .value-box-showcase{grid-area:bottom;padding:0;margin:0}.bslib-value-box.showcase-bottom .value-box-grid .value-box-area{grid-area:top}.bslib-value-box.showcase-bottom[data-full-screen=true] .value-box-grid{grid-template-rows:1fr var(---bslib-value-box-showcase-h-fs, 2fr)}.bslib-value-box.showcase-bottom[data-full-screen=true] .value-box-grid .value-box-showcase{padding:1rem}[data-bs-theme=dark] .bslib-value-box{--bslib-value-box-shadow: 0 0.5rem 1rem rgb(0 0 0 / 50%)}.bslib-card{overflow:auto}.bslib-card .card-body+.card-body{padding-top:0}.bslib-card .card-body{overflow:auto}.bslib-card .card-body p{margin-top:0}.bslib-card .card-body p:last-child{margin-bottom:0}.bslib-card .card-body{max-height:var(--bslib-card-body-max-height, none)}.bslib-card[data-full-screen=true]>.card-body{max-height:var(--bslib-card-body-max-height-full-screen, none)}.bslib-card .card-header .form-group{margin-bottom:0}.bslib-card .card-header .selectize-control{margin-bottom:0}.bslib-card .card-header .selectize-control .item{margin-right:1.15rem}.bslib-card .card-footer{margin-top:auto}.bslib-card .bslib-navs-card-title{display:flex;flex-wrap:wrap;justify-content:space-between;align-items:center}.bslib-card .bslib-navs-card-title .nav{margin-left:auto}.bslib-card .bslib-sidebar-layout:not([data-bslib-sidebar-border=true]){border:none}.bslib-card .bslib-sidebar-layout:not([data-bslib-sidebar-border-radius=true]){border-top-left-radius:0;border-top-right-radius:0}[data-full-screen=true]{position:fixed;inset:3.5rem 1rem 1rem;height:auto !important;max-height:none !important;width:auto !important;z-index:1070}.bslib-full-screen-enter{display:none;position:absolute;bottom:var(--bslib-full-screen-enter-bottom, 0.2rem);right:var(--bslib-full-screen-enter-right, 0);top:var(--bslib-full-screen-enter-top);left:var(--bslib-full-screen-enter-left);color:var(--bslib-color-fg, var(--bs-card-color));background-color:var(--bslib-color-bg, var(--bs-card-bg, var(--bs-body-bg)));border:var(--bs-card-border-width) solid var(--bslib-color-fg, var(--bs-card-border-color));box-shadow:0 2px 4px rgba(0,0,0,.15);margin:.2rem .4rem;padding:.55rem !important;font-size:.8rem;cursor:pointer;opacity:.7;z-index:1070}.bslib-full-screen-enter:hover{opacity:1}.card[data-full-screen=false]:hover>*>.bslib-full-screen-enter{display:block}.bslib-has-full-screen .card:hover>*>.bslib-full-screen-enter{display:none}@media(max-width: 575.98px){.bslib-full-screen-enter{display:none !important}}.bslib-full-screen-exit{position:relative;top:1.35rem;font-size:.9rem;cursor:pointer;text-decoration:none;display:flex;float:right;margin-right:2.15rem;align-items:center;color:rgba(var(--bs-body-bg-rgb), 0.8)}.bslib-full-screen-exit:hover{color:rgba(var(--bs-body-bg-rgb), 1)}.bslib-full-screen-exit svg{margin-left:.5rem;font-size:1.5rem}#bslib-full-screen-overlay{position:fixed;inset:0;background-color:rgba(var(--bs-body-color-rgb), 0.6);backdrop-filter:blur(2px);-webkit-backdrop-filter:blur(2px);z-index:1069;animation:bslib-full-screen-overlay-enter 400ms cubic-bezier(0.6, 0.02, 0.65, 1) forwards}@keyframes bslib-full-screen-overlay-enter{0%{opacity:0}100%{opacity:1}}:root{--bslib-page-sidebar-title-bg: #f8f9fa;--bslib-page-sidebar-title-color: #000}.bslib-page-title{background-color:var(--bslib-page-sidebar-title-bg);color:var(--bslib-page-sidebar-title-color);font-size:1.25rem;font-weight:300;padding:var(--bslib-spacer, 1rem);padding-left:1.5rem;margin-bottom:0;border-bottom:1px solid #dee2e6}@media(min-width: 576px){.nav:not(.nav-hidden){display:flex !important;display:-webkit-flex !important}.nav:not(.nav-hidden):not(.nav-stacked):not(.flex-column){float:none !important}.nav:not(.nav-hidden):not(.nav-stacked):not(.flex-column)>.bslib-nav-spacer{margin-left:auto !important}.nav:not(.nav-hidden):not(.nav-stacked):not(.flex-column)>.form-inline{margin-top:auto;margin-bottom:auto}.nav:not(.nav-hidden).nav-stacked{flex-direction:column;-webkit-flex-direction:column;height:100%}.nav:not(.nav-hidden).nav-stacked>.bslib-nav-spacer{margin-top:auto !important}}.bslib-grid{display:grid !important;gap:var(--bslib-spacer, 1rem);height:var(--bslib-grid-height)}.bslib-grid.grid{grid-template-columns:repeat(var(--bs-columns, 12), minmax(0, 1fr));grid-template-rows:unset;grid-auto-rows:var(--bslib-grid--row-heights);--bslib-grid--row-heights--xs: unset;--bslib-grid--row-heights--sm: unset;--bslib-grid--row-heights--md: unset;--bslib-grid--row-heights--lg: unset;--bslib-grid--row-heights--xl: unset;--bslib-grid--row-heights--xxl: unset}.bslib-grid.grid.bslib-grid--row-heights--xs{--bslib-grid--row-heights: var(--bslib-grid--row-heights--xs)}@media(min-width: 576px){.bslib-grid.grid.bslib-grid--row-heights--sm{--bslib-grid--row-heights: var(--bslib-grid--row-heights--sm)}}@media(min-width: 768px){.bslib-grid.grid.bslib-grid--row-heights--md{--bslib-grid--row-heights: var(--bslib-grid--row-heights--md)}}@media(min-width: 992px){.bslib-grid.grid.bslib-grid--row-heights--lg{--bslib-grid--row-heights: var(--bslib-grid--row-heights--lg)}}@media(min-width: 1200px){.bslib-grid.grid.bslib-grid--row-heights--xl{--bslib-grid--row-heights: var(--bslib-grid--row-heights--xl)}}@media(min-width: 1400px){.bslib-grid.grid.bslib-grid--row-heights--xxl{--bslib-grid--row-heights: var(--bslib-grid--row-heights--xxl)}}.bslib-grid>*>.shiny-input-container{width:100%}.bslib-grid-item{grid-column:auto/span 1}@media(max-width: 767.98px){.bslib-grid-item{grid-column:1/-1}}@media(max-width: 575.98px){.bslib-grid{grid-template-columns:1fr !important;height:var(--bslib-grid-height-mobile)}.bslib-grid.grid{height:unset !important;grid-auto-rows:var(--bslib-grid--row-heights--xs, auto)}}html{height:100%}.bslib-page-fill{width:100%;height:100%;margin:0;padding:var(--bslib-spacer, 1rem);gap:var(--bslib-spacer, 1rem)}@media(max-width: 575.98px){.bslib-page-fill{height:var(--bslib-page-fill-mobile-height, auto)}}.html-fill-container{display:flex;flex-direction:column;min-height:0;min-width:0}.html-fill-container>.html-fill-item{flex:1 1 auto;min-height:0;min-width:0}.html-fill-container>:not(.html-fill-item){flex:0 0 auto}.quarto-container{min-height:calc(100vh - 132px)}body.hypothesis-enabled #quarto-header{margin-right:16px}footer.footer .nav-footer,#quarto-header>nav{padding-left:1em;padding-right:1em}footer.footer div.nav-footer p:first-child{margin-top:0}footer.footer div.nav-footer p:last-child{margin-bottom:0}#quarto-content>*{padding-top:14px}#quarto-content>#quarto-sidebar-glass{padding-top:0px}@media(max-width: 991.98px){#quarto-content>*{padding-top:0}#quarto-content .subtitle{padding-top:14px}#quarto-content section:first-of-type h2:first-of-type,#quarto-content section:first-of-type .h2:first-of-type{margin-top:1rem}}.headroom-target,header.headroom{will-change:transform;transition:position 200ms linear;transition:all 200ms linear}header.headroom--pinned{transform:translateY(0%)}header.headroom--unpinned{transform:translateY(-100%)}.navbar-container{width:100%}.navbar-brand{overflow:hidden;text-overflow:ellipsis}.navbar-brand-container{max-width:calc(100% - 115px);min-width:0;display:flex;align-items:center}@media(min-width: 992px){.navbar-brand-container{margin-right:1em}}.navbar-brand.navbar-brand-logo{margin-right:4px;display:inline-flex}.navbar-toggler{flex-basis:content;flex-shrink:0}.navbar .navbar-brand-container{order:2}.navbar .navbar-toggler{order:1}.navbar .navbar-container>.navbar-nav{order:20}.navbar .navbar-container>.navbar-brand-container{margin-left:0 !important;margin-right:0 !important}.navbar .navbar-collapse{order:20}.navbar #quarto-search{order:4;margin-left:auto}.navbar .navbar-toggler{margin-right:.5em}.navbar-collapse .quarto-navbar-tools{margin-left:.5em}.navbar-logo{max-height:24px;width:auto;padding-right:4px}nav .nav-item:not(.compact){padding-top:1px}nav .nav-link i,nav .dropdown-item i{padding-right:1px}.navbar-expand-lg .navbar-nav .nav-link{padding-left:.6rem;padding-right:.6rem}nav .nav-item.compact .nav-link{padding-left:.5rem;padding-right:.5rem;font-size:1.1rem}.navbar .quarto-navbar-tools{order:3}.navbar .quarto-navbar-tools div.dropdown{display:inline-block}.navbar .quarto-navbar-tools .quarto-navigation-tool{color:#545555}.navbar .quarto-navbar-tools .quarto-navigation-tool:hover{color:#1f4eb6}.navbar-nav .dropdown-menu{min-width:220px;font-size:.9rem}.navbar .navbar-nav .nav-link.dropdown-toggle::after{opacity:.75;vertical-align:.175em}.navbar ul.dropdown-menu{padding-top:0;padding-bottom:0}.navbar .dropdown-header{text-transform:uppercase;font-size:.8rem;padding:0 .5rem}.navbar .dropdown-item{padding:.4rem .5rem}.navbar .dropdown-item>i.bi{margin-left:.1rem;margin-right:.25em}.sidebar #quarto-search{margin-top:-1px}.sidebar #quarto-search svg.aa-SubmitIcon{width:16px;height:16px}.sidebar-navigation a{color:inherit}.sidebar-title{margin-top:.25rem;padding-bottom:.5rem;font-size:1.3rem;line-height:1.6rem;visibility:visible}.sidebar-title>a{font-size:inherit;text-decoration:none}.sidebar-title .sidebar-tools-main{margin-top:-6px}@media(max-width: 991.98px){#quarto-sidebar div.sidebar-header{padding-top:.2em}}.sidebar-header-stacked .sidebar-title{margin-top:.6rem}.sidebar-logo{max-width:90%;padding-bottom:.5rem}.sidebar-logo-link{text-decoration:none}.sidebar-navigation li a{text-decoration:none}.sidebar-navigation .quarto-navigation-tool{opacity:.7;font-size:.875rem}#quarto-sidebar>nav>.sidebar-tools-main{margin-left:14px}.sidebar-tools-main{display:inline-flex;margin-left:0px;order:2}.sidebar-tools-main:not(.tools-wide){vertical-align:middle}.sidebar-navigation .quarto-navigation-tool.dropdown-toggle::after{display:none}.sidebar.sidebar-navigation>*{padding-top:1em}.sidebar-item{margin-bottom:.2em;line-height:1rem;margin-top:.4rem}.sidebar-section{padding-left:.5em;padding-bottom:.2em}.sidebar-item .sidebar-item-container{display:flex;justify-content:space-between;cursor:pointer}.sidebar-item-toggle:hover{cursor:pointer}.sidebar-item .sidebar-item-toggle .bi{font-size:.7rem;text-align:center}.sidebar-item .sidebar-item-toggle .bi-chevron-right::before{transition:transform 200ms ease}.sidebar-item .sidebar-item-toggle[aria-expanded=false] .bi-chevron-right::before{transform:none}.sidebar-item .sidebar-item-toggle[aria-expanded=true] .bi-chevron-right::before{transform:rotate(90deg)}.sidebar-item-text{width:100%}.sidebar-navigation .sidebar-divider{margin-left:0;margin-right:0;margin-top:.5rem;margin-bottom:.5rem}@media(max-width: 991.98px){.quarto-secondary-nav{display:block}.quarto-secondary-nav button.quarto-search-button{padding-right:0em;padding-left:2em}.quarto-secondary-nav button.quarto-btn-toggle{margin-left:-0.75rem;margin-right:.15rem}.quarto-secondary-nav nav.quarto-title-breadcrumbs{display:none}.quarto-secondary-nav nav.quarto-page-breadcrumbs{display:flex;align-items:center;padding-right:1em;margin-left:-0.25em}.quarto-secondary-nav nav.quarto-page-breadcrumbs a{text-decoration:none}.quarto-secondary-nav nav.quarto-page-breadcrumbs ol.breadcrumb{margin-bottom:0}}@media(min-width: 992px){.quarto-secondary-nav{display:none}}.quarto-title-breadcrumbs .breadcrumb{margin-bottom:.5em;font-size:.9rem}.quarto-title-breadcrumbs .breadcrumb li:last-of-type a{color:#6c757d}.quarto-secondary-nav .quarto-btn-toggle{color:#595959}.quarto-secondary-nav[aria-expanded=false] .quarto-btn-toggle .bi-chevron-right::before{transform:none}.quarto-secondary-nav[aria-expanded=true] .quarto-btn-toggle .bi-chevron-right::before{transform:rotate(90deg)}.quarto-secondary-nav .quarto-btn-toggle .bi-chevron-right::before{transition:transform 200ms ease}.quarto-secondary-nav{cursor:pointer}.no-decor{text-decoration:none}.quarto-secondary-nav-title{margin-top:.3em;color:#595959;padding-top:4px}.quarto-secondary-nav nav.quarto-page-breadcrumbs{color:#595959}.quarto-secondary-nav nav.quarto-page-breadcrumbs a{color:#595959}.quarto-secondary-nav nav.quarto-page-breadcrumbs a:hover{color:rgba(33,81,191,.8)}.quarto-secondary-nav nav.quarto-page-breadcrumbs .breadcrumb-item::before{color:#8c8c8c}.breadcrumb-item{line-height:1.2rem}div.sidebar-item-container{color:#595959}div.sidebar-item-container:hover,div.sidebar-item-container:focus{color:rgba(33,81,191,.8)}div.sidebar-item-container.disabled{color:rgba(89,89,89,.75)}div.sidebar-item-container .active,div.sidebar-item-container .show>.nav-link,div.sidebar-item-container .sidebar-link>code{color:#2151bf}div.sidebar.sidebar-navigation.rollup.quarto-sidebar-toggle-contents,nav.sidebar.sidebar-navigation:not(.rollup){background-color:#fff}@media(max-width: 991.98px){.sidebar-navigation .sidebar-item a,.nav-page .nav-page-text,.sidebar-navigation{font-size:1rem}.sidebar-navigation ul.sidebar-section.depth1 .sidebar-section-item{font-size:1.1rem}.sidebar-logo{display:none}.sidebar.sidebar-navigation{position:static;border-bottom:1px solid #dee2e6}.sidebar.sidebar-navigation.collapsing{position:fixed;z-index:1000}.sidebar.sidebar-navigation.show{position:fixed;z-index:1000}.sidebar.sidebar-navigation{min-height:100%}nav.quarto-secondary-nav{background-color:#fff;border-bottom:1px solid #dee2e6}.quarto-banner nav.quarto-secondary-nav{background-color:#f8f9fa;color:#545555;border-top:1px solid #dee2e6}.sidebar .sidebar-footer{visibility:visible;padding-top:1rem;position:inherit}.sidebar-tools-collapse{display:block}}#quarto-sidebar{transition:width .15s ease-in}#quarto-sidebar>*{padding-right:1em}@media(max-width: 991.98px){#quarto-sidebar .sidebar-menu-container{white-space:nowrap;min-width:225px}#quarto-sidebar.show{transition:width .15s ease-out}}@media(min-width: 992px){#quarto-sidebar{display:flex;flex-direction:column}.nav-page .nav-page-text,.sidebar-navigation .sidebar-section .sidebar-item{font-size:.875rem}.sidebar-navigation .sidebar-item{font-size:.925rem}.sidebar.sidebar-navigation{display:block;position:sticky}.sidebar-search{width:100%}.sidebar .sidebar-footer{visibility:visible}}@media(min-width: 992px){#quarto-sidebar-glass{display:none}}@media(max-width: 991.98px){#quarto-sidebar-glass{position:fixed;top:0;bottom:0;left:0;right:0;background-color:rgba(255,255,255,0);transition:background-color .15s ease-in;z-index:-1}#quarto-sidebar-glass.collapsing{z-index:1000}#quarto-sidebar-glass.show{transition:background-color .15s ease-out;background-color:rgba(102,102,102,.4);z-index:1000}}.sidebar .sidebar-footer{padding:.5rem 1rem;align-self:flex-end;color:#6c757d;width:100%}.quarto-page-breadcrumbs .breadcrumb-item+.breadcrumb-item,.quarto-page-breadcrumbs .breadcrumb-item{padding-right:.33em;padding-left:0}.quarto-page-breadcrumbs .breadcrumb-item::before{padding-right:.33em}.quarto-sidebar-footer{font-size:.875em}.sidebar-section .bi-chevron-right{vertical-align:middle}.sidebar-section .bi-chevron-right::before{font-size:.9em}.notransition{-webkit-transition:none !important;-moz-transition:none !important;-o-transition:none !important;transition:none !important}.btn:focus:not(:focus-visible){box-shadow:none}.page-navigation{display:flex;justify-content:space-between}.nav-page{padding-bottom:.75em}.nav-page .bi{font-size:1.8rem;vertical-align:middle}.nav-page .nav-page-text{padding-left:.25em;padding-right:.25em}.nav-page a{color:#6c757d;text-decoration:none;display:flex;align-items:center}.nav-page a:hover{color:#1f4eb6}.nav-footer .toc-actions{padding-bottom:.5em;padding-top:.5em}.nav-footer .toc-actions a,.nav-footer .toc-actions a:hover{text-decoration:none}.nav-footer .toc-actions ul{display:flex;list-style:none}.nav-footer .toc-actions ul :first-child{margin-left:auto}.nav-footer .toc-actions ul :last-child{margin-right:auto}.nav-footer .toc-actions ul li{padding-right:1.5em}.nav-footer .toc-actions ul li i.bi{padding-right:.4em}.nav-footer .toc-actions ul li:last-of-type{padding-right:0}.nav-footer{display:flex;flex-direction:row;flex-wrap:wrap;justify-content:space-between;align-items:baseline;text-align:center;padding-top:.5rem;padding-bottom:.5rem;background-color:#fff}body.nav-fixed{padding-top:64px}.nav-footer-contents{color:#6c757d;margin-top:.25rem}.nav-footer{min-height:3.5em;color:#757575}.nav-footer a{color:#757575}.nav-footer .nav-footer-left{font-size:.825em}.nav-footer .nav-footer-center{font-size:.825em}.nav-footer .nav-footer-right{font-size:.825em}.nav-footer-left .footer-items,.nav-footer-center .footer-items,.nav-footer-right .footer-items{display:inline-flex;padding-top:.3em;padding-bottom:.3em;margin-bottom:0em}.nav-footer-left .footer-items .nav-link,.nav-footer-center .footer-items .nav-link,.nav-footer-right .footer-items .nav-link{padding-left:.6em;padding-right:.6em}@media(min-width: 768px){.nav-footer-left{flex:1 1 0px;text-align:left}}@media(max-width: 575.98px){.nav-footer-left{margin-bottom:1em;flex:100%}}@media(min-width: 768px){.nav-footer-right{flex:1 1 0px;text-align:right}}@media(max-width: 575.98px){.nav-footer-right{margin-bottom:1em;flex:100%}}.nav-footer-center{text-align:center;min-height:3em}@media(min-width: 768px){.nav-footer-center{flex:1 1 0px}}.nav-footer-center .footer-items{justify-content:center}@media(max-width: 767.98px){.nav-footer-center{margin-bottom:1em;flex:100%}}@media(max-width: 767.98px){.nav-footer-center{margin-top:3em;order:10}}.navbar .quarto-reader-toggle.reader .quarto-reader-toggle-btn{background-color:#545555;border-radius:3px}@media(max-width: 991.98px){.quarto-reader-toggle{display:none}}.quarto-reader-toggle.reader.quarto-navigation-tool .quarto-reader-toggle-btn{background-color:#595959;border-radius:3px}.quarto-reader-toggle .quarto-reader-toggle-btn{display:inline-flex;padding-left:.2em;padding-right:.2em;margin-left:-0.2em;margin-right:-0.2em;text-align:center}.navbar .quarto-reader-toggle:not(.reader) .bi::before{background-image:url('data:image/svg+xml,')}.navbar .quarto-reader-toggle.reader .bi::before{background-image:url('data:image/svg+xml,')}.sidebar-navigation .quarto-reader-toggle:not(.reader) .bi::before{background-image:url('data:image/svg+xml,')}.sidebar-navigation .quarto-reader-toggle.reader .bi::before{background-image:url('data:image/svg+xml,')}#quarto-back-to-top{display:none;position:fixed;bottom:50px;background-color:#fff;border-radius:.25rem;box-shadow:0 .2rem .5rem #6c757d,0 0 .05rem #6c757d;color:#6c757d;text-decoration:none;font-size:.9em;text-align:center;left:50%;padding:.4rem .8rem;transform:translate(-50%, 0)}#quarto-announcement{padding:.5em;display:flex;justify-content:space-between;margin-bottom:0;font-size:.9em}#quarto-announcement .quarto-announcement-content{margin-right:auto}#quarto-announcement .quarto-announcement-content p{margin-bottom:0}#quarto-announcement .quarto-announcement-icon{margin-right:.5em;font-size:1.2em;margin-top:-0.15em}#quarto-announcement .quarto-announcement-action{cursor:pointer}.aa-DetachedSearchButtonQuery{display:none}.aa-DetachedOverlay ul.aa-List,#quarto-search-results ul.aa-List{list-style:none;padding-left:0}.aa-DetachedOverlay .aa-Panel,#quarto-search-results .aa-Panel{background-color:#fff;position:absolute;z-index:2000}#quarto-search-results .aa-Panel{max-width:400px}#quarto-search input{font-size:.925rem}@media(min-width: 992px){.navbar #quarto-search{margin-left:.25rem;order:999}}.navbar.navbar-expand-sm #quarto-search,.navbar.navbar-expand-md #quarto-search{order:999}@media(min-width: 992px){.navbar .quarto-navbar-tools{order:900}}@media(min-width: 992px){.navbar .quarto-navbar-tools.tools-end{margin-left:auto !important}}@media(max-width: 991.98px){#quarto-sidebar .sidebar-search{display:none}}#quarto-sidebar .sidebar-search .aa-Autocomplete{width:100%}.navbar .aa-Autocomplete .aa-Form{width:180px}.navbar #quarto-search.type-overlay .aa-Autocomplete{width:40px}.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form{background-color:inherit;border:none}.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form:focus-within{box-shadow:none;outline:none}.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form .aa-InputWrapper{display:none}.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form .aa-InputWrapper:focus-within{display:inherit}.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form .aa-Label svg,.navbar #quarto-search.type-overlay .aa-Autocomplete .aa-Form .aa-LoadingIndicator svg{width:26px;height:26px;color:#545555;opacity:1}.navbar #quarto-search.type-overlay .aa-Autocomplete svg.aa-SubmitIcon{width:26px;height:26px;color:#545555;opacity:1}.aa-Autocomplete .aa-Form,.aa-DetachedFormContainer .aa-Form{align-items:center;background-color:#fff;border:1px solid #dee2e6;border-radius:.25rem;color:#343a40;display:flex;line-height:1em;margin:0;position:relative;width:100%}.aa-Autocomplete .aa-Form:focus-within,.aa-DetachedFormContainer .aa-Form:focus-within{box-shadow:rgba(39,128,227,.6) 0 0 0 1px;outline:currentColor none medium}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix{align-items:center;display:flex;flex-shrink:0;order:1}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-Label,.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-Label,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator{cursor:initial;flex-shrink:0;padding:0;text-align:left}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-Label svg,.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator svg,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-Label svg,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator svg{color:#343a40;opacity:.5}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-SubmitButton,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-SubmitButton{appearance:none;background:none;border:0;margin:0}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator{align-items:center;display:flex;justify-content:center}.aa-Autocomplete .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator[hidden],.aa-DetachedFormContainer .aa-Form .aa-InputWrapperPrefix .aa-LoadingIndicator[hidden]{display:none}.aa-Autocomplete .aa-Form .aa-InputWrapper,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper{order:3;position:relative;width:100%}.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input{appearance:none;background:none;border:0;color:#343a40;font:inherit;height:calc(1.5em + .1rem + 2px);padding:0;width:100%}.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input::placeholder,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input::placeholder{color:#343a40;opacity:.8}.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input:focus,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input:focus{border-color:none;box-shadow:none;outline:none}.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-decoration,.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-cancel-button,.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-results-button,.aa-Autocomplete .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-results-decoration,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-decoration,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-cancel-button,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-results-button,.aa-DetachedFormContainer .aa-Form .aa-InputWrapper .aa-Input::-webkit-search-results-decoration{display:none}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix{align-items:center;display:flex;order:4}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-ClearButton,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-ClearButton{align-items:center;background:none;border:0;color:#343a40;opacity:.8;cursor:pointer;display:flex;margin:0;width:calc(1.5em + .1rem + 2px)}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-ClearButton:hover,.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-ClearButton:focus,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-ClearButton:hover,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-ClearButton:focus{color:#343a40;opacity:.8}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-ClearButton[hidden],.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-ClearButton[hidden]{display:none}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-ClearButton svg,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-ClearButton svg{width:calc(1.5em + 0.75rem + calc(1px * 2))}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-CopyButton,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-CopyButton{border:none;align-items:center;background:none;color:#343a40;opacity:.4;font-size:.7rem;cursor:pointer;display:none;margin:0;width:calc(1em + .1rem + 2px)}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-CopyButton:hover,.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-CopyButton:focus,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-CopyButton:hover,.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-CopyButton:focus{color:#343a40;opacity:.8}.aa-Autocomplete .aa-Form .aa-InputWrapperSuffix .aa-CopyButton[hidden],.aa-DetachedFormContainer .aa-Form .aa-InputWrapperSuffix .aa-CopyButton[hidden]{display:none}.aa-PanelLayout:empty{display:none}.quarto-search-no-results.no-query{display:none}.aa-Source:has(.no-query){display:none}#quarto-search-results .aa-Panel{border:solid #dee2e6 1px}#quarto-search-results .aa-SourceNoResults{width:398px}.aa-DetachedOverlay .aa-Panel,#quarto-search-results .aa-Panel{max-height:65vh;overflow-y:auto;font-size:.925rem}.aa-DetachedOverlay .aa-SourceNoResults,#quarto-search-results .aa-SourceNoResults{height:60px;display:flex;justify-content:center;align-items:center}.aa-DetachedOverlay .search-error,#quarto-search-results .search-error{padding-top:10px;padding-left:20px;padding-right:20px;cursor:default}.aa-DetachedOverlay .search-error .search-error-title,#quarto-search-results .search-error .search-error-title{font-size:1.1rem;margin-bottom:.5rem}.aa-DetachedOverlay .search-error .search-error-title .search-error-icon,#quarto-search-results .search-error .search-error-title .search-error-icon{margin-right:8px}.aa-DetachedOverlay .search-error .search-error-text,#quarto-search-results .search-error .search-error-text{font-weight:300}.aa-DetachedOverlay .search-result-text,#quarto-search-results .search-result-text{font-weight:300;overflow:hidden;text-overflow:ellipsis;display:-webkit-box;-webkit-line-clamp:2;-webkit-box-orient:vertical;line-height:1.2rem;max-height:2.4rem}.aa-DetachedOverlay .aa-SourceHeader .search-result-header,#quarto-search-results .aa-SourceHeader .search-result-header{font-size:.875rem;background-color:#f2f2f2;padding-left:14px;padding-bottom:4px;padding-top:4px}.aa-DetachedOverlay .aa-SourceHeader .search-result-header-no-results,#quarto-search-results .aa-SourceHeader .search-result-header-no-results{display:none}.aa-DetachedOverlay .aa-SourceFooter .algolia-search-logo,#quarto-search-results .aa-SourceFooter .algolia-search-logo{width:110px;opacity:.85;margin:8px;float:right}.aa-DetachedOverlay .search-result-section,#quarto-search-results .search-result-section{font-size:.925em}.aa-DetachedOverlay a.search-result-link,#quarto-search-results a.search-result-link{color:inherit;text-decoration:none}.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item,#quarto-search-results li.aa-Item[aria-selected=true] .search-item{background-color:#2780e3}.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item.search-result-more,.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item .search-result-section,.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item .search-result-text,.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item .search-result-title-container,.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item .search-result-text-container,#quarto-search-results li.aa-Item[aria-selected=true] .search-item.search-result-more,#quarto-search-results li.aa-Item[aria-selected=true] .search-item .search-result-section,#quarto-search-results li.aa-Item[aria-selected=true] .search-item .search-result-text,#quarto-search-results li.aa-Item[aria-selected=true] .search-item .search-result-title-container,#quarto-search-results li.aa-Item[aria-selected=true] .search-item .search-result-text-container{color:#fff;background-color:#2780e3}.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item mark.search-match,.aa-DetachedOverlay li.aa-Item[aria-selected=true] .search-item .search-match.mark,#quarto-search-results li.aa-Item[aria-selected=true] .search-item mark.search-match,#quarto-search-results li.aa-Item[aria-selected=true] .search-item .search-match.mark{color:#fff;background-color:#4b95e8}.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item,#quarto-search-results li.aa-Item[aria-selected=false] .search-item{background-color:#fff}.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item.search-result-more,.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item .search-result-section,.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item .search-result-text,.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item .search-result-title-container,.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item .search-result-text-container,#quarto-search-results li.aa-Item[aria-selected=false] .search-item.search-result-more,#quarto-search-results li.aa-Item[aria-selected=false] .search-item .search-result-section,#quarto-search-results li.aa-Item[aria-selected=false] .search-item .search-result-text,#quarto-search-results li.aa-Item[aria-selected=false] .search-item .search-result-title-container,#quarto-search-results li.aa-Item[aria-selected=false] .search-item .search-result-text-container{color:#343a40}.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item mark.search-match,.aa-DetachedOverlay li.aa-Item[aria-selected=false] .search-item .search-match.mark,#quarto-search-results li.aa-Item[aria-selected=false] .search-item mark.search-match,#quarto-search-results li.aa-Item[aria-selected=false] .search-item .search-match.mark{color:inherit;background-color:#e5effc}.aa-DetachedOverlay .aa-Item .search-result-doc:not(.document-selectable) .search-result-title-container,#quarto-search-results .aa-Item .search-result-doc:not(.document-selectable) .search-result-title-container{background-color:#fff;color:#343a40}.aa-DetachedOverlay .aa-Item .search-result-doc:not(.document-selectable) .search-result-text-container,#quarto-search-results .aa-Item .search-result-doc:not(.document-selectable) .search-result-text-container{padding-top:0px}.aa-DetachedOverlay li.aa-Item .search-result-doc.document-selectable .search-result-text-container,#quarto-search-results li.aa-Item .search-result-doc.document-selectable .search-result-text-container{margin-top:-4px}.aa-DetachedOverlay .aa-Item,#quarto-search-results .aa-Item{cursor:pointer}.aa-DetachedOverlay .aa-Item .search-item,#quarto-search-results .aa-Item .search-item{border-left:none;border-right:none;border-top:none;background-color:#fff;border-color:#dee2e6;color:#343a40}.aa-DetachedOverlay .aa-Item .search-item p,#quarto-search-results .aa-Item .search-item p{margin-top:0;margin-bottom:0}.aa-DetachedOverlay .aa-Item .search-item i.bi,#quarto-search-results .aa-Item .search-item i.bi{padding-left:8px;padding-right:8px;font-size:1.3em}.aa-DetachedOverlay .aa-Item .search-item .search-result-title,#quarto-search-results .aa-Item .search-item .search-result-title{margin-top:.3em;margin-bottom:0em}.aa-DetachedOverlay .aa-Item .search-item .search-result-crumbs,#quarto-search-results .aa-Item .search-item .search-result-crumbs{white-space:nowrap;text-overflow:ellipsis;font-size:.8em;font-weight:300;margin-right:1em}.aa-DetachedOverlay .aa-Item .search-item .search-result-crumbs:not(.search-result-crumbs-wrap),#quarto-search-results .aa-Item .search-item .search-result-crumbs:not(.search-result-crumbs-wrap){max-width:30%;margin-left:auto;margin-top:.5em;margin-bottom:.1rem}.aa-DetachedOverlay .aa-Item .search-item .search-result-crumbs.search-result-crumbs-wrap,#quarto-search-results .aa-Item .search-item .search-result-crumbs.search-result-crumbs-wrap{flex-basis:100%;margin-top:0em;margin-bottom:.2em;margin-left:37px}.aa-DetachedOverlay .aa-Item .search-result-title-container,#quarto-search-results .aa-Item .search-result-title-container{font-size:1em;display:flex;flex-wrap:wrap;padding:6px 4px 6px 4px}.aa-DetachedOverlay .aa-Item .search-result-text-container,#quarto-search-results .aa-Item .search-result-text-container{padding-bottom:8px;padding-right:8px;margin-left:42px}.aa-DetachedOverlay .aa-Item .search-result-doc-section,.aa-DetachedOverlay .aa-Item .search-result-more,#quarto-search-results .aa-Item .search-result-doc-section,#quarto-search-results .aa-Item .search-result-more{padding-top:8px;padding-bottom:8px;padding-left:44px}.aa-DetachedOverlay .aa-Item .search-result-more,#quarto-search-results .aa-Item .search-result-more{font-size:.8em;font-weight:400}.aa-DetachedOverlay .aa-Item .search-result-doc,#quarto-search-results .aa-Item .search-result-doc{border-top:1px solid #dee2e6}.aa-DetachedSearchButton{background:none;border:none}.aa-DetachedSearchButton .aa-DetachedSearchButtonPlaceholder{display:none}.navbar .aa-DetachedSearchButton .aa-DetachedSearchButtonIcon{color:#545555}.sidebar-tools-collapse #quarto-search,.sidebar-tools-main #quarto-search{display:inline}.sidebar-tools-collapse #quarto-search .aa-Autocomplete,.sidebar-tools-main #quarto-search .aa-Autocomplete{display:inline}.sidebar-tools-collapse #quarto-search .aa-DetachedSearchButton,.sidebar-tools-main #quarto-search .aa-DetachedSearchButton{padding-left:4px;padding-right:4px}.sidebar-tools-collapse #quarto-search .aa-DetachedSearchButton .aa-DetachedSearchButtonIcon,.sidebar-tools-main #quarto-search .aa-DetachedSearchButton .aa-DetachedSearchButtonIcon{color:#595959}.sidebar-tools-collapse #quarto-search .aa-DetachedSearchButton .aa-DetachedSearchButtonIcon .aa-SubmitIcon,.sidebar-tools-main #quarto-search .aa-DetachedSearchButton .aa-DetachedSearchButtonIcon .aa-SubmitIcon{margin-top:-3px}.aa-DetachedContainer{background:rgba(255,255,255,.65);width:90%;bottom:0;box-shadow:rgba(222,226,230,.6) 0 0 0 1px;outline:currentColor none medium;display:flex;flex-direction:column;left:0;margin:0;overflow:hidden;padding:0;position:fixed;right:0;top:0;z-index:1101}.aa-DetachedContainer::after{height:32px}.aa-DetachedContainer .aa-SourceHeader{margin:var(--aa-spacing-half) 0 var(--aa-spacing-half) 2px}.aa-DetachedContainer .aa-Panel{background-color:#fff;border-radius:0;box-shadow:none;flex-grow:1;margin:0;padding:0;position:relative}.aa-DetachedContainer .aa-PanelLayout{bottom:0;box-shadow:none;left:0;margin:0;max-height:none;overflow-y:auto;position:absolute;right:0;top:0;width:100%}.aa-DetachedFormContainer{background-color:#fff;border-bottom:1px solid #dee2e6;display:flex;flex-direction:row;justify-content:space-between;margin:0;padding:.5em}.aa-DetachedCancelButton{background:none;font-size:.8em;border:0;border-radius:3px;color:#343a40;cursor:pointer;margin:0 0 0 .5em;padding:0 .5em}.aa-DetachedCancelButton:hover,.aa-DetachedCancelButton:focus{box-shadow:rgba(39,128,227,.6) 0 0 0 1px;outline:currentColor none medium}.aa-DetachedContainer--modal{bottom:inherit;height:auto;margin:0 auto;position:absolute;top:100px;border-radius:6px;max-width:850px}@media(max-width: 575.98px){.aa-DetachedContainer--modal{width:100%;top:0px;border-radius:0px;border:none}}.aa-DetachedContainer--modal .aa-PanelLayout{max-height:var(--aa-detached-modal-max-height);padding-bottom:var(--aa-spacing-half);position:static}.aa-Detached{height:100vh;overflow:hidden}.aa-DetachedOverlay{background-color:rgba(52,58,64,.4);position:fixed;left:0;right:0;top:0;margin:0;padding:0;height:100vh;z-index:1100}.quarto-dashboard.nav-fixed.dashboard-sidebar #quarto-content.quarto-dashboard-content{padding:0em}.quarto-dashboard #quarto-content.quarto-dashboard-content{padding:1em}.quarto-dashboard #quarto-content.quarto-dashboard-content>*{padding-top:0}@media(min-width: 576px){.quarto-dashboard{height:100%}}.quarto-dashboard .card.valuebox.bslib-card.bg-primary{background-color:#5397e9 !important}.quarto-dashboard .card.valuebox.bslib-card.bg-secondary{background-color:#343a40 !important}.quarto-dashboard .card.valuebox.bslib-card.bg-success{background-color:#3aa716 !important}.quarto-dashboard .card.valuebox.bslib-card.bg-info{background-color:rgba(153,84,187,.7019607843) !important}.quarto-dashboard .card.valuebox.bslib-card.bg-warning{background-color:#fa6400 !important}.quarto-dashboard .card.valuebox.bslib-card.bg-danger{background-color:rgba(255,0,57,.7019607843) !important}.quarto-dashboard .card.valuebox.bslib-card.bg-light{background-color:#f8f9fa !important}.quarto-dashboard .card.valuebox.bslib-card.bg-dark{background-color:#343a40 !important}.quarto-dashboard.dashboard-fill{display:flex;flex-direction:column}.quarto-dashboard #quarto-appendix{display:none}.quarto-dashboard #quarto-header #quarto-dashboard-header{border-top:solid 1px #dae0e5;border-bottom:solid 1px #dae0e5}.quarto-dashboard #quarto-header #quarto-dashboard-header>nav{padding-left:1em;padding-right:1em}.quarto-dashboard #quarto-header #quarto-dashboard-header>nav .navbar-brand-container{padding-left:0}.quarto-dashboard #quarto-header #quarto-dashboard-header .navbar-toggler{margin-right:0}.quarto-dashboard #quarto-header #quarto-dashboard-header .navbar-toggler-icon{height:1em;width:1em;background-image:url('data:image/svg+xml,')}.quarto-dashboard #quarto-header #quarto-dashboard-header .navbar-brand-container{padding-right:1em}.quarto-dashboard #quarto-header #quarto-dashboard-header .navbar-title{font-size:1.1em}.quarto-dashboard #quarto-header #quarto-dashboard-header .navbar-nav{font-size:.9em}.quarto-dashboard #quarto-dashboard-header .navbar{padding:0}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-container{padding-left:1em}.quarto-dashboard #quarto-dashboard-header .navbar.slim .navbar-brand-container .nav-link,.quarto-dashboard #quarto-dashboard-header .navbar.slim .navbar-nav .nav-link{padding:.7em}.quarto-dashboard #quarto-dashboard-header .navbar .quarto-color-scheme-toggle{order:9}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-toggler{margin-left:.5em;order:10}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-nav .nav-link{padding:.5em;height:100%;display:flex;align-items:center}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-nav .active{background-color:#e0e5e9}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-brand-container{padding:.5em .5em .5em 0;display:flex;flex-direction:row;margin-right:2em;align-items:center}@media(max-width: 767.98px){.quarto-dashboard #quarto-dashboard-header .navbar .navbar-brand-container{margin-right:auto}}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-collapse{align-self:stretch}@media(min-width: 768px){.quarto-dashboard #quarto-dashboard-header .navbar .navbar-collapse{order:8}}@media(max-width: 767.98px){.quarto-dashboard #quarto-dashboard-header .navbar .navbar-collapse{order:1000;padding-bottom:.5em}}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-collapse .navbar-nav{align-self:stretch}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-title{font-size:1.25em;line-height:1.1em;display:flex;flex-direction:row;flex-wrap:wrap;align-items:baseline}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-title .navbar-title-text{margin-right:.4em}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-title a{text-decoration:none;color:inherit}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-subtitle,.quarto-dashboard #quarto-dashboard-header .navbar .navbar-author{font-size:.9rem;margin-right:.5em}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-author{margin-left:auto}.quarto-dashboard #quarto-dashboard-header .navbar .navbar-logo{max-height:48px;min-height:30px;object-fit:cover;margin-right:1em}.quarto-dashboard #quarto-dashboard-header .navbar .quarto-dashboard-links{order:9;padding-right:1em}.quarto-dashboard #quarto-dashboard-header .navbar .quarto-dashboard-link-text{margin-left:.25em}.quarto-dashboard #quarto-dashboard-header .navbar .quarto-dashboard-link{padding-right:0em;padding-left:.7em;text-decoration:none;color:#545555}.quarto-dashboard .page-layout-custom .tab-content{padding:0;border:none}.quarto-dashboard-img-contain{height:100%;width:100%;object-fit:contain}@media(max-width: 575.98px){.quarto-dashboard .bslib-grid{grid-template-rows:minmax(1em, max-content) !important}.quarto-dashboard .sidebar-content{height:inherit}.quarto-dashboard .page-layout-custom{min-height:100vh}}.quarto-dashboard.dashboard-toolbar>.page-layout-custom,.quarto-dashboard.dashboard-sidebar>.page-layout-custom{padding:0}.quarto-dashboard .quarto-dashboard-content.quarto-dashboard-pages{padding:0}.quarto-dashboard .callout{margin-bottom:0;margin-top:0}.quarto-dashboard .html-fill-container figure{overflow:hidden}.quarto-dashboard bslib-tooltip .rounded-pill{border:solid #6c757d 1px}.quarto-dashboard bslib-tooltip .rounded-pill .svg{fill:#343a40}.quarto-dashboard .tabset .dashboard-card-no-title .nav-tabs{margin-left:0;margin-right:auto}.quarto-dashboard .tabset .tab-content{border:none}.quarto-dashboard .tabset .card-header .nav-link[role=tab]{margin-top:-6px;padding-top:6px;padding-bottom:6px}.quarto-dashboard .card.valuebox,.quarto-dashboard .card.bslib-value-box{min-height:3rem}.quarto-dashboard .card.valuebox .card-body,.quarto-dashboard .card.bslib-value-box .card-body{padding:0}.quarto-dashboard .bslib-value-box .value-box-value{font-size:clamp(.1em,15cqw,5em)}.quarto-dashboard .bslib-value-box .value-box-showcase .bi{font-size:clamp(.1em,max(18cqw,5.2cqh),5em);text-align:center;height:1em}.quarto-dashboard .bslib-value-box .value-box-showcase .bi::before{vertical-align:1em}.quarto-dashboard .bslib-value-box .value-box-area{margin-top:auto;margin-bottom:auto}.quarto-dashboard .card figure.quarto-float{display:flex;flex-direction:column;align-items:center}.quarto-dashboard .dashboard-scrolling{padding:1em}.quarto-dashboard .full-height{height:100%}.quarto-dashboard .showcase-bottom .value-box-grid{display:grid;grid-template-columns:1fr;grid-template-rows:1fr auto;grid-template-areas:"top" "bottom"}.quarto-dashboard .showcase-bottom .value-box-grid .value-box-showcase{grid-area:bottom;padding:0;margin:0}.quarto-dashboard .showcase-bottom .value-box-grid .value-box-showcase i.bi{font-size:4rem}.quarto-dashboard .showcase-bottom .value-box-grid .value-box-area{grid-area:top}.quarto-dashboard .tab-content{margin-bottom:0}.quarto-dashboard .bslib-card .bslib-navs-card-title{justify-content:stretch;align-items:end}.quarto-dashboard .card-header{display:flex;flex-wrap:wrap;justify-content:space-between}.quarto-dashboard .card-header .card-title{display:flex;flex-direction:column;justify-content:center;margin-bottom:0}.quarto-dashboard .tabset .card-toolbar{margin-bottom:1em}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout{border:none;gap:var(--bslib-spacer, 1rem)}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout>.main{padding:0}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout>.sidebar{border-radius:.25rem;border:1px solid rgba(0,0,0,.175)}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout>.collapse-toggle{display:none}@media(max-width: 767.98px){.quarto-dashboard .bslib-grid>.bslib-sidebar-layout{grid-template-columns:1fr;grid-template-rows:max-content 1fr}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout>.main{grid-column:1;grid-row:2}.quarto-dashboard .bslib-grid>.bslib-sidebar-layout .sidebar{grid-column:1;grid-row:1}}.quarto-dashboard .sidebar-right .sidebar{padding-left:2.5em}.quarto-dashboard .sidebar-right .collapse-toggle{left:2px}.quarto-dashboard .quarto-dashboard .sidebar-right button.collapse-toggle:not(.transitioning){left:unset}.quarto-dashboard aside.sidebar{padding-left:1em;padding-right:1em;background-color:rgba(52,58,64,.25);color:#343a40}.quarto-dashboard .bslib-sidebar-layout>div.main{padding:.7em}.quarto-dashboard .bslib-sidebar-layout button.collapse-toggle{margin-top:.3em}.quarto-dashboard .bslib-sidebar-layout .collapse-toggle{top:0}.quarto-dashboard .bslib-sidebar-layout.sidebar-collapsed:not(.transitioning):not(.sidebar-right) .collapse-toggle{left:2px}.quarto-dashboard .sidebar>section>.h3:first-of-type{margin-top:0em}.quarto-dashboard .sidebar .h3,.quarto-dashboard .sidebar .h4,.quarto-dashboard .sidebar .h5,.quarto-dashboard .sidebar .h6{margin-top:.5em}.quarto-dashboard .sidebar form{flex-direction:column;align-items:start;margin-bottom:1em}.quarto-dashboard .sidebar form div[class*=oi-][class$=-input]{flex-direction:column}.quarto-dashboard .sidebar form[class*=oi-][class$=-toggle]{flex-direction:row-reverse;align-items:center;justify-content:start}.quarto-dashboard .sidebar form input[type=range]{margin-top:.5em;margin-right:.8em;margin-left:1em}.quarto-dashboard .sidebar label{width:fit-content}.quarto-dashboard .sidebar .card-body{margin-bottom:2em}.quarto-dashboard .sidebar .shiny-input-container{margin-bottom:1em}.quarto-dashboard .sidebar .shiny-options-group{margin-top:0}.quarto-dashboard .sidebar .control-label{margin-bottom:.3em}.quarto-dashboard .card .card-body .quarto-layout-row{align-items:stretch}.quarto-dashboard .toolbar{font-size:.9em;display:flex;flex-direction:row;border-top:solid 1px #bcbfc0;padding:1em;flex-wrap:wrap;background-color:rgba(52,58,64,.25)}.quarto-dashboard .toolbar .cell-output-display{display:flex}.quarto-dashboard .toolbar .shiny-input-container{padding-bottom:.5em;margin-bottom:.5em;width:inherit}.quarto-dashboard .toolbar .shiny-input-container>.checkbox:first-child{margin-top:6px}.quarto-dashboard .toolbar>*:last-child{margin-right:0}.quarto-dashboard .toolbar>*>*{margin-right:1em;align-items:baseline}.quarto-dashboard .toolbar>*>*>a{text-decoration:none;margin-top:auto;margin-bottom:auto}.quarto-dashboard .toolbar .shiny-input-container{padding-bottom:0;margin-bottom:0}.quarto-dashboard .toolbar .shiny-input-container>*{flex-shrink:0;flex-grow:0}.quarto-dashboard .toolbar .form-group.shiny-input-container:not([role=group])>label{margin-bottom:0}.quarto-dashboard .toolbar .shiny-input-container.no-baseline{align-items:start;padding-top:6px}.quarto-dashboard .toolbar .shiny-input-container{display:flex;align-items:baseline}.quarto-dashboard .toolbar .shiny-input-container label{padding-right:.4em}.quarto-dashboard .toolbar .shiny-input-container .bslib-input-switch{margin-top:6px}.quarto-dashboard .toolbar input[type=text]{line-height:1;width:inherit}.quarto-dashboard .toolbar .input-daterange{width:inherit}.quarto-dashboard .toolbar .input-daterange input[type=text]{height:2.4em;width:10em}.quarto-dashboard .toolbar .input-daterange .input-group-addon{height:auto;padding:0;margin-left:-5px !important;margin-right:-5px}.quarto-dashboard .toolbar .input-daterange .input-group-addon .input-group-text{padding-top:0;padding-bottom:0;height:100%}.quarto-dashboard .toolbar span.irs.irs--shiny{width:10em}.quarto-dashboard .toolbar span.irs.irs--shiny .irs-line{top:9px}.quarto-dashboard .toolbar span.irs.irs--shiny .irs-min,.quarto-dashboard .toolbar span.irs.irs--shiny .irs-max,.quarto-dashboard .toolbar span.irs.irs--shiny .irs-from,.quarto-dashboard .toolbar span.irs.irs--shiny .irs-to,.quarto-dashboard .toolbar span.irs.irs--shiny .irs-single{top:20px}.quarto-dashboard .toolbar span.irs.irs--shiny .irs-bar{top:8px}.quarto-dashboard .toolbar span.irs.irs--shiny .irs-handle{top:0px}.quarto-dashboard .toolbar .shiny-input-checkboxgroup>label{margin-top:6px}.quarto-dashboard .toolbar .shiny-input-checkboxgroup>.shiny-options-group{margin-top:0;align-items:baseline}.quarto-dashboard .toolbar .shiny-input-radiogroup>label{margin-top:6px}.quarto-dashboard .toolbar .shiny-input-radiogroup>.shiny-options-group{align-items:baseline;margin-top:0}.quarto-dashboard .toolbar .shiny-input-radiogroup>.shiny-options-group>.radio{margin-right:.3em}.quarto-dashboard .toolbar .form-select{padding-top:.2em;padding-bottom:.2em}.quarto-dashboard .toolbar .shiny-input-select{min-width:6em}.quarto-dashboard .toolbar div.checkbox{margin-bottom:0px}.quarto-dashboard .toolbar>.checkbox:first-child{margin-top:6px}.quarto-dashboard .toolbar form{width:fit-content}.quarto-dashboard .toolbar form label{padding-top:.2em;padding-bottom:.2em;width:fit-content}.quarto-dashboard .toolbar form input[type=date]{width:fit-content}.quarto-dashboard .toolbar form input[type=color]{width:3em}.quarto-dashboard .toolbar form button{padding:.4em}.quarto-dashboard .toolbar form select{width:fit-content}.quarto-dashboard .toolbar>*{font-size:.9em;flex-grow:0}.quarto-dashboard .toolbar .shiny-input-container label{margin-bottom:1px}.quarto-dashboard .toolbar-bottom{margin-top:1em;margin-bottom:0 !important;order:2}.quarto-dashboard .quarto-dashboard-content>.dashboard-toolbar-container>.toolbar-content>.tab-content>.tab-pane>*:not(.bslib-sidebar-layout){padding:1em}.quarto-dashboard .quarto-dashboard-content>.dashboard-toolbar-container>.toolbar-content>*:not(.tab-content){padding:1em}.quarto-dashboard .quarto-dashboard-content>.tab-content>.dashboard-page>.dashboard-toolbar-container>.toolbar-content,.quarto-dashboard .quarto-dashboard-content>.tab-content>.dashboard-page:not(.dashboard-sidebar-container)>*:not(.dashboard-toolbar-container){padding:1em}.quarto-dashboard .toolbar-content{padding:0}.quarto-dashboard .quarto-dashboard-content.quarto-dashboard-pages .tab-pane>.dashboard-toolbar-container .toolbar{border-radius:0;margin-bottom:0}.quarto-dashboard .dashboard-toolbar-container.toolbar-toplevel .toolbar{border-bottom:1px solid rgba(0,0,0,.175)}.quarto-dashboard .dashboard-toolbar-container.toolbar-toplevel .toolbar-bottom{margin-top:0}.quarto-dashboard .dashboard-toolbar-container:not(.toolbar-toplevel) .toolbar{margin-bottom:1em;border-top:none;border-radius:.25rem;border:1px solid rgba(0,0,0,.175)}.quarto-dashboard .vega-embed.has-actions details{width:1.7em;height:2em;position:absolute !important;top:0;right:0}.quarto-dashboard .dashboard-toolbar-container{padding:0}.quarto-dashboard .card .card-header p:last-child,.quarto-dashboard .card .card-footer p:last-child{margin-bottom:0}.quarto-dashboard .card .card-body>.h4:first-child{margin-top:0}.quarto-dashboard .card .card-body{z-index:4}@media(max-width: 767.98px){.quarto-dashboard .card .card-body .itables div.dataTables_wrapper div.dataTables_length,.quarto-dashboard .card .card-body .itables div.dataTables_wrapper div.dataTables_info,.quarto-dashboard .card .card-body .itables div.dataTables_wrapper div.dataTables_paginate{text-align:initial}.quarto-dashboard .card .card-body .itables div.dataTables_wrapper div.dataTables_filter{text-align:right}.quarto-dashboard .card .card-body .itables div.dataTables_wrapper div.dataTables_paginate ul.pagination{justify-content:initial}}.quarto-dashboard .card .card-body .itables .dataTables_wrapper{display:flex;flex-wrap:wrap;justify-content:space-between;align-items:center;padding-top:0}.quarto-dashboard .card .card-body .itables .dataTables_wrapper table{flex-shrink:0}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dt-buttons{margin-bottom:.5em;margin-left:auto;width:fit-content;float:right}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dt-buttons.btn-group{background:#fff;border:none}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dt-buttons .btn-secondary{background-color:#fff;background-image:none;border:solid #dee2e6 1px;padding:.2em .7em}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dt-buttons .btn span{font-size:.8em;color:#343a40}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_info{margin-left:.5em;margin-bottom:.5em;padding-top:0}@media(min-width: 768px){.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_info{font-size:.875em}}@media(max-width: 767.98px){.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_info{font-size:.8em}}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_filter{margin-bottom:.5em;font-size:.875em}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_filter input[type=search]{padding:1px 5px 1px 5px;font-size:.875em}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_length{flex-basis:1 1 50%;margin-bottom:.5em;font-size:.875em}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_length select{padding:.4em 3em .4em .5em;font-size:.875em;margin-left:.2em;margin-right:.2em}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_paginate{flex-shrink:0}@media(min-width: 768px){.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_paginate{margin-left:auto}}.quarto-dashboard .card .card-body .itables .dataTables_wrapper .dataTables_paginate ul.pagination .paginate_button .page-link{font-size:.8em}.quarto-dashboard .card .card-footer{font-size:.9em}.quarto-dashboard .card .card-toolbar{display:flex;flex-grow:1;flex-direction:row;width:100%;flex-wrap:wrap}.quarto-dashboard .card .card-toolbar>*{font-size:.8em;flex-grow:0}.quarto-dashboard .card .card-toolbar>.card-title{font-size:1em;flex-grow:1;align-self:flex-start;margin-top:.1em}.quarto-dashboard .card .card-toolbar .cell-output-display{display:flex}.quarto-dashboard .card .card-toolbar .shiny-input-container{padding-bottom:.5em;margin-bottom:.5em;width:inherit}.quarto-dashboard .card .card-toolbar .shiny-input-container>.checkbox:first-child{margin-top:6px}.quarto-dashboard .card .card-toolbar>*:last-child{margin-right:0}.quarto-dashboard .card .card-toolbar>*>*{margin-right:1em;align-items:baseline}.quarto-dashboard .card .card-toolbar>*>*>a{text-decoration:none;margin-top:auto;margin-bottom:auto}.quarto-dashboard .card .card-toolbar form{width:fit-content}.quarto-dashboard .card .card-toolbar form label{padding-top:.2em;padding-bottom:.2em;width:fit-content}.quarto-dashboard .card .card-toolbar form input[type=date]{width:fit-content}.quarto-dashboard .card .card-toolbar form input[type=color]{width:3em}.quarto-dashboard .card .card-toolbar form button{padding:.4em}.quarto-dashboard .card .card-toolbar form select{width:fit-content}.quarto-dashboard .card .card-toolbar .cell-output-display{display:flex}.quarto-dashboard .card .card-toolbar .shiny-input-container{padding-bottom:.5em;margin-bottom:.5em;width:inherit}.quarto-dashboard .card .card-toolbar .shiny-input-container>.checkbox:first-child{margin-top:6px}.quarto-dashboard .card .card-toolbar>*:last-child{margin-right:0}.quarto-dashboard .card .card-toolbar>*>*{margin-right:1em;align-items:baseline}.quarto-dashboard .card .card-toolbar>*>*>a{text-decoration:none;margin-top:auto;margin-bottom:auto}.quarto-dashboard .card .card-toolbar .shiny-input-container{padding-bottom:0;margin-bottom:0}.quarto-dashboard .card .card-toolbar .shiny-input-container>*{flex-shrink:0;flex-grow:0}.quarto-dashboard .card .card-toolbar .form-group.shiny-input-container:not([role=group])>label{margin-bottom:0}.quarto-dashboard .card .card-toolbar .shiny-input-container.no-baseline{align-items:start;padding-top:6px}.quarto-dashboard .card .card-toolbar .shiny-input-container{display:flex;align-items:baseline}.quarto-dashboard .card .card-toolbar .shiny-input-container label{padding-right:.4em}.quarto-dashboard .card .card-toolbar .shiny-input-container .bslib-input-switch{margin-top:6px}.quarto-dashboard .card .card-toolbar input[type=text]{line-height:1;width:inherit}.quarto-dashboard .card .card-toolbar .input-daterange{width:inherit}.quarto-dashboard .card .card-toolbar .input-daterange input[type=text]{height:2.4em;width:10em}.quarto-dashboard .card .card-toolbar .input-daterange .input-group-addon{height:auto;padding:0;margin-left:-5px !important;margin-right:-5px}.quarto-dashboard .card .card-toolbar .input-daterange .input-group-addon .input-group-text{padding-top:0;padding-bottom:0;height:100%}.quarto-dashboard .card .card-toolbar span.irs.irs--shiny{width:10em}.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-line{top:9px}.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-min,.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-max,.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-from,.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-to,.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-single{top:20px}.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-bar{top:8px}.quarto-dashboard .card .card-toolbar span.irs.irs--shiny .irs-handle{top:0px}.quarto-dashboard .card .card-toolbar .shiny-input-checkboxgroup>label{margin-top:6px}.quarto-dashboard .card .card-toolbar .shiny-input-checkboxgroup>.shiny-options-group{margin-top:0;align-items:baseline}.quarto-dashboard .card .card-toolbar .shiny-input-radiogroup>label{margin-top:6px}.quarto-dashboard .card .card-toolbar .shiny-input-radiogroup>.shiny-options-group{align-items:baseline;margin-top:0}.quarto-dashboard .card .card-toolbar .shiny-input-radiogroup>.shiny-options-group>.radio{margin-right:.3em}.quarto-dashboard .card .card-toolbar .form-select{padding-top:.2em;padding-bottom:.2em}.quarto-dashboard .card .card-toolbar .shiny-input-select{min-width:6em}.quarto-dashboard .card .card-toolbar div.checkbox{margin-bottom:0px}.quarto-dashboard .card .card-toolbar>.checkbox:first-child{margin-top:6px}.quarto-dashboard .card-body>table>thead{border-top:none}.quarto-dashboard .card-body>.table>:not(caption)>*>*{background-color:#fff}.tableFloatingHeaderOriginal{background-color:#fff;position:sticky !important;top:0 !important}.dashboard-data-table{margin-top:-1px}div.value-box-area span.observablehq--number{font-size:calc(clamp(.1em,15cqw,5em)*1.25);line-height:1.2;color:inherit;font-family:var(--bs-body-font-family)}.quarto-listing{padding-bottom:1em}.listing-pagination{padding-top:.5em}ul.pagination{float:right;padding-left:8px;padding-top:.5em}ul.pagination li{padding-right:.75em}ul.pagination li.disabled a,ul.pagination li.active a{color:#fff;text-decoration:none}ul.pagination li:last-of-type{padding-right:0}.listing-actions-group{display:flex}.quarto-listing-filter{margin-bottom:1em;width:200px;margin-left:auto}.quarto-listing-sort{margin-bottom:1em;margin-right:auto;width:auto}.quarto-listing-sort .input-group-text{font-size:.8em}.input-group-text{border-right:none}.quarto-listing-sort select.form-select{font-size:.8em}.listing-no-matching{text-align:center;padding-top:2em;padding-bottom:3em;font-size:1em}#quarto-margin-sidebar .quarto-listing-category{padding-top:0;font-size:1rem}#quarto-margin-sidebar .quarto-listing-category-title{cursor:pointer;font-weight:600;font-size:1rem}.quarto-listing-category .category{cursor:pointer}.quarto-listing-category .category.active{font-weight:600}.quarto-listing-category.category-cloud{display:flex;flex-wrap:wrap;align-items:baseline}.quarto-listing-category.category-cloud .category{padding-right:5px}.quarto-listing-category.category-cloud .category-cloud-1{font-size:.75em}.quarto-listing-category.category-cloud .category-cloud-2{font-size:.95em}.quarto-listing-category.category-cloud .category-cloud-3{font-size:1.15em}.quarto-listing-category.category-cloud .category-cloud-4{font-size:1.35em}.quarto-listing-category.category-cloud .category-cloud-5{font-size:1.55em}.quarto-listing-category.category-cloud .category-cloud-6{font-size:1.75em}.quarto-listing-category.category-cloud .category-cloud-7{font-size:1.95em}.quarto-listing-category.category-cloud .category-cloud-8{font-size:2.15em}.quarto-listing-category.category-cloud .category-cloud-9{font-size:2.35em}.quarto-listing-category.category-cloud .category-cloud-10{font-size:2.55em}.quarto-listing-cols-1{grid-template-columns:repeat(1, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-1{grid-template-columns:repeat(1, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-1{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-2{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-2{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-2{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-3{grid-template-columns:repeat(3, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-3{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-3{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-4{grid-template-columns:repeat(4, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-4{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-4{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-5{grid-template-columns:repeat(5, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-5{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-5{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-6{grid-template-columns:repeat(6, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-6{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-6{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-7{grid-template-columns:repeat(7, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-7{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-7{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-8{grid-template-columns:repeat(8, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-8{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-8{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-9{grid-template-columns:repeat(9, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-9{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-9{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-10{grid-template-columns:repeat(10, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-10{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-10{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-11{grid-template-columns:repeat(11, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-11{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-11{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-cols-12{grid-template-columns:repeat(12, minmax(0, 1fr));gap:1.5em}@media(max-width: 767.98px){.quarto-listing-cols-12{grid-template-columns:repeat(2, minmax(0, 1fr));gap:1.5em}}@media(max-width: 575.98px){.quarto-listing-cols-12{grid-template-columns:minmax(0, 1fr);gap:1.5em}}.quarto-listing-grid{gap:1.5em}.quarto-grid-item.borderless{border:none}.quarto-grid-item.borderless .listing-categories .listing-category:last-of-type,.quarto-grid-item.borderless .listing-categories .listing-category:first-of-type{padding-left:0}.quarto-grid-item.borderless .listing-categories .listing-category{border:0}.quarto-grid-link{text-decoration:none;color:inherit}.quarto-grid-link:hover{text-decoration:none;color:inherit}.quarto-grid-item h5.title,.quarto-grid-item .title.h5{margin-top:0;margin-bottom:0}.quarto-grid-item .card-footer{display:flex;justify-content:space-between;font-size:.8em}.quarto-grid-item .card-footer p{margin-bottom:0}.quarto-grid-item p.card-img-top{margin-bottom:0}.quarto-grid-item p.card-img-top>img{object-fit:cover}.quarto-grid-item .card-other-values{margin-top:.5em;font-size:.8em}.quarto-grid-item .card-other-values tr{margin-bottom:.5em}.quarto-grid-item .card-other-values tr>td:first-of-type{font-weight:600;padding-right:1em;padding-left:1em;vertical-align:top}.quarto-grid-item div.post-contents{display:flex;flex-direction:column;text-decoration:none;height:100%}.quarto-grid-item .listing-item-img-placeholder{background-color:rgba(52,58,64,.25);flex-shrink:0}.quarto-grid-item .card-attribution{padding-top:1em;display:flex;gap:1em;text-transform:uppercase;color:#6c757d;font-weight:500;flex-grow:10;align-items:flex-end}.quarto-grid-item .description{padding-bottom:1em}.quarto-grid-item .card-attribution .date{align-self:flex-end}.quarto-grid-item .card-attribution.justify{justify-content:space-between}.quarto-grid-item .card-attribution.start{justify-content:flex-start}.quarto-grid-item .card-attribution.end{justify-content:flex-end}.quarto-grid-item .card-title{margin-bottom:.1em}.quarto-grid-item .card-subtitle{padding-top:.25em}.quarto-grid-item .card-text{font-size:.9em}.quarto-grid-item .listing-reading-time{padding-bottom:.25em}.quarto-grid-item .card-text-small{font-size:.8em}.quarto-grid-item .card-subtitle.subtitle{font-size:.9em;font-weight:600;padding-bottom:.5em}.quarto-grid-item .listing-categories{display:flex;flex-wrap:wrap;padding-bottom:5px}.quarto-grid-item .listing-categories .listing-category{color:#6c757d;border:solid 1px #dee2e6;border-radius:.25rem;text-transform:uppercase;font-size:.65em;padding-left:.5em;padding-right:.5em;padding-top:.15em;padding-bottom:.15em;cursor:pointer;margin-right:4px;margin-bottom:4px}.quarto-grid-item.card-right{text-align:right}.quarto-grid-item.card-right .listing-categories{justify-content:flex-end}.quarto-grid-item.card-left{text-align:left}.quarto-grid-item.card-center{text-align:center}.quarto-grid-item.card-center .listing-description{text-align:justify}.quarto-grid-item.card-center .listing-categories{justify-content:center}table.quarto-listing-table td.image{padding:0px}table.quarto-listing-table td.image img{width:100%;max-width:50px;object-fit:contain}table.quarto-listing-table a{text-decoration:none;word-break:keep-all}table.quarto-listing-table th a{color:inherit}table.quarto-listing-table th a.asc:after{margin-bottom:-2px;margin-left:5px;display:inline-block;height:1rem;width:1rem;background-repeat:no-repeat;background-size:1rem 1rem;background-image:url('data:image/svg+xml,');content:""}table.quarto-listing-table th a.desc:after{margin-bottom:-2px;margin-left:5px;display:inline-block;height:1rem;width:1rem;background-repeat:no-repeat;background-size:1rem 1rem;background-image:url('data:image/svg+xml,');content:""}table.quarto-listing-table.table-hover td{cursor:pointer}.quarto-post.image-left{flex-direction:row}.quarto-post.image-right{flex-direction:row-reverse}@media(max-width: 767.98px){.quarto-post.image-right,.quarto-post.image-left{gap:0em;flex-direction:column}.quarto-post .metadata{padding-bottom:1em;order:2}.quarto-post .body{order:1}.quarto-post .thumbnail{order:3}}.list.quarto-listing-default div:last-of-type{border-bottom:none}@media(min-width: 992px){.quarto-listing-container-default{margin-right:2em}}div.quarto-post{display:flex;gap:2em;margin-bottom:1.5em;border-bottom:1px solid #dee2e6}@media(max-width: 767.98px){div.quarto-post{padding-bottom:1em}}div.quarto-post .metadata{flex-basis:20%;flex-grow:0;margin-top:.2em;flex-shrink:10}div.quarto-post .thumbnail{flex-basis:30%;flex-grow:0;flex-shrink:0}div.quarto-post .thumbnail img{margin-top:.4em;width:100%;object-fit:cover}div.quarto-post .body{flex-basis:45%;flex-grow:1;flex-shrink:0}div.quarto-post .body h3.listing-title,div.quarto-post .body .listing-title.h3{margin-top:0px;margin-bottom:0px;border-bottom:none}div.quarto-post .body .listing-subtitle{font-size:.875em;margin-bottom:.5em;margin-top:.2em}div.quarto-post .body .description{font-size:.9em}div.quarto-post .body pre code{white-space:pre-wrap}div.quarto-post a{color:#343a40;text-decoration:none}div.quarto-post .metadata{display:flex;flex-direction:column;font-size:.8em;font-family:"Source Sans Pro",-apple-system,BlinkMacSystemFont,"Segoe UI",Roboto,"Helvetica Neue",Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol";flex-basis:33%}div.quarto-post .listing-categories{display:flex;flex-wrap:wrap;padding-bottom:5px}div.quarto-post .listing-categories .listing-category{color:#6c757d;border:solid 1px #dee2e6;border-radius:.25rem;text-transform:uppercase;font-size:.65em;padding-left:.5em;padding-right:.5em;padding-top:.15em;padding-bottom:.15em;cursor:pointer;margin-right:4px;margin-bottom:4px}div.quarto-post .listing-description{margin-bottom:.5em}div.quarto-about-jolla{display:flex !important;flex-direction:column;align-items:center;margin-top:10%;padding-bottom:1em}div.quarto-about-jolla .about-image{object-fit:cover;margin-left:auto;margin-right:auto;margin-bottom:1.5em}div.quarto-about-jolla img.round{border-radius:50%}div.quarto-about-jolla img.rounded{border-radius:10px}div.quarto-about-jolla .quarto-title h1.title,div.quarto-about-jolla .quarto-title .title.h1{text-align:center}div.quarto-about-jolla .quarto-title .description{text-align:center}div.quarto-about-jolla h2,div.quarto-about-jolla .h2{border-bottom:none}div.quarto-about-jolla .about-sep{width:60%}div.quarto-about-jolla main{text-align:center}div.quarto-about-jolla .about-links{display:flex}@media(min-width: 992px){div.quarto-about-jolla .about-links{flex-direction:row;column-gap:.8em;row-gap:15px;flex-wrap:wrap}}@media(max-width: 991.98px){div.quarto-about-jolla .about-links{flex-direction:column;row-gap:1em;width:100%;padding-bottom:1.5em}}div.quarto-about-jolla .about-link{color:#626d78;text-decoration:none;border:solid 1px}@media(min-width: 992px){div.quarto-about-jolla .about-link{font-size:.8em;padding:.25em .5em;border-radius:4px}}@media(max-width: 991.98px){div.quarto-about-jolla .about-link{font-size:1.1em;padding:.5em .5em;text-align:center;border-radius:6px}}div.quarto-about-jolla .about-link:hover{color:#2761e3}div.quarto-about-jolla .about-link i.bi{margin-right:.15em}div.quarto-about-solana{display:flex !important;flex-direction:column;padding-top:3em !important;padding-bottom:1em}div.quarto-about-solana .about-entity{display:flex !important;align-items:start;justify-content:space-between}@media(min-width: 992px){div.quarto-about-solana .about-entity{flex-direction:row}}@media(max-width: 991.98px){div.quarto-about-solana .about-entity{flex-direction:column-reverse;align-items:center;text-align:center}}div.quarto-about-solana .about-entity .entity-contents{display:flex;flex-direction:column}@media(max-width: 767.98px){div.quarto-about-solana .about-entity .entity-contents{width:100%}}div.quarto-about-solana .about-entity .about-image{object-fit:cover}@media(max-width: 991.98px){div.quarto-about-solana .about-entity .about-image{margin-bottom:1.5em}}div.quarto-about-solana .about-entity img.round{border-radius:50%}div.quarto-about-solana .about-entity img.rounded{border-radius:10px}div.quarto-about-solana .about-entity .about-links{display:flex;justify-content:left;padding-bottom:1.2em}@media(min-width: 992px){div.quarto-about-solana .about-entity .about-links{flex-direction:row;column-gap:.8em;row-gap:15px;flex-wrap:wrap}}@media(max-width: 991.98px){div.quarto-about-solana .about-entity .about-links{flex-direction:column;row-gap:1em;width:100%;padding-bottom:1.5em}}div.quarto-about-solana .about-entity .about-link{color:#626d78;text-decoration:none;border:solid 1px}@media(min-width: 992px){div.quarto-about-solana .about-entity .about-link{font-size:.8em;padding:.25em .5em;border-radius:4px}}@media(max-width: 991.98px){div.quarto-about-solana .about-entity .about-link{font-size:1.1em;padding:.5em .5em;text-align:center;border-radius:6px}}div.quarto-about-solana .about-entity .about-link:hover{color:#2761e3}div.quarto-about-solana .about-entity .about-link i.bi{margin-right:.15em}div.quarto-about-solana .about-contents{padding-right:1.5em;flex-basis:0;flex-grow:1}div.quarto-about-solana .about-contents main.content{margin-top:0}div.quarto-about-solana .about-contents h2,div.quarto-about-solana .about-contents .h2{border-bottom:none}div.quarto-about-trestles{display:flex !important;flex-direction:row;padding-top:3em !important;padding-bottom:1em}@media(max-width: 991.98px){div.quarto-about-trestles{flex-direction:column;padding-top:0em !important}}div.quarto-about-trestles .about-entity{display:flex !important;flex-direction:column;align-items:center;text-align:center;padding-right:1em}@media(min-width: 992px){div.quarto-about-trestles .about-entity{flex:0 0 42%}}div.quarto-about-trestles .about-entity .about-image{object-fit:cover;margin-bottom:1.5em}div.quarto-about-trestles .about-entity img.round{border-radius:50%}div.quarto-about-trestles .about-entity img.rounded{border-radius:10px}div.quarto-about-trestles .about-entity .about-links{display:flex;justify-content:center}@media(min-width: 992px){div.quarto-about-trestles .about-entity .about-links{flex-direction:row;column-gap:.8em;row-gap:15px;flex-wrap:wrap}}@media(max-width: 991.98px){div.quarto-about-trestles .about-entity .about-links{flex-direction:column;row-gap:1em;width:100%;padding-bottom:1.5em}}div.quarto-about-trestles .about-entity .about-link{color:#626d78;text-decoration:none;border:solid 1px}@media(min-width: 992px){div.quarto-about-trestles .about-entity .about-link{font-size:.8em;padding:.25em .5em;border-radius:4px}}@media(max-width: 991.98px){div.quarto-about-trestles .about-entity .about-link{font-size:1.1em;padding:.5em .5em;text-align:center;border-radius:6px}}div.quarto-about-trestles .about-entity .about-link:hover{color:#2761e3}div.quarto-about-trestles .about-entity .about-link i.bi{margin-right:.15em}div.quarto-about-trestles .about-contents{flex-basis:0;flex-grow:1}div.quarto-about-trestles .about-contents h2,div.quarto-about-trestles .about-contents .h2{border-bottom:none}@media(min-width: 992px){div.quarto-about-trestles .about-contents{border-left:solid 1px #dee2e6;padding-left:1.5em}}div.quarto-about-trestles .about-contents main.content{margin-top:0}div.quarto-about-marquee{padding-bottom:1em}div.quarto-about-marquee .about-contents{display:flex;flex-direction:column}div.quarto-about-marquee .about-image{max-height:550px;margin-bottom:1.5em;object-fit:cover}div.quarto-about-marquee img.round{border-radius:50%}div.quarto-about-marquee img.rounded{border-radius:10px}div.quarto-about-marquee h2,div.quarto-about-marquee .h2{border-bottom:none}div.quarto-about-marquee .about-links{display:flex;justify-content:center;padding-top:1.5em}@media(min-width: 992px){div.quarto-about-marquee .about-links{flex-direction:row;column-gap:.8em;row-gap:15px;flex-wrap:wrap}}@media(max-width: 991.98px){div.quarto-about-marquee .about-links{flex-direction:column;row-gap:1em;width:100%;padding-bottom:1.5em}}div.quarto-about-marquee .about-link{color:#626d78;text-decoration:none;border:solid 1px}@media(min-width: 992px){div.quarto-about-marquee .about-link{font-size:.8em;padding:.25em .5em;border-radius:4px}}@media(max-width: 991.98px){div.quarto-about-marquee .about-link{font-size:1.1em;padding:.5em .5em;text-align:center;border-radius:6px}}div.quarto-about-marquee .about-link:hover{color:#2761e3}div.quarto-about-marquee .about-link i.bi{margin-right:.15em}@media(min-width: 992px){div.quarto-about-marquee .about-link{border:none}}div.quarto-about-broadside{display:flex;flex-direction:column;padding-bottom:1em}div.quarto-about-broadside .about-main{display:flex !important;padding-top:0 !important}@media(min-width: 992px){div.quarto-about-broadside .about-main{flex-direction:row;align-items:flex-start}}@media(max-width: 991.98px){div.quarto-about-broadside .about-main{flex-direction:column}}@media(max-width: 991.98px){div.quarto-about-broadside .about-main .about-entity{flex-shrink:0;width:100%;height:450px;margin-bottom:1.5em;background-size:cover;background-repeat:no-repeat}}@media(min-width: 992px){div.quarto-about-broadside .about-main .about-entity{flex:0 10 50%;margin-right:1.5em;width:100%;height:100%;background-size:100%;background-repeat:no-repeat}}div.quarto-about-broadside .about-main .about-contents{padding-top:14px;flex:0 0 50%}div.quarto-about-broadside h2,div.quarto-about-broadside .h2{border-bottom:none}div.quarto-about-broadside .about-sep{margin-top:1.5em;width:60%;align-self:center}div.quarto-about-broadside .about-links{display:flex;justify-content:center;column-gap:20px;padding-top:1.5em}@media(min-width: 992px){div.quarto-about-broadside .about-links{flex-direction:row;column-gap:.8em;row-gap:15px;flex-wrap:wrap}}@media(max-width: 991.98px){div.quarto-about-broadside .about-links{flex-direction:column;row-gap:1em;width:100%;padding-bottom:1.5em}}div.quarto-about-broadside .about-link{color:#626d78;text-decoration:none;border:solid 1px}@media(min-width: 992px){div.quarto-about-broadside .about-link{font-size:.8em;padding:.25em .5em;border-radius:4px}}@media(max-width: 991.98px){div.quarto-about-broadside .about-link{font-size:1.1em;padding:.5em .5em;text-align:center;border-radius:6px}}div.quarto-about-broadside .about-link:hover{color:#2761e3}div.quarto-about-broadside .about-link i.bi{margin-right:.15em}@media(min-width: 992px){div.quarto-about-broadside .about-link{border:none}}.tippy-box[data-theme~=quarto]{background-color:#fff;border:solid 1px #dee2e6;border-radius:.25rem;color:#343a40;font-size:.875rem}.tippy-box[data-theme~=quarto]>.tippy-backdrop{background-color:#fff}.tippy-box[data-theme~=quarto]>.tippy-arrow:after,.tippy-box[data-theme~=quarto]>.tippy-svg-arrow:after{content:"";position:absolute;z-index:-1}.tippy-box[data-theme~=quarto]>.tippy-arrow:after{border-color:rgba(0,0,0,0);border-style:solid}.tippy-box[data-placement^=top]>.tippy-arrow:before{bottom:-6px}.tippy-box[data-placement^=bottom]>.tippy-arrow:before{top:-6px}.tippy-box[data-placement^=right]>.tippy-arrow:before{left:-6px}.tippy-box[data-placement^=left]>.tippy-arrow:before{right:-6px}.tippy-box[data-theme~=quarto][data-placement^=top]>.tippy-arrow:before{border-top-color:#fff}.tippy-box[data-theme~=quarto][data-placement^=top]>.tippy-arrow:after{border-top-color:#dee2e6;border-width:7px 7px 0;top:17px;left:1px}.tippy-box[data-theme~=quarto][data-placement^=top]>.tippy-svg-arrow>svg{top:16px}.tippy-box[data-theme~=quarto][data-placement^=top]>.tippy-svg-arrow:after{top:17px}.tippy-box[data-theme~=quarto][data-placement^=bottom]>.tippy-arrow:before{border-bottom-color:#fff;bottom:16px}.tippy-box[data-theme~=quarto][data-placement^=bottom]>.tippy-arrow:after{border-bottom-color:#dee2e6;border-width:0 7px 7px;bottom:17px;left:1px}.tippy-box[data-theme~=quarto][data-placement^=bottom]>.tippy-svg-arrow>svg{bottom:15px}.tippy-box[data-theme~=quarto][data-placement^=bottom]>.tippy-svg-arrow:after{bottom:17px}.tippy-box[data-theme~=quarto][data-placement^=left]>.tippy-arrow:before{border-left-color:#fff}.tippy-box[data-theme~=quarto][data-placement^=left]>.tippy-arrow:after{border-left-color:#dee2e6;border-width:7px 0 7px 7px;left:17px;top:1px}.tippy-box[data-theme~=quarto][data-placement^=left]>.tippy-svg-arrow>svg{left:11px}.tippy-box[data-theme~=quarto][data-placement^=left]>.tippy-svg-arrow:after{left:12px}.tippy-box[data-theme~=quarto][data-placement^=right]>.tippy-arrow:before{border-right-color:#fff;right:16px}.tippy-box[data-theme~=quarto][data-placement^=right]>.tippy-arrow:after{border-width:7px 7px 7px 0;right:17px;top:1px;border-right-color:#dee2e6}.tippy-box[data-theme~=quarto][data-placement^=right]>.tippy-svg-arrow>svg{right:11px}.tippy-box[data-theme~=quarto][data-placement^=right]>.tippy-svg-arrow:after{right:12px}.tippy-box[data-theme~=quarto]>.tippy-svg-arrow{fill:#343a40}.tippy-box[data-theme~=quarto]>.tippy-svg-arrow:after{background-image:url(data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMTYiIGhlaWdodD0iNiIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj48cGF0aCBkPSJNMCA2czEuNzk2LS4wMTMgNC42Ny0zLjYxNUM1Ljg1MS45IDYuOTMuMDA2IDggMGMxLjA3LS4wMDYgMi4xNDguODg3IDMuMzQzIDIuMzg1QzE0LjIzMyA2LjAwNSAxNiA2IDE2IDZIMHoiIGZpbGw9InJnYmEoMCwgOCwgMTYsIDAuMikiLz48L3N2Zz4=);background-size:16px 6px;width:16px;height:6px}.top-right{position:absolute;top:1em;right:1em}.visually-hidden{border:0;clip:rect(0 0 0 0);height:auto;margin:0;overflow:hidden;padding:0;position:absolute;width:1px;white-space:nowrap}.hidden{display:none !important}.zindex-bottom{z-index:-1 !important}figure.figure{display:block}.quarto-layout-panel{margin-bottom:1em}.quarto-layout-panel>figure{width:100%}.quarto-layout-panel>figure>figcaption,.quarto-layout-panel>.panel-caption{margin-top:10pt}.quarto-layout-panel>.table-caption{margin-top:0px}.table-caption p{margin-bottom:.5em}.quarto-layout-row{display:flex;flex-direction:row;align-items:flex-start}.quarto-layout-valign-top{align-items:flex-start}.quarto-layout-valign-bottom{align-items:flex-end}.quarto-layout-valign-center{align-items:center}.quarto-layout-cell{position:relative;margin-right:20px}.quarto-layout-cell:last-child{margin-right:0}.quarto-layout-cell figure,.quarto-layout-cell>p{margin:.2em}.quarto-layout-cell img{max-width:100%}.quarto-layout-cell .html-widget{width:100% !important}.quarto-layout-cell div figure p{margin:0}.quarto-layout-cell figure{display:block;margin-inline-start:0;margin-inline-end:0}.quarto-layout-cell table{display:inline-table}.quarto-layout-cell-subref figcaption,figure .quarto-layout-row figure figcaption{text-align:center;font-style:italic}.quarto-figure{position:relative;margin-bottom:1em}.quarto-figure>figure{width:100%;margin-bottom:0}.quarto-figure-left>figure>p,.quarto-figure-left>figure>div{text-align:left}.quarto-figure-center>figure>p,.quarto-figure-center>figure>div{text-align:center}.quarto-figure-right>figure>p,.quarto-figure-right>figure>div{text-align:right}.quarto-figure>figure>div.cell-annotation,.quarto-figure>figure>div code{text-align:left}figure>p:empty{display:none}figure>p:first-child{margin-top:0;margin-bottom:0}figure>figcaption.quarto-float-caption-bottom{margin-bottom:.5em}figure>figcaption.quarto-float-caption-top{margin-top:.5em}div[id^=tbl-]{position:relative}.quarto-figure>.anchorjs-link{position:absolute;top:.6em;right:.5em}div[id^=tbl-]>.anchorjs-link{position:absolute;top:.7em;right:.3em}.quarto-figure:hover>.anchorjs-link,div[id^=tbl-]:hover>.anchorjs-link,h2:hover>.anchorjs-link,.h2:hover>.anchorjs-link,h3:hover>.anchorjs-link,.h3:hover>.anchorjs-link,h4:hover>.anchorjs-link,.h4:hover>.anchorjs-link,h5:hover>.anchorjs-link,.h5:hover>.anchorjs-link,h6:hover>.anchorjs-link,.h6:hover>.anchorjs-link,.reveal-anchorjs-link>.anchorjs-link{opacity:1}#title-block-header{margin-block-end:1rem;position:relative;margin-top:-1px}#title-block-header .abstract{margin-block-start:1rem}#title-block-header .abstract .abstract-title{font-weight:600}#title-block-header a{text-decoration:none}#title-block-header .author,#title-block-header .date,#title-block-header .doi{margin-block-end:.2rem}#title-block-header .quarto-title-block>div{display:flex}#title-block-header .quarto-title-block>div>h1,#title-block-header .quarto-title-block>div>.h1{flex-grow:1}#title-block-header .quarto-title-block>div>button{flex-shrink:0;height:2.25rem;margin-top:0}@media(min-width: 992px){#title-block-header .quarto-title-block>div>button{margin-top:5px}}tr.header>th>p:last-of-type{margin-bottom:0px}table,table.table{margin-top:.5rem;margin-bottom:.5rem}caption,.table-caption{padding-top:.5rem;padding-bottom:.5rem;text-align:center}figure.quarto-float-tbl figcaption.quarto-float-caption-top{margin-top:.5rem;margin-bottom:.25rem;text-align:center}figure.quarto-float-tbl figcaption.quarto-float-caption-bottom{padding-top:.25rem;margin-bottom:.5rem;text-align:center}.utterances{max-width:none;margin-left:-8px}iframe{margin-bottom:1em}details{margin-bottom:1em}details[show]{margin-bottom:0}details>summary{color:#6c757d}details>summary>p:only-child{display:inline}pre.sourceCode,code.sourceCode{position:relative}dd code:not(.sourceCode),p code:not(.sourceCode){white-space:pre-wrap}code{white-space:pre}@media print{code{white-space:pre-wrap}}pre>code{display:block}pre>code.sourceCode{white-space:pre}pre>code.sourceCode>span>a:first-child::before{text-decoration:none}pre.code-overflow-wrap>code.sourceCode{white-space:pre-wrap}pre.code-overflow-scroll>code.sourceCode{white-space:pre}code a:any-link{color:inherit;text-decoration:none}code a:hover{color:inherit;text-decoration:underline}ul.task-list{padding-left:1em}[data-tippy-root]{display:inline-block}.tippy-content .footnote-back{display:none}.footnote-back{margin-left:.2em}.tippy-content{overflow-x:auto}.quarto-embedded-source-code{display:none}.quarto-unresolved-ref{font-weight:600}.quarto-cover-image{max-width:35%;float:right;margin-left:30px}.cell-output-display .widget-subarea{margin-bottom:1em}.cell-output-display:not(.no-overflow-x),.knitsql-table:not(.no-overflow-x){overflow-x:auto}.panel-input{margin-bottom:1em}.panel-input>div,.panel-input>div>div{display:inline-block;vertical-align:top;padding-right:12px}.panel-input>p:last-child{margin-bottom:0}.layout-sidebar{margin-bottom:1em}.layout-sidebar .tab-content{border:none}.tab-content>.page-columns.active{display:grid}div.sourceCode>iframe{width:100%;height:300px;margin-bottom:-0.5em}a{text-underline-offset:3px}.callout pre.sourceCode{padding-left:0}div.ansi-escaped-output{font-family:monospace;display:block}/*! * * ansi colors from IPython notebook's * diff --git a/site_libs/quarto-html/quarto-syntax-highlighting-e26003cea8cd680ca0c55a263523d882.css b/site_libs/quarto-html/quarto-syntax-highlighting-549806ee2085284f45b00abea8c6df48.css similarity index 97% rename from site_libs/quarto-html/quarto-syntax-highlighting-e26003cea8cd680ca0c55a263523d882.css rename to site_libs/quarto-html/quarto-syntax-highlighting-549806ee2085284f45b00abea8c6df48.css index 74e04ef..80e34e4 100644 --- a/site_libs/quarto-html/quarto-syntax-highlighting-e26003cea8cd680ca0c55a263523d882.css +++ b/site_libs/quarto-html/quarto-syntax-highlighting-549806ee2085284f45b00abea8c6df48.css @@ -202,4 +202,4 @@ code span.kw { content: " { - category = atob(category); + // category is URI encoded in EJS template for UTF-8 support + category = decodeURIComponent(atob(category)); if (categoriesLoaded) { activateCategory(category); setCategoryHash(category); diff --git a/sitemap.xml b/sitemap.xml index 849d1dc..7680ebd 100644 --- a/sitemap.xml +++ b/sitemap.xml @@ -1,227 +1,211 @@ - https://patel-zeel.github.io/blog/posts/pruning_vs_uncertainty.html - 2024-12-27T07:12:26.325Z + https://patel-zeel.github.io/blog/lab/scratchpad.html + 2025-02-11T22:05:40.356Z - https://patel-zeel.github.io/blog/posts/fundamentals_across_domains.html - 2024-12-27T07:12:26.313Z + https://patel-zeel.github.io/blog/posts/PurpleAir.html + 2025-02-11T22:05:40.459Z - https://patel-zeel.github.io/blog/posts/AL_with_MNIST.html - 2024-12-27T07:12:26.093Z + https://patel-zeel.github.io/blog/posts/climate-modeling-with-siren.html + 2025-02-11T22:05:40.573Z - https://patel-zeel.github.io/blog/posts/2021-03-22-gp_kernels.html - 2024-12-27T07:12:26.157Z + https://patel-zeel.github.io/blog/posts/2022-10-21-gaussian-processes.html + 2025-02-11T22:05:40.440Z - https://patel-zeel.github.io/blog/posts/2022-03-06-probabilistic-machine-learning.html - 2024-12-27T07:12:26.321Z + https://patel-zeel.github.io/blog/posts/Basis_functions.html + 2025-02-11T22:05:40.408Z - https://patel-zeel.github.io/blog/posts/2020-09-21-programatically_download_openaq_data.html - 2024-12-27T07:12:26.173Z + https://patel-zeel.github.io/blog/posts/2021-10-23-warped-gp.html + 2025-02-11T22:05:40.568Z - https://patel-zeel.github.io/blog/posts/ssh-macos.html - 2024-12-27T07:12:26.329Z + https://patel-zeel.github.io/blog/posts/2022-10-27-mogp.html + 2025-02-11T22:05:40.448Z - https://patel-zeel.github.io/blog/posts/bayesian-gaussian-basis-regression.html - 2024-12-27T07:12:26.309Z + https://patel-zeel.github.io/blog/posts/2022-08-01-conditional_neural_processes.html + 2025-02-11T22:05:40.574Z - https://patel-zeel.github.io/blog/posts/2023-03-28-nngp.html - 2024-12-27T07:12:26.321Z + https://patel-zeel.github.io/blog/posts/2022-10-18-kfac-laplace.html + 2025-02-11T22:05:40.448Z - https://patel-zeel.github.io/blog/posts/2022-01-25-gp_frameworks_comparison.html - 2024-12-27T07:12:26.129Z + https://patel-zeel.github.io/blog/posts/2024-12-29-object-detection-how-to.html + 2025-02-11T22:05:40.378Z - https://patel-zeel.github.io/blog/posts/climate-modeling-with-siren.html - 2024-12-27T07:12:26.313Z + https://patel-zeel.github.io/blog/posts/GNNs_and_GPs.html + 2025-02-11T22:05:40.413Z - https://patel-zeel.github.io/blog/posts/2022-04-06-github_faqs.html - 2024-12-27T07:12:26.157Z + https://patel-zeel.github.io/blog/posts/presentation_tips.html + 2025-02-11T22:05:40.583Z - https://patel-zeel.github.io/blog/posts/torch-tips.html - 2024-12-27T07:12:26.329Z + https://patel-zeel.github.io/blog/posts/GNN_for_regression.html + 2025-02-11T22:05:40.412Z - https://patel-zeel.github.io/blog/posts/2022-03-08-torch-essentials.html - 2024-12-27T07:12:26.329Z + https://patel-zeel.github.io/blog/posts/non-gaussian-likelihood-mlps.html + 2025-02-11T22:05:40.582Z - https://patel-zeel.github.io/blog/posts/2024-12-10-cpcb-download.html - 2024-12-27T07:12:26.089Z + https://patel-zeel.github.io/blog/posts/2022-03-08-torch-essentials.html + 2025-02-11T22:05:40.589Z - https://patel-zeel.github.io/blog/posts/py_over_ipynb.html - 2024-12-27T07:12:26.325Z + https://patel-zeel.github.io/blog/posts/bayesian-gaussian-basis-regression.html + 2025-02-11T22:05:40.570Z - https://patel-zeel.github.io/blog/posts/2022-10-27-mogp.html - 2024-12-27T07:12:26.165Z + https://patel-zeel.github.io/blog/posts/2022-03-06-probabilistic-machine-learning.html + 2025-02-11T22:05:40.583Z - https://patel-zeel.github.io/blog/posts/2022-03-05-uncertainty-in-deep-learning.html - 2024-12-27T07:12:26.329Z + https://patel-zeel.github.io/blog/posts/2022-01-24-query_by_committee.html + 2025-02-11T22:05:40.520Z - https://patel-zeel.github.io/blog/posts/seq_to_seq.html - 2024-12-27T07:12:26.325Z + https://patel-zeel.github.io/blog/posts/ssh-macos.html + 2025-02-11T22:05:40.589Z - https://patel-zeel.github.io/blog/posts/wrf-tutorial.html - 2024-12-27T07:12:26.329Z + https://patel-zeel.github.io/blog/posts/2022-10-31-stochastic-variational-gp.html + 2025-02-11T22:05:40.563Z - https://patel-zeel.github.io/blog/posts/2022-05-17-contributors_sorted_by_prs.html - 2024-12-27T07:12:26.313Z + https://patel-zeel.github.io/blog/posts/2024-12-10-cpcb-download.html + 2025-02-11T22:05:40.367Z - https://patel-zeel.github.io/blog/posts/2021-10-26-anonymization-tips.html - 2024-12-27T07:12:26.121Z + https://patel-zeel.github.io/blog/posts/CNPs_for_Images.html + 2025-02-11T22:05:40.409Z https://patel-zeel.github.io/blog/posts/Rank1_GPs.html - 2024-12-27T07:12:26.249Z - - - https://patel-zeel.github.io/blog/posts/PurpleAir.html - 2024-12-27T07:12:26.177Z + 2025-02-11T22:05:40.522Z - https://patel-zeel.github.io/blog/posts/2022-08-01-conditional_neural_processes.html - 2024-12-27T07:12:26.313Z - - - https://patel-zeel.github.io/blog/posts/Basis_functions.html - 2024-12-27T07:12:26.121Z + https://patel-zeel.github.io/blog/posts/py_over_ipynb.html + 2025-02-11T22:05:40.585Z - https://patel-zeel.github.io/blog/posts/2024-12-27-download_caaqm_locations.html - 2024-12-27T07:12:26.089Z + https://patel-zeel.github.io/blog/posts/kl-divergence.html + 2025-02-11T22:05:40.579Z https://patel-zeel.github.io/blog/index.html - 2024-12-27T07:12:26.077Z + 2025-02-11T22:05:40.315Z https://patel-zeel.github.io/blog/about.html - 2024-12-27T07:12:25.361Z + 2025-02-11T22:05:39.642Z - https://patel-zeel.github.io/blog/posts/2022-01-24-query_by_committee.html - 2024-12-27T07:12:26.245Z - - - https://patel-zeel.github.io/blog/posts/2022-06-10-jaxoptimizers.html - 2024-12-27T07:12:26.165Z - - - https://patel-zeel.github.io/blog/posts/2022-10-18-kfac-laplace.html - 2024-12-27T07:12:26.165Z + https://patel-zeel.github.io/blog/posts/climate-modeling-with-SpecialGP.html + 2025-02-11T22:05:40.573Z - https://patel-zeel.github.io/blog/posts/Torch-DataLoaders.html - 2024-12-27T07:12:26.293Z + https://patel-zeel.github.io/blog/posts/2021-10-26-anonymization-tips.html + 2025-02-11T22:05:40.408Z - https://patel-zeel.github.io/blog/posts/GNNs_and_GPs.html - 2024-12-27T07:12:26.129Z + https://patel-zeel.github.io/blog/posts/2023-04-29-sine-combination-netowrks.html + 2025-02-11T22:05:40.589Z - https://patel-zeel.github.io/blog/posts/air-quality-google-.html - 2024-12-27T07:12:26.305Z + https://patel-zeel.github.io/blog/posts/2020-09-21-programatically_download_openaq_data.html + 2025-02-11T22:05:40.454Z - https://patel-zeel.github.io/blog/posts/gcloud.html - 2024-12-27T07:12:26.313Z + https://patel-zeel.github.io/blog/posts/2023-03-28-nngp.html + 2025-02-11T22:05:40.581Z - https://patel-zeel.github.io/blog/posts/2023-04-29-sine-combination-netowrks.html - 2024-12-27T07:12:26.329Z + https://patel-zeel.github.io/blog/posts/2025-02-10-object-detection-random-baseline.html + 2025-02-11T22:05:40.378Z - https://patel-zeel.github.io/blog/posts/docker_cheatsheet.html - 2024-12-27T07:12:26.313Z + https://patel-zeel.github.io/blog/posts/2022-01-25-gp_frameworks_comparison.html + 2025-02-11T22:05:40.414Z - https://patel-zeel.github.io/blog/posts/GPT-from-scratch.html - 2024-12-27T07:12:26.129Z + https://patel-zeel.github.io/blog/posts/2021-10-12-sparsegps.html + 2025-02-11T22:05:40.560Z - https://patel-zeel.github.io/blog/posts/Multiclass_GP_classification.html - 2024-12-27T07:12:26.173Z + https://patel-zeel.github.io/blog/posts/2024-12-27-download_caaqm_locations copy.html + 2025-02-11T22:05:40.367Z - https://patel-zeel.github.io/blog/posts/foundation-models-for-time-series.html - 2024-12-27T07:12:26.313Z + https://patel-zeel.github.io/blog/posts/2022-04-06-github_faqs.html + 2025-02-11T22:05:40.440Z - https://patel-zeel.github.io/blog/posts/learnings_from_brick_kiln_project.html - 2024-12-27T07:12:26.317Z + https://patel-zeel.github.io/blog/posts/2021-03-22-gp_kernels.html + 2025-02-11T22:05:40.440Z - https://patel-zeel.github.io/blog/posts/presentation_tips.html - 2024-12-27T07:12:26.321Z + https://patel-zeel.github.io/blog/posts/docker_cheatsheet.html + 2025-02-11T22:05:40.575Z - https://patel-zeel.github.io/blog/posts/2021-10-23-warped-gp.html - 2024-12-27T07:12:26.297Z + https://patel-zeel.github.io/blog/posts/numpy-algebra- copy.html + 2025-02-11T22:05:40.582Z - https://patel-zeel.github.io/blog/posts/numpy-algebra- copy.html - 2024-12-27T07:12:26.321Z + https://patel-zeel.github.io/blog/posts/2020-03-28-active_learning_with_bayesian_linear_regression.html + 2025-02-11T22:05:40.407Z - https://patel-zeel.github.io/blog/posts/2021-10-12-sparsegps.html - 2024-12-27T07:12:26.289Z + https://patel-zeel.github.io/blog/posts/Multiclass_GP_classification.html + 2025-02-11T22:05:40.453Z - https://patel-zeel.github.io/blog/posts/GNN_for_regression.html - 2024-12-27T07:12:26.129Z + https://patel-zeel.github.io/blog/posts/2022-05-14-iteratively_reweighted_least_squares.html + 2025-02-11T22:05:40.446Z - https://patel-zeel.github.io/blog/posts/2022-10-31-stochastic-variational-gp.html - 2024-12-27T07:12:26.293Z + https://patel-zeel.github.io/blog/posts/fundamentals_across_domains.html + 2025-02-11T22:05:40.575Z - https://patel-zeel.github.io/blog/posts/2021-09-27-constraints.html - 2024-12-27T07:12:26.125Z + https://patel-zeel.github.io/blog/posts/pruning_vs_uncertainty.html + 2025-02-11T22:05:40.585Z - https://patel-zeel.github.io/blog/posts/kl-divergence.html - 2024-12-27T07:12:26.317Z + https://patel-zeel.github.io/blog/posts/gcloud.html + 2025-02-11T22:05:40.575Z - https://patel-zeel.github.io/blog/posts/2020-03-28-active_learning_with_bayesian_linear_regression.html - 2024-12-27T07:12:26.121Z + https://patel-zeel.github.io/blog/posts/2022-05-17-contributors_sorted_by_prs.html + 2025-02-11T22:05:40.574Z - https://patel-zeel.github.io/blog/posts/non-gaussian-likelihood-mlps.html - 2024-12-27T07:12:26.321Z + https://patel-zeel.github.io/blog/posts/2022-03-05-uncertainty-in-deep-learning.html + 2025-02-11T22:05:40.589Z - https://patel-zeel.github.io/blog/posts/2022-05-14-iteratively_reweighted_least_squares.html - 2024-12-27T07:12:26.165Z + https://patel-zeel.github.io/blog/posts/2022-06-10-jaxoptimizers.html + 2025-02-11T22:05:40.446Z - https://patel-zeel.github.io/blog/posts/CNPs_for_Images.html - 2024-12-27T07:12:26.125Z + https://patel-zeel.github.io/blog/posts/torch-tips.html + 2025-02-11T22:05:40.589Z - https://patel-zeel.github.io/blog/posts/climate-modeling-with-SpecialGP.html - 2024-12-27T07:12:26.313Z + https://patel-zeel.github.io/blog/posts/2021-09-27-constraints.html + 2025-02-11T22:05:40.409Z - https://patel-zeel.github.io/blog/posts/2022-10-21-gaussian-processes.html - 2024-12-27T07:12:26.157Z + https://patel-zeel.github.io/blog/lab/how-to-finetune-florence-2-on-detection-dataset.html + 2025-02-11T22:05:40.344Z