This repository contains Terraform scripts to prepare different operational sources to be crawled by Kaleidoscope. It currently has support for Azure and AWS.
The crawler logic involves systematically scanning and ingesting data and events from cloud resources. This process includes setting up infrastructure to capture information from various sources within the cloud environment, such as AWS and Azure. There are two types of crawl:
-
Data Crawl:
- Involves systematically scanning and gathering information about the configuration, usage, and metadata of cloud resources.
- Helps in understanding how resources are configured, utilized, and interconnected within the cloud environment.
- Enables organizations to identify inefficiencies, optimize resource allocation, and ensure compliance with security and governance policies.
-
Event Crawl:
- Focuses on capturing and processing events generated by cloud services, such as resource provisioning, configuration changes, and access activities.
- Provides insights into resource activities, changes, and potential security incidents.
- Enables organizations to monitor and respond to events in real-time, ensuring the security, availability, and performance of their cloud infrastructure.
Both data and event crawls are essential for maintaining visibility and control over cloud resources, supporting proactive management, compliance enforcement, and security monitoring efforts.
For more information and detailed usage instructions, refer to the specific module's documentation in their respective directories.