Skip to content

Commit

Permalink
Datastore (#301)
Browse files Browse the repository at this point in the history
* refactor: Add CaseDataStore model and relationship to Case model

* refactor: Add CaseDataStore model and relationship to Case model

* chore: Add MINIO_SECURE environment variable to .env.example and use it in data_store_session.py

* feat: Add endpoint for uploading case data store...still testing

* refactor: Add endpoint for uploading file to case

* feat: Add delete endpoint for case data store file

* refactor: Add ListCaseDataStoreResponse model and endpoints for listing case data store files

* refactor: Add download endpoint for case data store file

* refactor: Update README.md with YouTube channel subscribers badge

* filter alerts by customer code

* refactor: updated pinia setttings

* chore: update vue dependencies

* refactor: vue components and utils

* new eslint settings

* refactor: eslint fix

* refactor: eslint fix

* refactor: eslint fix

* refactor: eslint fix

* refactor: eslint fix

* refactor: eslint fix

* refactor: eslint fix

* fix: vue dependencies conflicts

* refactor: Add customer code field to Case model and update related endpoints

* refactor: Add list_cases_by_customer_code endpoint and related functions

* refactor: eslint fix

* refactor: Improve search for index source in Wazuh-Indexer

* refactor: Update Docker tags for backend and frontend images

* Update miniopy-async dependency to version 1.21.1

* chore: Update default value for MINIO_ROOT_PASSWORD in data_store_session.py

* refactor: Optimize index source retrieval in Wazuh-Indexer

* refactor: Update post_to_copilot_mimecast_module signature to include optional license_key parameter

* refactor: Add license_key parameter to post_to_copilot_huntress_module

* refactor: Update post_to_copilot_carbonblack_module signature to include optional license_key parameter

* refactor: Add list_alerts_by_source_endpoint and related functions

* refactor: agent sca download

* chore: update vue dependencies

* feat: add CaseDataStore components

* feat: add CaseDataStore api/types

* refactor: Remove wazuh_index_fields_resize and rename

* feat: Upload Data Store file

* refactor: lint

* feat: Enable bucket creation during initialization

The code changes in `copilot.py` enable the creation of buckets during the initialization process. This was previously commented out but is now uncommented to ensure that the buckets are created.

Commit message suggestion: `feat: Enable bucket creation during initialization`

* export cases and alerts

* chore: update dependencies in frontend/package.json

* feat: Add multiple filters for listing alerts

* chore: Update Docker tags for backend and frontend images

* feat: add customerCode and source filter to incident management API

* feat: add customer_code and source filter to AlertsList

* feat: add link to customer code in cases list and overview

* feat: autocomplete for customerCode and source filters

* feat: Add CSV export functionality for cases and alerts

* refactor: incidentManagement components

* added customer_code to filename

* refactor: feat: improve support for userId parameter in getLogs function

* refactor: update baseInfo computation to return merged SocCaseExt values

* refactor: update Docker workflow to use "lab" tags for backend and frontend images

* refactor: Remove Discord notifications from Docker workflow

* refactor: Update Docker workflow to use v0.1.2 of CoPilot

* Update .env file to configure Copilot environment variables

* chore: update dependencies in frontend

* chore: update optional dependencies in frontend

* feat: add exportCases api

* refactor: ActiveResponseActions

* refactor: download fileName

* refactor: CaseCreationButton props

* refactor: incidentManagement filters

* feat: add cases export feature

* some precommit fixes

* chore: Update Docker tags for backend and frontend images

* update precommit eslint

* fix: eslint config

* precommit fixes

---------

Co-authored-by: Davide Di Modica <webmaster.ddm@gmail.com>
  • Loading branch information
taylorwalton and Linko91 authored Sep 22, 2024
1 parent e532410 commit 2392a6a
Show file tree
Hide file tree
Showing 416 changed files with 11,339 additions and 4,898 deletions.
5 changes: 5 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,11 @@ MYSQL_ROOT_PASSWORD=REPLACE_WITH_PASSWORD
MYSQL_USER=copilot
MYSQL_PASSWORD=REPLACE_WITH_PASSWORD

MINIO_URL=copilot-minio
MINIO_ROOT_USER=admin
MINIO_ROOT_PASSWORD=REPLACE_ME
MINIO_SECURE=False

# ! ALERT FORWARDING IP
# Set this to the IP of the host running CoPilot. This is used by Graylog to forward alerts to CoPilot
# ! Not needed anymore since we are reading from the index now
Expand Down
12 changes: 2 additions & 10 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -51,15 +51,7 @@ repos:
- id: prettier

- repo: https://github.com/pre-commit/mirrors-eslint
rev: v9.5.0
rev: v9.11.0
hooks:
- id: eslint
files: \.([cjt]sx?|[cm]ts|[cm]js|cvue)$ # *.js, *.jsx, *.ts, *.tsx and *.vue
args: ["--config", "frontend/eslint.config.js"]
additional_dependencies:
- eslint@9.5.0
- "@vue/eslint-config-prettier@9.0.0"
- "@vue/eslint-config-typescript@13.0.0"
- eslint-plugin-cypress@3.3.0
- eslint-plugin-vue@9.26.0
- globals@15.6.0
args: ["--config", "frontend/eslint.config.mjs"]
2 changes: 2 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
{
"cSpell.words": [
"ajoelp",
"antfu",
"apexchart",
"arcticons",
"artifactstring",
Expand Down Expand Up @@ -31,6 +32,7 @@
"iconoir",
"Indicies",
"lastupdate",
"linebreak",
"Logsource",
"majesticons",
"mimecast",
Expand Down
7 changes: 5 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,12 @@
SOCFortress CoPilot

[![Medium](https://img.shields.io/badge/Medium-12100E?style=for-the-badge&logo=medium&logoColor=white)](https://socfortress.medium.com/)
[![YouTube](https://img.shields.io/badge/YouTube-%23FF0000.svg?style=for-the-badge&logo=YouTube&logoColor=white)](https://www.youtube.com/@taylorwalton_socfortress/videos)
[![YouTube Channel Subscribers](https://img.shields.io/youtube/channel/subscribers/UC4EUQtTxeC8wGrKRafI6pZg)](https://www.youtube.com/@taylorwalton_socfortress/videos)
[![Discord Shield](https://discordapp.com/api/guilds/871419379999469568/widget.png?style=shield)](https://discord.gg/UN3pNBzaEQ)
[![GitHub Sponsors](https://img.shields.io/badge/sponsor-30363D?style=for-the-badge&logo=GitHub-Sponsors&logoColor=#EA4AAA)](https://github.com/sponsors/taylorwalton)

[![Email Us](https://img.shields.io/badge/📧%20Email%20Us-We're%20Here%20to%20Help!-orange?style=for-the-badge)](https://www.socfortress.co/contact_form.html)

</h1><h4 align="center">

[SOCFortress CoPilot](https://www.socfortress.co) focuses on providing a single pane of glass for all your security operations needs. Simplify your open source security stack with a single platform focused on making open source security tools easier to use and more accessible.
Expand Down Expand Up @@ -83,14 +85,15 @@ systemctl restart docker

```bash
# Clone the CoPilot repository
wget https://raw.githubusercontent.com/socfortress/CoPilot/v0.1.1/docker-compose.yml
wget https://raw.githubusercontent.com/socfortress/CoPilot/v0.1.2/docker-compose.yml

# Edit the docker-compose.yml file to set the server name and/or the services you want to use

# Create the path for storing your data
mkdir data

# Create the .env file based on the .env.example
nano .env

# Run Copilot
docker compose up -d
Expand Down
2 changes: 1 addition & 1 deletion backend/alembic/alembic.ini
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ version_path_separator = os # Use os.pathsep. Default configuration used for ne
# are written from script.py.mako
# output_encoding = utf-8

sqlalchemy.url = mysql+pymysql://copilot:REPLACE_ME@copilot-mysql/copilot
sqlalchemy.url = mysql+pymysql://copilot:H7U3AHsXWSGvE5L123B7$GQdLQz@10.255.254.2/copilot


[post_write_hooks]
Expand Down
1 change: 1 addition & 0 deletions backend/alembic/env.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@
from app.incidents.models import AssetFieldName
from app.incidents.models import Case
from app.incidents.models import CaseAlertLink
from app.incidents.models import CaseDataStore
from app.incidents.models import Comment
from app.incidents.models import CustomerCodeFieldName
from app.incidents.models import FieldName
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
"""Add incident management cases data store table
Revision ID: 068f32ae5984
Revises: e18dc3169b33
Create Date: 2024-09-13 15:15:02.050101
"""
from typing import Sequence
from typing import Union

import sqlalchemy as sa

from alembic import op

# revision identifiers, used by Alembic.
revision: str = "068f32ae5984"
down_revision: Union[str, None] = "e18dc3169b33"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None


def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
"incident_management_case_datastore",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("case_id", sa.Integer(), nullable=False),
sa.Column("bucket_name", sa.String(length=255), nullable=False),
sa.Column("object_key", sa.String(length=1024), nullable=False),
sa.Column("file_name", sa.String(length=255), nullable=False),
sa.Column("content_type", sa.String(length=100), nullable=True),
sa.Column("file_size", sa.Integer(), nullable=True),
sa.Column("upload_time", sa.DateTime(), nullable=False),
sa.Column("file_hash", sa.String(length=128), nullable=False),
sa.ForeignKeyConstraint(
["case_id"],
["incident_management_case.id"],
),
sa.PrimaryKeyConstraint("id"),
)
# ### end Alembic commands ###


def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table("incident_management_case_datastore")
# ### end Alembic commands ###
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
"""Add customer code column to cases table
Revision ID: ba64d98dbdc5
Revises: 068f32ae5984
Create Date: 2024-09-18 08:25:27.546888
"""
from typing import Sequence
from typing import Union

import sqlalchemy as sa

from alembic import op

# revision identifiers, used by Alembic.
revision: str = "ba64d98dbdc5"
down_revision: Union[str, None] = "068f32ae5984"
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None


def upgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.add_column("incident_management_case", sa.Column("customer_code", sa.String(length=50), nullable=True))
# ### end Alembic commands ###


def downgrade() -> None:
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column("incident_management_case", "customer_code")
# ### end Alembic commands ###
22 changes: 22 additions & 0 deletions backend/app/connectors/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,3 +36,25 @@ async def get_connector_info_from_db(
else:
logger.warning("No connector found.")
return None


async def is_connector_verified(connector_name: str, db: AsyncSession) -> bool:
"""
Checks if a connector is verified.
Args:
connector_name (str): The name of the connector to check.
db (AsyncSession): The database session.
Returns:
bool: True if the connector is verified, otherwise False.
"""
logger.info(f"Checking if connector {connector_name} is verified")
query = select(Connectors).where(Connectors.connector_name == connector_name)
result = await db.execute(query)
connector = result.scalars().first()
if connector:
return connector.connector_verified
else:
logger.warning("No connector found.")
return False
29 changes: 27 additions & 2 deletions backend/app/connectors/wazuh_indexer/utils/universal.py
Original file line number Diff line number Diff line change
Expand Up @@ -507,18 +507,43 @@ async def return_graylog_events_index_names():
return list(indices.keys())


# async def get_index_source(index_name: str):
# """
# Get the 10 latest results from the index and search for where the source contains a field name of `syslog_type` or `integration`
# """
# es_client = await create_wazuh_indexer_client("Wazuh-Indexer")
# query = {"size": 10, "query": {"bool": {"must": [{"exists": {"field": "syslog_type"}}]}}}
# response = es_client.search(index=index_name, body=query)
# for hit in response["hits"]["hits"]: # Loop through each hit in the response
# if "syslog_type" in hit["_source"]: # Check if 'syslog_type' exists in the source of the hit
# if hit["_source"]["syslog_type"] == "integration" and "integration" in hit["_source"]:
# return hit["_source"]["integration"] # Return the value of 'integration' if 'syslog_type' equals 'integration'
# return hit["_source"]["syslog_type"] # Return the value of 'syslog_type' for other cases
# raise HTTPException(status_code=404, detail=f"Source not found in index {index_name}")


async def get_index_source(index_name: str):
"""
Get the 10 latest results from the index and search for where the source contains a field name of `syslog_type` or `integration`
"""
es_client = await create_wazuh_indexer_client("Wazuh-Indexer")
query = {"size": 10, "query": {"bool": {"must": [{"exists": {"field": "syslog_type"}}]}}}
response = es_client.search(index=index_name, body=query)

# First search for 'syslog_type'
query_syslog_type = {"size": 10, "query": {"bool": {"must": [{"exists": {"field": "syslog_type"}}]}}}
response = es_client.search(index=index_name, body=query_syslog_type)
for hit in response["hits"]["hits"]: # Loop through each hit in the response
if "syslog_type" in hit["_source"]: # Check if 'syslog_type' exists in the source of the hit
if hit["_source"]["syslog_type"] == "integration" and "integration" in hit["_source"]:
return hit["_source"]["integration"] # Return the value of 'integration' if 'syslog_type' equals 'integration'
return hit["_source"]["syslog_type"] # Return the value of 'syslog_type' for other cases

# If no 'syslog_type' found, search for 'integration'
query_integration = {"size": 10, "query": {"bool": {"must": [{"exists": {"field": "integration"}}]}}}
response = es_client.search(index=index_name, body=query_integration)
for hit in response["hits"]["hits"]: # Loop through each hit in the response
if "integration" in hit["_source"]: # Check if 'integration' exists in the source of the hit
return hit["_source"]["integration"] # Return the value of 'integration'

raise HTTPException(status_code=404, detail=f"Source not found in index {index_name}")


Expand Down
95 changes: 95 additions & 0 deletions backend/app/data_store/data_store_operations.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
import os

import aiofiles
import aiohttp
from fastapi import HTTPException
from fastapi import UploadFile
from loguru import logger

from app.data_store.data_store_schema import CaseDataStoreCreation
from app.data_store.data_store_session import create_session


async def create_bucket_if_not_exists(bucket_name: str) -> None:
client = await create_session()
if not await client.bucket_exists(bucket_name):
await client.make_bucket(bucket_name)
logger.info(f"Created bucket {bucket_name}")
else:
logger.info(f"Bucket {bucket_name} already exists")


async def upload_case_data_store(data: CaseDataStoreCreation, file: UploadFile) -> None:
client = await create_session()
logger.info(f"Uploading file {file.filename} to bucket {data.bucket_name}")

# Define the temporary file path
temp_file_path = os.path.join(os.getcwd(), file.filename)

# Save the file to the temporary location
async with aiofiles.open(temp_file_path, "wb") as out_file:
content = await file.read()
await out_file.write(content)

# Upload the file to Minio
await client.fput_object(
bucket_name=data.bucket_name,
object_name=f"{data.case_id}/{file.filename}",
file_path=temp_file_path,
content_type=data.content_type,
)

# Optionally, remove the temporary file after upload
os.remove(temp_file_path)


async def download_case_data_store(bucket_name: str, object_name: str) -> bytes:
client = await create_session()
logger.info(f"Downloading file {object_name} from bucket {bucket_name}")
try:
# Check if the file exists
await client.stat_object(bucket_name, object_name)

# If no exception is raised, the file exists, proceed to download
async with aiohttp.ClientSession() as session:
response = await client.get_object(bucket_name, object_name, session)
if response is None:
raise Exception("Received None response from get_object")
if not isinstance(response, aiohttp.ClientResponse):
raise Exception("Response is not an instance of aiohttp.ClientResponse")
data = await response.read() # Ensure to read the data
response.close() # Close the response to release resources
logger.info(f"Downloaded file {object_name} from bucket {bucket_name} and returning data")
return data
except Exception as e:
# If an exception is raised, the file does not exist
logger.info(f"Error: {e}")
# List all objects in the bucket
objects = client.list_objects(bucket_name, recursive=True)
objects_list = [obj.object_name async for obj in objects]
logger.info(f"Objects in bucket {bucket_name}: {objects_list}")
raise HTTPException(status_code=404, detail=f"File {object_name} not found in bucket {bucket_name}")


async def list_case_data_store_files(bucket_name: str, case_id: int) -> list:
client = await create_session()
objects = await client.list_objects(bucket_name, prefix=f"{case_id}/")
return objects


async def create_buckets() -> None:
await create_bucket_if_not_exists("copilot-cases")


async def delete_file(bucket_name: str, object_name: str) -> None:
client = await create_session()
try:
# Check if the file exists
await client.stat_object(bucket_name, object_name)
# If no exception is raised, the file exists, proceed to delete
await client.remove_object(bucket_name, object_name)
logger.info(f"Deleted file {object_name} from bucket {bucket_name}")
except Exception as e:
# If an exception is raised, the file does not exist
logger.info(f"Error: {e}")
raise HTTPException(status_code=404, detail=f"File {object_name} not found in bucket {bucket_name}")
16 changes: 16 additions & 0 deletions backend/app/data_store/data_store_schema.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
from datetime import datetime
from typing import Optional

from pydantic import BaseModel
from pydantic import Field


class CaseDataStoreCreation(BaseModel):
case_id: int
bucket_name: str = Field(max_length=255)
object_key: str = Field(max_length=1024)
file_name: str = Field(max_length=255)
content_type: Optional[str] = Field(max_length=100, default=None)
file_size: Optional[int] = None
upload_time: datetime = Field(default_factory=datetime.utcnow)
file_hash: str = Field(max_length=128)
23 changes: 23 additions & 0 deletions backend/app/data_store/data_store_session.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
from pathlib import Path

from environs import Env
from loguru import logger
from miniopy_async import Minio

env = Env()
env.read_env(Path(__file__).parent.parent / ".env")
# env.read_env(Path(__file__).parent.parent.parent / "docker-env" / ".env")
logger.info(f"Loading environment from {Path(__file__).parent.parent.parent.parent / '.env'}")


minio_root_user = env.str("MINIO_ROOT_USER", default="admin")
minio_root_password = env.str("MINIO_ROOT_PASSWORD", default="password")
minio_url = env.str("MINIO_URL", default="copilot-minio")
minio_secure = env.bool("MINIO_SECURE", default=False)

logger.info(f"Minio Root User: {minio_root_user} and password: {minio_root_password}")


async def create_session() -> Minio:
client = Minio(f"{minio_url}:9000", access_key=minio_root_user, secret_key=minio_root_password, secure=minio_secure)
return client
16 changes: 16 additions & 0 deletions backend/app/data_store/data_store_setup.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
from loguru import logger

from app.data_store.data_store_session import create_session


async def create_bucket_if_not_exists(bucket_name: str) -> None:
client = await create_session()
if not await client.bucket_exists(bucket_name):
await client.make_bucket(bucket_name)
logger.info(f"Created bucket {bucket_name}")
else:
logger.info(f"Bucket {bucket_name} already exists")


async def create_buckets() -> None:
await create_bucket_if_not_exists("copilot-cases")
Loading

0 comments on commit 2392a6a

Please sign in to comment.