Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/make lambda #11

Merged
merged 3 commits into from
Oct 22, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 9 additions & 10 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -81,21 +81,14 @@ format:
$(VENV_DIR)/bin/poetry run ruff format


.PHONY: build-api-image
build-api-image:
docker build -t $(DOCKER_IMAGE_NAME)-api -f deployment/docker/Dockerfile.api .


.PHONY: build-batch-image
build-batch-image:
docker build -t $(DOCKER_IMAGE_NAME)-batch -f deployment/docker/Dockerfile.batch .


## Inicia a API localmente [Docker]
.PHONY: api
api: build-api-image
@echo "Lançando a API localmente..."
docker run -p $(API_PORT):$(API_PORT) $(DOCKER_IMAGE_NAME)-api:latest
.PHONY: build-lambda-image
build-lambda-image:
docker build -t $(DOCKER_IMAGE_NAME)-lambda -f deployment/docker/Dockerfile.lambda .


## Inicia o batch localmente [Docker]
Expand All @@ -104,6 +97,12 @@ batch: build-batch-image
@echo "Lançando o batch localmente..."
docker run $(DOCKER_IMAGE_NAME)-batch:latest

## Inicia o lambda localmente [Docker]
.PHONY: lambda
lambda: build-lambda-image
@echo "Lançando o lambda localmente..."
docker run -p 9000:8080 $(DOCKER_IMAGE_NAME)-lambda:latest


#################################################################################
# Self Documenting Commands #
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -233,7 +233,7 @@ make init Prepara todo o repositório com o poetry e pre-com
make clean Remove todo o ambiente virtual e desconfigura o pre-commit
make lint Lint usando ruff (use `make format` para formatação)
make format Formata o código fonte com ruff
make api Inicia a API localmente [Docker]
make lambda Inicia o lambda localmente [Docker]
make batch Inicia o batch localmente [Docker]
```

Expand Down Expand Up @@ -327,7 +327,7 @@ Os deploys só podem ser feitos através dos pipelines do CI/CD do github. Com i

- Se assegurar de que as mudanças feitas nos códigos de inferência em `src/pipelines/predict.py` estejam sendo refletidas nos Dockerfiles, como instruído em `Inferência`.

- Se ficar em dúvida se o deploy vai funcionar, use os comandos do makefile que permitem voce testar localmente. São eles: `make batch` e `make api`. Eles irão rodar um container Docker da mesma forma que será executado na nuvem.
- Se ficar em dúvida se o deploy vai funcionar, use os comandos do makefile que permitem voce testar localmente. São eles: `make batch` e `make lambda`. Eles irão rodar um container Docker da mesma forma que será executado na nuvem.

- Para o deploy em si, há a necessidade de se criar uma role na AWS com OIDC. Aqui está o tutorial oficial da AWS de como criar e configurar: https://aws.amazon.com/pt/blogs/security/use-iam-roles-to-connect-github-actions-to-actions-in-aws/ . Após configurar, coloque o arn da role no arquivo yml dos pipelines de deploy batch e deploy online (lambda).

Expand Down