Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minor cleanup changes #61

Merged
merged 5 commits into from
Oct 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 2 additions & 4 deletions .github/workflows/linters.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,17 +16,15 @@ jobs:
strategy:
max-parallel: 4
matrix:
python-version: [3.9]
python-version: [3.11]
steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Run a commitlint
uses: wagoid/commitlint-github-action@v2
- name: Install dependencies
run: |
sudo apt-get update -y
Expand Down
7 changes: 7 additions & 0 deletions .gitleaks.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
[allowlist]
description = "Global Allowlist"

paths = [
# Ignore the python code generation dataset, it has fake secrets
'''datasets\/python_code_alpaca_subset.jsonl$''',
]
18 changes: 10 additions & 8 deletions config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@ output:
format: "json" # Maybe add option for pickle?
dir: "./output/"
file: "output.json"
warmup: True
warmup: False
warmup_options:
requests: 11
requests: 10
timeout_sec: 20
storage: # TODO
type: local
Expand All @@ -13,19 +13,21 @@ dataset:
max_queries: 1000
min_input_tokens: 0
max_input_tokens: 1024
max_output_tokens: 256
max_sequence_tokens: 1024
min_output_tokens: 0
max_output_tokens: 1024
max_sequence_tokens: 2048
load_options:
type: constant #Future options: loadgen, stair-step
concurrency: 2
concurrency: 1
duration: 20 # In seconds. Maybe in future support "100s" "10m", etc...
plugin: "tgis_grpc_plugin"
plugin: "openai_plugin"
plugin_options:
#interface: "grpc" # Some plugins like caikit-nlp-client should support grpc/http
use_tls: False # Use True if querying an SSL grpc endpoint over https
streaming: True
model_name: "flan-t5-small"
host: "route.to.host"
port: 8033
endpoint: "/v1/completions" # for openai plugin
# interface: "grpc" # Some plugins like caikit-nlp-client should support grpc/http
# port: 8033 # Some plugins like the standalone TGIS plugin need port
extra_metadata:
replicas: 1
361 changes: 0 additions & 361 deletions datasets/openorca-subset-006.json

This file was deleted.

Loading
Loading