Skip to content

Commit

Permalink
deploy: 8aba343
Browse files Browse the repository at this point in the history
  • Loading branch information
fjxmlzn committed Jan 11, 2025
1 parent 9ecc03c commit c8e5ba3
Show file tree
Hide file tree
Showing 8 changed files with 48 additions and 25 deletions.
14 changes: 12 additions & 2 deletions _modules/pe/llm/azure_openai.html
Original file line number Diff line number Diff line change
Expand Up @@ -1028,6 +1028,7 @@ <h1>Source code for pe.llm.azure_openai</h1><div class="highlight"><pre>
<span class="kn">from</span><span class="w"> </span><span class="nn">openai</span><span class="w"> </span><span class="kn">import</span> <span class="n">PermissionDeniedError</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">azure.identity</span><span class="w"> </span><span class="kn">import</span> <span class="n">AzureCliCredential</span><span class="p">,</span> <span class="n">get_bearer_token_provider</span>
<span class="kn">import</span><span class="w"> </span><span class="nn">os</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">tqdm</span><span class="w"> </span><span class="kn">import</span> <span class="n">tqdm</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">tenacity</span><span class="w"> </span><span class="kn">import</span> <span class="n">retry</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">tenacity</span><span class="w"> </span><span class="kn">import</span> <span class="n">retry_if_not_exception_type</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">tenacity</span><span class="w"> </span><span class="kn">import</span> <span class="n">stop_after_attempt</span>
Expand All @@ -1053,9 +1054,11 @@ <h1>Source code for pe.llm.azure_openai</h1><div class="highlight"><pre>
<span class="sd"> * ``AZURE_OPENAI_API_ENDPOINT``: Azure OpenAI endpoint. You can get it from https://portal.azure.com/.</span>
<span class="sd"> * ``AZURE_OPENAI_API_VERSION``: Azure OpenAI API version. You can get it from https://portal.azure.com/.&quot;&quot;&quot;</span>

<div class="viewcode-block" id="AzureOpenAILLM.__init__"><a class="viewcode-back" href="../../../api/pe.llm.azure_openai.html#pe.llm.AzureOpenAILLM.__init__">[docs]</a> <span class="k">def</span><span class="w"> </span><span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">dry_run</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="n">num_threads</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="o">**</span><span class="n">generation_args</span><span class="p">):</span>
<div class="viewcode-block" id="AzureOpenAILLM.__init__"><a class="viewcode-back" href="../../../api/pe.llm.azure_openai.html#pe.llm.AzureOpenAILLM.__init__">[docs]</a> <span class="k">def</span><span class="w"> </span><span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">progress_bar</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span> <span class="n">dry_run</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="n">num_threads</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="o">**</span><span class="n">generation_args</span><span class="p">):</span>
<span class="w"> </span><span class="sd">&quot;&quot;&quot;Constructor.</span>

<span class="sd"> :param progress_bar: Whether to show the progress bar, defaults to True</span>
<span class="sd"> :type progress_bar: bool, optional</span>
<span class="sd"> :param dry_run: Whether to enable dry run. When dry run is enabled, the responses are fake and the APIs are</span>
<span class="sd"> not called. Defaults to False</span>
<span class="sd"> :type dry_run: bool, optional</span>
Expand All @@ -1064,6 +1067,7 @@ <h1>Source code for pe.llm.azure_openai</h1><div class="highlight"><pre>
<span class="sd"> :param \\*\\*generation_args: The generation arguments that will be passed to the OpenAI API</span>
<span class="sd"> :type \\*\\*generation_args: str</span>
<span class="sd"> &quot;&quot;&quot;</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_progress_bar</span> <span class="o">=</span> <span class="n">progress_bar</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_dry_run</span> <span class="o">=</span> <span class="n">dry_run</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_num_threads</span> <span class="o">=</span> <span class="n">num_threads</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_generation_args</span> <span class="o">=</span> <span class="n">generation_args</span>
Expand Down Expand Up @@ -1130,7 +1134,13 @@ <h1>Source code for pe.llm.azure_openai</h1><div class="highlight"><pre>
<span class="k">for</span> <span class="n">request</span> <span class="ow">in</span> <span class="n">requests</span>
<span class="p">]</span>
<span class="k">with</span> <span class="n">ThreadPoolExecutor</span><span class="p">(</span><span class="n">max_workers</span><span class="o">=</span><span class="bp">self</span><span class="o">.</span><span class="n">_num_threads</span><span class="p">)</span> <span class="k">as</span> <span class="n">executor</span><span class="p">:</span>
<span class="n">responses</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span><span class="n">executor</span><span class="o">.</span><span class="n">map</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">_get_response_for_one_request</span><span class="p">,</span> <span class="n">messages_list</span><span class="p">,</span> <span class="n">generation_args_list</span><span class="p">))</span>
<span class="n">responses</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span>
<span class="n">tqdm</span><span class="p">(</span>
<span class="n">executor</span><span class="o">.</span><span class="n">map</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">_get_response_for_one_request</span><span class="p">,</span> <span class="n">messages_list</span><span class="p">,</span> <span class="n">generation_args_list</span><span class="p">),</span>
<span class="n">total</span><span class="o">=</span><span class="nb">len</span><span class="p">(</span><span class="n">messages_list</span><span class="p">),</span>
<span class="n">disable</span><span class="o">=</span><span class="ow">not</span> <span class="bp">self</span><span class="o">.</span><span class="n">_progress_bar</span><span class="p">,</span>
<span class="p">)</span>
<span class="p">)</span>
<span class="k">return</span> <span class="n">responses</span></div>

<div class="viewcode-block" id="AzureOpenAILLM._get_response_for_one_request"><a class="viewcode-back" href="../../../api/pe.llm.azure_openai.html#pe.llm.AzureOpenAILLM._get_response_for_one_request">[docs]</a> <span class="nd">@retry</span><span class="p">(</span>
Expand Down
14 changes: 12 additions & 2 deletions _modules/pe/llm/openai.html
Original file line number Diff line number Diff line change
Expand Up @@ -1027,6 +1027,7 @@ <h1>Source code for pe.llm.openai</h1><div class="highlight"><pre>
<span class="kn">from</span><span class="w"> </span><span class="nn">openai</span><span class="w"> </span><span class="kn">import</span> <span class="n">NotFoundError</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">openai</span><span class="w"> </span><span class="kn">import</span> <span class="n">PermissionDeniedError</span>
<span class="kn">import</span><span class="w"> </span><span class="nn">os</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">tqdm</span><span class="w"> </span><span class="kn">import</span> <span class="n">tqdm</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">tenacity</span><span class="w"> </span><span class="kn">import</span> <span class="n">retry</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">tenacity</span><span class="w"> </span><span class="kn">import</span> <span class="n">retry_if_not_exception_type</span>
<span class="kn">from</span><span class="w"> </span><span class="nn">tenacity</span><span class="w"> </span><span class="kn">import</span> <span class="n">stop_after_attempt</span>
Expand All @@ -1047,9 +1048,11 @@ <h1>Source code for pe.llm.openai</h1><div class="highlight"><pre>
<span class="sd"> * ``OPENAI_API_KEY``: OpenAI API key. You can get it from https://platform.openai.com/account/api-keys. Multiple</span>
<span class="sd"> keys can be separated by commas, and a key will be selected randomly for each request.&quot;&quot;&quot;</span>

<div class="viewcode-block" id="OpenAILLM.__init__"><a class="viewcode-back" href="../../../api/pe.llm.openai.html#pe.llm.OpenAILLM.__init__">[docs]</a> <span class="k">def</span><span class="w"> </span><span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">dry_run</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="n">num_threads</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="o">**</span><span class="n">generation_args</span><span class="p">):</span>
<div class="viewcode-block" id="OpenAILLM.__init__"><a class="viewcode-back" href="../../../api/pe.llm.openai.html#pe.llm.OpenAILLM.__init__">[docs]</a> <span class="k">def</span><span class="w"> </span><span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">progress_bar</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span> <span class="n">dry_run</span><span class="o">=</span><span class="kc">False</span><span class="p">,</span> <span class="n">num_threads</span><span class="o">=</span><span class="mi">1</span><span class="p">,</span> <span class="o">**</span><span class="n">generation_args</span><span class="p">):</span>
<span class="w"> </span><span class="sd">&quot;&quot;&quot;Constructor.</span>

<span class="sd"> :param progress_bar: Whether to show the progress bar, defaults to True</span>
<span class="sd"> :type progress_bar: bool, optional</span>
<span class="sd"> :param dry_run: Whether to enable dry run. When dry run is enabled, the responses are fake and the APIs are</span>
<span class="sd"> not called. Defaults to False</span>
<span class="sd"> :type dry_run: bool, optional</span>
Expand All @@ -1058,6 +1061,7 @@ <h1>Source code for pe.llm.openai</h1><div class="highlight"><pre>
<span class="sd"> :param \\*\\*generation_args: The generation arguments that will be passed to the OpenAI API</span>
<span class="sd"> :type \\*\\*generation_args: str</span>
<span class="sd"> &quot;&quot;&quot;</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_progress_bar</span> <span class="o">=</span> <span class="n">progress_bar</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_dry_run</span> <span class="o">=</span> <span class="n">dry_run</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_num_threads</span> <span class="o">=</span> <span class="n">num_threads</span>
<span class="bp">self</span><span class="o">.</span><span class="n">_generation_args</span> <span class="o">=</span> <span class="n">generation_args</span>
Expand Down Expand Up @@ -1097,7 +1101,13 @@ <h1>Source code for pe.llm.openai</h1><div class="highlight"><pre>
<span class="k">for</span> <span class="n">request</span> <span class="ow">in</span> <span class="n">requests</span>
<span class="p">]</span>
<span class="k">with</span> <span class="n">ThreadPoolExecutor</span><span class="p">(</span><span class="n">max_workers</span><span class="o">=</span><span class="bp">self</span><span class="o">.</span><span class="n">_num_threads</span><span class="p">)</span> <span class="k">as</span> <span class="n">executor</span><span class="p">:</span>
<span class="n">responses</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span><span class="n">executor</span><span class="o">.</span><span class="n">map</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">_get_response_for_one_request</span><span class="p">,</span> <span class="n">messages_list</span><span class="p">,</span> <span class="n">generation_args_list</span><span class="p">))</span>
<span class="n">responses</span> <span class="o">=</span> <span class="nb">list</span><span class="p">(</span>
<span class="n">tqdm</span><span class="p">(</span>
<span class="n">executor</span><span class="o">.</span><span class="n">map</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">_get_response_for_one_request</span><span class="p">,</span> <span class="n">messages_list</span><span class="p">,</span> <span class="n">generation_args_list</span><span class="p">),</span>
<span class="n">total</span><span class="o">=</span><span class="nb">len</span><span class="p">(</span><span class="n">messages_list</span><span class="p">),</span>
<span class="n">disable</span><span class="o">=</span><span class="ow">not</span> <span class="bp">self</span><span class="o">.</span><span class="n">_progress_bar</span><span class="p">,</span>
<span class="p">)</span>
<span class="p">)</span>
<span class="k">return</span> <span class="n">responses</span></div>

<div class="viewcode-block" id="OpenAILLM._get_response_for_one_request"><a class="viewcode-back" href="../../../api/pe.llm.openai.html#pe.llm.OpenAILLM._get_response_for_one_request">[docs]</a> <span class="nd">@retry</span><span class="p">(</span>
Expand Down
17 changes: 8 additions & 9 deletions _modules/pe/logging.html
Original file line number Diff line number Diff line change
Expand Up @@ -1025,39 +1025,38 @@ <h1>Source code for pe.logging</h1><div class="highlight"><pre>
<span class="kn">import</span><span class="w"> </span><span class="nn">os</span>

<span class="c1">#: The logger that will be used to log the execution information</span>
<span class="n">execution_logger</span> <span class="o">=</span> <span class="n">logging</span><span class="o">.</span><span class="n">getLogger</span><span class="p">()</span>
<span class="n">execution_logger</span> <span class="o">=</span> <span class="n">logging</span><span class="o">.</span><span class="n">getLogger</span><span class="p">(</span><span class="s2">&quot;pe&quot;</span><span class="p">)</span>


<div class="viewcode-block" id="setup_logging"><a class="viewcode-back" href="../../api/pe.logging.html#pe.logging.setup_logging">[docs]</a><span class="k">def</span><span class="w"> </span><span class="nf">setup_logging</span><span class="p">(</span>
<span class="n">log_file</span><span class="o">=</span><span class="kc">None</span><span class="p">,</span>
<span class="n">log_screen</span><span class="o">=</span><span class="kc">True</span><span class="p">,</span>
<span class="n">datefmt</span><span class="o">=</span><span class="s2">&quot;%m/</span><span class="si">%d</span><span class="s2">/%Y %H:%M:%S %p&quot;</span><span class="p">,</span>
<span class="n">fmt</span><span class="o">=</span><span class="s2">&quot;</span><span class="si">%(asctime)s</span><span class="s2"> [</span><span class="si">%(name)s</span><span class="s2">] [</span><span class="si">%(levelname)-5.5s</span><span class="s2">] </span><span class="si">%(message)s</span><span class="s2">&quot;</span><span class="p">,</span>
<span class="n">level</span><span class="o">=</span><span class="n">logging</span><span class="o">.</span><span class="n">INFO</span><span class="p">,</span>
<span class="n">name</span><span class="o">=</span><span class="s2">&quot;logger&quot;</span><span class="p">,</span>
<span class="p">):</span>
<span class="w"> </span><span class="sd">&quot;&quot;&quot;Setup the logging configuration.</span>

<span class="sd"> :param log_file: The log file path, defaults to None</span>
<span class="sd"> :type log_file: str, optional</span>
<span class="sd"> :param log_screen: Whether to log to the screen, defaults to True</span>
<span class="sd"> :type log_screen: bool, optional</span>
<span class="sd"> :param datefmt: The date format, defaults to &quot;%m/%d/%Y %H:%M:%S %p&quot;</span>
<span class="sd"> :type datefmt: str, optional</span>
<span class="sd"> :param fmt: The log format, defaults to &quot;%(asctime)s [%(name)s] [%(levelname)-5.5s] %(message)s&quot;</span>
<span class="sd"> :type fmt: str, optional</span>
<span class="sd"> :param level: The log level, defaults to logging.INFO</span>
<span class="sd"> :type level: int, optional</span>
<span class="sd"> :param name: The logger name, defaults to &quot;logger&quot;</span>
<span class="sd"> :type name: str, optional</span>
<span class="sd"> &quot;&quot;&quot;</span>
<span class="n">execution_logger</span><span class="o">.</span><span class="n">name</span> <span class="o">=</span> <span class="n">name</span>

<span class="n">execution_logger</span><span class="o">.</span><span class="n">handlers</span><span class="o">.</span><span class="n">clear</span><span class="p">()</span>
<span class="n">execution_logger</span><span class="o">.</span><span class="n">setLevel</span><span class="p">(</span><span class="n">level</span><span class="p">)</span>

<span class="n">log_formatter</span> <span class="o">=</span> <span class="n">logging</span><span class="o">.</span><span class="n">Formatter</span><span class="p">(</span><span class="n">fmt</span><span class="o">=</span><span class="n">fmt</span><span class="p">,</span> <span class="n">datefmt</span><span class="o">=</span><span class="n">datefmt</span><span class="p">)</span>

<span class="n">console_handler</span> <span class="o">=</span> <span class="n">logging</span><span class="o">.</span><span class="n">StreamHandler</span><span class="p">()</span>
<span class="n">console_handler</span><span class="o">.</span><span class="n">setFormatter</span><span class="p">(</span><span class="n">log_formatter</span><span class="p">)</span>
<span class="n">execution_logger</span><span class="o">.</span><span class="n">addHandler</span><span class="p">(</span><span class="n">console_handler</span><span class="p">)</span>
<span class="k">if</span> <span class="n">log_screen</span><span class="p">:</span>
<span class="n">console_handler</span> <span class="o">=</span> <span class="n">logging</span><span class="o">.</span><span class="n">StreamHandler</span><span class="p">()</span>
<span class="n">console_handler</span><span class="o">.</span><span class="n">setFormatter</span><span class="p">(</span><span class="n">log_formatter</span><span class="p">)</span>
<span class="n">execution_logger</span><span class="o">.</span><span class="n">addHandler</span><span class="p">(</span><span class="n">console_handler</span><span class="p">)</span>

<span class="k">if</span> <span class="n">log_file</span> <span class="ow">is</span> <span class="ow">not</span> <span class="kc">None</span><span class="p">:</span>
<span class="n">os</span><span class="o">.</span><span class="n">makedirs</span><span class="p">(</span><span class="n">os</span><span class="o">.</span><span class="n">path</span><span class="o">.</span><span class="n">dirname</span><span class="p">(</span><span class="n">log_file</span><span class="p">),</span> <span class="n">exist_ok</span><span class="o">=</span><span class="kc">True</span><span class="p">)</span>
Expand Down
Loading

0 comments on commit c8e5ba3

Please sign in to comment.