Skip to content

Pawan modular v2#13

Open
pawantr wants to merge 14 commits intomainfrom
pawan-modular-v2
Open

Pawan modular v2#13
pawantr wants to merge 14 commits intomainfrom
pawan-modular-v2

Conversation

@pawantr
Copy link
Collaborator

@pawantr pawantr commented Mar 23, 2026

No description provided.

pawantr added 14 commits March 23, 2026 16:50
New apexa_agents.py replaces the monolithic agentic loop with:
- ArgoProvider: single class for all Argo Gateway HTTP calls
- APEXAAgent: lightweight agent definition (instructions + tool filter)
- AgentRunner: dual-mode tool calling (native API + text-based TOOL_CALL parsing)
- OrchestratorAgent: keyword-score routing to specialist agents
  (CalibrationAgent, AnalysisAgent, KnowledgeAgent)

Text-based TOOL_CALL:/ARGUMENTS: parsing is the primary mode since
Argo Gateway returns string responses that strip native tool_calls.
Tools are listed in the system prompt with parameter schemas so the
model knows exactly what to call and how.
Removes ~850 lines of monolithic code replaced by apexa_agents.py:
- Removed: call_argo_chat_api(), _prepare_argo_payload(),
  _convert_tools_to_claude_format(), get_all_available_tools(),
  process_diffraction_query(), _extract_peak_positions(),
  CALCULATION_KEYWORDS/MAP dicts, self.http_client,
  self.conversation_history
- Added: _tool_registry (O(1) dict lookup built at connection time),
  _available_tools list, OrchestratorAgent initialization
- New run_query() delegates to orchestrator.process()
- execute_tool_call() routing: O(n) waterfall → O(1) registry lookup
- Fixed stale CLI shortcuts: run/ls commands use correct tool names
  (run_ff_hedm_full_workflow, run_command, list_directory on core server)
servers.config now loads 2 servers instead of 3:
- core (beamline_core_server.py): unified file ops, command execution,
  X-ray calculations — replaces filesystem, executor, utilities servers
- midas (midas_comprehensive_server.py): all HEDM analysis tools

Legacy servers (filesystem, executor, utilities) commented out with
explanation. beamline_core_server.py provides 9 tools: list_directory,
read_file, write_file, get_file_info, run_command, check_environment,
xray_calculate, validate_beamline_parameters, list_common_calibrants.
gradio_ui.py:
- Added _read_servers_config() to parse servers.config
- Fixed _initialize_async() to call connect_to_multiple_servers()
- Changed process_diffraction_query() → run_query()

web_server.py:
- Fixed 3 stale call sites: process_diffraction_query → run_query
- Fixed send_message (non-existent method) → run_query
Removed duplicate entries for fastapi, websockets, python-multipart.
Reorganized deps into categories: HTTP & API, MCP & AI, UI,
Scientific/diffraction, Numeric/plotting, Data formats,
Knowledge base/RAG, CLI. No new dependencies needed.
- Remove debug prints: [Argo], [AgentRunner], [Agent] per-query logs
- Keep only tool execution prints (→ tool_name) for useful feedback
- Consolidate server connection logging (one line per server)
- Simplify interactive session banner to single compact header
- Fix uv VIRTUAL_ENV warning: unset stale env var in launch script
- Remove duplicate tool_name print between AgentRunner and execute_tool_call
beamline_core_server.py:
- Silence numexpr INFO logging (was printing thread count banner)
- Remove ✓ success prints; keep only ⚠ warnings on missing deps
- Remove startup banner (===, server name, available commands count)

midas_comprehensive_server.py:
- Remove all ✓ success prints from Python/conda detection
- Remove ✓ MIDAS installation validation prints
- Remove AutoCalibrateZarr.py found print
- Remove MIDAS scientific/Python API available prints
- Remove full startup banner with tool listing (40 lines → 2)
- Keep only ⚠/❌ warnings for missing deps, invalid paths

start_beamline_assistant.sh:
- Silence numexpr via NUMEXPR_MAX_THREADS env var
Rule: warnings/errors always visible, success messages removed.
Keep: which MIDAS path is loaded, ⚠ multiple installs, ❌ not found,
      ⚠ MIDAS_PATH invalid, ⚠ missing Python deps, ⚠ missing AutoCalibrateZarr.

Always print one line showing which MIDAS is selected so operators
can confirm the correct installation at a glance.
…ains tool

- run_ff_hedm_full_workflow: add use_gpu (-useGPU), resume_file (--resume),
  restart_from (--restartFrom) for v10 checkpoint/resume support
- run_pf_hedm_workflow: add use_gpu, resume_file, restart_from, do_tomo (-doTomo)
- run_nf_hedm_reconstruction: add use_gpu (-gpuFit for FitOrientationGPU),
  resume_file, restart_from
- run_ff_calibration: switch CalibrantOMP → CalibrantIntegratorOMP (v10 primary)
- New match_grains tool: wraps match_grains.py for Hungarian algorithm grain
  matching across load states and NF-HEDM layer stitching
- validate_midas_installation: update key executables list for v10 (IndexerGPU,
  FitPosOrStrainsGPU, FitOrientationGPU, IntegratorZarrOMP, ProcessImagesCombined)
- validate_midas_installation: update workflow script paths (drop v7 subdir)
- apexa_agents.py: add match_grains to AnalysisAgent tool list; fix unused
  loop variable (iteration → _)
- beamline_core_server.py: ALLOWED_COMMANDS updated with full v10 executable set
- Replace -StoppingStrain (removed in v10) with --n-iterations,
  --tol-shifts, --tol-rotation (actual AutoCalibrateZarr.py args)
- File search now also looks in cwd subdirs (e.g. test1/) when
  image or param file not found at the given path
- Show full path including subdir in search results
…llback

- get_midas_env(): add MIDAS_ROOT/utils to PYTHONPATH so midas_config
  is importable regardless of working directory (fixes ModuleNotFoundError)
- find_midas_python(): raise RuntimeError instead of silently falling back
  to current Python when midas_env is not found — fails loud and clear
get_midas_env() sets DYLD_LIBRARY_PATH for C++ executables (they need
MIDAS's own libhdf5). get_midas_python_env() deliberately omits
DYLD_LIBRARY_PATH so the midas conda env's h5py links against its own
libhdf5 without conflict.

midas_auto_calibrate and run_python_script now use get_midas_python_env().
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant