Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Connection Error during config-changed event #99

Closed
gruyaume opened this issue Feb 26, 2024 · 0 comments · Fixed by #115
Closed

Connection Error during config-changed event #99

gruyaume opened this issue Feb 26, 2024 · 0 comments · Fixed by #115
Assignees
Labels
bug Something isn't working

Comments

@gruyaume
Copy link
Collaborator

Describe the bug

An error comes up during config changed event.

To Reproduce

  1. Pack the charm
charmcraft pack --verbose
  1. Deploy the charm:
juju deploy ./sdcore-upf-k8s_ubuntu-22.04-amd64.charm --trust --resource bessd-image=ghcr.io/canonical/sdcore-upf-bess:1.3 --resource pfcp-agent-image=ghcr.io/canonical/sdcore-upf-pfcpiface:1.3

Expected behavior

No error

Logs

unit-sdcore-upf-k8s-0: 10:13:55 ERROR unit.sdcore-upf-k8s/0.juju-log Uncaught exception while in charm code:
Traceback (most recent call last):
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/./src/charm.py", line 1040, in <module>
    main(UPFOperatorCharm)
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/venv/ops/main.py", line 456, in main
    _emit_charm_event(charm, dispatcher.event_name)
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/venv/ops/main.py", line 144, in _emit_charm_event
    event_to_emit.emit(*args, **kwargs)
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/venv/ops/framework.py", line 351, in emit
    framework._emit(event)
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/venv/ops/framework.py", line 853, in _emit
    self._reemit(event_path)
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/venv/ops/framework.py", line 943, in _reemit
    custom_handler(event)
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/./src/charm.py", line 513, in _on_config_changed
    self._on_bessd_pebble_ready(event)
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/./src/charm.py", line 535, in _on_bessd_pebble_ready
    self._configure_bessd_workload()
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/./src/charm.py", line 593, in _configure_bessd_workload
    self._run_bess_configuration()
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/./src/charm.py", line 601, in _run_bess_configuration
    if not self._is_bessctl_executed():
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/./src/charm.py", line 637, in _is_bessctl_executed
    return self._bessd_container.exists(path=f"/{BESSCTL_CONFIGURE_EXECUTED_FILE_NAME}")
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/venv/ops/model.py", line 2578, in exists
    self._pebble.list_files(str(path), itself=True)
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/venv/ops/pebble.py", line 2318, in list_files
    resp = self._request('GET', '/v1/files', query)
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/venv/ops/pebble.py", line 1754, in _request
    response = self._request_raw(method, path, query, headers, data)
  File "/var/lib/juju/agents/unit-sdcore-upf-k8s-0/charm/venv/ops/pebble.py", line 1803, in _request_raw
    raise ConnectionError(
ops.pebble.ConnectionError: Could not connect to Pebble: socket not found at '/charm/containers/bessd/pebble.socket' (container restarted?)

Environment

  • Charm / library version (if relevant):
  • Juju version (output from juju --version): 3.4.0
  • Cloud Environment: MicroK8s
  • Kubernetes version (output from kubectl version --short): v1.28.7

Additional context

@gruyaume gruyaume added the bug Something isn't working label Feb 26, 2024
@dariofaccin dariofaccin self-assigned this Mar 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants