Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Decompose the provisioner #109

Merged
merged 4 commits into from
Aug 21, 2024

Conversation

Vicente-Cheng
Copy link
Collaborator

@Vicente-Cheng Vicente-Cheng commented Jun 5, 2024

Problem:
To support various storage backends.

Solution:
Decompose the provisioner, we could define various behaviors for each provisioner

Related Issue:
TBD

Test plan:
Make sure all operations work as usual

TASKS

  • decompose the provisioner
  • create Longhornv1

@Vicente-Cheng Vicente-Cheng force-pushed the wip-decompose-NDM branch 5 times, most recently from 7d96e8b to e65b17e Compare June 5, 2024 17:09
@bk201 bk201 requested a review from Yu-Jack June 6, 2024 06:21
@Vicente-Cheng Vicente-Cheng force-pushed the wip-decompose-NDM branch 7 times, most recently from 2f2d814 to 1d00ec5 Compare June 7, 2024 08:31
@Vicente-Cheng Vicente-Cheng marked this pull request as ready for review June 7, 2024 08:57
@Vicente-Cheng
Copy link
Collaborator Author

CodeFactor complains about files generated by wangler. (the redundant return of deepcopy func)

@Vicente-Cheng Vicente-Cheng force-pushed the wip-decompose-NDM branch 2 times, most recently from 133aba5 to df2eb1e Compare June 12, 2024 03:24
@Vicente-Cheng Vicente-Cheng requested review from tserong and removed request for Yu-Jack June 19, 2024 04:10
pkg/provisioner/lvm.go Outdated Show resolved Hide resolved
pkg/provisioner/lvm.go Outdated Show resolved Hide resolved
pkg/provisioner/common.go Outdated Show resolved Hide resolved
pkg/provisioner/lvm.go Outdated Show resolved Hide resolved
Copy link
Contributor

@tserong tserong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is the provisioner info duplicated in both the CR spec and status sections? Doesn't this only need to be set in the spec?

@@ -23,7 +23,7 @@ spec:
- jsonPath: .status.deviceStatus.details.deviceType
name: Type
type: string
- jsonPath: .spec.devPath
- jsonPath: .status.deviceStatus.devPath
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A general question about NDM which I've only thought to ask just now while reviewing this PR: Why do we have both spec.DevPath and status.deviceStatus.devPath? Can these two values ever be different? If not, should we remove one of them?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The spec.DevPath means the dev path when we create the blockdevice CRD (means the first time and fixed)
The status.deviceStatus.devPath means the current dev path.

So I thought showing the status.deviceStatus.devPath seems more useful for us.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah showing the actual current path makes sense.

I've just realised this (kinda) explains harvester/harvester#6224 where there were discrepancies between spec.DevPath and status.deviceStatus.devPath. I mean, there never should have been three BDs that all had the same status.deviceStatus.devPath, and I assume that was (somehow) caused by multiple dodgy devices with the same WWN, but the fact that one is the path when the device was added, and one is the current path, at least sheds some light on the discrepancies I noticed in that issue.

Thinking aloud: Maybe we should just drop spec.DevPath? Does it help at all to know what the path was originally? Note: I'm not suggesting we make any further changes to this PR in this area. Just an item for thought/consideration later.

Copy link
Collaborator Author

@Vicente-Cheng Vicente-Cheng Aug 14, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we should just drop spec.DevPath

+1, it's a redundant field for me with the current design.
We can do more cleanup for these fields. I plan to introduce a new naming mechanism to replace the current one in the next release (means v1.5). That we would be good at some devices w/o wwn. Maybe we could redesign these fields at that time.

We still need to check the WWN duplicated. But we could have better handling with those devices w/o wwn.

UPDATE

I assume that was (somehow) caused by multiple dodgy devices with the same WWN, but the fact that one is the path when the device was added, and one is the current path, at least sheds some light on the discrepancies I noticed in that issue.

I am also curious about that part. We should make sure the duplicated wwn is not added, but somehow, it was added. Maybe there still is a corner case on the wwn checking mechanism.

Copy link
Member

@votdev votdev Aug 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can these two values ever be different?

Yes. If you have a device file like /dev/disk/by-path/pci-0000:00:10.0-scsi-0:0:0:0 this can change when the PCI slots are reordered, e.g. by adding new hardware. Therefore /dev/disk/by-id/ should be used because they are predictable due the fact that these device files contains the WWN or serial number of the disk for example.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @votdev,
Yeah, if we would like to keep the spec.devPath field. We should use a meaningful path like the above instead of the /dev/sdx. As I mentioned above, we could redesign the field and naming mechanism on v1.5.

@Vicente-Cheng
Copy link
Collaborator Author

Hi guys, I would like to reduce this PR to only decompose the provisioned and make longhornv1 work as usual.

After that, both @tserong and I could base on it to develop the v2 engine and LVM.
Please help to check it again. Thanks!

@Vicente-Cheng Vicente-Cheng force-pushed the wip-decompose-NDM branch 3 times, most recently from 63fd98c to 13ef984 Compare August 14, 2024 01:46
@Yu-Jack Yu-Jack self-requested a review August 14, 2024 01:59
Copy link

@Yu-Jack Yu-Jack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks.

@Vicente-Cheng Vicente-Cheng force-pushed the wip-decompose-NDM branch 3 times, most recently from 005f971 to 3542487 Compare August 15, 2024 05:28
manifests/crds/harvesterhci.io_blockdevices.yaml Outdated Show resolved Hide resolved
pkg/apis/harvesterhci.io/v1beta1/types.go Outdated Show resolved Hide resolved
pkg/provisioner/longhornv1.go Outdated Show resolved Hide resolved
pkg/provisioner/longhornv1.go Show resolved Hide resolved
pkg/provisioner/longhornv1.go Show resolved Hide resolved
pkg/provisioner/longhornv1.go Outdated Show resolved Hide resolved
pkg/provisioner/longhornv1.go Outdated Show resolved Hide resolved
    - Also update the deploy crd for testing

Signed-off-by: Vicente Cheng <vicente.cheng@suse.com>
Signed-off-by: Vicente Cheng <vicente.cheng@suse.com>
    - We might need LH v1.7.0 for v2 engine, so move to
      wrangler v3 first.

Signed-off-by: Vicente Cheng <vicente.cheng@suse.com>
Signed-off-by: Vicente Cheng <vicente.cheng@suse.com>
Copy link
Member

@votdev votdev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@Vicente-Cheng Vicente-Cheng merged commit b3d80cc into harvester:master Aug 21, 2024
5 of 6 checks passed
@Vicente-Cheng
Copy link
Collaborator Author

@Mergifyio backport v0.7.x

Copy link

mergify bot commented Aug 21, 2024

backport v0.7.x

✅ Backports have been created

@mergify mergify bot mentioned this pull request Aug 21, 2024
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants