You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a use case where I need to impose "local" NChooseK constraints, e.g., for features x1-x6, I need to impose 2Choose1 only on features x5,x6 only. The following repro works:
importrandomimporttorchfromaximportData, Experiment, ParameterType, RangeParameter, SearchSpacefromax.modelbridge.registryimportModelsfromax.runners.syntheticimportSyntheticRunnerfromtorch.quasirandomimportSobolEngineimportcopyimportnumpyasnpfromax.service.ax_clientimportAxClientfromax.modelbridge.registryimportModelsfromax.core.observationimportObservationFeaturesfrombotorch.models.gp_regressionimportSingleTaskGPfromax.utils.measurement.synthetic_functionsimportbraninfromax.models.torch.botorch_modular.surrogateimportSurrogatefrombotorch.acquisition.monte_carloimportqNoisyExpectedImprovementfromax.modelbridge.generation_strategyimportGenerationStrategy, GenerationStepfrombotorch.test_functionsimportHartmannimportnumpyasnpfromax.service.ax_clientimportAxClient, ObjectivePropertiesfromax.utils.measurement.synthetic_functionsimporthartmann6# Load our sample 2-objective problemfrombotorch.test_functions.multi_objectiveimportBraninCurrin, DTLZ2branin_currin=BraninCurrin(negate=True).to(
dtype=torch.double,
device=torch.device("cuda"iftorch.cuda.is_available() else"cpu"),
)
test_dtlz2=DTLZ2(dim=6, negate=True)
# Example usagen=1000dim_of_problem=6list_of_nck= [[3, 5]] # 3-choose-1list_of_non_zeros= [1]
var_idxs=list_of_ncknon_zeros=list_of_non_zerosdefcreate_ineq_constraint(var_idx: List[int], non_zero: int):
defineq_constraint_on_vars(x: torch.Tensor, ell: float=1e-3):
""" Each callable is expected to take a `(num_restarts) x q x d`-dim tensor as an input and return a `(num_restarts) x q`-dim tensor with the constraint values. """x_slice=x[..., var_idx[0]:var_idx[1]]
returnnarrow_gaussian(x_slice, ell).sum(dim=-1) - (x_slice.shape[-1] -non_zero)
returnineq_constraint_on_varsdefsetup_ineqs(var_idxs: List[List[int]], non_zeros: List[int]):
""" Setup the inequality constraints for the optimization problem. """return [create_ineq_constraint(var_idx, non_zero)
forvar_idx, non_zeroinzip(var_idxs, non_zeros)]
# Create a list of inequality constraint functionsineq_constraints=setup_ineqs(var_idxs, non_zeros)
frombotorch.utils.samplingimportDelaunayPolytopeSamplerfrombotorch.sampling.normalimportSobolQMCNormalSampler# Nonsimple constraintsget_batch_initial_conditions_multitarget=get_batch_initial_conditions_multisubspace(num_restarts=1, raw_samples=512, q=q, list_of_nck=list_of_nck, list_of_non_zeros=list_of_non_zeros)
generation_strategy=GenerationStrategy(
steps=[
GenerationStep(
model=Models.SOBOL,
num_trials=1, # https://github.com/facebook/Ax/issues/922min_trials_observed=1,
max_parallelism=6,
model_kwargs={"seed": 9999},
model_gen_kwargs={
"model_gen_options": {
# "optimizer_kwargs": {# "nonlinear_inequality_constraints": [ineq_constraint],# "batch_initial_conditions": batch_initial_conditions,# }
}
},
),
GenerationStep(
model=Models.BOTORCH_MODULAR,
num_trials=-1,
model_gen_kwargs={
"model_gen_options": {
"optimizer_kwargs": {
# "nonlinear_inequality_constraints": [ineq_constraint_first_2, ineq_constraint_last_2],"nonlinear_inequality_constraints": ineq_constraints,
"batch_initial_conditions": get_batch_initial_conditions_multitarget,
},
"acqf_kwargs":{
"sampler": DelaunayPolytopeSampler,
},
}
},
should_deduplicate=True,
),
]
)
ax_client=AxClient(generation_strategy=generation_strategy)
ax_client.create_experiment(
name="moo_experiment",
parameters=[
{
"name": f"x{i+1}",
"type": "range",
"bounds": [0.0, 1.0],
}
foriinrange(dim_of_problem)
],
objectives={
# `threshold` arguments are optional"a": ObjectiveProperties(minimize=False, threshold=test_dtlz2.ref_point[0]),
"b": ObjectiveProperties(minimize=False, threshold=test_dtlz2.ref_point[1]),
},
overwrite_existing_experiment=True,
is_test=True,
)
defevaluate(parameters):
evaluation=test_dtlz2(
torch.tensor([parameters.get("x1"),
parameters.get("x2"),
parameters.get("x3"),
parameters.get("x4"),
parameters.get("x5"),
parameters.get("x6"),])
)
# In our case, standard error is 0, since we are computing a synthetic function.# Set standard error to None if the noise level is unknown.return {"a": (evaluation[0].item(), 0.0), "b": (evaluation[1].item(), 0.0)}
foriinrange(20):
parameters, trial_index=ax_client.get_next_trial()
# Local evaluation here can be replaced with deployment to external system.ax_client.complete_trial(trial_index=trial_index, raw_data=evaluate(parameters))
where in this repro I impose 3Choose1 on x4,x5,x6. The resulting suggestions are shown
[INFO 07-02 17:13:50] ax.service.ax_client: Generated new trial 0 with parameters {'x1': 0.288336, 'x2': 0.699209, 'x3': 0.507989, 'x4': 0.692475, 'x5': 0.49404, 'x6': 0.184216}.
[INFO 07-02 17:13:50] ax.service.ax_client: Completed trial 0 with data: {'a': (-1.057924, 0.0), 'b': (-0.514846, 0.0)}.
[INFO 07-02 17:13:50] ax.modelbridge.transforms.standardize_y: Outcome a is constant, within tolerance.
[INFO 07-02 17:13:50] ax.modelbridge.transforms.standardize_y: Outcome b is constant, within tolerance.
[INFO 07-02 17:13:50] ax.service.ax_client: Generated new trial 1 with parameters {'x1': 0.815077, 'x2': 0.000158, 'x3': 1.0, 'x4': 0.0, 'x5': 0.931009, 'x6': 3e-05}.
[INFO 07-02 17:13:50] ax.service.ax_client: Completed trial 1 with data: {'a': (-0.625969, 0.0), 'b': (-2.094022, 0.0)}.
[INFO 07-02 17:13:51] ax.service.ax_client: Generated new trial 2 with parameters {'x1': 0.0, 'x2': 0.533964, 'x3': 0.0, 'x4': 0.026203, 'x5': 0.0, 'x6': 0.00677}.
[INFO 07-02 17:13:51] ax.service.ax_client: Completed trial 2 with data: {'a': (-1.968914, 0.0), 'b': (-0.0, 0.0)}.
[INFO 07-02 17:13:51] ax.service.ax_client: Generated new trial 3 with parameters {'x1': 0.518246, 'x2': 0.820945, 'x3': 0.654583, 'x4': 0.0, 'x5': 0.611085, 'x6': 0.312251}.
[INFO 07-02 17:13:51] ax.service.ax_client: Completed trial 3 with data: {'a': (-0.977989, 0.0), 'b': (-1.035719, 0.0)}.
[INFO 07-02 17:18:00] ax.service.ax_client: Generated new trial 4 with parameters {'x1': 0.580242, 'x2': 0.071386, 'x3': 0.780743, 'x4': 0.0, 'x5': 0.736197, 'x6': 0.0}.
[INFO 07-02 17:18:01] ax.service.ax_client: Completed trial 4 with data: {'a': (-1.113913, 0.0), 'b': (-1.437173, 0.0)}.
[INFO 07-02 17:22:05] ax.modelbridge.generation_node: The generator run produced duplicate arms. Re-running the generation step in an attempt to deduplicate. Candidates produced in the last generator run: [Arm(name='4_0', parameters={'x1': 0.5802415013313293, 'x2': 0.0713861957192421, 'x3': 0.7807427644729614, 'x4': 0.0, 'x5': 0.7361965179443359, 'x6': 0.0})].
[INFO 07-02 17:22:06] ax.service.ax_client: Generated new trial 5 with parameters {'x1': 0.328518, 'x2': 0.731405, 'x3': 0.569487, 'x4': 0.719355, 'x5': 0.0, 'x6': 0.207749}.
[INFO 07-02 17:22:06] ax.service.ax_client: Completed trial 5 with data: {'a': (-1.254143, 0.0), 'b': (-0.711487, 0.0)}.
[INFO 07-02 17:26:28] ax.modelbridge.generation_node: The generator run produced duplicate arms. Re-running the generation step in an attempt to deduplicate. Candidates produced in the last generator run: [Arm(name='4_0', parameters={'x1': 0.5802415013313293, 'x2': 0.0713861957192421, 'x3': 0.7807427644729614, 'x4': 0.0, 'x5': 0.7361965179443359, 'x6': 0.0})].
[INFO 07-02 17:26:29] ax.service.ax_client: Generated new trial 6 with parameters {'x1': 0.701085, 'x2': 0.73956, 'x3': 0.141509, 'x4': 0.818724, 'x5': 0.0, 'x6': 0.133479}.
[INFO 07-02 17:26:29] ax.service.ax_client: Completed trial 6 with data: {'a': (-0.756455, 0.0), 'b': (-1.4909, 0.0)}.
[INFO 07-02 17:30:30] ax.modelbridge.generation_node: The generator run produced duplicate arms. Re-running the generation step in an attempt to deduplicate. Candidates produced in the last generator run: [Arm(name='4_0', parameters={'x1': 0.5802415013313293, 'x2': 0.0713861957192421, 'x3': 0.7807427644729614, 'x4': 0.0, 'x5': 0.7361965179443359, 'x6': 0.0})].
[INFO 07-02 17:34:54] ax.modelbridge.generation_node: The generator run produced duplicate arms. Re-running the generation step in an attempt to deduplicate. Candidates produced in the last generator run: [Arm(name='4_0', parameters={'x1': 0.5802415013313293, 'x2': 0.0713861957192421, 'x3': 0.7807427644729614, 'x4': 0.0, 'x5': 0.7361965179443359, 'x6': 0.0})].
[INFO 07-02 17:39:21] ax.modelbridge.generation_node: The generator run produced duplicate arms. Re-running the generation step in an attempt to deduplicate. Candidates produced in the last generator run: [Arm(name='4_0', parameters={'x1': 0.5802415013313293, 'x2': 0.0713861957192421, 'x3': 0.7807427644729614, 'x4': 0.0, 'x5': 0.7361965179443359, 'x6': 0.0})].
[INFO 07-02 17:43:35] ax.modelbridge.generation_node: The generator run produced duplicate arms. Re-running the generation step in an attempt to deduplicate. Candidates produced in the last generator run: [Arm(name='4_0', parameters={'x1': 0.5802415013313293, 'x2': 0.0713861957192421, 'x3': 0.7807427644729614, 'x4': 0.0, 'x5': 0.7361965179443359, 'x6': 0.0})].
[INFO 07-02 17:47:46] ax.modelbridge.generation_node: The generator run produced duplicate arms. Re-running the generation step in an attempt to deduplicate. Candidates produced in the last generator run: [Arm(name='4_0', parameters={'x1': 0.5802415013313293, 'x2': 0.0713861957192421, 'x3': 0.7807427644729614, 'x4': 0.0, 'x5': 0.7361965179443359, 'x6': 0.0})].
[INFO 07-02 17:47:46] ax.service.ax_client: Generated new trial 7 with parameters {'x1': 0.052018, 'x2': 0.873592, 'x3': 0.467638, 'x4': 0.0, 'x5': 0.752633, 'x6': 0.160381}.
[INFO 07-02 17:47:46] ax.service.ax_client: Completed trial 7 with data: {'a': (-1.564546, 0.0), 'b': (-0.128124, 0.0)}.
[INFO 07-02 17:47:47] ax.service.ax_client: Generated new trial 8 with parameters {'x1': 0.720643, 'x2': 1.0, 'x3': 0.042758, 'x4': 0.0, 'x5': 0.44248, 'x6': 0.079273}.
[INFO 07-02 17:47:47] ax.service.ax_client: Completed trial 8 with data: {'a': (-0.802736, 0.0), 'b': (-1.710383, 0.0)}.
[INFO 07-02 17:47:47] ax.service.ax_client: Generated new trial 9 with parameters {'x1': 0.276204, 'x2': 0.890964, 'x3': 0.776269, 'x4': 0.0, 'x5': 1.0, 'x6': 0.707807}.
[INFO 07-02 17:47:48] ax.service.ax_client: Completed trial 9 with data: {'a': (-1.608152, 0.0), 'b': (-0.745059, 0.0)}.
[INFO 07-02 17:47:48] ax.service.ax_client: Generated new trial 10 with parameters {'x1': 0.0, 'x2': 0.122837, 'x3': 0.914065, 'x4': 0.0, 'x5': 0.302231, 'x6': 1.0}.
[INFO 07-02 17:47:48] ax.service.ax_client: Completed trial 10 with data: {'a': (-1.852815, 0.0), 'b': (-0.0, 0.0)}.
[INFO 07-02 17:47:49] ax.service.ax_client: Generated new trial 11 with parameters {'x1': 0.992469, 'x2': 1.0, 'x3': 1.0, 'x4': 0.0, 'x5': 0.634098, 'x6': 0.205465}.
[INFO 07-02 17:47:49] ax.service.ax_client: Completed trial 11 with data: {'a': (-0.021941, 0.0), 'b': (-1.854603, 0.0)}.
[INFO 07-02 17:47:49] ax.service.ax_client: Generated new trial 12 with parameters {'x1': 0.066958, 'x2': 0.841555, 'x3': 0.0, 'x4': 1.0, 'x5': 0.0, 'x6': 0.0}.
[INFO 07-02 17:47:49] ax.service.ax_client: Completed trial 12 with data: {'a': (-2.104963, 0.0), 'b': (-0.222213, 0.0)}.
[INFO 07-02 17:47:50] ax.service.ax_client: Generated new trial 13 with parameters {'x1': 1.0, 'x2': 1.0, 'x3': 1.0, 'x4': 0.0, 'x5': 0.41536, 'x6': 0.771124}.
[INFO 07-02 17:47:50] ax.service.ax_client: Completed trial 13 with data: {'a': (0.0, 0.0), 'b': (-1.830672, 0.0)}.
As can be seen, most of the suggestions do not conform to the imposed constraints (3Choose1 for x4, x5, x6). What could be the reason for this situation?
Edit: From the traceback one can see as well that there are a lot of duplicated suggestions. I found this from sometime ago, but it doesn't really address my specific question.
The text was updated successfully, but these errors were encountered:
For future reference of others who might have a use for this code, the error was a bug in the function
defcreate_ineq_constraint(var_idx: List[int], non_zero: int):
defineq_constraint_on_vars(x: torch.Tensor, ell: float=1e-3):
""" Each callable is expected to take a `(num_restarts) x q x d`-dim tensor as an input and return a `(num_restarts) x q`-dim tensor with the constraint values. """x_slice=x[..., var_idx[0]:var_idx[1]]
returnnarrow_gaussian(x_slice, ell).sum(dim=-1) - (x_slice.shape[-1] -non_zero)
returnineq_constraint_on_vars
where instead of x_slice = x[..., var_idx[0]:var_idx[1]] this should have read x_slice = x[..., var_idx[0]:var_idx[1]+1]
Dear Ax Team,
I have a use case where I need to impose "local" NChooseK constraints, e.g., for features x1-x6, I need to impose 2Choose1 only on features x5,x6 only. The following repro works:
where in this repro I impose 3Choose1 on x4,x5,x6. The resulting suggestions are shown
As can be seen, most of the suggestions do not conform to the imposed constraints (3Choose1 for x4, x5, x6). What could be the reason for this situation?
Edit: From the traceback one can see as well that there are a lot of duplicated suggestions. I found this from sometime ago, but it doesn't really address my specific question.
The text was updated successfully, but these errors were encountered: