Possibly modify the downsampling protocol to repeat until a desired connectance is reached. Since the downsampling is a way to prune links to be more representative of real world metawebs. I can't see why it would be wrong to continue removing links by 'recalculating' the link distribution after a downsampling event.
How I see this could work:
downsample -> make binary -> check Co -> if > than desired Co -> downsample -> repeat while Co > desired Co