We present a notion, relative independence, that models independence in relation to a predicate. The intuition is to capture the notion of a minimum of dependencies among variables with respect to the predicate. We prove that relative independence coincides with conditional independence only in a trivial case. For use in second-order probability, we let the predicate express first-order probability, i.e. that the probability variables must sum to one in order to restrict dependency to the necessary relation between probabilities of exhaustive and mutually exclusive events. We then show examples of Dirichlet distributions that do and do not have the property of relative independence. These distributions are compared with respect to the impact of further dependencies, apart from those imposed by the predicate.