We examine the minimization of information entropy for measures on the phase space of bounded domains, subject to constraints that are averages of grand canonical distributions. We describe the set of all such constraints and show that it equals the set of averages of all probability measures absolutely continuous with respect to the standard measure on the phase space (with the exception of the measure concentrated on the empty configuration). We also investigate how the set of constrains relates to the domain of the microcanonical thermodynamic limit entropy. We then show that, for fixed constraints, the parameters of the corresponding grand canonical distribution converge, as volume increases, to the corresponding parameters (derivatives...