Abstract : This work addresses (mixed-integer) joint chance-constrained optimization problems in which the only uncertain parameter corresponds to the right-hand side coefficients in an inequality system. It exploits one-dimensional marginals to construct upper and lower models for the underlying joint probability function, which need not be differentiable or satisfy generalized concavity properties. Based on these models, two optimization methods are proposed. Neither of them requires the probability function’s (sub)gradients and can thus be considered derivative-free methods. The first approach iteratively enriches an upper model for the probability function within an outer approximation algorithm that is shown to compute, under mild assumptions, an approximate global solution to the nonconvex chance-constrained problem. When the problem’s data is linear, the outer approximation algorithm requires solving (approximately) a mixed-integer linear programming problem per iteration. The second method works with a lower model and penalization techniques to efficiently compute points satisfying a new criticality condition. The approach, which handles only continuous variables, defines iterates as critical points of a nonlinear master program that can be handled with off-the-shelf nonlinear programming solvers. Numerical experiments on academic chance-constrained problems highlight the approach’s potential and limitations.