The main objective of this thesis is to develop stochastic control theory and applications to population dynamics. From a theoritical point of view, we study finite horizon stochastic control problems on diffusion processes, nonlinear branching processes and branching diffusion processes. In each case we establish a dynamic programmic principle by carefully proving a conditioning argument similar to the strong Markov property for controlled processes. Then we deduce that the value function is a (viscosity or regular) solution of the associated Hamilton-Jacobi-Bellman equation. In the regular case, we further identify an optimal control in the class of markovian strategies thanks to a verification theorem. From a pratical point of view, we a...