We study a stochastic gradient algorithm for performing online a constrained binary logistic regression in the case of streaming or massive data. Assuming that observed data are realizations of a random vector, these data are standardized online in particular to avoid a numerical explosion or when a shrinkage method such as LASSO is used. We prove the almost sure convergence of a variable step-size constrained stochastic gradient process with averaging when a varying number of new data is introduced at each step. 24 stochastic approximation processes are compared on real or simulated datasets, classical processes with raw data, processes with online standardized data, with or without averaging and with variable or piecewise constant step-si...