Visual recording of everyday human activities and behaviour over the long term is now feasible and with the widespread use of wearable devices embedded with cameras this offers the potential to gain real insights into wearers’ activities and behaviour. To date we have concentrated on automatically detecting semantic concepts from within visual lifelogs yet identifying human activities from such lifelogged images or videos is still a major challenge if we are to use lifelogs to maximum benefit. In this paper, we propose an activity classification method from visual lifelogs based on Fisher kernels, which extract discriminative embeddings from Hidden Markov Models (HMMs) of occurrences of semantic concepts. By using the gradients as features,...