This work presents the development of a robotic wheelchair that can be commanded by users in a supervised way or by a fully automatic unsupervised navigation system. It provides flexibility to choose different modalities to command the wheelchair, in addition to be suitable for people with different levels of disabilities. Users can command the wheelchair based on their eye blinks, eye movements, head movements, by sip-and-puff and through brain signals. The wheelchair can also operate like an auto-guided vehicle, following metallic tapes, or in an autonomous way. The system is provided with an easy to use and flexible graphical user interface onboard a personal digital assistant, which is used to allow users to choose commands to be sent t...