We give an exposition of the principles of quantum computing (logic gates, exponential parallelism from polynomial hardware, fast quantum algorithms, quantum error correction, hardware requirements, and experimental milestones). A compact description of the quantum Fourier transform to find the period of a function-the key step in Shor\u27s factoring algorithm-illustrates how parallel state evolution along many classical computational paths produces fast algorithms by constructive interference similar to Bragg reflections in x-ray crystallography. On the hardware side, we present a new method to estimate critical time scales for the operation of a quantum computer. We derive a universal upper bound on the probability of a computation to fai...