Most of the researches in algorithms are for reducing computational time complexity. Such researches often neglect the amount of memory used, resulting in algorithms that require large amounts of memory and cannot be executed on ordinary PCs. On the other hand, there have been researches on reducing the amount of memory required for computation for a long time. However while most of them were theoretically interesting, practically too restrictive, such as whether some computation can be done with O(log n) bits for input size n. In recent years, the use of big data has become widespread, and the size of the input to algorithms tends to increase compared to the past. In the past, it was natural to use the same amount of memory as the size of ...