At present, in the new hardware design work of deep learning, memristor as a non-volatile memory with computing power has become a research hotspot. The weights in the deep neural network are the floating-point number. Writing a floating-point value into a memristor will result in a loss of accuracy, and the writing process will take more time. The binarized neural network (BNN) binarizes the weights and activation values that were originally floating-point numbers to +1 and -1. This will greatly reduce the storage space consumption and time consumption of programming the resistance value of the memristor. Furthermore, this will help to simplify the programming of memristors in deep neural network circuits and speed up the inference process...