Recent developments in deep learning have pushed the limits of possibilities with large language models exhibiting outstanding capabilities, models for computer vision, and natural language processing exceeding human-level performance. However, this progress comes at the expense of immense power consumption while training such large-scale models. From this perspective, the advance is not sustainable, especially considering the climate change concerns looming over the present day. The enormity of the energy consumption can be attributed to the architecture of conventional computers, which is not optimized for energy consumption for deep learning applications. On the other hand, the human brain excels at this aspect by performing complex patt...