The unprecedented growth of Artificial Intelligence (AI) models, particularly transformer-based language models like GPT-3 and BERT, has ushered in the era of large-scale AI development. Training AI models using conventional 2D graphical processing unit (GPU) chips contribute significantly to carbon emissions. For instance, training a single ChatGPT-3 model using a conventional 2D chip GPU consumed 1.2-megawatt hours (MWh) and generated 552 tons of CO2 emissions, equivalent to 123 gasoline-powered passenger vehicles driven in one year. Moreover, training these AI models with GPUs using 2D circuit design configuration poses significant computational power challenges. There is a dire need to develop innovative circuit configurations to reduce carbon footprint for improving the welfare of humanity. Our ground-breaking Tasawwur transistor-based GPU can reduce power consumption by more than 250 % using the state-of-the-art 3D circuit design configuration. For instance, for training a single ChatGPT-3 model, our proposed Tasawwur GPU will consume just 0.48 MWh and reduce 220 tons of CO2 emissions, equivalent to planting 9 million trees annually. Hence, our groundbreaking Tasawwur GPU is a carbon-negative electronic chip that can revolutionize future AI computation.