What Tech Powers ChatGPT Behind The Scenes?
Hey everyone, been wondering about the tech stack behind ChatGPT. Like, does it run on TensorFlow, PyTorch, or something else entirely? Would love to hear from …
Micah Preston
February 8, 2026 at 07:07 PM
Hey everyone, been wondering about the tech stack behind ChatGPT. Like, does it run on TensorFlow, PyTorch, or something else entirely? Would love to hear from folks who know more about these AI frameworks and how they relate to models like ChatGPT.
Add a Comment
Comments (11)
I'm curious if the inference part (like what we use) still relies on PyTorch or if they convert the model to other runtimes?
Honestly, I've always preferred TensorFlow for production due to its robust tools. Just interesting to see PyTorch dominating research.
Anyone knows if the training infrastructure uses any other frameworks or custom setups besides PyTorch?
Thanks for the insights folks, helps clear up a lot about how these AI giants operate tech-wise.
I heard that OpenAI switched from TensorFlow to PyTorch a while back for their big projects. It’s more user-friendly for research stuff.
I guess this just shows how important the ecosystem and community support are when picking tools for AI.
Does that mean TensorFlow is kinda obsolete for these big NLP models now?
Is there any chance ChatGPT will move back to TensorFlow or another framework in the future?
Found this discussion helpful, anyone knows where I can check out new AI tools related to this?
Do you think the choice of PyTorch over TensorFlow affects how fast new features get developed in ChatGPT?
From what I know, ChatGPT is mostly built using PyTorch. It's pretty popular for research and development in AI these days.