Not to be impolite, but this is incorrect. One detail they did share in their paper is that they where able to finetune and select their hyper parameters on a model that needed 1,000x less compute than the final gpt4 model. OpenAI is definitely leading in how to train very large models cost effectively.