Handling the refresh of large schemas

Is there a best practice for handling the refresh of large schemas? We have a few thousand entities in our schema and are experiencing some issues when a change is made to the schema and the schema is refreshed across our deployment of a few hundred servers.

Can you share more specifics about the issues you are seeing and what version of Gateway or Router you are running?

Our schema file is 35k lines currently and expected to grow to possibly 2-3 times that. We have a few hundred servers to serve requests. Whenever we update the schema, we reload the schema file dynamically and this causes a spike in CPU usage that results in higher response times for a few seconds. We are wondering if anyone is updating an existing schema with the delta changes rather than completely regenerating the schema from the file ever time something is changed or if it is possible to break up the schema into separate files and only update parts of the schema when changes are made. We routinely update the schema using a pull requests process and reload the schema whenever schema changes are merged. We do not want to restart the servers when schema updates are made.