E2E testing, CI/CD pipelins in AWS and more

Hey, we have 10+ microservices, and they’re successfully deployed in AWS witch CI/CD pipelines, all of that works just fine for now. However, our microservices also has e2e tests, and we want to add that to the pipelines, cause so far we only run them locally (that sucks, right?). Since it’s e2e testing, we run queries/mutations, etc directly to apollo gateway. Since we have things like authentication, and getting entities via rabbitMQ communication from other services- to run e2e tests we need all of the microservices. The question is- what is the best approach to do it? I came up to 2 solutions:

  1. Do all this blood work in codebuild (clone github repositories, build containers (including installing packages and etc), loading envs to each of them (not sure of the best way to do it without exposing .env.test files tbh, but could think of something), running tests, and then destroying these containers.

  2. Creating entire test environment in AWS. Which means creating ECS cluster, Elasticache, RDS, Document DB, Amazon MQ instances, NAT gateways, VPC and etc…

Maybe one of you have similar situation and found a decent solution? If not, maybe you still have some ideas on that? I would like to hear all opinions, there are no dumb solutions, any solution is potential solution, or at least spark for thought.

Hello, my experience is that it is always better to have a seperate environment. We use cloudformation for all our infrastructure so it is very easy to setup a 2nd and 3rd. We always have what we call a staging environment that we can run both manual and automatic tests towards.