What is the best practice to separate Prod and Test Environments in AWS?
I can think of 2 options (let's assume my website is called: blue-sky.com)
Create 2 AWS Accounts: blue-sky and test-blue-sky (Create a Test User under Test Account and a Prod User under Prod Account).
Create 1 AWS Account: blue-sky. Create 2 Users, Prod and Test. Test user has access to Test Servers and Prod User has access to Prod Servers.
Which is better? Any alternative approach?
Both of the above scenarios have the same drawback. Let's say our Release manager has access to both Test and Prod server. He wants to blow up a Test Server since he has the credentials to both environments, he mistakenly logs in with Prod credentials and deletes the Prod Server.
If you want to do it right take it one step further.
First about your user accounts.
Have 3 accounts - login, test and prod. All your IAM users are only in the login account and they all have MFA enabled for security.
In test and prod accounts you only have IAM Roles that can be assumed from the login account.
Check out Cross-account access for more details.
Next your deployments.
Don't worry about using Prod credentials for Test - your Prod deployments should be fully automated through CI/CD and you should never need to login to the Prod account to do deployments manually.
Take advantage of CloudFormation together with AWS CodePipeline, CodeBuild and CodeDeploy to automate your builds and deployments. Or use some other tools like Jenkins, GoCD, etc, it doesn't matter as long as the deployments to both Test and Prod are done from the same CI/CD pipeline.
Indeed there will be cases in Test where you'll have to login to the console to test something, debug something, etc. But that will be nailed in Test before it's promoted to Prod through your automation.
That way you will have 100% consistent and replicable deployments and you won't have to worry about making a mistake that takes your Production environment down.
Hope that helps :)
Short general answer: one account per environment lowers risk, reduces the chances of hitting limits, and gives best isolation.
--
Longer answer: There's no one answer to this question, but here are a few options:
One account per application per environment. Best isolation for workloads and users, lowest risk, lowest chance of a test account maxing out and hitting AWS limits. It takes more effort to secure and monitor everything, and can cost more for networking.
One account per application, one environment per VPC. Good inter-application isolation, but applications are vulnerable to user issues, limits, etc. You can work around user access with tags and policies but it's more work. Same comment around costs as above, but slightly less.
One account, one VPC per application or per application per environment. Easy to set up, requires more work to do good isolation. Even higher risk of hitting limits.
One account, one VPC, multiple applications. You can see where this is going - higher risk again basically.
In all scenarios there's a risk of a careless user deleting the wrong resource. Automated deployments like MLu suggested is one way to mitigate that risk.
Any important deployment to any server / cloud should have a good backup strategy. Some financial regulations mandate that all data stored in the cloud must be stored in a second location such as on-premise in an organisation owned data center. For my own personal deployments I backup to S3 nightly, sync S3 to my PC, and take an incremental backup which is stored offsite.