Question:
Our CloudFormation templates are stored in GitHub. Inside CodePipeline we’re using GitHub as our Source, but we can’t reference nested CloudFormation Stacks when they’re not stored on S3.
How can we reference CloudFormation nested Stacks when using GitHub as our source in CodePipeline?
If this is not possible, how can we upload the CloudFormation Templates from GitHub to S3 between the Source Stage (from GitHub) and the Deploy Stage in CodePipeline?
Answer:
There are two approaches I can think of to reference nested CloudFormation Stacks from a GitHub source for a CodePipeline deployment:
1. pre-commit Git hook
Add a pre-commit
client-side Git hook that runs aws cloudformation package
on your template, committing a generated template with the S3 reference to your GitHub repository alongside the changes to the source template.
The benefit to this approach is that you can leverage the existing template-rewriting logic in aws cloudformation package
, and you don’t have to modify your existing CodePipeline configuration.
2. Lambda pipeline Stage
Add a Lambda-based pipeline Stage that extracts the specified nested-stack template file from the GitHub Source Artifact and uploads it to a specified location in S3 referenced in the parent stack template.
The benefit to this approach is that the Pipeline will remain entirely self-contained, without any extra pre-processing step required by the committer.
I’ve published a complete reference example implementation to wjordan/aws-codepipeline-nested-stack
:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 |
AWSTemplateFormatVersion: 2010-09-09 Description: Infrastructure Continuous Delivery with CodePipeline and CloudFormation, with a project containing a nested stack. Parameters: ArtifactBucket: Type: String Description: Name of existing S3 bucket for storing pipeline artifacts StackFilename: Type: String Default: cfn-template.yml Description: CloudFormation stack template filename in the Git repo GitHubOwner: Type: String Description: GitHub repository owner GitHubRepo: Type: String Default: aws-codepipeline-nested-stack Description: GitHub repository name GitHubBranch: Type: String Default: master Description: GitHub repository branch GitHubToken: Type: String Description: GitHub repository OAuth token NestedStackFilename: Type: String Description: GitHub filename (and S3 Object Key) for nested stack template. Default: nested.yml Resources: Pipeline: Type: AWS::CodePipeline::Pipeline Properties: RoleArn: !GetAtt [PipelineRole, Arn] ArtifactStore: Type: S3 Location: !Ref ArtifactBucket Stages: - Name: Source Actions: - Name: Source ActionTypeId: Category: Source Owner: ThirdParty Version: 1 Provider: GitHub Configuration: Owner: !Ref GitHubOwner Repo: !Ref GitHubRepo Branch: !Ref GitHubBranch OAuthToken: !Ref GitHubToken OutputArtifacts: [Name: Template] RunOrder: 1 - Name: Deploy Actions: - Name: S3Upload ActionTypeId: Category: Invoke Owner: AWS Provider: Lambda Version: 1 InputArtifacts: [Name: Template] Configuration: FunctionName: !Ref S3UploadObject UserParameters: !Ref NestedStackFilename RunOrder: 1 - Name: Deploy RunOrder: 2 ActionTypeId: Category: Deploy Owner: AWS Version: 1 Provider: CloudFormation InputArtifacts: [Name: Template] Configuration: ActionMode: REPLACE_ON_FAILURE RoleArn: !GetAtt [CFNRole, Arn] StackName: !Ref AWS::StackName TemplatePath: !Sub "Template::${StackFilename}" Capabilities: CAPABILITY_IAM ParameterOverrides: !Sub | { "ArtifactBucket": "${ArtifactBucket}", "StackFilename": "${StackFilename}", "GitHubOwner": "${GitHubOwner}", "GitHubRepo": "${GitHubRepo}", "GitHubBranch": "${GitHubBranch}", "GitHubToken": "${GitHubToken}", "NestedStackFilename": "${NestedStackFilename}" } CFNRole: Type: AWS::IAM::Role Properties: AssumeRolePolicyDocument: Statement: - Action: ['sts:AssumeRole'] Effect: Allow Principal: {Service: [cloudformation.amazonaws.com]} Version: '2012-10-17' Path: / ManagedPolicyArns: # TODO grant least privilege to only allow managing your CloudFormation stack resources - "arn:aws:iam::aws:policy/AdministratorAccess" PipelineRole: Type: AWS::IAM::Role Properties: AssumeRolePolicyDocument: Statement: - Action: ['sts:AssumeRole'] Effect: Allow Principal: {Service: [codepipeline.amazonaws.com]} Version: '2012-10-17' Path: / Policies: - PolicyName: CodePipelineAccess PolicyDocument: Version: '2012-10-17' Statement: - Action: - 's3:*' - 'cloudformation:*' - 'iam:PassRole' - 'lambda:*' Effect: Allow Resource: '*' Dummy: Type: AWS::CloudFormation::WaitConditionHandle NestedStack: Type: AWS::CloudFormation::Stack Properties: TemplateURL: !Sub "https://s3.amazonaws.com/${ArtifactBucket}/${NestedStackFilename}" S3UploadObject: Type: AWS::Lambda::Function Properties: Description: Extracts and uploads the specified InputArtifact file to S3. Handler: index.handler Role: !GetAtt LambdaExecutionRole.Arn Code: ZipFile: !Sub | var exec = require('child_process').exec; var AWS = require('aws-sdk'); var codePipeline = new AWS.CodePipeline(); exports.handler = function(event, context, callback) { var job = event["CodePipeline.job"]; var s3Download = new AWS.S3({ credentials: job.data.artifactCredentials, signatureVersion: 'v4' }); var s3Upload = new AWS.S3({ signatureVersion: 'v4' }); var jobId = job.id; function respond(e) { var params = {jobId: jobId}; if (e) { params['failureDetails'] = { message: JSON.stringify(e), type: 'JobFailed', externalExecutionId: context.invokeid }; codePipeline.putJobFailureResult(params, (err, data) => callback(e)); } else { codePipeline.putJobSuccessResult(params, (err, data) => callback(e)); } } var filename = job.data.actionConfiguration.configuration.UserParameters; var location = job.data.inputArtifacts[0].location.s3Location; var bucket = location.bucketName; var key = location.objectKey; var tmpFile = '/tmp/file.zip'; s3Download.getObject({Bucket: bucket, Key: key}) .createReadStream() .pipe(require('fs').createWriteStream(tmpFile)) .on('finish', ()=>{ exec(`unzip -p ${!tmpFile} ${!filename}`, (err, stdout)=>{ if (err) { respond(err); } s3Upload.putObject({Bucket: bucket, Key: filename, Body: stdout}, (err, data) => respond(err)); }); }); }; Timeout: 30 Runtime: nodejs4.3 LambdaExecutionRole: Type: AWS::IAM::Role Properties: AssumeRolePolicyDocument: Version: '2012-10-17' Statement: - Effect: Allow Principal: {Service: [lambda.amazonaws.com]} Action: ['sts:AssumeRole'] Path: / ManagedPolicyArns: - "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole" - "arn:aws:iam::aws:policy/AWSCodePipelineCustomActionAccess" Policies: - PolicyName: S3Policy PolicyDocument: Version: '2012-10-17' Statement: - Effect: Allow Action: - 's3:PutObject' - 's3:PutObjectAcl' Resource: !Sub "arn:aws:s3:::${ArtifactBucket}/${NestedStackFilename}" |