Skip navigation links

Package software.amazon.awscdk.services.codepipeline.actions

AWS CodePipeline Actions

See: Description

Package software.amazon.awscdk.services.codepipeline.actions Description

AWS CodePipeline Actions

This package contains Actions that can be used in a CodePipeline.

 import software.amazon.awscdk.core.*;
 import software.amazon.awscdk.core.*;
 

Sources

AWS CodeCommit

To use a CodeCommit Repository in a CodePipeline:

 Repository repo = Repository.Builder.create(this, "Repo")
         .repositoryName("MyRepo")
         .build();
 
 Pipeline pipeline = Pipeline.Builder.create(this, "MyPipeline")
         .pipelineName("MyPipeline")
         .build();
 Artifact sourceOutput = new Artifact();
 CodeCommitSourceAction sourceAction = CodeCommitSourceAction.Builder.create()
         .actionName("CodeCommit")
         .repository(repo)
         .output(sourceOutput)
         .build();
 pipeline.addStage(StageOptions.builder()
         .stageName("Source")
         .actions(List.of(sourceAction))
         .build());
 

If you want to use existing role which can be used by on commit event rule. You can specify the role object in eventRole property.

 Repository repo;
 IRole eventRole = Role.fromRoleArn(this, "Event-role", "roleArn");
 CodeCommitSourceAction sourceAction = CodeCommitSourceAction.Builder.create()
         .actionName("CodeCommit")
         .repository(repo)
         .output(new Artifact())
         .eventRole(eventRole)
         .build();
 

If you want to clone the entire CodeCommit repository (only available for CodeBuild actions), you can set the codeBuildCloneOutput property to true:

 PipelineProject project;
 Repository repo;
 
 Artifact sourceOutput = new Artifact();
 CodeCommitSourceAction sourceAction = CodeCommitSourceAction.Builder.create()
         .actionName("CodeCommit")
         .repository(repo)
         .output(sourceOutput)
         .codeBuildCloneOutput(true)
         .build();
 
 CodeBuildAction buildAction = CodeBuildAction.Builder.create()
         .actionName("CodeBuild")
         .project(project)
         .input(sourceOutput) // The build action must use the CodeCommitSourceAction output as input.
         .outputs(List.of(new Artifact()))
         .build();
 

The CodeCommit source action emits variables:

 PipelineProject project;
 Repository repo;
 
 Artifact sourceOutput = new Artifact();
 CodeCommitSourceAction sourceAction = CodeCommitSourceAction.Builder.create()
         .actionName("CodeCommit")
         .repository(repo)
         .output(sourceOutput)
         .variablesNamespace("MyNamespace")
         .build();
 
 // later:
 
 // later:
 CodeBuildAction.Builder.create()
         .actionName("CodeBuild")
         .project(project)
         .input(sourceOutput)
         .environmentVariables(Map.of(
                 "COMMIT_ID", BuildEnvironmentVariable.builder()
                         .value(sourceAction.getVariables().getCommitId())
                         .build()))
         .build();
 

GitHub

If you want to use a GitHub repository as the source, you must create:

To use GitHub as the source of a CodePipeline:

 // Read the secret from Secrets Manager
 Pipeline pipeline = new Pipeline(this, "MyPipeline");
 Artifact sourceOutput = new Artifact();
 GitHubSourceAction sourceAction = GitHubSourceAction.Builder.create()
         .actionName("GitHub_Source")
         .owner("awslabs")
         .repo("aws-cdk")
         .oauthToken(SecretValue.secretsManager("my-github-token"))
         .output(sourceOutput)
         .branch("develop")
         .build();
 pipeline.addStage(StageOptions.builder()
         .stageName("Source")
         .actions(List.of(sourceAction))
         .build());
 

The GitHub source action emits variables:

 Artifact sourceOutput;
 PipelineProject project;
 
 
 GitHubSourceAction sourceAction = GitHubSourceAction.Builder.create()
         .actionName("Github_Source")
         .output(sourceOutput)
         .owner("my-owner")
         .repo("my-repo")
         .oauthToken(SecretValue.secretsManager("my-github-token"))
         .variablesNamespace("MyNamespace")
         .build();
 
 // later:
 
 // later:
 CodeBuildAction.Builder.create()
         .actionName("CodeBuild")
         .project(project)
         .input(sourceOutput)
         .environmentVariables(Map.of(
                 "COMMIT_URL", BuildEnvironmentVariable.builder()
                         .value(sourceAction.getVariables().getCommitUrl())
                         .build()))
         .build();
 

BitBucket

CodePipeline can use a BitBucket Git repository as a source:

Note: you have to manually connect CodePipeline through the AWS Console with your BitBucket account. This is a one-time operation for a given AWS account in a given region. The simplest way to do that is to either start creating a new CodePipeline, or edit an existing one, while being logged in to BitBucket. Choose BitBucket as the source, and grant CodePipeline permissions to your BitBucket account. Copy & paste the Connection ARN that you get in the console, or use the codestar-connections list-connections AWS CLI operation to find it. After that, you can safely abort creating or editing the pipeline - the connection has already been created.

 Artifact sourceOutput = new Artifact();
 CodeStarConnectionsSourceAction sourceAction = CodeStarConnectionsSourceAction.Builder.create()
         .actionName("BitBucket_Source")
         .owner("aws")
         .repo("aws-cdk")
         .output(sourceOutput)
         .connectionArn("arn:aws:codestar-connections:us-east-1:123456789012:connection/12345678-abcd-12ab-34cdef5678gh")
         .build();
 

You can also use the CodeStarConnectionsSourceAction to connect to GitHub, in the same way (you just have to select GitHub as the source when creating the connection in the console).

Similarly to GitHubSourceAction, CodeStarConnectionsSourceAction also emits the variables:

 Project project;
 
 
 Artifact sourceOutput = new Artifact();
 CodeStarConnectionsSourceAction sourceAction = CodeStarConnectionsSourceAction.Builder.create()
         .actionName("BitBucket_Source")
         .owner("aws")
         .repo("aws-cdk")
         .output(sourceOutput)
         .connectionArn("arn:aws:codestar-connections:us-east-1:123456789012:connection/12345678-abcd-12ab-34cdef5678gh")
         .variablesNamespace("SomeSpace")
         .build();
 
 // later:
 
 // later:
 CodeBuildAction.Builder.create()
         .actionName("CodeBuild")
         .project(project)
         .input(sourceOutput)
         .environmentVariables(Map.of(
                 "COMMIT_ID", BuildEnvironmentVariable.builder()
                         .value(sourceAction.getVariables().getCommitId())
                         .build()))
         .build();
 

AWS S3 Source

To use an S3 Bucket as a source in CodePipeline:

 Bucket sourceBucket = Bucket.Builder.create(this, "MyBucket")
         .versioned(true)
         .build();
 
 Pipeline pipeline = new Pipeline(this, "MyPipeline");
 Artifact sourceOutput = new Artifact();
 S3SourceAction sourceAction = S3SourceAction.Builder.create()
         .actionName("S3Source")
         .bucket(sourceBucket)
         .bucketKey("path/to/file.zip")
         .output(sourceOutput)
         .build();
 pipeline.addStage(StageOptions.builder()
         .stageName("Source")
         .actions(List.of(sourceAction))
         .build());
 

The region of the action will be determined by the region the bucket itself is in. When using a newly created bucket, that region will be taken from the stack the bucket belongs to; for an imported bucket, you can specify the region explicitly:

 IBucket sourceBucket = Bucket.fromBucketAttributes(this, "SourceBucket", BucketAttributes.builder()
         .bucketName("my-bucket")
         .region("ap-southeast-1")
         .build());
 

By default, the Pipeline will poll the Bucket to detect changes. You can change that behavior to use CloudWatch Events by setting the trigger property to S3Trigger.EVENTS (it's S3Trigger.POLL by default). If you do that, make sure the source Bucket is part of an AWS CloudTrail Trail - otherwise, the CloudWatch Events will not be emitted, and your Pipeline will not react to changes in the Bucket. You can do it through the CDK:

 import software.amazon.awscdk.core.*;
 
 Bucket sourceBucket;
 
 Artifact sourceOutput = new Artifact();
 String key = "some/key.zip";
 Trail trail = new Trail(this, "CloudTrail");
 trail.addS3EventSelector(List.of(S3EventSelector.builder()
         .bucket(sourceBucket)
         .objectPrefix(key)
         .build()), AddEventSelectorOptions.builder()
         .readWriteType(ReadWriteType.WRITE_ONLY)
         .build());
 S3SourceAction sourceAction = S3SourceAction.Builder.create()
         .actionName("S3Source")
         .bucketKey(key)
         .bucket(sourceBucket)
         .output(sourceOutput)
         .trigger(S3Trigger.EVENTS)
         .build();
 

The S3 source action emits variables:

 Bucket sourceBucket;
 
 // later:
 PipelineProject project;
 String key = "some/key.zip";
 Artifact sourceOutput = new Artifact();
 S3SourceAction sourceAction = S3SourceAction.Builder.create()
         .actionName("S3Source")
         .bucketKey(key)
         .bucket(sourceBucket)
         .output(sourceOutput)
         .variablesNamespace("MyNamespace")
         .build();
 CodeBuildAction.Builder.create()
         .actionName("CodeBuild")
         .project(project)
         .input(sourceOutput)
         .environmentVariables(Map.of(
                 "VERSION_ID", BuildEnvironmentVariable.builder()
                         .value(sourceAction.getVariables().getVersionId())
                         .build()))
         .build();
 

AWS ECR

To use an ECR Repository as a source in a Pipeline:

 import software.amazon.awscdk.core.*;
 
 Repository ecrRepository;
 
 Pipeline pipeline = new Pipeline(this, "MyPipeline");
 Artifact sourceOutput = new Artifact();
 EcrSourceAction sourceAction = EcrSourceAction.Builder.create()
         .actionName("ECR")
         .repository(ecrRepository)
         .imageTag("some-tag") // optional, default: 'latest'
         .output(sourceOutput)
         .build();
 pipeline.addStage(StageOptions.builder()
         .stageName("Source")
         .actions(List.of(sourceAction))
         .build());
 

The ECR source action emits variables:

 import software.amazon.awscdk.core.*;
 Repository ecrRepository;
 
 // later:
 PipelineProject project;
 
 
 Artifact sourceOutput = new Artifact();
 EcrSourceAction sourceAction = EcrSourceAction.Builder.create()
         .actionName("Source")
         .output(sourceOutput)
         .repository(ecrRepository)
         .variablesNamespace("MyNamespace")
         .build();
 CodeBuildAction.Builder.create()
         .actionName("CodeBuild")
         .project(project)
         .input(sourceOutput)
         .environmentVariables(Map.of(
                 "IMAGE_URI", BuildEnvironmentVariable.builder()
                         .value(sourceAction.getVariables().getImageUri())
                         .build()))
         .build();
 

Build & test

AWS CodeBuild

Example of a CodeBuild Project used in a Pipeline, alongside CodeCommit:

 PipelineProject project;
 
 Repository repository = Repository.Builder.create(this, "MyRepository")
         .repositoryName("MyRepository")
         .build();
 PipelineProject project = new PipelineProject(this, "MyProject");
 
 Artifact sourceOutput = new Artifact();
 CodeCommitSourceAction sourceAction = CodeCommitSourceAction.Builder.create()
         .actionName("CodeCommit")
         .repository(repository)
         .output(sourceOutput)
         .build();
 CodeBuildAction buildAction = CodeBuildAction.Builder.create()
         .actionName("CodeBuild")
         .project(project)
         .input(sourceOutput)
         .outputs(List.of(new Artifact())) // optional
         .executeBatchBuild(true) // optional, defaults to false
         .combineBatchBuildArtifacts(true)
         .build();
 
 Pipeline.Builder.create(this, "MyPipeline")
         .stages(List.of(StageProps.builder()
                 .stageName("Source")
                 .actions(List.of(sourceAction))
                 .build(), StageProps.builder()
                 .stageName("Build")
                 .actions(List.of(buildAction))
                 .build()))
         .build();
 

The default category of the CodeBuild Action is Build; if you want a Test Action instead, override the type property:

 PipelineProject project;
 
 Artifact sourceOutput = new Artifact();
 CodeBuildAction testAction = CodeBuildAction.Builder.create()
         .actionName("IntegrationTest")
         .project(project)
         .input(sourceOutput)
         .type(CodeBuildActionType.TEST)
         .build();
 

Multiple inputs and outputs

When you want to have multiple inputs and/or outputs for a Project used in a Pipeline, instead of using the secondarySources and secondaryArtifacts properties of the Project class, you need to use the extraInputs and outputs properties of the CodeBuild CodePipeline Actions. Example:

 Repository repository1;
 Repository repository2;
 
 PipelineProject project;
 
 Artifact sourceOutput1 = new Artifact();
 CodeCommitSourceAction sourceAction1 = CodeCommitSourceAction.Builder.create()
         .actionName("Source1")
         .repository(repository1)
         .output(sourceOutput1)
         .build();
 Artifact sourceOutput2 = new Artifact("source2");
 CodeCommitSourceAction sourceAction2 = CodeCommitSourceAction.Builder.create()
         .actionName("Source2")
         .repository(repository2)
         .output(sourceOutput2)
         .build();
 CodeBuildAction buildAction = CodeBuildAction.Builder.create()
         .actionName("Build")
         .project(project)
         .input(sourceOutput1)
         .extraInputs(List.of(sourceOutput2))
         .outputs(List.of(
             new Artifact("artifact1"),  // for better buildspec readability - see below
             new Artifact("artifact2")))
         .build();
 

Note: when a CodeBuild Action in a Pipeline has more than one output, it only uses the secondary-artifacts field of the buildspec, never the primary output specification directly under artifacts. Because of that, it pays to explicitly name all output artifacts of that Action, like we did above, so that you know what name to use in the buildspec.

Example buildspec for the above project:

 PipelineProject project = PipelineProject.Builder.create(this, "MyProject")
         .buildSpec(BuildSpec.fromObject(Map.of(
                 "version", "0.2",
                 "phases", Map.of(
                         "build", Map.of(
                                 "commands", List.of())),
                 "artifacts", Map.of(
                         "secondary-artifacts", Map.of(
                                 "artifact1", Map.of(),
                                 "artifact2", Map.of())))))
         .build();
 

Variables

The CodeBuild action emits variables. Unlike many other actions, the variables are not static, but dynamic, defined in the buildspec, in the 'exported-variables' subsection of the 'env' section. Example:

 // later:
 PipelineProject project;
 Artifact sourceOutput = new Artifact();
 CodeBuildAction buildAction = CodeBuildAction.Builder.create()
         .actionName("Build1")
         .input(sourceOutput)
         .project(PipelineProject.Builder.create(this, "Project")
                 .buildSpec(BuildSpec.fromObject(Map.of(
                         "version", "0.2",
                         "env", Map.of(
                                 "exported-variables", List.of("MY_VAR")),
                         "phases", Map.of(
                                 "build", Map.of(
                                         "commands", "export MY_VAR=\"some value\"")))))
                 .build())
         .variablesNamespace("MyNamespace")
         .build();
 CodeBuildAction.Builder.create()
         .actionName("CodeBuild")
         .project(project)
         .input(sourceOutput)
         .environmentVariables(Map.of(
                 "MyVar", BuildEnvironmentVariable.builder()
                         .value(buildAction.variable("MY_VAR"))
                         .build()))
         .build();
 

Jenkins

In order to use Jenkins Actions in the Pipeline, you first need to create a JenkinsProvider:

 JenkinsProvider jenkinsProvider = JenkinsProvider.Builder.create(this, "JenkinsProvider")
         .providerName("MyJenkinsProvider")
         .serverUrl("http://my-jenkins.com:8080")
         .version("2")
         .build();
 

If you've registered a Jenkins provider in a different CDK app, or outside the CDK (in the CodePipeline AWS Console, for example), you can import it:

 IJenkinsProvider jenkinsProvider = JenkinsProvider.fromJenkinsProviderAttributes(this, "JenkinsProvider", JenkinsProviderAttributes.builder()
         .providerName("MyJenkinsProvider")
         .serverUrl("http://my-jenkins.com:8080")
         .version("2")
         .build());
 

Note that a Jenkins provider (identified by the provider name-category(build/test)-version tuple) must always be registered in the given account, in the given AWS region, before it can be used in CodePipeline.

With a JenkinsProvider, we can create a Jenkins Action:

 JenkinsProvider jenkinsProvider;
 
 JenkinsAction buildAction = JenkinsAction.Builder.create()
         .actionName("JenkinsBuild")
         .jenkinsProvider(jenkinsProvider)
         .projectName("MyProject")
         .type(JenkinsActionType.BUILD)
         .build();
 

Deploy

AWS CloudFormation

This module contains Actions that allows you to deploy to CloudFormation from AWS CodePipeline.

For example, the following code fragment defines a pipeline that automatically deploys a CloudFormation template directly from a CodeCommit repository, with a manual approval step in between to confirm the changes:

example Pipeline to deploy CloudFormation

See the AWS documentation for more details about using CloudFormation in CodePipeline.

Actions defined by this package

This package contains the following CloudFormation actions:

Lambda deployed through CodePipeline

If you want to deploy your Lambda through CodePipeline, and you don't use assets (for example, because your CDK code and Lambda code are separate), you can use a special Lambda Code class, CfnParametersCode. Note that your Lambda must be in a different Stack than your Pipeline. The Lambda itself will be deployed, alongside the entire Stack it belongs to, using a CloudFormation CodePipeline Action. Example:

Example of deploying a Lambda through CodePipeline

Cross-account actions

If you want to update stacks in a different account, pass the account property when creating the action:

 Artifact sourceOutput = new Artifact();
 CloudFormationCreateUpdateStackAction.Builder.create()
         .actionName("CloudFormationCreateUpdate")
         .stackName("MyStackName")
         .adminPermissions(true)
         .templatePath(sourceOutput.atPath("template.yaml"))
         .account("123456789012")
         .build();
 

This will create a new stack, called <PipelineStackName>-support-123456789012, in your App, that will contain the role that the pipeline will assume in account 123456789012 before executing this action. This support stack will automatically be deployed before the stack containing the pipeline.

You can also pass a role explicitly when creating the action - in that case, the account property is ignored, and the action will operate in the same account the role belongs to:

 import software.amazon.awscdk.core.PhysicalName;
 
 // in stack for account 123456789012...
 Stack otherAccountStack;
 
 Role actionRole = Role.Builder.create(otherAccountStack, "ActionRole")
         .assumedBy(new AccountPrincipal("123456789012"))
         // the role has to have a physical name set
         .roleName(PhysicalName.GENERATE_IF_NEEDED)
         .build();
 
 // in the pipeline stack...
 Artifact sourceOutput = new Artifact();
 CloudFormationCreateUpdateStackAction.Builder.create()
         .actionName("CloudFormationCreateUpdate")
         .stackName("MyStackName")
         .adminPermissions(true)
         .templatePath(sourceOutput.atPath("template.yaml"))
         .role(actionRole)
         .build();
 

AWS CodeDeploy

Server deployments

To use CodeDeploy for EC2/on-premise deployments in a Pipeline:

 ServerDeploymentGroup deploymentGroup;
 Pipeline pipeline = Pipeline.Builder.create(this, "MyPipeline")
         .pipelineName("MyPipeline")
         .build();
 
 // add the source and build Stages to the Pipeline...
 Artifact buildOutput = new Artifact();
 CodeDeployServerDeployAction deployAction = CodeDeployServerDeployAction.Builder.create()
         .actionName("CodeDeploy")
         .input(buildOutput)
         .deploymentGroup(deploymentGroup)
         .build();
 pipeline.addStage(StageOptions.builder()
         .stageName("Deploy")
         .actions(List.of(deployAction))
         .build());
 

Lambda deployments

To use CodeDeploy for blue-green Lambda deployments in a Pipeline:

 CfnParametersCode lambdaCode = Code.fromCfnParameters();
 Function func = Function.Builder.create(this, "Lambda")
         .code(lambdaCode)
         .handler("index.handler")
         .runtime(Runtime.NODEJS_12_X)
         .build();
 // used to make sure each CDK synthesis produces a different Version
 Version version = func.addVersion("NewVersion");
 Alias alias = Alias.Builder.create(this, "LambdaAlias")
         .aliasName("Prod")
         .version(version)
         .build();
 
 LambdaDeploymentGroup.Builder.create(this, "DeploymentGroup")
         .alias(alias)
         .deploymentConfig(LambdaDeploymentConfig.LINEAR_10PERCENT_EVERY_1MINUTE)
         .build();
 

Then, you need to create your Pipeline Stack, where you will define your Pipeline, and deploy the lambdaStack using a CloudFormation CodePipeline Action (see above for a complete example).

ECS

CodePipeline can deploy an ECS service. The deploy Action receives one input Artifact which contains the image definition file:

 import software.amazon.awscdk.core.*;
 
 FargateService service;
 
 Pipeline pipeline = new Pipeline(this, "MyPipeline");
 Artifact buildOutput = new Artifact();
 IStage deployStage = pipeline.addStage(StageOptions.builder()
         .stageName("Deploy")
         .actions(List.of(
             EcsDeployAction.Builder.create()
                     .actionName("DeployAction")
                     .service(service)
                     // if your file is called imagedefinitions.json,
                     // use the `input` property,
                     // and leave out the `imageFile` property
                     .input(buildOutput)
                     // if your file name is _not_ imagedefinitions.json,
                     // use the `imageFile` property,
                     // and leave out the `input` property
                     .imageFile(buildOutput.atPath("imageDef.json"))
                     .deploymentTimeout(Duration.minutes(60))
                     .build()))
         .build());
 

Deploying ECS applications stored in a separate source code repository

The idiomatic CDK way of deploying an ECS application is to have your Dockerfiles and your CDK code in the same source code repository, leveraging Docker Assets, and use the CDK Pipelines module.

However, if you want to deploy a Docker application whose source code is kept in a separate version control repository than the CDK code, you can use the TagParameterContainerImage class from the ECS module. Here's an example:

example ECS pipeline for an application in a separate source code repository

AWS S3 Deployment

To use an S3 Bucket as a deployment target in CodePipeline:

 Artifact sourceOutput = new Artifact();
 Bucket targetBucket = new Bucket(this, "MyBucket");
 
 Pipeline pipeline = new Pipeline(this, "MyPipeline");
 S3DeployAction deployAction = S3DeployAction.Builder.create()
         .actionName("S3Deploy")
         .bucket(targetBucket)
         .input(sourceOutput)
         .build();
 IStage deployStage = pipeline.addStage(StageOptions.builder()
         .stageName("Deploy")
         .actions(List.of(deployAction))
         .build());
 

Invalidating the CloudFront cache when deploying to S3

There is currently no native support in CodePipeline for invalidating a CloudFront cache after deployment. One workaround is to add another build step after the deploy step, and use the AWS CLI to invalidate the cache:

 // Create a Cloudfront Web Distribution
 import software.amazon.awscdk.core.*;
 Distribution distribution;
 
 
 // Create the build project that will invalidate the cache
 PipelineProject invalidateBuildProject = PipelineProject.Builder.create(this, "InvalidateProject")
         .buildSpec(BuildSpec.fromObject(Map.of(
                 "version", "0.2",
                 "phases", Map.of(
                         "build", Map.of(
                                 "commands", List.of("aws cloudfront create-invalidation --distribution-id ${CLOUDFRONT_ID} --paths \"/*\""))))))
         .environmentVariables(Map.of(
                 "CLOUDFRONT_ID", BuildEnvironmentVariable.builder().value(distribution.getDistributionId()).build()))
         .build();
 
 // Add Cloudfront invalidation permissions to the project
 String distributionArn = String.format("arn:aws:cloudfront::%s:distribution/%s", this.account, distribution.getDistributionId());
 invalidateBuildProject.addToRolePolicy(PolicyStatement.Builder.create()
         .resources(List.of(distributionArn))
         .actions(List.of("cloudfront:CreateInvalidation"))
         .build());
 
 // Create the pipeline (here only the S3 deploy and Invalidate cache build)
 Bucket deployBucket = new Bucket(this, "DeployBucket");
 Artifact deployInput = new Artifact();
 Pipeline.Builder.create(this, "Pipeline")
         .stages(List.of(StageProps.builder()
                 .stageName("Deploy")
                 .actions(List.of(
                     S3DeployAction.Builder.create()
                             .actionName("S3Deploy")
                             .bucket(deployBucket)
                             .input(deployInput)
                             .runOrder(1)
                             .build(),
                     CodeBuildAction.Builder.create()
                             .actionName("InvalidateCache")
                             .project(invalidateBuildProject)
                             .input(deployInput)
                             .runOrder(2)
                             .build()))
                 .build()))
         .build();
 

Alexa Skill

You can deploy to Alexa using CodePipeline with the following Action:

 // Read the secrets from ParameterStore
 SecretValue clientId = SecretValue.secretsManager("AlexaClientId");
 SecretValue clientSecret = SecretValue.secretsManager("AlexaClientSecret");
 SecretValue refreshToken = SecretValue.secretsManager("AlexaRefreshToken");
 
 // Add deploy action
 Artifact sourceOutput = new Artifact();
 AlexaSkillDeployAction.Builder.create()
         .actionName("DeploySkill")
         .runOrder(1)
         .input(sourceOutput)
         .clientId(clientId.toString())
         .clientSecret(clientSecret)
         .refreshToken(refreshToken)
         .skillId("amzn1.ask.skill.12345678-1234-1234-1234-123456789012")
         .build();
 

If you need manifest overrides you can specify them as parameterOverridesArtifact in the action:

 // Deploy some CFN change set and store output
 Artifact executeOutput = new Artifact("CloudFormation");
 CloudFormationExecuteChangeSetAction executeChangeSetAction = CloudFormationExecuteChangeSetAction.Builder.create()
         .actionName("ExecuteChangesTest")
         .runOrder(2)
         .stackName("MyStack")
         .changeSetName("MyChangeSet")
         .outputFileName("overrides.json")
         .output(executeOutput)
         .build();
 
 // Provide CFN output as manifest overrides
 SecretValue clientId = SecretValue.secretsManager("AlexaClientId");
 SecretValue clientSecret = SecretValue.secretsManager("AlexaClientSecret");
 SecretValue refreshToken = SecretValue.secretsManager("AlexaRefreshToken");
 Artifact sourceOutput = new Artifact();
 AlexaSkillDeployAction.Builder.create()
         .actionName("DeploySkill")
         .runOrder(1)
         .input(sourceOutput)
         .parameterOverridesArtifact(executeOutput)
         .clientId(clientId.toString())
         .clientSecret(clientSecret)
         .refreshToken(refreshToken)
         .skillId("amzn1.ask.skill.12345678-1234-1234-1234-123456789012")
         .build();
 

AWS Service Catalog

You can deploy a CloudFormation template to an existing Service Catalog product with the following Action:

 Artifact cdkBuildOutput = new Artifact();
 ServiceCatalogDeployActionBeta1 serviceCatalogDeployAction = ServiceCatalogDeployActionBeta1.Builder.create()
         .actionName("ServiceCatalogDeploy")
         .templatePath(cdkBuildOutput.atPath("Sample.template.json"))
         .productVersionName("Version - " + Date.getNow().getToString())
         .productVersionDescription("This is a version from the pipeline with a new description.")
         .productId("prod-XXXXXXXX")
         .build();
 

Approve & invoke

Manual approval Action

This package contains an Action that stops the Pipeline until someone manually clicks the approve button:

 import software.amazon.awscdk.core.*;
 
 
 Pipeline pipeline = new Pipeline(this, "MyPipeline");
 IStage approveStage = pipeline.addStage(StageOptions.builder().stageName("Approve").build());
 ManualApprovalAction manualApprovalAction = ManualApprovalAction.Builder.create()
         .actionName("Approve")
         .notificationTopic(new Topic(this, "Topic")) // optional
         .notifyEmails(List.of("some_email@example.com")) // optional
         .additionalInformation("additional info")
         .build();
 approveStage.addAction(manualApprovalAction);
 

If the notificationTopic has not been provided, but notifyEmails were, a new SNS Topic will be created (and accessible through the notificationTopic property of the Action).

If you want to grant a principal permissions to approve the changes, you can invoke the method grantManualApproval passing it a IGrantable:

 Pipeline pipeline = new Pipeline(this, "MyPipeline");
 IStage approveStage = pipeline.addStage(StageOptions.builder().stageName("Approve").build());
 ManualApprovalAction manualApprovalAction = ManualApprovalAction.Builder.create()
         .actionName("Approve")
         .build();
 approveStage.addAction(manualApprovalAction);
 
 IRole role = Role.fromRoleArn(this, "Admin", Arn.format(ArnComponents.builder().service("iam").resource("role").resourceName("Admin").build(), this));
 manualApprovalAction.grantManualApproval(role);
 

AWS Lambda

This module contains an Action that allows you to invoke a Lambda function in a Pipeline:

 Function fn;
 
 Pipeline pipeline = new Pipeline(this, "MyPipeline");
 LambdaInvokeAction lambdaAction = LambdaInvokeAction.Builder.create()
         .actionName("Lambda")
         .lambda(fn)
         .build();
 pipeline.addStage(StageOptions.builder()
         .stageName("Lambda")
         .actions(List.of(lambdaAction))
         .build());
 

The Lambda Action can have up to 5 inputs, and up to 5 outputs:

 Function fn;
 
 Artifact sourceOutput = new Artifact();
 Artifact buildOutput = new Artifact();
 LambdaInvokeAction lambdaAction = LambdaInvokeAction.Builder.create()
         .actionName("Lambda")
         .inputs(List.of(sourceOutput, buildOutput))
         .outputs(List.of(
             new Artifact("Out1"),
             new Artifact("Out2")))
         .lambda(fn)
         .build();
 

The Lambda Action supports custom user parameters that pipeline will pass to the Lambda function:

 Function fn;
 
 
 Pipeline pipeline = new Pipeline(this, "MyPipeline");
 LambdaInvokeAction lambdaAction = LambdaInvokeAction.Builder.create()
         .actionName("Lambda")
         .lambda(fn)
         .userParameters(Map.of(
                 "foo", "bar",
                 "baz", "qux"))
         // OR
         .userParametersString("my-parameter-string")
         .build();
 

The Lambda invoke action emits variables. Unlike many other actions, the variables are not static, but dynamic, defined by the function calling the PutJobSuccessResult API with the outputVariables property filled with the map of variables Example:

 // later:
 PipelineProject project;
 LambdaInvokeAction lambdaInvokeAction = LambdaInvokeAction.Builder.create()
         .actionName("Lambda")
         .lambda(Function.Builder.create(this, "Func")
                 .runtime(Runtime.NODEJS_12_X)
                 .handler("index.handler")
                 .code(Code.fromInline("\n        const AWS = require('aws-sdk');\n\n        exports.handler = async function(event, context) {\n            const codepipeline = new AWS.CodePipeline();\n            await codepipeline.putJobSuccessResult({\n                jobId: event['CodePipeline.job'].id,\n                outputVariables: {\n                    MY_VAR: \"some value\",\n                },\n            }).promise();\n        }\n    "))
                 .build())
         .variablesNamespace("MyNamespace")
         .build();
 Artifact sourceOutput = new Artifact();
 CodeBuildAction.Builder.create()
         .actionName("CodeBuild")
         .project(project)
         .input(sourceOutput)
         .environmentVariables(Map.of(
                 "MyVar", BuildEnvironmentVariable.builder()
                         .value(lambdaInvokeAction.variable("MY_VAR"))
                         .build()))
         .build();
 

See the AWS documentation on how to write a Lambda function invoked from CodePipeline.

AWS Step Functions

This module contains an Action that allows you to invoke a Step Function in a Pipeline:

 import software.amazon.awscdk.core.*;
 
 Pipeline pipeline = new Pipeline(this, "MyPipeline");
 Pass startState = new Pass(this, "StartState");
 StateMachine simpleStateMachine = StateMachine.Builder.create(this, "SimpleStateMachine")
         .definition(startState)
         .build();
 StepFunctionInvokeAction stepFunctionAction = StepFunctionInvokeAction.Builder.create()
         .actionName("Invoke")
         .stateMachine(simpleStateMachine)
         .stateMachineInput(StateMachineInput.literal(Map.of("IsHelloWorldExample", true)))
         .build();
 pipeline.addStage(StageOptions.builder()
         .stageName("StepFunctions")
         .actions(List.of(stepFunctionAction))
         .build());
 

The StateMachineInput can be created with one of 2 static factory methods: literal, which takes an arbitrary map as its only argument, or filePath:

 import software.amazon.awscdk.core.*;
 
 
 Pipeline pipeline = new Pipeline(this, "MyPipeline");
 Artifact inputArtifact = new Artifact();
 Pass startState = new Pass(this, "StartState");
 StateMachine simpleStateMachine = StateMachine.Builder.create(this, "SimpleStateMachine")
         .definition(startState)
         .build();
 StepFunctionInvokeAction stepFunctionAction = StepFunctionInvokeAction.Builder.create()
         .actionName("Invoke")
         .stateMachine(simpleStateMachine)
         .stateMachineInput(StateMachineInput.filePath(inputArtifact.atPath("assets/input.json")))
         .build();
 pipeline.addStage(StageOptions.builder()
         .stageName("StepFunctions")
         .actions(List.of(stepFunctionAction))
         .build());
 

See the AWS documentation for information on Action structure reference.

Skip navigation links

Copyright © 2022. All rights reserved.