Skip navigation links

Package software.amazon.awscdk.services.stepfunctions.tasks

Tasks for AWS Step Functions

See: Description

Package software.amazon.awscdk.services.stepfunctions.tasks Description

Tasks for AWS Step Functions

---

cdk-constructs: Stable


AWS Step Functions is a web service that enables you to coordinate the components of distributed applications and microservices using visual workflows. You build applications from individual components that each perform a discrete function, or task, allowing you to scale and change applications quickly.

A Task state represents a single unit of work performed by a state machine. All work in your state machine is performed by tasks.

This module is part of the AWS Cloud Development Kit project.

Table Of Contents

Task

A Task state represents a single unit of work performed by a state machine. In the CDK, the exact work to be done is determined by a class that implements IStepFunctionsTask.

AWS Step Functions integrates with some AWS services so that you can call API actions, and coordinate executions directly from the Amazon States Language in Step Functions. You can directly call and pass parameters to the APIs of those services.

Paths

In the Amazon States Language, a path is a string beginning with $ that you can use to identify components within JSON text.

Learn more about input and output processing in Step Functions here

InputPath

Both InputPath and Parameters fields provide a way to manipulate JSON as it moves through your workflow. AWS Step Functions applies the InputPath field first, and then the Parameters field. You can first filter your raw input to a selection you want using InputPath, and then apply Parameters to manipulate that input further, or add new values. If you don't specify an InputPath, a default value of $ will be used.

The following example provides the field named input as the input to the Task state that runs a Lambda function.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 LambdaInvoke submitJob = new LambdaInvoke(this, "Invoke Handler", new LambdaInvokeProps()
         .lambdaFunction(fn)
         .inputPath("$.input"));
 

OutputPath

Tasks also allow you to select a portion of the state output to pass to the next state. This enables you to filter out unwanted information, and pass only the portion of the JSON that you care about. If you don't specify an OutputPath, a default value of $ will be used. This passes the entire JSON node to the next state.

The response from a Lambda function includes the response from the function as well as other metadata.

The following example assigns the output from the Task to a field named result

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 LambdaInvoke submitJob = new LambdaInvoke(this, "Invoke Handler", new LambdaInvokeProps()
         .lambdaFunction(fn)
         .outputPath("$.Payload.result"));
 

ResultPath

The output of a state can be a copy of its input, the result it produces (for example, output from a Task state’s Lambda function), or a combination of its input and result. Use ResultPath to control which combination of these is passed to the state output. If you don't specify an ResultPath, a default value of $ will be used.

The following example adds the item from calling DynamoDB's getItem API to the state input and passes it to the next state.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new DynamoPutItem(this, "PutItem", new DynamoPutItemProps()
         .item(Map.of(
                 "MessageId", tasks.DynamoAttributeValue.fromString("message-id")))
         .table(myTable)
         .resultPath("$.Item"));
 

⚠️ The OutputPath is computed after applying ResultPath. All service integrations return metadata as part of their response. When using ResultPath, it's not possible to merge a subset of the task output to the input.

Task parameters from the state JSON

Most tasks take parameters. Parameter values can either be static, supplied directly in the workflow definition (by specifying their values), or a value available at runtime in the state machine's execution (either as its input or an output of a prior state). Parameter values available at runtime can be specified via the JsonPath class, using methods such as JsonPath.stringAt().

The following example provides the field named input as the input to the Lambda function and invokes it asynchronously.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 LambdaInvoke submitJob = new LambdaInvoke(this, "Invoke Handler", new LambdaInvokeProps()
         .lambdaFunction(fn)
         .payload(sfn.TaskInput.fromDataAt("$.input"))
         .invocationType(tasks.LambdaInvocationType.getEVENT()));
 

Each service integration has its own set of parameters that can be supplied.

Evaluate Expression

Use the EvaluateExpression to perform simple operations referencing state paths. The expression referenced in the task will be evaluated in a Lambda function (eval()). This allows you to not have to write Lambda code for simple operations.

Example: convert a wait time from milliseconds to seconds, concat this in a message and wait:

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 EvaluateExpression convertToSeconds = new EvaluateExpression(this, "Convert to seconds", new EvaluateExpressionProps()
         .expression("$.waitMilliseconds / 1000")
         .resultPath("$.waitSeconds"));
 
 EvaluateExpression createMessage = new EvaluateExpression(this, "Create message", new EvaluateExpressionProps()
         // Note: this is a string inside a string.
         .expression("`Now waiting ${$.waitSeconds} seconds...`")
         .runtime(lambda.Runtime.getNODEJS_14_X())
         .resultPath("$.message"));
 
 SnsPublish publishMessage = new SnsPublish(this, "Publish message", new SnsPublishProps()
         .topic(new Topic(this, "cool-topic"))
         .message(sfn.TaskInput.fromDataAt("$.message"))
         .resultPath("$.sns"));
 
 Wait wait = new Wait(this, "Wait", new WaitProps()
         .time(sfn.WaitTime.secondsPath("$.waitSeconds")));
 
 new StateMachine(this, "StateMachine", new StateMachineProps()
         .definition(convertToSeconds
             .next(createMessage)
             .next(publishMessage).next(wait)));
 

The EvaluateExpression supports a runtime prop to specify the Lambda runtime to use to evaluate the expression. Currently, only runtimes of the Node.js family are supported.

API Gateway

Step Functions supports API Gateway through the service integration pattern.

HTTP APIs are designed for low-latency, cost-effective integrations with AWS services, including AWS Lambda, and HTTP endpoints. HTTP APIs support OIDC and OAuth 2.0 authorization, and come with built-in support for CORS and automatic deployments. Previous-generation REST APIs currently offer more features. More details can be found here.

Call REST API Endpoint

The CallApiGatewayRestApiEndpoint calls the REST API endpoint.

 // Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
 import software.amazon.awscdk.services.stepfunctions.*;
 import .*;
 
 
 Object restApi = new RestApi(stack, "MyRestApi");
 
 CallApiGatewayRestApiEndpoint invokeTask = new CallApiGatewayRestApiEndpoint(stack, "Call REST API", new CallApiGatewayRestApiEndpointProps()
         .api(restApi)
         .stageName("prod")
         .method(HttpMethod.getGET()));
 

Call HTTP API Endpoint

The CallApiGatewayHttpApiEndpoint calls the HTTP API endpoint.

 // Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
 import software.amazon.awscdk.services.stepfunctions.*;
 import .*;
 
 
 Object httpApi = new HttpApi(stack, "MyHttpApi");
 
 CallApiGatewayHttpApiEndpoint invokeTask = new CallApiGatewayHttpApiEndpoint(stack, "Call HTTP API", new CallApiGatewayHttpApiEndpointProps()
         .api(httpApi)
         .method(HttpMethod.getGET()));
 

Athena

Step Functions supports Athena through the service integration pattern.

StartQueryExecution

The StartQueryExecution API runs the SQL query statement.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 AthenaStartQueryExecution startQueryExecutionJob = new AthenaStartQueryExecution(this, "Start Athena Query", new AthenaStartQueryExecutionProps()
         .queryString(sfn.JsonPath.stringAt("$.queryString"))
         .queryExecutionContext(new QueryExecutionContext()
                 .databaseName("mydatabase"))
         .resultConfiguration(new ResultConfiguration()
                 .encryptionConfiguration(new EncryptionConfiguration()
                         .encryptionOption(tasks.EncryptionOption.getS3_MANAGED()))
                 .outputLocation(new Location()
                         .bucketName("query-results-bucket")
                         .objectKey("folder"))));
 

GetQueryExecution

The GetQueryExecution API gets information about a single execution of a query.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 AthenaGetQueryExecution getQueryExecutionJob = new AthenaGetQueryExecution(this, "Get Query Execution", new AthenaGetQueryExecutionProps()
         .queryExecutionId(sfn.JsonPath.stringAt("$.QueryExecutionId")));
 

GetQueryResults

The GetQueryResults API that streams the results of a single query execution specified by QueryExecutionId from S3.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 AthenaGetQueryResults getQueryResultsJob = new AthenaGetQueryResults(this, "Get Query Results", new AthenaGetQueryResultsProps()
         .queryExecutionId(sfn.JsonPath.stringAt("$.QueryExecutionId")));
 

StopQueryExecution

The StopQueryExecution API that stops a query execution.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 AthenaStopQueryExecution stopQueryExecutionJob = new AthenaStopQueryExecution(this, "Stop Query Execution", new AthenaStopQueryExecutionProps()
         .queryExecutionId(sfn.JsonPath.stringAt("$.QueryExecutionId")));
 

Batch

Step Functions supports Batch through the service integration pattern.

SubmitJob

The SubmitJob API submits an AWS Batch job from a job definition.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 BatchSubmitJob task = new BatchSubmitJob(this, "Submit Job", new BatchSubmitJobProps()
         .jobDefinition(batchJobDefinition)
         .jobName("MyJob")
         .jobQueue(batchQueue));
 

CodeBuild

Step Functions supports CodeBuild through the service integration pattern.

StartBuild

StartBuild starts a CodeBuild Project by Project Name.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 import software.amazon.awscdk.services.codebuild.*;
 
 
 Project codebuildProject = new Project(this, "Project", new ProjectProps()
         .projectName("MyTestProject")
         .buildSpec(codebuild.BuildSpec.fromObject(Map.of(
                 "version", "0.2",
                 "phases", Map.of(
                         "build", Map.of(
                                 "commands", asList("echo \"Hello, CodeBuild!\"")))))));
 
 CodeBuildStartBuild task = new CodeBuildStartBuild(this, "Task", new CodeBuildStartBuildProps()
         .project(codebuildProject)
         .integrationPattern(sfn.IntegrationPattern.getRUN_JOB())
         .environmentVariablesOverride(Map.of(
                 "ZONE", new BuildEnvironmentVariable()
                         .type(codebuild.BuildEnvironmentVariableType.getPLAINTEXT())
                         .value(sfn.JsonPath.stringAt("$.envVariables.zone")))));
 

DynamoDB

You can call DynamoDB APIs from a Task state. Read more about calling DynamoDB APIs here

GetItem

The GetItem operation returns a set of attributes for the item with the given primary key.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new DynamoGetItem(this, "Get Item", new DynamoGetItemProps()
         .key(Map.of("messageId", tasks.DynamoAttributeValue.fromString("message-007")))
         .table(myTable));
 

PutItem

The PutItem operation creates a new item, or replaces an old item with a new item.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new DynamoPutItem(this, "PutItem", new DynamoPutItemProps()
         .item(Map.of(
                 "MessageId", tasks.DynamoAttributeValue.fromString("message-007"),
                 "Text", tasks.DynamoAttributeValue.fromString(sfn.JsonPath.stringAt("$.bar")),
                 "TotalCount", tasks.DynamoAttributeValue.fromNumber(10)))
         .table(myTable));
 

DeleteItem

The DeleteItem operation deletes a single item in a table by primary key.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new DynamoDeleteItem(this, "DeleteItem", new DynamoDeleteItemProps()
         .key(Map.of("MessageId", tasks.DynamoAttributeValue.fromString("message-007")))
         .table(myTable)
         .resultPath(sfn.JsonPath.getDISCARD()));
 

UpdateItem

The UpdateItem operation edits an existing item's attributes, or adds a new item to the table if it does not already exist.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new DynamoUpdateItem(this, "UpdateItem", new DynamoUpdateItemProps()
         .key(Map.of(
                 "MessageId", tasks.DynamoAttributeValue.fromString("message-007")))
         .table(myTable)
         .expressionAttributeValues(Map.of(
                 ":val", tasks.DynamoAttributeValue.numberFromString(sfn.JsonPath.stringAt("$.Item.TotalCount.N")),
                 ":rand", tasks.DynamoAttributeValue.fromNumber(20)))
         .updateExpression("SET TotalCount = :val + :rand"));
 

ECS

Step Functions supports ECS/Fargate through the service integration pattern.

RunTask

RunTask starts a new task using the specified task definition.

EC2

The EC2 launch type allows you to run your containerized applications on a cluster of Amazon EC2 instances that you manage.

When a task that uses the EC2 launch type is launched, Amazon ECS must determine where to place the task based on the requirements specified in the task definition, such as CPU and memory. Similarly, when you scale down the task count, Amazon ECS must determine which tasks to terminate. You can apply task placement strategies and constraints to customize how Amazon ECS places and terminates tasks. Learn more about task placement

The latest ACTIVE revision of the passed task definition is used for running the task.

The following example runs a job from a task definition on EC2

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 import software.amazon.awscdk.services.ecs.*;
 
 
 IVpc vpc = ec2.Vpc.fromLookup(this, "Vpc", new VpcLookupOptions()
         .isDefault(true));
 
 Cluster cluster = new Cluster(this, "Ec2Cluster", new ClusterProps().vpc(vpc));
 cluster.addCapacity("DefaultAutoScalingGroup", new AddCapacityOptions()
         .instanceType(new InstanceType("t2.micro"))
         .vpcSubnets(new SubnetSelection().subnetType(ec2.SubnetType.getPUBLIC())));
 
 TaskDefinition taskDefinition = new TaskDefinition(this, "TD", new TaskDefinitionProps()
         .compatibility(ecs.Compatibility.getEC2()));
 
 taskDefinition.addContainer("TheContainer", new ContainerDefinitionOptions()
         .image(ecs.ContainerImage.fromRegistry("foo/bar"))
         .memoryLimitMiB(256));
 
 EcsRunTask runTask = new EcsRunTask(this, "Run", new EcsRunTaskProps()
         .integrationPattern(sfn.IntegrationPattern.getRUN_JOB())
         .cluster(cluster)
         .taskDefinition(taskDefinition)
         .launchTarget(new EcsEc2LaunchTarget(new EcsEc2LaunchTargetOptions()
                 .placementStrategies(asList(ecs.PlacementStrategy.spreadAcrossInstances(), ecs.PlacementStrategy.packedByCpu(), ecs.PlacementStrategy.randomly()))
                 .placementConstraints(asList(ecs.PlacementConstraint.memberOf("blieptuut"))))));
 

Fargate

AWS Fargate is a serverless compute engine for containers that works with Amazon Elastic Container Service (ECS). Fargate makes it easy for you to focus on building your applications. Fargate removes the need to provision and manage servers, lets you specify and pay for resources per application, and improves security through application isolation by design. Learn more about Fargate

The Fargate launch type allows you to run your containerized applications without the need to provision and manage the backend infrastructure. Just register your task definition and Fargate launches the container for you. The latest ACTIVE revision of the passed task definition is used for running the task. Learn more about Fargate Versioning

The following example runs a job from a task definition on Fargate

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 import software.amazon.awscdk.services.ecs.*;
 
 
 IVpc vpc = ec2.Vpc.fromLookup(this, "Vpc", new VpcLookupOptions()
         .isDefault(true));
 
 Cluster cluster = new Cluster(this, "FargateCluster", new ClusterProps().vpc(vpc));
 
 TaskDefinition taskDefinition = new TaskDefinition(this, "TD", new TaskDefinitionProps()
         .memoryMiB("512")
         .cpu("256")
         .compatibility(ecs.Compatibility.getFARGATE()));
 
 ContainerDefinition containerDefinition = taskDefinition.addContainer("TheContainer", new ContainerDefinitionOptions()
         .image(ecs.ContainerImage.fromRegistry("foo/bar"))
         .memoryLimitMiB(256));
 
 EcsRunTask runTask = new EcsRunTask(this, "RunFargate", new EcsRunTaskProps()
         .integrationPattern(sfn.IntegrationPattern.getRUN_JOB())
         .cluster(cluster)
         .taskDefinition(taskDefinition)
         .assignPublicIp(true)
         .containerOverrides(asList(new ContainerOverride()
                 .containerDefinition(containerDefinition)
                 .environment(asList(new TaskEnvironmentVariable().name("SOME_KEY").value(sfn.JsonPath.stringAt("$.SomeKey"))))))
         .launchTarget(new EcsFargateLaunchTarget()));
 

EMR

Step Functions supports Amazon EMR through the service integration pattern. The service integration APIs correspond to Amazon EMR APIs but differ in the parameters that are used.

Read more about the differences when using these service integrations.

Create Cluster

Creates and starts running a cluster (job flow). Corresponds to the runJobFlow API in EMR.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 Role clusterRole = new Role(this, "ClusterRole", new RoleProps()
         .assumedBy(new ServicePrincipal("ec2.amazonaws.com")));
 
 Role serviceRole = new Role(this, "ServiceRole", new RoleProps()
         .assumedBy(new ServicePrincipal("elasticmapreduce.amazonaws.com")));
 
 Role autoScalingRole = new Role(this, "AutoScalingRole", new RoleProps()
         .assumedBy(new ServicePrincipal("elasticmapreduce.amazonaws.com")));
 
 autoScalingRole.assumeRolePolicy.addStatements(
 new PolicyStatement(new PolicyStatementProps()
         .effect(iam.Effect.getALLOW())
         .principals(asList(
             new ServicePrincipal("application-autoscaling.amazonaws.com")))
         .actions(asList("sts:AssumeRole"))));
 
 new EmrCreateCluster(this, "Create Cluster", new EmrCreateClusterProps()
         .instances(Map.of())
         .clusterRole(clusterRole)
         .name(sfn.TaskInput.fromDataAt('$.ClusterName').getValue())
         .serviceRole(serviceRole)
         .autoScalingRole(autoScalingRole));
 

Termination Protection

Locks a cluster (job flow) so the EC2 instances in the cluster cannot be terminated by user intervention, an API call, or a job-flow error.

Corresponds to the setTerminationProtection API in EMR.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new EmrSetClusterTerminationProtection(this, "Task", new EmrSetClusterTerminationProtectionProps()
         .clusterId("ClusterId")
         .terminationProtected(false));
 

Terminate Cluster

Shuts down a cluster (job flow). Corresponds to the terminateJobFlows API in EMR.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new EmrTerminateCluster(this, "Task", new EmrTerminateClusterProps()
         .clusterId("ClusterId"));
 

Add Step

Adds a new step to a running cluster. Corresponds to the addJobFlowSteps API in EMR.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new EmrAddStep(this, "Task", new EmrAddStepProps()
         .clusterId("ClusterId")
         .name("StepName")
         .jar("Jar")
         .actionOnFailure(tasks.ActionOnFailure.getCONTINUE()));
 

Cancel Step

Cancels a pending step in a running cluster. Corresponds to the cancelSteps API in EMR.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new EmrCancelStep(this, "Task", new EmrCancelStepProps()
         .clusterId("ClusterId")
         .stepId("StepId"));
 

Modify Instance Fleet

Modifies the target On-Demand and target Spot capacities for the instance fleet with the specified InstanceFleetName.

Corresponds to the modifyInstanceFleet API in EMR.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new EmrModifyInstanceFleetByName(this, "Task", new EmrModifyInstanceFleetByNameProps()
         .clusterId("ClusterId")
         .instanceFleetName("InstanceFleetName")
         .targetOnDemandCapacity(2)
         .targetSpotCapacity(0));
 

Modify Instance Group

Modifies the number of nodes and configuration settings of an instance group.

Corresponds to the modifyInstanceGroups API in EMR.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new EmrModifyInstanceGroupByName(this, "Task", new EmrModifyInstanceGroupByNameProps()
         .clusterId("ClusterId")
         .instanceGroupName(sfn.JsonPath.stringAt("$.InstanceGroupName"))
         .instanceGroup(Map.of(
                 "instanceCount", 1)));
 

EKS

Step Functions supports Amazon EKS through the service integration pattern. The service integration APIs correspond to Amazon EKS APIs.

Read more about the differences when using these service integrations.

Call

Read and write Kubernetes resource objects via a Kubernetes API endpoint. Corresponds to the call API in Step Functions Connector.

The following code snippet includes a Task state that uses eks:call to list the pods.

 // Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
 import software.amazon.awscdk.services.eks.*;
 import software.amazon.awscdk.services.stepfunctions.*;
 import software.amazon.awscdk.services.stepfunctions.tasks.*;
 
 
 Cluster myEksCluster = new Cluster(this, "my sample cluster", new ClusterProps()
         .version(eks.KubernetesVersion.getV1_18())
         .clusterName("myEksCluster"));
 
 new EksCall(stack, "Call a EKS Endpoint", new EksCallProps()
         .cluster(myEksCluster)
         .httpMethod(MethodType.getGET())
         .httpPath("/api/v1/namespaces/default/pods"));
 

Glue

Step Functions supports AWS Glue through the service integration pattern.

You can call the StartJobRun API from a Task state.

 // Example automatically generated without compilation. See https://github.com/aws/jsii/issues/826
 GlueStartJobRun.Builder.create(this, "Task")
         .glueJobName("my-glue-job")
         .arguments(sfn.TaskInput.fromObject(Map.of(
                 "key", "value")))
         .timeout(cdk.Duration.minutes(30))
         .notifyDelayAfter(cdk.Duration.minutes(5))
         .build();
 

Glue DataBrew

Step Functions supports AWS Glue DataBrew through the service integration pattern.

You can call the StartJobRun API from a Task state.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new GlueDataBrewStartJobRun(this, "Task", new GlueDataBrewStartJobRunProps()
         .name("databrew-job"));
 

Lambda

Invoke a Lambda function.

You can specify the input to your Lambda function through the payload attribute. By default, Step Functions invokes Lambda function with the state input (JSON path '$') as the input.

The following snippet invokes a Lambda Function with the state input as the payload by referencing the $ path.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new LambdaInvoke(this, "Invoke with state input", new LambdaInvokeProps()
         .lambdaFunction(fn));
 

When a function is invoked, the Lambda service sends these response elements back.

⚠️ The response from the Lambda function is in an attribute called Payload

The following snippet invokes a Lambda Function by referencing the $.Payload path to reference the output of a Lambda executed before it.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new LambdaInvoke(this, "Invoke with empty object as payload", new LambdaInvokeProps()
         .lambdaFunction(fn)
         .payload(sfn.TaskInput.fromObject(Map.of())));
 
 // use the output of fn as input
 // use the output of fn as input
 new LambdaInvoke(this, "Invoke with payload field in the state input", new LambdaInvokeProps()
         .lambdaFunction(fn)
         .payload(sfn.TaskInput.fromDataAt("$.Payload")));
 

The following snippet invokes a Lambda and sets the task output to only include the Lambda function response.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new LambdaInvoke(this, "Invoke and set function response as task output", new LambdaInvokeProps()
         .lambdaFunction(fn)
         .outputPath("$.Payload"));
 

If you want to combine the input and the Lambda function response you can use the payloadResponseOnly property and specify the resultPath. This will put the Lambda function ARN directly in the "Resource" string, but it conflicts with the integrationPattern, invocationType, clientContext, and qualifier properties.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new LambdaInvoke(this, "Invoke and combine function response with task input", new LambdaInvokeProps()
         .lambdaFunction(fn)
         .payloadResponseOnly(true)
         .resultPath("$.fn"));
 

You can have Step Functions pause a task, and wait for an external process to return a task token. Read more about the callback pattern

To use the callback pattern, set the token property on the task. Call the Step Functions SendTaskSuccess or SendTaskFailure APIs with the token to indicate that the task has completed and the state machine should resume execution.

The following snippet invokes a Lambda with the task token as part of the input to the Lambda.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new LambdaInvoke(this, "Invoke with callback", new LambdaInvokeProps()
         .lambdaFunction(fn)
         .integrationPattern(sfn.IntegrationPattern.getWAIT_FOR_TASK_TOKEN())
         .payload(sfn.TaskInput.fromObject(Map.of(
                 "token", sfn.JsonPath.getTaskToken(),
                 "input", sfn.JsonPath.stringAt("$.someField")))));
 

⚠️ The task will pause until it receives that task token back with a SendTaskSuccess or SendTaskFailure call. Learn more about Callback with the Task Token.

AWS Lambda can occasionally experience transient service errors. In this case, invoking Lambda results in a 500 error, such as ServiceException, AWSLambdaException, or SdkClientException. As a best practice, the LambdaInvoke task will retry on those errors with an interval of 2 seconds, a back-off rate of 2 and 6 maximum attempts. Set the retryOnServiceExceptions prop to false to disable this behavior.

SageMaker

Step Functions supports AWS SageMaker through the service integration pattern.

Create Training Job

You can call the CreateTrainingJob API from a Task state.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new SageMakerCreateTrainingJob(this, "TrainSagemaker", new SageMakerCreateTrainingJobProps()
         .trainingJobName(sfn.JsonPath.stringAt("$.JobName"))
         .algorithmSpecification(new AlgorithmSpecification()
                 .algorithmName("BlazingText")
                 .trainingInputMode(tasks.InputMode.getFILE()))
         .inputDataConfig(asList(new Channel()
                 .channelName("train")
                 .dataSource(new DataSource()
                         .s3DataSource(new S3DataSource()
                                 .s3DataType(tasks.S3DataType.getS3_PREFIX())
                                 .s3Location(tasks.S3Location.fromJsonExpression("$.S3Bucket"))))))
         .outputDataConfig(new OutputDataConfig()
                 .s3OutputLocation(tasks.S3Location.fromBucket(s3.Bucket.fromBucketName(this, "Bucket", "mybucket"), "myoutputpath")))
         .resourceConfig(new ResourceConfig()
                 .instanceCount(1)
                 .instanceType(ec2.InstanceType.of(ec2.InstanceClass.getP3(), ec2.InstanceSize.getXLARGE2()))
                 .volumeSize(cdk.Size.gibibytes(50)))// optional: default is 1 instance of EC2 `M4.XLarge` with `10GB` volume
         .stoppingCondition(new StoppingCondition()
                 .maxRuntime(cdk.Duration.hours(2))));
 

Create Transform Job

You can call the CreateTransformJob API from a Task state.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new SageMakerCreateTransformJob(this, "Batch Inference", new SageMakerCreateTransformJobProps()
         .transformJobName("MyTransformJob")
         .modelName("MyModelName")
         .modelClientOptions(new ModelClientOptions()
                 .invocationsMaxRetries(3)// default is 0
                 .invocationsTimeout(cdk.Duration.minutes(5)))
         .transformInput(new TransformInput()
                 .transformDataSource(new TransformDataSource()
                         .s3DataSource(new TransformS3DataSource()
                                 .s3Uri("s3://inputbucket/train")
                                 .s3DataType(tasks.S3DataType.getS3_PREFIX()))))
         .transformOutput(new TransformOutput()
                 .s3OutputPath("s3://outputbucket/TransformJobOutputPath"))
         .transformResources(new TransformResources()
                 .instanceCount(1)
                 .instanceType(ec2.InstanceType.of(ec2.InstanceClass.getM4(), ec2.InstanceSize.getXLARGE()))));
 

Create Endpoint

You can call the CreateEndpoint API from a Task state.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new SageMakerCreateEndpoint(this, "SagemakerEndpoint", new SageMakerCreateEndpointProps()
         .endpointName(sfn.JsonPath.stringAt("$.EndpointName"))
         .endpointConfigName(sfn.JsonPath.stringAt("$.EndpointConfigName")));
 

Create Endpoint Config

You can call the CreateEndpointConfig API from a Task state.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new SageMakerCreateEndpointConfig(this, "SagemakerEndpointConfig", new SageMakerCreateEndpointConfigProps()
         .endpointConfigName("MyEndpointConfig")
         .productionVariants(asList(new ProductionVariant()
                 .initialInstanceCount(2)
                 .instanceType(ec2.InstanceType.of(ec2.InstanceClass.getM5(), ec2.InstanceSize.getXLARGE()))
                 .modelName("MyModel")
                 .variantName("awesome-variant"))));
 

Create Model

You can call the CreateModel API from a Task state.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new SageMakerCreateModel(this, "Sagemaker", new SageMakerCreateModelProps()
         .modelName("MyModel")
         .primaryContainer(new ContainerDefinition(new ContainerDefinitionOptions()
                 .image(tasks.DockerImage.fromJsonExpression(sfn.JsonPath.stringAt("$.Model.imageName")))
                 .mode(tasks.Mode.getSINGLE_MODEL())
                 .modelS3Location(tasks.S3Location.fromJsonExpression("$.TrainingJob.ModelArtifacts.S3ModelArtifacts")))));
 

Update Endpoint

You can call the UpdateEndpoint API from a Task state.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 new SageMakerUpdateEndpoint(this, "SagemakerEndpoint", new SageMakerUpdateEndpointProps()
         .endpointName(sfn.JsonPath.stringAt("$.Endpoint.Name"))
         .endpointConfigName(sfn.JsonPath.stringAt("$.Endpoint.EndpointConfig")));
 

SNS

Step Functions supports Amazon SNS through the service integration pattern.

You can call the Publish API from a Task state to publish to an SNS topic.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 Topic topic = new Topic(this, "Topic");
 
 // Use a field from the execution data as message.
 SnsPublish task1 = new SnsPublish(this, "Publish1", new SnsPublishProps()
         .topic(topic)
         .integrationPattern(sfn.IntegrationPattern.getREQUEST_RESPONSE())
         .message(sfn.TaskInput.fromDataAt("$.state.message")));
 
 // Combine a field from the execution data with
 // a literal object.
 SnsPublish task2 = new SnsPublish(this, "Publish2", new SnsPublishProps()
         .topic(topic)
         .message(sfn.TaskInput.fromObject(Map.of(
                 "field1", "somedata",
                 "field2", sfn.JsonPath.stringAt("$.field2")))));
 

Step Functions

Start Execution

You can manage AWS Step Functions executions.

AWS Step Functions supports it's own StartExecution API as a service integration.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 // Define a state machine with one Pass state
 StateMachine child = new StateMachine(this, "ChildStateMachine", new StateMachineProps()
         .definition(sfn.Chain.start(new Pass(this, "PassState"))));
 
 // Include the state machine in a Task state with callback pattern
 StepFunctionsStartExecution task = new StepFunctionsStartExecution(this, "ChildTask", new StepFunctionsStartExecutionProps()
         .stateMachine(child)
         .integrationPattern(sfn.IntegrationPattern.getWAIT_FOR_TASK_TOKEN())
         .input(sfn.TaskInput.fromObject(Map.of(
                 "token", sfn.JsonPath.getTaskToken(),
                 "foo", "bar")))
         .name("MyExecutionName"));
 
 // Define a second state machine with the Task state above
 // Define a second state machine with the Task state above
 new StateMachine(this, "ParentStateMachine", new StateMachineProps()
         .definition(task));
 

Invoke Activity

You can invoke a Step Functions Activity which enables you to have a task in your state machine where the work is performed by a worker that can be hosted on Amazon EC2, Amazon ECS, AWS Lambda, basically anywhere. Activities are a way to associate code running somewhere (known as an activity worker) with a specific task in a state machine.

When Step Functions reaches an activity task state, the workflow waits for an activity worker to poll for a task. An activity worker polls Step Functions by using GetActivityTask, and sending the ARN for the related activity.

After the activity worker completes its work, it can provide a report of its success or failure by using SendTaskSuccess or SendTaskFailure. These two calls use the taskToken provided by GetActivityTask to associate the result with that task.

The following example creates an activity and creates a task that invokes the activity.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 Activity submitJobActivity = new Activity(this, "SubmitJob");
 
 new StepFunctionsInvokeActivity(this, "Submit Job", new StepFunctionsInvokeActivityProps()
         .activity(submitJobActivity));
 

SQS

Step Functions supports Amazon SQS

You can call the SendMessage API from a Task state to send a message to an SQS queue.

 // Example automatically generated. See https://github.com/aws/jsii/issues/826
 Queue queue = new Queue(this, "Queue");
 
 // Use a field from the execution data as message.
 SqsSendMessage task1 = new SqsSendMessage(this, "Send1", new SqsSendMessageProps()
         .queue(queue)
         .messageBody(sfn.TaskInput.fromDataAt("$.message")));
 
 // Combine a field from the execution data with
 // a literal object.
 SqsSendMessage task2 = new SqsSendMessage(this, "Send2", new SqsSendMessageProps()
         .queue(queue)
         .messageBody(sfn.TaskInput.fromObject(Map.of(
                 "field1", "somedata",
                 "field2", sfn.JsonPath.stringAt("$.field2")))));
 
Skip navigation links

Copyright © 2021. All rights reserved.