"documentation":"<p>Cancels the reprocessing of data through the pipeline.</p>"
},
"CreateChannel":{
"name":"CreateChannel",
"http":{
"method":"POST",
"requestUri":"/channels",
"responseCode":201
},
"input":{"shape":"CreateChannelRequest"},
"output":{"shape":"CreateChannelResponse"},
"errors":[
{"shape":"InvalidRequestException"},
{"shape":"ResourceAlreadyExistsException"},
{"shape":"InternalFailureException"},
{"shape":"ServiceUnavailableException"},
{"shape":"ThrottlingException"},
{"shape":"LimitExceededException"}
],
"documentation":"<p>Creates a channel. A channel collects data from an MQTT topic and archives the raw, unprocessed messages before publishing the data to a pipeline.</p>"
"documentation":"<p>Creates a data set. A data set stores data retrieved from a data store by applying a \"queryAction\" (a SQL query) or a \"containerAction\" (executing a containerized application). This operation creates the skeleton of a data set. The data set can be populated manually by calling \"CreateDatasetContent\" or automatically according to a \"trigger\" you specify.</p>"
"documentation":"<p>Creates the content of a data set by applying a \"queryAction\" (a SQL query) or a \"containerAction\" (executing a containerized application).</p>"
"documentation":"<p>Creates a pipeline. A pipeline consumes messages from one or more channels and allows you to process the messages before storing them in a data store. You must specify both a <code>channel</code> and a <code>datastore</code> activity and, optionally, as many as 23 additional activities in the <code>pipelineActivities</code> array.</p>"
"documentation":"<p>Deletes the specified channel.</p>"
},
"DeleteDataset":{
"name":"DeleteDataset",
"http":{
"method":"DELETE",
"requestUri":"/datasets/{datasetName}",
"responseCode":204
},
"input":{"shape":"DeleteDatasetRequest"},
"errors":[
{"shape":"InvalidRequestException"},
{"shape":"ResourceNotFoundException"},
{"shape":"InternalFailureException"},
{"shape":"ServiceUnavailableException"},
{"shape":"ThrottlingException"}
],
"documentation":"<p>Deletes the specified data set.</p> <p>You do not have to delete the content of the data set before you perform this operation.</p>"
},
"DeleteDatasetContent":{
"name":"DeleteDatasetContent",
"http":{
"method":"DELETE",
"requestUri":"/datasets/{datasetName}/content",
"responseCode":204
},
"input":{"shape":"DeleteDatasetContentRequest"},
"errors":[
{"shape":"InvalidRequestException"},
{"shape":"ResourceNotFoundException"},
{"shape":"InternalFailureException"},
{"shape":"ServiceUnavailableException"},
{"shape":"ThrottlingException"}
],
"documentation":"<p>Deletes the content of the specified data set.</p>"
},
"DeleteDatastore":{
"name":"DeleteDatastore",
"http":{
"method":"DELETE",
"requestUri":"/datastores/{datastoreName}",
"responseCode":204
},
"input":{"shape":"DeleteDatastoreRequest"},
"errors":[
{"shape":"InvalidRequestException"},
{"shape":"ResourceNotFoundException"},
{"shape":"InternalFailureException"},
{"shape":"ServiceUnavailableException"},
{"shape":"ThrottlingException"}
],
"documentation":"<p>Deletes the specified data store.</p>"
},
"DeletePipeline":{
"name":"DeletePipeline",
"http":{
"method":"DELETE",
"requestUri":"/pipelines/{pipelineName}",
"responseCode":204
},
"input":{"shape":"DeletePipelineRequest"},
"errors":[
{"shape":"InvalidRequestException"},
{"shape":"ResourceNotFoundException"},
{"shape":"InternalFailureException"},
{"shape":"ServiceUnavailableException"},
{"shape":"ThrottlingException"}
],
"documentation":"<p>Deletes the specified pipeline.</p>"
},
"DescribeChannel":{
"name":"DescribeChannel",
"http":{
"method":"GET",
"requestUri":"/channels/{channelName}"
},
"input":{"shape":"DescribeChannelRequest"},
"output":{"shape":"DescribeChannelResponse"},
"errors":[
{"shape":"InvalidRequestException"},
{"shape":"ResourceNotFoundException"},
{"shape":"InternalFailureException"},
{"shape":"ServiceUnavailableException"},
{"shape":"ThrottlingException"}
],
"documentation":"<p>Retrieves information about a channel.</p>"
},
"DescribeDataset":{
"name":"DescribeDataset",
"http":{
"method":"GET",
"requestUri":"/datasets/{datasetName}"
},
"input":{"shape":"DescribeDatasetRequest"},
"output":{"shape":"DescribeDatasetResponse"},
"errors":[
{"shape":"InvalidRequestException"},
{"shape":"ResourceNotFoundException"},
{"shape":"InternalFailureException"},
{"shape":"ServiceUnavailableException"},
{"shape":"ThrottlingException"}
],
"documentation":"<p>Retrieves information about a data set.</p>"
},
"DescribeDatastore":{
"name":"DescribeDatastore",
"http":{
"method":"GET",
"requestUri":"/datastores/{datastoreName}"
},
"input":{"shape":"DescribeDatastoreRequest"},
"output":{"shape":"DescribeDatastoreResponse"},
"errors":[
{"shape":"InvalidRequestException"},
{"shape":"ResourceNotFoundException"},
{"shape":"InternalFailureException"},
{"shape":"ServiceUnavailableException"},
{"shape":"ThrottlingException"}
],
"documentation":"<p>Retrieves information about a data store.</p>"
"documentation":"<p>Sets or updates the AWS IoT Analytics logging options.</p> <p>Note that if you update the value of any <code>loggingOptions</code> field, it takes up to one minute for the change to take effect. Also, if you change the policy attached to the role you specified in the roleArn field (for example, to correct an invalid policy) it takes up to 5 minutes for that change to take effect. </p>"
"documentation":"<p>Simulates the results of running a pipeline activity on a message payload.</p>"
},
"SampleChannelData":{
"name":"SampleChannelData",
"http":{
"method":"GET",
"requestUri":"/channels/{channelName}/sample"
},
"input":{"shape":"SampleChannelDataRequest"},
"output":{"shape":"SampleChannelDataResponse"},
"errors":[
{"shape":"InvalidRequestException"},
{"shape":"ResourceNotFoundException"},
{"shape":"InternalFailureException"},
{"shape":"ServiceUnavailableException"},
{"shape":"ThrottlingException"}
],
"documentation":"<p>Retrieves a sample of messages from the specified channel ingested during the specified timeframe. Up to 10 messages can be retrieved.</p>"
"documentation":"<p>Updates the settings of a pipeline. You must specify both a <code>channel</code> and a <code>datastore</code> activity and, optionally, as many as 23 additional activities in the <code>pipelineActivities</code> array.</p>"
"documentation":"<p>The name of the 'addAttributes' activity.</p>"
},
"attributes":{
"shape":"AttributeNameMapping",
"documentation":"<p>A list of 1-50 \"AttributeNameMapping\" objects that map an existing attribute to a new attribute.</p> <note> <p>The existing attributes remain in the message, so if you want to remove the originals, use \"RemoveAttributeActivity\".</p> </note>"
},
"next":{
"shape":"ActivityName",
"documentation":"<p>The next activity in the pipeline.</p>"
}
},
"documentation":"<p>An activity that adds other attributes based on existing attributes in the message.</p>"
},
"AttributeName":{
"type":"string",
"max":256,
"min":1
},
"AttributeNameMapping":{
"type":"map",
"key":{"shape":"AttributeName"},
"value":{"shape":"AttributeName"},
"max":50,
"min":1
},
"AttributeNames":{
"type":"list",
"member":{"shape":"AttributeName"},
"max":50,
"min":1
},
"BatchPutMessageErrorEntries":{
"type":"list",
"member":{"shape":"BatchPutMessageErrorEntry"}
},
"BatchPutMessageErrorEntry":{
"type":"structure",
"members":{
"messageId":{
"shape":"MessageId",
"documentation":"<p>The ID of the message that caused the error. (See the value corresponding to the \"messageId\" key in the message object.)</p>"
},
"errorCode":{
"shape":"ErrorCode",
"documentation":"<p>The code associated with the error.</p>"
},
"errorMessage":{
"shape":"ErrorMessage",
"documentation":"<p>The message associated with the error.</p>"
}
},
"documentation":"<p>Contains informations about errors.</p>"
},
"BatchPutMessageRequest":{
"type":"structure",
"required":[
"channelName",
"messages"
],
"members":{
"channelName":{
"shape":"ChannelName",
"documentation":"<p>The name of the channel where the messages are sent.</p>"
"documentation":"<p>The list of messages to be sent. Each message has format: '{ \"messageId\": \"string\", \"payload\": \"string\"}'.</p> <p>Note that the field names of message payloads (data) that you send to AWS IoT Analytics:</p> <ul> <li> <p>Must contain only alphanumeric characters and undescores (_); no other special characters are allowed.</p> </li> <li> <p>Must begin with an alphabetic character or single underscore (_).</p> </li> <li> <p>Cannot contain hyphens (-).</p> </li> <li> <p>In regular expression terms: \"^[A-Za-z_]([A-Za-z0-9]*|[A-Za-z0-9][A-Za-z0-9_]*)$\". </p> </li> <li> <p>Cannot be greater than 255 characters.</p> </li> <li> <p>Are case-insensitive. (Fields named \"foo\" and \"FOO\" in the same payload are considered duplicates.)</p> </li> </ul> <p>For example, {\"temp_01\": 29} or {\"_temp_01\": 29} are valid, but {\"temp-01\": 29}, {\"01_temp\": 29} or {\"__temp_01\": 29} are invalid in message payloads. </p>"
"documentation":"<p>The status of the channel.</p>"
},
"retentionPeriod":{
"shape":"RetentionPeriod",
"documentation":"<p>How long, in days, message data is kept for the channel.</p>"
},
"creationTime":{
"shape":"Timestamp",
"documentation":"<p>When the channel was created.</p>"
},
"lastUpdateTime":{
"shape":"Timestamp",
"documentation":"<p>When the channel was last updated.</p>"
}
},
"documentation":"<p>A collection of data from an MQTT topic. Channels archive the raw, unprocessed messages before publishing the data to a pipeline.</p>"
},
"ChannelActivity":{
"type":"structure",
"required":[
"name",
"channelName"
],
"members":{
"name":{
"shape":"ActivityName",
"documentation":"<p>The name of the 'channel' activity.</p>"
},
"channelName":{
"shape":"ChannelName",
"documentation":"<p>The name of the channel from which the messages are processed.</p>"
},
"next":{
"shape":"ActivityName",
"documentation":"<p>The next activity in the pipeline.</p>"
}
},
"documentation":"<p>The activity that determines the source of the messages to be processed.</p>"
"documentation":"<p>The ARN of the Docker container stored in your account. The Docker container contains an application and needed support libraries and is used to generate data set contents.</p>"
},
"executionRoleArn":{
"shape":"RoleArn",
"documentation":"<p>The ARN of the role which gives permission to the system to access needed resources in order to run the \"containerAction\". This includes, at minimum, permission to retrieve the data set contents which are the input to the containerized application.</p>"
},
"resourceConfiguration":{
"shape":"ResourceConfiguration",
"documentation":"<p>Configuration of the resource which executes the \"containerAction\".</p>"
},
"variables":{
"shape":"Variables",
"documentation":"<p>The values of variables used within the context of the execution of the containerized application (basically, parameters passed to the application). Each variable must have a name and a value given by one of \"stringValue\", \"datasetContentVersionValue\", or \"outputFileUriValue\".</p>"
}
},
"documentation":"<p>Information needed to run the \"containerAction\" to produce data set contents.</p>"
"documentation":"<p>A list of triggers. A trigger causes data set contents to be populated at a specified time interval or when another data set's contents are created. The list of triggers can be empty or contain up to five <b>DataSetTrigger</b> objects.</p>"
"documentation":"<p>[Optional] How long, in days, versions of data set contents are kept for the data set. If not specified or set to null, versions of data set contents are retained for at most 90 days. The number of versions of data set contents retained is determined by the <code>versioningConfiguration</code> parameter. (For more information, see https://docs.aws.amazon.com/iotanalytics/latest/userguide/getting-started.html#aws-iot-analytics-dataset-versions)</p>"
},
"versioningConfiguration":{
"shape":"VersioningConfiguration",
"documentation":"<p>[Optional] How many versions of data set contents are kept. If not specified or set to null, only the latest version plus the latest succeeded version (if they are different) are kept for the time period specified by the \"retentionPeriod\" parameter. (For more information, see https://docs.aws.amazon.com/iotanalytics/latest/userguide/getting-started.html#aws-iot-analytics-dataset-versions)</p>"
"documentation":"<p>A list of \"PipelineActivity\" objects. Activities perform transformations on your messages, such as removing, renaming or adding message attributes; filtering messages based on attribute values; invoking your Lambda functions on messages for advanced processing; or performing mathematical transformations to normalize device data.</p> <p>The list can be 2-25 <b>PipelineActivity</b> objects and must contain both a <code>channel</code> and a <code>datastore</code> activity. Each entry in the list must contain only one activity, for example:</p> <p> <code>pipelineActivities = [ { \"channel\": { ... } }, { \"lambda\": { ... } }, ... ]</code> </p>"
"documentation":"<p>The name of the Amazon S3 bucket in which channel data is stored.</p>"
},
"keyPrefix":{
"shape":"S3KeyPrefix",
"documentation":"<p>The prefix used to create the keys of the channel data objects. Each object in an Amazon S3 bucket has a key that is its unique identifier within the bucket (each object in a bucket has exactly one key).</p>"
},
"roleArn":{
"shape":"RoleArn",
"documentation":"<p>The ARN of the role which grants AWS IoT Analytics permission to interact with your Amazon S3 resources.</p>"
}
},
"documentation":"<p>Use this to store channel data in an S3 bucket that you manage.</p>"
},
"CustomerManagedChannelS3StorageSummary":{
"type":"structure",
"members":{
"bucket":{
"shape":"BucketName",
"documentation":"<p>The name of the Amazon S3 bucket in which channel data is stored.</p>"
},
"keyPrefix":{
"shape":"S3KeyPrefix",
"documentation":"<p>The prefix used to create the keys of the channel data objects. Each object in an Amazon S3 bucket has a key that is its unique identifier within the bucket (each object in a bucket has exactly one key).</p>"
},
"roleArn":{
"shape":"RoleArn",
"documentation":"<p>The ARN of the role which grants AWS IoT Analytics permission to interact with your Amazon S3 resources.</p>"
}
},
"documentation":"<p>Used to store channel data in an S3 bucket that you manage.</p>"
},
"CustomerManagedDatastoreS3Storage":{
"type":"structure",
"required":[
"bucket",
"roleArn"
],
"members":{
"bucket":{
"shape":"BucketName",
"documentation":"<p>The name of the Amazon S3 bucket in which data store data is stored.</p>"
},
"keyPrefix":{
"shape":"S3KeyPrefix",
"documentation":"<p>The prefix used to create the keys of the data store data objects. Each object in an Amazon S3 bucket has a key that is its unique identifier within the bucket (each object in a bucket has exactly one key).</p>"
},
"roleArn":{
"shape":"RoleArn",
"documentation":"<p>The ARN of the role which grants AWS IoT Analytics permission to interact with your Amazon S3 resources.</p>"
}
},
"documentation":"<p>Use this to store data store data in an S3 bucket that you manage.</p>"
},
"CustomerManagedDatastoreS3StorageSummary":{
"type":"structure",
"members":{
"bucket":{
"shape":"BucketName",
"documentation":"<p>The name of the Amazon S3 bucket in which data store data is stored.</p>"
},
"keyPrefix":{
"shape":"S3KeyPrefix",
"documentation":"<p>The prefix used to create the keys of the data store data objects. Each object in an Amazon S3 bucket has a key that is its unique identifier within the bucket (each object in a bucket has exactly one key).</p>"
},
"roleArn":{
"shape":"RoleArn",
"documentation":"<p>The ARN of the role which grants AWS IoT Analytics permission to interact with your Amazon S3 resources.</p>"
}
},
"documentation":"<p>Used to store data store data in an S3 bucket that you manage.</p>"
"documentation":"<p>[Optional] How many versions of data set contents are kept. If not specified or set to null, only the latest version plus the latest succeeded version (if they are different) are kept for the time period specified by the \"retentionPeriod\" parameter. (For more information, see https://docs.aws.amazon.com/iotanalytics/latest/userguide/getting-started.html#aws-iot-analytics-dataset-versions)</p>"
"documentation":"<p>Information which allows the system to run a containerized application in order to create the data set contents. The application must be in a Docker container along with any needed support libraries.</p>"
"documentation":"<p>A list of triggers. A trigger causes data set content to be populated at a specified time interval or when another data set is populated. The list of triggers can be empty or contain up to five DataSetTrigger objects</p>"
},
"actions":{
"shape":"DatasetActionSummaries",
"documentation":"<p>A list of \"DataActionSummary\" objects.</p>"
"documentation":"<p>The ARN of the data store.</p>"
},
"status":{
"shape":"DatastoreStatus",
"documentation":"<p>The status of a data store:</p> <dl> <dt>CREATING</dt> <dd> <p>The data store is being created.</p> </dd> <dt>ACTIVE</dt> <dd> <p>The data store has been created and can be used.</p> </dd> <dt>DELETING</dt> <dd> <p>The data store is being deleted.</p> </dd> </dl>"
},
"retentionPeriod":{
"shape":"RetentionPeriod",
"documentation":"<p>How long, in days, message data is kept for the data store.</p>"
},
"creationTime":{
"shape":"Timestamp",
"documentation":"<p>When the data store was created.</p>"
},
"lastUpdateTime":{
"shape":"Timestamp",
"documentation":"<p>The last time the data store was updated.</p>"
}
},
"documentation":"<p>Information about a data store.</p>"
},
"DatastoreActivity":{
"type":"structure",
"required":[
"name",
"datastoreName"
],
"members":{
"name":{
"shape":"ActivityName",
"documentation":"<p>The name of the 'datastore' activity.</p>"
},
"datastoreName":{
"shape":"DatastoreName",
"documentation":"<p>The name of the data store where processed messages are stored.</p>"
}
},
"documentation":"<p>The 'datastore' activity that specifies where to store the processed data.</p>"
"documentation":"<p>The status of the data store.</p>"
},
"creationTime":{
"shape":"Timestamp",
"documentation":"<p>When the data store was created.</p>"
},
"lastUpdateTime":{
"shape":"Timestamp",
"documentation":"<p>The last time the data store was updated.</p>"
}
},
"documentation":"<p>A summary of information about a data store.</p>"
},
"DeleteChannelRequest":{
"type":"structure",
"required":["channelName"],
"members":{
"channelName":{
"shape":"ChannelName",
"documentation":"<p>The name of the channel to delete.</p>",
"location":"uri",
"locationName":"channelName"
}
}
},
"DeleteDatasetContentRequest":{
"type":"structure",
"required":["datasetName"],
"members":{
"datasetName":{
"shape":"DatasetName",
"documentation":"<p>The name of the data set whose content is deleted.</p>",
"location":"uri",
"locationName":"datasetName"
},
"versionId":{
"shape":"DatasetContentVersion",
"documentation":"<p>The version of the data set whose content is deleted. You can also use the strings \"$LATEST\" or \"$LATEST_SUCCEEDED\" to delete the latest or latest successfully completed data set. If not specified, \"$LATEST_SUCCEEDED\" is the default.</p>",
"location":"querystring",
"locationName":"versionId"
}
}
},
"DeleteDatasetRequest":{
"type":"structure",
"required":["datasetName"],
"members":{
"datasetName":{
"shape":"DatasetName",
"documentation":"<p>The name of the data set to delete.</p>",
"location":"uri",
"locationName":"datasetName"
}
}
},
"DeleteDatastoreRequest":{
"type":"structure",
"required":["datastoreName"],
"members":{
"datastoreName":{
"shape":"DatastoreName",
"documentation":"<p>The name of the data store to delete.</p>",
"location":"uri",
"locationName":"datastoreName"
}
}
},
"DeletePipelineRequest":{
"type":"structure",
"required":["pipelineName"],
"members":{
"pipelineName":{
"shape":"PipelineName",
"documentation":"<p>The name of the pipeline to delete.</p>",
"documentation":"<p>The number of seconds of estimated \"in flight\" lag time of message data. When you create data set contents using message data from a specified time frame, some message data may still be \"in flight\" when processing begins, and so will not arrive in time to be processed. Use this field to make allowances for the \"in flight\" time of your message data, so that data not processed from a previous time frame will be included with the next time frame. Without this, missed message data would be excluded from processing during the next time frame as well, because its timestamp places it within the previous time frame.</p>"
"documentation":"<p>An expression by which the time of the message data may be determined. This may be the name of a timestamp field, or a SQL expression which is used to derive the time the message data was generated.</p>"
"documentation":"<p>Additional statistical information about the data store. Included if the 'includeStatistics' parameter is set to true in the request.</p>"
"documentation":"<p>The next activity in the pipeline.</p>"
}
},
"documentation":"<p>An activity that filters a message based on its attributes.</p>"
},
"FilterExpression":{
"type":"string",
"max":256,
"min":1
},
"GetDatasetContentRequest":{
"type":"structure",
"required":["datasetName"],
"members":{
"datasetName":{
"shape":"DatasetName",
"documentation":"<p>The name of the data set whose contents are retrieved.</p>",
"location":"uri",
"locationName":"datasetName"
},
"versionId":{
"shape":"DatasetContentVersion",
"documentation":"<p>The version of the data set whose contents are retrieved. You can also use the strings \"$LATEST\" or \"$LATEST_SUCCEEDED\" to retrieve the contents of the latest or latest successfully completed data set. If not specified, \"$LATEST_SUCCEEDED\" is the default.</p>",
"location":"querystring",
"locationName":"versionId"
}
}
},
"GetDatasetContentResponse":{
"type":"structure",
"members":{
"entries":{
"shape":"DatasetEntries",
"documentation":"<p>A list of \"DatasetEntry\" objects.</p>"
},
"timestamp":{
"shape":"Timestamp",
"documentation":"<p>The time when the request was made.</p>"
},
"status":{
"shape":"DatasetContentStatus",
"documentation":"<p>The status of the data set content.</p>"
"documentation":"<p>The name of the table in your AWS Glue Data Catalog which is used to perform the ETL (extract, transform and load) operations. (An AWS Glue Data Catalog table contains partitioned data and descriptions of data sources and targets.)</p>"
},
"databaseName":{
"shape":"GlueDatabaseName",
"documentation":"<p>The name of the database in your AWS Glue Data Catalog in which the table is located. (An AWS Glue Data Catalog database contains Glue Data tables.)</p>"
}
},
"documentation":"<p>Configuration information for coordination with the AWS Glue ETL (extract, transform and load) service.</p>"
"documentation":"<p>The name of the 'lambda' activity.</p>"
},
"lambdaName":{
"shape":"LambdaName",
"documentation":"<p>The name of the Lambda function that is run on the message.</p>"
},
"batchSize":{
"shape":"ActivityBatchSize",
"documentation":"<p>The number of messages passed to the Lambda function for processing.</p> <p>The AWS Lambda function must be able to process all of these messages within five minutes, which is the maximum timeout duration for Lambda functions.</p>"
},
"next":{
"shape":"ActivityName",
"documentation":"<p>The next activity in the pipeline.</p>"
}
},
"documentation":"<p>An activity that runs a Lambda function to modify the message.</p>"
},
"LambdaName":{
"type":"string",
"max":64,
"min":1,
"pattern":"^[a-zA-Z0-9_-]+$"
},
"LimitExceededException":{
"type":"structure",
"members":{
"message":{"shape":"errorMessage"}
},
"documentation":"<p>The command caused an internal limit to be exceeded.</p>",
"error":{"httpStatusCode":410},
"exception":true
},
"ListChannelsRequest":{
"type":"structure",
"members":{
"nextToken":{
"shape":"NextToken",
"documentation":"<p>The token for the next set of results.</p>",
"location":"querystring",
"locationName":"nextToken"
},
"maxResults":{
"shape":"MaxResults",
"documentation":"<p>The maximum number of results to return in this request.</p> <p>The default value is 100.</p>",
"location":"querystring",
"locationName":"maxResults"
}
}
},
"ListChannelsResponse":{
"type":"structure",
"members":{
"channelSummaries":{
"shape":"ChannelSummaries",
"documentation":"<p>A list of \"ChannelSummary\" objects.</p>"
},
"nextToken":{
"shape":"NextToken",
"documentation":"<p>The token to retrieve the next set of results, or <code>null</code> if there are no more results.</p>"
"documentation":"<p>A filter to limit results to those data set contents whose creation is scheduled on or after the given time. See the field <code>triggers.schedule</code> in the CreateDataset request. (timestamp)</p>",
"location":"querystring",
"locationName":"scheduledOnOrAfter"
},
"scheduledBefore":{
"shape":"Timestamp",
"documentation":"<p>A filter to limit results to those data set contents whose creation is scheduled before the given time. See the field <code>triggers.schedule</code> in the CreateDataset request. (timestamp)</p>",
"documentation":"<p>The payload of the message. This may be a JSON string or a Base64-encoded string representing binary data (in which case you must decode it by means of a pipeline activity).</p>"
"documentation":"<p>The type of the compute resource used to execute the \"containerAction\". Possible values are: ACU_1 (vCPU=4, memory=16GiB) or ACU_2 (vCPU=8, memory=32GiB).</p>"
},
"volumeSizeInGB":{
"shape":"VolumeSizeInGB",
"documentation":"<p>The size (in GB) of the persistent storage available to the resource instance used to execute the \"containerAction\" (min: 1, max: 50).</p>"
}
},
"documentation":"<p>The configuration of the resource used to execute the \"containerAction\".</p>"
"documentation":"<p>A resource with the specified name could not be found.</p>",
"error":{"httpStatusCode":404},
"exception":true
},
"RetentionPeriod":{
"type":"structure",
"members":{
"unlimited":{
"shape":"UnlimitedRetentionPeriod",
"documentation":"<p>If true, message data is kept indefinitely.</p>"
},
"numberOfDays":{
"shape":"RetentionPeriodInDays",
"documentation":"<p>The number of days that message data is kept. The \"unlimited\" parameter must be false.</p>"
}
},
"documentation":"<p>How long, in days, message data is kept.</p>"
},
"RetentionPeriodInDays":{
"type":"integer",
"min":1
},
"RoleArn":{
"type":"string",
"max":2048,
"min":20
},
"RunPipelineActivityRequest":{
"type":"structure",
"required":[
"pipelineActivity",
"payloads"
],
"members":{
"pipelineActivity":{
"shape":"PipelineActivity",
"documentation":"<p>The pipeline activity that is run. This must not be a 'channel' activity or a 'datastore' activity because these activities are used in a pipeline only to load the original message and to store the (possibly) transformed message. If a 'lambda' activity is specified, only short-running Lambda functions (those with a timeout of less than 30 seconds or less) can be used.</p>"
},
"payloads":{
"shape":"MessagePayloads",
"documentation":"<p>The sample message payloads on which the pipeline activity is run.</p>"
}
}
},
"RunPipelineActivityResponse":{
"type":"structure",
"members":{
"payloads":{
"shape":"MessagePayloads",
"documentation":"<p>The enriched or transformed sample message payloads as base64-encoded strings. (The results of running the pipeline activity on each input sample message payload, encoded in base64.)</p>"
},
"logResult":{
"shape":"LogResult",
"documentation":"<p>In case the pipeline activity fails, the log message that is generated.</p>"
"documentation":"<p>The name of the Amazon S3 bucket to which data set contents are delivered.</p>"
},
"key":{
"shape":"BucketKeyExpression",
"documentation":"<p>The key of the data set contents object. Each object in an Amazon S3 bucket has a key that is its unique identifier within the bucket (each object in a bucket has exactly one key).</p>"
},
"glueConfiguration":{
"shape":"GlueConfiguration",
"documentation":"<p>Configuration information for coordination with the AWS Glue ETL (extract, transform and load) service.</p>"
},
"roleArn":{
"shape":"RoleArn",
"documentation":"<p>The ARN of the role which grants AWS IoT Analytics permission to interact with your Amazon S3 and AWS Glue resources.</p>"
}
},
"documentation":"<p>Configuration information for delivery of data set contents to Amazon S3.</p>"
"documentation":"<p>The expression that defines when to trigger an update. For more information, see <a href=\"https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/ScheduledEvents.html\"> Schedule Expressions for Rules</a> in the Amazon CloudWatch Events User Guide.</p>"
"documentation":"<p>How long, in days, data set contents are kept for the data set.</p>"
},
"versioningConfiguration":{
"shape":"VersioningConfiguration",
"documentation":"<p>[Optional] How many versions of data set contents are kept. If not specified or set to null, only the latest version plus the latest succeeded version (if they are different) are kept for the time period specified by the \"retentionPeriod\" parameter. (For more information, see https://docs.aws.amazon.com/iotanalytics/latest/userguide/getting-started.html#aws-iot-analytics-dataset-versions)</p>"
"documentation":"<p>A list of \"PipelineActivity\" objects. Activities perform transformations on your messages, such as removing, renaming or adding message attributes; filtering messages based on attribute values; invoking your Lambda functions on messages for advanced processing; or performing mathematical transformations to normalize device data.</p> <p>The list can be 2-25 <b>PipelineActivity</b> objects and must contain both a <code>channel</code> and a <code>datastore</code> activity. Each entry in the list must contain only one activity, for example:</p> <p> <code>pipelineActivities = [ { \"channel\": { ... } }, { \"lambda\": { ... } }, ... ]</code> </p>"
"documentation":"<p>The name of the variable.</p>"
},
"stringValue":{
"shape":"StringValue",
"documentation":"<p>The value of the variable as a string.</p>"
},
"doubleValue":{
"shape":"DoubleValue",
"documentation":"<p>The value of the variable as a double (numeric).</p>",
"box":true
},
"datasetContentVersionValue":{
"shape":"DatasetContentVersionValue",
"documentation":"<p>The value of the variable as a structure that specifies a data set content version.</p>"
},
"outputFileUriValue":{
"shape":"OutputFileUriValue",
"documentation":"<p>The value of the variable as a structure that specifies an output file URI.</p>"
}
},
"documentation":"<p>An instance of a variable to be passed to the \"containerAction\" execution. Each variable must have a name and a value given by one of \"stringValue\", \"datasetContentVersionValue\", or \"outputFileUriValue\".</p>"
"documentation":"<p>AWS IoT Analytics allows you to collect large amounts of device data, process messages, and store them. You can then query the data and run sophisticated analytics on it. AWS IoT Analytics enables advanced data exploration through integration with Jupyter Notebooks and data visualization through integration with Amazon QuickSight.</p> <p>Traditional analytics and business intelligence tools are designed to process structured data. IoT data often comes from devices that record noisy processes (such as temperature, motion, or sound). As a result the data from these devices can have significant gaps, corrupted messages, and false readings that must be cleaned up before analysis can occur. Also, IoT data is often only meaningful in the context of other data from external sources. </p> <p>AWS IoT Analytics automates the steps required to analyze data from IoT devices. AWS IoT Analytics filters, transforms, and enriches IoT data before storing it in a time-series data store for analysis. You can set up the service to collect only the data you need from your devices, apply mathematical transforms to process the data, and enrich the data with device-specific metadata such as device type and location before storing it. Then, you can analyze your data by running queries using the built-in SQL query engine, or perform more complex analytics and machine learning inference. AWS IoT Analytics includes pre-built models for common IoT use cases so you can answer questions like which devices are about to fail or which customers are at risk of abandoning their wearable devices.</p>"