Class DataSource
- java.lang.Object
-
- software.amazon.awssdk.services.personalize.model.DataSource
-
- All Implemented Interfaces:
Serializable,SdkPojo,ToCopyableBuilder<DataSource.Builder,DataSource>
@Generated("software.amazon.awssdk:codegen") public final class DataSource extends Object implements SdkPojo, Serializable, ToCopyableBuilder<DataSource.Builder,DataSource>
Describes the data source that contains the data to upload to a dataset, or the list of records to delete from Amazon Personalize.
- See Also:
- Serialized Form
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static interfaceDataSource.Builder
-
Method Summary
All Methods Static Methods Instance Methods Concrete Methods Modifier and Type Method Description static DataSource.Builderbuilder()StringdataLocation()For dataset import jobs, the path to the Amazon S3 bucket where the data that you want to upload to your dataset is stored.booleanequals(Object obj)booleanequalsBySdkFields(Object obj)<T> Optional<T>getValueForField(String fieldName, Class<T> clazz)inthashCode()Map<String,SdkField<?>>sdkFieldNameToField()List<SdkField<?>>sdkFields()static Class<? extends DataSource.Builder>serializableBuilderClass()DataSource.BuildertoBuilder()StringtoString()Returns a string representation of this object.-
Methods inherited from class java.lang.Object
clone, finalize, getClass, notify, notifyAll, wait, wait, wait
-
Methods inherited from interface software.amazon.awssdk.utils.builder.ToCopyableBuilder
copy
-
-
-
-
Method Detail
-
dataLocation
public final String dataLocation()
For dataset import jobs, the path to the Amazon S3 bucket where the data that you want to upload to your dataset is stored. For data deletion jobs, the path to the Amazon S3 bucket that stores the list of records to delete.
For example:
s3://bucket-name/folder-name/fileName.csvIf your CSV files are in a folder in your Amazon S3 bucket and you want your import job or data deletion job to consider multiple files, you can specify the path to the folder. With a data deletion job, Amazon Personalize uses all files in the folder and any sub folder. Use the following syntax with a
/after the folder name:s3://bucket-name/folder-name/- Returns:
- For dataset import jobs, the path to the Amazon S3 bucket where the data that you want to upload to your
dataset is stored. For data deletion jobs, the path to the Amazon S3 bucket that stores the list of
records to delete.
For example:
s3://bucket-name/folder-name/fileName.csvIf your CSV files are in a folder in your Amazon S3 bucket and you want your import job or data deletion job to consider multiple files, you can specify the path to the folder. With a data deletion job, Amazon Personalize uses all files in the folder and any sub folder. Use the following syntax with a
/after the folder name:s3://bucket-name/folder-name/
-
toBuilder
public DataSource.Builder toBuilder()
- Specified by:
toBuilderin interfaceToCopyableBuilder<DataSource.Builder,DataSource>
-
builder
public static DataSource.Builder builder()
-
serializableBuilderClass
public static Class<? extends DataSource.Builder> serializableBuilderClass()
-
equalsBySdkFields
public final boolean equalsBySdkFields(Object obj)
- Specified by:
equalsBySdkFieldsin interfaceSdkPojo
-
toString
public final String toString()
Returns a string representation of this object. This is useful for testing and debugging. Sensitive data will be redacted from this string using a placeholder value.
-
sdkFieldNameToField
public final Map<String,SdkField<?>> sdkFieldNameToField()
- Specified by:
sdkFieldNameToFieldin interfaceSdkPojo
-
-