public interface V1WriteBuilder extends WriteBuilder
WriteBuilder interface such as SupportsOverwrite, SupportsTruncate
should be extended as well to support additional operations other than data appends.
This interface is designed to provide Spark DataSources time to migrate to DataSource V2 and will be removed in a future Spark release.
| Modifier and Type | Method and Description |
|---|---|
BatchWrite |
buildForBatch()
Returns a
BatchWrite to write data to batch source. |
StreamingWrite |
buildForStreaming()
Returns a
StreamingWrite to write data to streaming source. |
InsertableRelation |
buildForV1Write()
Creates an InsertableRelation that allows appending a DataFrame to a
a destination (using data source-specific parameters).
|
withInputDataSchema, withQueryIdBatchWrite buildForBatch()
WriteBuilderBatchWrite to write data to batch source. By default this method throws
exception, data sources must overwrite this method to provide an implementation, if the
Table that creates this write returns TableCapability.BATCH_WRITE support in
its Table.capabilities().buildForBatch in interface WriteBuilderStreamingWrite buildForStreaming()
WriteBuilderStreamingWrite to write data to streaming source. By default this method
throws exception, data sources must overwrite this method to provide an implementation, if the
Table that creates this write returns TableCapability.STREAMING_WRITE support
in its Table.capabilities().buildForStreaming in interface WriteBuilderInsertableRelation buildForV1Write()
overwrite=false. The DataSource should implement the overwrite behavior as
part of the SupportsOverwrite, and SupportsTruncate interfaces.