Class CatalogV2Util
Object
org.apache.spark.sql.connector.catalog.CatalogV2Util
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionapplyClusterByChanges(Map<String, String> properties, StructType schema, scala.collection.immutable.Seq<TableChange> changes) Apply ClusterBy changes to a Java map and return the result.static Transform[]applyClusterByChanges(Transform[] partitioning, StructType schema, scala.collection.immutable.Seq<TableChange> changes) Apply ClusterBy changes to the partitioning transforms and return the result.applyClusterByChanges(scala.collection.immutable.Map<String, String> properties, StructType schema, scala.collection.immutable.Seq<TableChange> changes) Apply ClusterBy changes to a map and return the result.applyNamespaceChanges(Map<String, String> properties, scala.collection.immutable.Seq<NamespaceChange> changes) Apply properties changes to a Java map and return the result.applyNamespaceChanges(scala.collection.immutable.Map<String, String> properties, scala.collection.immutable.Seq<NamespaceChange> changes) Apply properties changes to a map and return the result.applyPropertiesChanges(Map<String, String> properties, scala.collection.immutable.Seq<TableChange> changes) Apply properties changes to a Java map and return the result.applyPropertiesChanges(scala.collection.immutable.Map<String, String> properties, scala.collection.immutable.Seq<TableChange> changes) Apply properties changes to a map and return the result.static StructTypeapplySchemaChanges(StructType schema, scala.collection.immutable.Seq<TableChange> changes, scala.Option<String> tableProvider, String statementType) Apply schema changes to a schema and return the result.static Constraint[]collectConstraintChanges(Table table, scala.collection.immutable.Seq<TableChange> changes) Extracts and validates table constraints from a sequence of table changes.convertTableProperties(org.apache.spark.sql.catalyst.plans.logical.TableSpec t) convertToProperties(scala.Option<org.apache.spark.sql.catalyst.plans.logical.SerdeInfo> serdeInfo) Converts Hive Serde info to table properties.static TablegetTable(CatalogPlugin catalog, Identifier ident, scala.Option<org.apache.spark.sql.catalyst.analysis.TimeTravelSpec> timeTravelSpec, scala.Option<String> writePrivilegesString) static TableCataloggetTableProviderCatalog(SupportsCatalogOptions provider, org.apache.spark.sql.connector.catalog.CatalogManager catalogManager, CaseInsensitiveStringMap options) static booleanisSameTable(org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation rel, CatalogPlugin catalog, Identifier ident, Table table) static booleanisSessionCatalog(CatalogPlugin catalog) static scala.Option<UnboundFunction>loadFunction(CatalogPlugin catalog, Identifier ident) static scala.Option<org.apache.spark.sql.catalyst.analysis.NamedRelation>loadRelation(CatalogPlugin catalog, Identifier ident) static scala.Option<Table>loadTable(CatalogPlugin catalog, Identifier ident, scala.Option<org.apache.spark.sql.catalyst.analysis.TimeTravelSpec> timeTravelSpec, scala.Option<String> writePrivilegesString) static scala.Option<org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation>lookupCachedRelation(org.apache.spark.sql.catalyst.analysis.RelationCache cache, CatalogPlugin catalog, Identifier ident, Table table, org.apache.spark.sql.internal.SQLConf conf) static scala.collection.immutable.Seq<String>The list of reserved namespace properties, which can not be removed or changed directly by the syntax: {{ ALTER NAMESPACE ...static scala.collection.immutable.Seq<String>searchPathForTableIdentifier(TableCatalog catalog, Identifier ident) Search path for analysis errors for a V2 table identifier: catalog name, then either the identifier's namespace (if non-empty) or the catalog's default namespace.static Column[]structTypeToV2Columns(StructType schema) Converts a StructType to DS v2 columns, which decodes the StructField metadata to v2 column comment and default value or generation expression.static scala.collection.immutable.Seq<String>The list of reserved table properties, which can not be removed or changed directly by the syntax: {{ ALTER TABLE ...static StructTypetoStructType(scala.collection.immutable.Seq<MetadataColumn> cols) static StructTypev2ColumnsToStructType(Column[] columns) static StructTypev2ColumnsToStructType(scala.collection.immutable.Seq<Column> columns) Converts DS v2 columns to StructType, which encodes column comment and default value to StructField metadata.withDefaultOwnership(scala.collection.immutable.Map<String, String> properties)
-
Constructor Details
-
CatalogV2Util
public CatalogV2Util()
-
-
Method Details
-
TABLE_RESERVED_PROPERTIES
The list of reserved table properties, which can not be removed or changed directly by the syntax: {{ ALTER TABLE ... SET TBLPROPERTIES ... }}They need specific syntax to modify
- Returns:
- (undocumented)
-
NAMESPACE_RESERVED_PROPERTIES
The list of reserved namespace properties, which can not be removed or changed directly by the syntax: {{ ALTER NAMESPACE ... SET PROPERTIES ... }}They need specific syntax to modify
- Returns:
- (undocumented)
-
applyNamespaceChanges
public static scala.collection.immutable.Map<String,String> applyNamespaceChanges(scala.collection.immutable.Map<String, String> properties, scala.collection.immutable.Seq<NamespaceChange> changes) Apply properties changes to a map and return the result.- Parameters:
properties- (undocumented)changes- (undocumented)- Returns:
- (undocumented)
-
applyNamespaceChanges
public static Map<String,String> applyNamespaceChanges(Map<String, String> properties, scala.collection.immutable.Seq<NamespaceChange> changes) Apply properties changes to a Java map and return the result.- Parameters:
properties- (undocumented)changes- (undocumented)- Returns:
- (undocumented)
-
applyPropertiesChanges
public static scala.collection.immutable.Map<String,String> applyPropertiesChanges(scala.collection.immutable.Map<String, String> properties, scala.collection.immutable.Seq<TableChange> changes) Apply properties changes to a map and return the result.- Parameters:
properties- (undocumented)changes- (undocumented)- Returns:
- (undocumented)
-
applyPropertiesChanges
public static Map<String,String> applyPropertiesChanges(Map<String, String> properties, scala.collection.immutable.Seq<TableChange> changes) Apply properties changes to a Java map and return the result.- Parameters:
properties- (undocumented)changes- (undocumented)- Returns:
- (undocumented)
-
applyClusterByChanges
public static scala.collection.immutable.Map<String,String> applyClusterByChanges(scala.collection.immutable.Map<String, String> properties, StructType schema, scala.collection.immutable.Seq<TableChange> changes) Apply ClusterBy changes to a map and return the result.- Parameters:
properties- (undocumented)schema- (undocumented)changes- (undocumented)- Returns:
- (undocumented)
-
applyClusterByChanges
public static Map<String,String> applyClusterByChanges(Map<String, String> properties, StructType schema, scala.collection.immutable.Seq<TableChange> changes) Apply ClusterBy changes to a Java map and return the result.- Parameters:
properties- (undocumented)schema- (undocumented)changes- (undocumented)- Returns:
- (undocumented)
-
applyClusterByChanges
public static Transform[] applyClusterByChanges(Transform[] partitioning, StructType schema, scala.collection.immutable.Seq<TableChange> changes) Apply ClusterBy changes to the partitioning transforms and return the result.- Parameters:
partitioning- (undocumented)schema- (undocumented)changes- (undocumented)- Returns:
- (undocumented)
-
applySchemaChanges
public static StructType applySchemaChanges(StructType schema, scala.collection.immutable.Seq<TableChange> changes, scala.Option<String> tableProvider, String statementType) Apply schema changes to a schema and return the result.- Parameters:
schema- (undocumented)changes- (undocumented)tableProvider- (undocumented)statementType- (undocumented)- Returns:
- (undocumented)
-
collectConstraintChanges
public static Constraint[] collectConstraintChanges(Table table, scala.collection.immutable.Seq<TableChange> changes) Extracts and validates table constraints from a sequence of table changes.- Parameters:
table- (undocumented)changes- (undocumented)- Returns:
- (undocumented)
-
loadTable
public static scala.Option<Table> loadTable(CatalogPlugin catalog, Identifier ident, scala.Option<org.apache.spark.sql.catalyst.analysis.TimeTravelSpec> timeTravelSpec, scala.Option<String> writePrivilegesString) -
getTable
public static Table getTable(CatalogPlugin catalog, Identifier ident, scala.Option<org.apache.spark.sql.catalyst.analysis.TimeTravelSpec> timeTravelSpec, scala.Option<String> writePrivilegesString) -
searchPathForTableIdentifier
public static scala.collection.immutable.Seq<String> searchPathForTableIdentifier(TableCatalog catalog, Identifier ident) Search path for analysis errors for a V2 table identifier: catalog name, then either the identifier's namespace (if non-empty) or the catalog's default namespace.- Parameters:
catalog- (undocumented)ident- (undocumented)- Returns:
- (undocumented)
-
loadFunction
-
loadRelation
public static scala.Option<org.apache.spark.sql.catalyst.analysis.NamedRelation> loadRelation(CatalogPlugin catalog, Identifier ident) -
isSameTable
public static boolean isSameTable(org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation rel, CatalogPlugin catalog, Identifier ident, Table table) -
lookupCachedRelation
public static scala.Option<org.apache.spark.sql.execution.datasources.v2.DataSourceV2Relation> lookupCachedRelation(org.apache.spark.sql.catalyst.analysis.RelationCache cache, CatalogPlugin catalog, Identifier ident, Table table, org.apache.spark.sql.internal.SQLConf conf) -
isSessionCatalog
-
convertTableProperties
-
convertToProperties
public static scala.collection.immutable.Map<String,String> convertToProperties(scala.Option<org.apache.spark.sql.catalyst.plans.logical.SerdeInfo> serdeInfo) Converts Hive Serde info to table properties. The mapped property keys are: - INPUTFORMAT/OUTPUTFORMAT: hive.input/output-format - STORED AS: hive.stored-as - ROW FORMAT SERDE: hive.serde - SERDEPROPERTIES: add "option." prefix- Parameters:
serdeInfo- (undocumented)- Returns:
- (undocumented)
-
withDefaultOwnership
-
getTableProviderCatalog
public static TableCatalog getTableProviderCatalog(SupportsCatalogOptions provider, org.apache.spark.sql.connector.catalog.CatalogManager catalogManager, CaseInsensitiveStringMap options) -
toStructType
-
v2ColumnsToStructType
-
v2ColumnsToStructType
Converts DS v2 columns to StructType, which encodes column comment and default value to StructField metadata. This is mainly used to define the schema of v2 scan, w.r.t. the columns of the v2 table.- Parameters:
columns- (undocumented)- Returns:
- (undocumented)
-
structTypeToV2Columns
Converts a StructType to DS v2 columns, which decodes the StructField metadata to v2 column comment and default value or generation expression. This is mainly used to generate DS v2 columns from table schema in DDL commands, so that Spark can pass DS v2 columns to DS v2 createTable and related APIs.- Parameters:
schema- (undocumented)- Returns:
- (undocumented)
-