Packages

  • package root
    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package apache
    Definition Classes
    org
  • package spark

    Core Spark functionality.

    Core Spark functionality. org.apache.spark.SparkContext serves as the main entry point to Spark, while org.apache.spark.rdd.RDD is the data type representing a distributed collection, and provides most parallel operations.

    In addition, org.apache.spark.rdd.PairRDDFunctions contains operations available only on RDDs of key-value pairs, such as groupByKey and join; org.apache.spark.rdd.DoubleRDDFunctions contains operations available only on RDDs of Doubles; and org.apache.spark.rdd.SequenceFileRDDFunctions contains operations available on RDDs that can be saved as SequenceFiles. These operations are automatically available on any RDD of the right type (e.g. RDD[(Int, Int)] through implicit conversions.

    Java programmers should reference the org.apache.spark.api.java package for Spark programming APIs in Java.

    Classes and methods marked with Experimental are user-facing features which have not been officially adopted by the Spark project. These are subject to change or removal in minor releases.

    Classes and methods marked with Developer API are intended for advanced users want to extend Spark through lower level interfaces. These are subject to changes or removal in minor releases.

    Definition Classes
    apache
  • package sql

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Definition Classes
    spark
  • package connector
    Definition Classes
    sql
  • package catalog
    Definition Classes
    connector
  • package constraints
    Definition Classes
    catalog
  • Check
  • Constraint
  • ForeignKey
  • PrimaryKey
  • Unique

class Check extends BaseConstraint

A CHECK constraint.

A CHECK constraint defines a condition each row in a table must satisfy. Connectors can define such constraints either in SQL (Spark SQL dialect) or using a predicate if the condition can be expressed using a supported expression. A CHECK constraint can reference one or more columns. Such constraint is considered violated if its condition evaluates to FALSE, but not NULL. The search condition must be deterministic and cannot contain subqueries and certain functions like aggregates or UDFs.

Spark supports enforced and not enforced CHECK constraints, allowing connectors to control whether data modifications that violate the constraint must fail. Each constraint is either valid (the existing data is guaranteed to satisfy the constraint), invalid (some records violate the constraint), or unvalidated (the validity is unknown). If the validity is unknown, Spark will check #rely() to see whether the constraint is believed to be true and can be used for query optimization.

Annotations
@Evolving()
Source
Check.java
Since

4.1.0

Linear Supertypes
BaseConstraint, Constraint, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Check
  2. BaseConstraint
  3. Constraint
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
  6. def definition(): String
    Attributes
    protected[constraints]
    Definition Classes
    Check → BaseConstraint
    Annotations
    @Override()
  7. def enforced(): Boolean

    Indicates whether this constraint is actively enforced.

    Indicates whether this constraint is actively enforced. If enforced, data modifications that violate the constraint fail with a constraint violation error.

    Definition Classes
    BaseConstraint → Constraint
    Annotations
    @Override()
  8. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  9. def equals(other: AnyRef): Boolean
    Definition Classes
    Check → AnyRef → Any
    Annotations
    @Override()
  10. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  11. def hashCode(): Int
    Definition Classes
    Check → AnyRef → Any
    Annotations
    @Override()
  12. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  13. def name(): String

    Returns the name of this constraint.

    Returns the name of this constraint.

    Definition Classes
    BaseConstraint → Constraint
    Annotations
    @Override()
  14. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  15. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  16. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  17. def predicate(): Predicate

    Returns the search condition.

  18. def predicateSql(): String

    Returns the SQL representation of the search condition (Spark SQL dialect).

  19. def rely(): Boolean

    Indicates whether this constraint is assumed to hold true if the validity is unknown.

    Indicates whether this constraint is assumed to hold true if the validity is unknown. Unlike the validation status, this flag is usually provided by the user as a hint to the system.

    Definition Classes
    BaseConstraint → Constraint
    Annotations
    @Override()
  20. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  21. def toDDL(columns: Array[NamedReference]): String
    Attributes
    protected[constraints]
    Definition Classes
    BaseConstraint
  22. def toDDL(): String

    Returns the definition of this constraint in the DDL format.

    Returns the definition of this constraint in the DDL format.

    Definition Classes
    BaseConstraint → Constraint
    Annotations
    @Override()
  23. def toDescription(): String

    Returns the constraint description for DESCRIBE TABLE output, excluding the constraint name (shown separately).

    Returns the constraint description for DESCRIBE TABLE output, excluding the constraint name (shown separately).

    Definition Classes
    BaseConstraint → Constraint
  24. def toString(): String
    Definition Classes
    BaseConstraint → AnyRef → Any
    Annotations
    @Override()
  25. def validationStatus(): ValidationStatus

    Indicates whether the existing data in the table satisfies this constraint.

    Indicates whether the existing data in the table satisfies this constraint. The constraint can be valid (the data is guaranteed to satisfy the constraint), invalid (some records violate the constraint), or unvalidated (the validity is unknown). The validation status is usually managed by the system and can't be modified by the user.

    Definition Classes
    BaseConstraint → Constraint
    Annotations
    @Override()
  26. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  27. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  28. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable]) @Deprecated
    Deprecated

    (Since version 9)

Inherited from BaseConstraint

Inherited from Constraint

Inherited from AnyRef

Inherited from Any

Ungrouped