Packages

  • package root
    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package apache
    Definition Classes
    org
  • package spark

    Core Spark functionality.

    Core Spark functionality. org.apache.spark.SparkContext serves as the main entry point to Spark, while org.apache.spark.rdd.RDD is the data type representing a distributed collection, and provides most parallel operations.

    In addition, org.apache.spark.rdd.PairRDDFunctions contains operations available only on RDDs of key-value pairs, such as groupByKey and join; org.apache.spark.rdd.DoubleRDDFunctions contains operations available only on RDDs of Doubles; and org.apache.spark.rdd.SequenceFileRDDFunctions contains operations available on RDDs that can be saved as SequenceFiles. These operations are automatically available on any RDD of the right type (e.g. RDD[(Int, Int)] through implicit conversions.

    Java programmers should reference the org.apache.spark.api.java package for Spark programming APIs in Java.

    Classes and methods marked with Experimental are user-facing features which have not been officially adopted by the Spark project. These are subject to change or removal in minor releases.

    Classes and methods marked with Developer API are intended for advanced users want to extend Spark through lower level interfaces. These are subject to changes or removal in minor releases.

    Definition Classes
    apache
  • package sql

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Definition Classes
    spark
  • package connector
    Definition Classes
    sql
  • package catalog
    Definition Classes
    connector
  • package constraints
    Definition Classes
    catalog
  • Check
  • Constraint
  • ForeignKey
  • PrimaryKey
  • Unique

class Unique extends BaseConstraint

A UNIQUE constraint.

A UNIQUE constraint specifies one or more columns as unique columns. Such constraint is satisfied if and only if no two rows in a table have the same non-null values in the unique columns.

Spark doesn't enforce UNIQUE constraints but leverages them for query optimization. Each constraint is either valid (the existing data is guaranteed to satisfy the constraint), invalid (some records violate the constraint), or unvalidated (the validity is unknown). If the validity is unknown, Spark will check #rely() to see whether the constraint is believed to be true and can be used for query optimization.

Annotations
@Evolving()
Source
Unique.java
Since

4.1.0

Linear Supertypes
BaseConstraint, Constraint, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Unique
  2. BaseConstraint
  3. Constraint
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##: Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.CloneNotSupportedException]) @IntrinsicCandidate() @native()
  6. def columns(): Array[NamedReference]

    Returns the columns that comprise the unique key.

  7. def definition(): String
    Attributes
    protected[constraints]
    Definition Classes
    Unique → BaseConstraint
    Annotations
    @Override()
  8. def enforced(): Boolean

    Indicates whether this constraint is actively enforced.

    Indicates whether this constraint is actively enforced. If enforced, data modifications that violate the constraint fail with a constraint violation error.

    Definition Classes
    BaseConstraint → Constraint
    Annotations
    @Override()
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(other: AnyRef): Boolean
    Definition Classes
    Unique → AnyRef → Any
    Annotations
    @Override()
  11. final def getClass(): Class[_ <: AnyRef]
    Definition Classes
    AnyRef → Any
    Annotations
    @IntrinsicCandidate() @native()
  12. def hashCode(): Int
    Definition Classes
    Unique → AnyRef → Any
    Annotations
    @Override()
  13. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  14. def name(): String

    Returns the name of this constraint.

    Returns the name of this constraint.

    Definition Classes
    BaseConstraint → Constraint
    Annotations
    @Override()
  15. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  16. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  17. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @IntrinsicCandidate() @native()
  18. def rely(): Boolean

    Indicates whether this constraint is assumed to hold true if the validity is unknown.

    Indicates whether this constraint is assumed to hold true if the validity is unknown. Unlike the validation status, this flag is usually provided by the user as a hint to the system.

    Definition Classes
    BaseConstraint → Constraint
    Annotations
    @Override()
  19. final def synchronized[T0](arg0: => T0): T0
    Definition Classes
    AnyRef
  20. def toDDL(columns: Array[NamedReference]): String
    Attributes
    protected[constraints]
    Definition Classes
    BaseConstraint
  21. def toDDL(): String

    Returns the definition of this constraint in the DDL format.

    Returns the definition of this constraint in the DDL format.

    Definition Classes
    BaseConstraint → Constraint
    Annotations
    @Override()
  22. def toDescription(): String

    Returns the constraint description for DESCRIBE TABLE output, excluding the constraint name (shown separately).

    Returns the constraint description for DESCRIBE TABLE output, excluding the constraint name (shown separately).

    Definition Classes
    BaseConstraint → Constraint
  23. def toString(): String
    Definition Classes
    BaseConstraint → AnyRef → Any
    Annotations
    @Override()
  24. def validationStatus(): ValidationStatus

    Indicates whether the existing data in the table satisfies this constraint.

    Indicates whether the existing data in the table satisfies this constraint. The constraint can be valid (the data is guaranteed to satisfy the constraint), invalid (some records violate the constraint), or unvalidated (the validity is unknown). The validation status is usually managed by the system and can't be modified by the user.

    Definition Classes
    BaseConstraint → Constraint
    Annotations
    @Override()
  25. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])
  26. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException]) @native()
  27. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.InterruptedException])

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws(classOf[java.lang.Throwable]) @Deprecated
    Deprecated

    (Since version 9)

Inherited from BaseConstraint

Inherited from Constraint

Inherited from AnyRef

Inherited from Any

Ungrouped