Packages

c

com.datastax.spark.connector.writer

WritableToCassandra

abstract class WritableToCassandra[T] extends AnyRef

Linear Supertypes
AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. WritableToCassandra
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new WritableToCassandra()

Abstract Value Members

  1. abstract def deleteFromCassandra(keyspaceName: String, tableName: String, deleteColumns: ColumnSelector, keyColumns: ColumnSelector, writeConf: WriteConf)(implicit connector: CassandraConnector, rwf: RowWriterFactory[T]): Unit

    Delete data from Cassandra table, using data from the RDD as a list of primary keys.

    Delete data from Cassandra table, using data from the RDD as a list of primary keys. Uses the specified column names. By default, it deletes all columns from corresponding Cassandra rows.

    Example:

    CREATE KEYSPACE test WITH replication = {'class': 'SimpleStrategy', 'replication_factor': 1 };
    CREATE TABLE test.words(word VARCHAR PRIMARY KEY, count INT, other VARCHAR);
    INSERT INTO test.words((word,count,other) values ('foo', 5, 'foo');
    case class WordCount(word: String, count: Int, other: String)
    val rdd = sc.cassandraTable("test", "words")

    The underlying RDD class must provide data for all primary key columns. Delete "other" column values

    rdd.deleteFromCassandra("test", "words", Seq("other"))

    This delete consistency level and other properties are the same as for writes

    keyspaceName

    the name of the Keyspace to use

    tableName

    the name of the Table to use

    deleteColumns

    The list of column names to delete, empty ColumnSelector means full row.

    keyColumns

    Primary key columns selector, Optional. All RDD primary columns columns will be checked by default

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

  2. abstract def saveToCassandra(keyspaceName: String, tableName: String, columnNames: ColumnSelector, writeConf: WriteConf)(implicit connector: CassandraConnector, rwf: RowWriterFactory[T]): Unit

    Saves the data from RDD to a Cassandra table.

    Saves the data from RDD to a Cassandra table. By default, it saves all properties that have corresponding Cassandra columns.

    Example:

    CREATE KEYSPACE test WITH replication = {'class': 'SimpleStrategy', 'replication_factor': 1 };
    CREATE TABLE test.words(word VARCHAR PRIMARY KEY, count INT, other VARCHAR);
    case class WordCount(word: String, count: Int, other: String)
    val rdd = sc.parallelize(Seq(WordCount("foo", 5, "bar")))

    By default, the underlying RDD class must provide data for all columns:

    rdd.saveToCassandra("test", "words")

    By default, writes are performed at ConsistencyLevel.LOCAL_QUORUM. This write consistency level is controlled by the following property:

    • spark.cassandra.output.consistency.level: consistency level for RDD writes, string matching the ConsistencyLevel enum name.
    keyspaceName

    the name of the Keyspace to use

    tableName

    the name of the Table to use

    columnNames

    The list of column names to save data to. Uses only the unique column names, and you must select at least all primary key columns. All other fields are discarded. Non-selected property/column names are left unchanged.

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

  3. abstract def sparkContext: SparkContext

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  6. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  8. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  9. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  10. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  11. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  12. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  13. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  14. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  15. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  16. def toString(): String
    Definition Classes
    AnyRef → Any
  17. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  18. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  19. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from AnyRef

Inherited from Any

Ungrouped