Class/Object

com.datastax.spark.connector.rdd

CassandraRDD

Related Docs: object CassandraRDD | package rdd

Permalink

abstract class CassandraRDD[R] extends RDD[R]

Linear Supertypes
RDD[R], Logging, Serializable, Serializable, AnyRef, Any
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. CassandraRDD
  2. RDD
  3. Logging
  4. Serializable
  5. Serializable
  6. AnyRef
  7. Any
Implicitly
  1. by rddToPairRDDFunctions
  2. by numericRDDToDoubleRDDFunctions
  3. by doubleRDDToDoubleRDDFunctions
  4. by rddToOrderedRDDFunctions
  5. by rddToSequenceFileRDDFunctions
  6. by rddToAsyncRDDActions
  7. by toPairRDDFunctions
  8. by toRDDFunctions
  9. by any2stringadd
  10. by StringFormat
  11. by Ensuring
  12. by ArrowAssoc
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new CassandraRDD(sc: SparkContext, dep: Seq[Dependency[_]])(implicit arg0: ClassTag[R])

    Permalink

Type Members

  1. abstract type Self <: CassandraRDD[R]

    Permalink

    This is slightly different than Scala this.type.

    This is slightly different than Scala this.type. this.type is the unique singleton type of an object which is not compatible with other instances of the same type, so returning anything other than this is not really possible without lying to the compiler by explicit casts. Here SelfType is used to return a copy of the object - a different instance of the same type

Abstract Value Members

  1. abstract def cassandraCount(): Long

    Permalink

    Counts the number of items in this RDD by selecting count(*) on Cassandra table

  2. abstract def clusteringOrder: Option[ClusteringOrder]

    Permalink
    Attributes
    protected
  3. abstract def columnNames: ColumnSelector

    Permalink
    Attributes
    protected
  4. abstract def compute(split: Partition, context: TaskContext): Iterator[R]

    Permalink
    Definition Classes
    RDD
    Annotations
    @DeveloperApi()
  5. abstract def connector: CassandraConnector

    Permalink
    Attributes
    protected
  6. abstract def copy(columnNames: ColumnSelector = columnNames, where: CqlWhereClause = where, limit: Option[CassandraLimit] = limit, clusteringOrder: Option[ClusteringOrder] = None, readConf: ReadConf = readConf, connector: CassandraConnector = connector): Self

    Permalink

    Allows to copy this RDD with changing some of the properties

    Allows to copy this RDD with changing some of the properties

    Attributes
    protected
  7. abstract def getPartitions: Array[Partition]

    Permalink
    Attributes
    protected
    Definition Classes
    RDD
  8. abstract def keyspaceName: String

    Permalink
    Attributes
    protected[com.datastax.spark.connector]
  9. abstract def limit: Option[CassandraLimit]

    Permalink
    Attributes
    protected
  10. abstract def narrowColumnSelection(columns: Seq[ColumnRef]): Seq[ColumnRef]

    Permalink
    Attributes
    protected
  11. abstract def readConf: ReadConf

    Permalink
    Attributes
    protected
  12. abstract val selectedColumnRefs: Seq[ColumnRef]

    Permalink
  13. abstract def tableName: String

    Permalink
    Attributes
    protected[com.datastax.spark.connector]
  14. abstract def toEmptyCassandraRDD: EmptyCassandraRDD[R]

    Permalink
  15. abstract def where: CqlWhereClause

    Permalink
    Attributes
    protected

Concrete Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. def +(other: String): String

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to any2stringadd[CassandraRDD[R]] performed by method any2stringadd in scala.Predef.
    Definition Classes
    any2stringadd
  4. def ++(other: RDD[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  5. def ->[B](y: B): (CassandraRDD[R], B)

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to ArrowAssoc[CassandraRDD[R]] performed by method ArrowAssoc in scala.Predef. This conversion will take place only if R is a superclass of Any and a subclass of (Nothing, Nothing) with Double (R >: Any <: (Nothing, Nothing) with Double).
    Definition Classes
    ArrowAssoc
    Annotations
    @inline()
  6. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  7. def aggregate[U](zeroValue: U)(seqOp: (U, R) ⇒ U, combOp: (U, U) ⇒ U)(implicit arg0: ClassTag[U]): U

    Permalink
    Definition Classes
    RDD
  8. def aggregateByKey[U](zeroValue: U)(seqOp: (U, V) ⇒ U, combOp: (U, U) ⇒ U)(implicit arg0: ClassTag[U]): RDD[(K, U)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  9. def aggregateByKey[U](zeroValue: U, numPartitions: Int)(seqOp: (U, V) ⇒ U, combOp: (U, U) ⇒ U)(implicit arg0: ClassTag[U]): RDD[(K, U)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  10. def aggregateByKey[U](zeroValue: U, partitioner: Partitioner)(seqOp: (U, V) ⇒ U, combOp: (U, U) ⇒ U)(implicit arg0: ClassTag[U]): RDD[(K, U)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  11. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10, A11](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10, A11) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8], arg10: TypeConverter[A9], arg11: TypeConverter[A10], arg12: TypeConverter[A11]): CassandraRDD[B]

    Permalink
  12. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8], arg10: TypeConverter[A9], arg11: TypeConverter[A10]): CassandraRDD[B]

    Permalink
  13. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8, A9](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8, A9) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8], arg10: TypeConverter[A9]): CassandraRDD[B]

    Permalink
  14. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8]): CassandraRDD[B]

    Permalink
  15. def as[B, A0, A1, A2, A3, A4, A5, A6, A7](f: (A0, A1, A2, A3, A4, A5, A6, A7) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7]): CassandraRDD[B]

    Permalink
  16. def as[B, A0, A1, A2, A3, A4, A5, A6](f: (A0, A1, A2, A3, A4, A5, A6) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6]): CassandraRDD[B]

    Permalink
  17. def as[B, A0, A1, A2, A3, A4, A5](f: (A0, A1, A2, A3, A4, A5) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5]): CassandraRDD[B]

    Permalink
  18. def as[B, A0, A1, A2, A3, A4](f: (A0, A1, A2, A3, A4) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4]): CassandraRDD[B]

    Permalink
  19. def as[B, A0, A1, A2, A3](f: (A0, A1, A2, A3) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3]): CassandraRDD[B]

    Permalink
  20. def as[B, A0, A1, A2](f: (A0, A1, A2) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2]): CassandraRDD[B]

    Permalink
  21. def as[B, A0, A1](f: (A0, A1) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1]): CassandraRDD[B]

    Permalink
  22. def as[B, A0](f: (A0) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0]): CassandraRDD[B]

    Permalink

    Maps each row into object of a different type using provided function taking column value(s) as argument(s).

    Maps each row into object of a different type using provided function taking column value(s) as argument(s). Can be used to convert each row to a tuple or a case class object:

    sc.cassandraTable("ks", "table")
      .select("column1")
      .as((s: String) => s)                 // yields CassandraRDD[String]
    
    sc.cassandraTable("ks", "table")
      .select("column1", "column2")
      .as((_: String, _: Long))             // yields CassandraRDD[(String, Long)]
    
    case class MyRow(key: String, value: Long)
    sc.cassandraTable("ks", "table")
      .select("column1", "column2")
      .as(MyRow)                            // yields CassandraRDD[MyRow]
  23. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  24. def barrier(): RDDBarrier[R]

    Permalink
    Definition Classes
    RDD
    Annotations
    @Experimental() @Since( "2.4.0" )
  25. def cache(): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  26. def cartesian[U](other: RDD[U])(implicit arg0: ClassTag[U]): RDD[(R, U)]

    Permalink
    Definition Classes
    RDD
  27. def checkpoint(): Unit

    Permalink
    Definition Classes
    RDD
  28. def clearDependencies(): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    RDD
  29. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  30. def clusteringOrder(order: ClusteringOrder): Self

    Permalink

    Adds a CQL ORDER BY clause to the query.

    Adds a CQL ORDER BY clause to the query. It can be applied only in case there are clustering columns and primary key predicate is pushed down in where. It is useful when the default direction of ordering rows within a single Cassandra partition needs to be changed.

  31. def coalesce(numPartitions: Int, shuffle: Boolean, partitionCoalescer: Option[PartitionCoalescer])(implicit ord: Ordering[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  32. def cogroup[W1, W2, W3](other1: RDD[(K, W1)], other2: RDD[(K, W2)], other3: RDD[(K, W3)], numPartitions: Int): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2], Iterable[W3]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  33. def cogroup[W1, W2](other1: RDD[(K, W1)], other2: RDD[(K, W2)], numPartitions: Int): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  34. def cogroup[W](other: RDD[(K, W)], numPartitions: Int): RDD[(K, (Iterable[V], Iterable[W]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  35. def cogroup[W1, W2](other1: RDD[(K, W1)], other2: RDD[(K, W2)]): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  36. def cogroup[W](other: RDD[(K, W)]): RDD[(K, (Iterable[V], Iterable[W]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  37. def cogroup[W1, W2, W3](other1: RDD[(K, W1)], other2: RDD[(K, W2)], other3: RDD[(K, W3)]): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2], Iterable[W3]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  38. def cogroup[W1, W2](other1: RDD[(K, W1)], other2: RDD[(K, W2)], partitioner: Partitioner): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  39. def cogroup[W](other: RDD[(K, W)], partitioner: Partitioner): RDD[(K, (Iterable[V], Iterable[W]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  40. def cogroup[W1, W2, W3](other1: RDD[(K, W1)], other2: RDD[(K, W2)], other3: RDD[(K, W3)], partitioner: Partitioner): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2], Iterable[W3]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  41. def collect[U](f: PartialFunction[R, U])(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Definition Classes
    RDD
  42. def collect(): Array[R]

    Permalink
    Definition Classes
    RDD
  43. def collectAsMap(): Map[K, V]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  44. def collectAsync(): FutureAction[Seq[R]]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to AsyncRDDActions[R] performed by method rddToAsyncRDDActions in org.apache.spark.rdd.RDD. This conversion will take place only if R is accompanied by a ClassTag, which is a runtime representation of its type that survives erasure (R: ClassTag).
    Definition Classes
    AsyncRDDActions
  45. def combineByKey[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C): RDD[(K, C)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  46. def combineByKey[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C, numPartitions: Int): RDD[(K, C)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  47. def combineByKey[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C, partitioner: Partitioner, mapSideCombine: Boolean, serializer: Serializer): RDD[(K, C)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  48. def combineByKeyWithClassTag[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C)(implicit ct: ClassTag[C]): RDD[(K, C)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
    Annotations
    @Experimental()
  49. def combineByKeyWithClassTag[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C, numPartitions: Int)(implicit ct: ClassTag[C]): RDD[(K, C)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
    Annotations
    @Experimental()
  50. def combineByKeyWithClassTag[C](createCombiner: (V) ⇒ C, mergeValue: (C, V) ⇒ C, mergeCombiners: (C, C) ⇒ C, partitioner: Partitioner, mapSideCombine: Boolean, serializer: Serializer)(implicit ct: ClassTag[C]): RDD[(K, C)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
    Annotations
    @Experimental()
  51. def context: SparkContext

    Permalink
    Definition Classes
    RDD
  52. def convertTo[B](implicit arg0: ClassTag[B], arg1: RowReaderFactory[B]): CassandraRDD[B]

    Permalink
    Attributes
    protected
  53. def count(): Long

    Permalink
    Definition Classes
    RDD
  54. def countApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]

    Permalink
    Definition Classes
    RDD
  55. def countApproxDistinct(relativeSD: Double): Long

    Permalink
    Definition Classes
    RDD
  56. def countApproxDistinct(p: Int, sp: Int): Long

    Permalink
    Definition Classes
    RDD
  57. def countApproxDistinctByKey(relativeSD: Double): RDD[(K, Long)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  58. def countApproxDistinctByKey(relativeSD: Double, numPartitions: Int): RDD[(K, Long)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  59. def countApproxDistinctByKey(relativeSD: Double, partitioner: Partitioner): RDD[(K, Long)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  60. def countApproxDistinctByKey(p: Int, sp: Int, partitioner: Partitioner): RDD[(K, Long)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  61. def countAsync(): FutureAction[Long]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to AsyncRDDActions[R] performed by method rddToAsyncRDDActions in org.apache.spark.rdd.RDD. This conversion will take place only if R is accompanied by a ClassTag, which is a runtime representation of its type that survives erasure (R: ClassTag).
    Definition Classes
    AsyncRDDActions
  62. def countByKey(): Map[K, Long]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  63. def countByKeyApprox(timeout: Long, confidence: Double): PartialResult[Map[K, BoundedDouble]]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  64. def countByValue()(implicit ord: Ordering[R]): Map[R, Long]

    Permalink
    Definition Classes
    RDD
  65. def countByValueApprox(timeout: Long, confidence: Double)(implicit ord: Ordering[R]): PartialResult[Map[R, BoundedDouble]]

    Permalink
    Definition Classes
    RDD
  66. def deleteFromCassandra(keyspaceName: String, tableName: String, deleteColumns: ColumnSelector = SomeColumns(), keyColumns: ColumnSelector = PrimaryKeyColumns, writeConf: WriteConf = ...)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), rwf: RowWriterFactory[R]): Unit

    Permalink

    Delete data from Cassandra table, using data from the RDD as primary keys.

    Delete data from Cassandra table, using data from the RDD as primary keys. Uses the specified column names.

    keyspaceName

    the name of the Keyspace to use

    tableName

    the name of the Table to use

    deleteColumns

    The list of column names to delete, empty ColumnSelector means full row.

    keyColumns

    Primary key columns selector, Optional. All RDD primary columns columns will be checked by default

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to RDDFunctions[R] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctionsWritableToCassandra
    See also

    com.datastax.spark.connector.writer.WritableToCassandra

  67. final def dependencies: Seq[Dependency[_]]

    Permalink
    Definition Classes
    RDD
  68. def distinct(): RDD[R]

    Permalink
    Definition Classes
    RDD
  69. def distinct(numPartitions: Int)(implicit ord: Ordering[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  70. def ensuring(cond: (CassandraRDD[R]) ⇒ Boolean, msg: ⇒ Any): CassandraRDD[R]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to Ensuring[CassandraRDD[R]] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  71. def ensuring(cond: (CassandraRDD[R]) ⇒ Boolean): CassandraRDD[R]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to Ensuring[CassandraRDD[R]] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  72. def ensuring(cond: Boolean, msg: ⇒ Any): CassandraRDD[R]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to Ensuring[CassandraRDD[R]] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  73. def ensuring(cond: Boolean): CassandraRDD[R]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to Ensuring[CassandraRDD[R]] performed by method Ensuring in scala.Predef.
    Definition Classes
    Ensuring
  74. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  75. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  76. def filter(f: (R) ⇒ Boolean): RDD[R]

    Permalink
    Definition Classes
    RDD
  77. def filterByRange(lower: K, upper: K): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to OrderedRDDFunctions[K, V, (K, V)] performed by method rddToOrderedRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type Ordering[K] is in scope
    2. an implicit value of type ClassTag[K] is in scope
    3. an implicit value of type ClassTag[V] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    OrderedRDDFunctions
  78. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  79. def first(): R

    Permalink
    Definition Classes
    RDD
  80. def firstParent[U](implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Attributes
    protected[org.apache.spark]
    Definition Classes
    RDD
  81. def flatMap[U](f: (R) ⇒ TraversableOnce[U])(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Definition Classes
    RDD
  82. def flatMapValues[U](f: (V) ⇒ TraversableOnce[U]): RDD[(K, U)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  83. def fold(zeroValue: R)(op: (R, R) ⇒ R): R

    Permalink
    Definition Classes
    RDD
  84. def foldByKey(zeroValue: V)(func: (V, V) ⇒ V): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  85. def foldByKey(zeroValue: V, numPartitions: Int)(func: (V, V) ⇒ V): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  86. def foldByKey(zeroValue: V, partitioner: Partitioner)(func: (V, V) ⇒ V): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  87. def foreach(f: (R) ⇒ Unit): Unit

    Permalink
    Definition Classes
    RDD
  88. def foreachAsync(f: (R) ⇒ Unit): FutureAction[Unit]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to AsyncRDDActions[R] performed by method rddToAsyncRDDActions in org.apache.spark.rdd.RDD. This conversion will take place only if R is accompanied by a ClassTag, which is a runtime representation of its type that survives erasure (R: ClassTag).
    Definition Classes
    AsyncRDDActions
  89. def foreachPartition(f: (Iterator[R]) ⇒ Unit): Unit

    Permalink
    Definition Classes
    RDD
  90. def foreachPartitionAsync(f: (Iterator[R]) ⇒ Unit): FutureAction[Unit]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to AsyncRDDActions[R] performed by method rddToAsyncRDDActions in org.apache.spark.rdd.RDD. This conversion will take place only if R is accompanied by a ClassTag, which is a runtime representation of its type that survives erasure (R: ClassTag).
    Definition Classes
    AsyncRDDActions
  91. def formatted(fmtstr: String): String

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to StringFormat[CassandraRDD[R]] performed by method StringFormat in scala.Predef.
    Definition Classes
    StringFormat
    Annotations
    @inline()
  92. def fullOuterJoin[W](other: RDD[(K, W)], numPartitions: Int): RDD[(K, (Option[V], Option[W]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  93. def fullOuterJoin[W](other: RDD[(K, W)]): RDD[(K, (Option[V], Option[W]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  94. def fullOuterJoin[W](other: RDD[(K, W)], partitioner: Partitioner): RDD[(K, (Option[V], Option[W]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  95. def getCheckpointFile: Option[String]

    Permalink
    Definition Classes
    RDD
  96. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  97. def getDependencies: Seq[Dependency[_]]

    Permalink
    Attributes
    protected
    Definition Classes
    RDD
  98. final def getNumPartitions: Int

    Permalink
    Definition Classes
    RDD
    Annotations
    @Since( "1.6.0" )
  99. def getOutputDeterministicLevel: org.apache.spark.rdd.DeterministicLevel.Value

    Permalink
    Attributes
    protected
    Definition Classes
    RDD
    Annotations
    @DeveloperApi()
  100. def getPreferredLocations(split: Partition): Seq[String]

    Permalink
    Attributes
    protected
    Definition Classes
    RDD
  101. def getStorageLevel: StorageLevel

    Permalink
    Definition Classes
    RDD
  102. def glom(): RDD[Array[R]]

    Permalink
    Definition Classes
    RDD
  103. def groupBy[K](f: (R) ⇒ K, p: Partitioner)(implicit kt: ClassTag[K], ord: Ordering[K]): RDD[(K, Iterable[R])]

    Permalink
    Definition Classes
    RDD
  104. def groupBy[K](f: (R) ⇒ K, numPartitions: Int)(implicit kt: ClassTag[K]): RDD[(K, Iterable[R])]

    Permalink
    Definition Classes
    RDD
  105. def groupBy[K](f: (R) ⇒ K)(implicit kt: ClassTag[K]): RDD[(K, Iterable[R])]

    Permalink
    Definition Classes
    RDD
  106. def groupByKey(): RDD[(K, Iterable[V])]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  107. def groupByKey(numPartitions: Int): RDD[(K, Iterable[V])]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  108. def groupByKey(partitioner: Partitioner): RDD[(K, Iterable[V])]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  109. def groupWith[W1, W2, W3](other1: RDD[(K, W1)], other2: RDD[(K, W2)], other3: RDD[(K, W3)]): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2], Iterable[W3]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  110. def groupWith[W1, W2](other1: RDD[(K, W1)], other2: RDD[(K, W2)]): RDD[(K, (Iterable[V], Iterable[W1], Iterable[W2]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  111. def groupWith[W](other: RDD[(K, W)]): RDD[(K, (Iterable[V], Iterable[W]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  112. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  113. val id: Int

    Permalink
    Definition Classes
    RDD
  114. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  115. def initializeLogIfNecessary(isInterpreter: Boolean): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  116. def intersection(other: RDD[R], numPartitions: Int): RDD[R]

    Permalink
    Definition Classes
    RDD
  117. def intersection(other: RDD[R], partitioner: Partitioner)(implicit ord: Ordering[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  118. def intersection(other: RDD[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  119. lazy val isBarrier_: Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    RDD
  120. def isCheckpointed: Boolean

    Permalink
    Definition Classes
    RDD
  121. def isEmpty(): Boolean

    Permalink
    Definition Classes
    RDD
  122. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  123. def isTraceEnabled(): Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  124. final def iterator(split: Partition, context: TaskContext): Iterator[R]

    Permalink
    Definition Classes
    RDD
  125. def join[W](other: RDD[(K, W)], numPartitions: Int): RDD[(K, (V, W))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  126. def join[W](other: RDD[(K, W)]): RDD[(K, (V, W))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  127. def join[W](other: RDD[(K, W)], partitioner: Partitioner): RDD[(K, (V, W))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  128. def joinWithCassandraTable[R](keyspaceName: String, tableName: String, selectedColumns: ColumnSelector = AllColumns, joinColumns: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), newType: ClassTag[R], rrf: RowReaderFactory[R], ev: ValidRDDType[R], currentType: ClassTag[R], rwf: RowWriterFactory[R]): CassandraJoinRDD[R, R]

    Permalink

    Uses the data from RDD to join with a Cassandra table without retrieving the entire table.

    Uses the data from RDD to join with a Cassandra table without retrieving the entire table. Any RDD which can be used to saveToCassandra can be used to joinWithCassandra as well as any RDD which only specifies the partition Key of a Cassandra Table. This method executes single partition requests against the Cassandra Table and accepts the functional modifiers that a normal com.datastax.spark.connector.rdd.CassandraTableScanRDD takes.

    By default this method only uses the Partition Key for joining but any combination of columns which are acceptable to C* can be used in the join. Specify columns using joinColumns as a parameter or the on() method.

    Example With Prior Repartitioning:

    val source = sc.parallelize(keys).map(x => new KVRow(x))
    val repart = source.repartitionByCassandraReplica(keyspace, tableName, 10)
    val someCass = repart.joinWithCassandraTable(keyspace, tableName)

    Example Joining on Clustering Columns:

    val source = sc.parallelize(keys).map(x => (x, x * 100))
    val someCass = source.joinWithCassandraTable(keyspace, wideTable).on(SomeColumns("key", "group"))
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to RDDFunctions[R] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  129. def keyBy[K](f: (R) ⇒ K): RDD[(K, R)]

    Permalink
    Definition Classes
    RDD
  130. def keyByCassandraReplica(keyspaceName: String, tableName: String, partitionKeyMapper: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), currentType: ClassTag[R], rwf: RowWriterFactory[R]): RDD[(Set[InetAddress], R)]

    Permalink

    Key every row in the RDD by with the IP Adresses of all of the Cassandra nodes which a contain a replica of the data specified by that row.

    Key every row in the RDD by with the IP Adresses of all of the Cassandra nodes which a contain a replica of the data specified by that row. The calling RDD must have rows that can be converted into the partition key of the given Cassandra Table.

    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to RDDFunctions[R] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  131. def keys: RDD[K]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  132. def leftJoinWithCassandraTable[R](keyspaceName: String, tableName: String, selectedColumns: ColumnSelector = AllColumns, joinColumns: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), newType: ClassTag[R], rrf: RowReaderFactory[R], ev: ValidRDDType[R], currentType: ClassTag[R], rwf: RowWriterFactory[R]): CassandraLeftJoinRDD[R, R]

    Permalink

    Uses the data from RDD to left join with a Cassandra table without retrieving the entire table.

    Uses the data from RDD to left join with a Cassandra table without retrieving the entire table. Any RDD which can be used to saveToCassandra can be used to leftJoinWithCassandra as well as any RDD which only specifies the partition Key of a Cassandra Table. This method executes single partition requests against the Cassandra Table and accepts the functional modifiers that a normal com.datastax.spark.connector.rdd.CassandraTableScanRDD takes.

    By default this method only uses the Partition Key for joining but any combination of columns which are acceptable to C* can be used in the join. Specify columns using joinColumns as a parameter or the on() method.

    Example With Prior Repartitioning:

    val source = sc.parallelize(keys).map(x => new KVRow(x))
    val repart = source.repartitionByCassandraReplica(keyspace, tableName, 10)
    val someCass = repart.leftJoinWithCassandraTable(keyspace, tableName)

    Example Joining on Clustering Columns:

    val source = sc.parallelize(keys).map(x => (x, x * 100))
    val someCass = source.leftJoinWithCassandraTable(keyspace, wideTable).on(SomeColumns("key", "group"))
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to RDDFunctions[R] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  133. def leftOuterJoin[W](other: RDD[(K, W)], numPartitions: Int): RDD[(K, (V, Option[W]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  134. def leftOuterJoin[W](other: RDD[(K, W)]): RDD[(K, (V, Option[W]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  135. def leftOuterJoin[W](other: RDD[(K, W)], partitioner: Partitioner): RDD[(K, (V, Option[W]))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  136. def limit(rowLimit: Long): Self

    Permalink

    Adds the limit clause to CQL select statement.

    Adds the limit clause to CQL select statement. The limit will be applied for each created Spark partition. In other words, unless the data are fetched from a single Cassandra partition the number of results is unpredictable.

    The main purpose of passing limit clause is to fetch top n rows from a single Cassandra partition when the table is designed so that it uses clustering keys and a partition key predicate is passed to the where clause.

  137. def localCheckpoint(): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  138. def log: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  139. def logDebug(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  140. def logDebug(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  141. def logError(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  142. def logError(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  143. def logInfo(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  144. def logInfo(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  145. def logName: String

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  146. def logTrace(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  147. def logTrace(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  148. def logWarning(msg: ⇒ String, throwable: Throwable): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  149. def logWarning(msg: ⇒ String): Unit

    Permalink
    Attributes
    protected
    Definition Classes
    Logging
  150. def lookup(key: K): Seq[V]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  151. def map[U](f: (R) ⇒ U)(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Definition Classes
    RDD
  152. def mapPartitions[U](f: (Iterator[R]) ⇒ Iterator[U], preservesPartitioning: Boolean)(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Definition Classes
    RDD
  153. def mapPartitionsWithIndex[U](f: (Int, Iterator[R]) ⇒ Iterator[U], preservesPartitioning: Boolean)(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Definition Classes
    RDD
  154. def mapValues[U](f: (V) ⇒ U): RDD[(K, U)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  155. def max()(implicit ord: Ordering[R]): R

    Permalink
    Definition Classes
    RDD
  156. def min()(implicit ord: Ordering[R]): R

    Permalink
    Definition Classes
    RDD
  157. var name: String

    Permalink
    Definition Classes
    RDD
  158. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  159. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  160. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  161. def parent[U](j: Int)(implicit arg0: ClassTag[U]): RDD[U]

    Permalink
    Attributes
    protected[org.apache.spark]
    Definition Classes
    RDD
  162. def partitionBy(partitioner: Partitioner): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  163. val partitioner: Option[Partitioner]

    Permalink
    Definition Classes
    RDD
  164. final def partitions: Array[Partition]

    Permalink
    Definition Classes
    RDD
  165. def perPartitionLimit(rowLimit: Long): Self

    Permalink

    Adds the PER PARTITION LIMIT clause to CQL select statement.

    Adds the PER PARTITION LIMIT clause to CQL select statement. The limit will be applied for every Cassandra Partition. Only Valid For Cassandra 3.6+

  166. def persist(): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  167. def persist(newLevel: StorageLevel): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  168. def pipe(command: Seq[String], env: Map[String, String], printPipeContext: ((String) ⇒ Unit) ⇒ Unit, printRDDElement: (R, (String) ⇒ Unit) ⇒ Unit, separateWorkingDir: Boolean, bufferSize: Int, encoding: String): RDD[String]

    Permalink
    Definition Classes
    RDD
  169. def pipe(command: String, env: Map[String, String]): RDD[String]

    Permalink
    Definition Classes
    RDD
  170. def pipe(command: String): RDD[String]

    Permalink
    Definition Classes
    RDD
  171. final def preferredLocations(split: Partition): Seq[String]

    Permalink
    Definition Classes
    RDD
  172. def randomSplit(weights: Array[Double], seed: Long): Array[RDD[R]]

    Permalink
    Definition Classes
    RDD
  173. def reduce(f: (R, R) ⇒ R): R

    Permalink
    Definition Classes
    RDD
  174. def reduceByKey(func: (V, V) ⇒ V): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  175. def reduceByKey(func: (V, V) ⇒ V, numPartitions: Int): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  176. def reduceByKey(partitioner: Partitioner, func: (V, V) ⇒ V): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  177. def reduceByKeyLocally(func: (V, V) ⇒ V): Map[K, V]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  178. def repartition(numPartitions: Int)(implicit ord: Ordering[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  179. def repartitionAndSortWithinPartitions(partitioner: Partitioner): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to OrderedRDDFunctions[K, V, (K, V)] performed by method rddToOrderedRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type Ordering[K] is in scope
    2. an implicit value of type ClassTag[K] is in scope
    3. an implicit value of type ClassTag[V] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    OrderedRDDFunctions
  180. def repartitionByCassandraReplica(keyspaceName: String, tableName: String, partitionsPerHost: Int = 10, partitionKeyMapper: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), currentType: ClassTag[R], rwf: RowWriterFactory[R]): CassandraPartitionedRDD[R]

    Permalink

    Repartitions the data (via a shuffle) based upon the replication of the given keyspaceName and tableName.

    Repartitions the data (via a shuffle) based upon the replication of the given keyspaceName and tableName. Calling this method before using joinWithCassandraTable will ensure that requests will be coordinator local. partitionsPerHost Controls the number of Spark Partitions that will be created in this repartitioning event. The calling RDD must have rows that can be converted into the partition key of the given Cassandra Table.

    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to RDDFunctions[R] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  181. def rightOuterJoin[W](other: RDD[(K, W)], numPartitions: Int): RDD[(K, (Option[V], W))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  182. def rightOuterJoin[W](other: RDD[(K, W)]): RDD[(K, (Option[V], W))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  183. def rightOuterJoin[W](other: RDD[(K, W)], partitioner: Partitioner): RDD[(K, (Option[V], W))]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  184. def sample(withReplacement: Boolean, fraction: Double, seed: Long): RDD[R]

    Permalink
    Definition Classes
    RDD
  185. def sampleByKey(withReplacement: Boolean, fractions: Map[K, Double], seed: Long): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  186. def sampleByKeyExact(withReplacement: Boolean, fractions: Map[K, Double], seed: Long): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  187. def saveAsCassandraTable(keyspaceName: String, tableName: String, columns: ColumnSelector = AllColumns, writeConf: WriteConf = ...)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), rwf: RowWriterFactory[R], columnMapper: ColumnMapper[R]): Unit

    Permalink

    Saves the data from RDD to a new table with definition taken from the ColumnMapper for this class.

    Saves the data from RDD to a new table with definition taken from the ColumnMapper for this class.

    keyspaceName

    keyspace where to create a new table

    tableName

    name of the table to create; the table must not exist

    columns

    Selects the columns to save data to. Uses only the unique column names, and you must select at least all primary key columns. All other fields are discarded. Non-selected property/column names are left unchanged. This parameter does not affect table creation.

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

    connector

    optional, implicit connector to Cassandra

    rwf

    factory for obtaining the row writer to be used to extract column values from items of the RDD

    columnMapper

    a column mapper determining the definition of the table

    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to RDDFunctions[R] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  188. def saveAsCassandraTableEx(table: TableDef, columns: ColumnSelector = AllColumns, writeConf: WriteConf = ...)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), rwf: RowWriterFactory[R]): Unit

    Permalink

    Saves the data from RDD to a new table defined by the given TableDef.

    Saves the data from RDD to a new table defined by the given TableDef.

    First it creates a new table with all columns from the TableDef and then it saves RDD content in the same way as saveToCassandra. The table must not exist prior to this call.

    table

    table definition used to create a new table

    columns

    Selects the columns to save data to. Uses only the unique column names, and you must select at least all primary key columns. All other fields are discarded. Non-selected property/column names are left unchanged. This parameter does not affect table creation.

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

    connector

    optional, implicit connector to Cassandra

    rwf

    factory for obtaining the row writer to be used to extract column values from items of the RDD

    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to RDDFunctions[R] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  189. def saveAsHadoopDataset(conf: JobConf): Unit

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  190. def saveAsHadoopFile(path: String, keyClass: Class[_], valueClass: Class[_], outputFormatClass: Class[_ <: OutputFormat[_, _]], conf: JobConf, codec: Option[Class[_ <: CompressionCodec]]): Unit

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  191. def saveAsHadoopFile(path: String, keyClass: Class[_], valueClass: Class[_], outputFormatClass: Class[_ <: OutputFormat[_, _]], codec: Class[_ <: CompressionCodec]): Unit

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  192. def saveAsHadoopFile[F <: OutputFormat[K, V]](path: String, codec: Class[_ <: CompressionCodec])(implicit fm: ClassTag[F]): Unit

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  193. def saveAsHadoopFile[F <: OutputFormat[K, V]](path: String)(implicit fm: ClassTag[F]): Unit

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  194. def saveAsNewAPIHadoopDataset(conf: Configuration): Unit

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  195. def saveAsNewAPIHadoopFile(path: String, keyClass: Class[_], valueClass: Class[_], outputFormatClass: Class[_ <: OutputFormat[_, _]], conf: Configuration): Unit

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  196. def saveAsNewAPIHadoopFile[F <: OutputFormat[K, V]](path: String)(implicit fm: ClassTag[F]): Unit

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  197. def saveAsObjectFile(path: String): Unit

    Permalink
    Definition Classes
    RDD
  198. def saveAsSequenceFile(path: String, codec: Option[Class[_ <: CompressionCodec]]): Unit

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to SequenceFileRDDFunctions[K, V] performed by method rddToSequenceFileRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type WritableFactory[K] is in scope
    4. an implicit value of type WritableFactory[V] is in scope
    5. R is (K, V) (R =:= (K, V))
    Definition Classes
    SequenceFileRDDFunctions
  199. def saveAsTextFile(path: String, codec: Class[_ <: CompressionCodec]): Unit

    Permalink
    Definition Classes
    RDD
  200. def saveAsTextFile(path: String): Unit

    Permalink
    Definition Classes
    RDD
  201. def saveToCassandra(keyspaceName: String, tableName: String, columns: ColumnSelector = AllColumns, writeConf: WriteConf = ...)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), rwf: RowWriterFactory[R]): Unit

    Permalink

    Saves the data from RDD to a Cassandra table.

    Saves the data from RDD to a Cassandra table. Uses the specified column names.

    keyspaceName

    the name of the Keyspace to use

    tableName

    the name of the Table to use

    writeConf

    additional configuration object allowing to set consistency level, batch size, etc.

    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to RDDFunctions[R] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctionsWritableToCassandra
    See also

    com.datastax.spark.connector.writer.WritableToCassandra

  202. def select(columns: ColumnRef*): Self

    Permalink

    Narrows down the selected set of columns.

    Narrows down the selected set of columns. Use this for better performance, when you don't need all the columns in the result RDD. When called multiple times, it selects the subset of the already selected columns, so after a column was removed by the previous select call, it is not possible to add it back.

    The selected columns are ColumnRef instances. This type allows to specify columns for straightforward retrieval and to read TTL or write time of regular columns as well. Implicit conversions included in com.datastax.spark.connector package make it possible to provide just column names (which is also backward compatible) and optional add .ttl or .writeTime suffix in order to create an appropriate ColumnRef instance.

  203. def selectedColumnNames: Seq[String]

    Permalink
  204. def setName(_name: String): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  205. def sortBy[K](f: (R) ⇒ K, ascending: Boolean, numPartitions: Int)(implicit ord: Ordering[K], ctag: ClassTag[K]): RDD[R]

    Permalink
    Definition Classes
    RDD
  206. def sortByKey(ascending: Boolean, numPartitions: Int): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to OrderedRDDFunctions[K, V, (K, V)] performed by method rddToOrderedRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type Ordering[K] is in scope
    2. an implicit value of type ClassTag[K] is in scope
    3. an implicit value of type ClassTag[V] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    OrderedRDDFunctions
  207. def spanBy[U](f: (R) ⇒ U): RDD[(U, Iterable[R])]

    Permalink

    Applies a function to each item, and groups consecutive items having the same value together.

    Applies a function to each item, and groups consecutive items having the same value together. Contrary to groupBy, items from the same group must be already next to each other in the original collection. Works locally on each partition, so items from different partitions will never be placed in the same group.

    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to RDDFunctions[R] performed by method toRDDFunctions in com.datastax.spark.connector.
    Definition Classes
    RDDFunctions
  208. def spanByKey: RDD[(K, Seq[V])]

    Permalink

    Groups items with the same key, assuming the items with the same key are next to each other in the collection.

    Groups items with the same key, assuming the items with the same key are next to each other in the collection. It does not perform shuffle, therefore it is much faster than using much more universal Spark RDD groupByKey. For this method to be useful with Cassandra tables, the key must represent a prefix of the primary key, containing at least the partition key of the Cassandra table.

    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to PairRDDFunctions[K, V] performed by method toPairRDDFunctions in com.datastax.spark.connector. This conversion will take place only if R is (K, V) (R =:= (K, V)).
    Definition Classes
    PairRDDFunctions
  209. def sparkContext: SparkContext

    Permalink
    Definition Classes
    RDD
  210. def subtract(other: RDD[R], p: Partitioner)(implicit ord: Ordering[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  211. def subtract(other: RDD[R], numPartitions: Int): RDD[R]

    Permalink
    Definition Classes
    RDD
  212. def subtract(other: RDD[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  213. def subtractByKey[W](other: RDD[(K, W)], p: Partitioner)(implicit arg0: ClassTag[W]): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  214. def subtractByKey[W](other: RDD[(K, W)], numPartitions: Int)(implicit arg0: ClassTag[W]): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  215. def subtractByKey[W](other: RDD[(K, W)])(implicit arg0: ClassTag[W]): RDD[(K, V)]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  216. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  217. def take(num: Int): Array[R]

    Permalink
    Definition Classes
    CassandraRDD → RDD
  218. def takeAsync(num: Int): FutureAction[Seq[R]]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to AsyncRDDActions[R] performed by method rddToAsyncRDDActions in org.apache.spark.rdd.RDD. This conversion will take place only if R is accompanied by a ClassTag, which is a runtime representation of its type that survives erasure (R: ClassTag).
    Definition Classes
    AsyncRDDActions
  219. def takeOrdered(num: Int)(implicit ord: Ordering[R]): Array[R]

    Permalink
    Definition Classes
    RDD
  220. def takeSample(withReplacement: Boolean, num: Int, seed: Long): Array[R]

    Permalink
    Definition Classes
    RDD
  221. def toDebugString: String

    Permalink
    Definition Classes
    RDD
  222. def toJavaRDD(): JavaRDD[R]

    Permalink
    Definition Classes
    RDD
  223. def toLocalIterator: Iterator[R]

    Permalink
    Definition Classes
    RDD
  224. def toString(): String

    Permalink
    Definition Classes
    RDD → AnyRef → Any
  225. def top(num: Int)(implicit ord: Ordering[R]): Array[R]

    Permalink
    Definition Classes
    RDD
  226. def treeAggregate[U](zeroValue: U)(seqOp: (U, R) ⇒ U, combOp: (U, U) ⇒ U, depth: Int)(implicit arg0: ClassTag[U]): U

    Permalink
    Definition Classes
    RDD
  227. def treeReduce(f: (R, R) ⇒ R, depth: Int): R

    Permalink
    Definition Classes
    RDD
  228. def union(other: RDD[R]): RDD[R]

    Permalink
    Definition Classes
    RDD
  229. def unpersist(blocking: Boolean): CassandraRDD.this.type

    Permalink
    Definition Classes
    RDD
  230. def values: RDD[V]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V] performed by method rddToPairRDDFunctions in org.apache.spark.rdd.RDD.
    This conversion will take place only if all of the following constraints are met:
    1. an implicit value of type ClassTag[K] is in scope
    2. an implicit value of type ClassTag[V] is in scope
    3. an implicit value of type Ordering[K] is in scope
    4. R is (K, V) (R =:= (K, V))
    Definition Classes
    PairRDDFunctions
  231. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  232. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  233. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  234. def where(cql: String, values: Any*): Self

    Permalink

    Adds a CQL WHERE predicate(s) to the query.

    Adds a CQL WHERE predicate(s) to the query. Useful for leveraging secondary indexes in Cassandra. Implicitly adds an ALLOW FILTERING clause to the WHERE clause, however beware that some predicates might be rejected by Cassandra, particularly in cases when they filter on an unindexed, non-clustering column.

  235. def withAscOrder: Self

    Permalink
  236. def withConnector(connector: CassandraConnector): Self

    Permalink

    Returns a copy of this Cassandra RDD with specified connector

  237. def withDescOrder: Self

    Permalink
  238. def withReadConf(readConf: ReadConf): Self

    Permalink

    Allows to set custom read configuration, e.g.

    Allows to set custom read configuration, e.g. consistency level or fetch size.

  239. def zip[U](other: RDD[U])(implicit arg0: ClassTag[U]): RDD[(R, U)]

    Permalink
    Definition Classes
    RDD
  240. def zipPartitions[B, C, D, V](rdd2: RDD[B], rdd3: RDD[C], rdd4: RDD[D])(f: (Iterator[R], Iterator[B], Iterator[C], Iterator[D]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[D], arg3: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  241. def zipPartitions[B, C, D, V](rdd2: RDD[B], rdd3: RDD[C], rdd4: RDD[D], preservesPartitioning: Boolean)(f: (Iterator[R], Iterator[B], Iterator[C], Iterator[D]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[D], arg3: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  242. def zipPartitions[B, C, V](rdd2: RDD[B], rdd3: RDD[C])(f: (Iterator[R], Iterator[B], Iterator[C]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  243. def zipPartitions[B, C, V](rdd2: RDD[B], rdd3: RDD[C], preservesPartitioning: Boolean)(f: (Iterator[R], Iterator[B], Iterator[C]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  244. def zipPartitions[B, V](rdd2: RDD[B])(f: (Iterator[R], Iterator[B]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  245. def zipPartitions[B, V](rdd2: RDD[B], preservesPartitioning: Boolean)(f: (Iterator[R], Iterator[B]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[V]): RDD[V]

    Permalink
    Definition Classes
    RDD
  246. def zipWithIndex(): RDD[(R, Long)]

    Permalink
    Definition Classes
    RDD
  247. def zipWithUniqueId(): RDD[(R, Long)]

    Permalink
    Definition Classes
    RDD
  248. def [B](y: B): (CassandraRDD[R], B)

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to ArrowAssoc[CassandraRDD[R]] performed by method ArrowAssoc in scala.Predef. This conversion will take place only if R is a superclass of Any and a subclass of (Nothing, Nothing) with Double (R >: Any <: (Nothing, Nothing) with Double).
    Definition Classes
    ArrowAssoc

Shadowed Implicit Value Members

  1. def histogram(buckets: Array[Double], evenBuckets: Boolean): Array[Long]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).histogram(buckets, evenBuckets)
    Definition Classes
    DoubleRDDFunctions
  2. def histogram(bucketCount: Int): (Array[Double], Array[Long])

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).histogram(bucketCount)
    Definition Classes
    DoubleRDDFunctions
  3. def histogram(buckets: Array[Double], evenBuckets: Boolean): Array[Long]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).histogram(buckets, evenBuckets)
    Definition Classes
    DoubleRDDFunctions
  4. def histogram(bucketCount: Int): (Array[Double], Array[Long])

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).histogram(bucketCount)
    Definition Classes
    DoubleRDDFunctions
  5. def mean(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).mean()
    Definition Classes
    DoubleRDDFunctions
  6. def mean(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).mean()
    Definition Classes
    DoubleRDDFunctions
  7. def meanApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).meanApprox(timeout, confidence)
    Definition Classes
    DoubleRDDFunctions
  8. def meanApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).meanApprox(timeout, confidence)
    Definition Classes
    DoubleRDDFunctions
  9. def popStdev(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).popStdev()
    Definition Classes
    DoubleRDDFunctions
    Annotations
    @Since( "2.1.0" )
  10. def popStdev(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).popStdev()
    Definition Classes
    DoubleRDDFunctions
    Annotations
    @Since( "2.1.0" )
  11. def popVariance(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).popVariance()
    Definition Classes
    DoubleRDDFunctions
    Annotations
    @Since( "2.1.0" )
  12. def popVariance(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).popVariance()
    Definition Classes
    DoubleRDDFunctions
    Annotations
    @Since( "2.1.0" )
  13. def sampleStdev(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).sampleStdev()
    Definition Classes
    DoubleRDDFunctions
  14. def sampleStdev(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).sampleStdev()
    Definition Classes
    DoubleRDDFunctions
  15. def sampleVariance(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).sampleVariance()
    Definition Classes
    DoubleRDDFunctions
  16. def sampleVariance(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).sampleVariance()
    Definition Classes
    DoubleRDDFunctions
  17. val sparkContext: SparkContext

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to RDDFunctions[R] performed by method toRDDFunctions in com.datastax.spark.connector.
    Shadowing
    This implicitly inherited member is shadowed by one or more members in this class.
    To access this member you can use a type ascription:
    (cassandraRDD: RDDFunctions[R]).sparkContext
    Definition Classes
    RDDFunctionsWritableToCassandra
  18. def stats(): StatCounter

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).stats()
    Definition Classes
    DoubleRDDFunctions
  19. def stats(): StatCounter

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).stats()
    Definition Classes
    DoubleRDDFunctions
  20. def stdev(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).stdev()
    Definition Classes
    DoubleRDDFunctions
  21. def stdev(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).stdev()
    Definition Classes
    DoubleRDDFunctions
  22. def sum(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).sum()
    Definition Classes
    DoubleRDDFunctions
  23. def sum(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).sum()
    Definition Classes
    DoubleRDDFunctions
  24. def sumApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).sumApprox(timeout, confidence)
    Definition Classes
    DoubleRDDFunctions
  25. def sumApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).sumApprox(timeout, confidence)
    Definition Classes
    DoubleRDDFunctions
  26. def variance(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method numericRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is a numeric class, such as Int, Long, Float or Double (R: Numeric).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).variance()
    Definition Classes
    DoubleRDDFunctions
  27. def variance(): Double

    Permalink
    Implicit information
    This member is added by an implicit conversion from CassandraRDD[R] to DoubleRDDFunctions performed by method doubleRDDToDoubleRDDFunctions in org.apache.spark.rdd.RDD. This conversion will take place only if R is Double (R =:= Double).
    Shadowing
    This implicitly inherited member is ambiguous. One or more implicitly inherited members have similar signatures, so calling this member may produce an ambiguous implicit conversion compiler error.
    To access this member you can use a type ascription:
    (cassandraRDD: DoubleRDDFunctions).variance()
    Definition Classes
    DoubleRDDFunctions

Inherited from RDD[R]

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Inherited by implicit conversion rddToPairRDDFunctions from CassandraRDD[R] to org.apache.spark.rdd.PairRDDFunctions[K, V]

Inherited by implicit conversion numericRDDToDoubleRDDFunctions from CassandraRDD[R] to DoubleRDDFunctions

Inherited by implicit conversion doubleRDDToDoubleRDDFunctions from CassandraRDD[R] to DoubleRDDFunctions

Inherited by implicit conversion rddToOrderedRDDFunctions from CassandraRDD[R] to OrderedRDDFunctions[K, V, (K, V)]

Inherited by implicit conversion rddToSequenceFileRDDFunctions from CassandraRDD[R] to SequenceFileRDDFunctions[K, V]

Inherited by implicit conversion rddToAsyncRDDActions from CassandraRDD[R] to AsyncRDDActions[R]

Inherited by implicit conversion toPairRDDFunctions from CassandraRDD[R] to PairRDDFunctions[K, V]

Inherited by implicit conversion toRDDFunctions from CassandraRDD[R] to RDDFunctions[R]

Inherited by implicit conversion any2stringadd from CassandraRDD[R] to any2stringadd[CassandraRDD[R]]

Inherited by implicit conversion StringFormat from CassandraRDD[R] to StringFormat[CassandraRDD[R]]

Inherited by implicit conversion Ensuring from CassandraRDD[R] to Ensuring[CassandraRDD[R]]

Inherited by implicit conversion ArrowAssoc from CassandraRDD[R] to ArrowAssoc[CassandraRDD[R]]

Ungrouped