Packages

c

com.datastax.spark.connector.rdd

EmptyCassandraRDD

class EmptyCassandraRDD[R] extends CassandraRDD[R]

Represents a CassandraRDD with no rows. This RDD does not load any data from Cassandra and doesn't require for the table to exist.

Linear Supertypes
CassandraRDD[R], RDD[R], Logging, Serializable, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. EmptyCassandraRDD
  2. CassandraRDD
  3. RDD
  4. Logging
  5. Serializable
  6. Serializable
  7. AnyRef
  8. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new EmptyCassandraRDD(sc: SparkContext, keyspaceName: String, tableName: String, columnNames: ColumnSelector = AllColumns, where: CqlWhereClause = CqlWhereClause.empty, limit: Option[CassandraLimit] = None, clusteringOrder: Option[ClusteringOrder] = None, readConf: ReadConf = ReadConf())(implicit arg0: ClassTag[R])

Type Members

  1. type Self = EmptyCassandraRDD[R]

    This is slightly different than Scala this.type.

    This is slightly different than Scala this.type. this.type is the unique singleton type of an object which is not compatible with other instances of the same type, so returning anything other than this is not really possible without lying to the compiler by explicit casts. Here SelfType is used to return a copy of the object - a different instance of the same type

    Definition Classes
    EmptyCassandraRDDCassandraRDD

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. def ++(other: RDD[R]): RDD[R]
    Definition Classes
    RDD
  4. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  5. def aggregate[U](zeroValue: U)(seqOp: (U, R) ⇒ U, combOp: (U, U) ⇒ U)(implicit arg0: ClassTag[U]): U
    Definition Classes
    RDD
  6. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10, A11](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10, A11) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8], arg10: TypeConverter[A9], arg11: TypeConverter[A10], arg12: TypeConverter[A11]): CassandraRDD[B]
    Definition Classes
    CassandraRDD
  7. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8, A9, A10) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8], arg10: TypeConverter[A9], arg11: TypeConverter[A10]): CassandraRDD[B]
    Definition Classes
    CassandraRDD
  8. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8, A9](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8, A9) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8], arg10: TypeConverter[A9]): CassandraRDD[B]
    Definition Classes
    CassandraRDD
  9. def as[B, A0, A1, A2, A3, A4, A5, A6, A7, A8](f: (A0, A1, A2, A3, A4, A5, A6, A7, A8) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7], arg9: TypeConverter[A8]): CassandraRDD[B]
    Definition Classes
    CassandraRDD
  10. def as[B, A0, A1, A2, A3, A4, A5, A6, A7](f: (A0, A1, A2, A3, A4, A5, A6, A7) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6], arg8: TypeConverter[A7]): CassandraRDD[B]
    Definition Classes
    CassandraRDD
  11. def as[B, A0, A1, A2, A3, A4, A5, A6](f: (A0, A1, A2, A3, A4, A5, A6) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5], arg7: TypeConverter[A6]): CassandraRDD[B]
    Definition Classes
    CassandraRDD
  12. def as[B, A0, A1, A2, A3, A4, A5](f: (A0, A1, A2, A3, A4, A5) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4], arg6: TypeConverter[A5]): CassandraRDD[B]
    Definition Classes
    CassandraRDD
  13. def as[B, A0, A1, A2, A3, A4](f: (A0, A1, A2, A3, A4) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3], arg5: TypeConverter[A4]): CassandraRDD[B]
    Definition Classes
    CassandraRDD
  14. def as[B, A0, A1, A2, A3](f: (A0, A1, A2, A3) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2], arg4: TypeConverter[A3]): CassandraRDD[B]
    Definition Classes
    CassandraRDD
  15. def as[B, A0, A1, A2](f: (A0, A1, A2) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1], arg3: TypeConverter[A2]): CassandraRDD[B]
    Definition Classes
    CassandraRDD
  16. def as[B, A0, A1](f: (A0, A1) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0], arg2: TypeConverter[A1]): CassandraRDD[B]
    Definition Classes
    CassandraRDD
  17. def as[B, A0](f: (A0) ⇒ B)(implicit arg0: ClassTag[B], arg1: TypeConverter[A0]): CassandraRDD[B]

    Maps each row into object of a different type using provided function taking column value(s) as argument(s).

    Maps each row into object of a different type using provided function taking column value(s) as argument(s). Can be used to convert each row to a tuple or a case class object:

    sc.cassandraTable("ks", "table")
      .select("column1")
      .as((s: String) => s)                 // yields CassandraRDD[String]
    
    sc.cassandraTable("ks", "table")
      .select("column1", "column2")
      .as((_: String, _: Long))             // yields CassandraRDD[(String, Long)]
    
    case class MyRow(key: String, value: Long)
    sc.cassandraTable("ks", "table")
      .select("column1", "column2")
      .as(MyRow)                            // yields CassandraRDD[MyRow]
    Definition Classes
    CassandraRDD
  18. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  19. def barrier(): RDDBarrier[R]
    Definition Classes
    RDD
    Annotations
    @Experimental() @Since( "2.4.0" )
  20. def cache(): EmptyCassandraRDD.this.type
    Definition Classes
    RDD
  21. def cartesian[U](other: RDD[U])(implicit arg0: ClassTag[U]): RDD[(R, U)]
    Definition Classes
    RDD
  22. val cassandraCount: Long

    Counts the number of items in this RDD by selecting count(*) on Cassandra table

    Counts the number of items in this RDD by selecting count(*) on Cassandra table

    Definition Classes
    EmptyCassandraRDDCassandraRDD
  23. def checkpoint(): Unit
    Definition Classes
    RDD
  24. def cleanShuffleDependencies(blocking: Boolean): Unit
    Definition Classes
    RDD
    Annotations
    @Experimental() @DeveloperApi() @Since( "3.1.0" )
  25. def clearDependencies(): Unit
    Attributes
    protected
    Definition Classes
    RDD
  26. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native() @HotSpotIntrinsicCandidate()
  27. def clusteringOrder(order: ClusteringOrder): Self

    Adds a CQL ORDER BY clause to the query.

    Adds a CQL ORDER BY clause to the query. It can be applied only in case there are clustering columns and primary key predicate is pushed down in where. It is useful when the default direction of ordering rows within a single Cassandra partition needs to be changed.

    Definition Classes
    CassandraRDD
  28. val clusteringOrder: Option[ClusteringOrder]
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  29. def coalesce(numPartitions: Int, shuffle: Boolean, partitionCoalescer: Option[PartitionCoalescer])(implicit ord: Ordering[R]): RDD[R]
    Definition Classes
    RDD
  30. def collect[U](f: PartialFunction[R, U])(implicit arg0: ClassTag[U]): RDD[U]
    Definition Classes
    RDD
  31. def collect(): Array[R]
    Definition Classes
    RDD
  32. val columnNames: ColumnSelector
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  33. def compute(split: Partition, context: TaskContext): Iterator[R]
    Definition Classes
    EmptyCassandraRDD → RDD
    Annotations
    @DeveloperApi()
  34. def connector: CassandraConnector
    Attributes
    protected
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  35. def context: SparkContext
    Definition Classes
    RDD
  36. def convertTo[B](implicit arg0: ClassTag[B], arg1: RowReaderFactory[B]): CassandraRDD[B]
    Attributes
    protected
    Definition Classes
    CassandraRDD
  37. def copy(columnNames: ColumnSelector = columnNames, where: CqlWhereClause = where, limit: Option[CassandraLimit] = limit, clusteringOrder: Option[ClusteringOrder] = None, readConf: ReadConf = readConf, connector: CassandraConnector = connector): EmptyCassandraRDD[R]

    Allows to copy this RDD with changing some of the properties

    Allows to copy this RDD with changing some of the properties

    Attributes
    protected
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  38. def count(): Long
    Definition Classes
    RDD
  39. def countApprox(timeout: Long, confidence: Double): PartialResult[BoundedDouble]
    Definition Classes
    RDD
  40. def countApproxDistinct(relativeSD: Double): Long
    Definition Classes
    RDD
  41. def countApproxDistinct(p: Int, sp: Int): Long
    Definition Classes
    RDD
  42. def countByValue()(implicit ord: Ordering[R]): Map[R, Long]
    Definition Classes
    RDD
  43. def countByValueApprox(timeout: Long, confidence: Double)(implicit ord: Ordering[R]): PartialResult[Map[R, BoundedDouble]]
    Definition Classes
    RDD
  44. final def dependencies: Seq[Dependency[_]]
    Definition Classes
    RDD
  45. def distinct(): RDD[R]
    Definition Classes
    RDD
  46. def distinct(numPartitions: Int)(implicit ord: Ordering[R]): RDD[R]
    Definition Classes
    RDD
  47. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  48. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  49. def filter(f: (R) ⇒ Boolean): RDD[R]
    Definition Classes
    RDD
  50. def first(): R
    Definition Classes
    RDD
  51. def firstParent[U](implicit arg0: ClassTag[U]): RDD[U]
    Attributes
    protected[org.apache.spark]
    Definition Classes
    RDD
  52. def flatMap[U](f: (R) ⇒ TraversableOnce[U])(implicit arg0: ClassTag[U]): RDD[U]
    Definition Classes
    RDD
  53. def fold(zeroValue: R)(op: (R, R) ⇒ R): R
    Definition Classes
    RDD
  54. def foreach(f: (R) ⇒ Unit): Unit
    Definition Classes
    RDD
  55. def foreachPartition(f: (Iterator[R]) ⇒ Unit): Unit
    Definition Classes
    RDD
  56. def getCheckpointFile: Option[String]
    Definition Classes
    RDD
  57. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  58. def getDependencies: Seq[Dependency[_]]
    Attributes
    protected
    Definition Classes
    RDD
  59. final def getNumPartitions: Int
    Definition Classes
    RDD
    Annotations
    @Since( "1.6.0" )
  60. def getOutputDeterministicLevel: org.apache.spark.rdd.DeterministicLevel.Value
    Attributes
    protected
    Definition Classes
    RDD
    Annotations
    @DeveloperApi()
  61. def getPartitions: Array[Partition]
    Attributes
    protected
    Definition Classes
    EmptyCassandraRDD → RDD
  62. def getPreferredLocations(split: Partition): Seq[String]
    Attributes
    protected
    Definition Classes
    RDD
  63. def getResourceProfile(): ResourceProfile
    Definition Classes
    RDD
    Annotations
    @Experimental() @Since( "3.1.0" )
  64. def getStorageLevel: StorageLevel
    Definition Classes
    RDD
  65. def glom(): RDD[Array[R]]
    Definition Classes
    RDD
  66. def groupBy[K](f: (R) ⇒ K, p: Partitioner)(implicit kt: ClassTag[K], ord: Ordering[K]): RDD[(K, Iterable[R])]
    Definition Classes
    RDD
  67. def groupBy[K](f: (R) ⇒ K, numPartitions: Int)(implicit kt: ClassTag[K]): RDD[(K, Iterable[R])]
    Definition Classes
    RDD
  68. def groupBy[K](f: (R) ⇒ K)(implicit kt: ClassTag[K]): RDD[(K, Iterable[R])]
    Definition Classes
    RDD
  69. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  70. val id: Int
    Definition Classes
    RDD
  71. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  72. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  73. def intersection(other: RDD[R], numPartitions: Int): RDD[R]
    Definition Classes
    RDD
  74. def intersection(other: RDD[R], partitioner: Partitioner)(implicit ord: Ordering[R]): RDD[R]
    Definition Classes
    RDD
  75. def intersection(other: RDD[R]): RDD[R]
    Definition Classes
    RDD
  76. lazy val isBarrier_: Boolean
    Attributes
    protected
    Definition Classes
    RDD
    Annotations
    @transient()
  77. def isCheckpointed: Boolean
    Definition Classes
    RDD
  78. def isEmpty(): Boolean
    Definition Classes
    RDD
  79. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  80. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  81. final def iterator(split: Partition, context: TaskContext): Iterator[R]
    Definition Classes
    RDD
  82. def keyBy[K](f: (R) ⇒ K): RDD[(K, R)]
    Definition Classes
    RDD
  83. val keyspaceName: String
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  84. def limit(rowLimit: Long): Self

    Adds the limit clause to CQL select statement.

    Adds the limit clause to CQL select statement. The limit will be applied for each created Spark partition. In other words, unless the data are fetched from a single Cassandra partition the number of results is unpredictable.

    The main purpose of passing limit clause is to fetch top n rows from a single Cassandra partition when the table is designed so that it uses clustering keys and a partition key predicate is passed to the where clause.

    Definition Classes
    CassandraRDD
  85. val limit: Option[CassandraLimit]
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  86. def localCheckpoint(): EmptyCassandraRDD.this.type
    Definition Classes
    RDD
  87. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  88. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  89. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  90. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  91. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  92. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  93. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  94. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  95. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  96. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  97. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  98. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  99. def map[U](f: (R) ⇒ U)(implicit arg0: ClassTag[U]): RDD[U]
    Definition Classes
    RDD
  100. def mapPartitions[U](f: (Iterator[R]) ⇒ Iterator[U], preservesPartitioning: Boolean)(implicit arg0: ClassTag[U]): RDD[U]
    Definition Classes
    RDD
  101. def mapPartitionsWithIndex[U](f: (Int, Iterator[R]) ⇒ Iterator[U], preservesPartitioning: Boolean)(implicit arg0: ClassTag[U]): RDD[U]
    Definition Classes
    RDD
  102. def max()(implicit ord: Ordering[R]): R
    Definition Classes
    RDD
  103. def min()(implicit ord: Ordering[R]): R
    Definition Classes
    RDD
  104. var name: String
    Definition Classes
    RDD
  105. def narrowColumnSelection(columns: Seq[ColumnRef]): Seq[ColumnRef]
    Attributes
    protected
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  106. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  107. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  108. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @HotSpotIntrinsicCandidate()
  109. def parent[U](j: Int)(implicit arg0: ClassTag[U]): RDD[U]
    Attributes
    protected[org.apache.spark]
    Definition Classes
    RDD
  110. val partitioner: Option[Partitioner]
    Definition Classes
    RDD
  111. final def partitions: Array[Partition]
    Definition Classes
    RDD
  112. def perPartitionLimit(rowLimit: Long): Self

    Adds the PER PARTITION LIMIT clause to CQL select statement.

    Adds the PER PARTITION LIMIT clause to CQL select statement. The limit will be applied for every Cassandra Partition. Only Valid For Cassandra 3.6+

    Definition Classes
    CassandraRDD
  113. def persist(): EmptyCassandraRDD.this.type
    Definition Classes
    RDD
  114. def persist(newLevel: StorageLevel): EmptyCassandraRDD.this.type
    Definition Classes
    RDD
  115. def pipe(command: Seq[String], env: Map[String, String], printPipeContext: ((String) ⇒ Unit) ⇒ Unit, printRDDElement: (R, (String) ⇒ Unit) ⇒ Unit, separateWorkingDir: Boolean, bufferSize: Int, encoding: String): RDD[String]
    Definition Classes
    RDD
  116. def pipe(command: String, env: Map[String, String]): RDD[String]
    Definition Classes
    RDD
  117. def pipe(command: String): RDD[String]
    Definition Classes
    RDD
  118. final def preferredLocations(split: Partition): Seq[String]
    Definition Classes
    RDD
  119. def randomSplit(weights: Array[Double], seed: Long): Array[RDD[R]]
    Definition Classes
    RDD
  120. val readConf: ReadConf
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  121. def reduce(f: (R, R) ⇒ R): R
    Definition Classes
    RDD
  122. def repartition(numPartitions: Int)(implicit ord: Ordering[R]): RDD[R]
    Definition Classes
    RDD
  123. def sample(withReplacement: Boolean, fraction: Double, seed: Long): RDD[R]
    Definition Classes
    RDD
  124. def saveAsObjectFile(path: String): Unit
    Definition Classes
    RDD
  125. def saveAsTextFile(path: String, codec: Class[_ <: CompressionCodec]): Unit
    Definition Classes
    RDD
  126. def saveAsTextFile(path: String): Unit
    Definition Classes
    RDD
  127. val sc: SparkContext
  128. def select(columns: ColumnRef*): Self

    Narrows down the selected set of columns.

    Narrows down the selected set of columns. Use this for better performance, when you don't need all the columns in the result RDD. When called multiple times, it selects the subset of the already selected columns, so after a column was removed by the previous select call, it is not possible to add it back.

    The selected columns are ColumnRef instances. This type allows to specify columns for straightforward retrieval and to read TTL or write time of regular columns as well. Implicit conversions included in com.datastax.spark.connector package make it possible to provide just column names (which is also backward compatible) and optional add .ttl or .writeTime suffix in order to create an appropriate ColumnRef instance.

    Definition Classes
    CassandraRDD
  129. def selectedColumnNames: Seq[String]
    Definition Classes
    CassandraRDD
  130. lazy val selectedColumnRefs: Seq[ColumnRef]
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  131. def setName(_name: String): EmptyCassandraRDD.this.type
    Definition Classes
    RDD
  132. def sortBy[K](f: (R) ⇒ K, ascending: Boolean, numPartitions: Int)(implicit ord: Ordering[K], ctag: ClassTag[K]): RDD[R]
    Definition Classes
    RDD
  133. def sparkContext: SparkContext
    Definition Classes
    RDD
  134. def subtract(other: RDD[R], p: Partitioner)(implicit ord: Ordering[R]): RDD[R]
    Definition Classes
    RDD
  135. def subtract(other: RDD[R], numPartitions: Int): RDD[R]
    Definition Classes
    RDD
  136. def subtract(other: RDD[R]): RDD[R]
    Definition Classes
    RDD
  137. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  138. val tableName: String
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  139. def take(num: Int): Array[R]
    Definition Classes
    CassandraRDD → RDD
  140. def takeOrdered(num: Int)(implicit ord: Ordering[R]): Array[R]
    Definition Classes
    RDD
  141. def takeSample(withReplacement: Boolean, num: Int, seed: Long): Array[R]
    Definition Classes
    RDD
  142. def toDebugString: String
    Definition Classes
    RDD
  143. def toEmptyCassandraRDD: EmptyCassandraRDD[R]
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  144. def toJavaRDD(): JavaRDD[R]
    Definition Classes
    RDD
  145. def toLocalIterator: Iterator[R]
    Definition Classes
    RDD
  146. def toString(): String
    Definition Classes
    RDD → AnyRef → Any
  147. def top(num: Int)(implicit ord: Ordering[R]): Array[R]
    Definition Classes
    RDD
  148. def treeAggregate[U](zeroValue: U)(seqOp: (U, R) ⇒ U, combOp: (U, U) ⇒ U, depth: Int)(implicit arg0: ClassTag[U]): U
    Definition Classes
    RDD
  149. def treeReduce(f: (R, R) ⇒ R, depth: Int): R
    Definition Classes
    RDD
  150. def union(other: RDD[R]): RDD[R]
    Definition Classes
    RDD
  151. def unpersist(blocking: Boolean): EmptyCassandraRDD.this.type
    Definition Classes
    RDD
  152. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  153. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  154. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  155. def where(cql: String, values: Any*): Self

    Adds a CQL WHERE predicate(s) to the query.

    Adds a CQL WHERE predicate(s) to the query. Useful for leveraging secondary indexes in Cassandra. Implicitly adds an ALLOW FILTERING clause to the WHERE clause, however beware that some predicates might be rejected by Cassandra, particularly in cases when they filter on an unindexed, non-clustering column.

    Definition Classes
    CassandraRDD
  156. val where: CqlWhereClause
    Definition Classes
    EmptyCassandraRDDCassandraRDD
  157. def withAscOrder: Self
    Definition Classes
    CassandraRDD
  158. def withConnector(connector: CassandraConnector): Self

    Returns a copy of this Cassandra RDD with specified connector

    Returns a copy of this Cassandra RDD with specified connector

    Definition Classes
    CassandraRDD
  159. def withDescOrder: Self
    Definition Classes
    CassandraRDD
  160. def withReadConf(readConf: ReadConf): Self

    Allows to set custom read configuration, e.g.

    Allows to set custom read configuration, e.g. consistency level or fetch size.

    Definition Classes
    CassandraRDD
  161. def withResources(rp: ResourceProfile): EmptyCassandraRDD.this.type
    Definition Classes
    RDD
    Annotations
    @Experimental() @Since( "3.1.0" )
  162. def zip[U](other: RDD[U])(implicit arg0: ClassTag[U]): RDD[(R, U)]
    Definition Classes
    RDD
  163. def zipPartitions[B, C, D, V](rdd2: RDD[B], rdd3: RDD[C], rdd4: RDD[D])(f: (Iterator[R], Iterator[B], Iterator[C], Iterator[D]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[D], arg3: ClassTag[V]): RDD[V]
    Definition Classes
    RDD
  164. def zipPartitions[B, C, D, V](rdd2: RDD[B], rdd3: RDD[C], rdd4: RDD[D], preservesPartitioning: Boolean)(f: (Iterator[R], Iterator[B], Iterator[C], Iterator[D]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[D], arg3: ClassTag[V]): RDD[V]
    Definition Classes
    RDD
  165. def zipPartitions[B, C, V](rdd2: RDD[B], rdd3: RDD[C])(f: (Iterator[R], Iterator[B], Iterator[C]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[V]): RDD[V]
    Definition Classes
    RDD
  166. def zipPartitions[B, C, V](rdd2: RDD[B], rdd3: RDD[C], preservesPartitioning: Boolean)(f: (Iterator[R], Iterator[B], Iterator[C]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[C], arg2: ClassTag[V]): RDD[V]
    Definition Classes
    RDD
  167. def zipPartitions[B, V](rdd2: RDD[B])(f: (Iterator[R], Iterator[B]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[V]): RDD[V]
    Definition Classes
    RDD
  168. def zipPartitions[B, V](rdd2: RDD[B], preservesPartitioning: Boolean)(f: (Iterator[R], Iterator[B]) ⇒ Iterator[V])(implicit arg0: ClassTag[B], arg1: ClassTag[V]): RDD[V]
    Definition Classes
    RDD
  169. def zipWithIndex(): RDD[(R, Long)]
    Definition Classes
    RDD
  170. def zipWithUniqueId(): RDD[(R, Long)]
    Definition Classes
    RDD

Deprecated Value Members

  1. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] ) @Deprecated @deprecated
    Deprecated

    (Since version ) see corresponding Javadoc for more information.

Inherited from CassandraRDD[R]

Inherited from RDD[R]

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped