Packages

p

com.datastax.spark

connector

package connector

The root package of Cassandra connector for Apache Spark. Offers handy implicit conversions that add Cassandra-specific methods to SparkContext and RDD.

Call cassandraTable method on the SparkContext object to create a CassandraRDD exposing Cassandra tables as Spark RDDs.

Call RDDFunctions saveToCassandra function on any RDD to save distributed collection to a Cassandra table.

Example:

CREATE KEYSPACE test WITH replication = {'class': 'SimpleStrategy', 'replication_factor': 1 };
CREATE TABLE test.words (word text PRIMARY KEY, count int);
INSERT INTO test.words(word, count) VALUES ("and", 50);
import com.datastax.spark.connector._

val sparkMasterHost = "127.0.0.1"
val cassandraHost = "127.0.0.1"
val keyspace = "test"
val table = "words"

// Tell Spark the address of one Cassandra node:
val conf = new SparkConf(true).set("spark.cassandra.connection.host", cassandraHost)

// Connect to the Spark cluster:
val sc = new SparkContext("spark://" + sparkMasterHost + ":7077", "example", conf)

// Read the table and print its contents:
val rdd = sc.cassandraTable(keyspace, table)
rdd.toArray().foreach(println)

// Write two rows to the table:
val col = sc.parallelize(Seq(("of", 1200), ("the", "863")))
col.saveToCassandra(keyspace, table)

sc.stop()
Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. connector
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. sealed trait BatchSize extends AnyRef
  2. case class BytesInBatch(batchSize: Int) extends BatchSize with Product with Serializable
  3. class CassandraSparkExtensions extends (SparkSessionExtensions) ⇒ Unit with Logging
  4. final class CassandraTableScanPairRDDFunctions[K, V] extends Serializable
  5. final class CassandraTableScanRDDFunctions[R] extends Serializable
  6. implicit final class ColumnNameFunctions extends AnyVal
  7. sealed trait ColumnSelector extends AnyRef
  8. class DatasetFunctions[K] extends Serializable

    Provides Cassandra-specific methods on org.apache.spark.sql.DataFrame

  9. class PairRDDFunctions[K, V] extends Serializable
  10. class RDDFunctions[T] extends WritableToCassandra[T] with Serializable

    Provides Cassandra-specific methods on RDD

  11. case class RowsInBatch(batchSize: Int) extends BatchSize with Product with Serializable
  12. case class SomeColumns(columns: ColumnRef*) extends ColumnSelector with Product with Serializable
  13. class SparkContextFunctions extends Serializable

    Provides Cassandra-specific methods on SparkContext

Value Members

  1. implicit def toCassandraTableScanFunctions[T](rdd: CassandraTableScanRDD[T]): CassandraTableScanRDDFunctions[T]
  2. implicit def toCassandraTableScanRDDPairFunctions[K, V](rdd: CassandraTableScanRDD[(K, V)]): CassandraTableScanPairRDDFunctions[K, V]
  3. implicit def toDataFrameFunctions(dataFrame: DataFrame): DatasetFunctions[Row]
  4. implicit def toDatasetFunctions[K](dataset: Dataset[K])(implicit arg0: Encoder[K]): DatasetFunctions[K]
  5. implicit def toNamedColumnRef(columnName: String): ColumnName
  6. implicit def toPairRDDFunctions[K, V](rdd: RDD[(K, V)]): PairRDDFunctions[K, V]
  7. implicit def toRDDFunctions[T](rdd: RDD[T]): RDDFunctions[T]
  8. implicit def toSparkContextFunctions(sc: SparkContext): SparkContextFunctions
  9. object AllColumns extends ColumnSelector with Product with Serializable
  10. object BatchSize
  11. object DocUtil
  12. object PartitionKeyColumns extends ColumnSelector with Product with Serializable
  13. object PrimaryKeyColumns extends ColumnSelector with Product with Serializable
  14. object SomeColumns extends Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped