class DStreamFunctions[T] extends WritableToCassandra[T] with Serializable with Logging
- Alphabetic
- By Inheritance
- DStreamFunctions
- Logging
- Serializable
- Serializable
- WritableToCassandra
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
- new DStreamFunctions(dstream: DStream[T])
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @HotSpotIntrinsicCandidate()
- def conf: SparkConf
-
def
deleteFromCassandra(keyspaceName: String, tableName: String, deleteColumns: ColumnSelector = SomeColumns(), keyColumns: ColumnSelector = PrimaryKeyColumns, writeConf: WriteConf = ...)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), rwf: RowWriterFactory[T]): Unit
Delete data from Cassandra table, using data from the stream as a list of primary keys.
Delete data from Cassandra table, using data from the stream as a list of primary keys. Uses the specified column names.
- keyspaceName
the name of the Keyspace to use
- tableName
the name of the Table to use
- deleteColumns
The list of column names to delete, empty ColumnSelector means full row.
- keyColumns
Primary key columns selector, Optional. All RDD primary columns columns will be checked by default
- writeConf
additional configuration object allowing to set consistency level, batch size, etc.
- Definition Classes
- DStreamFunctions → WritableToCassandra
- See also
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
def
isTraceEnabled(): Boolean
- Attributes
- protected
- Definition Classes
- Logging
-
def
joinWithCassandraTable[R](keyspaceName: String, tableName: String, selectedColumns: ColumnSelector = AllColumns, joinColumns: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(sparkContext), newType: ClassTag[R], rrf: RowReaderFactory[R], ev: ValidRDDType[R], currentType: ClassTag[T], rwf: RowWriterFactory[T]): DStream[(T, R)]
Transforms RDDs with com.datastax.spark.connector.RDDFunctions.joinWithCassandraTable for each produced RDD
-
def
log: Logger
- Attributes
- protected
- Definition Classes
- Logging
-
def
logDebug(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logDebug(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logError(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logError(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logInfo(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logInfo(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logName: String
- Attributes
- protected
- Definition Classes
- Logging
-
def
logTrace(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logTrace(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logWarning(msg: ⇒ String, throwable: Throwable): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
def
logWarning(msg: ⇒ String): Unit
- Attributes
- protected
- Definition Classes
- Logging
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
repartitionByCassandraReplica(keyspaceName: String, tableName: String, partitionsPerHost: Int = 10, partitionKeyMapper: ColumnSelector = PartitionKeyColumns)(implicit connector: CassandraConnector = CassandraConnector(conf), currentType: ClassTag[T], rwf: RowWriterFactory[T]): DStream[T]
Transforms RDDs with com.datastax.spark.connector.RDDFunctions.repartitionByCassandraReplica for each produced RDD.
-
def
saveToCassandra(keyspaceName: String, tableName: String, columnNames: ColumnSelector = AllColumns, writeConf: WriteConf = WriteConf.fromSparkConf(conf))(implicit connector: CassandraConnector = CassandraConnector(conf), rwf: RowWriterFactory[T]): Unit
Performs com.datastax.spark.connector.writer.WritableToCassandra for each produced RDD.
Performs com.datastax.spark.connector.writer.WritableToCassandra for each produced RDD. Uses specific column names with an additional batch size.
- keyspaceName
the name of the Keyspace to use
- tableName
the name of the Table to use
- columnNames
The list of column names to save data to. Uses only the unique column names, and you must select at least all primary key columns. All other fields are discarded. Non-selected property/column names are left unchanged.
- writeConf
additional configuration object allowing to set consistency level, batch size, etc.
- Definition Classes
- DStreamFunctions → WritableToCassandra
-
def
sparkContext: SparkContext
- Definition Classes
- DStreamFunctions → WritableToCassandra
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
- def warnIfKeepAliveIsShort(): Unit
Deprecated Value Members
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated @deprecated
- Deprecated
(Since version ) see corresponding Javadoc for more information.