A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://spark.apache.org/docs/latest/api/java/org/apache/spark/sql/Encoder.html below:

Encoder (Spark 4.0.0 JavaDoc)

All Superinterfaces:
Serializable

Used to convert a JVM object of type

T

to and from the internal Spark SQL representation.

==Scala== Encoders are generally created automatically through implicits from a SparkSession, or can be explicitly created by calling static methods on Encoders.


   import spark.implicits._

   val ds = Seq(1, 2, 3).toDS() // implicitly provided (spark.implicits.newIntEncoder)
 

==Java== Encoders are specified by calling static methods on Encoders.


   List<String> data = Arrays.asList("abc", "abc", "xyz");
   Dataset<String> ds = context.createDataset(data, Encoders.STRING());
 

Encoders can be composed into tuples:


   Encoder<Tuple2<Integer, String>> encoder2 = Encoders.tuple(Encoders.INT(), Encoders.STRING());
   List<Tuple2<Integer, String>> data2 = Arrays.asList(new scala.Tuple2(1, "a");
   Dataset<Tuple2<Integer, String>> ds2 = context.createDataset(data2, encoder2);
 

Or constructed from Java Beans:


   Encoders.bean(MyClass.class);
 

==Implementation== - Encoders should be thread-safe.

Since:
1.6.0

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4