Spark work with MySQL

With spark 2.0.x,we can use DataFrameReader and DataFrameWriter.
Use SparkSession.read to access DataFrameReader and use Dataset.write to access DataFrameWriter.

Suppose using spark-shell.

read example

1
2
3
4
5
6
7
val prop=new java.util.Properties()
prop.put("user","username")
prop.put("password","yourpassword")
val url="jdbc:mysql://host:3306/db_name"

val df=spark.read.jdbc(url,"table_name",prop)
df.show()

read example2

1
2
3
4
5
6
7
val jdbcDF = spark.read
.format("jdbc")
.option("url", "jdbc:mysql:dbserver")
.option("dbtable", "schema.tablename")
.option("user", "username")
.option("password", "password")
.load()

read example3

If you want to read data from a query result rather than a table.

1
2
3
4
5
6
7
8
val sql="""select * from db.your_table where id>1"""
val jdbcDF = spark.read
.format("jdbc")
.option("url", "jdbc:mysql:dbserver")
.option("dbtable", s"( $sql ) t")
.option("user", "username")
.option("password", "password")
.load()

write example

1
2
3
4
5
6
7
8
import org.apache.spark.sql.SaveMode

val prop=new java.util.Properties()
prop.put("user","username")
prop.put("password","yourpassword")
val url="jdbc:mysql://host:3306/db_name"
//df is a dataframe contains the data which you want to write.
df.write.mode(SaveMode.Append).jdbc(url,"table_name",prop)