hadoop - How to save Iterable[String] to hdfs in spark-scala -


val ordersrdd = sc.textfile("/user/cloudera/sqoop_import/orders"); val ordersrddstatus = ordersrdd.map( rec => (rec.split(",")(3), 1)); val countordersstatus = ordersrddstatus.countbykey(); val output = countordersstatus.map(input => input._1 + "\t" + input._2); 

how save output iterable[string] hdfs in spark-scala. iterable[string]

note: ouput not rdd (i cannot use output.saveastextfile("hdfs-path")

one way write simple hdfs file (same way in vanilla scala or java). has nothing spark.

another way convert output rdd , save that.

val output = countordersstatus.map(input => input._1 + "\t" + input._2) sc.makerdd(output.tolist).saveastextfile("hdfs-path") 

Comments

Popular posts from this blog

wordpress - (T_ENDFOREACH) php error -

Export Excel workseet into txt file using vba - (text and numbers with formulas) -

Using django-mptt to get only the categories that have items -