java - Spark - save RDD to multiple files as output -


i have javardd<model>, need write more 1 file different layout [one or 2 fields in rdd different between different layout].

when use saveastextfile() calling tostring() method of model, means same layout written output.

currently doing iterate rdd using map transformation method , return different model other layout, can use saveastextfile() action write different output file.

just because of 1 or 2 fields different , need iterate entire rdd again , create new rdd save output file.

for example:

current rdd fields:

roleindicator, name, age, address, department

output file 1:

name, age, address

output file 2:

roleindicator, name, age, department

is there optimal solution this?

regards, shankar

you want use foreach, not collect.

you should define function actual named class extends voidfunction. create instance variables both files, , add close() method closes files. call() implementation write whatever need.

remember call close() on function object after you're done.


Comments

Popular posts from this blog

javascript - Using jquery append to add option values into a select element not working -

Android soft keyboard reverts to default keyboard on orientation change -

jquery - javascript onscroll fade same class but with different div -