by Diego Calvo | Nov 23, 2017 | Apache Spark, Python-example | 0 comments
Displays an example of a map function with Spark.
def my_func(iterator): yield sum(iterator) list = range(1,10) parallel = sc.parallelize(list, 5) parallel.mapPartitions(my_func).collect()
[1, 5, 9, 13, 17]
Your email address will not be published. Required fields are marked *
Comment *
Name *
Email *
Website
Submit Comment
Δ
0 Comments