Changes

Jump to: navigation, search

Telemetry/Custom analysis with spark

191 bytes added, 19:45, 15 November 2016
FAQ: Add sc.cancelAllJobs() tip
2. The connection from PySpark to the Spark driver might be lost. Unfortunately the best way to recover from this for the moment seems to be spinning up a new cluster.
 
3. Canceling execution of a notebook cell doesn't cancel any spark jobs that might be running in the background. If your spark commands seem to be hanging, try running `sc.cancelAllJobs()`.
13
edits

Navigation menu