Skip to content
This repository was archived by the owner on Jan 9, 2020. It is now read-only.
This repository was archived by the owner on Jan 9, 2020. It is now read-only.

Make Spark launch new executor pods when executor pods fail unexpectedly during the job run #136

@varunkatta

Description

@varunkatta

This is the desired behavior to improve Spark's fault tolerance on k8s without affecting performance. This expectation is inline with what is already available in Spark on Yarn. This is probably a more longer term goal and is a road-map item to be taken into consideration for future design and release planning.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions