0

I got the following from an API call for monitoring Apache Spark:

"diagnostic_details": {
        "data": {
            "stages":[...],
"executors":{"isDynamicAllocationOn": true,
                "executorInstances": 16,
                "startTime": 1717687393983,
                "endTime": 1717687759107,
                "events":[...],
"processedEvents":[...],
                "executorCores": "8",
                "executorMemory": "56g",
                "efficiency": 22.01,
                "sampleTime": 1717687759107
            }
        }
    }
}

The jobs is currently using 4 executors including a driver executor with 8 cores and 56 GB memory for each executor. We also have dynamic allocation enabled with minimum executor as 3 and max executor as 16. Does the efficiency here mean it's only using 22.01% of 4 executor?

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.