Test_openjdk17_hs_extended.perf_aarch64_linux
Started by upstream project "build-scripts/jobs/release/jobs/jdk17u/jdk17u-release-linux-aarch64-temurin" build number 32
originally caused by:
Started by upstream project "build-scripts/release-openjdk17-pipeline" build number 88
originally caused by:
Started by upstream project "build-scripts/utils/releaseTrigger_jdk17u" build number 6821
originally caused by:
Started by user Andrew Leonard
Checking out git ${ADOPTOPENJDK_REPO} into /home/jenkins/.jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux@script/7d272c0688f17ab4e5b2f6ce77a7dc9cf4df33ff05c3a95eddd38682ef795b79 to read aqa-tests/buildenv/jenkins/openjdk_tests
The recommended git tool is: git
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Using shallow clone with depth 1
Cloning repository https://github.com/adoptium/aqa-tests.git
> git init /home/jenkins/.jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux@script/7d272c0688f17ab4e5b2f6ce77a7dc9cf4df33ff05c3a95eddd38682ef795b79/aqa-tests # timeout=10
Fetching upstream changes from https://github.com/adoptium/aqa-tests.git
> git --version # timeout=10
> git --version # 'git version 2.43.0'
> git fetch --tags --force --progress --depth=1 -- https://github.com/adoptium/aqa-tests.git +refs/heads/*:refs/remotes/origin/* # timeout=60
> git config remote.origin.url https://github.com/adoptium/aqa-tests.git # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
Avoid second fetch
> git rev-parse origin/v1.0.7-release^{commit} # timeout=10
Checking out Revision 5a57a73acca15191cbf8bba592670458bc380962 (origin/v1.0.7-release)
> git config core.sparsecheckout # timeout=10
> git checkout -f 5a57a73acca15191cbf8bba592670458bc380962 # timeout=10
Commit message: "Update related repo and branch (#6173)"
First time build. Skipping changelog.
[Pipeline] Start of Pipeline
[Pipeline] timestamps
[Pipeline] {
[Pipeline] nodesByLabel
[2025-04-09T20:36:50.925Z] Found a total of 2 nodes with the 'sw.os.linux&&hw.arch.aarch64&&ci.role.perf' label
[Pipeline] echo
[2025-04-09T20:36:50.938Z] SPEC: linux_aarch64
[Pipeline] echo
[2025-04-09T20:36:50.939Z] LABEL: sw.os.linux&&hw.arch.aarch64&&ci.role.perf
[Pipeline] stage
[Pipeline] { (Queue)
[Pipeline] nodesByLabel
[2025-04-09T20:36:50.954Z] Found a total of 2 nodes with the 'sw.os.linux&&hw.arch.aarch64&&ci.role.perf' label
[Pipeline] echo
[2025-04-09T20:36:50.962Z] dynamicAgents: []
[Pipeline] node
[2025-04-09T20:37:05.967Z] Still waiting to schedule task
[2025-04-09T20:37:05.969Z] Waiting for next available executor on ‘sw.os.linux&&hw.arch.aarch64&&ci.role.perf’
[2025-04-09T21:24:24.752Z] Running on test-osuosl-ubuntu2204-aarch64-1 in /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux
[Pipeline] {
[Pipeline] retry
[Pipeline] {
[Pipeline] timeout
[2025-04-09T21:24:24.785Z] Timeout set to expire in 1 hr 0 min
[Pipeline] {
[Pipeline] cleanWs
[2025-04-09T21:24:25.219Z] [WS-CLEANUP] Deleting project workspace...
[2025-04-09T21:24:25.219Z] [WS-CLEANUP] Deferred wipeout is disabled by the job configuration...
[2025-04-09T21:24:25.399Z] [WS-CLEANUP] done
[Pipeline] }
[Pipeline] // timeout
[Pipeline] checkout
[2025-04-09T21:24:25.613Z] The recommended git tool is: git
[2025-04-09T21:24:25.970Z] No credentials specified
[2025-04-09T21:24:26.148Z] Cloning the remote Git repository
[2025-04-09T21:24:26.411Z] Cloning repository https://github.com/adoptium/aqa-tests.git
[2025-04-09T21:24:26.412Z] > git init /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests # timeout=10
[2025-04-09T21:24:26.417Z] [WARNING] Reference path does not exist: /home/jenkins/openjdk_cache
[2025-04-09T21:24:26.417Z] Fetching upstream changes from https://github.com/adoptium/aqa-tests.git
[2025-04-09T21:24:26.417Z] > git --version # timeout=10
[2025-04-09T21:24:26.419Z] > git --version # 'git version 2.34.1'
[2025-04-09T21:24:26.420Z] > git fetch --tags --force --progress -- https://github.com/adoptium/aqa-tests.git +refs/heads/*:refs/remotes/origin/* # timeout=10
[2025-04-09T21:24:30.012Z] Avoid second fetch
[2025-04-09T21:24:29.718Z] > git config remote.origin.url https://github.com/adoptium/aqa-tests.git # timeout=10
[2025-04-09T21:24:29.724Z] > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
[2025-04-09T21:24:30.197Z] Checking out Revision 5a57a73acca15191cbf8bba592670458bc380962 (origin/v1.0.7-release)
[2025-04-09T21:24:30.097Z] > git rev-parse origin/v1.0.7-release^{commit} # timeout=10
[2025-04-09T21:24:30.283Z] > git config core.sparsecheckout # timeout=10
[2025-04-09T21:24:30.289Z] > git checkout -f 5a57a73acca15191cbf8bba592670458bc380962 # timeout=10
[2025-04-09T21:24:31.368Z] Commit message: "Update related repo and branch (#6173)"
[Pipeline] }
[Pipeline] // retry
[Pipeline] load
[Pipeline] { (/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/buildenv/jenkins/JenkinsfileBase)
[Pipeline] }
[Pipeline] // load
[Pipeline] timeout
[2025-04-09T21:24:33.065Z] Timeout set to expire in 1 day 1 hr
[Pipeline] {
[Pipeline] echo
[2025-04-09T21:24:33.091Z] runMulticastCmd: false
[Pipeline] stage
[Pipeline] { (Setup)
[Pipeline] sh
[2025-04-09T21:24:35.017Z] + LC_TIME=C date +%a, %d %b %Y %T %z
[Pipeline] echo
[2025-04-09T21:24:35.749Z] PROCESSCATCH: Terminating any hung/left over test processes:
[Pipeline] sh
[2025-04-09T21:24:37.611Z] + aqa-tests/terminateTestProcesses.sh jenkins
[2025-04-09T21:24:37.612Z] Unix type machine..
[2025-04-09T21:24:37.612Z] Running on a Linux host
[2025-04-09T21:24:37.612Z] Woohoo - no rogue processes detected!
[Pipeline] sh
[2025-04-09T21:24:40.532Z] + printenv
[2025-04-09T21:24:40.532Z] JENKINS_HOME=/home/jenkins/.jenkins
[2025-04-09T21:24:40.532Z] OPENJ9_REPO=https://github.com/eclipse-openj9/openj9.git
[2025-04-09T21:24:40.532Z] SETUP_JCK_RUN=false
[2025-04-09T21:24:40.532Z] USE_TESTENV_PROPERTIES=true
[2025-04-09T21:24:40.532Z] EXTERNAL_CUSTOM_BRANCH=master
[2025-04-09T21:24:40.532Z] SSH_CLIENT=78.47.239.97 59338 22
[2025-04-09T21:24:40.533Z] BUILD_LIST=perf
[2025-04-09T21:24:40.533Z] USER=jenkins
[2025-04-09T21:24:40.533Z] SDK_RESOURCE=upstream
[2025-04-09T21:24:40.533Z] OPENJ9_BRANCH=master
[2025-04-09T21:24:40.533Z] CI=true
[2025-04-09T21:24:40.533Z] RUN_CHANGES_DISPLAY_URL=https://ci.adoptium.net/job/Test_openjdk17_hs_extended.perf_aarch64_linux/220/display/redirect?page=changes
[2025-04-09T21:24:40.533Z] ADOPTOPENJDK_REPO=https://github.com/adoptium/aqa-tests.git
[2025-04-09T21:24:40.533Z] UPSTREAM_JOB_NUMBER=32
[2025-04-09T21:24:40.533Z] XDG_SESSION_TYPE=tty
[2025-04-09T21:24:40.533Z] EXIT_FAILURE=false
[2025-04-09T21:24:40.533Z] SHLVL=0
[2025-04-09T21:24:40.533Z] HUDSON_URL=https://ci.adoptium.net/
[2025-04-09T21:24:40.533Z] TARGET=extended.perf
[2025-04-09T21:24:40.533Z] NODE_LABELS=aarch64 armv8 hw.arch.aarch64 dockerBuildX ubuntu buildX linux armv8.2 sw.os.linux test-osuosl-ubuntu2204-aarch64-1 ci.role.perf
[2025-04-09T21:24:40.533Z] MOTD_SHOWN=pam
[2025-04-09T21:24:40.533Z] USE_JRE=false
[2025-04-09T21:24:40.533Z] STF_OWNER_BRANCH=adoptium:master
[2025-04-09T21:24:40.533Z] OLDPWD=/home/jenkins
[2025-04-09T21:24:40.533Z] HOME=/home/jenkins
[2025-04-09T21:24:40.533Z] BUILD_URL=https://ci.adoptium.net/job/Test_openjdk17_hs_extended.perf_aarch64_linux/220/
[2025-04-09T21:24:40.533Z] ADOPTOPENJDK_BRANCH=v1.0.7-release
[2025-04-09T21:24:40.533Z] TAP_NAME=Test_openjdk17_hs_extended.perf_aarch64_linux.tap
[2025-04-09T21:24:40.533Z] JDK_IMPL=hotspot
[2025-04-09T21:24:40.533Z] DOCKER_REQUIRED=false
[2025-04-09T21:24:40.533Z] JENKINS_SERVER_COOKIE=durable-ba64336416a204acc64b35d67d9880e17b495ea6f4e408bdc755f4a8a6b89a2a
[2025-04-09T21:24:40.533Z] HUDSON_COOKIE=0f394e5f-bfdc-45cf-9ca4-0f8ddb3d790e
[2025-04-09T21:24:40.533Z] DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/1002/bus
[2025-04-09T21:24:40.533Z] PERF_ROOT=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../benchmarks
[2025-04-09T21:24:40.533Z] JDK_REPO=https://github.com/adoptium/jdk17u
[2025-04-09T21:24:40.533Z] WORKSPACE=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux
[2025-04-09T21:24:40.533Z] KEEP_WORKSPACE=false
[2025-04-09T21:24:40.533Z] ARCHIVE_TEST_RESULTS=false
[2025-04-09T21:24:40.533Z] GENERATE_JOBS=false
[2025-04-09T21:24:40.533Z] PERSONAL_BUILD=false
[2025-04-09T21:24:40.533Z] NODE_NAME=test-osuosl-ubuntu2204-aarch64-1
[2025-04-09T21:24:40.533Z] JDK_BRANCH=jdk-17.0.15+5_adopt
[2025-04-09T21:24:40.533Z] TKG_OWNER_BRANCH=adoptium:master
[2025-04-09T21:24:40.533Z] LOGNAME=jenkins
[2025-04-09T21:24:40.533Z] _=/usr/lib/jvm/jdk17/bin/java
[2025-04-09T21:24:40.533Z] RUN_ARTIFACTS_DISPLAY_URL=https://ci.adoptium.net/job/Test_openjdk17_hs_extended.perf_aarch64_linux/220/display/redirect?page=artifacts
[2025-04-09T21:24:40.533Z] STAGE_NAME=Setup
[2025-04-09T21:24:40.533Z] XDG_SESSION_CLASS=user
[2025-04-09T21:24:40.533Z] AUTO_DETECT=true
[2025-04-09T21:24:40.533Z] EXECUTOR_NUMBER=0
[2025-04-09T21:24:40.533Z] JDK_VERSION=17
[2025-04-09T21:24:40.533Z] XDG_SESSION_ID=22327
[2025-04-09T21:24:40.533Z] BUILD_DISPLAY_NAME=#220
[2025-04-09T21:24:40.533Z] CUSTOMIZED_SDK_URL_CREDENTIAL_ID=eclipse_temurin_bot_email_and_token
[2025-04-09T21:24:40.533Z] TIME_LIMIT=25
[2025-04-09T21:24:40.533Z] EXTERNAL_TEST_CMD=mvn clean install
[2025-04-09T21:24:40.533Z] RUN_TESTS_DISPLAY_URL=https://ci.adoptium.net/job/Test_openjdk17_hs_extended.perf_aarch64_linux/220/display/redirect?page=tests
[2025-04-09T21:24:40.533Z] HUDSON_HOME=/home/jenkins/.jenkins
[2025-04-09T21:24:40.533Z] JOB_BASE_NAME=Test_openjdk17_hs_extended.perf_aarch64_linux
[2025-04-09T21:24:40.533Z] PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
[2025-04-09T21:24:40.533Z] PLATFORM=aarch64_linux
[2025-04-09T21:24:40.533Z] TKG_ITERATIONS=1
[2025-04-09T21:24:40.533Z] EXIT_SUCCESS=false
[2025-04-09T21:24:40.533Z] RERUN_FAILURE=true
[2025-04-09T21:24:40.533Z] BUILD_ID=220
[2025-04-09T21:24:40.533Z] XDG_RUNTIME_DIR=/run/user/1002
[2025-04-09T21:24:40.533Z] ACTIVE_NODE_TIMEOUT=5
[2025-04-09T21:24:40.533Z] SYSTEM_LIB_DIR=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/system_lib
[2025-04-09T21:24:40.533Z] BUILD_TAG=jenkins-Test_openjdk17_hs_extended.perf_aarch64_linux-220
[2025-04-09T21:24:40.533Z] JENKINS_URL=https://ci.adoptium.net/
[2025-04-09T21:24:40.533Z] OPENJCEPLUS_GIT_REPO=https://github.com/ibmruntimes/OpenJCEPlus.git
[2025-04-09T21:24:40.533Z] LANG=en_US.UTF-8
[2025-04-09T21:24:40.533Z] JCK_GIT_BRANCH=master
[2025-04-09T21:24:40.533Z] JOB_URL=https://ci.adoptium.net/job/Test_openjdk17_hs_extended.perf_aarch64_linux/
[2025-04-09T21:24:40.533Z] KEEP_REPORTDIR=false
[2025-04-09T21:24:40.533Z] ORIGIN_JDK_VERSION=17
[2025-04-09T21:24:40.533Z] BUILD_NUMBER=220
[2025-04-09T21:24:40.533Z] JENKINS_NODE_COOKIE=9d45a140-89b8-45af-8e10-76e6942d8a4e
[2025-04-09T21:24:40.533Z] SHELL=/bin/bash
[2025-04-09T21:24:40.533Z] ITERATIONS=1
[2025-04-09T21:24:40.533Z] OPENJCEPLUS_GIT_BRANCH=semeru-java17
[2025-04-09T21:24:40.533Z] TEST_JDK_HOME=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image
[2025-04-09T21:24:40.533Z] RUN_DISPLAY_URL=https://ci.adoptium.net/job/Test_openjdk17_hs_extended.perf_aarch64_linux/220/display/redirect
[2025-04-09T21:24:40.533Z] LIGHT_WEIGHT_CHECKOUT=false
[2025-04-09T21:24:40.533Z] OPENJ9_SYSTEMTEST_OWNER_BRANCH=eclipse:master
[2025-04-09T21:24:40.533Z] SPEC=linux_aarch64
[2025-04-09T21:24:40.533Z] HUDSON_SERVER_COOKIE=2d832652af5afba8
[2025-04-09T21:24:40.533Z] JOB_DISPLAY_URL=https://ci.adoptium.net/job/Test_openjdk17_hs_extended.perf_aarch64_linux/display/redirect
[2025-04-09T21:24:40.533Z] NUM_PROCESSORS=8
[2025-04-09T21:24:40.533Z] UPSTREAM_JOB_NAME=build-scripts/jobs/release/jobs/jdk17u/jdk17u-release-linux-aarch64-temurin
[2025-04-09T21:24:40.533Z] ADOPTOPENJDK_SYSTEMTEST_OWNER_BRANCH=adoptium:master
[2025-04-09T21:24:40.533Z] JOB_NAME=Test_openjdk17_hs_extended.perf_aarch64_linux
[2025-04-09T21:24:40.533Z] SLACK_CHANNEL=aqavit-bot
[2025-04-09T21:24:40.533Z] PWD=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux
[2025-04-09T21:24:40.533Z] LIB_DIR=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:24:40.533Z] SSH_CONNECTION=78.47.239.97 59338 140.211.169.57 22
[2025-04-09T21:24:40.533Z] TEST_TIME=120
[2025-04-09T21:24:40.533Z] WORKSPACE_TMP=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux@tmp
[2025-04-09T21:24:40.533Z] PARALLEL=Dynamic
[2025-04-09T21:24:40.533Z] JOBSTARTTIME=Wed, 09 Apr 2025 21:24:33 +0000
[2025-04-09T21:24:40.533Z] DYNAMIC_COMPILE=false
[2025-04-09T21:24:40.533Z] RERUN_ITERATIONS=1
[Pipeline] timeout
[2025-04-09T21:24:41.070Z] Timeout set to expire in 1 hr 0 min
[Pipeline] {
[Pipeline] dir
[2025-04-09T21:24:41.085Z] Running in /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary
[Pipeline] {
[Pipeline] copyArtifacts
[2025-04-09T21:26:10.017Z] Copied 7 artifacts from "build-scripts » jobs » release » jobs » jdk17u » jdk17u-release-linux-aarch64-temurin" build number 32
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // timeout
[Pipeline] echo
[2025-04-09T21:26:10.174Z] IS_SVT_TESTREPO is set to false
[Pipeline] dir
[2025-04-09T21:26:10.180Z] Running in /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux
[Pipeline] {
[Pipeline] sshagent
[2025-04-09T21:26:10.285Z] $ ssh-agent
[2025-04-09T21:26:11.096Z] SSH_AUTH_SOCK=/tmp/ssh-XXXXXXwnPftO/agent.1758102
[2025-04-09T21:26:11.096Z] SSH_AGENT_PID=1758105
[2025-04-09T21:26:11.096Z] [ssh-agent] Started.
[Pipeline] {
[Pipeline] dir
[2025-04-09T21:26:11.121Z] Running in /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests
[Pipeline] {
[Pipeline] withCredentials
[2025-04-09T21:26:11.152Z] Masking supported pattern matches of $USERNAME or $PASSWORD
[Pipeline] {
[Pipeline] sh
[2025-04-09T21:26:13.033Z] + pwd
[2025-04-09T21:26:13.033Z] + ./get.sh -s /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/.. -p aarch64_linux -r upstream -j 17 -i hotspot --clone_openj9 false
[2025-04-09T21:26:13.033Z] TESTDIR: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests
[2025-04-09T21:26:13.033Z] load ./testenv/testenv.properties
[2025-04-09T21:26:13.033Z] Running checkTags with ./testenv/testenv.properties and 17
[2025-04-09T21:26:13.033Z] 9fa9b95fce0d913160f64df8e6f94209a91b3567 refs/tags/jdk-17.0.15-dryrun-ga
[2025-04-09T21:26:13.034Z] Use tag name jdk-17.0.15-dryrun-ga from ./testenv/testenv.properties
[2025-04-09T21:26:13.034Z] get jdk binary...
[2025-04-09T21:26:13.034Z] --sdkdir is set to upstream. Therefore, skip download jdk binary
[2025-04-09T21:26:13.034Z] Uncompressing file: OpenJDK17U-jdk_aarch64_linux_hotspot_17.0.15_5.tar.gz ...
[2025-04-09T21:26:15.002Z] List files in jdkbinary folder...
[2025-04-09T21:26:15.962Z] total 487528
[2025-04-09T21:26:15.963Z] -rw-r--r-- 1 jenkins jenkins 158616031 Apr 9 18:55 OpenJDK17U-debugimage_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:15.963Z] -rw-r--r-- 1 jenkins jenkins 190740818 Apr 9 18:55 OpenJDK17U-jdk_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:15.963Z] -rw-r--r-- 1 jenkins jenkins 45478300 Apr 9 18:55 OpenJDK17U-jre_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:15.963Z] -rw-r--r-- 1 jenkins jenkins 158246 Apr 9 19:05 OpenJDK17U-sbom_aarch64_linux_hotspot_17.0.15_5.json
[2025-04-09T21:26:15.963Z] -rw-r--r-- 1 jenkins jenkins 29285 Apr 9 19:02 OpenJDK17U-sbom_aarch64_linux_hotspot_17.0.15_5-metadata.json
[2025-04-09T21:26:15.963Z] -rw-r--r-- 1 jenkins jenkins 3660154 Apr 9 18:55 OpenJDK17U-static-libs-glibc_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:15.963Z] -rw-r--r-- 1 jenkins jenkins 100520096 Apr 9 18:55 OpenJDK17U-testimage_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:15.963Z] drwxrwxr-x 3 jenkins jenkins 4096 Apr 9 21:26 tmp
[2025-04-09T21:26:15.963Z] List files in jdkbinary/tmp folder...
[2025-04-09T21:26:15.963Z] total 4
[2025-04-09T21:26:15.963Z] drwxr-xr-x 9 jenkins jenkins 4096 Apr 9 18:55 jdk-17.0.15+5
[2025-04-09T21:26:15.963Z] Moving directory jdk-17.0.15+5/ to ../j2sdk-image
[2025-04-09T21:26:15.963Z] Uncompressing file: OpenJDK17U-jre_aarch64_linux_hotspot_17.0.15_5.tar.gz ...
[2025-04-09T21:26:16.922Z] List files in jdkbinary folder...
[2025-04-09T21:26:16.922Z] total 487532
[2025-04-09T21:26:16.922Z] drwxr-xr-x 9 jenkins jenkins 4096 Apr 9 18:55 j2sdk-image
[2025-04-09T21:26:16.922Z] -rw-r--r-- 1 jenkins jenkins 158616031 Apr 9 18:55 OpenJDK17U-debugimage_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 190740818 Apr 9 18:55 OpenJDK17U-jdk_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 45478300 Apr 9 18:55 OpenJDK17U-jre_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 158246 Apr 9 19:05 OpenJDK17U-sbom_aarch64_linux_hotspot_17.0.15_5.json
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 29285 Apr 9 19:02 OpenJDK17U-sbom_aarch64_linux_hotspot_17.0.15_5-metadata.json
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 3660154 Apr 9 18:55 OpenJDK17U-static-libs-glibc_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 100520096 Apr 9 18:55 OpenJDK17U-testimage_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:16.923Z] drwxrwxr-x 3 jenkins jenkins 4096 Apr 9 21:26 tmp
[2025-04-09T21:26:16.923Z] List files in jdkbinary/tmp folder...
[2025-04-09T21:26:16.923Z] total 4
[2025-04-09T21:26:16.923Z] drwxr-xr-x 6 jenkins jenkins 4096 Apr 9 18:55 jdk-17.0.15+5-jre
[2025-04-09T21:26:16.923Z] Moving directory jdk-17.0.15+5-jre/ to ../j2re-image
[2025-04-09T21:26:16.923Z] Uncompressing file: OpenJDK17U-static-libs-glibc_aarch64_linux_hotspot_17.0.15_5.tar.gz ...
[2025-04-09T21:26:16.923Z] List files in jdkbinary folder...
[2025-04-09T21:26:16.923Z] total 487536
[2025-04-09T21:26:16.923Z] drwxr-xr-x 6 jenkins jenkins 4096 Apr 9 18:55 j2re-image
[2025-04-09T21:26:16.923Z] drwxr-xr-x 9 jenkins jenkins 4096 Apr 9 18:55 j2sdk-image
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 158616031 Apr 9 18:55 OpenJDK17U-debugimage_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 190740818 Apr 9 18:55 OpenJDK17U-jdk_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 45478300 Apr 9 18:55 OpenJDK17U-jre_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 158246 Apr 9 19:05 OpenJDK17U-sbom_aarch64_linux_hotspot_17.0.15_5.json
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 29285 Apr 9 19:02 OpenJDK17U-sbom_aarch64_linux_hotspot_17.0.15_5-metadata.json
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 3660154 Apr 9 18:55 OpenJDK17U-static-libs-glibc_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:16.923Z] -rw-r--r-- 1 jenkins jenkins 100520096 Apr 9 18:55 OpenJDK17U-testimage_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:16.923Z] drwxrwxr-x 3 jenkins jenkins 4096 Apr 9 21:26 tmp
[2025-04-09T21:26:16.923Z] List files in jdkbinary/tmp folder...
[2025-04-09T21:26:16.924Z] total 4
[2025-04-09T21:26:16.924Z] drwxr-xr-x 3 jenkins jenkins 4096 Apr 9 18:55 jdk-17.0.15+5-static-libs
[2025-04-09T21:26:16.924Z] Moving directory jdk-17.0.15+5-static-libs/ to ../static-libs
[2025-04-09T21:26:16.924Z] Uncompressing file: OpenJDK17U-testimage_aarch64_linux_hotspot_17.0.15_5.tar.gz ...
[2025-04-09T21:26:18.892Z] List files in jdkbinary folder...
[2025-04-09T21:26:18.892Z] total 487540
[2025-04-09T21:26:18.892Z] drwxr-xr-x 6 jenkins jenkins 4096 Apr 9 18:55 j2re-image
[2025-04-09T21:26:18.892Z] drwxr-xr-x 9 jenkins jenkins 4096 Apr 9 18:55 j2sdk-image
[2025-04-09T21:26:18.892Z] -rw-r--r-- 1 jenkins jenkins 158616031 Apr 9 18:55 OpenJDK17U-debugimage_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:18.892Z] -rw-r--r-- 1 jenkins jenkins 190740818 Apr 9 18:55 OpenJDK17U-jdk_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:18.892Z] -rw-r--r-- 1 jenkins jenkins 45478300 Apr 9 18:55 OpenJDK17U-jre_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:18.892Z] -rw-r--r-- 1 jenkins jenkins 158246 Apr 9 19:05 OpenJDK17U-sbom_aarch64_linux_hotspot_17.0.15_5.json
[2025-04-09T21:26:18.892Z] -rw-r--r-- 1 jenkins jenkins 29285 Apr 9 19:02 OpenJDK17U-sbom_aarch64_linux_hotspot_17.0.15_5-metadata.json
[2025-04-09T21:26:18.892Z] -rw-r--r-- 1 jenkins jenkins 3660154 Apr 9 18:55 OpenJDK17U-static-libs-glibc_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:18.892Z] -rw-r--r-- 1 jenkins jenkins 100520096 Apr 9 18:55 OpenJDK17U-testimage_aarch64_linux_hotspot_17.0.15_5.tar.gz
[2025-04-09T21:26:18.892Z] drwxr-xr-x 3 jenkins jenkins 4096 Apr 9 18:55 static-libs
[2025-04-09T21:26:18.892Z] drwxrwxr-x 3 jenkins jenkins 4096 Apr 9 21:26 tmp
[2025-04-09T21:26:18.892Z] List files in jdkbinary/tmp folder...
[2025-04-09T21:26:18.892Z] total 4
[2025-04-09T21:26:18.892Z] drwxr-xr-x 5 jenkins jenkins 4096 Apr 9 18:42 jdk-17.0.15+5-test-image
[2025-04-09T21:26:18.892Z] Moving directory jdk-17.0.15+5-test-image/ to ../openjdk-test-image
[2025-04-09T21:26:18.892Z] Uncompressing OpenJDK17U-debugimage_aarch64_linux_hotspot_17.0.15_5.tar.gz over ./j2sdk-image...
[2025-04-09T21:26:21.932Z] Removing top-level folder jdk-17.0.15+5-debug-image/
[2025-04-09T21:26:21.932Z] Removing top-level folder jdk-17.0.15+5/
[2025-04-09T21:26:21.932Z] Run /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java -version
[2025-04-09T21:26:21.932Z] =JAVA VERSION OUTPUT BEGIN=
[2025-04-09T21:26:21.932Z] openjdk version "17.0.15" 2025-04-15
[2025-04-09T21:26:21.932Z] OpenJDK Runtime Environment Temurin-17.0.15+5 (build 17.0.15+5)
[2025-04-09T21:26:21.932Z] OpenJDK 64-Bit Server VM Temurin-17.0.15+5 (build 17.0.15+5, mixed mode, sharing)
[2025-04-09T21:26:21.932Z] =JAVA VERSION OUTPUT END=
[2025-04-09T21:26:21.932Z] =RELEASE INFO BEGIN=
[2025-04-09T21:26:21.932Z] IMPLEMENTOR="Eclipse Adoptium"
[2025-04-09T21:26:21.932Z] IMPLEMENTOR_VERSION="Temurin-17.0.15+5"
[2025-04-09T21:26:21.932Z] JAVA_RUNTIME_VERSION="17.0.15+5"
[2025-04-09T21:26:21.932Z] JAVA_VERSION="17.0.15"
[2025-04-09T21:26:21.932Z] JAVA_VERSION_DATE="2025-04-15"
[2025-04-09T21:26:21.932Z] LIBC="gnu"
[2025-04-09T21:26:21.933Z] MODULES="java.base java.compiler java.datatransfer java.xml java.prefs java.desktop java.instrument java.logging java.management java.security.sasl java.naming java.rmi java.management.rmi java.net.http java.scripting java.security.jgss java.transaction.xa java.sql java.sql.rowset java.xml.crypto java.se java.smartcardio jdk.accessibility jdk.internal.jvmstat jdk.attach jdk.charsets jdk.compiler jdk.crypto.ec jdk.crypto.cryptoki jdk.dynalink jdk.internal.ed jdk.editpad jdk.hotspot.agent jdk.httpserver jdk.incubator.foreign jdk.incubator.vector jdk.internal.le jdk.internal.opt jdk.internal.vm.ci jdk.internal.vm.compiler jdk.internal.vm.compiler.management jdk.jartool jdk.javadoc jdk.jcmd jdk.management jdk.management.agent jdk.jconsole jdk.jdeps jdk.jdwp.agent jdk.jdi jdk.jfr jdk.jlink jdk.jpackage jdk.jshell jdk.jsobject jdk.jstatd jdk.localedata jdk.management.jfr jdk.naming.dns jdk.naming.rmi jdk.net jdk.nio.mapmode jdk.random jdk.sctp jdk.security.auth jdk.security.jgss jdk.unsupported jdk.unsupported.desktop jdk.xml.dom jdk.zipfs"
[2025-04-09T21:26:21.933Z] OS_ARCH="aarch64"
[2025-04-09T21:26:21.933Z] OS_NAME="Linux"
[2025-04-09T21:26:21.933Z] SOURCE=".:git:4ab160f057dc"
[2025-04-09T21:26:21.933Z] BUILD_SOURCE="git:28cff0685e7ea9524e428ec20243d4c0e8c943bd"
[2025-04-09T21:26:21.933Z] BUILD_SOURCE_REPO="https://github.com/adoptium/temurin-build.git"
[2025-04-09T21:26:21.933Z] SOURCE_REPO="https://github.com/adoptium/jdk17u.git"
[2025-04-09T21:26:21.933Z] FULL_VERSION="17.0.15+5"
[2025-04-09T21:26:21.933Z] SEMANTIC_VERSION="17.0.15+5"
[2025-04-09T21:26:21.933Z] BUILD_INFO="OS: Linux Version: 6.8.0-39-generic"
[2025-04-09T21:26:21.933Z] JVM_VARIANT="Hotspot"
[2025-04-09T21:26:21.933Z] JVM_VERSION="17.0.15+5"
[2025-04-09T21:26:21.933Z] IMAGE_TYPE="JDK"
[2025-04-09T21:26:21.933Z] =RELEASE INFO END=
[2025-04-09T21:26:21.933Z] get testKitGen...
[2025-04-09T21:26:21.933Z] git clone -q https://github.com/adoptium/TKG.git
[2025-04-09T21:26:22.893Z] git rev-parse v1.0.7
[2025-04-09T21:26:22.893Z] git checkout -q -f cab8e835b96c95363ec451851b4c9c319c66ca11
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[2025-04-09T21:26:23.576Z] $ ssh-agent -k
[2025-04-09T21:26:23.580Z] unset SSH_AUTH_SOCK;
[2025-04-09T21:26:23.580Z] unset SSH_AGENT_PID;
[2025-04-09T21:26:23.580Z] echo Agent pid 1758105 killed;
[2025-04-09T21:26:24.385Z] [ssh-agent] Stopped.
[Pipeline] // sshagent
[Pipeline] fileExists
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // stage
[Pipeline] echo
[2025-04-09T21:26:25.005Z] Rerun in Grinder: https://ci.adoptium.net/job/Grinder/parambuild/?SDK_RESOURCE=upstream&TARGET=extended.perf&BASE_DOCKER_REGISTRY_CREDENTIAL_ID=&TEST_FLAG=&UPSTREAM_TEST_JOB_NAME=&DOCKER_REQUIRED=false&ACTIVE_NODE_TIMEOUT=5&VENDOR_TEST_DIRS=&EXTRA_DOCKER_ARGS=&TKG_OWNER_BRANCH=adoptium%3Amaster&OPENJ9_SYSTEMTEST_OWNER_BRANCH=eclipse%3Amaster&PLATFORM=aarch64_linux&GENERATE_JOBS=false&KEEP_REPORTDIR=false&PERSONAL_BUILD=false&DOCKER_REGISTRY_DIR=&RERUN_ITERATIONS=0&ADOPTOPENJDK_REPO=https%3A%2F%2Fgithub.com%2Fadoptium%2Faqa-tests.git&SETUP_JCK_RUN=false&DOCKER_REGISTRY_URL_CREDENTIAL_ID=&LABEL=&EXTRA_OPTIONS=&CUSTOMIZED_SDK_URL=&BUILD_IDENTIFIER=&JENKINS_KEY=&ADOPTOPENJDK_BRANCH=v1.0.7-release&LIGHT_WEIGHT_CHECKOUT=false&USE_JRE=false&ARTIFACTORY_SERVER=&KEEP_WORKSPACE=false&USER_CREDENTIALS_ID=&JDK_VERSION=17&DOCKER_REGISTRY_URL=&ITERATIONS=1&VENDOR_TEST_REPOS=&JDK_REPO=https%3A%2F%2Fgithub.com%2Fadoptium%2Fjdk17u&JCK_GIT_BRANCH=master&OPENJ9_BRANCH=master&OPENJ9_SHA=&JCK_GIT_REPO=&VENDOR_TEST_BRANCHES=&UPSTREAM_JOB_NAME=build-scripts%2Fjobs%2Frelease%2Fjobs%2Fjdk17u%2Fjdk17u-release-linux-aarch64-temurin&OPENJ9_REPO=https%3A%2F%2Fgithub.com%2Feclipse-openj9%2Fopenj9.git&CLOUD_PROVIDER=&CUSTOM_TARGET=&VENDOR_TEST_SHAS=&JDK_BRANCH=jdk-17.0.15%2B5_adopt&LABEL_ADDITION=&ARTIFACTORY_REPO=&ARTIFACTORY_ROOT_DIR=&UPSTREAM_TEST_JOB_NUMBER=&DOCKERIMAGE_TAG=&TEST_TIME=120&JDK_IMPL=hotspot&SSH_AGENT_CREDENTIAL=&AUTO_DETECT=true&SLACK_CHANNEL=aqavit-bot&DYNAMIC_COMPILE=false&RELATED_NODES=&ADOPTOPENJDK_SYSTEMTEST_OWNER_BRANCH=adoptium%3Amaster&APPLICATION_OPTIONS=&CUSTOMIZED_SDK_URL_CREDENTIAL_ID=eclipse_temurin_bot_email_and_token&ARCHIVE_TEST_RESULTS=false&NUM_MACHINES=&OPENJDK_SHA=&TRSS_URL=&RERUN_FAILURE=false&USE_TESTENV_PROPERTIES=true&BUILD_LIST=perf&ADDITIONAL_ARTIFACTS_REQUIRED=&UPSTREAM_JOB_NUMBER=32&STF_OWNER_BRANCH=adoptium%3Amaster&TIME_LIMIT=25&JVM_OPTIONS=&PARALLEL=None
[Pipeline] echo
[2025-04-09T21:26:25.015Z] Rerun in Grinder on same machine: https://ci.adoptium.net/job/Grinder/parambuild/?SDK_RESOURCE=upstream&TARGET=extended.perf&BASE_DOCKER_REGISTRY_CREDENTIAL_ID=&TEST_FLAG=&UPSTREAM_TEST_JOB_NAME=&DOCKER_REQUIRED=false&ACTIVE_NODE_TIMEOUT=5&VENDOR_TEST_DIRS=&EXTRA_DOCKER_ARGS=&TKG_OWNER_BRANCH=adoptium%3Amaster&OPENJ9_SYSTEMTEST_OWNER_BRANCH=eclipse%3Amaster&PLATFORM=aarch64_linux&GENERATE_JOBS=false&KEEP_REPORTDIR=false&PERSONAL_BUILD=false&DOCKER_REGISTRY_DIR=&RERUN_ITERATIONS=0&ADOPTOPENJDK_REPO=https%3A%2F%2Fgithub.com%2Fadoptium%2Faqa-tests.git&SETUP_JCK_RUN=false&DOCKER_REGISTRY_URL_CREDENTIAL_ID=&LABEL=test-osuosl-ubuntu2204-aarch64-1&EXTRA_OPTIONS=&CUSTOMIZED_SDK_URL=&BUILD_IDENTIFIER=&JENKINS_KEY=&ADOPTOPENJDK_BRANCH=v1.0.7-release&LIGHT_WEIGHT_CHECKOUT=false&USE_JRE=false&ARTIFACTORY_SERVER=&KEEP_WORKSPACE=false&USER_CREDENTIALS_ID=&JDK_VERSION=17&DOCKER_REGISTRY_URL=&ITERATIONS=1&VENDOR_TEST_REPOS=&JDK_REPO=https%3A%2F%2Fgithub.com%2Fadoptium%2Fjdk17u&JCK_GIT_BRANCH=master&OPENJ9_BRANCH=master&OPENJ9_SHA=&JCK_GIT_REPO=&VENDOR_TEST_BRANCHES=&UPSTREAM_JOB_NAME=build-scripts%2Fjobs%2Frelease%2Fjobs%2Fjdk17u%2Fjdk17u-release-linux-aarch64-temurin&OPENJ9_REPO=https%3A%2F%2Fgithub.com%2Feclipse-openj9%2Fopenj9.git&CLOUD_PROVIDER=&CUSTOM_TARGET=&VENDOR_TEST_SHAS=&JDK_BRANCH=jdk-17.0.15%2B5_adopt&LABEL_ADDITION=&ARTIFACTORY_REPO=&ARTIFACTORY_ROOT_DIR=&UPSTREAM_TEST_JOB_NUMBER=&DOCKERIMAGE_TAG=&TEST_TIME=120&JDK_IMPL=hotspot&SSH_AGENT_CREDENTIAL=&AUTO_DETECT=true&SLACK_CHANNEL=aqavit-bot&DYNAMIC_COMPILE=false&RELATED_NODES=&ADOPTOPENJDK_SYSTEMTEST_OWNER_BRANCH=adoptium%3Amaster&APPLICATION_OPTIONS=&CUSTOMIZED_SDK_URL_CREDENTIAL_ID=eclipse_temurin_bot_email_and_token&ARCHIVE_TEST_RESULTS=false&NUM_MACHINES=&OPENJDK_SHA=&TRSS_URL=&RERUN_FAILURE=false&USE_TESTENV_PROPERTIES=true&BUILD_LIST=perf&ADDITIONAL_ARTIFACTS_REQUIRED=&UPSTREAM_JOB_NUMBER=32&STF_OWNER_BRANCH=adoptium%3Amaster&TIME_LIMIT=25&JVM_OPTIONS=&PARALLEL=None
[Pipeline] stage
[Pipeline] { (setupParallelEnv)
[Pipeline] timeout
[2025-04-09T21:26:25.058Z] Timeout set to expire in 1 hr 0 min
[Pipeline] {
[Pipeline] copyArtifacts
[2025-04-09T21:26:26.152Z] Copied 1 artifact from "getTRSSOutput" build number 1737
[Pipeline] sh
[2025-04-09T21:26:28.029Z] + cd ./aqa-tests/TKG/resources/TRSS
[2025-04-09T21:26:28.029Z] + gzip -cd TRSSOutput.tar.gz
[2025-04-09T21:26:28.029Z] + tar xof -
[2025-04-09T21:26:28.029Z] + rm TRSSOutput.tar.gz
[Pipeline] }
[Pipeline] // timeout
[Pipeline] timeout
[2025-04-09T21:26:28.595Z] Timeout set to expire in 20 min
[Pipeline] {
[Pipeline] echo
[2025-04-09T21:26:28.614Z] Custom URL: https://ci.adoptium.net//job/test.getDependency/lastSuccessfulBuild/artifact/
[Pipeline] sh
[2025-04-09T21:26:30.478Z] + perl ./aqa-tests/TKG/scripts/getDependencies.pl -path /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib -task default -customUrl https://ci.adoptium.net//job/test.getDependency/lastSuccessfulBuild/artifact/
[2025-04-09T21:26:30.479Z] --------------------------------------------
[2025-04-09T21:26:30.479Z] path is set to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:26:30.479Z] task is set to default
[2025-04-09T21:26:30.479Z] dependencyList is set to all
[2025-04-09T21:26:30.479Z] --------------------------------------------
[2025-04-09T21:26:30.479Z] Starting download third party dependent jars
[2025-04-09T21:26:30.479Z] --------------------------------------------
[2025-04-09T21:26:30.479Z] downloading dependent third party jars to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/org.eclipse.osgi-3.16.100.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jtreg_5_1_b01.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/javassist.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/apache-maven-bin.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/objenesis.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jython-standalone.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/commons-cli.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/junit4.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/asmtools.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jtreg_6_1.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/byte-buddy-agent.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/asm-all.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/mockito-core.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jaxb-api.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/openj9jtregtimeouthandler.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/commons-exec.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/byte-buddy.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jcstress-tests-all-20240222.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/json-simple.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jtreg_7_3_1_1.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/testng.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/asm.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jcommander.jar exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jtreg_7_4_1.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jtreg_7_5_1_1.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:30.479Z] downloaded dependent third party jars successfully
[Pipeline] }
[Pipeline] // timeout
[Pipeline] nodesByLabel
[2025-04-09T21:26:31.048Z] Found a total of 2 nodes with the 'sw.os.linux&&hw.arch.aarch64&&ci.role.perf' label
[Pipeline] sh
[2025-04-09T21:26:32.935Z] + cd ./aqa-tests/TKG
[2025-04-09T21:26:32.935Z] + uname
[2025-04-09T21:26:32.935Z] + [ Linux = AIX ]
[2025-04-09T21:26:32.935Z] + uname
[2025-04-09T21:26:32.935Z] + [ Linux = SunOS ]
[2025-04-09T21:26:32.935Z] + uname
[2025-04-09T21:26:32.935Z] + [ Linux = *BSD ]
[2025-04-09T21:26:32.935Z] + MAKE=make
[2025-04-09T21:26:32.935Z] + make genParallelList TEST=extended.perf TEST_TIME=120 NUM_MACHINES=
[2025-04-09T21:26:32.935Z] JAVA_HOME is set to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image
[2025-04-09T21:26:32.935Z] LIB_DIR is set to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:26:32.935Z] rm -f -r /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation; \
[2025-04-09T21:26:32.935Z] mkdir -p /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation; \
[2025-04-09T21:26:32.935Z] (ant -f ./scripts/build_tools.xml "-DTEST_JDK_HOME=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image" "-DTEST_ROOT=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/.." "-DLIB_DIR=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib" 2>&1; echo $? ) | tee "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation/compilation.log"; \
[2025-04-09T21:26:32.935Z] if [ -z "$(tail -1 /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation/compilation.log | grep 0)" ]; then perl scripts/moveDmp.pl --compileLogPath="/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation/compilation.log" --testRoot="/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/.."; false; else rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation"; fi
[2025-04-09T21:26:32.935Z] Buildfile: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/scripts/build_tools.xml
[2025-04-09T21:26:32.935Z]
[2025-04-09T21:26:32.935Z] build:
[2025-04-09T21:26:32.935Z]
[2025-04-09T21:26:32.935Z] clean:
[2025-04-09T21:26:32.935Z]
[2025-04-09T21:26:32.935Z] init:
[2025-04-09T21:26:32.935Z] [mkdir] Created dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/bin
[2025-04-09T21:26:32.935Z]
[2025-04-09T21:26:32.935Z] getDependentLibs:
[2025-04-09T21:26:32.935Z] [exec] --------------------------------------------
[2025-04-09T21:26:32.935Z] [exec] path is set to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:26:32.936Z] [exec] task is set to default
[2025-04-09T21:26:32.936Z] [exec] dependencyList is set to json_simple
[2025-04-09T21:26:32.936Z] [exec] --------------------------------------------
[2025-04-09T21:26:32.936Z] [exec] Starting download third party dependent jars
[2025-04-09T21:26:32.936Z] [exec] --------------------------------------------
[2025-04-09T21:26:32.936Z] [exec] downloading dependent third party jars to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:26:32.936Z] [exec] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/json-simple.jar exists with correct hash, not downloading
[2025-04-09T21:26:32.936Z] [exec] downloaded dependent third party jars successfully
[2025-04-09T21:26:32.936Z]
[2025-04-09T21:26:32.936Z] compile:
[2025-04-09T21:26:32.936Z] [echo] Ant version is Apache Ant(TM) version 1.10.15 compiled on August 25 2024
[2025-04-09T21:26:32.936Z] [echo] ============COMPILER SETTINGS============
[2025-04-09T21:26:32.936Z] [echo] ===fork: yes
[2025-04-09T21:26:32.936Z] [echo] ===debug: on
[2025-04-09T21:26:32.936Z] [javac] Compiling 28 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/bin
[2025-04-09T21:26:37.130Z] [javac] Note: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/src/org/openj9/envInfo/MachineInfo.java uses or overrides a deprecated API.
[2025-04-09T21:26:37.130Z] [javac] Note: Recompile with -Xlint:deprecation for details.
[2025-04-09T21:26:37.130Z]
[2025-04-09T21:26:37.130Z] dist:
[2025-04-09T21:26:37.130Z] [jar] Building jar: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/bin/TestKitGen.jar
[2025-04-09T21:26:37.130Z]
[2025-04-09T21:26:37.130Z] BUILD SUCCESSFUL
[2025-04-09T21:26:37.130Z] Total time: 3 seconds
[2025-04-09T21:26:37.130Z] 0
[2025-04-09T21:26:37.130Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java -cp ./bin/TestKitGen.jar org.openj9.envInfo.EnvDetector
[2025-04-09T21:26:37.130Z] Unfamiliar microArch detected in TKG. It will not be added in TKG microArch!
[2025-04-09T21:26:37.130Z] microArchOutput:
[2025-04-09T21:26:37.130Z] ****************************** MACHINE INFO ******************************
[2025-04-09T21:26:37.130Z] File path : /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG
[2025-04-09T21:26:37.130Z] Free Physical Memory Size : 7864266752
[2025-04-09T21:26:37.130Z] Free space (bytes) : 51966406656
[2025-04-09T21:26:37.130Z] Total Physical Memory Size : 16739598336
[2025-04-09T21:26:37.130Z] Total space (bytes) : 83902935040
[2025-04-09T21:26:37.130Z] Usable space (bytes) : 48489496576
[2025-04-09T21:26:37.130Z] antVersion : Apache Ant(TM) version 1.10.15 compiled on August 25 2024
[2025-04-09T21:26:37.130Z] bashVersion :
[2025-04-09T21:26:37.130Z] GNU bash, version 5.1.16(1)-release (aarch64-unknown-linux-gnu)
[2025-04-09T21:26:37.130Z] Copyright (C) 2020 Free Software Foundation, Inc.
[2025-04-09T21:26:37.130Z] License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
[2025-04-09T21:26:37.130Z]
[2025-04-09T21:26:37.130Z] This is free software; you are free to change and redistribute it.
[2025-04-09T21:26:37.130Z] There is NO WARRANTY, to the extent permitted by law.
[2025-04-09T21:26:37.130Z] cpuCores : 8
[2025-04-09T21:26:37.130Z] curlVersion :
[2025-04-09T21:26:37.130Z] curl 7.81.0 (aarch64-unknown-linux-gnu) libcurl/7.81.0 OpenSSL/3.0.2 zlib/1.2.11 brotli/1.0.9 zstd/1.4.8 libidn2/2.3.2 libpsl/0.21.0 (+libidn2/2.3.2) libssh/0.9.6/openssl/zlib nghttp2/1.43.0 librtmp/2.3 OpenLDAP/2.5.18
[2025-04-09T21:26:37.130Z] Release-Date: 2022-01-05
[2025-04-09T21:26:37.130Z] Protocols: dict file ftp ftps gopher gophers http https imap imaps ldap ldaps mqtt pop3 pop3s rtmp rtsp scp sftp smb smbs smtp smtps telnet tftp
[2025-04-09T21:26:37.130Z] Features: alt-svc AsynchDNS brotli GSS-API HSTS HTTP2 HTTPS-proxy IDN IPv6 Kerberos Largefile libz NTLM NTLM_WB PSL SPNEGO SSL TLS-SRP UnixSockets zstd
[2025-04-09T21:26:37.130Z] dockerVersion : Docker version 20.10.23, build 7155243
[2025-04-09T21:26:37.130Z] gcc version : 11
[2025-04-09T21:26:37.130Z] gclibc version : ldd (Ubuntu GLIBC 2.35-0ubuntu3.9) 2.35
[2025-04-09T21:26:37.130Z] gdb version : bash: line 1: gdb: command not found
[2025-04-09T21:26:37.130Z] lldb version : Command could not be executed
[2025-04-09T21:26:37.130Z] makeVersion :
[2025-04-09T21:26:37.130Z] GNU Make 4.3
[2025-04-09T21:26:37.130Z] Built for aarch64-unknown-linux-gnu
[2025-04-09T21:26:37.130Z] Copyright (C) 1988-2020 Free Software Foundation, Inc.
[2025-04-09T21:26:37.130Z] License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
[2025-04-09T21:26:37.130Z] This is free software: you are free to change and redistribute it.
[2025-04-09T21:26:37.130Z] There is NO WARRANTY, to the extent permitted by law.
[2025-04-09T21:26:37.130Z] mavenVersion :
[2025-04-09T21:26:37.130Z] [1mApache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f)[m
[2025-04-09T21:26:37.130Z] Maven home: /usr/local/apache-maven-3.6.3
[2025-04-09T21:26:37.130Z] Java version: 17.0.15, vendor: Eclipse Adoptium, runtime: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image
[2025-04-09T21:26:37.130Z] Default locale: en_US, platform encoding: UTF-8
[2025-04-09T21:26:37.130Z] OS name: "linux", version: "5.15.0-133-generic", arch: "aarch64", family: "unix"
[2025-04-09T21:26:37.130Z] osInfo :
[2025-04-09T21:26:37.130Z] PRETTY_NAME="Ubuntu 22.04.5 LTS"
[2025-04-09T21:26:37.130Z] NAME="Ubuntu"
[2025-04-09T21:26:37.130Z] VERSION_ID="22.04"
[2025-04-09T21:26:37.130Z] VERSION="22.04.5 LTS (Jammy Jellyfish)"
[2025-04-09T21:26:37.130Z] VERSION_CODENAME=jammy
[2025-04-09T21:26:37.130Z] ID=ubuntu
[2025-04-09T21:26:37.130Z] ID_LIKE=debian
[2025-04-09T21:26:37.131Z] HOME_URL="https://www.ubuntu.com/"
[2025-04-09T21:26:37.131Z] SUPPORT_URL="https://help.ubuntu.com/"
[2025-04-09T21:26:37.131Z] BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
[2025-04-09T21:26:37.131Z] PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
[2025-04-09T21:26:37.131Z] UBUNTU_CODENAME=jammy
[2025-04-09T21:26:37.131Z] osLabel : ubuntu.22
[2025-04-09T21:26:37.131Z] perlVersion :
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] This is perl 5, version 34, subversion 0 (v5.34.0) built for aarch64-linux-gnu-thread-multi
[2025-04-09T21:26:37.131Z] (with 60 registered patches, see perl -V for more detail)
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] Copyright 1987-2021, Larry Wall
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] Perl may be copied only under the terms of either the Artistic License or the
[2025-04-09T21:26:37.131Z] GNU General Public License, which may be found in the Perl 5 source kit.
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] Complete documentation for Perl, including FAQ lists, should be found on
[2025-04-09T21:26:37.131Z] this system using "man perl" or "perldoc perl". If you have access to the
[2025-04-09T21:26:37.131Z] Internet, point your browser at http://www.perl.org/, the Perl Home Page.
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] podmanVersion : bash: line 1: podman: command not found
[2025-04-09T21:26:37.131Z] procArch : aarch64
[2025-04-09T21:26:37.131Z] sysArch : aarch64
[2025-04-09T21:26:37.131Z] sysOS : Linux
[2025-04-09T21:26:37.131Z] ulimit :
[2025-04-09T21:26:37.131Z] real-time non-blocking time (microseconds, -R) unlimited
[2025-04-09T21:26:37.131Z] core file size (blocks, -c) 0
[2025-04-09T21:26:37.131Z] data seg size (kbytes, -d) unlimited
[2025-04-09T21:26:37.131Z] scheduling priority (-e) 0
[2025-04-09T21:26:37.131Z] file size (blocks, -f) unlimited
[2025-04-09T21:26:37.131Z] pending signals (-i) 63441
[2025-04-09T21:26:37.131Z] max locked memory (kbytes, -l) 2043408
[2025-04-09T21:26:37.131Z] max memory size (kbytes, -m) unlimited
[2025-04-09T21:26:37.131Z] open files (-n) 1048576
[2025-04-09T21:26:37.131Z] pipe size (512 bytes, -p) 8
[2025-04-09T21:26:37.131Z] POSIX message queues (bytes, -q) 819200
[2025-04-09T21:26:37.131Z] real-time priority (-r) 0
[2025-04-09T21:26:37.131Z] stack size (kbytes, -s) 8192
[2025-04-09T21:26:37.131Z] cpu time (seconds, -t) unlimited
[2025-04-09T21:26:37.131Z] max user processes (-u) 63441
[2025-04-09T21:26:37.131Z] virtual memory (kbytes, -v) unlimited
[2025-04-09T21:26:37.131Z] file locks (-x) unlimited
[2025-04-09T21:26:37.131Z] uname : Linux test-osuosl-ubuntu2204-aarch64-1 5.15.0-133-generic #144-Ubuntu SMP Sat Feb 8 14:13:21 UTC 2025 aarch64 aarch64 aarch64 GNU/Linux
[2025-04-09T21:26:37.131Z] vmVendor : Oracle Corporation
[2025-04-09T21:26:37.131Z] vmVersion : 17.0.15+5
[2025-04-09T21:26:37.131Z] xlc version : bash: line 1: xlC: command not found
[2025-04-09T21:26:37.131Z] **************************************************************************
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] System.getProperty('java.vm.name')=OpenJDK 64-Bit Server VM
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] System.getProperty('java.vendor')=Eclipse Adoptium
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] System.getProperty('os.name')=Linux
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] System.getProperty('os.arch')=aarch64
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] System.getProperty('java.fullversion')=null
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] System.getProperty('sun.arch.data.model')=64
[2025-04-09T21:26:37.131Z]
[2025-04-09T21:26:37.131Z] make -f makeGen.mk AUTO_DETECT=true MODE=parallelList NUM_MACHINES= TEST_TIME=120 TESTTARGET=extended.perf TESTLIST= TRSS_URL= LIB_DIR=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:26:37.131Z] make[1]: Entering directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG'
[2025-04-09T21:26:37.131Z] perl scripts/configure.pl
[2025-04-09T21:26:38.094Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java -cp "./bin/TestKitGen.jar:/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/json-simple.jar" org.testKitGen.MainRunner --mode=parallelList --spec=linux_aarch64 --microArch="" --osLabel="ubuntu.22" --jdkVersion=17 --impl=hotspot --vendor="eclipse" --buildList=perf --iterations=1 --aotIterations= --testFlag= --testTarget=extended.perf --testList= --numOfMachines= --testTime=120 --TRSSURL=
[2025-04-09T21:26:38.094Z] Modes data parsed from resources/modes.xml and resources/ottawa.csv.
[2025-04-09T21:26:38.094Z]
[2025-04-09T21:26:38.094Z]
[2025-04-09T21:26:38.094Z] Starting to generate parallel test lists.
[2025-04-09T21:26:38.094Z]
[2025-04-09T21:26:38.094Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/dacapo/playlist.xml
[2025-04-09T21:26:38.094Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/idle_micro/playlist.xml
[2025-04-09T21:26:38.094Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/liberty/playlist.xml
[2025-04-09T21:26:38.094Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/specjbb/playlist.xml
[2025-04-09T21:26:38.094Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/renaissance/playlist.xml
[2025-04-09T21:26:38.094Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/bumbleBench/playlist.xml
[2025-04-09T21:26:38.094Z] Attempting to get test duration data from TRSS.
[2025-04-09T21:26:38.094Z] curl --silent --max-time 120 -L -k https://trss.adoptopenjdk.net/api/getTestAvgDuration?limit=10&jdkVersion=17&impl=hs&platform=aarch64_linux&group=perf&level=extended
[2025-04-09T21:26:39.059Z] Attempting to get test duration data from cached files.
[2025-04-09T21:26:39.059Z]
[2025-04-09T21:26:39.059Z] TEST DURATION
[2025-04-09T21:26:39.059Z] ====================================================================================
[2025-04-09T21:26:39.059Z] Total number of tests searched: 20
[2025-04-09T21:26:39.059Z] Number of test durations found: 20
[2025-04-09T21:26:39.059Z] Top slowest tests:
[2025-04-09T21:26:39.059Z] 05m50s renaissance-movie-lens_0
[2025-04-09T21:26:39.059Z] 04m06s renaissance-finagle-http_0
[2025-04-09T21:26:39.059Z] 02m51s renaissance-als_0
[2025-04-09T21:26:39.059Z] ====================================================================================
[2025-04-09T21:26:39.059Z]
[2025-04-09T21:26:39.059Z] Test target is split into 1 lists.
[2025-04-09T21:26:39.059Z] Reducing estimated test running time from 25m40s to 25m40s.
[2025-04-09T21:26:39.059Z]
[2025-04-09T21:26:39.059Z] -------------------------------------testList_0-------------------------------------
[2025-04-09T21:26:39.059Z] Number of tests: 20
[2025-04-09T21:26:39.059Z] Estimated running time: 25m40s
[2025-04-09T21:26:39.059Z] TESTLIST=renaissance-movie-lens_0,renaissance-finagle-http_0,renaissance-als_0,renaissance-philosophers_0,renaissance-gauss-mix_0,renaissance-mnemonics_0,renaissance-chi-square_0,renaissance-dec-tree_0,renaissance-par-mnemonics_0,renaissance-log-regression_0,dacapo-jython_0,dacapo-avrora_0,dacapo-xalan_0,dacapo-luindex_0,dacapo-pmd_0,dacapo-fop_0,dacapo-sunflow_0,renaissance-db-shootout_0,dacapo-tomcat_0,renaissance-finagle-chirper_0
[2025-04-09T21:26:39.059Z] ------------------------------------------------------------------------------------
[2025-04-09T21:26:39.059Z]
[2025-04-09T21:26:39.059Z] Parallel test lists file (/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/parallelList.mk) is generated successfully.
[2025-04-09T21:26:39.059Z] make[1]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG'
[Pipeline] fileExists
[Pipeline] echo
[2025-04-09T21:26:39.783Z] read parallelList.mk file: aqa-tests/TKG/parallelList.mk
[Pipeline] readProperties
[Pipeline] echo
[2025-04-09T21:26:40.342Z] Number of test list is 1, no need to run tests in child job.
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Build)
[Pipeline] echo
[2025-04-09T21:26:40.401Z] Building tests...
[Pipeline] timeout
[2025-04-09T21:26:40.404Z] Timeout set to expire in 20 min
[Pipeline] {
[Pipeline] echo
[2025-04-09T21:26:40.426Z] Custom URL: https://ci.adoptium.net//job/test.getDependency/lastSuccessfulBuild/artifact/
[Pipeline] sh
[2025-04-09T21:26:42.302Z] + perl ./aqa-tests/TKG/scripts/getDependencies.pl -path /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib -task default -customUrl https://ci.adoptium.net//job/test.getDependency/lastSuccessfulBuild/artifact/
[2025-04-09T21:26:42.302Z] --------------------------------------------
[2025-04-09T21:26:42.302Z] path is set to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:26:42.302Z] task is set to default
[2025-04-09T21:26:42.302Z] dependencyList is set to all
[2025-04-09T21:26:42.302Z] --------------------------------------------
[2025-04-09T21:26:42.302Z] Starting download third party dependent jars
[2025-04-09T21:26:42.302Z] --------------------------------------------
[2025-04-09T21:26:42.302Z] downloading dependent third party jars to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/javassist.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jaxb-api.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/openj9jtregtimeouthandler.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/byte-buddy.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jcommander.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jtreg_7_3_1_1.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/objenesis.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/commons-cli.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jtreg_7_4_1.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/testng.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/asm.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jtreg_5_1_b01.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/mockito-core.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jtreg_7_5_1_1.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/apache-maven-bin.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/byte-buddy-agent.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jtreg_6_1.tar.gz exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/json-simple.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/commons-exec.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jython-standalone.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/junit4.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/jcstress-tests-all-20240222.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/asm-all.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/asmtools.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/org.eclipse.osgi-3.16.100.jar exists with correct hash, not downloading
[2025-04-09T21:26:42.302Z] downloaded dependent third party jars successfully
[Pipeline] }
[Pipeline] // timeout
[Pipeline] fileExists
[Pipeline] fileExists
[Pipeline] withCredentials
[2025-04-09T21:26:43.292Z] Masking supported pattern matches of $USERNAME or $PASSWORD
[Pipeline] {
[Pipeline] dir
[2025-04-09T21:26:43.319Z] Running in /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests
[Pipeline] {
[Pipeline] sshagent
[2025-04-09T21:26:43.423Z] $ ssh-agent
[2025-04-09T21:26:44.240Z] SSH_AUTH_SOCK=/tmp/ssh-XXXXXXKlBIbU/agent.1758577
[2025-04-09T21:26:44.240Z] SSH_AGENT_PID=1758579
[2025-04-09T21:26:44.240Z] [ssh-agent] Started.
[Pipeline] {
[Pipeline] sh
[2025-04-09T21:26:46.130Z] + bash ./compile.sh
[2025-04-09T21:26:46.130Z] Set values based on ./testenv/testenv.properties:
[2025-04-09T21:26:46.130Z] =========
[2025-04-09T21:26:46.130Z] TKG_REPO=https://github.com/adoptium/TKG.git
[2025-04-09T21:26:46.130Z] TKG_BRANCH=v1.0.7
[2025-04-09T21:26:46.130Z] OPENJ9_REPO=https://github.com/eclipse-openj9/openj9.git
[2025-04-09T21:26:46.130Z] OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:26:46.130Z] STF_REPO=https://github.com/adoptium/STF.git
[2025-04-09T21:26:46.130Z] STF_BRANCH=v1.0.7
[2025-04-09T21:26:46.130Z] OPENJ9_SYSTEMTEST_REPO=https://github.com/adoptium/openj9-systemtest.git
[2025-04-09T21:26:46.130Z] OPENJ9_SYSTEMTEST_BRANCH=v1.0.7
[2025-04-09T21:26:46.130Z] ADOPTOPENJDK_SYSTEMTEST_REPO=https://github.com/adoptium/aqa-systemtest.git
[2025-04-09T21:26:46.130Z] ADOPTOPENJDK_SYSTEMTEST_BRANCH=v1.0.7
[2025-04-09T21:26:46.130Z] JDK8_REPO=https://github.com/adoptium/jdk8u.git
[2025-04-09T21:26:46.130Z] JDK8_BRANCH=jdk8u452-dryrun-ga
[2025-04-09T21:26:46.130Z] JDK11_REPO=https://github.com/adoptium/jdk11u.git
[2025-04-09T21:26:46.130Z] JDK11_BRANCH=jdk-11.0.27-dryrun-ga
[2025-04-09T21:26:46.130Z] JDK17_REPO=https://github.com/adoptium/jdk17u.git
[2025-04-09T21:26:46.130Z] JDK17_BRANCH=jdk-17.0.15-dryrun-ga
[2025-04-09T21:26:46.130Z] JDK21_REPO=https://github.com/adoptium/jdk21u.git
[2025-04-09T21:26:46.130Z] JDK21_BRANCH=jdk-21.0.7-dryrun-ga
[2025-04-09T21:26:46.130Z] JDK24_REPO=https://github.com/adoptium/jdk24u.git
[2025-04-09T21:26:46.130Z] JDK24_BRANCH=jdk-24.0.1-dryrun-ga
[2025-04-09T21:26:46.130Z] JDK8_OPENJ9_REPO=https://github.com/ibmruntimes/openj9-openjdk-jdk8.git
[2025-04-09T21:26:46.130Z] JDK8_OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:26:46.130Z] JDK11_OPENJ9_REPO=https://github.com/ibmruntimes/openj9-openjdk-jdk11.git
[2025-04-09T21:26:46.130Z] JDK11_OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:26:46.130Z] JDK17_OPENJ9_REPO=https://github.com/ibmruntimes/openj9-openjdk-jdk17.git
[2025-04-09T21:26:46.130Z] JDK17_OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:26:46.130Z] JDK21_OPENJ9_REPO=https://github.com/ibmruntimes/openj9-openjdk-jdk21.git
[2025-04-09T21:26:46.130Z] JDK21_OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:26:46.131Z] JDK24_OPENJ9_REPO=https://github.com/ibmruntimes/openj9-openjdk-jdk24.git
[2025-04-09T21:26:46.131Z] JDK24_OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:26:46.131Z] JDK11_OPENJCEPLUS_GIT_REPO=https://github.com/ibmruntimes/OpenJCEPlus.git
[2025-04-09T21:26:46.131Z] JDK11_OPENJCEPLUS_GIT_BRANCH=semeru-java-11.0.27
[2025-04-09T21:26:46.131Z] JDK17_OPENJCEPLUS_GIT_REPO=https://github.com/ibmruntimes/OpenJCEPlus.git
[2025-04-09T21:26:46.131Z] JDK17_OPENJCEPLUS_GIT_BRANCH=semeru-java-17.0.15
[2025-04-09T21:26:46.131Z] JDK21_OPENJCEPLUS_GIT_REPO=https://github.com/ibmruntimes/OpenJCEPlus.git
[2025-04-09T21:26:46.131Z] JDK21_OPENJCEPLUS_GIT_BRANCH=semeru-java-21.0.7
[2025-04-09T21:26:46.131Z] JDK24_OPENJCEPLUS_GIT_REPO=https://github.com/ibmruntimes/OpenJCEPlus.git
[2025-04-09T21:26:46.131Z] JDK24_OPENJCEPLUS_GIT_BRANCH=semeru-java-24.0.1
[2025-04-09T21:26:46.131Z] AQA_REQUIRED_TARGETS=sanity.functional,extended.functional,special.functional,sanity.openjdk,extended.openjdk,sanity.system,extended.system,sanity.perf,extended.perf
[2025-04-09T21:26:46.131Z]
[2025-04-09T21:26:46.131Z] =========
[2025-04-09T21:26:46.131Z]
[2025-04-09T21:26:46.131Z] JDK_REPO=https://github.com/adoptium/jdk17u.git
[2025-04-09T21:26:46.131Z] JDK_BRANCH=jdk-17.0.15-dryrun-ga
[2025-04-09T21:26:46.131Z] OPENJCEPLUS_GIT_REPO=https://github.com/ibmruntimes/OpenJCEPlus.git
[2025-04-09T21:26:46.131Z] OPENJCEPLUS_GIT_BRANCH=semeru-java17
[2025-04-09T21:26:46.131Z] JAVA_HOME is set to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image
[2025-04-09T21:26:46.131Z] LIB_DIR is set to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:26:46.131Z] rm -f -r /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation; \
[2025-04-09T21:26:46.131Z] mkdir -p /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation; \
[2025-04-09T21:26:46.131Z] (ant -f ./scripts/build_tools.xml "-DTEST_JDK_HOME=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image" "-DTEST_ROOT=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/.." "-DLIB_DIR=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib" 2>&1; echo $? ) | tee "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation/compilation.log"; \
[2025-04-09T21:26:46.132Z] if [ -z "$(tail -1 /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation/compilation.log | grep 0)" ]; then perl scripts/moveDmp.pl --compileLogPath="/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation/compilation.log" --testRoot="/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/.."; false; else rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation"; fi
[2025-04-09T21:26:46.132Z] Buildfile: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/scripts/build_tools.xml
[2025-04-09T21:26:46.132Z]
[2025-04-09T21:26:46.132Z] build:
[2025-04-09T21:26:46.132Z]
[2025-04-09T21:26:46.132Z] clean:
[2025-04-09T21:26:46.132Z] [delete] Deleting: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/autoGenEnv.mk
[2025-04-09T21:26:46.132Z] [delete] Deleting directory /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/bin
[2025-04-09T21:26:46.132Z]
[2025-04-09T21:26:46.132Z] init:
[2025-04-09T21:26:46.132Z] [mkdir] Created dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/bin
[2025-04-09T21:26:46.132Z]
[2025-04-09T21:26:46.132Z] getDependentLibs:
[2025-04-09T21:26:46.132Z] [exec] --------------------------------------------
[2025-04-09T21:26:46.132Z] [exec] path is set to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:26:46.132Z] [exec] task is set to default
[2025-04-09T21:26:46.132Z] [exec] dependencyList is set to json_simple
[2025-04-09T21:26:46.132Z] [exec] --------------------------------------------
[2025-04-09T21:26:46.132Z] [exec] Starting download third party dependent jars
[2025-04-09T21:26:46.132Z] [exec] --------------------------------------------
[2025-04-09T21:26:46.132Z] [exec] downloading dependent third party jars to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:26:46.132Z] [exec] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/json-simple.jar exists with correct hash, not downloading
[2025-04-09T21:26:46.132Z] [exec] downloaded dependent third party jars successfully
[2025-04-09T21:26:46.132Z]
[2025-04-09T21:26:46.132Z] compile:
[2025-04-09T21:26:46.132Z] [echo] Ant version is Apache Ant(TM) version 1.10.15 compiled on August 25 2024
[2025-04-09T21:26:46.132Z] [echo] ============COMPILER SETTINGS============
[2025-04-09T21:26:46.132Z] [echo] ===fork: yes
[2025-04-09T21:26:46.132Z] [echo] ===debug: on
[2025-04-09T21:26:46.133Z] [javac] Compiling 28 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/bin
[2025-04-09T21:26:48.108Z] [javac] Note: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/src/org/openj9/envInfo/MachineInfo.java uses or overrides a deprecated API.
[2025-04-09T21:26:48.108Z] [javac] Note: Recompile with -Xlint:deprecation for details.
[2025-04-09T21:26:48.108Z]
[2025-04-09T21:26:48.108Z] dist:
[2025-04-09T21:26:48.108Z] [jar] Building jar: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/bin/TestKitGen.jar
[2025-04-09T21:26:48.108Z]
[2025-04-09T21:26:48.108Z] BUILD SUCCESSFUL
[2025-04-09T21:26:48.108Z] Total time: 2 seconds
[2025-04-09T21:26:48.108Z] 0
[2025-04-09T21:26:48.108Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java -cp ./bin/TestKitGen.jar org.openj9.envInfo.EnvDetector
[2025-04-09T21:26:48.108Z] Unfamiliar microArch detected in TKG. It will not be added in TKG microArch!
[2025-04-09T21:26:48.108Z] microArchOutput:
[2025-04-09T21:26:49.069Z] ****************************** MACHINE INFO ******************************
[2025-04-09T21:26:49.069Z] File path : /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG
[2025-04-09T21:26:49.069Z] Free Physical Memory Size : 7864139776
[2025-04-09T21:26:49.069Z] Free space (bytes) : 51966382080
[2025-04-09T21:26:49.069Z] Total Physical Memory Size : 16739598336
[2025-04-09T21:26:49.069Z] Total space (bytes) : 83902935040
[2025-04-09T21:26:49.069Z] Usable space (bytes) : 48489472000
[2025-04-09T21:26:49.069Z] antVersion : Apache Ant(TM) version 1.10.15 compiled on August 25 2024
[2025-04-09T21:26:49.070Z] bashVersion :
[2025-04-09T21:26:49.070Z] GNU bash, version 5.1.16(1)-release (aarch64-unknown-linux-gnu)
[2025-04-09T21:26:49.070Z] Copyright (C) 2020 Free Software Foundation, Inc.
[2025-04-09T21:26:49.070Z] License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
[2025-04-09T21:26:49.070Z]
[2025-04-09T21:26:49.070Z] This is free software; you are free to change and redistribute it.
[2025-04-09T21:26:49.070Z] There is NO WARRANTY, to the extent permitted by law.
[2025-04-09T21:26:49.070Z] cpuCores : 8
[2025-04-09T21:26:49.070Z] curlVersion :
[2025-04-09T21:26:49.070Z] curl 7.81.0 (aarch64-unknown-linux-gnu) libcurl/7.81.0 OpenSSL/3.0.2 zlib/1.2.11 brotli/1.0.9 zstd/1.4.8 libidn2/2.3.2 libpsl/0.21.0 (+libidn2/2.3.2) libssh/0.9.6/openssl/zlib nghttp2/1.43.0 librtmp/2.3 OpenLDAP/2.5.18
[2025-04-09T21:26:49.070Z] Release-Date: 2022-01-05
[2025-04-09T21:26:49.070Z] Protocols: dict file ftp ftps gopher gophers http https imap imaps ldap ldaps mqtt pop3 pop3s rtmp rtsp scp sftp smb smbs smtp smtps telnet tftp
[2025-04-09T21:26:49.070Z] Features: alt-svc AsynchDNS brotli GSS-API HSTS HTTP2 HTTPS-proxy IDN IPv6 Kerberos Largefile libz NTLM NTLM_WB PSL SPNEGO SSL TLS-SRP UnixSockets zstd
[2025-04-09T21:26:49.070Z] dockerVersion : Docker version 20.10.23, build 7155243
[2025-04-09T21:26:49.070Z] gcc version : 11
[2025-04-09T21:26:49.070Z] gclibc version : ldd (Ubuntu GLIBC 2.35-0ubuntu3.9) 2.35
[2025-04-09T21:26:49.070Z] gdb version : bash: line 1: gdb: command not found
[2025-04-09T21:26:49.070Z] lldb version : Command could not be executed
[2025-04-09T21:26:49.070Z] makeVersion :
[2025-04-09T21:26:49.070Z] GNU Make 4.3
[2025-04-09T21:26:49.070Z] Built for aarch64-unknown-linux-gnu
[2025-04-09T21:26:49.070Z] Copyright (C) 1988-2020 Free Software Foundation, Inc.
[2025-04-09T21:26:49.070Z] License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
[2025-04-09T21:26:49.070Z] This is free software: you are free to change and redistribute it.
[2025-04-09T21:26:49.070Z] There is NO WARRANTY, to the extent permitted by law.
[2025-04-09T21:26:49.070Z] mavenVersion :
[2025-04-09T21:26:49.070Z] [1mApache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f)[m
[2025-04-09T21:26:49.070Z] Maven home: /usr/local/apache-maven-3.6.3
[2025-04-09T21:26:49.070Z] Java version: 17.0.15, vendor: Eclipse Adoptium, runtime: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image
[2025-04-09T21:26:49.070Z] Default locale: en_US, platform encoding: UTF-8
[2025-04-09T21:26:49.070Z] OS name: "linux", version: "5.15.0-133-generic", arch: "aarch64", family: "unix"
[2025-04-09T21:26:49.070Z] osInfo :
[2025-04-09T21:26:49.070Z] PRETTY_NAME="Ubuntu 22.04.5 LTS"
[2025-04-09T21:26:49.070Z] NAME="Ubuntu"
[2025-04-09T21:26:49.070Z] VERSION_ID="22.04"
[2025-04-09T21:26:49.070Z] VERSION="22.04.5 LTS (Jammy Jellyfish)"
[2025-04-09T21:26:49.070Z] VERSION_CODENAME=jammy
[2025-04-09T21:26:49.070Z] ID=ubuntu
[2025-04-09T21:26:49.070Z] ID_LIKE=debian
[2025-04-09T21:26:49.070Z] HOME_URL="https://www.ubuntu.com/"
[2025-04-09T21:26:49.070Z] SUPPORT_URL="https://help.ubuntu.com/"
[2025-04-09T21:26:49.070Z] BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
[2025-04-09T21:26:49.070Z] PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
[2025-04-09T21:26:49.070Z] UBUNTU_CODENAME=jammy
[2025-04-09T21:26:49.071Z] osLabel : ubuntu.22
[2025-04-09T21:26:49.071Z] perlVersion :
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] This is perl 5, version 34, subversion 0 (v5.34.0) built for aarch64-linux-gnu-thread-multi
[2025-04-09T21:26:49.071Z] (with 60 registered patches, see perl -V for more detail)
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] Copyright 1987-2021, Larry Wall
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] Perl may be copied only under the terms of either the Artistic License or the
[2025-04-09T21:26:49.071Z] GNU General Public License, which may be found in the Perl 5 source kit.
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] Complete documentation for Perl, including FAQ lists, should be found on
[2025-04-09T21:26:49.071Z] this system using "man perl" or "perldoc perl". If you have access to the
[2025-04-09T21:26:49.071Z] Internet, point your browser at http://www.perl.org/, the Perl Home Page.
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] podmanVersion : bash: line 1: podman: command not found
[2025-04-09T21:26:49.071Z] procArch : aarch64
[2025-04-09T21:26:49.071Z] sysArch : aarch64
[2025-04-09T21:26:49.071Z] sysOS : Linux
[2025-04-09T21:26:49.071Z] ulimit :
[2025-04-09T21:26:49.071Z] real-time non-blocking time (microseconds, -R) unlimited
[2025-04-09T21:26:49.071Z] core file size (blocks, -c) 0
[2025-04-09T21:26:49.071Z] data seg size (kbytes, -d) unlimited
[2025-04-09T21:26:49.071Z] scheduling priority (-e) 0
[2025-04-09T21:26:49.071Z] file size (blocks, -f) unlimited
[2025-04-09T21:26:49.071Z] pending signals (-i) 63441
[2025-04-09T21:26:49.071Z] max locked memory (kbytes, -l) 2043408
[2025-04-09T21:26:49.071Z] max memory size (kbytes, -m) unlimited
[2025-04-09T21:26:49.071Z] open files (-n) 1048576
[2025-04-09T21:26:49.071Z] pipe size (512 bytes, -p) 8
[2025-04-09T21:26:49.071Z] POSIX message queues (bytes, -q) 819200
[2025-04-09T21:26:49.071Z] real-time priority (-r) 0
[2025-04-09T21:26:49.071Z] stack size (kbytes, -s) 8192
[2025-04-09T21:26:49.071Z] cpu time (seconds, -t) unlimited
[2025-04-09T21:26:49.071Z] max user processes (-u) 63441
[2025-04-09T21:26:49.071Z] virtual memory (kbytes, -v) unlimited
[2025-04-09T21:26:49.071Z] file locks (-x) unlimited
[2025-04-09T21:26:49.071Z] uname : Linux test-osuosl-ubuntu2204-aarch64-1 5.15.0-133-generic #144-Ubuntu SMP Sat Feb 8 14:13:21 UTC 2025 aarch64 aarch64 aarch64 GNU/Linux
[2025-04-09T21:26:49.071Z] vmVendor : Oracle Corporation
[2025-04-09T21:26:49.071Z] vmVersion : 17.0.15+5
[2025-04-09T21:26:49.071Z] xlc version : bash: line 1: xlC: command not found
[2025-04-09T21:26:49.071Z] **************************************************************************
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] System.getProperty('java.vm.name')=OpenJDK 64-Bit Server VM
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] System.getProperty('java.vendor')=Eclipse Adoptium
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] System.getProperty('os.name')=Linux
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] System.getProperty('os.arch')=aarch64
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] System.getProperty('java.fullversion')=null
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] System.getProperty('sun.arch.data.model')=64
[2025-04-09T21:26:49.071Z]
[2025-04-09T21:26:49.071Z] make -f clean.mk cleanBuild
[2025-04-09T21:26:49.071Z] make[1]: Entering directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG'
[2025-04-09T21:26:49.072Z] rm -f -r /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest
[2025-04-09T21:26:49.072Z] make[1]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG'
[2025-04-09T21:26:49.072Z] make -f compile.mk compile
[2025-04-09T21:26:49.072Z] make[1]: Entering directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG'
[2025-04-09T21:26:49.072Z] rm -f -r /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation; \
[2025-04-09T21:26:49.072Z] mkdir -p /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation; \
[2025-04-09T21:26:49.072Z] (ant -f scripts/build_test.xml "-DTEST_ROOT=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/.." "-DBUILD_ROOT=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest" "-DJDK_VERSION=17" "-DJDK_IMPL=hotspot" "-DJDK_VENDOR=eclipse" "-DJCL_VERSION=latest" "-DBUILD_LIST=perf" "-DRESOURCES_DIR=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/TestConfig/resources" "-DSPEC=linux_aarch64" "-DTEST_JDK_HOME=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image" "-DJVM_VERSION=openjdk17" "-DLIB_DIR=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib" 2>&1; echo $? ) | tee "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation/compilation.log"; \
[2025-04-09T21:26:49.073Z] if [ -z "$(tail -1 /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation/compilation.log | grep 0)" ]; then perl scripts/moveDmp.pl --compileLogPath="/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation/compilation.log" --testRoot="/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/.."; false; else rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_compilation"; fi
[2025-04-09T21:26:50.734Z] Buildfile: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/scripts/build_test.xml
[2025-04-09T21:26:50.734Z] [echo] build.list is TestConfig/build.xml,perf/build.xml
[2025-04-09T21:26:50.734Z]
[2025-04-09T21:26:50.734Z] -create_test_directory:
[2025-04-09T21:26:50.734Z] [mkdir] Created dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest
[2025-04-09T21:26:50.734Z]
[2025-04-09T21:26:50.734Z] stage_test_material:
[2025-04-09T21:26:50.734Z] [copy] Copying 586 files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest
[2025-04-09T21:26:50.734Z]
[2025-04-09T21:26:50.734Z] build:
[2025-04-09T21:26:50.734Z]
[2025-04-09T21:26:50.734Z] init:
[2025-04-09T21:26:50.734Z] [mkdir] Created dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf
[2025-04-09T21:26:50.734Z]
[2025-04-09T21:26:50.734Z] dist:
[2025-04-09T21:26:50.734Z] [copy] Copying 1 file to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf
[2025-04-09T21:26:50.734Z]
[2025-04-09T21:26:50.734Z] build:
[2025-04-09T21:26:50.734Z]
[2025-04-09T21:26:50.734Z] build:
[2025-04-09T21:26:50.734Z]
[2025-04-09T21:26:50.734Z] init:
[2025-04-09T21:26:50.734Z] [mkdir] Created dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/bumbleBench
[2025-04-09T21:26:50.734Z]
[2025-04-09T21:26:50.734Z] getBumbleBench:
[2025-04-09T21:26:50.734Z] [echo] Cloning BumbleBench
[2025-04-09T21:26:50.734Z] [echo] git clone --depth 1 -b master https://github.com/adoptium/bumblebench.git
[2025-04-09T21:26:50.734Z] [exec] Cloning into 'bumblebench'...
[2025-04-09T21:26:51.696Z]
[2025-04-09T21:26:51.696Z] buildBumbleBench:
[2025-04-09T21:26:51.696Z] [echo] Building BumbleBench
[2025-04-09T21:26:51.696Z]
[2025-04-09T21:26:51.696Z] compile-all:
[2025-04-09T21:26:51.696Z] [javac] Compiling 8 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:51.696Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:51.696Z] [javac] Note: Some input files use or override a deprecated API.
[2025-04-09T21:26:51.696Z] [javac] Note: Recompile with -Xlint:deprecation for details.
[2025-04-09T21:26:51.696Z] [javac] Note: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench/net/adoptopenjdk/bumblebench/core/Launcher.java uses unchecked or unsafe operations.
[2025-04-09T21:26:51.696Z] [javac] Note: Recompile with -Xlint:unchecked for details.
[2025-04-09T21:26:51.696Z] [javac] 1 warning
[2025-04-09T21:26:51.696Z] [javac] Compiling 1 source file to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:51.696Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:51.696Z] [javac] 1 warning
[2025-04-09T21:26:51.696Z] [javac] Compiling 11 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:51.696Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:51.696Z] [javac] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench/net/adoptopenjdk/bumblebench/collections/CollectionsBench.java:29: warning: [removal] Integer(int) in Integer has been deprecated and marked for removal
[2025-04-09T21:26:51.696Z] [javac] result.add(new Integer(i));
[2025-04-09T21:26:51.696Z] [javac] ^
[2025-04-09T21:26:51.697Z] [javac] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench/net/adoptopenjdk/bumblebench/collections/CollectionsBench.java:37: warning: [removal] Integer(int) in Integer has been deprecated and marked for removal
[2025-04-09T21:26:51.697Z] [javac] result.put(new Integer(i), new Integer(i));
[2025-04-09T21:26:51.697Z] [javac] ^
[2025-04-09T21:26:51.697Z] [javac] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench/net/adoptopenjdk/bumblebench/collections/CollectionsBench.java:37: warning: [removal] Integer(int) in Integer has been deprecated and marked for removal
[2025-04-09T21:26:51.697Z] [javac] result.put(new Integer(i), new Integer(i));
[2025-04-09T21:26:51.697Z] [javac] ^
[2025-04-09T21:26:52.657Z] [javac] 4 warnings
[2025-04-09T21:26:52.657Z] [javac] Compiling 5 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:52.657Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:52.657Z] [javac] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench/net/adoptopenjdk/bumblebench/collections/HashMapReplaceAllLambdaBench.java:29: warning: [removal] Integer(int) in Integer has been deprecated and marked for removal
[2025-04-09T21:26:52.657Z] [javac] hashMap.replaceAll((k, v) -> new Integer(k + v));
[2025-04-09T21:26:52.657Z] [javac] ^
[2025-04-09T21:26:52.657Z] [javac] 2 warnings
[2025-04-09T21:26:52.657Z] [javac] Compiling 9 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:52.657Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:52.657Z] [javac] Note: Some input files use or override a deprecated API.
[2025-04-09T21:26:52.657Z] [javac] Note: Recompile with -Xlint:deprecation for details.
[2025-04-09T21:26:52.657Z] [javac] 1 warning
[2025-04-09T21:26:52.658Z] [javac] Compiling 1 source file to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:52.658Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:52.658Z] [javac] 1 warning
[2025-04-09T21:26:52.658Z] [javac] Compiling 2 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:52.658Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:52.658Z] [javac] 1 warning
[2025-04-09T21:26:52.658Z] [javac] Compiling 8 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:52.658Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:52.658Z] [javac] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench/net/adoptopenjdk/bumblebench/examples/LockContention.java:28: warning: [synchronization] attempt to synchronize on an instance of a value-based class
[2025-04-09T21:26:52.658Z] [javac] synchronized(init) {
[2025-04-09T21:26:52.658Z] [javac] ^
[2025-04-09T21:26:52.658Z] [javac] 2 warnings
[2025-04-09T21:26:52.658Z] [javac] Compiling 6 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:52.658Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:52.658Z] [javac] 1 warning
[2025-04-09T21:26:52.658Z] [javac] Compiling 5 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:52.658Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:52.658Z] [javac] 1 warning
[2025-04-09T21:26:52.658Z] [javac] Compiling 7 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:52.658Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:53.619Z] [javac] 1 warning
[2025-04-09T21:26:53.620Z] [javac] Compiling 8 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:53.620Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:53.620Z] [javac] 1 warning
[2025-04-09T21:26:53.620Z] [javac] Compiling 2 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:53.620Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:53.620Z] [javac] 1 warning
[2025-04-09T21:26:53.620Z] [javac] Compiling 4 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:53.620Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:53.620Z] [javac] 1 warning
[2025-04-09T21:26:53.620Z] [javac] Compiling 5 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:26:53.620Z] [javac] warning: [options] bootstrap class path not set in conjunction with -source 8
[2025-04-09T21:26:53.620Z] [javac] 1 warning
[2025-04-09T21:26:53.620Z]
[2025-04-09T21:26:53.620Z] dist:
[2025-04-09T21:26:53.620Z] [jar] Building jar: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench/BumbleBench.jar
[2025-04-09T21:26:53.620Z]
[2025-04-09T21:26:53.620Z] dist:
[2025-04-09T21:26:53.620Z] [copy] Copying 126 files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/bumbleBench
[2025-04-09T21:26:53.620Z]
[2025-04-09T21:26:53.620Z] build:
[2025-04-09T21:26:53.620Z]
[2025-04-09T21:26:53.620Z] init:
[2025-04-09T21:26:53.620Z] [mkdir] Created dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/dacapo
[2025-04-09T21:26:53.620Z]
[2025-04-09T21:26:53.620Z] getDacapoSuite:
[2025-04-09T21:26:53.620Z] [echo] curl -Lks -C - https://sourceforge.net/projects/dacapobench/files/latest/download -o dacapo.jar
[2025-04-09T21:27:05.193Z]
[2025-04-09T21:27:05.193Z] dist:
[2025-04-09T21:27:05.193Z] [copy] Copying 3 files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/dacapo
[2025-04-09T21:27:05.193Z]
[2025-04-09T21:27:05.193Z] build:
[2025-04-09T21:27:05.193Z]
[2025-04-09T21:27:05.193Z] init:
[2025-04-09T21:27:05.194Z] [mkdir] Created dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/idle_micro
[2025-04-09T21:27:05.194Z] [mkdir] Created dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/idle_micro/bin
[2025-04-09T21:27:05.194Z]
[2025-04-09T21:27:05.194Z] compile:
[2025-04-09T21:27:05.194Z] [echo] Ant version is Apache Ant(TM) version 1.10.15 compiled on August 25 2024
[2025-04-09T21:27:05.194Z] [echo] ============COMPILER SETTINGS============
[2025-04-09T21:27:05.194Z] [echo] ===fork: yes
[2025-04-09T21:27:05.194Z] [echo] ===executable: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/javac
[2025-04-09T21:27:05.194Z] [echo] ===debug: on
[2025-04-09T21:27:05.194Z] [echo] ===destdir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/idle_micro
[2025-04-09T21:27:05.194Z] [javac] Compiling 3 source files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/idle_micro/bin
[2025-04-09T21:27:06.156Z]
[2025-04-09T21:27:06.156Z] dist:
[2025-04-09T21:27:06.156Z] [jar] Building jar: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/idle_micro/IdleMicrobenchmark.jar
[2025-04-09T21:27:06.156Z] [copy] Copying 2 files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/idle_micro
[2025-04-09T21:27:06.156Z] [copy] Copying 2 files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/idle_micro
[2025-04-09T21:27:06.156Z] [copy] Copying 7 files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/idle_micro
[2025-04-09T21:27:06.156Z]
[2025-04-09T21:27:06.156Z] clean:
[2025-04-09T21:27:06.156Z] [delete] Deleting directory /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/idle_micro/bin
[2025-04-09T21:27:06.156Z]
[2025-04-09T21:27:06.156Z] build:
[2025-04-09T21:27:06.156Z]
[2025-04-09T21:27:06.156Z] build:
[2025-04-09T21:27:06.156Z]
[2025-04-09T21:27:06.156Z] init:
[2025-04-09T21:27:06.157Z] [mkdir] Created dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/renaissance
[2025-04-09T21:27:06.157Z]
[2025-04-09T21:27:06.157Z] getRenaissanceSuite:
[2025-04-09T21:27:06.157Z] [echo] curl -Lks -C - https://github.com/renaissance-benchmarks/renaissance/releases/download/v0.16.0/renaissance-mit-0.16.0.jar -o renaissance.jar
[2025-04-09T21:27:10.659Z]
[2025-04-09T21:27:10.659Z] dist:
[2025-04-09T21:27:10.659Z] [copy] Copying 3 files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/renaissance
[2025-04-09T21:27:10.659Z]
[2025-04-09T21:27:10.659Z] build:
[2025-04-09T21:27:10.659Z]
[2025-04-09T21:27:10.659Z] init:
[2025-04-09T21:27:10.659Z] [mkdir] Created dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/specjbb
[2025-04-09T21:27:10.659Z]
[2025-04-09T21:27:10.659Z] getSpecjbbSuite:
[2025-04-09T21:27:10.659Z]
[2025-04-09T21:27:10.659Z] dist:
[2025-04-09T21:27:10.659Z] [copy] Copying 3 files to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jvmtest/perf/specjbb/run
[2025-04-09T21:27:10.659Z]
[2025-04-09T21:27:10.660Z] BUILD SUCCESSFUL
[2025-04-09T21:27:10.660Z] Total time: 21 seconds
[2025-04-09T21:27:10.660Z] 0
[2025-04-09T21:27:10.660Z]
[2025-04-09T21:27:10.660Z]
[2025-04-09T21:27:10.660Z] RECORD TEST REPOs SHA
[2025-04-09T21:27:10.660Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/scripts"; \
[2025-04-09T21:27:10.660Z] bash "getSHAs.sh" --test_root_dir "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/.." --shas_file "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/SHAs.txt"
[2025-04-09T21:27:10.660Z] Check shas in /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/.. and store the info in /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/SHAs.txt
[2025-04-09T21:27:10.660Z] touch /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/SHAs.txt
[2025-04-09T21:27:10.660Z] ================================================
[2025-04-09T21:27:10.660Z] timestamp: 20250409-212710
[2025-04-09T21:27:10.660Z] repo dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/bumbleBench/bumblebench
[2025-04-09T21:27:10.660Z] git repo:
[2025-04-09T21:27:10.660Z] Fetch URL: https://github.com/adoptium/bumblebench.git
[2025-04-09T21:27:10.660Z] sha:
[2025-04-09T21:27:10.660Z] 4ec780b716e7617094cce740d3b085ca4637ee0f
[2025-04-09T21:27:10.660Z] ================================================
[2025-04-09T21:27:10.660Z] timestamp: 20250409-212710
[2025-04-09T21:27:10.660Z] repo dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests
[2025-04-09T21:27:10.660Z] git repo:
[2025-04-09T21:27:10.660Z] Fetch URL: https://github.com/adoptium/aqa-tests.git
[2025-04-09T21:27:10.660Z] sha:
[2025-04-09T21:27:10.660Z] 5a57a73acca15191cbf8bba592670458bc380962
[2025-04-09T21:27:10.660Z] ================================================
[2025-04-09T21:27:10.660Z] timestamp: 20250409-212710
[2025-04-09T21:27:10.660Z] repo dir: /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG
[2025-04-09T21:27:10.660Z] git repo:
[2025-04-09T21:27:10.660Z] Fetch URL: https://github.com/adoptium/TKG.git
[2025-04-09T21:27:10.660Z] sha:
[2025-04-09T21:27:10.660Z] 9db2f4162fc930a8a6980c685c5a7466cda63ee9
[2025-04-09T21:27:10.660Z] make[1]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG'
[Pipeline] }
[2025-04-09T21:27:11.279Z] $ ssh-agent -k
[2025-04-09T21:27:11.284Z] unset SSH_AUTH_SOCK;
[2025-04-09T21:27:11.284Z] unset SSH_AGENT_PID;
[2025-04-09T21:27:11.284Z] echo Agent pid 1758579 killed;
[2025-04-09T21:27:12.088Z] [ssh-agent] Stopped.
[Pipeline] // sshagent
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] echo
[2025-04-09T21:27:12.223Z] Running tests...
[Pipeline] echo
[2025-04-09T21:27:12.248Z] ITERATION: 1/1
[Pipeline] wrap
[2025-04-09T21:27:12.696Z] $ Xvfb -displayfd 2 -screen 0 1024x768x24 -fbdir /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/.xvfb-220-..fbdir17092142111288158292
[Pipeline] {
[Pipeline] sh
[2025-04-09T21:27:35.198Z] + ps -f
[2025-04-09T21:27:35.199Z] + awk {print $9}
[2025-04-09T21:27:35.199Z] + grep [X]vfb
[Pipeline] echo
[2025-04-09T21:27:35.938Z] env.DISPLAY is :0
[Pipeline] sh
[2025-04-09T21:27:37.819Z] + uname
[2025-04-09T21:27:37.819Z] + [ Linux = AIX ]
[2025-04-09T21:27:37.819Z] + uname
[2025-04-09T21:27:37.819Z] + [ Linux = SunOS ]
[2025-04-09T21:27:37.819Z] + uname
[2025-04-09T21:27:37.819Z] + [ Linux = *BSD ]
[2025-04-09T21:27:37.819Z] + MAKE=make
[2025-04-09T21:27:37.819Z] + make -f ./aqa-tests/TKG/testEnv.mk testEnvTeardown
[2025-04-09T21:27:37.819Z] make: Nothing to be done for 'testEnvTeardown'.
[Pipeline] sh
[2025-04-09T21:27:40.219Z] + uname
[2025-04-09T21:27:40.219Z] + [ Linux = AIX ]
[2025-04-09T21:27:40.219Z] + uname
[2025-04-09T21:27:40.219Z] + [ Linux = SunOS ]
[2025-04-09T21:27:40.219Z] + uname
[2025-04-09T21:27:40.219Z] + [ Linux = *BSD ]
[2025-04-09T21:27:40.219Z] + MAKE=make
[2025-04-09T21:27:40.219Z] + cd ./aqa-tests
[2025-04-09T21:27:40.219Z] + . ./scripts/testenv/testenvSettings.sh
[2025-04-09T21:27:40.219Z] + set +x
[2025-04-09T21:27:40.219Z] Set values based on ./testenv/testenv.properties:
[2025-04-09T21:27:40.219Z] =========
[2025-04-09T21:27:40.219Z] TKG_REPO=https://github.com/adoptium/TKG.git
[2025-04-09T21:27:40.219Z] TKG_BRANCH=v1.0.7
[2025-04-09T21:27:40.219Z] OPENJ9_REPO=https://github.com/eclipse-openj9/openj9.git
[2025-04-09T21:27:40.219Z] OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:27:40.219Z] STF_REPO=https://github.com/adoptium/STF.git
[2025-04-09T21:27:40.219Z] STF_BRANCH=v1.0.7
[2025-04-09T21:27:40.219Z] OPENJ9_SYSTEMTEST_REPO=https://github.com/adoptium/openj9-systemtest.git
[2025-04-09T21:27:40.219Z] OPENJ9_SYSTEMTEST_BRANCH=v1.0.7
[2025-04-09T21:27:40.219Z] ADOPTOPENJDK_SYSTEMTEST_REPO=https://github.com/adoptium/aqa-systemtest.git
[2025-04-09T21:27:40.219Z] ADOPTOPENJDK_SYSTEMTEST_BRANCH=v1.0.7
[2025-04-09T21:27:40.219Z] JDK8_REPO=https://github.com/adoptium/jdk8u.git
[2025-04-09T21:27:40.219Z] JDK8_BRANCH=jdk8u452-dryrun-ga
[2025-04-09T21:27:40.219Z] JDK11_REPO=https://github.com/adoptium/jdk11u.git
[2025-04-09T21:27:40.219Z] JDK11_BRANCH=jdk-11.0.27-dryrun-ga
[2025-04-09T21:27:40.219Z] JDK17_REPO=https://github.com/adoptium/jdk17u.git
[2025-04-09T21:27:40.219Z] JDK17_BRANCH=jdk-17.0.15-dryrun-ga
[2025-04-09T21:27:40.219Z] JDK21_REPO=https://github.com/adoptium/jdk21u.git
[2025-04-09T21:27:40.219Z] JDK21_BRANCH=jdk-21.0.7-dryrun-ga
[2025-04-09T21:27:40.219Z] JDK24_REPO=https://github.com/adoptium/jdk24u.git
[2025-04-09T21:27:40.219Z] JDK24_BRANCH=jdk-24.0.1-dryrun-ga
[2025-04-09T21:27:40.219Z] JDK8_OPENJ9_REPO=https://github.com/ibmruntimes/openj9-openjdk-jdk8.git
[2025-04-09T21:27:40.219Z] JDK8_OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:27:40.219Z] JDK11_OPENJ9_REPO=https://github.com/ibmruntimes/openj9-openjdk-jdk11.git
[2025-04-09T21:27:40.219Z] JDK11_OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:27:40.219Z] JDK17_OPENJ9_REPO=https://github.com/ibmruntimes/openj9-openjdk-jdk17.git
[2025-04-09T21:27:40.219Z] JDK17_OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:27:40.219Z] JDK21_OPENJ9_REPO=https://github.com/ibmruntimes/openj9-openjdk-jdk21.git
[2025-04-09T21:27:40.219Z] JDK21_OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:27:40.219Z] JDK24_OPENJ9_REPO=https://github.com/ibmruntimes/openj9-openjdk-jdk24.git
[2025-04-09T21:27:40.219Z] JDK24_OPENJ9_BRANCH=v0.51.0-release
[2025-04-09T21:27:40.219Z] JDK11_OPENJCEPLUS_GIT_REPO=https://github.com/ibmruntimes/OpenJCEPlus.git
[2025-04-09T21:27:40.219Z] JDK11_OPENJCEPLUS_GIT_BRANCH=semeru-java-11.0.27
[2025-04-09T21:27:40.219Z] JDK17_OPENJCEPLUS_GIT_REPO=https://github.com/ibmruntimes/OpenJCEPlus.git
[2025-04-09T21:27:40.219Z] JDK17_OPENJCEPLUS_GIT_BRANCH=semeru-java-17.0.15
[2025-04-09T21:27:40.219Z] JDK21_OPENJCEPLUS_GIT_REPO=https://github.com/ibmruntimes/OpenJCEPlus.git
[2025-04-09T21:27:40.219Z] JDK21_OPENJCEPLUS_GIT_BRANCH=semeru-java-21.0.7
[2025-04-09T21:27:40.219Z] JDK24_OPENJCEPLUS_GIT_REPO=https://github.com/ibmruntimes/OpenJCEPlus.git
[2025-04-09T21:27:40.219Z] JDK24_OPENJCEPLUS_GIT_BRANCH=semeru-java-24.0.1
[2025-04-09T21:27:40.219Z] AQA_REQUIRED_TARGETS=sanity.functional,extended.functional,special.functional,sanity.openjdk,extended.openjdk,sanity.system,extended.system,sanity.perf,extended.perf
[2025-04-09T21:27:40.219Z]
[2025-04-09T21:27:40.219Z] =========
[2025-04-09T21:27:40.219Z]
[2025-04-09T21:27:40.219Z] JDK_REPO=https://github.com/adoptium/jdk17u.git
[2025-04-09T21:27:40.219Z] JDK_BRANCH=jdk-17.0.15-dryrun-ga
[2025-04-09T21:27:40.219Z] OPENJCEPLUS_GIT_REPO=https://github.com/ibmruntimes/OpenJCEPlus.git
[2025-04-09T21:27:40.219Z] OPENJCEPLUS_GIT_BRANCH=semeru-java17
[2025-04-09T21:27:40.219Z] JAVA_HOME is set to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image
[2025-04-09T21:27:40.219Z] LIB_DIR is set to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib
[2025-04-09T21:27:40.219Z] make -f makeGen.mk AUTO_DETECT=true MODE=tests TESTTARGET=extended.perf TESTLIST=
[2025-04-09T21:27:40.219Z] make[1]: Entering directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG'
[2025-04-09T21:27:40.219Z] perl scripts/configure.pl
[2025-04-09T21:27:40.219Z] /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java -cp "./bin/TestKitGen.jar:/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/../../testDependency/lib/json-simple.jar" org.testKitGen.MainRunner --mode=tests --spec=linux_aarch64 --microArch="" --osLabel="ubuntu.22" --jdkVersion=17 --impl=hotspot --vendor="eclipse" --buildList=perf --iterations=1 --aotIterations= --testFlag= --testTarget=extended.perf --testList= --numOfMachines= --testTime=120 --TRSSURL=
[2025-04-09T21:27:40.219Z] Modes data parsed from resources/modes.xml and resources/ottawa.csv.
[2025-04-09T21:27:40.219Z]
[2025-04-09T21:27:40.219Z] Starting to generate test make files.
[2025-04-09T21:27:40.219Z]
[2025-04-09T21:27:40.220Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/dacapo/playlist.xml
[2025-04-09T21:27:40.220Z] Generated /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/dacapo/autoGen.mk
[2025-04-09T21:27:40.220Z]
[2025-04-09T21:27:40.220Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/idle_micro/playlist.xml
[2025-04-09T21:27:40.220Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/liberty/playlist.xml
[2025-04-09T21:27:40.220Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/specjbb/playlist.xml
[2025-04-09T21:27:40.220Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/renaissance/playlist.xml
[2025-04-09T21:27:40.220Z] Generated /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/renaissance/autoGen.mk
[2025-04-09T21:27:40.220Z]
[2025-04-09T21:27:40.220Z] Parsing /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/bumbleBench/playlist.xml
[2025-04-09T21:27:40.220Z] Generated /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../perf/autoGen.mk
[2025-04-09T21:27:40.220Z]
[2025-04-09T21:27:40.220Z] Generated /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../autoGen.mk
[2025-04-09T21:27:40.220Z]
[2025-04-09T21:27:40.220Z] Generated /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/utils.mk
[2025-04-09T21:27:40.220Z]
[2025-04-09T21:27:40.220Z] Generated /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/rerun.mk
[2025-04-09T21:27:40.220Z]
[2025-04-09T21:27:40.220Z] Make files are generated successfully.
[2025-04-09T21:27:40.220Z]
[2025-04-09T21:27:40.220Z] make[1]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG'
[2025-04-09T21:27:40.220Z] make -f runtest.mk _extended.perf
[2025-04-09T21:27:40.220Z] make[1]: Entering directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG'
[2025-04-09T21:27:40.220Z]
[2025-04-09T21:27:40.220Z] Running make 4.3
[2025-04-09T21:27:40.220Z] set TEST_ROOT to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..
[2025-04-09T21:27:40.220Z] set JDK_VERSION to 17
[2025-04-09T21:27:40.220Z] set JDK_IMPL to hotspot
[2025-04-09T21:27:40.220Z] set JVM_VERSION to openjdk17
[2025-04-09T21:27:40.220Z] set JCL_VERSION to latest
[2025-04-09T21:27:40.220Z] set JAVA_HOME to /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image
[2025-04-09T21:27:40.220Z] set SPEC to linux_aarch64
[2025-04-09T21:27:40.220Z] set TEST_FLAG to
[2025-04-09T21:27:40.220Z] set OS_LABEL to ubuntu.22
[2025-04-09T21:27:40.220Z] Running extended.perf ...
[2025-04-09T21:27:40.220Z] There are 20 test targets in extended.perf
[2025-04-09T21:27:40.220Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" -version
[2025-04-09T21:27:40.220Z] openjdk version "17.0.15" 2025-04-15
[2025-04-09T21:27:40.220Z] OpenJDK Runtime Environment Temurin-17.0.15+5 (build 17.0.15+5)
[2025-04-09T21:27:40.220Z] OpenJDK 64-Bit Server VM Temurin-17.0.15+5 (build 17.0.15+5, mixed mode, sharing)
[2025-04-09T21:27:40.220Z] make[2]: Entering directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests'
[2025-04-09T21:27:40.220Z] make[3]: Entering directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf'
[2025-04-09T21:27:40.220Z] make[4]: Entering directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/dacapo'
[2025-04-09T21:27:40.220Z]
[2025-04-09T21:27:40.220Z] ===============================================
[2025-04-09T21:27:40.220Z] Running test dacapo-avrora_0 ...
[2025-04-09T21:27:40.220Z] ===============================================
[2025-04-09T21:27:40.220Z] dacapo-avrora_0 Start Time: Wed Apr 9 21:27:39 2025 Epoch Time (ms): 1744234059928
[2025-04-09T21:27:40.220Z] variation: NoOptions
[2025-04-09T21:27:40.220Z] JVM_OPTIONS:
[2025-04-09T21:27:40.220Z] { \
[2025-04-09T21:27:40.220Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:27:40.220Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:27:40.220Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-avrora_0"; \
[2025-04-09T21:27:40.220Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-avrora_0"; \
[2025-04-09T21:27:40.220Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:27:40.220Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/dacapo/dacapo.jar" avrora; \
[2025-04-09T21:27:40.220Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "dacapo-avrora_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-avrora_0"; else echo "-----------------------------------"; echo "dacapo-avrora_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:27:40.220Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:27:40.220Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:27:40.220Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:27:40.220Z]
[2025-04-09T21:27:40.220Z] TEST SETUP:
[2025-04-09T21:27:40.220Z] Nothing to be done for setup.
[2025-04-09T21:27:40.220Z]
[2025-04-09T21:27:40.220Z] TESTING:
[2025-04-09T21:27:40.220Z] ===== DaCapo 9.12-MR1 avrora starting =====
[2025-04-09T21:27:48.412Z] ===== DaCapo 9.12-MR1 avrora PASSED in 8061 msec =====
[2025-04-09T21:27:48.412Z] -----------------------------------
[2025-04-09T21:27:48.412Z] dacapo-avrora_0_PASSED
[2025-04-09T21:27:48.413Z] -----------------------------------
[2025-04-09T21:27:48.413Z]
[2025-04-09T21:27:48.413Z] TEST TEARDOWN:
[2025-04-09T21:27:48.413Z] Nothing to be done for teardown.
[2025-04-09T21:27:48.413Z] dacapo-avrora_0 Finish Time: Wed Apr 9 21:27:48 2025 Epoch Time (ms): 1744234068246
[2025-04-09T21:27:48.413Z]
[2025-04-09T21:27:48.413Z] ===============================================
[2025-04-09T21:27:48.413Z] Running test dacapo-fop_0 ...
[2025-04-09T21:27:48.413Z] ===============================================
[2025-04-09T21:27:48.413Z] dacapo-fop_0 Start Time: Wed Apr 9 21:27:48 2025 Epoch Time (ms): 1744234068260
[2025-04-09T21:27:48.413Z] variation: NoOptions
[2025-04-09T21:27:48.413Z] JVM_OPTIONS:
[2025-04-09T21:27:48.413Z] { \
[2025-04-09T21:27:48.413Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:27:48.413Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:27:48.413Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-fop_0"; \
[2025-04-09T21:27:48.413Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-fop_0"; \
[2025-04-09T21:27:48.413Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:27:48.413Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/dacapo/dacapo.jar" fop; \
[2025-04-09T21:27:48.413Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "dacapo-fop_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-fop_0"; else echo "-----------------------------------"; echo "dacapo-fop_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:27:48.413Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:27:48.413Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:27:48.413Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:27:48.413Z]
[2025-04-09T21:27:48.413Z] TEST SETUP:
[2025-04-09T21:27:48.413Z] Nothing to be done for setup.
[2025-04-09T21:27:48.413Z]
[2025-04-09T21:27:48.413Z] TESTING:
[2025-04-09T21:27:49.372Z] ===== DaCapo 9.12-MR1 fop starting =====
[2025-04-09T21:27:51.346Z] ===== DaCapo 9.12-MR1 fop PASSED in 1812 msec =====
[2025-04-09T21:27:51.346Z] -----------------------------------
[2025-04-09T21:27:51.346Z] dacapo-fop_0_PASSED
[2025-04-09T21:27:51.346Z] -----------------------------------
[2025-04-09T21:27:51.346Z]
[2025-04-09T21:27:51.346Z] TEST TEARDOWN:
[2025-04-09T21:27:51.346Z] Nothing to be done for teardown.
[2025-04-09T21:27:51.346Z] dacapo-fop_0 Finish Time: Wed Apr 9 21:27:50 2025 Epoch Time (ms): 1744234070349
[2025-04-09T21:27:51.346Z]
[2025-04-09T21:27:51.346Z] ===============================================
[2025-04-09T21:27:51.346Z] Running test dacapo-jython_0 ...
[2025-04-09T21:27:51.346Z] ===============================================
[2025-04-09T21:27:51.346Z] dacapo-jython_0 Start Time: Wed Apr 9 21:27:50 2025 Epoch Time (ms): 1744234070365
[2025-04-09T21:27:51.346Z] variation: NoOptions
[2025-04-09T21:27:51.346Z] JVM_OPTIONS:
[2025-04-09T21:27:51.346Z] { \
[2025-04-09T21:27:51.346Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:27:51.346Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:27:51.346Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-jython_0"; \
[2025-04-09T21:27:51.346Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-jython_0"; \
[2025-04-09T21:27:51.346Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:27:51.346Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/dacapo/dacapo.jar" jython; \
[2025-04-09T21:27:51.346Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "dacapo-jython_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-jython_0"; else echo "-----------------------------------"; echo "dacapo-jython_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:27:51.346Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:27:51.346Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:27:51.346Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:27:51.346Z]
[2025-04-09T21:27:51.346Z] TEST SETUP:
[2025-04-09T21:27:51.346Z] Nothing to be done for setup.
[2025-04-09T21:27:51.346Z]
[2025-04-09T21:27:51.346Z] TESTING:
[2025-04-09T21:27:53.328Z] ===== DaCapo 9.12-MR1 jython starting =====
[2025-04-09T21:27:56.921Z] -------------------------------------------------------------------------------
[2025-04-09T21:27:56.921Z] PYBENCH 2.0
[2025-04-09T21:27:56.921Z] -------------------------------------------------------------------------------
[2025-04-09T21:27:56.921Z] * using Python 2.5.2
[2025-04-09T21:27:56.921Z] * system check interval set to maximum: 2147483647
[2025-04-09T21:27:56.921Z] * using timer: time.time
[2025-04-09T21:27:56.921Z]
[2025-04-09T21:27:56.921Z] Calibrating tests. Please wait...
[2025-04-09T21:27:56.921Z]
[2025-04-09T21:27:56.921Z] Running 1 round(s) of the suite at warp factor 20:
[2025-04-09T21:27:56.921Z]
[2025-04-09T21:27:59.967Z] * Round 1 done in 2.848 seconds.
[2025-04-09T21:27:59.967Z]
[2025-04-09T21:27:59.967Z] -------------------------------------------------------------------------------
[2025-04-09T21:27:59.967Z] Benchmark: 2025-04-09 21:27:55
[2025-04-09T21:27:59.967Z] -------------------------------------------------------------------------------
[2025-04-09T21:27:59.967Z]
[2025-04-09T21:27:59.967Z] Rounds: 1
[2025-04-09T21:27:59.967Z] Warp: 20
[2025-04-09T21:27:59.967Z] Timer: time.time
[2025-04-09T21:27:59.967Z]
[2025-04-09T21:27:59.967Z] Test minimum average operation overhead
[2025-04-09T21:27:59.967Z] -------------------------------------------------------------------------------
[2025-04-09T21:27:59.967Z] BuiltinFunctionCalls: 50ms 50ms 0.20us 0.000ms
[2025-04-09T21:27:59.967Z] BuiltinMethodLookup: 59ms 59ms 0.11us 0.000ms
[2025-04-09T21:27:59.967Z] CompareFloats: 49ms 49ms 0.08us 0.000ms
[2025-04-09T21:27:59.967Z] CompareFloatsIntegers: 32ms 32ms 0.07us 0.000ms
[2025-04-09T21:27:59.967Z] CompareIntegers: 55ms 55ms 0.06us 0.000ms
[2025-04-09T21:27:59.967Z] CompareInternedStrings: 57ms 57ms 0.08us 0.000ms
[2025-04-09T21:27:59.967Z] CompareLongs: 31ms 31ms 0.06us 0.000ms
[2025-04-09T21:27:59.967Z] CompareStrings: 37ms 37ms 0.07us 0.000ms
[2025-04-09T21:27:59.967Z] CompareUnicode: 29ms 29ms 0.08us 0.000ms
[2025-04-09T21:27:59.967Z] ConcatStrings: 81ms 81ms 0.32us 0.000ms
[2025-04-09T21:27:59.967Z] ConcatUnicode: 37ms 37ms 0.25us 0.000ms
[2025-04-09T21:27:59.967Z] CreateInstances: 37ms 37ms 0.66us 0.000ms
[2025-04-09T21:27:59.967Z] CreateNewInstances: 53ms 53ms 1.26us 0.000ms
[2025-04-09T21:27:59.967Z] CreateStringsWithConcat: 53ms 53ms 0.11us 0.000ms
[2025-04-09T21:27:59.967Z] CreateUnicodeWithConcat: 21ms 21ms 0.11us 0.000ms
[2025-04-09T21:27:59.967Z] DictCreation: 51ms 51ms 0.25us 0.000ms
[2025-04-09T21:27:59.967Z] DictWithFloatKeys: 31ms 31ms 0.07us 0.000ms
[2025-04-09T21:27:59.967Z] DictWithIntegerKeys: 35ms 35ms 0.06us 0.000ms
[2025-04-09T21:27:59.967Z] DictWithStringKeys: 37ms 37ms 0.06us 0.000ms
[2025-04-09T21:27:59.967Z] ForLoops: 86ms 86ms 6.88us 0.000ms
[2025-04-09T21:27:59.968Z] IfThenElse: 67ms 67ms 0.10us 0.000ms
[2025-04-09T21:27:59.968Z] ListSlicing: 18ms 18ms 2.57us 0.000ms
[2025-04-09T21:27:59.968Z] NestedForLoops: 58ms 58ms 0.08us 0.000ms
[2025-04-09T21:27:59.968Z] NormalClassAttribute: 45ms 45ms 0.08us 0.000ms
[2025-04-09T21:27:59.968Z] NormalInstanceAttribute: 42ms 42ms 0.07us 0.000ms
[2025-04-09T21:27:59.968Z] PythonFunctionCalls: 43ms 43ms 0.26us 0.000ms
[2025-04-09T21:27:59.968Z] PythonMethodCalls: 37ms 37ms 0.33us 0.000ms
[2025-04-09T21:27:59.968Z] Recursion: 39ms 39ms 1.56us 0.000ms
[2025-04-09T21:27:59.968Z] SecondImport: 63ms 63ms 1.26us 0.000ms
[2025-04-09T21:27:59.968Z] SecondPackageImport: 35ms 35ms 0.70us 0.000ms
[2025-04-09T21:27:59.968Z] SecondSubmoduleImport: 71ms 71ms 1.42us 0.000ms
[2025-04-09T21:27:59.968Z] SimpleComplexArithmetic: 53ms 53ms 0.12us 0.000ms
[2025-04-09T21:27:59.968Z] SimpleDictManipulation: 44ms 44ms 0.07us 0.000ms
[2025-04-09T21:27:59.968Z] SimpleFloatArithmetic: 59ms 59ms 0.09us 0.000ms
[2025-04-09T21:27:59.968Z] SimpleIntFloatArithmetic: 58ms 58ms 0.09us 0.000ms
[2025-04-09T21:27:59.968Z] SimpleIntegerArithmetic: 54ms 54ms 0.08us 0.000ms
[2025-04-09T21:27:59.968Z] SimpleListManipulation: 48ms 48ms 0.08us 0.000ms
[2025-04-09T21:27:59.968Z] SimpleLongArithmetic: 49ms 49ms 0.15us 0.000ms
[2025-04-09T21:27:59.968Z] SmallLists: 49ms 49ms 0.14us 0.000ms
[2025-04-09T21:27:59.968Z] SmallTuples: 85ms 85ms 0.31us 0.000ms
[2025-04-09T21:27:59.968Z] SpecialClassAttribute: 47ms 47ms 0.08us 0.000ms
[2025-04-09T21:27:59.968Z] SpecialInstanceAttribute: 42ms 42ms 0.07us 0.000ms
[2025-04-09T21:27:59.968Z] StringMappings: 83ms 83ms 0.66us 0.000ms
[2025-04-09T21:27:59.968Z] StringPredicates: 60ms 60ms 0.17us 0.000ms
[2025-04-09T21:27:59.968Z] StringSlicing: 26ms 26ms 0.09us 0.000ms
[2025-04-09T21:27:59.968Z] TryExcept: 17ms 17ms 0.02us 0.000ms
[2025-04-09T21:27:59.968Z] TryRaiseExcept: 152ms 152ms 4.75us 0.000ms
[2025-04-09T21:27:59.968Z] TupleSlicing: 23ms 23ms 0.18us 0.000ms
[2025-04-09T21:27:59.968Z] UnicodeMappings: 121ms 121ms 6.72us 0.000ms
[2025-04-09T21:27:59.968Z] UnicodePredicates: 59ms 59ms 0.22us 0.000ms
[2025-04-09T21:27:59.968Z] UnicodeProperties: 244ms 244ms 1.22us 0.000ms
[2025-04-09T21:27:59.968Z] UnicodeSlicing: 36ms 36ms 0.15us 0.000ms
[2025-04-09T21:27:59.968Z] -------------------------------------------------------------------------------
[2025-04-09T21:27:59.968Z] Totals: 2848ms 2848ms
[2025-04-09T21:27:59.968Z]
[2025-04-09T21:27:59.968Z] ===== DaCapo 9.12-MR1 jython PASSED in 6092 msec =====
[2025-04-09T21:27:59.968Z] -----------------------------------
[2025-04-09T21:27:59.968Z] dacapo-jython_0_PASSED
[2025-04-09T21:27:59.968Z] -----------------------------------
[2025-04-09T21:27:59.968Z]
[2025-04-09T21:27:59.968Z] TEST TEARDOWN:
[2025-04-09T21:27:59.968Z] Nothing to be done for teardown.
[2025-04-09T21:27:59.968Z] dacapo-jython_0 Finish Time: Wed Apr 9 21:27:58 2025 Epoch Time (ms): 1744234078955
[2025-04-09T21:27:59.968Z]
[2025-04-09T21:27:59.968Z] ===============================================
[2025-04-09T21:27:59.968Z] Running test dacapo-luindex_0 ...
[2025-04-09T21:27:59.968Z] ===============================================
[2025-04-09T21:27:59.968Z] dacapo-luindex_0 Start Time: Wed Apr 9 21:27:58 2025 Epoch Time (ms): 1744234078970
[2025-04-09T21:27:59.968Z] variation: NoOptions
[2025-04-09T21:27:59.968Z] JVM_OPTIONS:
[2025-04-09T21:27:59.968Z] { \
[2025-04-09T21:27:59.968Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:27:59.968Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:27:59.968Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-luindex_0"; \
[2025-04-09T21:27:59.968Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-luindex_0"; \
[2025-04-09T21:27:59.968Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:27:59.968Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/dacapo/dacapo.jar" luindex; \
[2025-04-09T21:27:59.968Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "dacapo-luindex_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-luindex_0"; else echo "-----------------------------------"; echo "dacapo-luindex_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:27:59.968Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:27:59.968Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:27:59.968Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:27:59.968Z]
[2025-04-09T21:27:59.968Z] TEST SETUP:
[2025-04-09T21:27:59.968Z] Nothing to be done for setup.
[2025-04-09T21:27:59.968Z]
[2025-04-09T21:27:59.968Z] TESTING:
[2025-04-09T21:27:59.968Z] ===== DaCapo 9.12-MR1 luindex starting =====
[2025-04-09T21:28:00.929Z] ===== DaCapo 9.12-MR1 luindex PASSED in 1290 msec =====
[2025-04-09T21:28:00.930Z] -----------------------------------
[2025-04-09T21:28:00.930Z] dacapo-luindex_0_PASSED
[2025-04-09T21:28:00.930Z] -----------------------------------
[2025-04-09T21:28:00.930Z]
[2025-04-09T21:28:00.930Z] TEST TEARDOWN:
[2025-04-09T21:28:00.930Z] Nothing to be done for teardown.
[2025-04-09T21:28:00.930Z] dacapo-luindex_0 Finish Time: Wed Apr 9 21:28:00 2025 Epoch Time (ms): 1744234080634
[2025-04-09T21:28:00.930Z]
[2025-04-09T21:28:00.930Z] ===============================================
[2025-04-09T21:28:00.930Z] Running test dacapo-pmd_0 ...
[2025-04-09T21:28:00.930Z] ===============================================
[2025-04-09T21:28:00.930Z] dacapo-pmd_0 Start Time: Wed Apr 9 21:28:00 2025 Epoch Time (ms): 1744234080650
[2025-04-09T21:28:00.930Z] variation: NoOptions
[2025-04-09T21:28:00.930Z] JVM_OPTIONS:
[2025-04-09T21:28:00.930Z] { \
[2025-04-09T21:28:00.930Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:28:00.930Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:28:00.930Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-pmd_0"; \
[2025-04-09T21:28:00.930Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-pmd_0"; \
[2025-04-09T21:28:00.930Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:28:00.930Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/dacapo/dacapo.jar" pmd; \
[2025-04-09T21:28:00.930Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "dacapo-pmd_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-pmd_0"; else echo "-----------------------------------"; echo "dacapo-pmd_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:28:00.930Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:28:00.930Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:28:00.930Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:28:00.930Z]
[2025-04-09T21:28:00.930Z] TEST SETUP:
[2025-04-09T21:28:00.930Z] Nothing to be done for setup.
[2025-04-09T21:28:00.930Z]
[2025-04-09T21:28:00.930Z] TESTING:
[2025-04-09T21:28:01.894Z] ===== DaCapo 9.12-MR1 pmd starting =====
[2025-04-09T21:28:03.888Z] ===== DaCapo 9.12-MR1 pmd PASSED in 2281 msec =====
[2025-04-09T21:28:03.888Z] -----------------------------------
[2025-04-09T21:28:03.888Z] dacapo-pmd_0_PASSED
[2025-04-09T21:28:03.888Z] -----------------------------------
[2025-04-09T21:28:03.888Z]
[2025-04-09T21:28:03.888Z] TEST TEARDOWN:
[2025-04-09T21:28:03.888Z] Nothing to be done for teardown.
[2025-04-09T21:28:03.888Z] dacapo-pmd_0 Finish Time: Wed Apr 9 21:28:03 2025 Epoch Time (ms): 1744234083455
[2025-04-09T21:28:03.888Z]
[2025-04-09T21:28:03.888Z] ===============================================
[2025-04-09T21:28:03.888Z] Running test dacapo-sunflow_0 ...
[2025-04-09T21:28:03.888Z] ===============================================
[2025-04-09T21:28:03.888Z] dacapo-sunflow_0 Start Time: Wed Apr 9 21:28:03 2025 Epoch Time (ms): 1744234083471
[2025-04-09T21:28:03.888Z] variation: NoOptions
[2025-04-09T21:28:03.888Z] JVM_OPTIONS:
[2025-04-09T21:28:03.888Z] { \
[2025-04-09T21:28:03.888Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:28:03.888Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:28:03.888Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-sunflow_0"; \
[2025-04-09T21:28:03.888Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-sunflow_0"; \
[2025-04-09T21:28:03.888Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:28:03.888Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/dacapo/dacapo.jar" sunflow; \
[2025-04-09T21:28:03.888Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "dacapo-sunflow_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-sunflow_0"; else echo "-----------------------------------"; echo "dacapo-sunflow_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:28:03.888Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:28:03.888Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:28:03.888Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:28:03.888Z]
[2025-04-09T21:28:03.888Z] TEST SETUP:
[2025-04-09T21:28:03.888Z] Nothing to be done for setup.
[2025-04-09T21:28:03.888Z]
[2025-04-09T21:28:03.888Z] TESTING:
[2025-04-09T21:28:03.888Z] Using scaled threading model. 8 processors detected, 8 threads used to drive the workload, in a possible range of [1,256]
[2025-04-09T21:28:03.888Z] ===== DaCapo 9.12-MR1 sunflow starting =====
[2025-04-09T21:28:05.865Z] ===== DaCapo 9.12-MR1 sunflow PASSED in 1978 msec =====
[2025-04-09T21:28:05.866Z] -----------------------------------
[2025-04-09T21:28:05.866Z] dacapo-sunflow_0_PASSED
[2025-04-09T21:28:05.866Z] -----------------------------------
[2025-04-09T21:28:05.866Z]
[2025-04-09T21:28:05.866Z] TEST TEARDOWN:
[2025-04-09T21:28:05.866Z] Nothing to be done for teardown.
[2025-04-09T21:28:05.866Z] dacapo-sunflow_0 Finish Time: Wed Apr 9 21:28:05 2025 Epoch Time (ms): 1744234085760
[2025-04-09T21:28:05.866Z]
[2025-04-09T21:28:05.866Z] ===============================================
[2025-04-09T21:28:05.866Z] Running test dacapo-tomcat_0 ...
[2025-04-09T21:28:05.866Z] ===============================================
[2025-04-09T21:28:06.826Z] dacapo-tomcat_0 Start Time: Wed Apr 9 21:28:05 2025 Epoch Time (ms): 1744234085782
[2025-04-09T21:28:06.826Z] dacapo-tomcat_0_DISABLED
[2025-04-09T21:28:06.826Z] Disabled Reason:
[2025-04-09T21:28:06.826Z] https://bugs.openjdk.java.net/browse/JDK-8155588
[2025-04-09T21:28:06.826Z] dacapo-tomcat_0 Finish Time: Wed Apr 9 21:28:05 2025 Epoch Time (ms): 1744234085794
[2025-04-09T21:28:06.826Z]
[2025-04-09T21:28:06.826Z] ===============================================
[2025-04-09T21:28:06.826Z] Running test dacapo-xalan_0 ...
[2025-04-09T21:28:06.826Z] ===============================================
[2025-04-09T21:28:06.826Z] dacapo-xalan_0 Start Time: Wed Apr 9 21:28:05 2025 Epoch Time (ms): 1744234085809
[2025-04-09T21:28:06.826Z] variation: NoOptions
[2025-04-09T21:28:06.826Z] JVM_OPTIONS:
[2025-04-09T21:28:06.826Z] { \
[2025-04-09T21:28:06.826Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:28:06.826Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:28:06.826Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-xalan_0"; \
[2025-04-09T21:28:06.826Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-xalan_0"; \
[2025-04-09T21:28:06.826Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:28:06.826Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/dacapo/dacapo.jar" xalan; \
[2025-04-09T21:28:06.826Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "dacapo-xalan_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/dacapo-xalan_0"; else echo "-----------------------------------"; echo "dacapo-xalan_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:28:06.826Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:28:06.826Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:28:06.826Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:28:06.826Z]
[2025-04-09T21:28:06.826Z] TEST SETUP:
[2025-04-09T21:28:06.826Z] Nothing to be done for setup.
[2025-04-09T21:28:06.826Z]
[2025-04-09T21:28:06.826Z] TESTING:
[2025-04-09T21:28:06.826Z] Using scaled threading model. 8 processors detected, 8 threads used to drive the workload, in a possible range of [1,100]
[2025-04-09T21:28:07.785Z] ===== DaCapo 9.12-MR1 xalan starting =====
[2025-04-09T21:28:10.821Z] Normal completion.
[2025-04-09T21:28:10.821Z] ===== DaCapo 9.12-MR1 xalan PASSED in 2666 msec =====
[2025-04-09T21:28:10.821Z] -----------------------------------
[2025-04-09T21:28:10.821Z] dacapo-xalan_0_PASSED
[2025-04-09T21:28:10.821Z] -----------------------------------
[2025-04-09T21:28:10.821Z]
[2025-04-09T21:28:10.821Z] TEST TEARDOWN:
[2025-04-09T21:28:10.821Z] Nothing to be done for teardown.
[2025-04-09T21:28:10.821Z] dacapo-xalan_0 Finish Time: Wed Apr 9 21:28:09 2025 Epoch Time (ms): 1744234089813
[2025-04-09T21:28:10.821Z] make[4]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/dacapo'
[2025-04-09T21:28:10.821Z] make[4]: Entering directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/renaissance'
[2025-04-09T21:28:10.821Z]
[2025-04-09T21:28:10.821Z] ===============================================
[2025-04-09T21:28:10.821Z] Running test renaissance-als_0 ...
[2025-04-09T21:28:10.821Z] ===============================================
[2025-04-09T21:28:10.821Z] renaissance-als_0 Start Time: Wed Apr 9 21:28:09 2025 Epoch Time (ms): 1744234089837
[2025-04-09T21:28:10.821Z] variation: NoOptions
[2025-04-09T21:28:10.821Z] JVM_OPTIONS:
[2025-04-09T21:28:10.821Z] { \
[2025-04-09T21:28:10.821Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:28:10.821Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:28:10.821Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-als_0"; \
[2025-04-09T21:28:10.821Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-als_0"; \
[2025-04-09T21:28:10.821Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:28:10.821Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-als_0"/als.json" als; \
[2025-04-09T21:28:10.821Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-als_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-als_0"; else echo "-----------------------------------"; echo "renaissance-als_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:28:10.821Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:28:10.821Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:28:10.821Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:28:10.821Z]
[2025-04-09T21:28:10.821Z] TEST SETUP:
[2025-04-09T21:28:10.821Z] Nothing to be done for setup.
[2025-04-09T21:28:10.821Z]
[2025-04-09T21:28:10.821Z] TESTING:
[2025-04-09T21:28:16.224Z] NOTE: 'als' benchmark uses Spark local executor with 8 (out of 8) threads.
[2025-04-09T21:28:22.959Z] ====== als (apache-spark) [default], iteration 0 started ======
[2025-04-09T21:28:22.959Z] GC before operation: completed in 40.786 ms, heap usage 63.926 MB -> 36.608 MB.
[2025-04-09T21:28:38.781Z] ====== als (apache-spark) [default], iteration 0 completed (14414.377 ms) ======
[2025-04-09T21:28:38.782Z] ====== als (apache-spark) [default], iteration 1 started ======
[2025-04-09T21:28:38.782Z] GC before operation: completed in 116.241 ms, heap usage 821.278 MB -> 64.249 MB.
[2025-04-09T21:29:09.822Z] Cannot contact test-osuosl-ubuntu2204-aarch64-1: java.lang.InterruptedException
[2025-04-09T21:38:41.774Z] 21:37:38.330 WARN [dispatcher-HeartbeatReceiver] org.apache.spark.HeartbeatReceiver - Removing executor driver with no recent heartbeats: 544403 ms exceeds timeout 120000 ms
[2025-04-09T21:38:41.774Z] 21:37:38.381 WARN [kill-executor-thread] org.apache.spark.SparkContext - Killing executors is not supported by current scheduler.
[2025-04-09T21:38:41.774Z] 21:37:38.401 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.774Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.774Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.774Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.774Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.774Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.774Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.774Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.774Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.774Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.774Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] ... 8 more
[2025-04-09T21:38:41.775Z] 21:37:38.401 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.775Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.775Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.775Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.775Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.775Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.775Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.775Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.775Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.775Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] ... 3 more
[2025-04-09T21:38:41.775Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.775Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] ... 3 more
[2025-04-09T21:38:41.776Z] 21:37:38.416 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.776Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.776Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.776Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.776Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.776Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.776Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.776Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.776Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.776Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] ... 3 more
[2025-04-09T21:38:41.776Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.776Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.776Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] ... 3 more
[2025-04-09T21:38:41.777Z] 21:37:38.416 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.777Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.777Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.777Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.777Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.777Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.777Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] ... 8 more
[2025-04-09T21:38:41.778Z] 21:37:38.419 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.778Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.778Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.778Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.778Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.778Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.778Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] ... 8 more
[2025-04-09T21:38:41.779Z] 21:37:38.419 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.779Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.779Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.779Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.779Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.779Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.779Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.779Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.779Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.779Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] ... 3 more
[2025-04-09T21:38:41.779Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.779Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.780Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.780Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] ... 3 more
[2025-04-09T21:38:41.780Z] 21:37:38.427 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.780Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.780Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.780Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.787Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.788Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.788Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.788Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] ... 8 more
[2025-04-09T21:38:41.788Z] 21:37:38.427 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.788Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.788Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.788Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.788Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.788Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.788Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.788Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.788Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.788Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.788Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.788Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] ... 3 more
[2025-04-09T21:38:41.789Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] ... 3 more
[2025-04-09T21:38:41.789Z] 21:37:38.430 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.789Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.789Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.789Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.789Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.789Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.789Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.789Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.789Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.789Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.789Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.789Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] ... 3 more
[2025-04-09T21:38:41.790Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] ... 3 more
[2025-04-09T21:38:41.790Z] 21:37:38.430 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.790Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.790Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.790Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.790Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.790Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.790Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.790Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] ... 8 more
[2025-04-09T21:38:41.791Z] 21:37:38.433 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.791Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.791Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.791Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.791Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.791Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.791Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] ... 8 more
[2025-04-09T21:38:41.792Z] 21:37:38.433 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.792Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.792Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.792Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.792Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.792Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.792Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.792Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.792Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.792Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] ... 3 more
[2025-04-09T21:38:41.792Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.792Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] ... 3 more
[2025-04-09T21:38:41.793Z] 21:37:38.436 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.793Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.793Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.793Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.793Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.793Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.793Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] ... 8 more
[2025-04-09T21:38:41.794Z] 21:37:38.436 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.794Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.794Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.794Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.794Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.794Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.794Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.794Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.794Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.794Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] ... 3 more
[2025-04-09T21:38:41.794Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.794Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] ... 3 more
[2025-04-09T21:38:41.795Z] 21:37:38.461 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.795Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.795Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.795Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.795Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.795Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.795Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] ... 8 more
[2025-04-09T21:38:41.796Z] 21:37:38.461 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.796Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.796Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.796Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.796Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.796Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.796Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.796Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.796Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.796Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] ... 3 more
[2025-04-09T21:38:41.796Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.796Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] ... 3 more
[2025-04-09T21:38:41.797Z] 21:37:38.483 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.797Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.797Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.797Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.797Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.797Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.797Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.797Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.797Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.797Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] ... 3 more
[2025-04-09T21:38:41.797Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.797Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] ... 3 more
[2025-04-09T21:38:41.798Z] 21:37:38.483 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.798Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.798Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.798Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.798Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.798Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] ... 8 more
[2025-04-09T21:38:41.798Z] 21:37:38.487 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.798Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.798Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.798Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.798Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.798Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.798Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] ... 8 more
[2025-04-09T21:38:41.799Z] 21:37:38.487 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.799Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.799Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.799Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.799Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.799Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.799Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.799Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] ... 3 more
[2025-04-09T21:38:41.799Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] ... 3 more
[2025-04-09T21:38:41.799Z] 21:37:38.491 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.799Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.799Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.800Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.800Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.800Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.800Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.800Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.800Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] ... 3 more
[2025-04-09T21:38:41.800Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] ... 3 more
[2025-04-09T21:38:41.800Z] 21:37:38.491 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.800Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.800Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.800Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.800Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.800Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] ... 8 more
[2025-04-09T21:38:41.801Z] 21:37:38.496 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.801Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.801Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.801Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.801Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] ... 8 more
[2025-04-09T21:38:41.801Z] 21:37:38.496 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.801Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.801Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.801Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.801Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.801Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.801Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.801Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] ... 3 more
[2025-04-09T21:38:41.801Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.801Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] ... 3 more
[2025-04-09T21:38:41.802Z] 21:37:38.500 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.802Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.802Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.802Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.802Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.802Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.802Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.802Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] ... 3 more
[2025-04-09T21:38:41.802Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] ... 3 more
[2025-04-09T21:38:41.802Z] 21:37:38.500 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.802Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.802Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.803Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.803Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.803Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] ... 8 more
[2025-04-09T21:38:41.803Z] 21:37:38.505 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.803Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.803Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.803Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.803Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.803Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.803Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.803Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] ... 3 more
[2025-04-09T21:38:41.803Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.803Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] ... 3 more
[2025-04-09T21:38:41.804Z] 21:37:38.505 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.804Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.804Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.804Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.804Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] ... 8 more
[2025-04-09T21:38:41.804Z] 21:37:38.509 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.804Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.804Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.804Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.804Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.804Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] ... 8 more
[2025-04-09T21:38:41.805Z] 21:37:38.509 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.805Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.805Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.805Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.805Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.805Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.805Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.805Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] ... 3 more
[2025-04-09T21:38:41.805Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] ... 3 more
[2025-04-09T21:38:41.805Z] 21:37:38.512 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.805Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.805Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.805Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.805Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.805Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.805Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.805Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.805Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] ... 3 more
[2025-04-09T21:38:41.806Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] ... 3 more
[2025-04-09T21:38:41.806Z] 21:37:38.512 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.806Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.806Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.806Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.806Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] ... 8 more
[2025-04-09T21:38:41.806Z] 21:37:38.516 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.806Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.806Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.807Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.807Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.807Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] ... 8 more
[2025-04-09T21:38:41.807Z] 21:37:38.516 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.807Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.807Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.807Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.807Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.807Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.807Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.807Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] ... 3 more
[2025-04-09T21:38:41.807Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] ... 3 more
[2025-04-09T21:38:41.807Z] 21:37:38.518 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.807Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.807Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.808Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.808Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.808Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] ... 8 more
[2025-04-09T21:38:41.808Z] 21:37:38.518 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.808Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.808Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.808Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.808Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.808Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.808Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.808Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] ... 3 more
[2025-04-09T21:38:41.808Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] ... 3 more
[2025-04-09T21:38:41.808Z] 21:37:38.521 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.808Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.808Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.808Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.808Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.808Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] ... 8 more
[2025-04-09T21:38:41.809Z] 21:37:38.521 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.809Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.809Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.809Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.809Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.809Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.809Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.809Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] ... 3 more
[2025-04-09T21:38:41.809Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] ... 3 more
[2025-04-09T21:38:41.809Z] 21:37:38.524 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.809Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.809Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.809Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.809Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.809Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.809Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.809Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] ... 3 more
[2025-04-09T21:38:41.809Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.809Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] ... 3 more
[2025-04-09T21:38:41.810Z] 21:37:38.524 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.810Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.810Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.810Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.810Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] ... 8 more
[2025-04-09T21:38:41.810Z] 21:37:38.527 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.810Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.810Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.810Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.810Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] ... 8 more
[2025-04-09T21:38:41.810Z] 21:37:38.527 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.810Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.810Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.810Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.810Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.810Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.810Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.810Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] ... 3 more
[2025-04-09T21:38:41.810Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.810Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] ... 3 more
[2025-04-09T21:38:41.811Z] 21:37:38.529 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.811Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.811Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.811Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.811Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] ... 8 more
[2025-04-09T21:38:41.811Z] 21:37:38.529 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.811Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.811Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.811Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.811Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.811Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.811Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.811Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] ... 3 more
[2025-04-09T21:38:41.811Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.811Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] ... 3 more
[2025-04-09T21:38:41.812Z] 21:37:38.532 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.812Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.812Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.812Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.812Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] ... 8 more
[2025-04-09T21:38:41.812Z] 21:37:38.532 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.812Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.812Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.812Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.812Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.812Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.812Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.812Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] ... 3 more
[2025-04-09T21:38:41.812Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] ... 3 more
[2025-04-09T21:38:41.812Z] 21:37:38.538 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.812Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.812Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.812Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.812Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.812Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] ... 8 more
[2025-04-09T21:38:41.813Z] 21:37:38.538 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.813Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.813Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.813Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.813Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.813Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.813Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.813Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] ... 3 more
[2025-04-09T21:38:41.813Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] ... 3 more
[2025-04-09T21:38:41.813Z] 21:37:38.541 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.813Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.813Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.813Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.813Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] ... 8 more
[2025-04-09T21:38:41.813Z] 21:37:38.541 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.813Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.813Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.814Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.814Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] ... 3 more
[2025-04-09T21:38:41.814Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] ... 3 more
[2025-04-09T21:38:41.814Z] 21:37:38.543 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.814Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.814Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.814Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] ... 8 more
[2025-04-09T21:38:41.814Z] 21:37:38.544 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.814Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.814Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.814Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.814Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] ... 3 more
[2025-04-09T21:38:41.814Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.814Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] ... 3 more
[2025-04-09T21:38:41.815Z] 21:37:38.546 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.815Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.815Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.815Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.815Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.815Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.815Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.815Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] ... 3 more
[2025-04-09T21:38:41.815Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] ... 3 more
[2025-04-09T21:38:41.815Z] 21:37:38.546 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.815Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.815Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.815Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.815Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] ... 8 more
[2025-04-09T21:38:41.815Z] 21:37:38.549 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.815Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.815Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.816Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.816Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.816Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.816Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.816Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.816Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] ... 3 more
[2025-04-09T21:38:41.816Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] ... 3 more
[2025-04-09T21:38:41.816Z] 21:37:38.549 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.816Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.816Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.816Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.816Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] ... 8 more
[2025-04-09T21:38:41.816Z] 21:37:38.552 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.816Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.816Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.816Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.816Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.816Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] ... 8 more
[2025-04-09T21:38:41.817Z] 21:37:38.552 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.817Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.817Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.817Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] ... 3 more
[2025-04-09T21:38:41.817Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] ... 3 more
[2025-04-09T21:38:41.817Z] 21:37:38.554 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.817Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.817Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.817Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] ... 8 more
[2025-04-09T21:38:41.817Z] 21:37:38.554 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.817Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.817Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.817Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.817Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.817Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] ... 3 more
[2025-04-09T21:38:41.818Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] ... 3 more
[2025-04-09T21:38:41.818Z] 21:37:38.556 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.818Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.818Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.818Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.818Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.818Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.818Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] ... 8 more
[2025-04-09T21:38:41.819Z] 21:37:38.557 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.819Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.819Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.819Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.819Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.819Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.819Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.819Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.819Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.819Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.819Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] ... 3 more
[2025-04-09T21:38:41.819Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.819Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.819Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.819Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.819Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.819Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.819Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.819Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.819Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.819Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.819Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] ... 3 more
[2025-04-09T21:38:41.820Z] 21:37:38.559 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.820Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.820Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.820Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.820Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.820Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.820Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.820Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.820Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.820Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] ... 3 more
[2025-04-09T21:38:41.820Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.820Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] ... 3 more
[2025-04-09T21:38:41.821Z] 21:37:38.559 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.821Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.821Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.821Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.821Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.821Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.821Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] ... 8 more
[2025-04-09T21:38:41.822Z] 21:37:38.590 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.822Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.822Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.822Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.822Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.822Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] ... 8 more
[2025-04-09T21:38:41.822Z] 21:37:38.590 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.822Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.822Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.822Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.823Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.823Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.823Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.823Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.823Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.823Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.823Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] ... 3 more
[2025-04-09T21:38:41.823Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] ... 3 more
[2025-04-09T21:38:41.823Z] 21:37:38.592 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.823Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.823Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.823Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.824Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.824Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.824Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.Promise$DefaultPromise.dispatchOrAddCallbacks(Promise.scala:312) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:176) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] ... 17 more
[2025-04-09T21:38:41.824Z] 21:37:38.592 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.824Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.824Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.824Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.824Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.824Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.824Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.824Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.824Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.824Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] ... 3 more
[2025-04-09T21:38:41.824Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.Promise$DefaultPromise.dispatchOrAddCallbacks(Promise.scala:312) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:176) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.824Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] ... 3 more
[2025-04-09T21:38:41.825Z] 21:37:38.595 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.825Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.825Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.825Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.825Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.825Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.825Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.825Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.825Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.825Z] at scala.concurrent.impl.Promise$DefaultPromise.dispatchOrAddCallbacks(Promise.scala:312) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.825Z] at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:176) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] ... 17 more
[2025-04-09T21:38:41.825Z] 21:37:38.595 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.825Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.825Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.825Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.825Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.825Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.825Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.825Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.825Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.825Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.825Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.825Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] ... 3 more
[2025-04-09T21:38:41.826Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$DefaultPromise.dispatchOrAddCallbacks(Promise.scala:312) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:176) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] ... 3 more
[2025-04-09T21:38:41.826Z] 21:37:38.597 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.826Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.826Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.826Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.826Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.826Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.826Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] ... 8 more
[2025-04-09T21:38:41.827Z] 21:37:38.597 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.827Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.827Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.827Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.827Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.827Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.827Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.827Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.827Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.827Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] ... 3 more
[2025-04-09T21:38:41.827Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.827Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] ... 3 more
[2025-04-09T21:38:41.828Z] 21:37:38.599 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.828Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.828Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.828Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.828Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.828Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$DefaultPromise.dispatchOrAddCallbacks(Promise.scala:312) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:176) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] ... 17 more
[2025-04-09T21:38:41.828Z] 21:37:38.599 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.828Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.828Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.828Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.828Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.828Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.828Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.828Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.828Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.828Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.828Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.828Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] ... 3 more
[2025-04-09T21:38:41.829Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.dispatchOrAddCallbacks(Promise.scala:312) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:176) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] ... 3 more
[2025-04-09T21:38:41.829Z] 21:37:38.601 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.829Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.829Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.829Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.829Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.829Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] ... 8 more
[2025-04-09T21:38:41.829Z] 21:37:38.601 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.829Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.829Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.829Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.829Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.829Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.829Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.830Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.830Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.830Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.830Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.830Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] ... 3 more
[2025-04-09T21:38:41.830Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] ... 3 more
[2025-04-09T21:38:41.830Z] 21:37:38.603 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.830Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.830Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.830Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.830Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.830Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.830Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] ... 8 more
[2025-04-09T21:38:41.831Z] 21:37:38.604 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.831Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.831Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.831Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.831Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.831Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.831Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.831Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] ... 3 more
[2025-04-09T21:38:41.831Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] ... 3 more
[2025-04-09T21:38:41.831Z] 21:37:38.618 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.831Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.831Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.832Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.832Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.832Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] ... 8 more
[2025-04-09T21:38:41.832Z] 21:37:38.619 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.832Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.832Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.832Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.832Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.832Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.832Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.832Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] ... 3 more
[2025-04-09T21:38:41.832Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.832Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] ... 3 more
[2025-04-09T21:38:41.833Z] 21:37:38.630 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.833Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.833Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.833Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.833Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] ... 8 more
[2025-04-09T21:38:41.833Z] 21:37:38.630 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.833Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.833Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.833Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.833Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.833Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.833Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.833Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.833Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] ... 3 more
[2025-04-09T21:38:41.834Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] ... 3 more
[2025-04-09T21:38:41.834Z] 21:37:38.642 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.834Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.834Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.834Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.834Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.834Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.834Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] ... 8 more
[2025-04-09T21:38:41.835Z] 21:37:38.642 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.835Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.835Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.835Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.835Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.835Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.835Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.835Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] ... 3 more
[2025-04-09T21:38:41.835Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] ... 3 more
[2025-04-09T21:38:41.835Z] 21:37:38.649 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.835Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.835Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.836Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.836Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.836Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] ... 8 more
[2025-04-09T21:38:41.836Z] 21:37:38.650 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.836Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.836Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.836Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.836Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.836Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.836Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.836Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] ... 3 more
[2025-04-09T21:38:41.836Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.836Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.836Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] ... 3 more
[2025-04-09T21:38:41.837Z] 21:37:38.652 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.837Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.837Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.837Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.837Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] ... 8 more
[2025-04-09T21:38:41.837Z] 21:37:38.652 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.837Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.837Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.837Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.837Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.837Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.837Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.837Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.837Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] ... 3 more
[2025-04-09T21:38:41.838Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] ... 3 more
[2025-04-09T21:38:41.838Z] 21:37:38.662 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.838Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.838Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.838Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.838Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.838Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.838Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.838Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] ... 3 more
[2025-04-09T21:38:41.838Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.838Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] ... 3 more
[2025-04-09T21:38:41.839Z] 21:37:38.662 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.839Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.839Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.839Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.839Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] ... 8 more
[2025-04-09T21:38:41.839Z] 21:37:38.664 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.839Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.839Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.839Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.839Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] ... 8 more
[2025-04-09T21:38:41.839Z] 21:37:38.664 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.839Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.839Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.839Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.839Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.839Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.839Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.839Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.839Z] ... 3 more
[2025-04-09T21:38:41.839Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] ... 3 more
[2025-04-09T21:38:41.840Z] 21:37:38.670 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.840Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.840Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.840Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.840Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] ... 8 more
[2025-04-09T21:38:41.840Z] 21:37:38.673 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.840Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.840Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.840Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.840Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.840Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.840Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.840Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] ... 3 more
[2025-04-09T21:38:41.840Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.840Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] ... 3 more
[2025-04-09T21:38:41.841Z] 21:37:38.680 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.841Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.841Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.841Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] ... 3 more
[2025-04-09T21:38:41.841Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.dispatchOrAddCallbacks(Promise.scala:312) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:176) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] ... 3 more
[2025-04-09T21:38:41.841Z] 21:37:38.680 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.841Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.841Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.841Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.dispatchOrAddCallbacks(Promise.scala:312) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.flatMap(Promise.scala:176) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:150) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] ... 17 more
[2025-04-09T21:38:41.841Z] 21:37:38.682 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.841Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.841Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.841Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.841Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] ... 3 more
[2025-04-09T21:38:41.841Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.841Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] ... 3 more
[2025-04-09T21:38:41.842Z] 21:37:38.682 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.842Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.842Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.842Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.842Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] ... 8 more
[2025-04-09T21:38:41.842Z] 21:37:38.684 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.842Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.842Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.842Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.842Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] ... 8 more
[2025-04-09T21:38:41.842Z] 21:37:38.684 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.842Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.842Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.843Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.843Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.843Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.843Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.843Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.843Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] ... 3 more
[2025-04-09T21:38:41.843Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] ... 3 more
[2025-04-09T21:38:41.843Z] 21:37:38.687 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.843Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.843Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.843Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.843Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.843Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.843Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.843Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] ... 3 more
[2025-04-09T21:38:41.843Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.843Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] ... 3 more
[2025-04-09T21:38:41.844Z] 21:37:38.687 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.844Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.844Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.844Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.844Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] ... 8 more
[2025-04-09T21:38:41.844Z] 21:37:38.689 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.844Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.844Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.844Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.844Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] ... 8 more
[2025-04-09T21:38:41.844Z] 21:37:38.689 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.844Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.844Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.844Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.844Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.844Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.844Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.844Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.844Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.844Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] ... 3 more
[2025-04-09T21:38:41.845Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] ... 3 more
[2025-04-09T21:38:41.845Z] 21:37:38.691 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.845Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.845Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.845Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.845Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] ... 8 more
[2025-04-09T21:38:41.845Z] 21:37:38.691 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.845Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.845Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.845Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.845Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.845Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.845Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.845Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.845Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.845Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] ... 3 more
[2025-04-09T21:38:41.846Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] ... 3 more
[2025-04-09T21:38:41.846Z] 21:37:38.694 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.846Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.846Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.846Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.846Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] ... 8 more
[2025-04-09T21:38:41.846Z] 21:37:38.694 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.846Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.846Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.846Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.846Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.846Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.847Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.847Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.847Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.847Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] ... 3 more
[2025-04-09T21:38:41.847Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] ... 3 more
[2025-04-09T21:38:41.847Z] ====== als (apache-spark) [default], iteration 1 completed (544338.605 ms) ======
[2025-04-09T21:38:41.847Z] ====== als (apache-spark) [default], iteration 2 started ======
[2025-04-09T21:38:41.847Z] GC before operation: completed in 104.281 ms, heap usage 1.574 GB -> 66.101 MB.
[2025-04-09T21:38:41.847Z] 21:37:43.905 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.847Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.847Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.847Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.847Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.847Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.847Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.847Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] ... 3 more
[2025-04-09T21:38:41.847Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] ... 3 more
[2025-04-09T21:38:41.847Z] 21:37:43.905 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.847Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.847Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.848Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.848Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.848Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] ... 8 more
[2025-04-09T21:38:41.848Z] ====== als (apache-spark) [default], iteration 2 completed (3622.033 ms) ======
[2025-04-09T21:38:41.848Z] ====== als (apache-spark) [default], iteration 3 started ======
[2025-04-09T21:38:41.848Z] GC before operation: completed in 100.090 ms, heap usage 377.519 MB -> 65.512 MB.
[2025-04-09T21:38:41.848Z] ====== als (apache-spark) [default], iteration 3 completed (3495.980 ms) ======
[2025-04-09T21:38:41.848Z] ====== als (apache-spark) [default], iteration 4 started ======
[2025-04-09T21:38:41.848Z] GC before operation: completed in 98.083 ms, heap usage 207.489 MB -> 65.740 MB.
[2025-04-09T21:38:41.848Z] ====== als (apache-spark) [default], iteration 4 completed (3631.539 ms) ======
[2025-04-09T21:38:41.848Z] ====== als (apache-spark) [default], iteration 5 started ======
[2025-04-09T21:38:41.848Z] GC before operation: completed in 123.775 ms, heap usage 160.395 MB -> 66.599 MB.
[2025-04-09T21:38:41.848Z] 21:37:53.904 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.848Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.848Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.848Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.848Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.848Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.848Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.848Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] ... 3 more
[2025-04-09T21:38:41.848Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] ... 3 more
[2025-04-09T21:38:41.848Z] 21:37:53.904 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.848Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.848Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.848Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.848Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.848Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] ... 8 more
[2025-04-09T21:38:41.849Z] ====== als (apache-spark) [default], iteration 5 completed (4137.421 ms) ======
[2025-04-09T21:38:41.849Z] ====== als (apache-spark) [default], iteration 6 started ======
[2025-04-09T21:38:41.849Z] GC before operation: completed in 111.550 ms, heap usage 381.061 MB -> 67.279 MB.
[2025-04-09T21:38:41.849Z] ====== als (apache-spark) [default], iteration 6 completed (4388.097 ms) ======
[2025-04-09T21:38:41.849Z] ====== als (apache-spark) [default], iteration 7 started ======
[2025-04-09T21:38:41.849Z] GC before operation: completed in 157.553 ms, heap usage 489.790 MB -> 68.033 MB.
[2025-04-09T21:38:41.849Z] 21:38:03.910 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.849Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.849Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.849Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.849Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.849Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.849Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.849Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] ... 3 more
[2025-04-09T21:38:41.849Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] ... 3 more
[2025-04-09T21:38:41.849Z] 21:38:03.910 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.849Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.849Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.849Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.849Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.849Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] ... 8 more
[2025-04-09T21:38:41.850Z] ====== als (apache-spark) [default], iteration 7 completed (3687.238 ms) ======
[2025-04-09T21:38:41.850Z] ====== als (apache-spark) [default], iteration 8 started ======
[2025-04-09T21:38:41.850Z] GC before operation: completed in 113.554 ms, heap usage 638.228 MB -> 68.803 MB.
[2025-04-09T21:38:41.850Z] ====== als (apache-spark) [default], iteration 8 completed (3831.070 ms) ======
[2025-04-09T21:38:41.850Z] ====== als (apache-spark) [default], iteration 9 started ======
[2025-04-09T21:38:41.850Z] GC before operation: completed in 135.392 ms, heap usage 916.174 MB -> 69.671 MB.
[2025-04-09T21:38:41.850Z] 21:38:13.909 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.850Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.850Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.850Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.850Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] ... 8 more
[2025-04-09T21:38:41.850Z] 21:38:13.909 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.850Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.850Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.850Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.850Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.850Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.850Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.850Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] ... 3 more
[2025-04-09T21:38:41.850Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.850Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] ... 3 more
[2025-04-09T21:38:41.851Z] ====== als (apache-spark) [default], iteration 9 completed (4241.757 ms) ======
[2025-04-09T21:38:41.851Z] ====== als (apache-spark) [default], iteration 10 started ======
[2025-04-09T21:38:41.851Z] GC before operation: completed in 129.545 ms, heap usage 327.457 MB -> 69.186 MB.
[2025-04-09T21:38:41.851Z] ====== als (apache-spark) [default], iteration 10 completed (4300.853 ms) ======
[2025-04-09T21:38:41.851Z] ====== als (apache-spark) [default], iteration 11 started ======
[2025-04-09T21:38:41.851Z] GC before operation: completed in 165.602 ms, heap usage 638.973 MB -> 69.915 MB.
[2025-04-09T21:38:41.851Z] 21:38:23.916 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.851Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.851Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.851Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.851Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] ... 8 more
[2025-04-09T21:38:41.851Z] 21:38:23.916 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.851Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.851Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.851Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.851Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.851Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.851Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.851Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] ... 3 more
[2025-04-09T21:38:41.851Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] ... 3 more
[2025-04-09T21:38:41.851Z] ====== als (apache-spark) [default], iteration 11 completed (4153.954 ms) ======
[2025-04-09T21:38:41.851Z] ====== als (apache-spark) [default], iteration 12 started ======
[2025-04-09T21:38:41.851Z] GC before operation: completed in 174.948 ms, heap usage 471.806 MB -> 69.872 MB.
[2025-04-09T21:38:41.851Z] ====== als (apache-spark) [default], iteration 12 completed (4126.291 ms) ======
[2025-04-09T21:38:41.851Z] ====== als (apache-spark) [default], iteration 13 started ======
[2025-04-09T21:38:41.851Z] GC before operation: completed in 233.743 ms, heap usage 1.505 GB -> 71.291 MB.
[2025-04-09T21:38:41.851Z] ====== als (apache-spark) [default], iteration 13 completed (3997.652 ms) ======
[2025-04-09T21:38:41.851Z] ====== als (apache-spark) [default], iteration 14 started ======
[2025-04-09T21:38:41.851Z] GC before operation: completed in 137.202 ms, heap usage 672.790 MB -> 70.791 MB.
[2025-04-09T21:38:41.851Z] 21:38:33.924 ERROR [dispatcher-BlockManagerMaster] org.apache.spark.rpc.netty.Inbox - Ignoring error
[2025-04-09T21:38:41.851Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.851Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.852Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.852Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.852Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] ... 8 more
[2025-04-09T21:38:41.852Z] 21:38:33.925 WARN [executor-heartbeater] org.apache.spark.executor.Executor - Issue communicating with driver in heartbeater
[2025-04-09T21:38:41.852Z] org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:101) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:85) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:642) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1223) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:295) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) [spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
[2025-04-09T21:38:41.852Z] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:305) [?:?]
[2025-04-09T21:38:41.852Z] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) [?:?]
[2025-04-09T21:38:41.852Z] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
[2025-04-09T21:38:41.852Z] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
[2025-04-09T21:38:41.852Z] at java.lang.Thread.run(Thread.java:840) [?:?]
[2025-04-09T21:38:41.852Z] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult:
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.SparkThreadUtils$.awaitResult(SparkThreadUtils.scala:56) ~[spark-common-utils_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:310) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:124) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:688) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:687) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:725) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:133) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] ... 3 more
[2025-04-09T21:38:41.852Z] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@localhost:43635
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:470) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:504) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.ExecutionContext$parasitic$.execute(ExecutionContext.scala:222) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:335) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.trySuccess(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.trySuccess$(Promise.scala:99) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.ExecutionContextImpl.execute(ExecutionContextImpl.scala:21) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$Transformation.submitWithValue(Promise.scala:429) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.submitWithValue(Promise.scala:338) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete0(Promise.scala:285) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:278) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.complete(Promise.scala:57) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.complete$(Promise.scala:56) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.success(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.Promise.success$(Promise.scala:91) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:104) ~[scala-library-2.13.15.jar:?]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.13-3.5.3.jar:3.5.3]
[2025-04-09T21:38:41.852Z] ... 3 more
[2025-04-09T21:38:41.852Z] 21:38:33.928 ERROR [executor-heartbeater] org.apache.spark.executor.Executor - Exit as unable to send heartbeats to driver more than 60 times
[2025-04-09T21:38:44.173Z] ====== als (apache-spark) [default], iteration 14 failed (SparkException) ======
[2025-04-09T21:38:45.140Z] -----------------------------------
[2025-04-09T21:38:45.140Z] renaissance-als_0_FAILED
[2025-04-09T21:38:45.140Z] -----------------------------------
[2025-04-09T21:38:45.140Z]
[2025-04-09T21:38:45.140Z] TEST TEARDOWN:
[2025-04-09T21:38:45.140Z] Nothing to be done for teardown.
[2025-04-09T21:38:45.140Z] renaissance-als_0 Finish Time: Wed Apr 9 21:38:44 2025 Epoch Time (ms): 1744234724212
[2025-04-09T21:38:45.140Z]
[2025-04-09T21:38:45.140Z] ===============================================
[2025-04-09T21:38:45.140Z] Running test renaissance-chi-square_0 ...
[2025-04-09T21:38:45.140Z] ===============================================
[2025-04-09T21:38:45.140Z] renaissance-chi-square_0 Start Time: Wed Apr 9 21:38:44 2025 Epoch Time (ms): 1744234724227
[2025-04-09T21:38:45.140Z] variation: NoOptions
[2025-04-09T21:38:45.140Z] JVM_OPTIONS:
[2025-04-09T21:38:45.140Z] { \
[2025-04-09T21:38:45.140Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:38:45.140Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:38:45.140Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-chi-square_0"; \
[2025-04-09T21:38:45.140Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-chi-square_0"; \
[2025-04-09T21:38:45.140Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:38:45.140Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-chi-square_0"/chi-square.json" chi-square; \
[2025-04-09T21:38:45.140Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-chi-square_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-chi-square_0"; else echo "-----------------------------------"; echo "renaissance-chi-square_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:38:45.140Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:38:45.140Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:38:45.140Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:38:45.140Z]
[2025-04-09T21:38:45.140Z] TEST SETUP:
[2025-04-09T21:38:45.140Z] Nothing to be done for setup.
[2025-04-09T21:38:45.140Z]
[2025-04-09T21:38:45.140Z] TESTING:
[2025-04-09T21:38:50.546Z] NOTE: 'chi-square' benchmark uses Spark local executor with 4 (out of 8) threads.
[2025-04-09T21:38:52.518Z] ====== chi-square (apache-spark) [default], iteration 0 started ======
[2025-04-09T21:38:52.518Z] GC before operation: completed in 38.877 ms, heap usage 156.926 MB -> 30.028 MB.
[2025-04-09T21:38:56.696Z] ====== chi-square (apache-spark) [default], iteration 0 completed (3933.929 ms) ======
[2025-04-09T21:38:56.696Z] ====== chi-square (apache-spark) [default], iteration 1 started ======
[2025-04-09T21:38:56.696Z] GC before operation: completed in 77.992 ms, heap usage 356.833 MB -> 179.405 MB.
[2025-04-09T21:38:56.696Z] ====== chi-square (apache-spark) [default], iteration 1 completed (830.067 ms) ======
[2025-04-09T21:38:56.696Z] ====== chi-square (apache-spark) [default], iteration 2 started ======
[2025-04-09T21:38:56.696Z] GC before operation: completed in 86.103 ms, heap usage 275.949 MB -> 179.435 MB.
[2025-04-09T21:38:57.654Z] ====== chi-square (apache-spark) [default], iteration 2 completed (850.324 ms) ======
[2025-04-09T21:38:57.654Z] ====== chi-square (apache-spark) [default], iteration 3 started ======
[2025-04-09T21:38:57.654Z] GC before operation: completed in 66.581 ms, heap usage 201.699 MB -> 179.519 MB.
[2025-04-09T21:38:58.614Z] ====== chi-square (apache-spark) [default], iteration 3 completed (672.046 ms) ======
[2025-04-09T21:38:58.614Z] ====== chi-square (apache-spark) [default], iteration 4 started ======
[2025-04-09T21:38:58.614Z] GC before operation: completed in 87.004 ms, heap usage 551.853 MB -> 179.575 MB.
[2025-04-09T21:38:59.573Z] ====== chi-square (apache-spark) [default], iteration 4 completed (637.690 ms) ======
[2025-04-09T21:38:59.573Z] ====== chi-square (apache-spark) [default], iteration 5 started ======
[2025-04-09T21:38:59.573Z] GC before operation: completed in 72.621 ms, heap usage 545.699 MB -> 179.625 MB.
[2025-04-09T21:39:00.533Z] ====== chi-square (apache-spark) [default], iteration 5 completed (908.194 ms) ======
[2025-04-09T21:39:00.533Z] ====== chi-square (apache-spark) [default], iteration 6 started ======
[2025-04-09T21:39:00.533Z] GC before operation: completed in 72.784 ms, heap usage 907.964 MB -> 179.679 MB.
[2025-04-09T21:39:01.491Z] ====== chi-square (apache-spark) [default], iteration 6 completed (649.783 ms) ======
[2025-04-09T21:39:01.491Z] ====== chi-square (apache-spark) [default], iteration 7 started ======
[2025-04-09T21:39:01.491Z] GC before operation: completed in 74.213 ms, heap usage 528.335 MB -> 179.728 MB.
[2025-04-09T21:39:01.491Z] ====== chi-square (apache-spark) [default], iteration 7 completed (628.403 ms) ======
[2025-04-09T21:39:01.491Z] ====== chi-square (apache-spark) [default], iteration 8 started ======
[2025-04-09T21:39:01.491Z] GC before operation: completed in 83.068 ms, heap usage 541.792 MB -> 179.871 MB.
[2025-04-09T21:39:02.454Z] ====== chi-square (apache-spark) [default], iteration 8 completed (638.728 ms) ======
[2025-04-09T21:39:02.454Z] ====== chi-square (apache-spark) [default], iteration 9 started ======
[2025-04-09T21:39:02.454Z] GC before operation: completed in 74.622 ms, heap usage 536.456 MB -> 179.897 MB.
[2025-04-09T21:39:03.412Z] ====== chi-square (apache-spark) [default], iteration 9 completed (777.288 ms) ======
[2025-04-09T21:39:03.412Z] ====== chi-square (apache-spark) [default], iteration 10 started ======
[2025-04-09T21:39:03.412Z] GC before operation: completed in 71.437 ms, heap usage 901.665 MB -> 179.929 MB.
[2025-04-09T21:39:04.371Z] ====== chi-square (apache-spark) [default], iteration 10 completed (633.617 ms) ======
[2025-04-09T21:39:04.371Z] ====== chi-square (apache-spark) [default], iteration 11 started ======
[2025-04-09T21:39:04.371Z] GC before operation: completed in 89.353 ms, heap usage 516.025 MB -> 179.970 MB.
[2025-04-09T21:39:04.371Z] ====== chi-square (apache-spark) [default], iteration 11 completed (626.564 ms) ======
[2025-04-09T21:39:04.371Z] ====== chi-square (apache-spark) [default], iteration 12 started ======
[2025-04-09T21:39:05.327Z] GC before operation: completed in 80.495 ms, heap usage 539.515 MB -> 180.006 MB.
[2025-04-09T21:39:05.327Z] ====== chi-square (apache-spark) [default], iteration 12 completed (664.508 ms) ======
[2025-04-09T21:39:05.327Z] ====== chi-square (apache-spark) [default], iteration 13 started ======
[2025-04-09T21:39:05.327Z] GC before operation: completed in 78.877 ms, heap usage 546.551 MB -> 180.044 MB.
[2025-04-09T21:39:06.284Z] ====== chi-square (apache-spark) [default], iteration 13 completed (789.752 ms) ======
[2025-04-09T21:39:06.284Z] ====== chi-square (apache-spark) [default], iteration 14 started ======
[2025-04-09T21:39:06.284Z] GC before operation: completed in 80.323 ms, heap usage 905.953 MB -> 180.077 MB.
[2025-04-09T21:39:07.242Z] ====== chi-square (apache-spark) [default], iteration 14 completed (707.696 ms) ======
[2025-04-09T21:39:07.242Z] ====== chi-square (apache-spark) [default], iteration 15 started ======
[2025-04-09T21:39:07.242Z] GC before operation: completed in 77.659 ms, heap usage 521.968 MB -> 180.110 MB.
[2025-04-09T21:39:08.200Z] ====== chi-square (apache-spark) [default], iteration 15 completed (647.141 ms) ======
[2025-04-09T21:39:08.200Z] ====== chi-square (apache-spark) [default], iteration 16 started ======
[2025-04-09T21:39:08.200Z] GC before operation: completed in 87.141 ms, heap usage 529.655 MB -> 180.144 MB.
[2025-04-09T21:39:08.200Z] ====== chi-square (apache-spark) [default], iteration 16 completed (618.474 ms) ======
[2025-04-09T21:39:08.200Z] ====== chi-square (apache-spark) [default], iteration 17 started ======
[2025-04-09T21:39:09.157Z] GC before operation: completed in 65.148 ms, heap usage 530.195 MB -> 180.174 MB.
[2025-04-09T21:39:09.157Z] ====== chi-square (apache-spark) [default], iteration 17 completed (779.936 ms) ======
[2025-04-09T21:39:09.157Z] ====== chi-square (apache-spark) [default], iteration 18 started ======
[2025-04-09T21:39:09.157Z] GC before operation: completed in 86.609 ms, heap usage 898.926 MB -> 180.206 MB.
[2025-04-09T21:39:10.115Z] ====== chi-square (apache-spark) [default], iteration 18 completed (618.124 ms) ======
[2025-04-09T21:39:10.115Z] ====== chi-square (apache-spark) [default], iteration 19 started ======
[2025-04-09T21:39:10.115Z] GC before operation: completed in 71.543 ms, heap usage 530.753 MB -> 180.241 MB.
[2025-04-09T21:39:11.074Z] ====== chi-square (apache-spark) [default], iteration 19 completed (629.254 ms) ======
[2025-04-09T21:39:11.074Z] ====== chi-square (apache-spark) [default], iteration 20 started ======
[2025-04-09T21:39:11.074Z] GC before operation: completed in 86.757 ms, heap usage 529.318 MB -> 180.270 MB.
[2025-04-09T21:39:12.032Z] ====== chi-square (apache-spark) [default], iteration 20 completed (760.758 ms) ======
[2025-04-09T21:39:12.032Z] ====== chi-square (apache-spark) [default], iteration 21 started ======
[2025-04-09T21:39:12.032Z] GC before operation: completed in 74.627 ms, heap usage 900.476 MB -> 180.308 MB.
[2025-04-09T21:39:12.032Z] ====== chi-square (apache-spark) [default], iteration 21 completed (624.673 ms) ======
[2025-04-09T21:39:12.032Z] ====== chi-square (apache-spark) [default], iteration 22 started ======
[2025-04-09T21:39:12.989Z] GC before operation: completed in 85.351 ms, heap usage 528.857 MB -> 180.328 MB.
[2025-04-09T21:39:12.989Z] ====== chi-square (apache-spark) [default], iteration 22 completed (612.953 ms) ======
[2025-04-09T21:39:12.989Z] ====== chi-square (apache-spark) [default], iteration 23 started ======
[2025-04-09T21:39:12.989Z] GC before operation: completed in 63.863 ms, heap usage 528.873 MB -> 180.357 MB.
[2025-04-09T21:39:13.946Z] ====== chi-square (apache-spark) [default], iteration 23 completed (621.516 ms) ======
[2025-04-09T21:39:13.946Z] ====== chi-square (apache-spark) [default], iteration 24 started ======
[2025-04-09T21:39:13.946Z] GC before operation: completed in 75.381 ms, heap usage 532.901 MB -> 180.387 MB.
[2025-04-09T21:39:14.905Z] ====== chi-square (apache-spark) [default], iteration 24 completed (759.323 ms) ======
[2025-04-09T21:39:14.905Z] ====== chi-square (apache-spark) [default], iteration 25 started ======
[2025-04-09T21:39:14.905Z] GC before operation: completed in 76.856 ms, heap usage 899.132 MB -> 180.419 MB.
[2025-04-09T21:39:15.862Z] ====== chi-square (apache-spark) [default], iteration 25 completed (617.769 ms) ======
[2025-04-09T21:39:15.862Z] ====== chi-square (apache-spark) [default], iteration 26 started ======
[2025-04-09T21:39:15.862Z] GC before operation: completed in 69.711 ms, heap usage 522.310 MB -> 180.453 MB.
[2025-04-09T21:39:15.862Z] ====== chi-square (apache-spark) [default], iteration 26 completed (617.625 ms) ======
[2025-04-09T21:39:15.862Z] ====== chi-square (apache-spark) [default], iteration 27 started ======
[2025-04-09T21:39:15.862Z] GC before operation: completed in 86.881 ms, heap usage 522.348 MB -> 180.488 MB.
[2025-04-09T21:39:16.819Z] ====== chi-square (apache-spark) [default], iteration 27 completed (614.740 ms) ======
[2025-04-09T21:39:16.819Z] ====== chi-square (apache-spark) [default], iteration 28 started ======
[2025-04-09T21:39:16.819Z] GC before operation: completed in 76.890 ms, heap usage 529.030 MB -> 180.533 MB.
[2025-04-09T21:39:17.777Z] ====== chi-square (apache-spark) [default], iteration 28 completed (774.712 ms) ======
[2025-04-09T21:39:17.777Z] ====== chi-square (apache-spark) [default], iteration 29 started ======
[2025-04-09T21:39:17.777Z] GC before operation: completed in 93.719 ms, heap usage 901.445 MB -> 180.564 MB.
[2025-04-09T21:39:18.736Z] ====== chi-square (apache-spark) [default], iteration 29 completed (617.701 ms) ======
[2025-04-09T21:39:18.736Z] ====== chi-square (apache-spark) [default], iteration 30 started ======
[2025-04-09T21:39:18.736Z] GC before operation: completed in 73.374 ms, heap usage 505.111 MB -> 180.595 MB.
[2025-04-09T21:39:18.736Z] ====== chi-square (apache-spark) [default], iteration 30 completed (622.395 ms) ======
[2025-04-09T21:39:18.736Z] ====== chi-square (apache-spark) [default], iteration 31 started ======
[2025-04-09T21:39:18.736Z] GC before operation: completed in 71.146 ms, heap usage 520.462 MB -> 180.629 MB.
[2025-04-09T21:39:19.694Z] ====== chi-square (apache-spark) [default], iteration 31 completed (651.446 ms) ======
[2025-04-09T21:39:19.694Z] ====== chi-square (apache-spark) [default], iteration 32 started ======
[2025-04-09T21:39:19.694Z] GC before operation: completed in 80.430 ms, heap usage 520.514 MB -> 180.661 MB.
[2025-04-09T21:39:20.651Z] ====== chi-square (apache-spark) [default], iteration 32 completed (756.990 ms) ======
[2025-04-09T21:39:20.651Z] ====== chi-square (apache-spark) [default], iteration 33 started ======
[2025-04-09T21:39:20.651Z] GC before operation: completed in 73.132 ms, heap usage 897.413 MB -> 180.694 MB.
[2025-04-09T21:39:21.611Z] ====== chi-square (apache-spark) [default], iteration 33 completed (608.490 ms) ======
[2025-04-09T21:39:21.611Z] ====== chi-square (apache-spark) [default], iteration 34 started ======
[2025-04-09T21:39:21.611Z] GC before operation: completed in 77.648 ms, heap usage 513.237 MB -> 180.729 MB.
[2025-04-09T21:39:21.611Z] ====== chi-square (apache-spark) [default], iteration 34 completed (602.906 ms) ======
[2025-04-09T21:39:21.611Z] ====== chi-square (apache-spark) [default], iteration 35 started ======
[2025-04-09T21:39:21.611Z] GC before operation: completed in 101.117 ms, heap usage 516.803 MB -> 180.750 MB.
[2025-04-09T21:39:22.571Z] ====== chi-square (apache-spark) [default], iteration 35 completed (755.454 ms) ======
[2025-04-09T21:39:22.572Z] ====== chi-square (apache-spark) [default], iteration 36 started ======
[2025-04-09T21:39:22.572Z] GC before operation: completed in 70.258 ms, heap usage 889.475 MB -> 180.778 MB.
[2025-04-09T21:39:23.537Z] ====== chi-square (apache-spark) [default], iteration 36 completed (629.915 ms) ======
[2025-04-09T21:39:23.537Z] ====== chi-square (apache-spark) [default], iteration 37 started ======
[2025-04-09T21:39:23.537Z] GC before operation: completed in 72.227 ms, heap usage 512.669 MB -> 180.810 MB.
[2025-04-09T21:39:24.494Z] ====== chi-square (apache-spark) [default], iteration 37 completed (624.004 ms) ======
[2025-04-09T21:39:24.494Z] ====== chi-square (apache-spark) [default], iteration 38 started ======
[2025-04-09T21:39:24.494Z] GC before operation: completed in 80.261 ms, heap usage 519.852 MB -> 180.840 MB.
[2025-04-09T21:39:25.452Z] ====== chi-square (apache-spark) [default], iteration 38 completed (702.808 ms) ======
[2025-04-09T21:39:25.452Z] ====== chi-square (apache-spark) [default], iteration 39 started ======
[2025-04-09T21:39:25.452Z] GC before operation: completed in 81.267 ms, heap usage 525.530 MB -> 180.873 MB.
[2025-04-09T21:39:25.452Z] ====== chi-square (apache-spark) [default], iteration 39 completed (610.037 ms) ======
[2025-04-09T21:39:25.452Z] ====== chi-square (apache-spark) [default], iteration 40 started ======
[2025-04-09T21:39:25.452Z] GC before operation: completed in 71.781 ms, heap usage 505.063 MB -> 180.891 MB.
[2025-04-09T21:39:26.412Z] ====== chi-square (apache-spark) [default], iteration 40 completed (673.567 ms) ======
[2025-04-09T21:39:26.412Z] ====== chi-square (apache-spark) [default], iteration 41 started ======
[2025-04-09T21:39:26.412Z] GC before operation: completed in 69.293 ms, heap usage 519.791 MB -> 180.911 MB.
[2025-04-09T21:39:27.369Z] ====== chi-square (apache-spark) [default], iteration 41 completed (627.978 ms) ======
[2025-04-09T21:39:27.369Z] ====== chi-square (apache-spark) [default], iteration 42 started ======
[2025-04-09T21:39:27.369Z] GC before operation: completed in 92.452 ms, heap usage 525.461 MB -> 180.934 MB.
[2025-04-09T21:39:28.327Z] ====== chi-square (apache-spark) [default], iteration 42 completed (779.883 ms) ======
[2025-04-09T21:39:28.327Z] ====== chi-square (apache-spark) [default], iteration 43 started ======
[2025-04-09T21:39:28.327Z] GC before operation: completed in 122.135 ms, heap usage 895.716 MB -> 180.958 MB.
[2025-04-09T21:39:28.327Z] ====== chi-square (apache-spark) [default], iteration 43 completed (640.741 ms) ======
[2025-04-09T21:39:28.327Z] ====== chi-square (apache-spark) [default], iteration 44 started ======
[2025-04-09T21:39:29.286Z] GC before operation: completed in 82.025 ms, heap usage 512.501 MB -> 180.976 MB.
[2025-04-09T21:39:29.286Z] ====== chi-square (apache-spark) [default], iteration 44 completed (651.477 ms) ======
[2025-04-09T21:39:29.286Z] ====== chi-square (apache-spark) [default], iteration 45 started ======
[2025-04-09T21:39:29.286Z] GC before operation: completed in 71.197 ms, heap usage 520.522 MB -> 180.997 MB.
[2025-04-09T21:39:30.247Z] ====== chi-square (apache-spark) [default], iteration 45 completed (622.355 ms) ======
[2025-04-09T21:39:30.247Z] ====== chi-square (apache-spark) [default], iteration 46 started ======
[2025-04-09T21:39:30.247Z] GC before operation: completed in 76.010 ms, heap usage 524.545 MB -> 181.020 MB.
[2025-04-09T21:39:31.207Z] ====== chi-square (apache-spark) [default], iteration 46 completed (756.421 ms) ======
[2025-04-09T21:39:31.207Z] ====== chi-square (apache-spark) [default], iteration 47 started ======
[2025-04-09T21:39:31.207Z] GC before operation: completed in 71.865 ms, heap usage 892.794 MB -> 181.039 MB.
[2025-04-09T21:39:32.166Z] ====== chi-square (apache-spark) [default], iteration 47 completed (613.010 ms) ======
[2025-04-09T21:39:32.166Z] ====== chi-square (apache-spark) [default], iteration 48 started ======
[2025-04-09T21:39:32.166Z] GC before operation: completed in 65.728 ms, heap usage 512.967 MB -> 181.050 MB.
[2025-04-09T21:39:32.166Z] ====== chi-square (apache-spark) [default], iteration 48 completed (669.637 ms) ======
[2025-04-09T21:39:32.166Z] ====== chi-square (apache-spark) [default], iteration 49 started ======
[2025-04-09T21:39:32.166Z] GC before operation: completed in 82.320 ms, heap usage 511.597 MB -> 181.068 MB.
[2025-04-09T21:39:33.125Z] ====== chi-square (apache-spark) [default], iteration 49 completed (625.820 ms) ======
[2025-04-09T21:39:33.125Z] ====== chi-square (apache-spark) [default], iteration 50 started ======
[2025-04-09T21:39:33.125Z] GC before operation: completed in 89.643 ms, heap usage 519.959 MB -> 181.087 MB.
[2025-04-09T21:39:34.088Z] ====== chi-square (apache-spark) [default], iteration 50 completed (737.992 ms) ======
[2025-04-09T21:39:34.088Z] ====== chi-square (apache-spark) [default], iteration 51 started ======
[2025-04-09T21:39:34.088Z] GC before operation: completed in 86.640 ms, heap usage 893.416 MB -> 181.113 MB.
[2025-04-09T21:39:35.046Z] ====== chi-square (apache-spark) [default], iteration 51 completed (619.442 ms) ======
[2025-04-09T21:39:35.046Z] ====== chi-square (apache-spark) [default], iteration 52 started ======
[2025-04-09T21:39:35.046Z] GC before operation: completed in 77.958 ms, heap usage 520.653 MB -> 181.128 MB.
[2025-04-09T21:39:35.046Z] ====== chi-square (apache-spark) [default], iteration 52 completed (619.856 ms) ======
[2025-04-09T21:39:35.046Z] ====== chi-square (apache-spark) [default], iteration 53 started ======
[2025-04-09T21:39:35.046Z] GC before operation: completed in 79.125 ms, heap usage 527.996 MB -> 181.145 MB.
[2025-04-09T21:39:36.007Z] ====== chi-square (apache-spark) [default], iteration 53 completed (747.471 ms) ======
[2025-04-09T21:39:36.007Z] ====== chi-square (apache-spark) [default], iteration 54 started ======
[2025-04-09T21:39:36.007Z] GC before operation: completed in 72.808 ms, heap usage 897.869 MB -> 181.164 MB.
[2025-04-09T21:39:36.965Z] ====== chi-square (apache-spark) [default], iteration 54 completed (618.896 ms) ======
[2025-04-09T21:39:36.965Z] ====== chi-square (apache-spark) [default], iteration 55 started ======
[2025-04-09T21:39:36.965Z] GC before operation: completed in 80.166 ms, heap usage 513.056 MB -> 181.186 MB.
[2025-04-09T21:39:37.927Z] ====== chi-square (apache-spark) [default], iteration 55 completed (597.936 ms) ======
[2025-04-09T21:39:37.927Z] ====== chi-square (apache-spark) [default], iteration 56 started ======
[2025-04-09T21:39:37.927Z] GC before operation: completed in 77.833 ms, heap usage 520.061 MB -> 181.205 MB.
[2025-04-09T21:39:37.927Z] ====== chi-square (apache-spark) [default], iteration 56 completed (605.766 ms) ======
[2025-04-09T21:39:37.927Z] ====== chi-square (apache-spark) [default], iteration 57 started ======
[2025-04-09T21:39:37.927Z] GC before operation: completed in 73.746 ms, heap usage 528.273 MB -> 181.223 MB.
[2025-04-09T21:39:38.886Z] ====== chi-square (apache-spark) [default], iteration 57 completed (740.727 ms) ======
[2025-04-09T21:39:38.886Z] ====== chi-square (apache-spark) [default], iteration 58 started ======
[2025-04-09T21:39:38.886Z] GC before operation: completed in 75.193 ms, heap usage 895.982 MB -> 181.242 MB.
[2025-04-09T21:39:39.843Z] ====== chi-square (apache-spark) [default], iteration 58 completed (629.571 ms) ======
[2025-04-09T21:39:39.844Z] ====== chi-square (apache-spark) [default], iteration 59 started ======
[2025-04-09T21:39:39.844Z] GC before operation: completed in 76.093 ms, heap usage 520.137 MB -> 181.264 MB.
[2025-04-09T21:39:40.801Z] ====== chi-square (apache-spark) [default], iteration 59 completed (614.399 ms) ======
[2025-04-09T21:39:40.801Z] -----------------------------------
[2025-04-09T21:39:40.801Z] renaissance-chi-square_0_PASSED
[2025-04-09T21:39:40.801Z] -----------------------------------
[2025-04-09T21:39:40.801Z]
[2025-04-09T21:39:40.801Z] TEST TEARDOWN:
[2025-04-09T21:39:40.801Z] Nothing to be done for teardown.
[2025-04-09T21:39:40.801Z] renaissance-chi-square_0 Finish Time: Wed Apr 9 21:39:40 2025 Epoch Time (ms): 1744234780252
[2025-04-09T21:39:40.801Z]
[2025-04-09T21:39:40.801Z] ===============================================
[2025-04-09T21:39:40.801Z] Running test renaissance-db-shootout_0 ...
[2025-04-09T21:39:40.801Z] ===============================================
[2025-04-09T21:39:40.801Z] renaissance-db-shootout_0 Start Time: Wed Apr 9 21:39:40 2025 Epoch Time (ms): 1744234780266
[2025-04-09T21:39:40.801Z] renaissance-db-shootout_0_DISABLED
[2025-04-09T21:39:40.801Z] Disabled Reason:
[2025-04-09T21:39:40.801Z] https://github.com/adoptium/aqa-tests/issues/2500#issuecomment-818965455
[2025-04-09T21:39:40.801Z] https://github.com/adoptium/aqa-tests/issues/2159#issuecomment-759570887
[2025-04-09T21:39:40.801Z] renaissance-db-shootout_0 Finish Time: Wed Apr 9 21:39:40 2025 Epoch Time (ms): 1744234780278
[2025-04-09T21:39:40.801Z]
[2025-04-09T21:39:40.801Z] ===============================================
[2025-04-09T21:39:40.801Z] Running test renaissance-dec-tree_0 ...
[2025-04-09T21:39:40.801Z] ===============================================
[2025-04-09T21:39:40.801Z] renaissance-dec-tree_0 Start Time: Wed Apr 9 21:39:40 2025 Epoch Time (ms): 1744234780291
[2025-04-09T21:39:40.801Z] variation: NoOptions
[2025-04-09T21:39:40.801Z] JVM_OPTIONS:
[2025-04-09T21:39:40.801Z] { \
[2025-04-09T21:39:40.801Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:39:40.801Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:39:40.801Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-dec-tree_0"; \
[2025-04-09T21:39:40.801Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-dec-tree_0"; \
[2025-04-09T21:39:40.801Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:39:40.801Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-dec-tree_0"/dec-tree.json" dec-tree; \
[2025-04-09T21:39:40.801Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-dec-tree_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-dec-tree_0"; else echo "-----------------------------------"; echo "renaissance-dec-tree_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:39:40.801Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:39:40.801Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:39:40.801Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:39:40.801Z]
[2025-04-09T21:39:40.801Z] TEST SETUP:
[2025-04-09T21:39:40.801Z] Nothing to be done for setup.
[2025-04-09T21:39:40.801Z]
[2025-04-09T21:39:40.801Z] TESTING:
[2025-04-09T21:39:46.954Z] NOTE: 'dec-tree' benchmark uses Spark local executor with 6 (out of 8) threads.
[2025-04-09T21:39:48.927Z] ====== dec-tree (apache-spark) [default], iteration 0 started ======
[2025-04-09T21:39:48.927Z] GC before operation: completed in 58.654 ms, heap usage 52.076 MB -> 36.424 MB.
[2025-04-09T21:40:00.488Z] ====== dec-tree (apache-spark) [default], iteration 0 completed (10479.740 ms) ======
[2025-04-09T21:40:00.488Z] ====== dec-tree (apache-spark) [default], iteration 1 started ======
[2025-04-09T21:40:00.488Z] GC before operation: completed in 140.770 ms, heap usage 561.463 MB -> 72.906 MB.
[2025-04-09T21:40:02.456Z] ====== dec-tree (apache-spark) [default], iteration 1 completed (2881.276 ms) ======
[2025-04-09T21:40:02.456Z] ====== dec-tree (apache-spark) [default], iteration 2 started ======
[2025-04-09T21:40:02.456Z] GC before operation: completed in 110.492 ms, heap usage 157.518 MB -> 72.577 MB.
[2025-04-09T21:40:05.496Z] ====== dec-tree (apache-spark) [default], iteration 2 completed (2870.445 ms) ======
[2025-04-09T21:40:05.497Z] ====== dec-tree (apache-spark) [default], iteration 3 started ======
[2025-04-09T21:40:05.497Z] GC before operation: completed in 124.451 ms, heap usage 332.667 MB -> 73.302 MB.
[2025-04-09T21:40:07.463Z] ====== dec-tree (apache-spark) [default], iteration 3 completed (2309.382 ms) ======
[2025-04-09T21:40:07.463Z] ====== dec-tree (apache-spark) [default], iteration 4 started ======
[2025-04-09T21:40:07.463Z] GC before operation: completed in 115.727 ms, heap usage 402.091 MB -> 73.389 MB.
[2025-04-09T21:40:09.433Z] ====== dec-tree (apache-spark) [default], iteration 4 completed (1573.526 ms) ======
[2025-04-09T21:40:09.433Z] ====== dec-tree (apache-spark) [default], iteration 5 started ======
[2025-04-09T21:40:09.433Z] GC before operation: completed in 111.907 ms, heap usage 201.165 MB -> 73.453 MB.
[2025-04-09T21:40:11.412Z] ====== dec-tree (apache-spark) [default], iteration 5 completed (1463.902 ms) ======
[2025-04-09T21:40:11.412Z] ====== dec-tree (apache-spark) [default], iteration 6 started ======
[2025-04-09T21:40:11.412Z] GC before operation: completed in 128.511 ms, heap usage 296.869 MB -> 73.967 MB.
[2025-04-09T21:40:12.399Z] ====== dec-tree (apache-spark) [default], iteration 6 completed (1487.108 ms) ======
[2025-04-09T21:40:12.399Z] ====== dec-tree (apache-spark) [default], iteration 7 started ======
[2025-04-09T21:40:12.399Z] GC before operation: completed in 120.301 ms, heap usage 378.422 MB -> 74.203 MB.
[2025-04-09T21:40:14.384Z] ====== dec-tree (apache-spark) [default], iteration 7 completed (1406.242 ms) ======
[2025-04-09T21:40:14.384Z] ====== dec-tree (apache-spark) [default], iteration 8 started ======
[2025-04-09T21:40:14.384Z] GC before operation: completed in 130.366 ms, heap usage 120.921 MB -> 73.682 MB.
[2025-04-09T21:40:16.359Z] ====== dec-tree (apache-spark) [default], iteration 8 completed (1980.878 ms) ======
[2025-04-09T21:40:16.359Z] ====== dec-tree (apache-spark) [default], iteration 9 started ======
[2025-04-09T21:40:16.359Z] GC before operation: completed in 159.856 ms, heap usage 525.981 MB -> 74.690 MB.
[2025-04-09T21:40:18.333Z] ====== dec-tree (apache-spark) [default], iteration 9 completed (1993.191 ms) ======
[2025-04-09T21:40:18.333Z] ====== dec-tree (apache-spark) [default], iteration 10 started ======
[2025-04-09T21:40:18.333Z] GC before operation: completed in 148.447 ms, heap usage 373.890 MB -> 74.585 MB.
[2025-04-09T21:40:21.375Z] ====== dec-tree (apache-spark) [default], iteration 10 completed (2235.017 ms) ======
[2025-04-09T21:40:21.375Z] ====== dec-tree (apache-spark) [default], iteration 11 started ======
[2025-04-09T21:40:21.375Z] GC before operation: completed in 143.675 ms, heap usage 303.183 MB -> 74.976 MB.
[2025-04-09T21:40:22.338Z] ====== dec-tree (apache-spark) [default], iteration 11 completed (1295.973 ms) ======
[2025-04-09T21:40:22.338Z] ====== dec-tree (apache-spark) [default], iteration 12 started ======
[2025-04-09T21:40:22.338Z] GC before operation: completed in 151.332 ms, heap usage 353.299 MB -> 102.457 MB.
[2025-04-09T21:40:24.316Z] ====== dec-tree (apache-spark) [default], iteration 12 completed (1925.161 ms) ======
[2025-04-09T21:40:24.316Z] ====== dec-tree (apache-spark) [default], iteration 13 started ======
[2025-04-09T21:40:24.316Z] GC before operation: completed in 154.037 ms, heap usage 547.200 MB -> 97.719 MB.
[2025-04-09T21:40:26.295Z] ====== dec-tree (apache-spark) [default], iteration 13 completed (1779.492 ms) ======
[2025-04-09T21:40:26.295Z] ====== dec-tree (apache-spark) [default], iteration 14 started ======
[2025-04-09T21:40:26.295Z] GC before operation: completed in 248.315 ms, heap usage 230.793 MB -> 75.204 MB.
[2025-04-09T21:40:28.277Z] ====== dec-tree (apache-spark) [default], iteration 14 completed (1556.981 ms) ======
[2025-04-09T21:40:28.277Z] ====== dec-tree (apache-spark) [default], iteration 15 started ======
[2025-04-09T21:40:28.277Z] GC before operation: completed in 182.060 ms, heap usage 120.815 MB -> 74.412 MB.
[2025-04-09T21:40:30.244Z] ====== dec-tree (apache-spark) [default], iteration 15 completed (1517.508 ms) ======
[2025-04-09T21:40:30.244Z] ====== dec-tree (apache-spark) [default], iteration 16 started ======
[2025-04-09T21:40:30.244Z] GC before operation: completed in 186.700 ms, heap usage 287.286 MB -> 75.388 MB.
[2025-04-09T21:40:32.206Z] ====== dec-tree (apache-spark) [default], iteration 16 completed (1896.427 ms) ======
[2025-04-09T21:40:32.207Z] ====== dec-tree (apache-spark) [default], iteration 17 started ======
[2025-04-09T21:40:32.207Z] GC before operation: completed in 139.175 ms, heap usage 289.458 MB -> 97.453 MB.
[2025-04-09T21:40:33.166Z] ====== dec-tree (apache-spark) [default], iteration 17 completed (1236.313 ms) ======
[2025-04-09T21:40:33.166Z] ====== dec-tree (apache-spark) [default], iteration 18 started ======
[2025-04-09T21:40:33.166Z] GC before operation: completed in 149.644 ms, heap usage 339.867 MB -> 75.634 MB.
[2025-04-09T21:40:35.139Z] ====== dec-tree (apache-spark) [default], iteration 18 completed (1267.401 ms) ======
[2025-04-09T21:40:35.139Z] ====== dec-tree (apache-spark) [default], iteration 19 started ======
[2025-04-09T21:40:35.139Z] GC before operation: completed in 173.803 ms, heap usage 403.140 MB -> 75.958 MB.
[2025-04-09T21:40:36.099Z] ====== dec-tree (apache-spark) [default], iteration 19 completed (1177.587 ms) ======
[2025-04-09T21:40:36.099Z] ====== dec-tree (apache-spark) [default], iteration 20 started ======
[2025-04-09T21:40:36.099Z] GC before operation: completed in 147.881 ms, heap usage 483.431 MB -> 75.725 MB.
[2025-04-09T21:40:37.055Z] ====== dec-tree (apache-spark) [default], iteration 20 completed (1229.046 ms) ======
[2025-04-09T21:40:37.055Z] ====== dec-tree (apache-spark) [default], iteration 21 started ======
[2025-04-09T21:40:38.012Z] GC before operation: completed in 159.898 ms, heap usage 322.585 MB -> 76.210 MB.
[2025-04-09T21:40:38.968Z] ====== dec-tree (apache-spark) [default], iteration 21 completed (1197.681 ms) ======
[2025-04-09T21:40:38.968Z] ====== dec-tree (apache-spark) [default], iteration 22 started ======
[2025-04-09T21:40:38.968Z] GC before operation: completed in 190.852 ms, heap usage 448.377 MB -> 76.174 MB.
[2025-04-09T21:40:39.931Z] ====== dec-tree (apache-spark) [default], iteration 22 completed (1233.988 ms) ======
[2025-04-09T21:40:39.931Z] ====== dec-tree (apache-spark) [default], iteration 23 started ======
[2025-04-09T21:40:39.931Z] GC before operation: completed in 142.658 ms, heap usage 422.440 MB -> 76.200 MB.
[2025-04-09T21:40:41.918Z] ====== dec-tree (apache-spark) [default], iteration 23 completed (1451.004 ms) ======
[2025-04-09T21:40:41.919Z] ====== dec-tree (apache-spark) [default], iteration 24 started ======
[2025-04-09T21:40:41.919Z] GC before operation: completed in 138.942 ms, heap usage 409.910 MB -> 76.179 MB.
[2025-04-09T21:40:42.886Z] ====== dec-tree (apache-spark) [default], iteration 24 completed (1336.552 ms) ======
[2025-04-09T21:40:42.886Z] ====== dec-tree (apache-spark) [default], iteration 25 started ======
[2025-04-09T21:40:43.850Z] GC before operation: completed in 139.216 ms, heap usage 421.576 MB -> 76.523 MB.
[2025-04-09T21:40:44.815Z] ====== dec-tree (apache-spark) [default], iteration 25 completed (1200.639 ms) ======
[2025-04-09T21:40:44.815Z] ====== dec-tree (apache-spark) [default], iteration 26 started ======
[2025-04-09T21:40:44.815Z] GC before operation: completed in 154.710 ms, heap usage 466.776 MB -> 76.569 MB.
[2025-04-09T21:40:45.775Z] ====== dec-tree (apache-spark) [default], iteration 26 completed (1183.061 ms) ======
[2025-04-09T21:40:45.775Z] ====== dec-tree (apache-spark) [default], iteration 27 started ======
[2025-04-09T21:40:45.775Z] GC before operation: completed in 273.178 ms, heap usage 349.596 MB -> 76.714 MB.
[2025-04-09T21:40:47.768Z] ====== dec-tree (apache-spark) [default], iteration 27 completed (1162.562 ms) ======
[2025-04-09T21:40:47.768Z] ====== dec-tree (apache-spark) [default], iteration 28 started ======
[2025-04-09T21:40:47.768Z] GC before operation: completed in 156.293 ms, heap usage 273.746 MB -> 76.604 MB.
[2025-04-09T21:40:48.731Z] ====== dec-tree (apache-spark) [default], iteration 28 completed (1283.409 ms) ======
[2025-04-09T21:40:48.731Z] ====== dec-tree (apache-spark) [default], iteration 29 started ======
[2025-04-09T21:40:48.731Z] GC before operation: completed in 164.297 ms, heap usage 533.011 MB -> 88.196 MB.
[2025-04-09T21:40:50.704Z] ====== dec-tree (apache-spark) [default], iteration 29 completed (1413.742 ms) ======
[2025-04-09T21:40:50.704Z] ====== dec-tree (apache-spark) [default], iteration 30 started ======
[2025-04-09T21:40:50.704Z] GC before operation: completed in 156.517 ms, heap usage 625.375 MB -> 104.856 MB.
[2025-04-09T21:40:52.683Z] ====== dec-tree (apache-spark) [default], iteration 30 completed (1711.819 ms) ======
[2025-04-09T21:40:52.683Z] ====== dec-tree (apache-spark) [default], iteration 31 started ======
[2025-04-09T21:40:52.683Z] GC before operation: completed in 164.788 ms, heap usage 548.586 MB -> 77.288 MB.
[2025-04-09T21:40:53.652Z] ====== dec-tree (apache-spark) [default], iteration 31 completed (1270.008 ms) ======
[2025-04-09T21:40:53.652Z] ====== dec-tree (apache-spark) [default], iteration 32 started ======
[2025-04-09T21:40:53.652Z] GC before operation: completed in 169.918 ms, heap usage 271.906 MB -> 77.189 MB.
[2025-04-09T21:40:54.618Z] ====== dec-tree (apache-spark) [default], iteration 32 completed (1155.999 ms) ======
[2025-04-09T21:40:54.618Z] ====== dec-tree (apache-spark) [default], iteration 33 started ======
[2025-04-09T21:40:55.575Z] GC before operation: completed in 172.146 ms, heap usage 173.524 MB -> 76.889 MB.
[2025-04-09T21:40:56.532Z] ====== dec-tree (apache-spark) [default], iteration 33 completed (1328.628 ms) ======
[2025-04-09T21:40:56.532Z] ====== dec-tree (apache-spark) [default], iteration 34 started ======
[2025-04-09T21:40:56.532Z] GC before operation: completed in 166.381 ms, heap usage 541.473 MB -> 77.675 MB.
[2025-04-09T21:40:57.493Z] ====== dec-tree (apache-spark) [default], iteration 34 completed (1224.399 ms) ======
[2025-04-09T21:40:57.493Z] ====== dec-tree (apache-spark) [default], iteration 35 started ======
[2025-04-09T21:40:58.450Z] GC before operation: completed in 156.763 ms, heap usage 410.174 MB -> 77.575 MB.
[2025-04-09T21:40:59.413Z] ====== dec-tree (apache-spark) [default], iteration 35 completed (1477.864 ms) ======
[2025-04-09T21:40:59.413Z] ====== dec-tree (apache-spark) [default], iteration 36 started ======
[2025-04-09T21:40:59.413Z] GC before operation: completed in 198.587 ms, heap usage 371.545 MB -> 77.698 MB.
[2025-04-09T21:41:01.391Z] ====== dec-tree (apache-spark) [default], iteration 36 completed (1181.317 ms) ======
[2025-04-09T21:41:01.391Z] ====== dec-tree (apache-spark) [default], iteration 37 started ======
[2025-04-09T21:41:01.391Z] GC before operation: completed in 156.966 ms, heap usage 415.693 MB -> 77.729 MB.
[2025-04-09T21:41:02.355Z] ====== dec-tree (apache-spark) [default], iteration 37 completed (1160.025 ms) ======
[2025-04-09T21:41:02.355Z] ====== dec-tree (apache-spark) [default], iteration 38 started ======
[2025-04-09T21:41:02.355Z] GC before operation: completed in 168.401 ms, heap usage 368.099 MB -> 77.697 MB.
[2025-04-09T21:41:03.320Z] ====== dec-tree (apache-spark) [default], iteration 38 completed (1123.375 ms) ======
[2025-04-09T21:41:03.320Z] ====== dec-tree (apache-spark) [default], iteration 39 started ======
[2025-04-09T21:41:03.320Z] GC before operation: completed in 150.588 ms, heap usage 369.969 MB -> 105.488 MB.
[2025-04-09T21:41:03.320Z] 21:41:03.112 WARN [block-manager-storage-async-thread-pool-60] org.apache.spark.storage.BlockManager - Asked to remove block broadcast_541_piece0, which does not exist
[2025-04-09T21:41:05.298Z] ====== dec-tree (apache-spark) [default], iteration 39 completed (1385.581 ms) ======
[2025-04-09T21:41:05.298Z] -----------------------------------
[2025-04-09T21:41:05.298Z] renaissance-dec-tree_0_PASSED
[2025-04-09T21:41:05.298Z] -----------------------------------
[2025-04-09T21:41:05.298Z]
[2025-04-09T21:41:05.298Z] TEST TEARDOWN:
[2025-04-09T21:41:05.298Z] Nothing to be done for teardown.
[2025-04-09T21:41:05.298Z] renaissance-dec-tree_0 Finish Time: Wed Apr 9 21:41:04 2025 Epoch Time (ms): 1744234864863
[2025-04-09T21:41:05.298Z]
[2025-04-09T21:41:05.298Z] ===============================================
[2025-04-09T21:41:05.298Z] Running test renaissance-finagle-chirper_0 ...
[2025-04-09T21:41:05.298Z] ===============================================
[2025-04-09T21:41:05.298Z] renaissance-finagle-chirper_0 Start Time: Wed Apr 9 21:41:04 2025 Epoch Time (ms): 1744234864881
[2025-04-09T21:41:05.298Z] renaissance-finagle-chirper_0_DISABLED
[2025-04-09T21:41:05.298Z] Disabled Reason:
[2025-04-09T21:41:05.298Z] https://github.com/renaissance-benchmarks/renaissance/issues/231
[2025-04-09T21:41:05.298Z] renaissance-finagle-chirper_0 Finish Time: Wed Apr 9 21:41:04 2025 Epoch Time (ms): 1744234864894
[2025-04-09T21:41:05.298Z]
[2025-04-09T21:41:05.298Z] ===============================================
[2025-04-09T21:41:05.298Z] Running test renaissance-finagle-http_0 ...
[2025-04-09T21:41:05.298Z] ===============================================
[2025-04-09T21:41:05.298Z] renaissance-finagle-http_0 Start Time: Wed Apr 9 21:41:04 2025 Epoch Time (ms): 1744234864908
[2025-04-09T21:41:05.298Z] variation: NoOptions
[2025-04-09T21:41:05.298Z] JVM_OPTIONS:
[2025-04-09T21:41:05.298Z] { \
[2025-04-09T21:41:05.298Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:41:05.298Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:41:05.298Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-finagle-http_0"; \
[2025-04-09T21:41:05.298Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-finagle-http_0"; \
[2025-04-09T21:41:05.298Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:41:05.298Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-finagle-http_0"/finagle-http.json" finagle-http; \
[2025-04-09T21:41:05.298Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-finagle-http_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-finagle-http_0"; else echo "-----------------------------------"; echo "renaissance-finagle-http_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:41:05.298Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:41:05.298Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:41:05.298Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:41:05.298Z]
[2025-04-09T21:41:05.298Z] TEST SETUP:
[2025-04-09T21:41:05.298Z] Nothing to be done for setup.
[2025-04-09T21:41:05.298Z]
[2025-04-09T21:41:05.298Z] TESTING:
[2025-04-09T21:41:08.344Z] finagle-http on :46703 spawning 8 client and default number of server workers.
[2025-04-09T21:41:08.344Z] ====== finagle-http (web) [default], iteration 0 started ======
[2025-04-09T21:41:08.344Z] GC before operation: completed in 29.724 ms, heap usage 13.002 MB -> 12.372 MB.
[2025-04-09T21:41:36.952Z] ====== finagle-http (web) [default], iteration 0 completed (27877.304 ms) ======
[2025-04-09T21:41:36.952Z] ====== finagle-http (web) [default], iteration 1 started ======
[2025-04-09T21:41:36.952Z] GC before operation: completed in 61.874 ms, heap usage 56.478 MB -> 17.684 MB.
[2025-04-09T21:41:55.364Z] ====== finagle-http (web) [default], iteration 1 completed (17710.077 ms) ======
[2025-04-09T21:41:55.364Z] ====== finagle-http (web) [default], iteration 2 started ======
[2025-04-09T21:41:55.364Z] GC before operation: completed in 69.659 ms, heap usage 125.351 MB -> 17.879 MB.
[2025-04-09T21:42:11.229Z] ====== finagle-http (web) [default], iteration 2 completed (16980.819 ms) ======
[2025-04-09T21:42:11.229Z] ====== finagle-http (web) [default], iteration 3 started ======
[2025-04-09T21:42:11.229Z] GC before operation: completed in 94.278 ms, heap usage 53.087 MB -> 17.787 MB.
[2025-04-09T21:42:29.644Z] ====== finagle-http (web) [default], iteration 3 completed (16684.582 ms) ======
[2025-04-09T21:42:29.644Z] ====== finagle-http (web) [default], iteration 4 started ======
[2025-04-09T21:42:29.644Z] GC before operation: completed in 53.233 ms, heap usage 50.311 MB -> 17.809 MB.
[2025-04-09T21:42:48.192Z] ====== finagle-http (web) [default], iteration 4 completed (17796.147 ms) ======
[2025-04-09T21:42:48.192Z] ====== finagle-http (web) [default], iteration 5 started ======
[2025-04-09T21:42:48.192Z] GC before operation: completed in 62.808 ms, heap usage 113.268 MB -> 17.865 MB.
[2025-04-09T21:43:04.089Z] ====== finagle-http (web) [default], iteration 5 completed (16121.875 ms) ======
[2025-04-09T21:43:04.089Z] ====== finagle-http (web) [default], iteration 6 started ======
[2025-04-09T21:43:04.089Z] GC before operation: completed in 70.577 ms, heap usage 36.239 MB -> 17.799 MB.
[2025-04-09T21:43:22.470Z] ====== finagle-http (web) [default], iteration 6 completed (18531.621 ms) ======
[2025-04-09T21:43:22.470Z] ====== finagle-http (web) [default], iteration 7 started ======
[2025-04-09T21:43:22.470Z] GC before operation: completed in 60.350 ms, heap usage 104.719 MB -> 17.846 MB.
[2025-04-09T21:43:40.904Z] ====== finagle-http (web) [default], iteration 7 completed (19662.477 ms) ======
[2025-04-09T21:43:40.904Z] ====== finagle-http (web) [default], iteration 8 started ======
[2025-04-09T21:43:40.904Z] GC before operation: completed in 55.958 ms, heap usage 61.055 MB -> 18.001 MB.
[2025-04-09T21:44:02.402Z] ====== finagle-http (web) [default], iteration 8 completed (20780.178 ms) ======
[2025-04-09T21:44:02.402Z] ====== finagle-http (web) [default], iteration 9 started ======
[2025-04-09T21:44:02.402Z] GC before operation: completed in 78.359 ms, heap usage 42.937 MB -> 17.860 MB.
[2025-04-09T21:44:23.877Z] ====== finagle-http (web) [default], iteration 9 completed (21325.532 ms) ======
[2025-04-09T21:44:23.877Z] ====== finagle-http (web) [default], iteration 10 started ======
[2025-04-09T21:44:23.877Z] GC before operation: completed in 84.565 ms, heap usage 70.355 MB -> 17.793 MB.
[2025-04-09T21:44:45.396Z] ====== finagle-http (web) [default], iteration 10 completed (20720.357 ms) ======
[2025-04-09T21:44:45.396Z] ====== finagle-http (web) [default], iteration 11 started ======
[2025-04-09T21:44:45.396Z] GC before operation: completed in 56.337 ms, heap usage 102.429 MB -> 17.790 MB.
[2025-04-09T21:45:03.924Z] ====== finagle-http (web) [default], iteration 11 completed (19909.933 ms) ======
[2025-04-09T21:45:03.924Z] -----------------------------------
[2025-04-09T21:45:03.924Z] renaissance-finagle-http_0_PASSED
[2025-04-09T21:45:03.924Z] -----------------------------------
[2025-04-09T21:45:03.924Z]
[2025-04-09T21:45:03.924Z] TEST TEARDOWN:
[2025-04-09T21:45:03.924Z] Nothing to be done for teardown.
[2025-04-09T21:45:03.924Z] renaissance-finagle-http_0 Finish Time: Wed Apr 9 21:45:03 2025 Epoch Time (ms): 1744235103675
[2025-04-09T21:45:03.924Z]
[2025-04-09T21:45:03.924Z] ===============================================
[2025-04-09T21:45:03.924Z] Running test renaissance-gauss-mix_0 ...
[2025-04-09T21:45:03.924Z] ===============================================
[2025-04-09T21:45:03.924Z] renaissance-gauss-mix_0 Start Time: Wed Apr 9 21:45:03 2025 Epoch Time (ms): 1744235103698
[2025-04-09T21:45:03.924Z] variation: NoOptions
[2025-04-09T21:45:03.924Z] JVM_OPTIONS:
[2025-04-09T21:45:03.924Z] { \
[2025-04-09T21:45:03.924Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:45:03.924Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:45:03.924Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-gauss-mix_0"; \
[2025-04-09T21:45:03.924Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-gauss-mix_0"; \
[2025-04-09T21:45:03.924Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:45:03.924Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-gauss-mix_0"/gauss-mix.json" gauss-mix; \
[2025-04-09T21:45:03.924Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-gauss-mix_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-gauss-mix_0"; else echo "-----------------------------------"; echo "renaissance-gauss-mix_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:45:03.924Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:45:03.924Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:45:03.924Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:45:03.924Z]
[2025-04-09T21:45:03.924Z] TEST SETUP:
[2025-04-09T21:45:03.924Z] Nothing to be done for setup.
[2025-04-09T21:45:03.924Z]
[2025-04-09T21:45:03.924Z] TESTING:
[2025-04-09T21:45:17.487Z] NOTE: 'gauss-mix' benchmark uses Spark local executor with 4 (out of 8) threads.
[2025-04-09T21:45:17.487Z] ====== gauss-mix (apache-spark) [default], iteration 0 started ======
[2025-04-09T21:45:17.487Z] GC before operation: completed in 43.440 ms, heap usage 64.438 MB -> 30.771 MB.
[2025-04-09T21:45:25.688Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:45:28.720Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:45:30.701Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:45:32.676Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:45:34.652Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:45:36.620Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:45:37.586Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:45:39.556Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:45:39.556Z] ====== gauss-mix (apache-spark) [default], iteration 0 completed (22435.645 ms) ======
[2025-04-09T21:45:39.556Z] ====== gauss-mix (apache-spark) [default], iteration 1 started ======
[2025-04-09T21:45:39.556Z] GC before operation: completed in 83.782 ms, heap usage 427.259 MB -> 38.893 MB.
[2025-04-09T21:45:41.520Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:45:42.477Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:45:44.455Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:45:45.419Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:45:47.389Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:45:48.345Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:45:49.301Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:45:51.268Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:45:51.268Z] ====== gauss-mix (apache-spark) [default], iteration 1 completed (11521.365 ms) ======
[2025-04-09T21:45:51.268Z] ====== gauss-mix (apache-spark) [default], iteration 2 started ======
[2025-04-09T21:45:51.268Z] GC before operation: completed in 94.250 ms, heap usage 254.427 MB -> 40.498 MB.
[2025-04-09T21:45:53.231Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:45:54.187Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:45:56.153Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:45:58.114Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:45:59.068Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:46:01.051Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:46:02.019Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:46:02.980Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:46:02.981Z] ====== gauss-mix (apache-spark) [default], iteration 2 completed (11439.292 ms) ======
[2025-04-09T21:46:02.981Z] ====== gauss-mix (apache-spark) [default], iteration 3 started ======
[2025-04-09T21:46:02.981Z] GC before operation: completed in 67.150 ms, heap usage 324.194 MB -> 42.215 MB.
[2025-04-09T21:46:03.940Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:46:04.902Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:46:06.918Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:46:07.888Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:46:08.852Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:46:09.813Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:46:11.787Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:46:12.749Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:46:12.749Z] ====== gauss-mix (apache-spark) [default], iteration 3 completed (10149.976 ms) ======
[2025-04-09T21:46:12.749Z] ====== gauss-mix (apache-spark) [default], iteration 4 started ======
[2025-04-09T21:46:12.749Z] GC before operation: completed in 112.619 ms, heap usage 505.756 MB -> 43.363 MB.
[2025-04-09T21:46:14.727Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:46:16.712Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:46:17.679Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:46:18.653Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:46:19.611Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:46:20.567Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:46:21.523Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:46:23.487Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:46:23.487Z] ====== gauss-mix (apache-spark) [default], iteration 4 completed (10105.008 ms) ======
[2025-04-09T21:46:23.487Z] ====== gauss-mix (apache-spark) [default], iteration 5 started ======
[2025-04-09T21:46:23.487Z] GC before operation: completed in 97.886 ms, heap usage 990.354 MB -> 43.335 MB.
[2025-04-09T21:46:24.445Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:46:26.418Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:46:28.381Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:46:29.342Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:46:31.308Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:46:32.272Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:46:34.258Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:46:35.217Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:46:35.217Z] ====== gauss-mix (apache-spark) [default], iteration 5 completed (12010.030 ms) ======
[2025-04-09T21:46:35.217Z] ====== gauss-mix (apache-spark) [default], iteration 6 started ======
[2025-04-09T21:46:35.217Z] GC before operation: completed in 97.872 ms, heap usage 92.632 MB -> 43.242 MB.
[2025-04-09T21:46:36.187Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:46:38.161Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:46:40.131Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:46:42.099Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:46:43.055Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:46:44.016Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:46:44.971Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:46:45.929Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:46:45.929Z] ====== gauss-mix (apache-spark) [default], iteration 6 completed (10699.963 ms) ======
[2025-04-09T21:46:45.929Z] ====== gauss-mix (apache-spark) [default], iteration 7 started ======
[2025-04-09T21:46:45.929Z] GC before operation: completed in 100.513 ms, heap usage 466.824 MB -> 43.327 MB.
[2025-04-09T21:46:48.961Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:46:49.922Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:46:51.889Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:46:52.851Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:46:54.826Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:46:55.782Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:46:57.745Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:46:58.704Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:46:58.704Z] ====== gauss-mix (apache-spark) [default], iteration 7 completed (12256.051 ms) ======
[2025-04-09T21:46:58.704Z] ====== gauss-mix (apache-spark) [default], iteration 8 started ======
[2025-04-09T21:46:58.704Z] GC before operation: completed in 73.190 ms, heap usage 243.285 MB -> 43.481 MB.
[2025-04-09T21:46:59.661Z] 21:46:58.816 WARN [block-manager-storage-async-thread-pool-34] org.apache.spark.storage.BlockManager - Asked to remove block broadcast_4019, which does not exist
[2025-04-09T21:47:00.616Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:47:01.572Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:47:02.527Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:47:03.487Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:47:04.446Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:47:05.403Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:47:06.358Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:47:07.315Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:47:07.315Z] ====== gauss-mix (apache-spark) [default], iteration 8 completed (9003.117 ms) ======
[2025-04-09T21:47:07.315Z] ====== gauss-mix (apache-spark) [default], iteration 9 started ======
[2025-04-09T21:47:07.315Z] GC before operation: completed in 85.952 ms, heap usage 733.893 MB -> 43.494 MB.
[2025-04-09T21:47:09.277Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:47:10.233Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:47:11.190Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:47:12.149Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:47:13.111Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:47:15.077Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:47:16.033Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:47:16.989Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:47:16.989Z] ====== gauss-mix (apache-spark) [default], iteration 9 completed (9299.386 ms) ======
[2025-04-09T21:47:16.989Z] ====== gauss-mix (apache-spark) [default], iteration 10 started ======
[2025-04-09T21:47:16.989Z] GC before operation: completed in 83.000 ms, heap usage 1.185 GB -> 43.520 MB.
[2025-04-09T21:47:17.945Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:47:18.901Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:47:19.859Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:47:20.817Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:47:21.774Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:47:22.735Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:47:24.702Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:47:25.657Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:47:25.657Z] ====== gauss-mix (apache-spark) [default], iteration 10 completed (8454.572 ms) ======
[2025-04-09T21:47:25.657Z] ====== gauss-mix (apache-spark) [default], iteration 11 started ======
[2025-04-09T21:47:25.657Z] GC before operation: completed in 90.481 ms, heap usage 160.507 MB -> 43.428 MB.
[2025-04-09T21:47:27.620Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:47:28.575Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:47:29.530Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:47:30.486Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:47:31.440Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:47:32.395Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:47:33.353Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:47:34.311Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:47:34.311Z] ====== gauss-mix (apache-spark) [default], iteration 11 completed (8548.557 ms) ======
[2025-04-09T21:47:34.311Z] ====== gauss-mix (apache-spark) [default], iteration 12 started ======
[2025-04-09T21:47:34.311Z] GC before operation: completed in 109.435 ms, heap usage 512.679 MB -> 43.370 MB.
[2025-04-09T21:47:35.266Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:47:36.221Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:47:37.178Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:47:38.133Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:47:39.108Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:47:40.811Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:47:41.794Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:47:42.752Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:47:42.752Z] ====== gauss-mix (apache-spark) [default], iteration 12 completed (8576.159 ms) ======
[2025-04-09T21:47:42.752Z] ====== gauss-mix (apache-spark) [default], iteration 13 started ======
[2025-04-09T21:47:42.752Z] GC before operation: completed in 115.238 ms, heap usage 444.003 MB -> 43.093 MB.
[2025-04-09T21:47:44.729Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:47:45.687Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:47:46.644Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:47:47.598Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:47:48.553Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:47:49.509Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:47:51.475Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:47:52.429Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:47:52.429Z] ====== gauss-mix (apache-spark) [default], iteration 13 completed (9088.676 ms) ======
[2025-04-09T21:47:52.429Z] ====== gauss-mix (apache-spark) [default], iteration 14 started ======
[2025-04-09T21:47:52.429Z] GC before operation: completed in 103.388 ms, heap usage 339.357 MB -> 43.427 MB.
[2025-04-09T21:47:53.383Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:47:54.337Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:47:55.292Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:47:56.251Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:47:58.222Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:47:59.182Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:48:00.140Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:48:01.100Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:48:01.100Z] ====== gauss-mix (apache-spark) [default], iteration 14 completed (9350.827 ms) ======
[2025-04-09T21:48:01.100Z] ====== gauss-mix (apache-spark) [default], iteration 15 started ======
[2025-04-09T21:48:02.065Z] GC before operation: completed in 106.874 ms, heap usage 205.355 MB -> 43.113 MB.
[2025-04-09T21:48:03.031Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:48:03.987Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:48:04.943Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:48:05.899Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:48:07.862Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:48:08.820Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:48:09.776Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:48:10.736Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:48:10.736Z] ====== gauss-mix (apache-spark) [default], iteration 15 completed (9037.452 ms) ======
[2025-04-09T21:48:10.736Z] ====== gauss-mix (apache-spark) [default], iteration 16 started ======
[2025-04-09T21:48:10.736Z] GC before operation: completed in 97.143 ms, heap usage 433.660 MB -> 43.660 MB.
[2025-04-09T21:48:12.712Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:48:12.712Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:48:13.670Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:48:15.801Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:48:15.801Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:48:16.758Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:48:18.722Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:48:19.678Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:48:19.678Z] ====== gauss-mix (apache-spark) [default], iteration 16 completed (9000.059 ms) ======
[2025-04-09T21:48:19.678Z] ====== gauss-mix (apache-spark) [default], iteration 17 started ======
[2025-04-09T21:48:19.678Z] GC before operation: completed in 103.275 ms, heap usage 62.576 MB -> 43.317 MB.
[2025-04-09T21:48:21.645Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:48:22.600Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:48:24.569Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:48:25.529Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:48:26.489Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:48:27.448Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:48:29.420Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:48:30.382Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:48:30.382Z] ====== gauss-mix (apache-spark) [default], iteration 17 completed (10189.297 ms) ======
[2025-04-09T21:48:30.382Z] ====== gauss-mix (apache-spark) [default], iteration 18 started ======
[2025-04-09T21:48:30.382Z] GC before operation: completed in 101.047 ms, heap usage 183.201 MB -> 43.017 MB.
[2025-04-09T21:48:31.344Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:48:32.303Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:48:33.264Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:48:34.223Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:48:35.183Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:48:36.142Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:48:37.106Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:48:38.066Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:48:39.029Z] ====== gauss-mix (apache-spark) [default], iteration 18 completed (8440.469 ms) ======
[2025-04-09T21:48:39.029Z] ====== gauss-mix (apache-spark) [default], iteration 19 started ======
[2025-04-09T21:48:39.029Z] GC before operation: completed in 86.680 ms, heap usage 589.661 MB -> 43.592 MB.
[2025-04-09T21:48:39.987Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:48:40.946Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:48:41.902Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:48:43.034Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:48:43.993Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:48:44.947Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:48:45.904Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:48:47.883Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:48:47.883Z] ====== gauss-mix (apache-spark) [default], iteration 19 completed (8766.341 ms) ======
[2025-04-09T21:48:47.883Z] ====== gauss-mix (apache-spark) [default], iteration 20 started ======
[2025-04-09T21:48:47.883Z] GC before operation: completed in 72.207 ms, heap usage 185.994 MB -> 43.207 MB.
[2025-04-09T21:48:48.839Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:48:49.796Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:48:50.752Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:48:51.709Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:48:52.665Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:48:53.621Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:48:54.578Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:48:55.536Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:48:55.536Z] ====== gauss-mix (apache-spark) [default], iteration 20 completed (8496.977 ms) ======
[2025-04-09T21:48:55.536Z] ====== gauss-mix (apache-spark) [default], iteration 21 started ======
[2025-04-09T21:48:56.497Z] GC before operation: completed in 70.383 ms, heap usage 652.228 MB -> 43.121 MB.
[2025-04-09T21:48:57.454Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:48:58.410Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:48:59.367Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:49:00.323Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:49:02.287Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:49:03.244Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:49:04.200Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:49:06.161Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:49:06.162Z] ====== gauss-mix (apache-spark) [default], iteration 21 completed (9914.192 ms) ======
[2025-04-09T21:49:06.162Z] ====== gauss-mix (apache-spark) [default], iteration 22 started ======
[2025-04-09T21:49:06.162Z] GC before operation: completed in 88.179 ms, heap usage 572.033 MB -> 43.463 MB.
[2025-04-09T21:49:07.117Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:49:08.074Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:49:09.029Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:49:09.985Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:49:10.941Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:49:11.897Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:49:13.865Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:49:14.821Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:49:14.821Z] ====== gauss-mix (apache-spark) [default], iteration 22 completed (8356.699 ms) ======
[2025-04-09T21:49:14.821Z] ====== gauss-mix (apache-spark) [default], iteration 23 started ======
[2025-04-09T21:49:14.821Z] GC before operation: completed in 76.570 ms, heap usage 109.323 MB -> 43.038 MB.
[2025-04-09T21:49:15.787Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:49:16.744Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:49:17.705Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:49:18.662Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:49:19.618Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:49:20.573Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:49:21.533Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:49:22.488Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:49:22.488Z] ====== gauss-mix (apache-spark) [default], iteration 23 completed (8201.212 ms) ======
[2025-04-09T21:49:22.488Z] ====== gauss-mix (apache-spark) [default], iteration 24 started ======
[2025-04-09T21:49:22.488Z] GC before operation: completed in 82.590 ms, heap usage 335.551 MB -> 43.585 MB.
[2025-04-09T21:49:24.450Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:49:25.407Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:49:26.372Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:49:27.329Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:49:28.284Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:49:29.239Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:49:30.196Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:49:31.153Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:49:31.153Z] ====== gauss-mix (apache-spark) [default], iteration 24 completed (8111.919 ms) ======
[2025-04-09T21:49:31.153Z] ====== gauss-mix (apache-spark) [default], iteration 25 started ======
[2025-04-09T21:49:31.153Z] GC before operation: completed in 84.321 ms, heap usage 564.602 MB -> 43.530 MB.
[2025-04-09T21:49:32.111Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:49:33.072Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:49:34.033Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:49:35.999Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:49:36.955Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:49:37.913Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:49:38.871Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:49:39.828Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:49:39.828Z] ====== gauss-mix (apache-spark) [default], iteration 25 completed (9086.914 ms) ======
[2025-04-09T21:49:39.828Z] ====== gauss-mix (apache-spark) [default], iteration 26 started ======
[2025-04-09T21:49:39.828Z] GC before operation: completed in 110.757 ms, heap usage 551.713 MB -> 43.188 MB.
[2025-04-09T21:49:41.789Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:49:42.745Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:49:43.702Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:49:44.659Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:49:45.619Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:49:46.583Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:49:47.545Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:49:48.511Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:49:48.511Z] ====== gauss-mix (apache-spark) [default], iteration 26 completed (8547.115 ms) ======
[2025-04-09T21:49:48.511Z] ====== gauss-mix (apache-spark) [default], iteration 27 started ======
[2025-04-09T21:49:48.511Z] GC before operation: completed in 113.347 ms, heap usage 406.205 MB -> 43.492 MB.
[2025-04-09T21:49:50.487Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:49:51.451Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:49:52.406Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:49:53.374Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:49:54.339Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:49:55.306Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:49:56.270Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:49:57.235Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:49:57.235Z] ====== gauss-mix (apache-spark) [default], iteration 27 completed (8397.735 ms) ======
[2025-04-09T21:49:57.235Z] ====== gauss-mix (apache-spark) [default], iteration 28 started ======
[2025-04-09T21:49:57.235Z] GC before operation: completed in 112.198 ms, heap usage 649.485 MB -> 43.344 MB.
[2025-04-09T21:49:59.205Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:50:00.162Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:50:02.125Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:50:03.081Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:50:04.037Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:50:04.993Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:50:05.951Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:50:06.907Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:50:06.907Z] ====== gauss-mix (apache-spark) [default], iteration 28 completed (9587.284 ms) ======
[2025-04-09T21:50:06.907Z] ====== gauss-mix (apache-spark) [default], iteration 29 started ======
[2025-04-09T21:50:06.907Z] GC before operation: completed in 108.304 ms, heap usage 783.277 MB -> 43.840 MB.
[2025-04-09T21:50:07.865Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:50:08.823Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:50:10.789Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:50:10.789Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:50:12.758Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:50:13.721Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:50:14.677Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:50:15.631Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:50:15.632Z] ====== gauss-mix (apache-spark) [default], iteration 29 completed (8376.403 ms) ======
[2025-04-09T21:50:15.632Z] ====== gauss-mix (apache-spark) [default], iteration 30 started ======
[2025-04-09T21:50:15.632Z] GC before operation: completed in 92.398 ms, heap usage 543.393 MB -> 43.589 MB.
[2025-04-09T21:50:16.588Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:50:17.544Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:50:18.499Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:50:19.456Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:50:21.420Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:50:22.378Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:50:23.334Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:50:24.288Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:50:24.288Z] ====== gauss-mix (apache-spark) [default], iteration 30 completed (8597.503 ms) ======
[2025-04-09T21:50:24.288Z] ====== gauss-mix (apache-spark) [default], iteration 31 started ======
[2025-04-09T21:50:24.288Z] GC before operation: completed in 86.568 ms, heap usage 719.852 MB -> 43.356 MB.
[2025-04-09T21:50:25.243Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:50:26.200Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:50:27.154Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:50:28.111Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:50:29.068Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:50:30.026Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:50:30.981Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:50:32.945Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:50:32.945Z] ====== gauss-mix (apache-spark) [default], iteration 31 completed (8247.740 ms) ======
[2025-04-09T21:50:32.945Z] ====== gauss-mix (apache-spark) [default], iteration 32 started ======
[2025-04-09T21:50:32.945Z] GC before operation: completed in 102.478 ms, heap usage 792.294 MB -> 43.848 MB.
[2025-04-09T21:50:33.907Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:50:34.871Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:50:35.833Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:50:36.797Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:50:37.763Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:50:38.722Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:50:39.678Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:50:40.639Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:50:40.639Z] ====== gauss-mix (apache-spark) [default], iteration 32 completed (8349.922 ms) ======
[2025-04-09T21:50:40.639Z] ====== gauss-mix (apache-spark) [default], iteration 33 started ======
[2025-04-09T21:50:40.639Z] GC before operation: completed in 85.708 ms, heap usage 361.251 MB -> 43.320 MB.
[2025-04-09T21:50:41.595Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:50:43.561Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:50:44.517Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:50:44.517Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:50:45.471Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:50:47.433Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:50:48.388Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:50:49.345Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:50:49.345Z] ====== gauss-mix (apache-spark) [default], iteration 33 completed (7923.842 ms) ======
[2025-04-09T21:50:49.345Z] ====== gauss-mix (apache-spark) [default], iteration 34 started ======
[2025-04-09T21:50:49.345Z] GC before operation: completed in 75.152 ms, heap usage 197.792 MB -> 43.017 MB.
[2025-04-09T21:50:50.301Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:50:51.258Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:50:52.213Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:50:53.170Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:50:54.127Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:50:55.085Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:50:57.048Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:50:58.003Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:50:58.003Z] ====== gauss-mix (apache-spark) [default], iteration 34 completed (8709.058 ms) ======
[2025-04-09T21:50:58.003Z] ====== gauss-mix (apache-spark) [default], iteration 35 started ======
[2025-04-09T21:50:58.003Z] GC before operation: completed in 102.719 ms, heap usage 612.838 MB -> 43.557 MB.
[2025-04-09T21:50:58.958Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:51:00.918Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:51:01.875Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:51:02.835Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:51:03.790Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:51:04.753Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:51:05.708Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:51:06.670Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:51:06.670Z] ====== gauss-mix (apache-spark) [default], iteration 35 completed (9023.270 ms) ======
[2025-04-09T21:51:06.670Z] ====== gauss-mix (apache-spark) [default], iteration 36 started ======
[2025-04-09T21:51:06.670Z] GC before operation: completed in 111.112 ms, heap usage 498.372 MB -> 43.246 MB.
[2025-04-09T21:51:08.632Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:51:09.588Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:51:10.552Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:51:11.511Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:51:12.468Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:51:13.423Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:51:14.379Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:51:15.334Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:51:15.334Z] ====== gauss-mix (apache-spark) [default], iteration 36 completed (8818.887 ms) ======
[2025-04-09T21:51:15.334Z] ====== gauss-mix (apache-spark) [default], iteration 37 started ======
[2025-04-09T21:51:16.395Z] GC before operation: completed in 94.093 ms, heap usage 212.010 MB -> 43.473 MB.
[2025-04-09T21:51:17.351Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:51:18.306Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:51:19.261Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:51:20.217Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:51:21.172Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:51:22.127Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:51:23.166Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:51:24.122Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:51:24.122Z] ====== gauss-mix (apache-spark) [default], iteration 37 completed (8437.730 ms) ======
[2025-04-09T21:51:24.122Z] ====== gauss-mix (apache-spark) [default], iteration 38 started ======
[2025-04-09T21:51:24.122Z] GC before operation: completed in 141.105 ms, heap usage 271.587 MB -> 43.439 MB.
[2025-04-09T21:51:26.085Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:51:27.040Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:51:27.995Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:51:28.951Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:51:29.910Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:51:30.870Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:51:31.854Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:51:32.808Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:51:32.808Z] ====== gauss-mix (apache-spark) [default], iteration 38 completed (8650.380 ms) ======
[2025-04-09T21:51:32.808Z] ====== gauss-mix (apache-spark) [default], iteration 39 started ======
[2025-04-09T21:51:32.809Z] GC before operation: completed in 77.896 ms, heap usage 410.196 MB -> 43.180 MB.
[2025-04-09T21:51:34.771Z] Accuracy (validation) = 0.99556 for the model trained with K = 12, maxIterations = 25, and seed = 159147643.
[2025-04-09T21:51:35.727Z] Accuracy (validation) = 0.98333 for the model trained with K = 12, maxIterations = 25, and seed = 159147644.
[2025-04-09T21:51:36.688Z] Accuracy (validation) = 0.99778 for the model trained with K = 12, maxIterations = 25, and seed = 159147645.
[2025-04-09T21:51:37.646Z] Accuracy (validation) = 0.99889 for the model trained with K = 12, maxIterations = 25, and seed = 159147646.
[2025-04-09T21:51:38.603Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147643.
[2025-04-09T21:51:39.558Z] Accuracy (validation) = 0.99778 for the model trained with K = 9, maxIterations = 30, and seed = 159147644.
[2025-04-09T21:51:40.514Z] Accuracy (validation) = 0.99889 for the model trained with K = 9, maxIterations = 30, and seed = 159147645.
[2025-04-09T21:51:41.471Z] Accuracy (validation) = 1.00000 for the model trained with K = 9, maxIterations = 30, and seed = 159147646.
[2025-04-09T21:51:41.471Z] ====== gauss-mix (apache-spark) [default], iteration 39 completed (8739.754 ms) ======
[2025-04-09T21:51:42.426Z] -----------------------------------
[2025-04-09T21:51:42.426Z] renaissance-gauss-mix_0_PASSED
[2025-04-09T21:51:42.426Z] -----------------------------------
[2025-04-09T21:51:42.426Z]
[2025-04-09T21:51:42.426Z] TEST TEARDOWN:
[2025-04-09T21:51:42.426Z] Nothing to be done for teardown.
[2025-04-09T21:51:42.426Z] renaissance-gauss-mix_0 Finish Time: Wed Apr 9 21:51:41 2025 Epoch Time (ms): 1744235501926
[2025-04-09T21:51:42.426Z]
[2025-04-09T21:51:42.426Z] ===============================================
[2025-04-09T21:51:42.426Z] Running test renaissance-log-regression_0 ...
[2025-04-09T21:51:42.426Z] ===============================================
[2025-04-09T21:51:42.426Z] renaissance-log-regression_0 Start Time: Wed Apr 9 21:51:41 2025 Epoch Time (ms): 1744235501958
[2025-04-09T21:51:42.426Z] variation: NoOptions
[2025-04-09T21:51:42.426Z] JVM_OPTIONS:
[2025-04-09T21:51:42.426Z] { \
[2025-04-09T21:51:42.426Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:51:42.426Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:51:42.426Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-log-regression_0"; \
[2025-04-09T21:51:42.426Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-log-regression_0"; \
[2025-04-09T21:51:42.426Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:51:42.426Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-log-regression_0"/log-regression.json" log-regression; \
[2025-04-09T21:51:42.426Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-log-regression_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-log-regression_0"; else echo "-----------------------------------"; echo "renaissance-log-regression_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:51:42.426Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:51:42.426Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:51:42.426Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:51:42.426Z]
[2025-04-09T21:51:42.426Z] TEST SETUP:
[2025-04-09T21:51:42.426Z] Nothing to be done for setup.
[2025-04-09T21:51:42.426Z]
[2025-04-09T21:51:42.426Z] TESTING:
[2025-04-09T21:51:55.985Z] NOTE: 'log-regression' benchmark uses Spark local executor with 8 (out of 8) threads.
[2025-04-09T21:52:01.375Z] ====== log-regression (apache-spark) [default], iteration 0 started ======
[2025-04-09T21:52:01.375Z] GC before operation: completed in 61.335 ms, heap usage 49.706 MB -> 36.591 MB.
[2025-04-09T21:52:23.541Z] ====== log-regression (apache-spark) [default], iteration 0 completed (20005.547 ms) ======
[2025-04-09T21:52:23.541Z] ====== log-regression (apache-spark) [default], iteration 1 started ======
[2025-04-09T21:52:23.541Z] GC before operation: completed in 106.206 ms, heap usage 209.630 MB -> 127.130 MB.
[2025-04-09T21:52:25.505Z] ====== log-regression (apache-spark) [default], iteration 1 completed (3611.981 ms) ======
[2025-04-09T21:52:25.505Z] ====== log-regression (apache-spark) [default], iteration 2 started ======
[2025-04-09T21:52:25.505Z] GC before operation: completed in 138.689 ms, heap usage 422.460 MB -> 128.362 MB.
[2025-04-09T21:52:28.537Z] ====== log-regression (apache-spark) [default], iteration 2 completed (3398.863 ms) ======
[2025-04-09T21:52:28.537Z] ====== log-regression (apache-spark) [default], iteration 3 started ======
[2025-04-09T21:52:28.537Z] GC before operation: completed in 110.904 ms, heap usage 452.538 MB -> 128.971 MB.
[2025-04-09T21:52:32.715Z] ====== log-regression (apache-spark) [default], iteration 3 completed (3317.942 ms) ======
[2025-04-09T21:52:32.715Z] ====== log-regression (apache-spark) [default], iteration 4 started ======
[2025-04-09T21:52:32.715Z] GC before operation: completed in 201.216 ms, heap usage 724.966 MB -> 130.177 MB.
[2025-04-09T21:52:35.744Z] ====== log-regression (apache-spark) [default], iteration 4 completed (2854.350 ms) ======
[2025-04-09T21:52:35.744Z] ====== log-regression (apache-spark) [default], iteration 5 started ======
[2025-04-09T21:52:35.744Z] GC before operation: completed in 140.392 ms, heap usage 530.495 MB -> 130.048 MB.
[2025-04-09T21:52:38.791Z] ====== log-regression (apache-spark) [default], iteration 5 completed (3106.933 ms) ======
[2025-04-09T21:52:38.791Z] ====== log-regression (apache-spark) [default], iteration 6 started ======
[2025-04-09T21:52:38.791Z] GC before operation: completed in 133.453 ms, heap usage 451.806 MB -> 130.209 MB.
[2025-04-09T21:52:41.831Z] ====== log-regression (apache-spark) [default], iteration 6 completed (2540.173 ms) ======
[2025-04-09T21:52:41.831Z] ====== log-regression (apache-spark) [default], iteration 7 started ======
[2025-04-09T21:52:41.831Z] GC before operation: completed in 169.479 ms, heap usage 358.061 MB -> 130.248 MB.
[2025-04-09T21:52:44.864Z] ====== log-regression (apache-spark) [default], iteration 7 completed (2944.811 ms) ======
[2025-04-09T21:52:44.864Z] ====== log-regression (apache-spark) [default], iteration 8 started ======
[2025-04-09T21:52:44.864Z] GC before operation: completed in 199.453 ms, heap usage 387.581 MB -> 130.594 MB.
[2025-04-09T21:52:47.889Z] ====== log-regression (apache-spark) [default], iteration 8 completed (3025.000 ms) ======
[2025-04-09T21:52:47.889Z] ====== log-regression (apache-spark) [default], iteration 9 started ======
[2025-04-09T21:52:47.889Z] GC before operation: completed in 198.772 ms, heap usage 230.750 MB -> 130.659 MB.
[2025-04-09T21:52:50.924Z] ====== log-regression (apache-spark) [default], iteration 9 completed (3273.674 ms) ======
[2025-04-09T21:52:50.924Z] ====== log-regression (apache-spark) [default], iteration 10 started ======
[2025-04-09T21:52:50.924Z] GC before operation: completed in 151.974 ms, heap usage 298.706 MB -> 131.233 MB.
[2025-04-09T21:52:53.951Z] ====== log-regression (apache-spark) [default], iteration 10 completed (2763.403 ms) ======
[2025-04-09T21:52:53.951Z] ====== log-regression (apache-spark) [default], iteration 11 started ======
[2025-04-09T21:52:53.951Z] GC before operation: completed in 154.989 ms, heap usage 373.184 MB -> 131.757 MB.
[2025-04-09T21:52:56.974Z] ====== log-regression (apache-spark) [default], iteration 11 completed (2901.888 ms) ======
[2025-04-09T21:52:56.974Z] ====== log-regression (apache-spark) [default], iteration 12 started ======
[2025-04-09T21:52:56.974Z] GC before operation: completed in 269.892 ms, heap usage 235.210 MB -> 131.791 MB.
[2025-04-09T21:53:00.007Z] ====== log-regression (apache-spark) [default], iteration 12 completed (2640.870 ms) ======
[2025-04-09T21:53:00.007Z] ====== log-regression (apache-spark) [default], iteration 13 started ======
[2025-04-09T21:53:00.007Z] GC before operation: completed in 160.131 ms, heap usage 555.484 MB -> 132.986 MB.
[2025-04-09T21:53:03.032Z] ====== log-regression (apache-spark) [default], iteration 13 completed (2739.199 ms) ======
[2025-04-09T21:53:03.032Z] ====== log-regression (apache-spark) [default], iteration 14 started ======
[2025-04-09T21:53:03.032Z] GC before operation: completed in 163.909 ms, heap usage 407.003 MB -> 133.033 MB.
[2025-04-09T21:53:06.059Z] ====== log-regression (apache-spark) [default], iteration 14 completed (2474.792 ms) ======
[2025-04-09T21:53:06.059Z] ====== log-regression (apache-spark) [default], iteration 15 started ======
[2025-04-09T21:53:06.059Z] GC before operation: completed in 232.087 ms, heap usage 416.500 MB -> 133.407 MB.
[2025-04-09T21:53:08.030Z] ====== log-regression (apache-spark) [default], iteration 15 completed (2598.032 ms) ======
[2025-04-09T21:53:08.030Z] ====== log-regression (apache-spark) [default], iteration 16 started ======
[2025-04-09T21:53:08.030Z] GC before operation: completed in 188.749 ms, heap usage 684.078 MB -> 134.342 MB.
[2025-04-09T21:53:11.075Z] ====== log-regression (apache-spark) [default], iteration 16 completed (2793.499 ms) ======
[2025-04-09T21:53:11.075Z] ====== log-regression (apache-spark) [default], iteration 17 started ======
[2025-04-09T21:53:12.034Z] GC before operation: completed in 272.193 ms, heap usage 357.330 MB -> 134.004 MB.
[2025-04-09T21:53:14.025Z] ====== log-regression (apache-spark) [default], iteration 17 completed (2013.422 ms) ======
[2025-04-09T21:53:14.025Z] ====== log-regression (apache-spark) [default], iteration 18 started ======
[2025-04-09T21:53:14.025Z] GC before operation: completed in 153.959 ms, heap usage 479.726 MB -> 134.706 MB.
[2025-04-09T21:53:15.996Z] ====== log-regression (apache-spark) [default], iteration 18 completed (2586.288 ms) ======
[2025-04-09T21:53:15.996Z] ====== log-regression (apache-spark) [default], iteration 19 started ======
[2025-04-09T21:53:16.959Z] GC before operation: completed in 195.124 ms, heap usage 376.529 MB -> 134.746 MB.
[2025-04-09T21:53:18.927Z] ====== log-regression (apache-spark) [default], iteration 19 completed (2432.083 ms) ======
[2025-04-09T21:53:19.882Z] -----------------------------------
[2025-04-09T21:53:19.882Z] renaissance-log-regression_0_PASSED
[2025-04-09T21:53:19.882Z] -----------------------------------
[2025-04-09T21:53:19.882Z]
[2025-04-09T21:53:19.882Z] TEST TEARDOWN:
[2025-04-09T21:53:19.882Z] Nothing to be done for teardown.
[2025-04-09T21:53:19.882Z] renaissance-log-regression_0 Finish Time: Wed Apr 9 21:53:19 2025 Epoch Time (ms): 1744235599198
[2025-04-09T21:53:19.882Z]
[2025-04-09T21:53:19.882Z] ===============================================
[2025-04-09T21:53:19.882Z] Running test renaissance-mnemonics_0 ...
[2025-04-09T21:53:19.882Z] ===============================================
[2025-04-09T21:53:19.882Z] renaissance-mnemonics_0 Start Time: Wed Apr 9 21:53:19 2025 Epoch Time (ms): 1744235599224
[2025-04-09T21:53:19.882Z] variation: NoOptions
[2025-04-09T21:53:19.882Z] JVM_OPTIONS:
[2025-04-09T21:53:19.882Z] { \
[2025-04-09T21:53:19.882Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:53:19.882Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:53:19.882Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-mnemonics_0"; \
[2025-04-09T21:53:19.882Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-mnemonics_0"; \
[2025-04-09T21:53:19.882Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:53:19.882Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-mnemonics_0"/mnemonics.json" mnemonics; \
[2025-04-09T21:53:19.882Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-mnemonics_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-mnemonics_0"; else echo "-----------------------------------"; echo "renaissance-mnemonics_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:53:19.882Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:53:19.882Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:53:19.882Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:53:19.882Z]
[2025-04-09T21:53:19.882Z] TEST SETUP:
[2025-04-09T21:53:19.882Z] Nothing to be done for setup.
[2025-04-09T21:53:19.882Z]
[2025-04-09T21:53:19.882Z] TESTING:
[2025-04-09T21:53:20.839Z] ====== mnemonics (functional) [default], iteration 0 started ======
[2025-04-09T21:53:20.839Z] GC before operation: completed in 34.140 ms, heap usage 5.837 MB -> 4.893 MB.
[2025-04-09T21:53:27.632Z] ====== mnemonics (functional) [default], iteration 0 completed (5339.487 ms) ======
[2025-04-09T21:53:27.632Z] ====== mnemonics (functional) [default], iteration 1 started ======
[2025-04-09T21:53:27.632Z] GC before operation: completed in 17.103 ms, heap usage 462.809 MB -> 4.970 MB.
[2025-04-09T21:53:34.354Z] ====== mnemonics (functional) [default], iteration 1 completed (7003.257 ms) ======
[2025-04-09T21:53:34.354Z] ====== mnemonics (functional) [default], iteration 2 started ======
[2025-04-09T21:53:35.311Z] GC before operation: completed in 14.565 ms, heap usage 380.591 MB -> 4.969 MB.
[2025-04-09T21:53:43.660Z] ====== mnemonics (functional) [default], iteration 2 completed (7057.918 ms) ======
[2025-04-09T21:53:43.660Z] ====== mnemonics (functional) [default], iteration 3 started ======
[2025-04-09T21:53:43.660Z] GC before operation: completed in 12.219 ms, heap usage 392.938 MB -> 4.969 MB.
[2025-04-09T21:53:50.377Z] ====== mnemonics (functional) [default], iteration 3 completed (6911.088 ms) ======
[2025-04-09T21:53:50.377Z] ====== mnemonics (functional) [default], iteration 4 started ======
[2025-04-09T21:53:50.377Z] GC before operation: completed in 11.278 ms, heap usage 147.835 MB -> 4.969 MB.
[2025-04-09T21:53:58.547Z] ====== mnemonics (functional) [default], iteration 4 completed (6905.870 ms) ======
[2025-04-09T21:53:58.547Z] ====== mnemonics (functional) [default], iteration 5 started ======
[2025-04-09T21:53:58.547Z] GC before operation: completed in 13.701 ms, heap usage 106.396 MB -> 4.969 MB.
[2025-04-09T21:54:06.726Z] ====== mnemonics (functional) [default], iteration 5 completed (6940.779 ms) ======
[2025-04-09T21:54:06.726Z] ====== mnemonics (functional) [default], iteration 6 started ======
[2025-04-09T21:54:06.726Z] GC before operation: completed in 15.193 ms, heap usage 472.601 MB -> 4.969 MB.
[2025-04-09T21:54:13.450Z] ====== mnemonics (functional) [default], iteration 6 completed (6999.438 ms) ======
[2025-04-09T21:54:13.450Z] ====== mnemonics (functional) [default], iteration 7 started ======
[2025-04-09T21:54:13.450Z] GC before operation: completed in 17.395 ms, heap usage 161.673 MB -> 4.969 MB.
[2025-04-09T21:54:21.653Z] ====== mnemonics (functional) [default], iteration 7 completed (7197.041 ms) ======
[2025-04-09T21:54:21.653Z] ====== mnemonics (functional) [default], iteration 8 started ======
[2025-04-09T21:54:21.653Z] GC before operation: completed in 20.359 ms, heap usage 153.725 MB -> 4.969 MB.
[2025-04-09T21:54:29.858Z] ====== mnemonics (functional) [default], iteration 8 completed (7159.788 ms) ======
[2025-04-09T21:54:29.858Z] ====== mnemonics (functional) [default], iteration 9 started ======
[2025-04-09T21:54:29.858Z] GC before operation: completed in 24.109 ms, heap usage 531.117 MB -> 4.969 MB.
[2025-04-09T21:54:38.061Z] ====== mnemonics (functional) [default], iteration 9 completed (7109.528 ms) ======
[2025-04-09T21:54:38.061Z] ====== mnemonics (functional) [default], iteration 10 started ======
[2025-04-09T21:54:38.061Z] GC before operation: completed in 23.994 ms, heap usage 531.963 MB -> 4.969 MB.
[2025-04-09T21:54:46.267Z] ====== mnemonics (functional) [default], iteration 10 completed (7202.003 ms) ======
[2025-04-09T21:54:46.267Z] ====== mnemonics (functional) [default], iteration 11 started ======
[2025-04-09T21:54:46.267Z] GC before operation: completed in 20.527 ms, heap usage 109.408 MB -> 4.969 MB.
[2025-04-09T21:54:53.011Z] ====== mnemonics (functional) [default], iteration 11 completed (7144.473 ms) ======
[2025-04-09T21:54:53.011Z] ====== mnemonics (functional) [default], iteration 12 started ======
[2025-04-09T21:54:53.966Z] GC before operation: completed in 18.727 ms, heap usage 506.580 MB -> 4.970 MB.
[2025-04-09T21:55:02.148Z] ====== mnemonics (functional) [default], iteration 12 completed (7132.014 ms) ======
[2025-04-09T21:55:02.148Z] ====== mnemonics (functional) [default], iteration 13 started ======
[2025-04-09T21:55:02.148Z] GC before operation: completed in 24.030 ms, heap usage 136.242 MB -> 4.970 MB.
[2025-04-09T21:55:08.886Z] ====== mnemonics (functional) [default], iteration 13 completed (7020.624 ms) ======
[2025-04-09T21:55:08.886Z] ====== mnemonics (functional) [default], iteration 14 started ======
[2025-04-09T21:55:08.886Z] GC before operation: completed in 16.054 ms, heap usage 464.405 MB -> 4.969 MB.
[2025-04-09T21:55:17.060Z] ====== mnemonics (functional) [default], iteration 14 completed (7105.494 ms) ======
[2025-04-09T21:55:17.060Z] ====== mnemonics (functional) [default], iteration 15 started ======
[2025-04-09T21:55:17.060Z] GC before operation: completed in 16.117 ms, heap usage 419.315 MB -> 4.969 MB.
[2025-04-09T21:55:25.228Z] ====== mnemonics (functional) [default], iteration 15 completed (7019.181 ms) ======
[2025-04-09T21:55:25.228Z] -----------------------------------
[2025-04-09T21:55:25.228Z] renaissance-mnemonics_0_PASSED
[2025-04-09T21:55:25.228Z] -----------------------------------
[2025-04-09T21:55:25.228Z]
[2025-04-09T21:55:25.228Z] TEST TEARDOWN:
[2025-04-09T21:55:25.228Z] Nothing to be done for teardown.
[2025-04-09T21:55:25.228Z] renaissance-mnemonics_0 Finish Time: Wed Apr 9 21:55:24 2025 Epoch Time (ms): 1744235724831
[2025-04-09T21:55:25.228Z]
[2025-04-09T21:55:25.228Z] ===============================================
[2025-04-09T21:55:25.228Z] Running test renaissance-movie-lens_0 ...
[2025-04-09T21:55:25.228Z] ===============================================
[2025-04-09T21:55:25.228Z] renaissance-movie-lens_0 Start Time: Wed Apr 9 21:55:24 2025 Epoch Time (ms): 1744235724857
[2025-04-09T21:55:25.228Z] variation: NoOptions
[2025-04-09T21:55:25.228Z] JVM_OPTIONS:
[2025-04-09T21:55:25.228Z] { \
[2025-04-09T21:55:25.228Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T21:55:25.228Z] echo "Nothing to be done for setup."; \
[2025-04-09T21:55:25.228Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-movie-lens_0"; \
[2025-04-09T21:55:25.228Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-movie-lens_0"; \
[2025-04-09T21:55:25.228Z] echo ""; echo "TESTING:"; \
[2025-04-09T21:55:25.228Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-movie-lens_0"/movie-lens.json" movie-lens; \
[2025-04-09T21:55:25.228Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-movie-lens_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-movie-lens_0"; else echo "-----------------------------------"; echo "renaissance-movie-lens_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T21:55:25.228Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T21:55:25.228Z] echo "Nothing to be done for teardown."; \
[2025-04-09T21:55:25.228Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T21:55:25.228Z]
[2025-04-09T21:55:25.228Z] TEST SETUP:
[2025-04-09T21:55:25.228Z] Nothing to be done for setup.
[2025-04-09T21:55:25.228Z]
[2025-04-09T21:55:25.228Z] TESTING:
[2025-04-09T21:55:38.777Z] NOTE: 'movie-lens' benchmark uses Spark local executor with 8 (out of 8) threads.
[2025-04-09T21:56:00.059Z] Got 100004 ratings from 671 users on 9066 movies.
[2025-04-09T21:56:00.059Z] Training: 60056, validation: 20285, test: 19854
[2025-04-09T21:56:00.059Z] ====== movie-lens (apache-spark) [default], iteration 0 started ======
[2025-04-09T21:56:00.059Z] GC before operation: completed in 198.448 ms, heap usage 240.488 MB -> 77.548 MB.
[2025-04-09T21:56:11.654Z] RMSE (validation) = 3.621968954548751 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T21:56:18.376Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T21:56:23.779Z] RMSE (validation) = 1.3105189674610935 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T21:56:27.957Z] RMSE (validation) = 1.004540774800519 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T21:56:30.998Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T21:56:34.031Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T21:56:37.056Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T21:56:40.088Z] RMSE (validation) = 0.9001440981626694 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T21:56:40.088Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T21:56:41.044Z] The best model improves the baseline by 14.43%.
[2025-04-09T21:56:41.044Z] Top recommended movies for user id 72:
[2025-04-09T21:56:41.044Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T21:56:41.044Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T21:56:41.044Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T21:56:41.044Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T21:56:41.044Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T21:56:41.044Z] ====== movie-lens (apache-spark) [default], iteration 0 completed (42339.263 ms) ======
[2025-04-09T21:56:41.044Z] ====== movie-lens (apache-spark) [default], iteration 1 started ======
[2025-04-09T21:56:41.044Z] GC before operation: completed in 179.707 ms, heap usage 156.744 MB -> 95.601 MB.
[2025-04-09T21:56:46.441Z] RMSE (validation) = 3.621968954548751 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T21:56:50.614Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T21:56:54.807Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T21:56:57.869Z] RMSE (validation) = 1.004540774800519 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T21:57:00.912Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T21:57:02.883Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T21:57:04.852Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T21:57:07.884Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T21:57:07.884Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T21:57:07.884Z] The best model improves the baseline by 14.43%.
[2025-04-09T21:57:07.884Z] Top recommended movies for user id 72:
[2025-04-09T21:57:07.884Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T21:57:07.884Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T21:57:07.884Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T21:57:07.884Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T21:57:07.884Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T21:57:07.884Z] ====== movie-lens (apache-spark) [default], iteration 1 completed (27184.590 ms) ======
[2025-04-09T21:57:07.884Z] ====== movie-lens (apache-spark) [default], iteration 2 started ======
[2025-04-09T21:57:08.842Z] GC before operation: completed in 196.538 ms, heap usage 1005.597 MB -> 97.417 MB.
[2025-04-09T21:57:13.030Z] RMSE (validation) = 3.621968954548751 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T21:57:17.209Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T21:57:20.249Z] RMSE (validation) = 1.3105189674610935 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T21:57:24.417Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T21:57:26.395Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T21:57:28.354Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T21:57:30.324Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T21:57:32.290Z] RMSE (validation) = 0.9001440981626694 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T21:57:33.252Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T21:57:33.252Z] The best model improves the baseline by 14.43%.
[2025-04-09T21:57:33.252Z] Top recommended movies for user id 72:
[2025-04-09T21:57:33.252Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T21:57:33.252Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T21:57:33.252Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T21:57:33.252Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T21:57:33.252Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T21:57:33.252Z] ====== movie-lens (apache-spark) [default], iteration 2 completed (24586.908 ms) ======
[2025-04-09T21:57:33.252Z] ====== movie-lens (apache-spark) [default], iteration 3 started ======
[2025-04-09T21:57:33.252Z] GC before operation: completed in 184.021 ms, heap usage 253.512 MB -> 95.810 MB.
[2025-04-09T21:57:37.420Z] RMSE (validation) = 3.621968954548751 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T21:57:40.447Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T21:57:43.480Z] RMSE (validation) = 1.3105189674610935 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T21:57:46.513Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T21:57:48.484Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T21:57:51.530Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T21:57:53.499Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T21:57:55.476Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T21:57:55.476Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T21:57:55.476Z] The best model improves the baseline by 14.43%.
[2025-04-09T21:57:56.440Z] Top recommended movies for user id 72:
[2025-04-09T21:57:56.440Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T21:57:56.440Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T21:57:56.440Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T21:57:56.440Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T21:57:56.441Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T21:57:56.441Z] ====== movie-lens (apache-spark) [default], iteration 3 completed (22864.272 ms) ======
[2025-04-09T21:57:56.441Z] ====== movie-lens (apache-spark) [default], iteration 4 started ======
[2025-04-09T21:57:56.441Z] GC before operation: completed in 178.753 ms, heap usage 339.676 MB -> 94.674 MB.
[2025-04-09T21:58:00.610Z] RMSE (validation) = 3.6219689545487506 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T21:58:04.785Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T21:58:07.814Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T21:58:11.992Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T21:58:14.534Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T21:58:16.609Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T21:58:19.643Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T21:58:21.605Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T21:58:22.562Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T21:58:22.562Z] The best model improves the baseline by 14.43%.
[2025-04-09T21:58:22.562Z] Top recommended movies for user id 72:
[2025-04-09T21:58:22.562Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T21:58:22.562Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T21:58:22.562Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T21:58:22.562Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T21:58:22.562Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T21:58:22.562Z] ====== movie-lens (apache-spark) [default], iteration 4 completed (26240.192 ms) ======
[2025-04-09T21:58:22.562Z] ====== movie-lens (apache-spark) [default], iteration 5 started ======
[2025-04-09T21:58:22.562Z] GC before operation: completed in 153.217 ms, heap usage 468.848 MB -> 92.896 MB.
[2025-04-09T21:58:26.802Z] RMSE (validation) = 3.6219689545487506 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T21:58:30.978Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T21:58:35.155Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T21:58:38.191Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T21:58:40.153Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T21:58:43.357Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T21:58:45.318Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T21:58:47.279Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T21:58:47.279Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T21:58:47.279Z] The best model improves the baseline by 14.43%.
[2025-04-09T21:58:47.279Z] Top recommended movies for user id 72:
[2025-04-09T21:58:47.279Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T21:58:47.279Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T21:58:47.279Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T21:58:47.279Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T21:58:47.279Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T21:58:47.279Z] ====== movie-lens (apache-spark) [default], iteration 5 completed (25051.035 ms) ======
[2025-04-09T21:58:47.279Z] ====== movie-lens (apache-spark) [default], iteration 6 started ======
[2025-04-09T21:58:48.236Z] GC before operation: completed in 164.725 ms, heap usage 1.171 GB -> 99.480 MB.
[2025-04-09T21:58:51.291Z] RMSE (validation) = 3.621968954548751 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T21:58:55.464Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T21:58:58.499Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T21:59:01.530Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T21:59:03.496Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T21:59:05.459Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T21:59:07.422Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T21:59:09.392Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T21:59:10.353Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T21:59:10.353Z] The best model improves the baseline by 14.43%.
[2025-04-09T21:59:10.353Z] Top recommended movies for user id 72:
[2025-04-09T21:59:10.353Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T21:59:10.353Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T21:59:10.353Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T21:59:10.353Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T21:59:10.353Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T21:59:10.353Z] ====== movie-lens (apache-spark) [default], iteration 6 completed (22606.872 ms) ======
[2025-04-09T21:59:10.353Z] ====== movie-lens (apache-spark) [default], iteration 7 started ======
[2025-04-09T21:59:10.353Z] GC before operation: completed in 148.612 ms, heap usage 361.413 MB -> 95.344 MB.
[2025-04-09T21:59:14.545Z] RMSE (validation) = 3.6219689545487506 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T21:59:17.590Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T21:59:20.628Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T21:59:23.671Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T21:59:25.632Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T21:59:27.597Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T21:59:29.571Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T21:59:31.535Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T21:59:32.495Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T21:59:32.495Z] The best model improves the baseline by 14.43%.
[2025-04-09T21:59:32.495Z] Top recommended movies for user id 72:
[2025-04-09T21:59:32.495Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T21:59:32.495Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T21:59:32.495Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T21:59:32.495Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T21:59:32.495Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T21:59:32.495Z] ====== movie-lens (apache-spark) [default], iteration 7 completed (22129.746 ms) ======
[2025-04-09T21:59:32.495Z] ====== movie-lens (apache-spark) [default], iteration 8 started ======
[2025-04-09T21:59:32.495Z] GC before operation: completed in 162.021 ms, heap usage 1.261 GB -> 98.016 MB.
[2025-04-09T21:59:36.664Z] RMSE (validation) = 3.621968954548751 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T21:59:40.846Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T21:59:45.014Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T21:59:48.058Z] RMSE (validation) = 1.004540774800519 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T21:59:50.035Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T21:59:53.084Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T21:59:55.053Z] RMSE (validation) = 0.9275717391338142 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T21:59:57.026Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T21:59:57.995Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T21:59:57.995Z] The best model improves the baseline by 14.43%.
[2025-04-09T21:59:57.995Z] Top recommended movies for user id 72:
[2025-04-09T21:59:57.995Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T21:59:57.995Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T21:59:57.995Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T21:59:57.995Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T21:59:57.995Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T21:59:57.995Z] ====== movie-lens (apache-spark) [default], iteration 8 completed (24899.372 ms) ======
[2025-04-09T21:59:57.995Z] ====== movie-lens (apache-spark) [default], iteration 9 started ======
[2025-04-09T21:59:57.995Z] GC before operation: completed in 164.517 ms, heap usage 1.253 GB -> 97.678 MB.
[2025-04-09T22:00:02.176Z] RMSE (validation) = 3.6219689545487506 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T22:00:06.372Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T22:00:09.412Z] RMSE (validation) = 1.3105189674610935 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T22:00:13.602Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T22:00:15.569Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T22:00:17.729Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T22:00:19.702Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T22:00:21.687Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T22:00:22.647Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T22:00:22.647Z] The best model improves the baseline by 14.43%.
[2025-04-09T22:00:22.647Z] Top recommended movies for user id 72:
[2025-04-09T22:00:22.647Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T22:00:22.647Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T22:00:22.647Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T22:00:22.647Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T22:00:22.647Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T22:00:22.647Z] ====== movie-lens (apache-spark) [default], iteration 9 completed (24418.439 ms) ======
[2025-04-09T22:00:22.647Z] ====== movie-lens (apache-spark) [default], iteration 10 started ======
[2025-04-09T22:00:22.647Z] GC before operation: completed in 177.847 ms, heap usage 717.019 MB -> 98.243 MB.
[2025-04-09T22:00:26.852Z] RMSE (validation) = 3.621968954548751 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T22:00:29.897Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T22:00:32.940Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T22:00:35.988Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T22:00:37.975Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T22:00:41.031Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T22:00:43.010Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T22:00:44.972Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T22:00:45.928Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T22:00:45.928Z] The best model improves the baseline by 14.43%.
[2025-04-09T22:00:45.928Z] Top recommended movies for user id 72:
[2025-04-09T22:00:45.928Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T22:00:45.928Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T22:00:45.928Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T22:00:45.928Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T22:00:45.928Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T22:00:45.928Z] ====== movie-lens (apache-spark) [default], iteration 10 completed (23169.926 ms) ======
[2025-04-09T22:00:45.928Z] ====== movie-lens (apache-spark) [default], iteration 11 started ======
[2025-04-09T22:00:45.929Z] GC before operation: completed in 179.735 ms, heap usage 334.002 MB -> 95.395 MB.
[2025-04-09T22:00:50.120Z] RMSE (validation) = 3.6219689545487506 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T22:00:54.331Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T22:00:57.400Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T22:01:00.431Z] RMSE (validation) = 1.004540774800519 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T22:01:03.483Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T22:01:05.452Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T22:01:07.414Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T22:01:09.374Z] RMSE (validation) = 0.9001440981626694 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T22:01:09.374Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T22:01:09.374Z] The best model improves the baseline by 14.43%.
[2025-04-09T22:01:09.374Z] Top recommended movies for user id 72:
[2025-04-09T22:01:09.374Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T22:01:09.374Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T22:01:09.374Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T22:01:09.374Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T22:01:09.374Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T22:01:09.374Z] ====== movie-lens (apache-spark) [default], iteration 11 completed (23648.041 ms) ======
[2025-04-09T22:01:09.374Z] ====== movie-lens (apache-spark) [default], iteration 12 started ======
[2025-04-09T22:01:10.331Z] GC before operation: completed in 211.935 ms, heap usage 1.019 GB -> 98.542 MB.
[2025-04-09T22:01:13.381Z] RMSE (validation) = 3.621968954548751 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T22:01:17.554Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T22:01:21.789Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T22:01:23.758Z] RMSE (validation) = 1.004540774800519 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T22:01:25.723Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T22:01:28.749Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T22:01:30.712Z] RMSE (validation) = 0.9275717391338142 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T22:01:32.679Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T22:01:32.679Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T22:01:32.679Z] The best model improves the baseline by 14.43%.
[2025-04-09T22:01:33.636Z] Top recommended movies for user id 72:
[2025-04-09T22:01:33.636Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T22:01:33.636Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T22:01:33.636Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T22:01:33.636Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T22:01:33.636Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T22:01:33.636Z] ====== movie-lens (apache-spark) [default], iteration 12 completed (23351.264 ms) ======
[2025-04-09T22:01:33.636Z] ====== movie-lens (apache-spark) [default], iteration 13 started ======
[2025-04-09T22:01:33.636Z] GC before operation: completed in 203.461 ms, heap usage 1.333 GB -> 100.682 MB.
[2025-04-09T22:01:36.667Z] RMSE (validation) = 3.621968954548751 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T22:01:40.855Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T22:01:43.905Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T22:01:46.943Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T22:01:48.919Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T22:01:50.888Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T22:01:52.860Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T22:01:54.831Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T22:01:55.789Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T22:01:55.789Z] The best model improves the baseline by 14.43%.
[2025-04-09T22:01:55.789Z] Top recommended movies for user id 72:
[2025-04-09T22:01:55.789Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T22:01:55.789Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T22:01:55.789Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T22:01:55.789Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T22:01:55.789Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T22:01:55.789Z] ====== movie-lens (apache-spark) [default], iteration 13 completed (22582.803 ms) ======
[2025-04-09T22:01:55.789Z] ====== movie-lens (apache-spark) [default], iteration 14 started ======
[2025-04-09T22:01:55.789Z] GC before operation: completed in 164.416 ms, heap usage 759.422 MB -> 100.213 MB.
[2025-04-09T22:01:59.951Z] RMSE (validation) = 3.6219689545487506 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T22:02:04.127Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T22:02:07.173Z] RMSE (validation) = 1.3105189674610935 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T22:02:10.216Z] RMSE (validation) = 1.004540774800519 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T22:02:13.245Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T22:02:15.212Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T22:02:17.187Z] RMSE (validation) = 0.927571739133814 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T22:02:19.157Z] RMSE (validation) = 0.9001440981626694 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T22:02:19.157Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T22:02:19.157Z] The best model improves the baseline by 14.43%.
[2025-04-09T22:02:19.157Z] Top recommended movies for user id 72:
[2025-04-09T22:02:19.157Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T22:02:19.157Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T22:02:19.157Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T22:02:19.157Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T22:02:19.157Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T22:02:19.157Z] ====== movie-lens (apache-spark) [default], iteration 14 completed (23437.526 ms) ======
[2025-04-09T22:02:19.157Z] ====== movie-lens (apache-spark) [default], iteration 15 started ======
[2025-04-09T22:02:20.116Z] GC before operation: completed in 167.772 ms, heap usage 1.467 GB -> 102.085 MB.
[2025-04-09T22:02:24.358Z] RMSE (validation) = 3.621968954548751 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T22:02:27.480Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T22:02:30.634Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T22:02:33.844Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T22:02:35.839Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T22:02:39.000Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T22:02:40.985Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T22:02:43.004Z] RMSE (validation) = 0.9001440981626694 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T22:02:43.004Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T22:02:43.004Z] The best model improves the baseline by 14.43%.
[2025-04-09T22:02:43.960Z] Top recommended movies for user id 72:
[2025-04-09T22:02:43.960Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T22:02:43.960Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T22:02:43.960Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T22:02:43.960Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T22:02:43.960Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T22:02:43.960Z] ====== movie-lens (apache-spark) [default], iteration 15 completed (23752.956 ms) ======
[2025-04-09T22:02:43.960Z] ====== movie-lens (apache-spark) [default], iteration 16 started ======
[2025-04-09T22:02:43.960Z] GC before operation: completed in 160.743 ms, heap usage 1.144 GB -> 102.240 MB.
[2025-04-09T22:02:48.153Z] RMSE (validation) = 3.6219689545487506 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T22:02:51.316Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T22:02:55.584Z] RMSE (validation) = 1.3105189674610935 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T22:02:58.764Z] RMSE (validation) = 1.004540774800519 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T22:03:01.857Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T22:03:03.919Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T22:03:05.907Z] RMSE (validation) = 0.927571739133814 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T22:03:07.875Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T22:03:08.830Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T22:03:08.830Z] The best model improves the baseline by 14.43%.
[2025-04-09T22:03:08.830Z] Top recommended movies for user id 72:
[2025-04-09T22:03:08.830Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T22:03:08.830Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T22:03:08.830Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T22:03:08.830Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T22:03:08.830Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T22:03:08.830Z] ====== movie-lens (apache-spark) [default], iteration 16 completed (25252.652 ms) ======
[2025-04-09T22:03:08.830Z] ====== movie-lens (apache-spark) [default], iteration 17 started ======
[2025-04-09T22:03:08.830Z] GC before operation: completed in 230.272 ms, heap usage 1.564 GB -> 98.639 MB.
[2025-04-09T22:03:13.025Z] RMSE (validation) = 3.6219689545487506 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T22:03:16.272Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T22:03:19.315Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T22:03:22.938Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T22:03:24.911Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T22:03:26.891Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T22:03:28.895Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T22:03:30.891Z] RMSE (validation) = 0.9001440981626694 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T22:03:31.875Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T22:03:31.875Z] The best model improves the baseline by 14.43%.
[2025-04-09T22:03:31.875Z] Top recommended movies for user id 72:
[2025-04-09T22:03:31.875Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T22:03:31.875Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T22:03:31.875Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T22:03:31.875Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T22:03:31.875Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T22:03:31.875Z] ====== movie-lens (apache-spark) [default], iteration 17 completed (22591.503 ms) ======
[2025-04-09T22:03:31.875Z] ====== movie-lens (apache-spark) [default], iteration 18 started ======
[2025-04-09T22:03:31.875Z] GC before operation: completed in 184.268 ms, heap usage 302.952 MB -> 94.978 MB.
[2025-04-09T22:03:36.215Z] RMSE (validation) = 3.621968954548751 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T22:03:39.249Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T22:03:42.294Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T22:03:45.528Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T22:03:47.525Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T22:03:50.580Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T22:03:52.602Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T22:03:54.591Z] RMSE (validation) = 0.9001440981626695 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T22:03:55.550Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T22:03:55.550Z] The best model improves the baseline by 14.43%.
[2025-04-09T22:03:55.550Z] Top recommended movies for user id 72:
[2025-04-09T22:03:55.551Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T22:03:55.551Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T22:03:55.551Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T22:03:55.551Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T22:03:55.551Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T22:03:55.551Z] ====== movie-lens (apache-spark) [default], iteration 18 completed (23437.485 ms) ======
[2025-04-09T22:03:55.551Z] ====== movie-lens (apache-spark) [default], iteration 19 started ======
[2025-04-09T22:03:55.551Z] GC before operation: completed in 190.065 ms, heap usage 1.472 GB -> 100.213 MB.
[2025-04-09T22:03:58.586Z] RMSE (validation) = 3.6219689545487506 for the model trained with rank = 8, lambda = 5.0, and numIter = 20.
[2025-04-09T22:04:02.769Z] RMSE (validation) = 2.134092321711807 for the model trained with rank = 10, lambda = 2.0, and numIter = 20.
[2025-04-09T22:04:05.804Z] RMSE (validation) = 1.3105189674610933 for the model trained with rank = 12, lambda = 1.0, and numIter = 20.
[2025-04-09T22:04:08.844Z] RMSE (validation) = 1.0045407748005193 for the model trained with rank = 8, lambda = 0.05, and numIter = 20.
[2025-04-09T22:04:10.830Z] RMSE (validation) = 1.2218330581874075 for the model trained with rank = 10, lambda = 0.01, and numIter = 10.
[2025-04-09T22:04:12.811Z] RMSE (validation) = 1.1174953766371012 for the model trained with rank = 8, lambda = 0.02, and numIter = 10.
[2025-04-09T22:04:14.787Z] RMSE (validation) = 0.9275717391338141 for the model trained with rank = 12, lambda = 0.1, and numIter = 10.
[2025-04-09T22:04:16.763Z] RMSE (validation) = 0.9001440981626694 for the model trained with rank = 8, lambda = 0.2, and numIter = 10.
[2025-04-09T22:04:17.724Z] The best model was trained with rank = 8 and lambda = 0.2, and numIter = 10, and its RMSE on the test set is 0.9073522634082535.
[2025-04-09T22:04:17.724Z] The best model improves the baseline by 14.43%.
[2025-04-09T22:04:17.724Z] Top recommended movies for user id 72:
[2025-04-09T22:04:17.724Z] 1: Land of Silence and Darkness (Land des Schweigens und der Dunkelheit) (1971) (rating: 4.674, id: 67504)
[2025-04-09T22:04:17.724Z] 2: Goat, The (1921) (rating: 4.674, id: 83318)
[2025-04-09T22:04:17.724Z] 3: Play House, The (1921) (rating: 4.674, id: 83359)
[2025-04-09T22:04:17.724Z] 4: Cops (1922) (rating: 4.674, id: 83411)
[2025-04-09T22:04:17.724Z] 5: Dear Frankie (2004) (rating: 4.256, id: 8530)
[2025-04-09T22:04:17.724Z] ====== movie-lens (apache-spark) [default], iteration 19 completed (22229.140 ms) ======
[2025-04-09T22:04:19.699Z] -----------------------------------
[2025-04-09T22:04:19.699Z] renaissance-movie-lens_0_PASSED
[2025-04-09T22:04:19.699Z] -----------------------------------
[2025-04-09T22:04:19.699Z]
[2025-04-09T22:04:19.699Z] TEST TEARDOWN:
[2025-04-09T22:04:19.699Z] Nothing to be done for teardown.
[2025-04-09T22:04:19.699Z] renaissance-movie-lens_0 Finish Time: Wed Apr 9 22:04:19 2025 Epoch Time (ms): 1744236259027
[2025-04-09T22:04:19.699Z]
[2025-04-09T22:04:19.699Z] ===============================================
[2025-04-09T22:04:19.699Z] Running test renaissance-par-mnemonics_0 ...
[2025-04-09T22:04:19.699Z] ===============================================
[2025-04-09T22:04:19.699Z] renaissance-par-mnemonics_0 Start Time: Wed Apr 9 22:04:19 2025 Epoch Time (ms): 1744236259059
[2025-04-09T22:04:19.699Z] variation: NoOptions
[2025-04-09T22:04:19.699Z] JVM_OPTIONS:
[2025-04-09T22:04:19.699Z] { \
[2025-04-09T22:04:19.699Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T22:04:19.699Z] echo "Nothing to be done for setup."; \
[2025-04-09T22:04:19.699Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-par-mnemonics_0"; \
[2025-04-09T22:04:19.699Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-par-mnemonics_0"; \
[2025-04-09T22:04:19.699Z] echo ""; echo "TESTING:"; \
[2025-04-09T22:04:19.699Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-par-mnemonics_0"/par-mnemonics.json" par-mnemonics; \
[2025-04-09T22:04:19.699Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-par-mnemonics_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-par-mnemonics_0"; else echo "-----------------------------------"; echo "renaissance-par-mnemonics_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T22:04:19.699Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T22:04:19.699Z] echo "Nothing to be done for teardown."; \
[2025-04-09T22:04:19.699Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T22:04:19.699Z]
[2025-04-09T22:04:19.699Z] TEST SETUP:
[2025-04-09T22:04:19.699Z] Nothing to be done for setup.
[2025-04-09T22:04:19.699Z]
[2025-04-09T22:04:19.699Z] TESTING:
[2025-04-09T22:04:20.655Z] ====== par-mnemonics (functional) [default], iteration 0 started ======
[2025-04-09T22:04:20.655Z] GC before operation: completed in 20.041 ms, heap usage 5.825 MB -> 4.890 MB.
[2025-04-09T22:04:26.055Z] ====== par-mnemonics (functional) [default], iteration 0 completed (4555.595 ms) ======
[2025-04-09T22:04:26.056Z] ====== par-mnemonics (functional) [default], iteration 1 started ======
[2025-04-09T22:04:26.056Z] GC before operation: completed in 18.892 ms, heap usage 261.462 MB -> 5.005 MB.
[2025-04-09T22:04:32.790Z] ====== par-mnemonics (functional) [default], iteration 1 completed (7111.435 ms) ======
[2025-04-09T22:04:32.790Z] ====== par-mnemonics (functional) [default], iteration 2 started ======
[2025-04-09T22:04:32.790Z] GC before operation: completed in 17.521 ms, heap usage 291.316 MB -> 5.003 MB.
[2025-04-09T22:04:41.018Z] ====== par-mnemonics (functional) [default], iteration 2 completed (7151.768 ms) ======
[2025-04-09T22:04:41.018Z] ====== par-mnemonics (functional) [default], iteration 3 started ======
[2025-04-09T22:04:41.018Z] GC before operation: completed in 15.639 ms, heap usage 500.499 MB -> 5.003 MB.
[2025-04-09T22:04:47.740Z] ====== par-mnemonics (functional) [default], iteration 3 completed (7114.820 ms) ======
[2025-04-09T22:04:47.740Z] ====== par-mnemonics (functional) [default], iteration 4 started ======
[2025-04-09T22:04:48.703Z] GC before operation: completed in 18.417 ms, heap usage 540.329 MB -> 5.003 MB.
[2025-04-09T22:04:55.452Z] ====== par-mnemonics (functional) [default], iteration 4 completed (6331.642 ms) ======
[2025-04-09T22:04:55.452Z] ====== par-mnemonics (functional) [default], iteration 5 started ======
[2025-04-09T22:04:55.452Z] GC before operation: completed in 18.355 ms, heap usage 335.805 MB -> 5.003 MB.
[2025-04-09T22:05:02.335Z] ====== par-mnemonics (functional) [default], iteration 5 completed (7252.474 ms) ======
[2025-04-09T22:05:02.335Z] ====== par-mnemonics (functional) [default], iteration 6 started ======
[2025-04-09T22:05:02.335Z] GC before operation: completed in 18.690 ms, heap usage 307.851 MB -> 5.004 MB.
[2025-04-09T22:05:10.667Z] ====== par-mnemonics (functional) [default], iteration 6 completed (8038.557 ms) ======
[2025-04-09T22:05:10.667Z] ====== par-mnemonics (functional) [default], iteration 7 started ======
[2025-04-09T22:05:10.667Z] GC before operation: completed in 17.019 ms, heap usage 300.382 MB -> 5.004 MB.
[2025-04-09T22:05:20.462Z] ====== par-mnemonics (functional) [default], iteration 7 completed (8497.153 ms) ======
[2025-04-09T22:05:20.462Z] ====== par-mnemonics (functional) [default], iteration 8 started ======
[2025-04-09T22:05:20.462Z] GC before operation: completed in 19.949 ms, heap usage 202.568 MB -> 5.004 MB.
[2025-04-09T22:05:27.221Z] ====== par-mnemonics (functional) [default], iteration 8 completed (6425.171 ms) ======
[2025-04-09T22:05:27.221Z] ====== par-mnemonics (functional) [default], iteration 9 started ======
[2025-04-09T22:05:27.221Z] GC before operation: completed in 16.602 ms, heap usage 520.213 MB -> 5.004 MB.
[2025-04-09T22:05:35.406Z] ====== par-mnemonics (functional) [default], iteration 9 completed (7586.917 ms) ======
[2025-04-09T22:05:35.406Z] ====== par-mnemonics (functional) [default], iteration 10 started ======
[2025-04-09T22:05:35.406Z] GC before operation: completed in 24.033 ms, heap usage 342.010 MB -> 5.004 MB.
[2025-04-09T22:05:43.588Z] ====== par-mnemonics (functional) [default], iteration 10 completed (7440.242 ms) ======
[2025-04-09T22:05:43.588Z] ====== par-mnemonics (functional) [default], iteration 11 started ======
[2025-04-09T22:05:43.588Z] GC before operation: completed in 18.195 ms, heap usage 130.297 MB -> 5.004 MB.
[2025-04-09T22:05:50.328Z] ====== par-mnemonics (functional) [default], iteration 11 completed (6916.575 ms) ======
[2025-04-09T22:05:50.328Z] ====== par-mnemonics (functional) [default], iteration 12 started ======
[2025-04-09T22:05:50.328Z] GC before operation: completed in 27.189 ms, heap usage 613.816 MB -> 5.004 MB.
[2025-04-09T22:05:58.523Z] ====== par-mnemonics (functional) [default], iteration 12 completed (7091.691 ms) ======
[2025-04-09T22:05:58.523Z] ====== par-mnemonics (functional) [default], iteration 13 started ======
[2025-04-09T22:05:58.523Z] GC before operation: completed in 17.814 ms, heap usage 557.024 MB -> 5.004 MB.
[2025-04-09T22:06:05.247Z] ====== par-mnemonics (functional) [default], iteration 13 completed (7100.968 ms) ======
[2025-04-09T22:06:05.247Z] ====== par-mnemonics (functional) [default], iteration 14 started ======
[2025-04-09T22:06:05.247Z] GC before operation: completed in 13.368 ms, heap usage 437.534 MB -> 5.004 MB.
[2025-04-09T22:06:13.508Z] ====== par-mnemonics (functional) [default], iteration 14 completed (7061.730 ms) ======
[2025-04-09T22:06:13.508Z] ====== par-mnemonics (functional) [default], iteration 15 started ======
[2025-04-09T22:06:13.508Z] GC before operation: completed in 13.055 ms, heap usage 505.453 MB -> 5.004 MB.
[2025-04-09T22:06:20.257Z] ====== par-mnemonics (functional) [default], iteration 15 completed (7169.270 ms) ======
[2025-04-09T22:06:21.216Z] -----------------------------------
[2025-04-09T22:06:21.216Z] renaissance-par-mnemonics_0_PASSED
[2025-04-09T22:06:21.216Z] -----------------------------------
[2025-04-09T22:06:21.216Z]
[2025-04-09T22:06:21.216Z] TEST TEARDOWN:
[2025-04-09T22:06:21.216Z] Nothing to be done for teardown.
[2025-04-09T22:06:21.216Z] renaissance-par-mnemonics_0 Finish Time: Wed Apr 9 22:06:20 2025 Epoch Time (ms): 1744236380174
[2025-04-09T22:06:21.216Z]
[2025-04-09T22:06:21.216Z] ===============================================
[2025-04-09T22:06:21.216Z] Running test renaissance-philosophers_0 ...
[2025-04-09T22:06:21.216Z] ===============================================
[2025-04-09T22:06:21.216Z] renaissance-philosophers_0 Start Time: Wed Apr 9 22:06:20 2025 Epoch Time (ms): 1744236380193
[2025-04-09T22:06:21.216Z] variation: NoOptions
[2025-04-09T22:06:21.216Z] JVM_OPTIONS:
[2025-04-09T22:06:21.216Z] { \
[2025-04-09T22:06:21.216Z] echo ""; echo "TEST SETUP:"; \
[2025-04-09T22:06:21.216Z] echo "Nothing to be done for setup."; \
[2025-04-09T22:06:21.216Z] mkdir -p "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-philosophers_0"; \
[2025-04-09T22:06:21.216Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-philosophers_0"; \
[2025-04-09T22:06:21.216Z] echo ""; echo "TESTING:"; \
[2025-04-09T22:06:21.216Z] "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/jdkbinary/j2sdk-image/bin/java" --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/java.util=ALL-UNNAMED --add-opens java.base/java.util.concurrent=ALL-UNNAMED --add-opens java.base/java.nio=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/java.lang.invoke=ALL-UNNAMED -jar "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../../jvmtest/perf/renaissance/renaissance.jar" --json ""/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-philosophers_0"/philosophers.json" philosophers; \
[2025-04-09T22:06:21.216Z] if [ $? -eq 0 ]; then echo "-----------------------------------"; echo "renaissance-philosophers_0""_PASSED"; echo "-----------------------------------"; cd /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/..; rm -f -r "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/renaissance-philosophers_0"; else echo "-----------------------------------"; echo "renaissance-philosophers_0""_FAILED"; echo "-----------------------------------"; fi; \
[2025-04-09T22:06:21.216Z] echo ""; echo "TEST TEARDOWN:"; \
[2025-04-09T22:06:21.216Z] echo "Nothing to be done for teardown."; \
[2025-04-09T22:06:21.216Z] } 2>&1 | tee -a "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult";
[2025-04-09T22:06:21.216Z]
[2025-04-09T22:06:21.216Z] TEST SETUP:
[2025-04-09T22:06:21.216Z] Nothing to be done for setup.
[2025-04-09T22:06:21.216Z]
[2025-04-09T22:06:21.216Z] TESTING:
[2025-04-09T22:06:22.195Z] ====== philosophers (scala) [default], iteration 0 started ======
[2025-04-09T22:06:22.195Z] GC before operation: completed in 11.568 ms, heap usage 7.385 MB -> 5.844 MB.
[2025-04-09T22:06:27.649Z] Camera thread performed 122 scans.
[2025-04-09T22:06:27.649Z] ====== philosophers (scala) [default], iteration 0 completed (5779.697 ms) ======
[2025-04-09T22:06:27.649Z] ====== philosophers (scala) [default], iteration 1 started ======
[2025-04-09T22:06:27.649Z] GC before operation: completed in 26.659 ms, heap usage 14.625 MB -> 6.209 MB.
[2025-04-09T22:06:31.828Z] Camera thread performed 122 scans.
[2025-04-09T22:06:31.828Z] ====== philosophers (scala) [default], iteration 1 completed (3628.306 ms) ======
[2025-04-09T22:06:31.828Z] ====== philosophers (scala) [default], iteration 2 started ======
[2025-04-09T22:06:31.828Z] GC before operation: completed in 21.839 ms, heap usage 43.402 MB -> 6.209 MB.
[2025-04-09T22:06:33.800Z] Camera thread performed 122 scans.
[2025-04-09T22:06:33.800Z] ====== philosophers (scala) [default], iteration 2 completed (2786.684 ms) ======
[2025-04-09T22:06:33.800Z] ====== philosophers (scala) [default], iteration 3 started ======
[2025-04-09T22:06:33.800Z] GC before operation: completed in 39.727 ms, heap usage 32.366 MB -> 6.210 MB.
[2025-04-09T22:06:36.853Z] Camera thread performed 122 scans.
[2025-04-09T22:06:36.854Z] ====== philosophers (scala) [default], iteration 3 completed (2796.853 ms) ======
[2025-04-09T22:06:36.854Z] ====== philosophers (scala) [default], iteration 4 started ======
[2025-04-09T22:06:36.854Z] GC before operation: completed in 19.706 ms, heap usage 89.758 MB -> 6.211 MB.
[2025-04-09T22:06:38.854Z] Camera thread performed 122 scans.
[2025-04-09T22:06:38.854Z] ====== philosophers (scala) [default], iteration 4 completed (2392.191 ms) ======
[2025-04-09T22:06:38.854Z] ====== philosophers (scala) [default], iteration 5 started ======
[2025-04-09T22:06:38.854Z] GC before operation: completed in 29.717 ms, heap usage 86.176 MB -> 6.211 MB.
[2025-04-09T22:06:41.893Z] Camera thread performed 122 scans.
[2025-04-09T22:06:41.893Z] ====== philosophers (scala) [default], iteration 5 completed (2073.598 ms) ======
[2025-04-09T22:06:41.893Z] ====== philosophers (scala) [default], iteration 6 started ======
[2025-04-09T22:06:41.893Z] GC before operation: completed in 29.470 ms, heap usage 107.633 MB -> 6.212 MB.
[2025-04-09T22:06:43.880Z] Camera thread performed 122 scans.
[2025-04-09T22:06:43.880Z] ====== philosophers (scala) [default], iteration 6 completed (2909.414 ms) ======
[2025-04-09T22:06:43.880Z] ====== philosophers (scala) [default], iteration 7 started ======
[2025-04-09T22:06:44.845Z] GC before operation: completed in 28.879 ms, heap usage 81.387 MB -> 6.212 MB.
[2025-04-09T22:06:47.907Z] Camera thread performed 122 scans.
[2025-04-09T22:06:47.907Z] ====== philosophers (scala) [default], iteration 7 completed (2941.022 ms) ======
[2025-04-09T22:06:47.907Z] ====== philosophers (scala) [default], iteration 8 started ======
[2025-04-09T22:06:47.907Z] GC before operation: completed in 27.191 ms, heap usage 49.223 MB -> 6.212 MB.
[2025-04-09T22:06:50.947Z] Camera thread performed 122 scans.
[2025-04-09T22:06:50.948Z] ====== philosophers (scala) [default], iteration 8 completed (3083.826 ms) ======
[2025-04-09T22:06:50.948Z] ====== philosophers (scala) [default], iteration 9 started ======
[2025-04-09T22:06:50.948Z] GC before operation: completed in 29.509 ms, heap usage 77.906 MB -> 6.213 MB.
[2025-04-09T22:06:53.993Z] Camera thread performed 122 scans.
[2025-04-09T22:06:53.993Z] ====== philosophers (scala) [default], iteration 9 completed (3753.305 ms) ======
[2025-04-09T22:06:53.993Z] ====== philosophers (scala) [default], iteration 10 started ======
[2025-04-09T22:06:53.993Z] GC before operation: completed in 20.570 ms, heap usage 39.851 MB -> 6.213 MB.
[2025-04-09T22:06:58.165Z] Camera thread performed 122 scans.
[2025-04-09T22:06:58.165Z] ====== philosophers (scala) [default], iteration 10 completed (3883.814 ms) ======
[2025-04-09T22:06:58.165Z] ====== philosophers (scala) [default], iteration 11 started ======
[2025-04-09T22:06:58.165Z] GC before operation: completed in 20.182 ms, heap usage 38.131 MB -> 6.213 MB.
[2025-04-09T22:07:01.212Z] Camera thread performed 122 scans.
[2025-04-09T22:07:01.212Z] ====== philosophers (scala) [default], iteration 11 completed (3111.718 ms) ======
[2025-04-09T22:07:01.212Z] ====== philosophers (scala) [default], iteration 12 started ======
[2025-04-09T22:07:01.212Z] GC before operation: completed in 21.216 ms, heap usage 35.288 MB -> 6.213 MB.
[2025-04-09T22:07:04.263Z] Camera thread performed 122 scans.
[2025-04-09T22:07:04.263Z] ====== philosophers (scala) [default], iteration 12 completed (2425.237 ms) ======
[2025-04-09T22:07:04.263Z] ====== philosophers (scala) [default], iteration 13 started ======
[2025-04-09T22:07:04.263Z] GC before operation: completed in 21.185 ms, heap usage 70.490 MB -> 6.213 MB.
[2025-04-09T22:07:07.332Z] Camera thread performed 122 scans.
[2025-04-09T22:07:07.332Z] ====== philosophers (scala) [default], iteration 13 completed (3497.702 ms) ======
[2025-04-09T22:07:07.332Z] ====== philosophers (scala) [default], iteration 14 started ======
[2025-04-09T22:07:07.332Z] GC before operation: completed in 24.746 ms, heap usage 64.302 MB -> 6.213 MB.
[2025-04-09T22:07:10.393Z] Camera thread performed 122 scans.
[2025-04-09T22:07:10.393Z] ====== philosophers (scala) [default], iteration 14 completed (3499.045 ms) ======
[2025-04-09T22:07:10.393Z] ====== philosophers (scala) [default], iteration 15 started ======
[2025-04-09T22:07:10.393Z] GC before operation: completed in 21.146 ms, heap usage 132.242 MB -> 6.214 MB.
[2025-04-09T22:07:14.596Z] Camera thread performed 122 scans.
[2025-04-09T22:07:14.596Z] ====== philosophers (scala) [default], iteration 15 completed (3509.038 ms) ======
[2025-04-09T22:07:14.596Z] ====== philosophers (scala) [default], iteration 16 started ======
[2025-04-09T22:07:14.596Z] GC before operation: completed in 26.232 ms, heap usage 30.842 MB -> 6.213 MB.
[2025-04-09T22:07:17.649Z] Camera thread performed 122 scans.
[2025-04-09T22:07:17.649Z] ====== philosophers (scala) [default], iteration 16 completed (3424.852 ms) ======
[2025-04-09T22:07:17.649Z] ====== philosophers (scala) [default], iteration 17 started ======
[2025-04-09T22:07:17.649Z] GC before operation: completed in 19.236 ms, heap usage 44.310 MB -> 6.214 MB.
[2025-04-09T22:07:19.626Z] Camera thread performed 122 scans.
[2025-04-09T22:07:19.626Z] ====== philosophers (scala) [default], iteration 17 completed (2112.674 ms) ======
[2025-04-09T22:07:19.626Z] ====== philosophers (scala) [default], iteration 18 started ======
[2025-04-09T22:07:19.626Z] GC before operation: completed in 21.119 ms, heap usage 45.504 MB -> 6.213 MB.
[2025-04-09T22:07:23.792Z] Camera thread performed 122 scans.
[2025-04-09T22:07:23.792Z] ====== philosophers (scala) [default], iteration 18 completed (3524.214 ms) ======
[2025-04-09T22:07:23.792Z] ====== philosophers (scala) [default], iteration 19 started ======
[2025-04-09T22:07:23.792Z] GC before operation: completed in 23.076 ms, heap usage 78.462 MB -> 6.214 MB.
[2025-04-09T22:07:26.823Z] Camera thread performed 122 scans.
[2025-04-09T22:07:26.823Z] ====== philosophers (scala) [default], iteration 19 completed (3192.125 ms) ======
[2025-04-09T22:07:26.823Z] ====== philosophers (scala) [default], iteration 20 started ======
[2025-04-09T22:07:26.823Z] GC before operation: completed in 21.344 ms, heap usage 85.050 MB -> 6.214 MB.
[2025-04-09T22:07:29.857Z] Camera thread performed 122 scans.
[2025-04-09T22:07:29.857Z] ====== philosophers (scala) [default], iteration 20 completed (2665.835 ms) ======
[2025-04-09T22:07:29.857Z] ====== philosophers (scala) [default], iteration 21 started ======
[2025-04-09T22:07:29.857Z] GC before operation: completed in 23.602 ms, heap usage 78.938 MB -> 6.214 MB.
[2025-04-09T22:07:34.056Z] Camera thread performed 122 scans.
[2025-04-09T22:07:34.056Z] ====== philosophers (scala) [default], iteration 21 completed (4074.250 ms) ======
[2025-04-09T22:07:34.056Z] ====== philosophers (scala) [default], iteration 22 started ======
[2025-04-09T22:07:34.056Z] GC before operation: completed in 25.450 ms, heap usage 9.698 MB -> 6.214 MB.
[2025-04-09T22:07:38.255Z] Camera thread performed 122 scans.
[2025-04-09T22:07:38.255Z] ====== philosophers (scala) [default], iteration 22 completed (4608.890 ms) ======
[2025-04-09T22:07:38.255Z] ====== philosophers (scala) [default], iteration 23 started ======
[2025-04-09T22:07:38.255Z] GC before operation: completed in 20.363 ms, heap usage 47.998 MB -> 6.214 MB.
[2025-04-09T22:07:42.438Z] Camera thread performed 122 scans.
[2025-04-09T22:07:42.438Z] ====== philosophers (scala) [default], iteration 23 completed (3957.989 ms) ======
[2025-04-09T22:07:42.438Z] ====== philosophers (scala) [default], iteration 24 started ======
[2025-04-09T22:07:42.438Z] GC before operation: completed in 20.701 ms, heap usage 32.093 MB -> 6.214 MB.
[2025-04-09T22:07:45.479Z] Camera thread performed 122 scans.
[2025-04-09T22:07:45.479Z] ====== philosophers (scala) [default], iteration 24 completed (3646.780 ms) ======
[2025-04-09T22:07:45.479Z] ====== philosophers (scala) [default], iteration 25 started ======
[2025-04-09T22:07:45.479Z] GC before operation: completed in 22.960 ms, heap usage 47.386 MB -> 6.214 MB.
[2025-04-09T22:07:49.646Z] Camera thread performed 122 scans.
[2025-04-09T22:07:49.646Z] ====== philosophers (scala) [default], iteration 25 completed (3953.906 ms) ======
[2025-04-09T22:07:49.646Z] ====== philosophers (scala) [default], iteration 26 started ======
[2025-04-09T22:07:49.646Z] GC before operation: completed in 23.913 ms, heap usage 70.919 MB -> 6.214 MB.
[2025-04-09T22:07:52.733Z] Camera thread performed 122 scans.
[2025-04-09T22:07:52.733Z] ====== philosophers (scala) [default], iteration 26 completed (3139.460 ms) ======
[2025-04-09T22:07:52.733Z] ====== philosophers (scala) [default], iteration 27 started ======
[2025-04-09T22:07:52.733Z] GC before operation: completed in 21.017 ms, heap usage 100.738 MB -> 6.214 MB.
[2025-04-09T22:07:56.926Z] Camera thread performed 122 scans.
[2025-04-09T22:07:56.926Z] ====== philosophers (scala) [default], iteration 27 completed (3668.380 ms) ======
[2025-04-09T22:07:56.926Z] ====== philosophers (scala) [default], iteration 28 started ======
[2025-04-09T22:07:56.926Z] GC before operation: completed in 20.059 ms, heap usage 22.167 MB -> 6.214 MB.
[2025-04-09T22:07:59.975Z] Camera thread performed 122 scans.
[2025-04-09T22:07:59.975Z] ====== philosophers (scala) [default], iteration 28 completed (3644.478 ms) ======
[2025-04-09T22:07:59.975Z] ====== philosophers (scala) [default], iteration 29 started ======
[2025-04-09T22:07:59.975Z] GC before operation: completed in 24.235 ms, heap usage 154.426 MB -> 6.214 MB.
[2025-04-09T22:08:04.260Z] Camera thread performed 122 scans.
[2025-04-09T22:08:04.260Z] ====== philosophers (scala) [default], iteration 29 completed (3712.913 ms) ======
[2025-04-09T22:08:04.260Z] -----------------------------------
[2025-04-09T22:08:04.260Z] renaissance-philosophers_0_PASSED
[2025-04-09T22:08:04.260Z] -----------------------------------
[2025-04-09T22:08:04.260Z]
[2025-04-09T22:08:04.260Z] TEST TEARDOWN:
[2025-04-09T22:08:04.260Z] Nothing to be done for teardown.
[2025-04-09T22:08:04.260Z] renaissance-philosophers_0 Finish Time: Wed Apr 9 22:08:03 2025 Epoch Time (ms): 1744236483935
[2025-04-09T22:08:04.260Z] make[4]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf/renaissance'
[2025-04-09T22:08:04.260Z] make[3]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/perf'
[2025-04-09T22:08:04.260Z] make[2]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests'
[2025-04-09T22:08:04.260Z]
[2025-04-09T22:08:04.260Z]
[2025-04-09T22:08:04.260Z] All tests finished, run result summary:
[2025-04-09T22:08:04.260Z] cd "/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/scripts"; \
[2025-04-09T22:08:04.260Z] perl "resultsSum.pl" --failuremk="/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/failedtargets.mk" --resultFile="/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/TestTargetResult" --platFile="/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/resources/buildPlatformMap.properties" --diagnostic=failure --jdkVersion=17 --jdkImpl=hotspot --jdkVendor="eclipse" --spec=linux_aarch64 --buildList=perf --customTarget="" --testTarget=extended.perf --tapPath=/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG/../TKG/output_17442340591361/ --tapName=Test_openjdk17_hs_extended.perf_aarch64_linux.tap --comment=""
[2025-04-09T22:08:04.260Z]
[2025-04-09T22:08:04.260Z]
[2025-04-09T22:08:04.260Z] TEST TARGETS SUMMARY
[2025-04-09T22:08:04.260Z] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
[2025-04-09T22:08:04.260Z] DISABLED test targets:
[2025-04-09T22:08:04.260Z] dacapo-tomcat_0
[2025-04-09T22:08:04.260Z] renaissance-db-shootout_0
[2025-04-09T22:08:04.260Z] renaissance-finagle-chirper_0
[2025-04-09T22:08:04.260Z]
[2025-04-09T22:08:04.260Z] PASSED test targets:
[2025-04-09T22:08:04.260Z] dacapo-avrora_0
[2025-04-09T22:08:04.260Z] dacapo-fop_0
[2025-04-09T22:08:04.260Z] dacapo-jython_0
[2025-04-09T22:08:04.260Z] dacapo-luindex_0
[2025-04-09T22:08:04.260Z] dacapo-pmd_0
[2025-04-09T22:08:04.260Z] dacapo-sunflow_0
[2025-04-09T22:08:04.260Z] dacapo-xalan_0
[2025-04-09T22:08:04.260Z] renaissance-chi-square_0
[2025-04-09T22:08:04.260Z] renaissance-dec-tree_0
[2025-04-09T22:08:04.260Z] renaissance-finagle-http_0
[2025-04-09T22:08:04.260Z] renaissance-gauss-mix_0
[2025-04-09T22:08:04.260Z] renaissance-log-regression_0
[2025-04-09T22:08:04.260Z] renaissance-mnemonics_0
[2025-04-09T22:08:04.260Z] renaissance-movie-lens_0
[2025-04-09T22:08:04.260Z] renaissance-par-mnemonics_0
[2025-04-09T22:08:04.260Z] renaissance-philosophers_0
[2025-04-09T22:08:04.260Z]
[2025-04-09T22:08:04.260Z] FAILED test targets:
[2025-04-09T22:08:04.260Z] renaissance-als_0
[2025-04-09T22:08:04.260Z]
[2025-04-09T22:08:04.260Z] TOTAL: 20 EXECUTED: 17 PASSED: 16 FAILED: 1 DISABLED: 3 SKIPPED: 0
[2025-04-09T22:08:04.260Z] ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
[2025-04-09T22:08:04.260Z]
[2025-04-09T22:08:04.260Z] make[1]: *** [settings.mk:445: resultsSummary] Error 2
[2025-04-09T22:08:04.260Z] make[1]: Leaving directory '/home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux/aqa-tests/TKG'
[2025-04-09T22:08:04.260Z] make: *** [makefile:62: _extended.perf] Error 2
[Pipeline] sh
[2025-04-09T22:08:15.282Z] + uname
[2025-04-09T22:08:15.282Z] + [ Linux = AIX ]
[2025-04-09T22:08:15.282Z] + uname
[2025-04-09T22:08:15.282Z] + [ Linux = SunOS ]
[2025-04-09T22:08:15.282Z] + uname
[2025-04-09T22:08:15.282Z] + [ Linux = *BSD ]
[2025-04-09T22:08:15.282Z] + MAKE=make
[2025-04-09T22:08:15.282Z] + make -f ./aqa-tests/TKG/testEnv.mk testEnvTeardown
[2025-04-09T22:08:15.282Z] make: Nothing to be done for 'testEnvTeardown'.
[Pipeline] }
[2025-04-09T22:08:15.819Z] Xvfb stopping
[Pipeline] // wrap
[Pipeline] echo
[2025-04-09T22:08:17.924Z] There were test failures, set build result to UNSTABLE.
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Post)
[Pipeline] step
[2025-04-09T22:08:18.191Z] TAP Reports Processing: START
[2025-04-09T22:08:18.192Z] Looking for TAP results report in workspace using pattern: aqa-tests/TKG/**/*.tap
[2025-04-09T22:08:22.471Z] Saving reports...
[2025-04-09T22:08:24.629Z] Processing '/home/jenkins/.jenkins/jobs/Test_openjdk17_hs_extended.perf_aarch64_linux/builds/220/tap-master-files/aqa-tests/TKG/output_17442340591361/Test_openjdk17_hs_extended.perf_aarch64_linux.tap'
[2025-04-09T22:08:24.629Z] Parsing TAP test result [/home/jenkins/.jenkins/jobs/Test_openjdk17_hs_extended.perf_aarch64_linux/builds/220/tap-master-files/aqa-tests/TKG/output_17442340591361/Test_openjdk17_hs_extended.perf_aarch64_linux.tap].
[2025-04-09T22:08:24.737Z] There are failed test cases. Marking build as UNSTABLE
[2025-04-09T22:08:24.737Z] TAP Reports Processing: FINISH
[Pipeline] echo
[2025-04-09T22:08:24.748Z] Saving aqa-tests/testenv/testenv.properties file on jenkins.
[Pipeline] archiveArtifacts
[2025-04-09T22:08:24.950Z] Archiving artifacts
[2025-04-09T22:08:29.285Z] Recording fingerprints
[Pipeline] echo
[2025-04-09T22:08:29.865Z] Saving aqa-tests/TKG/**/*.tap file on jenkins.
[Pipeline] archiveArtifacts
[2025-04-09T22:08:30.051Z] Archiving artifacts
[2025-04-09T22:08:30.568Z] Recording fingerprints
[Pipeline] sh
[2025-04-09T22:08:32.718Z] + tar -cf benchmark_test_output.tar.gz ./aqa-tests/TKG/output_17442340591361
[Pipeline] echo
[2025-04-09T22:08:33.262Z] ARTIFACTORY_SERVER is not set. Saving artifacts on jenkins.
[Pipeline] archiveArtifacts
[2025-04-09T22:08:33.449Z] Archiving artifacts
[2025-04-09T22:08:33.839Z] Recording fingerprints
[Pipeline] findFiles
[Pipeline] echo
[2025-04-09T22:08:35.219Z] Tap file found: aqa-tests/TKG/output_17442340591361/Test_openjdk17_hs_extended.perf_aarch64_linux.tap...
[Pipeline] readFile
[Pipeline] echo
[2025-04-09T22:08:36.201Z] Rerun in Grinder with failed test targets: https://ci.adoptium.net/job/Grinder/parambuild/?SDK_RESOURCE=upstream&TARGET=testList+TESTLIST=renaissance-als_0&BASE_DOCKER_REGISTRY_CREDENTIAL_ID=&TEST_FLAG=&UPSTREAM_TEST_JOB_NAME=&DOCKER_REQUIRED=false&ACTIVE_NODE_TIMEOUT=5&VENDOR_TEST_DIRS=&EXTRA_DOCKER_ARGS=&TKG_OWNER_BRANCH=adoptium%3Amaster&OPENJ9_SYSTEMTEST_OWNER_BRANCH=eclipse%3Amaster&PLATFORM=aarch64_linux&GENERATE_JOBS=false&KEEP_REPORTDIR=false&PERSONAL_BUILD=false&DOCKER_REGISTRY_DIR=&RERUN_ITERATIONS=0&ADOPTOPENJDK_REPO=https%3A%2F%2Fgithub.com%2Fadoptium%2Faqa-tests.git&SETUP_JCK_RUN=false&DOCKER_REGISTRY_URL_CREDENTIAL_ID=&LABEL=&EXTRA_OPTIONS=&CUSTOMIZED_SDK_URL=&BUILD_IDENTIFIER=&JENKINS_KEY=&ADOPTOPENJDK_BRANCH=v1.0.7-release&LIGHT_WEIGHT_CHECKOUT=false&USE_JRE=false&ARTIFACTORY_SERVER=&KEEP_WORKSPACE=false&USER_CREDENTIALS_ID=&JDK_VERSION=17&DOCKER_REGISTRY_URL=&ITERATIONS=1&VENDOR_TEST_REPOS=&JDK_REPO=https%3A%2F%2Fgithub.com%2Fadoptium%2Fjdk17u&JCK_GIT_BRANCH=master&OPENJ9_BRANCH=master&OPENJ9_SHA=&JCK_GIT_REPO=&VENDOR_TEST_BRANCHES=&UPSTREAM_JOB_NAME=build-scripts%2Fjobs%2Frelease%2Fjobs%2Fjdk17u%2Fjdk17u-release-linux-aarch64-temurin&OPENJ9_REPO=https%3A%2F%2Fgithub.com%2Feclipse-openj9%2Fopenj9.git&CLOUD_PROVIDER=&CUSTOM_TARGET=&VENDOR_TEST_SHAS=&JDK_BRANCH=jdk-17.0.15%2B5_adopt&LABEL_ADDITION=&ARTIFACTORY_REPO=&ARTIFACTORY_ROOT_DIR=&UPSTREAM_TEST_JOB_NUMBER=&DOCKERIMAGE_TAG=&TEST_TIME=120&JDK_IMPL=hotspot&SSH_AGENT_CREDENTIAL=&AUTO_DETECT=true&SLACK_CHANNEL=aqavit-bot&DYNAMIC_COMPILE=false&RELATED_NODES=&ADOPTOPENJDK_SYSTEMTEST_OWNER_BRANCH=adoptium%3Amaster&APPLICATION_OPTIONS=&CUSTOMIZED_SDK_URL_CREDENTIAL_ID=eclipse_temurin_bot_email_and_token&ARCHIVE_TEST_RESULTS=false&NUM_MACHINES=&OPENJDK_SHA=&TRSS_URL=&RERUN_FAILURE=false&USE_TESTENV_PROPERTIES=true&BUILD_LIST=perf&ADDITIONAL_ARTIFACTS_REQUIRED=&UPSTREAM_JOB_NUMBER=32&STF_OWNER_BRANCH=adoptium%3Amaster&TIME_LIMIT=25&JVM_OPTIONS=&PARALLEL=None
[Pipeline] junit
[2025-04-09T22:08:36.449Z] Recording test results
[2025-04-09T22:08:40.171Z] No test report files were found. Configuration error?
[2025-04-09T22:08:45.085Z] None of the test reports contained any result
[2025-04-09T22:08:45.085Z] [Checks API] No suitable checks publisher found.
[Pipeline] }
[Pipeline] // stage
[Pipeline] echo
[2025-04-09T22:08:45.199Z] PROCESSCATCH: Terminating any hung/left over test processes:
[Pipeline] sh
[2025-04-09T22:08:47.071Z] + aqa-tests/terminateTestProcesses.sh jenkins
[2025-04-09T22:08:47.071Z] Unix type machine..
[2025-04-09T22:08:47.071Z] Running on a Linux host
[2025-04-09T22:08:47.071Z] Woohoo - no rogue processes detected!
[Pipeline] cleanWs
[2025-04-09T22:08:47.964Z] [WS-CLEANUP] Deleting project workspace...
[2025-04-09T22:08:47.964Z] [WS-CLEANUP] Deferred wipeout is disabled by the job configuration...
[2025-04-09T22:08:50.106Z] [WS-CLEANUP] done
[Pipeline] sh
[2025-04-09T22:08:51.971Z] + find /tmp -name *core* -print -exec rm -f {} ;
[2025-04-09T22:08:51.971Z] + true
[Pipeline] }
[Pipeline] // timeout
[Pipeline] timeout
[2025-04-09T22:08:52.623Z] Timeout set to expire in 5 min 0 sec
[Pipeline] {
[Pipeline] }
[Pipeline] // timeout
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] echo
[2025-04-09T22:08:53.266Z] SETUP_LABEL: ci.role.test
[Pipeline] stage
[Pipeline] { (Parallel Tests)
[Pipeline] parallel
[2025-04-09T22:08:53.416Z] No branches to run
[Pipeline] // parallel
[Pipeline] node
[2025-04-09T22:08:53.549Z] Running on test-docker-ubuntu2404-armv7-3 in /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux
[Pipeline] {
[Pipeline] cleanWs
[2025-04-09T22:08:53.911Z] [WS-CLEANUP] Deleting project workspace...
[2025-04-09T22:08:53.911Z] [WS-CLEANUP] Deferred wipeout is disabled by the job configuration...
[2025-04-09T22:08:54.103Z] [WS-CLEANUP] done
[Pipeline] findFiles
[Pipeline] cleanWs
[2025-04-09T22:08:54.560Z] [WS-CLEANUP] Deleting project workspace...
[2025-04-09T22:08:54.560Z] [WS-CLEANUP] Deferred wipeout is disabled by the job configuration...
[2025-04-09T22:08:54.702Z] [WS-CLEANUP] done
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Rerun)
[Pipeline] echo
[2025-04-09T22:08:55.014Z] allocate a node for generating rerun job ...
[Pipeline] node
[2025-04-09T22:08:55.059Z] Running on test-docker-ubuntu2404-armv7-6 in /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux
[Pipeline] {
[Pipeline] echo
[2025-04-09T22:08:55.145Z] Generating rerun Test_openjdk17_hs_extended.perf_aarch64_linux_rerun job for running failed test(s) ...
[Pipeline] library
[2025-04-09T22:08:55.211Z] Loading library openjdk-jenkins-helper@master
[2025-04-09T22:08:55.812Z] Examining adoptium/jenkins-helper
[2025-04-09T22:08:55.812Z] Attempting to resolve master as a branch
[2025-04-09T22:08:56.029Z] Resolved master as branch master at revision 3cccb6284f500372ec96c4e58b84130fcc2582cd
[2025-04-09T22:08:56.035Z] The recommended git tool is: NONE
[2025-04-09T22:08:56.035Z] using credential eclipse_temurin_bot_email_and_token
[2025-04-09T22:08:56.054Z] Cloning the remote Git repository
[2025-04-09T22:08:56.054Z] Cloning with configured refspecs honoured and without tags
[2025-04-09T22:08:56.054Z] Cloning repository https://github.com/adoptium/jenkins-helper.git
[2025-04-09T22:08:56.054Z] > git init /home/jenkins/.jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux@libs/c31318ca553a5b48d3dae5b71a63edd0af44b52a419d47af015d2ae014de8985 # timeout=10
[2025-04-09T22:08:56.098Z] Fetching upstream changes from https://github.com/adoptium/jenkins-helper.git
[2025-04-09T22:08:56.098Z] > git --version # timeout=10
[2025-04-09T22:08:56.121Z] > git --version # 'git version 2.43.0'
[2025-04-09T22:08:56.121Z] using GIT_ASKPASS to set credentials
[2025-04-09T22:08:56.145Z] > git fetch --no-tags --force --progress -- https://github.com/adoptium/jenkins-helper.git +refs/heads/master:refs/remotes/origin/master # timeout=10
[2025-04-09T22:08:56.740Z] > git config remote.origin.url https://github.com/adoptium/jenkins-helper.git # timeout=10
[2025-04-09T22:08:56.767Z] > git config --add remote.origin.fetch +refs/heads/master:refs/remotes/origin/master # timeout=10
[2025-04-09T22:08:56.791Z] Avoid second fetch
[2025-04-09T22:08:56.791Z] Checking out Revision 3cccb6284f500372ec96c4e58b84130fcc2582cd (master)
[2025-04-09T22:08:56.792Z] > git config core.sparsecheckout # timeout=10
[2025-04-09T22:08:56.811Z] > git checkout -f 3cccb6284f500372ec96c4e58b84130fcc2582cd # timeout=10
[2025-04-09T22:08:56.839Z] Commit message: "Add semgrep differential code check action (#61)"
[2025-04-09T22:08:56.839Z] First time build. Skipping changelog.
[Pipeline] echo
[2025-04-09T22:08:59.076Z] Test_openjdk17_hs_extended.perf_aarch64_linux_rerun jobIsRunnable: false
[Pipeline] parallel
[Pipeline] { (Branch: Test_openjdk17_hs_extended.perf_aarch64_linux_rerun)
[Pipeline] echo
[2025-04-09T22:08:59.129Z] Test job Test_openjdk17_hs_extended.perf_aarch64_linux_rerun doesn't exist, set test job Test_openjdk17_hs_extended.perf_aarch64_linux_rerun params for generating the job
[Pipeline] fileExists
[Pipeline] sh
[2025-04-09T22:09:01.232Z] + curl -Os https://raw.githubusercontent.com/adoptium/aqa-tests/master/buildenv/jenkins/testJobTemplate
[Pipeline] jobDsl
[2025-04-09T22:09:02.388Z] Processing DSL script testJobTemplate
[2025-04-09T22:12:37.403Z] LEVELS: [extended]
[2025-04-09T22:12:37.403Z] JDK_VERSIONS: [8]
[2025-04-09T22:12:37.403Z] GROUPS: [perf]
[2025-04-09T22:12:37.403Z] ARCH_OS_LIST: [aarch64_linux]
[2025-04-09T22:12:37.537Z] Added items:
[2025-04-09T22:12:37.537Z] GeneratedJob{name='Test_openjdk17_hs_extended.perf_aarch64_linux_rerun'}
[Pipeline] }
[Pipeline] // parallel
[Pipeline] }
[Pipeline] // node
[Pipeline] echo
[2025-04-09T22:13:59.650Z] Triggering rerun jobs in parallel ...
[Pipeline] parallel
[Pipeline] { (Branch: Test_openjdk17_hs_extended.perf_aarch64_linux_rerun)
[Pipeline] build
[2025-04-09T22:13:59.706Z] Scheduling project: Test_openjdk17_hs_extended.perf_aarch64_linux_rerun
[2025-04-09T22:14:07.442Z] Starting building: Test_openjdk17_hs_extended.perf_aarch64_linux_rerun #1
[2025-04-09T22:21:41.115Z] Build Test_openjdk17_hs_extended.perf_aarch64_linux_rerun #1 completed: SUCCESS
[Pipeline] }
[Pipeline] // parallel
[Pipeline] node
[2025-04-09T22:21:41.298Z] Running on test-docker-ubuntu2404-armv7-3 in /home/jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux
[Pipeline] {
[Pipeline] cleanWs
[2025-04-09T22:21:41.649Z] [WS-CLEANUP] Deleting project workspace...
[2025-04-09T22:21:41.649Z] [WS-CLEANUP] Deferred wipeout is disabled by the job configuration...
[2025-04-09T22:21:41.788Z] [WS-CLEANUP] done
[Pipeline] echo
[2025-04-09T22:21:41.796Z] Test_openjdk17_hs_extended.perf_aarch64_linux_rerun #1 completed with status SUCCESS
[Pipeline] timeout
[2025-04-09T22:21:41.799Z] Timeout set to expire in 1 hr 0 min
[Pipeline] {
[Pipeline] copyArtifacts
[2025-04-09T22:21:42.784Z] Copied 1 artifact from "Test_openjdk17_hs_extended.perf_aarch64_linux_rerun" build number 1
[Pipeline] }
[Pipeline] // timeout
[Pipeline] step
[2025-04-09T22:21:43.053Z] TAP Reports Processing: START
[2025-04-09T22:21:43.053Z] Looking for TAP results report in workspace using pattern: Test_openjdk17_hs_extended.perf_aarch64_linux_rerun/1/**/*.tap
[2025-04-09T22:21:43.210Z] Saving reports...
[2025-04-09T22:21:43.383Z] Processing '/home/jenkins/.jenkins/jobs/Test_openjdk17_hs_extended.perf_aarch64_linux/builds/220/tap-master-files/Test_openjdk17_hs_extended.perf_aarch64_linux_rerun/1/aqa-tests/TKG/output_1744237114218/Test_openjdk17_hs_extended.perf_aarch64_linux_rerun.tap'
[2025-04-09T22:21:43.383Z] Parsing TAP test result [/home/jenkins/.jenkins/jobs/Test_openjdk17_hs_extended.perf_aarch64_linux/builds/220/tap-master-files/Test_openjdk17_hs_extended.perf_aarch64_linux_rerun/1/aqa-tests/TKG/output_1744237114218/Test_openjdk17_hs_extended.perf_aarch64_linux_rerun.tap].
[2025-04-09T22:21:43.431Z] TAP Reports Processing: FINISH
[Pipeline] echo
[2025-04-09T22:21:43.438Z] Saving Test_openjdk17_hs_extended.perf_aarch64_linux_rerun/1/**/*.tap file on jenkins.
[Pipeline] archiveArtifacts
[2025-04-09T22:21:43.589Z] Archiving artifacts
[2025-04-09T22:21:43.938Z] Recording fingerprints
[Pipeline] findFiles
[Pipeline] readFile
[Pipeline] cleanWs
[2025-04-09T22:21:45.375Z] [WS-CLEANUP] Deleting project workspace...
[2025-04-09T22:21:45.375Z] [WS-CLEANUP] Deferred wipeout is disabled by the job configuration...
[2025-04-09T22:21:45.535Z] [WS-CLEANUP] done
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // timestamps
[Pipeline] End of Pipeline
/home/jenkins/.jenkins/workspace/Test_openjdk17_hs_extended.perf_aarch64_linux@tmp/jfrog/220/.jfrog deleted
Finished: UNSTABLE