You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I met an issue. Anyone can help? Thanks in advance.
After deploy spark-standalone-cluster-on-docker (images: andreper/spark-master:3.0.0) on a server (192.XX.X.X), I try to test by another PC (192.XX.X.Y).
cmd steps:
$ spark-shell --master spark://192.XX.X.X:7077
val count = sc.parallelize(1 to 1000).filter { _ =>
val x = math.random
val y = math.random
x*x + y*y < 1
}.count()
I got below error.(infinite loop messages)
WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
Build Env.
Images: andreper/spark-master:3.0.0
Docker Engine version: 20.10.14
Docker Compose version: Docker Compose version v2.2.3
The text was updated successfully, but these errors were encountered:
You have another Spark Application running on this cluster already (e.g. a Spark Session created in Jupyter Lab). Close this Session/Spark Application.
In a Standalone Cluster default behavior for a Spark Application is to "grav" all ressources. If this is a permanent Session, it will block any other application from correctly getting any ressource.
@ThomasMannKenbun I think it would work if jupyter was run on the same container as spark master and starting pyspark with the --master local[*] option. Tested on Iceberg image. The app isn't listed on localhost:7077.
Hi,
I met an issue. Anyone can help? Thanks in advance.
After deploy spark-standalone-cluster-on-docker (images: andreper/spark-master:3.0.0) on a server (192.XX.X.X), I try to test by another PC (192.XX.X.Y).
cmd steps:
$ spark-shell --master spark://192.XX.X.X:7077
I got below error.(infinite loop messages)
Build Env.
The text was updated successfully, but these errors were encountered: