Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

not possible to use specific Spark instance from the DC/OS Spark CLI #69

Open
aalfonso-stratio opened this issue Oct 7, 2016 · 3 comments

Comments

@aalfonso-stratio
Copy link

I have two spark framework instances in my DC/OS cluster (1.8) and I am trying to run separate tasks in each instance through the CLI.

I launch the second spark instance using this json as specified in the documentation:
{
"service":
{ "name": "spark-dev" }
}

When I try to specify the service name as indicated in the documentation:
$ dcos config set spark.app_id spark-dev

I receive the following error:
root@dcos-cli:/dcos# dcos config set spark.app_id spark-dev
Traceback (most recent call last):
File "/usr/local/lib/python3.4/site-packages/dcoscli/subcommand.py", line 99, in run_and_capture
exit_code = m.main(self._command + self._args)
File "/usr/local/lib/python3.4/site-packages/dcoscli/config/main.py", line 16, in main
return _main(argv)
File "/usr/local/lib/python3.4/site-packages/dcoscli/util.py", line 21, in wrapper
result = func(_args, *_kwargs)
File "/usr/local/lib/python3.4/site-packages/dcoscli/config/main.py", line 31, in _main
return cmds.execute(_cmds(), args)
File "/usr/local/lib/python3.4/site-packages/dcos/cmds.py", line 43, in execute
return function(*params)
File "/usr/local/lib/python3.4/site-packages/dcoscli/config/main.py", line 94, in _set
toml, msg = config.set_val(name, value)
File "/usr/local/lib/python3.4/site-packages/dcos/config.py", line 122, in set_val
config_schema = get_config_schema(section)
File "/usr/local/lib/python3.4/site-packages/dcos/config.py", line 333, in get_config_schema
return subcommand.config_schema(executable, command)
File "/usr/local/lib/python3.4/site-packages/dcos/subcommand.py", line 194, in config_schema
return json.loads(out.decode('utf-8'))
File "/usr/local/lib/python3.4/json/init.py", line 318, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.4/json/decoder.py", line 343, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.4/json/decoder.py", line 359, in raw_decode
obj, end = self.scan_once(s, idx)
ValueError: Expecting property name enclosed in double quotes: line 15 column 9 (char 514)

I have also tried using doble quotes in the call: $ dcos config set spark.app_id "spark-dev", but with the same result.

It could be that I am doing something wrong, but I cannot find the key.

@jamartin9
Copy link

So the error looks likes it failing the reading of a json config. #68 appears to have fixed it. You can try removing the comma in the ~/.dcos/subcommands/spark/env/lib/python2.7/site-packages/dcos_spark/data/config-schema/spark.json too.

@julienlau
Copy link

Up... 3 years after.
I am using DCOS OSS 1.11 and I want to spin two different version of spark in this cluster.
Then I have one spark service named "spark-1-6" and another spark service named "spark-2-3".
Deployment is OK, but I cannot use dcos cli to submit jobs anymore, because it seems that the name of the spark service is hardcoded to "spark".
It says message: App '/spark' does not exist

If I try to specify a different config it crash:

dcos config set spark.app_id spark-2-3
dcos-spark: error: unknown long flag '--config-schema', try --help
Traceback (most recent call last):
File "cli/dcoscli/subcommand.py", line 101, in run_and_capture
File "cli/dcoscli/config/main.py", line 17, in main
File "cli/dcoscli/util.py", line 24, in wrapper
File "cli/dcoscli/config/main.py", line 32, in _main
File "dcos/cmds.py", line 43, in execute
File "cli/dcoscli/config/main.py", line 87, in _set
File "dcos/config.py", line 244, in set_val
File "dcos/config.py", line 460, in get_config_schema
File "dcos/subcommand.py", line 205, in config_schema
File "dcos/subprocess.py", line 38, in check_output
File "subprocess.py", line 629, in check_output
File "subprocess.py", line 711, in run
subprocess.CalledProcessError: Command '['/home/centos/.dcos/clusters/742ef9ce-7b56-44b1-b9d2-36897dfb4d48/subcommands/spark/env/bin/dcos-spark', 'spark', '--config-schema']' returned non-zero exit status 1

Regards
JL

@julienlau
Copy link

As suggested in dcos slack, the submission works fine by using the following cmd:
dcos spark --name="spark-2-3" run blabla

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants