-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
not possible to use specific Spark instance from the DC/OS Spark CLI #69
Comments
So the error looks likes it failing the reading of a json config. #68 appears to have fixed it. You can try removing the comma in the ~/.dcos/subcommands/spark/env/lib/python2.7/site-packages/dcos_spark/data/config-schema/spark.json too. |
Up... 3 years after. If I try to specify a different config it crash:
Regards |
As suggested in dcos slack, the submission works fine by using the following cmd: |
I have two spark framework instances in my DC/OS cluster (1.8) and I am trying to run separate tasks in each instance through the CLI.
I launch the second spark instance using this json as specified in the documentation:
{
"service":
{ "name": "spark-dev" }
}
When I try to specify the service name as indicated in the documentation:
$ dcos config set spark.app_id spark-dev
I receive the following error:
root@dcos-cli:/dcos# dcos config set spark.app_id spark-dev
Traceback (most recent call last):
File "/usr/local/lib/python3.4/site-packages/dcoscli/subcommand.py", line 99, in run_and_capture
exit_code = m.main(self._command + self._args)
File "/usr/local/lib/python3.4/site-packages/dcoscli/config/main.py", line 16, in main
return _main(argv)
File "/usr/local/lib/python3.4/site-packages/dcoscli/util.py", line 21, in wrapper
result = func(_args, *_kwargs)
File "/usr/local/lib/python3.4/site-packages/dcoscli/config/main.py", line 31, in _main
return cmds.execute(_cmds(), args)
File "/usr/local/lib/python3.4/site-packages/dcos/cmds.py", line 43, in execute
return function(*params)
File "/usr/local/lib/python3.4/site-packages/dcoscli/config/main.py", line 94, in _set
toml, msg = config.set_val(name, value)
File "/usr/local/lib/python3.4/site-packages/dcos/config.py", line 122, in set_val
config_schema = get_config_schema(section)
File "/usr/local/lib/python3.4/site-packages/dcos/config.py", line 333, in get_config_schema
return subcommand.config_schema(executable, command)
File "/usr/local/lib/python3.4/site-packages/dcos/subcommand.py", line 194, in config_schema
return json.loads(out.decode('utf-8'))
File "/usr/local/lib/python3.4/json/init.py", line 318, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.4/json/decoder.py", line 343, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.4/json/decoder.py", line 359, in raw_decode
obj, end = self.scan_once(s, idx)
ValueError: Expecting property name enclosed in double quotes: line 15 column 9 (char 514)
I have also tried using doble quotes in the call: $ dcos config set spark.app_id "spark-dev", but with the same result.
It could be that I am doing something wrong, but I cannot find the key.
The text was updated successfully, but these errors were encountered: