Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Input path does not exist: file:/Users/Gabriel/Dropbox/arun/ScalaDataAnalysis/Code/scaladataanalysisCB-tower/chapter3-data-loading/profiles.json #2

Open
nellaivijay opened this issue Jan 3, 2016 · 0 comments

Comments

@nellaivijay
Copy link

Arun,
Please find the DataFrameFromJson function using hard code values for Input file location, can you please fix it as well.

"C:\Program Files\Java\jdk1.8.0_65\bin\java" -Didea.launcher.port=7535 "-Didea.launcher.bin.path=C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 15.0.2\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_65\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_65\jre\lib\rt.jar;C:\Users\ramdov\IdeaProjects\ScalaDataAnalysisCookbook\chapter3-data-loading\target\scala-2.10\classes;C:\Users\ramdov.ivy2\cache\aopalliance\aopalliance\jars\aopalliance-1.0.jar;C:\Users\ramdov.ivy2\cache\asm\asm\jars\asm-3.2.jar;C:\Users\ramdov.ivy2\cache\com.clearspring.analytics\stream\jars\stream-2.7.0.jar;C:\Users\ramdov.ivy2\cache\com.codahale.metrics\metrics-core\bundles\metrics-core-3.0.2.jar;C:\Users\ramdov.ivy2\cache\com.databricks\spark-csv_2.10\jars\spark-csv_2.10-1.0.3.jar;C:\Users\ramdov.ivy2\cache\com.datastax.cassandra\cassandra-driver-core\bundles\cassandra-driver-core-2.1.5.jar;C:\Users\ramdov.ivy2\cache\com.datastax.spark\spark-cassandra-connector-java_2.10\jars\spark-cassandra-connector-java_2.10-1.2.0.jar;C:\Users\ramdov.ivy2\cache\com.datastax.spark\spark-cassandra-connector_2.10\jars\spark-cassandra-connector_2.10-1.2.0.jar;C:\Users\ramdov.ivy2\cache\com.esotericsoftware.kryo\kryo\bundles\kryo-2.21.jar;C:\Users\ramdov.ivy2\cache\com.esotericsoftware.minlog\minlog\jars\minlog-1.2.jar;C:\Users\ramdov.ivy2\cache\com.esotericsoftware.reflectasm\reflectasm\jars\reflectasm-1.07-shaded.jar;C:\Users\ramdov.ivy2\cache\com.fasterxml.jackson.core\jackson-annotations\bundles\jackson-annotations-2.4.4.jar;C:\Users\ramdov.ivy2\cache\com.fasterxml.jackson.core\jackson-core\bundles\jackson-core-2.4.4.jar;C:\Users\ramdov.ivy2\cache\com.fasterxml.jackson.core\jackson-databind\bundles\jackson-databind-2.4.4.jar;C:\Users\ramdov.ivy2\cache\com.fasterxml.jackson.module\jackson-module-scala_2.10\bundles\jackson-module-scala_2.10-2.4.4.jar;C:\Users\ramdov.ivy2\cache\com.github.fommil.netlib\core\jars\core-1.1.2.jar;C:\Users\ramdov.ivy2\cache\com.github.rwl\jtransforms\jars\jtransforms-2.4.0.jar;C:\Users\ramdov.ivy2\cache\com.google.code.findbugs\jsr305\jars\jsr305-1.3.9.jar;C:\Users\ramdov.ivy2\cache\com.google.guava\guava\bundles\guava-14.0.1.jar;C:\Users\ramdov.ivy2\cache\com.google.inject\guice\jars\guice-3.0.jar;C:\Users\ramdov.ivy2\cache\com.google.protobuf\protobuf-java\bundles\protobuf-java-2.5.0.jar;C:\Users\ramdov.ivy2\cache\com.ning\compress-lzf\bundles\compress-lzf-1.0.3.jar;C:\Users\ramdov.ivy2\cache\com.sun.istack\istack-commons-runtime\jars\istack-commons-runtime-2.16.jar;C:\Users\ramdov.ivy2\cache\com.sun.jersey\jersey-core\bundles\jersey-core-1.9.jar;C:\Users\ramdov.ivy2\cache\com.sun.jersey\jersey-json\bundles\jersey-json-1.9.jar;C:\Users\ramdov.ivy2\cache\com.sun.jersey\jersey-server\bundles\jersey-server-1.9.jar;C:\Users\ramdov.ivy2\cache\com.sun.jersey.contribs\jersey-guice\jars\jersey-guice-1.9.jar;C:\Users\ramdov.ivy2\cache\com.sun.jersey.jersey-test-framework\jersey-test-framework-grizzly2\jars\jersey-test-framework-grizzly2-1.9.jar;C:\Users\ramdov.ivy2\cache\com.sun.xml.bind\jaxb-core\jars\jaxb-core-2.2.7.jar;C:\Users\ramdov.ivy2\cache\com.sun.xml.bind\jaxb-impl\jars\jaxb-impl-2.2.7.jar;C:\Users\ramdov.ivy2\cache\com.sun.xml.fastinfoset\FastInfoset\jars\FastInfoset-1.2.12.jar;C:\Users\ramdov.ivy2\cache\com.thoughtworks.paranamer\paranamer\jars\paranamer-2.6.jar;C:\Users\ramdov.ivy2\cache\com.twitter\chill-java\jars\chill-java-0.5.0.jar;C:\Users\ramdov.ivy2\cache\com.twitter\chill_2.10\jars\chill_2.10-0.5.0.jar;C:\Users\ramdov.ivy2\cache\com.twitter\jsr166e\jars\jsr166e-1.1.0.jar;C:\Users\ramdov.ivy2\cache\com.twitter\parquet-column\jars\parquet-column-1.6.0rc3.jar;C:\Users\ramdov.ivy2\cache\com.twitter\parquet-common\jars\parquet-common-1.6.0rc3.jar;C:\Users\ramdov.ivy2\cache\com.twitter\parquet-encoding\jars\parquet-encoding-1.6.0rc3.jar;C:\Users\ramdov.ivy2\cache\com.twitter\parquet-format\jars\parquet-format-2.2.0-rc1.jar;C:\Users\ramdov.ivy2\cache\com.twitter\parquet-generator\jars\parquet-generator-1.6.0rc3.jar;C:\Users\ramdov.ivy2\cache\com.twitter\parquet-hadoop\jars\parquet-hadoop-1.6.0rc3.jar;C:\Users\ramdov.ivy2\cache\com.twitter\parquet-jackson\jars\parquet-jackson-1.6.0rc3.jar;C:\Users\ramdov.ivy2\cache\com.typesafe\config\bundles\config-1.2.1.jar;C:\Users\ramdov.ivy2\cache\commons-beanutils\commons-beanutils\jars\commons-beanutils-1.7.0.jar;C:\Users\ramdov.ivy2\cache\commons-beanutils\commons-beanutils-core\jars\commons-beanutils-core-1.8.0.jar;C:\Users\ramdov.ivy2\cache\commons-cli\commons-cli\jars\commons-cli-1.2.jar;C:\Users\ramdov.ivy2\cache\commons-codec\commons-codec\jars\commons-codec-1.5.jar;C:\Users\ramdov.ivy2\cache\commons-collections\commons-collections\jars\commons-collections-3.2.1.jar;C:\Users\ramdov.ivy2\cache\commons-configuration\commons-configuration\jars\commons-configuration-1.6.jar;C:\Users\ramdov.ivy2\cache\commons-digester\commons-digester\jars\commons-digester-1.8.jar;C:\Users\ramdov.ivy2\cache\commons-httpclient\commons-httpclient\jars\commons-httpclient-3.1.jar;C:\Users\ramdov.ivy2\cache\commons-io\commons-io\jars\commons-io-2.4.jar;C:\Users\ramdov.ivy2\cache\commons-lang\commons-lang\jars\commons-lang-2.5.jar;C:\Users\ramdov.ivy2\cache\commons-logging\commons-logging\jars\commons-logging-1.1.1.jar;C:\Users\ramdov.ivy2\cache\commons-net\commons-net\jars\commons-net-2.2.jar;C:\Users\ramdov.ivy2\cache\io.dropwizard.metrics\metrics-core\bundles\metrics-core-3.1.0.jar;C:\Users\ramdov.ivy2\cache\io.dropwizard.metrics\metrics-graphite\bundles\metrics-graphite-3.1.0.jar;C:\Users\ramdov.ivy2\cache\io.dropwizard.metrics\metrics-json\bundles\metrics-json-3.1.0.jar;C:\Users\ramdov.ivy2\cache\io.dropwizard.metrics\metrics-jvm\bundles\metrics-jvm-3.1.0.jar;C:\Users\ramdov.ivy2\cache\io.netty\netty\bundles\netty-3.9.0.Final.jar;C:\Users\ramdov.ivy2\cache\io.netty\netty-all\jars\netty-all-4.0.23.Final.jar;C:\Users\ramdov.ivy2\cache\javax.inject\javax.inject\jars\javax.inject-1.jar;C:\Users\ramdov.ivy2\cache\javax.xml.bind\jaxb-api\jars\jaxb-api-2.2.7.jar;C:\Users\ramdov.ivy2\cache\javax.xml.bind\jsr173_api\jars\jsr173_api-1.0.jar;C:\Users\ramdov.ivy2\cache\jline\jline\jars\jline-0.9.94.jar;C:\Users\ramdov.ivy2\cache\joda-time\joda-time\jars\joda-time-2.3.jar;C:\Users\ramdov.ivy2\cache\log4j\log4j\bundles\log4j-1.2.17.jar;C:\Users\ramdov.ivy2\cache\mysql\mysql-connector-java\jars\mysql-connector-java-5.1.34.jar;C:\Users\ramdov.ivy2\cache\net.java.dev.jets3t\jets3t\jars\jets3t-0.7.1.jar;C:\Users\ramdov.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar;C:\Users\ramdov.ivy2\cache\net.razorvine\pyrolite\jars\pyrolite-4.4.jar;C:\Users\ramdov.ivy2\cache\net.sf.opencsv\opencsv\jars\opencsv-2.3.jar;C:\Users\ramdov.ivy2\cache\net.sf.py4j\py4j\jars\py4j-0.8.2.1.jar;C:\Users\ramdov.ivy2\cache\net.sourceforge.f2j\arpack_combined_all\jars\arpack_combined_all-0.1.jar;C:\Users\ramdov.ivy2\cache\net.sourceforge.f2j\arpack_combined_all\jars\arpack_combined_all-0.1-javadoc.jar;C:\Users\ramdov.ivy2\cache\org.apache.avro\avro\jars\avro-1.7.4.jar;C:\Users\ramdov.ivy2\cache\org.apache.cassandra\cassandra-clientutil\jars\cassandra-clientutil-2.1.3.jar;C:\Users\ramdov.ivy2\cache\org.apache.cassandra\cassandra-thrift\jars\cassandra-thrift-2.1.3.jar;C:\Users\ramdov.ivy2\cache\org.apache.commons\commons-compress\jars\commons-compress-1.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.commons\commons-csv\jars\commons-csv-1.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.commons\commons-lang3\jars\commons-lang3-3.3.2.jar;C:\Users\ramdov.ivy2\cache\org.apache.commons\commons-math\jars\commons-math-2.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.commons\commons-math3\jars\commons-math3-3.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.curator\curator-client\bundles\curator-client-2.4.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.curator\curator-framework\bundles\curator-framework-2.4.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.curator\curator-recipes\bundles\curator-recipes-2.4.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-annotations\jars\hadoop-annotations-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-auth\jars\hadoop-auth-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-client\jars\hadoop-client-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-common\jars\hadoop-common-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-hdfs\jars\hadoop-hdfs-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-mapreduce-client-app\jars\hadoop-mapreduce-client-app-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-mapreduce-client-common\jars\hadoop-mapreduce-client-common-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-mapreduce-client-core\jars\hadoop-mapreduce-client-core-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-mapreduce-client-jobclient\jars\hadoop-mapreduce-client-jobclient-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-mapreduce-client-shuffle\jars\hadoop-mapreduce-client-shuffle-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-yarn-api\jars\hadoop-yarn-api-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-yarn-client\jars\hadoop-yarn-client-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-yarn-common\jars\hadoop-yarn-common-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.hadoop\hadoop-yarn-server-common\jars\hadoop-yarn-server-common-2.2.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.httpcomponents\httpclient\jars\httpclient-4.2.5.jar;C:\Users\ramdov.ivy2\cache\org.apache.httpcomponents\httpcore\jars\httpcore-4.2.4.jar;C:\Users\ramdov.ivy2\cache\org.apache.ivy\ivy\jars\ivy-2.4.0.jar;C:\Users\ramdov.ivy2\cache\org.apache.mesos\mesos\jars\mesos-0.21.1-shaded-protobuf.jar;C:\Users\ramdov.ivy2\cache\org.apache.spark\spark-catalyst_2.10\jars\spark-catalyst_2.10-1.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.spark\spark-core_2.10\jars\spark-core_2.10-1.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.spark\spark-graphx_2.10\jars\spark-graphx_2.10-1.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.spark\spark-launcher_2.10\jars\spark-launcher_2.10-1.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.spark\spark-mllib_2.10\jars\spark-mllib_2.10-1.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.spark\spark-network-common_2.10\jars\spark-network-common_2.10-1.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.spark\spark-network-shuffle_2.10\jars\spark-network-shuffle_2.10-1.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.spark\spark-sql_2.10\jars\spark-sql_2.10-1.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.spark\spark-streaming_2.10\jars\spark-streaming_2.10-1.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.spark\spark-unsafe_2.10\jars\spark-unsafe_2.10-1.4.1.jar;C:\Users\ramdov.ivy2\cache\org.apache.thrift\libthrift\jars\libthrift-0.9.2.jar;C:\Users\ramdov.ivy2\cache\org.apache.zookeeper\zookeeper\jars\zookeeper-3.4.5.jar;C:\Users\ramdov.ivy2\cache\org.codehaus.jackson\jackson-core-asl\jars\jackson-core-asl-1.9.11.jar;C:\Users\ramdov.ivy2\cache\org.codehaus.jackson\jackson-jaxrs\jars\jackson-jaxrs-1.8.8.jar;C:\Users\ramdov.ivy2\cache\org.codehaus.jackson\jackson-mapper-asl\jars\jackson-mapper-asl-1.9.11.jar;C:\Users\ramdov.ivy2\cache\org.codehaus.jackson\jackson-xc\jars\jackson-xc-1.8.8.jar;C:\Users\ramdov.ivy2\cache\org.codehaus.jettison\jettison\bundles\jettison-1.1.jar;C:\Users\ramdov.ivy2\cache\org.eclipse.jetty.orbit\javax.servlet\orbits\javax.servlet-3.0.0.v201112011016.jar;C:\Users\ramdov.ivy2\cache\org.joda\joda-convert\jars\joda-convert-1.2.jar;C:\Users\ramdov.ivy2\cache\org.jodd\jodd-core\jars\jodd-core-3.6.3.jar;C:\Users\ramdov.ivy2\cache\org.jpmml\pmml-agent\jars\pmml-agent-1.1.15.jar;C:\Users\ramdov.ivy2\cache\org.jpmml\pmml-model\jars\pmml-model-1.1.15.jar;C:\Users\ramdov.ivy2\cache\org.jpmml\pmml-schema\jars\pmml-schema-1.1.15.jar;C:\Users\ramdov.ivy2\cache\org.json4s\json4s-ast_2.10\jars\json4s-ast_2.10-3.2.10.jar;C:\Users\ramdov.ivy2\cache\org.json4s\json4s-core_2.10\jars\json4s-core_2.10-3.2.10.jar;C:\Users\ramdov.ivy2\cache\org.json4s\json4s-jackson_2.10\jars\json4s-jackson_2.10-3.2.10.jar;C:\Users\ramdov.ivy2\cache\org.mortbay.jetty\jetty-util\jars\jetty-util-6.1.26.jar;C:\Users\ramdov.ivy2\cache\org.objenesis\objenesis\jars\objenesis-1.2.jar;C:\Users\ramdov.ivy2\cache\org.roaringbitmap\RoaringBitmap\bundles\RoaringBitmap-0.4.5.jar;C:\Users\ramdov.sbt\boot\scala-2.10.4\lib\scala-compiler.jar;C:\Users\ramdov.ivy2\cache\org.scala-lang\scala-library\jars\scala-library-2.10.5.jar;C:\Users\ramdov.ivy2\cache\org.scala-lang\scala-reflect\jars\scala-reflect-2.10.5.jar;C:\Users\ramdov.ivy2\cache\org.scala-lang\scalap\jars\scalap-2.10.0.jar;C:\Users\ramdov.ivy2\cache\org.scalamacros\quasiquotes_2.10\jars\quasiquotes_2.10-2.0.1.jar;C:\Users\ramdov.ivy2\cache\org.scalanlp\breeze-macros_2.10\jars\breeze-macros_2.10-0.11.2.jar;C:\Users\ramdov.ivy2\cache\org.scalanlp\breeze_2.10\jars\breeze_2.10-0.11.2.jar;C:\Users\ramdov.ivy2\cache\org.slf4j\jcl-over-slf4j\jars\jcl-over-slf4j-1.7.10.jar;C:\Users\ramdov.ivy2\cache\org.slf4j\jul-to-slf4j\jars\jul-to-slf4j-1.7.10.jar;C:\Users\ramdov.ivy2\cache\org.slf4j\slf4j-api\jars\slf4j-api-1.7.10.jar;C:\Users\ramdov.ivy2\cache\org.slf4j\slf4j-log4j12\jars\slf4j-log4j12-1.7.10.jar;C:\Users\ramdov.ivy2\cache\org.sonatype.sisu.inject\cglib\jars\cglib-2.2.1-v20090111.jar;C:\Users\ramdov.ivy2\cache\org.spark-project.akka\akka-actor_2.10\jars\akka-actor_2.10-2.3.4-spark.jar;C:\Users\ramdov.ivy2\cache\org.spark-project.akka\akka-remote_2.10\jars\akka-remote_2.10-2.3.4-spark.jar;C:\Users\ramdov.ivy2\cache\org.spark-project.akka\akka-slf4j_2.10\jars\akka-slf4j_2.10-2.3.4-spark.jar;C:\Users\ramdov.ivy2\cache\org.spark-project.protobuf\protobuf-java\bundles\protobuf-java-2.5.0-spark.jar;C:\Users\ramdov.ivy2\cache\org.spark-project.spark\unused\jars\unused-1.0.0.jar;C:\Users\ramdov.ivy2\cache\org.spire-math\spire-macros_2.10\jars\spire-macros_2.10-0.7.4.jar;C:\Users\ramdov.ivy2\cache\org.spire-math\spire_2.10\jars\spire_2.10-0.7.4.jar;C:\Users\ramdov.ivy2\cache\org.tachyonproject\tachyon\jars\tachyon-0.6.4.jar;C:\Users\ramdov.ivy2\cache\org.tachyonproject\tachyon-client\jars\tachyon-client-0.6.4.jar;C:\Users\ramdov.ivy2\cache\org.tukaani\xz\jars\xz-1.0.jar;C:\Users\ramdov.ivy2\cache\org.uncommons.maths\uncommons-maths\jars\uncommons-maths-1.2.2a.jar;C:\Users\ramdov.ivy2\cache\org.xerial.snappy\snappy-java\bundles\snappy-java-1.1.1.7.jar;C:\Users\ramdov.ivy2\cache\oro\oro\jars\oro-2.0.8.jar;C:\Users\ramdov.ivy2\cache\stax\stax-api\jars\stax-api-1.0.1.jar;C:\Users\ramdov.ivy2\cache\xmlenc\xmlenc\jars\xmlenc-0.52.jar;C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 15.0.2\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain com.packt.dataload.DataFrameFromJSON
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/01/03 05:07:35 INFO SparkContext: Running Spark version 1.4.1
16/01/03 05:07:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/01/03 05:07:36 INFO SecurityManager: Changing view acls to: ramdov
16/01/03 05:07:36 INFO SecurityManager: Changing modify acls to: ramdov
16/01/03 05:07:36 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(ramdov); users with modify permissions: Set(ramdov)
16/01/03 05:07:37 INFO Slf4jLogger: Slf4jLogger started
16/01/03 05:07:37 INFO Remoting: Starting remoting
16/01/03 05:07:37 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.107.1:51476]
16/01/03 05:07:37 INFO Utils: Successfully started service 'sparkDriver' on port 51476.
16/01/03 05:07:37 INFO SparkEnv: Registering MapOutputTracker
16/01/03 05:07:37 INFO SparkEnv: Registering BlockManagerMaster
16/01/03 05:07:37 INFO DiskBlockManager: Created local directory at C:\Users\ramdov\AppData\Local\Temp\spark-dc443474-9805-4d0f-b7ca-6b40beb676dd\blockmgr-d268befd-d547-4b3c-a869-786e01061501
16/01/03 05:07:37 INFO MemoryStore: MemoryStore started with capacity 1439.1 MB
16/01/03 05:07:37 INFO HttpFileServer: HTTP File server directory is C:\Users\ramdov\AppData\Local\Temp\spark-dc443474-9805-4d0f-b7ca-6b40beb676dd\httpd-0c26dc70-dda3-4365-9179-0402ec7b864f
16/01/03 05:07:37 INFO HttpServer: Starting HTTP Server
16/01/03 05:07:37 INFO Utils: Successfully started service 'HTTP file server' on port 51477.
16/01/03 05:07:37 INFO SparkEnv: Registering OutputCommitCoordinator
16/01/03 05:07:37 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/01/03 05:07:37 INFO SparkUI: Started SparkUI at http://192.168.107.1:4040
16/01/03 05:07:37 INFO Executor: Starting executor ID driver on host localhost
16/01/03 05:07:38 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 51496.
16/01/03 05:07:38 INFO NettyBlockTransferService: Server created on 51496
16/01/03 05:07:38 INFO BlockManagerMaster: Trying to register BlockManager
16/01/03 05:07:38 INFO BlockManagerMasterEndpoint: Registering block manager localhost:51496 with 1439.1 MB RAM, BlockManagerId(driver, localhost, 51496)
16/01/03 05:07:38 INFO BlockManagerMaster: Registered BlockManager
16/01/03 05:07:39 INFO MemoryStore: ensureFreeSpace(110248) called with curMem=0, maxMem=1509005721
16/01/03 05:07:39 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 107.7 KB, free 1439.0 MB)
16/01/03 05:07:39 INFO MemoryStore: ensureFreeSpace(10090) called with curMem=110248, maxMem=1509005721
16/01/03 05:07:39 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 9.9 KB, free 1439.0 MB)
16/01/03 05:07:39 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on localhost:51496 (size: 9.9 KB, free: 1439.1 MB)
16/01/03 05:07:39 INFO SparkContext: Created broadcast 0 from jsonFile at DataFrameFromJSON.scala:19
16/01/03 05:07:39 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300)
at org.apache.hadoop.util.Shell.(Shell.java:293)
at org.apache.hadoop.util.StringUtils.(StringUtils.java:76)
at org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:362)
at org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$32.apply(SparkContext.scala:980)
at org.apache.spark.SparkContext$$anonfun$hadoopFile$1$$anonfun$32.apply(SparkContext.scala:980)
at org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$6.apply(HadoopRDD.scala:176)
at org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$6.apply(HadoopRDD.scala:176)
at scala.Option.map(Option.scala:145)
at org.apache.spark.rdd.HadoopRDD.getJobConf(HadoopRDD.scala:176)
at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:200)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1.apply(RDD.scala:1073)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
at org.apache.spark.rdd.RDD.treeAggregate(RDD.scala:1071)
at org.apache.spark.sql.json.InferSchema$.apply(InferSchema.scala:58)
at org.apache.spark.sql.json.JSONRelation$$anonfun$schema$1.apply(JSONRelation.scala:139)
at org.apache.spark.sql.json.JSONRelation$$anonfun$schema$1.apply(JSONRelation.scala:138)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.sql.json.JSONRelation.schema$lzycompute(JSONRelation.scala:137)
at org.apache.spark.sql.json.JSONRelation.schema(JSONRelation.scala:137)
at org.apache.spark.sql.sources.LogicalRelation.(LogicalRelation.scala:30)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:120)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)
at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:213)
at org.apache.spark.sql.SQLContext.jsonFile(SQLContext.scala:1115)
at com.packt.dataload.DataFrameFromJSON$delayedInit$body.apply(DataFrameFromJSON.scala:19)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)
at com.packt.dataload.DataFrameFromJSON$.main(DataFrameFromJSON.scala:13)
at com.packt.dataload.DataFrameFromJSON.main(DataFrameFromJSON.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Exception in thread "main" org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: file:/Users/Gabriel/Dropbox/arun/ScalaDataAnalysis/Code/scaladataanalysisCB-tower/chapter3-data-loading/profiles.json
at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:251)
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:270)
at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:207)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1.apply(RDD.scala:1073)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
at org.apache.spark.rdd.RDD.treeAggregate(RDD.scala:1071)
at org.apache.spark.sql.json.InferSchema$.apply(InferSchema.scala:58)
at org.apache.spark.sql.json.JSONRelation$$anonfun$schema$1.apply(JSONRelation.scala:139)
at org.apache.spark.sql.json.JSONRelation$$anonfun$schema$1.apply(JSONRelation.scala:138)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.sql.json.JSONRelation.schema$lzycompute(JSONRelation.scala:137)
at org.apache.spark.sql.json.JSONRelation.schema(JSONRelation.scala:137)
at org.apache.spark.sql.sources.LogicalRelation.(LogicalRelation.scala:30)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:120)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:104)
at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:213)
at org.apache.spark.sql.SQLContext.jsonFile(SQLContext.scala:1115)
at com.packt.dataload.DataFrameFromJSON$delayedInit$body.apply(DataFrameFromJSON.scala:19)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant