-
Notifications
You must be signed in to change notification settings - Fork 92
Local Tutorial Debugging using Spark logs
Dinesh Chandnani edited this page Apr 16, 2019
·
3 revisions
If you run into issues with your job, you will want to dig into the job logs to determine the cause. The Spark jobs has detailed logs; you can access them via the docker logs command.
In this tutorial, you'll learn to:
- View docker logs
- You can look at the logs by querying the container that is running via Powershell
- Launch Powershell then run the following command:
docker logs --tail 1000 dataxlocal
- If you want to see the logs continuously be updated, you can use the '-f' flag:
docker logs -f --tail 1000 dataxlocal
- Launch Powershell then run the following command:
This will help diagnose issues, see exceptions and callstacks and confirm jobs are running properly.