Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Research hardware options to train models more quickly #5

Open
acs opened this issue Apr 25, 2019 · 2 comments
Open

Research hardware options to train models more quickly #5

acs opened this issue Apr 25, 2019 · 2 comments

Comments

@acs
Copy link
Collaborator

acs commented Apr 25, 2019

After the meetup today at:

https://www.meetup.com/es-ES/MachineLearningSpain/events/260664781/

the nVidia GPUs option, using the Rapids framework which offers Pandas API and others implemented over GPUs, and the Edge TPUs from Google (Coral dev, the one used in the meetup demo) which can be used from Tensorflow, are interesting options to explore once we need howto improve the training of the ML models.

GTC (GPU Technical Conference) is a great place to explore AI topics: https://on-demand-gtc.gputechconf.com/gtcnew/on-demand-gtc.php (also deep learning eduation)

Deep Learning on the Edge

@acs
Copy link
Collaborator Author

acs commented Apr 25, 2019

Edge TPUs are mainly for the inference, not for the training:

https://coral.withgoogle.com/docs/edgetpu/faq/
"The Edge TPU is not capable of backward propagation"

For training probably it is better: https://www.nvidia.com/es-es/autonomous-machines/embedded-systems/jetson-nano/

@acs
Copy link
Collaborator Author

acs commented May 6, 2019

I have bought the nVidia Jetson Nano. It will arrive next 13th May. Not sure I could train models with it, but for sure, I will learn a lot, specially if I can mount it in the Jetbot.

Captura de pantalla de 2019-05-06 16-22-21

@acs acs transferred this issue from aylabs/mydogbreed Jun 23, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant