Skip to content

Commit

Permalink
Update examples to use kp::Memory instead of kp::Tensor as required.
Browse files Browse the repository at this point in the history
Only done when creating a vector of shared_ptrs of Tensors as these
can't be automatically converted to a vector of shared_ptrs of Memorys
to be passed to Manager::algorithm
  • Loading branch information
robquill committed Aug 19, 2024
1 parent 57b62ee commit d6f3c91
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ KomputeModelML::train(std::vector<float> yData,

std::shared_ptr<kp::TensorT<float>> lOut = mgr.tensor(zerosData);

std::vector<std::shared_ptr<kp::Tensor>> params = { xI, xJ, y,
std::vector<std::shared_ptr<kp::Memory>> params = { xI, xJ, y,
wIn, wOutI, wOutJ,
bIn, bOut, lOut };

Expand Down
2 changes: 1 addition & 1 deletion examples/array_multiplication/src/main.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ main()
std::shared_ptr<kp::TensorT<float>> tensorOut =
mgr.tensor({ 0.0, 0.0, 0.0 });

const std::vector<std::shared_ptr<kp::Tensor>> params = { tensorInA,
const std::vector<std::shared_ptr<kp::Memory>> params = { tensorInA,
tensorInB,
tensorOut };

Expand Down
2 changes: 1 addition & 1 deletion examples/logistic_regression/src/main.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ main()

std::shared_ptr<kp::TensorT<float>> lOut = mgr.tensor({ 0, 0, 0, 0, 0 });

std::vector<std::shared_ptr<kp::Tensor>> params = { xI, xJ, y,
std::vector<std::shared_ptr<kp::Memory>> params = { xI, xJ, y,
wIn, wOutI, wOutJ,
bIn, bOut, lOut };

Expand Down

0 comments on commit d6f3c91

Please sign in to comment.