We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我现在在使用alink推理torch模型(内网环境,无法连接互联网),模型和数据文件已经手动下载到本地,代码使用的官方文档的代码,如下: test = AkSourceBatchOp() .setFilePath("./mnist_test_vector.ak");
test = VectorToTensorBatchOp() .setTensorDataType("float") .setTensorShape([1, 1, 28, 28]) .setSelectedCol("vec") .setOutputCol("tensor") .setReservedCols(["label"]) .linkFrom(test)
predictor = TorchModelPredictBatchOp() .setModelPath("./mnist_model_pytorch.pt") .setSelectedCols(["tensor"]) .setOutputSchemaStr("probabilities FLOAT_TENSOR")
test = predictor.linkFrom(test).select("label, probabilities") test.print() 但是推理时报错了,从代码里看,是需要连接互联网下载libtorch的推理库(cpu版)然后进行解压。我现在开发是在内网环境不能连接互联网,于是手动下载了libtorch的zip包,应该解压到什么路径?使用时是否需要修改什么参数选项避免连接互联网的步骤? 另外,如果要使用GPU进行推理,libtorch的安装和代码是否需要进行修改?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
我现在在使用alink推理torch模型(内网环境,无法连接互联网),模型和数据文件已经手动下载到本地,代码使用的官方文档的代码,如下:
test = AkSourceBatchOp()
.setFilePath("./mnist_test_vector.ak");
test = VectorToTensorBatchOp()
.setTensorDataType("float")
.setTensorShape([1, 1, 28, 28])
.setSelectedCol("vec")
.setOutputCol("tensor")
.setReservedCols(["label"])
.linkFrom(test)
predictor = TorchModelPredictBatchOp()
.setModelPath("./mnist_model_pytorch.pt")
.setSelectedCols(["tensor"])
.setOutputSchemaStr("probabilities FLOAT_TENSOR")
test = predictor.linkFrom(test).select("label, probabilities")
test.print()
但是推理时报错了,从代码里看,是需要连接互联网下载libtorch的推理库(cpu版)然后进行解压。我现在开发是在内网环境不能连接互联网,于是手动下载了libtorch的zip包,应该解压到什么路径?使用时是否需要修改什么参数选项避免连接互联网的步骤?
另外,如果要使用GPU进行推理,libtorch的安装和代码是否需要进行修改?
The text was updated successfully, but these errors were encountered: