We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
hello and thanks for your great work during installing the dependency for fx2ait using docker container I've faced with following error massage :
root@835aee4c9775:/AITemplate/fx2ait# python3 setup.py install Compiling extensions with following flags: DEBUG: False NVCC_FLAGS: running install running bdist_egg running egg_info writing fx2ait.egg-info/PKG-INFO writing dependency_links to fx2ait.egg-info/dependency_links.txt writing requirements to fx2ait.egg-info/requires.txt writing top-level names to fx2ait.egg-info/top_level.txt /usr/local/lib/python3.8/dist-packages/torch/utils/cpp_extension.py:476: UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend. warnings.warn(msg.format('we could not find ninja.')) reading manifest file 'fx2ait.egg-info/SOURCES.txt' writing manifest file 'fx2ait.egg-info/SOURCES.txt' installing library code to build/bdist.linux-x86_64/egg running install_lib running build_py running build_ext building 'fx2ait.libait_model' extension x86_64-linux-gnu-gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -g -fwrapv -O2 -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/AITemplate/fx2ait/fx2ait/csrc -I/AITemplate/static/include -I/AITemplate/3rdparty/picojson -I/usr/local/lib/python3.8/dist-packages/torch/include -I/usr/local/lib/python3.8/dist-packages/torch/include/torch/csrc/api/include -I/usr/local/lib/python3.8/dist-packages/torch/include/TH -I/usr/local/lib/python3.8/dist-packages/torch/include/THC -I/usr/local/cuda/include -I/usr/include/python3.8 -c /AITemplate/fx2ait/fx2ait/csrc/AITModelImpl.cpp -o build/temp.linux-x86_64-3.8/AITemplate/fx2ait/fx2ait/csrc/AITModelImpl.o -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE="_gcc" -DPYBIND11_STDLIB="_libstdcpp" -DPYBIND11_BUILD_ABI="_cxxabi1011" -DTORCH_EXTENSION_NAME=libait_model -D_GLIBCXX_USE_CXX11_ABI=0 -std=c++17 /AITemplate/fx2ait/fx2ait/csrc/AITModelImpl.cpp: In member function ‘void torch::aitemplate::AITModelImpl::allocateOutputs(std::vector<c10::intrusive_ptr<c10::StorageImpl> >&, std::vector<AITData>&, std::vector<std::vector<long int> >&, std::vector<long int*>&, const c10::Device&)’: /AITemplate/fx2ait/fx2ait/csrc/AITModelImpl.cpp:328:44: error: ‘struct c10::StorageImpl’ has no member named ‘mutable_data’ 328 | ait_outputs.emplace_back(storage_impl->mutable_data(), shape, ait_dtype); | ^~~~~~~~~~~~ /AITemplate/fx2ait/fx2ait/csrc/AITModelImpl.cpp: In member function ‘std::vector<AITData> torch::aitemplate::AITModelImpl::processInputs(std::vector<at::Tensor>&, std::vector<at::Tensor>&)’: /AITemplate/fx2ait/fx2ait/csrc/AITModelImpl.cpp:385:51: warning: comparison of integer expressions of different signedness: ‘int’ and ‘std::vector<std::basic_string<char> >::size_type’ {aka ‘long unsigned int’} [-Wsign-compare] 385 | for (int python_input_idx = 0; python_input_idx < input_names_.size(); | ~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~ /AITemplate/fx2ait/fx2ait/csrc/AITModelImpl.cpp: In function ‘c10::ScalarType torch::aitemplate::{anonymous}::AITemplateDtypeToTorchDtype(AITemplateDtype)’: /AITemplate/fx2ait/fx2ait/csrc/AITModelImpl.cpp:261:1: warning: control reaches end of non-void function [-Wreturn-type] 261 | } | ^ error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
could you please help me to address this problem ?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
hello and thanks for your great work
during installing the dependency for fx2ait using docker container I've faced with following error massage :
could you please help me to address this problem ?
The text was updated successfully, but these errors were encountered: