Skip to content
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Commit 7949250

Browse files
goldsboroughfacebook-github-bot
authored andcommittedSep 17, 2018
Fixes for Torch Script C++ API (pytorch#11682)
Summary: A couple fixes I deem necessary to the TorchScript C++ API after writing the tutorial: 1. When I was creating the custom op API, I created `torch/op.h` as the one-stop header for creating custom ops. I now notice that there is no good header for the TorchScript C++ story altogether, i.e. when you just want to load a script module in C++ without any custom ops necessarily. The `torch/op.h` header suits that purpose just as well of course, but I think we should rename it to `torch/script.h`, which seems like a great name for this feature. 2. The current API for the CMake we provided was that we defined a bunch of variables like `TORCH_LIBRARY_DIRS` and `TORCH_INCLUDES` and then expected users to add those variables to their targets. We also had a CMake function that did that for you automatically. I now realized a much smarter way of doing this is to create an `IMPORTED` target for the libtorch library in CMake, and then add all this stuff to the link interface of that target. Then all downstream users have to do is `target_link_libraries(my_target torch)` and they get all the proper includes, libraries and compiler flags added to their target. This means we can get rid of the CMake function and all that stuff. orionr AFAIK this is a much, much better way of doing all of this, no? 3. Since we distribute libtorch with `D_GLIBCXX_USE_CXX11_ABI=0`, dependent libraries must set this flag too. I now add this to the interface compile options of this imported target. 4. Fixes to JIT docs. These could likely be 4 different PRs but given the release I wouldn't mind landing them all asap. zdevito dzhulgakov soumith Pull Request resolved: pytorch#11682 Differential Revision: D9839431 Pulled By: goldsborough fbshipit-source-id: fdc47b95f83f22d53e1995aa683e09613b4bfe65
1 parent a7e3cd0 commit 7949250

File tree

10 files changed

+66
-39
lines changed

10 files changed

+66
-39
lines changed
 

‎cmake/TorchConfig.cmake.in

+20-18
Original file line numberDiff line numberDiff line change
@@ -7,15 +7,12 @@
77
#
88
# TORCH_FOUND -- True if the system has the Torch library
99
# TORCH_INCLUDE_DIRS -- The include directories for torch
10-
# TORCH_LIBRARIES -- Libraries to link to
10+
# TORCH_LIBRARIES -- Libraries to link against
11+
# TORCH_CXX_FLAGS -- Additional (required) compiler flags
1112
#
1213
# and the following imported targets:
1314
#
14-
# Torch
15-
#
16-
# and the following functions:
17-
#
18-
# torch_add_custom_op_library(<name> <source_files>)
15+
# torch
1916

2017
if ($ENV{TORCH_INSTALL_PREFIX})
2118
set(TORCH_INSTALL_PREFIX $ENV{TORCH_INSTALL_PREFIX})
@@ -26,13 +23,19 @@ else()
2623
endif()
2724

2825
# Include directories.
29-
set(TORCH_INCLUDE_DIRS "${TORCH_INSTALL_PREFIX}/lib/include")
26+
if (EXISTS "${TORCH_INSTALL_PREFIX}/lib/include")
27+
set(TORCH_INCLUDE_DIRS "${TORCH_INSTALL_PREFIX}/lib/include")
28+
else()
29+
set(TORCH_INCLUDE_DIRS "${TORCH_INSTALL_PREFIX}/include")
30+
endif()
3031

3132
# Library dependencies.
3233
find_package(Caffe2 REQUIRED)
34+
3335
find_library(TORCH_LIBRARY torch PATHS "${TORCH_INSTALL_PREFIX}/lib")
36+
add_library(torch SHARED IMPORTED)
37+
set(TORCH_LIBRARIES torch ${Caffe2_MAIN_LIBS})
3438

35-
set(TORCH_LIBRARIES ${TORCH_LIBRARY} ${Caffe2_MAIN_LIBS})
3639
if (@USE_CUDA@)
3740
if(MSVC)
3841
set(NVTOOLEXT_HOME "C:/Program Files/NVIDIA Corporation/NvToolsExt")
@@ -59,13 +62,12 @@ if (@USE_CUDA@)
5962
list(APPEND TORCH_LIBRARIES ${TORCH_CUDA_LIBRARIES})
6063
endif()
6164

62-
# Creates a shared library <name> with the correct include directories
63-
# and linker flags set to include Torch header files and link with Torch
64-
# libraries. Also sets the C++ standard version to C++11. All options
65-
# can be override by specifying further options on the `<name>` CMake target.
66-
function(torch_add_custom_op_library name source_files)
67-
add_library(${name} SHARED ${source_files})
68-
target_include_directories(${name} PUBLIC "${TORCH_INCLUDE_DIRS}")
69-
target_link_libraries(${name} "${TORCH_LIBRARIES}")
70-
set_property(TARGET ${name} PROPERTY CXX_STANDARD 11)
71-
endfunction(torch_add_custom_op_library)
65+
# When we build libtorch with the old GCC ABI, dependent libraries must too.
66+
set(TORCH_CXX_FLAGS "-D_GLIBCXX_USE_CXX11_ABI=@GLIBCXX_USE_CXX11_ABI@")
67+
68+
set_target_properties(torch PROPERTIES
69+
IMPORTED_LOCATION ${TORCH_LIBRARY}
70+
INTERFACE_INCLUDE_DIRECTORIES ${TORCH_INCLUDE_DIRS}
71+
INTERFACE_COMPILE_OPTIONS ${TORCH_CXX_FLAGS}
72+
CXX_STANDARD 11
73+
)

‎docs/source/jit.rst

+6-5
Original file line numberDiff line numberDiff line change
@@ -7,13 +7,13 @@ Torch Script
77
.. currentmodule:: torch.jit
88

99
Torch Script is a way to create serializable and optimizable models from PyTorch code.
10-
Anything code written in Torch Script can be saved from your Python
11-
process and loaded/run a process where there is no python dependency.
10+
Any code written in Torch Script can be saved from your Python
11+
process and loaded in a process where there is no Python dependency.
1212

1313
We provide tools to incrementally transition a model from being a pure Python program
14-
to a Torch Script program that can be run independently from python, for instance, in a standalone C++ process.
14+
to a Torch Script program that can be run independently from Python, for instance, in a standalone C++ program.
1515
This makes it possible to train models in PyTorch using familiar tools and then export
16-
the model to a production environment where it is not a good idea to run models as python programs
16+
the model to a production environment where it is not a good idea to run models as Python programs
1717
for performance and multi-threading reasons.
1818

1919
Creating Torch Script Code
@@ -47,7 +47,7 @@ Mixing Tracing and Scripting
4747
----------------------------
4848

4949
In many cases either tracing or script is an easier approach for converting a model.
50-
We allow you to compose tracing and scripting to suite the particular requirements
50+
We allow you to compose tracing and scripting to suit the particular requirements
5151
of a part of a model.
5252

5353
Scripted functions can call traced ones. This is particularly useful when you need
@@ -79,6 +79,7 @@ Example:
7979
::
8080

8181
import torch
82+
8283
@torch.jit.script
8384
def foo(x, y):
8485
if x.max() > y.max():

‎test/custom_operator/CMakeLists.txt

+4-11
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,12 @@
11
# Basic CMake setup
2-
cmake_minimum_required(VERSION 3.0 FATAL_ERROR)
2+
cmake_minimum_required(VERSION 3.1 FATAL_ERROR)
33
project(custom_ops)
44

55
find_package(Torch REQUIRED)
66

7-
# This convenience function will create a shared library target, configure
8-
# the right include directories and link against the right libraries. It is
9-
# exactly equivalent to the following lines:
10-
#
11-
# add_library(custom_ops SHARED op.cpp)
12-
# target_include_directories(custom_ops PUBLIC "${TORCH_INCLUDE_DIRS}")
13-
# target_link_libraries(custom_ops "${TORCH_LIBRARIES}")
14-
# set_property(TARGET custom_ops PROPERTY CXX_STANDARD 11)
15-
#
16-
torch_add_custom_op_library(custom_ops op.cpp)
7+
add_library(custom_ops SHARED op.cpp)
8+
target_compile_features(custom_ops PUBLIC cxx_range_for)
9+
target_link_libraries(custom_ops ${TORCH_LIBRARIES})
1710

1811
add_executable(test_custom_ops test_custom_ops.cpp)
1912
target_link_libraries(test_custom_ops custom_ops)

‎test/custom_operator/op.cpp

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
#include <torch/op.h>
1+
#include <torch/script.h>
22

33
#include <cstddef>
44
#include <vector>

‎test/custom_operator/op.h

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
#include <torch/op.h>
1+
#include <torch/script.h>
22

33
#include <cstddef>
44
#include <vector>

‎test/custom_operator/test_custom_ops.cpp

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
#include <torch/op.h>
1+
#include <torch/script.h>
22

33
#include "op.h"
44

‎torch/CMakeLists.txt

+23-1
Original file line numberDiff line numberDiff line change
@@ -413,7 +413,7 @@ endif()
413413
install(DIRECTORY "${TORCH_SRC_DIR}/csrc"
414414
DESTINATION ${TORCH_INSTALL_INCLUDE_DIR}/torch
415415
FILES_MATCHING PATTERN "*.h")
416-
install(FILES "${TORCH_SRC_DIR}/op.h"
416+
install(FILES "${TORCH_SRC_DIR}/script.h"
417417
DESTINATION ${TORCH_INSTALL_INCLUDE_DIR}/torch)
418418

419419
install(TARGETS torch
@@ -488,6 +488,28 @@ if (BUILD_TEST AND NOT NO_API AND NOT USE_ROCM)
488488
endif()
489489
endif()
490490

491+
if ("${CMAKE_CXX_COMPILER_ID}" STREQUAL "GNU")
492+
message(STATUS "${CMAKE_CXX_COMPILER} ${CMAKE_CURRENT_LIST_DIR}/abi-check.cpp -o ${CMAKE_BINARY_DIR}/abi-check")
493+
execute_process(
494+
COMMAND
495+
"${CMAKE_CXX_COMPILER}"
496+
"${CMAKE_CURRENT_LIST_DIR}/abi-check.cpp"
497+
"-o"
498+
"${CMAKE_BINARY_DIR}/abi-check"
499+
RESULT_VARIABLE ABI_CHECK_COMPILE_RESULT)
500+
if (ABI_CHECK_COMPILE_RESULT)
501+
message(FATAL_ERROR "Could not compile ABI Check: ${ABI_CHECK_COMPILE_RESULT}")
502+
endif()
503+
execute_process(
504+
COMMAND "${CMAKE_BINARY_DIR}/abi-check"
505+
RESULT_VARIABLE ABI_CHECK_RESULT
506+
OUTPUT_VARIABLE GLIBCXX_USE_CXX11_ABI)
507+
if (ABI_CHECK_RESULT)
508+
message(WARNING "Could not run ABI Check: ${ABI_CHECK_RESULT}")
509+
endif()
510+
message(STATUS "Determined _GLIBCXX_USE_CXX11_ABI=${GLIBCXX_USE_CXX11_ABI}")
511+
endif()
512+
491513
# CMake config for external projects.
492514
configure_file(
493515
${PROJECT_SOURCE_DIR}/cmake/TorchConfigVersion.cmake.in

‎torch/abi-check.cpp

+9
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
#include <iostream>
2+
3+
int main() {
4+
#ifdef _GLIBCXX_USE_CXX11_ABI
5+
std::cout << _GLIBCXX_USE_CXX11_ABI;
6+
#else
7+
std::cout << 0;
8+
#endif
9+
}

‎torch/op.h ‎torch/script.h

File renamed without changes.

‎torch/utils/cpp_extension.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -288,7 +288,7 @@ def _add_gnu_abi_flag_if_binary(self, extension):
288288
# if the extension is compiled with gcc >= 5.1,
289289
# then we have to define _GLIBCXX_USE_CXX11_ABI=0
290290
# so that the std::string in the API is resolved to
291-
# non-C++11 symbols
291+
# non-C++11 symbols.
292292
define = '-D_GLIBCXX_USE_CXX11_ABI=0'
293293
if is_binary_build():
294294
if isinstance(extension.extra_compile_args, dict):

0 commit comments

Comments
 (0)
Please sign in to comment.