interface InferenceEngine::ICNNNetwork¶
Overview¶
This is the main interface to describe the NN topology. More…
#include <ie_icnn_network.hpp>
template ICNNNetwork: public std::enable_shared_from_this< ICNNNetwork >
{
// typedefs
typedef std::shared_ptr<ICNNNetwork> Ptr;
typedef std::map<std::string, SizeVector> InputShapes;
// methods
virtual std::shared_ptr<ngraph::Function> getFunction() = 0;
virtual std::shared_ptr<const ngraph::Function> getFunction() const = 0;
virtual void getOutputsInfo(OutputsDataMap& out) const = 0;
virtual void getInputsInfo(InputsDataMap& inputs) const = 0;
virtual InputInfo::Ptr getInput(const std::string& inputName) const = 0;
virtual const std::string& getName() const = 0;
virtual size_t layerCount() const = 0;
virtual StatusCode addOutput(
const std::string& layerName,
size_t outputIndex = 0,
ResponseDesc \* resp = nullptr
) = 0;
virtual StatusCode setBatchSize(size_t size, ResponseDesc \* responseDesc) = 0;
virtual size_t getBatchSize() const = 0;
virtual StatusCode reshape(const InputShapes& inputShapes, ResponseDesc \* resp);
virtual StatusCode reshape(
const std::map<std::string, ngraph::PartialShape>& partialShapes,
ResponseDesc \* resp
);
virtual StatusCode serialize(
const std::string& xmlPath,
const std::string& binPath,
ResponseDesc \* resp
) const = 0;
virtual StatusCode serialize(
std::ostream& xmlStream,
std::ostream& binStream,
ResponseDesc \* resp
) const = 0;
virtual StatusCode serialize(
std::ostream& xmlStream,
Blob::Ptr& binData,
ResponseDesc \* resp
) const = 0;
virtual StatusCode getOVNameForTensor(
std::string& ov_name,
const std::string& orig_name,
ResponseDesc \* resp
) const;
protected:
};
Detailed Documentation¶
This is the main interface to describe the NN topology.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Typedefs¶
typedef std::shared_ptr<ICNNNetwork> Ptr
A shared pointer to a ICNNNetwork interface.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
typedef std::map<std::string, SizeVector> InputShapes
Map of pairs: name of corresponding data and its dimension.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Methods¶
virtual std::shared_ptr<ngraph::Function> getFunction() = 0
Returns nGraph function.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Returns:
nGraph function
virtual std::shared_ptr<const ngraph::Function> getFunction() const = 0
Returns constant nGraph function.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Returns:
constant nGraph function
virtual void getOutputsInfo(OutputsDataMap& out) const = 0
Gets the network output Data node information. The received info is stored in the given Data node.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
For single and multiple outputs networks.
This method need to be called to find out OpenVINO output names for using them later when calling InferenceEngine::InferRequest::GetBlob or InferenceEngine::InferRequest::SetBlob
If you want to use framework names, you can use InferenceEngine::ICNNNetwork::getOVNameForTensor method to map framework names to OpenVINO names
Parameters:
out |
Reference to the OutputsDataMap object |
virtual void getInputsInfo(InputsDataMap& inputs) const = 0
Gets the network input Data node information. The received info is stored in the given InputsDataMap object.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
For single and multiple inputs networks. This method need to be called to find out OpenVINO input names for using them later when calling InferenceEngine::InferRequest::SetBlob
If you want to use framework names, you can use InferenceEngine::ICNNNetwork::getOVNameForTensor method to map framework names to OpenVINO names
Parameters:
inputs |
Reference to InputsDataMap object. |
virtual InputInfo::Ptr getInput(const std::string& inputName) const = 0
Returns information on certain input pointed by inputName.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Parameters:
inputName |
Name of input layer to get info on |
Returns:
A smart pointer to the input information
virtual const std::string& getName() const = 0
Returns the network name.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Returns:
Network name
virtual size_t layerCount() const = 0
Returns the number of layers in the network as an integer value.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Returns:
The number of layers as an integer value
virtual StatusCode addOutput(
const std::string& layerName,
size_t outputIndex = 0,
ResponseDesc \* resp = nullptr
) = 0
Adds output to the layer.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Parameters:
layerName |
Name of the layer |
outputIndex |
Index of the output |
resp |
Response message |
Returns:
Status code of the operation
virtual StatusCode setBatchSize(size_t size, ResponseDesc \* responseDesc) = 0
Changes the inference batch size.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
There are several limitations and it’s not recommended to use it. Set batch to the input shape and call ICNNNetwork::reshape.
Current implementation of the function sets batch size to the first dimension of all layers in the networks. Before calling it make sure that all your layers have batch in the first dimension, otherwise the method works incorrectly. This limitation is resolved via shape inference feature by using InferenceEngine::ICNNNetwork::reshape method. To read more refer to the Shape Inference section in documentation
Current implementation of the function sets batch size to the first dimension of all layers in the networks. Before calling it make sure that all your layers have batch in the first dimension, otherwise the method works incorrectly. This limitation is resolved via shape inference feature by using InferenceEngine::ICNNNetwork::reshape method. To read more refer to the Shape Inference section in documentation
Parameters:
size |
Size of batch to set |
responseDesc |
Pointer to the response message that holds a description of an error if any occurred |
Returns:
Status code of the operation
virtual size_t getBatchSize() const = 0
Gets the inference batch size.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Returns:
The size of batch as a size_t value
virtual StatusCode reshape(const InputShapes& inputShapes, ResponseDesc \* resp)
Run shape inference with new input shapes for the network.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Parameters:
inputShapes |
|
resp |
Pointer to the response message that holds a description of an error if any occurred |
Returns:
Status code of the operation
virtual StatusCode reshape(
const std::map<std::string, ngraph::PartialShape>& partialShapes,
ResponseDesc \* resp
)
Run shape inference with new input shapes for the network.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Parameters:
partialShapes |
|
resp |
Pointer to the response message that holds a description of an error if any occurred |
Returns:
Status code of the operation
virtual StatusCode serialize(
const std::string& xmlPath,
const std::string& binPath,
ResponseDesc \* resp
) const = 0
Serialize network to IR and weights files.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Parameters:
xmlPath |
Path to output IR file. |
binPath |
Path to output weights file. |
resp |
Pointer to the response message that holds a description of an error if any occurred |
Returns:
Status code of the operation
virtual StatusCode serialize(
std::ostream& xmlStream,
std::ostream& binStream,
ResponseDesc \* resp
) const = 0
Serialize network to IR and weights files.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Parameters:
xmlStream |
A stream for xml content (.xml file) |
binStream |
A stream for weights content (.bin file) |
resp |
Pointer to the response message that holds a description of an error if any occurred |
Returns:
Status code of the operation
virtual StatusCode serialize(
std::ostream& xmlStream,
Blob::Ptr& binData,
ResponseDesc \* resp
) const = 0
Serialize network to IR and weights files.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Parameters:
xmlStream |
A stream for xml content (.xml file) |
binData |
A blob for weights content (.bin file) |
resp |
Pointer to the response message that holds a description of an error if any occurred |
Returns:
Status code of the operation
virtual StatusCode getOVNameForTensor(
std::string& ov_name,
const std::string& orig_name,
ResponseDesc \* resp
) const
Methods maps framework tensor name to OpenVINO name.
Deprecated Use InferenceEngine::CNNNetwork wrapper instead
Parameters:
ov_name |
OpenVINO name |
orig_name |
Framework tensor name |
resp |
Pointer to the response message that holds a description of an error if any occurred |
Returns:
Status code of the operation