class InferenceEngine::ExecutableNetwork¶
Overview¶
This is an interface of an executable network. More…
#include <ie_executable_network.hpp>
class ExecutableNetwork
{
public:
// construction
ExecutableNetwork();
ExecutableNetwork(const ExecutableNetwork& other);
ExecutableNetwork(ExecutableNetwork&& other);
// methods
ExecutableNetwork& operator = (const ExecutableNetwork& other);
ExecutableNetwork& operator = (ExecutableNetwork&& other);
ConstOutputsDataMap GetOutputsInfo() const;
ConstInputsDataMap GetInputsInfo() const;
InferRequest CreateInferRequest();
void Export(const std::string& modelFileName);
void Export(std::ostream& networkModel);
CNNNetwork GetExecGraphInfo();
void SetConfig(const std::map<std::string, Parameter>& config);
Parameter GetConfig(const std::string& name) const;
Parameter GetMetric(const std::string& name) const;
RemoteContext::Ptr GetContext() const;
bool operator ! () const;
operator bool () const;
void reset(std::shared_ptr<IExecutableNetwork> newActual);
operator std::shared_ptr< IExecutableNetwork > ();
InferRequest::Ptr CreateInferRequestPtr();
};
Detailed Documentation¶
This is an interface of an executable network.
Construction¶
ExecutableNetwork()
Default constructor.
ExecutableNetwork(const ExecutableNetwork& other)
Default copy constructor.
Parameters:
other |
other ExecutableNetwork object |
ExecutableNetwork(ExecutableNetwork&& other)
Default move constructor.
Parameters:
other |
other ExecutableNetwork object |
Methods¶
ExecutableNetwork& operator = (const ExecutableNetwork& other)
Default copy assignment operator.
Parameters:
other |
other ExecutableNetwork object |
Returns:
reference to the current object
ExecutableNetwork& operator = (ExecutableNetwork&& other)
Default move assignment operator.
Parameters:
other |
other ExecutableNetwork object |
Returns:
reference to the current object
ConstOutputsDataMap GetOutputsInfo() const
Gets the Executable network output Data node information.
The received info is stored in the given InferenceEngine::ConstOutputsDataMap node. This method need to be called to find output names for using them later when calling InferenceEngine::InferRequest::GetBlob or InferenceEngine::InferRequest::SetBlob
Returns:
A collection that contains string as key, and const Data smart pointer as value
ConstInputsDataMap GetInputsInfo() const
Gets the executable network input Data node information.
The received info is stored in the given InferenceEngine::ConstInputsDataMap object. This method need to be called to find out input names for using them later when calling InferenceEngine::InferRequest::SetBlob
Returns:
A collection that contains string as key, and const InputInfo smart pointer as value
InferRequest CreateInferRequest()
Creates an inference request object used to infer the network.
The created request has allocated input and output blobs (that can be changed later).
Returns:
InferRequest object
void Export(const std::string& modelFileName)
Exports the current executable network.
Parameters:
modelFileName |
Full path to the location of the exported file |
See also:
void Export(std::ostream& networkModel)
Exports the current executable network.
Parameters:
networkModel |
Network model output stream |
See also:
CNNNetwork GetExecGraphInfo()
Get executable graph information from a device.
Wraps IExecutableNetwork::GetExecGraphInfo.
Returns:
CNNetwork containing Executable Graph Info
void SetConfig(const std::map<std::string, Parameter>& config)
Sets configuration for current executable network.
Parameters:
config |
Map of pairs: (config parameter name, config parameter value) |
Parameter GetConfig(const std::string& name) const
Gets configuration for current executable network.
The method is responsible to extract information which affects executable network execution. The list of supported configuration values can be extracted via ExecutableNetwork::GetMetric with the SUPPORTED_CONFIG_KEYS key, but some of these keys cannot be changed dynamically, e.g. DEVICE_ID cannot changed if an executable network has already been compiled for particular device.
Parameters:
name |
config key, can be found in |
Returns:
Configuration parameter value
Parameter GetMetric(const std::string& name) const
Gets general runtime metric for an executable network.
It can be network name, actual device ID on which executable network is running or all other properties which cannot be changed dynamically.
Parameters:
name |
metric name to request |
Returns:
Metric parameter value
RemoteContext::Ptr GetContext() const
Returns pointer to plugin-specific shared context on remote accelerator device that was used to create this ExecutableNetwork.
Returns:
A context
bool operator ! () const
Checks if current ExecutableNetwork object is not initialized.
Returns:
true if current ExecutableNetwork object is not initialized, false - otherwise
operator bool () const
Checks if current ExecutableNetwork object is initialized.
Returns:
true if current ExecutableNetwork object is initialized, false - otherwise
void reset(std::shared_ptr<IExecutableNetwork> newActual)
reset owned object to new pointer.
Deprecated The method Will be removed
Essential for cases when simultaneously loaded networks not expected.
Parameters:
newActual |
actual pointed object |
operator std::shared_ptr< IExecutableNetwork > ()
cast operator is used when this wrapper initialized by LoadNetwork
Deprecated Will be removed. Use operator bool
Returns:
A shared pointer to IExecutableNetwork interface.
InferRequest::Ptr CreateInferRequestPtr()
Creates an inference request object used to infer the network.
Deprecated Use ExecutableNetwork::CreateInferRequest
Wraps IExecutableNetwork::CreateInferRequest.
Returns:
shared pointer on InferenceEngine::InferRequest object