nnsight.intervention.backends#

class nnsight.intervention.backends.remote.LocalTracer(*args, **kwargs)[source]#
execute(fn: Callable)[source]#

Execute the compiled function.

Runs the compiled function with the necessary context to execute the traced code block.

Parameters:

fn – The compiled function to execute

push()[source]#

Push local variables back to the original execution frame.

This allows changes made during tracing to affect the original scope.

Parameters:

state – Dictionary of variable names and values to push to the frame. If None, automatically collects variables from the current frame.

class nnsight.intervention.backends.remote.RemoteBackend(model_key: str, host: str = None, blocking: bool = True, job_id: str = None, ssl: bool = None, api_key: str = '', callback: str = '')[source]#

Backend to execute a context object via a remote service.

Context object must inherit from RemoteMixin and implement its methods.

url#

Remote host url. Defaults to that set in CONFIG.API.HOST.

Type:

str

blocking_request(tracer: Tracer) Dict[str, Any] | None[source]#

Send intervention request to the remote service while waiting for updates via websocket.

Parameters:

request (RequestModel) – Request.

get_response() Dict[str, Any] | None[source]#

Retrieves and handles the response object from the remote endpoint.

Raises:

Exception – If there was a status code other than 200 for the response.

Returns:

Response.

Return type:

(ResponseModel)

handle_response(response: ResponseModel, tracer: Tracer | None = None) Dict[str, Any] | None[source]#

Handles incoming response data.

Logs the response object. If the job is completed, retrieve and stream the result from the remote endpoint. Use torch.load to decode and load the ResultModel into memory. Use the backend object’s .handle_result method to handle the decoded result.

Parameters:

response (Any) – Json data to concert to ResponseModel

Raises:

Exception – If the job’s status is ResponseModel.JobStatus.ERROR

Returns:

ResponseModel.

Return type:

ResponseModel

non_blocking_request(tracer: Tracer)[source]#

Send intervention request to the remote service if request provided. Otherwise get job status.

Sets CONFIG.API.JOB_ID on initial request as to later get the status of said job.

When job is completed, clear CONFIG.API.JOB_ID to request a new job.

Parameters:

request (RequestModel) – Request if submitting a new request. Defaults to None

stream_send(values: Dict[int, Any], sio: SimpleClient)[source]#

Upload some value to the remote service for some job id.

Parameters:
  • value (Any) – Value to upload

  • job_id (str) – Job id.

  • sio (socketio.SimpleClient) – Connected websocket client.

submit_request(data: bytes, headers: Dict[str, Any]) ResponseModel | None[source]#

Sends request to the remote endpoint and handles the response object.

Raises:

Exception – If there was a status code other than 200 for the response.

Returns:

Response.

Return type:

(ResponseModel)

exception nnsight.intervention.backends.remote.RemoteException[source]#