Skip to content

ndif

ndif

CRITICAL_PACKAGES module-attribute

CRITICAL_PACKAGES = {'nnsight', 'transformers', 'torch'}

NDIF_ENV module-attribute

NDIF_ENV = None

NdifStatus

NdifStatus(response: dict)

Bases: dict

Status for remote execution on NDIF.

This class provides a structured view of the NDIF status, including information about all deployed models and their current states. It inherits from dict, allowing direct access to the underlying status data while providing rich formatting for display.

ATTRIBUTE DESCRIPTION
status

The overall service status (UP, REDEPLOYING, or DOWN).

TYPE: Status

Examples

TYPE: Status

PARAMETER DESCRIPTION
response

Dictionary mapping repo_id to model info dictionaries, each containing 'model_class', 'repo_id', 'revision', 'type', and 'state'.

TYPE: dict

status property

status: Status

Status

Bases: Enum

Overall NDIF service status.

ATTRIBUTE DESCRIPTION
UP

Service is operational with at least one model running.

REDEPLOYING

Service is transitioning; models are deploying or starting.

DOWN

Service is unavailable or no models are running.

UP class-attribute instance-attribute
UP = 'UP'
REDEPLOYING class-attribute instance-attribute
REDEPLOYING = 'REDEPLOYING'
DOWN class-attribute instance-attribute
DOWN = 'DOWN'

ModelStatus

Bases: Enum

Status of an individual model deployment.

ATTRIBUTE DESCRIPTION
RUNNING

Model is fully deployed and accepting requests.

DEPLOYING

Model is currently being deployed.

NOT_DEPLOYED

Model is configured but not yet started.

DOWN

Model deployment has failed or is unavailable.

RUNNING class-attribute instance-attribute
RUNNING = 'RUNNING'
DEPLOYING class-attribute instance-attribute
DEPLOYING = 'DEPLOYING'
NOT_DEPLOYED class-attribute instance-attribute
NOT_DEPLOYED = 'NOT DEPLOYED'
DOWN class-attribute instance-attribute
DOWN = 'DOWN'

DeploymentType

Bases: Enum

Type of model deployment on NDIF.

ATTRIBUTE DESCRIPTION
DEDICATED

Model is a permanent deployment.

PILOT_ONLY

Model available only for pilot users.

SCHEDULED

Model runs on a schedule (e.g., specific hours).

DEDICATED class-attribute instance-attribute
DEDICATED = 'Dedicated'
PILOT_ONLY class-attribute instance-attribute
PILOT_ONLY = 'Pilot-Only'
SCHEDULED class-attribute instance-attribute
SCHEDULED = 'Scheduled'

request_status classmethod

request_status() -> Union[Status, dict]

Fetch raw status data from the NDIF API.

RETURNS DESCRIPTION
Union[Status, dict]

The raw JSON response from the NDIF status endpoint.

RAISES DESCRIPTION
Exception

If the request times out, with a DOWN status message.

__str__

__str__()

__repr__

__repr__()

register

register(module: ModuleType | str)

Register a local module for serialization by value when executing remotely on NDIF.

When submitting code for remote execution on NDIF, any local modules that are not installed on the server will cause a ModuleNotFoundError. This function registers a module so that its class definitions and function source code are serialized and sent along with the request, allowing them to be rebuilt on the server.

This is a wrapper around cloudpickle.register_pickle_by_value.

PARAMETER DESCRIPTION
module

The module to register for serialization by value. Can be the actual module object or a string with the module's name.

TYPE: ModuleType | str

Note
  • Call this function after importing the module but before using any of its contents in a remote context.
  • The module's source code and definitions will be included in the serialized payload, so keep registered modules reasonably sized.

Examples:

import mymodule from nnsight import LanguageModel from nnsight.ndif import register

Register the local module so it can be used remotely

register(mymodule)

Or register by module name

register("mymodule")

Now you can use functions/classes from mymodule in remote execution

from mymodule.myfile import myfunction

model = LanguageModel("meta-llama/Llama-3.1-70B") with model.generate("Hello", remote=True): ... result = myfunction(model) ... result.save()

status

status(raw: bool = False) -> Union[dict, NdifStatus]

Query the current status of the NDIF service and all deployed models.

This is the primary function for checking NDIF availability and model status. When printed, the returned NdifStatus object displays a formatted table of all available models with their deployment type and current state.

PARAMETER DESCRIPTION
raw

If True, return the raw API response dict. If False (default), return a formatted NdifStatus object.

TYPE: bool DEFAULT: False

RETURNS DESCRIPTION
Union[dict, NdifStatus]

If raw=True: The raw JSON response from the NDIF API.

Union[dict, NdifStatus]

If raw=False: An NdifStatus object with formatted model information, or an empty dict if the request fails.

Examples

TYPE: Union[dict, NdifStatus]

Union[dict, NdifStatus]

from nnsight import ndif_status

Union[dict, NdifStatus]

status = ndif_status()

Union[dict, NdifStatus]

print(status)

Union[dict, NdifStatus]

NDIF Service: Up 🟢

Union[dict, NdifStatus]

┏━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━┳━━━━━━━━━━━┳━━━━━━━━━┓

Union[dict, NdifStatus]

┃ Model Class ┃ Repo ID ┃ Revision ┃ Type ┃ Status ┃

Union[dict, NdifStatus]

┡━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━╇━━━━━━━━━━━╇━━━━━━━━━┩

Union[dict, NdifStatus]

│ LanguageModel │ meta-llama/Llama-3.1-70B │ main │ Dedicated │ RUNNING │

Union[dict, NdifStatus]

└───────────────┴────────────────────────────┴──────────┴───────────┴─────────┘

ndif_status

ndif_status(raw: bool = False) -> Union[dict, NdifStatus]

Deprecated: Use status() instead.

is_model_running

is_model_running(repo_id: str, revision: str = 'main') -> bool

Checks if a specific model is currently running on NDIF.

PARAMETER DESCRIPTION
repo_id

The HuggingFace repository ID (e.g., "meta-llama/Llama-3.1-70B").

TYPE: str

revision

The model revision/branch to check. Defaults to "main".

TYPE: str DEFAULT: 'main'

RETURNS DESCRIPTION
bool

True if the model is running and available, False otherwise

bool

(including if the API request fails).

Examples

TYPE: bool

bool

from nnsight import is_model_running

bool

if is_model_running("meta-llama/Llama-3.1-70B"):

bool

... print("Model is available!")

get_local_env

get_local_env() -> dict

Get the local Python environment information.

get_remote_env

get_remote_env(force_refresh: bool = False) -> dict

Fetch and cache the NDIF environment information from the remote server.

PARAMETER DESCRIPTION
force_refresh

If True, always refetch even if cached.

TYPE: bool DEFAULT: False

RETURNS DESCRIPTION
dict

The environment info from NDIF.

RAISES DESCRIPTION
Exception

If the request fails.

build_table

build_table(local_env: dict, remote_env: dict) -> Table

Build the environment comparison table for local and remote.

compare

compare() -> None

Compare the local Python environment with the NDIF remote environment and print the results.

RETURNS DESCRIPTION
None

None