You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The provided model needs to be in the Onnx format.
573
+
604
574
***Security & confidentiality warnings:***
605
-
*`model`: The model sent on a Onnx format is encrypted in transit via TLS (as all connections). It may be subject to inference Attacks if an adversary is able to query the trained model repeatedly to determine whether or not a particular example is part of the trained dataset model.
575
+
model: The model sent on a Onnx format is encrypted in transit via TLS (as all connections).
576
+
It may be subject to inference Attacks if an adversary is able to query the trained model
577
+
repeatedly to determine whether or not a particular example is part of the trained dataset model.
606
578
Args:
607
579
model (str): Path to Onnx model file.
608
-
model_name (Optional[str], optional): Name of the model. By default, the server will assign a random UUID. You can call the model with the name you specify here.
609
-
optimize (bool): Whether tract (our inference engine) should optimize the model or not. Optimzing should only be turned off when tract wasn't able to optimze the model.
580
+
model_name (Optional[str], optional): Name of the model. By default, the server will assign a random UUID.
581
+
You can call the model with the name you specify here.
582
+
optimize (bool): Whether tract (our inference engine) should optimize the model or not.
583
+
Optimzing should only be turned off when tract wasn't able to optimze the model.
610
584
Raises:
611
585
HttpError: raised by the requests lib to relay server side errors
Send data to the server to make a secure inference.
646
-
The data provided must be in a list, as the tensor will be rebuilt inside the server.
618
+
"""Send data to the server to make a secure inference.
619
+
620
+
The data provided must be in a list, as the tensor will be rebuilt inside the
621
+
server.
622
+
647
623
***Security & confidentiality warnings:***
648
-
*`model_id` : hash of the Onnx model uploaded. the given hash is return via gRPC through the proto files. It's a SHA-256 hash that is generated each time a model is uploaded.
649
-
`tensors`: protected in transit and protected when running it on the secure enclave. In the case of a compromised OS, the data is isolated and confidential by SGX design.
624
+
model_id: hash of the Onnx model uploaded. the given hash is return via gRPC through the proto files.
625
+
It's a SHA-256 hash that is generated each time a model is uploaded.
626
+
tensors: protected in transit and protected when running it on the secure enclave.
627
+
In the case of a compromised OS, the data is isolated and confidential by SGX design.
628
+
650
629
Args:
651
630
model_id (str): If set, will run a specific model.
652
-
input_tensors (Union[List[Any], List[List[Any]]))): The input data. It must be an array of numpy, tensors or flat list of the same type datum_type specified in `upload_model`.
653
-
dtypes (Union[List[ModelDatumType], ModelDatumType], optional): The type of data of the data you want to upload. Only required if you are uploading flat lists, will be ignored if you are uploading numpy or tensors (this info will be extracted directly from the tensors/numpys).
654
-
shapes (Union[List[List[int]], List[int]], optional): The shape of the data you want to upload. Only required if you are uploading flat lists, will be ignored if you are uploading numpy or tensors (this info will be extracted directly from the tensors/numpys).
631
+
input_tensors (Union[List[Any], List[List[Any]]))): The input data. It must be an array of numpy,
632
+
tensors or flat list of the same type datum_type specified in `upload_model`.
633
+
dtypes (Union[List[ModelDatumType], ModelDatumType], optional): The type of data
634
+
of the data you want to upload. Only required if you are uploading flat lists, will be ignored
635
+
if you are uploading numpy or tensors (this info will be extracted directly from the tensors/numpys).
636
+
shapes (Union[List[List[int]], List[int]], optional): The shape of the data you want to upload.
637
+
Only required if you are uploading flat lists, will be ignored if you are uploading numpy
638
+
or tensors (this info will be extracted directly from the tensors/numpys).
655
639
Raises:
656
640
HttpError: raised by the requests lib to relay server side errors
If you did not specify that you wanted your model to be saved on the server, please note that the model will only be present in memory, and will disappear when the server close.
682
-
***Security & confidentiality warnings:***
683
-
*model_id : If you are using this on the Mithril Security Cloud, you can only delete models that you uploaded. Otherwise, the deletion of a model does only relies on the `model_id`. It doesn't relies on a session token or anything, hence if the `model_id` is known, it's deletion is possible.*
662
+
"""Delete a model in the inference server.
663
+
664
+
This may be used to free up some memory. If you did not specify that you
665
+
wanted your model to be saved on the server, please note that the model will
666
+
only be present in memory, and will disappear when the server close.
667
+
668
+
**Security & confidentiality warnings: **
669
+
model_id: If you are using this on the Mithril Security Cloud, you can only delete models
670
+
that you uploaded. Otherwise, the deletion of a model does only relies on the `model_id`.
671
+
It doesn't relies on a session token or anything, hence if the `model_id` is known,
0 commit comments