Skip to main content

Scoring Metrics

Submitting Predictions

Once you've built your differentially private model and made predictions, you can submit them for scoring. This can be done from your local environment or directly from a cell running the %%ag magic command on the Enclave server.

Submitting Predictions Locally

To submit your predictions from your local environment, use the submit_predictions method of the client object created from ag.login. Ensure that your predictions are in a Pandas DataFrame format.

# Assuming 'ag_client' is your logged-in client object
# and 'predictions_df' is a DataFrame containing your predictions

result = ag_client.submit_predictions(predictions_df)
print(result)

This method will return a response in the form of a dictionary representing your score. Note that a score will be generated only for a competition, not for a dataset.

A positive response would look something like this:

{
'score': {
'leaderboard': 0.30710269312062605,
'logs': {
'BIN_ACC': 0.30710269312062605,
'LIN_EPS': -0.0
}
}
}

Submitting Predictions from %%ag Cell

If you're executing code in the Enclave with the %%ag magic cell, you can utilize the submit_predictions function from the ag_utils library. Make sure your predictions are formatted as either a Pandas DataFrame, PrivateDataFrame, or PrivateSeries.

%%ag

# Assuming 'predictions_df' is a DataFrame, PrivateDataFrame, or PrivateSeries containing your predictions
result = submit_predictions(predictions_df)

As with the local submission, the function will return a dictionary containing your score. Remember, scoring is applicable for competitions and not for individual datasets.

If there are issues with the submission, such as mismatched dimensions between your predictions and the test data, you might receive an error message like this:

{ "detail": "Supervisor returned error while submitting predictions. 
Error from upstream service: (400, 'Dimensions of user provided data does not match with test data')" }