Skip to main content

Platform

Learn more

Competing at Antigranular is not just about obtaining the maximum accuracy. It is also about using the ideal amount of privacy budget when performing any differentially private method.It is highly possible that you might end up getting a low score even if your predictions are accurate (incase you spend too much privacy budget). To avoid such scenarios , it is important to understand that you spend privacy budget wisely and go through the important notes and practice before trying out the competition.

Important notes

  • You can learn about the function signatures and what Antigranular has to offer in the Private Python section.
  • Learn how to use private pandas using the following quick guide.
  • Practice the application of differentially private methods to work out sample prediction models on various datasets before trying out the competition.
  • Learn to apply commonly used differentially private methods using the sample notebooks that are uploaded here.
  • Epsilon spent on competitions cannot be erased or undone. Hence, its advised to export important variables and download useful results / data to a local loadable file. This is will avoid you from spending unnecessary epsilons for obtaining results that were already computed.
  • For testing some function signature or syntax , It is better to spend neglible amount of epsilon ( example: 0.00001 ) rather than wasting your precious privacy budget and reducing your overall score.
  • Incase you need any guidance or want a feature request , feel free to connect using our discord channel.

How is the score calculated ?

The scoring mechanism for a particular competition will be mentioned in the overview section and will most likely vary depending on the dataset used. Here is an example scoring mechanism which is used in our first competition:

  • Score = Λ - ε/κ
    • Accuracy (Λ): This parameter corresponds to the accuracy of your prediction against the test data and can be a maximum of 1.0 for 100 percent accuracy.

    • Epsilon (ε): A privacy parameter in differential privacy that quantifies the level of privacy protection. Smaller ε provides stronger privacy guarantees but may result in noisier results, while larger ε provides weaker privacy guarantees but more accurate results.

    • Constant (κ): This parameter will vary depending on the size of the dataset used and will be mentioned in the overview section of the competition.

  • You can learn more about other scoring metrics here.

ε in Differential Privacy

  • Epsilon (ε): In differential privacy, epsilon is a key parameter that measures the level of privacy protection. Smaller ε provides stronger privacy but may result in noisier results, while larger ε offers weaker privacy but more accurate outcomes.

  • Privacy Protection: Differential privacy limits the impact of individual data points on query results. It achieves this by adding calibrated noise, making it difficult to distinguish if specific data was used in computations.

  • Trade-off: Epsilon controls the trade-off between privacy and utility. Lower ε offers stronger privacy guarantees at the cost of more noise, while higher ε provides weaker privacy but more accurate results.

  • Formal Definition: A mechanism is ε-differentially private if the probability of observing an output on two neighboring datasets differs by at most e^ε times.

  • Calibrating Privacy: Epsilon allows balancing privacy and data utility. Privacy-conscious analyses aim for an appropriate ε to ensure privacy while preserving result accuracy.