Differential privacy (DP) provides a way to quantify privacy. A privacy budget quantitatively measures by how much the risk to an individual’s privacy may increase due to the inclusion of certain data. The higher the value, the less privacy protection is provided.
This paper by Jun Tang, Aleksandra Korolova, Xiaolong Bai, Xueqiang Wang, and Xiaofeng Wang identifies the components of the DP system under macOS and the organization of the system. The use of DP is focused on new words, emojis, deep links and look up hints in notes. When a user uses an emoji or uses a new word, a privatized version is added to a table.
This paper provides detailed information on what is contained in this database and the related privacy budget system. If you are interested in how this system is put together and don’t have the time to reverse engineer it this paper provides valuable insight as Apple doesn’t provide documentation on how this process works.
This perhaps would be the important outcome from this paper. The how and what of the differential privacy preservation system is not well explained and is far from transparent. Further the privacy loss allowed by the system is much higher than would be considered reasonable by academics. The allowed loss of the system is 16 per day, which higher than would be considered privacy conscious and due to the daily reset it permits a much greater loss over time. Properly implementing DP in the real world is no easy feat, but transparency and accountability of the effectiveness of deployed systems is important in moving forward.