Personalized Privacy-Preserving Social Recommendation
Xuying Meng
1, 2
, Suhang Wang
3
, Kai Shu
3
, Jundong Li
3
,
Bo Chen
4
, Huan Liu
3
and Yujun Zhang
1
1
Institute of Computing Technology, Chinese Academy of Sciences, Beijing 100190, China
2
University of Chinese Academy of Sciences, Beijing 100049, China
3
Computer Science and Engineering, Arizona State University, Tempe, 85281, USA
4
Department of Computer Science, Michigan Technological University, Houghton, 49931, USA
{mengxuying, zhmj}@ict.ac.cn, {suhang.wang, kai.shu, jundongl, huan.liu}@asu.edu, bchen@mtu.edu
Abstract
Privacy leakage is an important issue for social recommen-
dation. Existing privacy preserving social recommendation
approaches usually allow the recommender to fully control
users’ information. This may be problematic since the rec-
ommender itself may be untrusted, leading to serious privacy
leakage. Besides, building social relationships requires shar-
ing interests as well as other private information, which may
lead to more privacy leakage. Although sometimes users are
allowed to hide their sensitive private data using privacy set-
tings, the data being shared can still be abused by the adver-
saries to infer sensitive private information. Supporting social
recommendation with least privacy leakage to untrusted rec-
ommender and other users (i.e., friends) is an important yet
challenging problem.
In this paper, we aim to address the problem of achieving
privacy-preserving social recommendation under personal-
ized privacy settings. We propose PrivSR, a novel framework
for privacy-preserving social recommendation, in which users
can model ratings and social relationships privately. Mean-
while, by allocating different noise magnitudes to personal-
ized sensitive and non-sensitive ratings, we can protect users’
privacy against the untrusted recommender and friends. The-
oretical analysis and experimental evaluation on real-world
datasets demonstrate that our framework can protect users’
privacy while being able to retain effectiveness of the under-
lying recommender system.
Introduction
The recommender system has become an imperative compo-
nent of myriad online commercial platforms. With increas-
ing popularity of social networks, recommender systems can
take advantage of rich social relationships to further improve
effectiveness of recommendation (Tang, Hu, and Liu 2013;
Wang et al. 2017; Shu et al. 2018). Despite their effective-
ness, these social relationship-based recommender systems
(i.e., social recommendation), however, may introduce an-
other source of privacy leakage. For example, by observing
victim users’ ratings on products such as adult or medical
items, the attacker may infer the victims’ private sex incli-
nations and health conditions (Fredrikson et al. 2014), which
may be even further abused for financial benefits (Niko-
laenko et al. 2013).
Copyright
c
2018, Association for the Advancement of Artificial
Intelligence (www.aaai.org). All rights reserved.
In practice, a privacy-preserving social recommender sys-
tem, which can produce accurate recommendation results
without sacrificing users’ privacy, is very necessary. There
are a few mechanisms dedicated along this line. How-
ever, most of them suffer from following defects. First,
a vast majority of existing efforts (Liu and Terzi 2010;
Jorgensen and Yu 2014) heavily rely on an assumption that
the recommender is fully trusted. They neglect the fact that
the recommender itself may be untrusted and may conduct
malicious behaviors, causing serious privacy leakage. Sec-
ond, some other works (Hoens, Blanton, and Chawla 2010;
Tang and Wang 2016) rely on cryptography to prevent
users’ exact inputs from being leaked to the untrusted rec-
ommender. Nonetheless, it has been shown that attackers
can still infer sensitive information about the victim users
based on their influence on the final results (McSherry
and Mironov 2009). In addition, the cryptographic process
is usually expensive and may bring large computational
overhead. Third, some of the existing works (Machanava-
jjhala, Korolova, and Sarma 2011; Jorgensen and Yu 2014;
Hua, Xia, and Zhong 2015) rely on friends’ history ratings
to make recommendations. These methods, however, do not
differentiate sensitive and non-sensitive ratings and simply
treat them equally, which contradicts the real-world scenar-
ios. In practice, social media sites such as IMDB and Face-
book
1
allow users to specify the visibility of their ratings
on products. Treating all the ratings as equally sensitive and
thus not exposing any non-sensitive ratings will make it dif-
ficult to attract common-interest friends and make effective
recommendations, sacrificing user experience in the long
run. Our work actually allows to disclosing the non-sensitive
rating, but prevents sensitive ratings from being leaked from
the exposed non-sensitive ratings.
Resolving all the aforementioned defects is necessary
for building an effective privacy-preserving social recom-
mender system, which is a very challenging task due to the
following reasons: First, to eliminate the assumption that a
recommender is fully trustful, we need to change the recom-
mender system from a fully centralized manner to a semi-
centralized manner. In other words, instead of fully rely-
ing on the recommender, we now allow users and the rec-
1
Facebook provides public pages for products, e.g., https://
www.facebook.com/pages/Google-Earth/107745592582048