3
c
1
= (T
2
· g
s
)
θ
1
= g
(r+s)θ
1
mod N and c
2
= g
s
mod N ,
then sends {T
1
, c
1
, c
2
} to proxy.
Once received {T
1
, c
1
, c
2
}, the proxy computes (c
1
)
θ
2
=
h
r+s
and its inverse (h
r+s
)
−1
mod N . Then, it computes
T
0
1
= T
1
(h
r+s
)
−1
mod N
2
= E
pk
+
(mh
−s
).
and c
0
2
= (c
2
)
θ
2
= g
sθ
2
mod N , and sends {T
0
1
, c
0
2
} to the
server.
The server uses {T
0
1
, c
0
2
} to compute (c
0
2
)
θ
1
= h
s
mod N .
Then, the ADD ciphertext can be recovered by calculating
(T
0
1
)
h
s
= E
pk
+
(mh
−s
· h
s
) = E
pk
+
(m).
III. SYSTEM & PRIVACY MODELS
In this section, we formalize the EPOC system and privacy
models.
A. System Model
Our system consists of four types of generic entities: Trusted
Authority (TA), User, Public Cloud Platform (PCP) and Private
Computation Cloud (PCC).
1) Trusted Authority (TA): TA is assumed to be trusted by
all the other entities to distribute and manage all the private
& public keys in the system.
2) User: The goal of a user is to get computed results
over public database according to a function of his choice.
The user sends a query to PCP, specifying the function in a
privacy-preserving manner. The PCP computes the results and
returns them back to the user. Note that the results can only
be decrypted with the help of PCC, i.e., the user cannot get
the finally results without the authorization of PCC.
3) Private Computation Cloud (PCC) : PCC provides
computation service to the user, e.g., it can process encrypted
data such as multiplication over the encrypted data. Also, PCC
has the ability to partially decrypt ciphertexts which are sent
from PCP, do some calculations over the plaintexts, and then
re-encrypt the data with corresponding user’s public key.
4) Public Cloud Platform (PCP) : PCP contains unlimited
data storage space which stores and manages all the public
data. PCP stores all the intermediate and final results in
encrypted form. Furthermore, PCP has some computation
power to perform certain calculations over encrypted data.
Next, we present the dataset and the function used in
our EPOC. Suppose a data space D defined by a set of
γ dimensions {x
1
, ··· , x
γ
} and a dataset S on D with
cardinality γ. A transaction
~
a
i
∈ S can be represented
as
~
a
i
= (a
i,1
, ··· , a
i,γ
), where a
i,k
∈ G
2p
0
q
0
is a value
on dimension d
k
(i = 1, ··· , τ, k = 1, ··· , γ). All the
transactions are stored as plaintext form in PCP and the dataset
S can be accessed by any party in the system (include the
adversary A
∗
defined in III-B). A user α’s goal is to obtain
the output of the function
F : o =
κ
X
j=1
C
j
x
t
j,1
1
···x
t
j,γ
γ
,
3
3
All the functions used in EPOC are defined over ring of integers Z, thus
the function must satisfy the following restriction: 1) the size of output of the
function must less than N, i.e. |o| < N . 2) the size of every monomials must
less N , i.e. |C
j
x
t
j,1
1
· · · x
t
j,γ
γ
| < N for every j.
over the public data stored in PCP, where C
j
∈ G
2p
0
q
0
and
t
j,k
∈ Z
N
. This kind of functions can be used for statistical
analysis. For example, the user can calculate the arithmetic
mean x = (
P
γ
i=1
x
i
)/γ across different dimensions. The stan-
dard deviation σ =
q
1
γ
P
γ
i
(x
i
− x)
2
can also be calculated
(note that X =
1
γ
P
γ
i
(x
i
− x)
2
, a special case of F, can be
computed on server side, the user can calculate
√
X by himself
to get the standard deviation). Moreover, for polynomial kernel
SVM classification, the classifier is to map the data from the
input space X to a feature space F . In the space F , the
discriminant function is f(x) =
P
n
i=1
k(x, x
i
) + b, where
k(x, x
0
) = (x
T
x
0
+ 1)
d
is a degree-d polynomial kernel.
B. Privacy Model
In our privacy model, PCP and PCC are curious-but-
honest parties in that they strictly follow the protocol, but
are interested in user’s private information, e.g., both PCP and
PCC are curious about user’s private outsourced function and
calculated results. EPOC should meet the following privacy
requirements.
• User Privacy: protect privacy of user’s function and the
output of the function from PCP, PCC and any external
adversary. That is, PCP, PCC and any external adversary
cannot learn anything about user’s function in the query
as well as anything about the output of the function.
• Privilege Separation: It can prevent the user to the abuse
of privilege without the permission of the PCC and allow
PCC to efficiently revoke the user’s private key.
• Security of Data-in-transit: guarantee confidentiality of
data during transmission.
To satisfy these privacy requirements, we assume that there
exist an active adversary A
∗
in our system. The goal of A
∗
is to compromise user’s privacy
4
with the following attacking
abilities: A
∗
may eavesdrop on all the communication links to
get the challenge user’s encrypted queries, encrypted middle
results and encrypted final results. Also, A
∗
may compromise
PCP, PCC, and users simultaneously, but subject to the follow-
ing restrictions: 1) A
∗
cannot compromise PCP and PCC at
the same time. 2) A
∗
cannot compromise the challenge user.
For efficiency concern, we achieve the aforementioned
privacy requirements by defining the following three privacy
levels.
Definition 1. (Level-I Privacy): Upon completion of an EPOC
protocol, A
∗
cannot learn the coefficients C
j
(j = 1, ··· , κ)
of the outsourced function F and the final outputs.
Definition 2. (Level-II Privacy) Upon completion of an EPOC
protocol, A
∗
cannot learn the coefficients C
j
(j = 1, ··· , κ)
If the function is defined over Z
N
, the function which used in EPOC has
no restriction, i.e., the function is formalized as
F : o =
κ
X
j=1
C
j
x
t
j,1
1
· · · x
t
j,γ
γ
mod N.
4
Before attacking a user, the adversary A
∗
should issue a target and try
to guess the plaintext value of the target user’s ciphertext with some existing
resources. We call the target user as the challenge user.