Skip to main content

Authentication with Weaker Trust Assumptions for Voting Systems

  • Conference paper
  • First Online:
Progress in Cryptology – AFRICACRYPT 2018 (AFRICACRYPT 2018)

Part of the book series: Lecture Notes in Computer Science ((LNSC,volume 10831))

Included in the following conference series:

Abstract

Some voting systems are reliant on external authentication services. Others use cryptography to implement their own. We combine digital signatures and non-interactive proofs to derive a generic construction for voting systems with their own authentication mechanisms, from systems that rely on external authentication services. We prove that our construction produces systems satisfying ballot secrecy and election verifiability, assuming the underlying voting system does. Moreover, we observe that works based on similar ideas provide neither ballot secrecy nor election verifiability. Finally, we demonstrate applicability of our results by applying our construction to the Helios voting system.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Meyer and Smyth describe the application of OAuth in Helios [23].

  2. 2.

    Quaglia and Smyth [27] provide a tutorial-style introduction to definitions of ballot secrecy and election verifiability, and Smyth [33] provides a technical introduction.

  3. 3.

    Some voting systems permit the tallier’s role to be distributed amongst several talliers. For simplicity, we consider only a single tallier in this paper.

  4. 4.

    Let \(A(x_1,\dots ,x_n; r)\) denote the output of probabilistic algorithm A on inputs \(x_1,\dots ,x_n\) and random coins r. Let \(A(x_1,\dots ,x_n)\) denote \(A(x_1,\dots ,x_n;r)\), where r is chosen uniformly at random. And let \(\leftarrow \) denote assignment. Moreover, let \(\langle x \rangle \) denote an optional input and \(\mathbf{v}[v]\) denote component v of vector \(\mathbf{v}\).

  5. 5.

    Let \(\mathsf {FS}(\varSigma ,\mathcal H)\) denote the non-interactive proof system derived by application of the Fiat-Shamir transformation to sigma protocol \(\varSigma \) and hash function \(\mathcal H\).

  6. 6.

    We omit a formal definition of asymmetric encryption for brevity.

  7. 7.

    We adopt the formal definition of comparison based non-malleability under chosen plaintext attack, which coincides with indistinguishability under a parallel chosen-ciphertext attack (\(\textsf {IND}{\text {-}}\textsf {PA0}\)) [3]. We omit formal security definitions for brevity.

  8. 8.

    Beyond secrecy and verifiability, attacks against eligibility are also known [23, 38].

  9. 9.

    Ballot blocking violates the recorded-as-cast assumption used in Cortier et al.’s proof.

  10. 10.

    Smyth [34] shows that vulnerabilities in Helios cause vulnerabilities in implementations of the mixnet variant and proves verifiability is satisfied when a fix is applied.

  11. 11.

    Oracles may access game parameters, e.g., \( pk \).

  12. 12.

    Function \( correct\text {-}outcome \) uses a counting quantifier [31] denoted \(\exists ^{=}\). Predicate \((\exists ^{=\ell } x : P(x))\) holds exactly when there are \(\ell \) distinct values for x such that P(x) is satisfied. Variable x is bound by the quantifier, whereas \(\ell \) is free.

References

  1. Adida, B.: Helios: web-based open-audit voting. In: USENIX Security 2008: 17th USENIX Security Symposium, pp. 335–348. USENIX Association (2008)

    Google Scholar 

  2. Adida, B., Marneffe, O., Pereira, O., Quisquater, J.: Electing a university president using open-audit voting: analysis of real-world use of Helios. In: EVT/WOTE 2009: Electronic Voting Technology Workshop/Workshop on Trustworthy Elections. USENIX Association (2009)

    Google Scholar 

  3. Bellare, M., Sahai, A.: Non-malleable encryption: equivalence between two notions, and an indistinguishability-based characterization. In: Wiener, M. (ed.) CRYPTO 1999. LNCS, vol. 1666, pp. 519–536. Springer, Heidelberg (1999). https://doi.org/10.1007/3-540-48405-1_33

    Google Scholar 

  4. Benaloh, J., Vaudenay, S., Quisquater, J.: Final report of IACR electronic voting committee. International Association for Cryptologic Research, September 2010. https://iacr.org/elections/eVoting/finalReportHelios_2010-09-27.html

  5. Bernhard, D., Cortier, V., Galindo, D., Pereira, O., Warinschi, B.: SoK: a comprehensive analysis of game-based ballot privacy definitions. In: S&P 2015: 36th Security and Privacy Symposium. IEEE Computer Society (2015)

    Google Scholar 

  6. Bernhard, D., Cortier, V., Pereira, O., Smyth, B., Warinschi, B.: Adapting Helios for provable ballot privacy. In: Atluri, V., Diaz, C. (eds.) ESORICS 2011. LNCS, vol. 6879, pp. 335–354. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-23822-2_19

    Chapter  Google Scholar 

  7. Bernhard, D., Pereira, O., Warinschi, B.: How not to prove yourself: pitfalls of the Fiat-Shamir heuristic and applications to Helios. In: Wang, X., Sako, K. (eds.) ASIACRYPT 2012. LNCS, vol. 7658, pp. 626–643. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-34961-4_38

    Chapter  Google Scholar 

  8. Bernhard, D., Pereira, O., Warinschi, B.: On necessary and sufficient conditions for private Ballot submission. Cryptology ePrint Archive, Report 2012/236 (version 20120430:154117b) (2012)

    Google Scholar 

  9. Bulens, P., Giry, D., Pereira, O.: Running mixnet-based elections with Helios. In: EVT/WOTE 2011: Electronic Voting Technology Workshop/Workshop on Trustworthy Elections. USENIX Association (2011)

    Google Scholar 

  10. Bundesverfassungsgericht (Germany’s Federal Constitutional Court): Use of voting computers in 2005 Bundestag election unconstitutional. Press release 19/2009, March 2009

    Google Scholar 

  11. Cortier, V., Galindo, D., Glondu, S., Izabachene, M.: A generic construction for voting correctness at minimum cost - application to Helios. Cryptology ePrint Archive, Report 2013/177 (version 20130521:145727) (2013)

    Google Scholar 

  12. Cortier, V., Galindo, D., Glondu, S., Izabachene, M.: Distributed elgamal à la pedersen: application to Helios. In: WPES 2013: Workshop on Privacy in the Electronic Society, pp. 131–142. ACM Press (2013)

    Google Scholar 

  13. Cortier, V., Galindo, D., Glondu, S., Izabachène, M.: Election verifiability for Helios under weaker trust assumptions. In: Kutyłowski, M., Vaidya, J. (eds.) ESORICS 2014 Part II. LNCS, vol. 8713, pp. 327–344. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-11212-1_19

    Google Scholar 

  14. Cortier, V., Galindo, D., Glondu, S., Izabachène, M.: Election verifiability for Helios under weaker trust assumptions. Technical report RR-8555, INRIA (2014)

    Google Scholar 

  15. Cortier, V., Smyth, B.: Attacking and fixing Helios: an analysis of ballot secrecy. In: CSF 2011: 24th Computer Security Foundations Symposium, pp. 297–311. IEEE Computer Society (2011)

    Google Scholar 

  16. Gonggrijp, R., Hengeveld, W.J.: Studying the Nedap/Groenendaal ES3B voting computer: a computer security perspective. In: EVT 2007: Electronic Voting Technology Workshop. USENIX Association (2007)

    Google Scholar 

  17. Gumbel, A.: Steal This Vote: Dirty Elections and the Rotten History of Democracy in America. Nation Books, New York (2005)

    Google Scholar 

  18. Haber, S., Benaloh, J., Halevi, S.: The Helios e-voting demo for the IACR. International Association for Cryptologic Research, May 2010. https://iacr.org/elections/eVoting/heliosDemo.pdf

  19. Jones, D.W., Simons, B.: Broken ballots: will your vote count? CSLI Lecture Notes, vol. 204. Stanford University, Center for the Study of Language and Information (2012)

    Google Scholar 

  20. Juels, A., Catalano, D., Jakobsson, M.: Coercion-resistant electronic elections. In: Chaum, D., Jakobsson, M., Rivest, R.L., Ryan, P.Y.A., Benaloh, J., Kutylowski, M., Adida, B. (eds.) Towards Trustworthy Elections. LNCS, vol. 6000, pp. 37–63. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-12980-3_2

    Chapter  Google Scholar 

  21. Kiayias, A., Zacharias, T., Zhang, B.: End-to-end verifiable elections in the standard model. In: Oswald, E., Fischlin, M. (eds.) EUROCRYPT 2015 Part II. LNCS, vol. 9057, pp. 468–498. Springer, Heidelberg (2015). https://doi.org/10.1007/978-3-662-46803-6_16

    Google Scholar 

  22. Lijphart, A., Grofman, B.: Choosing an Electoral System: Issues and Alternatives. Praeger, New York (1984)

    Google Scholar 

  23. Meyer, M., Smyth, B.: An attack against the Helios election system that exploits re-voting. arXiv, Report 1612.04099 (2017)

    Google Scholar 

  24. Organization for Security and Co-operation in Europe: Document of the Copenhagen Meeting of the Conference on the Human Dimension of the CSCE (1990)

    Google Scholar 

  25. Organization of American States: American Convention on Human Rights, “Pact of San Jose, Costa Rica” (1969)

    Google Scholar 

  26. Pereira, O.: Internet voting with Helios. In: Real-World Electronic Voting: Design, Analysis and Deployment, Chap. 11. CRC Press (2016)

    Google Scholar 

  27. Quaglia, E.A., Smyth, B.: A short introduction to secrecy and verifiability for elections. arXiv, Report 1702.03168 (2017)

    Google Scholar 

  28. Quaglia, E.A., Smyth, B.: Authentication with weaker trust assumptions for voting systems (2018). https://bensmyth.com/publications/2018-voting-authentication/

  29. Quaglia, E.A., Smyth, B.: Secret, verifiable auctions from elections. Cryptology ePrint Archive, Report 2015/1204 (2018)

    Google Scholar 

  30. Saalfeld, T.: On dogs and whips: recorded votes. In: Döring, H. (ed.) Parliaments and Majority Rule in Western Europe, Chap. 16. St. Martin’s Press (1995)

    Google Scholar 

  31. Schweikardt, N.: Arithmetic, first-order logic, and counting quantifiers. ACM Trans. Comput. Logic 6(3), 634–671 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  32. Smyth, B.: Ballot secrecy: security definition, sufficient conditions, and analysis of Helios. Cryptology ePrint Archive, Report 2015/942 (2018)

    Google Scholar 

  33. Smyth, B.: A foundation for secret, verifiable elections (2018). https://bensmyth.com/publications/2018-secrecy-verifiability-elections-tutorial/

  34. Smyth, B.: Verifiability of Helios mixnet. In: Voting 2018: 3rd Workshop on Advances in Secure Electronic Voting. LNCS, Springer (2018)

    Google Scholar 

  35. Smyth, B., Bernhard, D.: Ballot secrecy and ballot independence coincide. In: Crampton, J., Jajodia, S., Mayes, K. (eds.) ESORICS 2013. LNCS, vol. 8134, pp. 463–480. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-40203-6_26

    Chapter  Google Scholar 

  36. Smyth, B., Frink, S., Clarkson, M.R.: Election Verifiability: Cryptographic Definitions and an Analysis of Helios, Helios-C, and JCJ. Cryptology ePrint Archive, Report 2015/233 (2017)

    Google Scholar 

  37. Smyth, B., Hanatani, Y., Muratani, H.: NM-CPA secure encryption with proofs of plaintext knowledge. In: Tanaka, K., Suga, Y. (eds.) IWSEC 2015. LNCS, vol. 9241, pp. 115–134. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-22425-1_8

    Chapter  Google Scholar 

  38. Smyth, B., Pironti, A.: Truncating TLS Connections to Violate Beliefs in Web Applications. In: WOOT 2013: 7th USENIX Workshop on Offensive Technologies. USENIX Association (2013). First Appeared at Black Hat USA 2013

    Google Scholar 

  39. Springall, D., Finkenauer, T., Durumeric, Z., Kitcat, J., Hursti, H., MacAlpine, M., Halderman, J.A.: Security analysis of the estonian internet voting system. In: CCS 2014: 21st ACM Conference on Computer and Communications Security, pp. 703–715. ACM Press (2014)

    Google Scholar 

  40. Staff, C.: ACM’s 2014 General Election: Please Take This Opportunity to Vote. Commun. ACM 57(5), 9–17 (2014)

    Article  Google Scholar 

  41. Tsoukalas, G., Papadimitriou, K., Louridas, P., Tsanakas, P.: From Helios to Zeus. J. Elect. Technol. Syst. 1(1), 1–17 (2013)

    Google Scholar 

  42. United Nations: Universal Declaration of Human Rights (1948)

    Google Scholar 

  43. Wolchok, S., Wustrow, E., Halderman, J.A., Prasad, H.K., Kankipati, A., Sakhamuri, S.K., Yagati, V., Gonggrijp, R.: Security analysis of India’s electronic voting machines. In: CCS 2010: 17th ACM Conference on Computer and Communications Security, pp. 1–14. ACM Press (2010)

    Google Scholar 

  44. Wolchok, S., Wustrow, E., Isabel, D., Halderman, J.A.: Attacking the Washington, D.C. internet voting system. In: Keromytis, A.D. (ed.) FC 2012. LNCS, vol. 7397, pp. 114–128. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-32946-3_10

    Chapter  Google Scholar 

Download references

Acknowledgements

In the context of [36], Smyth conceived the fundamental ideas of our construction for election schemes with internal authentication. In addition, Smyth discovered that Helios-C does not satisfy ballot secrecy, whilst analysing election verifiability. Smyth and his co-authors, Frink & Clarkson, decided not to publish these results. This paper builds upon those unpublished results and we are grateful to Frink and Clarkson for their part in inspiring this line of work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Elizabeth A. Quaglia .

Editor information

Editors and Affiliations

Appendices

A Ballot Privacy: Definitions and Proofs

We recall Smyth’s definition of ballot secrecy for election schemes with external authentication (Definition 6), and present a natural, straightforward extension of that definition to capture ballot secrecy for election schemes with internal authentication (Definition 7). Our definitions both use predicate \( balanced \) such that \( balanced (\mathfrak {bb}, nc ,B)\) holds when: for all votes \(v\in \{1,\dots , nc \}\) we have \(|\{b \mid b \in \mathfrak {bb}\wedge \exists v_1 \mathrel . (b,v,v_1) \in B\}| = |\{b \mid b \in \mathfrak {bb}\wedge \exists v_0 \mathrel . (b,v_0,v) \in B\}|\). Intuitively, the definitions challenge an adversary to determine whether the left-right oracle produces ballots for “left” or “right” inputs, by giving the adversary the oracle’s outputs, as well as the election outcome and tallying proof. The definitions prevent the adversary from trivially distinguishing ballots by requiring predicate \( balanced \) to hold.

Definition 6

(\(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Ext}\) [32]). Let \(\varGamma = (\mathsf {Setup}, \mathsf {Vote}, \mathsf {Tally}, \mathsf {Verify})\) be an election scheme with external authentication, \(\mathcal {A}\) be an adversary, \(\kappa \) be a security parameter, and \(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Ext}(\varGamma ,\mathcal {A},\kappa )\) be the following game.

figure c

Oracle \(\mathcal O_{}\) is defined as follows:Footnote 11

  • \(\mathcal O_{}(v_0,v_1)\) computes if \(v_0,v_1\in \{1, \ldots , nc \}\) then .

We say \(\varGamma \) satisfies \(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Ext}\), if for all probabilistic polynomial-time adversaries \(\mathcal {A}\), there exists a negligible function \(\mathsf {negl}\), such that for all security parameters \(\kappa \), we have \(\mathsf {Succ}(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Ext}(\varGamma , \mathcal {A}, \kappa )) \le \frac{1}{2}+\mathsf {negl}(\kappa )\).

Definition 7

(\(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Int}\)). Let \(\varGamma = (\mathsf {Setup}, \mathsf {Register}, \mathsf {Vote}, \mathsf {Tally}, \mathsf {Verify})\) be an election scheme with internal authentication, \(\mathcal {A}\) be an adversary, \(\kappa \) be a security parameter, and \(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Int}(\varGamma ,\mathcal {A},\kappa )\) be the following game.

figure d

Oracle \(\mathcal O_{}\) is defined as follows:

  • \(\mathcal O_{}(i,v_0,v_1)\) computes if \(v_0,v_1\in \{1, \ldots , nc \} \wedge i\not \in R\) then ; and

  • \(\mathcal O_{}(i)\) computes if \(i\not \in R\) then .

We say \(\varGamma \) satisfies \(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Int}\), if for all probabilistic polynomial-time adversaries \(\mathcal {A}\), there exists a negligible function \(\mathsf {negl}\), such that for all security parameters \(\kappa \), we have \(\mathsf {Succ}(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Int}(\varGamma ,\mathcal {A}, \kappa )) \le \frac{1}{2}+ \mathsf {negl}(\kappa )\).

Game \(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Int}\) extends \(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Ext}\) to take credentials into account. In particular, the challenger constructs \( nv \) credentials, where \( nv \) is chosen by the adversary. These credentials are used to construct ballots and for tallying. Public and private credentials are available to the adversary. Albeit, the oracle will only reveal a private credential if it has not used it to construct a ballot. Moreover, the oracle may only use a private credential to construct a ballot if it has not revealed it nor constructed a previous ballot with it.

Proof of Theorem 2

Suppose \(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Int}\) is not satisfied by , i.e., there exists a adversary \(\mathcal {A}\) such that for all negligible functions \(\mathsf {negl}\) there exists a security parameter \(\kappa \) and . We construct an adversary \(\mathcal {B}\) against \(\varGamma \) from \(\mathcal {A}\).

Let \(\varGamma = (\mathsf {Setup}_\varGamma , \mathsf {Vote}_\varGamma , \mathsf {Tally}_\varGamma , \mathsf {Verify}_\varGamma )\), \(\varOmega = (\mathsf {Gen}_\varOmega , \mathsf {Sign}_\varOmega , \mathsf {Verify}_\varOmega )\), \(\mathsf {FS}(\varSigma ,\mathcal H) = (\mathsf {Prove}_\varSigma , \mathsf {Verify}_\varSigma )\) , and . By [7, Theorem 1], non-interactive proof system \((\mathsf {Prove}_\varSigma , \mathsf {Verify}_\varSigma )\) satisfies zero-knowledge, i.e., there exists a simulator for \((\mathsf {Prove}_\varSigma , \mathsf {Verify}_\varSigma )\). Let \(\mathcal S\) be such a simulator. We define \(\mathcal {B}\) as follows:

  • \(\mathcal {B}( pk ,\kappa )\) computes \( nv \leftarrow \mathcal {A}( pk ,\kappa )\); for \(1 \le i \le nv \) do \( nc \leftarrow \mathcal {A}( pd _1,\dots , pd _{ nv })\) and outputs \( nc \).

  • \(\mathcal {B}()\) computes \(R\leftarrow \emptyset ; \mathfrak {bb}\leftarrow \mathcal {A}^{\mathcal O_{}}(); \mathfrak {bb}\leftarrow \mathsf {auth}(\mathfrak {bb},\{ pd _1,\dots , pd _{ nv }\})\) and outputs \(\mathfrak {bb}\), handling oracle calls from \(\mathcal {A}\) as follows. Given an oracle call \(\mathcal O_{}(i,v_0,v_1)\) such that \(v_0,v_1\in \{1, \ldots , nc \} \wedge i\not \in R\), adversary \(\mathcal {B}\) computes \(b\leftarrow \mathcal O_{}(v_0,v_1); \sigma \leftarrow \mathsf {Sign}_\varOmega ( d _i, b); \tau \leftarrow \mathcal S(( pk ,b,\sigma , nc ,\kappa ),\kappa ); R\leftarrow R \cup \{i\}\) and returns \(( pd _i,b,\sigma ,\tau )\) to \(\mathcal {A}\). Moreover, given an oracle call \(\mathcal O_{}(i)\) such that \(i\not \in R\), adversary \(\mathcal {B}\) computes \(R\leftarrow R \cup \{i\}\) and returns \( d _i\) to \(\mathcal {A}\).

  • \(\mathcal {B}(\mathbf{v}, pf )\) computes \(g\leftarrow \mathcal {A}(\mathbf{v}, pf )\) and outputs g.

We prove that \(\mathcal {B}\) wins \(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Ext}\) against \(\varGamma \).

Suppose \(( pk , sk , mb , mc )\) is an output of \(\mathsf {Setup}_\varGamma (\kappa )\) and \( nc \) is an output of \(\mathcal {B}( pk ,\kappa )\). It is trivial to see that \(\mathcal {B}( pk ,\kappa )\) simulates \(\mathcal {A}\)’s challenger to \(\mathcal {A}\). Let \(\beta \) be a bit. Suppose \(\mathfrak {bb}\) is an output of \(\mathcal {B}()\). Since \(\mathcal S\) is a simulator for \((\mathsf {Prove}_\varSigma , \mathsf {Verify}_\varSigma )\), we have \(\mathcal {B}()\) simulates \(\mathcal {A}\)’s challenger to \(\mathcal {A}\). In particular, \(\mathcal {B}()\) simulates oracle calls \(\mathcal O_{}(i,v_0,v_1)\). Indeed, adversary \(\mathcal {B}\) computes \( b\leftarrow \mathcal O_{}(v_0,v_1);\sigma \leftarrow \mathsf {Sign}_\varOmega ( d _i, b); \tau \leftarrow \mathcal S(( pk ,b,\sigma , nc ,\kappa ),\kappa ) \), which, by definition of \(\mathcal {B}\)’s oracle, is equivalent to \( b\leftarrow \mathsf {Vote}_\varGamma ( pk , nc ,v_\beta , \kappa ); \sigma \leftarrow \mathsf {Sign}_\varOmega ( d _i, b); \tau \leftarrow \mathcal S(( pk ,b,\sigma , nc ,\kappa ),\kappa ) \). And \(\mathcal {A}\)’s oracle computes \(b\leftarrow \mathsf {Vote}( d _i, pk , nc ,v_\beta , \kappa )\), i.e., \( b \leftarrow \mathsf {Vote}_\varGamma ( pk , nc , v_\beta , \kappa ;r); \sigma \leftarrow \mathsf {Sign}_\varOmega ( d _i, b; r'); \tau \leftarrow \mathsf {Prove}_\varSigma (( pk , b, \sigma , nc , \kappa ),(v_\beta ,r, d _i,r'),\kappa ) \), where r and \(r'\) are coins chosen uniformly at random. Hence, computations of b, \(\sigma \) and \(\tau \) by \(\mathcal {B}\) and \(\mathcal {A}\)’s oracle are equivalent, with overwhelming probability. Suppose \((\mathbf{v}, pf )\) is an output of \(\mathsf {Tally}_\varGamma ( sk ,\mathfrak {bb},{} nc ,{}\kappa )\) and g is an output of \(\mathcal {B}(\mathbf{v}, pf )\). We have \(\mathcal {B}(\mathbf{v}, pf )\) simulates \(\mathcal {A}\)’s challenger to \(\mathcal {A}\), because outputs of \(\mathsf {Tally}_\varGamma ( sk ',\mathsf {auth}(\mathfrak {bb}',L),{} nc ',{}\kappa ')\) and \(\mathsf {Tally}( sk ', nc ',\mathfrak {bb}',L,\kappa ')\) are indistinguishable for all \( sk '\), \(\mathfrak {bb}'\), L, \( nc '\), and \(\kappa '\). Indeed, \(\mathsf {Tally}\) computes \((\mathbf{v}', pf ') \leftarrow \mathsf {Tally}_\varGamma ( sk ', \mathsf {auth}(\mathfrak {bb}',L),{} nc ',{}\kappa ')\) and outputs \((\mathbf{v}', pf ')\). Since adversary \(\mathcal {B}\) simulates \(\mathcal {A}\)’s challenger, with overwhelming probability. It follows that \(\mathcal {B}\) determines \(\beta \) correctly with the same success as \(\mathcal {A}\) with overwhelming probability. Hence, \(\mathcal {B}\) wins \(\mathsf {Ballot}\text {-}\mathsf {Secrecy}\text {-}\mathsf {Ext}(\varGamma ,\mathcal {A}, \kappa )\), with overwhelming probability, deriving a contradiction and concluding our proof.     \(\square \)

B Election Verifiability: Definitions and Proofs

1.1 B.1 Individual Verifiability

Definition 8

(\(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Ext}\) [36]). Let \(\varGamma = (\mathsf {Setup}, \mathsf {Vote}, \mathsf {Tally}, \mathsf {Verify})\) be an election scheme with external authentication, \(\mathcal {A}\) be an adversary, \(\kappa \) be a security parameter, and \(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Ext}(\varGamma , \mathcal {A}, \kappa )\) be the following game.

figure e

We say \(\varGamma \) satisfies \(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Ext}\), if for all probabilistic polynomial-time adversaries \(\mathcal {A}\), there exists a negligible function \(\mathsf {negl}\), such that for all security parameters \(\kappa \), we have \(\mathsf {Succ}(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Ext}(\varGamma , \mathcal {A}, \kappa ))\le \mathsf {negl}(\kappa )\).

Definition 9

(\(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Int}\) [36]). Let \(\varGamma = (\mathsf {Setup}, \mathsf {Register}, \mathsf {Vote}, \mathsf {Tally}, \mathsf {Verify})\) be an election scheme with external authentication, \(\mathcal {A}\) be an adversary, \(\kappa \) be a security parameter, and \(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Int}(\varPi , \mathcal {A}, \kappa )\) be the following game.

figure f

Oracle \(C\) is defined such that \(C(i)\) computes \( Crpt \leftarrow Crpt \cup \{d_i\}\) and outputs \(d_i\), where \(1\le i \le nv \).

We say \(\varGamma \) satisfies \(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Int}\), if for all probabilistic polynomial-time adversaries \(\mathcal {A}\), there exists a negligible function \(\mathsf {negl}\), such that for all security parameters \(\kappa \), we have \(\mathsf {Succ}(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Int}(\varPi , \mathcal {A}, \kappa ))\le \mathsf {negl}(\kappa )\).

Lemma 10

Let \(\varGamma = (\mathsf {Setup}, \mathsf {Register}, \mathsf {Vote}, \mathsf {Tally}, \mathsf {Verify})\) be an election scheme with external authentication, \(\varOmega = (\mathsf {Gen}, \mathsf {Sign}, \mathsf {Verify})\) be a digital signature scheme, \(\varSigma \) be a sigma protocol for relation \(R(\varGamma ,\varOmega )\), and \(\mathcal H\) be a hash function. Suppose \(\varOmega \) satisfies strong unforgeability. We have satisfies \(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Int}\).

Proof

Suppose does not satisfy \(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Int}\). Hence, there exists a PPT adversary \(\mathcal {A}\), such that for all negligible functions \(\mathsf {negl}\), there exists a security parameter \(\kappa \) and . We construct the following adversary \(\mathcal {B}\) against strong unforgeability from \(\mathcal {A}\):

figure g

where \(C(i)\) outputs \( d _i\) if \(i\not =i^*\) and aborts otherwise. We prove that \(\mathcal {B}\) wins strong unforgeability against \(\varOmega \).

Since adversary \(\mathcal {B}\) chooses \(i^*\) uniformly at random and independently of adversary \(\mathcal {A}\), and since \(\mathcal {A}\) is a winning adversary, hence, does not corrupt at least two distinct credentials, we have that \(\mathcal {B}\) aborts with a probability upper-bounded by \(\frac{ nv -2}{ nv }\). Let us consider the probability that \(\mathcal {B}\) wins, when there is no abort. Suppose \(( pd , d )\) is an output of \(\mathsf {Gen}(\kappa )\), \(( pk , nv )\) is an output of \(\mathcal {A}(\kappa )\), and \(i^*\) is chosen uniformly at random from \(\{1,\dots , nv \}\). Further suppose \(( pd _i, d _i)\) is an output of for each \(i \in \{1,\dots , nv \}\setminus \{i^*\}\). It is straightforward to see that \(\mathcal {B}\) simulates the challenger and oracle in \(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Int}\) to \(\mathcal {A}\). Suppose \(( nc , v, v', j, k)\) is an output of \(\mathcal {A}^{C}(\{ pd _1,\dots , pd _{i^*-1}, pd , pd _{i^*+1},\dots , pd _{ nv }\})\). Since \(\mathcal {A}\) is a winning adversary, outputs of \(\mathsf {Vote}( d _j, pk , nc ,v,\kappa )\) and \(\mathsf {Vote}( d _k, pk , nc ,v', \kappa )\) collide with non-negligible probability. Hence, if \(i^*=k\), then \(\mathsf {Vote}( d _j, pk , nc ,v, \kappa )\) outputs \(( pd _j, b, \sigma , \tau )\) such that \(\sigma \) is a signature on b with respect to private key \( d _{i^*}\), otherwise \((i^*=j)\), \(\mathsf {Vote}( d _k, pk , nc ,v',\kappa )\) outputs \(( pd _k,b,\sigma ,\tau )\) such that \(\sigma \) is a signature on b with respect to private key \( d _{i^*}\). Thus, \(\mathsf {Succ}(\mathsf {Exp}\text {-}\mathsf {StrongSign}(\varGamma , \mathcal {B},\kappa ))\) is at least , which is non-negligible.   \(\square \)

1.2 B.2 Universal Verifiability

External authentication. Algorithm \(\mathsf {Verify}\) is required to accept iff the election outcome is correct. The notion of a correct outcome is captured using function \( correct\text {-}outcome \), which is defined such that for all \( pk \), \( nc \), \(\mathfrak {bb}\), \(\kappa \), \(\ell \), and \(v \in \{1, \ldots , nc \}\), we have \( correct\text {-}outcome ( pk , nc ,\mathfrak {bb},\kappa )[v] = \ell \) iff \(\exists ^{=\ell } b\in \mathfrak {bb}\setminus \{\bot \} : \exists r : b=\mathsf {Vote}( pk , nc ,v,\kappa ; r) \),Footnote 12 and the produced vector is of length \( nc \). Hence, component v of vector \( correct\text {-}outcome ( pk , nc ,\mathfrak {bb},\kappa )\) equals \(\ell \) iff there exist \(\ell \) ballots on the bulletin board that are votes for candidate v. The function requires ballots to be interpreted for only one candidate, which can be ensured by injectivity.

The if requirement of universal verifiability is captured by Completeness, which stipulates that election outcomes produced by algorithm \(\mathsf {Tally}\) will actually be accepted by algorithm \(\mathsf {Verify}\). And the only if requirement is captured by Soundness, which challenges an adversary to concoct a scenario in which algorithm \(\mathsf {Verify}\) accepts, but the election outcome is not correct.

Definition 10

([36]). An election scheme with external authentication \((\mathsf {Setup}, \mathsf {Vote}, \mathsf {Tally}, \mathsf {Verify})\) satisfies Soundness, if the scheme satisfies Injectivity [36] and for all probabilistic polynomial-time adversaries \(\mathcal {A}\), there exists a negligible function \(\mathsf {negl}\), such that for all security parameters \(\kappa \), we have .

An election scheme with external authentication satisfies \(\mathsf {Exp}\text {-}\mathsf {UV}\text {-}\mathsf {Ext}\), if Injectivity, Completeness and Soundness are satisfied, where formal definitions of Injectivity and Completeness appear in [36].

Internal authentication. Function \( correct\text {-}outcome \) is now modified to tally only authorised ballots: let function \( correct\text {-}outcome \) now be defined such that for all \( pk \), \( nc \), \(\mathfrak {bb}\), \(M\), \(\kappa \), \(\ell \), and \(v \in \{1,\ldots , nc \}\), we have \( correct\text {-}outcome ( pk , nc , \mathfrak {bb}, M, \kappa )[v] = \ell \) iff \(\exists ^{=\ell } b\in authorized ( pk , nc ,(\mathfrak {bb}\setminus \{\bot \}),M, \kappa ) \mathrel : \exists d , r : b=\mathsf {Vote}( d , pk , nc ,v,\kappa ; r)\). A ballot is authorised if it is constructed with a private credential from \(M\), and that private credential was not used to construct any other ballot on \(\mathfrak {bb}\). Let \( authorized \) be defined as follows: \( authorized ( pk , nc , \mathfrak {bb}, M, \kappa ) = \{b \mathrel {:}\; b \in \mathfrak {bb}\mathrel \wedge \exists pd , d , v, r \mathrel : b=\mathsf {Vote}( d , pk , nc ,v,\kappa ; r) \mathrel \wedge ( pd , d )\in M\mathrel \wedge \lnot \exists b',v',r' : b' \in (\mathfrak {bb}\setminus \{b\}) \mathrel \wedge b'=\mathsf {Vote}( d , pk , nc ,v',\kappa ; r')\} \).

Definition 11

([36]). An election scheme with internal authentication \((\mathsf {Setup}, \mathsf {Register}, \mathsf {Vote}, \mathsf {Tally}, \mathsf {Verify})\) satisfies Soundness, if the scheme satisfies Injectivity [36] and for all probabilistic polynomial-time adversaries \(\mathcal {A}\), there exists a negligible function \(\mathsf {negl}\), such that for all security parameters \(\kappa \), we have \(\Pr [( pk , nv ) \leftarrow \mathcal {A}(\kappa );\) for \(1 \le i \le nv \) do ; .

An election scheme with internal authentication satisfies \(\mathsf {Exp}\text {-}\mathsf {UV}\text {-}\mathsf {Int}\), if Injectivity, Completeness and Soundness are satisfied.

Lemma 11

Let \(\varGamma = (\mathsf {Setup}_\varGamma , \mathsf {Vote}_\varGamma , \mathsf {Tally}_\varGamma , \mathsf {Verify}_\varGamma )\) be an election scheme with external authentication, \(\varOmega = (\mathsf {Gen}_\varOmega , \mathsf {Sign}_\varOmega , \mathsf {Verify}_\varOmega )\) be a perfectly correct digital signature scheme, \(\varSigma \) be a sigma protocol for relation \(R(\varGamma ,\varOmega )\), and \(\mathcal H\) be a random oracle. Moreover, let \(\mathsf {FS}(\varSigma ,\mathcal H) = (\mathsf {Prove}_\varSigma , \mathsf {Verify}_\varSigma )\). Suppose \(\varGamma \) satisfies \(\mathsf {Exp}\text {-}\mathsf {UV}\text {-}\mathsf {Ext}\), \(\varOmega \) satisfies strong unforgeabilityand \(\varSigma \) satisfies perfect special soundness and special honest verifier zero-knowledge. Election scheme with internal authentication satisfies \(\mathsf {Exp}\text {-}\mathsf {UV}\text {-}\mathsf {Int}\).

Proof

We prove that satisfies Injectivity, Completeness and Soundness: The proofs for Injectivity and Completeness are quite straightforward and can be found in our technical report [28].

Soundness. We prove that satisfies Soundness by contradiction. Suppose does not satisfy Soundness, i.e., there exists an adversary \(\mathcal {A}\) such that for all negligible functions \(\mathsf {negl}\) there exists a security parameter \(\kappa \) and the probability defined in Definition 11 is greater than \( \mathsf {negl}(\kappa )\). We use \(\mathcal {A}\) to construct an adversary \(\mathcal {B}\) that wins the Soundness game against \(\varGamma \).

figure h

We prove that \(\mathcal {B}\) wins the Soundness game against \(\varGamma \).

Suppose \(( pk , nv )\) is an output of \(\mathcal {A}(\kappa )\) and \(( pd _1, d _1),\dots ,( pd _{ nv }, d _{ nv })\) are outputs of . Let \(L= \{ pd _1,\dots , pd _{ nv }\}\) and \(M= \{( pd _1, d _1), \dots , ( pd _{ nv }, d _{ nv })\}\). Suppose \((\mathfrak {bb}, nc , \mathbf{v}, pf )\) is an output of \(\mathcal {A}(M)\). Further suppose \(( pk , nc , \mathsf {auth}(\mathfrak {bb},L), \mathbf{v}, pf )\) is an output of \(\mathcal {B}(\kappa )\). Since \(\mathcal {A}\) is a winning adversary, we have \(\mathsf {Verify}( pk , nc ,\mathfrak {bb},L, \mathbf{v}, pf ,\kappa ) = 1\), with non-negligible probability. By inspection of algorithm \(\mathsf {Verify}\), we have \(\mathsf {Verify}( pk , nc ,\mathfrak {bb},L, \mathbf{v}, pf ,\kappa ) = 1\) implies \(\mathsf {Verify}_\varGamma ( pk , \mathsf {auth}(\mathfrak {bb},L), nc , \mathbf{v}, pf , \kappa ) = 1\). Hence, it remains to show \(\mathbf{v}\ne correct\text {-}outcome ( pk , nc , \mathsf {auth}(\mathfrak {bb},L), \kappa )\), with probability greater than \(\mathsf {negl}(\kappa )\).

By definition of function \( correct\text {-}outcome \), we have \(\mathbf{v}\) is a vector of length \( nc \) such that

$$\begin{aligned}&correct\text {-}outcome ( pk , nc ,\mathsf {auth}(\mathfrak {bb},L),\kappa )[v] = \ell \\&\quad \Leftrightarrow \exists ^{=\ell } b\in \mathsf {auth}(\mathfrak {bb},L) \setminus \{\bot \}: \exists r: b=\mathsf {Vote}( pk , nc ,v,\kappa ; r) \end{aligned}$$

Since \(\mathcal {A}\) is a winning adversary, it suffices to derive

$$\begin{aligned} \Leftrightarrow \exists ^{=\ell } b\in authorized&( pk , nc ,(\mathfrak {bb}\setminus \{\bot \}),M, \kappa )\nonumber \\&{\mathrel : \exists d , r : b=\mathsf {Vote}( d , pk , nc ,v,\kappa ; r)} \end{aligned}$$
(1)

Let set \(auth^* ( pk , nc ,\mathfrak {bb},M,\kappa )\!=\!\{b^*| ( pd , b^*, \sigma , \tau ) \in authorized ( pk , nc ,\mathfrak {bb},M,\kappa ) \}\). To prove (1), it suffices to show \(\mathsf {auth}(\mathfrak {bb},L)\setminus \{\perp \} = auth^* ( pk , nc ,\mathfrak {bb},M,\kappa )\setminus \{\perp \}\), since this would imply that \( correct\text {-}outcome \) is computed on sets of corresponding ballots in both the external and internal authentication setting.

  • \(auth^*( pk , nc ,\mathfrak {bb},M,\kappa )\setminus \{\perp \} \subseteq \mathsf {auth}(\mathfrak {bb},L)\setminus \{\perp \}\)

    If \(b^* \in auth^*( pk , nc ,\mathfrak {bb},M,\kappa )\), then \(b^* \ne \ \perp \) and there exists \(b \in authorized ( pk , nc ,\mathfrak {bb},M,\kappa )\) such that (i) \(b \in \mathfrak {bb}\); (ii) \(\exists pd , d , v, r,r',r'' \mathrel : b = ( pd ,b^*,\sigma ,\tau )\), \(b^* = \mathsf {Vote}_\varGamma ( pk , nc ,v,\kappa ;r)\), \(\sigma = \mathsf {Sign}_\varOmega ( d ,b^*;r')\), and \(\tau = \mathsf {Prove}_\varSigma (( pk , b^*, \sigma , nc , \kappa ), (v, r, d , r'), \kappa ; r'')\), which – by correctness of \(\varOmega \) and completeness of \(\varSigma \) – implies \(\mathsf {Verify}_\varOmega ( pd , b^*, \sigma ) =1\) and \(\mathsf {Verify}_\varSigma (( pk , b^*, nc , \kappa ), \tau , \kappa ))=1\); (iii) \( ( pd , d )\in M\), which implies \( pd \in L\) by construction; and (iv) \(\lnot \exists b', v', r,r',r'' \mathrel : b' \in (\mathfrak {bb}\setminus \{b\}) \wedge b' = ( pd , b^{*'}, \sigma ', \tau ')\), \(b^{*'} = \mathsf {Vote}_\varGamma ( pk , nc ,v',\kappa ;r)\), \(\sigma ' = \mathsf {Sign}_\varOmega ( d ,b^{*'};r')\), and \(\tau ' = \mathsf {Prove}_\varSigma (( pk ,b^{*'}, \sigma ', nc ,\kappa ),(v',r, d ,r'),\kappa ; r'')\), which, by correctness of \(\varOmega \), implies \(\mathsf {Verify}_\varOmega ( pd , b^{*'}, \sigma ') =1\). It follows by (i)–(iv) that \(b^* \in auth^*( pk , nc ,\mathfrak {bb},M,\kappa )\) implies \(b^* \in \mathsf {auth}(\mathfrak {bb},L)\setminus \{\perp \}\).

  • \( \mathsf {auth}(\mathfrak {bb},L)\setminus \{\perp \} \subseteq auth^*( pk , nc ,\mathfrak {bb},M,\kappa )\setminus \{\perp \}\)

    If \(b^* \in \mathsf {auth}(\mathfrak {bb},L)\setminus \{\perp \}\), then \(b^* \ne \ \perp \) such that (i) \(( pd , b^*, \sigma , \tau ) \in \mathfrak {bb}\); (ii) \(\mathsf {Verify}_\varOmega ( pd , b^*, \sigma ) =1\) and \(\mathsf {Verify}_\varSigma (( pk , b^*, nc , \kappa ), \tau , \kappa ))=1\), which – by the security of \(\varOmega \) and \(\varSigma \) – implies \(\exists pd , d , v, r,r',r'' \mathrel : \) \(b^* = \mathsf {Vote}_\varGamma ( pk , nc ,v,\kappa ;r)\), \(\sigma = \mathsf {Sign}_\varOmega ( d ,b^*;r')\), and \(\tau = \mathsf {Prove}_\varSigma (( pk ,b^*,\sigma , nc ,\kappa ), (v,r, d ,r'),\kappa ; r'')\). Indeed, suppose this is not true, i.e., such values do not exist. Then \((b^*, \sigma )\) and \((( pk , b^*, nc , \kappa ), \tau )\) could be used by adversaries to break the unforgeability property of \(\varOmega \) and the special soundness and special honest verifier zero-knowledge property of \(\varSigma \), respectively. Furthermore, we have (iii) \( pd \in L\), which implies \( ( pd , d )\in M\) by construction; and (iv) \(b' = ( pd , b^{*'}, \sigma ', \tau ') \notin (\mathfrak {bb}\setminus \{( pd , b^*, \sigma , \tau )\}) \wedge \mathsf {Verify}_\varOmega ( pd , b^{*'}, \sigma ') =1\), which implies \(\lnot \exists b', v', r,r',r'' \mathrel : b' \in (\mathfrak {bb}\setminus \{b\}) \wedge b' = ( pd , b^{*'}, \sigma ', \tau ')\), \(b^{*'} = \mathsf {Vote}_\varGamma ( pk , nc ,v',\kappa ;r)\), \(\sigma ' = \mathsf {Sign}_\varOmega ( d ,b^{*'};r')\), and \(\tau ' = \mathsf {Prove}_\varSigma (( pk , b^{*'}, \sigma ', nc , \kappa ), (v', r, d , r'), \kappa ; r'')\), as per definition of \( authorized \), concluding our proof.    \(\square \)

1.3 B.3 Eligibility Verifiability

Definition 12

(Eligibility verifiability [36]). Let \(\varGamma = (\mathsf {Setup}, \mathsf {Register}, \mathsf {Vote}, \mathsf {Tally}, \mathsf {Verify})\) be an election scheme with internal authentication, \(\mathcal {A}\) be an adversary, \(\kappa \) be a security parameter, and \(\mathsf {Exp}\text {-}\mathsf {EV}\text {-}\mathsf {Int}(\varPi , \mathcal {A}, \kappa )\) be the following game.

figure i

Oracle \(C\) is the same oracle as in \(\mathsf {Exp}\text {-}\mathsf {IV}\text {-}\mathsf {Int}\), and oracle \(R\) is defined such that \(R(i,v, nc )\) computes \(b \leftarrow \mathsf {Vote}( d _i, pk , nc ,v,k); Rvld \leftarrow Rvld \cup \{b\}\) and outputs b.

We say \(\varGamma \) satisfies \(\mathsf {Exp}\text {-}\mathsf {EV}\text {-}\mathsf {Int}\), if for all probabilistic polynomial-time adversaries \(\mathcal {A}\), there exists a negligible function \(\mathsf {negl}\), such that for all security parameters \(\kappa \), we have \(\mathsf {Succ}(\mathsf {Exp}\text {-}\mathsf {EV}\text {-}\mathsf {Int}(\varPi , \mathcal {A}, \kappa ))\le \mathsf {negl}(\kappa )\).

Lemma 12

Let \(\varGamma = (\mathsf {Setup}_\varGamma , \mathsf {Vote}_\varGamma , \mathsf {Tally}_\varGamma , \mathsf {Verify}_\varGamma )\) be an election scheme with external authentication, \(\varOmega = (\mathsf {Gen}_\varOmega , \mathsf {Sign}_\varOmega , \mathsf {Verify}_\varOmega )\) be a digital signature scheme, \(\varSigma \) be a sigma protocol for relation \(R(\varGamma ,\varOmega )\), and \(\mathcal H\) be a hash function. Suppose \(\varSigma \) satisfies special soundness and special honest verifier zero-knowledge, and \(\varOmega \) satisfies strong unforgeability. Election scheme with internal authentication satisfies \(\mathsf {Exp}\text {-}\mathsf {EV}\text {-}\mathsf {Int}\).

Proof

Suppose does not satisfy \(\mathsf {Exp}\text {-}\mathsf {EV}\text {-}\mathsf {Int}\), i.e., there exists an adversary \(\mathcal {A}\) such that for all negligible functions \(\mathsf {negl}\) there exists a security parameter \(\kappa \) and \(\mathsf {Succ}(\mathsf {Exp}\text {-}\mathsf {EV}\text {-}\mathsf {Int}(\varPi , \mathcal {A}, \kappa ))> \mathsf {negl}(\kappa )\). We construct the following adversary \(\mathcal {B}\) against the strong unforgeability of \(\varOmega \) from \(\mathcal {A}\).

figure j

where oracle calls are handled as follows:

  • C(i) computes \( Crpt \leftarrow Crpt \cup \{ d _i \}\) and returns \( d _i\) if \(i\not =i^*\), and aborts otherwise.

  • \(R(i, v, nc )\) distinguishes two cases: If \(i=i^*\), then \(\mathcal {B}\) computes \(b \leftarrow \mathsf {Vote}_\varGamma ( pk , nc , v, \kappa ); \sigma \leftarrow \mathcal O_{}(b);\tau \leftarrow \mathcal S(( pk ,b,\sigma , nc ,\kappa ),\kappa )\), computes \( Rvld \leftarrow Rvld \cup \{ ( pd , b, \sigma , \tau )\}\), and returns \(( pd , b, \sigma , \tau )\), where \(\mathcal S\) is a simulator for \(\mathsf {FS}(\varSigma ,\mathcal H)\) that exists by [7, Theorem 1]. Otherwise, \(\mathcal {B}\) computes \(b \leftarrow \mathsf {Vote}( d _i, pk , nc , v, \kappa )\), \( Rvld \leftarrow Rvld \cup \{ b\}\) and returns b.

We prove that \(\mathcal {B}\) wins the strong unforgeability game against \(\varOmega \).

Let \(\kappa \) be a security parameter. Suppose \(( pd , d )\) is an output of \(\mathsf {Gen}(\kappa )\) and \(( pk , nv )\) is an output of \(\mathcal {A}(\kappa )\). Let \(i^*\) be an integer chosen uniformly at random from \(\{1, \ldots , nv \}\). Suppose \(( pd _i, d _i)\) is an output of \(\mathsf {Register}( pk ,\kappa )\), for each \(i \in \{1, \ldots , nv \}\setminus \{ i^* \} \). Let us consider an execution of \(\mathcal {A}(\{ pd _{1}, \dots , pd _{i^*-1}, pd , pd _{i^*+1}, \dots , pd _{ nv }\})\). Let \(( nc , v, i, b)\) be the output of \(\mathcal {A}\). By definition of algorithm \(\mathsf {Register}\), it is trivial to see that \(\mathcal {B}\) simulates \(\mathcal {A}\)’s challenger to \(\mathcal {A}\). Moreover, \(\mathcal {B}\) simulates oracle C to \(\mathcal {A}\), except when \(\mathcal {B}\) aborts. Furthermore, \(\mathcal {B}\) simulates oracle R to \(\mathcal {A}\) as well. In particular, simulator \(\mathcal S\) produces proofs that are indistinguishable from proofs constructed by non-interactive proof system \(\mathsf {FS}(\varSigma ,\mathcal H)\).

We denote by \(\textsf {Good}\) the event that \(i=i^* \). Now, let us assess \(\mathcal {B}\)’s probability not to abort, to determine the success probability of \(\mathcal {B}\). Since \(\mathcal {A}\) is not allowed to corrupt the credential it finally outputs (as \(\mathcal {A}\) is a winning adversary, \( d _{i} \notin Crpt \) must hold), a sufficient condition for \(\mathcal {B}\) not to be asked for the unknown private credential \( d _i\) is to be lucky when drawing \(i^*\leftarrow \{1, \ldots , nv \}\) at random and have event \(\textsf {Good}\) occurring.

This is the case with probability \(\Pr [\textsf {Good}] = \frac{1}{ nv }\) since the choice of \(i^*\) is completely independent of \(\mathcal {A}\)’s view. Therefore we have \( \mathsf {Succ}(\mathsf {Exp}\text {-}\mathsf {EV}\text {-}\mathsf {Int}(\varPi , \mathcal {A}, \kappa ))\le nv \cdot \textsf {Succ}(\mathsf {Exp}\text {-}\mathsf {StrongSign}(\varOmega ,\mathcal {B},k)) \).     \(\square \)

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Quaglia, E.A., Smyth, B. (2018). Authentication with Weaker Trust Assumptions for Voting Systems. In: Joux, A., Nitaj, A., Rachidi, T. (eds) Progress in Cryptology – AFRICACRYPT 2018. AFRICACRYPT 2018. Lecture Notes in Computer Science(), vol 10831. Springer, Cham. https://doi.org/10.1007/978-3-319-89339-6_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-89339-6_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-89338-9

  • Online ISBN: 978-3-319-89339-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics