A Brief History of Cryptography and the Rise of Privacy Computing
This article surveys the evolution of cryptography from ancient Mesopotamian cipher sticks through classical ciphers, the Enigma machine, modern public‑key systems, and multi‑party computation, then explains the concept, current challenges, and future directions of privacy‑preserving computation technologies.
Introduction – A Brief History of Cryptography
Privacy‑preserving computation cannot be discussed without cryptography.
The need for secrecy dates back to around 1500 BC in Mesopotamia, where people used pen‑and‑paper or simple mechanical tools for encryption.
These early devices are often referred to as classic or classical cryptography.
Classical Cryptography
“If secrecy is required, the letter is encoded by rearranging letters so that outsiders cannot form a word. To read it, replace the 4th letter with the 1st, D for A, and so on.” – Suetonius, *The Twelve Caesars*
The Caesar cipher is a simple substitution cipher with a fixed shift (e.g., A→D, B→E, shift = 3). The Vigenère cipher introduced a table‑based substitution that reduced vulnerability to frequency analysis.
Example: plaintext ATTACKATDOWN with repeating key LEMON yields ciphertext LXFOPVEFRNHR using the Vigenère table.
Modern Cryptography
In the 20th century, complex mechanical and electromechanical devices emerged, most famously the Enigma machine, whose core principle was still a multi‑rotor substitution but automated.
Polish cryptanalysts, led by Rejewski, first applied rigorous mathematical methods to break Enigma, paving the way for Alan Turing’s later work.
Contemporary Cryptography
Claude Shannon’s 1949 “Mathematical Theory of Cryptography” launched the modern era. He proved the one‑time pad is theoretically unbreakable and defined the properties of a perfectly secure key.
1976 saw the introduction of public‑key cryptography by Diffie and Hellman, for which they received the Turing Award in 2015.
Public‑key (asymmetric) cryptography solves the key‑distribution problem inherent in symmetric schemes, though it is computationally heavier.
Secure Multi‑Party Computation (MPC)
In 1982, academic Yao introduced the millionaire’s problem, illustrating how two parties can compare wealth without revealing exact amounts.
The practical solution is Oblivious Transfer (OT) , where a sender transmits two messages and the receiver obtains exactly one without the sender learning which.
Alice generates two RSA key pairs (p0, p1) and sends the public keys to Bob.
Bob encrypts a random number with either p0 or p1 (depending on which secret he wants) and sends the ciphertext to Alice.
Alice decrypts both ciphertexts, obtaining k0 and k1 (one is the true random number). She XORs k0 with m0 and k1 with m1, producing e0 and e1, and sends both to Bob.
Bob XORs his original random number with e0 and e1; the result that matches his random number reveals the chosen secret (e.g., m0).
By extending this idea, MPC enables data owners to compute jointly while keeping their inputs invisible.
Privacy Computing – Current State
What Is Privacy Computing?
Privacy computing is an umbrella term for technologies that allow multiple data owners to share, compute, and model on data without exposing the raw data, thereby preserving confidentiality while extracting collective value.
The three main technical routes are Trusted Execution Environments (TEE), Multi‑Party Computation (MPC), and Federated Learning (FL).
Why Is Privacy Computing Gaining Momentum?
The rise of the digital economy, the importance of data assets, and fragmented data infrastructures create a tension that privacy computing resolves by enabling secure data sharing.
Data owners often "cannot give" (due to lack of infrastructure), "do not dare to give" (because of regulatory and privacy concerns), or "do not want to give" (protecting data as a valuable asset).
Challenges and Pain Points
Immature Products and Performance Bottlenecks
Complex technology stacks make deployment difficult, and cryptographic operations (e.g., RSA key generation) are computationally heavy, leading to performance slowdowns in large‑scale federated learning.
Assumptions such as “semi‑honest” participants or “gradient leakage is negligible” are often unrealistic, raising security concerns.
Low Customer Acceptance
Complexity makes it hard to explain to customers and regulators; proving that no plaintext is exposed is challenging.
Lack of End‑to‑End Solutions
Most offerings focus only on the data‑fusion stage, leaving gaps in data acquisition, storage, and lifecycle management.
Future Outlook
Despite challenges, privacy computing is expected to become mainstream.
Convergence of Technical Routes and Interoperable Platforms
MPC, FL, and TEE each have strengths; combining them can mitigate individual weaknesses, and hardware accelerators are emerging to boost MPC performance.
Accelerated Industry Standards
Neutral third‑party bodies and industry alliances are drafting standards that will lower adoption barriers and enable cross‑platform interoperability.
In summary, privacy computing—through TEE, MPC, and FL—offers a way to make data usable yet invisible, addressing the “cannot‑give”, “do‑not‑dare‑give”, and “do‑not‑want‑give” dilemmas of the digital economy.
DataFunTalk
Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.