Source: analyticsindiamag.com
Researchers from Microsoft, Princeton University, Technion and Algorand Foundation recently introduced a new framework known as Falcon. Falcon is an end-to-end 3-party protocol that can be used for fast and secure computations of deep learning algorithms on larger networks.
Today, a vast amount of private data and sensitive information is continuously being generated. According to the researchers, combining this data with deep learning algorithms can transform the current social and technological scenario.
Behind Falcon
Falcon is a deep learning framework that provides support for both training and inference with malicious security guarantees. Falcon consists of hybrid integration of ideas from SecureNN and ABY3, along with newer protocol constructions for privacy-preserving deep learning.
The codebase of Falcon is written in C++ in about 12.3k LOC and is built using the communication backend of SecureNN. Falcon provides a cryptographically secure framework, where the client data is split into unrecognizable parts among several non-colluding entities.
There are three main advantages of this framework, which are:
- This framework is highly expressive. Falcon is the first-ever secure framework to support high capacity networks with over a hundred million parameters such as VGG16. It is also the first framework to support batch normalization.
- Falcon guarantees security with abort against malicious adversaries, assuming an honest majority. It ensures that the protocol always completes with correct output for honest participants or aborts when it detects the presence of a malicious adversary.
- This framework presents new theoretical insights for protocol design that make it highly efficient and allow it to outperform existing, secure in-depth learning solutions.
Evaluating Falcon
To evaluate the framework, the researchers used six diverse networks, ranging from simple 3-layer multi-layer perceptrons (MLP) with about 118,000 parameters to large networks with about 16-layers having 138 million parameters.
Top Data Scientists for our Hackathons
It has been trained on popular datasets such as MNIST, CIFAR-10 and Tiny ImageNet datasets as appropriate based on the network size. According to the researchers, this framework is the first secure machine learning framework to support the training of high capacity networks, such as AlexNet and VGG16 on the Tiny ImageNet dataset.
The researchers then performed an extensive evaluation of the deep learning framework in both the LAN and WAN setting, as well as semi-honest and malicious adversarial settings. The evaluation resulted in the performance improvement over SecureNN, which is a 3-party secure computation for Neural Networks. They also claimed that Falcon is an optimized 3-PC framework concerning communication, which is often considered as the main bottleneck
in multi-party computation protocols.
Contributions of This Project
According to the researchers, Falcon makes secure deep learning techniques practical through the following contributions:
- Malicious Security: This framework provides strong security, which guarantees in an honest-majority adversarial setting. It proposes new protocols that are secure against corruption and ensures that either the computation always correctly completes or aborts detecting malicious activity.
- Improved Protocols: Falcon combines techniques from SecureNN and ABY3 that result in improved protocol efficiency.
- Expressiveness: Falcon is the first framework to demonstrate support for Batch-Normalization layers in private machine learning. It supports both private training and private inference, which makes this framework expressive.
Wrapping Up
According to the researchers, Falcon provides malicious security and provides several orders of magnitude performance improvements over prior work. The deep learning framework proposed more efficient protocols for common machine learning functionalities while providing stronger security guarantees. It has been claimed as the first secure deep learning framework to examine performance over large-scale networks such as AlexNet and VGG16 and massive datasets such as Tiny ImageNet.