Hardness of (Semiuniform) MLWE with Short Distributions Using the Rényi Divergence

The module learning with errors (MLWE) problem has attracted considerable attention for its tradeoff between security and efficiency. The quantum/classical worst-case to average-case hardness for the MLWE problem (or more exactly, a family of problems) has been established, but most of the known res...

Full description

Saved in:
Bibliographic Details
Main Authors: Wenjuan Jia, Baocang Wang
Format: Article
Language:English
Published: Wiley 2023-01-01
Series:IET Information Security
Online Access:http://dx.doi.org/10.1049/2023/2104380
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The module learning with errors (MLWE) problem has attracted considerable attention for its tradeoff between security and efficiency. The quantum/classical worst-case to average-case hardness for the MLWE problem (or more exactly, a family of problems) has been established, but most of the known results require the seed distribution to be the uniform distribution. In the present paper, we show that, using the noise flooding technique based on the Rényi divergence, the search MLWE problem with uniform B-bounded secret distribution for 1≤B≪q can still be hard for some seed distributions that are not (even computationally indistinguishable from) the uniform distribution under the standard MLWE assumption. Specifically, we show that if the seed distribution is a semiuniform distribution (namely, the seed distribution can be publicly derived from and has a “small difference” to the uniform distribution), then for suitable parameter choices, the search MLWE problem with uniform bounded secret distribution is hard under the standard MLWE assumption. Moreover, we also show that under the appropriate setting of parameters, the search MLWE problem with uniform bounded noise distribution is at least as hard as the standard MLWE assumption using a different approach than the one used by Boudgoust et al. in [JoC 2023].
ISSN:1751-8717