用户名: 密码: 验证码:
量子并行神经网络
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Quantum Parallel Neural Network
  • 作者:陈佳临 ; 王伶俐
  • 英文作者:CHEN Jia-Lin;WANG Ling-Li;School of Microelectronics,Fudan University;
  • 关键词:量子神经元 ; 量子并行神经网络 ; 量子可实现 ; 容错性 ; 量子内存
  • 英文关键词:Quantum neuron(Quron);;Quantum Parallel Neural Network(QPNN);;quantum-implementable;;fault tolerant;;quantum memory
  • 中文刊名:JSJX
  • 英文刊名:Chinese Journal of Computers
  • 机构:复旦大学微电子学院;
  • 出版日期:2018-09-19 09:19
  • 出版单位:计算机学报
  • 年:2019
  • 期:v.42;No.438
  • 语种:中文;
  • 页:JSJX201906004
  • 页数:13
  • CN:06
  • ISSN:11-1826/TP
  • 分类号:47-59
摘要
本文在前期量子概率神经网络(QPrNN)的基础上,提出了一种物理可实现的量子神经网络,称为量子并行神经网络(QPNN).主要特点是基于量子神经元的激活机制,利用量子并行性跟踪所有网络状态来提高分类结果.与之前的研究相比,在网络各个中间层和输入层之间添加了连接,增加了量子神经网络的非线性表达能力,所以结构上可以向深层网络发展.由于QPNN独特的量子门性质,该模型在很多条件下对噪声不敏感,涵盖了相位偏移和幅值翻转噪声.QPNN的另一个优势是可以作为内存使用,不但可以像经典内存一样存取数据,还可以作为生成模型,产生新数据.在实验验证部分,本次研究选取了两个标准的例子,MNIST手写体识别和Cifar-10来验证其测试误差.实验结果表明,QPNN只需采用经典神经网络3%左右的神经元资源即可超过相对应的全连接前向神经网络.与QPrNN相比,MNIST的分类测试准确率提高了0.2%;Cifar-10测试准确率提高了3%.同时,MNIST的正确取回概率平均提高了2%.
        Based on the previous quantum probability neural networks(QPrNN)research,an improved quantum-implementable neural network,namely Quantum Parallel Neural Network(QPNN)model is proposed in this paper.QPNN is a kind of quantum feed-forward neural networks which is composed of a new type of quantum neurons,or qurons,and their connections.If the input quron x′satisfies x′·ω′0,then the output quron will be activated with a probability larger than 0.5 and rest otherwise.In this sense,qurons are similar to classical neurons based on the sigmoid function.Taking advantage of quantum parallelism,QPNN can trace all possible network states to get the final output.Moreover,the most interesting points of QPNN is that we can combine several basic networks with different parameters even different structures at the same time to improve the result.To achieve this purpose,only n qubits are needed to perform quantum multiplexer gates to create 2 n separable networks.Therefore,QPNN has unique advantages over classical feed-forward neural networks.Compare to the previous QPrNN,direct links between each layer and input layer are added to enhance the nonlinearity of QPNN,and hence can be developed into deep network structure.Due to its unique quantum nature,this model is robust to several quantum noises under certain conditions,such as phase-flip channel and bit-flip channel,which can be efficiently implemented by universal quantum computers.Another advantage is that QPNN can be used as memory to retrieve the most relevant data and even to generate new data.During the learning phase of QPNN,the most expensive part is the summation over all possible states of the hidden layer qurons.Therefore,in order to focus on states with relatively large probabilities,classical sampling methods are used to sample the layer.In the experiments,this strategy is used to tradeoff between the learning speed and the accuracy,where for a hidden layer with m qurons,only 2 m-3 sampling states are needed to calculate.Alternatively,in a real quantum computer(suppose there exit a real quantum computer),this can be done by repeatedly measuring the hidden layer several times to obtain a set of most likely layer states.Note that the classical methods sample the layer efficiently only in the second layer as they are tensor product states.As a result,for a deeper network structure,this strategy does not work well and only quantum computers perform efficiently.For verifying the performance of QPNN,we apply it to two real-life classification applications,i.e.MNIST handwritten digit database recognition and Cifar-10 classification.Here,in both experiments,Matlab simulation results show that hat only about 3% neuron resources are required in QPNN to obtain a better result than the classical feedforward neural network.Compare to the previous QPrNN,the test accuracies of MNIST and Cifar-10 are improved by 0.2%and 3%respectively.In addition to the resources saving,QPNN can also be used as memory to retrieve the most relevant data where the successful retrieve probability of MNIST is improved by 2%than QPrNN.
引文
[1]Nielsen M,Chuang I.Quantum Computation and Quantum Information.Cambridge,UK:Cambridge University Press,2000
    [2]Shor P.Polynomial-time algorithms for prime factorization and discrete logarithms on a quantum computer.SIAM Journal on Computing,1997,26(5):1484-1509
    [3]Childs A M,Landahl A J,Parrilo P A.Quantum algorithms for the ordered search problem via semidefinite programming.Physical Review A,2007,75(3):032335
    [4]Farhi E,Goldstone J,Gutmann S,et al.A quantum adiabatic evolution algorithm applied to random instances of an NP-complete problem.Science,2001,292(5516):472-475
    [5]Murphy K P.Machine Learning:A Probabilistic Perspective.Massachusetts,USA:MIT Press,2012
    [6]Diamantini M C,Trugenberger C A.High-capacity quantum associative memories.arXiv:1506.01231v1,2015
    [7]Busemeyer J R,Bruza P D.Quantum Models of Cognition and Decision.Cambridge,UK:Cambridge University Press,2012
    [8]Schuld M,Sinayskiy I,Petruccione F.An introduction to quantum machine learning.Contemporary Physics,2015,56(2):172-185
    [9]Goodfellow I,Bengio Y,Courville A.Deep Learning.Cambridge,USA:The MIT Press,2016
    [10]Wang Long-Hao,Long Gui-Lu.Big data and quantum computation.Chinese Science Bulletin,2015,60:499-508(in Chinese)(王龙浩,龙桂鲁.大数据与量子计算.科学通报,2015,60:499-508)
    [11]Huang Yi-Ming,Lei Hang,Li Xiao-Yu.A survey on quantum machine learning.Chinese Journal of Computers,2018,41(1):145-163(in Chinese)(黄一鸣,雷航,李晓瑜.量子机器学习算法综述.计算机学报,2018,41(1):145-163)
    [12]Wiebe N,Kapoor A,Svore K.Quantum deep learning.arXiv:1412.3489,2014
    [13]Wiebe N,Braun D,Lloyd S.Quantum algorithm for data fitting.Physical Review Letters,2012,109(5):050505
    [14]Lloyd S,Mohseni M,Rebentrost P.Quantum algorithms for supervised and unsupervised machine learning.arXiv preprint arXiv:1307.0411,2013
    [15]Menneer T,Narayanan A.Quantum artificial neural network architectures and components.Information Sciences,2000,128(3-4):231-255
    [16]Ventura D,Martinez T.Quantum associative memory.Information Sciences,2000,124(1-4):273-296
    [17]Biamonte J,Wittek P,Pancotti N,et al.Quantum machine learning.Nature,2017,549(7671):195-202
    [18]Wiebe N,Kapoor A,Granade C,Svore KM.Quantum inspired training for Boltzmann machines.arXiv:1507.02642,2015
    [19]Adachi S H,Henderson M P.Application of quantum annealing to training of deep neural networks.arXiv:1510.06356,2015
    [20]Amin M H,Andriyash E,Rolfe J,et al.Quantum Boltzmann machine.arXiv:1601.02036,2016
    [21]Behrman E C,Nash L R,Steck J E,et al.Simulations of quantum neural networks.Information Sciences,2000,128(3):257-269
    [22]Panella M,Martinelli G.Neural networks with quantum architecture and quantum learning.International Journal of Circuit Theory and Applications,2011,39(1):61-77
    [23]Schuld M,Sinayskiy I,Petruccione F.The quest for a quantum neural network.Quantum Information Processing,2014,13(11):2567-2586
    [24]Sahni V,Patvardhan C.Iris data classification using quantum neural networks.AIP Conference Proceedings,2006,864:219-227
    [25]Li Fei,Zheng Baoyu.A study of quantum neural networks//Proceedings of the IEEE International Conference on Neural Networks and Signal Processing.Nanjing,China,2003:539-542
    [26]Weinberg S.Precision tests of quantum mechanics.Physical Review Letters,1989,65(2):485-488
    [27]Silva A J D,Ludermir T B,Oliveira W R D.Quantum perceptron over a field and neural network architecture selection in a quantum computer.Neural Networks,2016,76:55-64
    [28]Schuld M,Sinayskiy I,Petruccione F.Simulating aperceptron on a quantum computer.arXiv:1412.3635,2014
    [29]Chen Jialin,Wang Lingli,Charbon E.A quantum-implementable neural network model.Quantum Information Processing,2017,16(10):245
    (1)The MNIST DATABASE of handwritten digits website.http://yann.lecun.com/exdb/mnist/
    (2)The Cifar-10 dataset.http://www.cs.toronto.edu/~kriz/cifar.html
    (1)https://en.wikipedia.org/wiki/Einstein_notation

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700