000 04384cam a2200529 i 4500
001 u15147
003 SA-PMU
005 20210418123321.0
008 160613t20162016maua b 001 0 eng
010 _a 2016022992
040 _aDLC
_beng
_erda
_cDLC
_dYDXCP
_dBTCTA
_dBDX
_dOCLCF
_dUUM
_dMYG
_dNRC
_dWW9
_dDGU
_dIUL
_dEUM
_dHEBIS
_dOBE
_dVP@
_dFDA
_dTMA
_dLIP
_dOCLCO
_dCTU
_dCOH
_dUKWOH
_dOSU
_dUCW
_dVT2
_dU3W
_dI8M
_dUV0
_dCSA
_dNJR
_dGZN
_dTEU
_dAUD
_dJ3G
_dUWO
_dBUF
_dVTU
_dJG0
_dLEATE
_dAUW
_dBGAUB
_dCUY
_dOKX
_dYDX
_dGILDS
_dOCLCQ
_dGZM
019 _a951226949
_a964650355
_a1009034043
020 _a9780262035613
_q(hardcover ;
_qalkaline paper)
020 _a0262035618
_q(hardcover ;
_qalkaline paper)
035 _a(OCoLC)955778308
_z(OCoLC)951226949
_z(OCoLC)964650355
_z(OCoLC)1009034043
042 _apcc
050 0 0 _aQ325.5
_b.G66 2016
072 7 _aCOM
_2eflch
072 7 _aCOM
_2ukslc
082 0 0 _a006.3/1
_223
084 _a006.31
_bGOO
_2z
084 _a006.31
_bGOO
100 1 _aGoodfellow, Ian,
_eauthor.
245 1 0 _aDeep learning /
_cIan Goodfellow, Yoshua Bengio, and Aaron Courville.
264 1 _aCambridge, Massachusetts :
_bThe MIT Press,
_c[2016]
264 4 _c©2016
300 _axxii, 775 pages :
_billustrations (some color) ;
_c24 cm.
336 _atext
_btxt
_2rdacontent
337 _aunmediated
_bn
_2rdamedia
338 _avolume
_bnc
_2rdacarrier
490 1 _aAdaptive computation and machine learning
504 _aIncludes bibliographical references (pages 711-766) and index.
505 0 _aIntroduction -- APPLIED MATH AND MACHINE LEARNING BASICS -- Linear algebra -- Probability and information theory -- Numerical computation -- Machine learning basics -- DEEP NETWORKS: MODERN PRACTICES -- Deep feedforward networks -- Regularization for deep learning -- Optimization for training deep models -- Convolutional networks -- Sequence modeling: recurrent and recursive nets -- Practical methodology -- Applications -- DEEP LEARNING RESEARCH -- Linear factor models -- Autoencoders -- Representation learning -- Structured probabilistic models for deep learning -- Monte Carlo methods -- Confronting the partition function -- Approximate inference -- Deep generative models.
520 _a"Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and video games. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models. Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors"--Page 4 of cover.
650 0 _aMachine learning.
650 7 _aComputers and IT.
_2eflch
650 7 _aMachine learning.
_2fast
_0(OCoLC)fst01004795
650 7 _aMaschinelles Lernen
_2gnd
650 7 _aComputers and IT.
_2ukslc
650 4 _aMachine learning.
700 1 _aBengio, Yoshua,
_eauthor.
700 1 _aCourville, Aaron,
_eauthor.
830 0 _aAdaptive computation and machine learning.
856 4 2 _uhttp://www.deeplearningbook.org/
596 _a1
942 _cBOOK
999 _c2541
_d2541