关注微信

推荐商品

    加载中... 正在为您读取数据...
分享到:
  • 熵与信息论(影印版)/国外电子信息精品著作[平装]
  • 共2个商家     60.00元~68.00
  • 作者:RobertM.Gray(作者)
  • 出版社:科学出版社;第1版(2012年6月1日)
  • 出版时间:
  • 版次 :
  • 印刷时间:
  • 包装:
  • ISBN:9787030344731

  • 商家报价
  • 简介
  • 评价
  • 加载中... 正在为您读取数据...
  • 商品描述

    编辑推荐

    由格雷编写的这本《熵与信息论(影印版)》是国外电子信息精品著作。本书共分14个章节,内容包括:熵、数据压缩、信道容量、率失真、网络信息论以及假设检验等。可作为电子工程、统计学以及通信方向高年级本科生和研究生学习信息论基础课程的参考书使用。

    目录

    Preface
    Introduction
    1 Information Sources
    1.1 Probability Spaces and Random Variables
    1.2 Random Processes and Dynamical Systems
    1.3 Distributions
    1.4 Standard Alphabets
    1.5 Expectation
    1.6 Asymptotic Mean Stationarity
    1.7 Ergodic Properties
    2 Pair Processes: Channels, Codes, and Couplings
    2.1 Pair Processes
    2.2 Channels
    2.3 Stationariw Properties of Channels
    2.4 Extremes: Noiseless and Completely Random Channels
    2.5 Deterministic Channels and Sequence Coders
    2.6 Stationary and Sliding-Block Codes
    2.7 Block Codes
    2.8 Random Punctuation Sequences
    2.9 Memoryless Channels
    2.10 Finite-Memory Channels
    2.11 Output Mixing Channels
    2.12 Block independent Channels
    2.13 Conditionally Block independent Channels
    2.14 Stationarizing Block Independent Channels
    2.15 Primitive Channels
    2.16 Additive Noise Channels
    2.17 Markov Channels
    2.18 Finite-State Channels and Codes
    2.19 Cascade Channels
    2.20 Commuication Systems
    2.21 Couplings
    2.22 Block to Sliding-Block: The Rohiin-Kakutani Theorem
    3 Entropy
    3.1 Entropy and Entropy Rate
    3.2 Divergence Inequality and Relative Entropy
    3.3 Basic Properties of Entropy
    3.4 Entropy Rate
    3.5 Relative Entropy Rate
    3.6 Conditional Entropy and Mutual Information
    3.7 Entropy Rate Revisited
    3.8 Markov Approximations
    3.9 Relative Entropy Densities
    4 The Entropy Ergodic Theorem
    4.1 History
    4.2 Stationary Ergodic Sources
    4.3 Stationary Nonergodic Sources
    4.4 AMS Sources
    4.5 The Asymptotic Equipartition Property
    5 Distortion and Approximation
    5.1 Distortion Measures
    5.2 Fidelity Criteria
    5.3 Average Limiting Distortion
    5.4 Communications Systems Performance
    5.5 Optima] Performance
    5.6 Code Approximation
    5.7 Approximating Random Vectors and Processes
    5.8 The Monge/Kantorovich/Vasershtein Distance
    5.9 Variation and Distribution Distance
    5.10 Coupling Discrete Spaces with the Hamming Distance
    5.11 Process Distance and Approximation
    5.12 Source Approximation and Codes
    5.13 d-bar Continuous Channels
    6 Distortion and Entropy
    6.1 The Fano Inequality
    6.2 Code Approximation and Entropy Rate
    6.3 Pinsker's and Matron's Inequalities
    6.4 Entropy and Isomorphism
    6.5 Almost Lossless Source Coding
    6.6 Asymptotically Optimal Almost Lossless Codes
    6.7 Modeling and Simulation
    Relative Entropy
    7.1 Divergence
    7.2 Conditional Relative Entropy
    7.3 Limiting Entropy Densities
    7.4 Information for General Alphabets
    7.5 Convergence Results
    8 Information Rates
    8.1 Information Rates for Finite Alphabets
    8.2 Information Rates for General Alphabets
    8.3 A Mean Ergodic Theorem for Densities
    8.4 Information Rates of Stationary Processes
    8.5 The Data Processing Theorem
    8.6 Memoryless Channels and Sources
    9 Distortion and Information
    9.1 The Shannon Distortion-Rate Function
    9.2 Basic Properties
    9.3 Process Definitions of the Distortion-Rate Function
    9.4 The Distortion-Rate Function as a Lower Bound
    9.5 Evaluating the Rate-Distortion Function
    10 Relative Entropy Rates
    10.1 Relative Entropy Densities and Rates
    10.2 Markov Dominating Measures
    10.3 Stationary Processes
    10.4 Mean Ergodic Theorems
    11 Ergodic Theorems for Densities
    11.1 Stationary Ergodic Sources
    11.2 Stationary Nonergodic Sources
    11.3 AMS Sources
    11.4 Ergodic Theorems for Information Densities
    12 Source Coding Theorems
    12.1 Source Coding and Channel Coding
    12.2 Block Source Codes for AMS Sources
    12.3 Block Source Code Mismatch
    12.4 Block Coding Stationary Sources
    12.5 Block Cod|rig AMS Ergodic Sources
    12.6 Subadditive FideliW Criteria
    12.7 Asynchronous Block Codes
    12.8 Sliding-Block Source Codes
    12.9 A Geometric Interpretation
    13 Properties of Good Source Codes
    13.1 Optimal and Asymptotically Optimal Codes
    13.2 Block Codes
    13.3 Sliding-Block Codes
    14 Coding for Noisy Channels
    14.1 Noisy Channels
    14.2 Feinstein's Lemma
    14.3 Feinstein's Theorem
    14.4 Channel Capacity
    14.5 Robust Block Codes
    14.6 Block Coding Theorems for Noisy Channels
    14.7 Joint Source and Channel Block Codes
    14.8 Synchronizing Block Channel Codes
    14.9 Sliding-block Source and Channel Coding
    References
    Index