交通大學 電子工程學系 電子研究所 「磐石課程」課程綱要 課程名稱:(中文)隨機過程 (英文)Stochastic Processes 學分數 3 必/選修 選修 開課單位 電子研究所 永久課號 IEE5620 開課年級 電子碩博 先修科目或先備能力: 1. 必須修過大學部「微積分 I、II」、「機率與統計」。 Must have had “Calculus I, II,” and “Principles of Communication Systems,” all undergraduate courses. 2. 必須修過足夠之大學部「工程數學」(含線性代數、機率、常微分方程、複變函數)。例如:必須 了解矩陣演算與矩陣分解(「線性代數」課程範圍)、機率的概念與相關運算(「機率與統計」課程 範圍)、線性固定係數微分方程式之解的特性(「微分方程」課程範圍)、複變數的觀念與相關運算 (「複變函數」課程內容)、等。 Must have had adequate undergraduate “Engineering Mathematics” (including Linear Algebra, Probability, Ordinary Differential Equations, and Complex Variables). For example, must understand matrix computation and matrix decomposition (covered in “Linear Algebra” course), concept of probability and related operations (covered in “Probability and Statistics”), characteristics of solutions to linear constant-coefficient differential equations (covered in “Differential Equations”), concepts of complex variables and related operations (covered in “Complex Variables”), etc. 3. 強烈建議修過大學部「訊號與系統」。 Strongly recommended to have taken the undergraduate course “Signals and Systems.” 課程概述與目標: 本課程為研究所基礎課程,主要目的在提供進階課程(如數位通訊、檢測與估計、機器學習理論、 行動通訊等)所需之基礎背景知識。課程重點包括高斯隨機變數進階討論,連續與離散時間隨機訊 號之統計特性分類與其於線性系統輸出入之統計行為表現,及隨機序列之收斂特性探討。同時並介 紹常見之統計學理論及其在檢測估計技術上之應用。 This course is a fundamental course aiming at offering background knowledge needed in more advanced courses, such as Digital Communication, Detection and Estimation, Machine Learning, Mobile Communication. The emphasis is on introducing jointly Gaussian random variables, continuous- and discrete-time random signals and their classifications, statistical behavior analysis of wide-sense stationary signals in linear time-invariant systems (Wiener-Khinchin Theorem), and convergence properties of random sequences. Basic concepts in statistics and their applications to detection/estimation are also touched in this course. 課程大綱 單元主題 內容綱要 Probability and linear algebra review First topic offers a quick review to important concepts in probability and linear algebra that are frequently used in this course. Particularly, the focus is on the total probability theorem, conditional probability/expectation, Bayes rule, and eigen-decomposition of Hermitian matrices. Jointly Gaussian and complex Gaussian The topic introduces the notion of jointly Gaussian and its extension to complex Gaussian. The central limit theorem and its applications will be discussed. This topic briefly introduces the idea of Bayesian detection, with Statistics (1): Fundamentals of Detection emphasis on the minimum error probability detection rule, i.e. the maximum a posteriori (MAP), and the maximum likelihood (ML) detection. This unit covers several techniques in estimation theory, including interval estimator (introducing the notion of confidence interval and Statistics (2): Fundamentals of Estimation Student-t distribution), maximum likelihood (ML) estimation, leastsquares (LS) estimation, minimum mean-square error (MMSE) estimation, and linear MMSE. Stochastic Processes The topic includes two subunits: 1) discrete-time random process, i.e., random sequences, and 2) continuous-time random process, both of which are under the notion of “stochastic processes.” The topic first introduces the definition of stochastic processes, stationary, wide-sense stationary (WSS) and cyclo-stationary random processes, the autocorrelation function and power spectral density. Particularly, the relation between a WSS random input and its output to an LTI system will be discussed. Besides, various convergence definitions for random sequences will also be detailed and their properties discussed. And the notion of convergence leads to the two laws of large numbers. Advanced topics in stochastic processes This unit mentions several advanced topics on the abstract level, particularly on the definition of stochastic integral and its convergence property, mean-ergodic theorem, Fourier series expansion and the Karhunen-Loeve expansion. Final topic wraps up the course by extending the MMSE to Wiener Applications: Wiener filter, Kalman filter, filter, Kalman filter (a recursive MMSE) and by extending the ML to EM algorithm, etc. the EM (an iterative ML) algorithm, if time.
© Copyright 2026 Paperzz