Lecture 4 Lab Session: Variance, Standard Devia6on, Entropy Solu6ons Task 1 • Compute the variance of the following set of values: 150, 300, 250, 270, 130, 80, 700, 500, 200, 220, 110, 320, 420, 140 • Solu6on: 27 435.2 Your Explana6on… • variance Task 2 • Compute the variance of the following set of values: −6.9, −17.3, 18.1, 1.5, 8.1, 9.6, −13.1, −14.0, 10.5, −14.8, −6.5, 1.4 • Solu6on: 127.5 Task 3 Consider the following three data sets A, B and C. A = {9,10,11,7,13} B = {10,10,10,10,10} C = {1,1,10,19,19} Calculate the standard devia6on of each data set. Which set has the largest standard devia6on? C Your Explana6on and Calcula6ons • ”Fist we should do the same as we did in task 2 and 1 for each set to find the variance and THEN well take the root in order to find the standard devia6on.” Task 4 • Last year Sam bought 5 6res, which lasted 17,010, 16,080, 17,050, 16,090, and 17,000 miles. Find the standard devia6on for the life of the 6res. Decimal separator & Thousand separator • The decimal separator is a symbol used to mark the border between the integral and the frac6onal parts of a decimal numeral. As this symbol can be a period "." or a comma ",", decimal point and decimal comma are common names for the decimal separator. • Great Britain and the United States are two of the few places in the world that use a period to indicate the decimal place. Many other countries use a comma instead. • Likewise, while the U.K. and U.S. use a comma to separate groups of thousands, many other countries use a period instead, and some countries separate thousands groups with a thin space. Task 4 ::: watch out!!! Thousand separator: • 17.010,16.080,17.050,16.090,17.000 • 17,010,16,080,17,050,16,090,17,000 • 17010,16080,17050,16090,17000 Entropy: defini6ons • In informa6on theory, entropy is a measure of the uncertainty in a random variable. The term usually refers to the Shannon entropy, which quan6fies the expected value of the informa6on contained in a message.. • In our context, we use the no6on of entropy to quan6fy how good or bad a par6cular outcome is, especially if we have a sequence or range. • The entropy is the expecta6on of the surprisal. Repe66on 10 Surprisal (=self-‐informa6on) • The surprisal is the minus log probability i.e. -‐log2(P) • If an event conveys informa6on, that means it’s a surprise. • If an event always occurs, P(Ai)=1, then it carries no informa6on. -‐
log2(1) = 0 • If an event rarely occurs (e.g. P(Ai)=0.001), it carries a lot of info. -‐
log2(0.001) = 9.97 • The less likely the event, the more the informa5on it carries • Entropy is weighted average of informa5on carried by each event Repe66on 11 Entropy curve • If the events are equally probable, the entropy is maximum. Repe66on 12 Entropy: J&M p.148-‐149 • Compu6ng entropy requires that we establish a random variable X that ranges over whatever we are predic6ong ( a sequence of leners, words, numbers, etc…) and that has a par6cular probability func6on (call it f(x)… • The entropy of X is then: Task 5 • What is the entropy H, in bits, of the following source alphabet whose leners have the probabili6es shown? A B C D 1/4 1/8 1/2 1/8 Solu6on: 1.75 Explana6on and Calcula6ons Several approaches…. Task 6 Suppose that the following sequence of Yes/No ques6ons was an op6mal strategy to learn which of the leners {A,B,C,D,E, F,G} someone had chosen: “Is it A?” “No.” “Is it a member of the set {B,C}?” “No.” “Is it a member of the set {D,E}?” “No.” “Is it F?” “No.” What is the entropy of this alphabet? Solu5on: 2.25 (if we assume probability 0.5) Solu6on: probability guessing Solu6ons (probability guessing, i assume) Solu6on (probability guessing) Alterna6ve Calcula6ons Without probability guessing Etc etc. • Redo you calcula6ons, if they were wrong or if you did not figure out how to solve a problem • Some of the reports are sloppy (missing informa6on) • Thank you for complying to the naming conven6ons: it saves me lots of 6me! • If you do not want to use excel and if you cannot convert your spreadsheet into an excel format, please past your calcula6ons and formulae into the pdf. • Well done! The ends
© Copyright 2026 Paperzz