Question #51211Suppose that a computer is connected to a local network. To send a message the computer is consuming the network for a time frame of a multi seconds exactly. With probability p = 0.8, the transmission is a success but , if the transmission does not work then the computer will make a new attempt to transmit after a random period of time has elapsed, and so on until the transmission has been successfully allocated. 1)What is the expected total time required to transmit a message if the random delay has a hope than 3 milliseconds? 2)What is the standard deviation of the total transmission time if the time between transmissions is an integer of milliseconds randomly selected between 1 and 5 inclusive (equal opportunities?).
Solution. 1) Here we are at the situation, either we a successful transmission immediately (this happens with the probability 0.8 ) or wait 3 seconds for the another attempt and then have a successful attempt (this happens with the probability ) ans so on. So, in fact, we have that the waiting time has the distribution of , where , that is , .Hence ) Here, we have more complicated situation, if the attempt is not successful, then we wait random time , that is distributed as the following .Denote by the sequence of iid r.v. that have the same distribution as , let be the r.v., that indicates the number of step when the attempt to transmit is successful. Then, . It is well-known that . It can be easily calculated that and . So, .
##