Suppose that a computer is connected to a local network. To send a message the computer is consuming the network for a time frame of a multi seconds exactly. With probability p = 0.8, the transmission is a success but , if the transmission does not work then the computer will make a new attempt to transmit after a random period of time has elapsed, and so on until the transmission has been successfully allocated.
What is the expected total time required to transmit a message if the random delay has a hope than 3 milliseconds?
What is the standard deviation of the total transmission time if the time between transmissions is an integer of milliseconds randomly selected between 1 and 5 inclusive (equal opportunities?)
The answer to the question is available in the PDF file https://assignmentexpert.com/https://assignmentexpert.com/homework-answers/mathematics-answer-15211.pdf
Comments
Leave a comment