You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This experiment will enable the user to understand the above aspects of these channels. The user is expected to know basics of probability distributions (such as Bernoulli, Binomial, and Gaussian distributions) and the notion of conditional probability, to execute this experiment. The user should ideally read the theory part of this experiment first, before attempting the questions.
Copy file name to clipboardExpand all lines: experiment/pretest.json
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -46,7 +46,7 @@
46
46
"d": "$p_X(0)=0.3,\\hspace{0.2cm} p_X(1)=0.6$."
47
47
},
48
48
"explanations": {
49
-
"a": "Wrong answer. This option is a valid Binomial distribution, not a Bernoulli distribution.",
49
+
"a": "Incorrect answer. This option is a valid Binomial distribution, not a Bernoulli distribution.",
50
50
"b": "Incorrect answer. A Bernoulli random variable takes only two values.",
51
51
"c": "Correct answer! A Bernoulli random variable takes two possible values (often represented as $0$ or $1$), and the probabilities should sum to $1$.",
52
52
"d": "Incorrect answer! A Bernoulli random variable does take only two possible values (often represented as $0$ or $1$. However their probabilities should sum to $1$."
Copy file name to clipboardExpand all lines: experiment/procedure.md
+5-7Lines changed: 5 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,3 @@
1
-
### Procedure
2
-
3
1
The experiment consists of three sub-experiments, through which the user will be systematically understanding the essential mathematical aspects of three important probabilistic channels, discussed in the theory part of this experiment. These channels are :
4
2
5
3
1. The Binary Erasure Channel, which erases each bit transmitted independently with probability $\epsilon$. The erasure symbol is denoted by $?$.
@@ -8,7 +6,7 @@ The experiment consists of three sub-experiments, through which the user will be
8
6
9
7
The detailed working of this experiment is as follows.
1.**Select Output Vectors**: Select the possible output vectors ($\vec{y}$) of the Binary Erasure Channel $BEC(\epsilon)$ whose input vector $\vec{x}$ is given. After selection, the boxes will turn green and deselecting them will turn them to gray.
1.**Select Output Vectors**: Select the possible output vectors ($\vec{y}$) of the Binary Symmetric Channel $BSC(p)$ whose input vector $\vec{x}$ is given. After selection, the boxes will turn green and deselecting them will turn them to gray.
1.**Enter probability values**: According to the statement about the AWGN channel displayed, enter the values in the input boxes provided in the expression that represents the probability density of the output.
Copy file name to clipboardExpand all lines: experiment/theory.md
+4-6Lines changed: 4 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,14 +1,12 @@
1
-
# Theory
2
-
3
-
## What is a Communication Channel?
1
+
### What is a Communication Channel?
4
2
5
3
A communication channel is a medium through which communication happens. In this virtual lab, we are dealing with specifically those channels that accept binary-valued inputs. We call these channels as binary-input channels. For some binary channels, we write the possible set of inputs as the *logical bits* $\{0,1\}$. That is, at any time instant, we can send a logical "0" through the channel, or a logical "1". Equivalently, we may also write the binary alphabet in the *bipolar* form, which is written as $\{+1,-1\}$. Normally, we take the logical-bit to bipolar mapping as $0\to +1$ and $1\to -1$.
6
4
7
5
8
6
We generally use the notation $\cal X$ to denote the input alphabet of the channel. From the point of view of the receiver, the input to the channel is unknown, and hence is modelled as a random variable with some input probability distribution. We denote this input random variable as $X$. Similarly, the output of the channel, is a random variable denoted by $Y$. We assume that the output alphabet, the set of all values that the output can possibly take, is denoted by $\cal Y$.
9
7
10
8
11
-
## Types of Channels considered in this virtual lab
9
+
###Types of Channels considered in this virtual lab
12
10
13
11
The problem of designing good communication systems arises precisely due to the existence of *noise* in communication channels. The noise in the communication channel is generally modelled via the conditional probabilities (of the output value, given the input value). We consider some three important types of communication channels (or in other words, noise models) in this virtual lab.
14
12
@@ -47,7 +45,7 @@ $$Y=X+Z.$$
47
45
48
46
---
49
47
50
-
## Conditional Distribution Associated with the Communication Channel
48
+
###Conditional Distribution Associated with the Communication Channel
51
49
52
50
We can also describe the channels above using the conditional distribution of the output random variable $Y$ given by the input random variable $X$. Specifically, we have the following.
We assume that the three channels we have considered in this virtual lab have the *memoryless* property and exist *without feedback*. To be precise, if we transmit a $n$-length sequence of bits denoted by $(x_1,\ldots,x_n)$ through any of these channels, the output is a sequence of bits $(y_1,\ldots,y_n)$, with probability as follows.
0 commit comments