From 10b7b800d4e57fb9b1b67c6045294c3b4fbe1651 Mon Sep 17 00:00:00 2001 From: Luke Peters Date: Mon, 21 Aug 2023 13:20:07 +0200 Subject: [PATCH] More changes to wording/formatting, fixing small errors --- README.md | 74 ++++++++++++++++++++++++++++--------------------------- 1 file changed, 38 insertions(+), 36 deletions(-) diff --git a/README.md b/README.md index 2729aa3d..826aec7c 100644 --- a/README.md +++ b/README.md @@ -73,22 +73,22 @@ The user can decide the type of non-linearity function for certain specified lay The current VGSL-Specs implementation supports the following layers: ## Overview Sheet -| **Layer** | **Spec** | **Example** | **Description** | -|--------------------|------------------------------------------------|------------------|----------------------------------------------------------------------| -| Input | `[batch, height, width, depth]` | `None,64,None,1` | Input layer with variable batch_size & width, depth of 1 channel | -| Output | `O(2\|1\|0)(l\|s)` | `O1s10` | Dense layer with a 1D sequence as with 10 output classes and softmax | -| Conv2D | `C(s\|t\|r\|e\|l\|m),,[,],` | `Cr,3,3,1,1` | Conv2D layer with Relu, a 3x3 filter and 1,1 stride | -| Dense (FC) | `F(s\|t\|r\|l\|m)` | `Fs64` | Dense layer with softmax and 64 units | -| LSTM | `L(f\|r)[s]` | `Lf64` | Forward-only LSTM cell with 64 units | -| GRU | `G(f\|r)[s]` | `Gr64` | Reverse-only GRU cell with 64 units | -| Bidirectional | `B(g\|l)` | `Bl256` | Bidirectional layer wrapping a LSTM RNN with 256 units | -| BatchNormalization | `Bn` | `Bn` | BatchNormalization layer | -| MaxPooling2D | `Mp,,,` | `Mp2,2,1,1` | MaxPooling2D layer with 2,2 pool size and 1,1 strides | -| AvgPooling2D | `Ap,,,` | `Ap2,2,2,2` | AveragePooling2D layer with 2,2 pool size and 1,1 strides | -| Dropout | `D` | `Do25` | Dropout layer with `dropout` = 0.25 | -| ------------------ | ------------------------------------ |------------------|----------------------------------------------------------------------| -| ResidualBlock | `TODO` | | | -| CTCLayer | `TODO` | | | +| **Layer** | **Spec** | **Example** | **Description** | +|--------------------|------------------------------------------------|--------------------|------------------------------------------------------------------------| +| Input | `[batch, height, width, depth]` | `None,64,None,1` | Input layer with variable batch_size & width, depth of 1 channel | +| Output | `O(2\|1\|0)(l\|s)` | `O1s10` | Dense layer with a 1D sequence as with 10 output classes and softmax | +| Conv2D | `C(s\|t\|r\|e\|l\|m),,[,],` | `Cr,3,3,64` | Conv2D layer with Relu, a 3x3 filter, 1x1 stride and 64 filters | +| Dense (FC) | `F(s\|t\|r\|l\|m)` | `Fs64` | Dense layer with softmax and 64 units | +| LSTM | `L(f\|r)[s]` | `Lf64` | Forward-only LSTM cell with 64 units | +| GRU | `G(f\|r)[s]` | `Gr64` | Reverse-only GRU cell with 64 units | +| Bidirectional | `B(g\|l)` | `Bl256` | Bidirectional layer wrapping a LSTM RNN with 256 units | +| BatchNormalization | `Bn` | `Bn` | BatchNormalization layer | +| MaxPooling2D | `Mp,,,` | `Mp2,2,1,1` | MaxPooling2D layer with 2x2 pool size and 1x1 strides | +| AvgPooling2D | `Ap,,,` | `Ap2,2,2,2` | AveragePooling2D layer with 2x2 pool size and 1x1 strides | +| Dropout | `D` | `Do25` | Dropout layer with `dropout` = 0.25 | +| ------------------ | ------------------------------------ | ------------------ | ---------------------------------------------------------------------- | +| ResidualBlock | `TODO` | | | +| CTCLayer | `TODO` | | | ## Layer Details ### Input: @@ -107,37 +107,39 @@ _Creates a Dense layer with a 1D sequence as output with 10 classes and softmax_ ### Conv2D: Spec: **`C(s|t|r|e|l|m),,[,,]`**
-Convolves using a `x`,`y` window and `d` units. Optionally, the stride window can be set with (`s_x`, `s_y`)
-Example: `Cr3,3,1,1`
-_Creates a Conv2D layer with a Relu activation function a 3x3 filter and 1,1 stride (if s_x and s_y are not provided, set to (1,1) default)_ +Convolves using a `x`,`y` window and `d` filters. Optionally, the stride window can be set with (`s_x`, `s_y`)
+Example_1: `Cr3,3,64`
+_Creates a Conv2D layer with a Relu activation function a 3x3 filter, 1x1 stride (if s_x and s_y are not provided, set to (1,1) default) and 64 filters_ +Example_2: `Cr3,3,1,3,128`
+_Creates a Conv2D layer with a Relu activation function a 3x3 filter, 1x3 strides and 128 filters_

### **Dense (Fully-connected layer)**: Spec: **`F(s|t|r|e|l|m)`**
-Fully-connected(FC)) with `s|t|r|e|l|m` non-linearity and `d` outputs
+Fully-connected(FC)) with `s|t|r|e|l|m` non-linearity and `d` units
Example: `Fs64`
-_Creates a FC layer with softmax non-linearity and 64 outputs_ +_Creates a FC layer with softmax non-linearity and 64 units_

### **LSTM**: Spec: **`L(f|r)[s]`**
-LSTM cell running either forward-only (`f`) or reversed-only (`r`), with `n` outputs. Optionally, summarization of the output at the final step can be added by providing a Boolean `d` (corresponds with the return_sequences)
+LSTM cell running either forward-only (`f`) or reversed-only (`r`), with `n` units. Optionally, summarization of the output at the final step can be added by providing a Boolean `d` (corresponds with the return_sequences)
Example: `Lf64`
-_Creates a forward-only LSTM cell with 64 outputs_ +_Creates a forward-only LSTM cell with 64 units_

### **GRU**: Spec: **`G(f|r)[s]`**
-GRU cell running either forward-only (`f`) or reversed-only (`r`), with `n` outputs. Optionally, summarization of the output at the final step can be added by providing a Boolean `d` (corresponds with the return_sequences)
+GRU cell running either forward-only (`f`) or reversed-only (`r`), with `n` units. Optionally, summarization of the output at the final step can be added by providing a Boolean `d` (corresponds with the return_sequences)
Example: `Gf64`
-_Creates a forward-only GRU cell with 64 outputs_ +_Creates a forward-only GRU cell with 64 units_

### **Bidirectional**: Spec: **`B(g|l)`**
-Bidirectional layer which wraps either a LSTM (`l`) or GRU (`g`) RNN layer and runs it in both forward and backward directions.
+Bidirectional layer which wraps either a LSTM (`l`) or GRU (`g`) RNN layer and runs it in both forward and backward directions with `n` units.
Example: `Bl256`
-_Creates a Bidirectional RNN layer using a LSTM Cell with 256 outputs_ +_Creates a Bidirectional RNN layer using a LSTM Cell with 256 units_

### **BatchNormalization**: @@ -149,24 +151,24 @@ Applies a transformation that maintains the mean output close to 0 and the outpu

### **MaxPooling2D**: +Spec: **`Mp,,,`**
Downsampling technique along the height and width of a 2D image by taking the maximum value over the input window `x`,`y` and is shifted by strides along each dimension (`s_x`, `s_y`). `padding` is always set to "same".
- Spec: **`Mp,,,`**
- Example: `Mp2,2,2,2`
- _Creates a MaxPooling2D layer with pool size (2,2) and strides of (2,2)_ +Example: `Mp2,2,2,2`
+_Creates a MaxPooling2D layer with pool size (2,2) and strides of (2,2)_

### **AvgPooling2D**: +Spec: **`Ap,,,`**
Downsampling technique along the height and width of a 2D image by taking the average value over the input window x,y and is shifted by strides along each dimension (s_x, s_y). `padding` is always set to "same".
- Spec: **`Ap,,,`**
- Example: `Ap2,2,2,2`
- _Creates a AveragePooling2D layer with pool size (2,2) and strides of (2,2)_ +Example: `Ap2,2,2,2`
+_Creates a AveragePooling2D layer with pool size (2,2) and strides of (2,2)_

### **Dropout**: -Regularization layer that randomly sets input units to 0 with a frequency of `rate` at each step during training time. Used to prevent overfitting. Spec: **`D`**
- Example: `Do50`
- _Creates a Dropout layer with a dropout rate of 0.5 (`D`/100)_ +Regularization layer that randomly sets input units to 0 with a frequency of `rate` at each step during training time. Used to prevent overfitting. +Example: `Do50`
+_Creates a Dropout layer with a dropout rate of 0.5 (`D`/100)_

---