A web application where you upload pictorial drawings of major HTML components which are transformed into a single page website via Deep Learning.
- Single Page Website with minimal but professional frontend.
- Pictoral Representation : A paper drawing with major HTML componets (nav, aside, main and footer) as dataset. ( you are free to create a dataset except mentioned componets as per your imagination but the basis goal of project must remain intact )
- Deep Learning Model : A trained DL model to capture drawn components.
- HTML Part : modular HTML componets with minimal CSS wich are independent and can be mixed with each other to create a web page.
We made a output website with HTML and CSS.
-
- TensorFlow : TensorFlow is a free and open-source software library for machine learning. It can be used across a range of tasks but has a particular focus on training and inference of deep neural networks.
- TensorFlow (Keras) : Keras is a deep learning API written in Python, running on top of the machine learning platform TensorFlow. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result as fast as possible is key to doing good research. Keras is: Simple -- but not simplistic.
-
- ELU : The Exponential Linear Unit (ELU) is an activation function for neural networks. In contrast to ReLUs, ELUs have negative values which allows them to push mean unit activations closer to zero like batch normalization but with lower computational complexity.
- ReLU : The rectified linear activation function (called ReLU) has been shown to lead to very high-performance networks. This function takes a single number as an input, returning 0 if the input is negative, and the input if the input is positive.
- Sigmoid : The sigmoid function is a mathematical logistic function. It is commonly used in statistics, audio signal processing, biochemistry, and the activation function in artificial neurons. The formula for the sigmoid function is F(x) = 1/(1 + e^(-x)).
- ELU : The Exponential Linear Unit (ELU) is an activation function for neural networks. In contrast to ReLUs, ELUs have negative values which allows them to push mean unit activations closer to zero like batch normalization but with lower computational complexity.
-
- In deep learning we implemented CNN (Convolutional Neural Network) which is a class of deep neural network that is used for Computer Vision or analyzing visual imagery.
We created 100+ handdrawn datasets with complete variations i.e., with complete and incomplete components.
COMPLETE COMPONENTS | INCOMPLETE COMPONENTS |
---|---|
Val_loss is 1.3
COMPLETE DATASET | INCOMPLETE DATASET |
---|---|
DATASET | ORDER OF RESULT |
---|---|
Navbar - Sidebar - Footer - Main |