Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add more loss functions like cross-entropy... #9

Open
TarunTomar122 opened this issue Oct 1, 2020 · 22 comments · May be fixed by #211
Open

Add more loss functions like cross-entropy... #9

TarunTomar122 opened this issue Oct 1, 2020 · 22 comments · May be fixed by #211
Assignees
Labels
easy good first issue Good for newcomers

Comments

@TarunTomar122
Copy link
Member

TarunTomar122 commented Oct 1, 2020

Right we have implemented only Mean Squared Error and Logarithmic Error as our loss functions in our module. As we are going to implement more and more machine learning algorithms we will need more loss functions. There in this issue you can add one or more loss functions like cross-entropy etc.

Resources:-

https://keras.io/api/losses/ <--------(Most of the known loss functions)

https://machinelearningmastery.com/how-to-choose-loss-functions-when-training-deep-learning-neural-networks/
https://heartbeat.fritz.ai/5-regression-loss-functions-all-machine-learners-should-know-4fb140e9d4b0
https://www.analyticsvidhya.com/blog/2019/08/detailed-guide-7-loss-functions-machine-learning-python-code/

@quadri-haider-ali
Copy link

I would love to work on this part.

@rohansingh9001
Copy link
Collaborator

@quadri-haider-ali Sure go ahead. Issue assigned to you.

@SaiSrichandra
Copy link
Contributor

I would also like to work on this issue

@TarunTomar122
Copy link
Member Author

@SaiSrichandra You can work on this... Do Tell prior what loss function are you going to add?

@SaiSrichandra
Copy link
Contributor

I will add the KL Divergence loss

@TarunTomar122
Copy link
Member Author

Okay @SaiSrichandra do keep us updated

@parva-jain
Copy link
Contributor

Hey I would like to add mean squared logarithmic error(msle) and root mean squared logarithmic error(rmlse).

@rohansingh9001
Copy link
Collaborator

@parva-jain sure go ahead. Keep us updated on your progress and feel free to ask anything in the Gitter channel for further clarification.

@Halix267
Copy link

Halix267 commented Dec 7, 2020

Hey @rohansingh9001 I would like to add root mean squared logarithmic error (rmsle)

@parva-jain
Copy link
Contributor

Hey @rohansingh9001 I would like to add root mean squared logarithmic error (rmsle)

Actually, I'm ready with the loss function of mean squared logarithmic error and derivative equation of mean squared logarithmic error(handly written) and only left with its implementation in python. Also @rohansingh9001, the equation is a bit complex so how should I justify it?

@Vinit-source
Copy link
Contributor

Hey I would like to add the Triplet loss.
Refer this for more info. on it.

@Udit-git-acc
Copy link
Contributor

I would like to add cosine similarity loss function.

@kwanit1142
Copy link
Collaborator

@Vinit-source and @Udit-git-acc, Okay 👍

@ssiddharth27
Copy link
Contributor

I would like to add logcosh_loss function

@kwanit1142
Copy link
Collaborator

@Siddharth-Singh27 sure, go on

@kwanit1142
Copy link
Collaborator

kwanit1142 commented Mar 28, 2021

Completion Phase-1

Huber Loss -> Shreya Sachan
Cosine Similarity -> Udit Agarwal
Mean Squared log loss -> Devyani Gorkar
Log cosh loss -> Siddharth Singh

@dr-ghost
Copy link

dr-ghost commented Oct 3, 2022

I would like to add the Binary Crossentropy loss function

@kwanit1142
Copy link
Collaborator

Sure, go ahead

@Ritik-in-Tech
Copy link

I want to add hinge function loss , Will I ?

@kwanit1142
Copy link
Collaborator

Sure, Go ahead :) Just check in the files, whether hinge function loss is already there or not?

@yorozuya-2003
Copy link
Contributor

I would like to add the Poisson Loss function.

@kwanit1142
Copy link
Collaborator

Sure, Go ahead :) Just check in the files, whether Poisson function loss is already there or not?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment