Replies: 1 comment
-
Also, one could similarly consider automatic casting of While nice in principle, I don't see this getting as much use as the main suggestion, since I expect most functions with a matrix parameter will require a square matrix. But it might be usefull for the MatrixNormal class, which I suspect most people use as a vector normal distribution anyway. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When using Math.NET I often find myself having to convert a scalar into a vector or matrix, to match a particular function signature. The typical example would be a class that implements a probability distribution, which can be either univariate or multivariate:
It would be nice to be able to pass the double directly in this case, and have it be automatically converted to a one-element vector. Same for matrix arguments, where scalar input could be automatically converted to a 1-by-1 matrix. Other solutions, such as overloading functions or making separate classes for the univariate cases, are cumbersome and lead to a lot of boilerplate.
I'm by no means a C# expert, but should this not be possible to implement with user-defined implicit conversion operators?
Beta Was this translation helpful? Give feedback.
All reactions