You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi
Your book helps me a lot and I am really appreciated.
In chapter 8, it is about the lwlr, when I try the code with my own data, it came up with a warning.
The warning is about exp(diffMatdiffMat.T/(-2.0k**2)).
If the diffMat*diffMat.T is too big, then the exp() result will be very small and causes overflow.
Is there any methods to deal with this kind of problem?
Thank you very much
The text was updated successfully, but these errors were encountered:
Code such as: https://github.com/pbharrin/machinelearninginaction/blob/master/Ch08/regression.py#L30 is calculating the difference between every point passed in and the test point. If you have a points that are really far away from each other then diffMat will be large. In which case you don't want to use the point because the weight will be 0. If your data has many dimensions you make get large numbers for diffMat*diffMat.T, and then you would want to change k to account for this.
So I would try: print out diffMat or diffMat*diffMat.T before line 36 to see what data is causing this. Then I would adjust k.
In applications where you have to work with large numbers people often use the log transform to work with these numbers in a common programming language. For example when calculating likelihoods the numbers get really small so people use log-likelihood. That isn't the case here you should be able to modify k to meet your needs.
Hi
Your book helps me a lot and I am really appreciated.
In chapter 8, it is about the lwlr, when I try the code with my own data, it came up with a warning.
The warning is about exp(diffMatdiffMat.T/(-2.0k**2)).
If the diffMat*diffMat.T is too big, then the exp() result will be very small and causes overflow.
Is there any methods to deal with this kind of problem?
Thank you very much
The text was updated successfully, but these errors were encountered: