You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After some discussions in the Julia Discourse regarding Float64 vs Float32, some adjustments should be made to the code in order to avoid some performance pitfalls. Those are:
Move all code back to Float64. Apparently, unless one is using GPUs, there's no benefit on that for CPUs.
Use Int in exponents (e.g. in the flow law). See explanations in Discourse.
Avoid memory allocation in util functions following example in iceflow_sandbox model.
The text was updated successfully, but these errors were encountered:
Partially implemented in SIA!. There are still some operations remaining for which the pre-initialized size of the matrices needs to be adjusted. This is not urgent for the scientific applications, so to be updated later on.
This could be a good task for Lucille in her internship to optimize the SIA in a forward model.
More optimizations have been implemented. There are still quite a lot of memory allocations, though. Removing almost all of them would results in a relatively unreadable code, and it would imply passing even more initialized context matrices to SIA!().
After some discussions in the Julia Discourse regarding
Float64
vsFloat32
, some adjustments should be made to the code in order to avoid some performance pitfalls. Those are:Float64
. Apparently, unless one is using GPUs, there's no benefit on that for CPUs.Int
in exponents (e.g. in the flow law). See explanations in Discourse.iceflow_sandbox
model.The text was updated successfully, but these errors were encountered: