I thought that the idea of having a Real typedef was that it could be a global switch to change computation from being double precision to single precision etc..
This seems to work in most parts of the code, however there is a lot of code, especially computation involving scalars and vectors where this isn't the case (i.e. the code only compiles if Real==double). Is this the desired behavior?
Archive powered by MHonArc 2.6.16.