On May 23, 2005, at 3:28 PM, Abe Stephens wrote:
I thought that the idea of having a Real typedef was that it could be a global switch to change computation from being double precision to single precision etc..
Yes, this was the plan.
This seems to work in most parts of the code, however there is a lot of code, especially computation involving scalars and vectors where this isn't the case (i.e. the code only compiles if Real==double). Is this the desired behavior?
It is not the desired behavior, but the typedef Real is relatively new, so it hasn't been propogated everywhere.
Steve
Archive powered by MHonArc 2.6.16.