Axel Vogt

5821 Reputation

20 Badges

20 years, 221 days
Munich, Bavaria, Germany

MaplePrimes Activity


These are replies submitted by Axel Vogt

After reading the dsolve approach I think one can speed it up as follows (i have not carried it out in detail). Your chi^2 needs only 1 integral and several summations. For that sort the data in their order of the input values z (i do not know how to do this with Maple for all the 3 lists at once to preserve the relative positions), some are multiples like 0.450. For the first z[1] use some exact and may be expensive method. For the others use Int(f(zz),zz=0..z[i+1])=Int(f(zz),zz=0..z[i]) + Int(f(zz),zz=z[i]..z[i+1]) and then use some Gauss-Legendre of low degree for the later. This makes sense as up to z = 0.2 you have close data points, then there is a 'gap' up to 0.4 and at the end, well ... observeds are a bit jumpy - but the overall spaces is small. To check that i did a _rough_ estimate for the parameters and used beta 0,359626349 lambda 589,9229257 sigma 2,012847862 P 0,051979125 B 0,0001 M 34,48092251 to look at the integrand of DL over that range to fit it by a low degree polynomial. Alread with deg = 2 that looks quite close and for deg = 4 even very fine (ok, to save time I did it with Excel, so better check it). A cross check to fit against the integral itself would suggest a degree of 6 for that. I interprete this as: using a Gauss-Legendre method (=finite sums) of degree 4 for the integrals inbetween should give you a quite precise value for chi^2. That will be fast as it only needs about 200 - 300 function evaluations, for which one can use evalhf plus one 'actual' integration. Not sure how to fit the GOT Global Optimization Tool with it, as in this case the objective function is numeric (and the 'simple' versions with Maple's Optimization seem not to like this), which has to be coded.
After reading the dsolve approach I think one can speed it up as follows (i have not carried it out in detail). Your chi^2 needs only 1 integral and several summations. For that sort the data in their order of the input values z (i do not know how to do this with Maple for all the 3 lists at once to preserve the relative positions), some are multiples like 0.450. For the first z[1] use some exact and may be expensive method. For the others use Int(f(zz),zz=0..z[i+1])=Int(f(zz),zz=0..z[i]) + Int(f(zz),zz=z[i]..z[i+1]) and then use some Gauss-Legendre of low degree for the later. This makes sense as up to z = 0.2 you have close data points, then there is a 'gap' up to 0.4 and at the end, well ... observeds are a bit jumpy - but the overall spaces is small. To check that i did a _rough_ estimate for the parameters and used beta 0,359626349 lambda 589,9229257 sigma 2,012847862 P 0,051979125 B 0,0001 M 34,48092251 to look at the integrand of DL over that range to fit it by a low degree polynomial. Alread with deg = 2 that looks quite close and for deg = 4 even very fine (ok, to save time I did it with Excel, so better check it). A cross check to fit against the integral itself would suggest a degree of 6 for that. I interprete this as: using a Gauss-Legendre method (=finite sums) of degree 4 for the integrals inbetween should give you a quite precise value for chi^2. That will be fast as it only needs about 200 - 300 function evaluations, for which one can use evalhf plus one 'actual' integration. Not sure how to fit the GOT Global Optimization Tool with it, as in this case the objective function is numeric (and the 'simple' versions with Maple's Optimization seem not to like this), which has to be coded.
i think he wants to have his comments translated as well ... a lazy way would be to use a print command or a dummy fct and to replace it later in the C source
It also works fine on Win with IE or Mozilla (no downloading problems for any files here).
thx for the hint, i do not mind the content of the sidebars. it was just to see more of the postings as i am used from other forums. but may be that is a matter of my personal taste.
Alejandro, Hm ... i do not have the GlobalOptimization Toolbox (=GOT ?) and never worked with it, so can only guess what the syntax means. Could you describe what you want to have for that 54 data points? For me it looks like M is some regularization, the s are weights and you want to maximize/minimize a squared sum (log likelihood?). What is your original problem? To find beta,lambda,sigma,P,B s.th. exp(m/r) ~ DL(z) ? What means the fct v? May be you have some plots and typical solutions (or some values close or expected). What i see is quite expensive, as for each evaluation of chin 54 integrals are to be computed. And if that GOT is going to look for gradients numerical it may let you wait a long time :-)
Alejandro, Hm ... i do not have the GlobalOptimization Toolbox (=GOT ?) and never worked with it, so can only guess what the syntax means. Could you describe what you want to have for that 54 data points? For me it looks like M is some regularization, the s are weights and you want to maximize/minimize a squared sum (log likelihood?). What is your original problem? To find beta,lambda,sigma,P,B s.th. exp(m/r) ~ DL(z) ? What means the fct v? May be you have some plots and typical solutions (or some values close or expected). What i see is quite expensive, as for each evaluation of chin 54 integrals are to be computed. And if that GOT is going to look for gradients numerical it may let you wait a long time :-)
The posted coded is almost a naked one and as thus may be faster (may be a Gauss method would be sufficient). But it is on you to care whether it is appropriate - a lib like NAG is expected to do that for you. If you are sure enough to get rid off 'overhead' ... Personally i would look for a stable / reliable solution first and care for tuning later. I will look at your sheet later that evening.
The posted coded is almost a naked one and as thus may be faster (may be a Gauss method would be sufficient). But it is on you to care whether it is appropriate - a lib like NAG is expected to do that for you. If you are sure enough to get rid off 'overhead' ... Personally i would look for a stable / reliable solution first and care for tuning later. I will look at your sheet later that evening.
If your work should be done in Maple then you can force it to use the NAG library for fast and _reliable_ external integration like evalf(Int(f, x=a..b, method = _d01ajc)), see the help on evalf/int. Does that already help? I have not checked your integral (as something on the params should be known), but it simplifies by changing variables to Int(1/(xi*(xi^alpha*(1+sigma*xi^lambda)^a+b*xi)^(1/2)),xi=1..x) using 1+zz=xi and then x=1+z,1-beta=alpha,P=a,B=b But I have not looked up tables whether more is known, that algebraic fct looks quite general ...
If your work should be done in Maple then you can force it to use the NAG library for fast and _reliable_ external integration like evalf(Int(f, x=a..b, method = _d01ajc)), see the help on evalf/int. Does that already help? I have not checked your integral (as something on the params should be known), but it simplifies by changing variables to Int(1/(xi*(xi^alpha*(1+sigma*xi^lambda)^a+b*xi)^(1/2)),xi=1..x) using 1+zz=xi and then x=1+z,1-beta=alpha,P=a,B=b But I have not looked up tables whether more is known, that algebraic fct looks quite general ...
Thx - the new work around works for me!
Thx, for me (Win ME, Maple10) those suggestions do not work. The directory is users (with an additional S) and the ini file there does not have those lines. If i add them then nothing happens. May be due to my somewhat old operating system?
First 204 205 206 207 Page 206 of 207