MaplePrimes Posts

MaplePrimes Posts are for sharing your experiences, techniques and opinions about Maple, MapleSim and related products, as well as general interests in math and computing.

Latest Post
  • Latest Posts Feed
  • While reviewing code the other day, I came across the following snippet (here converted to a procedure).

    Ds := proc(V::set, n::posint, t)
    local i,v;
        {seq(seq((D@@i)(v)(t), i=1..n), v in V)};
    end proc:

    The purpose of this is to generate a set of derivatives at a point of a set of unassigned names. For example

     Ds({x,y},2,0);
                   ...

    It always makes me happy to see people using Maple for interesting things.  So I was pleased to see this blog post on Technology Review about this paper on arXiv on quantum randomness.  In this case, they are just comparing random numbers generated from lasers (this is why physicists get better press than mathematicians: LASERS!) with pseudo-random numbers generated using Meresenne Twister implemented in Maple, pseudo-random numbers generated using a Celluar Automata method implemented in another computer algebra system, and then binary digits of π treated as a pseudo-random sequence.  (Spoiler: the lasers win)

    While not a particularly interesting use of computer algebra systems, it did inspire me to revisit my old blog post on pseudo-random numbers in Maple and now I am working on a follow up that talks about some of the mathematical and statitical tests used to test the quality of pseudo-random number sequences which I hope to post soon.

    This sites needs better rendering of 2D Math.

    Here is a sample from the Wikipedia page for Maplesoft, from the pdes example section. The image on Wikipedia is quite nice (it's alt-tags have LaTeX code, which may be a hint).

    This week, I had the pleasure of attending a rock concert with my son Eric who is now about to turn 15 and who has turned out to possess non-trivial interests and talents in music. The concert was by the band Rush who, to the uninitiated, would be yet another big, loud, over-produced rock band. But to a generation of technocrats (e.g. yours truly) educated from the late 1970’s and on, they are the band of choice due to an intriguing mix of musicianship, technological...

    There is a probem in the Optimization package's (nonlinear programming problem) NLPSolve routine which affects multivariate problems given in so-called Operator form. That is, when the objective and constraints are not provided as expressions. Below is a workaround for this.

    Usually, an objective (or constraint) gets processed by Optimization using automatic differentiation tools ...

    The welcome to the new Version 2 of Mapleprimes was announced on May 27, 2010.

    On June 17, 2010, Will wrote, "We will likely re-visit the number of up-votes required for the various badges. No one has earned any of the up-vote related badges yet, so it may be important that we change the numbers required."

    If I was working for MapleSoft and I wanted to make sure that
    Maple would become the best mathematics software in the world
    the first thing I would look at would be my competitors.

    I had a look at Mathematica's website and their marketing outline
    some some very cool functionality that speak to my heart directly.
    1) Integrated datasources  2) real time data analysis etc etc


    I dont really like Mathematica due to their retarded programming

    An interesting discussion on rule-based integration and a new symbolic computer algebra package for it, named Rubi is ongoing in this long usenet thread.

    (The Rubi link above seems to work ok, although the link in the top article of that usenet thread may not. YMMV.)

    There's also some interesting subtext related to how practical developments in computer algebra systems can come about.

    The greatest benefits from bringing Maple into the classroom are realized when the static pedagogy of a printed textbook is enlivened by the interplay of symbolic, graphic, and numeric calculations made possible by technology.  It is not enough merely to compute or check answers with Maple.  To stop after noting that indeed, Maple can compute the correct answer is not a pedagogical breakthrough.

    ...

    It has been a while since my last post, mostly because of a combination of getting Maple 14 ready to ship and a lack of meaty topics to write about. I am trying to get back into the habit of posting more regularly. You can help me achieve my goal by posting questions about parallel programming. I'll do my best to answer. However for now, I'll give a brief overview of the new parallel programming features in Maple 14.

    A new function has been added to the Task Programming Model. The Threads:-Task:-Return function allows a parallel algorithm implemented on top of the Task Programming Model to perform an early bail out. Lets imagine that you have implemented a parallel search. You are looking for a particular element in a large set of data. Using the Task Programming Model, you've created a bunch of tasks, each searching a particular subset of the data. However one of the first tasks to execute finds the element you are looking for. In Maple 13, there was no built in way of telling the other tasks that the result have been found and they they should not execute. In Maple 14, the Return function allows one task to specify a return value (which will be returned from Threads:-Task:-Start) and signal the other tasks that the algorithm is complete and that additional tasks should not be executed. Tasks that are already running will still run to completion, but tasks that have not started executing will not be started.

    You may have noticed that there is a race condition with Return. What happens if two tasks both call Return in parallel? Only one of the values will become the value that is passed to Threads:-Task:-Start. I suppose I could say the "first" value is the one that is used, but really, what does that mean? If you call Return, then the value passed to Return should be an acceptable result for the algorithm.  If you call Return more than once, any of those values should be valid, thus it shouldn't matter which one becomes the return value.  That said, the Return function does give some feedback. In the task that succeeds in having its value accepted, Return will return true. In all other tasks that call Return, it will return false. This allows the code to know if a particular result was or was not accepted.

    Maple 14 also adds the Task Programming Model to the C External Calling API. This means that you can write your algorithms in C and make use of the Task Programming Model. The C API is similar to the Maple API, with a few differences. In particular, you need to create each child task individually, instead of as a single call to Continue, as you would in Maple. As well, because it is C code, you need to worry about a few details like memory management that are handled automatically in Maple.  Using External Call is fairly advanced, so I won't go into too much detail here.  If you'd like to see more details of using the Task Programming Model in External Calling, I can write a seperated post dedicated to that.

    As with every release of Maple, we spent some time trying to make our existing functionality faster and more stable. For parallel programming, we reduced the overhead of using the Task Programming Model, as well as reducing the locking in the kernel (which should help improve parallelism). Of course many bugs have been fixed, which should make parallel programming more reliable in Maple 14.

    Since the FIFA World Cup final is approaching quickly, I have created this animated Netherlands flag.  It is for my Dutch acquaintances to cheer for their favourite team during game.

    with(plots):
    p := [ seq(
    plot( [ seq((1/4)*sin(x+`if`(j > i, 1, 0))+1+2*i, i = 0 .. 3) ],
    x = 0 .. 2*Pi, y = 0 .. 8,
    color = [white, blue, white, red], filled = true, axes = none),
    j = [0, 1, 2, 3, 4, 3, 2, 1])

    Some images have reverted to some sort of equation form layout.  As an example look here http://www.mapleprimes.com/questions/87786-Smoothing-Data-Points-

    What has happened?

    Just for fun, I'm reviving the Maple soccer ball in anticipation of the FIFA final. You can make a simple animation by adding the option viewpoint=[circleleft] to the display command.

    with(plots):
    geom3d[TruncatedIcosahedron](p);
    V := evalf(geom3d[faces](p));
    display(seq(polygonplot3d(V[i], color = `if`(nops(V[i]) = 5, black, white)), i = 1 .. 32), scaling = constrained);

    I suppose this topic has come across many peoples minds at one time or another and I've only taken this number for granted and with a grain of salt.  I have become curious as to what other Maple users would run the memory usage up to? 

    For myself I find on average usually running below 10Mb or so.  I suppose this is average, for me, but I haven't usually created large worksheets, I don't think I've ever let it run over 40Mb. ...

    Here is yet another finesse (new to me) in getting better performance for some floating-point computation involving the Statistics package.

    > restart:

    > X:=Statistics:-RandomVariable('Normal'(0,1)):

    st:=time():
    seq(Statistics:-Quantile(X,1/i,numeric),i=2..10000):
    time()-st;[%%][-1];
    6.786
    -3.719016485

    > restart:

    > X:=Statistics:-Distribution(Normal(0,1)):
    First 127 128 129 130 131 132 133 Last Page 129 of 297