roman_pearce

Mr. Roman Pearce

1678 Reputation

19 Badges

20 years, 215 days
CECM/SFU
Research Associate
Abbotsford, British Columbia, Canada

I am a research associate at Simon Fraser University and a member of the Computer Algebra Group at the CECM.

MaplePrimes Activity


These are replies submitted by roman_pearce

The Groebner[Basis] command will actually replace sqrt(3) by new variable Z and insert the equation Z^2-3 to compute the Groebner basis. The difference is that in order to guarantee the correct result in general, Z and any other new variables must be made less important than all the original variables. For example, if your variables were {x,y}, then any term involving x or y to any power would be larger than a term involving only Z. This forces Z to be eliminated from the original system, which is required to get the correct Groebner basis. When you replace sqrt(3) by a new variable and just compute a Groebner basis of the new system, then all the variables are mixed together and the problem can sometimes be easier. For RootFinding:-Isolate this is fine because you're going to solve for everything anyways. We should probably extend RootFinding:-Isolate to do this automatically. So to answer your question. When you replace sqrt(3) by a variable and compute a Groebner basis, what you get may not be a Groebner basis of the original system. So there may be no way to reproduce the effect in Groebner[Basis] when sqrt(3) is present in the input.
The Groebner[Basis] command will actually replace sqrt(3) by new variable Z and insert the equation Z^2-3 to compute the Groebner basis. The difference is that in order to guarantee the correct result in general, Z and any other new variables must be made less important than all the original variables. For example, if your variables were {x,y}, then any term involving x or y to any power would be larger than a term involving only Z. This forces Z to be eliminated from the original system, which is required to get the correct Groebner basis. When you replace sqrt(3) by a new variable and just compute a Groebner basis of the new system, then all the variables are mixed together and the problem can sometimes be easier. For RootFinding:-Isolate this is fine because you're going to solve for everything anyways. We should probably extend RootFinding:-Isolate to do this automatically. So to answer your question. When you replace sqrt(3) by a variable and compute a Groebner basis, what you get may not be a Groebner basis of the original system. So there may be no way to reproduce the effect in Groebner[Basis] when sqrt(3) is present in the input.
You can disable a lot of the arithmetic and normalization and use a heuristic test for zero to speed things up:
Normalizer := proc(a) a end proc:
TestZero := testeq:
A := LinearAlgebra:-MatrixInverse(M):
Granted, the result is probably now 10 times more useless.
You can disable a lot of the arithmetic and normalization and use a heuristic test for zero to speed things up:
Normalizer := proc(a) a end proc:
TestZero := testeq:
A := LinearAlgebra:-MatrixInverse(M):
Granted, the result is probably now 10 times more useless.
10 minutes on a 3 GHz computer is O(10^12) cycles, so a computation that is O(10^9) or less should terminate. Most problems are not of that size, so unless you know otherwise you are probably just wasting time. It's a fairly safe rule of thumb, and it's conveniently about the same as a coffee break.
10 minutes on a 3 GHz computer is O(10^12) cycles, so a computation that is O(10^9) or less should terminate. Most problems are not of that size, so unless you know otherwise you are probably just wasting time. It's a fairly safe rule of thumb, and it's conveniently about the same as a coffee break.
What do they have in common? Rapid introduction of features, some of which are half-baked, and an unhealthy obsession with backwards compatibility - with previously introduced, half-baked features. The truth is, anyone can add code. For older projects it is often better to remove code. Pair it down to the essential features and then optimize. Build common facilities for things and update the old code to use them. Get rid of code. That is a goal. I don't want to sound like I'm slagging Maplesoft here. They do a lot of work in this direction. For example, processing options (keyword=value) in the kernel. The fact that these issues exist is simply due to the age of Maple, it's not really anybody's "fault". However, I would encourage Maplesoft to be more aggressive about scrapping old code. If the new code doesn't work properly (ie: LinearAlgebra vs. linalg for a variety of issues) then fix the new code so you can scrap the old code. If you don't, this is the inevitable result. Also, Vista was not designed from scratch. They tried to do that and failed because the task was too big. Vista was built off of Windows 2003.
In general, I used to like Maple 10-12 years ago. But starting from Maple 6 it is degrading and degrading. Is there anything there that is not broken by now? New packages are broken, integration is broken (as usual), simplify is broken, solve is broken, ArrayTools are broken, evalf is broken, and even some integer procedures are broken! The number of routines in Maple has increased substantially. Surely the system is much more powerful than Maple 6. There is no excuse for bugs, however we must accept a non-zero probability of introducing bugs whenever anything is developed. It can be very difficult to do anything. It was much more fun for everyone when the system was smaller and expectations were lower. The real question is, how can we fight this miserable trend ? I think that a clean internal design is a big win. Maple needs to separate (as much as possible) different parts of the system. That's obvious, but unwinding commands like solve or simplify is a herculean task. Secondly, cruft has to be removed from the system. There is no point in maintaining backwards compatibility with stupid quirks or junk that nobody uses. Get rid of it. This can be very painful to do, but the alternative is that developers go insane. Look at Windows Vista, which should be a cautionary tale for Maplesoft. Note: Windows Vista did not sell. Maple Vista won't be better.
The examples shown in the help pages are not computed on your machine (this would be highly undesirable for reading help pages) so what you're seeing is the time an example took on one of Maplesoft's computers. Also, the time reported by the time command can vary somewhat, depending on Maple's internal state. If your computation triggers a garbage collection then that is counted against the time. If you have a lot of memory allocated then garbage collection may run slower. Mathematica is the same way, I don't know about Matlab.
The examples shown in the help pages are not computed on your machine (this would be highly undesirable for reading help pages) so what you're seeing is the time an example took on one of Maplesoft's computers. Also, the time reported by the time command can vary somewhat, depending on Maple's internal state. If your computation triggers a garbage collection then that is counted against the time. If you have a lot of memory allocated then garbage collection may run slower. Mathematica is the same way, I don't know about Matlab.
It should be safe to remove examples and examplesclassic (70 MB). I'm not sure if you can delete the java and jre directories, even if you're content to not use the java interface. The best place to save space is in the lib directory. The mla files are Maple code, so it would be best not to delete that. The hdb files are help files and there are a lot of them. For example: - SDictionary (62 MB). - StdWdExamples (50 MB) - StdWsTask/Tour/Applications/Manuals (72 MB) I don't know how well Maple functions without these files, but it should work. The lib/classic directory has another dictionary file (55 MB). I wish Maplesoft would implement automatic compression. It's not rocket science. gzip -1 (fastest) compresses these files by a factor of 5. You implement it once, and then everyone's worksheets are smaller and Maple's footprint is much smaller. It's not hard, just use zlib.
That functionality would be welcome :) If possible, it would be nice to make it really fast, like 10 cycles, so that developers can poll in a tight loop. The hack uses 700 cycles, which means I can only afford to poll every once and a while.
1. Do you have a similar suggestion, if calling an external DLL (which does not know of Maple)? Operating systems install an interrupt handler, so you could essentially to do the same thing. I have no idea how to do that. You would need to read up on operating systems. 2. If I want to interrupt Maple execution for conventional code within a classical sheet on Win there is often no reaction even after long waiting (so I shut down the task) I have no ideas. Control-C in Linux and Mac terminals has always worked for me.
I am driven to conclude that Maplesoft’s strategy should be to concentrate on the educational market, which as Jacques implies is mature.

Name one company that has targeted the educational market and survived :) Seriously, if Maple can't solve real world problems then it's only a matter of time before it's not used for teaching either. The value of educational software is approximately $0. It's nice gravy, but people expect the meat. On the other hand, if you can solve a few large industrial problems people will never stop buying your product. Old customers may not upgrade, but you can remedy that by solving new problems or solving old ones faster. People will pay for that! Some poor engineer has the choice of either writing his own stuff, or trying to cobble together something out of other peoples' stuff, or he can buy your product type "solve" and boom! get the answer. That is worth money. You just saved that guy two weeks. You saved his company a paycheque. And once they realize they have a magic box that gives answers to large problems quickly, you can rest assured that they will never run out of problems to try. Nor will they stop paying you money, as long as you keep improving the box. I think the market for computer algebra is nowhere near what it could be. Maple should be the essential magic box on every engineers' desk. MATLAB is there for a good reason. It can solve a 10M x 10M sparse linear system. It can interpolate millions of points. You can analyze gigabytes of data with it. Maple could potentially do all of this: with hardware floats, in arbitrary precision, and with exact arithmetic. But it has to be fast. The fact is, people hate numerical analysis. They would gladly try the product if it meant they could set Digits := 50; and run the thing again, or solve it exactly and not think at all.
It's not just me, the group at Western is working on solving polynomial systems. That's a very hard problem and nobody really does it well, and it takes a long time. I agree that it is much easier to define requirements for "computer algebra" development than it is for "symbolic computation". Of course it does not help that int, simplify, limit, etc, are black holes that can swallow up infinite amounts of time. In some cases the problems are not well defined, which makes them research problems in themselves. I can think of two things that are really holding back Maple development. The first is the test suite. There are thousands of tests, many of them are lumped together, and all kinds of things blow up because top level commands call each other expecting a certain type of result. Maple commands need a hierarchy, and the tests need to be cleaned out and organized. This is quite possibly the most thankless job in the world, but until a small team of people do it we are all going to suffer. It is just not worth it to make any changes to int, simplify, limit, solve, etc, under the current regime. Set ordering could even make this problem worse. I think the second thing holding back Maple is performance. Many of the algorithms and data structures are too slow. That makes it hard to improve on the previous code, because you are greatly limited in what you can handle. If Maple were 100 or 1000 times faster you could design more ambitious algorithms for "symbolic computation". As it is now, someone has done something or put something in the kernel and you can't really beat it with a proper algorithm because Maple code is slow. There may be no good way to do a computation in Maple code using Maple data structures. The solution is to provide fast primitives. Find out what people need and provide it, and make all of the basic operations as fast as possible. This will make it possible for people to implement new algorithms. Or at least, that is the idea :)
First 17 18 19 20 21 22 23 Last Page 19 of 39