vs140580

490 Reputation

8 Badges

5 years, 16 days

MaplePrimes Activity


These are replies submitted by vs140580

@tomleslie 

 table([(25, 50) = ["C-H", 1.08652243], (33, 34) = ["C-O", 1.36654528], (23, 48) = ["C-H", 1.08717708], (26, 30) = ["C-C", 1.40591714], (18, 23) = ["C-C", 1.39157249], (11, 16) = ["C-C", 1.39650421], (7, 36) = ["C-H", 1.08977154], (13, 18) = ["C-C", 1.39940202], (3, 6) = ["C-C", 1.47386499], (7, 11) = ["C-C", 1.47502068], (5, 6) = ["N-C", 1.39733282], (17, 23) = ["C-C", 1.39224064], (29, 31) = ["C-C", 1.40271915], (20, 46) = ["C-H", 1.08555101], (29, 52) = ["C-H", 1.08794255], (20, 25) = ["C-C", 1.38934841], (14, 20) = ["C-C", 1.40179385], (32, 55) = ["C-H", 1.08686798], (4, 8) = ["N-C", 1.39794063], (12, 17) = ["C-C", 1.39699284], (16, 42) = ["C-H", 1.08722951], (15, 41) = ["C-H", 1.08699816], (5, 9) = ["N-C", 1.45578467], (15, 21) = ["C-C", 1.41498728], (22, 47) = ["C-H", 1.08651829], (34, 35) = ["O-C", 1.42310435], (1, 2) = ["S-C", 1.78150330], (8, 13) = ["C-C", 1.40057453], (21, 27) = ["C-C", 1.40557782], (16, 22) = ["C-C", 1.39429839], (18, 44) = ["C-H", 1.08793888], (14, 19) = ["C-C", 1.40117130], (1, 3) = ["S-C", 1.72362728], (22, 27) = ["C-C", 1.39684144], (2, 5) = ["C-N", 1.37795936], (9, 14) = ["C-C", 1.52456453], (9, 37) = ["C-H", 1.09752950], (21, 26) = ["C-C", 1.48468482], (32, 33) = ["C-C", 1.39465444], (30, 53) = ["C-H", 1.08791452], (8, 12) = ["C-C", 1.39195618], (26, 29) = ["C-C", 1.40628233], (12, 39) = ["C-H", 1.08344312], (35, 58) = ["C-H", 1.09528489], (27, 51) = ["C-H", 1.08956367], (6, 10) = ["C-O", 1.22372628], (30, 32) = ["C-C", 1.40199679], (9, 38) = ["C-H", 1.08364570], (31, 54) = ["C-H", 1.08455521], (25, 28) = ["C-N", 1.34880132], (19, 45) = ["C-H", 1.08340943], (24, 28) = ["C-N", 1.34753442], (17, 43) = ["C-H", 1.08637148], (24, 49) = ["C-H", 1.08749437], (35, 57) = ["C-H", 1.09439070], (2, 4) = ["C-N", 1.29482084], (19, 24) = ["C-C", 1.39010000], (3, 7) = ["C-C", 1.34033764], (11, 15) = ["C-C", 1.40238725], (31, 33) = ["C-C", 1.39138564], (13, 40) = ["C-H", 1.08708279], (35, 56) = ["C-H", 1.09419651]])

All correct but

for this table when i excecute My graph drawn does not show the weights when i excecute can you help I dont know why

@Christian Wolinski How to store the output in a new list which list is it storing help with a sample code 

@mmcdara 

I have uploaded a excel sheet in the other thread can you help to run the code with that sheet 

For that thread I was getting error please help to verify the code with this the analysis excel sheet in the below thread

https://www.mapleprimes.com/questions/235524-Give-A-Data-For-Regression-In-Excel?reply=reply

@mmcdara 

 

Reply to the remider sir Please help the below

 

Please my kind help I have posted a reply to you two thread mainly 

Train test code thread and PLS PCR these as my excel sheet analysis  is not able to run in these codes please help

Kind help to Do PLS and PCR on the data say

analysis.xlsx

@mmcdara PLS+PCR_Linnerund.mw this i am not able to download dont know reason It says Page not found error

@Carl Love 

Kind help sir with Link to your KNN answer thread kind help

Please help 

@tomleslie 

The question was edited towards the end to generate a list of undirected graphs.

May be I need to use a for loop to convert all lists in the super list to a set of sets and add as graphs  to a new list.

This should work and short code help 

@mmcdara can you help with an Rcode for my steps

To run code train and test to get only  two or three regressor variables

If not get after a few runs it prints more data required

And if all satisfies the charts etc as I have 

Same algorithm 

@mmcdara I program should choose only a maximum of 2 or 3 independent variables only to design model if model R^2 is not more 90% it should print model is not good and it should print add more independent variables 

Here Y is the dependent variable and x1,x2,x3 etc are independent variables

In the actual data their could be more independent variables 

A_sample.xlsx

@mmcdara 

Step1; we need to import data from excel

Step 2: we split the data into train and test random

Step 3:

Here let us fix the number of regression variables or independent variables to be choosen at a time say K

I usual look for K to 2 or 3 in the model equation not more 

we need to apply stepwise regression or lasso regression that is we reduce the residual sum errors as told on the train data to get the best fit it gives 

Step 4: if the model best fits to predict the test data too then store the train and test data into Excel sheet

, And the predicted values of the dependent variable too based on the model equation obtained,

And give all the plots as suggested

With train data in one color and test In one Color with the fit model 

and stop

Or

If it doesn't we again run and randomly split the try to get the model

Hope train set 70% to test set 30%

Split

Or 

If no such model comes with two or three regressor variables even are certain number of number of runs

We print the need to create more regressor variables or independent variables to improve model

 A sample data 

In actual data their could be more independent variables

A_sample.xlsx

@mmcdara I want the fitting to pick the best of independent variables to get a good or best model for the dependent variable

It should not fit with all 50 variables and give a big regression equation

Will the above code be able to do that to

If not how can we pick the best subset of the independent variables which best fits the dependent variable from the data

@mmcdara To write my Matrix to txt file in python once i get the numpy matrix or array output in the format required for the above code Kind help possible with python code 

@Carl Love 

 

[[0.         1.54544582 2.62118205 1.54430276 1.53285025 3.11145877
  3.13247991 2.57045449 1.09662462 2.17342262 2.16878313 3.52843002
  2.17932583 2.16674307 2.18977151 2.2004811  2.16912137 3.37166319
  2.86734302 4.14751396 4.19987953 3.22693404 3.18442386 2.93131597
  2.77457215 3.50347723]
 [1.54544582 0.         1.54306338 2.55309911 2.54975089 2.55279004
  2.51359895 3.06696785 2.17320488 1.09810015 1.09615493 2.14707345
  2.82391877 3.49528299 3.03398745 2.70230412 3.47202471 2.76921802
  2.88532613 3.49814137 3.41037401 2.63928125 3.06642538 2.84833581
  3.32901734 4.10047077]
 [2.62118205 1.54306338 0.         3.96734923 3.14877773 1.52911407
  1.53187574 4.48313031 2.85963015 2.16849766 2.12983227 1.09726282
  4.29153513 4.77514808 3.1204953  3.27409979 4.21732595 2.1730296
  2.19343389 2.16727128 2.13955654 2.2027129  2.21612749 4.25389009
  4.51408519 5.55529558]
 [1.54430276 2.55309911 3.96734923 0.         2.50242191 4.50340749
  4.54869566 1.52230454 2.13762888 2.78872482 2.74941948 4.69759987
  1.09697454 1.09734176 3.41898787 2.96455502 2.55750665 4.5218565
  4.29381928 5.57785858 5.61430753 4.42209972 4.66859616 2.17911172
  2.1724484  2.1562555 ]
 [1.53285025 2.54975089 3.14877773 2.50242191 0.         3.75302841
  2.86860765 3.88925989 2.14512884 2.80612428 3.48868129 4.13277359
  2.74358604 2.6929382  1.08974972 1.09313603 1.09770091 4.36409157
  3.30622823 4.59390927 3.9432795  2.7402963  2.61086909 4.31282244
  4.17882028 4.65039237]
 [3.11145877 2.55279004 1.52911407 4.50340749 3.75302841 0.
  2.50206663 4.82642653 2.76510097 3.49141611 2.78332281 2.15026242
  5.12935509 5.13144026 3.42301559 4.24050211 4.69248623 1.09498381
  1.09443338 1.09551337 2.90210966 3.45841276 2.62328062 4.76706956
  4.49986095 5.91289522]
 [3.13247991 2.51359895 1.53187574 4.54869566 2.86860765 2.50206663
  0.         5.44964793 3.51615481 2.73659235 3.43595826 2.15048268
  4.70254329 5.19336656 2.58889102 2.72235819 3.93590448 3.46327128
  2.74270573 2.77619767 1.09872851 1.09329082 1.09018047 5.35627805
  5.596883   6.45811306]
 [2.57045449 3.06696785 4.48313031 1.52230454 3.88925989 4.82642653
  5.44964793 0.         2.77971526 3.41142901 2.6671573  5.01159873
  2.14720266 2.13179027 4.68562472 4.36570923 4.02711267 4.48948244
  4.78484155 5.91038526 6.43193331 5.45672621 5.69149833 1.09359571
  1.0943537  1.09509473]
 [1.09662462 2.17320488 2.85963015 2.13762888 2.14512884 2.76510097
  3.51615481 2.77971526 0.         3.07656775 2.51204162 3.82571163
  3.05508523 2.45592594 2.36709154 3.06251483 2.61622376 2.8971508
  2.27731546 3.81109022 4.50972646 3.89919949 3.35756354 3.2911009
  2.54000295 3.721604  ]
 [2.17342262 1.09810015 2.16849766 2.78872482 2.80612428 3.49141611
  2.73659235 3.41142901 3.07656775 0.         1.75215549 2.45792361
  2.61336972 3.78481674 3.47646661 2.53135927 3.67753261 3.77007403
  3.87506235 4.31397885 3.56465023 2.42632872 3.46818459 3.03503962
  3.96035619 4.30419799]
 [2.16878313 1.09615493 2.12983227 2.74941948 3.48868129 2.78332281
  3.43595826 2.6671573  2.51204162 1.75215549 0.         2.39169749
  3.12553535 3.7552195  3.98304059 3.73088171 4.29158833 2.55647964
  3.26971073 3.73701171 4.19761891 3.6508236  4.01333914 2.19837214
  2.81839321 3.71334458]
 [3.52843002 2.14707345 1.09726282 4.69759987 4.13277359 2.15026242
  2.15048268 5.01159873 3.82571163 2.45792361 2.39169749 0.
  4.90016591 5.61441444 4.17399978 4.08746537 5.20622426 2.52201038
  3.07959853 2.45302035 2.31166425 2.65334458 3.05296749 4.55314275
  5.08952498 6.06523756]
 [2.17932583 2.82391877 4.29153513 1.09697454 2.74358604 5.12935509
  4.70254329 2.14720266 3.05508523 2.61336972 3.12553535 4.90016591
  0.         1.76065628 3.80290476 2.79636067 2.78288127 5.23590115
  5.03313411 6.15894147 5.74113638 4.32219336 4.95498731 2.46115568
  3.06629728 2.51819326]
 [2.16674307 3.49528299 4.77514808 1.09734176 2.6929382  5.13144026
  5.19336656 2.13179027 2.45592594 3.78481674 3.7552195  5.61441444
  1.76065628 0.         3.52100828 3.31618563 2.30017726 5.18965584
  4.70570541 6.19321052 6.28631398 5.08500542 5.10523886 3.05379707
  2.54609826 2.39336733]
 [2.18977151 3.03398745 3.1204953  3.41898787 1.08974972 3.42301559
  2.58889102 4.68562472 2.36709154 3.47646661 3.98304059 4.17399978
  3.80290476 3.52100828 0.         1.80467991 1.72210273 4.20045499
  2.79165804 4.10149117 3.55804534 2.75767344 1.94402758 5.11708781
  4.77189066 5.51011897]
 [2.2004811  2.70230412 3.27409979 2.96455502 1.09313603 4.24050211
  2.72235819 4.36570923 3.06251483 2.53135927 3.73088171 4.08746537
  2.79636067 3.31618563 1.80467991 0.         1.74955718 4.91696583
  4.00636879 4.99128269 3.72516524 2.18486233 2.71877051 4.60184212
  4.85311722 5.08304677]
 [2.16912137 3.47202471 4.21732595 2.55750665 1.09770091 4.69248623
  3.93590448 4.02711267 2.61622376 3.67753261 4.29158833 5.20622426
  2.78288127 2.30017726 1.72210273 1.74955718 0.         5.22447236
  4.10434417 5.55536148 4.99013    3.74469462 3.55789324 4.65871848
  4.3411393  4.57902555]
 [3.37166319 2.76921802 2.1730296  4.5218565  4.36409157 1.09498381
  3.46327128 4.48948244 2.8971508  3.77007403 2.55647964 2.52201038
  5.23590115 5.18965584 4.20045499 4.91696583 5.22447236 0.
  1.7737017  1.76883018 3.87210745 4.31693122 3.67485475 4.35818008
  4.01455962 5.55474956]
 [2.86734302 2.88532613 2.19343389 4.29381928 3.30622823 1.09443338
  2.74270573 4.78484155 2.27731546 3.87506235 3.26971073 3.07959853
  5.03313411 4.70570541 2.79165804 4.00636879 4.10434417 1.7737017
  0.         1.76486725 3.3148933  3.67146781 2.4316566  4.97122166
  4.38166932 5.82254256]
 [4.14751396 3.49814137 2.16727128 5.57785858 4.59390927 1.09551337
  2.77619767 5.91038526 3.81109022 4.31397885 3.73701171 2.45302035
  6.15894147 6.19321052 4.10149117 4.99128269 5.55536148 1.76883018
  1.76486725 0.         2.75851653 3.83435237 2.86445731 5.79823051
  5.56335548 6.99895835]
 [4.19987953 3.41037401 2.13955654 5.61430753 3.9432795  2.90210966
  1.09872851 6.43193331 4.50972646 3.56465023 4.19761891 2.31166425
  5.74113638 6.28631398 3.55804534 3.72516524 4.99013    3.87210745
  3.3148933  2.75851653 0.         1.7429245  1.72378589 6.245082
  6.54202604 7.4607489 ]
 [3.22693404 2.63928125 2.2027129  4.42209972 2.7402963  3.45841276
  1.09329082 5.45672621 3.89919949 2.42632872 3.6508236  2.65334458
  4.32219336 5.08500542 2.75767344 2.18486233 3.74469462 4.31693122
  3.67146781 3.83435237 1.7429245  0.         1.8060876  5.3289484
  5.80932649 6.37865082]
 [3.18442386 3.06642538 2.21612749 4.66859616 2.61086909 2.62328062
  1.09018047 5.69149833 3.35756354 3.46818459 4.01333914 3.05296749
  4.95498731 5.10523886 1.94402758 2.71877051 3.55789324 3.67485475
  2.4316566  2.86445731 1.72378589 1.8060876  0.         5.79737769
  5.73205614 6.67521841]
 [2.93131597 2.84833581 4.25389009 2.17911172 4.31282244 4.76706956
  5.35627805 1.09359571 3.2911009  3.03503962 2.19837214 4.55314275
  2.46115568 3.05379707 5.11708781 4.60184212 4.65871848 4.35818008
  4.97122166 5.79823051 6.245082   5.3289484  5.79737769 0.
  1.77860864 1.76091466]
 [2.77457215 3.32901734 4.51408519 2.1724484  4.17882028 4.49986095
  5.596883   1.0943537  2.54000295 3.96035619 2.81839321 5.08952498
  3.06629728 2.54609826 4.77189066 4.85311722 4.3411393  4.01455962
  4.38166932 5.56335548 6.54202604 5.80932649 5.73205614 1.77860864
  0.         1.77080858]
 [3.50347723 4.10047077 5.55529558 2.1562555  4.65039237 5.91289522
  6.45811306 1.09509473 3.721604   4.30419799 3.71334458 6.06523756
  2.51819326 2.39336733 5.51011897 5.08304677 4.57902555 5.55474956
  5.82254256 6.99895835 7.4607489  6.37865082 6.67521841 1.76091466
  1.77080858 0.        ]]

 

This a numpy matrix or array

Now one how to write it if to txt file in python then how to read from maplesoft

I may have 80 such matrices how to store all in some place and convert them all together in maplesoft as maple matricies

One thing is matrix to matrix dimensions will changes

2 3 4 5 6 7 8 Page 4 of 11