 Connectivity - Maple Help

 Connectivity Python

Maple 2018 is now packaged with a Python 3.6 kernel.

The Python kernel is linked to Maple. This means you can execute Python scripts, and return results to Maple.

Everyone who installs Maple will also have access to a Python interpreter with a selection of useful Python libraries.  In Maple, with(Python); loads a small set of tools to help execute commands, inspect variables, and interact with Python.



$\mathrm{with}\left(\mathrm{Python}\right);$

 $\left[{\mathrm{EvalFunction}}{,}{\mathrm{EvalMember}}{,}{\mathrm{EvalString}}{,}{\mathrm{GetVariable}}{,}{\mathrm{ImportModule}}{,}{\mathrm{None}}{,}{\mathrm{SetVariable}}{,}{\mathrm{Start}}{,}{\mathrm{Stop}}\right]$ (1.1)

Arbitrary strings can be parsed and evaluated:

$\mathrm{EvalString}\left("1+1"\right);$

 ${2}$ (1.2)

$\mathrm{EvalString}\left("7 ^ 15"\right);$

 ${8}$ (1.3)

$\mathrm{EvalString}\left("7 ** 15"\right);$

 ${4747561509943}$ (1.4)

$\mathrm{EvalString}\left("15 // 2"\right);$

 ${7}$ (1.5)

Here's a slightly longer string to parse and evaluate after importing the "statistics" package:

$\mathrm{ImportModule}\left("statistics"\right);$

$\mathrm{EvalString}\left("statistics.median_grouped\left(\left[1, 3, 3, 5, 7\right], interval=2\right)"\right)$

 ${3.50000000000000}$ (1.6)

Evaluating as a string is easy and flexible, but you can go even further and make Python functions look like native Maple functions:

 ${\mathrm{mg}}{≔}{">"}$ (1.7)

 ${3.50000000000000}$ (1.8)

$\mathrm{ImportModule}\left(\mathrm{math}\right);$

 ${\mathrm{pysin}}{≔}{">"}$ (1.9)

$\mathrm{pysin}\left(1.0\right);$

 ${0.841470984807897}$ (1.10) Module :- member notation can be used to class methods:

 ${\mathrm{pymath}}{≔}{">"}$ (1.11)

$\mathrm{pymath}:-\mathrm{cos}\left(1.0\right);$

 ${0.540302305868140}$ (1.12)

$\mathrm{pymath}:-\mathrm{acos}\left(1.0\right);$

 ${0.}$ (1.13)

$\mathrm{pymath}:-\mathrm{sqrt}\left(2\right);$

 ${1.41421356237310}$ (1.14)

Here is a more involved example that uses a Python package for HTML parsing and manipulation.

$\mathrm{ImportModule}\left("from bs4 import BeautifulSoup"\right)$

$\mathrm{htmldoc}≔"

Great Novels

":$

$\mathrm{soup}≔\mathrm{EvalFunction}\left(\mathrm{BeautifulSoup},\mathrm{htmldoc},"html.parser"\right):$

$\mathrm{soup}:-\mathrm{h1}:-\mathrm{string}$

 ${"Great Novels"}$ (1.15)

 $\left[{"Don Quixote"}{,}{"Crime and Punishment"}{,}{"The Count of Monte Cristo"}\right]$ (1.16)



New procedures can be defined from Maple:

$\mathrm{EvalString}\left("def mysum\left(a,b\right): return a+b",'\mathrm{output}'='\mathrm{none}'\right)$

 ${\mathrm{pysum}}{≔}{">"}$ (1.17)

$\mathrm{pysum}\left(1,1\right);$

 ${2}$ (1.18)

Options are available to control how results come back to Maple.  Above we see 'output'='none' to prevent any result from coming back.  Next, we'll use 'output' = 'python' to avoid converting back to a Maple object; and instead capture an object reference, in this case to an empty string.  This allows us access to Python string methods in the result.

$\mathrm{emptystring}≔\mathrm{EvalString}\left("\text{'}\text{'}",'\mathrm{output}'='\mathrm{python}'\right)$

 ${\mathrm{emptystring}}{≔}{""}$ (1.19)

$\mathrm{emptystring}:-\mathrm{join}\left(\left["a","b","c"\right]\right)$

 ${"abc"}$ (1.20)

The Python process is in a separate memory space than Maple, and can be stopped and restarted at any time.

$\mathrm{EvalString}\left("a=1\nb=2\nc=3"\right):$

 ${1}{,}{2}{,}{3}$ (1.21)

$\mathrm{Stop}\left(\right);$

 ${\mathrm{true}}$ (1.22)

$\mathrm{EvalString}\left("2+2"\right);$

 ${4}$ (1.23)



For more, see Python. Deep Learning with TensorFlowTM

Maple 2018 includes a new package, DeepLearning, which offers an limited API to the TensorFlow toolset for machine learning using neural networks. Example: Fitting a Curve

Here we perform least-squares regression to fit a Fourier series to a set of sample data given by:

$\mathrm{restart}:$

The general formula for a Fourier series with  terms and coefficients $c,\mathrm{a__1},...,\mathrm{a__N},\mathrm{b__1},...,\mathrm{b__N}$ is given by:

 $\left({x}{,}{a}{,}{b}{,}{c}{,}{N}\right){↦}{c}{+}{\mathrm{add}}{}\left({{a}}_{{n}}{}{\mathrm{cos}}{}\left({n}{}{x}\right){+}{{b}}_{{n}}{}{\mathrm{sin}}{}\left({n}{}{x}\right){,}{n}{=}{1}{..}{N}\right)$ (2.1.1)

For this example we take $N=4$:

 ${4}$ (2.1.2)

We first declare a sequence of variables in the deep learning graph corresponding to each $a\left[n\right]$ and $b\left[n\right]$:

$\mathrm{with}\left(\mathrm{DeepLearning}\right):$

$\mathrm{SetEagerExecution}\left(\mathrm{false}\right)$

 ${\mathrm{true}}$ (2.1.3)

We can now define placeholders to hold the sample x- and y-values against which we want to fit $F\left(x,a,b,N\right)$:

 ${\mathrm{DeepLearning}}{:-}{\mathrm{Tensor}}{}\left({""}\right)$ (2.1.4)

 ${\mathrm{DeepLearning}}{:-}{\mathrm{Tensor}}{}\left({""}\right)$ (2.1.5)

We can now define a least-squares distance function which we aim to minimize:

 ${\mathrm{DeepLearning}}{:-}{\mathrm{Tensor}}{}\left({""}\right)$ (2.1.6)

 ${\mathrm{DeepLearning}}{:-}{\mathrm{Optimizer}}{}\left({">"}\right)$ (2.1.7)

 ${\mathrm{DeepLearning}}{:-}{\mathrm{Tensor}}{}\left({""}\right)$ (2.1.8)

With the structure of our deep learning graph defined, we can proceed to train it on sample data.  After initializing the session, we run 1000 training cycles using the training data.

 $\left[{0.}{,}{1.}{,}{2.}{,}{3.}{,}{4.}{,}{5.}{,}{6.}{,}{7.}\right]$ (2.1.9)

 $\left[{2.1}{,}{-1.5}{,}{-3.1}{,}{6.3}{,}{8.2}{,}{11.5}{,}{12.7}{,}{8.4}\right]$ (2.1.10)

 ${\mathrm{DeepLearning}}{:-}{\mathrm{Session}}{}\left({">"}\right)$ (2.1.11)

$\mathrm{sess}:-\mathrm{Run}\left(\mathrm{init}\right):$

We can now query the state of the trained model for the present value of the loss function:

 ${0.000535954604856670}$ (2.1.12)

As this is a small value, we have a close fit for the data.  We can then obtain the final value of the parameters from the trained model.

 $\left[\left[\begin{array}{c}2.0857386589050293\\ 8.189836502075195\\ -2.988469123840332\\ -10.794703483581543\end{array}\right]{,}\left[\begin{array}{c}-8.446731567382812\\ -8.051170349121094\\ 1.606977939605713\\ 0.8452249765396118\end{array}\right]{,}\left[\begin{array}{c}5.619427680969238\end{array}\right]\right]$ (2.1.13)

 ${A}{,}{B}{,}{C}{≔}\left[\begin{array}{c}{2.08573865890503}\\ {8.18983650207520}\\ {-2.98846912384033}\\ {-10.7947034835815}\end{array}\right]{,}\left[\begin{array}{c}{-8.44673156738281}\\ {-8.05117034912109}\\ {1.60697793960571}\\ {0.845224976539612}\end{array}\right]{,}\left[\begin{array}{c}{5.61942768096924}\end{array}\right]$ (2.1.14)

Finally, we can visualize the result:
$\mathrm{with}\left(\mathrm{plots}\right):$  Example: Classification

Here we use a deep neural network to classify the famous Iris flower data set collected by Edgar Anderson and made famous by Ronald Fisher.  This data set includes 150 distinct observations of iris flowers, each of which consists of four empirical observations (sepal length, sepal width, petal length, and petal width) along with a classification into one of three known species (I. setosa, I. versicolor, and I. virginica).

We will repeat here the classical task for which this data set is used: attempting prediction of the species based on the four measured quantities

Training and Test Data
We have divided the data in into training and test data: the former is used to build the model, the latter is used to test its predictive accuracy.      

 ${\mathrm{DataFrame}}{}\left({{\mathrm{_rtable}}}_{{18446744074113216990}}{,}{\mathrm{rows}}{=}\left[{1}{,}{2}{,}{3}{,}{4}{,}{5}{,}{6}{,}{7}{,}{8}{,}{9}{,}{10}{,}{11}{,}{12}{,}{13}{,}{14}{,}{15}{,}{16}{,}{17}{,}{18}{,}{19}{,}{20}{,}{21}{,}{22}{,}{23}{,}{24}{,}{25}{,}{26}{,}{27}{,}{28}{,}{29}{,}{30}\right]{,}{\mathrm{columns}}{=}\left[{\mathrm{SepalLength}}{,}{\mathrm{SepalWidth}}{,}{\mathrm{PetalLength}}{,}{\mathrm{PetalWidth}}{,}{\mathrm{Species}}\right]\right)$ (2.2.1)

We see that this data set has 150 samples (120 for training and 30 for testing) and that the Species column has three distinct species:

 ${30}{,}{120}$ (2.2.2)

 $\left\{{"setosa"}{,}{"versicolor"}{,}{"virginica"}\right\}$ (2.2.3)

To simplify things we will replace the strings designating the species classification with the numbers 0,1,2 (corresponding to setosa, versicolor, and virginica, respectively):

 ${\mathrm{DataFrame}}{}\left({{\mathrm{_rtable}}}_{{18446744074839233830}}{,}{\mathrm{rows}}{=}\left[{1}{,}{2}{,}{3}{,}{4}{,}{5}{,}{6}{,}{7}{,}{8}{,}{9}{,}{10}{,}{11}{,}{12}{,}{13}{,}{14}{,}{15}{,}{16}{,}{17}{,}{18}{,}{19}{,}{20}{,}{21}{,}{22}{,}{23}{,}{24}{,}{25}{,}{26}{,}{27}{,}{28}{,}{29}{,}{30}{,}{31}{,}{32}{,}{33}{,}{34}{,}{35}{,}{36}{,}{37}{,}{38}{,}{39}{,}{40}{,}{41}{,}{42}{,}{43}{,}{44}{,}{45}{,}{46}{,}{47}{,}{48}{,}{49}{,}{50}{,}{51}{,}{52}{,}{53}{,}{54}{,}{55}{,}{56}{,}{57}{,}{58}{,}{59}{,}{60}{,}{61}{,}{62}{,}{63}{,}{64}{,}{65}{,}{66}{,}{67}{,}{68}{,}{69}{,}{70}{,}{71}{,}{72}{,}{73}{,}{74}{,}{75}{,}{76}{,}{77}{,}{78}{,}{79}{,}{80}{,}{81}{,}{82}{,}{83}{,}{84}{,}{85}{,}{86}{,}{87}{,}{88}{,}{89}{,}{90}{,}{91}{,}{92}{,}{93}{,}{94}{,}{95}{,}{96}{,}{97}{,}{98}{,}{99}{,}{100}{,}{101}{,}{102}{,}{103}{,}{104}{,}{105}{,}{106}{,}{107}{,}{108}{,}{109}{,}{110}{,}{111}{,}{112}{,}{113}{,}{114}{,}{115}{,}{116}{,}{117}{,}{118}{,}{119}{,}{120}\right]{,}{\mathrm{columns}}{=}\left[{\mathrm{SepalLength}}{,}{\mathrm{SepalWidth}}{,}{\mathrm{PetalLength}}{,}{\mathrm{PetalWidth}}{,}{\mathrm{Species}}\right]\right)$ (2.2.4)

 ${\mathrm{DataFrame}}{}\left({{\mathrm{_rtable}}}_{{18446744074848250806}}{,}{\mathrm{rows}}{=}\left[{1}{,}{2}{,}{3}{,}{4}{,}{5}{,}{6}{,}{7}{,}{8}{,}{9}{,}{10}{,}{11}{,}{12}{,}{13}{,}{14}{,}{15}{,}{16}{,}{17}{,}{18}{,}{19}{,}{20}{,}{21}{,}{22}{,}{23}{,}{24}{,}{25}{,}{26}{,}{27}{,}{28}{,}{29}{,}{30}\right]{,}{\mathrm{columns}}{=}\left[{\mathrm{SepalLength}}{,}{\mathrm{SepalWidth}}{,}{\mathrm{PetalLength}}{,}{\mathrm{PetalWidth}}{,}{\mathrm{Species}}\right]\right)$ (2.2.5)

Training the Deep Neural Network Model

With our data prepared, we can now actually define and train the model.

$\mathrm{with}\left(\mathrm{DeepLearning}\right):$

Our first step is to define a feature for each of the four observed quantities in the test data minus the final one (species) which we aim to predict:

 ${\mathrm{cols}}{≔}\left[{\mathrm{SepalLength}}{,}{\mathrm{SepalWidth}}{,}{\mathrm{PetalLength}}{,}{\mathrm{PetalWidth}}{,}{\mathrm{Species}}\right]$ (2.2.6)

 $\left[{\mathbf{module}}\left({}\right)\phantom{\rule[-0.0ex]{0.5em}{0.0ex}}{...}\phantom{\rule[-0.0ex]{0.5em}{0.0ex}}{\mathbf{end module}}{,}{\mathbf{module}}\left({}\right)\phantom{\rule[-0.0ex]{0.5em}{0.0ex}}{...}\phantom{\rule[-0.0ex]{0.5em}{0.0ex}}{\mathbf{end module}}{,}{\mathbf{module}}\left({}\right)\phantom{\rule[-0.0ex]{0.5em}{0.0ex}}{...}\phantom{\rule[-0.0ex]{0.5em}{0.0ex}}{\mathbf{end module}}{,}{\mathbf{module}}\left({}\right)\phantom{\rule[-0.0ex]{0.5em}{0.0ex}}{...}\phantom{\rule[-0.0ex]{0.5em}{0.0ex}}{\mathbf{end module}}\right]$ (2.2.7)

We can now define a deep neural network classifier with these features.  It has 3 classes because there are 3 species of iris in the dataset.

 ${\mathrm{DeepLearning}}{:-}{\mathrm{Estimator}}{}\left({">"}\right)$ (2.2.8)

We are now ready to train the model.

 ${\mathrm{DeepLearning}}{:-}{\mathrm{Tensor}}{}\left({">"}\right)$ (2.2.9)

Now trained, we can evaluate the classifier on the test set, and we see that we have achieved 96.7% predictive accuracy:

 $\left[\begin{array}{cc}{"accuracy"}& {0.966666638851166}\\ {"average_loss"}& {0.316836178302765}\\ {"loss"}& {9.50508499145508}\\ {"global_step"}& {2000}\end{array}\right]$ (2.2.10)

We can now build a predictor function that takes an arbitrary set of measurements as a DataSeries and returns a prediction:

Using a Trained Model

With this we can take an arbitrary new point, and generate a prediction from the trained model:

 ${\mathrm{DataSeries}}{}\left(\left[\begin{array}{cccc}5.8& 5.8& 5.8& 5.8\end{array}\right]{,}{\mathrm{labels}}{=}\left[{\mathrm{SepalLength}}{,}{\mathrm{SepalWidth}}{,}{\mathrm{PetalLength}}{,}{\mathrm{PetalWidth}}\right]{,}{\mathrm{datatype}}{=}{\mathrm{anything}}\right)$ (2.2.11)

$\mathrm{predictor}\left(\mathrm{ds}\right)$

 ${\mathrm{DataSeries}}{}\left(\left[\begin{array}{cccc}\left[\begin{array}{c}-8.940464973449707\\ 0.861007034778595\\ 4.639074802398682\end{array}\right]& \left[\begin{array}{c}-8.940464973449707\\ 0.861007034778595\\ 4.639074802398682\end{array}\right]& \left[\begin{array}{c}-8.940464973449707\\ 0.861007034778595\\ 4.639074802398682\end{array}\right]& \left[\begin{array}{c}-8.940464973449707\\ 0.861007034778595\\ 4.639074802398682\end{array}\right]\end{array}\right]{,}{\mathrm{labels}}{=}\left[{"logits"}{,}{"probabilities"}{,}{"class_ids"}{,}{"classes"}\right]{,}{\mathrm{datatype}}{=}{\mathrm{anything}}\right)$ (2.2.12)

The probabilities field in the above result records the estimated probabilities for each class.

In this case, the model predicts with high probability that this particular sample is class 2, and therefore I. virginica. FileTools

The CanonicalPath command resolves any relative directories and symbolic links present in a given file path to construct a canonical version of this path.

 > $\mathrm{with}\left(\mathrm{FileTools}\right):$
 > $\mathrm{CanonicalPath}\left("C:\\Users\\MapleUser\\.."\right)$
 ${"C:\Users"}$ (3.1)

The IsLink command tests whether a given file path corresponds to a symbolic link.

 > $\mathrm{IsLink}\left("/usr/local/bin/perl"\right)$
 ${\mathrm{true}}$ (3.2) XMLTools

ToRecord formats an XML tree as a nested record

 > $\mathrm{with}\left(\mathrm{XMLTools}\right):$

This example shows how repeated elements are put into a list, and the order they occurred can be deduced from the _order export.

 > $\mathrm{xml}≔\mathrm{ParseString}\left("123"\right):$
 > $r≔\mathrm{ToRecord}\left(\mathrm{xml}\right):$
 > $\mathrm{Print}\left(r\right)$
     1     2     3
 > $r:-\mathrm{doc}:-a\left[1\right]$
 ${"1"}$ (4.1)
 > $r:-\mathrm{doc}:-b$
 ${"2"}$ (4.2)
 > $r:-\mathrm{doc}:-a\left[2\right]$
 ${"3"}$ (4.3)
 > $r:-\mathrm{doc}:-\mathrm{_order}$
 $\left[{"a"}{,}{"b"}{,}{"a"}\right]$ (4.4) Additional Updates The new MapleTA:-QTI command converts IMS Question & Test Interoperability (QTI) files into Maple T.A./Möbius course module. OpenMaple enhances Java connectivity with several new commands, including toBigDecimal, toBigInteger, Relation, evalBoolean, lhs and set.