.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/synthetic/plot_1D_3_MAF.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code or to run this example in your browser via Binder .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_synthetic_plot_1D_3_MAF.py: Unimodal distribution ===================== .. GENERATED FROM PYTHON SOURCE LINES 8-17 The following example demonstrates the statistical learning based determination of the nuclear shielding tensor parameters from a one-dimensional cross-section of a magic-angle flipping (MAF) spectrum. In this example, we use a synthetic MAF lineshape from a unimodal tensor distribution. Before getting started ---------------------- Import all relevant packages. .. GENERATED FROM PYTHON SOURCE LINES 17-42 .. code-block:: default import csdmpy as cp import matplotlib.pyplot as plt import numpy as np from mrinversion.kernel.nmr import ShieldingPALineshape from mrinversion.linear_model import SmoothLasso, SmoothLassoCV, TSVDCompression from mrinversion.utils import get_polar_grids # Setup for the matplotlib figures # function for 2D x-y plot. def plot2D(ax, csdm_object, title=""): # convert the dimension coordinates of the csdm_object from Hz to pmm. _ = [item.to("ppm", "nmr_frequency_ratio") for item in csdm_object.dimensions] levels = (np.arange(9) + 1) / 10 plt.figure(figsize=(4.5, 3.5)) ax.contourf(csdm_object, cmap="gist_ncar", levels=levels) ax.grid(None) ax.set_title(title) get_polar_grids(ax) ax.set_aspect("equal") .. GENERATED FROM PYTHON SOURCE LINES 43-50 Dataset setup ------------- Import the dataset '''''''''''''''''' Load the dataset. Here, we import the dataset as a CSDM data-object. .. GENERATED FROM PYTHON SOURCE LINES 50-58 .. code-block:: default # the 1D MAF cross-section data in csdm format filename = "https://osu.box.com/shared/static/puxfgdh25rru1q3li124anylkgup8rdp.csdf" data_object = cp.load(filename) # convert the data dimension from `Hz` to `ppm`. data_object.dimensions[0].to("ppm", "nmr_frequency_ratio") .. GENERATED FROM PYTHON SOURCE LINES 59-62 The variable ``data_object`` holds the 1D MAF cross-section. For comparison, let's also import the true tensor parameter distribution from which the synthetic 1D pure anisotropic MAF cross-section line-shape is simulated. .. GENERATED FROM PYTHON SOURCE LINES 62-65 .. code-block:: default datafile = "https://osu.box.com/shared/static/s5wpm26w4cv3w64qjhouqu458ch4z0nd.csdf" true_data_object = cp.load(datafile) .. GENERATED FROM PYTHON SOURCE LINES 66-68 The plot of the 1D MAF cross-section along with the 2D true tensor parameter distribution of the synthetic dataset is shown below. .. GENERATED FROM PYTHON SOURCE LINES 68-79 .. code-block:: default # the plot of the 1D MAF cross-section dataset. _, ax = plt.subplots(1, 2, figsize=(9, 3.5), subplot_kw={"projection": "csdm"}) ax[0].plot(data_object) ax[0].invert_xaxis() # the plot of the true tensor distribution. plot2D(ax[1], true_data_object, title="True distribution") plt.tight_layout() plt.show() .. rst-class:: sphx-glr-horizontal * .. image:: /auto_examples/synthetic/images/sphx_glr_plot_1D_3_MAF_001.png :alt: True distribution :class: sphx-glr-multi-img * .. image:: /auto_examples/synthetic/images/sphx_glr_plot_1D_3_MAF_002.png :alt: plot 1D 3 MAF :class: sphx-glr-multi-img .. GENERATED FROM PYTHON SOURCE LINES 80-88 Linear Inversion setup ---------------------- Dimension setup ''''''''''''''' **Anisotropic-dimension:** The dimension of the dataset that holds the pure anisotropic frequency contributions, which in this case, is the only dimension. .. GENERATED FROM PYTHON SOURCE LINES 88-90 .. code-block:: default anisotropic_dimension = data_object.dimensions[0] .. GENERATED FROM PYTHON SOURCE LINES 91-93 **x-y dimensions:** The two inverse dimensions corresponding to the `x` and `y`-axis of the `x`-`y` grid. .. GENERATED FROM PYTHON SOURCE LINES 93-98 .. code-block:: default inverse_dimension = [ cp.LinearDimension(count=25, increment="370 Hz", label="x"), # the `x`-dimension. cp.LinearDimension(count=25, increment="370 Hz", label="y"), # the `y`-dimension. ] .. GENERATED FROM PYTHON SOURCE LINES 99-106 Generating the kernel ''''''''''''''''''''' For MAF datasets, the line-shape kernel corresponds to the pure nuclear shielding anisotropy line-shapes. Use the :class:`~mrinversion.kernel.nmr.ShieldingPALineshape` class to generate a shielding line-shape kernel. .. GENERATED FROM PYTHON SOURCE LINES 106-116 .. code-block:: default lineshape = ShieldingPALineshape( anisotropic_dimension=anisotropic_dimension, inverse_dimension=inverse_dimension, channel="29Si", magnetic_flux_density="9.4 T", rotor_angle="90 deg", rotor_frequency="14 kHz", number_of_sidebands=4, ) .. GENERATED FROM PYTHON SOURCE LINES 117-131 Here, ``lineshape`` is an instance of the :class:`~mrinversion.kernel.nmr.ShieldingPALineshape` class. The required arguments of this class are the `anisotropic_dimension`, `inverse_dimension`, and `channel`. We have already defined the first two arguments in the previous sub-section. The value of the `channel` argument is the nucleus observed in the MAF experiment. In this example, this value is '29Si'. The remaining arguments, such as the `magnetic_flux_density`, `rotor_angle`, and `rotor_frequency`, are set to match the conditions under which the spectrum was acquired. The value of the `number_of_sidebands` argument is the number of sidebands calculated for each line-shape within the kernel. Once the ShieldingPALineshape instance is created, use the :meth:`~mrinversion.kernel.nmr.ShieldingPALineshape.kernel` method of the instance to generate the MAF line-shape kernel. .. GENERATED FROM PYTHON SOURCE LINES 131-133 .. code-block:: default K = lineshape.kernel(supersampling=1) .. GENERATED FROM PYTHON SOURCE LINES 134-139 Data Compression '''''''''''''''' Data compression is optional but recommended. It may reduce the size of the inverse problem and, thus, further computation time. .. GENERATED FROM PYTHON SOURCE LINES 139-145 .. code-block:: default new_system = TSVDCompression(K, data_object) compressed_K = new_system.compressed_K compressed_s = new_system.compressed_s print(f"truncation_index = {new_system.truncation_index}") .. rst-class:: sphx-glr-script-out Out: .. code-block:: none compression factor = 1.5737704918032787 truncation_index = 61 .. GENERATED FROM PYTHON SOURCE LINES 146-160 Solving the inverse problem --------------------------- Smooth-LASSO problem '''''''''''''''''''' Solve the smooth-lasso problem. You may choose to skip this step and proceed to the statistical learning method. Usually, the statistical learning method is a time-consuming process that solves the smooth-lasso problem over a range of predefined hyperparameters. If you are unsure what range of hyperparameters to use, you can use this step for a quick look into the possible solution by giving a guess value for the :math:`\alpha` and :math:`\lambda` hyperparameters, and then decide on the hyperparameters range accordingly. .. GENERATED FROM PYTHON SOURCE LINES 160-166 .. code-block:: default # guess alpha and lambda values. s_lasso = SmoothLasso(alpha=5e-5, lambda1=5e-6, inverse_dimension=inverse_dimension) s_lasso.fit(K=compressed_K, s=compressed_s) f_sol = s_lasso.f .. GENERATED FROM PYTHON SOURCE LINES 167-170 Here, ``f_sol`` is the solution corresponding to hyperparameters :math:`\alpha=5\times10^{-5}` and :math:`\lambda=5\times 10^{-6}`. The plot of this solution is .. GENERATED FROM PYTHON SOURCE LINES 170-180 .. code-block:: default _, ax = plt.subplots(1, 2, figsize=(9, 3.5), subplot_kw={"projection": "csdm"}) # the plot of the guess tensor distribution solution. plot2D(ax[0], f_sol / f_sol.max(), title="Guess distribution") # the plot of the true tensor distribution. plot2D(ax[1], true_data_object, title="True distribution") plt.tight_layout() plt.show() .. rst-class:: sphx-glr-horizontal * .. image:: /auto_examples/synthetic/images/sphx_glr_plot_1D_3_MAF_003.png :alt: Guess distribution, True distribution :class: sphx-glr-multi-img * .. image:: /auto_examples/synthetic/images/sphx_glr_plot_1D_3_MAF_004.png :alt: plot 1D 3 MAF :class: sphx-glr-multi-img * .. image:: /auto_examples/synthetic/images/sphx_glr_plot_1D_3_MAF_005.png :alt: plot 1D 3 MAF :class: sphx-glr-multi-img .. GENERATED FROM PYTHON SOURCE LINES 181-185 Predicted spectrum '''''''''''''''''' You may also evaluate the predicted spectrum from the above solution following .. GENERATED FROM PYTHON SOURCE LINES 185-197 .. code-block:: default residuals = s_lasso.residuals(K, data_object) predicted_spectrum = data_object - residuals plt.figure(figsize=(4, 3)) plt.subplot(projection="csdm") plt.plot(data_object, color="black", label="spectrum") # the original spectrum plt.plot(predicted_spectrum, color="red", label="prediction") # the predicted spectrum plt.gca().invert_xaxis() plt.legend() plt.tight_layout() plt.show() .. image:: /auto_examples/synthetic/images/sphx_glr_plot_1D_3_MAF_006.png :alt: plot 1D 3 MAF :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 198-212 As you can see from the predicted spectrum, our guess isn't far from the optimum hyperparameters. Let's create a search grid about the guess hyperparameters and run a cross-validation method for selection. Statistical learning of the tensors ----------------------------------- Smooth LASSO cross-validation ''''''''''''''''''''''''''''' Create a guess range of values for the :math:`\alpha` and :math:`\lambda` hyperparameters. The following code generates a range of :math:`\lambda` and :math:`\alpha` values that are uniformly sampled on the log scale. .. GENERATED FROM PYTHON SOURCE LINES 212-226 .. code-block:: default lambdas = 10 ** (-5.2 - 1 * (np.arange(6) / 5)) alphas = 10 ** (-4 - 2 * (np.arange(6) / 5)) # set up cross validation smooth lasso method s_lasso_cv = SmoothLassoCV( alphas=alphas, lambdas=lambdas, inverse_dimension=inverse_dimension, sigma=0.005, folds=10, ) # run the fit using the compressed kernel and compressed signal. s_lasso_cv.fit(compressed_K, compressed_s) .. GENERATED FROM PYTHON SOURCE LINES 227-233 The optimum hyper-parameters '''''''''''''''''''''''''''' Use the :attr:`~mrinversion.linear_model.SmoothLassoCV.hyperparameters` attribute of the instance for the optimum hyper-parameters, :math:`\alpha` and :math:`\lambda`, determined from the cross-validation. .. GENERATED FROM PYTHON SOURCE LINES 233-235 .. code-block:: default print(s_lasso_cv.hyperparameters) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none {'alpha': 6.30957344480193e-06, 'lambda': 1.584893192461114e-06} .. GENERATED FROM PYTHON SOURCE LINES 236-243 The cross-validation surface '''''''''''''''''''''''''''' Optionally, you may want to visualize the cross-validation error curve/surface. Use the :attr:`~mrinversion.linear_model.SmoothLassoCV.cross_validation_curve` attribute of the instance, as follows. The cross-validation metric is the mean square error (MSE). .. GENERATED FROM PYTHON SOURCE LINES 243-258 .. code-block:: default cv_curve = s_lasso_cv.cross_validation_curve # plot of the cross-validation curve plt.figure(figsize=(5, 3.5)) ax = plt.subplot(projection="csdm") ax.contour(np.log10(s_lasso_cv.cross_validation_curve), levels=25) ax.scatter( -np.log10(s_lasso_cv.hyperparameters["alpha"]), -np.log10(s_lasso_cv.hyperparameters["lambda"]), marker="x", color="k", ) plt.tight_layout(pad=0.5) plt.show() .. image:: /auto_examples/synthetic/images/sphx_glr_plot_1D_3_MAF_007.png :alt: plot 1D 3 MAF :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 259-264 The optimum solution '''''''''''''''''''' The :attr:`~mrinversion.linear_model.SmoothLassoCV.f` attribute of the instance holds the solution. .. GENERATED FROM PYTHON SOURCE LINES 264-266 .. code-block:: default f_sol = s_lasso_cv.f .. GENERATED FROM PYTHON SOURCE LINES 267-269 The corresponding plot of the solution, along with the true tensor distribution, is shown below. .. GENERATED FROM PYTHON SOURCE LINES 269-278 .. code-block:: default _, ax = plt.subplots(1, 2, figsize=(9, 3.5), subplot_kw={"projection": "csdm"}) # the plot of the tensor distribution solution. plot2D(ax[0], f_sol / f_sol.max(), title="Optimum distribution") # the plot of the true tensor distribution. plot2D(ax[1], true_data_object, title="True distribution") plt.tight_layout() plt.show() .. rst-class:: sphx-glr-horizontal * .. image:: /auto_examples/synthetic/images/sphx_glr_plot_1D_3_MAF_008.png :alt: Optimum distribution, True distribution :class: sphx-glr-multi-img * .. image:: /auto_examples/synthetic/images/sphx_glr_plot_1D_3_MAF_009.png :alt: plot 1D 3 MAF :class: sphx-glr-multi-img * .. image:: /auto_examples/synthetic/images/sphx_glr_plot_1D_3_MAF_010.png :alt: plot 1D 3 MAF :class: sphx-glr-multi-img .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 19.771 seconds) .. _sphx_glr_download_auto_examples_synthetic_plot_1D_3_MAF.py: .. only :: html .. container:: sphx-glr-footer :class: sphx-glr-footer-example .. container:: binder-badge .. image:: images/binder_badge_logo.svg :target: https://mybinder.org/v2/gh/DeepanshS/mrinversion/master?urlpath=lab/tree/docs/_build/html/../../notebooks/auto_examples/synthetic/plot_1D_3_MAF.ipynb :alt: Launch binder :width: 150 px .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_1D_3_MAF.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_1D_3_MAF.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_