Skip to content

Edits to DPF-Core RST and TXT files #729

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jan 10, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 10 additions & 10 deletions docs/source/concepts/concepts.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
==================
Terms and concepts
==================
DPF sees *fields of data*, not physical results. This makes DPF a
DPF sees **fields of data**, not physical results. This makes DPF a
very versatile tool that can be used across teams, projects, and
simulations.

Expand All @@ -20,7 +20,7 @@ Here are descriptions for key DPF terms:
uses three different spatial locations for finite element data: ``Nodal``,
``Elemental``, and ``ElementalNodal``.
- **Operators:** Objects that are used to create and transform the data.
An operator is composed of a *core* and *pins*. The core handles the
An operator is composed of a **core** and **pins**. The core handles the
calculation, and the pins provide input data to and output data from
the operator.
- **Scoping:** Spatial and/or temporal subset of a model's support.
Expand All @@ -32,19 +32,19 @@ Here are descriptions for key DPF terms:
Scoping
-------
In most cases, you do not want to work with an entire set of data
but rather with a subset of this data. To achieve this, you define
a *scoping*, which is a subset of the model's support.
but rather with a subset. To achieve this, you define
a **scoping**, which is a subset of the model's support.
Typically, scoping can represent node IDs, element IDs, time steps,
frequencies, and joints. Scoping describes a spatial and/or temporal
subset that the field is scoped on.

Field data
----------
In DPF, field data is always associated with its scoping and support, making
the *field* a self-describing piece of data. For example, in a field of nodal
displacement, the *displacement* is the simulation data, and the associated
*nodes* are the scoping. A field can also be defined by its dimensionality,
unit of data, and *location*.
the **field** a self-describing piece of data. For example, in a field of nodal
displacement, the **displacement** is the simulation data, and the associated
**nodes** are the scoping. A field can also be defined by its dimensionality,
unit of data, and **location**.

Location
--------
Expand All @@ -58,7 +58,7 @@ finite element data, the location is one of three spatial locations: ``Nodal``,
is identified by an ID, which is typically an element number.
- An ``ElementalNodal`` location describes data defined on the nodes of the elements.
To retrieve an elemental node, you must use the ID for the element. To achieve
this, you define an *elemental scoping* or *nodal scoping*.
this, you define an elemental scoping or nodal scoping.

Concept summary
---------------
Expand All @@ -80,7 +80,7 @@ You use :ref:`ref_dpf_operators_reference` to create and transform the data. An

Workflows
---------
You can chain operators together to create a *workflow*, which is a global entity
You can chain operators together to create a **workflow**, which is a global entity
that you use to evaluate data produced by operators. A workflow requires inputs
to operators, which computes requested outputs.

Expand Down
6 changes: 1 addition & 5 deletions docs/source/concepts/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,7 @@
Concepts
========

This section gives in depth descriptions and explanations of DPF concepts, including terminology.

Other sections of this guide include :ref:`ref_user_guide`, :ref:`ref_api_section`,
:ref:`ref_dpf_operators_reference`, and :ref:`gallery`.

This section provides in-depth descriptions and explanations of DPF concepts, including terminology.

DPF concepts
~~~~~~~~~~~~
Expand Down
27 changes: 14 additions & 13 deletions docs/source/concepts/stepbystep.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Data can come from two sources:
defining where the result files are located.
- **Manual input in DPF:** You can create fields of data in DPF.

Once you have specify data sources or manually create fields in PDF,
Once you specify data sources or manually create fields in DPF,
you can create field containers (if applicable) and define scopings to
identify the subset of data that you want to evaluate.

Expand All @@ -31,7 +31,7 @@ Specify the data source
To evaluate the data in simulation result files, you specify the data source by defining
where the results files are located.

This example shows how to define the data source:
This code shows how to define the data source:

.. code-block:: python

Expand All @@ -42,15 +42,15 @@ This example shows how to define the data source:
['/tmp/file.rst']

To evaluate data files, they must be opened. To open data files, you
define *streams*. A stream is an entity that contains the data sources.
define **streams**. A stream is an entity that contains the data sources.
Streams keep the data files open and keep some data cached to make the next
evaluation faster. Streams are particularly convenient when using large
data files. They save time when opening and closing data files. When a stream
is released, the data files are closed.

Define fields
~~~~~~~~~~~~~
A *field* is a container of simulation data. In numerical simulations,
A **field** is a container of simulation data. In numerical simulations,
result data is defined by values associated with entities:

.. image:: ../images/drawings/values-entities.png
Expand All @@ -59,7 +59,7 @@ Therefore, a field of data might look something like this:

.. image:: ../images/drawings/field.png

This example shows how to define a field from scratch:
This code shows how to define a field from scratch:

.. code-block:: python

Expand Down Expand Up @@ -87,7 +87,7 @@ You specify the set of entities by defining a range of IDs:

You must define a scoping prior to its use in the transformation data workflow.

This example shows how to define a mesh scoping:
This code shows how to define a mesh scoping:

.. code-block:: python

Expand All @@ -105,7 +105,7 @@ This example shows how to define a mesh scoping:

Define field containers
~~~~~~~~~~~~~~~~~~~~~~~
A *field container* holds a set of fields. It is used mainly for
A **field container** holds a set of fields. It is used mainly for
transient, harmonic, modal, or multi-step analyses. This image
explains its structure:

Expand All @@ -123,7 +123,7 @@ You can define a field container in multiple ways:
- Create a field container from a CSV file.
- Convert existing fields to a field container.

This example shows how to define a field container from scratch:
This code shows how to define a field container from scratch:

.. code-block:: python

Expand Down Expand Up @@ -165,7 +165,8 @@ an output that it passes to a field or field container using an output pin.
.. image:: ../images/drawings/circuit.png

Comprehensive information on operators is available in :ref:`ref_dpf_operators_reference`.
In the **Available Operators** area, you can either type a keyword in the **Search** option
In the **Available Operators** area for either the **Entry** or **Premium** operators,
you can either type a keyword in the **Search** option
or browse by operator categories:

.. image:: ../images/drawings/help-operators.png
Expand All @@ -186,7 +187,7 @@ language (IronPython, CPython, and C++).

.. image:: ../images/drawings/operator-def.png

This example shows how to define an operator from a model:
This code shows how to define an operator from a model:

.. code-block:: python

Expand All @@ -203,15 +204,15 @@ data transformation workflow, enabling you to perform all operations necessary
to get the result that you want.

In a workflow, the output pins of one operator can be connected to the input pins
of another operator, allowing output data from one operator to be passed as
input to another operator.
of another operator, allowing the output from one operator to be passed as
the input to another operator.

This image shows how you would get the norm of a resulting vector from the
dot product of two vectors:

.. image:: ../images/drawings/connect-operators.png

This example shows how to define a generic workflow that computes the minimum
This code shows how to define a generic workflow that computes the minimum
of displacement by chaining the ``U`` and ``min_max_fc`` operators:

.. code-block:: python
Expand Down
2 changes: 1 addition & 1 deletion docs/source/concepts/waysofusing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ CPython
Standalone DPF uses CPython and can be accessed with any Python console.
Data can be exported to universal file formats, such as VTK, HDF5, and TXT
files. You can use it to generate TH-plots, screenshots, and animations or
to create custom result plots using `numpy <https://numpy.org/>`_
to create custom result plots using the `numpy <https://numpy.org/>`_
and `matplotlib <https://matplotlib.org/>`_ packages.

.. image:: ../images/drawings/dpf-reports.png
Expand Down
8 changes: 3 additions & 5 deletions docs/source/contributing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,16 +7,14 @@ Contribute
Overall guidance on contributing to a PyAnsys repository appears in
`Contribute <https://dev.docs.pyansys.com/overview/contributing.html>`_
in the *PyAnsys Developer's Guide*. Ensure that you are thoroughly familiar
with this guide, paying particular attention to `Guidelines and Best Practices
<https://dev.docs.pyansys.com/guidelines/index.html>`_, before attempting
to contribute to PyDPF-Core.
with this guide before attempting to contribute to PyDPF-Core.

The following contribution information is specific to PyDPF-Core.

Clone the repository
--------------------
To clone and install the latest version of PyDPF-Core in
development mode, run:
Clone and install the latest version of PyDPF-Core in
development mode by running this code:

.. code::

Expand Down
12 changes: 6 additions & 6 deletions docs/source/getting_started/compatibility.rst
Original file line number Diff line number Diff line change
Expand Up @@ -65,19 +65,19 @@ should also be synchronized with the server version.
- 0.2.2
- 0.2.*

(** compatibility of DPF 2.0 with ansys-dpf-core 0.5.0 and later is assumed but no longer certified)
(** Compatibility of DPF 2.0 with ansys-dpf-core 0.5.0 and later is assumed but no longer certified.)

Updating Python environment
---------------------------
Update Python environment
-------------------------

When moving from one Ansys release to another, you must update the ``ansys-dpf-core`` package and its dependencies.
To get the latest version of the ``ansys-dpf-core`` package, use this code:
To get the latest version of the ``ansys-dpf-core`` package, use this command:

.. code::

pip install --upgrade --force-reinstall ansys-dpf-core

To get a specific version of the ``ansys-dpf-core`` package, such as 0.7.0, use this code:
To get a specific version of the ``ansys-dpf-core`` package, such as 0.7.0, use this command:

.. code::

Expand All @@ -88,7 +88,7 @@ To get a specific version of the ``ansys-dpf-core`` package, such as 0.7.0, use
Environment variable
--------------------

The ``start_local_server`` method uses the ``Ans.Dpf.Grpc.bat`` file or
The ``start_local_server()`` method uses the ``Ans.Dpf.Grpc.bat`` file or
``Ans.Dpf.Grpc.sh`` file to start the server. Ensure that the ``AWP_ROOT{VER}``
environment variable is set to your installed Ansys version. For example, if Ansys
2022 R2 is installed, ensure that the ``AWP_ROOT222`` environment
Expand Down
8 changes: 4 additions & 4 deletions docs/source/getting_started/dependencies.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@ Dependencies
Package dependencies
--------------------

PyDPF-Core dependencies are automatically checked when packages are
installed. Package dependencies follow:
Dependencies for the ``ansys-dpf-core`` package are automatically checked when the
package is installed. Package dependencies follow:

- `ansys.dpf.gate <https://pypi.org/project/ansys-dpf-gate/>`_, which is the gate
to the DPF C API or Python gRPC API. The gate depends on the server configuration:
Expand All @@ -28,5 +28,5 @@ Optional dependencies

For plotting, you can install these optional Python packages:

- `matplotlib <https://pypi.org/project/matplotlib/>`_ for chart plotting
- `pyvista <https://pypi.org/project/pyvista/>`_ for 3D plotting
- `matplotlib <https://pypi.org/project/matplotlib/>`_ package for chart plotting
- `pyvista <https://pypi.org/project/pyvista/>`_ package for 3D plotting
56 changes: 27 additions & 29 deletions docs/source/getting_started/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,10 @@ PyDPF-Core is a Python client API communicating with a **DPF Server**, either
through the network using gRPC or directly in the same process.


Installing PyDPF-Core
---------------------
Install PyDPF-Core
------------------

In a Python environment, run the following command to install PyDPF-Core:
To install PyDPF-Core, in a Python environment, run this command:

.. code::

Expand All @@ -26,54 +26,52 @@ In a Python environment, run the following command to install PyDPF-Core:
For more installation options, see :ref:`Installation section <installation>`.


Installing DPF Server
---------------------
Install DPF Server
------------------

#. DPF Server is packaged within the **Ansys Unified Installer** starting with Ansys 2021 R1.
To use it, download the standard installation using your preferred distribution channel,
and install Ansys following the installer instructions. If you experience problems,
see :ref:`Environment variable section <target_environment_variable_with_dpf_section>`. For information on getting
a licensed copy of Ansys, visit the `Ansys website <https://www.ansys.com/>`_.
* DPF Server is packaged within the **Ansys installer** in Ansys 2021 R1 and later.
To use it, download the standard installation using your preferred distribution channel,
and install Ansys following the installer instructions. If you experience problems,
see :ref:`Environment variable <target_environment_variable_with_dpf_section>`. For information on getting
a licensed copy of Ansys, visit the `Ansys website <https://www.ansys.com/>`_.

#. DPF Server is available as a **standalone** package (independent of the Ansys installer) on the
`DPF Pre-Release page of the Ansys Customer Portal <https://download.ansys.com/Others/DPF%20Pre-Release>`_.
As explained in :ref:`Ansys licensing section <target_to_ansys_license_mechanism>`,
DPF Server is protected by an Ansys license mechanism. Once you have access to an
Ansys license, install DPF Server:
* DPF Server is available as a **standalone** package (independent of the Ansys installer) on the
`DPF Pre-Release page <https://download.ansys.com/Others/DPF%20Pre-Release>`_ of the Ansys Customer Portal.
As explained in :ref:`Ansys licensing <target_to_ansys_license_mechanism>`,
DPF Server is protected by an Ansys license mechanism. Once you have access to an
Ansys license, install DPF Server:

.. card::

* Download the ansys_dpf_server_win_v2023.2.pre0.zip or ansys_dpf_server_lin_v2023.2.pre0.zip
* Download the ``ansys_dpf_server_win_v2023.2.pre0.zip`` or ``ansys_dpf_server_lin_v2023.2.pre0.zip``
file as appropriate.
* Unzip the package and go to the root folder of the unzipped package
(ansys_dpf_server_win_v2023.2.pre0 or ansys_dpf_server_lin_v2023.2.pre0).
* In a Python environment, run the following command:
* Unzip the package and go to its root folder (``ansys_dpf_server_win_v2023.2.pre0`` or
``ansys_dpf_server_lin_v2023.2.pre0``).
* In a Python environment, run this command:

.. code::

pip install -e .

* DPF Server is protected using the license terms specified in the
`DPFPreviewLicenseAgreement <https://download.ansys.com/-/media/dpf/dpfpreviewlicenseagreement.ashx?la=en&hash=CCFB07AE38C638F0D43E50D877B5BC87356006C9>`_ file, which is available on the
`DPF Pre-Release page of the Ansys Customer Portal <https://download.ansys.com/Others/DPF%20Pre-Release>`_.
To accept these terms, you must set the
following environment variable:
`DPFPreviewLicenseAgreement <https://download.ansys.com/-/media/dpf/dpfpreviewlicenseagreement.ashx?la=en&hash=CCFB07AE38C638F0D43E50D877B5BC87356006C9>`_
file, which is available on the `DPF Pre-Release page <https://download.ansys.com/Others/DPF%20Pre-Release>`_
of the Ansys Customer Portal. To accept these terms, you must set this
environment variable:

.. code::

ANSYS_DPF_ACCEPT_LA=Y

For more information about the license terms, see the :ref:`DPF Preview License Agreement<target_to_license_terms>`
section.

For installation methods that do not use pip, such as using **Docker containers**, see
:ref:`ref_getting_started_with_dpf_server`.
For more information about the license terms, see :ref:`DPF Preview License Agreement<target_to_license_terms>`.

For installation methods that do not use `pip <https://pypi.org/project/pip/>`_,
such as using **Docker containers**, see :ref:`ref_getting_started_with_dpf_server`.

Use PyDPF-Core
--------------

In the same Python environment, run the following command to use PyDPF-Core:
To use PyDPF-Core, in the same Python environment, run this command:

.. code:: python

Expand Down
Loading