Skip to content

Merge/0.5.dev2 fix 2022R2 pipelines #325

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 144 commits into from
Jul 26, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
144 commits
Select commit Hold shift + click to select a range
3f8d676
Change ANSYS_VERSION to 222
PProfizi Jul 11, 2022
3431471
Add step for pygate 0.1.dev1 installation
PProfizi Jul 12, 2022
18e666a
Add step for pygate 0.1.dev1 installation
PProfizi Jul 12, 2022
e6f2b79
Add trigger for ci
PProfizi Jul 12, 2022
8a4353a
Add trigger for ci
PProfizi Jul 12, 2022
8be2f2a
Fix conditional for gatebin install steps
PProfizi Jul 12, 2022
99856e8
Fix gatebin install steps
PProfizi Jul 12, 2022
3786694
Add the wheels temporarily
PProfizi Jul 12, 2022
76e7650
Fix step conditional
PProfizi Jul 12, 2022
caed381
Change requirement to ansys-dpf-gate==0.1.dev1
PProfizi Jul 12, 2022
56c151d
Add pygate steps to documentation job
PProfizi Jul 12, 2022
a982047
Run Doc on windows-latest
PProfizi Jul 12, 2022
68bd6c2
Move build installation of pygate to pydpf-actions
PProfizi Jul 12, 2022
0db0e2e
Fix conditionals
PProfizi Jul 12, 2022
d810645
Update gate wheel
PProfizi Jul 12, 2022
16e2ab3
Add up-to-date ansys-grpc-dpf wheel
PProfizi Jul 12, 2022
be6c38a
Remove ANSYS_VERSION env setter from build_doc job
PProfizi Jul 12, 2022
5090ef7
Add install of local ansys-grpc-dpf wheel
PProfizi Jul 12, 2022
834f46d
Solve docstring test for download_file
PProfizi Jul 12, 2022
d270bdf
Solve docstring test for download_file
PProfizi Jul 12, 2022
fb9aa51
Solve docstring test for download_file
PProfizi Jul 12, 2022
5ee0bf4
Solve docstring test for download_file
PProfizi Jul 12, 2022
2e18c13
Update pygate wheel
PProfizi Jul 12, 2022
860d190
Fix doctests
PProfizi Jul 12, 2022
fb05974
Fix doctests
PProfizi Jul 12, 2022
f1df7bf
Fix doctests
PProfizi Jul 12, 2022
6a883a2
Fix doctests
PProfizi Jul 12, 2022
a4ec8b2
Fix doctests
PProfizi Jul 12, 2022
d611689
pytest.ini: add doctest NORMALIZE_WHITESPACE and ELLIPSIS by default
PProfizi Jul 12, 2022
0e5d006
remove inline #doctest: options
PProfizi Jul 12, 2022
a540885
Fix doctest
PProfizi Jul 12, 2022
5138dcf
Fix doctest
PProfizi Jul 12, 2022
816925c
Fix doctest
PProfizi Jul 12, 2022
3e8c229
Fix doctest
PProfizi Jul 12, 2022
ed8ca92
Fix doctest
PProfizi Jul 12, 2022
274ca9e
Fix doctest
PProfizi Jul 12, 2022
b2d7e93
use v2.2 pydpf-actions
TheGoldfish01 Jul 13, 2022
fac439a
Target potential failing test
TheGoldfish01 Jul 13, 2022
e4b47fe
Change local_test_repo to False
TheGoldfish01 Jul 13, 2022
5f03a49
Comment-out segfault test
TheGoldfish01 Jul 13, 2022
40801dc
Remove contrain on ansys-dpf-gate version in setup.py install_requires
TheGoldfish01 Jul 15, 2022
58157f2
Try-out DEBUG option for pydpf-actions/test_package
TheGoldfish01 Jul 18, 2022
38d8cb8
Revert "Comment-out segfault test"
TheGoldfish01 Jul 19, 2022
deba5f6
Revert "Target potential failing test"
TheGoldfish01 Jul 19, 2022
f9e7065
Fix test_busy_port from lea's PR on TFS
TheGoldfish01 Jul 19, 2022
26ac04a
Fix conftest according to TFS PR form Lea
TheGoldfish01 Jul 19, 2022
406c3cb
Fix gatebin tests from TFS PR Lea
TheGoldfish01 Jul 19, 2022
2644054
Update wheels
PProfizi Jul 19, 2022
fdaeab3
Merge branch 'merge/0.5.dev2' into merge/fix_222
TheGoldfish01 Jul 19, 2022
548eb12
Remove Report from test in install package
TheGoldfish01 Jul 19, 2022
a3ee08d
Fix flake8
TheGoldfish01 Jul 19, 2022
f30158b
Try-out on Python 3.7 same as TFS
TheGoldfish01 Jul 19, 2022
be15028
Merge branch 'merge/0.5.dev2' into merge/test_222
TheGoldfish01 Jul 19, 2022
e3f6097
Bump imageio from 2.19.3 to 2.19.5 (#312)
dependabot[bot] Jul 19, 2022
4cc4680
MAINT: collect dependencies under a requirements/ directory (#307)
jorgepiloto Jul 15, 2022
5bb7c33
Bump sphinx-notfound-page from 0.8 to 0.8.3 (#308)
dependabot[bot] Jul 15, 2022
ddf63d4
Bump coverage from 6.4.1 to 6.4.2 (#306)
dependabot[bot] Jul 13, 2022
a74e635
Bump actions/setup-python from 4.0.0 to 4.1.0 (#302)
dependabot[bot] Jul 12, 2022
36e1076
Add a retrocompatibility workflow (#310)
PProfizi Jul 19, 2022
306390f
Merge branch 'merge/test_222' into merge/fix_222
TheGoldfish01 Jul 19, 2022
ee1913d
Use pydpf-actions @v2.2 for retro.yml
TheGoldfish01 Jul 19, 2022
ec72b07
Add kill-all-servers steps
TheGoldfish01 Jul 19, 2022
472fe07
Add search for AWP_ROOT221 and others in decreasing order as a last r…
TheGoldfish01 Jul 19, 2022
bfedb8f
Reduces restrains in requirements_docs.txt
TheGoldfish01 Jul 19, 2022
6d3f0f7
Try fix doc generation for 06-stress_gradient_path.py
TheGoldfish01 Jul 19, 2022
7d7f959
Try fix doc generation for 06-stress_gradient_path.py
TheGoldfish01 Jul 19, 2022
b05711d
Revert "Try fix doc generation for 06-stress_gradient_path.py"
TheGoldfish01 Jul 19, 2022
c821bf5
Revert "Try fix doc generation for 06-stress_gradient_path.py"
TheGoldfish01 Jul 19, 2022
b2c30ef
Fix doc generation
PProfizi Jul 20, 2022
b3fe38d
Use wheel and wheelhouse options of build_package action v2.2
PProfizi Jul 20, 2022
3743fcc
Set wheelhouse generation to false while gate not available on PyPi
PProfizi Jul 20, 2022
3330a66
REtro now does not test docstrings
PProfizi Jul 20, 2022
5f00e76
Merge branch 'merge/0.5.dev2' into merge/retro/fix_222
PProfizi Jul 20, 2022
d99e7a0
Update wheels of gate, grpc,
PProfizi Jul 20, 2022
d4d0ebb
Move find_ansys() to last resort. If only AWP_ROOT221 is declared, it…
PProfizi Jul 20, 2022
0505f74
Restrict TestServerConfigs tests to servers >= 4.0
PProfizi Jul 20, 2022
c253f60
Moved ansys_path retrieval logic to misc along with find_ansys, in ge…
PProfizi Jul 20, 2022
172d79f
remove rogue prints
PProfizi Jul 20, 2022
e5840cd
Fixture for server<4.0 now only use global server
PProfizi Jul 20, 2022
4239d25
Rename test_remote_workflow.py
TheGoldfish01 Jul 20, 2022
c99cdfe
Fix import
TheGoldfish01 Jul 20, 2022
c4b1f07
Conftest autouse fixture count_servers
PProfizi Jul 21, 2022
45f7a79
Set ansys_path argument of get_ansys_path to None by default
PProfizi Jul 21, 2022
8e8ad1e
Revert "Conftest autouse fixture count_servers"
PProfizi Jul 21, 2022
d72282f
Conftest actually count_servers with fixture
PProfizi Jul 21, 2022
831c2fb
Fix flake8
PProfizi Jul 21, 2022
8666f36
Merge branch 'merge/0.5.dev2' into merge/retro/fix_222
PProfizi Jul 21, 2022
0ef8e83
Fix flake8
PProfizi Jul 21, 2022
f3ca491
Remove code merged by error.
PProfizi Jul 21, 2022
1b343d3
Remove test_set_coordinates_field_meshed_region which should come wit…
PProfizi Jul 21, 2022
daa6393
Remove assert nb_servers=1 from count_servers fixture
PProfizi Jul 21, 2022
503ecde
Move and rename test_launch_server_not_install to test_launcher.py as…
PProfizi Jul 21, 2022
458b4b5
Fix GrpcServer.shutdown()
PProfizi Jul 21, 2022
3da0c1e
Add a version test before trying to launch an InProcess server
PProfizi Jul 21, 2022
1f84078
Remove version test, add warning to get info on ansys_path used.
PProfizi Jul 21, 2022
d1ae076
Merge branch 'merge/0.5.dev2' into merge/retro/fix_222
PProfizi Jul 21, 2022
983fbce
fix circular references
cbellot000 Jul 22, 2022
c02c4a0
ADD WARNINGS FOR DEBUG
PProfizi Jul 22, 2022
4ce75de
ADD TIMESTAMP
PProfizi Jul 22, 2022
e8e5835
Use [email protected]
PProfizi Jul 22, 2022
ce0e3b8
Revert "Use [email protected]"
PProfizi Jul 22, 2022
ec55e9f
Revert "ADD TIMESTAMP"
PProfizi Jul 22, 2022
31f2d6e
Revert "ADD WARNINGS FOR DEBUG"
PProfizi Jul 22, 2022
d47f1f7
Merge branch 'merge/0.5.dev2' into merge/retro/fix_222
PProfizi Jul 22, 2022
e77afa0
changes refs
cbellot000 Jul 22, 2022
da22494
Remove warning used for debug
PProfizi Jul 22, 2022
11e30e0
Fix test_start_local_failed_executable
PProfizi Jul 22, 2022
455bde9
Fix test_busy_port
PProfizi Jul 22, 2022
df35875
fixes
cbellot000 Jul 22, 2022
4d28452
Try [email protected]
PProfizi Jul 22, 2022
cf099db
styling and spelling
cbellot000 Jul 22, 2022
3ff9971
Up doc install-dpf-server to v2.2
PProfizi Jul 22, 2022
88bc7b5
Revert "Try [email protected]"
PProfizi Jul 22, 2022
c08f8d7
disable cache
cbellot000 Jul 22, 2022
4460b74
dummy push
cbellot000 Jul 22, 2022
957a933
reremove cache
cbellot000 Jul 22, 2022
a925b04
Set autouse of count_servers fixture to false
PProfizi Jul 22, 2022
3639832
add cleanup
cbellot000 Jul 25, 2022
1c49739
fix cleanup
cbellot000 Jul 25, 2022
395faa8
cleanup
cbellot000 Jul 25, 2022
37a16fb
shutdown all
cbellot000 Jul 25, 2022
b385bf6
Fix codacy errors
PProfizi Jul 25, 2022
2fb232e
fix
cbellot000 Jul 25, 2022
a2fc58a
cleanup workflow
cbellot000 Jul 25, 2022
23154f3
Add back test_launch_server_not_install and rename it as test_launche…
PProfizi Jul 25, 2022
6ef105d
fixes
cbellot000 Jul 25, 2022
969a800
dummy commit
cbellot000 Jul 25, 2022
fc84b9d
Revert "dummy commit"
cbellot000 Jul 25, 2022
60af04d
Refactor get_ansys_path and find_ansys
PProfizi Jul 25, 2022
8536b89
fix global server
cbellot000 Jul 25, 2022
1237767
fix doctests
cbellot000 Jul 25, 2022
ae1439f
skip tests
cbellot000 Jul 25, 2022
d004206
Add back a print in test_print_data_sources_path
PProfizi Jul 25, 2022
34934d5
Merge branch 'merge/retro/fix_222' of https://github.com/pyansys/DPF-…
cbellot000 Jul 25, 2022
c0b9ab1
fix merge
cbellot000 Jul 25, 2022
4ab0d84
fix
cbellot000 Jul 25, 2022
7087365
dummy
cbellot000 Jul 25, 2022
3fb408d
Merge/retro/pipelines 0.5.dev2 (#334)
cbellot000 Jul 25, 2022
c54fb71
add comment
cbellot000 Jul 25, 2022
84f5bff
skip failing test
cbellot000 Jul 25, 2022
c4ebc44
restart
cbellot000 Jul 25, 2022
0403c5a
fixes
cbellot000 Jul 25, 2022
1250758
fix
cbellot000 Jul 25, 2022
b49c727
fixes
cbellot000 Jul 26, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file not shown.
Binary file not shown.
18 changes: 8 additions & 10 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -43,9 +43,10 @@ jobs:
MODULE: ${{env.MODULE}}
dpf-standalone-TOKEN: ${{secrets.DPF_PIPELINE}}
install_extras: plotting
wheelhouse: "false"

- name: "Test Package"
uses: pyansys/pydpf-actions/[email protected]
uses: pyansys/pydpf-actions/[email protected].dev0
with:
MODULE: ${{env.MODULE}}
DEBUG: True
Expand Down Expand Up @@ -75,7 +76,7 @@ jobs:
python-version: 3.8

- id: install-dpf
uses: pyansys/pydpf-actions/install-dpf-server@v2.0
uses: pyansys/pydpf-actions/install-dpf-server@v2.2
with:
dpf-standalone-TOKEN: ${{secrets.DPF_PIPELINE}}
ANSYS_VERSION : ${{env.ANSYS_VERSION}}
Expand Down Expand Up @@ -125,21 +126,18 @@ jobs:
run: |
pip install -r requirements/requirements_docs.txt

- name: "Kill all servers"
uses: pyansys/pydpf-actions/[email protected]

- name: Build Documentation
shell: cmd
run: |
cd .ci
build_doc.bat > ..\docs\log.txt && type ..\docs\log.txt 2>&1
timeout-minutes: 20

- name: Kill all servers
shell: cmd
run: |
tasklist /FI "IMAGENAME eq Ans.Dpf.Grpc.exe" 2>NUL | find /I /N "Ans.Dpf.Grpc.exe">NUL
ECHO %ERRORLEVEL%
if "%ERRORLEVEL%"=="0"(taskkill /f /im Ans.Dpf.Grpc.exe)
continue-on-error: true
if: always()
- name: "Kill all servers"
uses: pyansys/pydpf-actions/[email protected]

- name: Publish Documentation artifact
uses: actions/upload-artifact@v3
Expand Down
7 changes: 5 additions & 2 deletions .github/workflows/retro.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,16 +28,19 @@ jobs:
- uses: actions/checkout@v3

- name: "Build Package"
uses: pyansys/pydpf-actions/build_package@v2.1.1
uses: pyansys/pydpf-actions/build_package@v2.2
with:
python-version: ${{ matrix.python-version }}
ANSYS_VERSION: ${{env.ANSYS_VERSION}}
PACKAGE_NAME: ${{env.PACKAGE_NAME}}
MODULE: ${{env.MODULE}}
dpf-standalone-TOKEN: ${{secrets.DPF_PIPELINE}}
install_extras: plotting
wheel: "false"
wheelhouse: "false"

- name: "Test Package"
uses: pyansys/pydpf-actions/test_package@v2.1.1
uses: pyansys/pydpf-actions/test_package@v2.2.dev0
with:
MODULE: ${{env.MODULE}}
DOCTEST: "false"
61 changes: 48 additions & 13 deletions ansys/dpf/core/_model_helpers.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,49 @@
import weakref

def __connect_op__(op, metadata, mesh_by_default=True):
"""Connect the data sources or the streams to the operator."""
if metadata._stream_provider is not None and hasattr(op.inputs, "streams"):
op.inputs.streams.connect(metadata._stream_provider.outputs)
elif metadata._stream_provider is not None and hasattr(op.inputs, "streams_container"):
op.inputs.streams_container.connect(metadata._stream_provider.outputs)
elif metadata._data_sources is not None and hasattr(
op.inputs, "data_sources"
):
op.inputs.data_sources.connect(metadata._data_sources)

if mesh_by_default and metadata.mesh_provider and hasattr(op.inputs, "mesh"):
op.inputs.mesh.connect(metadata.mesh_provider)

class DataSourcesOrStreamsConnector:
def __init__(self, metadata):
self._metadata = weakref.ref(metadata)

@property
def streams_provider(self):
if self._metadata():
return self._metadata().streams_provider
return None

@property
def time_freq_support(self):
if self._metadata():
return self._metadata().time_freq_support
return None

@property
def mesh_provider(self):
if self._metadata():
return self._metadata()._mesh_provider_cached
return None

@property
def data_sources(self):
if self._metadata():
return self._metadata().data_sources
return None

def named_selection(self, name):
if self._metadata():
return self._metadata().named_selection(name)
return None

def __connect_op__(self, op, mesh_by_default=True):
"""Connect the data sources or the streams to the operator."""
if self.streams_provider is not None and hasattr(op.inputs, "streams"):
op.inputs.streams.connect(self.streams_provider.outputs)
elif self.streams_provider is not None and hasattr(op.inputs, "streams_container"):
op.inputs.streams_container.connect(self.streams_provider.outputs)
elif self.data_sources is not None and hasattr(
op.inputs, "data_sources"
):
op.inputs.data_sources.connect(self.data_sources)

if mesh_by_default and self.mesh_provider and hasattr(op.inputs, "mesh"):
op.inputs.mesh.connect(self.mesh_provider)
2 changes: 1 addition & 1 deletion ansys/dpf/core/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -312,7 +312,7 @@ def __init__(self, server=None, load_operators=True, timeout=5):
grpcapi=tmp_dir_grpcapi.TmpDirGRPCAPI
)

# step3: init environement
# step3: init environment
self._api.init_data_processing_environment(self) # creates stub when gRPC

def make_tmp_dir_server(self):
Expand Down
2 changes: 1 addition & 1 deletion ansys/dpf/core/data_sources.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ def __init__(self, result_path=None, data_sources=None, server=None):
self._api = self._server.get_api_for_type(capi=data_sources_capi.DataSourcesCAPI,
grpcapi=data_sources_grpcapi.DataSourcesGRPCAPI)

# step3: init environement
# step3: init environment
self._api.init_data_sources_environment(self) # creates stub when gRPC

# step4: if object exists: take instance, else create it:
Expand Down
54 changes: 34 additions & 20 deletions ansys/dpf/core/dpf_operator.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
========
"""

import functools
import logging
import os
import traceback
Expand All @@ -29,6 +28,24 @@
LOG.setLevel("DEBUG")


class _SubOperator:
def __init__(self, op_name, op_to_connect):
self.op_name = op_name
self.op = Operator(self.op_name, server=op_to_connect._server)
if op_to_connect.inputs is not None:
for key in op_to_connect.inputs._connected_inputs:
inpt = op_to_connect.inputs._connected_inputs[key]
if type(inpt).__name__ == "dict":
for keyout in inpt:
if inpt[keyout]() is not None:
self.op.connect(key, inpt[keyout](), keyout)
else:
self.op.connect(key, inpt())

def __call__(self):
return self.op


class Operator:
"""Represents an operator, which is an elementary operation.

Expand Down Expand Up @@ -74,7 +91,6 @@ def __init__(self, name, config=None, server=None):
self._internal_obj = None
self._description = None
self._inputs = None
self._outputs = None

# step 1: get server
self._server = server_module.get_or_create_server(server)
Expand All @@ -98,8 +114,6 @@ def __init__(self, name, config=None, server=None):
# add dynamic inputs
if len(self._spec.inputs) > 0 and self._inputs is None:
self._inputs = Inputs(self._spec.inputs, self)
if len(self._spec.outputs) != 0 and self._outputs is None:
self._outputs = Outputs(self._spec.outputs, self)

# step4: if object exists: take instance (config)
if config:
Expand Down Expand Up @@ -138,9 +152,21 @@ def _add_sub_res_operators(self, sub_results):
"""

for result_type in sub_results:
bound_method = self._sub_result_op.__get__(self, self.__class__)
method2 = functools.partial(bound_method, name=result_type["operator name"])
setattr(self, result_type["name"], method2)
try:
setattr(self, result_type["name"], _SubOperator(result_type["operator name"], self))
except KeyError:
pass

@property
def _outputs(self):
if self._spec and len(self._spec.outputs) != 0:
return Outputs(self._spec.outputs, self)

@_outputs.setter
def _outputs(self, value):
# the Operator should not hold a reference on its outputs because outputs hold a reference
# on the Operator
pass

@property
@version_requires("3.0")
Expand Down Expand Up @@ -189,7 +215,7 @@ def connect(self, pin, inpt, pin_out=0):
elif isinstance(inpt, Operator):
self._api.operator_connect_operator_output(self, pin, inpt, pin_out)
elif isinstance(inpt, Output):
self._api.operator_connect_operator_output(self, pin, inpt._operator(), inpt._pin)
self._api.operator_connect_operator_output(self, pin, inpt._operator, inpt._pin)
elif isinstance(inpt, list):
from ansys.dpf.core import collection
if server_meet_version("3.0", self._server):
Expand Down Expand Up @@ -538,18 +564,6 @@ def _find_outputs_corresponding_pins(
elif python_name == "Any":
corresponding_pins.append(pin)

def _sub_result_op(self, name):
op = Operator(name)
if self.inputs is not None:
for key in self.inputs._connected_inputs:
inpt = self.inputs._connected_inputs[key]
if type(inpt).__name__ == "dict":
for keyout in inpt:
op.connect(key, inpt[keyout], keyout)
else:
op.connect(key, inpt)
return op

def __add__(self, fields_b):
"""Add two fields or two fields containers.

Expand Down
26 changes: 14 additions & 12 deletions ansys/dpf/core/inputs.py
Original file line number Diff line number Diff line change
Expand Up @@ -114,21 +114,22 @@ def connect(self, inpt):
from ansys.dpf.core.results import Result

if isinstance(inpt, _Outputs):
self._operator().connect(self._pin, inpt._operator(), corresponding_pins[0][1])
self._operator().connect(self._pin, inpt._operator, corresponding_pins[0][1])
self._operator().inputs._connected_inputs[self._pin] = {
corresponding_pins[0][1]: inpt._operator()
corresponding_pins[0][1]: weakref.ref(inpt._operator)
}
elif isinstance(inpt, Output):
self._operator().connect(self._pin, inpt._operator(), inpt._pin)
self._operator().inputs._connected_inputs[self._pin] = {inpt._pin: inpt}
self._operator().connect(self._pin, inpt._operator, inpt._pin)
self._operator().inputs._connected_inputs[self._pin] = {inpt._pin: weakref.ref(inpt)}
elif isinstance(inpt, Result):
self._operator().connect(self._pin, inpt(), corresponding_pins[0][1])
self._operator().inputs._connected_inputs[self._pin] = {
corresponding_pins[0][1]: inpt()
corresponding_pins[0][1]: weakref.ref(inpt)
}
else:
self._operator().connect(self._pin, inpt)
self._operator().inputs._connected_inputs[self._pin] = inpt
self._operator().inputs._connected_inputs[self._pin] = weakref.ref(inpt) \
if hasattr(inpt, "__weakref__") else inpt

self.__inc_if_ellipsis()

Expand Down Expand Up @@ -242,25 +243,26 @@ def connect(self, inpt):

from ansys.dpf.core.results import Result
if isinstance(inpt, Output):
self._operator().connect(corresponding_pins[0], inpt._operator(), inpt._pin)
self._connected_inputs[corresponding_pins[0]] = {inpt._pin: inpt._operator()}
self._operator().connect(corresponding_pins[0], inpt._operator, inpt._pin)
self._connected_inputs[corresponding_pins[0]] = {inpt._pin: weakref.ref(inpt._operator)}
elif isinstance(inpt, _Outputs):
self._operator().connect(
corresponding_pins[0][0], inpt._operator(), corresponding_pins[0][1]
corresponding_pins[0][0], inpt._operator, corresponding_pins[0][1]
)
self._connected_inputs[corresponding_pins[0][0]] = {
corresponding_pins[0][1]: inpt._operator()
corresponding_pins[0][1]: weakref.ref(inpt._operator)
}
elif isinstance(inpt, Result):
self._operator().connect(
corresponding_pins[0][0], inpt(), corresponding_pins[0][1]
)
self._connected_inputs[corresponding_pins[0][0]] = {
corresponding_pins[0][1]: inpt()
corresponding_pins[0][1]: weakref.ref(inpt)
}
else:
self._operator().connect(corresponding_pins[0], inpt)
self._connected_inputs[corresponding_pins[0]] = inpt
self._connected_inputs[corresponding_pins[0]] = weakref.ref(inpt) \
if hasattr(inpt, "__weakref__") else inpt

def _add_input(self, pin, spec, count_ellipsis=-1):
if spec is not None:
Expand Down
54 changes: 53 additions & 1 deletion ansys/dpf/core/misc.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,9 @@
import glob
import os
from pkgutil import iter_modules
from ansys.dpf.core._version import (
__ansys_version__
)


DEFAULT_FILE_CHUNK_SIZE = 524288
Expand Down Expand Up @@ -57,8 +60,46 @@ def is_ubuntu():
return False


def get_ansys_path(ansys_path=None):
"""Give input path back if given, else look for ANSYS_DPF_PATH, then AWP_ROOT+__ansys_version__,
then calls for find_ansys as a last resort.

Parameters
----------
ansys_path : str
Full path to an Ansys installation to use.

Returns
-------
ansys_path : str
Full path to an Ansys installation.

"""
# If no custom path was given in input
# First check the environment variable for a custom path
if ansys_path is None:
ansys_path = os.environ.get("ANSYS_DPF_PATH")
# Then check for usual installation folders with AWP_ROOT and find_ansys
if ansys_path is None:
ansys_path = os.environ.get("AWP_ROOT" + __ansys_version__)
if ansys_path is None:
ansys_path = find_ansys()
# If still no install has been found, throw an exception
if ansys_path is None:
raise ValueError(
"Unable to locate any Ansys installation.\n"
f'Make sure the "AWP_ROOT{__ansys_version__}" environment variable '
f"is set if using ANSYS version {__ansys_version__}.\n"
"You can also manually define the path to the ANSYS installation root folder"
" of the version you want to use (vXXX folder):\n"
'- when starting the server with "start_local_server(ansys_path=*/vXXX)"\n'
'- or by setting it by default with the environment variable "ANSYS_DPF_PATH"')
return ansys_path


def find_ansys():
"""Search the standard installation location to find the path to the latest Ansys installation.
"""Search for a standard ANSYS environment variable (AWP_ROOTXXX) or a standard installation
location to find the path to the latest Ansys installation.

Returns
-------
Expand All @@ -77,6 +118,17 @@ def find_ansys():
>>> path = find_ansys()

"""
versions = [key[-3:] for key in os.environ.keys() if "AWP_ROOT" in key]
for version in sorted(versions, reverse=True):
if not version.isnumeric():
continue
if version == __ansys_version__:
continue
elif version < __ansys_version__:
ansys_path = os.environ.get("AWP_ROOT" + version)
if ansys_path:
return ansys_path

base_path = None
if os.name == "nt":
base_path = os.path.join(os.environ["PROGRAMFILES"], "ANSYS INC")
Expand Down
Loading