You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the existing issues and did not find a match.
Who can help?
I keep getting this error no matter what I do or what examples from your documentation I use.
What are you working on?
I'm trying to start a ner model for quality assessment. but I don't understand where else I can look to see what is causing the error. I tried to put other versions and load a different dataset (as in the documentation [(Alice, 1)]) the error is the same.
Traceback (most recent call last):
File "/usr/local/lib/python3.11/dist-packages/pyspark/serializers.py", line 458, in dumps
return cloudpickle.dumps(obj, pickle_protocol)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pyspark/cloudpickle/cloudpickle_fast.py", line 73, in dumps
cp.dump(obj)
File "/usr/local/lib/python3.11/dist-packages/pyspark/cloudpickle/cloudpickle_fast.py", line 602, in dump
return Pickler.dump(self, obj)
^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pyspark/cloudpickle/cloudpickle_fast.py", line 692, in reducer_override
return self._function_reduce(obj)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pyspark/cloudpickle/cloudpickle_fast.py", line 565, in _function_reduce
return self._dynamic_function_reduce(obj)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pyspark/cloudpickle/cloudpickle_fast.py", line 546, in _dynamic_function_reduce
state = _function_getstate(func)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pyspark/cloudpickle/cloudpickle_fast.py", line 157, in _function_getstate
f_globals_ref = _extract_code_globals(func.__code__)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pyspark/cloudpickle/cloudpickle.py", line 334, in _extract_code_globals
out_names = {names[oparg]: None for _, oparg in _walk_global_ops(co)}
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/dist-packages/pyspark/cloudpickle/cloudpickle.py", line 334, in <dictcomp>
out_names = {names[oparg]: None for _, oparg in _walk_global_ops(co)}
~~~~~^^^^^^^
IndexError: tuple index out of range
Is there an existing issue for this?
Who can help?
I keep getting this error no matter what I do or what examples from your documentation I use.
What are you working on?
I'm trying to start a ner model for quality assessment. but I don't understand where else I can look to see what is causing the error. I tried to put other versions and load a different dataset (as in the documentation [(Alice, 1)]) the error is the same.
https://github.com/JohnSnowLabs/spark-nlp/blob/ac203f2906e0cc6ce7185cb929b90c708820ea9e/docs/_posts/maziyarpanahi/2020-02-03-wikiner_840B_300_it.md#how-to-use
Current Behavior
Expected Behavior
no errors
Steps To Reproduce
Spark NLP version and Apache Spark
Spark NLP version 4.2.8
Apache Spark version: 3.3.0
Type of Spark Application
Python Application
Java Version
No response
Java Home Directory
No response
Setup and installation
No response
Operating System and Version
No response
Link to your project (if available)
No response
Additional Information
Spark NLP version 4.2.8
Apache Spark version: 3.3.0
The text was updated successfully, but these errors were encountered: