Skip to content

TF2.0 hub Universal Sentence Encoder Multilingual Sentenepieceop not registered problem #463

Closed
@cbahcevan

Description

@cbahcevan

Have I written custom code : No
OS Platform and Distribution : Windows 10 / Google Colab
TensorFlow version (use command below):tensorflow==2.0.0
Python version:Python 3.6.9
Here is my code.

import tensorflow as tf
import tensorflow_hub as hub
import tf_sentencepiece
embedding_layer = hub.load("https://tfhub.dev/google/universal-sentence-encoder-multilingual-large/3")

Problematic output

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/framework/ops.py in _get_op_def(self, type)
   3819     try:
-> 3820       return self._op_def_cache[type]
   3821     except KeyError:

KeyError: 'SentencepieceOp'

During handling of the above exception, another exception occurred:

NotFoundError                             Traceback (most recent call last)
8 frames
/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/framework/ops.py in _get_op_def(self, type)
   3822       with c_api_util.tf_buffer() as buf:
   3823         # pylint: disable=protected-access
-> 3824         c_api.TF_GraphGetOpDef(self._c_graph, compat.as_bytes(type), buf)
   3825         # pylint: enable=protected-access
   3826         data = c_api.TF_GetBuffer(buf)

NotFoundError: Op type not registered 'SentencepieceOp' in binary running on e2f7765a82a9. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed

I could not get answer to this issue on tensorflow respository . I want to get multilingual sentence embeddings using pretrained Google Universal sentence encoder. But I could not able to get it from Windows 10 and Google Colab even if the internet shows diffrent. I was able to get sentence vectors using diffrent hub link like https://tfhub.dev/google/universal-sentence-encoder/4. That is not dependent to sentencepiece.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions