Last updated:
0 purchases
bertfortf2e 0.14.13
BERT for TensorFlow 2.11.0
This is a modification version of the original bert-for-tf2 created by kpe. I made a minor change to the code
to make it work with never version of TensorFlow, following the solution that i found in github community.
Resolving the TypeError issue. Last time checked at 1/23/2023 - it worked fine.
This repo contains a TensorFlow 2.11.0_ Keras_ implementation of google-research/bert_
with support for loading of the original pre-trained weights_,
and producing activations numerically identical to the one calculated by the original model.
ALBERT_ and adapter-BERT_ are also supported by setting the corresponding
configuration parameters (shared_layer=True, embedding_size for ALBERT_
and adapter_size for adapter-BERT_). Setting both will result in an adapter-ALBERT
by sharing the BERT parameters across all layers while adapting every layer with layer specific adapter.
The implementation is build from scratch using only basic tensorflow operations,
following the code in google-research/bert/modeling.py_
(but skipping dead code and applying some simplifications). It also utilizes kpe/params-flow_ to reduce
common Keras boilerplate code (related to passing model and layer configuration arguments).
bert-for-tf2e_ should work with both TensorFlow 2.11.0_ and TensorFlow 1.14_ or newer.
Install
bert-for-tf2e bert for tensorflow 2.0 (extended) is on the Python Package Index (PyPI):
::
pip install bert-for-tf2e
For more detail please check the original version:
SOURCE_ - https://github.com/kpe/bert-for-tf2
For personal and professional use. You cannot resell or redistribute these repositories in their original state.
There are no reviews.