Pretrained transformers model fitted using Transformer-based Sequential Denoising Auto-Encoder for unsupervised sentence embedding learning with one objective : french tax domain adaptation.
Recent launches
TSDAE-Lemone-mBERT-tax
Pretrained transformers model fitted using Transformer-based Sequential Denoising Auto-Encoder for unsupervised sentence embedding learning with one objective : french tax domain adaptation.