Implementation and demo of explainable coding of clinical notes with Hierarchical Label-wise Attention Networks (HLAN)
-
Updated
Jul 15, 2022 - Python
Implementation and demo of explainable coding of clinical notes with Hierarchical Label-wise Attention Networks (HLAN)
The objective of this challenge is to create a machine translation system capable of converting text from French into Fongbe or Ewe.
This repository contains code for a fine-tuning experiment of CamemBERT, a French version of the BERT language model, on a portion of the FQuAD (French Question Answering Dataset) for Question Answering tasks.
Very Simple Transformers provides a simplified interface for packaging, deploying, and serving Transformer models.
The goal of this challenge is to build a machine translation model to translate sentences from Yorùbá language to English language in several domains like news articles, daily conversations, spoken dialog transcripts and books.
Application for training the pretrained transformer model DeBERTaV3 on an Aspect Based Sentiment Analysis task
Machine Learning Hackathon by MachineHack hosted by Ugam
This repository contains the annotation framework, dataset and code used for the resource paper "TACO -- Twitter Arguments from COnversations".
Classify forum posts into one of the amazon e-commerce forum categories using Natural Language Processing (NLP) and Machine Learning.
Deep learning in smiles win / loss evaluation.
Simpletransformer library is based on the Transformers BERT library by HuggingFace. The goal of Question Answering is to find the answer to a question given a question and an accompanying context. The predicted answer will be either a span of text from the context or an empty string (indicating the question cannot be answered from the context.)
Deep learning in FEN’s win / loss evaluation.
This library is based on simpletransformers and HuggingFace's Transformers library.
Weak Supervised Fake News Detection with RoBERTa, XLNet, ALBERT, XGBoost and Logistic Regression classifiers.
Small application to test out some functionality of OpenAIs Generative Pre-Trained Transformer (GPT-2) Model
Backend for MindPeers ML (NLP) models such as Sentiment Analysis & Keyword Extraction (including Feedback Loops)
Text classification code used to identify spam messages for a class Kaggle competition. The library used is Simple Transformers. Placed 2nd with a .98 accuracy score.
Simple Transformers Fork that supports T5TokenizerFast and umT5
Backend for MindPeers ML (NLP) models such as Sentiment Analysis & Keyword Extraction (including Feedback Loops)
🏷️ Classificação multi-label com BERT.
Add a description, image, and links to the simpletransformers topic page so that developers can more easily learn about it.
To associate your repository with the simpletransformers topic, visit your repo's landing page and select "manage topics."