Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

parameters #5

Open
WinnaYuan opened this issue Dec 30, 2019 · 3 comments
Open

parameters #5

WinnaYuan opened this issue Dec 30, 2019 · 3 comments

Comments

@WinnaYuan
Copy link

Hi, can you share all the parameters of the clinical finetuned model? thank you !

@AndriyMulyar
Copy link
Owner

AndriyMulyar commented Dec 30, 2019 via email

@WinnaYuan
Copy link
Author

The parameters are shared in the clinical model download. Look in your ~/.cache directory. Andriy

On Mon, Dec 30, 2019, 3:26 AM WinnaYuan @.***> wrote: Hi, can you share all the parameters of the clinical finetuned model? thank you ! — You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub <#5?email_source=notifications&email_token=ADJ4TBXSLMBZTAKV4N6AZJ3Q3GWCZA5CNFSM4KBKFLTKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4IDJUE6A>, or unsubscribe https://github.com/notifications/unsubscribe-auth/ADJ4TBUVGH3RHABJAYPKCHLQ3GWCZANCNFSM4KBKFLTA .

thanks for your reply, I just saw these parameters which doesn't contain learning rate and epoch and others, or you use the default parameters? and another question, where can I download the MED data? Thank you very much!
{
"attention_probs_dropout_prob": 0.1,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"initializer_range": 0.02,
"intermediate_size": 3072,
"max_position_embeddings": 512,
"num_attention_heads": 12,
"num_hidden_layers": 12,
"type_vocab_size": 2,
"vocab_size": 28996
}

@AndriyMulyar
Copy link
Owner

I never got around to publishing the training parameters due to not having time to clean it up. The training parameters were identical to the pytorch transformers sentence pair CLS finetuning parameters.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants