You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add BERT as a new task
we can reuse existing transformer code we already have for the translation NLP task from here: #33
@guptaprkhr@jbcdnr@mpagli can you point us to the best code to start from, including the pre-processing pipeline? and later have a look at it here again so we can get a draft of the standard data-parallel training running?
we should define a very light goal on the BERT training loss at first, to have something to iterate on quickly.
MLperf for comparison currently only has tensorflow BERT
The text was updated successfully, but these errors were encountered:
Add BERT as a new task
we can reuse existing transformer code we already have for the translation NLP task from here:
#33
@guptaprkhr @jbcdnr @mpagli can you point us to the best code to start from, including the pre-processing pipeline? and later have a look at it here again so we can get a draft of the standard data-parallel training running?
we should define a very light goal on the BERT training loss at first, to have something to iterate on quickly.
MLperf for comparison currently only has tensorflow BERT
The text was updated successfully, but these errors were encountered: