Better and Faster NLP Model Training: 10 Proven Techniques

About this Event

In this talk, we will explore the latest and most effective techniques for training natural language processing models. We will cover methods that have been proven to work in Kaggle competitions, such as AWP, SWA, and pseudolabeling, as well as best practices learned from W&B power users. With these techniques, you’ll be able to train your models faster and achieve better performance. Whether you’re a seasoned machine learning engineer or just getting started, you’ll come away with valuable insights and practical tips for boosting your NLP model training.

Speakers

Darek Kłeczek ><

Darek Kłeczek is a Machine Learning Engineer at Weights & Biases where he is leading the W&B education program. Previously, he applied machine learning across supply chain, manufacturing, legal and commercial use cases. He also worked on operationalizing machine learning at P&G. Darek has contributed the first Polish versions of BERT and GPT language models and has been a leader in the Polish NLP community. He’s a Kaggle competition