Dec 6, 2021

ML6 x Berlin Machine Learning Group present: Dutch GPT-2 & Efficient Transformers

Join another Meetup organized by ML6 at the AI Campus!

This time the event is co-hosted together with the Berlin Machine Learning Group and includes two exciting talks in the NLP domain:

Tune in to learn how ML6 trained their own Dutch GPT-2 using Transformers and to get to know what the future of efficient transformers may hold!

IMPORTANT: This Meetup takes place as a hybrid event.

- To join on-site, please sign up on this event's page. Please note: only people who are vaccinated against or have recovered from Covid-19 (2G) are allowed access to the AI Campus.

- To join online, please sign up here

Talk 1: How we trained our own Dutch GPT-2 using Transformers

Text Generation and the GPT series of Transformer models have been a hot topic since the public got to know the astounding power of it. The latest GPT-3 can mimic a human conversation to a close to scary level.At ML6 we trained and open sourced our own Dutch GPT-2 model using Huggingface’s Transformers library. This talk addresses the questions: How do you do that? What kind of data do you need and how to access enough compute power to actually train the model?


Thomas Vrancken is an ML Engineer at ML6 with a background in strategy consulting and research. Thomas is passionate about NLP, data science and making real world impacts with creative Machine Learning applications. Staying versatile is his credo, joining a maximum of events with interesting talks is a means of achieving it.

Talk 2: Efficient Transformers

In recent years we’ve seen an exponential increase in the size of pre-trained transformer based models and although they push the state-of-the-art to ever greater heights, they also become increasingly cumbersome to work with. This has prompted researchers around the world to try and find more efficient alternatives to the classic transformer architecture and has spawned an interesting new research direction. In this talk, we will have a look at some of the interesting ideas in this area and what the future may hold for these transformer based models.


Mats Uytterhoeven is an ML Engineer at ML6 interested in a broad range of topics. His main focus is on NLP and unsupervised learning problems. When he's not hacking on machine learning code, he likes playing tennis, reading (non-fiction), and traveling. He believes machine learning can have a positive impact on people's lives and loves working on projects that can make a difference.

More events
Europe’s Hub for AI.
Europe’s Hub for AI.
Europe’s Hub for AI.
Europe’s Hub for AI.
Europe’s Hub for AI.
Europe’s Hub for AI.
Join us

Become a part of the AI Campus.

There are many ways to join our community. Sign up to our newsletter below, or select one of the other two options and get in touch with us:

Newsletter Signup:

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.