Chevron Left
Back to Generative AI Language Modeling with Transformers

Learner Reviews & Feedback for Generative AI Language Modeling with Transformers by IBM

4.5
stars
81 ratings

About the Course

This course provides a practical introduction to using transformer-based models for natural language processing (NLP) applications. You will learn to build and train models for text classification using encoder-based architectures like Bidirectional Encoder Representations from Transformers (BERT), and explore core concepts such as positional encoding, word embeddings, and attention mechanisms. The course covers multi-head attention, self-attention, and causal language modeling with GPT for tasks like text generation and translation. You will gain hands-on experience implementing transformer models in PyTorch, including pretraining strategies such as masked language modeling (MLM) and next sentence prediction (NSP). Through guided labs, you’ll apply encoder and decoder models to real-world scenarios. This course is designed for learners interested in generative AI engineering and requires prior knowledge of Python, PyTorch, and machine learning. Enroll now to build your skills in NLP with transformers!...

Top reviews

LW

Nov 10, 2024

Get more familiar with transformer and its application in language

AB

Dec 30, 2024

This course gives me a wide picture of what transformers can be.

Filter by:

1 - 15 of 15 Reviews for Generative AI Language Modeling with Transformers

By Ohad H

Feb 3, 2025

The narration is poor. Instead of an expert lecturer, a narrator reads the text without understanding its meaning. Many fundamental terms are left unexplained.

By raul v r

Oct 11, 2024

Once again, great content and not that great documentation (printable cheatsheets, no slides, etc). Documentation is essential to review a course content in the future. Alas!

By Deleted A

Oct 22, 2024

It is an excellent specialisation, except the pace of the speaker is very fast. It is difficult to understand, and it sounds very artificial.

By Alexandre T

Feb 22, 2025

Fantastic class but it takes WAY MORE TIME than what is reported, unless you just don't do the labs or casually read them high level. Going in-depth in the labs and doing the necessary work to understand all key concepts, and codes, will take you easily 3-4x more times depending on your current level of expertise. Example: a lab of 30 minutes has a length of 15 A4 pages when you print it. Now imagine all these pages contain key notions & codes. Superb class, but required time is highly underestimated (like most of the IBM Generative AI Engineering certification).

By vikky b

Nov 17, 2024

need assistance from humans, which seems lacking though a coach can give guidance but not to the extent of human touch.

By XUETING W

Dec 2, 2024

Good content but I truly cannot understand...

By Makhlouf M

Apr 7, 2025

Great course! Clear explanations, solid structure, and just the right mix of theory and hands-on content. Thanks to Dr. Joseph Santarcangelo, Fateme Akbari, and Kang Wang for making complex concepts so accessible. Really enjoyed it and learned a lot about transformers and GenAI!

By LO W

Nov 10, 2024

Get more familiar with transformer and its application in language

By Ana A B

Dec 30, 2024

This course gives me a wide picture of what transformers can be.

By Muhammad A

Jan 18, 2025

Exceptional course and all the labs are industry related

By 329_SUDIP C

Dec 2, 2024

Nice Course

By Purva T

Jul 26, 2024

good.

By Francesco D G

Dec 15, 2024

Maybe a little chaotics. Slides should be available.

By Mykola K

Apr 23, 2025

The course is interesting and challenging. The lab assignments should be divided into more parts. There's too much code to grasp in a single lab session, making it difficult to follow the task. A major drawback is the extremely long training time of the model in the lab work. For example, BERT took over an hour to train. During that time, it's easy to lose interest in continuing the course. Either the model needs to be simplified to train faster, or the performance of the environment running the Jupyter Notebook should be significantly improved.

By Mohammad D

Mar 26, 2025

It's one of the worst courses I've seen. I couldn't understand anything from their explanation and I had to resort to external resources to understand the topic (and I am already someone with ML background).