5 Comments

Wow Lightning does make it super easy to employ mixed precision and distributed training! When I wrote about mixed precision in 2020, when PyTorch just released their AMP (automatic mixed precision) module, it was a mess trying to autocast the layers and remembering their precisions.

Enjoyed the read! Thanks!

Expand full comment
Aug 24, 2023Liked by Sebastian Raschka, PhD

Clear and straight to the point, thanks a lot!

Expand full comment

Any specific reasons to prefer bfloat16 to float16?

Expand full comment