GPU out of memory

Hi,
for these past two days I'm having these error while trying to train my model :

" RuntimeError: CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 4.75 GiB total capacity; 3.23 GiB already allocated; 1.50 MiB free; 3.26 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF "

does it mean I don't have GPU anymore, and I should ask for it again or else!!
plz help.

Demande liée à Demande de GPU 3g_20g