Access to the gpu partition

Hello

I would like to run a process running on 2 GPU (150 Gb as requirements)
in my account and for my llama_concept_chain project

all the best
Nicolas
nicolas.turenne@ird.fr

Hello,

As far as I understand, it's not possible to run your process if it requires 150 GB of GPU RAM.
We currently have NVIDIA Ampere A100 GPUs with 40 GB of RAM:

Is the 150 GB requirement related to GPU RAM or system RAM?

Best regards

Subject: Clarification on GPU RAM Requirement

Hello,

Thank you for your response.

The 150 GB requirement is specifically related to GPU RAM. The LLaMA server is capable of managing its usage across multiple GPUs, so I believe this would translate to needing 5 GPUs with 40 GB each.

Would this configuration be possible?

Best regards,
Nicolas

Hello Nicolas,

Unfortunately, we are unable to fulfill your requests, as our GPU resources are limited (Cluster description - IFB Core Cluster Documentation).
We invite you to explore other bioinformatics platforms (Platforms - IFB), regional computing centers (Les mésocentres en France), or even GENCI (Documentation complète — Documentation, https://www.edari.fr/).

Best regards,

P.S. Out of curiosity, are you writing your messages using an LLM assistant?