Skip to content
GitHub

P1 NGC HPC

The P1 NGC HPC is hosted at the National Genome Centre and is designed for secure data processing with GDPR compliance. It provides a secure environment for handling sensitive data and research projects.

Requirements: PhD or higher (exceptions may apply), valid Danish university email, and registered P1 affiliation.

Each project needs to bring a record (a signed Data Processing Agreement should do) that explicitly mentions NGC as a data processor and that the data is allowed to be stored there. If the project poses a high risk to individuals whose personal data is being processed a Data Protection Impact Assessment (DPIA) will be needed also.

  1. P1 Affiliation Form
    Before accessing the P1 NGC HPC, you must first register to become a member of P1.
  2. Complete and sign the NGC user creation form and forward it to the Compute Coordinator to request access.

You will be added to the NGC slack channel once you gain access.

The P1 NGC HPC is an air-gapped system requiring:

  • Multi-factor authentication (MFA)
  • A client for accessing the remote VM entrypoint
  • Specific access instructions will will follow after registration. But you can expect to use SFTP for the transferring if data into the system.

Once conneccted with the Omnissa Remote Desktop client you can access the login node using ssh -X <your-username>@login.

Important hosts:
- https://console.cld076.vmc/status # Internal Status page
- cld076-0004.cld076.vmc # Internal (SFTP)
- sftp.spc.ngc.dk # External Ingress/Egress (SFTP)

You can transfer sensitive and large data to the cluster using SFTP under the supervision of an admin. You will need to request access to the /data/upload directory as this acts as a data gateway.

Then its recommended to set up a SSH entry in your ~/.ssh/config.

Host ngc
HostName sftp.spc.ngc.dk
Port 6433
User <your-username>_sftp
HostKeyAlgorithms +ssh-rsa

From here you can connect with sftp ngc and then put to the /data/upload directory from the outside and use get from the inside. During the transfer period the data is only accessible to you and the admins.

As an alternative to using SFTP you can use scp to transfer the data to the cluster which is arguably easier:

scp ~/datasets/ISLES-2022.zip ngc:/data/upload/

Then inside the cluster you can transfer the data to your home directory using:

scp <your-username>_sftp@cld076-0004.cld076.vmc:/data/upload/ISLES-2022.zip ~/datasets/

For miscellaneous and small data such as personal dotfiles or source code you can:

  • Transfer via SFTP (tedious and requires admin supervision)
  • Use the internal server running a simple GitHub proxy/tunnel for public repositories.
  • Mount a host directory using the Omnissa Remote Desktop client (must be enabled by an NGC admin)
  • SSH into the admin node (if you have access to it)

You host clipboard works into the remote desktop client but not the other way around. You can take screenshots of the remote display during the session.

Technical Support

For technical issues, contact the NGC HPC Support Team

Policy Support

For policy issues, contact compute-governance-p1@aicentre.dk

General Questions

Use the #compute or #ask-compute channels on P1 Slack

Compute Coordinator

Contact bstja@dtu.dk for general or technical compute-related questions
  • Air-gapped system for secure data processing
  • GDPR compliant infrastructure
  • Secure storage solutions
  • Specific hardware details available upon access approval
  • Scheduling Environment: SLURM
  • Resource allocation details provided during onboarding