diff --git a/external/exchange-channels/atlas-hpc/atlas-hpc.md b/external/exchange-channels/atlas-hpc/atlas-hpc.md
index dd860116921f7f34c50b7c61b13fa0fa4ffbc873..22582e23e4c5e8747a2b590ca7c382995154e769 100644
--- a/external/exchange-channels/atlas-hpc/atlas-hpc.md
+++ b/external/exchange-channels/atlas-hpc/atlas-hpc.md
@@ -12,9 +12,9 @@ redirect_from:
 # Data transfer between Atlas and UL HPC Clusters
 A recommended storage pattern is to have the master copy of data on Atlas (project folder) and only store data on the UL HPC Clusters temporarily for the required practical duration of computational analysis. The derived data and results should hereafter be transferred back to Atlas. This How-to Card describes the different methods to transfer data between Atlas and the UL HPC Clusters. The three recommended methods to transfer data are:
 
-1. [Via laptop with ```scp``` or ```rsync```](1. Via laptop using scp or rsync)
-2. [Via dedicated Virtual Machine (VM)](2. Via dedicated Virtual Machine (VM) using rsync)
-3. [Via Large File Transfer (LFT)](3. Via Large File Transfer (LFT)) 
+1. [Via laptop with ```scp``` or ```rsync```](Via-laptop-using-scp-or-rsync)
+2. [Via dedicated Virtual Machine (VM)](Via-dedicated-Virtual-Machine-using-rsync)
+3. [Via Large File Transfer (LFT)](Via-Large-File-Transfer) 
 
 Please refer to the dedicated knowledge bases to see how to [connect to UL HPC Clusters](https://hpc-docs.uni.lu/connect/access/) and to [mount Atlas](https://service.uni.lu/sp?id=kb_article_view&sysparm_article=KB0010233).