Transferring a file from a Linux server involves using the secure copy protocol from your terminal. If you’ve ever wondered how to download file from linux server without messing up permissions or losing data, this guide walks you through every step.
You don’t need to be a sysadmin to copy files from a remote Linux machine. With a few commands, you can grab logs, backups, or configuration files in seconds.
Let’s start with the most common method, then explore alternatives for different situations.
Understanding File Transfer Basics
Before you run any commands, you need three things: the server’s IP address or hostname, your username, and the file path. Without these, you can’t connect.
Most Linux servers use SSH (Secure Shell) for encrypted connections. This means your file transfer is secure from eavesdropping.
You’ll also need a terminal emulator on your local machine. On Linux or macOS, the built-in terminal works fine. On Windows, you can use PowerShell, WSL, or a tool like PuTTY.
How To Download File From Linux Server
The fastest way to download a single file is with the scp command. SCP stands for Secure Copy Protocol, and it uses SSH under the hood.
Here’s the basic syntax:
scp username@server_ip:/path/to/remote/file /path/to/local/destination
For example, to download a file named backup.tar.gz from a server with IP 192.168.1.100 to your current directory:
scp user@192.168.1.100:/home/user/backup.tar.gz .
The dot at the end means “current directory.” You can also specify a full path like /home/you/downloads/.
You’ll be prompted for the remote user’s password. Once entered, the file transfers over an encrypted channel.
Using Rsync For Large Or Repeated Transfers
If you need to download large files or sync directories, rsync is better than SCP. It resumes interrupted transfers and only copies changed parts.
Basic rsync command:
rsync -avz username@server_ip:/path/to/remote/file /path/to/local/destination
The -a flag preserves permissions and timestamps. -v gives verbose output, and -z compresses data during transfer.
For example:
rsync -avz user@192.168.1.100:/var/logs/app.log ./logs/
Rsync is ideal for downloading database dumps or log files that change often.
Downloading Multiple Files With SCP
You can download multiple files by listing them or using wildcards. For instance:
scp user@server:/home/user/{file1.txt,file2.log} /local/path/
To download all .conf files from a directory:
scp user@server:/etc/nginx/*.conf /local/path/
Be careful with wildcards—they expand on the remote server, so you need proper escaping if the pattern contains special characters.
Using SFTP For Interactive Downloads
SFTP (SSH File Transfer Protocol) gives you an interactive session. It’s useful when you don’t know the exact file path or want to browse directories.
Start an SFTP session:
sftp username@server_ip
Once connected, you can navigate remote directories with cd and list files with ls. To download a file, use:
get remote_file local_destination
For example:
get /var/log/syslog ./
You can also download entire directories with the -r flag:
get -r /home/user/project/ ./
Type bye or exit to close the session.
Downloading With Wget From A Web Server
If the file is served over HTTP or HTTPS (like from a web server or cloud storage), wget is a simple option. This works even without SSH access.
Basic wget command:
wget http://example.com/file.zip
To save with a different name:
wget -O myfile.zip http://example.com/file.zip
Wget supports resuming downloads with the -c flag:
wget -c http://example.com/largefile.iso
This is handy for large files that might timeout.
Using Curl For More Control
Curl is another command-line tool that supports many protocols including HTTP, FTP, and SFTP. It’s more flexible than wget for authentication and headers.
Download a file with curl:
curl -O http://example.com/file.zip
The -O flag saves the file with its original name. To save with a custom name, use -o:
curl -o custom_name.zip http://example.com/file.zip
For SFTP downloads with curl:
curl -u username sftp://server_ip/path/to/file -o local_file
You’ll be prompted for the password. Curl also supports resume with -C -.
Automating Downloads With Scripts
If you download the same files regularly, write a bash script. This saves time and reduces errors.
Example script download_backup.sh:
#!/bin/bash
SERVER="user@192.168.1.100"
REMOTE_PATH="/backups/daily.tar.gz"
LOCAL_PATH="/home/you/backups/"
scp $SERVER:$REMOTE_PATH $LOCAL_PATH
echo "Download complete"
Make it executable with chmod +x download_backup.sh and run it with ./download_backup.sh.
You can add SSH keys to avoid password prompts. Generate a key pair with ssh-keygen, then copy the public key to the server with ssh-copy-id.
Handling Permission Issues
Sometimes you get “Permission denied” errors. This usually means your remote user doesn’t have read access to the file, or the local directory isn’t writable.
Check remote file permissions with ls -l /path/to/file. If needed, ask the server admin to adjust permissions or use sudo on the remote side (though SCP doesn’t support sudo directly).
For local write issues, ensure your destination folder exists and you own it. Use mkdir -p /local/path if needed.
Downloading Over Slow Connections
If your internet is unreliable, use rsync or wget -c for resume capability. You can also compress files before transfer to reduce size.
Compress a directory on the server:
tar -czf archive.tar.gz /path/to/directory
Then download the single archive file. This is faster than transferring many small files.
Another trick is to use screen or tmux on the server to keep the transfer running even if your SSH session disconnects.
Graphical Alternatives For Beginners
If you prefer a GUI, tools like FileZilla or WinSCP support SFTP and SCP. They show both local and remote directories side by side.
With FileZilla, enter the server details (host, username, password, port 22) and drag files from the remote panel to your local folder.
These tools are great for non-technical users or when you need to browse multiple directories quickly.
Security Considerations
Always use encrypted protocols like SCP, SFTP, or HTTPS. Avoid plain FTP or Telnet, which send passwords in cleartext.
Disable password-based SSH access if possible and use SSH keys instead. This prevents brute-force attacks.
Also, verify file integrity after download. Use md5sum or sha256sum on both ends to compare checksums.
For sensitive files, consider encrypting them with GPG before transfer.
Common Errors And Fixes
Here are typical problems you might encounter:
- Connection refused: SSH server isn’t running or firewall blocks port 22. Check with
systemctl status sshd. - Host key verification failed: Server’s fingerprint changed. Remove old key with
ssh-keygen -R server_ip. - Permission denied (publickey): SSH key isn’t authorized. Use
ssh-copy-idor check key permissions. - File not found: Double-check the remote path. Use
lsin SFTP to confirm. - Disk full: Local storage is exhausted. Free up space or choose another destination.
Most issues are simple to fix once you know what to look for.
Downloading From A Server Without SSH
If you only have web access (like a cPanel or shared hosting), look for a file manager in the control panel. Many hosting providers offer one-click downloads.
Alternatively, if the server runs a web server, you can place files in the document root and download via HTTP. This is less secure but works in a pinch.
For cloud servers (AWS, DigitalOcean, etc.), you can use their web consoles or CLI tools like aws s3 cp for S3 buckets.
Using Tmux Or Screen For Long Transfers
Large downloads can take hours. If your SSH session drops, the transfer fails. Use tmux or screen to keep it running.
Start a tmux session:
tmux new -s download
Run your SCP or rsync command inside. Detach with Ctrl+B then D. Reattach later with tmux attach -t download.
This is a lifesaver for overnight transfers.
Downloading With Parallel Transfers
For many small files, parallel transfers speed things up. Tools like parallel-scp (from pdsh) or mosh can help.
A simpler approach is to use rsync with multiple threads, though standard rsync is single-threaded. For massive datasets, consider lftp which supports parallel downloads.
Example with lftp:
lftp -e "pget -n 4 /path/to/file; exit" sftp://user@server
This downloads using 4 connections simultaneously.
Checking Transfer Progress
SCP and rsync don’t show progress by default. Add -v for verbose output or use pv (pipe viewer) to monitor speed.
With rsync:
rsync -avz --progress user@server:/path/to/file /local/path
With SCP, you can use pv by piping:
pv /path/to/file | ssh user@server "cat > /local/path"
This shows transfer speed and estimated time.
Resuming Interrupted Downloads
If a download fails halfway, don’t restart from scratch. Use rsync’s resume capability or wget’s -c flag.
For SCP, there’s no built-in resume. Switch to rsync for large files. Rsync checks existing data and only transfers missing parts.
Example resume with rsync:
rsync -avz --partial user@server:/path/to/file /local/path
The --partial flag keeps partially transferred files.
Downloading Entire Directories
To download a whole folder, use the -r flag with SCP or rsync’s default recursive behavior.
SCP recursive:
scp -r user@server:/remote/dir /local/path
Rsync recursive (default):
rsync -avz user@server:/remote/dir /local/path
Rsync is better for directories because it preserves structure and skips unchanged files.
Using SSH Config For Shorter Commands
If you connect to the same server often, create an SSH config file. Edit ~/.ssh/config and add:
Host myserver
HostName 192.168.1.100
User myuser
Port 22
Then you can download with:
scp myserver:/path/to/file .
This saves typing and reduces errors.
Downloading Files With Different Ports
Some servers use non-standard SSH ports. Specify the port with -P for SCP (capital P) or -p for rsync.
SCP example:
scp -P 2222 user@server:/file .
Rsync example:
rsync -avz -e "ssh -p 2222" user@server:/file .
SFTP also accepts port with -oPort=2222.
Downloading As Root Or Sudo
If you need to download a file owned by root, you can’t use SCP directly as a normal user. One workaround is to copy the file to a world-readable location first, or use sudo on the server to change permissions.
Alternatively, enable root SSH login (not recommended for security) or use sudo with a trick:
ssh user@server "sudo cat /etc/shadow" > local_shadow
This streams the file content through SSH and saves locally. It works for text files but not binaries well.
Using File Compression Before Download
Compressing files reduces transfer time and bandwidth. Use gzip, bzip2, or xz on the server before downloading.
Compress and download in one step with pipes:
ssh user@server "gzip -c /path/to/largefile" > localfile.gz
This avoids storing the compressed file on the server.
For directories, tar and compress:
ssh user@server "tar czf - /path/to/dir" > localdir.tar.gz
The - in tar sends output to stdout, which gets piped to your local file.
Downloading From Multiple Servers
If you manage several servers, automate downloads with a loop in bash:
for server in server1 server2 server3; do
scp user@$server:/path/to/log /local/logs/$server.log
done
This downloads the same file from each server into separate local files.
You can also use parallel for concurrent downloads:
parallel -j 3 scp user@{}:/file ./ ::: server1 server2 server3
Downloading With Authentication Keys
Using SSH keys eliminates password prompts, making automation easier. Generate a key pair:
ssh-keygen -t rsa -b 4096