Linux Networking Tools: curl, wget, netstat, ss
Every developer and system administrator encounters networking issues. Whether you're debugging why an API returns 500 errors, investigating which process is hogging port 8080, or downloading...
Key Insights
curlexcels at API testing and HTTP debugging with granular control over requests, whilewgetis purpose-built for recursive downloads and mirroring entire websitessshas replacednetstatas the modern standard for socket inspection—it’s faster, more feature-rich, and actively maintained in current Linux distributions- Mastering these four tools eliminates the need for GUI applications in most networking scenarios and enables powerful automation through shell scripting
Why These Tools Matter
Every developer and system administrator encounters networking issues. Whether you’re debugging why an API returns 500 errors, investigating which process is hogging port 8080, or downloading datasets for analysis, these four command-line tools form the foundation of Linux network troubleshooting.
Unlike bloated GUI applications, these utilities are lightweight, scriptable, and available on virtually every Linux distribution. They’re the difference between spending hours clicking through interfaces and solving problems in seconds with a well-crafted command.
curl: The Swiss Army Knife for HTTP Requests
curl is your go-to tool for anything HTTP-related. It supports dozens of protocols, but its real power lies in testing APIs and debugging web services with precise control over every aspect of the request.
Basic Requests and Header Inspection
Start with the fundamentals—viewing response headers alongside the body:
# GET request showing response headers
curl -i https://api.github.com/users/octocat
# Only show headers (HEAD request)
curl -I https://api.github.com/users/octocat
# Verbose output for debugging
curl -v https://api.example.com/endpoint
The -v flag is invaluable for troubleshooting. It shows the complete request/response exchange, including TLS handshake details and redirects.
POST Requests with JSON
Modern APIs expect JSON payloads. Here’s how to send them properly:
# POST with JSON data
curl -X POST https://api.example.com/users \
-H "Content-Type: application/json" \
-d '{"name":"John Doe","email":"john@example.com"}'
# POST from a file
curl -X POST https://api.example.com/users \
-H "Content-Type: application/json" \
-d @user.json
Authentication Patterns
APIs use various authentication schemes. curl handles them all:
# Basic authentication
curl -u username:password https://api.example.com/protected
# Bearer token (common for OAuth2/JWT)
curl -H "Authorization: Bearer eyJhbGc..." https://api.example.com/data
# API key in header
curl -H "X-API-Key: your-key-here" https://api.example.com/endpoint
Advanced Features
# Follow redirects automatically
curl -L https://short.url/abc123
# Save and send cookies (maintain session)
curl -c cookies.txt -b cookies.txt https://example.com/login
# Measure response time
curl -w "\nTime: %{time_total}s\n" -o /dev/null -s https://api.example.com
# Custom request method
curl -X PATCH https://api.example.com/users/123 \
-H "Content-Type: application/json" \
-d '{"status":"active"}'
The -w flag with format strings is particularly useful for performance testing. You can extract timing breakdowns, HTTP status codes, and more.
wget: Recursive Downloads and Web Mirroring
While curl focuses on protocol flexibility, wget specializes in downloading files and entire websites. It’s designed for reliability and automation.
Basic File Downloads
# Simple download (preserves filename)
wget https://example.com/file.tar.gz
# Save with different name
wget -O custom-name.tar.gz https://example.com/file.tar.gz
# Download multiple files
wget -i urls.txt
Recursive Website Downloads
This is where wget truly shines:
# Mirror a website (limit depth to avoid infinite recursion)
wget --mirror --convert-links --page-requisites \
--no-parent -P ./local-copy https://example.com
# Recursive download with depth limit
wget -r -l 3 https://example.com/docs/
# Download only specific file types
wget -r -A "*.pdf,*.doc" https://example.com/resources/
The --convert-links option rewrites URLs in downloaded HTML to point to local files, making the mirror browsable offline.
Background Downloads and Resuming
# Background download with logging
wget -b -o download.log https://example.com/large-file.iso
# Resume interrupted download
wget -c https://example.com/large-file.iso
# Retry failed downloads
wget --tries=10 --retry-connrefused https://unreliable-server.com/file
Rate Limiting and Bandwidth Control
Be a good internet citizen when scraping or mirroring:
# Limit download speed to 500KB/s
wget --limit-rate=500k https://example.com/file.tar.gz
# Wait between requests (in seconds)
wget -w 2 --random-wait -r https://example.com/
The --random-wait option adds randomness to the wait time, making automated downloads less detectable.
netstat vs ss: Network Socket Statistics
netstat has been the traditional tool for inspecting network connections, but ss (socket statistics) is faster and more powerful. Most modern distributions ship with ss as the default.
Why ss is Better
ss reads directly from kernel space, making it significantly faster than netstat, especially on systems with thousands of connections. It also provides more detailed information and better filtering capabilities.
Listing Listening Ports
# netstat: show all listening TCP ports
netstat -tlnp
# ss: equivalent command (faster)
ss -tlnp
# Breakdown: -t (TCP), -l (listening), -n (numeric), -p (processes)
Established Connections
# Show all established TCP connections with process info
ss -tnp state established
# netstat equivalent
netstat -tnp | grep ESTABLISHED
Filtering by Protocol and Port
# Show only UDP sockets
ss -unp
# Find what's using port 8080
ss -tlnp | grep :8080
# Show all connections to specific IP
ss dst 192.168.1.100
Advanced ss Filtering
ss has a powerful filtering syntax:
# Show connections in specific states
ss state time-wait
ss state syn-sent
# Filter by source or destination port
ss sport = :80
ss dport = :443
# Combine filters
ss -tn state established '( dport = :443 or sport = :443 )'
Network Statistics
# Display summary statistics
ss -s
# Show routing table
ss -r # or use 'ip route' for more detail
# TCP socket memory usage
ss -tm
Practical Use Cases and Workflows
Debugging API Connectivity
When an application can’t reach an API:
# 1. Test basic connectivity
curl -v https://api.example.com/health
# 2. Check if the local service is listening
ss -tlnp | grep :3000
# 3. Verify no firewall blocking
curl -I --connect-timeout 5 https://api.example.com
# 4. Test with specific headers the app uses
curl -H "User-Agent: MyApp/1.0" -v https://api.example.com
Monitoring Application Network Activity
# Watch connections in real-time (updates every 2 seconds)
watch -n 2 'ss -tnp | grep myapp'
# Count connections by state
ss -tan | awk '{print $1}' | sort | uniq -c
# Find the process with most connections
ss -tnp | awk '{print $6}' | sort | uniq -c | sort -rn | head
Automated Health Checks
#!/bin/bash
# Simple health check script
ENDPOINT="https://api.example.com/health"
TIMEOUT=5
HTTP_CODE=$(curl -o /dev/null -s -w "%{http_code}" --max-time $TIMEOUT $ENDPOINT)
if [ "$HTTP_CODE" -eq 200 ]; then
echo "Service healthy"
exit 0
else
echo "Service unhealthy (HTTP $HTTP_CODE)"
exit 1
fi
Troubleshooting Port Conflicts
# Find what's using a port before starting your app
ss -tlnp | grep :8080
# If occupied, kill the process
kill $(ss -tlnp | grep :8080 | awk '{print $6}' | cut -d',' -f2 | cut -d'=' -f2)
Best Practices and Tips
Security Considerations: Never put credentials directly in commands—they’ll appear in shell history. Use environment variables or config files:
# Bad: credentials in history
curl -u admin:password123 https://api.example.com
# Better: use environment variables
curl -u "$API_USER:$API_PASS" https://api.example.com
# Best: use .netrc file for curl
# Create ~/.netrc with: machine api.example.com login admin password secret
curl -n https://api.example.com
Performance: When downloading many files with wget, use --wait to avoid overwhelming servers. For curl performance testing, use --compressed to test with gzip encoding like browsers do.
Debugging: Always start with -v (verbose) in curl when troubleshooting. For ss, use -p to see process information—it requires root privileges but is essential for debugging.
Scripting: Both curl and wget return meaningful exit codes. Check them in scripts:
if wget -q --spider https://example.com; then
echo "URL exists"
else
echo "URL not reachable"
fi
Quick Reference
curl essentials: -i (headers), -v (verbose), -X (method), -H (header), -d (data), -L (follow redirects)
wget essentials: -r (recursive), -l (depth), -c (continue), -b (background), -A (accept types)
ss essentials: -t (TCP), -u (UDP), -l (listening), -n (numeric), -p (processes), state (filter)
These four tools cover 90% of networking tasks you’ll encounter. Master them, and you’ll troubleshoot faster than colleagues fumbling with GUI tools. Keep these commands in your shell history or a personal cheat sheet—you’ll reference them constantly.