SHA-256 vs MD5 vs SHA-3: Choosing the Right Hash Function
A practical guide to understanding cryptographic hash functions, their security trade-offs, and how to pick the right one for your project — with real-world lessons and JavaScript code you can use today.
Table of Contents
- Introduction: Why Hash Functions Matter
- What Is a Hash Function?
- MD5: The Legacy Standard
- SHA-1: The Deprecated Middle Ground
- SHA-256 (SHA-2 Family)
- SHA-3 (Keccak)
- Head-to-Head Comparison Table
- Hash Functions in Practice
- Password Hashing: A Special Case
- Implementation in JavaScript
- Choosing the Right Hash Function
- Conclusion
Introduction: Why Hash Functions Matter for Every Developer
Hash functions are one of those foundational building blocks that quietly hold together most of the software we write. If you have ever committed code to Git, verified a file download, authenticated a user, or stored a password, you have used a hash function — whether you realized it or not.
Early in my career, I treated hashing as a black box. I would call md5() or sha256(), get back a hex string, and move on. It worked. But then I joined a team that was migrating a legacy PHP application to Node.js, and we discovered thousands of user passwords stored as unsalted MD5 hashes. The security audit that followed taught me more about hash functions in two weeks than I had learned in the previous three years. It also taught me that choosing the wrong hash function — or using the right one incorrectly — can have real consequences for your users.
This guide is the article I wish I had read back then. We will break down the major hash function families — MD5, SHA-1, SHA-256, and SHA-3 — compare their security properties, walk through real-world use cases, and write actual JavaScript code. By the end, you will have a clear mental model for choosing the right hash function in any scenario you encounter as a working developer.
What Is a Hash Function?
A cryptographic hash function takes an arbitrary amount of input data and produces a fixed-size output, commonly called a digest or hash. Think of it as a fingerprint for data: no matter whether you feed in a single byte or a ten-gigabyte file, you always get back a digest of the same length.
For a hash function to be useful in security contexts, it needs to satisfy several key properties:
One-Way (Pre-image Resistance)
Given a hash output h, it should be computationally infeasible to find any input m such that hash(m) = h. In other words, you cannot reverse-engineer the original data from the hash. This is what makes hashing fundamentally different from encryption — encryption is designed to be reversed with a key, while hashing is a one-way trip.
Deterministic Output
The same input always produces the same output. Hash the string "hello" with SHA-256 a million times, and you will get 2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824 every single time. This determinism is what makes hash functions useful for verification — you can compare two hashes to know whether the underlying data is identical without comparing the data itself.
Fixed-Length Digest
MD5 always produces 128 bits (32 hex characters). SHA-256 always produces 256 bits (64 hex characters). SHA-512 always produces 512 bits (128 hex characters). This fixed output length holds regardless of input size. A one-character input and a one-terabyte input both produce a digest of the same length.
Avalanche Effect
A tiny change in the input should produce a dramatically different output. Change a single bit in your input, and roughly half the bits in the output should flip. This property prevents attackers from learning anything about the input by studying small variations in the output. For example, the SHA-256 hash of "hello" and "hellp" differ completely — you cannot look at the two hashes and tell that the inputs were nearly identical.
Collision Resistance
It should be computationally infeasible to find two different inputs m1 and m2 such that hash(m1) = hash(m2). Collisions mathematically must exist (you are mapping an infinite input space to a finite output space, after all), but finding one should require astronomical computational effort. When researchers manage to find collisions efficiently, the hash function is considered broken for security purposes. This is exactly what happened to MD5 and SHA-1.
MD5: The Legacy Standard
History and Design
MD5 (Message Digest Algorithm 5) was designed by Ronald Rivest in 1991 as an improvement over the earlier MD4. It produces a 128-bit (16-byte) hash value, typically rendered as a 32-character hexadecimal string. MD5 quickly became the de facto standard for checksums, file verification, and even password storage. For over a decade, it was the hash function you reached for by default.
The algorithm works by processing input in 512-bit blocks through four rounds of operations, using a Merkle-Damgård construction. It was fast, simple to implement, and widely supported across every language and platform. Libraries shipped with it. Tutorials taught it first. It was everywhere.
The Fall of MD5
Cracks in MD5's armor appeared as early as 1996, when Hans Dobbertin found collisions in the compression function. But the death blow came in 2004, when Xiaoyun Wang and her team demonstrated practical collision attacks against the full MD5 algorithm. They could generate two different inputs with the same MD5 hash in under an hour on a standard computer.
This was not a theoretical concern. In 2008, researchers demonstrated the Rogue CA attack at the Chaos Communication Congress. They used MD5 collisions to create a fraudulent Certificate Authority certificate that browsers would trust. They effectively showed that MD5's weakness could be exploited to impersonate any website on the internet with a valid-looking HTTPS certificate. The attack prompted certificate authorities to rapidly move away from MD5-based signatures.
By 2012, the Flame malware — widely attributed to state-level actors — used an MD5 collision to forge a Microsoft Windows Update certificate. This made MD5's insecurity not just a theoretical problem, but an active weapon in cyber warfare.
Why You Should Stop Using MD5 for Security
Today, generating an MD5 collision takes seconds on commodity hardware. Tools like HashClash can produce chosen-prefix collisions with practical effort. If you are using MD5 for digital signatures, certificate verification, or password hashing, you are building on a broken foundation.
I have personally reviewed codebases where MD5 was still used for password hashing in 2024. In one case, the developer's reasoning was, "It is hashed, so it is secure." But an unsalted MD5 hash of a common password can be reversed via rainbow table lookup in milliseconds. Sites like CrackStation maintain lookup tables with billions of pre-computed MD5 hashes.
Valid Remaining Use Cases
MD5 is not completely useless. In non-security contexts where you only need a quick checksum for data integrity (not against adversarial tampering), MD5 is still acceptable:
- Non-adversarial checksums: Verifying that a file was not corrupted during transfer (when you trust the source).
- Cache key generation: Creating short, unique keys for caching systems where collision is unlikely and not a security risk.
- Deduplication: Quickly identifying duplicate files in a trusted environment.
- Legacy system interoperability: When you must interface with old systems that only support MD5.
SHA-1: The Deprecated Middle Ground
Design and Adoption
SHA-1 (Secure Hash Algorithm 1) was published by NIST in 1995 and produces a 160-bit (20-byte) hash value. It was designed by the NSA as a replacement for SHA-0, which had undisclosed weaknesses. SHA-1 became the dominant hash function for digital signatures, SSL/TLS certificates, and source control systems like Git.
With its 160-bit output, SHA-1 offered significantly more collision resistance than MD5's 128 bits. A brute-force collision attack against SHA-1 would theoretically require 280 operations (by the birthday paradox), compared to 264 for MD5. For many years, this was considered sufficient.
The SHAttered Attack
Theoretical attacks against SHA-1 began appearing in 2005, when Wang (the same researcher who broke MD5) published a method that could find collisions in 269 operations — far below the theoretical 280. Over the following years, the attack complexity continued to drop.
In February 2017, a team from Google and CWI Amsterdam published the SHAttered attack. They produced two different PDF files with identical SHA-1 hashes. The attack required approximately 263 SHA-1 computations — 6,500 years of single-CPU computation, but achievable in practice using cloud computing resources. Google estimated the attack cost at roughly $110,000 in cloud compute at the time, a price well within reach of any moderately funded attacker.
They demonstrated the attack by creating two PDFs that displayed different content but shared the same SHA-1 hash. The implications were clear: SHA-1 could no longer be trusted for any security-sensitive application.
Browser and Industry Deprecation
The deprecation of SHA-1 in the browser ecosystem followed a gradual timeline:
- 2014: Google Chrome began showing warnings for SHA-1 certificates expiring after 2016.
- 2016: Microsoft Edge, Mozilla Firefox, and Google Chrome stopped accepting new SHA-1 certificates.
- 2017: All major browsers fully rejected SHA-1 signed TLS certificates.
- 2020: Git began planning migration to SHA-256 for object hashing (still ongoing as of 2026, due to backward compatibility requirements).
The lesson from SHA-1 is sobering: a hash function can go from "slightly weakened" to "practically broken" faster than most organizations can migrate away from it. If your systems still depend on SHA-1 for security, the time to migrate was years ago.
SHA-256 (SHA-2 Family)
Design and Background
SHA-256 is a member of the SHA-2 family, published by NIST in 2001. Like SHA-1, the SHA-2 algorithms were designed by the NSA. The "256" refers to the bit length of the output digest — 256 bits, rendered as a 64-character hexadecimal string. SHA-2 uses a Merkle-Damgård construction, similar to MD5 and SHA-1, but with a significantly more complex compression function and larger internal state.
SHA-2 Family Variants
SHA-256 is the most commonly used member, but the SHA-2 family includes several variants:
- SHA-224: A truncated version of SHA-256 with a different initialization vector. Produces 224-bit output. Rarely used in practice.
- SHA-256: The workhorse. 256-bit output, 128-bit security level against collision attacks. Used in TLS, code signing, Bitcoin, and countless applications.
- SHA-384: A truncated version of SHA-512. Produces 384-bit output. Used in some government and financial applications.
- SHA-512: 512-bit output, 256-bit collision resistance. Actually faster than SHA-256 on 64-bit processors because it works with 64-bit words natively.
- SHA-512/224, SHA-512/256: Truncated versions of SHA-512, added later for efficiency on 64-bit platforms while matching shorter output lengths.
Why SHA-256 Is the Current Standard
SHA-256 hits the sweet spot of security, performance, and compatibility. No practical attacks have been found against any SHA-2 variant. The best known attacks reduce the theoretical security margin only slightly, and they remain computationally infeasible by a vast margin. As of 2026, SHA-256 provides a 128-bit security level against collision attacks, which means an attacker would need approximately 2128 operations to find a collision — a number beyond the reach of any conceivable computer.
SHA-256 is mandated by numerous standards and regulations. It is used in TLS 1.2 and 1.3 cipher suites, X.509 certificate signing, JWT (JSON Web Token) signatures (HS256, RS256), DNSSEC, and many government security standards. If you need a hash function and do not have a specific reason to choose something else, SHA-256 is the right default.
The Bitcoin Connection
Bitcoin's proof-of-work mining algorithm uses double-SHA-256 (SHA-256 applied twice in sequence). This means the Bitcoin mining network is, in effect, the largest distributed SHA-256 computation in history. As of 2026, the Bitcoin network performs hundreds of exahashes per second — that is, hundreds of quintillions of SHA-256 computations every second. Despite this astronomical effort, no miner has ever found a collision. This serves as a compelling real-world testament to SHA-256's strength.
Performance Characteristics
SHA-256 is slower than MD5 — roughly 3 to 4 times slower on the same hardware. For most applications, this difference is negligible. However, in performance-critical scenarios (hashing terabytes of data, high-throughput network equipment), the difference matters. Modern CPUs include hardware acceleration for SHA-256 via Intel SHA Extensions (available since Goldmont and Ice Lake architectures), which dramatically close the performance gap. On hardware with these extensions, SHA-256 can actually approach MD5 speeds.
If you need maximum throughput on 64-bit systems, consider SHA-512, which is often faster than SHA-256 because it operates on 64-bit words. On 32-bit systems, SHA-256 outperforms SHA-512 for the opposite reason.
SHA-3 (Keccak)
Origins: The NIST Competition
SHA-3 was born out of prudence. After watching MD5 and SHA-1 fall to collision attacks, the cryptographic community worried that SHA-2 — which shares the Merkle-Damgård construction with its broken predecessors — might eventually succumb to a similar class of attacks. In 2007, NIST launched a public competition to select a new hash function standard, one built on fundamentally different principles.
The competition attracted 64 submissions from cryptographers worldwide. After five years of analysis, public review, and multiple elimination rounds, NIST selected the Keccak algorithm (designed by Guido Bertoni, Joan Daemen, Michaël Peeters, and Gilles Van Assche) as SHA-3 in October 2012. The standard was formally published as FIPS 202 in August 2015.
Sponge Construction vs. Merkle-Damgård
What makes SHA-3 fundamentally different from SHA-2 is its internal construction. While MD5, SHA-1, and SHA-2 all use the Merkle-Damgård construction (processing input blocks sequentially and chaining them through a compression function), SHA-3 uses a sponge construction.
In a sponge construction, the algorithm maintains a large internal state (1,600 bits for Keccak). Input data is "absorbed" into this state in blocks, and the output is "squeezed" out of the state. This fundamentally different design means that any attack strategy developed against Merkle-Damgård hash functions (like length-extension attacks) simply does not apply to SHA-3.
This is the real reason SHA-3 was selected: it provides algorithmic diversity. If a catastrophic flaw were ever discovered in the Merkle-Damgård construction itself (as opposed to a specific algorithm), SHA-3 would be completely unaffected, and vice versa. Having two fundamentally different standards is a form of cryptographic insurance.
SHA-3 Variants
- SHA3-224: 224-bit output, 112-bit collision resistance.
- SHA3-256: 256-bit output, 128-bit collision resistance. The most common SHA-3 variant.
- SHA3-384: 384-bit output, 192-bit collision resistance.
- SHA3-512: 512-bit output, 256-bit collision resistance.
SHAKE: Extensible Output Functions
One of SHA-3's unique features is the SHAKE family of extendable-output functions (XOFs). Unlike traditional hash functions that produce a fixed-length digest, SHAKE128 and SHAKE256 can produce output of any desired length.
SHAKE128 provides 128-bit security, and SHAKE256 provides 256-bit security, regardless of the output length you request. This makes them extremely flexible for use cases like key derivation, randomness generation, and protocols that need variable-length outputs. SHAKE is increasingly used in post-quantum cryptographic schemes like CRYSTALS-Dilithium and CRYSTALS-Kyber.
When to Use SHA-3 Over SHA-2
For most applications today, SHA-256 remains the better choice due to wider support, hardware acceleration, and equivalent security. However, there are compelling reasons to choose SHA-3:
- Defense in depth: If your threat model requires resilience against a theoretical break in Merkle-Damgård constructions.
- Length-extension attack resistance: SHA-3 is inherently immune to length-extension attacks, while SHA-256 requires HMAC wrapping to prevent them.
- Variable-length output: If you need SHAKE's extensible output functionality.
- Post-quantum readiness: SHA-3 and SHAKE are already integrated into many post-quantum cryptographic standards.
- Regulatory compliance: Some newer standards mandate or prefer SHA-3.
Current Adoption Status
SHA-3 adoption has been gradual. OpenSSL added SHA-3 support in version 1.1.1 (2018). Node.js supports SHA-3 via its crypto module. The Web Crypto API in browsers does not natively support SHA-3 as of 2026, which limits its use in client-side web applications without a polyfill or WebAssembly implementation. Python's hashlib has supported SHA-3 since Python 3.6. Most major cloud KMS services now support SHA-3 for signing operations.
Head-to-Head Comparison Table
| Property | MD5 | SHA-1 | SHA-256 | SHA-3-256 |
|---|---|---|---|---|
| Output Size | 128 bits (32 hex) | 160 bits (40 hex) | 256 bits (64 hex) | 256 bits (64 hex) |
| Internal State | 128 bits | 160 bits | 256 bits | 1600 bits |
| Block Size | 512 bits | 512 bits | 512 bits | 1088 bits (rate) |
| Construction | Merkle-Damgård | Merkle-Damgård | Merkle-Damgård | Sponge (Keccak) |
| Year Published | 1991 | 1995 | 2001 | 2015 (FIPS 202) |
| Collision Resistance | Broken (seconds) | Broken (263) | Secure (2128) | Secure (2128) |
| Pre-image Resistance | Weakened | Weakened | Secure (2256) | Secure (2256) |
| Length Extension | Vulnerable | Vulnerable | Vulnerable | Immune |
| Speed (relative) | Fastest | Fast | Moderate | Moderate-Slow |
| HW Acceleration | No | ARM (SHA1) | Intel SHA-NI, ARM | Limited |
| Use in 2026 | Checksums only | Avoid entirely | Default choice | Future-proofing |
Hash Functions in Practice
Understanding the theory is only half the battle. Here is how hash functions are actually used in the real-world systems you build every day.
File Integrity Verification
When you download an ISO image or a software package, the distributor often provides a SHA-256 checksum alongside the file. After downloading, you hash the file locally and compare your result to the published checksum. If they match, the file has not been corrupted or tampered with during transit.
This is one of the most straightforward applications of hashing. Package managers like npm, pip, and apt all use hash verification internally. Docker image layers are addressed by their SHA-256 digest. Every time you run docker pull, the client verifies each layer's hash against the manifest.
HMAC for Message Authentication
An HMAC (Hash-based Message Authentication Code) combines a hash function with a secret key to produce an authentication tag. Unlike a plain hash, an HMAC proves that the message was created (or approved) by someone who holds the secret key. HMAC-SHA256 is the standard choice for API authentication, webhook verification (GitHub, Stripe, and Slack all use it), and JWT signature algorithms like HS256.
The HMAC construction also neutralizes length-extension attacks, which is why HMAC-SHA256 is safe to use even though raw SHA-256 is theoretically vulnerable to length-extension. If you are building any kind of API authentication, HMAC-SHA256 should be your default.
Digital Signatures
When you sign a document or a software release, the signing algorithm first hashes the content with a function like SHA-256, then encrypts the hash with your private key. The recipient decrypts with your public key and compares the hash. This is how code signing, TLS certificates, and GPG signatures work. The hash function's collision resistance is critical here — if an attacker could find two documents with the same hash, they could substitute a malicious document for a legitimate one without invalidating the signature.
Content-Addressable Storage
Systems like Git, IPFS, and Docker use hash functions to create content-addressable storage, where the address (or identifier) of a piece of data is derived from the data itself. In Git, every commit, tree, and blob object is identified by its SHA-1 hash (with a gradual transition to SHA-256 underway). This means identical content always gets the same identifier, enabling efficient deduplication and integrity verification.
Git's continued use of SHA-1 is often cited as a concern. However, Git does not rely on SHA-1 for security against adversarial collisions — it uses SHA-1 as a content identifier in a trusted context. That said, the SHAttered attack prompted the Git project to add collision detection logic and to plan the SHA-256 migration, which is available as an experimental option.
Git Object Hashing
Git computes object hashes in a specific way that is worth understanding. It does not simply hash the raw file content. Instead, it prepends a header consisting of the object type, a space, the content length in bytes, and a null byte:
// Git's blob hashing formula:
// SHA-1("blob " + content.length + "\0" + content)
// For example, hashing the file content "hello\n":
// SHA-1("blob 6\0hello\n") = ce013625030ba8dba906f756967f9e9ca394464a
This header prevents type confusion attacks where an attacker might try to substitute one type of Git object for another.
Password Hashing: A Special Case
This is the section I wish someone had drilled into me on day one of my career. You should never use SHA-256 (or any raw hash function) for password storage. This is not because SHA-256 is insecure — it is because it is too fast.
The Speed Problem
A modern GPU can compute billions of SHA-256 hashes per second. This means an attacker with a stolen database of SHA-256 password hashes can try billions of password guesses per second. Even with a reasonably complex password, a brute-force or dictionary attack becomes feasible in hours or days.
The same speed that makes SHA-256 great for file checksums makes it terrible for passwords. Password hashing algorithms need to be deliberately slow to make brute-force attacks impractical.
Why You Need bcrypt, scrypt, or Argon2
Purpose-built password hashing functions solve the speed problem by introducing three key mechanisms:
- Salt: A random value unique to each password, stored alongside the hash. Salts ensure that two users with the same password get different hashes, defeating rainbow table attacks and preventing attackers from amortizing work across multiple accounts.
- Iteration count (work factor): The algorithm runs its internal function thousands or millions of times, making each hash computation deliberately slow. You can tune this parameter to make hashing take, say, 250 milliseconds per password — imperceptible to a user logging in, but devastating to an attacker trying billions of guesses.
- Memory hardness (scrypt, Argon2): Beyond being CPU-intensive, these algorithms require significant amounts of memory to compute. This makes them resistant to GPU-based attacks, where GPUs have massive parallelism but limited per-thread memory.
The Argon2id Recommendation
As of 2026, Argon2id is the recommended password hashing algorithm. It won the Password Hashing Competition in 2015 and is recommended by OWASP, NIST (SP 800-63B), and most modern security guidelines. Argon2id combines the strengths of Argon2i (resistance against side-channel attacks) and Argon2d (resistance against GPU cracking).
Here is a practical configuration for Argon2id:
// Recommended Argon2id parameters (OWASP 2026):
// - Memory: 19 MiB (19456 KiB) minimum, 64 MiB preferred
// - Iterations: 2 minimum
// - Parallelism: 1
// - Output length: 32 bytes
// Using the 'argon2' npm package:
const argon2 = require('argon2');
async function hashPassword(password) {
return await argon2.hash(password, {
type: argon2.argon2id,
memoryCost: 65536, // 64 MiB
timeCost: 3, // 3 iterations
parallelism: 1,
hashLength: 32
});
}
async function verifyPassword(password, hash) {
return await argon2.verify(hash, password);
}
// Usage:
// const hash = await hashPassword('user-password-here');
// const isValid = await verifyPassword('user-password-here', hash);
If Argon2 is not available in your environment, bcrypt remains a solid and widely-supported second choice. Avoid scrypt unless you have specific reasons to prefer it, as its parameter tuning is less intuitive than Argon2.
SHA-256(salt + password) or HMAC-SHA256(password) for password storage, even with a salt. These constructions are too fast. Use a dedicated password hashing function with built-in salt and work factor management.
Implementation in JavaScript
Let us get practical. Here is how to use hash functions in both browser and Node.js environments.
Browser: Web Crypto API (SubtleCrypto.digest)
Modern browsers provide native hashing via the SubtleCrypto interface. It supports SHA-1, SHA-256, SHA-384, and SHA-512 (but not MD5 or SHA-3).
// Hash a string using SHA-256 in the browser
async function sha256(message) {
// Encode the string as UTF-8 bytes
const encoder = new TextEncoder();
const data = encoder.encode(message);
// Hash the data
const hashBuffer = await crypto.subtle.digest('SHA-256', data);
// Convert the ArrayBuffer to a hex string
const hashArray = Array.from(new Uint8Array(hashBuffer));
const hashHex = hashArray
.map(byte => byte.toString(16).padStart(2, '0'))
.join('');
return hashHex;
}
// Usage:
// const hash = await sha256('Hello, world!');
// console.log(hash);
// => "315f5bdb76d078c43b8ac0064e4a0164612b1fce77c869345bfc94c75894edd3"
The Web Crypto API is asynchronous by design, which means it will not block the main thread. This is important for hashing large amounts of data in the browser without freezing the UI.
Node.js: The crypto Module
Node.js provides synchronous and streaming hash functions via its built-in crypto module. It supports a wide range of algorithms, including MD5, SHA-1, SHA-256, SHA-512, SHA-3-256, and many more.
const crypto = require('crypto');
// Simple string hashing
function hashString(algorithm, input) {
return crypto.createHash(algorithm).update(input).digest('hex');
}
// Examples:
console.log(hashString('md5', 'Hello, world!'));
// => "65a8e27d8879283831b664bd8b7f0ad4"
console.log(hashString('sha1', 'Hello, world!'));
// => "943a702d06f34599aee1f8da8ef9f7296031d699"
console.log(hashString('sha256', 'Hello, world!'));
// => "315f5bdb76d078c43b8ac0064e4a0164612b1fce77c869345bfc94c75894edd3"
console.log(hashString('sha3-256', 'Hello, world!'));
// => "f345a219da005ebe9c1a1eaad97bbf38a10c8473e0f68554940bfc4522060b8f"
Streaming Hash for Large Files
For files that do not fit comfortably in memory, use Node.js streams to hash them incrementally:
const crypto = require('crypto');
const fs = require('fs');
function hashFile(filePath, algorithm = 'sha256') {
return new Promise((resolve, reject) => {
const hash = crypto.createHash(algorithm);
const stream = fs.createReadStream(filePath);
stream.on('data', (chunk) => hash.update(chunk));
stream.on('end', () => resolve(hash.digest('hex')));
stream.on('error', (err) => reject(err));
});
}
// Usage:
// const fileHash = await hashFile('/path/to/large-file.iso');
// console.log(fileHash);
// You can also verify a file against a known hash:
async function verifyFile(filePath, expectedHash, algorithm = 'sha256') {
const actualHash = await hashFile(filePath, algorithm);
return actualHash === expectedHash;
}
// const isValid = await verifyFile(
// 'download.iso',
// 'a1b2c3d4e5f6...expected-sha256-hash...'
// );
This streaming approach reads the file in chunks (typically 64 KiB by default), so memory usage stays constant regardless of file size. I have used this pattern to hash multi-gigabyte database dumps as part of backup verification pipelines.
HMAC Creation
HMAC is essential for API authentication, webhook verification, and message integrity. Here is how to create and verify HMACs in Node.js:
const crypto = require('crypto');
// Create an HMAC
function createHmac(secret, message, algorithm = 'sha256') {
return crypto.createHmac(algorithm, secret)
.update(message)
.digest('hex');
}
// Verify a webhook signature (e.g., GitHub, Stripe)
function verifyWebhookSignature(secret, payload, signature) {
const expected = createHmac(secret, payload);
// Use timingSafeEqual to prevent timing attacks
const expectedBuffer = Buffer.from(expected, 'hex');
const signatureBuffer = Buffer.from(signature, 'hex');
if (expectedBuffer.length !== signatureBuffer.length) {
return false;
}
return crypto.timingSafeEqual(expectedBuffer, signatureBuffer);
}
// Usage example: Verify a GitHub webhook
// const isValid = verifyWebhookSignature(
// process.env.GITHUB_WEBHOOK_SECRET,
// JSON.stringify(req.body),
// req.headers['x-hub-signature-256'].replace('sha256=', '')
// );
crypto.timingSafeEqual() for comparing hashes and HMACs. A regular === comparison can leak information about which bytes match via timing side-channels, potentially allowing an attacker to forge a valid signature byte by byte.
Comparing Hash Performance in Node.js
Curious about performance differences? Here is a quick benchmark you can run:
const crypto = require('crypto');
function benchmark(algorithm, iterations = 100000) {
const data = crypto.randomBytes(1024); // 1 KiB of random data
const start = performance.now();
for (let i = 0; i < iterations; i++) {
crypto.createHash(algorithm).update(data).digest();
}
const elapsed = performance.now() - start;
const opsPerSec = Math.round(iterations / (elapsed / 1000));
console.log(`${algorithm.padEnd(10)} ${opsPerSec.toLocaleString()} ops/sec (${elapsed.toFixed(0)}ms)`);
}
benchmark('md5');
benchmark('sha1');
benchmark('sha256');
benchmark('sha512');
benchmark('sha3-256');
// Typical results on a modern machine (Apple M2, Node.js 20):
// md5 ~1,200,000 ops/sec
// sha1 ~1,100,000 ops/sec
// sha256 ~800,000 ops/sec
// sha512 ~900,000 ops/sec (faster than sha256 on 64-bit!)
// sha3-256 ~500,000 ops/sec
Notice that SHA-512 is often faster than SHA-256 on 64-bit processors. And while SHA-3 is the slowest of the group, it still handles half a million operations per second on 1 KiB inputs — more than enough for the vast majority of applications.
Choosing the Right Hash Function
After five years of working with these algorithms across different projects, here is the decision framework I have internalized. It boils down to asking one question first: Is this a security-sensitive context?
Decision Flowchart
File Checksum or Data Integrity (Trusted Source)
Use SHA-256. It is fast enough for any file size, universally supported, and provides genuine tamper detection. If you are in a purely non-adversarial context (e.g., detecting accidental corruption on a local disk), even MD5 or CRC32 will work. But SHA-256 costs you almost nothing extra, so just use it and avoid the risk of someone later repurposing your checksum code in a security context.
Password Storage
Use Argon2id. Not SHA-256, not bcrypt (unless Argon2 is unavailable), and definitely not MD5. This is non-negotiable. If your framework or ORM offers a built-in password hashing utility, verify that it uses Argon2id or bcrypt under the hood before trusting it.
HMAC / API Authentication
Use HMAC-SHA256. It is the standard for webhook verification, JWT signing (HS256), and API request authentication. SHA-256's speed is an advantage here, not a liability, because the secret key prevents brute-force attacks on the message content.
Digital Signatures and Certificates
Use SHA-256 or SHA-384 (for higher security margins, as in some TLS configurations). If you are signing with RSA-PSS or ECDSA, SHA-256 is the standard pairing.
Future-Proofing and Post-Quantum Readiness
Consider SHA-3-256 or SHAKE256. If you are building systems that need to remain secure for decades, or if you are working with post-quantum cryptographic algorithms, SHA-3 provides algorithmic diversity and is already integrated into many PQC standards.
Non-Security Contexts (Cache Keys, Deduplication, Checksums for Trusted Data)
Even MD5 is technically fine here. It is fast, widely supported, and produces reasonably distributed hashes. That said, I still default to SHA-256 in most cases because the performance difference is negligible for typical workloads, and it avoids the awkward conversation during code review when a colleague asks, "Why are we using MD5?"
Want to see these hash functions in action? Generate MD5, SHA-1, SHA-256, and SHA-512 hashes instantly in your browser.
Try the Hash Generator ToolA Note on Migrating Legacy Systems
If you have inherited a codebase that uses MD5 for password hashing, do not panic — but do act. A common migration strategy is to wrap the existing MD5 hashes with bcrypt or Argon2:
// Migration strategy: wrapping legacy MD5 hashes
// Step 1: On migration, hash the existing MD5 hash with Argon2
// existingHash = MD5(password) (stored in your database)
// newHash = Argon2id(existingHash)
// Step 2: On login, compute MD5 of the entered password,
// then verify against the Argon2 hash
// loginAttempt: Argon2id.verify(MD5(enteredPassword), storedNewHash)
// Step 3: Optionally, after successful login, re-hash with
// Argon2id(plaintext_password) and store the upgraded hash
const argon2 = require('argon2');
const crypto = require('crypto');
async function migrateHash(existingMd5Hash) {
// Wrap the MD5 hash with Argon2id
return await argon2.hash(existingMd5Hash, {
type: argon2.argon2id,
memoryCost: 65536,
timeCost: 3,
parallelism: 1
});
}
async function verifyMigratedPassword(enteredPassword, storedArgon2Hash) {
// Compute MD5 of the entered password (matching the legacy scheme)
const md5Hash = crypto.createHash('md5')
.update(enteredPassword)
.digest('hex');
// Verify against the Argon2-wrapped hash
return await argon2.verify(storedArgon2Hash, md5Hash);
}
This approach lets you upgrade security immediately without forcing all users to reset their passwords. Over time, as users log in, you can re-hash with Argon2id directly on the plaintext password and store the fully upgraded hash.
Conclusion
Hash functions are deceptively simple on the surface — put data in, get a fixed-size fingerprint out. But the choices you make about which hash function to use, and how to use it, have real consequences for the security and integrity of your systems.
Here are the key takeaways:
- MD5 is broken for any security purpose. Use it only for non-adversarial checksums or legacy compatibility, and even then, consider just using SHA-256 instead.
- SHA-1 is deprecated. Do not use it for new systems. Migrate existing systems away from it.
- SHA-256 is the default. For file integrity, digital signatures, HMAC, and general-purpose hashing, SHA-256 is the right choice in 2026.
- SHA-3 is the future-proof option. Its fundamentally different construction provides algorithmic diversity. Use it when you need defense in depth, length-extension resistance, or variable-length output (SHAKE).
- Never use raw hash functions for passwords. Use Argon2id (preferred), bcrypt, or scrypt. The deliberate slowness and memory-hardness of these algorithms is a feature, not a bug.
- Always use
timingSafeEqualwhen comparing hashes or HMAC signatures to prevent timing side-channel attacks.
The cryptographic landscape continues to evolve. Post-quantum computing research is accelerating, and SHA-3 and SHAKE are already being incorporated into next-generation standards. But for the working developer in 2026, the practical advice is straightforward: SHA-256 for hashing, Argon2id for passwords, HMAC-SHA256 for authentication, and SHA-3 when you need that extra layer of resilience.
Build on solid foundations, and you will not have to rewrite your security layer when the next attack paper drops.
Ready to generate hashes? Our free browser-based tool supports MD5, SHA-1, SHA-256, SHA-384, and SHA-512 with no data sent to any server.
Open Hash Generator