Hash Archive (beta)

Example lookups
Recently fetched URLs
Critical URLs



I made this site because so many Linux distributions still don't provide easy+secure download options in $YEAR. Using this archive (and hopefully other sources), you can have at least a small amount of confidence in the authenticity of your ISOs. Just put in the URL of the ISO, torrent file, or any other HTTP (or HTTPS) resource and we'll request the file and compute the hash for you using a wide variety of algorithms.

For sites that don't support HTTPS, this is a little bit like domain validation for certificates. Unless someone can intercept your local traffic and our traffic to a site, you'll be able to spot MITM attacks. This level of security isn't great, but it's better than nothing.

For sites that do support HTTPS, validating hashes from a third party is still useful. It can tell you if the server was compromised (like Linux Mint recently) or if the site is trying to make stealth updates (like on political campaign sites). This is a little bit like Certificate Transparency, or, of course, the Internet Archive's Wayback Machine.

You can also do reverse lookups to find out what URLs provide (or used to provide) a given hash. This might be useful for finding mirrors (although it only works if we've previously crawled those URLs).


What kind of questions can Hash Archive answer?

It can answer questions such as:

How can I compute and verify my own hashes?

There are several options:

Are hashes actually secure?

If they use a good algorithm and are at least 16-24 bytes long, then yes.

Is it better to use even longer hashes like SHA-384 or SHA-512?

There are diminishing returns. Past a certain point, the length does not matter. I'm not qualified to comment on differences in the algorithms themselves, but SHA-256 is widely accepted.

You can check the probability tables for hash collisions yourself here.

What do [#] links do?

Those provide the hash as a raw link, for formats that support it. In theory you could use it to resolve the hash through a different system. However, you need the appropriate protocol handler, and for now software support is mostly non-existant. My other project StrongLink has limited support, if you configure it. A big problem is the protocol safelist, which makes it difficult for web apps to resolve hashes (see complaints here and here).

If you right click the [#] and choose "copy link," you'll get a link you can paste into a resolver (such as StrongLink) or chat/email.

Why aren't the MultiHashes compatible with IPFS, or the Magnet URIs compatible with BitTorrent?

IPFS and BitTorrent compute their hashes in their own protocol-specific ways. BitTorrent hashes vary depending on the file name (amongst other things) and IPFS hashes vary depending on the chunker used. See my article, The Principles of Content Addressing, or this IPFS bug report. However, you can submit URLs from the ipfs.io gateway to bridge between them.

Is there an API?

Not yet, but support is planned. For now, please use a database snapshot.

Can you add a new algorithm?

Yes, if: 1. the algorithm is relatively efficient, 2. it's in widespread use or in high demand, and 3. if there is a drop-in Node.js module we can use. Please check our issues list and open a new one if it hasn't been proposed already.

Why not use the Web of Trust?

This is a link in the Web of Trust.