Link rot is the decay of web page links over time, frustrating users and diminishing scholarly value. Content management systems and permalinks can reduce this problem, as can creating archives and using redirects to avoid losing potential visitors.
Link rot — sometimes spelled linkrot — describes how links to web pages decay or become unavailable over time. For the average programmer, citing a website doesn’t necessarily mean that the link will stay active forever. Most people searching for a site get frustrated when they click on links only to find they don’t exist, and examples of link rot can be found in a variety of sources. Rot can occur on any type of website, but broken links that occur in things like academic publications can lead to a failure to link to cited sources, which can diminish an article’s scholarly value.
One problem with link rot is that some large sites like newspapers regularly move links to new addresses and don’t always leave a connection to them; alternatively, they can charge money for items that were once available for free, as soon as they are archived. This can mean that initial links take people to a different location, they just don’t work anymore, or they limit access to paying customers. For those who search the internet, being blocked by payment requests or broken links can be very frustrating.
The amateur programmer with few web pages can easily get rid of link rot by checking links about once a month or so to make sure none are broken and fixing those that are. This strategy doesn’t work well for people with a large number of pages and outbound links or even just one or two pages with multiple mentions. There are a few other strategies to help reduce this problem, however.
For large numbers of pages, programmers can use special computer applications called content management systems, which help check all links for potential rot. These programs don’t always work because some links to sites may no longer contain the same material. This is common when people use the practice of deep linking, where they link not to the first page of the site, but to a specific page within the site, which can easily get an address change later. Programmers often avoid deep linking when they have numerous outlinks.
Another way to combat link rot is to create permalinks, which create a permanent, accessible version of unique content. This is especially common with blog posts, where material from a single blog doesn’t stay on the front page for long and will move after some time. Alternatively, there are now many archives available that help create copies of unique content on web pages so that they remain accessible forever. Linking to archived material or creating an archive of linked material can prevent or reduce link rot.
There’s also the possibility that link rot could affect a site’s ranking, which can be important when people want to build high-profile pages. More often than not, web page rankings drop if they change the addresses of key material and don’t notify people that they’ve made internal links to their page. Programmers moving stuff should create redirects that divert people arriving at the page from a link to the new address. These are called 301s and it is worthwhile for site owners to know how to create and use them to avoid losing potential new visitors to a site to a new address.
Protect your devices with Threat Protection by NordVPN