Facebook Google Plus Twitter LinkedIn YouTube RSS Menu Search Resource - BlogResource - WebinarResource - ReportResource - Eventicons_066 icons_067icons_068icons_069icons_070

Tenable Blog

Subscribe

10 Reasons Why Websites STILL Get Hacked

Even with all of the cybersecurity solutions on the market today, websites are still getting hacked by attackers. Find out how your organization could be exposed to hackers.

In today’s cyber age it might seem surprising that websites are still frequently hacked. While cybersecurity continues to gain importance as a business priority, it’s difficult for organizations to know each and every way they’re vulnerable to hackers. Here are our top 10 reasons why websites still get hacked.

  1. Easily hacked ports: According to Bit Discovery data, over 2 billion internet-connected assets are listening on ports 80 and 443, which likely contain numerous vulnerabilities adding up to several methods attackers can use to hack websites.
  2. Unknown website and assets: Most companies remain unaware of the websites they own, what they do, or who is responsible for them. Obviously, you can only scan and secure what you know you own. For unknown assets, it’s impossible to respond quickly to high-priority vulnerability reports.
  3. Limited vulnerability scanning: A decreasing percentage of websites are being scanned. The number of new websites deployed on the internet exceeds the number of new websites covered by vulnerability management.
  4. Ineffective security tests: Dynamic Application Security Testing (DAST) scanning technology hasn’t proven its ability to scale to the size of the internet or even that of a large enterprise.
  5. Limited vulnerability management: Based on our experience working with various security vendors and customers, we believe only a tiny fraction (estimated ~20%) of a company’s websites is actively covered by vulnerability management.
  6. Overwhelming backlog: Companies remain resistant to scanning “more,” let alone “everything” because developers can’t tackle the current backlog. When vulnerabilities are identified, only roughly half are fixed, and it commonly takes six months to do so.
  7. Lack of prioritization: The existing vulnerability volume makes fixing all of them impractical, and prioritization is essential. Traffic-light risk models (red/yellow/green) are laughably unhelpful to allocate development resources as they don’t account for exploitation likelihood, asset value, financial impact, etc.
  8. Budget restrictions: Outsourcing vulnerability remediation has not received traction because security departments believe it should be a part of the engineering department’s budget.
  9. Slow integration acceptance: DAST + Website Application Firewall (WAF) integration for “virtual patching” is still gaining acceptance but currently covers only a small portion of all websites.
  10. Missed vulnerabilities: Static Application Security Testing (SAST) does not identify the vulnerabilities adversaries find and exploit. Based on our experience working with various security vendors, we estimate the DAST-SAST vulnerability overlap is only 1%-5%. “Shifting left” in this way so far has had marginal benefit.

Visit the Tenable.asm product page to learn more about attack surface management.

Related Articles

Cybersecurity News You Can Use

Enter your email and never miss timely alerts and security guidance from the experts at Tenable.