Rear view of two computer hackers in hoodies typing on keyboard and breaking the computer system while sitting in dark office

AMAZON HQ – 2.3 Tbps was the rate of the latest DDoS (Distributed Denial-of-Service) attack that is recorded as the largest ever tried against Amazon Web Services. The attack works on overloading the server by trying to stress it with overwhelming requests and data that are designed to be too overbearing for the network and server to the point it crashes. 

DDoS attacks are usually used to hamper security defenses in a network by distracting them with the over bearing load while other viruses are then sent through the penetration point in the form of Trojan files (files the look like one type but in fact hold hundred to thousands of destructive viral data), in other cases Worms are used to create a tunnel for the hacker to the internal infrastructure of an network where they can locate files hidden away in the server and sell the data or plan sabotages. 

Amazon recounted that the attack occurred back in February, a dn was controlled by Amazon’s  AWS Shield, a service designed to keep customers protected while on Amazon’s on-demand cloud computing platform against DDoS attacks, as well as from bad bots (files that change the functions in the computer). Amazon refuses to disclose what was the target or the origin of the attack.

Being the largest up to date, it brings to mind the previous highest, that the largest DDoS attack recorded was back in March 2018, when NetScout Arbor handled a 1.7 Tbps attack.

Amazon said that between Q2 2018 and Q4 2019, the largest attacks it saw were smaller than 1 Tbps, and that in the first quarter of this year 99 percent of attacks were 43 Gbps or smaller. 

As an online entity, Amazon must keep up to date in it’s cyber security in order to protect the data and information of the millions- billions who use it’s service from Prime to it’s online streaming networks. The applications have to be regularly tested for their vulnerabilities to allow for better patching but most big companies hire a penetration tester who can hack into the system and locate where the weak points of the code and programming is. 

The scale of 2.3 Tbps is beyond large, it’s an incredible feat to cause that much traffic, the number of PCs needed would be close to a small army.

Loading

By WBN