Quote Originally Posted by Sjol View Post
Did SE have warning that they were particularly vulnerable to a DDOS
If they thought they never had the warning, then that is a major issue. Any sane network developer will know that a ddos cannot be prevented unless the design of the internet gets changed massively (in both protocol and laws, and globaly). You can only mitigate up to a certain point as default, since every step above it costs money. You can reserve a bit to handle it (as in always having some backup servers available that you can hire, that a lot of companies can revert to if they have issues), but in most cases, its just a money/risk balance. Where money nearly always wins.

Even google faces ddos attacks. And very severe ones. Attacks at a scale that would not just kick you out of the game, you cannot even get the launcher to connect. Google however has a capacity that is magnitudes higher, they have to deliver a service to 2 billion people at once, instead of at peak times 200k. Note that the PSN has been down for a very long time at once because of a ddos. 2 weeks of having absolutely no way of getting online.

There is just no reason for SE to prepare before the attack, as it takes only a single test to know if a ddos works or not. And if not, pay more to scale it up. The ddos providers can reach terrabytes/s worth of data easily as they dont need this speed themselve, they only need a botnet large enough. Yet when defending, the only way to defend it is by making it possible to catch that number of traffic and spread it out enough so it can be processed (and discarded if its part of the ddos). If they protected against 200Mb/s, as ddosser you test just linearly in scale upward. first 100, then 200, and at 300 you see success so you stop scaling. If mitigation takes it to 400, you just continue testing upward and at 500 you again win. You are only restricted by the size of the botnet.

(and yes, ddosses are often measured in requests per second, but for the argument its still the same)