Getting Hit with a DDoS Attack

How to Know if You Are Getting Hit with a DDoS Attack

  • Unveiling of the People behind Mirai
  • Army of Thermostats and Routers Attacks
  • How Do You Know if You Are Getting Hit by a DDoS?
  • 5 Steps to Defend Yourself from DDoS
  • Now More than Ever


Many people know Dyn as the domain name system (DNS) infrastructure provider that was taken off the Internet by a record-breaking distributed denial of service (DDoS) attack. Interestingly enough, there is a prominent piece on surviving DDoS attacks that was written by Dyn director of operations and client services David Grange back in 2014. The attack on Dyn was successful, but it was the largest DDoS of all time; so Grange’s comments are valid and are broad enough that they are still highly relevant.


In exploring the thoughts provided by Grange, we will be answering the questions, “How do you know if you are getting hit by a DDoS?” and “How do you defend yourself from DDoS?” But first, let’s look at the pivotal malware that made DDoS such a particularly important aspect of cybercrime in 2017: Mirai.


Unveiling of the People behind Mirai


The problem with attacking an investigative security journalist with your malware is obvious: you just might get their attention. That’s definitely what happened when a DDoS malware developer went after former Washington Post IT security reporter Brian Krebs.


Krebs got hit by a DDoS attack in September – a massive one. In November 2016, a huge section of the Internet went down because of an attack using the same botnet. Krebs used his training as a reporter and investigated; and he has confidently fingered the culprit. Krebs cited various sources to make a reasonably solid case that the individual behind Mirai is Rutgers student Paras Jha – also, somewhat incredibly, the owner of Protraf Solutions, a DDoS prevention service.


Approximately a week following the DDoS attack of Krebs’ site, a malware author (Jha, in Krebs’ opinion) released the source code for their incredibly powerful zombie botnet of IoT devices, Mirai.


The open sourcing led to additional attacks, explained David Lumb of Engadget on January 19. “But it also gave Krebs the first clue in their long road to uncover Anna Senpai’s real-life identity – an investigation so exhaustive… Krebs made a glossary of cross-referenced names and terms along with an incomplete relational map.”


Army of Thermostats and Routers Attacks


The unveiling of the guy who is supposedly behind this operation is somewhat of a distraction from the general trend of DDoS, which continues to rise. What’s particularly notable about this attack from a threat perspective is its vast scope. The attack started on the evening of September 20, at about 8 p.m. ET. It measured 620 Gigabits per second. What was actually troubling about the attack wasn’t just that incredible scale but that it appeared to come from a massive army of hacked devices.


To understand this attack on Krebs and Mirai itself, it helps to look back at a previous major DDoS attack against a European media company that hit 363 Gbps. That attack was believed to have been produced by a botnet, but using various methods to amplify a small attack into having greater scale. The DDoS tactic of amplification is a type of reflection – in which the perpetrator is trying to create a flood of responses to a spoofed IP address (which is that of the target). In the case of the Krebs DDoS, it wasn’t amplification or reflection.


On the contrary, “many were garbage Web attack methods that require a legitimate connection between the attacking host and the target, including SYN, GET and POST floods,” explained Brian Krebs. There was one that did not fit that bill, though: traffic that was imitating generic routing encapsulation (GRE) packets – a protocol that is used to communicate directly between two network nodes (to allow sharing of data between peers without use of a public network).


An attacker can easily spoof DNS traffic, but that is not the case with GRE traffic, or with the garbage methods described by Krebs. Basically, the high amount of traffic of these types seen in early analysis of the DDoS pointed to the fact that a large number of hacked systems were used – hundreds of thousands of them. It’s now recognized that the collection of systems that was used was actually the Mirai botnet.


Following the attack on Krebs and the open-sourcing of Mirai’s code, another major assault occurred – this one of a DNS provider, Dyn. The Dyn DDoS, which occurred in October, was – similarly to the Krebs attack – the work of more than 100,000 devices. These are Internet of Things devices such as webcams, thermostats, and routers, but they can pack a punch in numbers, as is clear in the 1.2 terabits per second (Tbps) of power they delivered to Dyn.


How Do You Know if You Are Getting Hit by a DDoS?


What does this story have to do with you? DDoS is becoming increasingly commonplace, so – unfortunately – many companies are having to ask themselves the question, “Am I getting hit by a DDoS?” or “What would I do if I were hit by a DDoS?”


The confusing thing about identifying a DDoS is that it isn’t easy to tell if the spike in traffic is legitimate users or an actual distributed denial of service effort, noted Dyn’s Grange. “The key to telling the difference lies in the length of time the service is down – if slow or denied service continues for days rather than a spike during a campaign it is time to start to look into what’s going on.”


Another way to tell something malicious is occurring is when you see that a source continues to query a certain set of data long after the time to live (TTL) has elapsed. (TTL, also called hop limit, is any means by which the amount of time data stays on a network or computer is controlled – discarding data after the determined timespan has passed.)


5 Steps to Defend Yourself from DDoS


What are some proactive steps you can take to protect yourself from Mirai and other DDoS attacks? Here is Grange’s advice:


#1 – Focus on awareness. Track your network’s normal activity carefully so that you recognize when anything is amiss and a DDoS might be occurring.


#2 – Improve your capacity. Be certain your capacity is high enough to carry the load, and optimize for performance during spikes. Architect with mitigation in mind.


#3 – Run drills. Go through drills with your staff so everyone is ready if you do experience a DDoS.


#4 – Use an outside provider. Many companies reasonably decide that they do not want to deal with the DDoS challenge internally, so they partner with third parties.


#5 – Err on the side of preparation. “[F]igure out the impact it would have on your company financially if it were to happen,” said Grange. “[T]he cost associated with being attacked is usually much higher than the cost to take safeguards.”


Now More than Ever


Clearly, following the rise of Mirai and these high-profile mega-attacks, it is more important than ever to make sure you have defenses in place so that DDoS can’t sideline your operations. At KnownHost, we offer free DDoS protection with all of our VPS hosting packages. Compare plans.

Read More

Why Is It Easy to See That E-Commerce is Speed?

Speed is essential to e-commerce success. That means, more and more, that speed is central to business – as indicated by recent statistics on e-commerce and, specifically, m-commerce (mobile e-commerce). is one example of better revenue arising from a mobile focus.


  • The shocking impact of 100 milliseconds
  • Why is the ecommerce market increasingly important?
  • With m-commerce trending, we shop on the go
  • Smart m-commerce and other continuing 2017 trends
  • Speed and the mobile app:
  • What can you do to accelerate?


Business is all about prioritizing – and so much of success depends not on understanding that something is important to business but on understanding how critical it is when compared to other potential investments. For instance, we can all agree that speed is critical for online sales. However, we may disagree on just how valuable boosting your company’s speed is.


The Shocking Impact of 100 Milliseconds


Speed has become more prominent of a factor as studies have been released demonstrating just how pivotally it impacts revenue. Amazon was one of the first organizations to do research on the connection between performance and conversion rate. The company’s 2006 study, “Make Data Useful,” found that a 100-millisecond slower page load equated to a 1% reduction in sales.


According to 2016 figures from eMarketer highlighted in Womens Wear Daily, here is how the top 10 US ecommerce retailers would be affected by a 100-millisecond delay, in annual losses:


  1. – $792.68 million
  2. Wal-Mart Stores – $134.84 million
  3. Apple – $120.00 million
  4. Staples – $107.00 million
  5. Macy’s – $48.29 million
  6. The Home Depot – $42.67 million
  7. Best Buy – $37.80 million
  8. QVC – $37.22 million
  9. Costco Wholesale – $36.18 million
  10. Nordstrom – $26.99 million.


Why is the ECommerce Market Increasingly Important?


To better understand just how large the e-commerce market is, and how much it continues to grow even today, let’s look at stats from the US Census Bureau (noting that these figures are adjusted for seasonal variation but not pricing fluctuations). In the third quarter of 2016, ecommerce represented an estimated $101.3 billion in United States retail sales – a 4.0 percent rise over Q2 2016.  Overall, retail sales hit $1.2125 trillion – up 0.9 percent over Q2.


These upward bumps may seem small or insignificant, but what’s particularly eye-popping is the growth over the course of the year. “The third quarter 2016 e-commerce estimate increased 15.7 percent (±1.9%) from the third quarter of 2015 while total retail sales increased 2.2 percent (±0.7%) in the same period,” noted the Census. “E-commerce sales in the third quarter of 2016 accounted for 8.4 percent of total sales.”


With M-Commerce Trending, We Shop on the Go


M-commerce is responsible for the vast majority of recent e-commerce growth.


The 2016 Mobile 500 study, highlighted by Mark Brohan in Internet Retailer, found that m-commerce, mobile online shopping, was growing at almost three times the rate of general e-commerce. At the time of the analysis, the magazine reported that m-commerce accounted for about one-third of all online sales (29.7% of online sales vs. 24.6% of them in 2014). The grand total for m-commerce rose from $75.03 billion to $104.05 billion between 2014 and 2015. We’ll get into a couple of supporting reports below.


Part of the reason m-commerce is becoming increasingly prevalent is simply that more people own tablets and smartphones. In fact, an October 2015 report from Pew Research Center found that almost two-third of American consumers (68%) own smartphones, while close to half (45%) own tablets. The smartphone use has almost doubled from its 35% adoption rate in 2011.


This growth can’t continue forever, explained Monica Anderson of Pew. “Smartphone ownership is nearing the saturation point with some groups,” she said. “86% of those ages 18-29 have a smartphone, as do 83% of those ages 30-49 and 87% of those living in households earning $75,000 and up annually.”


Although the growth in m-commerce won’t continue forever, it’s clearly now at an all-time high – with the basic findings of the Mobile 500 project echoed in other research. Gartner has forecast that by 2017, half of online sales dollars will come by way of m-commerce. As of 2015, the industry analyst found that smartphones and tablets represented 22 percent of revenue.


Finally, on November 14, 2016, comScore released its finding on m-commerce for the third quarter of 2016. The analyst found that total e-commerce sales amounted to $84.3 billion – fully one-fifth of which was through smartphones and tablets.


“While we’re still experiencing a pronounced channel shift from desktop to mobile spending,” Adam Lella of comScore clarified, “spending on both platforms has been strong throughout 2016, which generally bodes well for the upcoming season.”


2017: Smart M-Commerce and Investing in Experience


Jennifer Polk of Gartner commented that certain sectors had more of an incentive to move fast on mobile – both on the functionality and promotion fronts. She noted that big-box stores don’t have to be as focused on m-commerce because they rely more on the in-store experience. Nonetheless, updating of credit card processing standards in 2015 to combat fraud meant that retailers had to revise their point-of-sale (POS) software. Many of those updates also more broadly facilitated mobile transactions.


Companies that are working on building their m-commerce, which should be the vast majority of retailers, should create cross-departmental teams so that better path-to-purchase and post-purchase can be achieved for mobile consumers. One item that should be addressed is mobile wallet support.


Furthermore, just like the m-commerce projection above, Gartner also expects half of the innovation financing that was allotted for product upgrades in 2015 to be budgeted instead for user experience (UX) by 2017. Gartner had stated in 2015 that nearly nine out of ten (89%) firms would list UX as their central competitive differentiator by 2016.


“In many industries, hypercompetition has eroded traditional product and service advantages, making customer experience the new competitive battlefield,” advised Gartner research director Jake Sorofman. “This is no truer than in durable consumer products markets, which face disproportionate commodity pressure as consumer access to pricing and product information via search and social channels undermine brand loyalty.”


The extent to which differentiation can be maintained through product and business model development is limited, since the market is so competitive in terms of that general innovation. That’s why almost three out of every four businesses were spending more on UX in 2015 than they did in 2014.


Speed and the Rise of the Mobile App


As the prominence of e-commerce and m-commerce continue to build, speed is a variable against which success can be measured with an increasing degree of accuracy. It isn’t easy, though: in 2014, Forrester reported that performance was the #1 challenge for companies in improving their UX.


Pixlee director of product marketing Andrew Higgins described why speed is such a top priority. “[A]s companies continue to innovate and add new technologies to improve and differentiate their eCommerce stores,” he said, “speed remains a top concern and criteria to evaluate new platforms.”


Speed and the Mobile App:, which was listed as #295 in the 2016 Mobile 500, has profited from focusing more on m-commerce. Between 2014 and 2015, the company’s m-commerce increased to $18.0 million, a 107% improvement. has been gradually simplified and functionally optimized over the years for users accessing via smartphone and tablet. Its mobile site makes it easy to search for and buy parts based on a truck’s make/model, year, and other parameters – so that there are effectively numerous ways to find each product.


What Can You Do to Accelerate?


One simple and effective way to accelerate your business is to switch from shared hosting to a VPS backed by solid state drives – offering better speed by foregoing moving parts. At KnownHost, our Managed SSD VPS packages offer enterprise-grade hardware running on SSD drives for optimal performance. Compare plans.

Read More

What is CentOS, and Why Should You Care?

  • Being Linus Torvalds
  • The story of Linux
  • Things you wanted to know about CentOS but were afraid to ask

CentOS is a particular distribution (aka distro) of the Linux operating system. Let’s look at Linux first to get a sense of that general technology and community, then take a direct look at this particular variation of the open source operating system.

Being Linus Torvalds

Like many major moments in computing or any field, when Linux was introduced, it didn’t seem like that big a deal until years later. On August 25, 1991, Linus Torvalds wrote a simple post in the Usenet newsgroup comp.os.minix. “I’m doing a (free) operating system (just a hobby, won’t be big and professional like gnu) for 386(486) AT clones,” he wrote in part. “This has been brewing since april, and is starting to get ready.” [sic]

The free OS that Linus was casually announcing would end up becoming a major piece of computing networks worldwide. Suffice it to say that today, Linux is not just a single developer’s hobby.

As the operating system began to take the world by storm, Glyn Moody of Ars Technica became interested in the steps that preceded its initial release. He flew to Helsinski, Finland, in December 1996 to speak with Torvalds at his home, resulting in the story detailed below.

The story of Linux

Linus started attending Helsinki University in 1988, where he was working on a degree in computer science. In 1990, he became familiar with the Unix operating system in one of his classes. The course he took had a cap of 16 students because that was the capacity of the school’s license. Torvalds was immediately drawn to the operating system, feeling that its coding interface was surprisingly user-friendly.

One of the textbooks for the class was Operating Systems: Design and Implementation. The book included source code for the OS Minix, which had become available on the Intel 80386 processor. Linus was very interested in chips and thought the 80386 was the best he had seen from the company.

It sparked a technological leap, in part because he had money from student loans and Christmas. “That’s when I actually broke down,” Torvalds told Moody. “I remember the first non-holiday day of the New Year I went to buy a PC.”

Linus bought his PC in January 1991. However, he couldn’t work with Unix because he didn’t yet have the Minix floppy disks. While he waited, he played Prince of Persia and started running tests on the 80386 chip.

He wanted to know how effectively the computer chip could switch from one process to another. He would run two tasks, with a timer set to alternate between them. One task simply wrote the letter A, while the other wrote the letter B. He was not programming very much at that point because he was getting to know the parameters of the Intel CPU.

As bizarre as it may sound, the bare-bones task-alternating project eventually morphed into the Linux kernel. Torvalds realized that he could change the A and B tasks to emulate a terminal. He had one task that was moving information from a keyboard to a modem, while another one brought data from the modem to the monitor.

“I had keyboard drivers because I obviously needed some way to communicate with this thing I was writing,” Linus explained, “and I had driver for text mode VGA and I wrote a driver for the serial line so that I could phone up the University and read news.” In other words, he was simply gathering information from newsgroups via the modem.

An advantage of drawing from the newsgroups was that the comments therein helped the young programmer to revise and strengthen the developing OS throughout the summer of 1991. Linus also realized he wanted to be able to download, so he programmed a disk driver. He additionally had to create a file system that could draw from the Minix file system for writing and reading during upload and download. Unix is essentially composed of these basic components, Torvalds noted: alternating between processes, drivers for your devices, and the file system.

Linux received its name by accident, really. Linus needed to know the POSIX standards that made systems similar to Unix compatible with one another. These specifications were a bit expensive, according to a professor at the university, Ari Lemmke. However, Lemmke said he was actually focused on operating systems and kernels himself.

“He had this small area on [the FTP server], and he said: ‘[H]ey, I’m putting a directory aside for you,” said Torvalds. “So he created the /pub/os/linux directory.”

Linux was the name Linus had given the project while it was in initial development, but he never intended for that to be the name of the OS when it was released publicly. He feared people would think he was arrogant. He wanted to instead called it Freax for Free Unix. Lemmke saved it instead under the work-in-progress name Linux, and it simply moved forward under that heading.

The first version of the OS was released via email to some contacts from the newsgroups. Torvalds rushed that version to get something up on the FTP site to which he had access. The next version, which he announced via the Minix newsgroups, represented a vast improvement.

Still, the original base of users was miniscule. “I don’t know how many people got [this first public version in comp.os.minix],” Linus commented. “[P]robably 10, 20, this kind of size.”

Things you wanted to know about CentOS but were afraid to ask

Now let’s look down the line at CentOS, one of the most prominent offspring of Linux.

Known for its stability, consistency, easy-to-use administration, and straightforward replication, this flavor of the open source OS was created as a spinoff of Red Hat Enterprise Linux (RHEL).

Beyond the OS itself, the CentOS Project – the entity that manages development of the platform – serves an organizational role by providing resources so that other groups can more easily develop tools based on the CentOS system.

CentOS, which was first announced in March 2004, is community-developed, based on source code released at no cost by Red Hat. Part of its grounding is that it should maintain compatibility with RHEL. The OS is free to download, use, and make available to others.

The community consists of a core development team and users ranging from casual Linux fans to corporate system administrators.

The basic idea behind the CentOS Project is to give people a strong system for open source groups to use and extend. The framework can be utilized by hosting companies and for processing of scientific data, for instance. Organizations are able to place their programs on a reliable platform.

The CentOS Governing Board consists of original project members and Red Hat personnel, all of whom help with development of the ecosystem.

The Project was designed in a similar manner to the esteemed Apache Foundation. “A governing board… oversees various semi-autonomous Special Interest Groups or SIGs,” notes the CentOS site. “These groups are focused on providing various enhancements, addons, or replacements for core CentOS Linux functionality.”


Want to see CentOS in action? At KnownHost, our managed VPS hosting packages, based on CentOS Linux, give you the flexibility and power of a dedicated server without the high price tag. Learn more.

Read More

With DDoS Attacks on the Rise, Are You Protected?

Recent reports suggest that DDoS attacks are approximately doubling between 2015 and 2016. It’s becoming more apparent all the time that this form of protection is a necessity for online businesses.

  • Study 1: 125% YOY Bump
  • Study 2: 83% YOY Bump
  • Study 3: 129% YOY Bump
  • Keeping Your Infrastructure Protected

Study 1: 125% YOY Bump

Your website and other digital assets could be more vulnerable than they have ever been, according to a report of Q1 2016 activity highlighted by ZDNet in June. The study found that there was a year-over-year acceleration in distributed denial of service (DDoS) attacks since the first quarter of 2015.

As if that isn’t enough cause for concern, the analysis didn’t just show a greater number of attacks. It also showed that they are longer-lasting than they have been in the past. The Q1 results show that a typical attack continues for over 16 hours, up from less than 15 hours in Q1 2015.

To review, attacks are both more prevalent and take longer to subside. Both of these statistics are clearly bad news for those doing business online. Plus, huge DDoS attacks, delivering 100 Gigabits per second (Gbps) of bogus botnet requests, occur more frequently than they have historically. There were eight attacks of that scope in the first three months of 2015. That number more than doubled in Q1 2016 to 19, a boost of 137.5%.

You can also see how strongly cybercriminals started the year by looking at the last three months of 2015. In October through December, there were just five of those 100+ Gbps mega-attacks.

Plus, reported Steven J. Vaughan-Nichols of ZDNet, the Q1 2016 study detected 4,523 DDoS assaults. “That’s a significant increase from the previous quarter’s 3,693 attacks,” he said. “This increase was largely driven by repeat attacks on customers rather than cyber crooks going after more targets.”

Once you are selected as a DDoS attack target, you have a major problem, because it’s likely it won’t be a single event but dozens of them. Even back in the beginning of 2015, there were 15 attacks experienced by each victim. In Q1 2016, that per-target number grew to 29.

Previously, the cybercriminals would realize protections were present and shift to a different target. In 2016, they are pounding sites over and over again in an effort to break through if the shields are ever inactive. Gaming sites are particularly vulnerable to a flurry of DDoS events because even a slight reduction in load times can noticeably impact performance. DDoS attacks are also becoming more repetitive because the attack tools are now, unfortunately, less expensive and more user-friendly.

“Indeed, DDoS attacks no longer require any hacking or networking skills,” noted Vaughan-Nichols. “DDoS-for hire sites now enable anyone with Bitcoin to launch multiple simultaneous attacks from an easy-to-use interface with a menu of attacks.”

Keep in mind, 29 attacks is simply an average. There are horror stories that go far beyond that level. For example, one company experienced 283 attacks just in the first quarter of 2016, which comes out to more than three each day.

The silver lining in the DDoS world is that the top-end of the monstrous mega-attacks has receded from previous heights. The largest attack detected in the first quarter by these researchers was 289 Gbps, a decline from a high-end of 309 Gbps in the last quarter of 2015. The dubious distinction of “victim of the largest attack ever” goes to a French website, which experienced a DDoS in 2014 that nearly hit 400 Gbps. The super-attacks are becoming less grandiose in scope because the attack platforms have become much more inefficient since ISPs have grown more capable of defending against them.

Study 2: 83% YOY Bump

That is not the only recent analysis that has found DDoS attacks to be increasing to a disturbing degree. An analysis highlighted in BetaNews found that second-quarter attacks went up 83 percent year-over-year from 2015 to 2016, reaching 182,900.

You may think that companies in the United States are the most at-risk for these Internet events. However, this study found that Russia was getting the brunt of these efforts. A Russian ISP that handles the traffic of SMBs and enterprises, Starlink, was pummeled with more than 2 out of every 5 attacks detected over 48 hours.

Much of the discussion of DDoS revolves around DDoS-for-hire scenarios simply because it accounts for so much of the activity, but the analysts of this study believe Starlink is a project by nationalist hacktivists wanting to hurt Russia’s economy.

The paper’s chief scientist, Terrence Gareau, explained that he didn’t expect the number of DDoS attacks to rise because many attackers were shifting to phishing, ransomware, and other forms of financially motivated assaults. “Organizations can expect cyberattacks to continue growing in frequency this year, especially with more attention on the Summer Olympics and the November election season in the US,” he said in July. “The results from this quarter also show how important it is to not only protect your website, but also to plan for new payloads and attacks on your infrastructure.”

Although Russia held the crown as the biggest target country for this sort of cybercrime, the United States was in second, with China in third. In South America, Brazil was still in the top ten nations; however, there were less than half as many Brazilian attacks as Q2 2015.

The researchers of this study additionally documented a rise in other types of cybercrime, such as multicast domain name system (mDNS) and routing information protocol (RIP) attacks. Hackers are testing different approaches to go after sites. Again, because the US presidential election is approaching and because the Olympics occurred in the third quarter, security experts expected to see an even higher quantity of DDoS attacks moving forward.

Study 3: 129% YOY Bump

Keep in mind that these studies are typically performed by security firms, content delivery companies, and similar entities – and they aren’t Internet-wide but of their own customers. Since you have different samples for different analyses, you’ll see different statistics each quarter.

An additional study of the 2nd quarter, featured in Infosecurity Magazine, found that attacks were still rising similarly to the first quarter – more dramatically, even. This analysis detected 4919 DDoS events.

The scope of attacks hit a ceiling in 2014, as discussed above. However, this study measured its largest attack, against a European media company, at nearly the same level – 363 Gbps. There were a total of a dozen assaults during the second quarter that were greater than 100 Gbps. The two largest ones, both of which were greater than 300 Gbps, were of media sites.

However, this study had good news in terms of general volume of an attack. Attacks were, on average, 36% less powerful, measuring 3.85 Gbps.

With the size of DDoS shrinking a bit, they are still becoming more common all the time since the DDoS platforms are so simple to deploy and generate money for the perpetrators, explained the paper’s head author, Martin McKeay. “This commoditization renders businesses vulnerable to a higher frequency of attacks they can’t defend against on their own,” he said. “[I]t is important for organizations to understand what they are up against, specifically as adversaries increasingly threaten DDoS attacks for ransom.”

Keeping Your Infrastructure Protected

It’s clear from the above statistics that companies both large and small need protection from DDoS attacks. At KnownHost, we offer 24/7 fully managed support for our VPS hosting plans, with comprehensive DDoS protection standardly included. Learn more.

Read More