INTELLIGENT BRANDS // Enterprise Security
the data centre, and could cause poor
performance and system errors. To
maintain high availability, it is essential
to keep garbage data under control.
Common culprits are installation files
duplicated at several locations, as well
as virtual machines that are invisible
because they have been removed from
the infrastructure inventory, but not
permanently deleted.
More often than not garbage data is
kept when nobody knows what is, and
no one wants to delete it in case it’s
something important. This method of
keeping useless data is a legacy from
the days when data protection and
availability solutions were much less
sophisticated, and restoring lost data
was a cumbersome and difficult process.
Today data recovery is much quicker,
allowing employees to recover what they
want, when they want. Whether you have
lost a backup copy of an important piece
of data or unintentionally deleted some
garbage data, it is much easier to restore,
usually within seconds.
Plan ahead
To ensure that services, applications
and data are available at all times, it
is not only IT solutions that must be
put in place, but also routine. Planning
for restoring data in the fastest and
easiest way when a problem has arisen is
essential if we are to avoid unnecessary
downtime and loss of corporate revenue.
When processes for recovery are in place
and well known by corporate employees,
it should not be necessary to take more
than 15 minutes to get systems up and
running again.
Availability is as important during times
of low staff levels as any other time,
and downtime remains costly no matter
what time of year it occurs. In a survey
performed on behalf of Veeam, it was
revealed that companies risk losing
millions of dollars due to downtime and
solutions not working as they should, as
well as losing productivity and data. This
cost only increases as more time passes,
and unless procedures are put in place
sooner rather than later, there is high risk
of unnecessarily long downtime and high
revenue loss.
70
INTELLIGENTCIO
At a glance…
Ransomware domains
increase 35-fold in Q1,
says Infoblox
Infoblox has
released the
Infoblox DNS
Threat Index for
the first quarter of
2016, highlighting
a 35-fold increase
in newly observed
ransomware
domains from
the fourth quarter
of 2015. This
dramatic uptick
helped propel the overall threat index, which measures creation
of malicious Domain Name System (DNS) infrastructure
including malware, exploit kits, phishing, and other threats, to its
highest level ever.
Ransomware is a relatively brazen attack where a malware
infection is used to seize data by encrypting it, and then payment
is demanded for the decryption key. According to Rod Rasmussen,
vice president of cybersecurity at Infoblox, “There has been a
seismic shift in the ransomware threat, expanding from a few
actors pulling off limited, small-dollar heists targeting consumers
to industrial-scale, big-money attacks on all sizes and manner of
organisations, including major enterprises. The threat index shows
cybercriminals rushing to take advantage of this opportunity.”
The FBI recently revealed that ransomware victims in the
United States reported costs of $209 million in the first quarter
of 2016, compared to $24 million for all of 2015. High-profile
Q1 ransomware incidents include the February 2016 attack on
Hollywood Presbyterian Medical Centre in Los Angeles and the
March 2016 breach at MedStar Health in Washington D.C.
The Infoblox DNS Threat Index hit an all-time high of 137 in
Q1 2016, rising 7% from an already elevated level of 128 in the
prior quarter, and topping the previous record of 133 established
in Q2 2015. The Infoblox DNS Threat Index tracks the creation
of malicious DNS infrastructure, through both registration of
new domains and hijacking of previously legitimate domains
or hosts. The baseline for the index is 100, which is the average
for creation of DNS-based threat infrastructure during the eight
quarters of 2013 and 2014.
www.intelligentcio.com