Proactive Log Management in Banking: Why Important and What Inhibits
1. Proactive Log Management in Banking:
Why it is important and what inhibits it
By Van Symons, President, Clear Technologies, Inc.
Executive Summary
With annual industry expenditures for technology topping an estimated $30 billion, the banking
industry clearly relies on technology to better serve customers. This dependence has continued to
grow since the Internet and e-commerce exploded in the 1990s. The unintended consequence of this
reliance on computers is exposure to data breaches.
Why breaches occur? In speaking with customers and reviewing existing research, a majority of
breaches in banking occur for three reasons:
1. Outsourced data
2. Hacking is a lucrative business
3. Employee retribution
Why should the banking industry care? The banking industry should be particularly interested in
mitigating data breaches because:
1. It costs a lot to fix
2. Brand Blemish
3. Intellectual Property
4. Regulations/Laws
5. Mandates
6. Standards/Controls
Attenuate breach impacts. Because it typically takes attackers days to get into a company’s network
and steal data, a recent Verizon RISK and U.S. Secret Service Data Breach Investigations Report
recommended that IT should constantly monitor server activity and red-flag any suspicious activity.
The best method to vigilantly monitor devices and applications is to monitor their logs. As a result, since
the banking industry heavily relies on technology to serve customers, monitoring “log data” or log
management for devices, servers, and applications is too important of a task to be overlooked.
The causes of log management lapses. Despite log management being a great first-line of defense
against a data breach, analyzing logs is seldom adequately performed. In order to ensure adherence
to laws and mollify ramifications, banking IT executives must first understand the human factors that
inhibit this important task:
1. Most people dislike tedious work.
2. No time to ensure uptime; no time to prevent downtime.
3. “NAH”: Not Affected Here.
The Real Solution. Log data management is too important of a task to be overlooked. A great way to
help to counteract these three behavioral issues is to provide your IT staff with the right solution to their
problem in order to resolve your problem.
Since 1993, Clear’s customers have relied on them to meet their hardware needs. Today, their customers look to them to increase their organizational
effectiveness by providing continuity, infrastructure, security, and virtualization solutions. Based in Coppell, Texas, Van can be reached at
www.cleartechnologies.net/DynamicLogAnalysis or (972) 906 -7500 or vsymons@cleartechnologies.net.
-1-
2. Proactive Log Management in Banking:
Why it is important and what inhibits it
By Van Symons, President, Clear Technologies, Inc.
With annual industry expenditures for technology topping an estimated $30 billion,
the banking industry clearly relies on technology to better serve customers. This
dependence has continued to grow since the Internet and e-commerce exploded
in the 1990s.
Exposure to computer systems’ vulnerabilities has also grown at an alarming rate as
attackers strive to identify and make the most of vulnerabilities. Consequently,
computers are attacked and compromised on a daily basis. A recent Verizon RISK
and U.S. Secret Service Data Breach Investigations Report stated that servers and
applications comprise 50% of all breached assets. These attacks steal personal
identities, bring down an entire network, disable the online presence of businesses, or
eliminate sensitive information that is critical for personal or business purposes. One
security survey noted how in 1997, 37% of respondents reported a breach. A 2009
report by the Ponemon Institute, a privacy management research firm, reported a
figure of 85%. Banks are especially susceptible.
Over the past several years, Virginia’s DHS system, TJX, Heartland Payment Systems,
Google, and T-Mobile have been adversely affected by breaches. The Heartland
Payment Systems breach had a ripple effect that exposed customer credit card
details at over 675 regional banks. Interestingly, since the breach, six of the banks
have, according to the FDIC, failed (1st Pacific Bank of California, Columbia River
Bank, Prosperan Bank, Rainier Pacific Bank, Sun West Bank, and TierOne Bank).
Why Breaches Occur?
In speaking with customers and reviewing existing research, a majority of breaches
occur in banking for three reasons: the increase of outsourced data, hacking is a
lucrative business, and employee retribution.
Outsourced Data. Increasingly, cost conscious companies in the banking industry are
outsourcing work to achieve economies of scale. The unintended consequence, as
stated by the 2010 Ponemon study is that 42% of all breach cases involved third party
mistakes. Data breaches involving outsourced data to third parties, especially when
the third party is offshore, are most costly. The per capita cost for data breaches
involving third parties is $217 versus $194, more than a $21 difference, according to
Ponemon. The per capita cost of a data breach involving a negligent insider or a
systems glitch averages $154 and $166, respectively.
Hacking is a lucrative business. In November 2008, the Atlanta-based U.S. payment
processing division for Royal Bank of Scotland, RBS Worldpay network, was infiltrated
by hackers. The hackers obtained unauthorized access and were then able to
Since 1993, Clear’s customers have relied on them to meet their hardware needs. Today, their customers look to them to increase their organizational
effectiveness by providing continuity, infrastructure, security, and virtualization solutions. Based in Coppell, Texas, Van can be reached at
www.cleartechnologies.net/DynamicLogAnalysis or (972) 906 -7500 or vsymons@cleartechnologies.net.
-2-
3. Proactive Log Management in Banking:
Why it is important and what inhibits it
By Van Symons, President, Clear Technologies, Inc.
reengineer personal identification numbers from a data feed, and defeat the credit
card processing system's encryption. In half a day, the hackers stole over $9 million.
Hackers utilize multiple methods to obtain sensitive information including, stealing
computers, combing through sensitive lost documents, brute force attacks, and
viruses. According to the Internet Security Threat Report published by Symantec in
April 2009, attackers released Trojan horses, viruses, and worms at a record pace in
2008, primarily targeting computer users’ confidential information, in particular their
online banking account credentials. Symantec documented a record of 1.6 million
instances of malicious code on the Web in 2008, about one million more than 2007.
Twenty four percent of all cases in Ponemon’s 2010 study involved a malicious or
criminal attack that resulted in the loss or theft of personal information. Moreover,
two types of data most often compromised is credit card information (54% of all
breaches) and bank account information (32% of all breaches) according to the
Verizon RISK and U.S. Secret Service Data Breach Investigations Report.
Employee retribution. In 2008, the Bank of New York breach made national headlines.
In the second quarter of 2010, the New York Country District Attorney's Office stated
that a computer technician formerly employed as a contractor at the headquarters
of the Bank of New York was to blame. Over an eight year period, the culprit stole
personal identifying information of 2,000 bank employees and organized thefts of
more than $1.1 million from various organizations, including charities and nonprofit
organizations.
The Identity Theft Resource Center, a San Diego based nonprofit, found that of the
roughly 250 data breaches publicly reported in the United States between January 1
and June 12, 2008, victims blamed the largest share of incidents on theft by
employees (18.4 %). This year, the 2010 Data Breach Investigations Report by Verizon
RISK and the U.S. Secret Service, 48% of data breaches across all industries were
caused by insiders.
Why Should I care?
In recent years, banks have paid increasing attention to IT security. This is
understandable given the sheer amount of information now in digital form. A recent
InformationWeek Analytics survey revealed that 75% of its executive level
respondents (among all industries) stated that information security is among its
highest priorities. Some reasons include it costs a lot to fix, diffuses the strength of a
brand, places intellectual property at risk, and has initiated widespread regulation,
mandates, and control standards.
Since 1993, Clear’s customers have relied on them to meet their hardware needs. Today, their customers look to them to increase their organizational
effectiveness by providing continuity, infrastructure, security, and virtualization solutions. Based in Coppell, Texas, Van can be reached at
www.cleartechnologies.net/DynamicLogAnalysis or (972) 906 -7500 or vsymons@cleartechnologies.net.
-3-
4. Proactive Log Management in Banking:
Why it is important and what inhibits it
By Van Symons, President, Clear Technologies, Inc.
It costs a lot to fix. Executives are focused on information security because of the
accompanying liability costs of the ever-increasing volume of corporate and
personal information theft. In certain cases, these events result in costly lawsuits with
much of the fees being paid to litigation service firms to sift through inaccessible,
unorganized volumes of data. According to the American Banker’s Association some
of the biggest costs associated with a breach are those from reissuing credit and
debit cards, covering fraudulent charges from stolen card numbers, and closing
accounts placed at risk. In other cases, companies incur the expense of setting up
credit monitoring services for customers affected by the breach. According to the
latest Ponemon Institute study, the cost per compromised customer record is $204
and the average total cost of a data breach is $6.75 million, which is up by 44% since
2006.
The Internet Crime Complaint Center, a partnership of the FBI, the National White
Collar Crime Center, and Bureau of Justice Assistance, reported that the number of
complaints from victims of cyber crime rose by almost a third since 2007. The total
number reached 275,284, amounting to $265 million in money lost. Research shows
data breaches involving malicious or criminal acts are much more expensive than
incidents resulting from negligence or systems glitch ($154 and $166 per record,
respectively). The per capita cost of a data breach involving a malicious or criminal
act averages $215. In instances where a bank issued cards affected by a breach,
these costs can mount quickly, and the bank ends up bearing all of the costs itself.
Brand blemish. Next, executives are focused on information security in order to
preserve brand value. For years, Business Week/InterBrand has published their yearly
findings on the top 100 Brands. Because stability is one of the factors for determining
a brand’s value, one can assume that a customer will be doubtful of the stability of a
brand that cannot protect their information.
John Watkins, the former SVP of online services at Wachovia (now Wells Fargo),
echoes this sentiment, "Data breaches and any type of security concern in the online
space affect customer confidence…We're concerned that customers will lose
confidence if we can't provide them with a good feeling that they are safe online. It's
about trust."
Building trust is especially needed in the already negative-publicity prone banking
industry. “For the second year in a row, banks experienced a significant increase in
complaints coinciding with 140 bank failures in 2009,” said the Better Business Bureau
(BBB). “Trust in the financial sector is already extremely low and the dramatic increase
in BBB complaints against banks reflects the growing discord between consumers
and the industry.” As a result, the banking industry has been through what marketing
experts call brand image turmoil in the aftermath of the financial meltdown.
Since 1993, Clear’s customers have relied on them to meet their hardware needs. Today, their customers look to them to increase their organizational
effectiveness by providing continuity, infrastructure, security, and virtualization solutions. Based in Coppell, Texas, Van can be reached at
www.cleartechnologies.net/DynamicLogAnalysis or (972) 906 -7500 or vsymons@cleartechnologies.net.
-4-
5. Proactive Log Management in Banking:
Why it is important and what inhibits it
By Van Symons, President, Clear Technologies, Inc.
Because the banking industry understands that consumer perceptions of trust are
important, this year banks have increased their advertising budgets to offset the
negative publicity and to rebuild consumer trust. One example is Citizens Banks latest
campaign, ‘Good Banking is Good Citizenship.’ Warren Zafrin, a partner with KPMG
sums it up best, "Over time, security will be a differentiator for banks. It's about
trust…security really goes beyond preventing data breaches to enhancing
relationships.”
Intellectual property. Because superior intellectual property leads to service
differentiation, executives view it as a key asset that, in the midst of hard economic
times, ensures revenue, market share, and long-term profitable growth. An
intellectual property breach can include unauthorized access, copying, disclosure or
use of client information, trade secrets, copyrighted materials, ongoing research,
strategy, M&A plans, and other such information.
Bradford Newman, the leader of the International Employee Mobility and Trade
Secrets practice at the law firm Paul Hastings Janofsky & Walker LLP, states, “Most
banks do have good data security practices. But to recover that data from the
thousands of employees across the globe is a new risk… companies first have to ask
themselves what their trade secrets are, where the most at-risk secrets lie, and, in
connection with the recent layoffs, how they can reduce the risk of disclosure and
maximize the chance of recovering the data." As such, protecting intellectual
property is essential for any organization.
Regulations/Laws. The current system for regulating and supervising financial
institutions is complex. According to the FDIC, “This complicated regulatory structure
came about because financial regulation has been responsive to several traditional
themes in U.S. history. Among them are a distrust of concentrations of financial
power, including a concentration of regulatory power; a preference for market
competition; and a belief that certain sectors of the economy should be ensured
access to credit, a belief that has led to a multiplicity of niche providers of credit. The
nation’s complex regulatory structure was designed to deal with all of these
sometimes conflicting objectives.”
Although many of the newer laws have focused on consumer protection, a number
of others have addressed issues of regulation and supervision related to concerns
about safety and soundness. As such, banks in the United States are obligated to
reduce ‘operational risk’, the risk of loss resulting from inadequate or failed internal
processes, people and systems, or from external events, by monitoring systems and
procedures to detect actual and attempted attacks on or intrusions into customer
information systems by the following laws:
Since 1993, Clear’s customers have relied on them to meet their hardware needs. Today, their customers look to them to increase their organizational
effectiveness by providing continuity, infrastructure, security, and virtualization solutions. Based in Coppell, Texas, Van can be reached at
www.cleartechnologies.net/DynamicLogAnalysis or (972) 906 -7500 or vsymons@cleartechnologies.net.
-5-
6. Proactive Log Management in Banking:
Why it is important and what inhibits it
By Van Symons, President, Clear Technologies, Inc.
1. Section 216 of the Fair and Accurate Credit Transactions Act (2003) (FACT Act)
- must provide for the identification, detection, and response to patterns,
practices, or specific activities – known as “red flags” – that could indicate
identity theft. Within banks, this requirement can only be sufficiently met
through monitoring.
2. Section 501(b) of the Gramm-Leach-Bliley Act (1999) states that a bank should
manage and control risk by “monitoring systems and procedures to detect
actual and attempted attacks on or intrusions into customer information
systems” both internally and with service providers and is the all encompassing
successor to the following:
a. Section 39 of the Federal Deposit Insurance Act
b. Code of Federal Regulations: Title 12: Banks and Banking - Code of
Federal Regulations, Part 30, Appendix B
c. Code of Federal Regulations: Title 12: Banks and Banking - Code of
Federal Regulations, Part 208, Appendix D-2
d. Code of Federal Regulations: Title 12: Banks and Banking - Code of
Federal Regulations, Part 225, Subpart J, Appendix F
e. Code of Federal Regulations: Title 12: Banks and Banking - Code of
Federal Regulations, Part 263, Subpart I, Appendix D-1
f. Code of Federal Regulations: Title 12: Banks and Banking - Code of
Federal Regulations, Part 364, Appendix B
g. Code of Federal Regulations: Title 12: Banks and Banking - Code of
Federal Regulations, Part 570, Appendix
h. Code of Federal Regulations Title 12 (Banks and Banking)
3. USA Patriot Act (2001). Although this Act does not directly ask an organization
to monitor, it increases the ability of law enforcement agencies to search
telephone, email communications, medical, financial, and other records. As a
result, log management is necessary for compliance.
4. Sarbanes-Oxley Act (2002). The formal name of this act is the Public Company
Accounting Reform and Investor Protection Act of 2002. This act requires the
boards, accounting firms, and management of publicly traded firms to adhere
to a higher set of financial recording and reporting standards. The reporting
requirements can only be sufficiently met through monitoring.
5. California Senate Bill 1386. California Senate Bill 1386 was introduced in July
2003. The bill was the first attempt by a state legislature to address the problem
of identity theft by introducing stiff disclosure requirements for businesses and
government agencies that experience security breaches that might contain
the personal information of California residents. Implied in the bill is that in order
to be to assess compliance, an organization should monitor their devices and
applications regularly to adhere to the following, "Notice must be given to any
resident of California whose PI is or is reasonably believed to have been
Since 1993, Clear’s customers have relied on them to meet their hardware needs. Today, their customers look to them to increase their organizational
effectiveness by providing continuity, infrastructure, security, and virtualization solutions. Based in Coppell, Texas, Van can be reached at
www.cleartechnologies.net/DynamicLogAnalysis or (972) 906 -7500 or vsymons@cleartechnologies.net.
-6-
7. Proactive Log Management in Banking:
Why it is important and what inhibits it
By Van Symons, President, Clear Technologies, Inc.
acquired by an unauthorized person." Notice must be given in "most expedient
time possible" and "without unreasonable delay" subject to certain provisions
that define what reasonable is for your organization.
Mandates. Mandates by the Basel Committee on Banking Supervision (Basel II) and
Payment Card Industry Data Security Standard (PCI-DSS) all seek to manage risk.
Basel II. Basel II improved on Basel I, first enacted in the 1980s, by offering more
complex models for calculating regulatory capital in order to make risky investments,
such as the subprime mortgage market in which higher risks assets are moved to
unregulated parts of holding companies. In addition to safeguarding bank solvency
while protecting the international financial system, Basel II also strives to reduce
operational risks. However, the Basel Committee on Banking Supervision recognizes
that operational risk is a term that has a variety of meanings and therefore, for
internal purposes, banks are permitted to adopt their own definitions of operational
risk, provided the minimum elements in the Committee's definition are included.
Through compliance, banks are assured that they hold sufficient capital reserves for
the risk they expose the bank to through its lending and investment practices.
PCI-DSS. Payment Card Industry Data Security Standard (PCI DSS), PCI DSS require
that adequate activity logs are produced, there is restricted access to logs, and that
logs are reviewed daily, all of which are encompassed in the following guidelines:
• 10.5 – Secure audit trail so they cannot be altered Access to audit trails must be
limited to READ access. If audit trails can be altered outside of the application,
monitoring controls should be implemented via file-integrity monitoring tools as
required in DSS 10.5.5. Alteration of audit trails should be investigated for propriety.
• 10.6 – Review logs for all system components at least daily. Log reviews must
include those servers that perform security functions such as IDS and VPN.
The American Bankers Association (ABA). The ABA is the largest banking trade
association representing all categories of banking institutions, including community,
regional, and money center banks. Although they do not have a mandate, the ABA
supports data security legislation and regulatory policy that creates a uniform
standard for data security across all types of businesses.
Standards/Controls. Standards like the Control Objectives for IT (COBIT), the ISO 27001
standard for Security Management, and the NIST Standards all seek to manage risk.
COBIT. The Control Objectives for Information and related Technology (COBIT) is
based on a “plan-build-run-monitor” framework and is a comprehensive set of IT
management best practices managed by the IT Governance Institute (ITGI). The
Since 1993, Clear’s customers have relied on them to meet their hardware needs. Today, their customers look to them to increase their organizational
effectiveness by providing continuity, infrastructure, security, and virtualization solutions. Based in Coppell, Texas, Van can be reached at
www.cleartechnologies.net/DynamicLogAnalysis or (972) 906 -7500 or vsymons@cleartechnologies.net.
-7-
8. Proactive Log Management in Banking:
Why it is important and what inhibits it
By Van Symons, President, Clear Technologies, Inc.
best practices are divided into four domains (Plan, Build, Run and Monitor) and 34
high-level processes. It relies on understanding the inter-relationship between
technologies across the enterprise, real-time understanding of risks, impacts, and
operational variables. Its goal is instill vigilance through monitoring.
ISO27001/2. ISO27001/2 is based on a “plan-do-check-act” framework and is derived
from the ISO 17799, ISO27001 and 2 (together known as ISO27001/2) were
renumbered in 2007 to conform to the ISO 27000 family numbering scheme.
ISO27001/2 are a widely accepted international standard for information security that
was established by the International Standards Organization and offers a broad set of
best practices for information security controls across organizations of any type and
assists all organizations - commercial, governmental or nonprofit - in the process of
managing information security. ISO 27001/2 a standard that offers oversight over
individual security controls. These controls call for the monitoring and analysis of data
generated by all systems including IT infrastructure, network appliances and security
solutions throughout the enterprise. The framework is comprised of twelve security
clauses that include 39 security categories with hundreds of control objectives
overall. Its goal is mitigate risk through active vigilance.
Mortgage Bankers Association (MBA). The MBA is a national association that
represents the real estate finance industry and includes more than 3,000 mortgage
companies, mortgage brokers, commercial banks, thrifts, life insurance companies,
and others in the mortgage lending field. Their Board of Directors Technology Steering
Committee (BoDTech) released a comprehensive approach to information
assurance, which analyzes three critical areas of information assurance: legislative
and regulatory, audit practices, and security standards and framework. The model
was created specifically so that mortgage firms could establish a comprehensive set
of processes that would cover the requirements of the many compliance programs.
The goal of the model is to provide a way for the mortgage industry to assure that
information, data, applications, and processes are in place to protect a firm, its
operations and its customers. The policy and architecture step, takes deeper dive
into a firm's operating environment. It addresses audit practices, including monitoring
of devices and application.
NIST Standards. The National Institute of Standards and Technology is a US federal
technology agency that develops and promotes measurement, standards, and
technology and relies on functional area framework of management, operational,
and technical safeguards. Most banks have adopted this control framework. The
specific log management control outlined with NIST standards rests within the AU-6
Audit Monitoring, Analysis, and Reporting control. In a nutshell, the control states an
organization should report indications of inappropriate or unusual activity to an
organization official and be aware of change in risk to organizational operations.
Since 1993, Clear’s customers have relied on them to meet their hardware needs. Today, their customers look to them to increase their organizational
effectiveness by providing continuity, infrastructure, security, and virtualization solutions. Based in Coppell, Texas, Van can be reached at
www.cleartechnologies.net/DynamicLogAnalysis or (972) 906 -7500 or vsymons@cleartechnologies.net.
-8-
9. Proactive Log Management in Banking:
Why it is important and what inhibits it
By Van Symons, President, Clear Technologies, Inc.
Their “control enhancements” category serves to distill the broad goals set forth by
AU-6, the NIST recommends:
1. An organization’s information system must first integrate audit review, analysis, and
reporting processes to support organizational processes for investigation and
response to suspicious activities.
2. An organization’s auditable data needs to be integrated, centralized, robust, and
be able to thoroughly analyze data from multiple devices.
3. An organization should correlate information from audit records with information
obtained from monitoring physical access to further enhance the ability to identify
suspicious, inappropriate, unusual, or malevolent activity.
Statement on Auditing Standards No. 70 (SAS 70). SAS 70 was developed by the
American Institute of Certified Public Accountants to provide guidance to
organizations that provide third-party services and defines the standards an auditor
must employ in order to assess their internal controls. It is an internationally-recognized
standard that reviews all levels of technology service providers over a six-month
period for business practices, communication, internal procedures and security. The
SAS 70 report guidance articulates the requirements for assessing four items: the fair
presentation of management's description of controls, the suitability of the design of
management's controls, whether the controls are in place as of a specified date,
and whether the controls operated with sufficient effectiveness to determine that
management's control objectives were achieved. This standard applies to banks
(usually publically traded) that rely on hosted data centers, and third party
processors, to provide outsourcing services that affect the operation of the bank.
The Solution?
Attenuate breach impacts. A recent Verizon RISK and U.S. Secret Service Data Breach
Investigations Report recommended that IT staff should constantly monitor server
activity and red-flag any suspicious activity because it typically takes criminals days
to get into a company’s network and steal data. The best method to vigilantly
monitor each device and applications is to monitor their logs. Therefore, monitoring
“log data” or log management for devices, servers, and applications is too important
of a task to be overlooked because it acts as a great first-line of defense against a
data breach.
The Problem with the Solution. Why IT puts us at risk. At one of our recent customer
visits, an IT executive was sharing his ongoing frustration with log management and
analysis. To complicate matters, he stated that the laws, regulations, and mandates
on companies of all sizes have made analyzing logs a necessity. He shared that
although his company had both the human and technology assets to perform the
Since 1993, Clear’s customers have relied on them to meet their hardware needs. Today, their customers look to them to increase their organizational
effectiveness by providing continuity, infrastructure, security, and virtualization solutions. Based in Coppell, Texas, Van can be reached at
www.cleartechnologies.net/DynamicLogAnalysis or (972) 906 -7500 or vsymons@cleartechnologies.net.
-9-
10. Proactive Log Management in Banking:
Why it is important and what inhibits it
By Van Symons, President, Clear Technologies, Inc.
analysis; his team could not, in a repetitive and timely manner, because of the
difficulty in performing the task.
Despite his frustration, we probed further to find out what drives this complexity. We
were surprised to learn that three factors influence why log management and
analysis is not performed: it is tedious, time consuming, too abstract to tend to.
No one likes tedious work. Most IT personnel are as generalized as being task versus
people-oriented. Even so, they do not like to perform brainless tasks. Log
management falls into that category as an IT person would have to pour through
reams of data and somehow correlate and weight each security risk, which is a truly
tedious task.
No time to ensure uptime; no time to prevent downtime. On any given day, they are
performing multiple tasks that stretch their skills to the limit. Already overworked, one
IT administrator stated that he is responsible for maintaining a service level of 98% for
his 900 users, and maintaining/reviewing log data. But, he is only merited based on his
service level performance. Consequently, he seldom manages and reviews his logs
and hopes that an incident will not bring down his system.
“NAH”. We've all heard the phrase "NIH", not invented here. However, with IT staff, we
constantly witness a belief system of “NAH", not affected here. Because of the
limited time and multiple demands placed on an IT staff, many are forced to hope
and believe for the best. One IT analyst confided to us he hoped to never have a
breach since a breach would cost about $25,000 an hour in lost productivity and on-
time delivery performance.
The real solution. Log data management is too important of a task to be overlooked.
In order to ensure adherence to laws and potential costs, IT executives must first
understand, address, and resolve the human factors that inhibit this important task. A
great way to help to counteract these three behavioral issues is to provide your IT
staff with the right solution to their problem in order to resolve your problem.
Since 1993, Clear’s customers have relied on them to meet their hardware needs. Today, their customers look to them to increase their organizational
effectiveness by providing continuity, infrastructure, security, and virtualization solutions. Based in Coppell, Texas, Van can be reached at
www.cleartechnologies.net/DynamicLogAnalysis or (972) 906 -7500 or vsymons@cleartechnologies.net.
- 10 -