For more information on this Corporate Presentation Toolkit please direct your questions to the following people: Main presentation deck and content: csaunders@websense.com Financial information, analyst or case study content: rzarkos@websense.com Product messaging and positioning: dmeizlik@websense.com Use and sales cycle: jsharer@websense.com
I’d like to say a few words about Websense in case you are not familiar with our company. Over the last year, Websense has achieved a number of milestones. Websense is the global market share leader in Web Filtering according to leading IT market research firms such as IDC. Today, more than 24 thousand customers, representing over 19 million protected users, have come to rely on Websense technology for managing their employee computing resources. Websense also had its best year financially in 2004. In fact, our annual billings have grown by 35% year over year for the last 2 years. Forbes recently recognized Websense as one of the fastest growing technology companies for 2005. And most importantly, Websense remains committed to product research and development. This focus on R&D enables our products to win awards like the 2004 PC Magazine Editor’s Choice.
Websense provides fundamentally solid CMF/DLP functions for data in motion (network) and at rest (discovery) in the same appliance. The company uses advanced detection techniques, including partial document match, data fingerprinting and statistical analysis to detect character replacements. Competitive differentiators include network printing analysis and watermarking as a response, offered through a partnership with SourceMedia (formerly Thomson Media). The ability to offer end users self-remediation for quarantined e-mails, such as encrypt and forward, can reduce operation costs. The product is internationalized to be able to detect content in double-byte character sets — a capability that is already in use in Japan — but the user interface is not localized. Websense acquired PortAuthority in January 2007 after a strategic partnership in 2006 and has announced that it intends to integrate the two companies' technologies in 2007. Before the acquisition, PortAuthority provided host functions through a partnership with Safend. The integration with Websense technology will likely involve integrating content awareness capabilities into the Websense Client Policy Manager host agent. Given the stability of its host based technology, Websense should be well-positioned to provide a comprehensive solution for data in motion, at rest and endpoint.
Websense provides fundamentally solid CMF/DLP functions for data in motion (network) and at rest (discovery) in the same appliance. The company uses advanced detection techniques, including partial document match, data fingerprinting and statistical analysis to detect character replacements. Competitive differentiators include network printing analysis and watermarking as a response, offered through a partnership with SourceMedia (formerly Thomson Media). The ability to offer end users self-remediation for quarantined e-mails, such as encrypt and forward, can reduce operation costs. The product is internationalized to be able to detect content in double-byte character sets — a capability that is already in use in Japan — but the user interface is not localized. Websense acquired PortAuthority in January 2007 after a strategic partnership in 2006 and has announced that it intends to integrate the two companies' technologies in 2007. Before the acquisition, PortAuthority provided host functions through a partnership with Safend. The integration with Websense technology will likely involve integrating content awareness capabilities into the Websense Client Policy Manager host agent. Given the stability of its host based technology, Websense should be well-positioned to provide a comprehensive solution for data in motion, at rest and endpoint.
To ensure uninterrupted business operations, more and more customers must overcome the challenges of data security. There are several distinct areas of focus: Managing Compliance and Risks – Many business are now required to meet specific compliances. Data loss (accidental or targeted) can often result in non-compliance, fines and lawsuits. Of course, non-compliance can disrupt business operations having negative impact to the bottom line. Visibility – The first thing business must understand is the type of data stored in the network and end-points along with what type of communication methods are considered valid. The fact that data is stored and accessed from databases, document repositories, file share, end-user file systems, portable storage devices, etc… makes visibility to such information very complex. Securing Business Processes – Inability to implement controls to protect against accidental data loss and targeted attacks aimed at stealing sensitive data challenges businesses to establish and meet their business processes. Aside from business impact, loss of sensitive data can also adversely effect the company brand and reputation.
Key Points: Whenever the PortAuthority Server receives a message from messaging server or application, the PortAuthority Server (via its fingerprint engine) creates a real-time fingerprint of that message and its associated attachments in memory. That real-time fingerprint is compared against the existing database of known fingerprints looking for any full or partial matches. This fingerprint library can be created through an automatic fingerprinting process that updates on a regular basis or when records are added, modified or deleted.
Here’s a great example why locking down the infrastructure is not a great idea. When you first put in data loss prevention solutions you find interesting things like this. Now this is an real life incident that triggered off one of the 800 or so built in policies that are built in, come ready made if you like, into our data loss prevention module. What we see here is a file of passwords for a good many systems which was zipped and encrypted by a user who then went on to send the zipped file to yahoo mail. Now that incident, quite frankly, at first blush looks quite malicious. Somebody is sending the passwords to your systems to a yahoo mail account and they are obscuring it by zipping the file so maybe they don’t want anybody to see what they are doing. The reason why we like this example is that it is very illustrative of a few concepts. The most important concept is, do you know who caused this problem? Not as you might think the person who actually sent the email, this was inadvertently caused by the IT organization and policies that created this. The company in question had a policy that you couldn’t have distribution lists in the email system with external people on them, since that might allow data to leak. They also had another policy to rotate the passwords every 30 days, which is a great way to encourage sticky notes and password leakage but that was the policy. However this person had to get the passwords to all the [CLICK] business partners who needed these passwords to gain access to all the back end systems so they could conduct business with them. The couldn’t use their own email system because the IT policy forbade external email addresses, so to prove the point that business will find a way, the enterprising employee was using yahoo mail, created a distribution list to circumvent this restriction and send the passwords to all his business partners. They were doing this for a couple of years before we put our [CLICK] system in and found this going on. So it’s very illustrative and shows how IT security policies that say lock things down can create opportunity for people to work around these restrictions to get their jobs done and in doing so create some pretty significant risks for their organizations. It also shows that once you transact in this open manner, the IT department could lock down web mail and the employee would find another way, maybe as Facebook or LinkedIn friends or similar and use that as a distribution mechanism. So the morale is we really need to be able to get a hold of the CONTENT that is transiting our networks here.
DLP methodology and available solutions have implemented some or all features of this process. This process is normally discussed in the context of network scanning for confidential data, but as highlighted previously, user mobility and privileged access to confidential data, combined with the need for timely and accurate scanning for this data - makes a strong case for executing the discovery process with a local software agent, where possible: Identify: a sound DLP discovery project requires prior knowledge of the data important to your organization, whether it is source code, formulas, CAD drawings or customer data. This type of data is usually created and stored in known locations. Other types of data like healthcare, credit card data may be stored within these known locations or they may be stored in bits and pieces, in the form of files or emails throughout the enterprise. Regardless, if any of these data types are unsecured, a breach could be devastating w/o proper controls. Fingerprint: take a snapshot of the business confidential data; at minimum, this is a hash of the bits, but more sophisticated technology is needed for accurate detection of files Discover: run a network scan and if needed, use endpoint software agents to run local scans of all confidential data Network-based: widest coverage and providing best overall visibility into multiple data stores, finding confidential data in often unexpected places Agent-based: scalable since individual discovery jobs on endpoints run independently of each other and report results back to a centralized server once completed; Report: Business policies for data protection as well as industry regulations mandate current inventory, knowledge of where sensitive data resides. This requires compliance reporting. Even without specific regulations or policies, need for risk management in the area of business data is needed. Trending across top violators and most frequent violations important to help prioritize remediation tasks Remediate: Ultimately, it is the responsibility of the data owner to set the policies and controls for how data is created, stored, used and even secured. In many cases, a compliance report concludes the automated discovery process and requires handoff to data owners, who then use their own tools or techniques to further secure the data. This can be complicated across numerous instances of confidential data storage. need, benefits, our features/functions, differentiation, etc. of our discovery product. Built a methodology for discovery, data id, data scan, data remediation planning, data enforcement. Call out the steps along the way, like fingerprinting via ODBC, automated scanning, distributed deployment and endpoint for parallel scanning, the need for recurring scans, flexibility of enforcement w/ tombstoning and ransom notes, etc.