Symantec Corporationv.the Trustees of Columbia University in the City of New YorkDownload PDFPatent Trial and Appeal BoardMay 31, 201612154405 (P.T.A.B. May. 31, 2016) Copy Citation Trials@uspto.gov Paper 57 571-272-7822 Entered: May 31, 2016 UNITED STATES PATENT AND TRADEMARK OFFICE _______________ BEFORE THE PATENT TRIAL AND APPEAL BOARD _______________ SYMANTEC CORPORATION, Petitioner, v. THE TRUSTEES OF COLUMBIA UNIVERSITY IN THE CITY OF NEW YORK, Patent Owner. _______________ Case IPR2015-00374 Patent 7,913,306 B2 _______________ Before HOWARD B. BLANKENSHIP, BRYAN F. MOORE, and ROBERT J. WEINSCHENK, Administrative Patent Judges. WEINSCHENK, Administrative Patent Judge. FINAL WRITTEN DECISION 35 U.S.C. § 318(a) and 37 C.F.R. § 42.73 IPR2015-00374 Patent 7,913,306 B2 I. INTRODUCTION Symantec Corporation (“Petitioner”) filed a Petition (Paper 2, “Pet.”) requesting an inter partes review of claims 1–4 and 7–11 of U.S. Patent No. 7,913,306 B2 (Ex. 1001, “the ’306 patent”). The Trustees of Columbia University in the City of New York (“Patent Owner”) filed a Preliminary Response (Paper 10, “Prelim. Resp.”) to the Petition. On June 3, 2015, we instituted an inter partes review of claims 1–4, 7, and 8 (“the challenged claims”) of the ’306 patent on the following grounds: Claim(s) Statutory Basis Applied Reference 1–4, 7, and 8 35 U.S.C. §§ 102(a), (b) Rebecca G. Bace, Technology Series: Intrusion Detection (Linda R. Engelman et al. eds., 2000) (Ex. 1007, “Bace”) 1–4 and 7 35 U.S.C. § 102(b) Dorothy E. Denning, An Intrusion- Detection Model, IEEE Transactions on Software Engineering, Feb. 1987 (Ex. 1021, “Denning”) Paper 13 (“Dec. on Inst.”), 16. After institution, Patent Owner filed a Response (Paper 23, “PO Resp.”) to the Petition, and Petitioner filed a Reply (Paper 37, “Pet. Reply”) to the Response. An oral hearing was held on February 24, 2016, and a transcript of the hearing is included in the record. Paper 56 (“Tr.”). We issue this Final Written Decision pursuant to 35 U.S.C. § 318(a) and 37 C.F.R. § 42.73. For the reasons set forth below, Petitioner has shown by a preponderance of the evidence that claims 1–4 of the ’306 patent are unpatentable, but Petitioner has not shown by a preponderance of the evidence that claims 7 and 8 of the ’306 patent are unpatentable. 2 IPR2015-00374 Patent 7,913,306 B2 A. Related Proceedings The parties indicate that the ’306 patent is at issue in the following district court case: Trustees of Columbia University in New York v. Symantec Corp., No. 3:13-cv-808 (E.D. Va.). Pet. 1; Paper 6, 2. Patent Owner identifies the following petitions for inter partes review as being related to this case (Paper 6, 2): Case No. Involved U.S. Patent No. IPR2015-00372 U.S. Patent No. 7,448,084 IPR2015-00378 U.S. Patent No. 7,448,084 B. The ’306 Patent The ’306 patent relates to detecting intrusions in a computer system. Ex. 1001, Abstract. The ’306 patent explains that one type of intrusion detection system is a signature-based system. Id. at col. 1, ll. 64–66, col. 2, ll. 26–30. A signature-based system detects intrusions by comparing activity in a computer system to a database of signatures corresponding to known malicious programs. Id. This type of intrusion detection also is known as misuse detection. PO Resp. 18. According to the ’306 patent, the drawback of a signature-based or misuse detection system is that it requires prior knowledge of a malicious program, and, thus, rarely is effective at detecting new attacks. Ex. 1001, col. 1, l. 66–col. 2, l. 21, col. 2, ll. 31–33. The ’306 patent explains that a second type of intrusion detection system is an anomaly detection system. Id. at col. 2, ll. 33–35. An anomaly detection system detects intrusions by generating a model of normal behavior and then determining if the activity in a computer system deviates from that model of normal behavior. Id. at col. 2, ll. 35–38. Thus, unlike a signature-based or misuse detection system, an anomaly detection system may detect new attacks. Id. at col. 2, ll. 38–40. According to the ’306 3 IPR2015-00374 Patent 7,913,306 B2 patent, the drawback of prior anomaly detection systems is that those systems had high computational overhead and high false positive rates. Id. at col. 2, l. 66–col. 3, l. 8. The ’306 patent describes an anomaly detection system that monitors accesses to a computer’s file system, such as the computer’s registry. Id. at col. 4, ll. 56–65. The ’306 patent explains that the advantage of monitoring the registry is “that registry activity is regular by nature, that the registry can be monitored with low computational overhead, and that almost all system activities query the registry.” Id. at col. 5, ll. 7–10. The anomaly detection system described in the ’306 patent gathers data, including features from records of normal processes that access the registry, and then generates a probabilistic model of normal computer system usage based on those features. Id. at col. 3, ll. 22–28. To detect a malicious program, the system determines whether a subsequent access to the registry deviates from the model of normal computer system usage. Id. at col. 6, ll. 24–28. C. Illustrative Claim Claim 1 is independent and is reproduced below. 1. A method for detecting intrusions in the operation of a computer system comprising: (a) gathering features from records of normal processes that access the file system of the computer; (b) generating a probabilistic model of normal computer system usage based on occurrences of the features and determining the likelihood of observing an event that was not observed during the gathering of features from the records of normal processes; and (c) analyzing features from a record of a process that accesses the file system to detect deviations from normal 4 IPR2015-00374 Patent 7,913,306 B2 computer system usage to determine whether the access to the file system is an anomaly. Ex. 1001, col. 22, ll. 40–52. II. ANALYSIS A. Level of Ordinary Skill in the Art Petitioner’s declarant, Dr. Michael T. Goodrich, testifies that a person of ordinary skill in the art at the time of the ’306 patent would have had “a Master’s degree in computer science or a related field with two to three years of experience in the field of software security systems.” Ex. 1003 ¶ 20. Patent Owner’s declarant, Dr. George Cybenko, testifies that a person of ordinary skill in the art at the time of the ’306 patent would have had “an undergraduate degree in computer science or mathematics, and one or two years of experience in the field of computer security.” Ex. 2019 ¶ 24. Thus, Petitioner’s declarant, Dr. Goodrich, defines the level of ordinary skill in the art slightly higher in both education and experience. Patent Owner points out this difference, but does not argue that it materially impacts either party’s analysis. PO Resp. 8. In fact, Petitioner’s declarant, Dr. Goodrich, testifies that his analysis would not change under the slightly lower level of ordinary skill in the art set forth by Patent Owner. Ex. 1029 ¶¶ 7–8. In other words, the minor distinctions between the parties’ proposed levels of ordinary skill in the art are not material. As a result, we determine that both parties define the level of ordinary skill in the art appropriately in this case. Patent Owner argues that, even if Petitioner correctly defines the level of ordinary skill in the art, Petitioner improperly conducts its anticipation analysis from the perspective of an expert in the art, not a person of ordinary skill in the art. PO Resp. 6–8. Patent Owner argues that Bace discloses that performing certain intrusion detection tasks “may be intuitively obvious only 5 IPR2015-00374 Patent 7,913,306 B2 to the most experienced security administrators.” Id. at 6–7 (citing Ex. 1007, 77) (emphasis omitted). The portion of Bace identified by Patent Owner relates specifically to performing the “balancing act” of monitoring enough features to detect significant security problems and not monitoring so many features that it would cause performance loss. Ex. 1007, 77. Patent Owner argues that Denning similarly discloses that monitoring too many features “could severely degrade performance on the target system or overload the intrusion-detection system.” PO Resp. 7 (citing Ex. 1021, 3). Thus, according to Patent Owner, Bace and Denning only enable “the most experienced members of the industry,” not a person of ordinary skill in the art, to make the method recited in the challenged claims. PO Resp. 8. “[A] prior art reference need not enable its full disclosure; it only needs to enable the portions of its disclosure alleged to anticipate the claimed invention.” In re Antor Media Corp., 689 F.3d 1282, 1290 (Fed. Cir. 2012). Here, Patent Owner does not show that Petitioner relies on the particular portions of Bace and Denning that Patent Owner alleges are not enabled. PO Resp. 6–8. In addition, the challenged claims do not recite a method for maintaining a certain level of computer performance or a certain level of detection accuracy. See, e.g., Ex. 1001, col. 22, ll. 40–52. As such, even if Patent Owner is correct that the portions of Bace and Denning that discuss maintaining a specific balance of performance and accuracy are not enabled, Petitioner need not rely on those portions of Bace and Denning to show that the challenged claims are anticipated. Therefore, we are not persuaded that Petitioner improperly conducts its anticipation analysis from the perspective of an expert in the art. 6 IPR2015-00374 Patent 7,913,306 B2 B. Anticipation of Claims 1–4, 7, and 8 by Bace Petitioner argues that claims 1–4, 7, and 8 are anticipated by Bace. Pet. 20–33. We have considered the parties’ arguments and supporting evidence, and we determine that Petitioner has shown by a preponderance of the evidence that claim 1 is anticipated by Bace. We determine that Petitioner has not shown by a preponderance of the evidence that claims 2– 4, 7, and 8 are anticipated by Bace. 1. Overview of Bace Bace describes techniques for detecting intrusions in a computer system. Ex. 1007, 48, 59–60.1 Similar to the ’306 patent, Bace explains that anomaly detection involves collecting reference data from a computer system, using that reference data to create a model of normal computer system usage, and then analyzing future activity to determine if it deviates from the model of normal computer system usage. Id. at 106, 108–110, 121. Bace identifies several data sources that are commonly used for anomaly detection. Id. at 66. Specifically, for computer systems that use a Windows operating system, Bace teaches that Windows event logs, such as the system log, application log, and security log, can be used as data sources. Id. at 74– 75. According to Bace, “the security log events are the primary focus of intrusion detection systems.” Id. at 75. Bace also identifies several models that are commonly used for anomaly detection. Id. at 121. Bace points out that Dorothy Denning’s model is the “seminal” anomaly detection model. Ex. 1007, 108, 121. 1 In this Decision, we cite to the exhibit page numbers used by the parties. 7 IPR2015-00374 Patent 7,913,306 B2 2. Claim 1 a. Limitations of Claim 1 Claim 1 recites “gathering features from records of normal processes that access the file system of the computer.” Ex. 1001, col. 22, ll. 42–43. Bace discloses collecting reference data from several Windows event logs, including the system log, application log, and security log. Ex. 1007, 74–75. Bace specifies, though, that “security log events are the primary focus of intrusion detection systems.” Id. at 75 (emphasis added). Bace also discloses that the security log includes events relating to “the creation, deletion, and alteration of system files and other objects.” Id. at 74–75 (emphasis added); Ex. 1003 ¶¶ 105–107, 129 (pp. 57–61). Claim 1 recites “generating a probabilistic model of normal computer system usage based on occurrences of the features and determining the likelihood of observing an event that was not observed during the gathering of features from the records of normal processes.” Ex. 1001, col. 22, ll. 44– 48. Bace discloses generating a model of normal computer system usage. Ex. 1007, 106, 108–110, 121; Ex. 1003 ¶¶ 110–112. Bace describes Denning’s model as the “seminal” anomaly detection model. Ex. 1007, 108, 121; Ex. 1003 ¶ 111. Denning’s anomaly detection model includes a mean and standard deviation model and a Markov process model, which are probabilistic models that determine the likelihood of observing an event that was not observed during the gathering of features from the records of normal processes. Ex. 1007, 121–122; Ex. 1003 ¶¶ 48–50, 114–115, 129 (pp. 61– 64). Bace discloses that those models are generated using the reference data discussed above. Ex. 1007, 74–75, 109, 121–122; Ex. 1003 ¶ 112. 8 IPR2015-00374 Patent 7,913,306 B2 Claim 1 recites “analyzing features from a record of a process that accesses the file system to detect deviations from normal computer system usage to determine whether the access to the file system is an anomaly.” Ex. 1001, col. 22, ll. 49–52. Bace discloses obtaining a record generated by an information source, comparing the record to the contents of the models discussed above, and determining whether the event is anomalous. Ex. 1007, 109–110; Ex. 1003 ¶¶ 113, 129 (pp. 64–66). Bace discloses that the records being analyzed for anomalies also are obtained from the Windows event logs, such as the security log, discussed above. Ex. 1007, 74–75, 109; Ex. 1003 ¶ 113. b. Patent Owner’s Arguments Patent Owner presents the following arguments regarding claim 1 in its Response: 1) Bace does not disclose generating a model of normal computer system usage (PO Resp. 15–16); and 2) Bace does not disclose all the limitations of claim 1, arranged as in the claim (id. at 16–23). We address each of Patent Owner’s arguments below. First, Patent Owner argues that Bace does not disclose generating a model of normal computer system usage. PO Resp. 15–16. Petitioner proposes construing the phrase “probabilistic model of normal computer system usage” to mean a “model of typical attack-free computer system usage that employs probability.” Pet. 12–13. Patent Owner acknowledges that we should apply Petitioner’s proposed construction in this case.2 Tr. 2 Petitioner’s proposed construction was adopted by the United States District Court for the Eastern District of Virginia and was affirmed by the United States Court of Appeals for the Federal Circuit in Trustees of Columbia University in New York v. Symantec Corp., 811 F.3d 1359, 1367– 69 (Fed. Cir. 2016). 9 IPR2015-00374 Patent 7,913,306 B2 74:12–18. Patent Owner argues that Bace does not disclose a model of attack-free computer system usage because Bace states that “[f]inding ‘clean’ training sets for anomaly detectors remains a major issue.” PO Resp. 16 (citing Ex. 1007, 109). Patent Owner’s argument is not persuasive. Bace discloses generating a model of “normal” activity for anomaly detection. Ex. 1007, 106, 110, 121; Ex. 1003 ¶ 110. Bace explains that the term “normal” in the context of anomaly detection refers to activity that is not an attack. Ex. 1007, 105 (“Denning’s initial assertion was that the region of ‘misuse’ activity falls far enough outside the region of ‘normal’ activity to use normality measures as the basis for finding misuse.”); Ex. 1003 ¶ 129 (p. 62). Thus, the model of normal activity disclosed in Bace is a model of attack-free computer system usage. Patent Owner points out that, according to Bace, “[t]he fact that the historical data used to populate the anomaly detector is devoid of intrusions is often assumed without any corroborating evidence,” and “[f]inding ‘clean’ training sets for anomaly detectors remains a major issue.” Ex. 1007, 109; PO Resp. 16. This portion of Bace, however, does not indicate that attack data is used (or should be used) to generate the anomaly detection model. To the contrary, this portion of Bace simply cautions that it cannot be assumed that data is attack-free. As such, this portion of Bace reinforces the disclosure in Bace discussed above that an anomaly detection model should be generated using normal (i.e., attack-free) data. Second, Patent Owner argues that Bace does not disclose all the limitations of claim 1, arranged as in the claim. PO Resp. 16–23. Specifically, Patent Owner argues that Bace only discloses using Windows event logs as an information source for misuse detection, not anomaly 10 IPR2015-00374 Patent 7,913,306 B2 detection. PO Resp. 18–19 (citing Ex. 1007, 70, 112). Patent Owner’s argument is not persuasive. As discussed above, anomaly detection involves identifying abnormal computer system usage. Ex. 1001, col. 2, ll. 35–38; Ex. 1007, 106, 110, 121. Bace discloses that the “finer-grained level of detail” provided by operating system audit trails, such as Windows event logs, can be used to identify “abnormal patterns of process execution,” such as “the execution of trojan horses and other malicious code.”3 Ex. 1007, 70. Thus, contrary to Patent Owner’s argument, Bace expressly discloses using Windows event logs for anomaly detection, not just misuse detection. Ex. 1029 ¶ 17. Patent Owner also argues that Bace describes various possible information sources and various possible intrusion detection models, but does not disclose using any particular information source with any particular model. PO Resp. 19–23. Thus, according to Patent Owner, Bace does not disclose using records of file system accesses as an information source for Denning’s mean and standard deviation model or Markov process model. Id. Patent Owner’s argument is not persuasive. Bace specifies that certain information sources are used for a computer system with a Windows operating system. Namely, Bace discloses that, for a Windows computer system, “security log events are the primary focus of intrusion detection systems,” and “especially those having to do with the creation, deletion, and alteration of system files and other 3 Patent Owner suggests that this portion of Bace only relates to the system call records gathered by UNIX system audit trails. Paper 48 ¶¶ 3–5. Section 3.2.4 of Bace, however, describes the advantages of operating system audit trails generally, not just UNIX system audit trails. Ex. 1007, 70. 11 IPR2015-00374 Patent 7,913,306 B2 objects.” Id. at 74–75 (emphases added). Bace then explains that the reference data obtained from the Windows computer system is used to generate the anomaly detection models discussed later in Bace. Ex. 1007, 109 (“Anomaly detectors are populated by running them against the collected reference event data.”); id. at 74–75, 121–122; Ex. 1003 ¶ 112. Thus, for a Windows computer system, Bace expressly links the reference data obtained by monitoring the creation, deletion, and alteration of system files with the disclosed anomaly detection models. Ex. 1029 ¶¶ 15–19; cf. Net MoneyIN, Inc. v. VeriSign, Inc., 545 F.3d 1359, 1370–71 (Fed. Cir. 2008) (explaining that the prior art did not link two distinct teachings). Bace further discloses that Denning’s anomaly detection model, which includes the mean and standard deviation model, is the “seminal” anomaly detection model. Ex. 1007, 108, 121–122. As a result, Bace provides a specific reason for selecting Denning’s anomaly detection model from among the other models disclosed in Bace. Ex. 1003 ¶ 111; Ex. 1029 ¶ 19. c. Summary For the reasons discussed above, we determine that Petitioner has shown by a preponderance of the evidence that claim 1 is anticipated by Bace. 3. Claim 2 Claim 2 depends from claim 1, and recites “wherein the step of gathering features from records of normal processes that access the file system of the computer comprises gathering a feature corresponding to a name of a process accessing the file system of the computer.” Ex. 1001, col. 22, ll. 53–57. Petitioner argues that Bace discloses a Windows event record that includes a “Source” field. Pet. 28 (citing Ex. 1007, 75–76). According 12 IPR2015-00374 Patent 7,913,306 B2 to Petitioner, the “Source” field includes the name of a process accessing the file system. Pet. 28. Bace discloses that the “Source” field in a Windows event record identifies “[t]he software responsible for generating the event record.” Ex. 1007, 76. In other words, the “Source” field identifies the software that creates the Windows event record, not the software that performs the event (e.g., accesses the file system). Id.; Ex. 2019 ¶¶ 52–53, 136–139. Patent Owner’s declarant, Dr. Cybenko, provides an example in which a Notepad application accesses the file system. Ex. 2019 ¶¶ 52, 139; Ex. 2023, 2–3. The “Source” field in the Windows event record only identifies the Security server that generated the Windows event record, not the Notepad application that accessed the file system. Ex. 2019 ¶¶ 52, 139; Ex. 2023, 2–3. Petitioner also argues that a reference entitled “Windows NT Event Logging,” by James D. Murray (“Murray”) discloses that “[l]arge applications usually consist of multiple processes, and it might be more informative for each to process report events using a different event source name.” Pet. Reply 11 (citing Ex. 1014, 58) (emphasis omitted). Petitioner does not explain whether it relies on Murray to show an inherent disclosure of Bace or to supplement the disclosure of Bace. See Pet. Reply 11. In either case, Petitioner’s argument is not persuasive. Like Bace, the cited disclosure in Murray refers to the “event source” as the software that “process[es] report[s],” not the software that performs the event (e.g., accesses the file system). Ex. 1014, 58. Therefore, we determine that Petitioner has not shown by a preponderance of the evidence that claim 2 is anticipated by Bace. 13 IPR2015-00374 Patent 7,913,306 B2 4. Claim 3 Claim 3 depends from claim 1, and recites “wherein gathering features from records of normal processes that access the file system of the computer comprises gathering a feature corresponding to a type of query being sent to the file system of the computer.” Ex. 1001, col. 22, ll. 58–62. Petitioner argues that Bace discloses a Windows event record that includes a “Description” field and a “Category” field. Pet. 29 (citing Ex. 1007, 75–76). According to Patent Owner, the “Description” field and the “Category” field include the type of query sent to the file system. Pet. 29. Bace discloses that the “Description” field in a Windows event record includes: “Variable content, depending on event. Can be text explanation of problem and recommendation of corrective measures.” Ex. 1007, 75. Bace discloses that the “Category” field includes “[t]he triggering event type, used primarily in the security log to indicate the event type for which success or failure auditing has been enabled.” Id. at 76. Thus, Bace indicates that the “Description” field and “Category” field may include the type of event. Id. at 75–76. Petitioner, however, does not provide an explanation or identify evidence indicating that the type of event necessarily is the type of query sent to the file system. See Pet. 29. Petitioner argues that the security log in Bace monitors events relating to the “creation, deletion, and alteration of system files,” which are different types of queries, and, thus, the security log gathers features corresponding to the type of query sent to the file system. Pet. Reply. 12 (citing Ex. 1007, 74–75; Ex. 1029 ¶ 25). Petitioner’s argument is not persuasive. Although the security log monitors events that may involve different types of queries being sent to the file system, that alone does not show that the Windows 14 IPR2015-00374 Patent 7,913,306 B2 event record necessarily reports the type of query sent to the file system. As discussed above, Bace discloses that the Windows event record includes the type of event, but not necessarily the type of query sent to the file system. Ex. 1007, 75–76. Petitioner also argues that “Murray provides additional evidence that the security log events described in Bace gather features corresponding to a type of query.” Pet. Reply 12. Petitioner does not explain whether it relies on Murray to show an inherent disclosure of Bace or to supplement the disclosure of Bace. Id. In either case, Petitioner’s argument is not persuasive. Like Bace, the cited disclosure in Murray indicates that the security log monitors events that may involve different types of queries being sent to the file system (Ex. 1004, 90–91), but the Windows event record only reports the type of event, not necessarily the type of query sent to the file system (id. at 107). Further, Murray discloses that there are only seven types of events reported in the “Category” field of a Windows event record, none of which specify the type of query sent to the file system. Ex. 1014, 60; Ex. 2019 ¶¶ 145–147. Therefore, we determine that Petitioner has not shown by a preponderance of the evidence that claim 3 is anticipated by Bace. 5. Claim 4 Claim 4 depends from claim 3, and recites “wherein gathering features from records of normal processes that access the file system of the computer comprises gathering a feature corresponding to an outcome of a query being sent to the file system of the computer.” Ex. 1001, col. 22, ll. 63–67. Because claim 4 depends from claim 3, and Petitioner has not shown that claim 3 is anticipated by Bace, we determine that Petitioner also has not 15 IPR2015-00374 Patent 7,913,306 B2 shown by a preponderance of the evidence that claim 4 is anticipated by Bace. See supra Section II.B.4. 6. Claim 7 Claim 7 depends from claim 1, and recites “wherein generating a probabilistic model of normal computer system usage comprises determining a likelihood of observing a feature in the records of processes that access the file system of the computer.” Ex. 1001, col. 23, ll. 10–14. Petitioner argues in the Petition that the mean and standard deviation model and the Markov process model in Bace determine the likelihood of observing an event. Pet. 30–32 (citing Ex. 1007, 121–122). Petitioner argues in the Reply that the TIM system in Bace determines the likelihood of observing one event given the occurrence of another event. Pet. Reply 14– 15 (citing Ex. 1007, 128–129). Thus, Petitioner identifies evidence indicating that Bace discloses determining the likelihood of observing an event, but does not identify evidence indicating that Bace discloses determining the likelihood of observing a feature, as recited in claim 7. Pet. 30–32; Pet. Reply 14–15; Tr. 30:2–15, 31:6–12. Petitioner instead argues that an event is the “same thing” as a feature. Tr. 30:20–24, 32:9–13. Petitioner, however, improperly presented this argument for the first time at the oral hearing. Id. at 30:13– 15; Pet. 30–32; Pet. Reply 14–15; see Office Patent Trial Practice Guide, 77 Fed. Reg. 48,756, 48,768 (Aug. 14, 2012) (“No new evidence or arguments may be presented at the oral argument.”). Moreover, Petitioner does not identify any evidence from the ’306 patent (or otherwise) to support its argument that an event is the same thing as a feature. Tr. 31:1–5; see Helmsderfer v. Bobrick Washroom Equip., Inc., 527 F.3d 1379, 1381–82 16 IPR2015-00374 Patent 7,913,306 B2 (Fed. Cir. 2008) (“Our precedent instructs that different claim terms are presumed to have different meanings.”). Therefore, we determine that Petitioner has not shown by a preponderance of the evidence that claim 7 is anticipated by Bace. 7. Claim 8 Claim 8 depends from claim 7, and recites “wherein determining a likelihood of observing a feature comprises determining a conditional probability of observing a first feature in the records of processes that access the file system of the computer given an occurrence of a second feature in the records.” Ex. 1001, col. 23, l. 15–col. 24, l. 2. Because claim 8 depends from claim 7, and Petitioner has not shown that claim 7 is anticipated by Bace, we determine that Petitioner also has not shown by a preponderance of the evidence that claim 8 is anticipated by Bace. See supra Section II.B.6. C. Anticipation of Claims 1–4 and 7 by Denning Petitioner argues that claims 1–4 and 7 are anticipated by Denning. Pet. 47–54. We have considered the parties’ arguments and supporting evidence, and we determine that Petitioner has shown by a preponderance of the evidence that claims 1–4 are anticipated by Denning. We determine that Petitioner has not shown by a preponderance of the evidence that claim 7 is anticipated by Denning. 1. Overview of Denning Denning describes a technique for detecting intrusions in a computer system. Ex. 1021, Abstract. Denning explains that “[t]he basic idea is to monitor the standard operations on a target system: logins, command and program execution, file and device accesses, etc., looking only for deviations in usage.” Id. at 2. More specifically, Denning discloses gathering audit 17 IPR2015-00374 Patent 7,913,306 B2 records of “actions performed by subjects on objects,” where the subject can be a process and the object can be a system file. Id. According to Denning, the audit records are used to generate an activity profile that represents “normal activity.” Id. at 3–4. Denning discloses that a new observation is then compared to the model of normal activity to determine whether the new observation is abnormal. Id. at 4. 2. Enablement Patent Owner argues that Denning does not enable the method recited in the challenged claims, and, thus, does not anticipate any of the challenged claims. PO Resp. 48–52. Specifically, Patent Owner argues that Denning is based on a “hypothesis” and poses several “open questions” about the utility of the disclosed method. Id. at 48–49 (citing Ex. 1021, 1, 10). Patent Owner also points out that Denning does not present any performance data from a working example. PO Resp. at 49–50. As a result, according to Patent Owner, it would require undue experimentation for a person of ordinary skill in the art to perform the claimed method based on the disclosure in Denning. Id. at 50–51. Patent Owner’s argument is not persuasive. As discussed below, Denning discloses specific details about how to perform the disclosed intrusion detection method, including by identifying the specific information sources and models used in the method. See infra Section II.C.3. Denning does not lack enablement just because it does not present a working example. See Antor Media, 689 F.3d at 1290 (“[T]he invention in a prior art publication need not have actually been made or performed to satisfy enablement.”). Denning also does not lack enablement just because it does not show that the disclosed method would be useful. See In re Gleave, 560 F.3d 1331, 1335 (Fed. Cir. 2009) (“[A] reference need 18 IPR2015-00374 Patent 7,913,306 B2 disclose no independent use or utility to anticipate a claim under § 102.”). Moreover, a person of ordinary skill in the art at the time of the ’306 patent would have known that the disclosed methods in Denning actually work to detect intrusions. See, e.g., Ex. 1007, 108 (“An example of an anomaly detection-based classification model is the Intrusion Detection Expert System (IDES), an instantiation of Dorothy Denning’s seminal intrusion detection system model.”); Ex. 1029 ¶ 42. Patent Owner also argues that “a method that, for example, returns a high positive rate and is unable to differentiate between intrusions and non- intrusions, would not be regarded by a person of ordinary skill in the art as ‘detecting intrusions.’” PO Resp. 51. Patent Owner’s argument is not persuasive. Denning discloses a method that differentiates between intrusions and non-intrusions by obtaining a new observation, comparing it to a model of normal activity, and determining whether the new observation is anomalous. Ex. 1021, 4 (“A new observation . . . is defined to be abnormal if it falls outside a confidence interval.”); Ex. 1003 ¶ 137 (pp. 91– 92). The challenged claims do not recite maintaining a specific level of accuracy. See, e.g., Ex. 1001, col. 22, ll. 40–52. Thus, whether Denning discloses a method that achieves a specific level of accuracy has no bearing on whether Denning is enabled for purposes of our anticipation analysis of the challenged claims. 3. Claim 1 a. Limitations of Claim 1 Claim 1 recites “gathering features from records of normal processes that access the file system of the computer.” Ex. 1001, col. 22, ll. 42–43. Denning discloses gathering audit records that represent “actions performed 19 IPR2015-00374 Patent 7,913,306 B2 by subjects on objects.” Ex. 1021, 2. Denning discloses that the “subject is typically a terminal user, but might also be a process acting on behalf of users or groups of users, or might be the system itself.” Id. (emphasis added); Ex. 1003 ¶ 127. Denning also specifies that the object includes “[r]esources managed by the system—files, commands, devices, etc.” Ex. 1021, 2 (emphasis added); Ex. 1003 ¶ 127. Thus, Denning discloses gathering features from records of processes that access the file system of the computer. Ex. 1003 ¶ 137 (pp. 89–90). Claim 1 recites “generating a probabilistic model of normal computer system usage based on occurrences of the features and determining the likelihood of observing an event that was not observed during the gathering of features from the records of normal processes.” Ex. 1001, col. 22, ll. 44– 48. Denning discloses using features from the audit records to generate an activity profile that represents “normal activity.” Ex. 1021, 3–4. In particular, Denning discloses a mean and standard deviation model, which is a probabilistic model that determines the likelihood of observing an event that was not observed during the gathering of features from the records of normal processes. Id. at 4 (“[T]he probability of a value falling outside this interval is at most 1 / d2.”); Ex. 1003 ¶ 137 (pp. 90–91). Claim 1 recites “analyzing features from a record of a process that accesses the file system to detect deviations from normal computer system usage to determine whether the access to the file system is an anomaly.” Ex. 1001, col. 22, ll. 49–52. Denning discloses obtaining a new observation, comparing it to the model of normal activity, and determining whether the new observation is anomalous. Ex. 1021, 4 (“A new observation . . . is defined to be abnormal if it falls outside a confidence interval.”); Ex. 1003 20 IPR2015-00374 Patent 7,913,306 B2 ¶ 137 (pp. 91–92). b. Patent Owner’s Arguments Patent Owner presents the following arguments regarding claim 1 in its Response: 1) Denning does not disclose generating a model of normal computer system usage (PO Resp. 35–37); and 2) Denning does not disclose all the limitations of claim 1, arranged as in the claim (id. at 37–41). We address each of Patent Owner’s arguments below. First, Patent Owner argues that Denning does not disclose generating a model of normal computer system usage. PO Resp. 35–37. As discussed above, Petitioner proposes construing the phrase “probabilistic model of normal computer system usage” to mean a “model of typical attack-free computer system usage that employs probability.” Pet. 12–13. As also discussed above, Patent Owner acknowledges that we should apply Petitioner’s proposed construction in this case. Tr. 74:12–18. Patent Owner argues that, although Denning discloses that the activity profile represents normal activity, Denning does not disclose that the data input to the activity profile is attack-free. PO Resp. 35–36 (citing Ex. 1021, 3–4). Patent Owner’s argument is not persuasive. Denning discloses generating a model of “normal activity” for anomaly detection. Ex. 1021, 3–4. Denning indicates that the term “normal” in the context of anomaly detection refers to activity that is not an attack. Id. at Abstract (“The model is based on the hypothesis that security violations can be detected by monitoring a system’s audit records for abnormal patterns of system usage.”) (emphases added); Ex. 1003 ¶ 137 (p. 91). Thus, the model of normal activity disclosed in Denning is a model of attack-free computer system usage. 21 IPR2015-00374 Patent 7,913,306 B2 Second, Patent Owner argues that Bace does not disclose all the limitations of claim 1, arranged as in the claim. PO Resp. 37–41. Specifically, Patent Owner argues that Denning only discloses gathering audit records relating to users that access the file system, not processes that access the file system. Id. Patent Owner points out that Denning states that “[f]ile-access activity is represented in audit records where the subject is a user.” Id. at 37 (citing Ex. 1021, 9) (emphasis omitted). Patent Owner’s argument is not persuasive. As discussed above, Denning discloses that the “subject is typically a terminal user, but might also be a process acting on behalf of users or groups of users, or might be the system itself.” Ex. 1021, 2 (emphasis added). Thus, although the audit records of file-access activity typically may have a user as the subject, Denning discloses that a process also may be the subject of those audit records. Id.; Ex. 1029 ¶¶ 35–36. Patent Owner also argues that none of “the specific measures that Denning suggested for the file-access activity profile . . . illustrate file accesses by a process.” PO Resp. 38–39. Patent Owner’s argument is not persuasive. Denning discloses ReadFails, WriteFails, DeleteFails, and CreateFails event counters that “could be useful for detecting users who persistently attempt access unauthorized files.” Ex. 1021, 9. As discussed above, though, Denning discloses that the subject is not limited to a user and may also be a process. Id. at 2. Thus, the ReadFails, WriteFails, DeleteFails, and CreateFails event counters disclosed in Denning also would be useful for detecting processes that persistently attempt to access unauthorized system files. Id. at 2, 9. 22 IPR2015-00374 Patent 7,913,306 B2 c. Summary For the reasons discussed above, we determine that Petitioner has shown by a preponderance of the evidence that claim 1 is anticipated by Denning. 4. Claim 2 Claim 2 depends from claim 1, and recites “wherein the step of gathering features from records of normal processes that access the file system of the computer comprises gathering a feature corresponding to a name of a process accessing the file system of the computer.” Ex. 1001, col. 22, ll. 53–57. Denning discloses gathering audit records with a “Subject” field that indicates the user or process that performed an action on an object. Ex. 1021, 2. Patent Owner again argues that Denning only discloses gathering audit records relating to users that access the file system, not processes that access the file system. PO Resp. 42. For the reasons discussed above, Patent Owner’s argument is not persuasive. See supra Section II.C.3.b. Patent Owner also argues that Denning does not disclose that the “Subject” field in an audit record can be used to generate the mean and standard deviation model. PO Resp. 43. Patent Owner’s argument is not persuasive. Denning discloses ReadFails, WriteFails, DeleteFails, and CreateFails event counters that can be “defined for individual users and files or classes thereof.” Ex. 1021, 9. In other words, Denning discloses event counters that are defined based on a specific subject from the audit records. Id. As discussed above, the subject in the audit records may be a process. Id. at 2. Denning also discloses that the mean and standard deviation model “is applicable to event counters.” Id. at 4. Thus, Denning discloses that the 23 IPR2015-00374 Patent 7,913,306 B2 mean and standard deviation model can be generated using an event counter that is based on the “Subject” field in the audit records. Ex. 1029 ¶ 38. Therefore, we determine that Petitioner has shown by a preponderance of the evidence that claim 2 is anticipated by Denning. 5. Claim 3 Claim 3 depends from claim 1, and recites “wherein gathering features from records of normal processes that access the file system of the computer comprises gathering a feature corresponding to a type of query being sent to the file system of the computer.” Ex. 1001, col. 22, ll. 58–62. Denning discloses gathering a feature corresponding to a type of query sent to the file system, such as “read,” “write,” “create,” “delete,” or “append.” Ex. 1021, 9. Denning also discloses generating the mean and standard deviation model using event counters that are based on the type of query sent to the file system, such as ReadFails, WriteFails, DeleteFails, and CreateFails event counters. Id. at 4, 9. Patent Owner does not present any specific arguments regarding claim 3 in its Response. PO Resp. 44. Therefore, we determine that Petitioner has shown by a preponderance of the evidence that claim 3 is anticipated by Denning. 6. Claim 4 Claim 4 depends from claim 3, and recites “wherein gathering features from records of normal processes that access the file system of the computer comprises gathering a feature corresponding to an outcome of a query being sent to the file system of the computer.” Ex. 1001, col. 22, ll. 63–67. Denning discloses gathering a feature corresponding to the outcome of a query sent to the file system, such as ReadFails, WriteFails, DeleteFails, and CreateFails event counters. Ex. 1021, 9. 24 IPR2015-00374 Patent 7,913,306 B2 Patent Owner argues that Petitioner relies on the mean and standard deviation model to disclose other limitations of the challenged claims, but does not show that the mean and standard deviation model can be generated using the ReadFails, WriteFails, DeleteFails, and CreateFails event counters. PO Resp. 45–46. According to Patent Owner, Denning only discloses using those particular event counters with the operational model, not the mean and standard deviation model. Id. at 45 (citing Ex. 1021, 9). Patent Owner’s argument is not persuasive. Denning discloses that the mean and standard deviation model “is applicable to event counters” generally and does not limit that model to any specific event counters. Ex. 1021, 4; Ex. 1029 ¶ 39. Although Denning “suggest[s]” (Ex. 1021, 7) using the ReadFails, WriteFails, DeleteFails, and CreateFails event counters with the operational model (id. at 9), that suggestion does not limit the broader disclosure indicating that the mean and standard deviation model is applicable to event counters generally (id. at 4; Ex. 1029 ¶ 39). Patent Owner also argues that claim 4, through its dependency from claims 1 and 3, requires gathering two features (the type of query and the outcome of the query) and then generating a model of normal computer system usage based on those two features. PO Resp. 46. Patent Owner contends that Denning does not disclose generating the mean and standard deviation model based on two different types of features. Id. at 47. Patent Owner’s argument is not persuasive. Denning discloses that a model can be defined “for aggregates of subject and objects (i.e., where the Subject and Object patterns match sets of names).” Ex. 1021, 5; Ex. 1029 ¶ 40. Further, as discussed above, Denning discloses generating the mean and standard deviation model using event counters. Ex. 1021, 4. Each of the ReadFails, 25 IPR2015-00374 Patent 7,913,306 B2 WriteFails, DeleteFails, and CreateFails event counters are based on two different types of features. Id. at 9. For example, the ReadFails event counter is based on the type of query (e.g., read) and the outcome of the query (e.g., fail). Id. Thus, Denning discloses generating the mean and standard deviation model based on two different types of features. Therefore, we determine that Petitioner has shown by a preponderance of the evidence that claim 4 is anticipated by Denning. 7. Claim 7 Claim 7 depends from claim 1, and recites “wherein generating a probabilistic model of normal computer system usage comprises determining a likelihood of observing a feature in the records of processes that access the file system of the computer.” Ex. 1001, col. 23, ll. 10–14. Petitioner argues that the mean and standard deviation model in Denning determines the likelihood of observing an event. Pet. 53–54 (citing Ex. 1021, 4). Thus, Petitioner identifies evidence indicating that Denning discloses determining the likelihood of observing an event, but does not identify evidence indicating that Denning discloses determining the likelihood of observing a feature, as recited in claim 7. Pet. 53–54; Pet. Reply 22–23; Tr. 30:2–15, 31:6–12. Petitioner instead argues that an event is the “same thing” as a feature. Tr. 30:20–24, 32:9–13. Petitioner, however, improperly presented this argument for the first time at the oral hearing. Id. at 30:13– 15; Pet. 53–54; Pet. Reply 22–23; see Office Patent Trial Practice Guide, 77 Fed. Reg. 48,756, 48,768 (Aug. 14, 2012) (“No new evidence or arguments may be presented at the oral argument.”). Moreover, Petitioner does not identify any evidence from the ’306 patent (or otherwise) to support its 26 IPR2015-00374 Patent 7,913,306 B2 argument that an event is the same thing as a feature. Tr. 31:1–5; see Helmsderfer, 527 F.3d at 1381–82. Therefore, we determine that Petitioner has not shown by a preponderance of the evidence that claim 7 is anticipated by Denning. D. Patent Owner’s Motion to Strike Patent Owner filed a Motion to Strike Petitioner’s Reply (Paper 42, “PO Mot. Str.”), to which Petitioner filed an Opposition (Paper 44, “Pet. Opp. Str.”).4 Patent Owner argues that the portion of Petitioner’s Reply relating to claim 7 as being anticipated by Bace should be stricken because it is beyond the scope of a proper reply. PO Mot. Str. 1–2. Specifically, the Petition relies on two anomaly detection models in Bace, the mean and standard deviation model and the Markov process model, as teaching the limitations of claim 7. Id. at 1; Pet. 30–32. Patent Owner argues in the Response that those models do not teach the limitations of claim 7. PO Mot. Str. 2; PO Resp. 31–33. Petitioner responds in the Reply that a different model in Bace, the TIM system, teaches the limitations of claim 7. PO Mot. Str. 2; Pet. Reply 14–15. “A reply may only respond to arguments raised in the corresponding opposition or patent owner response.” 37 C.F.R. § 42.23(b). Here, Patent Owner argues that Petitioner’s Reply exceeds the scope of a proper reply because it presents a new theory of unpatentability for claim 7 based on the TIM system in Bace that is not included in the Petition. PO Mot. Str. 2–3. Petitioner, on the other hand, argues that its Reply is proper because the Petition relies on the TIM system in Bace with respect to claim 8, and, thus 4 We authorized Patent Owner to file a motion to strike and Petitioner to file an opposition. Paper 40, 3. 27 IPR2015-00374 Patent 7,913,306 B2 Patent Owner was on notice that Petitioner also may rely on the TIM system with respect to claim 7. Pet. Opp. Str. 1–2. Petitioner’s arguments in the Reply relating to claim 7 as being anticipated by Bace test the limits of a proper reply under 37 C.F.R. § 42.23(b). Petitioner acknowledges that it relied on the TIM system in the Reply simply because “[i]t became clear that . . . the case is easier under the TIM Model.” Tr. 32:16–33:22. As discussed above, though, we determine that Petitioner’s allegedly improper arguments in the Reply relating to claim 7 as being anticipated by Bace are not persuasive. See supra Section II.B.6. As a result, Petitioner’s allegedly improper arguments in the Reply do not prejudice Patent Owner, and we determine that it is unnecessary to strike those portions of Petitioner’s Reply. Therefore, Patent Owner’s Motion to Strike Petitioner’s Reply is denied. E. Patent Owner’s Motion to Exclude Patent Owner filed a Motion to Exclude (Paper 45, “PO Mot. Excl.”), to which Petitioner filed an Opposition (Paper 52, “Pet. Opp. Excl.”), and Patent Owner filed a Reply (Paper 53, “PO Reply Excl.”). Patent Owner argues that portions of Exhibit 1029 should be excluded. PO Mot. Excl. 1. Exhibit 1029 is the Supplemental Declaration of Dr. Michael T. Goodrich. Patent Owner argues that paragraphs 31 and 32 of Exhibit 1029 should be excluded as prejudicial under Fed. R. Evid. 403, because Petitioner relies on those paragraphs to support the allegedly improper arguments in the Reply relating to claim 7. PO Mot. Excl. 1. Patent Owner identifies the arguments in its Motion to Strike Petitioner’s Reply as support for its Motion to Exclude. Id. As discussed above, we determine that it is unnecessary to strike the allegedly improper arguments in Petitioner’s Reply. See supra 28 IPR2015-00374 Patent 7,913,306 B2 Section II.D. For the same reasons, we determine that it is unnecessary to exclude paragraphs 31 and 32 of Exhibit 1029. Therefore, Patent Owner’s Motion to Exclude paragraphs 31 and 32 of Exhibit 1029 is denied. F. Patent Owner’s Motion for Observations Patent Owner filed a Motion for Observations on the cross examination of Dr. Michael T. Goodrich (Paper 48), to which Petitioner filed a Response (Paper 51). We have considered Patent Owner’s observations and Petitioner’s responses, and we have given Dr. Goodrich’s testimony the appropriate weight in making our determination in this case. III. CONCLUSION Petitioner has shown by a preponderance of the evidence that claims 1–4 of the ’306 patent are unpatentable. Petitioner, however, has not shown by a preponderance of the evidence that claims 7 and 8 of the ’306 patent are unpatentable. IV. ORDER In consideration of the foregoing, it is hereby: ORDERED that claims 1–4 of the ’306 patent are shown unpatentable; FURTHER ORDERED that claims 7 and 8 of the ’306 patent are not shown unpatentable; FURTHER ORDERED that Patent Owner’s Motion to Strike Petitioner’s Reply is denied; FURTHER ORDERED that Patent Owner’s Motion to Exclude is denied; and 29 IPR2015-00374 Patent 7,913,306 B2 FURTHER ORDERED that, because this is a Final Written Decision, parties to the proceeding seeking judicial review of the decision must comply with the notice and service requirements of 37 C.F.R. § 90.2. 30 IPR2015-00374 Patent 7,913,306 B2 PETITIONER: Brian M. Hoffman Michael Sacksteder FENWICK & WEST LLP bhoffman-ptab@fenwick.com msacksteder-ptab@fenwick.com PATENT OWNER: Hong A. Zhong Michael Fleming IRELL & MANELLA LLP hzhong@irell.com mfleming@irell.com 31 Copy with citationCopy as parenthetical citation