International Business Machines CorporationDownload PDFPatent Trials and Appeals BoardAug 2, 20212020000445 (P.T.A.B. Aug. 2, 2021) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 14/818,495 08/05/2015 Baiju D. Mandalia END920150041US1 2607 104079 7590 08/02/2021 IBM CORPORATION - Patent Center 1701 North Street B/256-3 Endicott, NY 13760 EXAMINER SAADOUN, HASSAN ART UNIT PAPER NUMBER 2435 NOTIFICATION DATE DELIVERY MODE 08/02/2021 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): fdciplaw@us.ibm.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE ____________________ BEFORE THE PATENT TRIAL AND APPEAL BOARD ____________________ Ex parte BAIJU D. MANDALIA and TUNG OUYANG ____________________ Appeal 2020-000445 Application 14/818,4951 Technology Center 2400 ____________________ Before CARL W. WHITEHEAD JR., ERIC S. FRAHM, and JOYCE CRAIG, Administrative Patent Judges. FRAHM, Administrative Patent Judge. DECISION ON APPEAL STATEMENT OF THE CASE Appellant2 appeals under 35 U.S.C. § 134(a) from a final rejection of claims 1–13 and 15–21. Claim 14 has been cancelled (see Appeal Br. 22, 1 Throughout this Opinion, we refer to: (1) the Final Office Action mailed November 5, 2018 (“Final Act.”); (2) the Advisory Action mailed January 29, 2019 (“Advisory Act.”); (3) the Pre-Appeal Brief filed February 5, 2019 (“Pre-Appeal Br.”); (4) the Appeal Brief filed June 3, 2019 (“Appeal Br.”); (5) the Examiner’s Answer mailed August 2, 2019 (“Ans.”); and (6) the Reply Brief filed October 1, 2019 (“Reply Br.”). 2 We use the word “Appellant” to refer to “applicant” as defined in 37 C.F.R. § 1.42 (2019). “The word ‘applicant’ when used in this title refers to the inventor or all of the joint inventors, or to the person applying for a patent as provided in §§ 1.43, 1.45, or 1.46.” 37 C.F.R. § 1.42(a). Appellant Appeal 2020-000445 Application 14/818,495 2 Claims Appendix). We have jurisdiction under 35 U.S.C. § 6(b). We affirm. DISCLOSED AND CLAIMED INVENTION According to Appellant, the disclosed invention, entitled “Security Control For An Enterprise Network” (Title), pertains to security control for a computing device (see Fig. 1, 120) in an enterprise network (see Fig. 1, 130) that connects to another device (see Fig. 1, 110) for administering security over the network (see Spec. 1:1–2, 4:3–5:20). Independent claim 1, reproduced below, is illustrative of the claimed subject matter: 1. A method for administering security for a network, comprising the steps of: detecting, by one or more processors of a device for administering security for the network, that a computing device has connected to a network in an area; determining, by the one or more processors of the device for administering security for the network, whether the computing device is valid, wherein determining whether the computing device is valid further comprises comparing a MAC address associated with the computing device with a database containing one or more MAC addresses associated with one or more valid devices; based on determining that the computing device is valid, determining, by the one or more processors of the device for administering security for the network, whether the computing device is being utilized for one or more suspicious activities; based on determining that the computing device is being utilized for one or more suspicious activities: identifies the real party in interest as International Business Machines Corporation (Appeal Br. 1). Appeal 2020-000445 Application 14/818,495 3 determining, by the one or more processors of the device for administering security for the network, a location of the computing device; determining, by the one or more processors of the device for administering security for the network, whether [A1] a user of the computing device can be identified by cameras for monitoring the area based on visual recognition; and based on determining that [A2] the user of the computing device cannot be identified as authorized to be present in the area, [B] disabling the computing device and [C] transmitting an alert to security personnel. Appeal Br. 19, Claims Appendix (bracketed lettering and emphases added). EXAMINER’S REJECTIONS and APPELLANT’S CONTENTIONS Examiner’s Rejections The Examiner made the following rejections: (1) Claims 1–3, 8–10, and 15–17 stand rejected under 35 U.S.C. § 103 as being unpatentable over Larue et al. (US 2014/0161027 A1; published June 12, 2014) (hereinafter, “Larue”), and Anderson et al. (US 2010/0205667 A1; published August 12, 2010) (hereinafter, “Anderson”). Final Act. 4–10. (2) Claims 4, 11, and 18 stand rejected under 35 U.S.C. § 103 as being unpatentable over Larue, Anderson, and Cherian et al. (US 2013/0231130 A1; published September 5, 2013) (hereinafter, “Cherian”). Final Act. 10– 11. (3) Claims 5, 6, 12, 13, 19, and 20 stand rejected under 35 U.S.C. § 103 as being unpatentable over Larue, Anderson, and Grier (US 2014/0208427 A1; published July 24, 2014). Final Act. 11–14. Appeal 2020-000445 Application 14/818,495 4 (4) Claim 7 stands rejected under 35 U.S.C. § 103 as being unpatentable over Larue, Anderson, and Pecen et al. (US 9,497,688 B2; issued Nov. 15, 2016) (hereinafter, “Pecen”). Final Act. 14. (5) Claim 21 stands rejected under 35 U.S.C. § 103 as being unpatentable over Larue, Anderson, and Aonuma et al. (US 2011/0282662 A1; published Nov. 17, 2011) (hereinafter, “Aonuma”). Final Act. 15. Appellant’s Contentions Appellant presents arguments as to claim 1, and relies on those arguments for the patentability of remaining claims 2–13 and 15–21 (see Appeal Br. 13–16). With regard to claim 1, Appellant contends (see Appeal Br. 9–13; Reply Br. 2–5), inter alia, the Examiner erred in reaching the conclusion of obviousness because (i) Anderson generally fails to disclose or suggest limitation A1, A2, B, and C of claim 1 (see Pre-Appeal Br. 8–11; Appeal Br. 11–13); (ii) Anderson, in particular, fails to disclose or suggest limitation “disabling the computing device” (claim 1, limitation B) of claim 1 (see Appeal Br. 12–13); and (iii) the Examiner incorrectly interprets the term “user of the computing device” (claim 1, limitations A1, A2) of claim 1 (see Appeal Br. 11–12; Reply Br. 3–5). Based on Appellant’s arguments, we (i) select claim 1 as representative of claims 1–3, 8–11, and 15–17; and (ii) decide the outcome of claims 4–7, 11–13, and 18–21 on the same basis as claim 1. See 37 C.F.R. § 41.37(c)(1)(iv). Appeal 2020-000445 Application 14/818,495 5 ISSUE Based on Appellant’s arguments in the Pre-Appeal Brief (Pre-Appeal Br. 8–11), Appeal Brief (Appeal Br. 9–13) and the Reply Brief (Reply Br. 2–5), the following principal issue is presented on appeal: Has Appellant shown the Examiner erred in rejecting claims 1–13 and 15–21 under 35 U.S.C. § 103 because the collective teachings and suggestions of Larue and Anderson, taken with the knowledge of a person having ordinary skill in the art of security control for a computing device, do not support the conclusion of obviousness as to the subject matter of claim 1? ANALYSIS We have reviewed the Examiner’s rejections (Final Act. 2–15; Ans. 3–17) in light of Appellant’s arguments (Pre-Appeal Br. 8–11; Appeal Br. 9–16; Reply Br. 2–5) that the Examiner has erred, as well as the Examiner’s response to Appellant’s arguments in the Appeal Brief (Ans. 15–17). With regard to representative claim 1, we agree with and adopt as our own the Examiner’s findings of facts and conclusions as set forth in the Final Rejection (Final Act. 4–7), Advisory Action (Advisory Act. 2), and Answer (Ans. 3–6, 15–17). Based on the record before us, we agree with the Examiner’s conclusion of obviousness as to claim 1. We provide the following explanation for emphasis only. Claim Interpretation: “Disabling the Computing Device,” “User of the Computing Device,” and “Security Personnel” Under the broadest reasonable interpretation standard, and absent any special definition, claim terms are given their ordinary and customary Appeal 2020-000445 Application 14/818,495 6 meaning, as would be understood by a person of ordinary skill in the art in the context of the entire disclosure. In re Translogic Tech., Inc., 504 F.3d 1249, 1257 (Fed. Cir. 2007). Any special definition for a claim term must be set forth with reasonable clarity, deliberateness, and precision. In re Paulsen, 30 F.3d 1475, 1480 (Fed. Cir. 1994). “When the specification explains and defines a term used in the claims, without ambiguity or incompleteness, there is no need to search further for the meaning of the term.” Multiform Desiccants, Inc. v. Medzam, Ltd., 133 F.3d 1473, 1478 (Fed. Cir. 1998); Bradium Technologies LLC v. Iancu, 923 F.3d 1032, (Fed. Cir. 2019) (IPR at PTAB affirmed) (finding no unambiguous definition of a claim term, Federal Circuit applies plain meaning to disputed PTAB construction). Thus, a clear and unambiguous definition of a claim term is required to redefine the term to mean something other than its plain and ordinary meaning. The Examiner, in the Final Office Action, determines that Anderson teaches “disabling the computing device” in paragraphs 16 and 79, and in Figures 8A and 8B (Final Act. 6). “The security processor is configured to at least partially control the operation of the data input device and the data output device . . .” (Final Act. 6 citing to Anderson ¶ 16) (emphasis added). “[P]rivacy mode is activated according to the security policy of this example, and the sensitive text (8120) and graphic data (8130) being displayed are obscured” (Final Act. 6–7 citing to Anderson ¶ 79). The limitation of “disabling the computing device” in claim 1 is not further defined or explained in any other dependent claim except for claim 7. Claim 7 recites disabling of the computing device “by way of the network” (Appeal Br. 20, Claims Appendix). Furthermore, no part of the originally filed Appeal 2020-000445 Application 14/818,495 7 Specification provides a clear and unambiguous definition of the claim term “disable” or “disabling” to redefine the term to mean something other than its plain and ordinary meaning (see Spec. ¶¶ 21–23) (describing the disabling operation). Based on the ordinary and customary meaning of the term “disable,” we reasonably conclude that “privacy mode” and obscuring of data are “disabling” of the computing device. Furthermore, Anderson’s teaching of “control” in paragraph 16 is encompassed by claim 1’s “disabling” based on the background knowledge of the artisan that control of computer devices includes disabling part or all of the device. Paragraph 94 of Anderson also teaches “disabling the computing device,” as recited in claim 1: . . . to restrict . . . device access . . . is also used to selectively disable particular devices . . . the configuration settings can indicate that web camera inputs be blocked, or that other data, . . . be substituted for actual camera input data. . . . For example, a DVD player application can be made to pause playback when the user looks away from the display device, or a video recorder application can be paused when the user is not facing the camera. Other actions can be taken . . . microphone inputs can be muted as well. Anderson ¶ 94 (emphases added). Therefore, in this light, Appellant’s contention that, in Anderson, “there is no mention of locking down the user from the system based on the facial recognition data in paragraph [0016]” (Appeal Br. 11), is not persuasive of Examiner error inasmuch as claim 1 does not require “locking down” a user, just disabling the computing device or part of the computing device. Therefore, Anderson’s descriptions of (1) controlling the computing device (Anderson ¶ 16), and (2) disabling the device or parts thereof (Anderson ¶ 94), teach or suggest the disabling set Appeal 2020-000445 Application 14/818,495 8 forth in limitation B of claim 1, especially to the extent “disabling” is described in Appellant’s Specification. Given the above discussion, the Examiner reasonably determines that Anderson disables a computing device by controlling applications and elements of the computing device as set forth in claim 1. Appellant’s contention that Anderson fails to “disable the computing device” (Appeal Br. 13) are unpersuasive of Examiner error in view of the teachings of Anderson discussed above in this regard (see Anderson ¶¶ 16, 94). Furthermore, we disagree with Appellant’s argument that “[w]hen the [Anderson] reference is read as a whole, the meaning to the cited portion [paragraph 79] of the reference is clear; the system looks for people eavesdropping, and notifies the user of the system that someone is eavesdropping” (Appeal Br. 13). Appellant ignores the further teachings of Anderson in (1) paragraph 79 that a “privacy mode is activated according to the security policy of this example, and the sensitive text (8120) and graphic data (8130) being displayed are obscured,” and (2) paragraph 94 that the computing device or components of the device are disabled (see Anderson ¶¶ 79, 94). Appellant contends, in the Reply Brief, that “[t]he Examiner’s argument, in essence, is that the additional face-like object, or facial recognition data [taught by Anderson], is a ‘user of the computing device’” (Reply Br. 3). Appellant further argues: User is defined as “a person who uses or operates something” or “one that uses.” Use is defined as . . . “to put into action or service : avail oneself of : employ.” . . . Based on the above, the broadest reasonable interpretation in this instance is the standard dictionary definition. . . . Appeal 2020-000445 Application 14/818,495 9 . . . In both provided definitions, the term use requires something (a person or agent) to actively interact with an object to make use of said object. Reply Br. 3–4. We disagree with Appellant’s assertion that Anderson does not teach the limitation “user of the computing device” as recited in claim 1. As recognized by Appellant, a user is one that puts into action or service or avails oneself of the computing device (see Reply Br. 3–4). As indicated in the Examiner’s Answer, Anderson in paragraph 79 teaches “the additional face-like object has an attention point on the display (8110) (i.e., is facing the display, is close enough to see the display, and optionally has it’s gaze direction pointed at the display)” (Ans. 15–16) and, thus, the user is availing themselves of the computing device display. Furthermore, Appellant’s argument that “Anderson makes no determination that the unrecognized object, discussed in paragraph [0079], is actually using the computing device (or even that the unrecognized object is a person)” (Appeal Br. 12) is unpersuasive of Examiner error, because the Examiner relies upon Larue (see Final Act. 5 citing Larue ¶ 31) as teaching detecting if a rogue computing device (e.g., a wireless access point) is “active” (see Larue ¶ 31). As such, Appellant’s argument addressing the individual shortcomings of Anderson, is conclusory, and do not address the collective teachings and suggestions of the references.3 Therefore, 3 We note that each reference cited by the Examiner must be read, not in isolation, but for what it fairly teaches in combination with the prior art as a whole. See In re Merck & Co., 800 F.2d 1091, 1097 (Fed. Cir. 1986) (finding one cannot show non-obviousness by attacking references individually where the rejections are based on combinations of references). We emphasize that the Examiner’s ultimate legal conclusion of obviousness is based upon the combined teachings of the cited references. Moreover, Appeal 2020-000445 Application 14/818,495 10 Appellant has not shown that the Examiner erred in rejecting claim 1 over the base combination of Larue and Anderson. Appellant’s argument that “when read as a whole, Anderson cannot disclose ‘disabl[ing] the computing device’ based on identification of the user of the computing device” (Appeal Br. 12), is not persuasive inasmuch as these arguments are not commensurate in scope with the language of claim 1. See In re Hiniker Co., 150 F.3d 1362, 1369 (Fed. Cir. 1998) (“[The] proffered facts . . . are not commensurate with the claim scope and are therefore unpersuasive.”). Appellant’s arguments are not persuasive of Examiner error because they are not responsive to the Examiner’s rejection and are not commensurate in scope to the language of claim 1 (which only requires “based on determining that the user of the computing device cannot be identified . . . , disabling the computing device” (Appeal Br. 19, Claims Appendix)). See In re Self, 671 F.2d 1344, 1348 (CCPA 1982) (limitations not appearing in the claims cannot be relied upon for patentability). Appellant contends in their Pre-Appeal Brief that “Larue in combination with Anderson does not disclose or suggest at least ‘. . . based on determining that the user of the computing device cannot be identified as authorized to be present in the area, program instructions to . . . transmit an alert to security personnel’” (Pre-Appeal Br. 8, emphasis added). The Examiner in the Final Office Action and Examiner’s Answer make the “‘the question under 35 USC 103 is not merely what the references expressly teach but what they would have suggested to one of ordinary skill in the art at the time the invention was made.’” Merck & Co. v. Biocraft Labs., Inc., 874 F.2d 804, 807 (Fed. Cir. 1989) (quoting In re Lamberti, 545 F.2d 747, 750 (CCPA 1976)) (emphasis added); see also MPEP § 2123. Appeal 2020-000445 Application 14/818,495 11 following finding with regard to the limitation of “transmit an alert to security personnel” by citing to Anderson’s Figures 8A and 8B and paragraph 79: The current user is also alerted to the presence of the unauthorized individual by a pop-up video window (8140) that shows the video input device feed with the unauthorized face-like object (8150) indicated by a contrasting color box (8160) placed around it (transmitting an alert) EN: the generated alert can be transmitt[ed] to a security employee). (Final Act. 7; Ans. 6) (underlining and bolding added). Based on the above teachings of Anderson, we agree with the Examiner’s finding that the user (8020) receiving an alert by a pop-up window teaches the limitation of claim 1 “transmitting an alert to security personnel” in which the user (8020) is the security personnel. In view of the foregoing, we sustain the rejection of representative claim 1 as being obvious over the combination of Larue and Anderson. CONCLUSION For the foregoing reasons, Appellant has not demonstrated any error in the rejections of (i) representative claim 1, and claims 2, 3, 8–10, and 15– 17 grouped therewith, over the base combination of Larue and Anderson; and/or (ii) claims 4–7, 11–13, and 18–21 over the same base combination taken with various other references. See Appeal Br. 9 (“For convenience of the Honorable Board in addressing the rejections, claims 8 and 15 stand or fall together with independent claim 1.”). Accordingly, all of the obviousness rejections over the base combination of Larue and Anderson are sustained. Appeal 2020-000445 Application 14/818,495 12 In summary: Claim(s) Rejected 35 U.S.C. § Reference(s)/Basis Affirmed Reversed 1–3, 8–10, 15–17 103 Larue, Anderson 1–3, 8–10, 15–17 4, 11, 18 103 Larue, Anderson, Cherian 4, 11, 18 5, 6, 12, 13, 19, 20 103 Larue, Anderson, Grier 5, 6, 12, 13, 19, 20 7 103 Larue, Anderson, Pecen 7 21 103 Larue, Anderson, Aonuma 21 Overall Outcome 1–13, 15–21 No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a)(1). See 37 C.F.R. §§ 41.50(f), 41.52(b). AFFIRMED Copy with citationCopy as parenthetical citation