Ex Parte Wright et alDownload PDFPatent Trial and Appeal BoardMay 24, 201813569048 (P.T.A.B. May. 24, 2018) Copy Citation UNITED STA TES p A TENT AND TRADEMARK OFFICE APPLICATION NO. FILING DATE FIRST NAMED INVENTOR 13/569,048 08/07/2012 60909 7590 05/29/2018 CYPRESS SEMICONDUCTOR CORPORATION 198 CHAMPION COURT SAN JOSE, CA 95134-1709 David G. Wright UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www .uspto.gov ATTORNEY DOCKET NO. CONFIRMATION NO. CD09033CIP1 1055 EXAMINER PARKER, JEFFREY ALAN ART UNIT PAPER NUMBER 2625 NOTIFICATION DATE DELIVERY MODE 05/29/2018 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): patents@cypress.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte DAVID G. WRIGHT, RYAN SEGUINE, STEVE KOLOKOWSKY, and DAVID YOUNG 1 Appeal 2017-011780 Application 13/569,048 Technology Center 2600 Before ALLEN R. MacDONALD, JOSEPH P. LENTIVECH, and MICHAEL J. ENGLE, Administrative Patent Judges. MacDONALD, Administrative Patent Judge. DECISION ON APPEAL 1 According to Appellants, the real party in interest is Cypress Semiconductor Corporation. App. Br. 3. Appeal 2017-011780 Application 13/569,048 STATEMENT OF THE CASE Appellants appeal under 35 U.S.C. § 134(a) from a final rejection of claims 1, 3, 5-15, and 17-21. Appellants have cancelled claims 2, 4, and 16. App. Br. 20, 21, 24. We have jurisdiction under 35 U.S.C. § 6(b). Representative Claims Representative claims 1, 8, and 15 under appeal read as follows ( emphasis, formatting, and bracketed material added): 1. A system comprising: [A.] a sensor configured to detect user inputs; and [B.] a processor configured to: [i.] receive a detected first user input from the sensor, the first user input comprising a gesture; [ii.] determine whether the gesture has an associated first command; [iii.] if the gesture does not have an associated first command, provide an interface with one or more available commands; [iv.] if the gesture has an associated first command, execute the associated first command; [ v.] receive a detected second user input, the second user input comprising an indication that the associated first command is incorrect; [vi.] in response to the second user input, undo the first command associated with the gesture by reversing the execution of the first command to revert the system back to a state prior to the first command being executed; [vii.] receive a detected third user input from the sensor, the third user input indicative of a second command; and [viii.] assign the second command to the gesture based on the third user input. 2 Appeal 2017-011780 Application 13/569,048 8. A system comprising: [A.] a sensing device configured to determine one or more characteristics of at least one of a plurality of user inputs, the at least one of the plurality of user inputs comprising a gesture; and [B.] a processor configured to: [i.] receive the determined one or more characteristics; [ii.] determine whether the determined one or more characte 1 istics are associated with one of a plurality of known commands; [iii.] when the determined one or more characteristics are not associated with one of a plurality of known commands, identify at least one of the plurality of known commands to be associated with the at least one of the plurality of user inputs based on the determined one or more characteristics of the gesture, wherein to identify the at least one of the plurality of known commands, the processor is configured to determine whether the determined one or more characteristics of the gesture are within a defined tolerance of allowed characteristics of the at least one of the plurality of known commands; and [iv.] assign the at least one of the plurality of known commands to the at least one of the plurality of user inputs. 3 Appeal 2017-011780 Application 13/569,048 15. A method, comprising: [A.] receiving a first user input detected by a sensor; [B.] identifying one or more characteristics of the received first user input; [C.] determining, by a processor if the one or more characteristics matches a characteristic of a known gesture in a gesture library, the gesture library comprising a plurality of known gestures and one or more characteristics that identify each of the plurality of known gestures; and [D.] if the one or more characteristics do not match a characteristic of a known gesture, [ i.] generating a new gesture based on the one or more characteristics of the first user input, [ii.] receiving a second user input, the second user input indicating a command, and [iii.] associating the command indicated by the second user input with the new gesture associated with the first user input by linking an entry in the gesture library corresponding to the new gesture with an entry in a command library associated with the command, wherein the command library is separate from the gesture library. Rejections The Examiner rejected claims 1, 3, 5-7, and 21 under 35 U.S.C. § I03(a) as being unpatentable over the combination of Westerman et al. (US 2008/0036743 Al; pub. Feb. 14, 2008), Rigazio et al. (US 2009/0002218 Al; pub. Jan. 1, 2009), and Nishihara et al. (US 2009/0103780 Al; pub. Apr. 23, 2009). 2 2 We select claim 1 as representative. Appellants do not argue separate patentability for claims 3, 5-7, and 21. Except for our ultimate decision, we do not discuss claims 3, 5-7, and 21 further herein. 4 Appeal 2017-011780 Application 13/569,048 The Examiner rejected claims 8-14 under 35 U.S.C. § I03(a) as being unpatentable the combination of Westerman, Nishihara, and Do et al. (US 2010/0013676 Al; pub. Jan. 21, 2010). 3 The Examiner rejected claims 15 and 17-20 under 35 U.S.C. § I03(a) as being unpatentable the combination of Westerman and Marvit et al. (US 2005/0210417 Al; pub. Sept. 22, 2005). 4 Issue on Appeal Did the Examiner err in rejecting claims 1, 8, and 15 as being obvious? ANALYSIS We have reviewed the Examiner's rejections in light of Appellants' arguments (Appeal Brief and Reply Brief) that the Examiner has erred. We disagree with Appellants. Except as noted below, we adopt as our own ( 1) the findings and reasons set forth by the Examiner in the action from which this appeal is taken and (2) the reasons set forth by the Examiner in the Examiner's Answer in response to Appellants' Appeal Brief. We concur with the conclusions reached by the Examiner. We highlight the following points. 3 We select claim 8 as representative. Appellants do not argue separate patentability for claims 9-14. Except for our ultimate decision, we do not discuss claims 9-14 further herein. 4 We select claim 15 as representative. Appellants do not argue separate patentability for claims 17-20. Except for our ultimate decision, we do not discuss claims 17-20 further herein. 5 Appeal 2017-011780 Application 13/569,048 A. Appellants raise the following arguments in contending that the Examiner erred in rejecting claim 1 under 35 U.S.C. § 103(a). 1. [T]he undo command referenced by Westerman pertains not to the "undo" of a previous command input, but to repeat the same command when flipping through pages. (Westerman, paragraph [0110].) App. Br. 8 ( emphasis added). [W]hile paragraph [0110] of Westerman does describe an undo command in a specific example of a type of command that could be detected, the undo command described by Westerman does not "undo" a prior, executed command. App. Br. 11 ( emphasis added). Westerman [paragraph 0110] does not describe that an undo command undoes a prior action associated with a gesture, as recited by claim 1. Instead, an undo command in Westerman simply undoes "work," which would apply to typing on a keyboard, clicking on an option with a mouse, drawing on a scratchpad with a mouse, etc. It appears as though the Examiner is incorrectly relying on the doctrine of inherency to demonstrate that Westerman teaches or suggests "receive a detected second user input, the second user input comprising an indication that the associated first command is incorrect (where the first command is associated with a gesture)" and "in response to the second user input, undo the first command associated with the gesture by reversing the execution of the first command to revert the system back to a state prior to the first command being executed," as recited by claim 1. App. Br. 12 ( emphasis added). 6 Appeal 2017-011780 Application 13/569,048 The Examiner presents the following response to Appellants' above arguments. Applicant argues on page 8 that Westerman's paragraph 110 does not teach an undo command but rather a way of continuing/repeating a command. The command that is discussed as being repeatable, however, is a traditional "undo" command and it seems stretched to read otherwise. Further, paragraphs 163 and 17 4 explicitly discuss that a traditional "undo" command can be mapped to a gesture event, and Fig. 29E demonstrates an interface for programming particular chords/ gesture events to commands such as editing operations including "undo." Ans 13-14 ( emphasis added). First, we construe the critical term "undo" of claim 1. Reviewing the Specification, we do not find the Appellants provide their own definition for the term "undo." Therefore, we tum to THE MICROSOFT PRESS COMPUTER DICTIONARY (Microsoft Press 1991), which states the term "undo" means: To reverse the last action- for example, to undo a deletion, thus restoring the text to a document. Many application programs enable the user both to undo an action and to "undo" an undo. Id. at 356. We find this dictionary definition is consistent with Appellants' use of the term "undo" in the Specification. Second, based on this dictionary definition of "undo," we conclude the Examiner is correct in concluding that the argued limitations of claim 1 would have been obvious to an artisan. Contrary to Appellants' argument, given the ordinary meaning of "undo," we conclude any Examiner reliance on inherency or obviousness is well founded to show that a prior action associated with a gesture can be "undone." For example, Westerman teaches "commands and gesture events that are linked to the commands," including gestures for editing commands such as "cut" and "paste," in 7 Appeal 2017-011780 Application 13/569,048 addition to an "undo" command. Westerman ,r 163. "Cut" and "paste" are examples of prior work that can be "undone," regardless of whether a command to cut or paste comes from a button, a keyboard, or a gesture. See id. ,r,r 110, 163, 174 (cited by the Examiner (Final Act. 3--4; Ans. 13-15)). 2. Westerman states "[a] switching event can refer to an event that resets the operation or initiates a chord change." (Westerman, paragraph [0111].) The Office action appears to interpret "resets the operation" to mean that a previous action associated with a first gesture is undone by the switching event. . . . A switching event does not undo a prior action, as the Office action contends. App. Br. 9-10 (emphasis added). Although the Examiner's rejection cites to Westerman paragraph 111 for support, our review of the rejection does not find the "resets the operation" interpretation argued by Appellants. Rather, as we discuss above, Westerman explicitly teaches using an "undo" command which as artisan would understand as undoing a prior incorrect command. B. In rejecting claim 8, the Examiner finds: Regarding claim 8, Westerman teaches a system compnsmg: a processor ... configured to: determine whether the determined one or more characteristics are associated with one of a plurality of known commands (see Fig. 1 blocks 22 and 24). Final Act. 7. 8 Appeal 2017-011780 Application 13/569,048 Appellants raise the following arguments in contending that the Examiner erred in rejecting claim 8 under 35 U.S.C. § 103(a). Westerman describes loading the gesture events for commands that are already associated with a gesture, rather than commands that are not yet associated with a gesture but that are going to be associated with the gesture as a result of the assigning. Accordingly, Westerman cannot be properly interpreted as teaching "identify at least one of the plurality of known commands to be associated with the at least one of the plurality of user inputs," as claimed. Do is directed to presence recognition control of electronic devices using a multi-touch device. (Do, Abstract.) Nishihara is directed to hand-gesture recognition. (Nishihara, Abstract.) Appellant respectfully submits that neither Do nor Nishihara can cure the deficiencies of Westerman and neither teaches or suggests identifying at least one of the plurality of known commands to be associated with the at least one of the plurality of user inputs based on the determined one or more characteristics of the gesture, as claimed. App. Br. 15 (emphasis added). In response to Appellants' above argument, at pages 19-20 of the Answer, the Examiner additionally cites to Westerman paragraphs 178-1805 and Figures 29-32 to support the rejection. The Examiner also presents the following new findings. Applicant's interpretation of the reference suggests that Westerman just allows looking up gesture events and the commands that are associated with said gesture events but does not allow programming of new correlations. Westerman Figs. 29-32 and paragraphs 178-180 show that Westerman definitely 5 "[T]he user may assign commands to gesture events by moving the commands over a gesture. For example, the user can select and drag a particular command and drop[] it over the desired gesture event. Thereafter, the command can be tied to that gesture event for future operations." Westerman, 178. 9 Appeal 2017-011780 Application 13/569,048 teaches assigning commands to gesture events inputted by the user. Ans. 19. In the Reply Brief, Appellants expand the initial argument presented in the Appeal Brief. The cited portion [of Westerman paragraph 178] does not, however, teach or suggest "when the determined one or more characteristics are not associated with one of a plurality of known commands, identify at least one of the plurality of known commands to be associated with the at least one of the plurality of user inputs based on the determined one or more characteristics of the gesture," as recited by claim 8. Reply Br. 12; see also App. Br. 15. This expanded argument is unpersuasive. The Examiner did not cite Westerman for the "when the determined one or more characteristics are not associated" limitation. Rather, the Examiner cited Do. Final Act. 8. Thus, Appellants do not address the actual reasoning of the Examiner's rejection. Instead, Appellants attack the reference singly for lacking a teaching that the Examiner relied on a combination of references to show. It is well-established that one cannot show non-obviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413,425 (CCPA 1981); In re Merck & Co., Inc., 800 F.2d 1091, 1097 (Fed. Cir. 1986). References must be read, not in isolation, but for what they fairly teach in combination with the prior art as a whole. Merck, 800 F .2d at 1097. Additionally, Appellants' above Appeal Brief arguments dismiss the Do reference. However, Appellants overlook that Do at paragraph 69 and Figure 5A (cited by the Examiner (Final Act. 8)) teaches a process paralleling the entire "when the determined one or more characteristics are 10 Appeal 2017-011780 Application 13/569,048 not associated ... " limitation clause of claim 8. Do teaches identifying characteristics, then determining if a gesture "exactly match[ es]" the characteristics or matches "within some predetermined tolerance." In the event of no match, Do looks for similar characteristics previously matched to one or more gestures, and prompts the user with a selection list. Do ,r 69. Contrary to Appellants' argument, we conclude that Do (in combination with Westerman) more than sufficiently suggests the argued limitation. C. Appellants raise the following arguments in contending that the Examiner erred in rejecting claim 15 under 35 U.S.C. § 103(a). Marvit describes that raw motion data of received user input is processed and stored as a gesture in a gesture database. (Marvit, Paragraph [0147].) Once the gesture is stored, Marvit teaches that function mapping information for the gesture is received from the user. The function mapping information includes functions, operations[,] or tasks of the device that the user desires the user-created gesture to command, and is stored in a function database or a gesture mapping database. (Marvit, Paragraph [0148].) Marvit, however, does not teach or suggest linking an entry in the gesture library corresponding to the new gesture with an entry in a command library associated with the command, wherein the command library is separate from the gesture library, as claimed . ... Marvit does not teach or suggest that one of either the function database or the gesture mapping database includes just the command and that an entry in the function database or the gesture mapping database is linked to an entry for the gesture in the gesture database. Marvit does not describe any connection between any of the gesture database, the function database or the gesture mapping database, and therefore cannot be properly interpreted as "linking" any entries across the various databases. Therefore, Marvit does not teach or suggest this feature of the claims. 11 Appeal 2017-011780 Application 13/569,048 App. Br. 17-18. In the Final Action at pages 11-12, the Examiner cites to Marvit paragraphs 147-148 and Figure 21 to support the rejection. In response to Appellants' above argument, at pages 22-24 of the Answer, the Examiner additionally cites to Marvit paragraph 137 and Figure 18 to support the rejection. The Examiner also presents the following new findings. Clearly in Fig. 18 a function, e.g. "command" database and a gesture database are stored in memory. Therefore, Marvit does teach a database that includes 'just the command." Marvit also teaches in paragraphs 137 and 148 that the function mapping (to a gesture) information is stored at either the function database or the gesture (mapping) database. Therefore, the gesture ( mapping) database can store the function mapping and a separate function database can store "just the command." Ans. 23-24. In the Reply Brief at pages 14--15, Appellants repeat word-for-word the argument presented in the Appeal Brief. This repeated argument is silent as to the Examiner's findings as to Marvit's paragraph 137, and Figures 18 and 21. We do not find where Appellants dispute the Examiner's new findings. Therefore, we find Appellants' argument unpersuasive. CONCLUSIONS (1) The Examiner has not erred in rejecting claims 1, 3, 5-15, and 17-21 as being unpatentable under 35 U.S.C. § 103(a). (2) Claims 1, 3, 5-15, and 17-21 are not patentable. 12 Appeal 2017-011780 Application 13/569,048 DECISION The Examiner's rejections of claims 1, 3, 5-15, and 17-21 are affirmed. No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a)(l )(iv). AFFIRMED 13 Copy with citationCopy as parenthetical citation