John SamuelsDownload PDFPatent Trials and Appeals BoardApr 27, 20212020000102 (P.T.A.B. Apr. 27, 2021) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 14/111,380 10/11/2013 John Samuels 042933/438532 3487 10949 7590 04/27/2021 Nokia Corporation and Alston & Bird LLP c/o Alston & Bird LLP One South at The Plaza, 101 South Tryon Street Suite 4000 Charlotte, NC 28280-4000 EXAMINER PAN, YONGJIA ART UNIT PAPER NUMBER 2145 NOTIFICATION DATE DELIVERY MODE 04/27/2021 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): usptomail@alston.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE ____________ BEFORE THE PATENT TRIAL AND APPEAL BOARD ____________ Ex parte JOHN SAMUELS ___________ Appeal 2020-000102 Application 14/111,380 Technology Center 2100 ____________ Before ALLEN R. MACDONALD, ERIC B. CHEN, and ADAM J. PYONIN, Administrative Patent Judges. CHEN, Administrative Patent Judge. DECISION ON APPEAL Appeal 2020-000102 Application 14/111,380 2 STATEMENT OF THE CASE Pursuant to 35 U.S.C. § 134(a), Appellant1 appeals from the Examiner’s decision to reject claims 1, 4, 5, 7, 8, 13–17, 20, 25–27, 31, 41, 42, and 44–47, which constitute all the claims pending in this application. Claims 2, 3, 6, 9–12, 18, 19, 21–24, 28–30, 32–40, and 43 have been cancelled. We have jurisdiction under 35 U.S.C. § 6(b). We REVERSE. CLAIMED SUBJECT MATTER The claimed subject matter is directed to switching between operating in a two-dimensional user interface state and operating in a three- dimensional user interface state on a hand-portable apparatus. (Abstract, Spec. 5:22–23.) Claim 1, reproduced below, is illustrative of the claimed subject matter, with disputed limitations in italics: 1. A method comprising: performing user input detection using at least a first detector, wherein the at least first detector comprises a touch sensitive device configured to detect touch on a two-dimensional surface; detecting using at least the first detector a predefined first phase of a predefined user gesture, wherein the predefined first phase of the predefined user gesture comprises movement substantially in a first plane; enabling at least a second detector; 1 We use the word “Appellant” to refer to “applicant” as defined in 37 C.F.R. § 1.42(a). Appellant identifies the real party in interest as Nokia Technologies Oy. (Appeal Br. 2.) Appeal 2020-000102 Application 14/111,380 3 detecting using at least the second detector a predefined second phase of the predefined user gesture, wherein the predefined second phase of the predefined user gesture comprises movement substantially outside of the first plane; responsive to detecting both the predefined first phase and the predefined second phase of the predefined user gesture, switching between operating in a two-dimensional user interface state and operating in a three-dimensional user interface state, wherein performing user input detection using at least the first detector comprises performing user input detection using the first detector but not using the second detector; and wherein enabling at least the second detector comprises: switching on at least the second detector or adjusting at least the second detector from a lower energy consumption mode to a higher energy consumption mode, and wherein enabling at least the second detector is responsive to detecting, using the first detector, the predefined first phase of the predefined user gesture. (Appeal Br. 12 (Claims Appendix).) REFERENCES Name Reference Date Perski et al. US 2006/0012580 A1 Jan. 19, 2006 Elias et al. US 2007/0177803 A1 Aug. 2, 2007 Duarte et al. US 2010/0156813 A1 June 24, 2010 King et al. US 2011/0164029 A1 July 7, 2011 Rabii US 2012/0147531 A1 June 14, 2012 REJECTIONS Claims 1, 4, 5, 7, 8, 13–16, 31, 41, 42, and 44–47 stand rejected under 35 U.S.C. § 103(a) as being unpatentable over King, Perski, and Rabii. Claims 17 and 20 stand rejected under 35 U.S.C. § 103(a) as being unpatentable over King, Perski, Rabii, and Elias. Appeal 2020-000102 Application 14/111,380 4 Claims 25 and 26 stand rejected under 35 U.S.C. § 103(a) as being unpatentable over King, Perski, Rabii, and Duarte. Claim 27 stands rejected under 35 U.S.C. § 103(a) as being unpatentable over King and Rabii. OPINION We are persuaded by Appellant’s arguments (Appeal Br. 9) that the Examiner improperly combined King, Perski, and Rabii. The Examiner found that the touch screen finger tracking detection mode of King and that the touch screen stylus-search detection mode of King combine to render obvious the limitation “wherein performing user input detection using at least the first detector comprises performing user input detection using the first detector but not using the second detector.” (Ans. 4; see also Final Act. 4.) The Examiner concluded that “it would have been obvious to one of skilled in the art to modify King to include Perski’s switching to achieve the predictable result of toggling detector to improving the detection of user interactions.” (Ans. 5; see also Final Act. 4–5.) We do not agree with the Examiner’s findings and conclusions. King relates to “working with 3D objects” (¶ 1), in particular, “[c]omputer assisted design (CAD) software [that] allows users to generate and manipulate two-dimensional (2D) and three-dimensional (3D) objects” (¶ 2). King explains the following: A device having a touch-sensitive display that enables a user to generate and manipulate 3D objects using 3D gesture inputs is disclosed. The device has touch sensors that can sense positions and movements of objects contacting a surface of the display, and proximity sensors that can sense positions and movements of objects in a three-dimensional space in the Appeal 2020-000102 Application 14/111,380 5 vicinity of the display surface (including movements in proximity to, but not actually touching, the display surface). . . . . The touch inputs, 2D gesture inputs, and 3D gesture inputs can be used by application programs to trigger events, such as applying certain transformations to objects, allowing the user to generate and manipulate 3D objects quickly and intuitively. (¶ 17.) Moreover, Figures 3–20 of King illustrate “exemplary 3D gesture inputs and 3D objects that are generated or modified based on the 3D gesture inputs according to various embodiments of the invention.” (¶ 13.) Although the Examiner proposes to modify King with Perski, the Examiner has not provided an articulated reasoning with some rational underpinning as to why one of ordinary skill in the art would modify King. See In re Kahn, 441 F.3d 977, 988 (Fed. Cir. 2006) (“[R]ejections on obviousness grounds cannot be sustained by mere conclusory statements; instead, there must be some articulated reasoning with some rational underpinning to support the legal conclusion of obviousness”); see also KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 418 (2007). In particular, because King relates to CAD software for the generation of 3D objects using both 2D and 3D gesture inputs, as illustrated in Figures 3–20, the Examiner has not provided sufficient articulated reasoning as to why one of ordinary skill would be “toggling detector[s] to improving the detection of user interactions.” Accordingly, we are persuaded by Appellant’s argument that: King’s embodiments disclose a touch screen with a proximity sensor where in response to touching a 2D shape on the touch screen and then lifting the fingers away from the touch screen, the user can extrude the 2D shape into a 3D Appeal 2020-000102 Application 14/111,380 6 shape. King discloses that the proximity sensor in combination with the touch sensor detects the 3D gesture inputs. Modifying King such that the touch sensor alone is only used with the proximity sensor not being used to detect user interactions, as allegedly provided by Perski, would fundamentally alter the embodiments of King such that the embodiments of King would not work as intended, and as such, the proposed combination is improper. (Appeal Br. 9 (emphasis omitted).) Thus, the Examiner has improperly combined King, Perski, and Rabii to reject claims 1, 4, 5, 7, 8, 13–16, 31, 41, 42, and 44–47 under 35 U.S.C. § 103(a). Accordingly, we do not sustain the rejection of claims 1, 4, 5, 7, 8, 13–16, 31, 41, 42, and 44–47 under 35 U.S.C. § 103(a). In addition, we do not sustain the rejections of claims 17, 20, 25, 26, 27 under 35 U.S.C. § 103(a) over various combinations of King, Perski, Rabii, Elias, and Duarte for the same reasons discussed previously with respect to the rejection of claims 1, 4, 5, 7, 8, 13–16, 31, 41, 42, and 44–47 under 35 U.S.C. § 103(a) over King, Perski, and Rabii. CONCLUSION The Examiner’s decision rejecting claims 1, 4, 5, 7, 8, 13–17, 20, 25– 27, 31, 41, 42, and 44–47 under 35 U.S.C. § 103(a) is reversed. Appeal 2020-000102 Application 14/111,380 7 DECISION In summary: Claims Rejected 35 U.S.C. § Reference(s)/Basis Affirmed Reversed 1, 4, 5, 7, 8, 13–16, 31, 41, 42, 44–47 103(a) King, Perski, Rabii 1, 4, 5, 7, 8, 13–16, 31, 41, 42, 44–47 17, 20 103(a) King, Perski, Rabii, Elias 17, 20 25, 26 103(a) King, Perski, Rabii, and Duarte 25, 26 27 103(a) King, Rabii 27 Overall Outcome 1, 4, 5, 7, 8, 13–17, 20, 25–27, 31, 41, 42, 44–47 REVERSED Copy with citationCopy as parenthetical citation