Ex Parte PATHANGAY et alDownload PDFPatent Trial and Appeal BoardFeb 2, 201612769654 (P.T.A.B. Feb. 2, 2016) Copy Citation UNITED STA TES p A TENT AND TRADEMARK OFFICE APPLICATION NO. FILING DATE 121769,654 04/29/2010 22879 7590 02/04/2016 HP Inc, 3390 E. Harmony Road Mail Stop 35 FORT COLLINS, CO 80528-9544 FIRST NAMED INVENTOR Vinod PATHANGAY UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www .uspto.gov ATTORNEY DOCKET NO. CONFIRMATION NO. 82260008 8118 EXAMINER ELL, MATTHEW ART UNIT PAPER NUMBER 2172 NOTIFICATION DATE DELIVERY MODE 02/04/2016 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address( es): ipa.mail@hp.com barbl@hp.com yvonne.bailey@hp.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte VINOD PATHANGAY and ANBUMANI SUBRAMANIAN Appeal2014-001499 Application 12/769,654 Technology Center 2100 Before CARLA M. KRIVAK, BRUCE R. WINSOR, and AMBER L. HAGY, Administrative Patent Judges. WINSOR, Administrative Patent Judge. DECISION ON APPEAL Appellants 1 appeal under 35 U.S.C. § 134(a) from the final rejection of claims 1-3 and 5-15, which constitute all the claims pending in this application. (App. Br. 4.) We have jurisdiction under 35 U.S.C. § 6(b). Claim 4 is cancelled. (App. Br. (Claims App'x) 29.) We affirm-in-part. 1 The real party in interest identified by Appellants is Hewlett-Packard Development Company, LP. (App. Br. 2.) Appeal2014-001499 Application 12/769,654 STATEMENT OF THE CASE The Invention Appellants' disclosed invention relates to "a point, select and transfer hand gesture based user interface." (Abstract.) Claims 1 and 12, which are illustrative, read as follows: 1. A computer implemented method for a point, select and transfer hand gesture based user interface, compns1ng: capturing a depth image of a hand gesture using an in-front camera on a frame by frame basis within a predefined interaction volume and wherein the in-front camera is disposed around a display device and wherein the display device is designed to display a plurality of select options; finding a nearest point of the hand gesture to a display screen of the display device using a nearest depth value in the captured depth image for each frame; performing an image-to-screen mapping of the captured depth in1age and the found nearest point to the display screen upon validating the found nearest point as associated with the hand for each frame; and pointing and selecting one of the plurality of displayed select options on the display screen of the display device when the nearest depth value is within a number of predetermined threshold ranges, and based on the outcome of the image-to-screen mapping, in which finding the nearest point of the hand gesture to the display screen of the display device using the nearest depth value in the captured depth image for each frame comprises finding a single pixel associated with the nearest depth value in the captured depth image for each frame. 12. A non-transitory computer-readable storage medium for a point, select and transfer hand gesture based 2 Appeal2014-001499 Application 12/769,654 user interface having instructions that, when executed by a computing device, cause the computing device to perform a method comprising: capturing a depth image of a hand gesture using an in-front camera on a frame by frame basis within a predefined interaction volume and wherein the in-front camera is disposed around a display device and wherein the display device is designed to display a plurality of select options; finding a nearest point of the hand gesture to a display screen of the display device using a nearest depth value in the captured depth image for each frame; performing an image-to-screen mapping of the captured depth image and the found nearest point to the display screen upon validating the found nearest point as associated with the hand for each frame; and pointing and selecting one of the plurality of displayed select options on the display screen of the display device when the nearest depth value is within one or more predetermined threshold values or ranges, and based on the outcome of the image-to-screen mapping, in which performing the image-to-screen mapping of the captured depth image and the found nearest point to the display screen upon validating the found nearest point as associated with the hand for each frame comprises smoothening an estimated pointing location on the display screen by temporal averaging of the estimated pointing location upon performing the image-to-screen mapping of the captured depth image and the found nearest point to the display screen. 3 Appeal2014-001499 Application 12/769,654 The Re} ections2 Claims 1-3, 5-10, 12, and 14 stand rejected under 35 U.S.C. § 103(a) as being unpatentable over Vassigh et al. (US 2011/0193939 Al; publ. Aug. 11, 2011; filed Feb., 2010; "Vassigh") and Davidson et al. (US 2007 /0279412 Al; publ. Dec. 6, 2007; Davidson). (See Final Act. 4--14.) Claims 11, 13, and 15 stand rejected under 35 U.S.C. § 103(a) as being unpatentable over Vassigh, Davidson, and Poon et al. (US 2011/0175822 Al; publ. July 21, 2011; filed Jan. 21, 2010; "Poon"). (See Final Act. 14--16.) The Record Rather than repeat the arguments here, we refer to the Briefs ("App. Br." filed June 27, 2013; "Reply Br." filed Nov. 15, 2013) and the Specification ("Spec." filed Apr. 29, 2010) for the positions of Appellants and the Final Office Action ("Final Act." mailed Feb. 28, 2013) and Answer ("Ans." mailed Sept. 18, 2013) for the reasoning, findings, and conclusions of the Examiner. Only those arguments actually made by Appellants have been considered in this decision. Arguments that Appellants did not make in the Briefs have not been considered and are deemed to be waived. See 37 C.F.R. § 41.37(c)(l)(iv) (2012). 2 A rejection of claims 1, 3, 12, and 14 under 35 U.S.C. § 112, first paragraph (see Final Act. 2-3) is withdrawn, and is, therefore, not before us. (Ans. 3.) Appellants' contention to the contrary notwithstanding (see Reply Br. 4 and 10), the Examiner's withdrawal of the§ 112 rejection fully addresses Appellants' traversal of the§ 112 rejection (see App. Br. 4--16; Reply Br. 4--7). 4 Appeal2014-001499 Application 12/769,654 ISSUES The issues presented by Appellants' contentions are as follows: Does the combination of Vassigh and Davidson teach or suggest "finding the nearest point of the hand gesture to the display screen of the display device using the nearest depth value in the captured depth image for each frame comprises finding a single pixel associated with the nearest depth value in the captured depth image for each frame" (hereinafter the "single pixel limitation"), as recited in claim 1 ?3 Does the combination of Vassigh and Davidson teach or suggest "smoothening an estimated pointing location on the display screen by temporal averaging of the estimated pointing location upon performing the image-to-screen mapping of the captured depth image and the found nearest point to the display screen" (hereinafter the "smoothening limitation"), as recited in claim 12 and similarly recited in claim 14? ANALYSIS Claim 1 - Single Pixel Limitation The Examiner relies on Davidson to teach the single pixel limitation (Final Act. 6 (citing Davidson i-fi-142 and 44); see also Ans. 3-5.) Appellants contend Davidson does not teach or suggest the single pixel limitation "because the closest 2D pixel location in a depth map as taught in [Davidson] ... is merely a 2D pixel closest to a camera and is not indicative 3 Appellants' arguments regarding claim 1 present additional issues. Because the identified issue is dispositive of claim 1, we do not reach the additional issues. 5 Appeal2014-001499 Application 12/769,654 of an actual closest pixel indicating the closest point of an individual (e.g., the individual's hand)." (App. Br. 19.) Although we do not find Appellants' reasoning to be entirely clear, we do agree with Appellants' basic premise: that the "closest 2D pixel location" taught by Davidson (i-f 44) "is not indicative of an actual closest pixel indicating the closest point of an individual (e.g., the individual's hand)" or any other object to a display screen (App. Br. 19). Rather, Davidson teaches for a plurality of pixel locations, comparing a computed distance for each pixel location with a distance recorded at a "closest 2D pixel location" in a depth map and using that comparison to designate the two pixel locations as either "corresponding" or "occluded." (Davidson i1 44.) Thus, Davidson teaches identifying multiple pixel locations in a depth map that are "closest" to computed pixel locations, rather than a single pixel that is closest to a display screen or camera. Therefore, Appellants have persuaded us of error in the rejection of claim 1. Accordingly, we do not sustain the rejections of claim 1 over Vassigh and Davidson, and claims 2, 3, and 5-11, which depend from claim 1, over various combinations of Vassigh, Davidson, and Poon. Claims 12 and 14 - Smoothening Limitation The Examiner relies on V assigh to teach the smoothening limitation. (Final Act. 13 (citing Vassigh i-f 119); see also Ans. 6.) Appellants contend as follows: The "data structure" of Vassigh merely defines measurement values of a number of body parts. The data structure of Vassigh may be formed via scan results averaged from multiple depth images which are provide at different points in time. In other words, Vassigh never uses an averaging processes or temporal 6 Appeal2014-001499 Application 12/769,654 averaging to determine an estimated pointing location on a display screen as recited in claim 12. (App. Br. 21-22 (discussing Vassigh i-f 119); see also Reply Br. 11-14 (additionally citing Vassigh i-f 120).) We agree with the Examiner. As pointed out by the Examiner, "the data structure formed by the temporal averaging [taught by Vassigh] is what is used in the depth image to perform gestures. This data structure will then translate to the display screen and smoothen an estimated pointing location on the display screen." (Ans. 6.) In explaining the use of the model of the human target represented by the data structure (Vassigh i-fi-1 119-20), Vassigh teaches that "[a]s the user moves in the physical space, information from the depth camera system is used to adjust the skeletal model such that the skeletal model represents a person" (id. i-f 122). Vassigh further explains as follows: This data can include coordinates . . . where the coordinates represent a position of the user's hand at a point in time; such as for a camera frame. A reference position of the hand such as the fingertips can be used to represent the hand's position. Individual fingers might also be identified and have respective reference positions if there is sufficient resolution. (Id. i-f 124.) Vassigh then describes details of using hand position in the model provided by the data structure to "provide an input to an application" (id. ,-r 125). Therefore, we find no error in the Examiner's finding that Vassigh' s teaching of a data structure from which a model of a human target is generated, and in which scan results from multiple depth images provided at different points in time ("temporal") are averaged (Vassigh i-f 119), teaches or suggests the smoothening limitation. 7 Appeal2014-001499 Application 12/769,654 Appellants have not demonstrated error in the rejection of claim 12. Accordingly we sustain the rejections over various combinations of Vassigh, Davidson, and Poon of claim 12, claim 14, which was argued on substantially the same bases as claim 12 (compare App. Br. 23-25 with App. Br. 20-22), and claims 13 and 15, which depend from claims 12 and 14, respectively, and were not separately argued with particularity (see App. Br. 27).4 DECISION The decision of the Examiner to reject claims 12-15 is affirmed. The decision of the Examiner to reject claims 1-3 and 5-11 is reversed. No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a)(l). See 37 C.F.R. § 1.136(a)(l )(iv). AFFIRMED-IN-PART 4 The patentability of claim 6 was argued on substantially the same bases as claims 12 and 14. (Compare App. Br. 25-27 with App. Br. 20-25.) Similarly, claim 11 was rejected together with claims 13 and 15. (See Final Act. 14--16.) However, we do not sustain the rejections of claims 6 and 11 for the reasons set forth for claim 1, from which claims 6 and 11 depend. 8 Copy with citationCopy as parenthetical citation