Ex Parte Yanai et alDownload PDFPatent Trial and Appeal BoardNov 14, 201713652181 (P.T.A.B. Nov. 14, 2017) Copy Citation United States Patent and Trademark Office UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O.Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 13/652,181 10/15/2012 Yaron Yanai 42P59822 1303 45209 7590 11/16/2017 BLAKELY SOKOLOFF TAYLOR & ZAFMAN 1279 Oakmead Parkway Sunnyvale, CA 94085-4040 EXAMINER TRUONG, NGUYEN T ART UNIT PAPER NUMBER 2486 NOTIFICATION DATE DELIVERY MODE 11/16/2017 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): Database_Group@bstz.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte YARON YANAI, MAOZ MADMONI, GILBOA LEVY, and GERSHOM KUTLIROFF1 Appeal 2017-006268 Application 13/652,181 Technology Center 2400 Before CAROLYN D. THOMAS, JEREMY J. CURCURI, and NABEEL U. KHAN, Administrative Patent Judges. THOMAS, Administrative Patent Judge. DECISION ON APPEAL Appellants seek our review under 35 U.S.C. § 134(a) of the Examiner’s Final Rejection of claims 1—20, all the pending claims in tl present application. See Claims Appendix. We have jurisdiction over appeal under 35 U.S.C. § 6(b). We AFFIRM. 1 The Appellants name Intel Corporation as the real party in interest. App. Br. 3. Appeal 2017-006268 Application 13/652,181 The present invention relates generally to combining depth images taken from multiple depth cameras into a composite image. See Abstract. Claim 7 below is illustrative: 7. A system comprising: a plurality of depth cameras, wherein each depth camera is configured to capture a sequence of depth images of a scene over a period of time; a group processor configured to: independently capture depth images from each of the plurality of depth cameras; project each captured depth image to a three-dimensional coordinate system of each depth camera; transform sets of three-dimensional points to the coordinate system of a virtual camera by applying a transformation from the coordinate system for each camera to a coordinate system of the virtual camera; generate a sequence of synthetic images from the sequences of depth images, wherein each synthetic image in the sequence of synthetic images corresponds to one of the depth images in the sequence of depth images from each of the plurality of depth cameras; separately track movements of one or more persons or body parts in the sequence of synthetic images from each of the depth cameras. Appellants appeal the following rejections: Rl. Claims 7, 8, and 10—18 are rejected under 35 U.S.C. § 102(b) as being anticipated by Wurmlin (US 2009/0315978 Al, Dec. 24, 2009); R2. Claims 1, 2, and 4—6 are rejected under 35 U.S.C. § 103(a) as being unpatentable over Wurmlin and Nguyen (US 2009/0055205 Al, Feb. 26, 2009); R3. Claim 3 is rejected under 35 U.S.C. § 103(a) as being unpatentable over Wurmlin, Nguyen, and Gun A. Lee (Occulusion based 2 Appeal 2017-006268 Application 13/652,181 Interaction Methods for Tangible Augmented Reality Environments, Virtual Reality Laboratory, Dept, of CSE, POSTECH, Republic of Korea, HIT Lab NZ, University of Canterbury, 419-26 (2004)); R4. Claims 9, 19, and 20 are rejected under 35 U.S.C. § 103(a) as being unpatentable over Wurmlin and Lee. We review the appealed rejections for error based upon the issues identified by Appellants, and in light of the arguments and evidence produced thereon. Ex parte Frye, 94 USPQ2d 1072, 1075 (BPAI 2010) (precedential). ANALYSIS Issue: Did the Examiner err in finding that Wurmlin discloses transforming sets of three-dimensional points to the coordinate system of a virtual camera, as set forth in the claims? Appellants contend: Thus, Wurmlin effectively discloses generating virtual camera views at locations where no real camera exists using a combined stream of feeds from two or more real cameras. Nevertheless, such a process of generating virtual camera views is not equivalent to transforming three-dimensional points from the real camera to a coordinate system of the virtual camera. ... In fact, nowhere in Wurmlin is there suggested an implementation of transforming three-dimensional points from a real camera to coordinate systems of a virtual camera. (App. Br. 11). Appellants further contend that “Nguyen does not disclose or suggest a process of transforming sets of three-dimensional points to a coordinate system of a virtual camera” (id. at 14). 3 Appeal 2017-006268 Application 13/652,181 In response, the Examiner finds “[t]he cited paragraphs of Wurmlin either explicitly or implicitly discloses using the real camera parameters such as location and orientation to derive the parameters of the virtual camera in 3D space in order to generate the virtual images” (Ans. 11). We agree with the Examiner. For example, the Examiner finds that Wurmlin’s “par. 91 discloses the location of the virtual camera is determined in 3D space; par. 205 and fig. 7 disclose virtual cameras located at positions and orientations where no real cameras exist; [and] par. 82 discloses viewpoints different from the actual physical locations of the cameras are generated based on the input video streams” (Ans. 11). Specifically, Wurmlin discloses “[a] method for generating a 3D representation of a dynamically changing 3D scene” (Abstract). In Wurmlin, “a synthesised view is provided which shows the scene from a virtual view point that is distinct from the positions of the real cameras” (Wurmlin 177) by at least “providing camera parameters of a virtual camera” (id 178). In addition, Wurmlin discloses that “[a]s a result, a global 3D representation of entire scene is maintained and updated dynamically, based on input from the video streams” (id. 1 82). Wurmlin further discloses determining “[t]he size and orientation of the billboard,” i.e., the object, and that “[t]he location of the billboard is defined by the calculated 3D position of the object” (id. 191) and that “[t]he system 100 generates views corresponding to virtual cameras 703 . . . located at positions and orientations where no real camera exists” (id. 1205). In other words, Wurmlin discloses generating a 3D representation of a changing 3D scene by acquiring video stream from real cameras, 4 Appeal 2017-006268 Application 13/652,181 determining the identity of objects in the video streams, determining the 3D position of the objects, and providing camera parameters of a virtual camera. Appellants’ contentions fail to explain why the aforementioned disclosures are distinguishable from the claimed “transform sets of three- dimensional points to the coordinate system of a virtual camera,” as set forth in representative claim 7. Instead, Appellants merely argue that Wurmlin fails to suggest an implementation of transforming three-dimensional points from a real camera to coordinate systems of a virtual camera (see App. Br. 11), without providing any meaningful analysis that explains why the Examiner erred. A statement which merely points out what a claim recites will not be considered an argument for separate patentability of the claim. See 37 C.F.R. § 41.37(c)(l)(vii). We note that arguments which Appellants could have made but chose not to make in the Principle Brief have not been considered and are deemed to be waived. Therefore, Appellants’ arguments are unpersuasive. Accordingly, we sustain the Examiner’s anticipation rejection of claim 7. Appellants’ arguments regarding the Examiner’s rejection of independent claims 12 and 17 rely on the same arguments as for claim 7, and Appellants do not argue separate patentability for the dependent claims. See App. Br. 8—14. We, therefore, also sustain the Examiner’s anticipation rejection of claims 8 and 10—18. As for the Examiner’s obviousness rejections of claims 1—6, 9, 19, and 20, we also sustain these rejections for similar reasons, given that there are no separate arguments pertaining thereto. 5 Appeal 2017-006268 Application 13/652,181 DECISION We affirm the Examiner’s § 102(b) and § 103(a) rejections Rl—R4. No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a)(l)(iv). AFFIRMED 6 Copy with citationCopy as parenthetical citation