Tsuyoshi ISHIKAWADownload PDFPatent Trials and Appeals BoardJan 7, 20222020005907 (P.T.A.B. Jan. 7, 2022) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 13/230,639 09/12/2011 Tsuyoshi ISHIKAWA 1946-0483 3468 142241 7590 01/07/2022 Paratus Law Group, PLLC 1765 Greensboro Station Place Suite 320 Tysons Corner, VA 22102 EXAMINER KIR, ALBERT ART UNIT PAPER NUMBER 2485 MAIL DATE DELIVERY MODE 01/07/2022 PAPER Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte TSUYOSHI ISHIKAWA Appeal 2020-005907 Application 13/230,639 Technology Center 2400 Before BRADLEY W. BAUMEISTER, MINN CHUNG, and ADAM J. PYONIN, Administrative Patent Judges. CHUNG, Administrative Patent Judge. DECISION ON APPEAL STATEMENT OF THE CASE Pursuant to 35 U.S.C. § 134(a), Appellant1 appeals from the Examiner’s decision to reject claims 1, 2, 15-17, 19-21, 23, and 25-28, which constitute all the claims pending in this application. See Final Act. 2. We have jurisdiction under 35 U.S.C. § 6(b). We affirm.2 1 “Appellant” herein refers to “applicant” as defined in 37 C.F.R. § 1.42. Appellant identifies the real party in interest as Sony Corporation. Appeal Br. 3. 2 In this Decision, we refer to Appellant’s Appeal Brief filed March 27, 2020 (“Appeal Br.”) and Reply Brief filed August 14, 2020 (“Reply Br.”); the Examiner’s Final Office Action mailed October 4, 2019 (“Final Act.”) Appeal 2020-005907 Application 13/230,639 2 CLAIMED SUBJECT MATTER The subject matter of the present application pertains to methods and systems for displaying stereoscopic images with a superimposed image, such as a user interface image. Spec. ¶ 4. Appellant’s Specification purports to describe and claim embodiments that relate to reducing interference between a stereoscopic image and a superimposed image. Id. ¶ 5. Claims 1, 19, and 20 are independent. Claim 1, reproduced below with the disputed limitation italicized, illustrates the claimed subject matter: 1. An information processing apparatus, comprising: a sensor configured to acquire a first operation; and circuitry configured to control a display to display a stereoscopic image, control the display to display an operation panel displayed in response to the first operation, compute a displacement value based on a display position of the stereoscopic image in a depth direction and a display position of the operation panel in the depth direction, and modify the display position of the operation panel in the depth direction based on the computed displacement value, wherein the sensor is further configured to acquire a second operation to operate the operation panel, and wherein the sensor acquires the second operation based on detection of one or more proximity inputs according to a proximity of a portion of an operational tool to the modified display position of the operation panel. Appeal Br. 22 (Claims App.). and Answer mailed June 16, 2020 (“Ans.”); and the Specification filed September 12, 2011 (“Spec.”). Rather than repeat the Examiner’s findings and Appellant’s contentions in their entirety, we refer to these documents. Appeal 2020-005907 Application 13/230,639 3 REJECTIONS The Examiner relies on the following references: Designation3 Reference Date Suh US 2012/0050473 Al Aug. 31, 2011 Pihlaja US 2011/0316679 Al Dec. 29, 2011 Guillou US 2010/0208040 Al Aug. 19, 2010 Claims 1, 2, 15-17, 19-21, 23, and 25-28 stand rejected under 35 U.S.C. § 103(a) as being unpatentable over the combination of Suh and Pihlaja. Final Act. 4. Claims 1, 2, 15-17, 19-21, 23, and 25-28 stand rejected under 35 U.S.C. § 103(a) as being unpatentable over the combination of Suh, Pihlaja, and Guillou. Final Act. 10. OPINION We have considered Appellant’s arguments (Appeal Br. 12-20; Reply Br. 4-11) in light of the Examiner’s findings and explanations (Final Act. 4- 17; Ans. 3-6). For the reasons set forth below, we are not persuaded of Examiner error in the rejection of the pending claims, and we, therefore, sustain the Examiner’s rejections. Appellant argues the rejections of independent claims 1, 19, and 20 collectively with regard to the Examiner’s rejections of claim 1 under 35 U.S.C. § 103(a). Appeal Br. 18, 20. Appellant does not separately argue patentability for the dependent claims. Id. In addition, Appellant presents essentially the same arguments with respect to both of Examiner’s 3 All citations herein to the prior art are by reference to the first named inventor only. Appeal 2020-005907 Application 13/230,639 4 rejections. See Appeal Br. 18-20; Reply Br. 9-11. Therefore, based on Appellant’s arguments, we decide the appeal of claims 1, 2, 15-17, 19-21, 23, and 25-28 based on claim 1 alone with respect to the Examiner’s rejection based on the combination of Suh and Pihlaja. See 37 C.F.R. § 41.37(c)(1)(iv) (2019). We adopt as our own: (1) the findings and reasons set forth by the Examiner in the action from which this appeal is taken (Final Act. 4-17) and (2) the reasons set forth by the Examiner in the Examiner’s Answer in response to Appellant’s Appeal Brief (Ans. 3-6). We concur with the conclusions reached by the Examiner, and we highlight the following for emphasis.4 In rejecting claim 1 over the combination of Suh and Pihlaja, the Examiner finds Suh teaches most limitations of claim 1, except for the disputed limitation, “wherein the sensor acquires the second operation based on detection of one or more proximity inputs according to a proximity of a portion of an operational tool to the modified display position of the operation panel.” Final Act. 4-6 (citing Suh ¶¶ 2, 16, 22, 41, 80, 95, 124, 267, 280, 289, 300, 414). The Examiner additionally relies on Pihlaja for the recitation “[a proximity] of a portion of an operational tool to the modified display position of the operation panel.” Id. at 6 (citing Pihlaja ¶¶ 19, 30, 50-54, 62, Figs. 3, 4a-h); Ans. 4-5 (stating that the Examiner “only admits the deficiency of ‘[a proximity] of a portion of an operation tool to the modified display position of the operation panel’” in Suh and that 4 Only those arguments made by Appellant have been considered in this decision. Arguments Appellant did not make are deemed to be waived. See 37 C.F.R. § 41.37(c)(1)(iv). Appeal 2020-005907 Application 13/230,639 5 the Examiner additionally relies on Pihlaja for the limitation missing in Suh). Regarding the disputed limitation, the Examiner finds that Suh teaches “wherein the sensor acquires the second operation based on detection of one or more proximity inputs according to a proximity” because Suh describes A remote controller (or front panel), functioning as a sensor, is used to receive user input operations including a first input operations and a second input operation, wherein the input operations are non-contact inputs that are not in contact with an information processing receiver. The non-contact input of a remote controller is a non-contact input according to a proximity s[t]ate, wherein when the remote controller through which a user input is entered is too far, out of proximity/range, from the corresponding signal receiver, the input is not received, and when the remote controller is close enough, within proximity/range, from the corresponding signal receiver, the input is received. Final Act. 5-6 (emphases added) (citing Suh ¶¶ 80, 414). The Examiner also finds Pihlaja teaches “[a proximity] of a portion of an operational tool to the modified display position of the operation panel” because Pihlaja describes that “[t]he detection unit is used to detect a proximity/hovering input to the input surface 112 or at the matching depth of the user interface item 160, wherein the position of the user interface is being changed/adjusted.” Final Act. 6 (citing Pihlaja ¶¶ 19, 30, 50-54, 62, Figs. 3, 4a-h). The Examiner further reasons that a person of ordinary skill in the art would have been motivated to combine the teachings of Suh and Pihlaja “to modify Suh’s information processing apparatus to adapt an alternative sensor, by incorporating Pihlaja’s teaching where a detection unit and a control unit are added to the apparatus for allowing the user to give proximity/hovering input to the UI and execute the Ul’s function” because Appeal 2020-005907 Application 13/230,639 6 such a combination would have “enable[d] more versatile and intuitive ways of interacting with electronic devices.” Final Act. 6. Appellant argues that “the 3D virtual item referred to in Figure 3 and illustrated in Figures 4a-4h” of Pihlaja cited by the Examiner “at best relates to content of the stereoscopic image itself, rather than a separately displayed operation panel upon which an input is detected.” Appeal Br. 16 (emphasis added); Reply Br. 6. According to Appellant, “there is no teaching or suggestion in Pihlaja that the 3D virtual item may be an operation panel” because “Pihlaja merely teaches to modify a shape or position of the 3D virtual item that is the content of the stereoscopic image ‘in response to an approaching input object’ (i.e. ‘virtual touch’) to create ‘visual effects’ related to ‘the visual appearance’ of the 3D virtual item.” Appeal Br. 16 (citing Pihlaja ¶¶ 50, 54); Reply Br. 6. Appellant argues, therefore, “if one of ordinary skill in the art were to modify the teachings of Suh with Pihlaja, the resulting combination would merely have led the ordinary artisan to detect the virtual touch of Pihlaja in order to interact with an appearance of content of the 3D video of Suh.” Appeal Br. 17; Reply Br. 7. In response, the Examiner explains, Pihlaja teaches 3D virtual user interface (UI) that is used to detect user input to operate the display system based on detecting an input tool such as a stylus . . . in close proximity to the input surface, see steps 210, 300, 302, 304 of Fig 3. Thus, Pihlaja clearly teaches detecting proximity of an operation tool (a stylus) to a modified display position of an operation panel, i.e. 3D virtual UI, and such teaching addresses the feature in question. Ans. 5 (emphases added) (citing Pihlaja ¶¶ 19, 50, 54, Figs. 3 (steps 210, 300, 302, 304), 4a-h). Appeal 2020-005907 Application 13/230,639 7 We find the Examiner’s findings and explanations to be reasonable based on the plain language of the claim and the related disclosures in the Specification, as well as the cited disclosures of Suh and Pihlaja. Claim 1 recites acquiring “a first operation” and displaying “an operation panel displayed in response to the first operation.” Claim 1 further recites acquiring “a second operation” “wherein the sensor acquires the second operation based on detection of one or more proximity inputs according to a proximity of a portion of an operational tool to the modified display position of the operation panel.” The Specification describes The operation unit 160 is a means for acquiring an operation by the user to input various kinds of data into the information processing apparatus 100 or instruct the information processing apparatus 100 to perform a processing operation. The operation unit 160 may be, for example, a touch panel provided on the display screen of the display apparatus 200. In addition, the operation unit 160 may be various operation means such as a mouse, keyboard, touch panel, button, switch and lever. Furthermore, the operation unit 160 may be a remote controller using infrared rays or other radio waves. Spec. ¶ 18 (emphases added). Appellant’s Specification also describes displaying “operation panel image 600” of a graphical play control on a touch panel and a user touching the operation panel image to perform an operation. See Spec. Figs. 3, 5, 7, ¶¶ 28-30, 56-60. The Specification describes three embodiments, all of which disclose this feature. Id. In the third embodiment, the Specification describes [I]f the operation unit 160 of the information processing apparatus 100 includes a touch panel, the user touches the backward side farther than the actually visible operation panel image 600, which may cause an uncomfortable feeling of the Appeal 2020-005907 Application 13/230,639 8 user. In such a case, the operation unit 160 may contain, for example, a proximity sensor capable of acquiring a proximity state. Spec. ¶ 60 (emphasis added). Pihlaja describes “a proximity detection system or unit 120 configured to detect when an input object 100, such as a finger or a stylus, is brought in close proximity to, but not in contact with, an input surface.” Pihlaja ¶ 19 (emphases added). According to Pihlaja “(non-touch) inputs by the object 100 in the hovering area may be referred to as hovering” and “[s]uch hovering input is associated with at least one function, for instance selection of an user interface (UI) item, zooming a display area, or activation of a pop-up menu.” Id. ¶ 20 (emphases added). Pihlaja further describes generating a “three-dimensional virtual UI item . . . hereafter generally referred also to as ‘virtual UI item’,” which “can serve as a visual target for finger movements . . . to provide an intuitive and reliable hovering interaction.” Id. ¶ 30 (emphasis added). As discussed above, the Examiner maps Pihlaja’s three-dimensional user interface items providing “hovering interaction” to perform a user interaction function such as “zooming a display area, or activation of a pop- up menu” to correspond to the “operation panel” for a “second operation” acquired “based on detection of one or more proximity inputs according to a proximity of a portion of an operational tool to the modified display position of the operation panel,” as recited in claim 1. Final Act. 6; Ans. 4-5. In view of the claim language and the disclosures in the Specification discussed above, we find the Examiner’s mapping to be reasonable. See In re Smith Int’l, Inc., 871 F.3d 1375, 1382-83 (Fed. Cir. 2017) (internal citations omitted) (“The correct inquiry in giving a claim term its broadest reasonable Appeal 2020-005907 Application 13/230,639 9 interpretation in light of the specification is . . . an interpretation that corresponds with what and how the inventor describes his invention in the specification, i.e., an interpretation that is ‘consistent with the specification.’”). Appellant does not explain adequately why Pihlaja’s three- dimensional user interface items are excluded from the “operation panel” for a second operation, as recited in claim 1. Appellant also does not dispute the Examiner’s finding that the finger or stylus described in Pihlaja teaches the “operational tool” recited in claim 1. See generally Appeal Br.; Reply Br. Accordingly, Appellant does not persuade us the Examiner erred in finding that Pihlaja’s three-dimensional user interface items teach or suggest an “operation panel” for a “second operation” acquired “based on detection of one or more proximity inputs according to a proximity of a portion of an operational tool to the modified display position of the operation panel,” as recited in claim 1. In the Reply, Appellant additionally argues that Suh does not teach or suggest “computation of a displacement value or modification of a display position of an operation panel in the depth direction based on the computed displacement value upon control to display the operation panel in response to a first operation.” Reply Br. 5. We are not persuaded by Appellant’s argument. The paragraphs of Suh cited by the Examiner state as follows: In another aspect of the present invention, the displaying processor configured to control on screen display (OSD) menu of the apparatus to be displayed on top of the 3D image during the duration of the scene included in the 3D service based on information in the video depth range descriptor and the scene duration information. Suh ¶ 22 (emphases added). Appeal 2020-005907 Application 13/230,639 10 For example, the OSD may be displayed so that the OSD does not overlay with the 3D image (or video), or the OSD may be displayed with a 3D effect further apart from the user than any of the other 3D images (or videos) that are currently being displayed. Id. ¶ 80 (emphases added). Depending upon the OSD called by the user, the broadcast receiver receives the depth information respective to at least one or more scenes. Thereafter, the broadcast receiver may display the OSD at a depth that seems to be closer to the user as compared to the depth indicated by the min_disparity field, which has the minimum value. For example, since an OSD for adjusting the volume may be manipulated in a short time, the broadcast receiver may decide the depth of the OSD for adjusting the volume by using the depth information respective to the scene corresponding to the moment when the user calls on the OSD. Id. ¶ 95 (emphases added). When outputting graphics or OSD screens in accordance with user interaction or the performance of message or alarm functions, the receiver uses the depth range information determined in the above-described steps. More specifically, the receiver uses the min_disparity field value so as to decode the disparity according to which the OSD screen is to be located, within a range that the OSD screen does not overlay with the 3D video. For example, the receiver decides the OSD screen to be displayed in a location nearer to the user than the value designated by the min_disparity field. Id. ¶ 289 (emphases added). These paragraphs describe that Suh’s broadcast receiver (i.e., the device including a processor for receiving and processing 3D video images, as well as controlling display of OSD menus (Suh ¶¶ 22, 41, Fig. 15)), when outputting OSD screens “in accordance with user interaction,” decides to display the OSD at a depth that seems to be closer to the user than the depth of the 3D image or video. Thus, we agree with the Examiner that Suh at Appeal 2020-005907 Application 13/230,639 11 least suggests to a person of ordinary skill in the art “circuitry configured to control the display to display an operation panel displayed in response to the first operation” and “compute a displacement value based on a display position of the stereoscopic image in a depth direction and a display position of the operation panel in the depth direction,” as recited in claim 1. Lastly, Appellant argues the Examiner’s combination of Suh and Pihlaja is based on impermissible hindsight. Appeal Br. 17-18; Reply Br. 8. We are not persuaded by Appellant’s argument because “[a]ny judgment on obviousness is in a sense necessarily a reconstruction based upon hindsight reasoning, but so long as it takes into account only knowledge which was within the level of ordinary skill at the time the claimed invention was made and does not include knowledge gleaned only from applicant’s disclosure, such a reconstruction is proper.” In re McLaughlin, 443 F.2d 1392, 1395 (CCPA 1971). Here, the Examiner relies on the disclosures of Suh and Pihlaja to explain that a person of ordinary skill in the art at the time of the Application’s filing would have been motivated to combine the references “to modify Suh’s information processing apparatus to adapt an alternative sensor, by incorporating Pihlaja’s teaching where a detection unit and a control unit are added to the apparatus for allowing the user to give proximity/hovering input to the UI and execute the Ul’s function” because such a combination would have “enable[d] more versatile and intuitive ways of interacting with electronic devices.” Final Act. 6 (citing Pihlaja ¶ 3 (describing “hovering user interfaces” and stating that “[p]eople who are using mobile devices generally wish to have more versatile and intuitive ways to interact with their devices”)). Appeal 2020-005907 Application 13/230,639 12 This is not evidence of hindsight bias. Appellant does not identify, nor we do discern, any particular knowledge used by the Examiner in the proffered combination of Suh and Pihlaja that was gleaned only from the Specification’s disclosure and was not within the level of ordinary skill in the art at the time of the filing. CONCLUSIONS In sum, Appellant does not persuade us that the Examiner erred in finding that the combination of Suh and Pihlaja teaches or suggests “the sensor acquires the second operation based on detection of one or more proximity inputs according to a proximity of a portion of an operational tool to the modified display position of the operation panel,” as recited in claim 1. We also find that the Examiner articulates sufficient reason, supported by rational underpinning, for combining the teachings of Suh and Pihlaja in the manner described by the Examiner. For the foregoing reasons, we are not persuaded of Examiner error in the rejection of claim 1. We, therefore, sustain the Examiner’s rejections on this record. The decision of the Examiner to reject claims 1, 2, 15-17, 19- 21, 23 and 25-28 under 35 U.S.C. § 103(a) is affirmed. Appeal 2020-005907 Application 13/230,639 13 DECISION SUMMARY In summary: Claim(s) Rejected 35 U.S.C. § Reference(s)/Basis Affirmed Reversed 1, 2, 15-17, 19-21, 23, 25-28 103(a) Suh, Pihlaja 1, 2, 15- 17, 19-21, 23, 25-28 1, 2, 15-17, 19-21, 23, 25-28 103(a) Suh, Pihlaja, Guillou 1, 2, 15- 17, 19-21, 23, 25-28 Overall Outcome 1, 2, 15- 17, 19-21, 23, 25-28 TIME PERIOD FOR RESPONSE No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a). See 37 C.F.R. § 1.136(a)(1)(iv). AFFIRMED Copy with citationCopy as parenthetical citation