Ex Parte KoizumiDownload PDFBoard of Patent Appeals and InterferencesSep 17, 200811005694 (B.P.A.I. Sep. 17, 2008) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE ____________ BEFORE THE BOARD OF PATENT APPEALS AND INTERFERENCES ____________ Ex parte DAVID H. KOIZUMI ____________ Appeal 2008-2489 Application 11/005,694 Technology Center 2600 ____________ Decided: September 17, 2008 ____________ Before RICHARD TORCZON, SALLY C. MEDLEY and JAMES T. MOORE, Administrative Patent Judges. MEDLEY, Administrative Patent Judge. DECISION ON APPEAL Appeal 2008-2489 Application 11/005,694 A. Statement of the Case Intel Corporation (“Intel”), the real party in interest, seeks review under 35 U.S.C. § 134(a) of a Final Rejection of claims 1-19, the only claims remaining in the application on appeal. We have jurisdiction under 35 U.S.C. § 6(b). We affirm. The application on appeal was filed 07 December 2004. The Examiner relies on the following prior art in rejecting the claims on appeal: Skodlar 5,734,370 Mar. 31, 1998 Mumford 6,377,249 Apr. 23, 2002 Yasutake 6,597,347 Jul. 22, 2003 The Examiner rejected claims 1-10 under 35 U.S.C. § 112, ¶ 1 as failing to comply with the enablement requirement. The Examiner rejected claims 1-19 as being indefinite under 35 U.S.C. § 112, ¶ 2. The Examiner rejected claims 1-2, 5-6 and 11-15 as anticipated under 35 U.S.C. § 102(b) by Skodlar. The Examiner rejected claims 3-4 and 19 as unpatentable under 35 U.S.C. § 103(a) as obvious over Skodlar and Mumford. The Examiner rejected claims 7-10 and 16-18 as unpatentable under 35 U.S.C. § 103(a) as obvious over Skodlar and Yasutake. B. Issues The first issue is whether Intel has sustained its burden of showing that the Examiner erred in rejecting appealed claims 1-10 for failing to comply with the enablement requirement under 35 U.S.C. § 112, ¶ 1. 1 Appeal 2008-2489 Application 11/005,694 The second issue is whether Intel has sustained its burden of showing that the Examiner erred in rejecting appealed claims 1-19 as being indefinite under 35 U.S.C. § 112, ¶ 2. The third issue is whether Intel has sustained its burden of showing that the Examiner erred in rejecting appealed claims 1-2, 5-6 and 11-15 as anticipated under 35 U.S.C. § 102(b) by the prior art. The fourth issue is whether Intel has sustained its burden of showing that the Examiner erred in rejecting appealed claims 3-4 and 19 under 35 U.S.C. § 103(a) as obvious over the prior art. The fifth issue is whether Intel has sustained its burden of showing that the Examiner erred in rejecting appealed claims 7-10 and 16-18 under 35 U.S.C. § 103(a) as obvious over the prior art. C. Findings of Fact (“FF”) The record supports the following findings of fact as well as any other findings of fact set forth in this opinion by at least a preponderance of the evidence. The Invention 1. The invention is related to a method and apparatus that enables a user to interact with an image displayed on a display screen. Spec. Abs. 2. Referring to figure 2 below [numbers from figure 2 inserted], the apparatus includes an element [10] that can be physically manipulated, such as by utilizing handle [12] and trigger [24]. Spec. 2-4. 3. The image [30] on the display screen [28] is changed based on the physical manipulation of element [10], such as through the movement of the shaft portions [14, 16] and spring biased light pens [20]. Spec. 2-4. 4. A display screen is two dimensional as depicted in Intel’s figure 2. 2 Appeal 2008-2489 Application 11/005,694 Intel’s figure 2 is reproduced below. Figure 2 depicts the interaction between the element and image on display screen. Claims on Appeal 5. Intel’s claims 1-19 are the subject of this appeal. App. Br. 5. 6. Claims 1 and 11 are independent. App. Br. 13-14. 7. Claims 2-10 and 12-19 are directly or indirectly dependent on claims 1 and 11 respectively. App. Br. 13-15. 8. Claims 1-10 stand or fall together with respect to the 35 U.S.C. § 112, first paragraph rejection. App. Br. 10. 9. Claims 1-19 stand or fall together with respect to the 35 U.S.C. § 112, second paragraph rejection. App. Br. 10. 10. Claims 1-2, 5-6 and 11-15 stand or fall together with respect to rejection over the prior art. App. Br. 11. 11. Claims 3-4 and 19 stand or fall together with respect to rejection over the prior art. App. Br. 11. 3 Appeal 2008-2489 Application 11/005,694 12. Claims 7-10 and 16-18 stand or fall together with respect to rejection over the prior art. App. Br. 11-12. 13. Representative claim 1 [numbers from figure 2 inserted], which we reproduce from the Claim Appendix of the Appeal Brief, reads as follows: 1. A method comprising: enabling the position of an element [10] on a display screen [28] to be determined automatically; and in response to physical manipulation of said element [10] in three dimensions, enabling an image [30, 32] displayed on said display screen [28] to be automatically altered in three dimensions. 14. Representative claim 11 [numbers from figure 2 inserted], which we reproduce from the Claim Appendix of the Appeal Brief, reads as follows: 11. An article comprising a medium storing instructions that enable a processor-based system to: determine the position of an element [10] on a display screen [28]; and in response to physical manipulation of said element [10] in three dimensions, alter an image [30, 32] displayed on the display screen [28]. Prior Art Skodlar 15. Referring to figure 6 below [numbers from figure 6 inserted], Skodlar describes a control stick [2] and a control apparatus [60] and a monitor display screen [75]. Col. 4, ll. 31-37. 16. Skodlar describes a display screen [75] showing a virtual cue [61] with virtual cue tip [69] as a virtual extension of control stick [2]. Col. 4, ll. 31-37. 4 Appeal 2008-2489 Application 11/005,694 17. Skodlar describes that a player will observe the virtual cue [61] on screen [75] moving in accordance with movement of the control stick [2]. Col. 4, ll. 37-39. 18. Skodlar describes that control stick [2] can be equipped with an optical sensor [68] at the tip that provides exact pointing direction. Col. 4, ll. 45-47. 19. The optical sensor [68] is connected to electronic circuit and provides a signal relating to the cursor, virtual cue tip [69] of virtual cue [61] on the screen, thereby allowing the displayed image including virtual cue [61] to align with control apparatus [60]. Col. 4, ll. 47-52. Figure 6 from Skodlar is reproduced below. Figure 6 depicts a control apparatus, a control stick, a virtual cue and a display. Mumford 20. Referring to figure 10 below [numbers from figure 10 inserted], Mumford describes a display screen [102], a light pen including a 5 Appeal 2008-2489 Application 11/005,694 colliminating lens [101], light pipe [103], multi-stranded fiber optic cables [104a, 104b, 104c], photodiodes [106r, 106g, 106b], amplifiers [107r, 107g, 107b] and a computer [108]. Col. 12, ll. 20-46. 21. Mumford describes that the electronic light pen measures red, green and blue pixel values independently through color filtered independent photodiodes. Col. 12, ll. 13-19. 22. Mumford describes that a mechanical assembly could maximize the transmission of light from the display screen [102] to the electronic light pen to adjust the angle of the lens [101] and light pipe [103] with respect to display screen [102]. Col. 13, ll. 6-12. 23. Mumford describes that the mechanical assembly could support the focusing of the electronic light pen onto a very few pixels to make the localization of color or grey scale information very specific. Col. 13, ll. 12-15. Figure 10 from Mumford is reproduced below. Figure 10 depicts a light pen including photodiodes and a display. Yasutake 24. Referring to figure 1 below [numbers from figure 1 inserted], Yasutake describes a force/touch sensitive 3D controller [105] with 6 Appeal 2008-2489 Application 11/005,694 force-sensitive sensor pads [110, 115, 120, 125, 130, 135] on each face of the controller [105]. Col. 8, ll. 42-57. 25. Yasutake describes that signals responsive to the application of pressure to the force-sensitive sensor pads [110, 115, 120, 125, 130, 135] are input to the computer system. Col. 8, ll. 60-64. 26. Yasutake describes that the computer system can display a computer generated object that can be manipulated in simulated space by controller [105]. Col. 8, ll. 65-67. 27. Yasutake describes that a real or computer generated object will move as if it is responding to the pressure (i.e., force) on controller [105] Col. 9, ll. 14-34. 28. Yasutake describes that the controlled object will move in the X, Y and Z directions in response to pressing on the force-sensitive sensor pads [110, 115, 120, 125, 130, 135]. Col. 9, ll. 14-34. 29. Yasutake describes that the controller [105] offers the advantage that it exhibits a zero neutral force, i.e., the controller does not require a force on any sensors or mechanical members to maintain a neutral position. Col. 9, ll. 35-37. 30. Yasutake describes that images on the display screen can be altered automatically to appear to extend further into the display screen (along z-axis) in response to an element (i.e., a thumb or forefinger) being pressed harder against the force-sensitive sensor pads. Col. 19, ll. 51- 58; figs. 33d1-33d4. 31. Yasutake depicts manipulation of a computer generated object in response to pressure on force-sensitive sensor pads in figs. 26a-26f, 28a- 28f, 33a1-a2, 33b1-b2, 33c1-c2, 33d1-d6 and 33e1-e2. 7 Appeal 2008-2489 Application 11/005,694 Figure 1 from Yasutake is reproduced below. Figure 1 depicts a force/touch sensitive 3D controller with force sensor pads. Examiner’s Findings 35 U.S.C. § 112, ¶ 2 Rejection 32. The Examiner found that it is not clear whether the “element on the screen to be determined automatically” refers to image [30] or physical element [10] as described in Intel’s Specification. Final Rejection 31. 35 U.S.C. § 102(b) Rejection 33. The Examiner found that Skodlar describes a computer control device [60] for controlling an image on an image display device that includes enabling the position of an element [2] on the display screen [67] to be determined automatically because the virtual cue [61] appears on the screen when the stick [2] points to the display 1 The 35 U.S.C. § 112, ¶ 2 Rejection was included in the Final Rejection, but not included in either of the Examiner’s Answers mailed 07 March 2007 and 15 May 2007. The Examiner indicated that this rejection is maintained in the paper mailed 20 December 2007. 8 Appeal 2008-2489 Application 11/005,694 screen [67]. Final Rejection 3 and Ans. 2 4, citing Skodlar col. 4, ll. 45- 47; fig. 6. 34. The Examiner found that Skodlar describes that in response to physical manipulation of the element [2] in three dimensions, enabling an image displayed on the display screen to be automatically altered in three dimensions since the optical sensor [68] is connected to electronic circuit providing a signal relating to the cursor, virtual cue tip [69] of virtual cue [61] on the screen, thereby allowing to alignment of the display image including a virtual image of the control cue [61]. Final Rejection 4 and Ans. 4, citing Skodlar col. 2, ll. 29-35; col. 4, ll. 48-52; col. 5, lines 42-45; fig 6. 35. The Examiner found that Skodlar also describes a medium storing instructions that enable a processor based computer to perform the method steps. Final Rejection 5 and Ans. 5, citing Skodlar fig. 5. 35 U.S.C. § 103(a) Rejections Claims 7-10 and 16-18 36. The Examiner found that Skodlar does not describe enabling the image on the display screen to be altered to appear to extend further into said screen in response to the element being pressed harder against the display screen. Final Rejection 6 and Ans. 7. 37. The Examiner found that the claimed “method for providing touch- sensitive input” reads on Yasutake’s description of the controller [105] having control harness [145] connected to force sensor pads and 2 The Examiner’s Answer referred to hereinafter is the Examiner’s Answer mailed 15 May 2007. 9 Appeal 2008-2489 Application 11/005,694 providing signals in response to application of pressure to the pads. Final Rejection 6 and Ans. 7, citing Yasutake fig. 1. 38. The Examiner found that Yasutake describes the signals are input to a computer system that can display a computer-generated object that can be manipulated in simulated space by the controller [105]. Final Rejection 6 and Ans. 7, citing Yasutake col. 8, ll. 43-65; fig. 1. 39. The Examiner found that Yasutake describes the user interface is intuitive since a computer object, i.e., image displayed on the screen, will move further as if it responding to a pressure on controller [105]. Final Rejection 6 and Ans. 7, citing Yasutake col. 9, ll. 14-34. 40. The Examiner determined that “[i]t would have been obvious for one of ordinary skill in the art at the time of the invention [to] provide the teaching of enabling the generated object, i.e., image displayed on the screen, to move as if i[n] response to pressure on the controller device in the system of Skodlar because it would provide a controller that exhibits a zero neutral force, i.e., the controller does not require a force on any sensors or mechanical members to maintain a neutral position as taught [by] Yasutake.” Final Rejection 6 and Ans. 7, citing Yasutake col. 9, ll. 34-36. 41. The Examiner found that Yasutake describes in figures 26a-26f the appearance of the image on the display screen is physically connected to the element by enabling the image to be altered in fashion that it would be altered in response to the three dimensional manipulation of element (controller [2500]) if the image were physically connected to the element. Final Rejection 7 and Ans. 7, citing Yasutake col. 16, l. 59- col. 17, l. 13. 10 Appeal 2008-2489 Application 11/005,694 42. The Examiner found that Yasutake describes in figures 28a-28f the displayed image to appear to grasp another image in response to manipulation of the controller [2820]. Final Rejection 7 and Ans. 7. 43. The Examiner found that Yasutake describes the image manipulated by controller [2500] to appear to alter in three dimensions in response to a corresponding manipulation of the controller [2500] in three dimensions. Final Rejection 7 and Ans. 8, citing Yasutake figs. 26a - 26f. 44. The Examiner found that Skodlar also describes a medium storing instructions that enable a processor based computer 5 to perform the method steps. Final Rejection 7 and Ans. 8. Claims 3-4 and 19 45. The Examiner found that Skodlar describes a control stick [2] with an optical sensor [68] and found the claimed light pen to read on the optical sensor [68]. Final Rejection 5 and Ans. 6. 46. The Examiner found that Skodlar does not provide a plurality of light pens on the end of the control stick [2]. Final Rejection 5 and Ans. 6. 47. The Examiner found that Mumford discloses a plurality of lights [106r, 106g, 106b] at the end of an electronic light pen to determine the position of the tip of the light pen with respect to the display screen. Final Rejection 5 and Ans. 6, citing Mumford fig. 10; col. 12, ll. 34-37. 48. The Examiner determined it would have been obvious for one of ordinary skill in the art at the time of the invention to modify Skodlar’s optical sensor at the tip of the control stick to have three light sensors because it would support the focusing of the cue tip onto a very few pixels to make the localization of color or gray scale information very 11 Appeal 2008-2489 Application 11/005,694 specific as taught by Mumford. Final Rejection 6 and Ans. 6, citing Mumford col. 13, ll. 13-15. 49. The Examiner found that Skodlar discloses a sensor to detect the longitudinal movements of the control stick [2] relative to the display screen, i.e.: physical extension of the tip portion having the light sensor [68] is determined from the guide and shown in figure 2a as the translational movement. Final Rejection 5 and Ans. 6, citing Skodlar col. 3, ll. 19-21; fig. 2a. Intel’s Arguments Rejection of claims 1-19 under 35 U.S.C. § 112, ¶ 2 50. Intel argues that claim 1 requires the element to be physically manipulated. App. Br. 10. 51. Intel argues that it is not seen how an image element on a display screen could be physically manipulated. App. Br. 10. 52. Intel argues that the rejection is unfounded since one skilled in the art would readily understand that the claim requires determining the position of the physical element. App. Br. 10. Rejection of claims 1-2, 5-6 and 11-15 under 35 U.S.C. § 102(b) 53. Intel argues that there is no way to read the claim so that the claim does not require that the element must be on the screen to be automatically determined. App. Br. 11. 54. Intel argues that the suggestion that the position of Skodlar’s control stick [2] is automatically determined on the display screen [28] is not supported because the control stick [2] never reaches or touches the display screen [28]. App. Br. 11 and Reply Br. 2-3. 12 Appeal 2008-2489 Application 11/005,694 55. Intel argues that since the control stick [2] never reaches or touches the display screen [28], it never automatically determines the position of an element on the display screen [28]. Reply. Br. 3. Rejection of claims 7-10 and 16-18 under 35 U.S.C. § 103(a) 56. Intel argues that neither Skodlar orYasutake teaches enabling the image on the display screen to be altered automatically to appear to extend further into the display screen in response to the element being pressed harder against the display screen or any rationale to achieve it. App. Br. 11-12. 57. Intel argues that the image modifications that seem to correspond to physical manipulation are nowhere suggested in either Skodlar or Yasutake. App. Br. 12. Rejection of claims 3-4 and 19 under 35 U.S.C. § 103(a) 58. Intel argues that there is no way that Skodlar’s optical sensor would correspond to a light pen. App. Br. 11. 59. Intel argues that there is no teaching of putting the light pen on the end of the element whose position is to be determined. App. Br. 11. 60. Intel argues that nothing in Mumford would suggest putting the light pen on the end of the stick, as suggested by the Examiner. App. Br. 11. 61. Intel argues that there is no reason why Mumford could not put his light pen directly on the screen without the presence of the stick. App. Br. 11. 62. Intel argues that the addition of the element that is physically manipulated and the attachment of the light pens to the end of the stick is nowhere suggested by Mumford or any combination of Mumford with any other reference. App. Br. 11. 13 Appeal 2008-2489 Application 11/005,694 D. Principles of Law “[T]he definiteness of the language employed must be analyzed-not in a vacuum, but always in light of the teachings of the prior art and of the particular application disclosure as it would be interpreted by one possessing the ordinary level of skill in the pertinent art.” In re Moore, 439 F.2d 1232, 1235 (CCPA 1971). “When the meaning of claims is in doubt, especially when . . . there is close prior art . . .” a finding of indefiniteness is proper. Amgen, Inc. v. Chugai Pharmaceutical Co. Ltd., 927 F.2d 1200, 1218 (Fed. Cir. 1991). The breadth of a claim is not to be equated with indefiniteness. In re Miller, 441 F.2d 689, 693 (CCPA 1971). If the metes and bounds of the claims are speculative, the rejection of the claims over the prior art will not be addressed. In re Steele, 305 F.2d 859, 863 (CCPA 1962). “In the patentability context, claims are to be given their broadest reasonable interpretations.” In re Van Geuns, 988 F.2d 1181, 1184 (Fed. Cir. 1993) (citation omitted). Once the PTO has established a prima facie case of anticipation, the burden of production falls upon the applicant to establish entitlement to a patent. In re Morris, 127 F.3d. 1048, 1054 (Fed. Cir. 1997) (citations omitted). “Under §103, the scope and content of the prior art are to be determined; differences between the prior art and the claims at issue are to be ascertained; and the level of ordinary skill in the pertinent art resolved. Against this background the obviousness or nonobviousness of the subject 14 Appeal 2008-2489 Application 11/005,694 matter is determined.” KSR Int’l Co. v. Teleflex Inc., 127 S. Ct. 1727, 1734 (2007). In an obviousness analysis, it is not necessary to find precise teachings in the prior art directed to the specific subject matter claimed because inferences and creative steps that a person of ordinary skill in the art would employ can be taken into account. See KSR Int’l Co. v. Teleflex Inc., 127 S.Ct. 1727, 1741 (2007). E. Analysis Rejection of claims 1-19 under 35 U.S.C. § 112, ¶ 2 The Examiner found that it is unclear whether the “element on the display screen”, as recited in independent claims 1 and 11, refers to an object [30] (i.e., image) or to a physical element [10]. FF3 31. Intel’s position is that the rejection is unfounded since the claims require the “element” to be physically manipulated and it is unclear how an image can be physically manipulated. FFs 50-52. While Intel’s argument has merit with respect to claims 11-19, the same cannot be said with respect to claims 1-10. Claim 1 recites “in response to physical manipulation of said element in three dimensions, enabling an image displayed on said display screen to be automatically altered in three dimensions.” FF 13. We construe Intel’s claimed “element” to correspond to element [10] including its component parts such as light pens [20], shaft portions [14, 16] etc., depicted in Intel’s figure 1. We also construe Intel’s “image displayed on said display screen” to correspond to image [30] or image [32] displayed on display screen [28] as depicted in Intel’s figure 1. 3 FF denotes Finding of Fact. 15 Appeal 2008-2489 Application 11/005,694 The limitations of claim 1 require both three dimensional physical manipulation of an element and three dimensional alteration of an image. In the first instance, Intel uses the term “three dimensions” for a real object (i.e., element [10]). While in the second instance, Intel uses the term “three dimensions” for a virtual object (i.e., images [30] or [32]). Further, we note that Intel’s images [30] or [32] are displayed on a display [28] that is two dimensional. FFs 3-4. In its arguments, Intel relies on three dimensional physical manipulation to urge that the claimed “element” is a real object (i.e., element [10]) and not a virtual object (i.e., images [30] or [32]). FFs 50-52. Yet, claim 1 also requires three dimensional alteration of a virtual object (i.e., images [30] or [32]). Since the image is displayed on a two dimensional display, three dimensional alteration appears to have a non-standard meaning that includes three dimensional alteration of a real object but also connotes three dimensional alteration of a virtual object. It is not clear why three dimensional physical manipulation should not have a similarly broad meaning encompassing both three dimensional physical manipulation of a real object but also three dimensional physical manipulation of a virtual object, instead of its common meaning. Since the meaning of three dimensional physical manipulation may include three dimensional physical manipulation of an image (i.e., a virtual object), we are not persuaded by Intel’s arguments. Further, since the meaning of three dimensional physical manipulation can include manipulation of both a real object and a virtual object, we agree with the Examiner’s finding that it is not clear whether the element on the display screen refers to element [10] (i.e., real object) or to an image [30] (i.e., virtual object). 16 Appeal 2008-2489 Application 11/005,694 We appreciate that ordinarily it is difficult to establish that a claim is indefinite in light of the teachings of the prior art and the application disclosure. However, a finding of indefiniteness is proper when there is close prior art and the meaning of the claim is in doubt as is the case before us. In other words, enhanced precision in claim drafting is essential when the prior art is close. In this case, the Skodlar reference constitutes close prior art because it describes an element that is physically manipulated and a display screen in which images are altered. FFs 15-19, 33, 34. Moreover, the Examiner’s and Intel’s arguments regarding the interaction of the element that is physically manipulated and the display screen which displays the image indicate that the prior art is close to what is covered by claim 1. FFs 33-35, 54, 55. For all these reasons, we find that Intel has not demonstrated that the Examiner erred in rejecting claims 1-10 as indefinite. Unlike independent claim 1, independent claim 11 does not recite enabling an image displayed on a display screen to be automatically altered in three dimensions. Claim 11 recites “alter an image displayed on the display screen.” A display screen is two dimensional. FF 4. We agree with Intel that the same image that is displayed on the display screen cannot be physically manipulated in three dimensions. The Examiner’s interpretation that claim 11 covers manipulation of either the displayed element or the hand held device is not reasonable. To the extent that the Examiner finds error in the breadth of the claim, breadth of a claim is not to be equated with indefiniteness. For these reasons, Intel has sufficiently demonstrated that the Examiner erred in rejecting claims 11-19 as indefinite. 17 Appeal 2008-2489 Application 11/005,694 Rejection of claims 1-10 under 35 U.S.C. § 112, ¶ 1 Since the Examiner’s rejection of claims 1-10 as indefinite under 35 U.S.C. § 112, ¶ 2 has been affirmed, as discussed above, we need not address the rejection of claims 1-10 under 35 U.S.C. § 112 for failing to comply with the enablement requirement. Rejection of claims 1-10 over the prior art Since the meets and bounds of claims 1-10 are speculative, as discussed above, we do not address the prior art rejections with respect to those claims. Rejection of claims 11-15 under 35 U.S.C. § 102(b) Claim 11 recites “an element on a display screen . . . ”. FF 14. The Examiner and Intel disagree as to whether this limitation reads on the Skodlar reference. Skodlar describes a control stick [2] and a control apparatus [60] and a monitor display screen [75]. FF 15. A player will observe the virtual cue [61] with virtual cue tip [69] on display screen [75] moving in accordance with movement of the control stick [2] as a virtual extension of control stick [2]. FFs 16, 17. The Examiner found that Skodlar describes enabling the position of an element [2] on the display screen to be determined automatically because the virtual cue [61] appears on the screen when the stick [2] points to the display screen. FF 33. In other words, the Examiner interprets an “element” as having both a physical portion and a virtual portion; the control stick [2] and the virtual cue [61]. 18 Appeal 2008-2489 Application 11/005,694 Intel argues that there is no suggestion that the position of Skodlar’s control stick [2] on the display screen is automatically determined because the control stick [2] never reaches or touches the display screen. FFs 53-55. However, the claim language does not require the “element on a display” to reach or touch the display screen physically. FF 14. According to the Examiner’s interpretation, Skodlar describes that the virtual cue [61] (i.e. the virtual portion of the element) is on the display screen. FFs 15-17. Intel has not presented arguments that the Examiner’s interpretation of the term “element” is unreasonable. For all these reasons, we find that Intel has not demonstrated that the Examiner erred in finding claims 11-15 anticipated by Skodlar. Rejection of claims 16-18 under 35 U.S.C. § 103(a) Claims 16-18 are each dependent on and include all the limitations of claim 11. FF 7. Claims 16-18 stand or fall together. FF 12. Claim 16 recites “instructions that enable the processor-based system to alter an image so that the image appears to extend further into said display screen in response to said element being pressed against said display screen”. App. Br. 15. The Examiner and Intel disagree as to whether Yasutake describes this limitation. The Examiner found that Yasutake describes signals responsive to application of pressure to force sensor pads of controller [105] are input to a computer system that can display a computer-generated object that can be manipulated in simulated space by the controller. FFs 37, 38. The Examiner found that Yasutake describes the user interface is intuitive since a computer object, i.e., image displayed on the screen, will move further as if responding to a pressure on controller [105]. FF 39. The Examiner 19 Appeal 2008-2489 Application 11/005,694 determined that it would have been obvious for one of ordinary skill in the art at the time of the invention to provide the teaching of enabling the generated object, i.e., image displayed on the screen, to move as if responding to pressure on the controller device in the system of Skodlar because it would provide a controller that exhibits a zero neutral force, i.e., the controller does not require a force on any sensors or mechanical members to maintain a neutral position as taught by Yasutake. FF 40. Intel argues that neither Skodlar nor Yasutake teaches enabling the image on the display screen to be altered automatically to appear to extend further into the display screen in response to the element being pressed harder against the display screen or any rationale to achieve it. FF 56. Intel’s arguments are not persuasive because they are contradictory to the Examiner’s findings and the disclosure in the Yasutake reference. In addition to the Examiner’s findings (FFs 38, 39), Yasutake describes in figures 33d1-33d4 images on the display screen that can be altered automatically to appear to extend further into the display screen (along z- axis) in response to an element (i.e., a thumb or forefinger) being pressed harder against the force-sensitive sensor pads. FFs 26-28, 30. Intel also argues that the image modifications that seem to correspond to physical manipulation are nowhere suggested in either Skodlar or Yasutake. FF 57. Intel’s arguments are unpersuasive because they are inconsistent with the Examiner’s findings and Yasutake’s disclosure. In addition to the Examiner’s findings (FFs 38, 39), Yasutake describes image modifications that correspond to physical manipulation in figures 26a-26f, 28a-28f, 33a1- a2, 33b1-b2, 33c1-c2, 33d1-d6 and 33e1-e2. FFs 26-28, 31. 20 Appeal 2008-2489 Application 11/005,694 For these reasons, we find that Intel has not demonstrated that the Examiner erred in rejecting claims 16-18 as obvious over the prior art. Rejection of claim 19 under 35 U.S.C. § 103(a) Claim 19 is dependent on claim 11 and includes all of the limitations of claim 11. FF 7. Claim 19 recites “[t]he article . . . further storing instructions that enable the processor-based system to detect the position, with respect to said display screen of a light pen associated with said element”. App. Br. 15. The Examiner found that Skodlar describes a control stick [2] with an optical sensor [68] and found the claimed light pen to read on the optical sensor [68]. FF 45. The Examiner found that Mumford discloses a plurality of lights [106r, 106g, 106b] at the end of an electronic light pen to determine the position of the tip of the light pen with respect to the display screen. FF 47. The Examiner determined it would have been obvious for one of ordinary skill in the art at the time of the invention to modify Skodlar’s optical sensor [68] at the tip of the control stick to have three light sensors because it would support the focusing of the cue tip onto a very few pixels to make the localization of color or gray scale information very specific as taught by Mumford. FF 48. Intel argues that Skodlar’s optical sensor does not correspond to a light pen. FF 58. Intel has not directed us to evidence in support of its argument. Moreover, Skodlar describes an optical sensor [68] that senses light from the display screen and Mumford describes that the light pen includes three photodiodes [106r, 106g, 106b] that sense different colors of light emitted from a display screen. FFs 20-23. Thus, Skodlar’s optical sensor and Mumford’s light pen both perform the function of sensing light from a 21 Appeal 2008-2489 Application 11/005,694 display, contrary to Intel’s argument that an optical sensor cannot correspond to a light pen. As a result, we are unpersuaded by Intel’s argument. Next, Intel argues that Skodlar does not teach putting a light pen on the end of the element whose position is to be determined (i.e., control stick [2]). FF 59. Intel also argues that Mumford or any combination of Mumford and the prior art does not suggest adding the element that is physically manipulated (i.e., control stick [2]) to the light pen and attaching the light pen(s) to the end of the stick. FFs 60-62. However, Intel’s arguments are not commensurate in scope with the language of claim 19. Claim 19 does not require the light pen to be located on the end of the element, but instead merely requires “a light pen associated with said element”. Intel’s arguments also fail because, as explained above, Skodlar describes a light pen (i.e., optical sensor [68]) located at the end of the control stick [2]. For all these reasons, we find that Intel has not demonstrated that the Examiner erred in rejecting claim 19 as obvious over the prior art. F. Decision Upon consideration of the appeal, and for the reasons given herein, it is ORDERED that the decision of the Examiner rejecting claims 1-10 as indefinite under 35 U.S.C. § 112, 2nd paragraph is affirmed. ORDERED that the decision of the Examiner rejecting claims 11-19 as indefinite under 35 U.S.C. § 112, 2nd paragraph is reversed. ORDERED that the decision of the Examiner rejecting claims 11-15 as anticipated under 35 U.S.C. § 102(b) over Skodlar is affirmed. 22 Appeal 2008-2489 Application 11/005,694 ORDERED that the decision of the Examiner rejecting claims 16-18 as unpatentable under 35 U.S.C. § 103(a) over Skodlar and Yasutake is affirmed. No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a). AFFIRMED 23 Appeal 2008-2489 Application 11/005,694 ADDITIONAL VIEWS by MOORE, Administrative Patent Judge I join with my colleagues in this decision, but write to express additional and separate views about the language of claim 1. Claim 1 recites as follows: 1. A method comprising: enabling the position of an element on a display screen to be determined automatically; and in response to physical manipulation of said element in three dimensions, enabling an image displayed on said display screen to be automatically altered in three dimensions. Specifically, the term “enabling” appears twice in this claim. The term appears also in dependent claims 2, 4, and 5-10. The Examiner and my colleagues have attempted to interpret this claim and find themselves unable to because of imprecise language used relating to the three dimensional manipulation. As best able, they find the claim to be close to the prior art’s description of a computer object (image) displayed on a screen moving further, as if responding to pressure on a controller. I agree that such a prior art description is close to the scope of the claim and accordingly precision in claim drafting is of utmost importance in this case specifically. But one other important function of a claim generally is to precisely define the scope of the claimed invention, the so-called “metes and bounds” of the property right. A potential infringer has the right to know where the line is drawn, so to speak, before trespassing. In this instance, I believe one of ordinary skill in the art cannot find the outer limit of claims 1-10 as written even if the three-dimensional issues discussed in the opinion are ultimately resolved. 24 Appeal 2008-2489 Application 11/005,694 The specification sheds no light on what the obfuscatory step of “enabling” entails. Almost anything can “enable” a result. Claims 1-10, as written, represent a desired end result or “wish” without any real steps for achieving that end result. There are no concrete or tangible steps recited, such as providing software which computes positional input and pressure data and then periodically re-renders the screen. Claims 1-10 represent a “black box” to me. Accordingly, they do not give a potential infringer notice of what is inside them, and what the claim covers. I would therefore find claims 1-10 to be unpatentable under 35 U.S.C. §112, second paragraph for this additional reason. qsg TROP PRUNER & HU, PC 1616 S. VOSS ROAD, SUITE 750 HOUSTON TX 77057-2631 25 Copy with citationCopy as parenthetical citation