Ex Parte MyllylaDownload PDFPatent Trial and Appeal BoardFeb 20, 201411954259 (P.T.A.B. Feb. 20, 2014) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE __________ BEFORE THE PATENT TRIAL AND APPEAL BOARD __________ Ex parte TOMI MYLLYLA1 __________ Appeal 2011-013193 Application 11/954,259 Technology Center 2100 __________ Before TONI R. SCHEINER, JEFFREY N. FREDMAN, and ERICA A. FRANKLIN, Administrative Patent Judges. SCHEINER, Administrative Patent Judge. DECISION ON APPEAL This is an appeal under 35 U.S.C. § 134 from the final rejection of claims 1-5, 7, 11-22, and 25-31, directed to a method, apparatus, and computer program for introducing real world and virtual information into media content based on location. The Examiner has rejected the claims on the grounds of anticipation, lack of enablement, and indefiniteness. We have jurisdiction under 35 U.S.C. § 6(b). 1 Appellant identifies the Real Party-In-Interest as Nokia Corporation (App. Br. 1). Appeal 2011-013193 Application 11/954,259 2 STATEMENT OF THE CASE “Embodiments of the present invention relate generally to augmented reality, and more particularly, relate to . . . displaying virtual media content items in a visual [medium]” (Spec. ¶ 1). Claim 1, reproduced below, is representative of the subject matter on appeal: 1. A method comprising: receiving a first media content item that includes location information; determining a capture zone for the first media content item based upon the location information; receiving a second media content item that includes location information; and determining to render at a user interface a presentation of the second media content item within a presentation of the first media content item when the location information of the second media content item indicates that the second media content item is or was located within the capture zone. Claims 1-5, 7, 11-22, and 25-31 stand rejected under 35 U.S.C. § 102(e) as anticipated by Border et al. (US 7,683,962 B2, March 23, 2010) (Ans. 5-8). Claims 28, 29, and 31 stand rejected under 35 U.S.C. § 112, first paragraph, as lacking enablement (Ans. 4-5). Claim 27 stands rejected under 35 U.S.C. § 112, second paragraph, as indefinite (Ans. 5). We reverse. Appeal 2011-013193 Application 11/954,259 3 FINDINGS OF FACT The following findings of fact are supported by a preponderance of the evidence of record. 1. Figure 4e of the Specification illustrates an embodiment that meets the limitations of claim 1. Figure 4e is reproduced below: Figure 4e depicts photograph 430, an example of a first media content item. The area displayed in photograph 430 is coextensive with the capture zone for the first media content. Figure 4e also depicts avatar object 450, an example of a second media content item. In this particular example, avatar object 450 represents a person with a mobile device which includes location capturing capabilities. When photograph 430 is displayed, and the person and the mobile device are located within the capture zone of photograph 430, avatar object 450 will also be displayed within photograph 430 at a location corresponding to the location where the mobile device is located within the capture zone. Moreover, as the person and the mobile device move, the display can change to represent the movement. (Spec. ¶¶ 60-63.) Appeal 2011-013193 Application 11/954,259 4 2. Border discloses “an electronic camera for producing an output image of a scene from a captured image signal” (Border, col. 7, ll. 39-41). [The] embodiments described herein include an image capture assembly, such as a digital camera - still or video - or a digital scanner, having multiple image capture stages, each composed of a lens and an image sensor, wherein the lenses of the multiple image capture stages have different focal lengths to provide an extended optical zoom range for the image capture assembly. . . . [T]he multiple image capture stages . . . provide an enhanced rangefinding capability. . . . [U]sing the image capture stages for image capture and autofocus . . . improv[es] the sharpness of the captured still and video images, as well as increasing the speed of response of the autofocus system. Besides their auto focus usage, these same images can also be used to provide a range map according to the invention. There are several embodiments by which one image capture stage may be used to capture digital still images or video images while another image capture stage is simultaneously being used for another purpose, such as enhanced auto focus, . . . and the like. In a first embodiment described herein, when a user has set a zoom position to be within a first optical zoom range, a first imaging stage containing a first (e. g., zoom) lens is used to capture a still image or a series of images as in a video sequence, while a second imaging stage simultaneously provides images for the purpose of auto focus of the first imaging stage. Since the second imaging stage is not used to capture the images, the focus conditions of the lens in the second imaging stage can be adjusted over a wider range . . . to determine the new best focus setting for the lens in the first imaging stage without detrimentally affecting the focus quality of the images captured by the first imaging stage. When the new best focus condition has been determined using the second imaging stage, the focus condition of the first imaging stage is changed from the previous best focus condition to the new best focus condition. When the user adjusts the zoom position on the image capture assembly to a second optical zoom range, the camera Appeal 2011-013193 Application 11/954,259 5 automatically switches to use the second imaging stage (containing, e.g., a second zoom lens) to capture the images, and begins using the first imaging stage for auto focus of the second imaging stage. (Border, col. 10, ll. 16-61.) 3. Border further discloses: [A] process for selecting one of the imaging stages in a dual lens camera system as the primary capture unit, while relegating the other imaging stage to certain other functions, such as scene analysis. More specifically, the power to the camera is turned on and the initialization process begins (block 1100). After the initialization process is completed, the first and second imaging stages 1 and 2 are set to their default zoom positions (block 1102), which are predetermined initial zoom positions that determine the images that are initially captured and displayed. Thereupon, the first and second imaging stages capture and display first and second preview images (block 1104) on the image display 70. These images could be displayed in several ways, for instance, side by side on the display as thumbnails or the like, one within the other, sequentially, etc. Next, the camera operator is asked to decide whether the first or second preview image should be the primary image (block 1106), where the primary image will be the one that is captured and stored . . . [and] the other (non-selected) imaging stage is designated as the scene analysis capture unit (block 1114). (Border, col. 25, ll. 26-57.) ANTICIPATION The Examiner finds that Border anticipates the invention of claims 1-5, 7, 11-22, and 25-31. As discussed in greater detail above, Border’s multi-lens digital camera uses “two (or more) image capture stages, wherein an image capture stage is composed of a sensor, a lens and a lens focus adjuster . . . in a rangefinder configuration to separately capture images of portions of the Appeal 2011-013193 Application 11/954,259 6 same scene” and “[t]he images captured by the two (or more) image capture stages can be compared to each other to measure the distance to portions of the scene for purposes of auto focus control and for the production of a range map. In addition, the range map can be populated with GPS locations for portions of the scene” (Border, col. 8, ll. 12-21 (emphasis added); FF 2). Appellant contends that the Examiner incorrectly “interpreted the images generated by Border et al.’s first and second imaging stages of a digital camera as the first and second media content items recited in the claims” (App. Br. 13). Appellant contends that the images generated by Border’s two image capture stages are captured at essentially the same time and are of “the same scene” (Reply Br. 2). Appellant further contends that even when Border’s system captures first and second preview images and displays them together for selection of a primary image (App. Br. 15), “the location information of one preview image will always indicate that it[] is or was located within the capture zone of the other preview” (id. at 16). That is, Border’s system “knows for sure the items [captured by the two image capture stages] share the same location” (Reply Br, 2), and are “images of the same scene” (id. at 5). That being the case, Appellant contends that the Examiner’s finding that Border “‘determine[s] to render at a user interface a presentation of the second media content item within a presentation of the first media content item when the location information of the second media content item indicates that the second media content item is or was located within the capture zone’” is not supported by the evidence (App. Br. 16). Appellant’s arguments are persuasive. We agree with Appellant that one of ordinary skill in the art would not consider the images generated by Appeal 2011-013193 Application 11/954,259 7 Border’s first and second imaging stages to be the first and second media content items required by the claims. That is, when the claims are read in light of the Specification, it is clear that the first and second media content items represent different content, and are not simply two images of the same item in the same scene captured at essentially the same time. Moreover, we agree with Appellant that Border’s system does not render a presentation of the second media item within a presentation of the first media content item based on a determination that the second media content item is or was in the capture zone of the first media content item. ENABLEMENT The Examiner finds that claims 28, 29, and 31 are not enabled by the Specification. We agree with Appellant that the Examiner’s initial burden of establishing lack of enablement has not been met for the reasons set forth on pages 8 and 9 of the Appeal Brief. Accordingly, the rejection is reversed. INDEFINITENESS The Examiner finds that claim 27 is indefinite. We agree with Appellant that this rejection is without merit for the reasons set forth on pages 10 and 11 of the Appeal Brief. The rejection is reversed. Appeal 2011-013193 Application 11/954,259 8 SUMMARY The rejection of claims 1-5, 7, 11-22, and 25-31 as anticipated by Border is reversed. The rejection of claims 28, 29, and 31 as lacking enablement is reversed. The rejection of claim 27 as indefinite is reversed. REVERSED lp Copy with citationCopy as parenthetical citation