VIACCESSDownload PDFPatent Trials and Appeals BoardMar 2, 20222020005646 (P.T.A.B. Mar. 2, 2022) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 15/571,387 11/02/2017 Chem Assayag BRV-58337 4282 116 7590 03/02/2022 PEARNE & GORDON LLP 1801 EAST 9TH STREET SUITE 1200 CLEVELAND, OH 44114-3108 EXAMINER MARANDI, JAMES R ART UNIT PAPER NUMBER 2421 NOTIFICATION DATE DELIVERY MODE 03/02/2022 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): patdocket@pearne.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE ________________ BEFORE THE PATENT TRIAL AND APPEAL BOARD ________________ Ex parte CHEM ASSAYAG and GILES DUBROEUCQ ________________ Appeal 2020-005646 Application 15/571,387 Technology Center 2400 ________________ Before JASON V. MORGAN, JEREMY J. CURCURI, and PHILLIP A. BENNETT, Administrative Patent Judges. MORGAN, Administrative Patent Judge. DECISION ON APPEAL STATEMENT OF THE CASE Pursuant to 35 U.S.C. § 134(a), Appellant1 appeals from the Examiner’s decision to reject claims 14-24. Claims 1-13 are canceled. Appeal Br. 19 (Claims App.). We have jurisdiction under 35 U.S.C. § 6(b). We affirm. 1 “Appellant” refers to “applicant” as defined in 37 C.F.R. § 1.42. Appellant identifies VIACCESS as the real party in interest. Appeal Br. 3. Appeal 2020-005646 Application 15/571,387 2 SUMMARY OF THE DISCLOSURE Appellant’s claimed subject matter relates to “setting the level of definition of at least one image of a multimedia programme displayed on a screen of a terminal” through the steps of (1) a “characterization unit automatically acquir[ing] metadata attached to said multimedia programme and submit[ting] said [meta]data to [a] unit for selecting predefined settings” and (2) a “unit for selecting predefined settings automatically select[ing] the settings according to pre-recorded rules defined depending on the values of said metadata.” Abstract. REFERENCES The Examiner relies on the following references (only the first named inventor of each reference is listed): Name Reference Date Kameoka US 2010/0110297 A1 May 6, 2010 Umeyama US 2011/0293241 A1 Dec. 1, 2011 Messmer US 2015/0071615 A1 Mar. 12, 2015 REJECTIONS The Examiner rejects claims 14-24 as follows: Claims Rejected 35 U.S.C. § Reference(s)/Basis Citation 14, 15, 18, 21, 24 102(a)(1) Messmer Final Act. 4-6 16, 17 103 Messmer, Umeyama Final Act. 6-7 19, 20, 22 103 Messmer, Kameoka Final Act. 7-8 23 103 Messmer, Kameoka, Umeyama Final Act. 8-9 Appeal 2020-005646 Application 15/571,387 3 ANALYSIS Claim 14, which is representative with respect to claims 14-24, is reproduced below (disputed limitations emphasized and bracketing added). 14. A method for setting the level of definition of at least one image of a multimedia programme displayed on a screen of a terminal provided with at least one unit for characterizing the multimedia programme and at least one unit for selecting predefined settings, in which method, - the characterising unit automatically acquires metadata attached to said multimedia programme and submits said metadata to the unit for selecting predefined settings, - [1] the characterising unit generates at least one further metadata from an analysis of a displayed image of the current programme, creates, live and on the fly, at least one rule for defining the desired setting of the final rendering of the image as a function of the metadata of the current programme, at least one of the rules thus created being intended to define the desired setting of the final rendering of the image as a function of the further metadata, and submits said metadata to the unit for selecting predefined settings, - the unit for selecting predefined settings automatically selects the settings according to prerecorded rules and/or the live on- the-fly defined rule, characterised in that said further metadata is generated from an object and/or scene type recognition. Appeal Br. 19 (Claims App.). In rejecting claim 14, the Examiner finds that Messmer’s creation of source video content characterization metadata 225B2 by analysis of video data 103 at decoder state 207B or during video processing 208 teaches or suggests [1] a “characterising unit [that] generates at least one further metadata from an analysis of a displayed image of the current programme.” Ans. 11-12 (citing, e.g., Messmer ¶ 74); Final Act. 4-5 (citing Messmer ¶¶ 45-78). Appeal 2020-005646 Application 15/571,387 4 Appellant acknowledges that “Messmer describes that viewing environment metadata 225C, source video content characterization metadata 225B2 and/or housekeeping metadata 225D may be created or provided by analysis of video data 103 at the encoder stage 207A, the decoder stage 207B and/or by the video processor at block 208.” Appeal Br. 14 (citing Messmer ¶¶ 54, 74). But Appellant argues that “all of these stages are post- production stages” (id. (citing Messmer Figs. 3, 4)), and, therefore, “[n]one of the metadata 225C, 225B2, and 225D is based on analysis of an image of the current programme” (id.). See also Reply Br. 3. Appellant’s arguments are not persuasive of reversible Examiner error because, although Messmer teaches that “video content characterization metadata 225B2 may describe post-production edited video content” (Messmer ¶ 54 (italicized emphasis added)), Messmer teaches or suggests creation of this and other metadata “by analysis of video data 103 at . . . the decoder stage 207B and/or by the video processor at block 208” (id. ¶ 74). As Messmer illustrates, video processing 208 and decoding 207B take place after delivery 206 via broadcast medium 205. Id. Figs. 3, 3A. Contrary to Appellant’s arguments, such processing post-delivery is not part of a “post- production” stage. Id. Fig. 3. To be sure, Messmer explicitly illustrates video characterization metadata 225B2 as being generated in post-production stage 204. Id. ¶ 47, Fig. 3. But we agree with the Examiner that Messmer’s broad teaching of alternative embodiments in which video characterization metadata 225B2 and other metadata are generated during other stages teaches or suggests disputed recitation [1]. Ans. 12. Accordingly, we sustain the Examiner’s 35 U.S.C. § 102(a)(1) rejection of claim 14, and the Examiner’s 35 U.S.C. §§ 102(a)(1), 103 Appeal 2020-005646 Application 15/571,387 5 rejections of claims 15-24, which Appellant argues are patentable for similar reasons. Appeal Br. 15-17. CONCLUSION In summary: Claim(s) Rejected 35 U.S.C. § Reference(s)/Basis Affirmed Reversed 14, 15, 18, 21, 24 102(a)(1) Messmer 14, 15, 18, 21, 24 16, 17 103 Messmer, Umeyama 16, 17 19, 20, 22 103 Messmer, Kameoka 19, 20, 22 23 103 Messmer, Kameoka, Umeyama 23 Overall Outcome 14-24 TIME PERIOD FOR RESPONSE No time period for taking subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a). See 37 C.F.R. § 1.136(a)(1)(iv). AFFIRMED Copy with citationCopy as parenthetical citation