Ex Parte Gutta et alDownload PDFBoard of Patent Appeals and InterferencesDec 20, 201110014199 (B.P.A.I. Dec. 20, 2011) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE ____________________ BEFORE THE BOARD OF PATENT APPEALS AND INTERFERENCES ____________________ Ex parte SRINIVAS GUTTA, VASANTH PHILOMIN, and MIROSLAV TRAJKOVIC ____________________ Appeal 2010-000946 Application 10/014,199 Technology Center 3600 ____________________ Before: MURRIEL E. CRAWFORD, ANTON W. FETTING, and JOSEPH A. FISCHETTI, Administrative Patent Judges. FISCHETTI, Administrative Patent Judge. DECISION ON APPEAL Appeal 2010-000946 Application 10/014,199 2 STATEMENT OF CASE Appellants seek our review under 35 U.S.C. § 134 from the Examiner’s final rejection of claims 1 and 3-12. We affirm. THE CLAIMED INVENTION Appellants claim classifiers using Eigen networks, that employ Principal Component Analysis (PCA) to determine eigenvalues and eigenvectors, for recognition and classification of objects. (Spec. 1:8-11). Claim 1 is illustrative of the claimed subject matter: 1. A method for classifying inputs to a neural network, comprising: performing Principal Component Analysis (PCA) on a plurality of inputs to the neural network to produce a plurality of PCA outputs; coupling each of the plurality of PCA outputs to a plurality of output nodes; multiplying each coupled PCA output by a weight selected for the coupled PCA output; calculating a node output for each output node; selecting a maximum output from the plurality of node outputs; and associating an output class with the maximum output. REFERENCES The prior art relied upon by the Examiner in rejecting the claims on appeal is: Pao US 6,134,537 Oct. 17, 2000 Appeal 2010-000946 Application 10/014,199 3 REJECTIONS The following rejection is before us for review: the Examiner rejected claims 1 and 3-12 under 35 U.S.C. § 103(a) over Applicant-Admitted Prior Art (“AAPA”) in view of Pao. ISSUES Did the Examiner err in rejecting claims 1 and 3-12 under 35 U.S.C. § 103(a) over AAPA and Pao as disclosing the application of PCA to neural network methods, since Pao discloses some limitations of PCA in certain types of non-linear problems? Did the Examiner err in rejecting claims 1 and 3-12 under 35 U.S.C. § 103(a) over AAPA and Pao by not having a motivation to combine that is taught in the references? FINDINGS OF FACT We find the following facts by a preponderance of the evidence. 1. Pao discloses a system for organizing “multi-dimensional pattern data into a dimensional representation that includes a neural network.” (Col. 5 ll. 23-26). 2. Pao discloses “[t]hat utilization of the Karhunen-Loeve transform for PCA purposes has been found to be valuable in dealing with many non- trivial problems. But in pattern recognition, it has a failing insofar as what is retained is not necessarily that which helps interclass discrimination.” (Col. 2 ll. 18-23). Appeal 2010-000946 Application 10/014,199 4 3. Pao discloses: Except for linear PCA methods which are limited by their linearity nature already, other methods mentioned above either map the high dimensional data to discrete grid points in the lower dimensional space or the appearance of the lower dimensional map closely depends on the initial (usually random) choice of mapping parameters or both. (Col. 4 ll. 10-15). 4. Pao discloses analysis of the advantages and disadvantages of the K-L (PCA), auto-associative, and nonlinear variance-conserving methods, where the disadvantages of the PCA method are as follows. “Lengthy computation of co-variance matrix; linear constraint leads to loss of information when dimension-reduction is large.” (Col. 10 ll. 45-52). 5. Pao discloses: Subsequent and somewhat related developments sought to link the ideas of PCA, K-L transform and linear neural networks. Such efforts sought to accomplish a linear K-L transform through neural-net computing, with fully-connected multilayer feedforward nets with the backpropagation algorithm for learning the weights, or with use of a Generalized Hebbian Learning algorithm. (Col. 2 ll. 24-30). 6. Pao discloses, “earlier works begged the question of what might be achieved if the neurons in the networks were allowed to also be nonlinear.” (Col. 2 ll. 38-40). ANALYSIS Initially, we note that the Appellants argue independent claims 1, 10, 11, and 12 together as a group. (App. Br. 5-6). Correspondingly, we select Appeal 2010-000946 Application 10/014,199 5 representative claim 1 to decide the appeal of these claims, with remaining independent claims 10, 11, and 12 standing or falling with claim 1. Appellants do not provide a substantive argument as to the separate patentability of claims 3-9 that depend from claim 1. Thus, claims 3-12 stand or fall with claim 1. See, 37 C.F.R. § 41.37(c)(1)(vii). Appellants argue that “[s]ince the Appellants' invention involves reducing the dimensions of input data using a system and method that employs linear outputs derived from PCA, and Pao teaches reducing the dimension of input data with a system that uses nonlinear outputs using the nonlinear variance-constraint method, it is clear that Pao teaches away from using PCA by discussing the limitations and failings of PCA.” (Appeal Br. 7). In support of the conclusion that Pao “teaches away”, Appellants cite to three sections of Pao (FF 2-4). The Examiner responded to the discussion at column 2 in Pao (FF 2), which mentions shortcomings of PCA for use in pattern recognition problems, stating, the “non-trivial problems to which Pao et al. is referring include classification/clustering problems other than pattern recognition. Since the instant claims are not limited to pattern recognition, the pertinent comment by Pao et al. is that PCA ‘has been found to be valuable’, not that ‘it has a failing’.” (Ans. 6). The Examiner further responded to Appellants’ arguments concerning the disclosures in Pao at columns 4 and 10 (FF 3, 4) about PCA being “inherently linear” as follows: Pao et al. compares the advantages and disadvantages of three dimension-reduction alternatives (col. 3 lines 51-66 and col. 10 lines 45-67). Pao et al. prefers the “NLVC” method (col. 11 lines 1-16), because it enables nonlinear distinctions (illustrated Appeal 2010-000946 Application 10/014,199 6 nicely by the three-part, nonlinear dotted line result in Fig. 6). But this nonlinear capability comes with a cost in complexity, as Pao et al. make clear in the comparison (col. 10 lines 45-67). If a problem can accept the simpler, linear constraint, then one of ordinary skill in the art would read Pao et al. to recommend PCA. The appellant's argument acknowledges that the instant invention can accept linear constraints ... (Ans. 6). We agree with the Examiner. Also, Appellants’ argument, that the claimed invention “employs linear outputs derived from PCA, and Pao teaches reducing the dimension of input data with a system that uses nonlinear outputs ...” (Appeal Br. 7), fails from the outset because it is not based on limitations appearing in the claims and is not commensurate with the broader scope of the claims, which do not recite the terms “linear” or “non-linear,” nor any equivalent, and are thus not limited to either linear or non-linear problems. In re Self, 671 F.2d 1344, 1348 (CCPA 1982). Appellants next argue the admitted prior art and Pao each fails to teach or suggest “classifying inputs to a neural network.” (Appeal Br. 8). We are not persuaded by Appellants' argument because the phrase, “classifying inputs to a neural network” only appears in the preamble of each independent claim, and not in the body of the claims. Therefore, we do not read “classifying inputs to a neural network” recited in the preamble of the independent claims as a limitation because it is not thereafter fully incorporated into the body of the claim. Corning Glass Works v. Sumitomo Elec. U.S.A., Inc., 868 F.2d 1251, 1257 (Fed. Cir. 1989) (An element initially recited in the preamble, is only thereafter fully incorporated into the body of the claim so as to breathe life and breath into it by setting forth the complete combination). Appeal 2010-000946 Application 10/014,199 7 Appellants, merely repeating core language of each of claims 1, 10, 11 and 12, assert the combination does not teach or suggest the recited claim limitations (Appeal Br. 8). We are not persuaded by Appellants' assertion because a general allegation that the art does not teach any of the claim limitations is no more than merely pointing out the claim limitations. A statement which merely points out what a claim recites will not be considered an argument for separate patentability of the claim. 37 C.F.R. § 41.37(c)(1)(vii). “It is not the function of this court to examine the claims in greater detail than argued by an appellant, looking for nonobvious distinctions over the prior art.” In re Baxter Travenol Labs, 952 F.2d 388, 391 (Fed. Cir. 1991). Appellants finally argue “the Federal Circuit requires a specific identification of a suggestion, motivation, or teaching why one of ordinary skill in the art would have been motivated to select the references and combine them. This the Examiner has not done.” (Appeal Br. 9-10). To the extent Appellants seek an explicit suggestion or motivation in the reference itself, this is no longer the law in view of the Supreme Court’s holding in KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 419 (2007). Appellants’ appeal brief (November 25, 2005) and the Examiner’s Answer (June 14, 2006), were each filed before the outcome of KSR. Notwithstanding, the Examiner found motivation to combine in Pao, stating: Appellant has noted that PCA is well known and is widely used. While PCA is limited to linear constraints, appellant has stated that linear constraints are acceptable in the instant invention. (That is, the claims are not limited to nonlinear constraints.) When linear constraints are acceptable, Pao et al., especially through its comparison of pros and cons (col. 10 Appeal 2010-000946 Application 10/014,199 8 lines 45-67), makes it clear that PCA, because of its simplicity, is the obvious first choice for dimension reduction. For these reasons, it would have been obvious to one of ordinary skill in the art at the time of the invention, to add the PCA teachings of Pao et al. to the admitted prior art. (Ans. 7). We find this rationale reasonable because we find Pao addresses both linear and non-linear problem sets using PCA and neural networks (FF 1-5), which would lead one of ordinary skill in the art to contemplate the application of PCA to neural network methods of AAPA to arrive at the claimed method. We therefore do not find error by the Examiner in the combination of AAPA with Pao, because the Examiner clearly articulated a motivation to combine which derives from the disclosure of Pao. CONCLUSIONS OF LAW The Examiner did not err in rejecting claims 1 and 3-12 under 35 U.S.C. § 103(a) over Applicant Admitted Prior Art and Pao. DECISION For the above reasons, the Examiner’s rejection of claims 1 and 3-12 is AFFIRMED. No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a)(1)(iv). AFFIRMED MP Copy with citationCopy as parenthetical citation