University of Florida Research Foundation, Inc.Download PDFPatent Trials and Appeals BoardDec 14, 20202020002931 (P.T.A.B. Dec. 14, 2020) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 15/065,028 03/09/2016 Frank J. Bova T15188 (222109-1120) 7022 24504 7590 12/14/2020 THOMAS | HORSTEMEYER, LLP 3200 WINDY HILL ROAD, SE SUITE 1600E ATLANTA, GA 30339 EXAMINER YIP, JACK ART UNIT PAPER NUMBER 3715 NOTIFICATION DATE DELIVERY MODE 12/14/2020 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): docketing@thomashorstemeyer.com ozzie.liggins@tkhr.com uspatents@tkhr.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE ____________ BEFORE THE PATENT TRIAL AND APPEAL BOARD ____________ Ex parte FRANK J. BOVA, SANFORD L. MEEKS, and TWYLA R. WILLOUGHBY ____________ Appeal 2020-002931 Application 15/065,028 Technology Center 3700 ____________ Before MEREDITH C. PETRAVICK, WILLIAM V. SAINDON, and ERIC C. JESCHKE, Administrative Patent Judges. JESCHKE, Administrative Patent Judge. DECISION ON APPEAL STATEMENT OF THE CASE Appellant1 seeks review, under 35 U.S.C. § 134(a), of the Examiner’s decision, as set forth in the Non-Final Office Action dated May 13, 2019, rejecting claims 1–20. We have jurisdiction under 35 U.S.C. § 6(b). We reverse. 1 We use the word “Appellant” to refer to “applicant” as defined in 37 C.F.R. § 1.42. Appellant identifies University of Florida Research Foundation, Inc. as the real party in interest. Appeal Br. 3. Appeal 2020-002931 Application 15/065,028 2 BACKGROUND The disclosed subject matter “is generally in the field of medical training systems, specifically medical training systems in oncology.” Spec. ¶ 1.2 Claims 1, 12, and 19 are independent. Claim 1 is reproduced below: 1. A method of radiation oncology training performed by a virtual radiation oncology clinic (VROC) computer system that simulates an entire course of radiation therapy including an evaluation of a virtual patient by a trainee, an identification by the trainee of all therapeutic target and non-target regions of the virtual patient, a recommendation by the trainee of radiation doses to structures of the virtual patient in one or more treatment fractions, delivery of the recommended radiation doses in one or more treatment fractions to the structures of the virtual patient in accordance with the recommendation, and a follow up by the trainee with the virtual patient after delivery of all treatment fractions, the method comprising: (a) with a computing device of the VROC computer system, reading an electronic medical record associated with the virtual patient from a database of the VROC computer system, (b) with a computing device of the VROC computer system, causing the electronic medical record to be displayed to a trainee on a display of a user interface of the VROC computer system, (c) in a computing device of the VROC computer system, receiving an electronic treatment plan associated with the virtual patient from the trainee, the electronic treatment plan including the identification by the trainee of all therapeutic target and non-target regions of the virtual patient, the 2 Citations to the Specification refer to the version filed on May 14, 2018, which includes paragraph numbering. Appeal 2020-002931 Application 15/065,028 3 recommendation by the trainee of radiation doses to structures of the virtual patient in one or more treatment fractions, and the delivery of the recommended radiation doses in one or more treatment fractions to the structures of the virtual patient in accordance with the recommendation, wherein the electronic treatment plan is generated by the trainee and entered by the trainee into the VROC computer system via the user interface or is displayed to the trainee on a display device of the user interface and modified by the trainee via the user interface of the VROC computer system, and (d) in a computing device of the VROC computer system, computing one or more comparison metrics for the electronic treatment plan and providing training feedback to the trainee via the user interface, wherein said one or more comparison metrics are indicative of whether the electronic treatment plan contains one or more errors, and wherein the feedback includes an indication of whether the electronic treatment plan contains said one or more errors. REJECTIONS 1. Claims 1–3, 5–14, 16, and 18–20 stand rejected under 35 U.S.C. § 102(a)(1) as anticipated by Riker (US 2005/0111621 A1, published May 26, 2005). 2. Claims 4 and 17 stand rejected under 35 U.S.C. § 103 as unpatentable over Riker and Willcut (US 9,764,162 B1, issued Sept. 19, 2017). 3. Claim 15 stands rejected under 35 U.S.C. § 103 as unpatentable over Riker and Purdie (US 2016/0140300 A1, published May 19, 2016). Appeal 2020-002931 Application 15/065,028 4 DISCUSSION Rejection 1 – The rejection of claims 1–3, 5–14, 16, and 18–20 under 35 U.S.C. § 102(a)(1) A. Claims 1–3 and 5–11 Independent claim 1 recites, among other limitations, the step of: (d) in a computing device of the VROC computer system, computing one or more comparison metrics for the electronic treatment plan and providing training feedback to the trainee via the user interface, wherein said one or more comparison metrics are indicative of whether the electronic treatment plan contains one or more errors, and wherein the feedback includes an indication of whether the electronic treatment plan contains said one or more errors.” Appeal Br. 25 (Claims App.) (emphasis added). In rejecting claim 1, the Examiner found that Riker discloses step (d). Non-Final Act. 5–6 (citing Riker ¶¶ 19, 21, 91, 145, 123, 127, 152, 153, 168). The Examiner found that “Riker teaches a plurality of feedbacks: comparing DVH[3] curves with DVH prescriptions; user reviews, scores, results are determined to be unacceptable, undesirable; treatment plan results; determines an excessive dose.” Id. at 6 (emphasis omitted). Appellant acknowledges that “the system and method of [Riker] provide various types of feedback to the physician during the process treatment plan optimization.” Appeal Br. 18. Appellant argues that Riker nevertheless fails to disclose step (d) because the disclosed system “does not provide indications to the physician of whether the plan contains one or more errors. Rather, this determination must be made by the physician.” Id. 3 “DVH” stands for “dose volume histogram.” Riker discloses that “DVH curves represent a summary of how much [radiation] dose the individual structures [in the patient’s body] are getting.” Riker ¶ 19. Appeal 2020-002931 Application 15/065,028 5 at 19; see also Reply Br. 11 (“The type of feedback provided in [Riker] is the type of feedback that could only be properly interpreted by someone who is already trained. It is then left up to the physician or other trained oncology professional to determine whether one or more errors are contained in the plan.”). We agree with Appellant’s understanding of the scope of claim 1 and the disclosures of Riker. First, we address claim scope. Step (d) generally includes two different requirements: computing “comparison metrics” and providing “training feedback.” See Appeal Br. 26 (Claims App.) Both the “metrics” and “feedback” indicate “whether the electronic treatment plan contains . . . errors.” Id. The Specification discloses that “comparison metrics can be computed for the treatment plan or from the simulated treatment record” and that “[t]he comparison metric can be, for example a contour metric, a dosimetry metric, a reporting error metric, or a combination thereof.” Spec. ¶ 52; see also id. ¶¶ 53–56 (describing the “comparison metrics”). From this description, we view the “comparison metrics” as a quantitative value or values, which are recited as “computed” by the system. In contrast, “training feedback” is a qualitative assessment, recited as “provided” by the system, which categorizes the trainee’s “electronic treatment plan” as proper or improper. See id. ¶ 101 (“[F]eedback is given to the trainee about the overall agreement of their [treatment plan] to the expert [treatment plan.]”), ¶ 80 (“Code was adapted to give feedback for the plan comparison.”). We turn now to the disclosures of Riker. The Examiner does not expressly distinguish which portions of Riker were cited to address the “comparison metrics” versus the “training feedback.” The Examiner appears to read step (d) as merely requiring some information that can be Appeal 2020-002931 Application 15/065,028 6 used to identify error in a treatment plan. See Ans. 33–34; Final Act. 5–6. For the reasons discussed above, we disagree. The cited portions of Riker appear to disclose only the “comparison metrics” requirement. For example, relied-upon paragraph 152 of Riker states, in relevant part, The radiation dose to the patient is simulated based upon the control information for the delivery device 39. The computer planning apparatus 35 provides a multitude of outputs both to the delivery device 39 and the user. A “human” needs to check all the results, therefore, as stated previously, the computer planning apparatus 35 can provide additional graphs and data that lets the user “test fire” the treatment plan without the patient and to make measurements of the dose delivered to test equipment 37 to determine if the computer optimized radiation treatment plan coincides with the expectations of the user, and to ensure that the output . . . of the delivery device 39 matches the radiation treatment plan. Riker ¶ 152 (emphasis added). Thus, the system in Riker simulates the delivery of a radiation dose to a patient based on a “computer optimized treatment plan,” and the results of this simulation are then provided to the user as a type of “feedback mechanism.” See, e.g., id. ¶ 19 (“DVH curves have been used as a prescription and as a feedback mechanism, whereby the user specifies goals in terms of such DVH curves.”). Appellant argues that having a user decide for themselves whether a treatment plan includes an error does not satisfy the requirement for “training feedback.” Appeal Br. 19. As noted by Appellant, Riker expressly discloses that “[a] ‘human’ needs to check all the results.” Id. (emphasis added). We agree with Appellant. As discussed above, step (d) requires the “computer system” to provide “training feedback”—i.e., a qualitative assessment categorizing the trainee’s “electronic treatment plan” as proper or improper. Because the user in Riker identifies whether the treatment plan Appeal 2020-002931 Application 15/065,028 7 contains one or more errors, we agree with Appellant that the aspects of Riker relied on by the Examiner fail to disclose the “training feedback” required by step (d). Other portions of Riker, cited by the Examiner (as well as Appellant), confirm this understanding of Riker’s system. See, e.g., Riker ¶ 21 (“The user can compare DVH curves actually achieved with DVH prescriptions to decide if the developed treatment plan was satisfactory.” (emphasis added)), quoted at Non-Final Act. 5; Riker ¶ 127 (“[T]he user reviews the isodose contours 162 overlaying image 161 in scan window 160 of GUI 150 and determines that an excessive dose is being delivered to a healthy organ structure, such as the rectum. . . . In this example, assume the adjustment resulted in less than desirable results in collateral areas.” (emphasis added)), quoted at Non-Final Act. 5; Riker ¶ 153 (“[T]he user determines whether the computer optimized radiation treatment plan meets expectations. . . . If the plan or results are determined to be unacceptable, undesirable, or even merely subject to improvement, the user returns (loops back) to steps 107- 109 . . . . This loop can be continued until the user determines the plan to be acceptable.” (emphasis added)), quoted at Reply Br. 10 and quoted in part at Non-Final Act. 6. Having considered the remaining disclosures of Riker relied on by the Examiner, none discloses the aspect of step (d) at issue. In the statement of the rejection in the Answer, the Examiner modifies the relevant findings and adds the conclusion that “Riker compares the actual result from the simulation to the comparison metric to generate feedback (including errors).” See Ans. 8–9. The Examiner relies on the same paragraphs from Riker, however, to support its modified conclusion. Compare Non-Final Act. 5–6, with Ans. 8–9. As discussed above, in Riker, Appeal 2020-002931 Application 15/065,028 8 the user identifies whether the electronic treatment plan contains one or more errors. In the response to argument section of the Answer, the Examiner states that Riker teaches various “treatment errors,” such as “a dose error,” “an error in the treatment time,” and “an error in the patient setup during treatment or a technician deviation from the treatment plan.” Ans. 33–34. Thus, according to the Examiner, “Riker meets at least one type of treatment errors as defined by the Appellant.” Id. at 33 (emphasis omitted). Even assuming that Riker teaches such errors, however, as discussed above, it is the user and not the system that identifies whether the electronic treatment plan contains the “errors.” For these reasons, we do not sustain the rejection of claim 1, or the rejection of claims 2, 3, and 5–11, which depend from claim 1. B. Claims 12–14, 16, and 18 Independent claim 12 requires the “computer system” to provide a similar type of “feedback” as that recited in claim 1. See Appeal Br. 29–30 (Claims App.). For claim 12 (and claims 13–14, 16, and 18, which depend from claim 12), the Examiner relies on the same deficient findings and conclusions with regard to Riker discussed above. Compare Non-Final Act. 8–9, with id. at 5–6. Although claim 12 adds that “the feedback includes an indication of whether at least one of the electronic treatment plan and the electronic simulated treatment record contains . . . one or more errors,” Appeal Br. 30 (Claims App.) (emphasis added), the Examiner does not explain, and we do not perceive, how the potentially broader scope of claim 12 remedies the deficiency discussed above. Thus, we also do not sustain the rejection of claims 12–14, 16, and 18. Appeal 2020-002931 Application 15/065,028 9 C. Claims 19 and 20 Similar to claim 12, independent claim 19 requires the “computer system” to provide “feedback” that “includes an indication of whether the electronic treatment plan or the electronic simulated treatment record contains . . . one or more errors.” Appeal Br. 31–32 (Claims App.) (emphasis added). For claim 19 (and claim 20, which depends from claim 19), the Examiner relies on the same deficient findings and conclusions with regard to Riker discussed above with regard to claims 1 and 12. Compare Non-Final Act. 12–13, with id. at 5–6, 8–9. Thus, for the same reasons discussed above, we also do not sustain the rejection of claims 19 and 20. Rejections 2 and 3 – The rejection of claims 4, 15, and 17 under 35 U.S.C. § 103(a) Claim 4 depends from claim 1, and claims 15 and 17 depend from claim 12. Appeal Br. 27, 30 (Claims App.). The Examiner’s added reliance on Willcut (regarding Rejection 2) and Purdie (regarding Rejection 3) does not remedy the deficiencies in the rejection based on Riker, discussed above, regarding independent claims 1 and 12 (see supra Rejection 1). Thus, for the same reasons discussed above, we do not sustain the rejection of claims 4, 15, and 17. CONCLUSION We reverse the Examiner’s rejection of claims 1–20. More specifically, we reverse the decision to reject claims 1–3, 5–14, 16, and 18–20 under 35 U.S.C. § 102(a)(1) and reverse the decision to reject claims 4, 15, and 17 under 35 U.S.C. § 103(a). Appeal 2020-002931 Application 15/065,028 10 DECISION SUMMARY In summary: Claims Rejected 35 U.S.C. § Reference(s)/Basis Affirmed Reversed 1–3, 5–14, 16, 18–20 102(a)(1) Riker 1–3, 5–14, 16, 18–20 4, 17 103 Riker, Willcut 4, 17 15 103 Riker, Purdie 15 Overall Outcome 1–20 REVERSED Copy with citationCopy as parenthetical citation