Ex Parte ZHU et alDownload PDFPatent Trial and Appeal BoardMay 9, 201813585432 (P.T.A.B. May. 9, 2018) Copy Citation UNITED STA TES p A TENT AND TRADEMARK OFFICE APPLICATION NO. FILING DATE FIRST NAMED INVENTOR 13/585,432 08/14/2012 145835 7590 05/09/2018 McDonnell Boehnen Hulbert & Bergh off LLP/W aymo 300 South Wacker Drive Chicago, IL 60606 UNITED ST A TES OF AMERICA JiajunZHU UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www .uspto.gov ATTORNEY DOCKET NO. CONFIRMATION NO. 12-156 4822 EXAMINER KING, RODNEY P ART UNIT PAPER NUMBER 3665 MAILDATE DELIVERY MODE 05/09/2018 PAPER Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte JIAJUN ZHU and DAVID I. FERGUSON Appeal2017-006440 Application 13/585,432 Technology Center 3600 Before BENJAMIN D. M. WOOD, LISA M. GUIJT, and PAUL J. KORNICZKY, Administrative Patent Judges. GUIJT, Administrative Patent Judge. DECISION ON APPEAL Appellants 1 appeal under 35 U.S.C. § 134(a) from the Examiner's rejection2 of claims 1, 3-5, 7, 8, 10-12, 14, 15, and 17-19 under 35 U.S.C. § 103(a) as unpatentable over Bageshwar (US 2010/0280699 Al; published Nov. 4, 2010) and Kulik (US 2011/0275408 Al; published Nov. 10, 2011). We have jurisdiction under 35 U.S.C. § 6(b). We affirm. 1 Appellants identify the real party in interest as Google Inc. Appeal Br. 1. 2 Appeal is taken from the Final Office Action dated June 6, 2016. Appeal2017-006440 Application 13/585,432 STATEMENT OF THE CASE Claims 1, 8, and 15 are the independent claims on appeal. Claim 1, reproduced below, is exemplary of the subject matter on appeal, with disputed limitations italicized for emphasis. 1. A method comprising: receiving using a computer system in a vehicle, ground truth data that relates to a current state of the vehicle in an environment, wherein the vehicle is configured to operate in an autonomous mode in which the computer system controls the vehicle in the autonomous mode based on data obtained by a sensor coupled to the vehicle, and wherein the sensor is a RADAR unit, a LIDAR unit, or a camera; obtaining, using the computer system in the vehicle, perceived environment data that relates to the current state of the vehicle in the environment as perceived by the sensor, detecting, using the computer system in the vehicle, a difference between the perceived environment data and the ground truth data; and adjusting, using the computer system in the vehicle, one or more parameters of the sensor to reduce the difference between the perceived environment data and the ground truth data, wherein the one or more parameters include a latency of the sensor, a position of the sensor, or an orientation of the sensor relative to the vehicle. ANALYSIS Appellants argue independent claims 1, 8, and 15 as a group. Appeal Br. 3-8; Reply Br. 1-6. We select claim 1 as representative, with claims 8 and 15 standing or falling with claim 1. See 37 C.F.R. § 41.37(c)(l)(iv). Regarding independent claim 1, the Examiner finds, inter alia, that Bageshwar discloses receiving, using a computer in a vehicle (i.e., processing functionality 102), ground truth data that relates to a current state of the vehicle in an environment, as claimed. Final Act. 3 (citing Bageshwar 2 Appeal2017-006440 Application 13/585,432 i-fi-f 14, 16, 17, 20). Appellants do not dispute the Examiner's finding. However, we note that the Specification provides a very broad description of the claim term "ground truth data." In particular, the Specification discloses that "[ s ]ensor parameters may be optimized by collecting sensor produced data and comparing the collected data to known data or ground truth data." Spec. i-f 16 (emphasis added). The Specification also discloses that [i]n one example, the vehicle may receive ground truth data that relates to a current state of the vehicle in an environment. For example, the vehicle may be traveling down a road, with other vehicles driving in front of it. The ground truth data may define, for example, external driving conditions, a current state of the vehicle, and a current status of the other vehicles traveling in front of the vehicle. The external driving conditions may include a weather indication, a position of an obstacle in the environment, a position of a landmark in the environment, and/ or a terrain map of the environment, for example. Other driving conditions could . . . be included. The current status of the other vehicles may include information such as . . . velocity or speed of the other vehicles, and the heading of the other vehicles. Other types of status information could also be received. For example, the ground truth data may indicate that another vehicle is in front of the vehicle heading straight on a two-lane, 10 mile road at a speed of 20 miles-per-hour, and there is a left-tum slant at mile 5. Id. i-f 19; see also id. i174; i-fi-159, 69-71 (disclosing that ground truth data is data indicating the speed and/or actual location of a vehicle other than the claimed vehicle). The Specification further discloses that "ground truth data may include data regarding the sensors," for example, "indicating a location and operating altitude of a camera ... [system]" (id. i174) and also that "ground truth data may be obtained directly from the automobile," for example, "one of the plurality of sensors may be a confirmed reliable data source used to obtain the ground truth data" (id. i175). In fact, the 3 Appeal2017-006440 Application 13/585,432 Specification discloses that "ground truth data may take the form of any data set ... received by the [automobile's] computer system ... to compare and/or validate the integrity of any data that is obtained or collected by one of the plurality of sensors of the automobile." Id. i-f 7 4 (emphasis added). Thus, we construe the claim term "ground truth data" to mean any data that is from a reliable data source and representative of an actual state of an environment (i.e., ground), consistent with the Specification. Cf claim 1, supra (further limiting "ground truth data" to data "that relates to a current state of the vehicle in an environment"). In addition, the Examiner finds that Bageshwar discloses a vehicle configured to operate in an autonomous mode based on data obtained by a sensor (including a RADAR or LIDAR unit, or a camera), and also obtaining, using the vehicle's computer system, perceived environment data perceived from the sensor, as claimed. Final Act. 3 (citing, e.g., Bageshwar i-f 14 (disclosing that autonomous vehicle 100 includes images sensors 104, for example, "cameras," which "provide data regarding landmarks in the environment around the vehicle 100 to processing functionality 102"), Fig. IA). The Examiner further finds that Bageshwar discloses detecting, using the vehicle's computer system, a difference between perceived environment data and ground truth data, because Bageshwar discloses comparing an estimated position to a measured position. Id. at 4 (citing Bageshwar i-fi-1 18, 20). The Examiner determines that Bageshwar does not specifically disclose adjusting, using the vehicle's computer system, a parameter of the sensor (including latency, position, or orientation relative to the vehicle) to reduce a difference between perceived environment and ground truth data, as 4 Appeal2017-006440 Application 13/585,432 claimed, and relies on Kulik for disclosing that, similar to Bageshwar, "orientation processor 250 can determine the difference between measured and calculated parameters." Final Act. 4 (citing Kulik i-fi-160, 61); see also id. i160 ("orientation processor 250 ... can save the differences as correction or calibration factors to the orientation sensor measurements"). The Examiner further relies on Kulik for teaching the desirability of calibrating sensors, as a "well known ... valuable engineering practice." Ans. 3 (citing Kulik i13). The Examiner reasons that it would have been obvious "to modify the teachings of Bageshwar ... to include the features of Kulik ... to determine sensor correction" (Final Act. 4), and more specifically, "to modify the sensor ... of Bageshwar ... to include the calibration/correction features of Kuli[k] ... to provide the user with the most accurate sensor readings of vehicle conditions deemed as valuable information for the user to make appropriate counteractions in response to the sensed information" (Ans. 4). Appellants argue that the Examiner erred because "Bageshwar in view of Kulik does not teach 'adjusting ... one or more parameters of the sensor' wherein 'the sensor is a RADAR unit, a LIDAR unit, or a camera." Appeal Br. 4. In support, Appellants submit that Bageshwar' s "image sensors 104 are not the sensors the need correction in Bageshwar," but rather, "what needs correction in Bageshwar is the 'kinematic state' of the vehicle." Id. at 5 (citing Bageshwar i-f 17); see also Reply Br. 1-2. Appellants further submit that, in Bageshwar, "image sensors 104 provide the 'ground truth data' (measurements of the positions of landmarks in the vehicle's frame) that is used to correct errors" and thus, "image sensors 104 do not need correction." Id. at 6. Appellants conclude that "because the image sensors 5 Appeal2017-006440 Application 13/585,432 104 provide the 'ground truth data' that is used to correct errors, Bageshwar teaches away from adjusting one or more parameters of the image sensors 104." Id. We are not persuaded by Appellants' argument, which fails to address the rejection as articulated by the Examiner. The Examiner proposes implementing a calibration step within Bageshwar' s method, whereby sensors 104 (i.e., a camera) are calibrated, as taught by Kulik, regardless of whether the measurements from Bageshwar' s sensors 104 are further used to perform error corrections. Appellants have not provided, nor can we find, support in Bageshwar that discourages calibration of sensors 104 prior to implementing the processing functionality to further calculate errors. Prior art does not teach away from claimed subject matter merely by disclosing a different solution to a similar problem unless the prior art also criticizes, discredits, or otherwise discourages the solution claimed. See In re Fulton, 391 F.3d 1195, 1201 (Fed. Cir. 2004). We also disagree with Appellants' characterization of sensors 104 (i.e., a camera) as "ground truth sensors" during a calibration 3 step, which by definition, compares the perceived environment data, as perceived by sensors 104, to a standard (or data that is from a reliable data source) to ascertain a deviation-although the same sensor (i.e., a camera) may provide ground truth data after calibration (i.e., after the sensor is confirmed to provide reliable data of an actual state of the vehicle in the environment). In 3 An ordinary meaning of the term "calibrate" is "to standardize (as a measuring instrument) by determining the deviation from standard esp. so as to ascertain the proper correction factors." WEBSTER'S THIRD NEW INT'L DICTIONARY 316 (1993). 6 Appeal2017-006440 Application 13/585,432 other words, during Kulik's calibration step, Bageshwar's sensors 104 provide perceived data that is compared against a standard (i.e., ground truth data). Appellants also argue that "Kulik does not provide any motivation to correct the image sensors 104 of Bageshwar." Appeal Br. 6. In support, Appellants contend that Kulik's teachings regarding calibration or correction of orientation sensors do not correspond to the claimed sensor (i.e., a camera), but rather, to accelerometers, gyroscopes, and compasses. Id. at 6- 7; see also Reply Br. 2-3. Again, however, Appellants' argument does not address the rejection as articulated by the Examiner, which relies on Kulik's teaching regarding the desirability of calibrating and correcting sensors, as applied to Bageshwar' s sensors 104, which expressly includes a camera as a sensor, as claimed. Appellants, for the first time in the Reply Brief, more specifically argue that "Bageshwar in view of Kulik does not teach 'adjusting ... one or more parameters of the sensor ... wherein the one or more parameters include a latency of the sensor, a position of the sensor, or an orientation of the sensor relative to the vehicle." Reply Br. 1 (emphasis added). In accordance with our rules, we do not consider such arguments unless good cause is shown. See 37 C.F.R. § 41.41(b)(2). Notwithstanding, Appellants do not provide any additional argument or evidence with respect to this statement of the claim limitation, and as such, Appellants' argument does not constitute separate argument for patentability of the claims pursuant to 37 C.F.R. § 41.37(c)(l)(iv). See In re Lovin, 652 F.3d 1349, 1357 (Fed. Cir. 2011) (holding that the Board had reasonably interpreted 37 C.F.R. 7 Appeal2017-006440 Application 13/585,432 § 41. 3 7 ( c) (1) (vii) as requiring "more substantive arguments in an appeal brief than a mere recitation of the claim elements and a naked assertion that the corresponding elements were not found in the prior art"). Moreover, as evidenced by Sharma, 4 it is well known that position and orientation (including tilt) are adjustable parameters ofBageshwar's sensors 104 (i.e., camera). See Sharma Abstract (referencing "vision sensor[] ... tilt parameters"), i-f 3 (disclosing that "[ t ]raditional camera calibration parameters consist of intrinsic and extrinsic parameters," wherein "[ e ]xtrinsic parameters relate the camera to other world coordinate systems"), i-f 4 (referencing "the camera's orientation and mounting parameters"). Finally, Appellants argue that the Examiner's determination that "'adding the feature of calibration/correction to the sensor is well known to be a valuable engineering practice' is completely unsubstantiated" and "cannot be used to support an obviousness rejection." Reply Br. 3--4 (citing In re Zurko, 258 F.3d 1379, 1385 (Fed. Cir. 2001) (holding that as assessment of basic knowledge and common sense that is not based on any evidence in the record lacks substantial evidentiary support). Unlike in Zurko, however, the Examiner provides support for the determination that the calibration and correction of sensors is a well-known engineering practice by quoting Kulik: "'even if the device has been calibrated in the factory, additional calibration, user or automatic, is useful and sometimes even unavoidable."' Ans. 3 (quoting Kulik i-f 3). Moreover, 4 US 2010/0235129 Al; published Sept. 16, 2010. Sharma is cited in the record. See Appellants' Information Disclosure Statement filed Aug. 14, 2012. 8 Appeal2017-006440 Application 13/585,432 in further support of the Examiner's finding, as Official Notice of a fact capable of instant and unquestionable demonstration as being well-known, Sharma discloses that "[a]utonomous vehicles need accurate and reliable obstacle detection solutions to aid in navigation and obstacle avoidance" and "[t]ypically sensors such as vision sensors5 ... are utilized to detect obstacles in an autonomous vehicle's path." Sharma ,-r 1. Sharma continues that "[the combining of sensory data] can give inaccurate outputs if the sensors are not accurately registered (calibrated)" and therefore, "[a]s a common practice, registration is done." Id. ,-r 2 (emphasis added). Put another way, Appellants' claim 1 generally recites the well-known method of calibrating (by definition) the sensor of an autonomous vehicle by adjusting a parameter of the sensor. The specific type of sensor, and its measured (or perceived) data and parameter(s), as claimed, are also known, as discussed supra. Thus, we are not apprised of error in the Examiner's findings or reasoning. Accordingly, we sustain the Examiner's rejection of independent claim 1, and claims 8 and 15 fall with claim 1. Appellants chose not to present separate arguments for the patentability of the dependent claims, and therefore, for the same reasons supra, we also sustain the Examiner's rejection of dependent claims 3-5, 7, 10-12, 14, and 17-19. Appeal Br. 8; Reply Br. 7. 5 Sharma discloses that a camera is a vision sensor. See, e.g., Sharma ,-r 18 ("Sensors 110 are mounted on the vehicle 100 . ... [S]ensors 110 consist of visions sensors (for example, a monocular camera")). 9 Appeal2017-006440 Application 13/585,432 DECISION The Examiner's decision rejecting claims 1, 3-5, 7, 8, 10-12, 14, 15, and 17-19 is AFFIRMED. No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a)(l )(iv). AFFIRMED 10 Copy with citationCopy as parenthetical citation