Apple Inc.Download PDFPatent Trials and Appeals BoardMar 1, 20212020006077 (P.T.A.B. Mar. 1, 2021) Copy Citation UNITED STATES PATENT AND TRADEMARK OFFICE UNITED STATES DEPARTMENT OF COMMERCE United States Patent and Trademark Office Address: COMMISSIONER FOR PATENTS P.O. Box 1450 Alexandria, Virginia 22313-1450 www.uspto.gov APPLICATION NO. FILING DATE FIRST NAMED INVENTOR ATTORNEY DOCKET NO. CONFIRMATION NO. 15/114,860 07/28/2016 Anna-Katrina Shedletsky P20130US1 2048 145730 7590 03/01/2021 Dorsey & Whitney LLP/Apple Inc. 111 South Main St Salt Lake City, UT 84111 EXAMINER PARKER, JEFFREY ALAN ART UNIT PAPER NUMBER 2625 NOTIFICATION DATE DELIVERY MODE 03/01/2021 ELECTRONIC Please find below and/or attached an Office communication concerning this application or proceeding. The time period for reply, if any, is set in the attached communication. Notice of the Office communication was sent electronically on above-indicated "Notification Date" to the following e-mail address(es): DWAppleteam@dorsey.com apip.docket@dorsey.com PTOL-90A (Rev. 04/07) UNITED STATES PATENT AND TRADEMARK OFFICE BEFORE THE PATENT TRIAL AND APPEAL BOARD Ex parte ANNA-KATRINA SHEDLETSKY, ERIK D. DE JONG, FLETCHER R. ROTHKOPF, and SAMUEL B. WEISS Appeal 2020-006077 Application 15/114,860 Technology Center 2600 Before MAHSHID D. SAADAT, ELENI MANTIS MERCADER, and BETH Z. SHAW, Administrative Patent Judges. SHAW, Administrative Patent Judge. DECISION ON APPEAL STATEMENT OF THE CASE Pursuant to 35 U.S.C. § 134(a), Appellant1 appeals from the Examiner’s decision to reject claims 1–21. See Final Act. 1. We have jurisdiction under 35 U.S.C. § 6(b). We conducted a telephonic hearing on February 18, 2021. We AFFIRM. 1 We use the word Appellant to refer to “applicant” as defined in 37 C.F.R. § 1.42. Appellant identifies the real party in interest as Apple Inc. Appeal Br. 3. Appeal 2020-006077 Application 15/114,860 2 CLAIMED SUBJECT MATTER The claims are directed to an automatic orientation of a device, and in particular, displaying a user interface on a wearable electronic device based on a determined orientation of the electronic device. Spec. ¶ 1. Claims 1, 9, 16, and 21, reproduced below, are illustrative of the claimed subject matter: 1. A method for presenting a user interface on a wearable electronic device, the method comprising: receiving touch input from a user via at least one sensor coupled to the wearable electronic device, the at least one sensor comprising a touch sensor, the touch input received by the touch sensor indicating a direction from which the input is provided to the user interface; determining, based on only the touch input from the at least one sensor, an orientation of the wearable electronic device with respect to an object to which the wearable electronic device is attached; and displaying a user interface on a display of the wearable electronic device, wherein the user interface is displayed in a first orientation based, at least in part, on the determined orientation of the wearable electronic device. 9. A method for presenting a user interface on a wearable electronic device, the method comprising: receiving input from a first sensor coupled to the wearable electronic device, the first sensor being configured to determine a direction of blood flow of the user; determining, based on input from the first sensor, an orientation of the wearable electronic device comprising at least whether the wearable electronic device is on an inside of a wrist of the wearer or on an outside of a wrist of the wearer; determining, based on input from a second sensor and the orientation of the wearable electronic device, whether a display of the wearable electronic device is in a field of view of a wearer of the wearable electronic device, wherein the display is Appeal 2020-006077 Application 15/114,860 3 in the field of view of the wearer when the indicated front or rear side of the wrist of the wearer faces the wearer; when it is determined that the display of the wearable electronic device is not in a field of view of the wearer of the wearable electronic device, causing the display to enter a standby mode; and when it is determined that the display of the wearable electronic device is in a field of the view of the wearer of the wearable electronic device, displaying a user interface on the display of the wearable electronic device, wherein the user interface is displayed in a first orientation based, at least in part, on the determined orientation of the wearable electronic device. 16. A device comprising: a first sensor, the first sensor being configured to detect a styloid; a second sensor; at least one processor; and a memory coupled to the at least one processor, the memory for storing instructions which, when executed by the at least one processor performs a method for presenting a user interface on a wearable electronic device, the method comprising: receiving input from the first sensor indicating whether a display of the device is located on a front side or a rear side of a wrist of a wearer of the wearable electronic device; determining, based on input from the second sensor, an orientation of the wearable device; determining, based on the input from the first and second sensors, whether the display is in a field of view of the wearer of the wearable electronic device, wherein the display is in the field of view of the wearer when the indicated front or rear side of the wrist of the wearer faces the wearer; and when it is determined that the display of the wearable electronic device is not in a field of view of the wearer of the wearable electronic device, causing the display to enter a standby mode; and when it is determined that the display of the wearable electronic device is in a field of the view of the wearer of the Appeal 2020-006077 Application 15/114,860 4 wearable electronic device, displaying a user interface on the display of the wearable electronic device, wherein the user interface is displayed in a first orientation based, at least in part, on the determined orientation of the wearable electronic device. 21. The method of claim 1, wherein the touch input is provided by a finger contacting the wearable electronic device, the direction from which the touch input is provided being based on at least one of: a shape of a contact area between the finger and the wearable electronic device, an angle of the finger relative to the wearable electronic device, a capacitance of the finger, and fingerprint information. REFERENCES The prior art relied upon by the Examiner is: Name Reference Date Tran US 2013/0172691 A1 July 4, 2013 Winkler US 2013/0222270 A1 Aug. 29, 2013 Mistry US 2014/0143785 A1 May 22, 2014 Major US 2015/0138099 A1 May 21, 2015 Raffa US 2015/0185874 A1 July 2, 2015 REJECTIONS Claims 1–8 and 21 are rejected under 35 U.S.C. § 103 as being unpatentable over Winkler and Major. Final Act. 3–5. Claims 9–15 are rejected under 35 U.S.C. § 103 as being unpatentable over Winkler, Raffa, and Tran. Final Act. 5–8. Claims 16–20 are rejected under 35 U.S.C. § 103 as being unpatentable over Winkler, Raffa, and Mistry. Final Act. 8–11. Appeal 2020-006077 Application 15/114,860 5 OPINION Independent Claim 1 Independent claim 1 recites, in part, “determining, based on only the touch input from the at least one sensor, an orientation of the wearable electronic device with respect to an object to which the wearable electronic device is attached.” On this record, we see no error in the Examiner’s reliance on Winkler and Major for collectively at least suggesting this recited determining step. As shown in Winkler’s paragraph 113, Winkler teaches that the user’s touch input, along with other information, can be used to change the presentation of information on a display of a device. Ans. 13 (citing Winkler ¶ 113). Winkler also teaches altering a presentation orientation of visual output. Winkler ¶¶ 42, 43 (emphasis added). Major teaches that its mobile device, which includes a touch sensor, can be configured to receive “orientation data from the touch sensor.” Major ¶ 8; Ans. 14. In some embodiments, the mobile device of Major may be “configured to change the orientation of the view.” Major ¶ 37. “Alternatively or additionally,” Major states, “the first view may be rotated to the second view based on orientation data received from user inputs, such as to a touch sensor.” Id. Major also explains that “[i]n some embodiments, some or all of the orientation data may alternatively or additionally be determined based on user inputs, such as touch inputs generated by touch sensor 210.” Id. ¶ 50. Appellant argues “the Examiner appears to be conflating an orientation of displayed content with an orientation of the physical device itself.” Appeal Br. 8. Appellant argues that Major’s “orientation data” is not the same type of “orientation” that is claimed, because the term “orientation” Appeal 2020-006077 Application 15/114,860 6 in “Major appears to be used in the context of determining a new view to be presented to the user for the user's movement through a virtual environment, not in determining an orientation of the mobile electronic device with respect to an object to which the mobile device is attached.” Appeal Br. 9. Although Appellant posits that Major’s “orientation data” is strictly limited to only orientation “as viewed” by a player or participant within a virtual environment, we disagree that Major’s teachings are so limited. Major does not limit its discussion of “orientation data” in this way, nor does Appellant identify any such disclaimer or explicit limitation of the “orientation data” in Major. Appellant’s arguments regarding Major’s alleged shortcomings in this regard (Appeal Br. 8–10; Reply Br. 5–6) are, therefore, unavailing and not commensurate with the scope of the claim that does not preclude Major’s relied-upon functionality. Claim 1 additionally recites “receiving touch input from a user via at least one sensor coupled to the wearable electronic device, the at least one sensor comprising a touch sensor, the touch input received by the touch sensor indicating a direction from which the input is provided to the user interface.” Claims App’x (emphasis added). On this record, we see no error in the Examiner’s reliance on Winkler and Major for collectively at least suggesting this recited receiving step. Major teaches, in part, “in response to the user touching a swiping leftwards on a left portion of the graphical interface or view, the view of the virtual environment may be rotated a variable amount to the left, such as by an amount that depends on the location, speed and/or duration of the touch or swipe as indicated by the orientation data.” Major ¶ 103; Ans. 16. As the Examiner explains, “Major detects a touch input that swipes RIGHT to Appeal 2020-006077 Application 15/114,860 7 LEFT, which necessarily contains the information that the input was provided from the RIGHT direction.” Ans. 16. Although Appellant directs us to paragraph 32 of Appellant’s Specification (Appeal Br. 11; Reply Br. 7) to give an example of how to identify the direction from which the input is provided, the claim does not limit how such input is provided. Appellant’s arguments regarding Major’s alleged shortcomings in this regard (Appeal Br. 10–11; Reply Br. 8) are, therefore, unavailing and not commensurate with the scope of the claim that does not preclude Major’s relied-upon functionality. Therefore, we are not persuaded that the Examiner erred in rejecting claim 1, and claims 2–8 not argued separately with particularity. Dependent Claim 21 Dependent claim 21 depends from claim 1 and recites “wherein the touch input is provided by a finger contacting the wearable electronic device, the direction from which the touch input is provided being based on at least one of: a shape of a contact area between the finger and the wearable electronic device, an angle of the finger relative to the wearable electronic device, a capacitance of the finger, and fingerprint information.” Claims App., emphasis added. As the Examiner explains, “Major detects a touch input that swipes RIGHT to LEFT, which necessarily contains the information that the input was provided from the RIGHT direction.” Ans. 18 (citing Major ¶ 103). In particular, Major teaches that “in response to the user touching a swiping leftwards on a left portion of the graphical interface or view, the view of the virtual environment may be rotated a variable amount to the left, such as by Appeal 2020-006077 Application 15/114,860 8 an amount that depends on the location, speed and/or duration of the touch or swipe as indicated by the orientation data.” Major ¶ 103. As the Examiner explains, a swipe input from right to left, as shown in Major, contains the shape of the contact area that touches the display. Ans. 18. Although Major does not state explicitly that the direction of the leftwards swipe is based on a shape of a contact area between the finger and the wearable electronic device, we nonetheless see no error in the Examiner’s reliance on Winkler and Major for collectively at least suggesting this feature, particularly given Major’s teaching of using the location, speed, or duration of the touch or swipe in a leftwards direction, as disclosed by Major in paragraph 103. Therefore, we are not persuaded that the Examiner erred in rejecting dependent claim 21. Independent Claim 9 Claim 9 recites, in part, “A method for presenting a user interface on a wearable electronic device, the method comprising: receiving input from a first sensor coupled to the wearable electronic device, the first sensor being configured to determine a direction of blood flow of the user.” Claims App. (emphasis added). On this record, we see no error in the Examiner’s reliance on Winkler, Raffa, and Tran for collectively at least suggesting the claimed first sensor being configured to determine a direction of blood flow of the user. As explained in paragraph 234 of Tran: The magnet provides a magnetic field, and one or more electrodes similar to electrode 1383 are positioned on the wrist band to measure voltage drops which are proportional to the blood velocity. The electromagnetic embodiment may be Appeal 2020-006077 Application 15/114,860 9 mounted on the upper arm of the patient, on the ankle or on the neck where peripheral blood vessels pass through and their blood velocity may be measured with minimal interruptions. The flexible magnet produces a pseudo-uniform (non-gradient) magnetic field. The magnetic field can be normal to the blood flow direction when wrist band 1374 is mounted on the user's wrist or may be a rotative pseudo-uniform magnetic field so that the magnetic field is in a transversal direction in respect to the blood flow direction . . . . Tran ¶ 234. As the Examiner explains, according to paragraph 234, Tran’s system is at the very least aware of the connection between the direction of the magnet’s magnetic field and the blood flow direction. Ans. 19. “[V]elocity is detected by a voltage. Velocity has direction as well as magnitude and voltage can be both raised and lowered to determine the direction of the blood flow.” Id. Appellant argues that Tran merely discloses the orientation of the magnetic field relative to the blood flow direction, meaning that the direction of blood flow has already been determined. Appeal Br. 13–14; Reply Br. 9–10. We see no error in the Examiner’s reliance on Tran for at least suggesting “the first sensor being configured to determine a direction of blood flow of the user.” Appellant provides insufficient evidence proving that the Specification or claims limit the “first sensor” in a way that, under a broad but reasonable interpretation, is not encompassed by Tran’s teachings of the electromagnetic embodiment in paragraph 234. The electromagnetic functionality of Tran’s system at least suggests the recited first sensor being configured to determine a direction of blood flow of the user because it determines blood velocity, which has direction. Appeal 2020-006077 Application 15/114,860 10 Therefore, we are not persuaded that the Examiner erred in rejecting independent claim 9, and claims 10–15 not argued separately with particularity. Independent Claim 16 Claim 16 recites, in part, “A device comprising: a first sensor, the first sensor being configured to detect a styloid.” We note that the disputed “styloid” recited in the first line of claim 16 is not again referenced in claim 16. The Examiner determines that the claimed “styloid” is “comprehensive enough to be read on by a stylus” as taught by Mistry because a stylus is a type of styloid. Ans. 20. On this record, we are not persuaded of error in the Examiner’s conclusion. Appellant provides insufficient evidence proving that the Specification or claims limit “styloid” in a way that, under a broad but reasonable interpretation, is not encompassed by Mistry’s teachings of a stylus. Although Appellant points to paragraph 26 of the Specification as referencing a “styloid” in the “wrist” of an individual (Appeal Br. 16), Appellant also acknowledges that paragraph 64 of the Specification refers to a “stylus” (Reply Br. 11). Indeed, paragraph 64 of the Specification notes that an “input device 1035” of a wearable electronic device 1000 may include a “stylus.” Spec. ¶ 64; Fig. 10. Appellant attempts to distinguish the “stylus” from the “styloid” recited in the claim because paragraph 64 in Appellant’s Specification “does not correlate input via the stylus pen with an orientation of the wearable electronic device.” Reply Br. 11. Yet, Appellant’s arguments are not persuasive because they are not Appeal 2020-006077 Application 15/114,860 11 commensurate with the scope of the claim, which, as discussed above, does not reference the “styloid” again in claim 16, let alone recite the “styloid” as being used to correlating input. Appellant similarly argues that the “Examiner has not indicated how the detection of an input from a stylus pen, such as taught by Mistry, would enable the sensor which detects such input to indicate whether a display of the device is located on a front side or a rear side of a wrist of a wearer of the wearable electronic device.” Appeal Br. 17. This argument, however, is not commensurate in scope with claim 16, which does not recite that the styloid does anything, let alone that it is used to “enable the sensor” in this way. We also note that the Examiner has pointed out that: In the Response After Final of 1/3/2020, Applicant proposed to amend the claim language to "the first sensor being configured to detect a styloid of a wearer's wrist," however, that amendment was not admitted as noted in the Advisory Action of 1/31/2020. An amendment such as that would make the claim clear on the record and would overcome Mystry. However, a new search would need to be conducted. Ans. 20. For these reasons, we are not persuaded that the Examiner erred in rejecting independent claim 16, and claims 17–20 not argued separately with particularity. CONCLUSION The Examiner’s rejections are affirmed. In particular, we affirm the rejection of claims 1–8 and 21 under 35 U.S.C. § 103 as being unpatentable over Winkler and Major. Appeal 2020-006077 Application 15/114,860 12 We affirm the rejection of claims 9–15 under 35 U.S.C. § 103 as being unpatentable over Winkler, Raffa, and Tran. We affirm the rejection of claims 16–20 are rejected under 35 U.S.C. § 103 as being unpatentable over Winkler, Raffa, and Mistry. DECISION SUMMARY Claims Rejected 35 U.S.C. § Reference(s)/Basis Affirmed Reversed 1–8, 21 103 Winkler, Major 1–8, 21 9–15 103 Winkler, Raffa, Tran 9–15 16–20 103 Winkler, Raffa, Mistry 16–20 Overall Outcome 1–21 TIME PERIOD FOR RESPONSE No time period for taking any subsequent action in connection with this appeal may be extended under 37 C.F.R. § 1.136(a). See 37 C.F.R. § 1.136(a)(1)(iv). AFFIRMED Copy with citationCopy as parenthetical citation