34 patents
Utility
Optics for vehicle occupant monitoring systems
2 Jan 24
A focusing optical part, including a plastic body, suitable for being delivered on a tape and reel and mounted on a PCB by an automated mounting machine, the plastic body including a concave mirror including a center aperture input surface through which light enters the plastic body, a convex mirror opposite the center aperture, wherein the concave mirror and the convex mirror form a reflective objective that reflects and focuses the light inside the plastic body, and an exit surface surrounding the convex mirror, through which focused light exits the plastic body.
Björn Alexander Jubner, Lars Bertil Sparf, Robert Sven Pettersson, Hans Anders Jansson
Filed: 23 Jan 19
Utility
Contactless touch input system
12 Dec 23
A proximity sensor, including light emitters and light detectors mounted on a circuit board, two stacked lenses, positioned above the emitters and the detectors, including an extruded cylindrical lens and a Fresnel lens array, wherein each emitter projects light through the two lenses along a common projection plane, wherein a reflective object located in the projection plane reflects light from one or more emitters to one or more detectors, and wherein each emitter-detector pair, when synchronously activated, generates a greatest detection signal at the activated detector when the reflective object is located at a specific 2D location in the projection plane corresponding to the emitter-detector pair, and a processor sequentially activating the emitters and synchronously co-activating one or more detectors, and identifying a location of the object in the projection plane, based on amounts of light detected by the detector of each synchronously activated emitter-detector pair.
Stefan Johannes Holmgren, Jan Tomas Hartman, Tom Richard Berglind, Lars Bertil Sparf, Jonas Daniel Justus Hjelm, John Elis Gõsta Karlsson, Per Carl Sture Rosengren, Gunnar Martin Frõjdh, Joseph Shain, Xiatao Wang, Clarence Ray King, III, Oscar Ritzén Praglowski de Radwan
Filed: 30 Dec 20
Utility
Optical Proximity Sensors
30 Nov 23
An optical method for identifying locations of objects in a plane, including serially projecting light beams along a detection area, from a plurality of locations along an edge of the detection area, whereby a reflective object inserted into the detection area reflects the projected light beams, directing the reflections of the projected light beams arriving at the edge of the detection area onto a plurality of light detectors, in a manner that maximizes amounts of reflected light arriving at the detectors when the light arrives at a particular angle in relation to the edge, and calculating two-dimensional coordinates of the inserted object in the detection area based on the particular angle and the outputs of the detectors.
Stefan Johannes Holmgren, Sairam Iyer, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Per Carl Sture Rosengren, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Björn Thomas Eriksson, Björn Alexander Jubner, Remo Behdasht, Simon Greger Fellin, Robin Kjell Åman, Joseph Shain
Filed: 2 Jul 23
Utility
User Interface
30 Nov 23
A method for interacting with controls in a graphical user interface (GUI), including recording user interface gestures performed by a user, for each recorded gesture: when the gesture includes the user virtually touching a specific GUI control, applying the gesture to the specific GUI control; and when the gesture is performed without the user virtually touching a specific GUI control, identifying a particular GUI control that the user is gazing at and applying the gesture to that particular GUI control.
Björn Thomas Eriksson, Sven Robert Pettersson, Stefan Johannes Holmgren, Xiatao Wang, Rozita Teymourzadeh, Per Erik Lindström, Emil Anders Braide, Jonas Daniel Justus Hjelm, Erik Anders Claes Rosengren
Filed: 15 Aug 23
Utility
Optical Touch Sensor
12 Oct 23
A method for detecting locations of objects in a plane, including emit light beams, one at a time, at locations along a first edge of a detection area, refract each light beam into multiple divergent light beams, direct each of the divergent beams to arrive at a respective pair of focusing lenses mounted along a second edge of the detection area, opposite the first edge, wherein an intensity profile of each divergent beam has maximum intensity along the center of the beam, for each of the focusing lenses, measure an intensity profile of that portion of the divergent beam that enters the focusing lens, for each pair of focusing lenses receiving a single divergent beam, compare the measured intensity profiles for light arriving at each lens, and determine a location of an object that partially blocks at least one of the divergent beams, based on the compares.
Stefan Johannes Holmgren
Filed: 1 Jun 23
Utility
Vehicle User Interface
12 Oct 23
A vehicle autonomous drive system including a steering wheel, a sensor operable to identify each gesture component within a set of gesture components performed on the steering wheel by a driver of the vehicle, the set of gesture components including thumb-tap, thumb touch-and-hold, thumb-glide, hand-grab and hand-tap, a processor for an autonomous drive system in the vehicle, receiving from the sensor, a series of time-stamped, contact coordinates for the gesture components identified by the sensor, and a non-transitory computer readable medium storing instructions thereon that, when executed by the processor, cause the processor to construct compound gestures based on the series of time-stamped, contact coordinates, and to activate features of the autonomous drive system in response to the compound gestures.
Björn Alexander Jubner, Björn Thomas Eriksson, Gunnar Martin Fröjdh, Simon Greger Fellin, Stefan Johannes Holmgren
Filed: 12 May 23
Utility
User interface
10 Oct 23
A method for interacting with controls in a graphical user interface (GUI) having a plurality of GUI controls, the method includes identify one GUI control that a user is gazing at, from among a plurality of GUI controls presented on a display, detect user interface gestures performed by the user, when the detected gesture is performed in an airspace away from the display, apply a relative motion corresponding to the gesture to the identified GUI control, and when the detected gesture is performed by the user touching the display, apply a sequence of absolute display locations touched by the gesture to a GUI control presented at the touched locations that is different than the identified GUI control.
Björn Thomas Eriksson, Sven Robert Pettersson, Stefan Johannes Holmgren, Xiatao Wang, Rozita Teymourzadeh, Per Erik Lindström, Emil Anders Braide, Jonas Daniel Justus Hjelm, Erik Anders Claes Rosengren
Filed: 15 Feb 21
Utility
Object detector based on reflected light
22 Aug 23
A sensor for a control panel, including a housing along an edge of the panel, light emitters projecting light along an in-air detection plane over the panel and detectors detecting reflections of the projected light, reflected by an object in the detection plane, lenses oriented such that each detector receives maximum light intensity when light enters a corresponding lens at a particular angle, whereby for each emitter-detector pair, when the object is located at a specific position in the detection plane, light emitted by the emitter of that pair is reflected by the object back through one of the lenses at the particular angle to the detector of that pair, the specific position being associated with that emitter-detector pair, and a processor configured to determine panel locations, map each location to a position in the detection plane associated with an emitter-detector pair, mapping the panel to the detection plane.
Stefan Johannes Holmgren, Sairam Iyer, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Per Carl Sture Rosengren, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Björn Thomas Eriksson, Björn Alexander Jubner, Remo Behdasht, Simon Greger Fellin, Robin Kjell Åman, Joseph Shain
Filed: 26 Jul 21
Utility
Multi-plane reflective sensor
1 Aug 23
An optical assembly including a reflectance-based sensor emitting light into a detection plane and detecting reflections of the emitted light, reflected by an object located in the detection plane, a light redirector positioned away from the sensor redirecting light emitted by the sensor into one or more spatial planes parallel to the detection plane and, when the object is located in the one or more spatial planes, redirecting light reflected by the object into the detection plane, and a processor controlling light emitted by the sensor and receiving outputs from the sensor, and configured such that when an object passes through one or more of the spatial planes, the processor identifies both the spatial planes through which the object passes, and the location of the object within the spatial planes through which it passes, based on the received outputs and the position of the light redirector relative to the sensor.
Thomas Eriksson, Alexander Jubner, Rozita Teymourzadeh, Stefan Holmgren, Lars Sparf, Bengt Henry Hjalmar Edlund, Richard Berglind
Filed: 11 Mar 21
Utility
Vehicle user interface
16 May 23
A vehicle gesture control system for a host vehicle, the host vehicle including an adaptive cruise control or autonomous drive arrangement, the system including a sensor to detect gestures performed by a driver on the surface of a steering wheel grip in the host vehicle, the steering wheel grip including a circular tube surrounding a steering wheel that rotates about a steering column, and a processor receiving outputs from the sensor and connected to a memory unit storing instructions for the processor to activate a plurality of features of the adaptive cruise control or autonomous drive arrangement in response to a respective plurality of different gestures detected by the sensor, wherein two of the gestures represent “up” and “down” commands and include movement of the driver's thumb in opposite directions, respectively, around a lateral section of the steering wheel grip that faces the driver.
Björn Alexander Jubner, Björn Thomas Eriksson, Gunnar Martin Fröjdh, Simon Greger Fellin, Stefan Johannes Holmgren
Filed: 17 Jul 20
Utility
Touch Sensitive Curved and Flexible Displays
23 Mar 23
Generating interactive in-air images, by emitters emitting light pulses along an in-air detection plane, light detectors, lenses configured such that there is a particular angle of entry at which each detector receives maximal light intensity when pulses enter a lens corresponding to the detector at the particular angle of entry, and there are target positions in the detection plane, associated with emitter-detector pairs, whereby for each emitter-detector pair, when an object is located at the target position, then pulses emitted by the emitter are reflected by the object into the lens corresponding to the detector at the particular angle of entry, a projector projecting an image that appears, to a user suspended in the detection plane, and a processor identifying locations of the object in the detection plane and mapping the identified locations to corresponding locations in the image, to register user interactions with the image.
Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Stefan Johannes Holmgren, Lars Bertil Sparf, Bengt Henry Hjalmar Edlund, Tom Richard Berglind
Filed: 13 Nov 22
Utility
Contactless Touch Input System
9 Feb 23
A proximity sensor, including light emitters and light detectors mounted on a circuit board, two stacked lenses, positioned above the emitters and the detectors, including an extruded cylindrical lens and a Fresnel lens array, wherein each emitter projects light through the two lenses along a common projection plane, wherein a reflective object located in the projection plane reflects light from one or more emitters to one or more detectors, and wherein each emitter-detector pair, when synchronously activated, generates a greatest detection signal at the activated detector when the reflective object is located at a specific 2D location in the projection plane corresponding to the emitter-detector pair, and a processor sequentially activating the emitters and synchronously co-activating one or more detectors, and identifying a location of the object in the projection plane, based on amounts of light detected by the detector of each synchronously activated emitter-detector pair.
Stefan Johannes Holmgren, Jan Tomas Hartman, Tom Richard Berglind, Lars Bertil Sparf, Jonas Daniel Justus Hjelm, Jon Elis Gõsta Karlsson, Per Carl Sture Rosengren, Gunnar Martin Frõjdh, Joseph Shain, Xiatao Wang, Clarence Ray King III, Oscar Ritzén Praglowski de Radwan
Filed: 30 Dec 20
Utility
Contactless Control Panel
13 Oct 22
Method including providing a sensor including light emitters, photodiode detectors, and lenses arranged so as to direct light beams from light emitters exiting lenses along a detection plane, and so as to direct light beams entering lenses at a specific angle of incidence onto photodiode detectors, mounting the sensor on a display presenting virtual input controls for an electronic device, such that the detection plane resides in an airspace in front of the display, activating light emitters to project light beams through lenses along the detection plane, wherein at least one of the light beams is interrupted by a finger, detecting light reflected by the finger, identifying emitters that projected the light beam that was reflected and photodiode detectors that detected the reflected light, as emitter-detector pairs, calculating display coordinates based on target positions associated with the identified emitter-detector pairs, and transmitting the calculated display coordinates to the electronic device.
Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
Filed: 9 Jun 22
Utility
Motorist user interface sensor
30 Aug 22
A steering wheel that includes optoelectronic components, each specific optoelectronic component including a light projector projecting light out of the steering wheel at two different angles, denoted a1 and a2, a light sensor detecting reflections of the light projected by neighboring optoelectronic components by an object above the steering wheel, a lens oriented relative to the light sensor such that the light sensor receives maximum intensity when light enters the lens at either of two particular angles, specifically, when light enters the lens at a particular angle b1, and at a particular angle b2 different than b1, wherein angle b1 views reflections of light projected at angle a1 by the optoelectronic component neighboring the specific optoelectronic component on one side, and angle b2 views reflections of light projected at angle a2 by the optoelectronic component neighboring the specific optoelectronic component on the side opposite the one side.
Stefan Johannes Holmgren, Lars Bertil Sparf, Tom Richard Berglind, Charles Bruce Banter, Per Carl Johan Nohlert
Filed: 25 Nov 19
Utility
Contactless control panel
5 Jul 22
An input panel for an electronic device, including an arrangement of buttons, wherein each button is actuated when pressed, providing input to an electronic device, and a sensor, detecting location of a user's finger above the buttons, the sensor including a housing, a printed circuit board, light emitters and photodiode detectors, lenses mounted in the housing in such a manner that, when the housing is mounted along an edge of the arrangement, the lenses direct light from the emitters along a plane above the buttons, and direct light from the plane, reflected toward the lenses by an object inserted into the plane, onto the detectors, a processor configured to identify a location in the plane at which the object is inserted based on the detections of light reflected by the object, and a communications port configured to output the identified location to the electronic device.
Björn Thomas Eriksson, Björn Alexander Jubner, Rozita Teymourzadeh, Håkan Sven Erik Andersson, Per Carl Sture Rosengren, Xiatao Wang, Stefan Johannes Holmgren, Gunnar Martin Fröjdh, Jan Tomas Hartman, Per Oscar Sverud, Sangtaek Kim, Rasmus Dahl-Örn, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Remo Behdasht, Robin Kjell Åman, Joseph Shain
Filed: 6 Oct 20
Utility
Optical Touch Sensor
31 Mar 22
An optical sensor including lenses along two opposite edges of a detection area, light emitters mounted along a specific one of the edges, each emitter projecting light beams through a respective one of the lenses, wherein the lenses split the projected emitter light into a plurality of divergent light beams directed across the detection area to respective pluralities of lenses along the edge of the detection area opposite the specific edge, wherein light intensity of each directed light beam is maximized along its center and an intensity distribution within each directed beam is known, light detectors along the edge opposite the specific edge, each detector receiving light directed across the detection area through a respective one of the lenses, and a processor receiving outputs from the detectors and calculating object locations in the detection area based on the known intensity distribution within each directed light beam and the received outputs.
Stefan Johannes Holmgren
Filed: 28 Sep 21
Utility
Motorist User Interface Sensor
3 Feb 22
A steering wheel that includes optoelectronic components, each specific optoelectronic component including a light projector projecting light out of the steering wheel at two different angles, denoted a1 and a2, a light sensor detecting reflections of the light projected by neighboring optoelectronic components by an object above the steering wheel, a lens oriented relative to the light sensor such that the light sensor receives maximum intensity when light enters the lens at either of two particular angles, specifically, when light enters the lens at a particular angle b1, and at a particular angle b2 different than b1, wherein angle b1 views reflections of light projected at angle a1 by the optoelectronic component neighboring the specific optoelectronic component on one side, and angle b2 views reflections of light projected at angle a2 by the optoelectronic component neighboring the specific optoelectronic component on the side opposite the one side.
Stefan Johannes Holmgren, Lars Bertil Sparf, Tom Richard Berglind, Charles Bruce Banter, Per Carl Johan Nohlert
Filed: 25 Nov 19
Utility
Optical Proximity Sensors
11 Nov 21
A sensor determining coordinates of a proximal object, including a one-dimensional array of alternating light emitters and detectors, including a plurality of light emitters projecting light along a detection plane, and a plurality of light detectors detecting reflections of the projected light, by a reflective object in the detection plane, and a plurality of lenses mounted and oriented relative to the emitters and the detectors such that the light detectors receive maximum intensity when light enters a corresponding lens at a first particular angle, whereby for each emitter-detector pair, light emitted by the emitter of that pair passes through one of the lenses and is reflected by the object back through one of the lenses to the detector of that pair when the object is located at one of a set of positions in the detection plane, that position being associated with that emitter-detector pair.
Stefan Johannes Holmgren, Sairam Iyer, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Per Carl Sture Rosengren, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Björn Thomas Eriksson, Björn Alexander Jubner, Remo Behdasht, Simon Greger Fellin, Robin Kjell Åman, Joseph Shain
Filed: 26 Jul 21
Utility
Optical proximity sensors
27 Jul 21
A modular proximity sensor including a plurality of sensor modules, each sensor module including a housing, lenses, light detectors, each detector positioned along the image plane of a respective lens so as to receive maximum light intensity when light enters the lens at a particular angle, light emitters, each emitter positioned in relation to a respective lens so as to project light into a detection zone, an activating unit synchronously co-activating each emitter with at least one of the detectors, and a calculating unit receiving detector outputs corresponding to amounts of projected light reflected by an object in the detection zone, and calculating a two-dimensional location of the object in the detection zone based on the detector outputs and the particular angle, wherein neighboring sensor modules monitor different detection zones, and a processor receiving outputs from each sensor module and mapping the object location in multiple detection zones over time.
Stefan Johannes Holmgren, Sairam Iyer, Tom Richard Berglind, Karl Erik Patrik Nordström, Lars Bertil Sparf, Per Carl Sture Rosengren, Erik Anders Claes Rosengren, John Elis Gösta Karlsson, Björn Thomas Eriksson, Björn Alexander Jubner, Remo Behdasht, Simon Greger Fellin, Robin Kjell Åman, Joseph Shain
Filed: 10 Jan 20
Utility
Multi-plane Reflective Sensor
22 Jul 21
An optical assembly including a reflectance-based sensor emitting light into a detection plane and detecting reflections of the emitted light, reflected by an object located in the detection plane, a light redirector positioned away from the sensor redirecting light emitted by the sensor into one or more spatial planes parallel to the detection plane and, when the object is located in the one or more spatial planes, redirecting light reflected by the object into the detection plane, and a processor controlling light emitted by the sensor and receiving outputs from the sensor, and configured such that when an object passes through one or more of the spatial planes, the processor identifies both the spatial planes through which the object passes, and the location of the object within the spatial planes through which it passes, based on the received outputs and the position of the light redirector relative to the sensor.
Thomas Eriksson, Alexander Jubner, Rozita Teymourzadeh, Stefan Holmgren, Lars Sparf, Bengt Henry Hjalmar Edlund, Richard Berglind
Filed: 11 Mar 21