Patent classifications
G06F3/042
Information processing apparatus, method of information processing, and information processing system
Provided is an information processing apparatus including a determination unit that determines an input state of a user based on a captured image of a first imaging device and a sensor controller that determines an imaging position at which an input image is captured by a second imaging device different from the first imaging device, based on a positional relation between an input-related object related to input by the user and each of a plurality of imaging position candidates and the input state.
Electronic device
An electronic device, including: a display panel, including a fingerprint recognition region, the fingerprint recognition region including a central region and an edge region surrounding the central region; an optical sensor, configured to collect a fingerprint image of a target object and including a plurality of photosensitive units, wherein a projection of the optical sensor on the display panel along a direction perpendicular to the display panel is within the fingerprint recognition region; and a controller, configured to control photosensitive performance of at least one photosensitive unit corresponding to the edge region to be better than photosensitive performance of at least one photosensitive unit corresponding to the central region.
Touch input processing method and electronic device supporting the same
An electronic device including: a housing; a sensor module disposed on an inner face of the housing and including a plurality of sensing units; and a processor positioned within the housing and electrically connected to the sensor module. Each of the plurality of sensing units is electrically connected to another sensing unit adjacent thereto among the plurality of sensing units, and includes a central portion and a plurality of peripheral portions connected to a partial area of the central portion and arranged around the central portion, and each of the central portion and the plurality of peripheral portions includes a touch sensor. In addition to this, various embodiments understood through this document are possible.
INTERACTIVE AND SHARED SURFACES
The interactive and shared surface technique described herein employs hardware that can project on any surface, capture color video of that surface, and get depth information of and above the surface while preventing visual feedback (also known as video feedback, video echo, or visual echo). The technique provides N-way sharing of a surface using video compositing. It also provides for automatic calibration of hardware components, including calibration of any projector, RGB camera, depth camera and any microphones employed by the technique. The technique provides object manipulation with physical, visual, audio, and hover gestures and interaction between digital objects displayed on the surface and physical objects placed on or above the surface. It can capture and scan the surface in a manner that captures or scans exactly what the user sees, which includes both local and remote objects, drawings, annotations, hands, and so forth.
Display device having touch sensor and method of driving the same
A display device having touch sensors and a method of driving the same are disclosed. The display device includes a display panel including a pixel array including pixels and a touch sensor array including touch sensors formed in the pixel array, the pixel array being divided into blocks, a gate driver to sequentially drive a plurality of gate lines in the pixel array in a block unit, a data driver to drive a plurality of data lines in the pixel array when the gate lines are driven, a touch controller to sequentially drive the touch sensor arrays in the block unit, and a timing controller to divide one frame into at least one display mode at which the pixel array is driven and at least one touch sensing mode at which the touch sensor array is driven and to control the gate drive, the data driver and the touch controller so that the display mode and the touch sensing mode alternate.
Display device having touch sensor and method of driving the same
A display device having touch sensors and a method of driving the same are disclosed. The display device includes a display panel including a pixel array including pixels and a touch sensor array including touch sensors formed in the pixel array, the pixel array being divided into blocks, a gate driver to sequentially drive a plurality of gate lines in the pixel array in a block unit, a data driver to drive a plurality of data lines in the pixel array when the gate lines are driven, a touch controller to sequentially drive the touch sensor arrays in the block unit, and a timing controller to divide one frame into at least one display mode at which the pixel array is driven and at least one touch sensing mode at which the touch sensor array is driven and to control the gate drive, the data driver and the touch controller so that the display mode and the touch sensing mode alternate.
Virtualization of tangible interface objects
An example system includes a computing device located proximate to a physical activity surface, a video capture device, and a detector. The video capture device is coupled for communication with the computing device and is adapted to capture a video stream that includes an activity scene of the physical activity surface and one or more interface objects physically intractable with by a user. The detector processes the video stream to detect the one or more interface objects included in the activity scene, to identify the one or more interface objects that are detectable, to generate one or more events describing the one or more interface objects, and to provide the one or more events to an activity application configured to render virtual information on the one or more computing devices based on the one or more events.
Electronic devices with a deployable flexible display
Examples disclosed herein provide electronic devices with a flexible display. An example electronic device includes an enclosure, a flexible display deployable from the enclosure, where a viewing angle of the flexible display with respect to the enclosure is adjustable. The electronic device includes a mechanism to autonomously deploy and retract the flexible display within the enclosure, and a supporting structure to reinforce the flexible display when deployed from the enclosure.
Information processing device, information processing method, and computer program
There is provided an information processing device, an information processing method, and a computer program capable of more reliably recognizing positions of a plurality of detection targets. The information processing device includes a control unit for recognizing positions of a first detection target and a second detection target that are present on the same surface. The control unit recognizes the position of the first detection target based on sensing data obtained by a first sensor for sensing the first detection target from a first direction, and recognizes the position of the second detection target based on sensing data obtained by a second sensor for sensing the second detection target from a direction opposite to the first direction across the same surface.
Program, recognition apparatus, and recognition method
There is provided a program, a recognition apparatus, and a recognition method that make it possible to improve a recognition accuracy of a hand recognition process. As a hand recognition process for recognizing, according to a posture of a portable terminal including an optical sensor that receives light, a hand reflected in an image obtained by sensing of the optical sensor, a hand recognition process for the left hand or a hand recognition process for the right hand is performed. The present technology can be applied to a case in which a hand is recognized.