in:(BROOKS, Teyvon John Hershey)

HAPTIC PLATFORM AND ECOSYSTEM FOR IMMERSIVE COMPUTER MEDIATED ENVIRONMENTS PCT/US2023/029559
[RUBIN, Jacob A., CROCKETT, Robert S., FOLEY, Edward Leo, LEE, Donald Jeong, MARINO, Joseph R., EICHERMUELLER, Michael C., RUBIN, Madeline K., LIU, Joanna Jin, KJOS, Leif Einar, CELLA, Charles Howard, BUTZER, Mitchell Stanley, WEBER, Christopher Todd, MEDEIROS, Benjamin John, BALLEW, Eric Alexander, ROJANACHAICHANIN, Bodin Limsowan, BROOKS, Teyvon John Hershey, BAIRD, Johnathan, REIMER, Keenan] 2701 McMillan Avenue, Suite 160San Luis Obispo, California 93401 A device may receive, by a haptic interface module that is configured to interact with one or more haptic interface devices and an application that generates a computer-mediated environment comprising an avatar corresponding to a user wearing the one or more haptic interface devices, respective sensor data for each respective haptic interface device, wherein the respective sensor data for a respective haptic interface device indicates respective positioning of respective sensors of the respective wearable haptic interface. A device may process, by the haptic interface module, the respective sensor data to generate respective relative location data for each respective haptic interface device, wherein the relative location data is relative to a reference location defined with respect to the corresponding wearable haptic interface. A device may receive, by the haptic interface module, tracked location data from one or more motion tracking sensors, wherein the tracked location data indicates respective locations of the one or more haptic interface devices relative to a spatial environment of the user. A device may generate, by the haptic interface module, a series of motion capture frames based on the tracked location data and the respective relative location data for each respective haptic interface device, wherein each respective motion capture frame indicates a set of locations and orientations for each respective haptic interface device at a given time. A device may generate, by the haptic interface module, a series of kinematic frames based on the series of motion capture frames and one or more mediation processes that collectively convert, for each of the motion capture frames, the set of locations and orientations of the one or more respective haptic interface devices into a set of intended locations and intended orientations for configuring the avatar in the computer-mediated environment. A device may output the series of kinematic frames to the application, wherein the kinematic frames are provided to the application as user input. A device may receive, by a haptic interface module that is configured to interact with one or more haptic interface devices and an application that generates a computer-mediated environment comprising an avatar corresponding to a user wearing the one or more haptic interface devices, respective sensor data for each respective haptic interface device, wherein the respective sensor data for a respective haptic interface device indicates respective positioning of respective sensors of the respective wearable haptic interface. A device may process, by the haptic interface module, the respective sensor data to generate respective relative location data for each respective haptic interface device, wherein the relative location data is relative to a reference location defined with respect to the corresponding wearable haptic interface. A device may receive, by the haptic interface module, tracked location data from one or more motion tracking sensors, wherein the tracked location data indicates respective locations of the one or more haptic interface devices relative to a spatial environment of the user. A device may generate, by the haptic interface module, a series of motion capture frames based on the tracked location data and the respective relative location data for each respective haptic interface device, wherein each respective motion capture frame indicates a set of locations and orientations for each respective haptic interface device at a given time. A device may generate, by the haptic interface module, a series of kinematic frames based on the series of motion capture frames and one or more mediation processes that collectively convert, for each of the motion capture frames, the set of locations and orientations of the one or more respective haptic interface devices into a set of intended locations and intended orientations for configuring the avatar in the computer-mediated environment. A device may output the series of kinematic frames to the application, wherein the kinematic frames are provided to the application as user input.
更多详情内容请点击查看
个性化你的检索平台
智能检索区域,可以通过专利号、专利名称、申请人、发明人等关键词检索专利或进行二次检索。也可以对编辑好的检索式进行保存。
筛选条件和检索历史可切换查看,筛选项包含数据源统计、申请人、申请日、发明人、法律信息等。可对数据二次过滤。
列表功能区包含视图切换、字段设置、高亮显示、收藏、分析、同族合并、附图对比、排序等功能。
列表模式
温馨提示:您已选择 条专利,您可以对其进行收藏操作!
复杂搜索

已移除专利

序号 申请号 专利名称