{"id":22037,"date":"2025-05-29T17:34:03","date_gmt":"2025-05-29T08:34:03","guid":{"rendered":"https:\/\/www.lp-research.com\/?p=22037"},"modified":"2026-03-23T10:56:15","modified_gmt":"2026-03-23T01:56:15","slug":"slam-system-ar-vr-tracking","status":"publish","type":"post","link":"https:\/\/www.lp-research.com\/ja\/slam-system-ar-vr-tracking\/","title":{"rendered":"SLAM system for AR\/VR: Next-Gen Full Fusion Tracking"},"content":{"rendered":"<div class=\"wpb-content-wrapper\"><p>[vc_row][vc_column][vc_single_image image=&#8221;22140&#8243; img_size=&#8221;medium&#8221; alignment=&#8221;center&#8221; css=&#8221;&#8221;][vc_column_text css=&#8221;&#8221;]<\/p>\n<h4>SLAM system for AR\/VR: Next-Gen Full Fusion Tracking<\/h4>\n<p>[\/vc_column_text][vc_column_text css=&#8221;&#8221;]<\/p>\n<p>At LP-Research, we have been pushing the boundaries of spatial tracking with our latest developments in Visual SLAM (Simultaneous Localization and Mapping) and sensor fusion technologies. Our new SLAM system, combined with what we call &#8220;Full Fusion,&#8221; is designed to deliver highly stable and accurate 6DoF tracking for robotics, augmented and virtual reality applications.<\/p>\n<p>[\/vc_column_text]<header class=\"thememount-heading-wrapper thememount-heading-wrapper-align-left\"><div class=\"thememount-heading-wrapper-inner\"><h2 class=\"thememount-heading-align-left\"><span>System Setup<\/span><\/h2><\/div><\/header>[vc_column_text css=&#8221;&#8221;]To demonstrate the progress of our development, we ran LPSLAM together with <a href=\"https:\/\/www.lp-research.com\/introducing-fusionhub-lp-researchs-sensor-fusion-operating-system\/\"><u>FusionHub<\/u><\/a> on a host computer and forwarded the resulting pose to a Meta Quest 3 mixed reality headset for visualization using <a href=\"https:\/\/www.lp-research.com\/products\/vr-ar-tracking-solutions-lpvr\/lpvr-air\/\"><u>LPVR-AIR<\/u><\/a>. We created a custom 3D-printed mount to affix the sensors needed for SLAM and Full Fusion, a <a href=\"https:\/\/www.stereolabs.com\/en-jp\/store\/products\/zed-mini\" target=\"_blank\" rel=\"noopener\"><u>ZED Mini stereo camera<\/u><\/a> and an <a href=\"https:\/\/www.lp-research.com\/products\/inertial-measurement-units-imu\/lpms-curs3-oem-9-axis-imu-ahrs-usb-can-uart\/\" target=\"_blank\" rel=\"noopener\"><u>LPMS-CURS3 IMU sensor<\/u><\/a> onto a the headset.<\/p>\n<p>This mount ensures proper alignment of the sensor and camera with respect to the headset&#8217;s optical axis, which is critical for accurate fusion results. The system connects via USB and runs on a host PC that communicates wirelessly with the HMD. An image of how IMU and camera are attached to the HMD is shown below.[\/vc_column_text][vc_single_image image=&#8221;22130&#8243; img_size=&#8221;medium&#8221; alignment=&#8221;center&#8221; css=&#8221;&#8221;][vc_column_text css=&#8221;&#8221;]<\/p>\n<p>In the current state of our developments we ran tests in our laboratory. The images below show a photo of the environment next to how this environment translates into an LPSLAM map.<\/p>\n<p>[\/vc_column_text][vc_single_image image=&#8221;22126&#8243; img_size=&#8221;medium&#8221; alignment=&#8221;center&#8221; css=&#8221;&#8221;][vc_single_image image=&#8221;22128&#8243; img_size=&#8221;medium&#8221; alignment=&#8221;center&#8221; css=&#8221;&#8221;]<header class=\"thememount-heading-wrapper thememount-heading-wrapper-align-left\"><div class=\"thememount-heading-wrapper-inner\"><h2 class=\"thememount-heading-align-left\"><span>Tracking Across Larger Areas<\/span><\/h2><\/div><\/header>[vc_column_text css=&#8221;&#8221;]<\/p>\n<p>A walk through the office based on a pre-built map yields good results. The fusion in this experiment is our regular IMU-optical fusion and therefore doesn&#8217;t support translation information with integrating accelerometer data. This leads to short interruptions of position tracking in certain areas where feature points aren&#8217;t found. We at least partially solve this problem with the full fusion shown in the next paragraph.<\/p>\n<p>[\/vc_column_text][vc_video link=&#8221;https:\/\/vimeo.com\/1093347595?share=copy&#8221; align=&#8221;center&#8221; css=&#8221;&#8221;]<header class=\"thememount-heading-wrapper thememount-heading-wrapper-align-left\"><div class=\"thememount-heading-wrapper-inner\"><h2 class=\"thememount-heading-align-left\"><span>What is Full Fusion?<\/span><\/h2><\/div><\/header>[vc_column_text css=&#8221;&#8221;]<\/p>\n<p>Traditional tracking systems rely either on Visual SLAM or IMU (Inertial Measurement Unit) data, often with one compensating for the other. Our Full Fusion approach goes beyond <a href=\"https:\/\/www.lp-research.com\/imucore-sensor-fusion\/\"><u>orientation fusion<\/u><\/a> and integrates both IMU and SLAM data to estimate not just orientation but also position. This combination provides smoother, more stable tracking even in complex, dynamic environments where traditional methods tend to struggle.<\/p>\n<p>By fusing IMU velocity estimates with visual SLAM pose data through a through a specialized filter algorithm, our system handles rapid movements gracefully and removes jitter seen in pure SLAM-only tracking. The IMU handles fast short-term movements while SLAM ensures long-term positional stability. Our latest releases even support alignment using fiducial markers, allowing the virtual scene to anchor precisely to the real world. The video below shows the SLAM in conjunction with the Full Fusion.<\/p>\n<p>[\/vc_column_text][vc_video link=&#8221;https:\/\/vimeo.com\/1088945901\/3454266fe1&#8243; css=&#8221;&#8221;]<header class=\"thememount-heading-wrapper thememount-heading-wrapper-align-left\"><div class=\"thememount-heading-wrapper-inner\"><h2 class=\"thememount-heading-align-left\"><span>Real-World Testing and Iteration<\/span><\/h2><\/div><\/header>[vc_column_text css=&#8221;&#8221;]<\/p>\n<p>We&#8217;ve extensively tested this system in both lab conditions and challenging real-world environments. Our recent experiments demonstrated excellent results. By integrating our LPMS IMU sensor and running our software pipeline (LPSLAM and FusionHub), we achieved room-scale tracking with sub-centimeter accuracy and rotation errors as low as 0.45 degrees.<\/p>\n<p>In order to evaluate the performance of the overall solution we compared the output from FusionHub with pose data recorded by an <a href=\"https:\/\/ar-tracking.com\/en\/product-program\/smarttrack3\" target=\"_blank\" rel=\"noopener\"><u>ART Smarttrack 3<\/u><\/a> tracking system. The accuracy of an ART tracking system is in the sub-mm range and therefore is sufficienty accurate to characterize the performance of our SLAM. The result of one of several measurement runs is shown in the image below. Note that both systems were alignment and timestamp synchronized to correctly compare poses.<\/p>\n<p>[\/vc_column_text][vc_single_image image=&#8221;22122&#8243; img_size=&#8221;large&#8221; alignment=&#8221;center&#8221; css=&#8221;&#8221;]<header class=\"thememount-heading-wrapper thememount-heading-wrapper-align-left\"><div class=\"thememount-heading-wrapper-inner\"><h2 class=\"thememount-heading-align-left\"><span>Developer-Friendly and Cross-Platform<\/span><\/h2><\/div><\/header>[vc_column_text css=&#8221;&#8221;]<\/p>\n<p>The LP-Research SLAM and FusionHub stack is designed for flexibility. Components can run on the PC and stream results to an HMD wirelessly, enabling rapid development and iteration. The system supports OpenXR-compatible headsets and has been tested with Meta Quest 3, Varjo XR-3, and more. Developers can also log and replay sessions for detailed tuning and offline debugging.<\/p>\n<p>[\/vc_column_text]<header class=\"thememount-heading-wrapper thememount-heading-wrapper-align-left\"><div class=\"thememount-heading-wrapper-inner\"><h2 class=\"thememount-heading-align-left\"><span>Looking Ahead<\/span><\/h2><\/div><\/header>[vc_column_text css=&#8221;&#8221;]<\/p>\n<p>Our roadmap includes support for optical flow integration to improve SLAM stability further, expanded hardware compatibility, and refined UI tools for better calibration and monitoring. We\u2019re also continuing our efforts to improve automated calibration and simplify the configuration process.<\/p>\n<p>This is just the beginning. If you&#8217;re building advanced AR\/VR systems and need precise, low-latency tracking that works in the real world, LP-Research&#8217;s Full Fusion system is ready to support your journey.<\/p>\n<p>To learn more or get involved in our beta program, <a href=\"https:\/\/www.lp-research.com\/company\/contact\/\"><u>reach out to us<\/u><\/a>.<\/p>\n<p>[\/vc_column_text][\/vc_column][\/vc_row]<\/p>\n<\/div>","protected":false},"excerpt":{"rendered":"<p>[vc_row][vc_column][vc_single_image image=&#8221;22140&#8243; img_size=&#8221;medium&#8221; alignment=&#8221;c [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[4,49],"tags":[],"class_list":["post-22037","post","type-post","status-publish","format-standard","hentry","category-blog","category-projects"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v24.1 (Yoast SEO v24.9) - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>SLAM system for AR\/VR: Next-Gen Full Fusion Tracking - LP-Research<\/title>\n<meta name=\"description\" content=\"Advanced SLAM system for AR\/VR tracking. High-precision full fusion for mobile robotics and mixed reality. Explore our latest sensor innovation.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/\" \/>\n<meta property=\"og:locale\" content=\"ja_JP\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"SLAM system for AR\/VR: Next-Gen Full Fusion Tracking\" \/>\n<meta property=\"og:description\" content=\"Advanced SLAM system for AR\/VR tracking. High-precision full fusion for mobile robotics and mixed reality. Explore our latest sensor innovation.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/\" \/>\n<meta property=\"og:site_name\" content=\"LP-Research\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/lpresearch\" \/>\n<meta property=\"article:published_time\" content=\"2025-05-29T08:34:03+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2026-03-23T01:56:15+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.lp-research.com\/wp-content\/uploads\/2015\/08\/2_LogoNeda.png\" \/>\n\t<meta property=\"og:image:width\" content=\"245\" \/>\n\t<meta property=\"og:image:height\" content=\"104\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Klaus Petersen\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@LPResearch\" \/>\n<meta name=\"twitter:site\" content=\"@LPResearch\" \/>\n<meta name=\"twitter:label1\" content=\"\u57f7\u7b46\u8005\" \/>\n\t<meta name=\"twitter:data1\" content=\"Klaus Petersen\" \/>\n\t<meta name=\"twitter:label2\" content=\"\u63a8\u5b9a\u8aad\u307f\u53d6\u308a\u6642\u9593\" \/>\n\t<meta name=\"twitter:data2\" content=\"5\u5206\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/\"},\"author\":{\"name\":\"Klaus Petersen\",\"@id\":\"https:\/\/52.42.213.164\/#\/schema\/person\/d33893eacbac18042e119cbbb28760c6\"},\"headline\":\"SLAM system for AR\/VR: Next-Gen Full Fusion Tracking\",\"datePublished\":\"2025-05-29T08:34:03+00:00\",\"dateModified\":\"2026-03-23T01:56:15+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/\"},\"wordCount\":870,\"publisher\":{\"@id\":\"https:\/\/52.42.213.164\/#organization\"},\"articleSection\":[\"Blog\",\"Projects\"],\"inLanguage\":\"ja\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/\",\"url\":\"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/\",\"name\":\"SLAM system for AR\/VR: Next-Gen Full Fusion Tracking - LP-Research\",\"isPartOf\":{\"@id\":\"https:\/\/52.42.213.164\/#website\"},\"datePublished\":\"2025-05-29T08:34:03+00:00\",\"dateModified\":\"2026-03-23T01:56:15+00:00\",\"description\":\"Advanced SLAM system for AR\/VR tracking. High-precision full fusion for mobile robotics and mixed reality. Explore our latest sensor innovation.\",\"breadcrumb\":{\"@id\":\"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/#breadcrumb\"},\"inLanguage\":\"ja\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/52.42.213.164\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"SLAM system for AR\/VR: Next-Gen Full Fusion Tracking\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/52.42.213.164\/#website\",\"url\":\"https:\/\/52.42.213.164\/\",\"name\":\"LP-RESEARCH\",\"description\":\"Advanced Sensor Fusion Solution and IMUs\",\"publisher\":{\"@id\":\"https:\/\/52.42.213.164\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/52.42.213.164\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"ja\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/52.42.213.164\/#organization\",\"name\":\"LP-RESEARCH Inc.\",\"url\":\"https:\/\/52.42.213.164\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ja\",\"@id\":\"https:\/\/52.42.213.164\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.lp-research.com\/wp-content\/uploads\/2015\/08\/2_LogoNeda.png\",\"contentUrl\":\"https:\/\/www.lp-research.com\/wp-content\/uploads\/2015\/08\/2_LogoNeda.png\",\"width\":245,\"height\":104,\"caption\":\"LP-RESEARCH Inc.\"},\"image\":{\"@id\":\"https:\/\/52.42.213.164\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/lpresearch\",\"https:\/\/x.com\/LPResearch\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/52.42.213.164\/#\/schema\/person\/d33893eacbac18042e119cbbb28760c6\",\"name\":\"Klaus Petersen\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"ja\",\"@id\":\"https:\/\/52.42.213.164\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/0b49144226532e2260097f60df3d62793d0ac9d8a2bd5c475971988daa794271?s=96&d=retro&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/0b49144226532e2260097f60df3d62793d0ac9d8a2bd5c475971988daa794271?s=96&d=retro&r=g\",\"caption\":\"Klaus Petersen\"},\"description\":\"I like to create magical things, especially projects related to new technologies like augmented and virtual reality, mobile robotics and MEMS-based sensor networks. I code in C(++) and Python, trying to keep up with my very talented colleagues :-)\",\"url\":\"https:\/\/www.lp-research.com\/ja\/author\/klauspetersenlpresearch\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"SLAM system for AR\/VR: Next-Gen Full Fusion Tracking - LP-Research","description":"Advanced SLAM system for AR\/VR tracking. High-precision full fusion for mobile robotics and mixed reality. Explore our latest sensor innovation.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/","og_locale":"ja_JP","og_type":"article","og_title":"SLAM system for AR\/VR: Next-Gen Full Fusion Tracking","og_description":"Advanced SLAM system for AR\/VR tracking. High-precision full fusion for mobile robotics and mixed reality. Explore our latest sensor innovation.","og_url":"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/","og_site_name":"LP-Research","article_publisher":"https:\/\/www.facebook.com\/lpresearch","article_published_time":"2025-05-29T08:34:03+00:00","article_modified_time":"2026-03-23T01:56:15+00:00","og_image":[{"width":245,"height":104,"url":"https:\/\/www.lp-research.com\/wp-content\/uploads\/2015\/08\/2_LogoNeda.png","type":"image\/png"}],"author":"Klaus Petersen","twitter_card":"summary_large_image","twitter_creator":"@LPResearch","twitter_site":"@LPResearch","twitter_misc":{"\u57f7\u7b46\u8005":"Klaus Petersen","\u63a8\u5b9a\u8aad\u307f\u53d6\u308a\u6642\u9593":"5\u5206"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/#article","isPartOf":{"@id":"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/"},"author":{"name":"Klaus Petersen","@id":"https:\/\/52.42.213.164\/#\/schema\/person\/d33893eacbac18042e119cbbb28760c6"},"headline":"SLAM system for AR\/VR: Next-Gen Full Fusion Tracking","datePublished":"2025-05-29T08:34:03+00:00","dateModified":"2026-03-23T01:56:15+00:00","mainEntityOfPage":{"@id":"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/"},"wordCount":870,"publisher":{"@id":"https:\/\/52.42.213.164\/#organization"},"articleSection":["Blog","Projects"],"inLanguage":"ja"},{"@type":"WebPage","@id":"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/","url":"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/","name":"SLAM system for AR\/VR: Next-Gen Full Fusion Tracking - LP-Research","isPartOf":{"@id":"https:\/\/52.42.213.164\/#website"},"datePublished":"2025-05-29T08:34:03+00:00","dateModified":"2026-03-23T01:56:15+00:00","description":"Advanced SLAM system for AR\/VR tracking. High-precision full fusion for mobile robotics and mixed reality. Explore our latest sensor innovation.","breadcrumb":{"@id":"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/#breadcrumb"},"inLanguage":"ja","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.lp-research.com\/slam-system-ar-vr-tracking\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/52.42.213.164\/"},{"@type":"ListItem","position":2,"name":"SLAM system for AR\/VR: Next-Gen Full Fusion Tracking"}]},{"@type":"WebSite","@id":"https:\/\/52.42.213.164\/#website","url":"https:\/\/52.42.213.164\/","name":"LP-RESEARCH","description":"Advanced Sensor Fusion Solution and IMUs","publisher":{"@id":"https:\/\/52.42.213.164\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/52.42.213.164\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"ja"},{"@type":"Organization","@id":"https:\/\/52.42.213.164\/#organization","name":"LP-RESEARCH Inc.","url":"https:\/\/52.42.213.164\/","logo":{"@type":"ImageObject","inLanguage":"ja","@id":"https:\/\/52.42.213.164\/#\/schema\/logo\/image\/","url":"https:\/\/www.lp-research.com\/wp-content\/uploads\/2015\/08\/2_LogoNeda.png","contentUrl":"https:\/\/www.lp-research.com\/wp-content\/uploads\/2015\/08\/2_LogoNeda.png","width":245,"height":104,"caption":"LP-RESEARCH Inc."},"image":{"@id":"https:\/\/52.42.213.164\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/lpresearch","https:\/\/x.com\/LPResearch"]},{"@type":"Person","@id":"https:\/\/52.42.213.164\/#\/schema\/person\/d33893eacbac18042e119cbbb28760c6","name":"Klaus Petersen","image":{"@type":"ImageObject","inLanguage":"ja","@id":"https:\/\/52.42.213.164\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/0b49144226532e2260097f60df3d62793d0ac9d8a2bd5c475971988daa794271?s=96&d=retro&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/0b49144226532e2260097f60df3d62793d0ac9d8a2bd5c475971988daa794271?s=96&d=retro&r=g","caption":"Klaus Petersen"},"description":"I like to create magical things, especially projects related to new technologies like augmented and virtual reality, mobile robotics and MEMS-based sensor networks. I code in C(++) and Python, trying to keep up with my very talented colleagues :-)","url":"https:\/\/www.lp-research.com\/ja\/author\/klauspetersenlpresearch\/"}]}},"_links":{"self":[{"href":"https:\/\/www.lp-research.com\/ja\/wp-json\/wp\/v2\/posts\/22037","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.lp-research.com\/ja\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.lp-research.com\/ja\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.lp-research.com\/ja\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.lp-research.com\/ja\/wp-json\/wp\/v2\/comments?post=22037"}],"version-history":[{"count":38,"href":"https:\/\/www.lp-research.com\/ja\/wp-json\/wp\/v2\/posts\/22037\/revisions"}],"predecessor-version":[{"id":23595,"href":"https:\/\/www.lp-research.com\/ja\/wp-json\/wp\/v2\/posts\/22037\/revisions\/23595"}],"wp:attachment":[{"href":"https:\/\/www.lp-research.com\/ja\/wp-json\/wp\/v2\/media?parent=22037"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.lp-research.com\/ja\/wp-json\/wp\/v2\/categories?post=22037"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.lp-research.com\/ja\/wp-json\/wp\/v2\/tags?post=22037"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}