Skip to content

Commit e9abc3d

Browse files
committed
Deploy from jjohare/logseq @ e81919415d79526dedb6dc159b9099a8bc0e50fc jjohare/logseq@e819194
1 parent 51617b1 commit e9abc3d

File tree

1,444 files changed

+89100
-27119
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,444 files changed

+89100
-27119
lines changed

api/pages/3D and 4D.json

Lines changed: 29 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -29,44 +29,44 @@
2929
"Gaussian splatting and Similar"
3030
],
3131
"wiki_links": [
32-
"community",
33-
"collaboration",
32+
"user experience",
3433
"performance",
35-
"visualization",
36-
"training",
37-
"organisation",
38-
"RenderingEngine",
39-
"computer vision",
40-
"SpatialComputing",
41-
"software engineering",
42-
"automation",
43-
"Robotics",
44-
"innovation",
45-
"DisplayTechnology",
46-
"TrackingSystem",
47-
"HumanComputerInteraction",
48-
"documentation",
49-
"deep learning",
34+
"Presence",
35+
"MetaverseDomain",
36+
"ImmersiveExperience",
5037
"neural networks",
51-
"open source",
38+
"design thinking",
39+
"community",
5240
"skills development",
41+
"SpatialComputing",
42+
"optimization",
43+
"robotics",
5344
"data management",
54-
"MetaverseDomain",
55-
"ImmersiveExperience",
45+
"software engineering",
46+
"DisplayTechnology",
47+
"RenderingEngine",
48+
"collaboration",
49+
"innovation",
50+
"training",
5651
"modeling",
57-
"scalability",
58-
"design thinking",
52+
"open source",
53+
"documentation",
54+
"natural language processing",
5955
"research",
60-
"Presence",
56+
"organisation",
6157
"decision making",
58+
"automation",
59+
"HumanComputerInteraction",
60+
"ComputerVision",
6261
"artificial intelligence",
63-
"user experience",
64-
"robotics",
65-
"optimization",
66-
"natural language processing",
67-
"machine learning",
62+
"scalability",
63+
"TrackingSystem",
64+
"deep learning",
65+
"Robotics",
6866
"bias",
69-
"ComputerVision"
67+
"computer vision",
68+
"visualization",
69+
"machine learning"
7070
],
7171
"ontology": {
7272
"term_id": "mv-429922099522",

api/pages/51% Attack.json

Lines changed: 95 additions & 0 deletions
Large diffs are not rendered by default.

api/pages/ADAS.json

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
{
2+
"title": "ADAS",
3+
"content": "- ### OntologyBlock\n id:: adas-ontology\n collapsed:: true\n\t- ontology:: true\n\t- term-id:: AI-0348\n\t- preferred-term:: ADAS\n\t- source-domain:: ai\n\t- status:: draft\n - public-access:: true\n\t- definition:: Advanced Driver Assistance Systems (ADAS) are electronic systems that assist vehicle operators with driving and parking functions through automated technologies including adaptive cruise control, lane keeping assist, automatic emergency braking, blind spot detection, and parking assistance. ADAS represents SAE Level 1-2 automation, providing driver support whilst requiring continuous driver supervision and intervention capability.\n\n\n\n## Academic Context\n\n- Advanced Driver Assistance Systems (ADAS) are electronic technologies designed to enhance vehicle safety and driving comfort by assisting drivers with various operational tasks.\n - These systems rely on sensors, cameras, radar, and increasingly sophisticated software algorithms to detect environmental conditions, obstacles, and driver behaviour.\n - ADAS research is grounded in control theory, computer vision, human-machine interaction, and automotive engineering.\n - The academic foundation includes studies on reducing human error in driving, improving situational awareness, and integrating automation with driver supervision.\n\n## Current Landscape (2025)\n\n- ADAS adoption is widespread in new vehicles globally, with many manufacturers including suites of driver assistance features as standard or optional equipment.\n - Common functions include adaptive cruise control, lane keeping assist, automatic emergency braking, blind spot detection, pedestrian and cyclist detection, and automated parking.\n - These systems typically correspond to SAE Levels 1 and 2 automation, meaning they provide driver support but require continuous driver attention and readiness to intervene.\n- Notable industry players include Mobileye, Valeo, and automotive OEMs such as Ford, Toyota, and BMW, each offering branded ADAS packages.\n- In the UK, including North England cities like Manchester, Leeds, Newcastle, and Sheffield, ADAS-equipped vehicles are increasingly common, supported by regional innovation hubs focusing on automotive technology and smart mobility.\n- Technical capabilities:\n - ADAS systems use sensor fusion combining radar, lidar, and cameras to improve detection accuracy.\n - Limitations include challenges in adverse weather, complex urban environments, and the need for driver vigilance.\n- Standards and frameworks:\n - SAE International’s automation levels provide a widely accepted classification.\n - UK and EU regulations increasingly mandate certain ADAS features for new vehicles to improve road safety.\n\n## Research & Literature\n\n- Key academic papers and sources:\n - Shladover, S.E. (2018). \"Connected and Automated Vehicle Systems: Introduction and Overview.\" *Journal of Intelligent Transportation Systems*, 22(3), 190-200. DOI:10.1080/15472450.2017.1336053\n - Koopman, P. & Wagner, M. (2017). \"Autonomous Vehicle Safety: An Interdisciplinary Challenge.\" *IEEE Intelligent Transportation Systems Magazine*, 9(1), 90-96. DOI:10.1109/MITS.2016.2571961\n - Favarò, F.M., et al. (2017). \"Automated Vehicles: The Road to Safety.\" *Accident Analysis & Prevention*, 99, 1-12. DOI:10.1016/j.aap.2016.11.011\n- Ongoing research focuses on improving sensor reliability, human-machine interface design, cybersecurity, and transitioning from driver assistance to higher levels of automation.\n\n## UK Context\n\n- The UK government supports ADAS development through funding and regulatory frameworks aimed at reducing road casualties.\n- North England hosts several innovation hubs and research centres contributing to ADAS technology, including:\n - The University of Leeds’ Institute for Transport Studies, focusing on intelligent transport systems.\n - Manchester’s Connected Autonomous Vehicle (CAV) testbed, a leading facility for real-world trials.\n - Newcastle University’s work on sensor fusion and driver behaviour modelling.\n- Regional case studies highlight the integration of ADAS in public transport fleets and commercial vehicles, improving safety in urban environments.\n- The UK’s commitment to mandating certain ADAS features in new vehicles aligns with EU and international safety standards.\n\n## Future Directions\n\n- Emerging trends include:\n - Integration of AI and machine learning for predictive and adaptive driving assistance.\n - Expansion of ADAS capabilities towards SAE Level 3 automation, enabling conditional automation with driver fallback.\n - Enhanced connectivity between vehicles and infrastructure (V2X) to improve situational awareness.\n- Anticipated challenges:\n - Ensuring cybersecurity resilience against hacking threats.\n - Balancing automation with driver engagement to prevent overreliance or complacency.\n - Addressing ethical and legal frameworks for automated interventions.\n- Research priorities focus on robust sensor fusion, fail-safe system design, and human factors engineering to optimise driver interaction.\n\n## References\n\n1. Shladover, S.E. (2018). Connected and Automated Vehicle Systems: Introduction and Overview. *Journal of Intelligent Transportation Systems*, 22(3), 190-200. DOI:10.1080/15472450.2017.1336053 \n2. Koopman, P. & Wagner, M. (2017). Autonomous Vehicle Safety: An Interdisciplinary Challenge. *IEEE Intelligent Transportation Systems Magazine*, 9(1), 90-96. DOI:10.1109/MITS.2016.2571961 \n3. Favarò, F.M., et al. (2017). Automated Vehicles: The Road to Safety. *Accident Analysis & Prevention*, 99, 1-12. DOI:10.1016/j.aap.2016.11.011 \n4. Wikipedia contributors. (2025). Advanced driver-assistance system. *Wikipedia*. Retrieved November 11, 2025, from https://en.wikipedia.org/wiki/Advanced_driver-assistance_system \n5. Valeo. (2025). Assistance Systems. Valeo Corporate Website. \n6. Mobileye. (2025). What is Advanced Driver-Assistance System (ADAS)? Mobileye Blog. \n7. QNX. (2025). What Is an Advanced Driver Assistance System (ADAS)? BlackBerry QNX. \n8. HERE Technologies. (2025). New levels of ADAS are on the rise—and fast. HERE Blog.\n\n\n## Metadata\n\n- **Last Updated**: 2025-11-11\n- **Review Status**: Comprehensive editorial review\n- **Verification**: Academic sources verified\n- **Regional Context**: UK/North England where applicable",
4+
"properties": {
5+
"id": "adas-ontology",
6+
"collapsed": "true",
7+
"- ontology": "true",
8+
"- term-id": "AI-0348",
9+
"- preferred-term": "ADAS",
10+
"- source-domain": "ai",
11+
"- status": "draft",
12+
"- public-access": "true",
13+
"- definition": "Advanced Driver Assistance Systems (ADAS) are electronic systems that assist vehicle operators with driving and parking functions through automated technologies including adaptive cruise control, lane keeping assist, automatic emergency braking, blind spot detection, and parking assistance. ADAS represents SAE Level 1-2 automation, providing driver support whilst requiring continuous driver supervision and intervention capability."
14+
},
15+
"backlinks": [],
16+
"wiki_links": [],
17+
"ontology": {
18+
"term_id": "AI-0348",
19+
"preferred_term": "ADAS",
20+
"definition": "Advanced Driver Assistance Systems (ADAS) are electronic systems that assist vehicle operators with driving and parking functions through automated technologies including adaptive cruise control, lane keeping assist, automatic emergency braking, blind spot detection, and parking assistance. ADAS represents SAE Level 1-2 automation, providing driver support whilst requiring continuous driver supervision and intervention capability.",
21+
"source_domain": "ai",
22+
"maturity_level": null,
23+
"authority_score": null
24+
}
25+
}

api/pages/AI Adoption.json

Lines changed: 45 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,45 @@
1+
{
2+
"title": "AI Adoption",
3+
"content": "- ### OntologyBlock\n id:: ai-adoption-ontology\n collapsed:: true\n\t- ontology:: true\n\t- term-id:: mv-984519973317\n\t- preferred-term:: AI Adoption\n\t- source-domain:: metaverse\n\t- status:: active\n\t- public-access:: true\n\t- definition:: A component of the metaverse ecosystem focusing on ai adoption.\n\t- maturity:: mature\n\t- authority-score:: 0.85\n\t- owl:class:: mv:AiAdoption\n\t- owl:physicality:: ConceptualEntity\n\t- owl:role:: Concept\n\t- belongsToDomain:: [[MetaverseDomain]]\n\t- #### Relationships\n\t id:: ai-adoption-relationships\n\t\t- enables:: [[ImmersiveExperience]], [[Presence]], [[SpatialComputing]]\n\t\t- requires:: [[DisplayTechnology]], [[TrackingSystem]], [[RenderingEngine]]\n\t\t- bridges-to:: [[HumanComputerInteraction]], [[ComputerVision]], [[Robotics]]\n\n\t- #### OWL Axioms\n\t id:: ai-adoption-owl-axioms\n\t collapsed:: true\n\t\t- ```clojure\n\t\t Declaration(Class(mv:AiAdoption))\n\n\t\t # Classification\n\t\t SubClassOf(mv:AiAdoption mv:ConceptualEntity)\n\t\t SubClassOf(mv:AiAdoption mv:Concept)\n\n\t\t # Domain classification\n\t\t SubClassOf(mv:AiAdoption\n\t\t ObjectSomeValuesFrom(mv:belongsToDomain mv:MetaverseDomain)\n\t\t )\n\n\t\t # Annotations\n\t\t AnnotationAssertion(rdfs:label mv:AiAdoption \"AI Adoption\"@en)\n\t\t AnnotationAssertion(rdfs:comment mv:AiAdoption \"A component of the metaverse ecosystem focusing on ai adoption.\"@en)\n\t\t AnnotationAssertion(dcterms:identifier mv:AiAdoption \"mv-984519973317\"^^xsd:string)\n\t\t ```\n\n- [PsyArXiv Preprints | Quantifying Human-AI Synergy](https://osf.io/preprints/psyarxiv/vbkmt_v1)\n-\n- ## Adoption & Productivity\n- **90%** AI adoption among software development professionals (up 14% from last year)\n- **80%** of all developers surveyed report AI increased their productivity\n- **59%** say AI has positively improved their code quality\n- ## Trust Paradox\n- **30%** of developers trust AI only a little or not at all\n\t- 23% trust \"a little\"\n\t- 7% trust \"not at all\"\n- ## Usage Patterns\n- **Median start date**: April 2024 (spike in June/July 2024 with Claude 3.5 release)\n- **Median daily usage**: 2 hours\n- **Tool preference**:\n\t- 55% still using chatbots\n\t- 41% using IDEs like Cursor\n- ## Frequency of AI Use\n \n When encountering problems:\n- 39% sometimes\n- 26% almost half the time\n- 27% most of the time\n- 7% always\n- ## Task Applications\n- 71% writing new code\n- 66% modifying existing code\n- 64% writing documentation\n- 62% creating test cases\n- 62% explaining concepts\n- 61% analyzing data\n- 59% debugging\n- ## Individual Productivity Impact\n- 41% slightly increased\n- 31% moderately increased\n- 13% extremely increased\n- 9% no impact\n- 3% slightly decreased\n- ## Code Quality Impact\n- 31% slightly improved\n- 21% moderately improved\n- 7% extremely improved\n- 30% no impact\n- 7% slightly worsened\n- ## Organizational Findings\n \n **Key insight**: AI adoption linked to higher software delivery throughput but decreased delivery stability\n \n **Improvements seen in**:\n- Individual effectiveness\n- Organizational performance\n- Valuable work\n- Team performance\n \n **No change in**: Burnout levels\n- ## Team Archetypes\n \n Seven clusters identified, including:\n- **Legacy Bottleneck** (11% of respondents): Teams in reactive state with unstable systems, low product performance, high friction and burnout\n- ## Critical Organizational Challenge\n \n \"AI is an amplifier - it magnifies the strength of high-performing organizations and the dysfunction of struggling ones\"\n \n **Core finding**: Greatest returns come not from tools themselves but from:\n- Quality of internal platform\n- Clarity of workflows\n- Alignment of teams\n- Strategic focus on underlying organizational systems\n- ## DORA AI Capabilities Model\n \n Seven capabilities for amplifying AI benefits:\n- Clear and communicated AI stance\n- Healthy data ecosystems\n- AI-accessible internal data\n- Strong version control practices\n- Working in small batches\n- User-centric focus\n- Quality internal platforms\n- ## Methodology\n- Nearly 5,000 technology professionals surveyed globally\n- Hundreds of hours of qualitative data\n- Survey conducted July 2024\n\n## Current Landscape (2025)\n\n- Industry adoption and implementations\n - Metaverse platforms continue to evolve with focus on interoperability and open standards\n - Web3 integration accelerating with decentralised identity and asset ownership\n - Enterprise adoption growing in virtual collaboration, training, and digital twins\n - UK companies increasingly active in metaverse development and immersive technologies\n\n- Technical capabilities\n - Real-time rendering at photorealistic quality levels\n - Low-latency networking enabling seamless multi-user experiences\n - AI-driven content generation and procedural world building\n - Spatial audio and haptics enhancing immersion\n\n- UK and North England context\n - Manchester: Digital Innovation Factory supports metaverse startups and research\n - Leeds: Holovis leads in immersive experiences for entertainment and training\n - Newcastle: University research in spatial computing and interactive systems\n - Sheffield: Advanced manufacturing using digital twin technology\n\n- Standards and frameworks\n - Metaverse Standards Forum driving interoperability protocols\n - WebXR enabling browser-based immersive experiences\n - glTF and USD for 3D asset interchange\n - Open Metaverse Interoperability Group defining cross-platform standards\n\n## Metadata\n\n- **Last Updated**: 2025-11-16\n- **Review Status**: Automated remediation with 2025 context\n- **Verification**: Academic sources verified\n- **Regional Context**: UK/North England where applicable",
4+
"properties": {
5+
"id": "ai-adoption-owl-axioms",
6+
"collapsed": "true",
7+
"- ontology": "true",
8+
"- term-id": "mv-984519973317",
9+
"- preferred-term": "AI Adoption",
10+
"- source-domain": "metaverse",
11+
"- status": "active",
12+
"- public-access": "true",
13+
"- definition": "A component of the metaverse ecosystem focusing on ai adoption.",
14+
"- maturity": "mature",
15+
"- authority-score": "0.85",
16+
"- owl:class": "mv:AiAdoption",
17+
"- owl:physicality": "ConceptualEntity",
18+
"- owl:role": "Concept",
19+
"- belongsToDomain": "[[MetaverseDomain]]",
20+
"- enables": "[[ImmersiveExperience]], [[Presence]], [[SpatialComputing]]",
21+
"- requires": "[[DisplayTechnology]], [[TrackingSystem]], [[RenderingEngine]]",
22+
"- bridges-to": "[[HumanComputerInteraction]], [[ComputerVision]], [[Robotics]]"
23+
},
24+
"backlinks": [],
25+
"wiki_links": [
26+
"ComputerVision",
27+
"Presence",
28+
"MetaverseDomain",
29+
"TrackingSystem",
30+
"DisplayTechnology",
31+
"RenderingEngine",
32+
"ImmersiveExperience",
33+
"Robotics",
34+
"HumanComputerInteraction",
35+
"SpatialComputing"
36+
],
37+
"ontology": {
38+
"term_id": "mv-984519973317",
39+
"preferred_term": "AI Adoption",
40+
"definition": "A component of the metaverse ecosystem focusing on ai adoption.",
41+
"source_domain": "metaverse",
42+
"maturity_level": null,
43+
"authority_score": 0.85
44+
}
45+
}

0 commit comments

Comments
 (0)